var/home/core/zuul-output/0000755000175000017500000000000015153551046014532 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015153564716015506 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000373257515153564507020307 0ustar corecoreGikubelet.log_o[;r)Br'o -n(!9t%Cs7}g/غIs,r.k9Gfͅ |YI_翪|mvſFެxۻf+ovpZjC4%_̿f\ϘקjzuQ6/㴻|]=ry+/vWŊ7 .=*EbqZnx.h{nۯSa ׋D*%(Ϗ_϶ݬvGR)$DD D~m{]iX\|U. $ॄKЗ/83Jp ώI8&xėv=E|;F}Zl8T*v (6pk**+ Le*gUWi [ӊg*XCF*A(-aD~JwFPO7M$n6iXύO^%26lDt#3{f!f6;WR.!$5 J:1*S%V!F([EbD]娍ԹiE03`Cfw&:ɴ@=yN{f}\{+>2^G) u.`l(Sm&F4a0>eBmFR5]!PI6f٘"y/(":[#;`1}+7 s'ϨF&%8'# $9b"r>B)GF%\bi/ Ff/Bp 4YH~BŊ6EZ|^߸3%L[EC 7gg/碓@e=Vn)h\\lwCzDiQJxTsL] ,=M`nͷ~Vܯ5a|X&pNz7l9HGAr Mme)M,O!Xa~YB ɻ!@J$ty#&i 5ܘ=ЂK]IIɻ]rwbXh)g''H_`!GKF5/O]Zڢ>:O񨡺ePӋ&56zGnL!?lJJYq=Wo/"IyQ4\:y|6h6dQX0>HTG5QOuxMe 1׶/5άRIo>a~W;D=;y|AAY'"葋_d$Ə{(he NSfX1982TH#D֪v3l"<, { Tms'oI&'Adp]{1DL^5"Ϧޙ`F}W5XDV7V5EE9esYYfiMOV i/ f>3VQ 7,oTW⇊AqO:rƭĘ DuZ^ To3dEN/} fI+?|Uz5SUZa{P,97óI,Q{eNFV+(hʺb ״ʻʞX6ýcsT z`q 0C?41- _n^ylSO2|'P'BOTLl-9Ja [$3BV2DC4l!TO C*Mrii1f5 JA *#jv߿Imy%u LOL8c3ilLJ!Ip,2(( *%KGj   %*e5-wFp"a~fzqu6tY,d,`!qIv꜒"T[1!I!NwL}\|}.b3oXR\(L _nJBR_v'5n]FhNU˿oۂ6C9C7sn,kje*;iΓA7,Q)-,=1A sK|ۜLɽy]ʸEO<-YEqKzϢ \{>dDLF amKGm+`VLJsC>?5rk{-3Ss`y_C}Q v,{*)ߎ% qƦat:D=uNvdߋ{Ny[$ {ɴ6hOI']dC5`t9:GO: FmlN*:g^;T^B0$B%C6Θ%|5u=kkN2{'FEc* A>{avdt)8|mg定TN7,TEXt+`F P |ɧ<Ғ8_iqE b}$B#fethBE;1"l r  B+R6Qp%;R8P󦟶Ub-L::;Ⱦ7,VW.JE:PgXoΰUv:ΰdɆΰ (ΰ0eTUgXun[g, ׽-t!X򴱞_aM:E.Qg1DllЊE҉L ehJx{̗Uɾ?si&2"C]u$.`mjmƒVe9f6NŐsLu6fe wkىKR%f"6=rw^)'Hz }x>1yFX09'A%bDb0!i(`Z;TyֻΗ|ִ0-6dAC5t[OM91c:VJR9&ksvJ;0ɝ$krogB= FYtЩOte=?>T&O{Ll)HClba1PIFĀ":tu^}.&R*!^pHPQuSVO$.KMb.:DK>WtWǭKv4@Va3"a`R@gbu%_J5Ґ 3?mmcK/&|ty'Sb'SN&s#$PAEw@,8-K߿^}n.|!p+,ICE^fu `|M3J#BQȌ6DNnCˣ"F$/Qx%m&FK_7P|٢?I-RiAKoQrMI}0BOnYr猸p$nuݣRF]NHw2kp}lrCy u)xF$Z83Ec罋}[εUX%>}< ݳln"sv&{b%^AAoۺ(I#hKD:Bߩ#蘈f=9oN*.Ѓ M#JC1J~]ps/9܎ms4gZY-07`-Id,9õ԰t+-b[uemNi_󈛥^g+!SKq<>78NBx;c4<ニ)H .Pd^cR^p_G+E--ۥ_F]a|v@|3p%kzh|k*BBRib\J3Yn|뇱[FfP%M:<`pz?]6laz5`ZQs{>3ư_o%oU׆]YLz_s߭AF'is^_&uUm$[[5HI4QCZ5!N&D[uiXk&2Bg&Ս7_/6v_cd=d@eU XyX2z>g8:.⺻h()&nO5YE\1t7aSyFxPV19 ĕi%K"IcB j>Pm[E[^OHmmU̸nG pHKZ{{Qo}i¿Xc\]e1e,5`te.5Hhao<[50wMUF􀍠P?W篬3yb.63>NKnJۦ^4*rB쑓:5Ǧ٨C.1`mU]+>i+m^CM&WTj7ȗE!NC6P}H`c(FUM gul)b ;2n6'k}ˍ[`-fYX_pL +1wu(#'3"fxsuҮױdy.0]?=b+ uV4}rdM$ѢIA$3~Lvi{u+]NC5ÿ nNჶT@~~ܥ 7-mU,\rXmQALglNʆ P7k%v>"WCyVtnV K`pCfE?~fjBwU&'ᚡilPї`m] leu]+?T4v\% ;qF0qV(]pP4W =d#t ru\M{Nj[ppH? 8>X+m7_Z`V j[ s3n/i{ 6uwŇct<= pDCm3-b _F(/f<8sBdimV-L^C{0lS|IJe" cѲj Ak-ڶxIuҐqI$6ʎ@lbx\<uV?.*E!qQ5m㎤9I͸,0E.ŊygcEl#L)(g4^atNbe7}v+7Zo>W?%TbzK-6cb:XeG_hl&0Ɠbb_2++oI~!&-[TWvxZ>4(s{z1v&YN2姟d4"?oWNW݃yho~%DTt^W7q.@ L⃳662G,:* $: e~7[/P%F on$dƹɥO"dޢt|BpYqc@P`ڄj҆anCѢMU sf`Y*_h/{͊0:L5:)u3wI$G}qsd*꓎0]TGF[vJ+ Rjv<Ҋ(.GGzpFL`1CS$Ǥ46i*#zL9tT :<XK*ɤ{ U܋N5 l͖h"褁l^=UF^BcAw`g*7R(#ғ [K&#Mp'XގL=s5^:z7y0^} "NqK$2$ Ri ?2,ᙌEK@-V3ʱd:/4Kwm2$'dW<qIE2Ľ)5kJҼMЌ DR3csf6rRSr[I߽ogCc;S5ׂdKZ=M3դ#F;SYƘK`K<<ƛ GnjMU.APf\M*t*vg]xo{:l[n=`smFQµtxx7/G%g!&^=SxDNew(æ*m3D Bo.hI"!A6:uQզ}@j=Mo<}nYUw1Xw:]e/sm lˣaVۤkĨdԖ)RtS2 "E I"{;ōCb{yex&Td >@).p$`XKxnX~E膂Og\IGֻq@wY)aL5^1 W9&3JW(7b ?)]m`F3 W((!5F-9]dDqL&RΖd}})7 k11 K ;%v'_3 dG8d t#MTU']h7^)O>?~?_|&q̑0dd4>vk 60D _o~[Ww3ckpkpLNa ^j 5*<&}kˢmqvۗj=<Tr=[ a^؃ È(<^ٽcZ7C:?pM z*"#窾+ HsOt۩%A498SwWv|jNQ=-[ӓI+[nX?&kZ!8- A7楣,GX(_ Pea[djtrr?)iuKVMٞM9$1#HR1(6x]mD@0ngd6#eMy"[ ^Q $[d8  i#i8YlsI!2(ȐP'3ޜb6xo^fmIx nf^Lw>"0(HKkD4<80: M:'֥P!r "Lӓݰ@ 9n# " $fG#stV \'yMgaSZNg8>e!^f%cYr]qs:"̊;isXa]d+"v=x7p.fZCg_Ys;pE&\U}ܫSh])qKYAـhhdEnU14&G * QIQs;rԩ.k83֖8Muqu_48dHܥlW7 q>fu6+'}xu\Veelz`Zbym gp8펠ˋ߆ֆ:1IC8qٞ^vXçL ]X/r}7O}Wh,h ;RQ=]u00yiC۔I^3!?H6iUH:ô 4P$rT`%2Aq-֢׍qt=@x#~0)p# ы9'iri]ͪ/@繁qVGCڤr,DihB ,m 9 _$q3= A$IC"6g^4e`Xo(D*6"^eTh'4xpFڜe'fVQ7~'c L^ԯwIڣA.}H;Ë*׬=`^ 9]r鐃 -Dfi2|QwZk‹u^6DQ1&H凎c!n[mi3)WfsF:M"uҷs.1!뾧1%s,hQs|hx̗3%*v9(I;:'>uQ+v-vR/egBhAAdh]4H:n^$tHI98/)=mͭ ڐn}}~קg_6WĩDRc0]rY9'z .(jHI :{HG}HDN`h7@{jnE#[dz;7V<*EzfH{]:*6M x-v쳎M'.hO3p-IGh ܆hR ]zi2hB9'S߬;I/d0oIU:m ^_,G;U\cAAz7EtlLuXuA}bT2H_*kIG?S(קjhg 5EF5uKkBYx-q}3Ď89؟N/pgÞ tFXB-Gj{ٶ 3Gzp؍H|*cyp@\첹,[up`uV,\KCB\qGiW痃[?iS{eϻl71X:݌>EEly(*SHN:ӫOq{{L$?Q{϶(F_Ej>3mqfΤP-j)H˧&8?a?2xĐ+EV؍x0bv6 fd1^ 2ӎԥ sZR cgu+]dڙ@{@|؜J" Ҫ ;焓&pMzxdysj8OZeIRkV/p0Q6 y2XN2_h~Cֆ֙82)=Ȓ7D- V)T O/VFeUk'7KIT, WeՔ}-66V؅ʹ;T$pZ#@L; ?o0]"2v[_hׂ'cJ6H4bs+3(@z$.K!#Šj2ݢxK-di +9Hᇷ絻 O.i2.I+69EVyw8//|~<ëng)P<x͟~? fp,CǴ_BjDN^5)s('cBh+6ez0)_~zJz"ё`Z&Z![0rGBK 5G~<:H~W>;ٍVnSt%_!BZMCe߷Ƅ%VDSWn 0,qh! E-Z%ܹpU:&&fS,xS"cV8i8w{+sOKB<չw"|{/MC8&%Og3?EUS 'MV+Z:H2d,P4J8 L72?og1>b$]ObsKx̊y`bE&>XYs䀚EƂ@K?n>lhTm' nܡvO+0f~tZ3lPuV-]fz:햘s >˧Byd}gs9QNʟ. /ӦxbHHAni5(~p>/O0vEWZ nY3 cU $O,mLacoW1jGUо Ѣ9*|ãeeH7.z!<zG4p9tV|̢T`˖"T:*Dᰤ*~IClz^F6!ܠqK3%$?E)~?wy,u'u()!duqy/]c 2ėi_e}L~5&lҬt񗽐0/λL[H* JzeMlTr &|~D)OQ`_h1UW蚍R$W;6q6^9.EPHŽ{pN>`cZV yB a[s׭dֲcUh=Ɩ9b&2} -/f;M.~dhÓ5¨LIa6PnzɗBQiG'CXt!*<0U-(qc;}*CiKe@p&Em&x!i6ٱ˭K& FCfJ9%ٕQ·BD-]R1#]TROr}S [;Zcq6xMY 6seAU9c>Xf~TTX)QӅtӚe~=WtX-sJb?U'3X7J4l+Cj%LPFxŰAVG Y%.9Vnd8? ǫjU3k%E)OD:"Ϳ%E)=}l/'O"Q_4ILAٍKK7'lWQVm0c:%UEhZ].1lcazn2ͦ_DQP/2 re%_bR~r9_7*vrv |S.Z!rV%¢EN$i^B^rX؆ z1ǡXtiK`uk&LO./!Z&p:ˏ!_B{{s1>"=b'K=}|+: :8au"N@#=Ugzy]sTv||Aec Xi.gL'—Ʃb4AUqػ< &}BIrwZ\"t%>6ES5oaPqobb,v 2w s1,jX4W->L!NUy*Gݓ KmmlTbc[O`uxOp  |T!|ik3cL_ AvG i\fs$<;uI\XAV{ˍlJsŅjЙNhwfG8>Vڇg18 O3E*dt:|X`Z)|z&V*"9U_R=Wd<)tc(߯)Y]g5>.1C( .K3g&_P9&`|8|Ldl?6o AMҪ1EzyNAtRuxyn\]q_ߍ&zk.)Eu{_rjuWݚ;*6mMq!R{QWR=oVbmyanUn.Uqsy.?W8 r[zW*8nؿ[;vmcoW]"U;gm>?Z֒Z6`!2XY]-Zcp˿˘ɲ}MV<в~!?YXV+lx)RRfb-I7p)3XɯEr^,bfbKJ'@hX><[@ ,&,]$*բk-Yv5 '1T9!(*t 0'b@񲱥-kc6VnR0h& 0Z|ђ8 CGV[4xIIWN?Yt>lf@ Vi`D~ڇŁQLLkY <ZPKoma_u` !>Z;3F\dEB n+0Z ?&s{ 6(E|<ޭLk1Yn(F!%sx]>CTl9"و5 |ݹր|/#.w0ޒx"khD?O`-9C| &8֨O8VH5uH)28 Ǿ-R9~ +#e;U6]aD6Xzqd5y n';)VKL]O@b OIAG Lmc 2;\d˽$Mu>WmCEQuabAJ;`uy-u.M>9VsWٔo RS`S#m8k;(WAXq 8@+S@+' 8U˜z+ZU;=eTtX->9U-q .AV/|\ǔ%&$]1YINJ2]:a0OWvI.O6xMY0/M$ *s5x{gsəL3{$)ՆbG(}1wt!wVf;I&Xi43غgR 6 ݩJ$)}Ta@ nS*X#r#v6*;WJ-_@q.+?DK១btMp1 1Gȩ f,M`,Lr6E} m"8_SK$_#O;V 7=xLOu-ȹ2NKLjp*: 'SasyrFrcC0 ѱ LKV:U} -:U8t[=EAV$=i[mhm"roe5jqf$i>;V0eOޞ4ccc2J1TN.7q;"sդSP) 0v3-)-ٕAg"pZ: "ka+n!e߮lɹL V3Os\ဝ+A= 2䣔AzG\ ` \vc"Kj61O Px"3Pc /' PW*3GX liWv-6W&)cX |]O;C%8@*Z1%8Gk@5^NtY"Fbi8D'+_1&1 7U^k6v읨gQ`LRx+I&s5Www` q:cdʰ H`X;"}B=-/M~C>''1R[sdJm RD3Q{)bJatdq>*Ct/GǍ-`2:u)"\**dPdvc& HwMlF@a5`+F>ΰ-q>0*s%Q)L>$ćYV\dsEGز/:ٕycZtO 2ze31cDB/eWy!A/V4cbpWaPBIpqS<(lȣ'3K?e Z?ڠ8VSZM}pnqL f2D?mzq*a[~;DY〩b𻾋-]f8dBմVs6傊zF"daeY(R+q%sor|.v\sfa:TX%;3Xl= \k>kqBbB;t@/Cԍ)Ga[ r=nl-w/38ѮI*/=2!j\FW+[3=`BZWX Zd>t*Uǖ\*Fu6Y3[yBPj|LcwaIuR;uݷ㺾|47ߍeys=.EinE% 1zY\+͕߬VͭW_겼cazyU1wOw)Ǽn@6 |lk'Z|VZpsqL5 څB}>u)^v~,󿴝} 3+m𢛲Pz_Sp2auQAP*tLnIXA6L7 8UgKdT)*7>p{Pgi-b)>U6IXabPde Ӽ8Ģ8GɄnb'G ֤Mcv4?>HC78NE@UMc8>`TvZ:}O{6#_I= 8i Is],JlȒ_8i:mF֏d~3͓ VA2!#DQ4qgpɖ¤^{ jSdp=U roJ㇪ OH` QH#KyWДc i UQJ̓v) V1*^uQVͽ츽H.?I!T$2Tp H`R<)*2PvfIK^Rt.Mcf{?Eh&믎âU"V9`=':ܟB!E2y^[׾XoO.gi3^PQW?<)<Av3j mo-ӗ3% ِd{mG푃,{K kt4K'(/q'EX*D̓2[YyyBպwiޚHr3Xq#c_:Uy6H}5ݺ~n. v⃊YALD;.NYPPJ %eY) e=BEBsN+^ 13 j\uzQ!y_4N2ͫh(F/GǶws Ew c |z, }<ӫ)|pa]OrK)P 4B۞A;RP| l[`Z08(3w-QKKE}kN S#bz /xw +gEP.TFxHq|Gy2# ]yTgB {&ٝ~tj'Y-N nt|ٷ=+=ngYC iHWQbRfCr^@ -S<-ow, 6 U5Pจz_ <")Z9cpyާU!t9(уvՒ,6 ]g[Χ)yV6 +'OY1y{vy$j@$g8(qY q4FX]Q1[/>'rSq+Ws\an~3ObGZb|;n{66mlKp8GtxOߝ5acB &rP񬃨 .C]@t:g|U 4ͭ".WbmbjhTEM}4 HR6J~%+"$L/t5QjaTG˶w',5r"EG lV%qX/su:c:bYqwweqsfRSyN8/[USkH\= uܳL% 765# Y< 3eJU YlY:c:Fk2激rq&h 3m|-\@rd_JȚ=-%2ܞrsd>!p] 2U{V67xUAaCK4󸯌Tڎ`Omkcd]YUԸȖH]-m.%jM_@&4-Ĺ= Ԙv-\6Q<u"9ֈ,ht8&Uo2qZ5W(f,`Bp znRF8\yT-uA-(Wgےr<ڇ+ (pu=E{ec[j.߽fm;EP`ZXW@$o .QWIdG,4dYu ,oR"~[ojٻy2B20=m5[bc=Gnj9$jeWf-f%BA_ϱ=c_=)၃-qPoSxgٴciC;&j .R E;n+<W3ꓬa13 inNkLFTݞn ^Yݢ&ո,[e:\*qʡza,~ч_j~Jь$0$ sciMmrrΩe̶|r$܏"2/fّo4aYaX\@VǮo]G5u X5]={+oV`SjzZ޺j=yQ%V"\Q!۠Dʮ˹1Dn˗D.S[,Mn7VƑԵBSҦMQ(p.EۇtuU<;<0t%HZ2edKzmGҊ~U?kd%y[k#z,9*kRWhAT躮n"=utm^>m趻'sfiG2ii5{I;e\BZ*m DI~!:S/'] l^δ4XM{]2IC2W\u>x]NDpxQ;/#f4 Ϸ9H,3 UXi]Rݚޏ5Pvq\gHT=-NjSygҬ},ݞ6 Hu)^q<6$i.|.I:taZMעzn>tm*)S2\{ b[(븉pg!ނ$WJ!U[wd,f(6!;Xu"_r.K"k<L]jZ˵mZ.*$`LPjXg͍lG&j-&0lvwBe~ۘGAzG=fԳ|3d/v]I=Ȑjp\kOGw M0]cI}ޓJ^8[#u1j{m-mg.짾WS17ԩ?t73ŌT}_LݼD$=}Mjgql,fZl?2Ӧ{4PQi01AayȾʂ~Ml~i`C&bQ< f##  Uf/׬U6Z`$FéVS嗛!2ĸ0>Jɝ+2@wM۹Ov:х$ yANLup}馌"/f$Εy'_(Eʓ'( |B\BNg{ƱW![ۢ`IN0MM L@DSϓYfO9ۑ]6V.rd亏݇mIbD}fɹXQ\u;D񹮁_IڰnuNq T}3׉^2^R X=.EQ-vʅp3y ms 1=fGVFVc`XfPߤXF[{\Xz#La畦|ij?6'#N |@Cy,~g/NTSp-rFu0ΡHYH%+RL_ k ! K8Mr^{Z~0bs SuYEܳfq9'ň̯2ICxpX*ang98lx#߉lFm?EedzƄ04ȼpdA__F 90~tޤ|?A踀݄%_;2bB{]3׿do`z?&Vnkj!fo)ӆ!TO ˨q-Ϋ4+ |2y!Xыg[sD1NP!=45Z{*0j|zn1OONtĨmHYM C#8$&쓟8Ŝ V"۳P y2XҺq?)#çmEC@Wisoߵo?p\w4ox\zCĤYsU,+b "3"i1K_`%X9"'|VE ,Kstx/>_皮 Ջ؇Sg~rZ0 jг6塽e9* s \b 33 :}:0=r 4c q).X# gUw NRx_7h;ՓC8aq/qq $?iv=NcOmܞ{!϶eIsz ,Fph!B>,-NȠth;'!mMHwu"[!O~;J/_M<ً3NT?edI>R!3'QE {dEVk3p"'{VͽZ{X`t~։#`(έZZԣU Vk0H)675RmXA)@F@r/9l$2>)989-Eyt/`#?>$ԝ?`gLHlFS?gC$uߣyܼUɟq|&;>|>Y!C|Iq'%Ы,.TF|şk1? ]\LXPf,mxވCF]g 0r&\N AD?X2b>/s>naȡl\( ep,8 5 XNW4<3l9쾆A{41_6'ĕz>n #xfе,0[3cF@o %-);uaާ(yi Ic+[D.sܷ7ޣ]Sk̴Yl,ps++##Уlu}_XyD_QkKɒ-YLcn1Yr Ci1 (@gnX[ihCGn4h0K|Qƅڷ)|#ߏD!][~ ہM=zY4ic#}ֿw(0 㵯=~M7~M#|kgׁq;Jk{~EWpqn6>z7T`Ǒ)G`_UO84`EL>D;mGi십VrV|a2?mxuY]v# l,r!ŽME}L*i6U: a ʓl6_rXưou tPt{B;J!tK6}JwT^v+ݪ_;vݥkڷ;w6RlA=HuvقPwPw B uF#zz[mO4B _%߂P{BH*<`GB- W 4ܞi;nA([%mA(۞P4Bَ-嫄-ʟF(ߑP%]Blc(<,+}C/fJCSͭ>:ߣhmOI.γDh;@ΫDcM>*s\h; FjHv-ʊeFЌԸM{3DrP_R6= w٘_x[\|)t/Ffe^e]>] Q(sUǷjMq Q ~ĨW_gy>V\qJ fe PEڳPfOs/?WY?Oʼ@P-$"6PvK85}Tw9^,!*H540J4D_6:£?Lth[Wߤ%s2c@I7([8W :$U@j8OAӯ |!5|ejB>ڸiE=e(@V|8~ǖRj+`=պ r̓0ŋ1j +8̈́2,i|xm\q+tTIpw֬HEVb<^o?Hf h˱ ͮIVWon"1hĝ c|k BS̰ckgwx.W,bNTK0 M-$yDد|z%8ȏzaJ  RA ] '!D/ Hh [AʂYx&Z` "@Te`E^vVj8Ev?0o\+҃kT<*ғ<V6A\4Y3)v=bYzNgM9_˗7W^JeWi6g<(Of#0+#k:TXRL$l:xRoer.eOl|CmU]8Tf9z I~b! ӞuM<ߏF5Gj.2!H4G^,' YZJ5Saၾ6@غ Y 0@s8hL$4gH9 Apd)JO F9TœGgiO R{QI{0YPl#qH곇=b6j ~'뿃n[ϩVn;p^Yg4l=:C(\toSzoç{PZ. ||<_CgU`%P:udBg}uPSU &8=_V8ZѵT-YkA;MQfw?o7Hg ?2f\R" v"0SeIi$ -ɝ1JNH[ciq3PiyQ3,2 P33H9?+ 4ge$&q ;i+TanE1r4&Mhɱt;޳kE-EG ҷ8s?jEW#"R"xֺWYgpQV2maB?_Q k cPt$sM1Gn)ΎIxQexR EQ=j2}yc9Gx'BIN!MOYi~׹id EAq=~m&1QjFHP;񌪋p|ʖir٨B ;(6 iBr!-$q dSJH6MX5@T2è}=fz/zA 'С, iPS p Mb0TH3 ՞RC-b' 'txd ϱц9h4Qm Rs"-髟.b{4~ퟞ/#OLl"W<-9 a3Q9ũU Rt.LLLt4NN]2= Nhv +h)x<4둣{u<NUFG\6BvzP0is/tZ7*qz-d'Z$jxf8?Fn%gkD,"N*X"QorrXhaէy5fD1`,*NyȬ+a-&!5{#iP>G:V1n|۸~kYcM((p썩[³R,=r۾ũ1%l)w XɕAN`o{pf%L)zg7=v8O޵6v$׿bCLl`A^ 8(Hc奄$k^\h$.{T>uNgi£~ǻ|ujKy Qo,L>~~Ѕdoa?AQrQ,%Wt9Liό_6?DO~c[?J-#6cK\c~>vՈ=߳9~|VK| m7Ozx|tI ?~.uvLOM:5Vx;3+B܆O>i =_]!ڈNJKn60^iZpm{qzAkboB#䖟XlxƓ2-U!g1.*Z ^ybM&1jE[ͲdHe1vZ%P|Fs(ms $YZk?P$d!}8qI3·7jو/o2 G36me|(j% qRG0rABA)Xuԋ*KgB~:<_~yf>~Zjg IUdńJFq΋w>DyoMz7ɭҍ) *N 1)xIN(ή~p<$a{mdžr˙E\%4~YV5!@Gp,~\l/({1X xex1DhÈS$aAX-I+Ǻ9 x諻(F RR>NP F9+ %GR@aBAOH&kOH.>:(ݲ{Avgɶ/׮ʌei16,a>wŢ V@H$AW{E \zgM=֧֜.XlPGOqпˡzB4h b̠QnUT Om1ZTt}JRh]6iô[_zB]m59oyqѫM)1 YVj tWZ_&XCđ oshQιxa9؆)we۠B _;K!˥.iڄ Vt2-0T|Qh[Û%)"MY}A0Bg/thAF|yì&6R$TYd9n5Fb&P%H¤Y7¾>PjI(?{yDiA 1@@*95/lELw_gq&cKïCvZ!ֽŔ ΅H:o?Ϸ7]p|Q\K"B~x}ϥga`x{(X[ʣfX?%S(k^FzQ("9p3bn7zJfQz_GZΘE]IB#[=`Q눱ERGgO0.͚΀dMw䧄C @9pws@7W[_ZWSAI0ZyW1*?yٿס"8*M+!B7=lTR|6x}|Qjp̥S)ej-Gn%z݁J|93JfAǹ$Е^MO%No[P)(;'@%X޳izYwQENXyv@gzS^Ls|<,5Ú8Mg.HiS= sZ'JZ.%ͼhD N~u%&Tvhz!9ҴA(/DU6^qo$)b.=F;ޗqT~Q*wnp}p E'ckF}I[ 9F4ByqGgC .g)_t|p:4μ۶ަU0j(~UOi`.Z\Pڨ,YMDJ\ĢN/PruǿO#eUszG3h^YԿ ('UI~+-}ǜpV`xZ3I4)K(1ضpc+RI0=]TKnt!/[`0CO(LM-&TIIeqtX"yv-+Ah(R=q"iL!5g|4^ͷXO]p4cǥTx]\D3TY eyع -c%*#.A)p= _*$\aEod~-SbpbۂmY Y, LC:s2ױ JFPR0?+^gR]ltcNJs05X79z;VPh(1]c{慺A宧Ulpeﴏ#nY F4[ҟH1UmG2jEʬMoNv]=(n7=#$ޞkŽx% xRԤ AҐ/,:^%Ӫ((1iG| )퍸~||肣٦y01Y>->JX}3)8zf*)e)YߨvQ{}g4վo;U9<8軟|De1} 6#8-,`z[ZfW-*c0܊%y/!UdfKFtf5zzਘ:44}v}W).g]5FTJcbLĠX[Zcr6* !"/Ia7Z {FSE]pLXqҒ/*_pT(ɂCҠ)޳z}^LC9ioc3^^\LI:}ꂣcrR*[mR ~l 㙳 *6:6ՕQ s?E 1V? nT( ׬15Q9":ERe7nj|$efQIW%uVkR6uq`K'R:-f{ĺmv*A#1*݀h> ]ty/0<|^,.jQZOuۅ~vpsG's9u[=yŇvllFG_ڪШōK, IH:;n]ba7cpxH,ӃYz3*0z܀\=|tq0I2%ڕ@ݦ80.\x/1-CB6q^E؅ui}̀b7x2{45G73\w=C{FSn`Tm!`އPpH:1}^o8x}9WaG~0pV#TQІ[еH:@iEȹ #zrʹN~p^`ںZm[n#ğK U]:@{Npj6.tG[աSW9V41A Vכܖ@ݢStla%7֗gGŪz{XXfe% R,5F$x4=H:ϱuͲNf2qjpD[34fe(={䧖LffګEF)Qr5vs1h>[O^e#Vmg/3:J+E~ vd_ve߻0*ͶR*zoj]~\c7RrULO7]pґq#6ܼ.>I/> ĬGx{z3v8ʴ~\闶iDʃc4?[2(+<:uZ\Su1|9vUHoţ/xVʣwqOba%-?{ܶ 3Z6to7m'i?7z-)9u3,ʏH2LƖI 81.nbb,,^[FXXdeQ$f4$괃K[ pqg֦'u Zx%Ϸ[N.d#8b6}>D[;}Az̓Na]g4ev3'pʟ1C!JKx#*djj+)uv>\]t%:ie}6[gZq~&[9ˎ3O8% cw8j9My\a1 ͋pHIc@\v4֠DK@t9P[p +.\ҬgnYgG>EVyxׄ`=S棾_JWDI=n_ĝW0[7C 8_ |\"3{q8 ׷̿L \OAA.(b#'ʿLvva~Ո`N ~=|r$wȭ7٫M 6D^XxD8{7@^MB qur0\k;꿮8sy6FE,ty~}dl '0qyy.F^c0adWEO#v;9xuvɚP$"ь">ϣI1'%X֥tj8WGE܁)  <9pBϧzֵSgz1|ᵡË|8 ftG $")>ُ@qta.FWn|tw}gz㋫n| ='yI'?Vxn,Gtݴ\~NJW\+ax @e]^?yBQj@W$:@K||=SGb<c~wxYQ8W Zߎ+el|lp[+🠃n8vQu9Rkz+?7]xI;rz D6\ D̻Itü5^R4(\%ZW"\!=f4~0?(&)g^u%k[fu338Y^$+^9E@-чEqς~g72{諒?2T?-7;fΫ;&Ly֨ߺ33IӞ%~gx @\lzt S;f&uQ\>?2k$ymw۝rsZݙ]E*[i= 9 F0AN=~8Q kq Lľy0-S)2+a\ihd'Ǘ&?eq%xeyF )oWx+%祖W/M:|A{m%vի}Wr;yeqCBj %D;E~!7/*1Dе%FkWLՃHڀ[臗/^<,N~?,Yfz=;>?#x*?ƭMPTsUUeyl4 #֗}qYm|9!?Oke9#L [9UmX{piK |ӽ/Cq $ aFT@H ́5Mzv-3&8>(J&o)i`5t,taRG[!UAb&iDH4ji6h;s -Z6h Yl=Zv#Z$ ·(0Sofh:u!pTBQBi66*3hm%'Uz@r֦\GHIەtR*H٦¢]>"DHa- '-/`n竓MYחai*hY'8g3q 5XAkMsp3+a)R:bJ; VH90XW#ԣ ~k- FEzP viz' ~y0}UKZtn>p"xޗy Og'VWgՋk~$Z/HWvAAIXօK.ŸkSY!(N u\Pיo` }z;]ԸooʞI7W oI~Qj MZrKk#L"iRcڙ59eRjY \Y8lns xKOc똪qNjhf;`Njw؆MFtC0v2ݕaxiyhy<; '>=?źa~NG7nu$' {$e%H9hqsAZ,8jͺnOotZw&)S\_-|Nz4i ,Xm>gI*(!ضXPo.7NZ0"ny $>|>~.CG̺?;$/ӢU0ZEq idӘՖNbebXSJYJԦ üozeZ~co] ;!ȉGg`koht W W"+Y_ϗ@8ʩb. x\PO8AW!}!P7J]dy/@ y3EݚXߎ+el|f7-i~HP^fxtBNM+`; ?tmpBo&{*ztI9ߘtC2k%lIuɮ}"}#vyYd|$Oנ_f`ˢV}꽪$9{d{{u7F_W=S\1DU5֋~/(8bsbce>6o܇WywMH5vdȿ&;IX|fW zfW#x'pF\ZgWj&y'Y1qEhT3`Zm3Foe1_F=33eWt7|#n@Q.W0_+B Bkz̆qIzp=2+ K`IW/OE;D%JJ1Y,Xp5i!GL,Fjd*=QWSKve67/I"wB&`{p%;v W":P+0֢N61Ȥ*,M%Ic)Kc4NCsc( T\n>"jyݘM4JQbٕȕ.]ZkwM4y͗l0 Md\q_q$< TSA隼:-|@hskO!yԟBjKW"W+DD?A}t45<%)1nM*bA)<~/LkL*JGʐN؀RH pca+CS ǐ&\i|NW("8b$' PJc#gML1\Ti0x8l)%,b)M  'P+f2dS,Ĺ0cI3J[ qx,#8Ɔ Pj~Aj\'p|܃+S/܆A@+[Π h$) bP[F*͸N'QZ^# w#(~id3ײMN$d{xkV eUBb-Z-tIjiWϳ_&dr5R<cj|Ɯ`%4K  xhNfB~8O+خ*V!S2{wcV#W gv8p~n9Yb1B# KD*k*w-mIТq9hK/HRpu,e^'Jn\#fIZZ):ء(ۙٙbm91ʨ0sޑO| !P&'<+,bAU!XUFRt8 BIdۥơMJn%,'$ iJ$H0'0:JTl- >#h˓pRX:QqSfu 2&" kDD>J̞0 Ax$P$I"*JD%-XC\ C+6 a28"\'F5|uK2>w.bBbogz߮u. 2LF% \cM9"Q#-rPX]9p!&c xQt>!~. G$R&B NI1ԆS]ƬUȈM$XFHLs!dߕ,IB0s!$`q&`DLJ P=^6siy' ˦j -BQޞeSa.#JaNZ$ HU4[UHhƲ5B@:JyA˳,@+j   ,G98P]WP>gd˽/f~/ z˕‹>y E .c5Z7YMϬ/.bQZx{8vDR{#<6"4J0qB]2b 9VGJ"BB$7{O5,+| ?鈶\hA+pPPׂ8&,Cj AԲHh gIH(9kE!=u^ג`*D6d'yټ˺E{gZhѪ5Yߒ#skLgMRvI¬MҠn@n;KEv{u}prԢ&dy"Zh%)~̚V[L,#zeƽj6ޮ͡]QܔS6<0:dk/hfVm_ZB;wE >s)Sh|^M*xGl<.J5 wɧq\}j9_-5WGa`D<ԟ{0ǀg?GƠ$ymo> H\Lv~ir8K,x c$ALd Hm迻u4gmzن˿+A:buMWc`S~|{4(Gq+ȣIۯ~pΜZ\YI磬M#un9)>׍MddbK>wdnXn.[c5b&N僤uozaVpJo]$mtTWoB_̦JNqZj\҂gtP;3)hh_/`fm˧|&&нMط˱IS6[N1zG-D9jj°~͹J=7o|u뜖ZԜI`ѹvJ=kӑZevQG0OOԯhQ#>"kR*G]gll/̇"_A^nXbXQ+z!<~ }Q/{/o^|6_˜"{hn ~l~~Y|0x=n1˳`>5)\̢<Ҷ3eO 2_MOAq' x"]x4yaUH!)$L&/CGfE3i 9A[;ezHr01AҊwܭ}Ny Ѻpq+)O7-elxwj?C(3/2n_bɇ\Yˠ?>w7d+?[8Ӡ+IJ/@&Φ J=,t<>~wfWQ)A7gxfO9 ]M<nڂ-G@f9؂>1~rS]d4^:Wd _@Eg\{jyV;>BVl1Z^ X¬\"{>_j+oEBTs̱.s0' l?/vc>i1+o'"^5vp$6k`Uq`:5ױɬ7]['G{r<>Ed=rs:^9Mt1\>i>q`xn"yd8J߳p6Kp^d񚢟ЫBx`cūu~=b>~؀Ќg&vc|qrk}Snb;ύxМIP̖W ~UXZwryK{rXl m@P5 ׬LJl\d>cg:`\<>YĢ9TaH}]_\ž!EqA^U-nrHɆ8(FvAp@Z\E,1fzgMiߘn4π1:wХ3ʻ4jnw DCvۓю*!'/!'Os~p9rAM\|n3{ T[g]C]7`,ڷi,p?c:?@{p_]:?Ӑi\`JV~pYr\\h5oƽ]nfgR > b#"BŭQ1͵IȨNB.#FLhAMd„÷н.x]e)q۬9@lE%X˦Ky¥Y5*j;kOoSP->caR{nO,AFE>hj*5TwZ)VkF/5ʚuqrCyBUMK% ]LS`ĸ&l&J\^*9cu~$(_Ué]{­`d7b6UoC?NFQ>ZKѭ04ܓGƏn B+Ł$VfZW.t$QJC{oXkJ^Ob_ D ZE-a9^r~^ 0CH#<L#žܑBIbp"/`^>1ދ7Zp/v2%wuT Z3U7J 5WS -r7n7OWY\dNCz'twd|q_yꐟ3$4~Rzx&BHRѭQn~|E^ "߅<\EFƺ_9zJ,㇠w] .Q5Y>=TV z!™Rd$o3#giu'PR$h9Mcߞ^ǥӈݪ_Vu'PU߅ڊ<0r7򛉹#tl{}dj.:g/jNO=ܿu]̛ڿ"tJrH%^10GhԻA{oſ3߫/Wk[XSd/M͊/W ?vfeaVboy0Ko,|O] ]b^a] #Jd>]kiֳ<0؎eI!IV}CɋXh𐗮ob.U {rp/@(mgDG1b]{k糌P/Ѷ51֟|^N `]Xx`yE~l]_~p?~pE'??dE~N.—r_ qfVZ/ꕟ/|ĉ,pہ~pUہ^h_(//dŝy: 2Cg`ݡ5UE5|棪@sU9 ຫ6ҤT5I]QO h(7 J])^ˣM]C)^FBŞ{wD}0`l8fj^h%.ZXj48 K֩Y7wh gA{#VxEH5Qle,J82bXK)G 5ݡ_jQk_lZ^ (-/ŦAhm 4V֭gv$r,yr =>=xn8nȄV#l/(z4+(%rq7IxwۡiIJ>ݾէٛܧwmwn~un]nSW< ZuI/1++h XTӽlVMP!tt˴ԻrZ+*l ń?{GdVH ,F_maeIdop{=LIizZӢw^hj};MڔNUEZC iM2(o kBͭ6~Q i$5L%*fZ/M]buK]v״Ʉ$D$519$)lpQ8ZΚCcRD"!|MH.UT-DrBuBd*.U&GzV}Q1RP<5CDtrc1Ma;3]*]its>!Kbr&i]z :T?w0[pJ鉬5ɋ橻.ezK֣2ĩƻ.#[9x!VH]# ehHP34㟢0(?㟤.Ouh@U?f<+f$*Ѽf+]#yl6yn꺡4ud6s?o8I`hM% mu.^z,U@M@Pʶ_ڷRTU+,sJv׶JިUɨmi7*B'Xݵ9<8OÃ&ߑSx@;(̎,ƒّh^Zij^HG|x3`KX]8dc=ᢽ τWʼ"G^SUw [!i  ISMlhv)99rl9yTkHOrDN ZZ@cuh>`sFH9@ ݆Ұ G@V!ܬ:)[,X$;10gBbOw SSl=84lWJ{Kѻ|6+[}+'C!sy̦Y hl2j#GmL &)TA)j<IRCIZ8U 0"b,.Tt+/f cHH}M#1r8v3Ә̴ԧ*mF8@-6)ױfkknPk%I~6ZI 'K6&j<^k97FՓ:r!j$I#PwCl=>#=2qyS!NHqYbS%My'ss_Ei7că5erj:<(9.@lI-H C$Q3ͫ"1uUado*!:Z/5#n=҆`dM(lM]MaPn e|l=>j#sFq =L4UF1EM<0u$ -3 \swl|42F0ڦji1ՕZOc+b\ Z}*<&uUKSs[Ծڢs[klґ񺫎;OUxUWY# &u΍˺0NW&#,꺹x7|t|S0}qqyw+wCLݸwC *Y8=9ЎKO) = ѽ{ ݊Iס LtQin4LISתi1:WJ_|ћZ\OWDXd1h_e侮/M(ǔvW-etY/]nzڨTv>|:^-WPCg.a%+aMA){(c%c= =a7=$ð0"۟mtu4yP-HF$qAB*U k Zv8܄$5q$Ɓ{s;$ꢰ!MÇF#iڈ#T`20Zkv00eU܆xǐÃht3؁~=#9B^#TcٛC4{Q6ujF0_lًF2v> 2S6{{fSvRс@tHHVAZ԰ CNf0 6Fz{V 26T+jN ՛Cz޳u 8>T5)mٛC4{+;& `J!8d^yRmHfF! ,7Aգe=|K{{fQd5m8#`"d7g3o)3C6=TZR>T1wPu3CK{{f_n#Jv>(e"lk-FQΖmõy!fCVF?N / 6oyuaWFSRgk:|xR6xvn9Z)6-ܦ/=[vlc:}8Gf8_gξ#>g89Z`ԓ:!I8I^-+mX0h\< I+* .Znxhyhkj8N84;*ʍAaz5#NɒGPX=-[=V(ٗy㺸벨N/燎r)8>M-׆KȫjM]PXT\3'Gin_OWHXZ\:"nV -iq Zy-MEʅ t̾4@2brzFRYjۙTKLUoi=+1YҶIU-wtEmumMmQe9G@$<ٮ1֋놦1a(2Bw\`0-QS(;#̩]'oAcQ_ ==d6\ۣtJyS+fm* !JأC}L*BlxHk3jnz;Xi9'LX˽ h_ٛλWPRm}JWYvdruˆK|5\i: bCUL*VYu7Ղ!*Bf*ʁ1agNVюsР8>j5 A!>+ÊaƠ9V/z5>QXfT( q*:G,#_.0gqv.y{X^梋J=>׮7{y~&ŗA腼vsz#,|szvyl66Xt4s|/xqWا:.beug^otfmR{\]U?t?+]\K[(  AUitm̅!]?(cBN‘ͅ#-\ GvW82 {*5y*TŰ٤ziM:0b (@M954\l1,RjvUS;@D h\Kآ\2. "gyNZ 1\2,d6\B)W q <*=)(.-%_兓{y;X6V*: -*הX'V^ReS][Ǧ.ZS+Jo&'((2 J[>/#v6ͣ}(i.9l~=>XG, @MR 5{S]^A)[@+ 8ˢ2Ka/So[Qr`lU^5gk~v16Ő/@ٸa}?#yP;vbl\PYO!MIsQ_]-8ϳۯGuyyE{d){rٶGn{]K/ۿ"(K??e%md]O-,,*1[Xlżk{v['E]&fKD,9M!~b96\7)K{ fa`\ĂޑbA!a;kŌ-1v l{SZ#ťe@(!;sN d7<ߋߐL =)@I2 {UIP@nH*u#8Q\}~p# l\ xsSƦ.kwJ]0[<z%cA ALK/ݢ}/ᬵfI1bCXV15,d6g5mVlm1ٌwl }E34CFBCFͪ1w~S5]f?~^dJF B fZ<&%4c|!!pf9di˩VNJd5pF1na/fxBf7|(t ,07  ؀#c7=CBfs!Z~k3"wGG"V܏D5bvH_9@Bd=۸MiP}\A `BCΛ\M9-ԅLrx)("URW,MSc_xJY[r jSp]s\T-۽DHPugX;Qd5(d6dQ{[ssg'iDgo)11,d.ܡAp Yơ! D)k!M}*V\T\tXlV}>9;U|m] 8u[ ukʩ=w iMMK `,E0}q]2vVv?>gǁRqIu)%(pom!d6!BmVUkʶ.lHSZnAC2CϪ$d6EvlSȦH N̠=I;%TE2fC\22SƳ)cG'g'!& /Tg i9}vVݠ]Aym^{6$΄ICBfCF\J۫3A< /EHgG=Va^>ÙFDoYC*2o`QF|;xeױ/l֝1>?̩@Q`8\[ۇ?x=:AḋBr)d34i] GTҾ(ٻGn$W|w[nwǘc0m4`V Æd(I)ڀݭ* ㋃gz4ȿP}z 4n i*qcv ؾ惔hӂМ0.1  UI9 R@zO#E.[V CB#$&•>![AURIt4sWK(jkU[6S^ښ!_Pn[2P>b^=Hc\aMHHE?aH$4Hj_Ƭ uxmhq}bۉJDuwAcb(Rǂ($l7%\+qS3Fs`BUo8x!{> _pT F07QF{!U * βGp̄"1ۉQ;-B x 0e1Tcjx Y0qG#Т&$:1 F7S5Z(<i P0 V"`@Q?BJ.[6dR*4>^  n 8\גƚı ^krF8UME>ր r^n!A=37n9:)#Ճ4GȅbM 2OUUOo\<@Ia:R)me%x L!X\q֦0S`79 3A9u{55H { A !A%Ȓ- hARL3F^4F)_ SRW,߂ P!A^R9HsEQu dRMu@!‚ )F!DҊ{  }"1޷ϋeBkܐ>Vp a(W,I y `oxBi1>c4VWk`(ϔ!Z0<;NA S i1H>HQcpD9L9HmEjW_y k hYjޮ1c*m_0+&"$"9#(#FT)ɹbHH}MA_@VT6 Aa^R9HspE+6YFtp\?N-k`f")0祇@{ V_@[?@+0!lCC(|z+D\@f<К l *4T$BDvJ TmeIĠպ%"z Me`LmG!|CR 4F1H6"ϢP1~T4b>,CAdX2Qg{gRg:?W[%\s#PT.[fLE>sðH =x0v&,T#eoэ*u)-55Y뻷+d^ߴ/qf `?nI}#Jc%F%z OBi KJP7C  =.Tb߷g 4ܲC?zǃq&X[\ˆPBU rfඇHxot[k=R <$Xj=`_/} da<EfnZ 0~vEo"u,3O0d8n~MTN ?P3 ~eK~;vPaGE=iGaA%| mm5#Ub(WiRJCufzW,H"7 ~R9Js,AHSol9K#%u`}~W݅ӎF,q}i AǝY*8psuY`u@Z$nPZ}l{P{C|IR7@>Nœ3 tiPaS4>rؠX޻Mu$OuC7C@v0}z8}80:chmi{LTfb:}qa6_L{ TbvٶksdyܲYJ.X&4Gqkif(N1]oaBsF&}ҤS#jڏG[2Ǜt<\pA. I)&$<jޕ])㚩$V&wOfܵjc.ɞ/f[{2N)oUxvn(W3B#ޝVaf+gKywZ({,o#EB%Qۇ*q ;`Rz(H1Fעč"8EF.\ tw9xc{ev ,1dvT +s}X0K&L&0 fʳߞM˝ĝaf Jk6q$^pS=JS~ ԁy[q(Y0DRI8Tif1J!S\9Ixl2_9iw('.iGI4##Dp6!)EI "<&2yQozE,gG9z.!18!E$nIN>DI,⤑3][OuFcR_"! +\K >XZO44ܒ[pKzַ1YչV|fҫ%u1HnI-i% _ , Xs=X) vlc{v,)XDxn~n5(;H)(z]BR֟dqQFұCk/i唾ǿi &IBtwc}Fi΃uzE+"b\w4x yV(DUƘX/r 3o2m24-w bGD xT ޓ+m7GˤnOl-3nW}-bVm$G0`*T N agS9?luU]lLG'pdﻶNzB(UE$ /O&u\=BN<"O&+3G?OZly,|Cx=|f~*㻽UDWR+ܱc:+:AhUhkR:eO1yn6P~%sw4 ~XPiғc8Ɩ{_-ǔ'p#BǴ'4pLr9. -LM~-m0z,SE}^wO|J6/{*-L@g1*sE޶{$YOHNʟZ6.v[Xzic6 D3`.f;SZ2I6qѹ^ۨź夌3b]oO_b1)nl#?b}VD#UuTBUXدGrTUu8y E bߖ[:H7Ͳ0Vkћt$WL\`j<(p0ac=1.=-[_DY}?Ik 笴M7pΆm7vSqve9l}glFo~Ӈ *O_ 4`vsƊ:2I ;t1tJ^՝E&oPON$XӺ5ɕ4ÞbہI֩1GLO5E&SFG^z}1G.sI-1݃O3thsC )\5\f2Y`A!~) b,UCS4 *͟cN3Ż`2InF;#\]Z ADH!Zz̓ZF8X!<}yB1 3f&UxSc)OZӫZ+txN=riPˑXR|.[9Nf4INRBI,QdH$T4QSMж٥PZǧJET]xXj~)A0BRD g9'B&^ .ޱfOUw/7m5-sj ^Xe}"3AQ8ѹP(IRI.q (Nt3T$YZT=pfI_,B\ri#y) 2WҋbPeafsBDǘc]=C3u1d#΢E0)Uj`o `ՄxyĀjQѩ,:dJ zS#WKaTHDmo-O_:bd̕+lf"޴F&cY`ҏۃiسZ~%ƣ{ 7.hYzsN޾<[s*cr$+kYU)Wg){?ǘB4Ü|l)C{10c 5_L?D0Nu1VLۚf^- =uďQX= l;cqRӳB)산H\ Zr,j咃~R6jt\޳uyj:r5u#i堗]M'36 ו lOW[bj1hw*DZ^UcCM [guIp|ϸva?lU#a*N莋?~Q fޝ_E" S9c>Xpx=rZ8 |é\* Qq ~tlz!DO-j9vjiz |lij2)uJB58Fw|nK`d]#@b%X{Ϲy). 0fԭ8N7)y yE"q/,;L y]LU_!߼[>#!= a~Y&y:|m>iV;&#& qza7?W=/\]awG||_%?/$JcSQj|JI[%ULdSVT))Kٚf}$)*w#yAxO+yGhvo5eoK Z/O/JVHY@͖ov"h,+M0J Rh*&xct=}I.&_(!9 )fD,BY!4.Td.J6 D1?1w䧌M !yc?5l.EDC9Bq4/h(c\"ks0D˷k;'ZWk'OւYRdLF +I HBT &YD zu:?֌%Ie]3H8]d3ɨ&eTұ B@Lµ=z_ BilRZ7c6mmTʲ4鬳>bg'@ã ά6]ĐwHRJn6Ls ms4BiM;Rj*dȿϐ)WL4ʨXZ\bdὠ"'@i#y\Tg1,1Fa-[rH!dHXVKvZ7'!GU&'jjsCsH- ^4kZy73@1;bU:L>"QQ%~D䫈 iY)r肙)#8D(u`2Ֆ*Mp$m*$Q$XYRŬrV^⢧ j)UZlʕI rcTȘ%gE2 VZ5"Q=VB \.%QC0Uѡ^& F2UPS`6hd9jP[0nEE FG9+CنUuqx#uLTGj1`yUc-lJ|GXLAKk#P48PY3.V#ъBl䛹x]mE^Vg| Ly20Ei^76/`2޸PK JGW F"@آy ,΋l՛LQBUFƊՐ:#mFPQd6HMaNHVG:(G"54%~ , _Tkuʈb2 `ɫy ve>` ל Hm 2YoG DcHsU ) <,$D&tZ,E:}n2oMI:$[gn׼~9ۡ[1/EW̚(N ;ƈuNdv'"4B LD s?`{c?u YASdD}eŚjy軐 n|@B$aaAƫE G^i-0dUUɁhKj w403)hyE nqh`B'A-)J#H&5hT^e4}V8 W0 V;Ѽy{* K2zJy[(2XpGWm`չY,TG՗?^_EQtpۊnїլ$S`e1vnQH}wu2W)o1s%} ^{@#DzvI.`@ʁ^ @oB%\}?J'0:"GȖT3 } (()۽4k0 6#:Z TB0w,p >:xdWXsMD0jiɵO Gh=K@QQ}Lp?KZ4flh'r(JgjJ6?-`WHg,YD5j% YL6H"*^ "j`A-FݭE@$`B~pBE\%t0Lpihm˷JP{y[:oNdx.nNWus-w&(PQ` 4#UlѣP19vQdPf10k̒FHќGBjс%7#3cyf;=& }Ӭ4#(C!k8hA/Qk CC*2xyDܠvj֠D˱ʨT`=RTc!J$=|B^j0fA=-` g֪ͰllAV( gƅ+EĒ.bF\%(\F|{?"w^^0d1 )-1@*6裸 Y7gf9\:'Xd WHUڈ$sј W =8rJpcRfjcZ B%QZϞdk&Qd2D 9+듵蟐U-k߭l5(J>H^>@|c t?p'88A O .`+=ae Z ΀z"=(&pJ$nÄ@=>d8NY{O Sy(t % ̤Hd~4?`H4D곱ĔrUjzX.!">ꘄCU`Fq:tB. 0\H5.7׵(x;]u)XoPt^E[Mv 8IL%*S\j%ѥk#S,,aT(KРԶ%-Q}[YJw~>Ը)Ÿnj QggNVv7|E7]tP0o߿pK{ iqjřNIZ#>y[jĆTNq/{|->>Y󽦮6 BP_ 0A/EhxBFy](5B.B.B.B.B.B.B.B.B.B.B.B.B.B.B.B.B.B.B.B.B.B.B.yBXQBvu׻#A,usP7$1B.B.B.B.B.B.B.B.B.B.B.B.B.B.B.B.B.B.B.B.B.B.B.yB[^P Xpz1BU uRwoR~X.B.B.B.B.B.B.B.B.B.B.B.B.B.B.B.B.B.B.B.B.B.B.Bg#t;9{w'Xwpno,/ֲIpwFvا=a/#^/Ny d] ``[͗I kx)_ֹp3˻H U~$/5RgH5Cau'u\]^q[Oy]](aQ/\zs"=]QuTAQ,:X  /,:scO|JEr*|~yuкPqP`1'5mOUypxmK,}ӂx,;;F0(5#I{:::"?`|`ap~CТGMF Շ܀|~}oy*bRu$׿ByHUY.џađw{NP)Cr]7dΜ]]u>},;&vLFkM{BmS E2mJxBbn02Gw&ޮ]FDw^eHIn&׭t뻋 ڽm^}|=rӟz1wJǏ&/ԏK)_d vRZM/>#ǤT `Qv%Ub *edJ=0|\ X Q] Xbtk Rm[ Xv-`5R kY`.U] Xu%`CTk1jV'DoXW&EkYF%ȕuƈed!>%ᕀ`xKcXA`ɰ_ Xkı[ X}~>nJ};7'WQO.?}}qm-ɡ1{n~`ո>}Xg*Jzq|Xo܍4-;ovldk| :~/g=F90s8nX$eVwd?Xag b+U +% GkY`Q 4BǾ?XW֢1n%`YW""+Dc0+9^K ݑ68`ݞ`*X4G%x`yܡ+;;Їg֕>Bgᬿo{ȃD ya"Eh?z? 4X(Wmfr[I,I-8c72~5~@׫^~IůϹE-r'Ż)?Ë|f_sEP=.뀺}˗{⯾=6w7o.J|]Dy1o~m^ާ77=Z vnXKtʬ3>loY]*g_>Ɂmm)_-sK=yryOvWs~F(~;q뒜퉊1QZxcRv]VJFYhKUcPg燚}9eԘ=a])T ;cժO9W[*q>$ݛRyc4þSF^:RME@q>/o dTZ-J0 &$ôqul5D &@Y\ &#H#˨w^)" E1<4۷AbwRe"ڮ-C{`RڼDm$cĚDh1=նG,:b4Ἂh:ls\l5oLH1-^rJ ф`(oY LaҒ1ç҆~&I eDc0`s@.\&J9ރ%v=᚝&>c8}tB$bkyd~6ޤn_ >B(JGQ%cNBkR_ħ\?o.B yrgk*uJsDm[!H9YC%x}Fk_-|DҎMqcXRhQK keߢ'Xp$4+эǃz JH$jk*DH,8daP۬> EDu ; ` Śn\3>EQO} VnJ}Qd\Zfx {ÚlS(! S#׆R3u ӄI2`4Z-+b3<ՂEFG {gN8up;mqQ?h0QI[U<ĪJ|X@0<3. 5!ʢa"▲4 50Ջk.Ak[{P]gF=ac֡s6AmPmAa1kGU0(IU\'ԓ-bx_$j1N[tl!\-"+5%1RlȆMh(pu18JxPeBG$׀R %4-*,әPKf"X[`B][_׊.nV]U{CZCɸMACX@4Oe @ o}d85v'ŚBp6o#PEX#0a?%H `4QI!0TTD V0cx Q8vr/ ōfU[j(hS1YD"H"/PukbSc2熥rI 2voPmy V&КeMZ&23e?{xB{=qiYvFǘMˉrC}!v[esՂU{]Ùܦ4\2RX8r0^4` +G$W{HV:E2bx𘊖'8;G%:#.pxq TE"4iUPym*CYnQ2X[Z%$de@Y%ve < oppGeS Uߚ"bNM d-EЗ"ةe6G_nNbLRrS ,ی7n&Ј`QǢށ]Oy|X 8h&! %:]%|̡c$!G؞kG_J0.Ʊ֌5 r[O hg.ɘ+u rN Bh5.3jBPX-;Z=t#25;yu6ڌ 8M 35 bƍJTDлqĩZtXt6d>f5Be1ݠ۶%V`k~kvXwH:68M2]%tf% M] JIǣj^" ;eֆaMŎg)D$4PzhЛʫ!>,φFo:(͈=X){8ݰ(^:(kM1WTsC<݈X\DUX\PzT0bK̨ҊgO(XSVfXG6vE*'V"Hjc$蓛knk;(JAo_j@ IyN¦P( )ԙB)ԙB)ԙB)ԙB)ԙB)ԙB)ԙB)ԙB)ԙB)ԙB)ԙB)ԙB)ԙB)ԙB)ԙB)ԙB)ԙB)ԙB)ԙB)ԙB)ԙB)ԙB)ԙB# u u΀Kf-Bֆ.(e u~BS3:S3:S3:S3:S3:S3:S3:S3:S3:S3:S3:S3:S3:S3:S3:S3:S3:S3:S3:S3:S3:S3:S3:V.3+GW#@^P(alPg uPg uPg uPg uPg uPg uPg uPg uPg uPg uPg uPg uPg uPg uPg uPg uPg uPg uPg uPg uPg uPg uPg uP狝΄{$|׸n}/NKkr-?oWݫ$ѭ^cQfOl'|_=ys6\.`0N Jz*+W 1Q="Jx~Wwz&`Yi%`Eѵ:ot%`F6 ohbAހ%{q_1SsY z"m4M/p`1O[,)l.^q쑎,o AlpfH䐜8O_6NFC$FG Iuau,fxv|Ǔd3OYOWF7a_2z邗͖שoWeڣ2Z]*eⷤ`7tx+*VsWWy_2(ߐ ˱UoE]u*asQW_҅ﱢ;WW{Mz몫fk먫fM]=ԕCYO|u/s `!* O rH8aEFw7]t=#ڜl3;ɷI2R)Riw8]]M%3onub9Nx= q<݄;y&`8z6;4f&WOq4iO, H7q>CO׮ʯw=?v~H<bdqSy nfHhWՆ˾w9ri_ #~X|,,=چWp+i݊,S-hl.^;_k]kbV'ZL-_֣*@aLͧ |}5Fh~& =2_~w&(As(Hg8 G^kjE4L+&Zɝ0yM,9I1HdlH(w²)xʹjԇ`)fݮc;;oM(R'=6 g@t`NrV579z=W}4'[;wdG{LyKYL~jl?fO݉˚}+m7nm֠=2 AZ 23>-$` hbIQq`<NO~'?G$a ! anaS[u*H&&.H$ $s.ZMsǼt&jB \0R6V!xs65+Wo:̌8ao0Soz{Nns(񿭿1oK)]q++*S&OdQʲ燨,M?.qzhuXk&22Q\o^BBM[qüEVq?.6dcDJvUK B*4Vk9Em(l>#? Ceu.ȓ"~B}$vq5³`0AC>ί?P^! _~2ʋ7Wߔ_zj,˫$h)\![N__xvǵ ksB8k ǖMAo[UoNv5/}?|wztmTy9BnvW`1Z:W(fMa0sExn0|ٟ=5qPvdS$*,1dY3>dO7/f|X@g8??ax.l6Y[p2EXkqП-y:~X?'篓dwюIN+ =F=Za?ԫem8w fm0.@DqxO!ȯϢ8K?1`-re V cwZ (ITc#]l|V`‚<ݓfǵʳ}Ge-%Z]8ISf0L91Yp,یR[`x5LWi0.[%7wp7ko[?B.YYߌy嫳sՏNhx́A~i]_MjzϭkGiDrE;x3:[:1{ *i,52(Wd+hDX:_ء $vMc-:xb봰u#BbvY˗*o'*U1wd5>aB>cc__gw_rdr:z</dWkSֿb]%&;s'h퓕"\|/*Yl[|q4%WmUn`skHd"K,`X S8cĦsϤoI)v8l5L3\bsMbĨVKX/n_P}T5AZ~Kf+N-vDpDa{1vBJƻ[lh JXZ3C:p:b}Ц]+y V/h1&gƀNUOZo O'})*0KöVΦqq9.zޠ Qq|qv{#'ɽR}#bW^G\LKKK????MZuV2ـ6, \ˠUkQt]l`kw z=?}3oV,QEIq(i#': ¨Xu_ej?|M1WWm0Qws~{=Dxp O9r;$IN&CaKpH^ޒd{qIL)hGO(BYpN6kr)IJl Md1)+/-sh b  %U^>gvtOXƀf *B KV{6p/I\mEHZnxcD~Jk\$A(CԀ#dD* \,F2d6|~r6fEW" JX)60Ƃ%8j$/oGrA$օ#{(%EQP 2K=A6qЪ*8ځAk.oIjUg~/N-pk={#AǍglO#+` nˏZlbB1`d rAd@ά:U_e$3!$ۓO`~ܣGwuƗۊN=MÁkSyM E5^9-5K>Q -5Ew# }y}!Rͬ#(h`(%7)ш[cemyO~oڋ#gqΝwesŎ z(rJK4^QRޕ4hdc dqKVyuKVĻدR*hnЬ`[rU4J9j6e%,YVIJ)q-e5r9KqpK&urkLO+r'߼m7?/Q)nܴxPc),dvg938g cRKڍ4(@Q5j$`#EfKl58yถ|Z..1">~ڻ؀R8EHH*dib9:æf>rfXE& G $ x022R)oK[BMGit"q5I"-E:U#u S xwa=1>eEۈ8S 95JDw5K=[]]%mȖ=[ +#(IZue{;ʷRlN9ڀ]2pB c/X G%gc;aooM'}ndap#sQ$8{BܢVYd+oKUߠVl?|,\:RL# GQ1+D v!/AQ#xx[Kz~7ii9A&-E\b a&LN&D`8hx[2ӽp#>=Z=\6qw؇ЛѰ}.c6qYt~t/k%4Q0prgK$2HKD c,5v.oKrqF?G)̍F$#ƂSjXak;8%o$U%'5X(d``ɲXk:-u0`|q)uzغbFu9CJ\Hy~25% ѧ,N_{h!H%L9{Ɗ/X#y x[]~ZIArL $$|G"$?C4y|#zasV,["'s?4KҢ;!ϯ,PϑEΟ!>zi!'ox$ (Y:S)q"U  HUz3jܠ}uyAw >5h|4m`xf"*On'{dj֊긓@JG(Fgt`S5,LA8B=b] G2Mh-զk]x555 "Ϙ*ٻi RfUrK&|d3ԛ3džK R͛ja|Z}ztC&Xz5E)qn V\8h=%W3V5 D>I c,V-ګ à>9-AݖKju3tvo8a.v?5xƻTgl"-*,(YUOV7Ĭ*,ec%crQԗwc!Y e)LHUѠD޼r ]V$8Db mCJNFuu:yz"*c+k@qEM\ >YP”KњRb6+;- rX%qlOyI$|ԟUfG9NޏAo glS?>"xE#J[1,kQoUyb# Hz'# BXt.:( uSajn @aA}ICu$l=#6{mzGl iYPlʯgVoKPPG~ #6|}D)D6oMabɍsz:k  "Q] s7<3.=woxfxK/TzY`h&M6t4fR]2inʹIb@ȺA)HU/lm@Q&mM )^{>@)흾GZ%Z$#,r ί"7U9"p[ (!4cXnn AB'W'j=P' UփLsv,d".Fe[AmlÅm^S$'hd/H٫\ }_# Bfr AV1$1 hRM?[yp "(ܜ?lgsun݌J43`- CSsʅ/[bрE_4|u@5Fȃ!<* ?UKɄ2H9S1ʰZGcϣ?k TrUh$xT܋BX%hV?) Z_+)?]}"8H<vrp('Aiz>lH؇@ז`{}0S"*r]+ l>YEtNG7M "74# __P1P^XmmGTC:~,1 ȧJ~\VR)Qsڷ}c.5{/!^T(l,%,pH`GǾWVȧb"K~|6"Ni iM3'Җ@y?vQkSSZ93H3FL h5+C "UYJ#{"?G!<88?~B >94N UuA!OѺ=9a?oskZ#@-()&PAK?nϬZq'F9cٝx6uU4F5BVCK(C9:~UZs9Qq0 ~`3YNԻn3vHPu5cˤcT]TBƤ3:͏nD!NQn1 N "9CGXsStY[r `B ư D>Q"/&rQ7f' k  "g}M&* E%r+hSwh\YsMЕNGM4yԊJ2bxcạD -܏<;["*_cQmoW C SJ#aY$ ֡Γpjض[ F@SetMfA#rΌp85sOϋ4zD2/x񊁢87Bέh9 *pAȶoˈ螙CE7+U])r5Xt40L A3qL@Q3I! EU@dF֐1\pB{0fPKA 9W# 7z<22DXTԹp$r,T{x(n YgYwxx].B>U[+/1*yl%H)):A->AyhAȧ U qB8@ W 0bKܨ0ѵYUWe4㨲٣h|Z3@5̴L (ihΪ/W¨!Lgv]r.(geќfۃtDZ<\LyOnAN8:*k8u*Å k<|mGQv A޽pm+Q۠=?ܧHu#N!.pK%W+t|`:vnm\`󹇃8Ў7S fx@|?.rsiB}y1 _ZaVؕy1t1TѦ6?{Cռ[Nӌ(oFOae:^Cc+q7(ߠÒi GiӴt)J8Y/Rx7ޭHqĂڭt/)lK{w| J]Rk;U&|To C:p;~ZqPlȆ/|-At40I5'/nVo $_D=VU K|/e(KW+PuҿlɆY%$Ota HO *e>F 8:) (QLXo(%x7OW&2BqY-ݻ<̧N6s +ba8(rɮ3,e/9Xi%!*cm?iRw/ms|1h_+tEYG,Q 㘅21XreڥR$6U.5C>Z) ~9h 3Ҝ>\*3b'$l* 18ϧ R68]Ji*,g1 Mf WL.K(yѸ gy9nȊu*tw|}ό]TbԺ9<` /5X":eY4ߥח'xd'/5^#B&xd'\/C>ÅBN*1t4k6#ex|*bph: A s  hV?11.j 9xf-LS8{ӬOk&%NJ"u pj @5-Uo?>0;OƁWqb0^;F`਄1&|9!|s̓ X;F!zݽM4*l˓<"++FyV(3#;r#mw + b,qJY{)$E15[7v>߾Z~|KJi`]j-$v o21ƛTY8\q6ϾEyE2f];fÂ"ηa`74#!Xwa<^~6APf'&S`L YWLPSHa + D\+0Xk;kbqאf.ڽ@)u זЊcYDIEp (#F$i)k4W3KYɻ%5 k±$8{ߋF \)tt^2t?7EK` W-:ɔR}lxPs1O'8Q^0E"A AM..w*͚ ®bgRt/͋w7(J1V@+T~o}؍"5$6atCy-}k[|:>ni Ge~޽MP 䩏f~/^1؋g*8]7}1.~[H)aWUR㦗a7T|Ǿe\A^ogww{BC5nwe0 (Bܑ.^f{+SZN_nG}q m{l;3%,'.GB~<}7Ĥ`Yrk4jqF(DLO2sfԇtkbkJ F-ݡ}.?&Enrkr :wF.&32ʨ<5_W5wԽr C5n5y;FAQ ;_F5t?Jo~h͊4-íG{,BKupHq!qGz6 eHz8toS܈UD%O FQ9g3#ZաnMNGi(`Bi זSZ Bv'^Qw Ͽ-nA5 |exM & Azl6^m K mM, Elt0:hY6PuTO[?;lxygĿK\L^Īf5]O7/8t+6|`hnjXw{UFw}_GiT.q1rKtTdGjv#/.|dڣ6$:ܟ1I}0'iWqX`,8ނCǖ!GP;A_nnN~+9Bw?`(G[2Ј .+,QI&TI9(UEL*iv/-K, Pd QV4se@Q?41jtshUjiQ h_B{)i:Iq < ik̎] Gh1߂o=Dg  D>qkėg&(e_BGv1 a *ƱU6s Qqt+Aj Q\UۨCx shBhJ'3ٍw^WH tW)faHi6%ZpZ(j֏˨5&akadh3G!CWjT]Jƶ4p%2iߎiROI {BIK պ(/EtVY}/9uz;B(5ZWB8 W%6d Q^u ~ -z|j "HgxBthY#\TV,{},%Fd6H.IfW쮱 nu&<GoC \5)*[ )Y$Zܵeɵº6Z+ .+ 2P.CKK0KbK0Ɉ^(n7%zW>3nZNu0-pTHp$=I$Ut"޹@j8z:"[ .hvj.,c{X4c(zN1;6rS5GQ ^ڶ!&/8; y_]lCB$ q.TWdLEL:F=)FQ-ҭd+ ֹ__s[e׻ sْ1ȹT[9#p$x Jo`Aeɶ9/_~;|C9ڒ1G5EPJ*_-q[hd QIР9aӀ7- 4LM!7a}c xn9?]P[Jl R9mPVO"ښLJm^N +!VLgCOP ޸9KZk 𸛌a8*ĕMaܞia_' V + gd^3}$8pp'YF>ԢF݃5!yDd>á3Xj So9~_Ǽ;H6 CpQRU U!M5Wd4^vH4c@|HI^;#Em:VbqRȪLB@DLQRQ(9p-tW` CQ!9)4b/wN%g鵗*(/tq;` BQ0B!c*hK%Tq7aˍBQNHrVE V)%Kzѵ9zՄ_ewbᘱ=@/M3{~d ]a7C<|1GF-+rс29C<-cW)^k-t9=g`pZiC ULb& lQ҆0)C)M05/v:ĊK"t>FK+_+SPzn!V^l@ac'F/M j5oudRgЕXdIZb/zb~ tQ`ߐ1DEGDy2ѢANN8 }S>7nZp>fdW9~TvT+?nl> ZΦo7y-y]y"Oڏ_2գvO߼ʔWoG|$cݬRG)rPtxQ`:+ZY[>tz4HI$}v;y^|10׼ɫ~I1]ޏ\wL84FuY#V ԘGϞ&_f\h̴$v6#ʦ&Ri4Y]7rJVR^~cתHGI""ɠޒI`^GC?j#dLuL[f{2qx͇8^n;H/rC>i]z8{3^zֳ}% nqos}dA?`陾4Zs.uyA<֫<2{*YazSnZcerɪF\9^1iѸՏjhɿU0-d?d:O,7FɰonZ#scLnv2m=H}kn$"ӇMB{lFK؍5Z)SN ߀ŝFjsF5^k,/>>'-eܔ$IՊ7RO*,g\Xu M#)Ba>RK'U$WWxg_B%{to @L~MA|ݨgK:E mxvH T!P0ju^[B!ې#I@y̭yzW;ZĞnxk3k_R'$檔iX>4߆WH@y/;ƝTF7dtnGww kt+Adr>idm>{ޤj>]Rt%d8"m3RZYK5]+m.}ӝUĺ"Ue`BhgY2Fbp#wX]4}i8ydu^1 M-UqeYTAǗ' iAC̕jEܢ/eOa :]52s WQ+Ɛdio{EIR[DY#ӏN=Cwx/vy)0VeYk-Ypo [qSͨ~[Z+[WAV#9G짢!7Ep4QI4tY1lMzUJuR{/!@mO;1%{6X/ձ5NURM$ɑH═Ȩ9cK&xtm RZf6,3(0P8^$ ^K}'>4D?=4Czޮq%fH).̌ /@Ͳ߲  K`'? \;/26d&FNj*w1'3467ɶI .W{x`R)-!mO5NKLߟvW8do;mom (g[([7''L辨d rYx)Yh'"%8)D%L;3M2.^T?os $Vaك[hB'eJ:̚pp6  d"j":㫮 wflYǨ13gU 41)Gt#4),E_>߁+K!P%cM[:*j')A,(1-)g,NqYBR[]O8k^`X,W<-WrSs"ȕ1Pgjpl$`grx >ZCp(D岰,]VE#WD+MbmlpKF`Q wmMnVte!T6q%v\\ڵWngtwErA3i R<8٦ͣxL~B(Oxn}o+;T* Z ^VHZClQ =!46cȗ? = 9j-ȑkki[ы>2lG,b-F&% *8>H]*+e"\njS)3W-WWy잨.ЫΕ~c*dMkv2ŸvTEK^ui߷OCsIRnP33Eߡ `V7AѶ5.{1JO>Pdgb E06rl%|qCS$9(_"VP-,9  ȧ+ FP!5VzfֶoVPfy48;j0ɑʢKFnG(_ 7kI|v<{]þ.p 2O 8n^."]{_`wXZEnݡ(XG,eu{ XAQhp ޠ]U;Ȩ*bG(.VBuO V% \ $moYi󥐒9E gu!$qlL1C}"bħb8/ ]X_w@lZNAƲÑQ?*FP東⿽V`6E17 N* |G=4҂CNgz"}ꡑ| '()Fbpa•xEmNՔ5h6-"3$%+x>DD+CPQ:V[R?\dgo0,Uy.ކ=4C'%ɮ= C r-P;ۣ]Pc}'3z3ᳩ]@lBbl2Îw=pJ)HRK`2rdgJH 6Gi$GGSTLjh{ta7Aqn8<` *fC#18ĕCl!5c pN% J&Y%B]ʌRE w 0sQ |RĈ$hN wջ^4`F~Eƹ"b4p6|{h$D ~^S~4lbX|_݀p>ڛ>Φmo|`zָAw5-r)9<=(sfk$ _X?ݔTX_S,\H N`LPi$*|5&x^&FgAy1UeჲM48S*DGҔy|-d.WtK6|(zC\Z樴a0,0#@9>\Lg>Uδ׵;7 դ#g<$T"GUQ*/ jl u>¯PCK7g9ڃ+GX U #y4[/Z0n^>00V<1""^j̠杁#@jpn/ 0OH-$a2ʚFbp=`5`^k̀tJAހHVpgt"iQdfz>K>PH# 8[%^U\Wa@T$=}; 5ȧC#18SWtCz$#O#1>1\T)ĚZ>kn0C#18R&Ln>iJt&1O8rHR+ -jPZ@sc¢1ݬK΁)KCDWLz#^%9*~ڝOM3A:zh$hյJq-W[H ,X0. C#18$%(Hbx"zmrtn8][ "vO =?^Ա{~HBBAGFbpiJ<5cC5 DS:`-:$)H3I<%zYU,A碇FbpRTz^ M( C#-8#Eڋ>$I?}dĠ<)!Nzh$'ch1 OX+꣋ R\t<4kp}W*(嶇FbpHa^KTfnyR؝C31~bVlT$ %r`c־,Bʣ>4 aʌSO:6`MkJe7}FM ?޺bq i' <Ym>S*QfVAL4ָiA$Aٶ91$LwֲР$q$L*G,o3"?EiLc\5qi]+Cm -gf"x3iU]U(-F[YGQ(}A8GGDC9@& +1*uRDn@(\@D˽PVTV&&d:)RK#w?/%?F 1,|N">',EG셥'#nD 5QAQ_\MbMA.?>tA5FZ~>3]lVg=3= ABzڗbFqa$\ (R˥Fbpx~ vČ=mB~=4CEuhM[hPH`ٺ+C#18<~}]NPB} RPܘ>%`ie`5}:qRIRxhYZ'(:p*.>Ā]BhJ2H5gwJ !X5ZГ$G"BVNZbGYH }8?Y99Fbp= 2?'n<9?ڗ܍l9w"&_)NV_y9>i^_WhwjQ?jl+4Մ_6?׵;n[&W͖%#h޸uu73T•p a-\nOn{:pA˚./nr6Yjͬq4_™b*"_}SyӍ¤~|5Y# f:wuV_۩_kof]g~YYgֿav}>ߨf7u=o2W׋ d_m}.%W2 ז/6E9=/gpa]2'?':xZlw}g1s[,3\ج4s8#i(`b5oA@Sr+w#3 &V3\.޿~?/X,?op(:{v6͐ͼc! v}}WHo^oGAꇈcLYnB(|.2\p%\ Q bTa",ZRG`%s#ͥadXVGaVY妄$W0FI[ɤS2,֕T(JXw()Td]eMEը|Yoa_[V0ɯħ΄g𝯲r\O;^NI)+iҝΛ|+/b::4ɔ*o*hQU e5:'&OW?T" r6-aRL%>/~#&[Qw񝶹0QՕ*LApuȝa )++BVT!y*\U] iLi-%i+%I~[.?wIOW_ .Y߭EI9DŽ0+0SaW~|) f˱ٕ喗wvbW ?NC;g6]ؠx^þ,;&6X]}Y}z u6/SXme|U|5hs0oSSA_(槫"l(g ]bJd@\wH_mRྮ^-*Z'\YFפ^SW:OW R8+hPd'C7t74e6ˆ1Vyc+%f, w-mZ vd4-d/pl!96Jr^EH=ȖlbkfClws? ܛ٦›yGpڊZԶ]豠\PѤ1BIWza_u+{hՈ H$@B}}{@]*ڦ+WXFJ#6bM7le熴M}!3Do!mhH-M3rO3ho}6ϐ%|ZM+F2C~`@WrD%0 Hա)I )?B׹;6W75W7һ1W7X\m;cD Gc~aG*Hɡ+d\=@sEdd{\q8sURh 8="shK>seUGsJKtDa=:4WA\q4*HoHYgf+NUʮ ߕq%Rb@Zš+Ȧs M660C'!A2*3L#hF |.UuPR0Y3 b* i,U:^ I ecűd9ԥarn'q; Kِ2̱d.(p FYMnǼj>Aq0jIг$LNu{Hb0ͣ޼y|jBs?G }!|%/JX9I7QVF0F4Bfݪ[W]\԰6eHu9hqcU1M+l:bۊj>^/}W5x՛l}Sk|UIvOG''.IsU^/!s1`ħ<ً#F'|_3C8Saca6. 19۔aμacɲ0ޱ8aøW9T% DO~\=2TlդTg4`,j+/=tF UJi"LZVȵVsV,.c0PG3>OFEneg)˿~yZW&'W_*t^_:\F{9?EH^7yw?qv2y?l4p.lf)^;iV-X4߾-4hAрoEA,Ȃ(cNf0hӑ5ګā e/s(e^ih2ΦӋ 00I=S-=3?F'MBAj2e1vZcq;T[0?iW!;CA]~2-2XjfpRd4l軹pۙZmv;.pI[0N1vyy%N1TkϭQM̱Ht]ټ?O*1 R^B^/ٙO#*J>'7h$qހL6=>xZ;ؼ~> _CNAo pn5ߌ5 d%YpN wl*gls*n|۲J_MjͺZ1KF$քĤV5zXZ;.e)'§L1;i$j6eZh+-mSFpdޕ|+e}θ&ca=sbK3 +E'",r:w(gsLDT (NHd`Rb=&9 d6d0*{sQ0&ǧ3Iޔ?\W ^\O1$Emr5!{ȰQl8"aHKA)g& &S0=?_%7 4+!"3I [J\:oaKUcd@ 8$0$^}2kV^Cdz~qR*%s̙d:*75Mռjy k,!ER9*adMŅӳAڢ|AYٷT/(Ur9-|̰܁>aIWUCVT#Qhe5uey ᨰQO7˼':A%_cuUlD.ϑP1hlCї) dT S_Z,NQ,o_.??P<_L)_ؒZF3t?@KIbgE4Y k*@S_Mˉ_$?^{짟ϗOū=}&ݳ{SX q+EP;NPתgܠjUP߼jXճoS+zË%a X.kWGZ쑖^lNY߀z[u0}2Yggd5!(R+c39 /{SIroaWJX$,͜TrV$Վ}9)\wqSwޑ%L<$tffD!,d!|`MzX&q%wyRŚgfeNslC(dЙ̳#CQ̬sjHRb%l:zr`CxC"ks|I$1umD@1umL]SԵ1umL]S:#R#vDЂM> ǒ-HCD!$b>M*ba  -GÂZkPrIuֆaOυ)ԵiT#Նr|P&dI.L;?C/OrOըK2y(sJ[R߭"nհmTT-q\M[WS]o[7W~] C6tD-y-;{J"ےϱVr!up^gţi]˾#K?(O,[W+I-v_,Vc 8MN_yQ}dUΚ9A[ L,d!-2Vx69p{Z ZZ)^*:=x,I[rn d7l,]n -{!#-3q.g@xDg0̻ľ =xVdIhcdI&AcQ.Z LZ(ehUgq}gywCxY_4o >QY:Â"'hq,xHQ{Ph^yZF">s (}(RHqh%=ZIVC+gxN˸noFi_oN-Kx>@g/!#汥MY;ԧ!ga8,-k[ՒGK&XI Ur'1w}(03+)xvKbd )>&n]II.& X7 9#SPPhށk<Ȥ43K.z(B`;>Zy Qqc` JFtY0ln}(4ڃE+Z!+-{Ph \۫ɌF}h9mW1=(4o]@"7@E%@Dɱw ɺ#ԇB ZER,YDe6 `dW &1YzFR&"Ff۱Ph@ΙՕ2 Vq %9@d:tPnFat*)B F[<ږB i,IP*+])Oʾ9x:9,@șU55Y=߃B 9Z]ΑR.>sFפw۾CT: _yu:K>Y:Y#]yI| M Cm%ʵE_G.BtAx|tXϙ%,7()&/2(UCx)<neícícícícícícíc_)q9#ӇB VW^-͐f[Lc]9VEiZZK֟ЃrؒY=rv'pa2?uu6=c4[J=,$*^eFB^3tsL$m,v5綹):C^xyBµ0ڛ Mxi;9yo%SjBYv>Ph^i]l*jpM,sdrXxp#ӇB OXm|K,_- HtZ׆xvJ69 -+28鳻{q,{]T@ix[\J\`р,-9w <<Ёn7>O݆aճ8e:!Ĭ-mXȿZXxIZD"Oʀ{@z ܈~*Pu; 巢}.>+p`g99[ֆ+Rܝ.iΏgͿ'vCBBo'#.0Rc5Od!yEn?aQ5 t%RՍw)66Zȵ$uՆ\E$:>Zy((2ot;$Ym”0"%cR]{P׮l> A"S%QDS}I%<|vPh^ѐcf<0X U%/̉L&GL MhR&ՄL"FFI3/-@n|qCOIl2ZVbLFUۇB yoRCHWmL0œX,n*SY.YCI&y͏1)$IP FN -;ZWYCGd,fSЦ"=߇B^>Т!)2]]4fIUk˼ck{lmsO!y.d 't:PY{CΆ7.@ݳ7[!V4b)t;&r}ݳV%)*uS}(4'oF,xLy--4 QĎ@y M;:zPr0JE !q І?:nPE-x}H%Pլ!\ȹ6#9 ^UT UϹT E3X,<\傣|VImA+it̻B,)Yd㢖ftN<~ϢeaNXxFg+弸 c3_>yZwsgɅ!U>̊qWr>*.rK}N CL^ ]zRI!ͼpد-6(G+Lf^:loXSkorIΎNG糺`5m CN'ätd'*/PE!#R&**;\wdS!憃dw"L;捆jNhTq 9dӶn8SL=$G<]ư'Rg3܉]ٍ[>-wR_|yj~OpF:%S^]{WG }0j&|t|E:tG6W[ۿ8/&y5-__3?_zmo1o^Ej?ro4JLK H([-ܕit6r{kKASm5AIPA/S/y&R4d #meۖD)r+gEʜg1Q)&4'`ӝ.&ү;68f~QKcg!{|}Mz^4}weY'/]\Յbp-Y<1Q|я.Xe\,gmǞ#_uYmzC^%(jENu8'm!, АyB;N> E'v׺ um#Jh$/K"5j@ LI]9/.#J7l=#ri9bQsB*28jXb!9O\ǒC>c7li[QZ*Jj)a#0cM靖KC.G\ŋq^ZoțIyXKR ӐOGѿrz2uys78p&k~VYp 95ΥBqypYP fx/fOس峫$Zk8{M:nu!VKqzl]JNB7aHFa4_牅_rtd9K4/~Zouttx~rz;Jh{x~<;HEC^͑3v=!x)(ݷoF0-=Z|fEy'8պqfIm<)wsǹ] M~8_pm$}$ЛG:62ì kڈ'ӹXOjgϷQj6*ݣ^6VM鬦0jԁ5_g*;b4]87%U":7iqdz3⷗xjiMSx5sQ0GQI_MNl$6iʕw@5$[@OGi|6~o_}ݷ/{oto_߽.jIV,u'ށ[C׫1o54w5rchP-Q/'ϸ%-ǩN'yB3Ņ-.wG#=k7mjNi)#U͉VexU(qe< v{nK@d?LF C6':3wNd`Y8Rle!&I&+b4 VnY;YiB gg Z|+"'_c֮f+P5x0sZ}s7j껖z_u`6%fcRJҢkeg 3 z;WHCsKaDOAs߂k^s/4ڸa!HRȀ wj1rZ?c#wClk%T8 yap8%b"pc1g;Jaظc#gth%< oyj4ښn~,yX͗/=&~Me ,s{ ƴFҖ,&,V [ӴFwIl4ij/Y k .7+mk<}C*]kھzan?S{e,:u~? S<`s-:r\:ƫ|ޡ̐l`kΛTI!gh՞2#`HJ>Z+%7Kq\T'-}nBC_ߦ; :VK}=2÷PC\7 8'v4eWFy+iYh42F(UJlŮ`:9fd k]o-[ EEK5njvȐ084Rp&Xib#PԶ"Ф|y,?S53 %1!#3A\sk(0=PQxc D`PkW&NjN̡aM:X& [K'X5w''[zW@{ XiʃZXJazJW#@kPtZk#hDqrڨҰձ kR *RЄJi8lD6`a+n0/l+aO z9N3#[Yj^ɋW+W!f\:t7f"Ib5kVF@"` 6!)=m(ia6D<͍1H#|='B-^q@U\}?tah{79x8\8 3(K^]HQG[00A(41 (WKM)(sY | \KЭȳySlF|OfdgG˲|m6[9l !`!ND.,WVc9Ri8d,n9$y]SBk ur:R"@JNKUd/RafXTqA[` hSA< LlV = 0V zVXVd!R#3ItFEā:]GT3b!.ښj#2y°O*G: ,L:b"cDjYA.> 5j]uDJZ[؅/nO]-D`ͥ+`#r&kM50 Iq}J/%a /f{e=E?_ ӫȣ}̮5NVY+0s 0E3D3)(ux.l{g<%x =ϞWÀKB\:$UXMn2!0!]w:l~(lu`2Sw)|φP$Y}9)hr1\J9L`y4;[UCSv5_ m>_^q[H^Aiy91[lzr3¯o'?^]Ng8)Āh1[{ݳR}5%GZ夸zrnKPUO'WtQ F"Pe7efyCP(4i'{={ѝTzմW:dUc4t5e$ú}8W̒Gwp6P](Eﺅ)c۫p'W.[}G2bӹ;ܬU5Q5M_1Ta:n8s9}wowNO_yw:=ϯ`/e\^IڙD;p?E׺yT]3g^_+~WS.|B`@_M:Gװ11f %=T4qF_@q vsOƞGf gɁ|IM^T:Gf|J%ƧmGFsR!R9S8e0N{"HsR-Qwޑ&kB@=WFme#CU<+-I0:IIW`=a u=jjGQV]tZxT3f`YƘcd45QU{ɱ6$Qs-9˝Rp&6H@,~$o|𠨶ϓH`3&kTl-z Hj&>@DPj&jB5aT5a`MXU"XQWPP1wu,)V]=uP gX,=`AQ)b/&.o` F†{a_̉)ujr$K7kCXҵF`| LvM-+Í|^!.y1)D-S3@E!R\`bms456p' i>rht}0C!Ieb֟o:+3(*d$u8&o$1N^ɴ O$`WIvZti({g2S[r2Nf*Ky 5sCĵ%8/ LzY/mǙ>**MKI/9uz]0q!6:^!  KꃱH;+TP B14r=#Xbo7Վ F"B!`^l.VDŽ;-iJVkW*= &}aGu冯RBUoRmQpj}5 7=xxzzZ Zt}F'IQN) npcaL}V@$ĕVNG[jrۛJ*-Xs~?L bޚabF$w lj7[?gGϮ wMuz:;Zht8ր Kfu :_JXn˼ٿ;ͻ;b _Q&Dt11ńͮҦy&D+%Ą=AkTSXBlV;TuY )n u6Tºtܘ2h.TBR hؔW'!i)&U}2^=xDViK=M8,|T[ cM1B %ŸP4ȆA4tՆµL,[dn?AfQG71b4lኝ*)z u61A¶n8R FazAn;v5n϶SrH!`q8\E%.eh#q&.\zIǕnC0CB==pӭw}J]݊JL][+Ҫm3ǵ0Դ܊ґ!Pq(è^:xl;os]q#r@K`91مcVı܊{yhwb[e*߭Zٻ6W#`r$;;`@vg'A=J$QCQqsnKi[TK2{>x$>usϽiD U4;&d3 bE+ ~ Pjc+ɨnw 3&[+BܹtE(J1i |_4]\Lh "LLtuQ!`ۡ++T+tEh;]ʍ$]]9mסfABF߻JDWGHW{e[Z ffA5zJ'jJzyw#]] {V{?@~(2t%'\H}CtEm;tEpnBDWHWBЎ  i+Y+tEh;]J':B©{ּ"´BWV^]ʱ}38ՋЕLI]owPjC)DWGHWJ[ ]\g[+f'J?Z*XvU׷BWV^]J-&:BRvzW6Bb]#]9陕 s+k\+tEh;]j#+hwE#C?V/d |]}=tvt:aJ{yxuGw?|}P6VAWj}]ϕB5+{iƵ_xCcJ{ho ^1)9$"ZM$ї46tvN#dѡ<\vgnq.}s&i\%[iZcƮ68 Tl_LmW1 ]Z1~1v#+--uwA5r%1ҕ^{]Ѣ"Z;~"c[=ՋЕ1ٖA~g++Yh-VJ9VAWz}]g5DW̹i=Tʱ-E۾%uE;7DWWVj;v"c۴i+RI]#]9+D3tEp}g/BTrc+%(PNE>%]˓._g0-.\߼472rY\>5/9V6f7xGc.^gކN!x] kkRj1ftvF9,/O߃.(E<6k=Vj5[TO,qڛ<놴UO-Fp]3#*6NTӢcb^ej1ݽZ쏏)10l4dF?%^ﲟ|=N[nȭ]ɭg;Xs!+^]D׻l>ϼÞn{f`_m@*c* Tq9T"rwe{Oboxlɖײ{m /IP nq}/~**7E$϶׏W* yʰƮAkFg#>7w1cu cA&Npe C3k06e}݋pkX9Bm 2Զ,K=_C|\,VNĶ> w9P#ǿ^SH |㑿Oνfr9anNRn[']y3PI^9z>9ALɧgޟ|>=J6t^c46'# ~O^8~pK?z J;LfZ9ګ tլ":ʉ|^Cw%> 9.~]l)\ nBԗEY_f||zpyq=|X_h0O_bvyoٟ|ͻwji>޸O;wݩuzq'xv]{y>{|C p-T4VW4Rʩ9ŠF !EKbl\[+U%W]!])fNp7Е;].o]]iyS_oP">]ZcNWrz(Q, bPʉ>[ Rfvsͷ_Nκ>.`eXw޾۷oo~zO9OK :n.[wEV+s{wB/$m`?vI?~K ggV!H7pvN?uחw2ͯV%e}5zX֗HGk7{ːNW]pg'ιhٍsSzM޻v­/;̔5qnF3 w3o%{JtTna9"2=Fa>{e<gq?6י)q=y8=@79:|f?z\A*\޿^{uM/ջ y} BRRhW 7Ny蕑,j\& Ug-XB}}yhudY@c󛧱E.Uxdd,ISrR1;xUF(MR4e!C gd+X.*zY|RsyV͐jPo\I1#h2.P(SPV;P C>$X !!@YP:ZQJ'Υ*"An9 1N0 s9_=NĠT`*ʦ)B`@iT66Tz*x+X2Pf$}B8*VJ (Uv e*5cPHq $N0ċB(X5e o(Ez@_Pq)TEVHUW&"[VR"1AUDI)l=^#2̋EV #5U"jJ() e01!2a+"Ǜ1sՕb1A ԙnBMK_vnѮ,3O<棊1ŲDӉsBu>L ژxaήiGxm~תJf=fղT8Mu%@ /mfiEoM0ddpҫh!R0 Ah)SU!0阌$;`ŗ,J舸0F<(9HiDL(2ӲB 僕EJAP*  )(^\D,EduśZAKpG AiB o@PX33A8 dB^3xQzA{!ֿ?)a8|cQ24w{ (5!8 "q %E& /=@p!`~3Xfm6Uӥ#D`Ж] F,Ь5;% ""fKzP)@oH0J.hF -;gt0j].…Я'uz7NR:Z*)t haU9150 _~YeM2(u8;}!6"Oo_6,$b`h)Ǹcg"K'/(|4|Lf7&spf:CJglxzʅ=a=SL󯸬pz94֓*y\@ 43-K\惫0AN~ CK ;B U1 3CWRZr`Φ[79azN D tN PB_>D@"'9 DN r@"'9 DN r@"'9 DN r@"'9 DN r@"'9 DN r@"'9 DNo dDv-9 [ӎ ތ59x'T:J'{?9 DN r@"'9 DN r@"'9 DN r@"'9 DN r@"'9 DN r@"'9 DN r@߮Ȃ-9,bCN +9J{N @<9  Yr@"'9 DN r@"'9 DN r@"'9 DN r@"'9 DN r@"'9 DN r@"'9 ꖜ@Xv@7N@߬r@"'9 DN r@"'9 DN r@"'9 DN r@"'9 DN r@"'9 DN r@"'9 toi=gGh+~eKM1~w*%iz9?=D1אq +ٌq jيq њ?QZGƥoƸdX0pNW6btܯztZ~j;]-]=q¤)B5CW׸V Ѻ~2#:B3o+vpl QJMtut%5DWxsW5B\]!])!=W  A+u+tb:]!Jɉ0ڵ++E+th>tBZ]!]USSXkf 3($:BRR0 mAne?Վ(':B2Ɨ +BDԡKE ܆4`ڙA>yqwMZ#!BJV Ѫ/F&`]'+:5CW43_h>t;#+ ȆJk4o:]!J!Zzo|i&Dcn;6(-ou!D+lluL:]!J鉮<|ZB++l8xB/ ? JkLWU{Ck4wJ{`߂M3h~6"{:=ϐXݴցbxa_WG}#woH,E}_:.(י9'u^WV3NnV-)qȉifDp+C94r)w6DWثv>5BJ81ҕt75/QUBWV+vt( DWGHWY![ gUoOWR9#+t-VBBWh^ךP Ftute< -X "\[+Dk^]!JKtute5DWط3p k Q ZSst3pV-v_{mRo<;a Z엮gh~Ֆ(a,{:]-%zNssǙv ]!=D1ҕr갏 x,_?˗[@6ofgϙ23´rV]*b+lWNjlFhzzCYr:Jj?Y CqL{W)mqdit\*e;+k^o+8 Xwts?&Q|A̋wh2A'yr<~^>+kZ^s?O>hLs5GN2Ʋ${r2Ū_/zȴMZ֟h~^~6[cKrwn.a4_jl}\B/?Ǯ0?[%rz|庻"̧ˋ}2h? uixuE~[jhnG܊ͣ;dҼ;C0^4􋒇 xcgA87Ձ8|̃iHL: HkHl>? OO' JBE!: g]Q<1#'pt|@Ζs`9}h *^sP?s;|/zN]I=hs^b&Ae9Udd;lIakڬ6j7XV?}TC%6ʞ>_ zi D }a,wUgymjߨ6ϹQn L7>Eps,V)0L!U1%aEYr4 &UM+R1/@Qs*+)l{sG>C\kսwō\|p[>&[;3(0* =n(u0tyteʹWh G!yQG*ONUB-z(Vx4QVs2S,xdU+%U,hB0SY@\N%:c)GGנF3Jyq&nRQS%@3pWZ|2#!$#ߐX+[|ⶕZZG6lH*RgcVIf?ϟ?. Ў$4;Ao.8 G-I4α 奃);N~^ >bOUɊ_g]'⤳y]trp0J('i9p2^A9F#xq~6mNd%: $zU^j fbn i5Ni8O|:ilCuUd:K8of,ewq*_ZW!W ؝6p8%$NoT%9Btnizugry3*\WGl3j̅~|2BEmj\ =:w $jJudX Mؤ[J2r`X(<ߟ;>̜|tz#,Ym#܀;qHۦH#inor _jr5tYkj 1!UrkHӕ1XlNÔ:d;T4 6 ?1 ?Ll\1KЖTpx3rϔ^SY΃B%h DLX^JV Sё%cBDW&:DcJS;g5ќFn=h]>@MF6v*_1x&ʏيܪA,ͷJC$|*gU{S>sQgG+g.h+ӿC߭#o6N5:@nԚzT;Ǎ!7:Z0 Axɠ"=!40a٭d[]Y\Ȼe\m˅4ff#Zx>dw]XFk޿<@F Rz(ݕ_F @h C)DtBQ4}'pZ(0fcmTv=| ri$NĨ48 QsECYJ ps>9u _T tL$% /ќI-$gJ:A`1g 5(ohG]:1߾o:. y#4ç%26~T cn'{zTD@M!5yG/@^2sD2I͓nO4 O V$Z}Ns1tRjJW-^@Ck rsn[ |7[o8#74\ٸX1NF!RaAGYh\ᕐ\Nl %yc D@h H'tP66{pi =tVcTy'߹2Y<]i٧GEz:p4E\T8hKH(J6o&ӸEFϪwmAk68vc_kص[K0 >=r_\l[%ǡz,ae8/qT@7N+f GE|gIW@-bQDEBO|Y'e5&Rd>HƕA+N.HŤ#2Υn] $Ik*\)H{V]c<|ZB;|mKR0}TGwy?~h8rXR|GVo^O727f0fxyӹpwփƔv"h/ E@o R^89\ebq}vVd,t3ؠvjoj@lrE,w$ϓ%࣌V: 9J筹 SۘcO[=r6J K&'dF_2ƭR u%1F̑f)K( rFh$r/jm u%i1Ĉ+9q|6ֳYSvۉܬ=J RҌRi=7C !$rqb1Z]qPSHJܷ*Ф |y.?5k-˿Bd8ADS%B舴3&])Qzb:PkS~'~R]m2?K3-ʣ' A'B]1݊_g&їDt/s KP@F \38\Y ߿>(h/bf" 2*/5mjv 1g7\t4KQTg)U'0t( sU+Os*C2%7W͢^/.o*Je>UHrvUC3vm?_ZqKH@yq}ҹ5՝yW_V&>8O8Zg+AVsf. {g瓹fnE(jW%ѹ'o!ul=%!WS#jh %(,Ǩ!̳W=f7R*#S jX5Y2r3Ʉ%/[_hR'ou 'Wݔͫ`~-'W_mz]m//EYf_ GykjRf %_!;л<ߟ;>̜|tz#v0,Ym#܀;qHۦH#inor _jr5tYkj 1!'sp5|h-c aJAn,QbF?jmgM&G@6%h~ ֏\yk3E #0PI (>h SqD9)Uh/C=utdf wT,onǏWpz|1)JDs](ytd>O|m7لک39^񘭨O]tN ]JB)`9_.<,}Xd$VKQ1 qeɶh@.nozRT۰E4lZcO:4,߳5Ki |7HutW[|7b. Qs E WPa`m"'i\;6eh'bbӑj2 ҸzF\h@ʢv+Ziq~?k#J6Je!٠LfL4f@tH,:$ :9 -Eqre{thRO nj] ]I2«SH~g>ue4 .ٮK>X]9r쵣iG`)~A `I3H;|Asggh',5icM2J͘L:jw1rW JHo]|T՗RMSjȮ]*p˗mo0J!cm/@n O@ Z>w/[8Kӓ=lS=W8DqϽ{=q|+1/a1+G`jFގ. gFU9RQ8Ŕ1/HӋ \).Es J w!dz+b)O5)/yT 9  rU;4t̾Lç4Bg,ߒ[R++qIN4 Ϭei|0<>6 Lj[ %V(,ph{HOKnOrF|%C7NqPPkYg{~4Axov YQN0H_\4᧟nEoT/[j{ǚ+{ \h)am2?2 Yg430+&[ȪX@ 7P!O[ 뒋3;P:Is/Cpbg<<d46!={mf / %QDžy@'s& ߽y/5- aN=DQҁ>\y2A)'c4Ir24]R* ҌRﰘ7y]&)N?/yp.#z$ s y.8Xe%Q|ϛQc\`t1OCᳪMqWPpsu;T:>`c }c8 ~zʼn":H]ڑ1rd<B _XW_~>ل=)iGi^'o9."C3tGO?z^3[wyo=DIF=dϽVEgd>D/~9J*׏l@LmY@ dõJay{R5ÔcKI?sԏ9c1G~Q?樫 gQ^P6XoM^:oՁhk38Θ463`dV"/řϪXdBc&+W@ nα1M 'ٳItؼ"`Tls)XQLv; џ}R<o+ޮv`w@-ERJ F⧩pϹr(_&:mL*-zږCvcr6,6VN9֩zŸ/4"4k5{\S{?e=4FRn蕊8y灛E>ZwL"<)ys>  `Ni_bC#k?\% {ҝ|).H dddvFu?uJ0F'2|ll={|KmJrKuykf}@oht[r1 _tYe/,dlòp9jD5NWjV=[0W 9 bw𧛢(C0!B*Caj3jX$"﵌6k5ͭ(ji@tnSAO=ֳvJyu \6r^o{T6jw,^zTU{s"}l.hiV:/x-0v\vz-RV:97ElѲE;^-{;.܌Is#sR^uj_EA3\9͇&L[Z|gc>얥Kc߹4ڢͶsӴ/ >EY0RhvA#"`IŌH2z)#"\w808T2,si8T6Je!٠LfLbľMi۴@0e!%i1̱dGo) g`g(۳jS1F: lp6HF7g)Oc.*5} *r,[|-v}90:i[`kɣ]O_≦e{cDҼS5ʜ鵢>ʺ+8rgkd/Or " L!ADۃ㋖"Ӆ<_ef5z/"\#4uh$gG D0tZf{1G8Lǁ! b5QQ8@ҒP!roDʵAǵEoYe쟏ѡzf|P2zE4^G Vp).}Ԗx93^ˡ箆ܵ.UiF bE]д7 zdUi I}4 j-$* %^Fґ0хTmȀ5-{EY15.S@u4H`&$ܵ>rhAmXzyxL+ : H[XA`IrY$53GcmXӲ紞zN;&]4)ZCcooqGOFCC:!>ԕ4XZgp, F4crH# be}0TUM-a$EV M[ `V 7|G~mݨtV?+Ǩt# sh5Z`(3*h2:ڀmT p'&S1Nʕ{V l[(;Q, j( ȧ[gE!Bee~CNf-JTWon!wAy"䎋*NKehSbxm:VH3 94f[$FzVZ3N()ya`"ZA۠FH 0Pդ2#$ 9r\k4Hz6urKir`-/ `}3K_L\]YCD&eBR\.orF hiփQ. ȩ()\TMpϹqn6>bkRM?x4)F I.΅ 'QDlU>3(* %B# bD" RjD'+TT!-Yk4䳜<'6~]R >FI0RE# A, +k:)6Hmmkn OL"u30$H8zqQ z0Lj*uUCU-7vRo'-+MyZ CZI0̔2ZoBH jNkm$Si<(6ji0yz纣5%#a*%Cl.+v@| [nrM!Rd9eXŭDT28-b^PAxL%#Jn#L_OυZ#2iȜXq Ɗ2=aZjt :+5TH 4@ lŽsX*cuNQ[=:A_bufX_,', >E8&:uJ2\Z+)d'($KU',ҹRWܮR ҂lƉSdgS}|GM@)WYڬ 8/ǻHIP;X0QEu9+e &3H\vW#ap) <a2c֖Rk Rcyֶv8 t~Nfnh^adUTlsq5H&*0w,?T*mGА<* l^ux!s×ij&P H$)9IEd޺ZafXTqAZ` hSAz ;9,ۗs½tgZx8_!62RC qr kxHC!)!GH*c$vTꚪd>g 늰6l6Db!$:P"@`ƃbDA.LS#;"S8( vPA=Q(d`%v0 ]VM$T/H CbZp0I3Q`b_.#.{g\$<7Ñ/C-Dwͥ7RD} &.]!iM2*c`ָ$ q)Uyv7䲸RĞ"ȟ/Uˣl8)^ +pRp 0ERLݙpeG_(0o%P0 8+:mqkB*̥CR&b\2t&+xx00qTD?/kS(9/ZmzKF(3eza5 \_h G?nV\ZyӋ[(Upk4{PޞŒ"\Kyuqۛb1 b+y8˒Tv%P6fVҌ PL~&ƫgL#QvUfy xR(`b˂ë`VMkg%hY%Z!iޛ Is`L. }?gx0)_TrRRnoaR!/á"&B|a2E~_&ӫW0,pTi״*<N|Ḽ+8 !X4/1~6ڂ @Im<&\|P2mq; =)HJgLhf%ogS|=gyK'^L4c ;ZEzNqG_.W>2 K\nԛ 5pR;^ueE)N.u9/U%ھ>d06YYNfV!Ț`[7zbSFn-̫Iq#MX!0zߤ.x!9Ea{߀S. ͨ1c֩LqƑ:h o~iPl{m eKԛZ« +e3=d]i'CfĻvzrߒbՅwPz:"If%մ_EvQ{V{A-k>VEfɎh4w},a1]FɍLCO4`!}nv8nK؝џ c۞FJlu3Ҵe9ɐ[;U;j6-}ʆwf41UqIW;(\vkV}؇rՆn+{ǍНџ:aO405N5sҺ#ϻ6Fj1T"'No 3GUD|{UU|EMQٽ.(ϿpৢyRJ>|TH<2Nodvp1ҮszyX#9mQ% vD(2hbB7! G$>HFhٱ@b`V7ef4| Yywx[["* ͤ<{g.cx]=i9TTOoZ ͻ|$M;paOD] e/,0dQ诣n8K$8 ׫ -Ǻ({5SMtL0m;*>d#:|~hޏO.?_|V%?2c̱l6Zoj# $cLKvB)bxЄ^z?0{5 Wa  V7mS?b*ݻd #wv|Y tR[cR;ĬW逰ed~sVk/;^sl-{>k5d1UUd-@7wp[u~ ;9.x0dEٳ`@Bw7 m}7?'5kګj}ü ;TJܠf~qqnO5TY+ncer0O ze=%CT-7Zn-J62}TѥX3ܶ WhNxw?tCq&);sA(,\yRRhEǸ} )np0;3˭2|`-cg#h`k.PH?Ș8P=d:FY^10 >>b-=o1՞`MZ^.Ndcs,ݚecRd[PXM=J9ƨuy! (5e-8FuL#ijDFRSq\y:aP}CPW z} @5)Uۿ w)瘀66j<[ MGEdQ0A[R.k%OFro[~PKx*Gp-.elvX(~En]=`/ŝ~*>`ڊwa6"4EX %eREK1bk5NÙFp:8;*`ĊL_޻"`lc{CGYy=eO8}/;οh(MݟѶv",OρD>#> SdL}GZ -J>C> SdL}IC-dLkr^iq"g\L!2 ,%& }3GA= }Arm\tҰ )ed@r,Dc<`o ,z =!"}DX{))qJ7Dd%0l\ AG~2XmBq 6o1&VX*y[|@&'mov{0J &l i f \.id*mi" `b{"qۊSݙ;?MQ.༚WoQWլصn7 IگVPUlQރR>J*=N ]i(8?\ x$Zu2LwZ NZFL&Z ih: O.x6R^BLv 9[Y֮7pfx]vzˢГY1])OIƋܺ|>YZ]vm>t` Cvpw{@qc=ہhhSqsaȷ7|jyWE Wt}&[GΌ*W*\[t1c']OW@IW;c(֞FIzT[슓nru޼ By `Hۃqw^k|*FY8~I8Q鳚Ф?N]JY;Hp<5f&H kFEwo> KKB9 ˽)ךf%;IӪӚ %ZZ |lGm7 ,3A [bzj(]kI-/6<2Ϳn'&~yCfFɧ0:|UDx|4 j-$* %^F@HHLBDN6tC>! \HLc@YGa9AQN!{iLv'X/cOHƴ (QHLL+@N Y$53C!c5A^zI%KI;i @(ZL>%8n;OtϨ.-X˓ii˕HJKZB]Z%e}K`K; W|Vr뾹FieF 9I#c(]}Qw89#(~DX;s.Z>Z30B{P69bXh\" rf3gfqHOcފVɑ %%c88/ LD+T4hi Lc,-g!);r6yki@:h2ȗd s/j]6 +#|5}b+N/zd5Vt4I45EKD;x(R)r&u{mMr{OO{9;T8:o_<RE I.S†(R6ҪMr5#XJƔL|Q$rŐ*X"Yno9km9 嬐?I MocTDQTADő`ʚ`Nx@1Rۋ@"\Q7IgJb`C F$f= Q`zpLjwem$IzŔy L7=1<%)öv0}#H%Qbʀ-SU+/"#39jm I4axHqZ[t@otI{bBL `+1VqQѸڨ_Un'3T{ x3%񊪔<8-"Lq FNA2 ^hV И#"8 ^j$P6yAS[NjN:+i <8Ǭ"a{i)eO9@YdNj^xX)HGm\`5K{A0Vo;فJQ2sw5M}lީĹvie\ $r,!ɍSnAjO6%$88[XS 1RD c5z譲J &2PP+w/w՜c7G#^#W;VTTL $yIḦdhMR(ZG"gBP) 99mA{aM7ܾ+ӴQ9b5E;L@L8D-y^ssLL&0!AWKOb';?VϿ#.{]i9C[nI|Do.8jb}KY}93lk~e=%P{ܬ#c3_|jR@N2}rz$cS;k׹9P ^Ǝ)f?$rsmOHt6Dzم7E;çT~=вnRF>śVߖdJf3???]n8/QAT:S[USvOmʷ .nT9\}l-ivkz32\ɓǣ/,Aі\ۯn/&k.պGe7W;_NTd{buOU@E7&,2{RG U̖zxg=jXe@u6ɪ^+’ 8ܱ@Kos!3zJ[U7ZkY>~} oC(m1%+)L 0J ר7l[Wi<*LU":b;y w~~߿?^~x7|;(z4WC p뿾 ]f]3蚳:g=]uuN߿Wsz:*D#x ˻qozare8URjqPhV}m ~Q1}?Llhњ PϽ^8YE)!T|i ^["7$脀2Dh]GGf Q<_x+S%J,HTt^FK(x.fW,|DfMW'|lE}n՝I,ɷOѭ` m{%n NɲsϋѠߩ JP\ B Q^k@\k[ۤVe'ՐKT_WAޓvS8jk5_((S <.5?\[7RʙQ.ml DqK=߽kիH?ͯOޗn|c_[j2opd !|l9C-@ غ`F>hJHHUn8 agpjZ&jzo.yEᓼSy.TDS=1 mpjRTǏ*xUS:eJ=>XD"32XjDzQmSԾF s?O?r^~YjڇŽGWwk$/^aoHN;kd4Jc,qjk%AhBT#d yb`:}Reoݕ|j8>F\ 7sOn&UfU(FDGTN̉j;tz0oWj&bL n5f8a j1CQc56մc&dԘBmcQS,kwe˦"bXw9Q1L޺hr񣦊١đϾ^K{6&Z?f-IbGMk2u C#%N߫05%v>AV4hH޻@))(ywv?^[l}3u\yk^L%]fh{Y1Ӈ}:JwkpnQSEFq=(w>f]Ɵii"Nc"W2VrHnm;Cߞ7rvƨJKia%"FÕZ8FM$.FU,I)` ȑjf9tTvМcKN5f9m8kY)WwHb&6+t:"!% SQ&NeH1148A;$%S>+?Uk=Ddd1pE%E@FaFL.T(G9A#@ڠu?»n{TAk^?iO·u*!CWc:51qCQyj;ብdU:k:{W\˪E|jo#8^LG)+Rp0ŝ$$9u$z%Z%d&ZBckƃXnp""|sj> :^uYqHShcDȽ4' DQ=Žp񘃱7"qs,%AlXc[+Wbmi+;\؎})|RHxf u;nG2x7{]&T|U-))"pWCa=MEVY%"4 (o (|w*OfCFqzD!B*`ET:$fwR8)0%j t\:τ1R*s#sx,vQ(4.(Cq$y]Y9R^~,K-1'\dI)KPiq ?jAGd! vwkm#9e$#x7F>z%W1EjIJ6?sHCRҌ(jǀ)ifSUUuu g!82TD2'TfiXG0T8@2ArCXPpè #OyYuoC*F$fNRĈuZG85)J1ʇ@ো_{YۑWsL0 BHGԂIYGS\dwY7h#02*UnO"&`rЫ%zm|Tk4U59,'.5}=?85 ~hn.qf)ta0um9Tē+N0H(-| h4\nUMiToxTpvW$域~|û/ξ&ޞ}z=?̊!ǍCPOOOOZ5WwMۡkF|+|~گK_q@/\}+w&VG>*fKp*ǘ1Phc Fl~b>Ϛ6%Ur&@;)EY?1ӋHo0mGFsR!R9S8e0N{"HsR-QޑZcBu WFme#CYN95FK$)qL(⩶N`QOtC7ZلzQʭWZyeVbRzU=)~`G@⁛JZ6ec0YGH(򹖜N)Vfx_yzMٮEꪼCZ%>E*r1F: lp6HF7g)Oc.*by&e\2 , L:H1[_. ]}!mRdtJ9"h8˖ D.|wU7M}97_G %}coN*_vJan|A-@r5Zঢ়Ti֜wf:)pp9>mt>c`/A}O.O-ݎO%_8vfJeRS/΂ u49|l` "dUpoV.gKQyNj 5/(B6TNo Vw>esꉪfeU]z2:~xy~,Uϴ tO3X-P&AGt_RU0Etg t< Wé0c1sE`3tn`rb#V՛l\Xk J` ZGK!`2f4j|sWL#,8ZI}o>,l\q@,ү5uwdn܉A_ K_aZm(A,S)%m1L:|hNS c$8-rCiJv)uqh u?_уApQExPH?K9DŽRn&x9$ Ȣ`;N'rX+?B"ɃwzARSC _q+OޖRo[ȭ=ƒG#j>A@fH=Dj:פ= &uD9ǜ M0&{| iw^}PI={ذ@~'cl{%Zt8'YZa .p/ZX^rS8R 5(§Y{;<-To8رbԠ0=൤,RZz)8Fz&iL#AAՃLy{>pķjm37U/8 Vxn> ߍE>?oɯDGZnKs6~S x JEiXࠧv%%0~y۲|25{+m /l2p疉(73wgCNAͨ pts>h[m0nߕl'@t%|:v_Zw!-6|NT::u)x N!N6zo"y'[c&,!lg2NmOAݝmom"]~ZHOۮva2{xY'w%]E1NWU{x]we[I/EۊQ܍mӪK߉-lN@ o:B\Oy;NL~Ͷ#Dh6WHC#"ᷠ3x<%I~ =K7k?rm\tҰ )ed@HKb`59O-/Xlw XH>%L%WMS"!&"Ŝ(yk;Gkٞdd|2wT\vt>e2N)?FZ¹rm×ϝcJ;J &l i `f*ӆKJ[([8,iϼ0oz5̝ToS ɓ~-;za\)yPRZ5Oγ< )Z2sϏM$nмG~~[0%Ձhks8Ι467`\V"/YoJvU.vO]px?Y ^~e&c:j֠,zg՘ ve4zhj4BZ"ڨ@C2Mn1iLZݞ0z*W]^Sld6PT햦+M{YѦ](J΃82 '.2 nn:EMa axxt^>rysg6[TÁL{Ay[i&kf07W\wY`o_KY5aoxj~wWs᭓mi֯_?˗yNդ/[A, V3h=B3x=C{{|8U*Llx* \/&OӸa ^?OҸ`5.<~bRo2zM郪?y H *RBXYL^jʈ)i1h#2&"ݚoce;.uq'ݸշpt+28o)nV6~Z8>ꉪfWmz2:~xuY# 2xv¡ Vcv;̇q\%z'L=GtӰ!2^ȹ(9?NPix`,(3xn23Ń'_Mj sX:$ :9,Go) g`gVgAj^1T)b+t@HmojRz\TklyZh+$Hvbhdž9⹰Fpl ۈ͗b.Fs{2 ĻIe|Ͱb*4 ӺT=\SbάSCY:'&|Y Ίkgy} r-\ֻE++'2C5?tRФU>ʼ$ o-| IM XQ9N`.0 KZ>dXXH6q;Mﵗhn|^P2zEx#x+8>jK ^Ȍ ^`A Ɯ\ sbQk#(opz;uh=r3Y/ @U*BD%$@tY$h) Ӂi]IՆ X1{9kWa iU3L1(hL> 'H еpXGmƴa\,={>| 9n:?}AY]{oƖ*D.pia XdMqoiQ`qc6Jrm}ϐӢ$[fIq9kHIQN)`-yJ5;˜l-m=ԮЋOy-%ɷj.uDm_kN:x0؆)o*{P\`*N3X-L*d|BP]_xn?૽l%5yϲl۰ȅ<5Vf_ŕ׊d=NnwքfV{[cb(ڹ,ʶdiQDڙ5RΗm 3 r;WHCsKFD_A{[{Z/4ڸa!HRȀ wj1r>b#!6nYOg} d aO^BNmEz&yIjkӋ2of7WƊL`hT0 H4hecbűe8M[my+WOSנg/_W- Ǐ9f}7oH8=ހ#D[qΤ&a g?):LW~%`xfcT/ʳg+U֕&[\9P]~c175'n/?.(Cch=.߀?NE\`[jHΘӹU"ogե7=R&XAGӗڬY=Yo({?^".AIQG"`"RSFDD b FрG!eLDj{-Z7:`O6uN]K/=#~w 3#ZiYׅ㖯۷|l3n[\™-cy8\[3V{fi`G__Y,lO:&A珊ŖOǍ]5 nHfpWAb%TäaCOfMe&фuZ_ϵfJj3&cE 1oCb/_Fc shn_Ь">z?j0㏑-r@qInW!˴%L0%a_7?i+4t&.o.J8UX8Pb@қHAL! 2._[j=8r'1M+C *dC%u}*~[ݬEu槢xAzR^O?dw;0鉲b/kf2;L5<*ū kX5/0wA@wW6`!v}mȺd>Mw`iYsݟ|7w?Nskٞ<=Ϳ7nMv]wW6f,EϨ9JU܇kf6@t 06mt蕱WM4*Fz09sH[ &ObenToSZW>)Y&&VQRT3ciWL#,8ZI}kgmigR 곲ufpƺ/?Kice6w SpLI㦞 ϋ߅E%;@,H 5"RE)֎{T}>33混5kXy`0@[L Kd\Z8 MR^P4NK!Nc- @9DŽRn&x9)YGEdQ0AfR.k%)$4\ߦ I#* 0ٶ}FQѾ0Oehލ_UR-8GwKњ21)*|bqWb K ǽ ^~mKy.89*ĐO5TkÈm1ڟO9.Ŗӯ/=C6LFk仲EymqUe3_|܎V˛To*濋:G/G"0@QAsglrD`;A 0At2 }bk6=w쨡[]oNف2.č;e|B>/+7on"a^JZ3/d.Ѣ ݭ9#䂑cq"䂋c?s.qS =jgk(ׁB2Ym0@4s:G$1옷m3_MQ'35 t|]5hfԽԡЗoFcf%h hj8Fa=8 lཀ;R)rJZ P| $sm"E CG&Y +k:)6HmM/+A>0 q68$`DbƁC0k# D.T1>Q3(ZSk[SˍI=|UĿWʀ)`dUTlwq5H.*pw,H*#OhHu80u[c{@JNcRYtVfiXGpT`H&gʏA~ 4# d>ݖYu#d!R##&с3pf<* B(5} ~HZ?z}L0 BHG(4s1 euD}L,#=';}7~ &nvb7mw&%@Ev I72RDkeT^pb$*f.J01.E <hzeӣ}jV "a :*33k}8@^ΫsJ`PVtb քјK ˧\t,H])BoT52Y PBf^+"uCvc AR~$8]ߜ 'ݎg%\N/v;_ʵz3" ޏ"1MzoBu#_>u0fBg`DzBFĒ{݇Bcv;ڇQ5գnghfI)o6y$ ́|t!ȯ9>YL]vSf~M$Nq[Jd?޿;C(>7x Oe |_`‡HqpWbtM"xţ4镧3׹&~]//}||/?|;x/R!)V| <uSCx㡩b hO]-&&fܿ/.#X}6vud,}T\KBJr U jDDRęfOMw议2hClg W5o΍?:H*jesTPiǙ1z T`mc_$&/&@C6eЖ}ٺX?q/v%|XPIQ6!H)qD`#S Kɪh/=L1Y ʁ+:DcJ,Qɝh΂FEn}\>x&T/POjspSFW?gVf{IUH.դ/T?_T듷xonu¡xi'R χ"h;9}7r{7QR; ٍJTep,?PV8-rvsN^ՁS'/Pᅬr(=Jumގjo[#zb EE+H{*֗oTwQw]UGeDuWo)"6l!O$|뚋WCr6y!+NֽIŏ_7)Nzӓ/ŧߋ>oӺ<2îan#?" xLLb:dj9t!Si`:..0hI:Q$Mh:):$@;",5e4LQefgTP-i\JXŭVFW,z?A۸8W y7Gr'ҚPg1%E&ă^1`!- Y\IkkEi5~[Jn(2ȹ$K%g4ȔwD%LAIz9t3jee dS+QDUI&nJyȕTX9y YH-Xl}F^䐯c>2ӠZ`CƼ.yn"C7o3K<;ɼcަHԎ\ABL򥥉 bޢeI 4FSN~CI$V訓hz AdZijΆ50YMޠ[~.;w!ݞ8tP2t */gc5wC_R$BrwLBzW_W >ETgž͖ylF_9۷kc;Su(wL*1em$R}o$p/oS]j?GX^~ΘG L 7D/(18-(K$F]rgQJy%!;$iƯƬ\\M>UiZGni^ڦ=KpQl*z~FmKIO+8;T} ػn4qǗWUZLKW"76&Mts%Cʧvi(uϭ˺q8/wt:oswȍYu-KlY6ڽMYϋww޸z^j&fw7YZsG[YoE֙oF ד_MsxސRjΚן5횺j}ئf3dM+fzz;dJ0'9JlCy?Z?Fu=fM[^YSl)ِ'eSv<)w"ﲭ6t-;\w39aӃ*olUv%@⶷mݣtꠃW<dyjP{I3jvRN^-o&{$KC)8^z\bi ¯ LvAEofE+oRQY. )hT͠5ۼ6Y /t0v#)p^9ɹ^4>l[j>&]^eʃAȓ^!:y9XFcS#B܊Ăx{oc=좣ҼM,7yg- ~jgZ[=7?NuG:].)fuEhVL]ξZNPd9!Pʅ!^j>S'F=x}:zYp ?s"Yx։2&Cͣ{={:;}1-z{Sl4@,AkML胏[Qo9awlGuW飌N1&ob! `zDpʅ!(%Da/X{*a@=aWyue14\MnPdzj &/4棋pgY.p^:HMLXVm+ -.x#@3)hŀP"1HZa- 00f %Di*HMJS`)Gj&4gi*A􁱖Ei=$Ҏa Œeߴ;igpԃE. ^("m(|wzI"S Ad=y))v.0R2a<b^H<*y.@:)`rv`;6%u4qh_[1 B2*ZG`scotH!"H]VT+TVS^nM;ˣ۬g?y6=n9U9~#MS*g8H.'X*gjl8*?ˊ\NuNLi'.<_$R;'y6i^}ê}^IՓ/Z]lS-V>'ThS!XC4 μu2JB9Џ"Hhѫ^}Nh-Jz0bR^89?}^2`̹45ǘEYH4`Pz2t&31$E ,bp8cA(Y?KäPH( R ;|\ը}Ԗ"$阯o燴̞i˴.9p^ FJ){D2k)LBo ?6I"㐝;0& >|Bή9ފ@)O^h-y B 'B҂jDf@yBݝ| /+B}PVo.*|)%zHܤ$!",*QA Q][oG+$9P}0YQy@UҼo IGđD#ɹTuMUW5U:OTPZm2"iIrYVkjj)cyPA=Q(d`%v0`]VM$TkHҽvl> [f?!Y?dC?.3.麎]3;}j!B4ZPSF,L1*0I<fNJ0~8L9IWfeq=E?_{MQNkdMR`@JN!tFs&L/;)8A'u>? fEvrkBZJK M!Ē] I(>t)_4!3Y>2ґ̥}|xx0sB1ǜ ,Imu\]͗C3+{*5lmLfețlz>xm`L[,^JT_v%QV%G^b)~(^2(zJo4#CjHD%,'`>GL>, AOhiT buTk*ъH'>,l$>?.z6? M7Wi {nqRz]UoCeN<| 4%%.|d xPg&bM5qE-C!pzs0՛QϷ>b?wO\2K?L w0{NjCm~l& Xǃ##9rA)HK:JLA8 !XIܷAcgGpۄ({^h#FO: -I:II<pP{UܘN|%yk :UYmm[QJ }Z}Ϝ ➛,k0T,s,G&l^r# $g\Kr\ 3]u[CzҢj[UP#EQ惭 {9kS R,??U}mRe4J[Kgp>heWK?^}l$7%<,Q7c80ݟ-D'?zKZbՃº+ҮN-z"HŘ\Y-ub?zg)ivl๥h/Ez w˅ߙQA6 8X}MTuLps {0MMWr-cAܗc`i<`s-@!O+E[Q`b=_<萞24A0ɵ r B($2f4j|sWL#,8ZI} wD`礵kf+2ZNkLMb[ť[-8}"p*|19(HzO9]I.,k1Ӕ 8NP#r;GGLot۞hF)L'/Le]ixql!PaK IT i3ie)Pʍ"6툣#"( SR.k%#RHD#ipc|0x|CT߿ $:ܲ+$OgKX.-K rH,DD!gDDL ` Xy$R6nbk~G& 3h4's@a颚}tַ@u>7yu\ۦP񏍩8Dacy*#eOx8^c@6({LQi~5ibEj:ε1'v0&x*XWUϾqmǶ5jV_c?~fmii1_A2MMrq?o"-),R 5(o+LkWJ]߇N;5(b DOO}b%I@MVO?0?7§<*}Ejot7odA=] Ǘ13of+#vVv ٍj- )Eə@8KT BwK;+5vxa^8]E/҄&e!MX m븫*ˏJ1;M4>PmmY6 Пci4§mĉ@fq΂LECͳ~bvC~nv"ŹSʣm ק)U 1#!Bee9̥1ZD>f|?_9b_"{R7t&p &@\*tN+$ؙՆ{ΜA cD3ǭsD#=y+`RZ鄒B L^+ZA۠FH` 0lrfb\1v ݧQ#c_z1ԾS['x-h[ng˩`Ҥx퐑I45EKD*qcȩ()B @+#n-1ՌSWgPlTB*r?m4\(A=ZaS) NJHvfP4TJF9`ŤԌHdDԈ*#T"{kOOzD}|._şˌK# $@e{z}_Z$ͥ9 l\:0I<fNJ0~8L9IWfeq=E?_{MQNkdMZ֬)9F"Ιb:f2 ۿL(85iatH]2ld_@z[6$gݤE&1K24is;.)\J ٗO`:~9L`;nj8|\qSHJi6YuzJ32]{uMyh6=q 60- Ey%[d(U+ͦ"X  :J) Ȑ2jɬ* ŰQ $4BГ+AoZ;8i-UndڱJ"I I9ñKM,epgêvM/oZ[z򶔞ffG՛!|}x9TS*ʗ@KSQRg~qI2բI]W2Tbg7:#_N_y~u|Ǜ.&{q̅sneA;1@ ww " b[fQ ]k|bJnKC_qۿ]8~%8cJ~irevH~`K̀N7fml0O #&iFHɒXI`YJ&2"Ndwk0q`󦍱?%p ΋>?ʰ?o\ϣ?PNmu%g3O Dz ^["7ɜ1EV.E;ث#%;wI<'FƆT0bM3WUDZ|׶\PV]@m0\Zu:ҒH+ss;bpMB~. [Yr+~|7? || Np(xhr.ė_ДD {(;I,m VE)$%߂nʣ,YΏ݇9z u84gὧ4jmn;+x?烼^-/z6Tn\zO,[o/e Uzgf_'~/Oy\m9~ρpwBdtόSEmxⴰ4qR`(n(˦-+_39F:V?ā"="B_/d E,8רF%%F9d,z =[3*ZD_>x1ř9me%&{j΀A!`(Rcɗrn6j$G M. Z^8l!)a f[^5GA_tAA.+ 1*i4ڪH8. X3PI!7|cB%}b"991!&(řP 82Q1xΒj} 5qv Yj?^?8AS۰jnXbq^T}Ʒ2S8^U/C'Q9sc ؈zv͉5a7_xj_q߿"} -@YWXi E0IŹ|&і!,ɜǨY~%bo8Wq=Y?~ݴin 3toKa4N5Z/ ȷV5~-R$j"֒1]PZB\(ƕ j]Qm$"*NYǽy6W,!h y{ )8N> +@XMԭⷃU88 8&_Gf]g U-ԚNM÷t;cDG!: ?Ș2oj~_ho*MY> ߳a c@{HlVܩhCQ 5PDpJ^-lK*'"Qv<ֳ^3/4~V=n*#ZX:$NZBd 6X%A؄÷_+D׼20#~꫽ޠ۠?u6`^*Q:+>fo#9Y \*/Ib{Mb`"|L+݆ X ;;yWaIHeӂlt{*SK}-*!^.$GLmhأN:{E:f4%y jIredT$eTG8@q(xusϫ?*6r΍M.kF wR5$ BV&&H:zO#Crd=-9~ч|*|{;YM\9 /PʅDv)AJP8w`9 $sCtʋ1WYZe\e)4W.\ sUcFB) eEs Sd\J})*K>Rޙ7hVib.\eA]⊋1WYZ%\e)4W+գ y4?<@5Yx]R佖"yXkw,Z=`gP4ridYDۛG&Xyh1\wsL0hn3hA3I !lR @(ўq-( $YSp%y0DC7oclqyC4D,h\$D@RU6 Ѕ m)z!d05]4 ,*_澕|uH蚹 1Ђ"U}g4._}ş~(kƻʑǏ'$iJ$8 f  spxZ]nHCv*YNt:=8G,ՓW\ `bBX+4/qf I S[4"t`ģM<8 c}B4 [(Zh0C FyqJ;S8h'kH8b->-HJƷǦxOٗݓ ۔Wj%A'Q0V|kx4EP%-$ 學$Yp4eB W0̛6?@2?oG:mL=IXa|eq\ʞd=A5K AMI+|dw]pi3-(.;lZҢҩdi-j_ArXge=g73u兛w^ʆ[E8ceq{Cu1obfe7j;vmeݝWJWZnhKd7ʯ@=wt,Y{kj֏_7auec7sAi7O㹥#|.e-%h)Qڏ͛ԛHagSD%Sn4 h:R&R]`nBoMZA_*ռA>],s|$p) bTmAL&ʌdXGhk1Y{_鰞> m߭Iξ<ܙ{KtI}Gj| cCP@ A{'HO-QdѬQFIZdm5fADIZ؄.^g#LQDr6liط19׋(O[.9ac7>yl/e͓=âjB7w%RU碥b@KH*Gʾr-4 c*K ]7V]Ve^ffpݭM?g'hu R1+SRZUC Bv*dWw Atp|>! +ƺ y1 59L"ZBbd؇}dl'$cK!d2leTAJ9d*p:: B2b$mAS݃PQ8Yz܇` >lBcƩ zM`m- 8DVSd ͹*܊L ! LVkIbK|ZﳕDDSX[{JkQHYcR  -P,A% {,*%NQȹJn.hmtH ^;kjį8uQ'HQBQ:@YPg %x-zr\낖 ޣ$U{m&F]1XeYJʲ\$Bݫ7ռt ؼx Jt7w9fk=p:W_N* K<ݨ4gT墥 ͔VFVڅ9+ҳs_v_h񅺌"֋b: ]ȫa![a/nXh~=2~E0ί.D-^t:͕sn`GsU}:{4?{)eùZ;#ѧTY/OG }w)ӻ#kr3;WUEȮ.A'qѕDn9sjRύ-iF5gv'т$7ɽB#AvLy,SeJ8cYK$/Q SZ $32I5fvR B: |DLME g9덜 嬕$fvbL~Q_A6KqJJ*aəla).(( I}\hъRtDOxT7SAT1HI I0DQ`<ϱlB"ğ˻9@B-<v`'weG|fMţ9rBԄƬ5tD02(^V'nKmScw'|=d&kkD@'])7 DdEW%f&eU2ݣGi;RSDD϶Ezc YB6ds Vt dmPڹhtDzR.2يcƶ*kcor[=Kbw6GUu7A.팠kD [$ q{26Ʊt=` )HQ𠚈4Eg6&(w(+nwe4ۼL1%#o"Xt ;;eC! Q!&kx8fGq2a&+㜌DR)5+s%R䦓ֆl&e=e$,2"ga+>&Wf7w;)/7r%^[kE*l B lgs'/GIj,^#s IsjK uych69 ʵE¼0X[HAly OX볃lq7ǖqiBz&dz<zAX7lɉ*jI-̫ŚE)CB275nZKF눫c "Eb"Uq>0++*92e +H}6|ʘ 9nlL?sq۱l7tz8sWr [snC{0 M5eKև0E<>_Ͼt,﴾[~<ayZ%cfH@uN %B;j-5ٌ섕CYCh2jB͟x@qoQvrMN*R782'hBZO[4z[E%˿Η}|݅HVa7z\]<`V]Vӽ3(r7 }Z<=?^݌gm~U{ӓŅbX튘pRƇll9&?VE>y'Rn Q8lyңF9lB UʽWVԣeVo兑z9)0Oxq6Ǩ fj, 4'wCVHL)TZ&8 z DmYE$&tТb Ͱ`>ah}LZNʪjồa&y\hxb(Ld` DjRL&"A  <v;iNܳb+(@D98 Ek .;kb>4U&^.c{GlJkQHYc+.((xﱨW:97r9efgx3c|wrf]xU?P_kЅ<qA j~b@xa}uB9D᪌~ރ:WA=e  D/hRPA :隟JF璱B138*!0D 2d2VDH%i2jU"Ȉ}ȹN*pPUd}m1OZR=ᣯ dfg٠Kvyh}Zӆۭr%'3k7Φ/%J e7_,^nzKY=&^@V?sԽғgj8,nWLE [l^n(f4Q\>\6;$^0O0W<^B;'Օahxa`mW M&eRK3D%t\Y^퓖[Y5bV[.,"b}#:1M{p|f>7n]86lo:gL;&i6bܶ\tmsI&lz8y杲PiXg"J}=}- k~ՠI;-pR`m.CqPn&H V(Jw4y#e?$@;"-Eg }*Z7Tv .m?|AشjW<duujLLSNW/ 'cvHuBY{0JHa$hNY{z<_ L b4ă )zNRB: !gGW ,3E j ב3hӉfAo~>`uΣQ>y\e>h{tLHIo"_n1G کC;d߆ w#? O'*JjbE;RGA^Yj䬑q,Nn735I)Ԋ2Foqߐ[7Sc:~j:4D Y9ZL@S fՎ{3;38C5==hΓ10t,)=1ҡ01qP qH,:&%4DEs%TR5c1r6k(Yta1x.u!uOK,ܧy#Ve:$̶A缨 4v& ]33βF EUx\l =1blb‚*H$hS'[YlyX^&*!aI*7 tP.SQ"0B|L2hJHW˗2dJ5HKi(ɣ%Q|P`B@ ΄NSulKZ/i-a;EW^v +/.*{`?Mt,<۝P h*68h~=gM+?/ "GG,9G(q~1cx_F=B]K. 6C5e|sfuQg êiFgWuM};7 ? G;?gsxy9w2tb&,~ʩ\0Ԓ NԆs^KfQJq3Hc2N@> D x'*QN&9&*R-"Ēy)y)1TrUWz 8ˁZZDdBL%QD@NJx4$hb:9 6 *C%!>K0t׷ қ 2Pš^N A}0 ՚}x\98ΪqjMҖq܎=?J8=->$˄LQI=eBIǕg:)=Cfɘh\X'q.]A$IJ$J")!d`$ ^7JJ,F- G2G~4Sz 8hx>ٺE_bbKZPQ, hbfIo2l\G/?G9Ws5D hDY~IzPN-Q P 72J@d>H( ?wVYɹw=W>sݜsKuNb.ۼi_*(SJɓGgHRayr*̎9) G^xD# J02yU`)U:ZE8VgMO&0QHT((T)WXYY@ǣ4ux7^|} w퉅n,ɳdj]ܳD uX16ڋZktJ gܝGŎ^{TH/K/ivKyOu&<#Opi` рr歓Q9Yitd4UQHA}VA!㙣W8xKeHlKSSߡ4D,FP "\hG֓8Brs!zN(n>RGuۏyD]9 6bs%^Q6qEf;IIDQt$1pLҜkH匸>8%ֳ*$O{9;;m64/L+`(9ДR0IT2C$5LYBskrzf@kK8q$+8 Y1rFoLFO!Kۈ)iFCTD9SeJx'^;T)3E~fW N,T&2:#JsL82=PIE!S,UMcVz?»ߥPGczi&ƷVyd&GauBgiaR8(^'}= oFۘiP*oђN)0oׄk$19W 7ZbtBfb(֛+"HU7y狱C>6m8Cg㜊H#{,e\k'Eɼx@ĴC6UbPlXt'&uӤ!)ΐHy@Bl.D.؄ Id`yW  $0k9[fM5VKb4e9a6)S=Aɋo"Gv0t&31$E ,"SqƂPR3S+4LjMeDʜ eؐK;9"0i5tH04:/EΖ-|OKuL˭%{-y+Oɬ&ZoAwK/8͵H,n9$%^ u\Nr:^LH% 2Z6L6T82XTD:m-{A}a(dBU9Oǣ[ºUdKIIBaY FU.Q!OB&c#䒽wHgga0Ti+;1vq#IP4 >0E0(]B.1! RYAzg9ɤr +.s ]/Z˳AlZfЛ L&$j4[.|ʍ0L.\7y fR:ze~Y{J(!璘3E'CLI. Ag /NF ^8<p2E%x~[E .AG/u fbAn9 Ó`8S>,F?G9r>7+uG຺p"Mt||4=8ZT3N%BQOKZ#jWh+Ns(p[4k]2#,cs/مwbPՂslg'r{3K Ejÿ^NNp1t=v?IȻtM!Yf_P# G̣-h8ggΗçZ=OGʊ_¼H*%nV%Eh]ۤVY1 ^ 33=8+A7Z/iƵJ4UjIs@C>E]\Er<,_^ȥP'W%/[DQB/|?<:tF &a0CNz\oO_:J8W'/O0Q'?z;8qpm#jm~] ?ԺazjS3'̼%7˭vg>:j@_(-;UYFZ]/uN ʃ&fX< `7bYd390PdR;FgЌNc|F] xpT pH~R9Ap aD)슶oԜpӄ@ WFme#CSN95FK$)qL(⩶N`R'o/ZڨxpW^*V-:EY-V_'ފ!V`qɣsveoVJV{$%݌n16;ųΟ~QlTkoRm3=`]Zet_&jjSيktfval6Q+FRK6?d0 N,S_&nĖkG;;zRիpSZ/n0Sjl<8=zv`M 9;A}˖CXU! 3YItIg=ˮ)M?YXqݪ5bMT4/ygyw&kZK~Q?Eܴ2@ҟ2]abF-Aofv&jn񮟚;aT 2Ndwgt0wKH/av”#P3?C_aj>CaaP}3;CsNu.m<+R]RaBܙY|jPgPiucފm.[!4]Tg1rm7YXյǥ3%|=V:ӡ^>Fhuᄱf+&7Ttjo*S{U=v߶ַZӠ,p" m*%RX(@߬s:"˴$*H:aʱD<8P)ǷhP7I{[tFiYԬZIj>}V{PDVwNN92CiUv?-䴜K+^VY}O+"Mך.37\sk^/gڡ3Qi ^°d?m9&wiBADƒE , *9,D-Taj@(tLa$z9kIYT*RpiiM LL#Aomby.At /Y.Fm%SwA%6!S֤H6"Fxptm1 SIcba0dMu;?:6QekU}>ZD4ԇ/JXe(~ul ]t1v(uNVR==[_-qzDcXch| \J8A{ذ8Ay%e;~*c¯(Ϯ`FaWl gW Z6]%(9ݲ'Ȯ GĮ`Ѯ]%(ՓdWL#bW `xUW<*A𦳫f[vJĮd]%h=&(gd+ANU_R/o~z?AΦ^95ǍP Y ]a8RCJƜ.u D"F8NÓflI)qgqfQA@GY 0"'L`8"҄P"qiLQ{T^`Tw1]D)TX2TƑ:њ3P;Ct0Wp1ϤR) $r8[IF8IZg ayx^4-[]LGs^GY\Z{K%(n? ̸SXgY1˼ѹϽ]W>6G#D^ n-1 1aS^Tm\6ެ6+ؽ4^jmSC97$cH+,Ab ;T  j$uF,B0#BXYL^jʈhA #(H8 !r^mk,پbzGqw7v>O^[ˢ2ޙ9VLdC'?o-A4:2R>u%0& .[ ZXT4ԃJNcF1Z \AqWFG_QE76P8aM}-}m1  6Eκئ]| !іe '-HyY*iYh!'KidQ"kϢv/P ceQ}.6ŢZf3CT7a^n VQA+tN+$ЙՆ{ΜA cD3ǭsD#=y+&tBI0!qp`xh m!M@g3ˌ3V[v1MDF[O/8i@{iɹF7 jz,*:" f hj8FaT"`ESQR*Ap*o1-FY*FolbkRM?x4ֵ:h&PzBD) NJHtfP4TJF9`ŤtETR# r(]x.!rU^!/yYChϺ۩oSKޮhH<"¸}U!aNR<5R"|?`F?;].ʍl2x^Ec1gL&w(9.V` $_:"a~RLݮ™^vCWgO%P0 0+:mqkrTa1Ʉ/`C:$O>$, \ cq;o0aҹ'?E10Lv vm2^D{G%cî[-(Uiad)0L/f;h̾,𴽥ܒU Ra4u|(IQLF m/rO!TC (6'Hx>GQO]zL }Vzi:W-ηzE]W)[s]46%2gQڙZTK>ϰA^G@*"ur)\ІHo$с>iQrv+qٮ9 rx-ͿĹ~Zb/rA[SigWU~n{y06X_/mxlcoL[&_6i =M9(6yQ+ hZJ+:2hq0Hc<)j&r, @E5᳼.DM4N8K09k7VNEWEp0:z.|BIDL` ܙ!ND$4BhS88k{+//kg^ O~#V7 u6X+; S\h0_/$kbq#C#Ul,9ّ4yf+;P8zuՁgTG8D@ 2rJR(PY:H"IE=X 5)+2΅H)#& Os >1cJ*+|ZLͧh F*e L{ ]HﵐTBRjqZOsQ! &#ʜguoJdN24 !5cS @Uv'ӧ.~s3ۺ^iWp]b[։Lvu3l> kMӚ#"qֱ$'LXXtnh]}U_~;X j RdR`A(s͸4*()(7`d.;X{`-[3{n`|ތqfݯ;AŚ1́7h߈ʒVs"`@ ,6 (&]F73HJߢY;(H hV>YyXFQ㒀>: %>psNAj%RekQQK@Dr b B4CMP3@d2Q1xΒj}iTL mhx^wwW\'?Mx;1|X8O_ dQ lQR+ E^3N2sD xhG G@<$旋VSy;t_uO%WT[o Y͉1MUԿ򵏷&57(8(hʺƢO8E(ſϜIώψςIlFn4,1˝J& XtѠ/>BjFC451sD)P5rLƇ26;s2c"G6ǧ*뷳[9O nLZs0S͕׎f!le]7z5yWoZpMӻfϼ͜Glԇ:nn8eR'c;\SI45?mmom $$wY{n7?os]7> i}GA Jd`!i1Q@ZHKS?SgM3=ӬC/i}Aluk.ls7~l!Tѭgtz\oLt%X(-0`\ &N &dHfEy:鉶6sª {.r?JݴzX~6W&m:HbKa=Aa[U9[VY;M^ૂ,+~jNO~caf2݊lRYݬ zy lw4*fU)g]5t^]XRY)|ժsyqaZ1]J,;y)$NKji|6 8=^^Bm uk q>;'+^-~LXc,4_qD c{Yfuan؁t:Ffv ,R^RV V-8Oa=cc'ӤcoMyJsu:k_56^_xffPԸ8Ny m}zyE\X.$-0hI7 )䴰6&QIW 53'ϩvWqEk.o}AtKB&?^WRsD6pRg+_G!ȬK(gM>}َ5P_ ,u74㯣:q1_~~;NJ=S\Q/Z$(Yh[=kq>e'8!Cd -/d(VEZBvMYNGA~?LIxo_6tn\{N.Bޤݝ q/ޤ~h0ߏ?KoD\ꥇmbl=D.z݂o&}w7B\ߧ/ 5jѸm$"*NY i O0F 4Ky{-BSp$/|V[ Ù I bNX-xsFM4 ַ˅VWy 9|RTn7.ՙ;3 G|f6LVeo'qS@P7\Ԁ:6vAIf TTNwnqL;BՏ3<{s}'9UBb&Ǝ=5kS¨=n$Y448 j?m7XO4D*;X:bՋWя]c1 G7e8l#)a Fwfc=m6 䕅jꬵU$`0C GĿC >EMо`=iY'<+ҬcFW ]te<9$*3 wp<툢P-DļUy(oF2(wOf)^c{kc9]H1JbĎ1J;7JRN|?t&2^6k,4SRK ̡2'ɜ|H yisGlFn4,1˝J& X9hA_}<ԌŇK{bqvJ.|F|#mv솮Uxe VYf Vg= >eJ<-[77tf+GY c;}λs-zk~>5Mﮯ7={mVIԳ|C/w40KOMSsӦ;ThZhn}f.(M&Ll!Ĵ"V}(&ՊW*.ੳȉ4[xӘ(Z-$ꩿG<>?խhUgR{h}F77pTAg_A>]u#,KS ĩv!ă2L,h5T"=&bU7Zz_po:}Z3ۻiBie:n^Jr9ߝ_rsT9[VYƵ*ȲgV0Fo(^^U-lyuDy7:^C/[]5ͬYtYW ]WW'VGVPՉgsyqaZ1]J,;y)$NKji|6 8=^^Bm uk q>;'+^-~LXc,4_qDGf`8 o,B|qo ,. ;N'S̮EkSՊڊg3lgudt<' *D 5/jgOwR6طl<-\6=_6:krIϩוj*&U\]L,YZu.z.S g%+Fh^%e|#}**ĕ 5jѸm$"*NY i O0F 4Ky{-BSp$/|V[ Ù :k֔z.hCL$]B$с>i@CdiW,&%FRm nQG槷z ~JGN(|ig3[8VgDΜY!_zl 9d|&ٶ@{ib6Zv;ԛeǖ%:-ɇsxf>30)#4 hz+zp? aO'[0R<Ǻjd;9ޡG^ն}T N "+6d!I-h%uiwJ2֤쒨C+՞k2l)Fϊbr"ݠ|Gګ Zo|o܅jz;kF֫{uZ**В 0 : 򣄳4Dr Ze 9UQ1fQ䂠 Fa֋R495/ \R8#,e+XXM3BY e~a~qY"-57W>:2?@ѧA|ZYLI8ddF/De S*CY:SJSքT? ) lJMxAa#frYM`u)2bW~a\r.wiG <ܴfUKK;˔N[[{t~`~.5]^zfRn>U7Y `ev:4k}ރNFZSU-\Rd9:N$3f&HH2Xdjo'`ǥE{n [꺃ߵgqO1h+:r;!`JܛfE\7%͊wI)8J=ÒfdZat;SlںiK"6=4 360gJ.~hq=^H|[ R] )K{43cTKsԢF*KNl0Ldg:,h\LBP>a<@GcSt$)x˘$@r1+4Jj{Yh;a8 4ٰ!/Z7EКrR܍+/ii<moo2DqIN3( EqqH(te/eLJV9#Y0DY(>>8w[r wvv(wh4C@M+bJ-%KR wJdrDz(ЖqR HDZ'LBL:^g3٤Dk$sVVjhg.Z}rP*& b(8WK#|1&i'g'G';r3&SP[Mɺ3)0rP)92$'**gd'fpG6ZbI0.Hq䥙HofM94bbgb`u&wQ8v}Z5X(۞ ,L'KF 1 QDA]D- X%3)PFn!.SSu҃ޙtTd]Ǒo/{-YLt6,L)ȥH>< t*{q<dlVӐuCV㲰7h/m@i&i>zb5nB$J6'x8J$ڙ r#M9ZgbȚD^+18f BQ%>HeB.%.d^JtLBYGfjZx8ljvCt|X]]OH5=@F.rw"_CƖQT{l( BҾ{,JeI>I2@T2@#ppp,GDNI Y:NP{()qiju9sq> :cf+eD49T*Z<ɠEZut(^8l 8Xi}YtNۿҴ:%b uBH-#Y扵p ۔! iggx-)Z1)_n7 ?w\ią4"P| o-t}J9'Bwy.|,*0>s!}`ƽM:2g{dq"_g&dwÙ,q95ܐ3PSLp%g\7S?r0 <=Hޛ7%Y9X]`,yJ:I[\ MH?Zu-JË">bv~9҄ ӧIc;s o˭_<8= +@s,J!׿ ]v??RлvOO/aFu&o8f" ^s>y?>έWs("UzY|x ;HWXH}$Pt6h0ueeyOp4(2e% _=ڿh9Q{MzȖt\F\rL{%"~xo4pE ~X R~z0Cie94P/KQQahFqo6t-R ~ǭjߧN _~ѻ7ozW?w0GoOi- =h:hl,7@ jmi Cy׳ _e\][q q!8zD-.ƭ GtMe4:+I?9?o.*8c\сqM8E0ilxRJLB^Ωё91y"ʎ+} )J:gɅQZP&W:d&TMWO$_yϭq%[m'Ji&[J!Mq?M9ꌰ, Ae#*oHbcQAkZ׷dZ,=8B.ѽSL6nd3jĽse61x>!)xkf& dCt@yg2X5B%F)`7o'.q>Jf4 t \Y ?4aIUOÏѧa })Zf6_v,c[(>t2:mRQ-Q*䚩GT d3oҬ5acȁ8~8 rIr+ob$TxL]'©ZkU[T֤ UNֆiT$t\Aрe\+iAcdz6YWYogSkҵyekgHc}P(Yo+Lo6m,oz[z[ sEi5Rpƺu9{R"-j9^j[赨]__  {_[^l;e91-G=H`Tzoz&F}Dy$[jLz>=,9 7 O FpTI!w^4,^v!_JM[:mKi;[[rrETS(!Ki#jhEJ˴Pf ]1R*u)njf:C*x9oL{/̢ri5q9B;NzKwx3&<;N]Ro\y|/spò69T@59fYFT&~_*Vw_݌W7CR1f6J 1蕂lB3?gg,~7ןi5'JJ6X˿ƮZD;@{I֟*9=e!e͟*v3G ƣ)⪽ ]h;z4R2)^Ls \i0)Hi\V*\i nWEJd\=CRR^+7pU/pUŝgWEʥ \i`"UžIu*R'\!+lʢ?N_}}ɦ@ȑKN6R܀l , dW/l%=΂|ViPS 3)c6^lFD}6+4m2 0bvr@ 8ָ85 7hͿkf/c^6?mj_o~x]8zy)@&rY u=IjVdoyDezɍ_Kx)؇ds6@28YJxOu$%Ydn,5EUůUDHj*-LPEjMa 5.X"J(E Dm"6 mf3HZ!O]yagBPV* #^UʪL% H[p! z)ZElT`PG h"ʙC}xK Hkm̀{UQɲ S>isSrRGΣqQ;CaH@CrX8tmR>'u9Uր"GqY 'J<:ͫ)sVQTId%nI'P!qE$.hEuR+)ȕ\m%VjF9CʛVκ:T@/RL&DK7Ji dȊʙE+ƙ4"#GA9D̈́e;騔!qm& Y1r+A%Wf s)BJQ*9B2-N,F$8NF,2pj%IVJ׷3E~aH=QTS2z#4c‘фJ2H n-*IV\+?Z[t@otI{bBL ֫?7VN(8Ϊc1x#!2ޫLxI*%ϜNyAB#$ ׫-*!31%k- _19TXac{XqIƊ{ 9nMKøRV@rhϰ"',V;\*t~-YTuj-TSDz7Hhms'NQblZRvb \%Uબ*B6}G @Yxr͔u:ex8Bz3jE5CIX|DsϼwR8q m b,:sP)c$pT)sxpv(Cd(]P"$@it^|Uᶬhp}E)&qi4w@·*qDxCH/:~P8gb^Ss7AJ NPt J~++ºWdKp1q7,yT2Ƚpב> ` !IZIZn?ִ>8jb!j,)gk`.8.gb 61b RYA:Y>{z? ~ g%BI@RC;SW S3Zs D\Ig>SE$?Y%70>w3.Tg#4Mq;kd!F" %WJrNqqEjvm%7ZOHt6#P(7M/EqP?T~^߇9a4x߹"wjMmwq*IUw_Z7MM#؝߭6^q[H]>@:2Yδyv3]镟 &׳>. Ġh17{orK"d/wtk;X,n?~BF2#|M6U% 6q.#y`Ώz7F:#ޥ3)7;58ur 7C~pgW.A[ă tpi޸g]lҮ*FLU"k:`z{($.?}~O.)ӗp/ޗIlo# $Yp  yЦ8|halWrø]W}XjjCu.a#ՙ^lNUÔ:@;pqF>1lz0!~ٙl5^FgȎ9?#H}u%g3O Dz ^["7Iծh'Bkԝ;2ׄgxLhr[iJ.Qbx] J(8 M(q^hQx-}U!Y+7|ȣW[6eV/o h%38} E֟Qd9 gUzrhQXbڗ%&򼠨H%pYZ5  4q0;c$:'p@8B>Ϥ]#.{@A)`w블tV.vrd4v ˻7T!hY6.rzӭޛ  %Ӿb| ᪲,| Ŧ7jGamO{m-z!H\RHd[«eg_),Shmf[Fcu4 Z6)ٚoldk k0 msJJM$ȝ "tJT@A!TZZfa#ǽ1[ =i;ϲ>Xߧ/2,h 3_Va:[etL_20^[ߐt7*iBgdeɎ 3cG~|h#uh;ʱi9 n=ᓈ`$QRX.SI  4KZIS$z˼F BPƹIB8z$2A3i>O9LIExbB;(5t&i (R+O }OH1_gYP(z<bTR(1o-Nqݵσ7Au+?E|ilcU L:q˂֍x,~ҳ';c"ز#b4UeE>` QPɭH"iO %p)څhh2#@'1g1rƬnGo|ݽΈxIƐ`x2"#GRORd9"8|V+tt<&cijh䴳FFC4R*`u4[mMr=)Z *yR:l1rvKFx4NRӡŵW+_zۭ(eBf)t'Ӯ +&; $RqJTޕ!UN3o*6ez*^hTRi`,Wvaym -HD/"xN2+u q]oE!qU$_'=q`Z E9Y>gϬsZD A X ,p1!qF%%F9S'0D2a׽ImlāZ茼Z@V㔶N#gN;9z_9p<~5@ҫMYw0KRdϛ[6j)IfXvݗ,6)\6(o,,z5yc3Z76S)d7ՊI(gu>NzK_ՠDzX9q*o*Tl xpu,Ӽӊ=N̝~^]by-f>_B|Ku \pq3ЇA{E~3VGKqx~=p)wUBzh>zq8r<~doЛ|c۔$lJ3Qm!1s:a몳> f)3ACۓA/t2[rTB;9絛/ Uj]Qm$"*NY{ߦ%|ٴ1`(V<& /8I^(a gz븑4W_.$$`.v's1]`Anf H:uIHdG/# ےS.wyFOcqv[л ;tmWQ}W8}x7{qG٦W>Ӭ\v+?s+,XhC WIkd"}}h\tڇ:BvϒF}vɏ\YX s=&G~3v:M;w{G”%S#w* xW|CwL+x2&Q[ e39T0)FF;?^X\1o#mMе¹%~~KxZ#φ\lǟ2W窪W1-.]R1 RĒ V/nF(hTjJmUe )F&" =NmZ-%|g=Z[W ~eۥ<OBZY I \ ^x&}2z\.>?)GGGgp69TMns: f#ǜ[%XdUnPIzYJ4gW3T#7e+.89!}i]]FsRT{PVzIb%T$J-5{~쨒LGGu\N&>b>"5y -ZӺK`QBmfMus-S_"OUD^YYQΌ[ lZ_J0dJr.asm\D󪊘5&DFo=WԄhvLHԓ:iǑJ, ~pNֹZxdJє|qye}pq5g՝mR0nO̾/h E Bri~°,×8e\,=rlnןg?ffxzS#ߟ[s/HROېo6́LR [)ɗgK;1!?e}i{xio?dş.b[*\.bŘlGh{5"nɵho >4;sb>?ML5Z -BdJ3Rks|sRoG4-y{g`[魹}X[\%kFyukD9?6e<ץ\{iQkMzpRGξ[.ݘ3$R7=5<&3np.Xd3<2Ɔ1 )[ k_=m̜=~/֠{[d"rulĵȆ6o!\CvWuVAa fC.bHJǍƬT)#~gR3=R26!32D~WNOO/٤X'pC\*k!YB'XzẹJl4)}D<H w Ɍ r#Znu}lNkdFгȝbK'.g/%qu~_0aS bR91ff&TD vGKFdኀS%=|օ"L@˦bO6XZ. Pb&"CvfM,Q vAe3Xc@PbZcC q^E >b.ځ0}ٖzY]8(VMCf(r-x=vx Aka&x _vHq6WM'ʅko.C m%>,Adp-!de0c+Hm̭a(Kb",jN[5>(kfS\3؁qn!eF:!S-CJ "c mB'X-A ?6'mh]GO=78!خLbj6êJ]jC 1phSz$]3Xp,pѾΣ`u>%oE,bБ0a;?%YbG0“_< :)$ܙ-Ă.,_8u+@eUE ]!0ڃjHTA fڇHޑ0`9syIJPc ,k(ݒ́GX܁m`cǗ ]]\Nu~T}g:sIg*7] m;&˺0 w2}9XaOm. ͼKM1Ui7Xk |lDr(cP.m,5ka/Vs7D2Qr@߅^sM7nT~EД;UW[ɱt ]I1;6@B+Klnw)X-egkqTyp;Z)"sQ>Հ7sr6H̴ֈ 5H. ?4!NtD󜠆7gKڰ`EU6à۵xXG;d+,OOSLD,ihm 磷f:tftbQVmw;,a3-bNGçQ";7f!z!h_B]6>P^y7Erv\ӞnG 3)QC<`JP3G/爴mz0.GewW;u#`RnHAs^,J( z31W廣?_>C6#riwäD{ s[ɵv#cs&:ҪUNJ$Tp=Ph''?a'J{NyXsG]ߺUo6~qUh0/bE]q("r܆e1w(.(j` '.u>pnE##H٣H1`bVu`\3QT'q*z xN;xcH0se6@zXV=ܭiKg&=f2ϳ]vG!UE('5AцK%ćP|nfw%*~ !` E̫NLp 0deP/FA,n>2rXvGvaԨB'c91 w_ *w˥)1I:V!7]#*҅. &!JCk}vk!h︥5;&[F^ܺݞg &FnPʆ!YF7qpDp  ?Ѵ  phe:f7胃k\3#ۛ=yy8W"HpT>@sXȡ{C`ȿxȡ0JF)$@J) $@J) $@J) $@J) $@J) $@J) $@J) $@J) $@J) $@J) $@J) $@bLYO$~H ސ@@8!۟w=I DHIHe$@J) $@J) $@J) $@J) $@J) $@J) $@J) $@J) $@J) $@J) $@J) $@& 4NyoH khFkK'f+ I kf$@J) $@J) $@J) $@J) $@J) $@J) $@J) $@J) $@J) $@J) $@J) $@z-0>p<Ϋm77_n7p{;w _8'p -K }h݋ь\zg~>J5BBȕ+˕skQLy G潑np"W3گ\!d+*;Vf{#W3ܸ7 zOQ:(Wojzc?rrdowanr#esO>}ĘIfdvypV['@1}ܾͼXݍA ~8]zh_:'s-}u9ߞǃ?!Iu]ſl{$8ݾ;_m}훫xVOe@%ɗDqOϻ r௿5oʼo~wNkm#9Sx_:'1N&{bWk"UȦH`K鮪`AjAyA' ,_RY?IZ-|SMDYILٻ{Z$֋5x%֮^Zf% co7ICl *X0B`D$tAW}R~|LTFC\akn߰I?y{iC V8̡*ߗkYkVw>.e[pn1BSN22 $ wj1rmř%zU/sa⤾OR׫,>^4 s:U{5/*urm壓~F?()+&݋u.|0Y=?oE!ES\Sǽ^̗~gιmJb7Z87mǯnr+ň|ʗޣnt{x+"gj'Y˽x)2'|e3i%Q- ZY$qk8`,WmLl`ľWM?/׫b&:LY8dP]>یoW53s[6ea_x#D[[ˤNSXp 8k:a8a^L)!Cu[N%K~r%P6 V BrwR1NH9 &hcz'Y6|#KfF|Mo>#XnKÌ@ o,B=|,.qY.96_:Ua6cf]"kӐ (ƭA2(GQ)JM4<̗qt*;y`ȁNZ/ZV^nx7F7I.*]aϭD9 "Fpđt@i&0JǠG1!udAw[ͳl5ORɍxrwfnoݾ{>VIvtc'Bۣ;/ƌ=6:^ 1:ZǗ lyƸv!t 6J!^ؠLaL )&m}yB sN" H [.p74,蠲ⷀ1T)b+t@0A[@SJxApfoi␾Av4zmYi4* ύʆ6>_-88D舤ㄏ$jT6rJ;*~gZg@Ecʥ!:U(nm:ܢtGui7<P%59O?v!lot ditGZE/wCeКD΢Q%1EV@T 81l8E*CYDH^ʐiaoT~8!,؍$GdXXH448M4\KChf:%lzҗ'[˗li^6Ϭ^A.oWRDÍN1q<thwzFd>Z70()&z@8Q[M*Ffd5>̙r{lQI|a6̾Qx$oXn6L72rƯ <78jYQpkމ[*%B쭋$RbYi.$g 'H >r ~xlGd="k=bjs-ô5Fn@i8p.#!Jq1j p(Q+ q, $^2= ĕ4(I&9b|d\4יMK"__lYNj^uwe7=~a@ ))D0"H*a@bk'# 9#ٝ1 meΝ.:NGC*-eqc*h%{gq6ÌМyRM|29kx_$9"oY~+)7ٰ:ik[ƏuRl*cexRm@Y(-=Մsʧ'H* HCz 'aE xEU{W38]p;~9L`͎.)Nr2 ~aR|\i0)30(%d- CRr]g'WxUPBp: .eqOȳfy5mpo:0v꼯.O)AL;jzS 3ɰ?w*kdhu"]phc`2z7܃_vj_#썛!A/LQR=wv1㺸Djc`)ł9 ˭."iXYnԱ&ڇsKD!9?%H\(wp)BSϘ8P=l:FY^1H+yJL#7*.-a*-&VǛЙ/zTl MwP.p>;姡KϮ뫾WS߰d}s4VMύU'9M1J}uLW^f*e+Uّn F BD~jqTHٲ 8@0B#iQjDARS1-; >Oa>(թdɟ$˺ӄb7TX"=0hǀqZb qkwGPʍ"6%ёaYqG)REbD h̝48HCKd4oI5*sqgqu;Z!]AW@@ࣅ!>ƅKؒ4BjNYv /߽}(EgaysaAX MXi3SDJH'5(m!!IJ:k9ZOA54Km05B5۟%X(wo/5ો -r|p03(:6n;+mW{nf2ޤS ALB EDXvJMv.T!Ae"D&P`( i 5Vqך QL#J-|G_olSˌ+o.l **XkJ/F{}:݉_歹k 5࠮f/gC,l0h/C0/ %-+rFhQJIefy;Y,\fyec>CԆν&:Uk`id;pϙ3H3ahuHb'1oEVڊN()ya`"ZA۠FH 0rfb\nښM KyU$h06R£iiv2tβ>N 7Bҙ&2ұx4(*jTG#HDSQ4 M lAs(r*JJȥ #n-7dߞvvI5R ptަ<Ɖ0"$JPOhJC) NJِ P"4 +敖܉ RjDaO9jRQ (2fl4Ү O8P2X0"1!5WLj3#f`'jj ܸ'͌O2ҔQ9UezJgSנɔ6VTlG1 Υ{&im3cY}A0Ha#I,ct*n%YǑ&0l d-Q22i3" d}}6seSgZ82ȗ7RuOkT8leU%CRTS K䴺{U]]G&· `XxcOx$&TJYLH)0HlPf+ UltA_J3Ф៎=Lnv\&W7 HG0lكM8)>*N0[r"B#q8isdo(Β]}4' L5Up$Π$VL~,:zmwQ Eq n€Äw)t`KQ0!SP(J;f m)A@[/5]|*9[1ɫ=^{=|0**@68  ˁ2JSmXOhHH/:|QMk .usS:@JNcRɜRafXTqA[` hSA< LlN's=VkJ)C?.L";a]֭"[ yH%4Ԩ8P'aƣR !" _ \ϵ5m1 >A(Z0;qJKN@ћH~c)N)H>{OOzD}?<.N7o\-DO`ͥ#rK>0 I?ʼn95JP]||0U\OjS5y?> O5E+ɣ}*Rzv9F"rrN.Swb*\I5%xuo?瘨wyw[ 2LߟDO G s Mm149g=Ym92_U ϗ]Z(6tQ g >x˘-rcPCuINO` ``OYY%gr`d=#(wR?:GL#_ۨ $Zc8C"29)q8qNjW3;2ՄGxMhr6bd`NIJx4t'tCBUr6Fw9zUKNQV+0`oux+){+{bKs4SQ'`YƘcxtݏⰴ;kH)EԒ)Ej Z53GeŃE=O=ŀ1T)b+t@HmojRz\TҧLvXO`UQ̛"K.l o:b3e WV'!_@}O#w!WxyqgD 5e@iآIe3Co]ξָo _ԐH{ݟ@<[%X{f,lYkó Gg:+~nMMg'ՍL_i?ߌU|`?MJߛ&H hv^NIX܁z=9d/'3cIqr H$j:ThQjcI&zGm}+iW/ ?(Fzl%^v C!8ҚՋ]UúQ~T䦮uO-D!tyNwI <Ԡ0=൤,RZz)8Fz&Is ,0JX}Y Q!51~N3I 2x2o#GWm#Wl&AA)5O}:AtЧ>h:ԝukZw֝ukʏj!nu qV9}]ൽrM/xZd ׯtCwe[~_r}-wK}K=ҷ^#^l'$%''EusS)mS%Y 3{tm_DQ0kJ*Y0R`D$T/b4fqg9iY/4ڸa!HA!A,Dc<4%Fg[Pyd(AZ3ci/%%\2N%0l\Әll>8)ĉI&t8U[mw1lpu_=JE>sy۫`6X4Zm3i%Q- ZY$ֶXq,@nY$nTt'; Ĺ-؝NfSkQdC*N<={vՄeonQ^? LNmh6a grh+ˡ}ʡb{)#-;,d8Pp j E)#I8 b &h31.AY 0B<0:o;cƌLy-#chnDdƇ2g#]ƣ:3OK"A/U]5Yqu5Vӷւ=ߢYnUw8\؍x>6lw=}:j>:cymZuN ńc!u ez}jC6ʩլ;%,'ܺ}Aƛ;d]٢煖a4ot}"yȶo=O醎 WM1Eo?K^s{ƚOnۻ67G6iMF3&aV}huls&xQT̉Xo"Ԑ1K8+ EFdOrrt$N:jEh\/.~TWzWك/@z.)mQ:XFR̂1",DDꥦ[FL h#2&"{mz/TLN?e0Fן5W+[Eן{ig+{oM_[1_VfAurQ_!7idY@oNm-V~EnuohAv3q2trrYN2WHf'`N0,NB[ 3 1^1Wm'mzԖ6f i͡LO=LBn:-wLZni’[/* |y+Z홦# 8>7*o94o32w;{IN#Zh-BVgA}$(jdǾ2qţy'/TZج9IvE_73Tk>eN>0cizsJ8.ğ BI bdQ# N3(4ĩe^ S'M4L?gboP[Xs;&Z0&at@ߏ1DA뫩=wΊ*+~}ݘEU[u=\ZRbɬS֖\"#Z3 E^EC-ny[VE.WI^ǁ: \hM"gQGǨ"+Y,81܃+7pUB'#% $! bB <* s~]`FgXXr rYDŨ 崡pO^z>>JxU+oQT!b;/4 rb!F3.&&ΣR1gՌB})n`P>RL@qNq飶ěF/pvJ*fFf͸Df]u!ͬ i ~?fva|<-Y-WwW/~Ӡ4KWqQk-D" X{NHȤR) e<`o]$RN? Jڤ@$R:"*Ffm`\Ї:y+z`} :XqyJRP:Y1 !f`E;$$GQ>rAYf#g>吤E#f]5"ˬY;&la6rcHSũֆsQ(˪ 6SBb%#1 C \Ir,i$53Gc5b6r(zU[ЋV1:qɮzQd֋Ӌ^\F)qV#C9$c69"R҈HI-J2 &ӋЋǢqDZC>܃ XE:T`]Ԋ{zzVE@{ ~yaG?2hX#~,z̝vX-"ro0ӕ3a=ko[9(`O"Qԫ@?t;ݝf0-=o;k;m:~Ns>Ib>DQ$ER"5x+F0eo4KTN JTP}VTMՓI8r=˃6ڨ-Zcם 0Ƕ>U:\RBĀxxk#F-9U +%Vp2Y@'6k,ZJXy7h/8lz0}P-Ik.w9 })4^Oޠru fu:˿&$r~{yOKqX`]Qaz5=}oK|sZ& uO6 r+=TϤRZLsԶJEc4襙:5n[wKd2b(mu?62',W1WT,YΈIƕN:U#avArI֤ dNj SS!:˓6bL9sYDqU-rds@%PtVR{zF ٘xN,`5 Xnu1RrB Ke)}Ig3r1+DҚr Zi&r$Zc\Ւޡ]('+RbDa{J_9D26>%9]V>*3h{T\M.}PEJ/ƃ RՅ3BM[GG«_k8-Xѕ7YVJ _WS"(ix8xY_Y˰C2솵E|6tIEd:{#CZ0Ei,4b.C2 xBa& S\.T`-[˭f!I̊oV&CcuzimwahCMڬU}c\[*2JVd)"y`9|B $jEqt*d'"m)V98X.7Yꋳq*L28llפ@*vԳ>y]mSLg튃#aZ-K}cقc5lk3(0hLcAW=+9 W>Eq-*wˢRE!,}[k6oFFCR}FHKZa0ϼu2J YTιb>t0zժȴAM&$O r);{edet6%ztJ V[g$CwF#K-ͩP NnrV7) Q6$|IfIzlH\6yGk%^X#3٤D+sVg&|]b +ք:8]E)xiΥ€9Ƥ22-q,(6hK^;T)h>+?kgEu!+lS`+Q䘴L"ʜ9Uk[U+{;`'͙w ,8L$oDfM 11g $5 䕶Nъ$o&6J ńR׀٦R刖u^3bv gP@ xKDL 6ݢG`&z\BLHdS惖̷ǽ2V<)F$300( Bk'1`|  kي{1Wvok)wCD'lSnRͲ+)+)"Q.$.$XIxC掗D_8-xtxϑJB/:}QT[MA^_2ZPtֆl,`IZpMh3IzBQݟd'{(?U~K;0u%YRB$FUI.ҮBIQfc1Ʋ) ]ÁڟjkZkc }٬ pȚcI+M9 6xo#=6}KOrMx>ϕ1)2 L>5WڊG4;/uʍ0$|>0§Ҩ0…u9kI{璘3hO0عeLר#n()qj%ȍFFcGgٻQ,9$zcɻĬRMbBׄn8'R\\}c#JN?}=uǟuՂd%٧i$z, \DKZѐ!eԆߗW'7TM9?v4mݙ4/nFX~<~r|6ݢrv0j̹s{z67]pYxߝ5dGByHM@0a,bĘZF N>g݌='GN9*գ&4j\5[@Dզ%Pʝ?G3)y sSw+/Wo( P/Y2)c'~! ?Q'LK"Jc2ng$'?~Ǔ?p0'Ïoi/38|5m[m M͆E"|qUSno5oPc-B )M~4nܹr5l*s[g+T5ȹȡ" g'3e `_#`컌H6řhϏ!}~Fg?%eu w> pƸd!SAMt`S.dLjW`=wd& w0Lhve+:̙h}Pq)iA QXH:zҢ;CM|#79U&]nYݡ[xʬy1IˎϞAkPϜaN r0hy,Rne6rmY7]zF16_q79 3rWm(~;WQ]1KW 2ܔ`&X;>:pKg{qAWg(n0SFe+=o֪`j Jg'^4h!lC(O*+fĕ,ވqU}l]{{qU%+~=r}WZnw]\(KWF$^Up%߿gSRrqBWь_ 1W9r!P)W)6_6e}ϾD\˼U=yA@ ]yUBY`OG樣IKxT1D)e"G#9QjaIr!MVaOH)gDLh ISΦ`lDjxdRxVш1[\Uo=~ c}?ƢQI0pP![Y*qfuPh50Z #dyZoT1zZZbʻL -ka %G::f3&{jn ,ܧn+?(?|-d-i>`W$5DQy8OF⊌"}/:-ٹF͕%LfGyJ"v9H<cUsCX"kr<| 28" *k)d%$%$2 TX1:I e0ܦ=rcp[8dՙ 1] WRUY3Vw"w&0wNrV7D}7'og/qA5V^,k(,u=Թ7P<_z b\٘Βկ JF)AՅ+{.~2#B.1wv_`uEդ =_Dk%6eE-)*~M;*kr5= ͞6q3uh~%L~zOѥI=]N)-p9u? .>h+kIs6Yux-۬@i3*/3d ܆g xBaF1er"T`8HD3ef6uZ6d[6$e-6abMվٻ6$U@FÀq:sI.Xఆ]]mH/)qݯz%k$lJ5,Q3隞zʌGk)|z [JVrs,yfDI )F&<@Ԑb{. 8UyftvVt֊]qw[ pC?tW7k~ٜMy<§ bީ ֌u!:׶k1sKoo7<}AQ~a[)pR!6򰛻EԆV yAVpfd9:f@B0VW7lCjV7jv5m&okY[t/^WրkB"yuQ6 fjK9haǿc{ZIq̸FaM"4OuMrh;?Y}x} L3Zs.Xd̈( SAFOA%G r÷*ttHCmbgnMo,5u1 q(1*0)}4$(KU@:%z%0K1&Cr/{" 5.9*iS;DK ؃+Ԣ;RֲK-.ءѩx ui>rɢU(C CK I0נ%3ؖ9Y/7`d#|Jp@IHZ::EkʂZ7B!-<^;±TCUAKT v^p DQ\ʂ2G@`SUh)ӁKYrAm>0U /y:\II֌ȹ[3*ta5VʺPp;{V{2/ 2Xܰp Gɟ_Єhq4)ϳz 2DIH>Ei yEŲҕ\$QHQԦ4T$zfY(ʛ XWYcW#n'b֮Ї^M>8&-yل@ouFuyE1FjDUY#^#V `1u%c%P2 p UUgXE1jUNeJg"AV OZ(+ DUY#V#gSWI/Ζ)YKՋ^Խ^mDς,JSNX+3c ^GV|'nm6& 1j}CчոPNAUl(T%V;Jђ}|Mڸ{w'Nx;#Ztp|'J@p#kzyRcRAi̙Z0LF:)~RI'η"u~~֞*0\?}{TjN}[o+xR 3>ƪM*A>L ;&]^ 5YU}[xٯZ Vr;;--rMdz|uVl^R8БL/G򗊴f~~ؠmQ2'Yɢ%8Еxz.ѧ+r%ks([uzgl;[8G'Ga<9RHr}I++B ` 6 *ES1-gR!29R'(zƼ Pym*`Rj'.V#gMCɎ0:lKUp.C%h,<ɂq˛oTFm9Vx4QfS2Cj:8e+5AH"`%Kp/gǒ\yu|-PC1rJ#v$'*t\dvZp>OZ a|" 䥹D{%̥dsL1V}Hkt*{QEqTקjQYB6x4HT*(Gx! ZRD`, U➸xAU!J,)A/fŰBuoKU5Z.HiDBY&$&lz2ѩ$EꟕP7h R֕I'CK@YOD׹h=Ux>D"6` v rM.|p aNFG80 0Geu Z,IR(΃Rj4 zLs"u!aRY<rI/ ENcX1-eP݉tJ|)g,!sIh5^ .f5̘)I$k̠ͥv\W".Ap\)D= ӝ_{ABA&Y^} 7~tg#L kUĥ|ȐKpaʞyǶ#f"i"+ay^b0}rYicO/yn^):y7kdJGu9%8g\2qFJ.(fxGfώpŹogEG1K!mY7[\MNN Ϲy'8acF*,̾XRC\s lt Mzz)li)<=R-(fpW旗0#ن~y՞tq:8m",>|n۵@0^M/c]#yaD0 :Yg$ ǓVG Y,|2}ZNrhT/;GQ5jZ62R>LbF+JDd_ݴCXiRtpP/K]2/ԁ%Hi_MgFIm^;cOIHۏOݫߞ^|.˿b7|qv}\TYjU#  _V6_]l#]E_VJP:f+p'] c* ϮQM@6?1_:0;q2yWr~NC 35t3u0LQkцNSwё&R_sPpio|oSt( =dF@8u~$EGo$z{ݪwxq_ 7<,±+ D8LeV /BkfJ[MiRY͘[A2's]ppկi*o?beg~#Eҡl2f3<֦/*jW:DAeέ52uH/#e*t. &GDOU9ZvI=A{D#U1B灘AE|$Kp;Ȩy%d8|:_jgM0Ae~}:v!ħz4덿f*1h.C<[b{jtku%BOH0d5S+dHUeprbL2˧d9.MEoY ImM>8hBZ Fcu>5 YUͪwnq`lY*\{uGqKL(o+"K_ᤂh!/%@6&9P`L *y!:]KD/6"m~t|Xr*-j7e%e؎H뙐Q><^E&y?9]6 [ q*'Gޞl\F JC UJF/%@i?']+B-P,&. Vc<kGL 9PuŠx=4Fܳ6ڣgVk#B4ty{«e\ G.?_m'iy3ALJ>0XVmB=\a(ZFC}C2 # zH)< <ГO1k}D)+Oɨ&TdT6c9p_G<8Q'kLIZ䃫&UO[.޲$/v<$*3A㈎  dŽϦr!U9N|28BբhoīfUB}5w5~j:^tTjlQdZT,іEzrwL^ ,PFbV'sIM*94"k@U=L( sǚ[]>ɞ1Lw=!4u!b 5JlbYC)1hwNVyW#Ch=Z|AM#p3c#!0V..ْϽ#^6T>R78oç-$ܫIՓoê-q^yz媏T-/Sgj| ^ywQJ0rUYcNg A *3&Sy4*]=TK)1y_ GR!A\|kGd8ޞ8=c?vӌB>/HԞmy֐E:hMYǛ'CNg煉 ! r:NI5"P!mHiY+T}_,k(ЂX@@sƉ?$B5J*`` f*]K;NʥsohNziG"a0J.=)&赺EFk^17"b!+ V(Fj(:&5UvȍeVT-*KP{&n '|R9};{D=omd!ELTQ֩6ZKU$a\hxzRPbcht)UCv".9K$ H&dw$~q1\>:i}GxV٥i%JI2T4Qu&)Yc}ݴc_?|!^%ȝ ;(.у1/׫ l_Wׅդvy!^_!W^Bn' yr]P0H* 1a 0<ʳX=g?md)bUxt37Y$-ۢѝ VdK%VCi 1}%c9y%rYpyYhgN#k.}5r"r@;Sj:ZzsGj4\PZƁ@  x& ::63.Q1vM;ʮOff0ަr)l.&VmHCʵvF^q9Q4S4@OK G}??#f"HEg+#J/!TumN%j=jt8ic&$x%JE<~.WU ۗB5j5LԚ(H88/Nf/W8hm efʤ DlUQ'2ZSetnjv6g /zoj}:\]8Hbdb`[Ah6. Z\EQT<6`lHi}t5&Amaq'|dX}dXfҭ%)+ O c3ϸ.ja<}N(*8ˀix@0Ĭ9R$ge,@&ߣ\.."׵/f˪q[0:KЗJ`tdqJ$K>dLQNXv= ZV7@7qxbjvCvzkrx&g%os*U  %!`H>y wo}$C>4[t]~0٢q_s]7XȀȚ<f5F>A&ͳx"@v >W nEl:ީ$jYܨMv:. mR 5WY$TKӝjcZ 0* jPI!f5 X!.;EGCLCO;k4omϦ|0Kn2.^9 *"A:Ly~hkkC^ޭ'œ^R̿5cXxr0OS.+S++xM?\ &[k( QT:$g6U\,8M?Z, i.z8hV!阕Y,eEoqI~:xri,Z7,&ă~Խ?ƕ#%o]o޼2Gmu;E|Oɐh^nGgV5FۏPj9H?\yOUW~X]lyt.vaINքs{)Q$v׳媭wGd3S>6R>ڛG:6 lxubG Yl+da=ыcN'˭'uTP~mn}VRtNFVϳyn>[-#Zc7T\+?1?'OEn<֛CxK%-ӹ%I9(_99͏io&|qk -DS''Hwo?jowo[Y 'eɹu*>@OC^Cx롍; S_L]u=y˸>GqqY/~@`[^w 8;HW`fj~hu ;k0tn}}K&3 -Y76S?lM"j:bʖ3d*.+HnlSd9pcJE'[Ơ#kODŽօ({^rt!QMU[LqHLBA1!e2Qyңz܄ Wt WVԋЭ 8[=O³s<ϟ'-x<)s 0r<3LቬxaYmqXOCh(duNi0Qedqރ0ڱAסq?򞽆iXZg+u5 AS)_Lɘ&Mi{ =޳%dowӽ#^&})#m3@*C^KoLrHB,ٻ6rdWyٙb`f; ` Ɩ<~e[[qHiȮ*~E֥f1q7Uz6#-WWzk l׃;+~v690еH념h4ɐֲ6pشqg.iH(˦kk1 w_h{W+ gG%SP r(ѧh,ZIPD\IKLtwo,}{W uh㔶Ng^\nK+:6SZ:K*M—\҃V!`}^? R4{ڴ)Kd6)`r\8MKv[=|b , hꃠօÔtbrP [(JA{15A/bD2+ᘵ,QkRs)!w6F tdSg~@OwqBvǪO+,MIo|@f 竽V|T"{ tn"czLH1`pLA/{ } {i\6G_m@M ˪bo}BRK<:dz|,}J7o4׌sf$̓$^]NŹܙT{ؓ${ܓdEJٸJiòAmPBRMrɒ-H6 R`UN4BR2YpRȅyo9b@H-ZȑJJ9;ts0"웸|t9n,7A%㷛\/o?nQxxkܭ;t 5PzzΛB샱mη»w=lܘJ.rJVҦ1ͅ+@h{ץ׏fzDdߐ |AjοnfwO9>õf!le3ݻݶzip z^i&O'A.O/s0,Ɋ6=oOjtv|N8*ˌt@gM :rѡy%cDnz${+P D eQ$d<:,`B,FOvΫWׅ&"@sOStzZA- LQh<yRA:aQVD!a&0 R3L$cB+ hc橣yƓأ<ߚq3]!b%Rp&& xP{dxUPqM4 W}pݺrt᧹GOLi:UoUc׮.>".켮:;RAGL$**L$j!&PaD-TjGHTA8U!XN|!CQWc ^6kTWhl<uU<uUGW>ǫTW1U!"r?uUEuuUTzJ3a%? u2 ϺRWZúJX^2hy ݿ_% $Zy#cWB"(K3Rjj<\8}nJ4< '% \ M75&Ԥʻ`B{yW:T:QgA:!KA@f " 1GkIi(M\ F>SS#dv(A+E-Q"Bh+p'!lO9 @YƉ7X kIZ[&h, 2MJDI0gEe9FΎr՗.ٞq-Z2=,픂1欁si0C1L9A8CJzAN4%* Y^joz=?Y">B%1zJzLӓ UD3GzU j˳Nv;h#+D*OܟM`h2L:c '!\:XkF{QEqt*G5+{<= ?wLTeAssoLhA]BF%]NKL ~x"B]MRSHыvEȾ·ǽ2V|ҀԚ89#ʼ2@ZMr"E}PÖ|ޥ1xG V6߯:Ib>+TLhxB{IE_BdD*g[A|M8f Y>QZKjptE,i I"#AH IFpN21Ib .Nxѧ,L2iBFUI.ҮީƠceS z^Bϵ5r }٬ Z8d̓2'62ؘ"^ ^UAj[W5 厗5 YgK#E&k >Wó8;j ɚ+aD$]Ɵ۳$&|GޅQ 7 f:w\OڣL[~<3<1Y !c3 GD+!Gn:MhzIJ(fliKrX.1ԁi֓[L%d>ӣҨ́Ĺz\_ǎ8aX8U$Uqm_޼9<8^QK.Qq];ZѐM؝el6I_^qJ[H|R-~?5yyq?gO~ituy:Upm$1lmYە"boWENޞrqGdlHvH(׏ti0f0˟vUfyGh(Ze'ή =sx>89Պ>bF8~;$H )Kb't}5%x%`]ꗝ0V߮i2RLzq^ᕛ]B°lEGq25Og' ̻X] ULUl /NIH_}_~~>p0wD{\@F̣IoomiC929 RتCb? \r6m4u@t8Ռs +z(Å}5b?rl GfA[aaMOK.?uu w> pƸd!SAM0E0)lU:/ڥ8CIƦL;cN@!^ӵ* (1xS^ɳθ6Mmߓ}UjΘ?Ͷ-lG [>uw?PQ=6`Nxz9v9jREЀpDy:A0MI㎔Ny8yvyS.U[5ؚ** ΒqQ꬏/h]CN"+%* }SVcXJdYq+(W:L] !U;/M՝OY5g)+#x$\ 3K4N$5!TDR{jyZ/c(Q+uZMWe&e=1LtJaL@LR+S:mIppBgoW/oΎ)̻n|s\o)?#Gc&G캽j[k7R;fhѠ\n|U|+v AZ0EE  &Ԟ*K4z AF$hބmYjxA@Hf98 dSQɔM4MUsͩoۗ5,e1FUL"Ϻwkޡ05W_=TvmFHhH QT09R?R|7uVgZwv=#;L`bASJ)jt,+`dE&P\i\ہ [pKB̅ hWF t` Y$Q3ѭ9z&&~>ߔ>b>X}:[>ݴ\ZOnTlJj;|{X}u۝eWt(>j3XL@*&I{&PQ5gPa@ Bi< ۂ&,R(!YU&PCj&T,VjOer293]ymHNɧsu[ vp-jMλRUd&/6*(#}Nѓ4ޙ1kNfy3 GCJJ-_6=DACKkJAGPEU f6"VJA 4Ni%~#ǻ_7xʲcnej3)]|OxWm4]gzaoZŖ5{(AרgޖMUėzx[BVo,>8Qjc`C[ySKi#it&|WJF[`AчOK/nIm:͍H'$qGjMn.fV,x&-cW[AD?Teqؿb,WU<{%kt5KEgCLrb#W2/tdڠ ([TdFL9xb8=sM5;9:Ps7ٟ/Nr%c/RЪ*A=pSR5KTj*+V( rv.f`WbZf_^~[U7 xM@n%7$(w[gk  'R>׎\F>8V^UfsATu$l7g+`EAECoO_6DHM; kfGJ8OTVHD-(=GoO}[.Z ) ޔj(˶T5G#kX`I% 5 K߀b;qs}*1t4|̏_VRo| hnv\UZPbͳDn1w"cji푅3)/veCq*+ v`E5+c=a\b "2&YA#~Sv*4YiRPVV l9`9b k*⛮bҤud v>]2 ٟW}֣k{3B*T L؄cxuVBeI^d /O[\1G-E!+=Yi8ON*H6.P.!z5gB-W9gZZ&OEaȱ_]itER$)sUZR޻6E&/r* :l>9cj49i"" \cl? \ =ܸTrȗTP2HFnُJ7,63^3 ~-!}1-1+7T~q_4ޏnPUDeʒK^sl0a{c } -X*أQ`81ʶ] T|蝦Ws#v5\PP{`$\9Ex{KF՘,lř.1H<=yJ1Xl!33TVt23"E82F sY&6'ꌇݚF*d bRDΈ"sƧtsULs#A,kZ_ٌMv[4]m芈BK-2ك g"Sqs)!%5xC^A1AgD֜^ȸ8]]>'_gYR\tq8NRUEdڡ#@l={Sd%6f;)KY3m)F\<. v炇exx{]dE?t󭄷)s$#{w#+Y Mj/(7 t.> Gw CxGtb$-s[r*;Ws.d*z* %Е4fQՙ+3O5L]srr>]_|`96NOU+8S:~>P/s@6OGMOemOK_Fy16V.k\\\\<4?9 gz- bl94ohm:U ZӺ~[7V%@n 5uE?̯5b=m~l|(A` s)@$&V:KF)j![B蒌zR/׿M/پCf AC-|l!@5&hg﬒`2&+dӖ$VjXJdYMt+(WgIEm=yijL ݚl) !z.21g;eQ.7o9 ъQGrUu؁j*tA@!#$OmHР~ZXT'(ŻgڶqdsVٲk?~l97 VHcD4=%OdR",}4@g)ճo`KQ?wbEJN[ٚ.Df~+ë?Ce e1c )@240$8BuYeoE$'ԣ&Jnmi%  -Jv$'n1Pmh6g?sWJ{-z>4K{i;Ǚ ;6),XL J}? U|6'Ui4Y;Gw=@]: D$L$kI1Gr;:`(n497vX,P >BU_g> !欁 ->18 TMXp5DS&F fgrdaZ|%qU!Wl` t>h6rhRٍ2gwʧ%WZ*U?O^{ H8ȟ|UӇ#c1_Ӓ*blz`+ydmESGBJokqDX"cLjm]ICTNG+)`m]%5 mfPSٍ0,OB? 3xC/)sQVU_?=Y\q6m9cFS#ElNh4\#:hV.5Oi,+ғ=xf]MtbܮtrlE)F)enz'<3j UakȦ$Q1K2Y/;)D0m p 4CVfhAkTG(qmI `<fn<\&ǂ㩈fDqǗ҅(-9[ S9Slծcl+ P- EĜ1Asc gD"qz$梗TfZ( Fa1[͊cubaRTِ 9Dߌ)́Mϝ,b:MXqx,x8;6p٫x lk_х+go^n.26[ V"cpﳱ0Y{|~:Yh qyN#qa4!O>fWܞ}4gٿ؞՛,*V1j!*%X|H-\mC؊jC&ɱY֟Y)sPH@1gFFGԃk|_S;\.'zq=ws')-@dqlSgP }3I5-Ws3E̩?5q̨F4fn=Cc"ՙىKD3O3GLg[ߒc/D1@g=bV2Ʒ"_>_IMsqg~姅))IILE6ʙ~C! {ttA?[`5NIWV &'1uE9R[G~Q1c'Wŷ¸վSYlWVY]]~^̀G< (A}XSwǢii]]q#3F!-2vگqb:8Ɯ*%ɡPX,T\jBAՊ} f9#Qm oj}gZ){7sfBegz~5WsF"T2L56eT78Jʨo!KFD[634PMxRg@O͜|ydP48S %p)au@0KL)g$KfIN PΆ"z CM1h G=_Nl| =Ѥ`;KWe˿#8H}.Yvx^/Y:'貧`Rh.hCɛz1'*TEsi%"x2al+?QODɀ!VZl) %ȣ.fcՑ&¸h\wIzw0%>xCag˸,<]}3>!jk }r\ZH^D^|[5s ooW@NGO aR -ut2!BX!fA$2S_ZtΓ<&x[b^ῦDsdɩwaZqecL`WɄ CF'Ia~Dw3}i?5-Vh<`j#j[WK6u&Rt1="]]g}4\[>Y4Ya}O)Khxud'uY;2vTNySW~:,ԥKiu*U7K7,D?Y7$&/lSv58)($p&)1܀2̾6mX %"+Wgˣw+}߇YfRvzߵFӦeIM0x8!T) mr[V3ZZ4W X D) Ān\@$b3+S.?t 7=йbjuhm??R{?}ۑO>hig*ph4̜8iAK?LSsғ1D_OLj]>u_6  `r2,!haQ#B4iKh ¸ǥCum9kd~yq~z:`ݫ4Z kEa46+DN,]?Bh)cWqoinvqşge::GW%Y='b2źpqTRV258K\RwN')aڧ a T> ?o*.5%0W'[~WD>Kbir!$j`L+z f_ǒ+SKy cy]koG+Lqe<  XC5EjHY>OIl[$vtuթ{o|Q&kSը daPN9:UsLM.ϏTkZhgVQ \\(Ofb7I8<=ɳ(/3|J28z7׵K]ؚHf#n׬޾]摜}T1N G~yMk__q8e'KQ9>n=lf?ksߧW~/zy6êp .msnӠz6[ncb(©6k&3;\ !M=@鼩Ѝ"LnV''\!G5GQ;h8dgpث፽2C6 $G >?>7uz"~Lho4 ](RZroCM<\2ͣC/̿1ϧ5]3*F7<ę#=߿8E~t?~zϿN('q/q:j O>kSko5װEK>lW|~ۯjj (C&4م?Ӹro#ݖ&I{p/oiJwe,NUz@2Սf2<΅w>?'Mc%p gP @+þo4֏/RzJ]dYESB >j-= )DZo9c6E;:[[GLxaBsGW&JTrˤķ輌 nQDqP,pP|-;*ruP W(Uw:{Y;bN:-))jՖ6}2Jpv!oO#xT"r|1HFMPz"P%Z3T +\kcMLmpf YAo]9i>;LP##ur)\ІHo$с>i^*KǞ>(W4ݷY\Z 񷯃P.r vԻ>%ޗkdk׻mwW?}7~ Vƾ>l٦wMZ;azPB\Lq45L6[i>s7 c2Ĺ8UL?gHNYҲHR gPy`NŮ~ 'V{9F J=>NR254rgk:%* *7F+B'g=Qjq/@ziyO'UOf^ݺ~(lV-vN{QO\UE~i?ղ-~D ?5EtQl ]e2Z2J;zt[DW++ m ]e*䴣WHW;jJgi2\*U f+AGW Hd2oth ]eR:]eS_]-_==" gW8%'EWtE;SA- XUY[*wP Qj:zt[DW7,EWP[*e*++.r@{-EW.2Za2J)pek ]!`CWn{**ԼWHWB- شtQNzt%?&xo ]eJdteNW%5]BR41-^`TguJY\tŽ'yZ𛜲2eN>Vg9?&uPR9q*o*Td!.Ͽ=qL7OxXsY +ǟ=`woy7$X'[ځsRIU֑Qςe!pl FD~>ew2-!Zt39bdԔ>ToGn7n~۔CQ?q{i'S֋'bZC~> ԥ'-W/OTgǵ(%WObƹJFA+PUV"8iRC.%aqRxG,ٔ*xM#KrB& `+ EɩING"pg=Ue?ruPHY"!(O!g cpbp,0惩w/BM&ǰ_ЎoP`ٜ%OEdַz .Mn&`eR ζ7=t+i 7BX羯P7tn! ߧemCwWxwjzq8]@)M$irOs [j 1Nd6MA0\ǃ8 uPͧAigj0gvS0jBk̘0SliY1͓`Oqͷ;2t$6{ӰyUrʛB̉ڗnv!$]5 L~{#1O6H_2ClsL#禨P}/'{ƔLP93@ ǥIbܿ tٍ{4{5 Vu[k.{z3t̢97[͛x)ksXz3j~gՌ1ƖǛ7ebwQI.I^@`"PC=puRrIg(bxy3ΐ r_<2yυB"FEDI7 )䴰6&q' "I^^>VxWZlnܲ?u/PQ:>1LY๪y (J90.3e9Fak_OrˆcQ4hy8"S[4O.xۨObit}O~=ĎJÅ>ywTdGI"!9. X3(x2\ S'Y9 ]3Z-/o>֒ք}C|;sJtс!䌓0V(smF;PpT *qBbnY3%]TcfOSS63|;ͫ&ύZ)Vk6ȷ~<& 6Lk;i&^1( .cy]i\eh$`ȳbJ#ˏ(>R%AQLZWa21%)ZF3 Q|]f>3NDY^:g3:N i .TH&yZQN\Ne-nR0\2IpZqj)q--(Jt$Hg-GpHk T:RD j]J1q|z!'m?+^\ǗhUr<:(%n֡b5zTA5 $TUR%*eFU !9S&C! 8 8l :%*I8`Tif,af,'b3cW.䅹w\H,7=,#2y7Tv0] Me}& ;1FPǕb(48XVeg#>feFsL\"C<v9mc"^Œ] ÌFðb kkwl+pґuT:jIZS_ ir^Zh" jzԐQ2ZQ!ׄHdi/QpD IQ|X |X+LˆǮ:Fq>g,YpIX+I kBDML ո 0<eD)!RQ@A+AKgDG=j9US#L`r0؝DmFn`{wALW7W?Z52Rh^%e|)(5b"XK"J-,e-mlA,!5T=P)8N> +@X!g&6*XZ5  4q0;c$:'-Hx:xo% L6/G5xYBh99§pyn+w1m g/di %aL* AZ9('xU*Rr$Y{;̋vءZzԜxPiq  (%_"~2[ Y \DQ LQ1Z"8DŽ#ӣ d=lֺPˏI3P4P#rҞ1!#uz@Vq48xbM,#4 ` x3dKU)ypp @ d8^hV -Yk mX8[Sq1YUX&·Gd8SE!8ǭ"1#xnVV@%MXtySH^ ۪ڄqȪlXoGى[l\ZR2MHhM6lrƓMN9)*?6B+ 2z{ (lP+p8ZfVpX\vFE2^9,&wFl8 ?כ{?Gl4 fgE$ΈS3t :'I4{'M-VR(l65sP)c$pT)sxpv(Cd(4.(C`9K6ً]L|h|0rk.$.G3"`|tv'𣶊+mrx=2uP0P{=9JxdɃLUԚ oP[PiQ"xc0{߇\wSB y#O~7ǟ޿F[7QhkIkZp5_дiilo4װDdiW6vٮz=,T*D#X Ņոrg?KHh?z;G/~i;O&fpTzziVӕ)=2~\0C?L@6xhͅWX/ߑIe;u%g3O DZJ{/Sxm&YNVSё&mBD-W&JTrˤYt^FK(8 ](q^hVOt=mYڪ|x+U7:nmՆ8aloRqxΐQb= _3ZMz^hoN9W᧋CKƳj=G%cmg| 8 oףVǥv̳z.G0P.*$HYnpѣV2 ̒]%^MrxBT}Q h+MnɣyCώ φ{3Z rtD]8T}2凓;Ug"LNn1>23c{ o>~#:'"+ٛnBolF_{Y}6WUҹ&2CWm,}lq#E7T"o+F\CX6r7͟/eʴX@3oɕf Y6|:̗1Ԝ8{ΙхҮw܌hU=B1wNUKJ-@ Ly&r)B*S: B,OdNKj1U-c.jKlsLRj"SXC#w&@()Q d>X+BY W}z9 Pzַi!3/o қ <[·PoCP*>;o p|gNž9hV!3ʒbGmq#IHNuGr-:nő'0)H" \:=hHFQyCj4) Aznl $ 2& O|:sN"Ds|sC棉g֟)'j޸l< \3Xė6R Jp̖:XMaǂl':ɶ~퀟I8&QN--9J FS% PXiC p 1*$Qɀ9=0bTmY`2Qf$r:_'D|9ͥm3V%k ޿!AdeEG$sEpO9Vdy@#cg%igRiUhښ"uI{ST! U³sߊQT6GzXST~EYnȳT1_䴓8*Ú(˝e)9N%*ؐJFPjgHo|Jj,eTjخM%_`ǒm60{5 wGÆA ݞ8m"nE!q$O'=q`ZĢs4VL{5Qf" \ FqHkƥQAIA `Q  L؛Ѥ67b܈ZhZtŨVmx9D<Qr}ȉWj.v/Z4n[!/rF BΔ2+02m{9L%Ƴ)zO6FVW"W Q6F%7ۥkժSO kuBմC mrUxj 1+:&U`-f(z/$877ak Ve|K:|U19?Ot'xx ϔ,oۣ*)3j:+o9q9v1s`ʦ$Ŧrn =aYEU˲M*EYLm5WSt_M~n .+Bȇ] DHc? ꑇKaN2kgeVEyɊBF2,QKW>;M M[)2no9:XM *"%GEow8ƭúsn'r ˛3.LR8ާ햺w nyʦ|f^Enn{ac[lU[|[8Gv+bxuǫ_WwՇ^z0Gt^ xebGJ>HXy*~yOS$!6?.Ǎ~=W^1s/o]믗g^f"n{krܦNLDv#sIH'>E|YWxD%ZAtAg EK(F TIl(^Z-2ee$zbM'8X'K{NV {uϓ~ I\t[w.}/|zzzgC'ު,b)I矓 EʔgSRiGDxD2F`x `0L>ym01:6 8x! YȈC$SetHZ|}OA)ji5_o3uux0iD;s(>o '*n;x|o{7y>?|eEFN6e6*mԌ2jvB1LL(0Oo?zBψN_=4B#|cx^ &`LHUI^L7x̞sBVh0(Fc{Fu;CC )uf7 \ WQī=~rnO};fѝ.cAw  Z8|P{51<]^dY&/i.?}_v=0on?G8u^+?A< ֋?Ǘszi y -="r]~E!$OZiV+^i٨VjfLu:Iī H/Mp3|[tHW _XUo V9MNIÛ4WAbi&fz:?nEы+$Hf&W-0no<6W1u?QK}sz9~Y (l9vT(DЖz-Ŏɡ`wFZD6[a:jAN A2vb6#+e; )m 1:uٟ93nΡh#ڍywEG!R%&",aRuֺw`*%EԺRSdDK%B1Mba5 *l))'GA526S82Uaa38 uc,#xc]KyKpఫŏE.O'GlNWDI 1K'\ HI͡ &bQE{6նʍ&{>)CPZUx!*aH1b7S8buH1YLjc.Hy "E }8{-=>e ъYI5Y@IЖ$58CPgQgbIYDTYو82F/|H$9}cޑ6/q%Q &(z!ػ& Hj DH+ 2ː.JɻUX%rB]ZЭ2Яs*j<:f>LJ RoC|aZsJfjČūF4/F¿/:cD*bgmij&yk;a*?o|J5F#{۲r2Fq&jA:A,)hQxmt-;Av[qJdpYH/@bp.IXLΰ)+lh]J@IAWn_KZ.\Q@<֎}>nUU<}R^y6InctFytnyG)$:K:Ѕ%"u䩔 j㻗;RY7~p^[7ru.cVLfS{BⓂ8]rjr鬠~7@z90qMQ򔔌B6A"Z"cBoY8 N:QepRR0#Cbf h[d =M]2)*:)_>_̾8qs.yI y\]nAuS4I*rڛ:`HhE6o@"smS#syn >Rs*OS(B2hQ'k}o2[*X qckRZ 9K ʂd>f"'4-6gC?$|5Sg~99j9`kPRRU!25?rv !hp >:bң(EZW@='Q (H"(`s8c F)T 'vxr j'yҝ7)sf#sY̦D% }5T{a'e&qVSIMGhp'Y!x6("I*%`MJ 2HQTOK,6&e,qnx"º}2>ơf&rΧV7$'+)ԙBeS":R)MTSIg38q7a3 9;tcnO_r7^HZ.HMިiBѼd]TrF(¡ Bg,GY=yB͡E[fyj=݉O_we+]CgfSs6#?^ޏQc0ْS~I&W6łb hXqe@Cm5ITJ$`.X`]Ak1ee9 Ԛ9:Շ}?XLj~IF!kQ09"W,;}֓&*eR:dqH߰uE_+{=5{x_d@dt@VQ_Km%x(ցB{Df=< ꏽA|c*|6κ嬃*[RERL;N0Ef|]cR:k !갆Xhobqښ#TOO(: WЊQTd Qhs޵4r,eė$G:"fs(nOY3vWݏ2J̤ݟuU٥P1g칵ji9)1qԦYxr3Ÿλtv:8[ Lm`\ 8ֲJE0ԊI'Xw}P%LՔΚȐ2\,G0}(F &)|T /.hUMt7y#6Uk"-Ӊ0#0hҧK_1K]٨>r꣠`-4NYJ3(]J~9|KD!9M3?%&guA'Viw4UHn⠨ogn0>Io?y6>:wG}LTѻX ~Ud~}?[uu[ޜ4Ul Ҍס+lroU _ڝJ(M~i7n8{ iMoJtf0]nbQCIUMp8w7-~طYY%gr`d5#(wRS4mGFsR!rKLA8 !XI劶wFK{Gp Q6<^9#F z: -I0:IIG? uۄjej[_Hc>FS [JF0 +0+I:!xIqkpO=>%fgN{J Wޙ<(pC8M.0EyfP~%T$v/(_޿yv$."W1a L %* j5ϪEpEU9;{uP m0&{f帡gO-Xpof+0vW~/UoWKwV|ov痪&œwd3=e|@v$8897̳j/goQ  Fcv ؂di;?129)`Er6g%O1d/?U"|fov<C_6d >"s  CelmrݛFHS}{}.*9ds7O> l7akIaաb@]뎺'et)KigH` r4 s8Uԗfr-c1QQʉ78Ŝ9s˭S5@beBTۏV 8PA\(w2 T3`NLj9+ 4g=g5/q5go^yɅRw/yZLfbX;JPDPe]3a ӆ$ iraEn9MɎ4g*ɉ7$'ڸxJr(i]ixql1a*,QpiLhǀqZb qkjPJqTQܦ2:2,"1r8[K։#RHD#i9(5v  /iM*Ή,E9QӆKQ yl=%0h.b!u̸SXbz;ބQg2-Oo_s͙%&W!aOfLnvxcVIh09ּF/m[4` q4F^8ء .S{#hE:vςIʣUh4BkA0( F刊` ;P(0Cm'mgJh4Mv~}#zl)*,_ɲp 0N4𚴑&Z(4sm| #ΨbS⭭Pi]]4pnJð낃o@Ip8ٲm^ i)φxFX!%Lr6,i~{M,Bͭ$G~[eXOXQzO-k9-4EX %eREK1bk5)ug MoR؈ZhZ^&𼄑9J^怋y>sl@i݋QoC\\u$apQ]XиӐ*χU~j?>Q,c3vvw/haI~RJ,y,,L'WV9.1jreۅZoFZ@9!g8sPH)9v=I;3&Xz1>1gIhd(ƅ*egN {oQĎd:V&O/a@:F"މbz,Ù^)(^|vo(#+:mqkrTa5ɄX/A ׯ?'4BQ9|1 420fnѤCv?ll*E2Hv{&۝w_ s&L{njh$uk),!aҿR3[gKm=Տgqp>=]8V0S[,93"'l5G rxjpg5Q͔_M鬉 i #QrчbT(`bZGr.l0mlPմ*AwS7jc[%&2 9s>?&.}tup6(G+?p S}A J7˗qʢS~Mnt A\_@FY4(37*횸" M Ƨ0I7ަG?p?zwk7/R&FԓYI PnmzsT5H3fg_h tIW1|%jw+C<4ߦ ކUwװ^M̖T>1c(Is n ߴ1md~VəY`H)ʝFgĔi1~F] xpa0G.(%Z/s0S '"DTh}gԽwd jw<$t^ m0XGnB!g2ڈ!B@/ڔ sjHR$PSm(<'twPlBaz>~m\imL,9FКn4.נݯ${TnEmxBxc~NٕV`.eVj gsc̱|:c qXKy$"kY"k5awi&L-mFWru_ñ%ɹK?,9wRAWTIɟ R(G0Sf^})O[Ƣ+*?I 78Ŝ9s˭ʼnuX_Vuk ӤmY&&VQR`*ƌƁZ0cĜE NVFvjq_Jj4ռ&-< N.߽tt?_)Ho&(oz$Z/ݪCҾ0yH%hNS cd^c1ԈJ8:b:LNGW:oPt]ixql1a*,QpiLhǀqZb qkUi: h|IC!S@\HS3򢫽F{jaX6 +ALjߛPe{ mT4/MȲ%&W!aOfLnv[/{w|nΧ35-K֪LhObkmO!=5pa^Ȓ#3,?VdK\e?V]$d?k8v{eC:%_U%U[N'ǹv9i@ he6rC& PHrk\| x2*k/V޶7{k|֣W3yR*^tr!5(ZRZ-ء oLi8 m1h_|%)NlI[OJӇR|l=O3:(dJsxu4cF ϷAirrsR{@4w8tݟ '8(`y kDwe/%h$<xSk{?ޡZ0ez %KVA*;Ec~L?_^_w"o'w8]jsn",US-wm:n]qol,m\3XtG,64xh@f49{&ɡG[>86e%к'-oe.:orIe&y**hDI8ʇ @-ƛPpTx Qc)~^$er>YsH <6 #G9*: #/9M>XxNjA->QyzDP| sHF8~YYyI%Jl&#hB"6[Mrm2^rv*Gc<Ejl\r`ZF!匎R,G%DU9CQ!v#'nx $"+ fQ=de9FΎrv|8>q!. 9r^ZC1 L-"%'4xIN7(f9Kߋ@MZwÀ᠈SBLӓ u3:zSfO[@Z G?ZH9 Rec4G+P(5$cs +*+[o,[;j]2X0& X̘{LXZtňVWKC==" ,|N;xqkAΊ=ȣDTL& i *Ȟ|"$ Žq6ƚ0v,QCYY:[ ^MpxꗐIV&!M;y !QR }3jΕyY⡳4kNF^q5RFtqs#9`WHz iGvEJ[v=@rp8yaˇmoi=EYZ痳a\ҤH}aeZI&G/z&_BJ}HЎ+ '"GBjgъ3g<~8 BkO|Z*/Aoęas=W3 I3׫jT;s>ZvI[Ӈ6}< tskxu.Otq滿~Ӌ?f3,[]5щ_N|Wga7,~9mz*fUlb!6@J)ijӬY5CƀÇlhmi/]-ZyOVe!ZoNVz[n~ ϖF]λ2[=%^I('tN+oޭ_ޕԼzl`ǗP G(8f@ Npw2O~Ϡ~(, }r2%/L,9 e%"ȭd$I@=!4CLGZ%Y Bq.?؜+Ⱥٻޡ/%rEOșvd%b""36@C;"XȜǨ1#pkhV+7TyS* @:$!2<ȴ*d'ńE f:1)˜+'/HH8 D1㹂q %U#gb[5wt}%>P7>+ӭWtla %#l_g3`rwio__q~?ȓ| Js6>="Mw4~侰wN:-h:HǚdtmFFw^bUB 7{R:TA2Ǟ,[Ebu'QfTfw>(-Fm[Ux5)-Dpf P!a5~<+|Յ. {ytTn^]9O|m3NdEZ3by+%JN$D5px]B(1%5еj}N]ۃUq|ʿ VxKT_x)|/vs==W}]auzsf魼~_=Mn"sk vr}k$Tr4c!櫝S4ܟ>n(=0;%568\gtUh򁲁ypiihe1l _<0Iat^0zk|vne2w vlb%:oZD_ Sؾo}JO7<>aff & ))iCe"W{R$ͫYz}I2*dks.|RTϜ = ͙ȳ 1HгːHIF^UQzC>kE5 Y|>,^.2EF{U;մͅE`|+R߿\a p^ͽuw؉U#~G>$>¹ EJI wq+rz͗ɟ n_ Ȋe6P6^ l4goɩػ}i0垨BO`N; BJFRɍ&1d P0,DYx9j;rׯ'8)dv(xEjqX_+nw3m7Ot 'A(w^0Kc*#3iY [5-d셷έ p:60Zt.֛VۙX+-o{ɥ~q5˧ݏUU`wI8oo|9e̡#Cj碥s l',O" ^kcY6JPXP"VKۄɑ/H>jʢ XR1M K-:̑c$ㅮlV#gOn~='͠y9Nvm{N75]{nn) PyuȖ9m8?fL=?wz|9kH"r#iMbmO.uD xe,etӍXĴ[]ڧy|f]v3|FnxaEOs߁#+9|xt;\%]bk.n1tZgZUmůN/7 usMXΈ/@00]<dnȇR5WzQ!68 "1 _ $=\ß_Guv(*%c74#h)Le-RO~+g۱BuÿZw}0\6>(_=}t> ]׏2wȳk~V7̵}A5ohSpëbF-.d4{Z`?4茏 ['an@86\Z^^,d %zk=J6CuFGoA7 ؀aAIFj b2)(hτoW]4FHJ;G .7$"Z TrX4#qE%IDĶI'Ӟ`ّ?,zbBxHIH.ګa6j]P& ̈́4z Ȍ#3VYSU+Py 63xA˹\HJN`2{@4f*C, CL΋)ڜMT3)ښ95c=[.BYY^{u2ٻ6dW~IH} 0YǻNBwu5E*h[Y俟Ehj(eq5U_UeVqK>Ӌ).7OwU?fxvab w4q.i ٛ*0,J[&`*ƞEmJMc&mW p Θ2Cc;؝sQ\31Ekw'DׁEh|0df=Q^вDV(EBipinS۬@2!CdELtML H¢&H:>!fQȹ[6FBIQ;m5"t׈FOF0\^ȜJ:2 qA3.%6rƴR:A. =,$M,"YuqWlP$Rҋe>:;mX/^/z6bIǼ,8 cdf $Ʃč d0%z /3}Ӈ;Pa֖$? яhRrH9!ǧQ ܾWi'>O:XU70U['׿O$LG.%l2OBz&1YM+l3Q,;Z;ܱs5jޓuu6^+HZە/`JjVq:Mihe:ʹ_h c/P:]"+P;K&aZ /8G $d JVMkk@ꠂP ^i2&]5}趟Mr=:vJBR4lBT2e'\Hs 㺜8X#s讎m޾+8.MUu$U5p<{ VNҨ:ʦO/AU]6MrםAmK?_FoFw_9Z}kjY|{l-^~A-dW^KrGb7{#uG 遠#sJvh:/BI|<$&x2I$pӵ^483 =OWg{(]Ez8*G\V^ LF-QX GwRyΧȲRUqΐI* JgѼ_aa2w3r ]_η/pTom4K >}_Gv{eȫ7DpΘ}xlFcҟ>_4 :d:/YZPC):EGVS쑽 ^*Mr $ ,7d<0ԐI 2ة @IJ 06䳌I:s md%%\BglXjB]gb-"T0x n^zҪ7 k?;?߳!32(Nf-đYNB!AEqf/ Bl:9I)ֻ٢9yn)$BBu9wK833>Y1ޖ(pM1h+zWtەeIo,[pk3 hjk 379Vjlw8>/J)/Ֆ]ێZ [y DTP!$༊ 3{p* ##m2fAm Kƀkv&إO{9;^٦D$Nv,g5r>ھp6F!K9rAZ0ǘt%"%'4IN`D%,C/]υz= ?9XA=%ϔS֘Ibz2r@rbڹPP χI+34I;qȜ-a6FC 'hJ= +N4bW< YWMoI/Oe^b :; ˺K,{5W[Qi4QsQy"]5G$SW v﷌-irx׎b* :`AyI7ZIb};\pi׹$:bӬciI 45 hy1kf% ĠU=޺Zm+9Je2,JR( !OC)U1 T!lj`n9&|NS5~}ز,Wg~>$i֚A3%a 6X!MP_L WF!S!sIh5^jΦU˪'N$M1Jfc4XuБ$"y w >V"]Ԑ ž^XF1,m*VJNd$53O5HRsւgeSQz1žjkh1.YNZF沶,j"G̕^$]}.xg^>R9~l ? 6i'\W]j,ȓ ^U׌r1-98r>g\2qp^-%xtsRiŵOY@|bNen&+rMEit^\Zc_ VpEIՙobhf.[G:mF cuufyIh(2e'ˉ^98d먂>%Fm}W֦tZ ?,MIJ^nz(痒4NX)_U?; Xz\,crzKV0(Vqh<9 .qv-T ~˫j8/W88;!!^?݋ߗ/} {ώ_¯h]Λ*u${˽|0CCK  ˷^L&._e^j< !m~Lnh8k*x1[S8 XCC2\,_9pƾg[ F=?2kB70(ߜl.|@P g+ZL"*h=]RJ4bo=wd 0Ss )# v9BkZŀ:y0#2D.ɗl3ZOZt/PBUHg<񙤌lsɇ^{zC^&kVE ]Z_ *% K]5s׋tĪzLL_0/ajPPزq)x~\  3cG&4B& 0UAĐMB3e3 !xLyC~/oG쯣1.*-M0e}:)zXDҴf̦+M8 { ][8G$5Ϋl;sLPgD 5Z+q꫻]Y;4ۏ^20XKk9G!Z\b,C2x\f6ʈXTXm;޳vw2S 6܏czA*~"A eVKYxkcs爂c>WiO}գem羅zu t]mDZ!"=ogHkՍppָ/ҼI Ñ:QiW.Xf YC{@,v;#3seQo~;.1@Z[m :Hk͙0nу=(@@Z߬{yhtX,*값2">Z!RkDFJ:2L{&! xv\mUSg8P>zt_*Ӗ>Nj)mZs_!3cVF+P>y*F @:E "YFc3#~2O9ŻaYכ&`#r Y =WȊ}t}7oIQ߿<ϭD%hCxMßރKC1W>\MGl/vԥs4lBxNxYK\m@ٛR A UC )E;-HL\ gWޭ}XM-3Zl>"Z47a+rNGƽ/g7Vx:k+GϷ(f~:KW(1Yln ۅ=52W9kJUm:I"׮BEf?!o&غiMp.lf3[;@lI%fb 37 ʴo!y0ț?]sm36ta#;t(zp)lIT^%QyÐNKl~T$* ITv~s[?JtmJ[=:pȓR1KpKl@:2sE~2u S=Dɰ |"%ϳ5T,f# IFC;2DIiHu/9@\̐PcrYDWqHUƹLmjly:.H7wp:_ 2 [p}yQj#N<]ŦC|F VYjK >L2AGust` 3rIe.wӸ eA$He,:,Y$ hIʵK}L^GlCyShPVkTnW/x(ÓYNXdepP0 DwtQhD{FqIoIW qs2. k;X%Yȝ'_w R8h[1@[nm =mIxY\须2Mq.sʒc_qџzR\q|8[=!EU&pqltB*\Yf{ J8Υ.l3$1hك^+30srTdNdۏNCŊvc nx(ϓB*P.-3;V$C7JDbR|晩AXQB-N 8)dZϖO2ܿJ^թOE6rnd{n>Fw9\O_xҗgt]unCCyB7{@eS>H!@<~v2db1Jq~I {';,XJnN(U7nut1&ڰ=nĖ%U`m('P)#'Q5 =*Ak꣫j)q YL@@9eUAK 9jϹV3.6MeIVC1Lv~?F$Ņ h&/lÍVI Eq:)ppRKgT[N fd6+BDs!^kjBmϮ:jI&hSfcߪ6ON=|:1Z=bS}^UTނw;PKLZeܰG pgA*:(!1ptc#M,GA傑:fyt)+LR:R ͌]PTBcSLɁbnnI7o7w,˶TC4x*i وĝA[К3 2CY8SזFda(Ξ 6B<hnەjlrHm*#v5u6#⒋y,]mv['VS`V;{9Z:&D P4ԽRi CFːzA$X 'nB1 ?p>fi@:i*a5u6a# 0 "V""TFDGĵ3>qO񎦏w﨣 yisˤS,*G)n d2:Z'8dU׀ΎLwƳ|d2L*3Xy4E%rm:5|LA;̮6Rˏח§*Ӿ \M&'|֫^o>6` e#Wd b0P4Rn7?ތf~ `QZٌ;t̩/v{bQY! 9h)ie 5D^( N-( e)9.qY'%k{(r[C`^~$_;SCc(<zehyⶃ;5Sn!`Ex,.;)P)pr#`sGV죪/ط}a"}y RNԵ SSєz}yM6nd۰f7vE˲~PrjӒ!3+ڑkW K3j[l&mq8G Bm ~4ؒ`yEŮgom!y0ț?]MmV<-N]o?N#05ݟ7]/|Ԝ{xgyv"  AnjS[=QR'Abx8$$ZdSA)yKg]4ăfxJj4a^\ҊE$rw%q=5UG. #*7L4#gG,P.f?>XKw͙/.Wݭ_=3^GmoZw~{cޞO.,xkbT YQ\1 @2DL ;8(Rd)H7 “ B%0KAk$He$L 2^Ama<=>Z|Uwsz\F3R3IF0U3TДlP0忱OAubP2tQRZ[vg~W!>H_jU7}>_XƝ}0o|h@D,`)PYzUcK(A)VJ${Z~>~ӵF?rF?Zm?>;ԈKULtr?V_=I-2>;/ٻ<!y tǾ ȥ_ϯƾFEFoO}\/y;*r&ܝN7z%B~bNk];5J5goIA> b^4e9jw^mg-ܶ ~|ޗpYg}Ҭ8M<a6ׇW opq^1 G^-/oD: VnC޵Ɩz_rՖ!R~F2mLv$c[HM_M!-شToU48>␻7@gLN2b>Z>*t~9}瑍ofo4hPW?K9 WNq7x0EKZ+&uWg'iveY{$,Mw7ey7]1 n7>1IlUȖ )'g`|O=gqld>֭aW"Zx2O;<ϧdnK* Pz&TF;/8o M$yB`VY &E//B1Yx>wS3(Bڐ6,AqJa5!%f d)@ U崛˫Caߌ7a/3k=Z&Wsn)GY'f:j#-|au^2otGq[-gr"nyi3dW TٌҶdI훇_!q?^NMި#\޳B%v(=o(T6X`q%YD̶BY@W ;DlO+[ʤ *](\lmٵ"~A]`f6u݌O=tyv]ޯǛMֹ%D[xæ 9&VdDc!R2`ZSTD"uXR*!V TT =L\tSA%ɕ9YYs)* R9#c;]6}P7B=`ᓗX0U?x/nh2~/pe9f",ާԬʰiL!U4vOC5()CPZUԖӃ',lr |oKEDt9#v PPw =hG^GLl)dX"f}4e щVI5:坰m1)g$̚XVeeő1?I"Q16#~7D?Pb}~+l:Ѹ_H\|8m(`C Qe1Ox!񎘴B / S0 Y, muI3$2]@F*+]H$]='d!I}.{[dg&!A,YoZnE.=y{\>=b.]tn^o x_IwUn ͎&kxz]Ydl*1B*(,[i:jTCWQvQ(u}^o_V%/ݝo@sl;:,W-y]5XX2e yeNūL}jo(M%d@GR >Ifڥyn`¼ c-J܄ u"5VHs]7CRw؟NW3kc'-Vk&-f^NqgecN`f9 g(e!BEB[:Dcd,Zg ^ ^J E-㏛Cn1cxhJQc_ SJE4MGNy<5ӫYVju1n\$ɕgvAꦸhL*_Ȫk2]lHlТ2l2N j֊X jB$ m&McAZBgS?L!0 -W!V*dTŹW QHz!tP͠Pr x͎˙!kA^7f(g\_vRQs."R)4>hB,)e[@XҼE악yI.PH%YQ_ /z~a?yAQPaFϙ L&T L)XNF=Az/'4Ik=m 9b+%F|m`/. 3f&yBpe&qP ^:#4Y(~ xm:[L"R !D#R 2HV[zKf&eU6#´]rWYBLYf,§·'Ul^S gSJ#Rj:Bb6؊cTftҡF'ռn&_[|tkJHU@lR&^]Nsq ~TW .c[_xWzw~OB_ OtQONOoUY7zF=v8*l1\YQȑQ~F:#6^^D2?%%3L=$ty.LI^O˟V1|mfuTlXݧz=iQzb}"Z]-;W|_FhjѥqTF|-q]va|2bn}uMlqyjϯowR7[ń<_Qd3,)ZIC\34^n37MoN}ef||qFd+9;N&%_'bb pF" GgUۺ†SQRR)4`ʂ&۔GH%,RjT1ht2x˲\x};dVb (1Y\pJc0_#|-շu̱JuymwsPW%?Տ,}L%:}I'8&zF ,Xe%{% {< U'yB>놰B.ڕbrU ȬJes@B*+E<\[N$J[/։ZTe QQtZF+\SN֜遂QM5vTF>4_gq;l-Q:EZ="r4rɳH2,]zSꃙ~]{euwuԓg9𷑧6aLVw:D:6/]()BZxBy~&O NwkΊLU9P F.Nn5!䦏|v~ `9tF&W$,̾e͛WGcirj&g[Ḋ6_`HgP*&?uTө~/?jzQ~n ,m1̥~[ovQ6jNwa[PUK%׷tY hF"Pe3a`,|8&zrf7iemnu6ȪV+*”N0[#apRW_wuWkZJ?u~؛bfvᥓ[]R L |Ͼҁ9/ yoqek*@R(;7E_>ӿ??>|gw?;~elW]P{w:@zuSMC{b5fQ^]+i]TKb7 P~ί3O5le4qP*tP*-lcJ2%f``w'`Ud~VəY`H)ʝu~l& @5a\PJ8$R)2H=$9]і֨{hLoȯW g2ڈ!B[N95FK$)qL(⩶N`,x u=jꨮW4&z.GQO 2D ][zC8 .yt./ܛէIW씰i|̨_뢳yAU'N!RnJůKǏZQҝD&Szc"ny#ޫR*x/Z`&Y-[x#i"/utlCE7Msܹ3oЙkj!y,:s I}  u^h^_d߼OJu|0\ǽu%2M j :u6ZԠ;Cw˷Ø ƐLa:|ɍ!PM#t9y}y_Pi0c4 2\YyMrlc_,Z+ *|H]QqIud9Rzxxs,GIX{ɱ6$Q3-9hÃ&̴ѠfGPa{m躕ۻ˴<@Bl11vY4a#A2sVk/;[[,} czWK.r;'+͏? hc:a^Ag flrSY6Lr໫&gի/σAmh'|B~E瞃@- UaׯFBoJ[{q}AU` Td383Ml▱ٹ,Hc`bҙV)42S=LU)XO]5:LLrܥܞB($S1f4j|5#,RFXp4g퉳*% GS{w)ַtyX^9m(Bǂz[ܧ{VހxOP xsr6GABLR9n[ҝf@YBwn|$y`0@4a*,Qpi,hǀqZb qk(/r 8M(rn%Ew9*"V_"D4yYug;/ 4tb8ԝhXAe)C' oIx䮹4 @ s.hq#~1UCK.`Ɋasvw&_97/MfC<8زX=$G?OrD2dH+,f;T  j$cGE]`;Mvhܻ.;qw`3(VVغZ\M/26n_fcci4§?āDbb Xg`;Fha-SPt͆G"hVaF͝QFG wP`QEUXw8p8U:Ϟz#l=VEh-G,1 ia}4Z(43m|#֨|VeAɳWKnCuCb2Wc{qm#!- S>=?=B{֞KEڬ=/0kPH"q0p~Mӆ8Lxg?M3JnrnQElf%Y[p/t^KC{=҄#Ga$z9kIYT*RpiiM ęFpn\;RV9N!mx6M7SM`/|ynl5ؖ7R{K F[Et`KQ;Ugx= ^_= diP\mjsfƛ0CG2Fʌa0SUJH J@ 0ʭ3yBA1jqai.:9X\kĂQ´¼>iU,Q5)iZ`[A,4 ٔ(紲g*'$ Fhp6}g DjQ@*:BAϩ5/ah6O6P07Nr68dݭ x&`GiusT'Gl~!  =r¢1Mp`U"+VNb )nt4Ea/тѵYQb>?pF36ۓ&s eGv0SD8c "Y0`^o i~;Kǁjq!)#$u*ה{j{`j!,UX- YF0c2b=6@lD4h;5Ån>&!ݑ; &GA f(>Zרp__f2 YּyRn[.wUGӼNDduJg;k8 0J~VSu1y5RP:S(zdK2[+Ji_Sm핗 F %7CW7{;:2?zydn׊nd8pQom3gEY_-ɖ<{vj2]h`GFKi7$N=P.O7zɱR/Ez!¼qWbũDk9h", quf!cNg:p0 z[ L@9.ֽ}q4H8:6h%Q+m9( AV4xoUOi5s<a$QH)9H u`"-T{0 0fQ*D0UXo}b]k.h8$0SEM[v=3FbN6PiVJ3WJjArq+7Ik-)e"+Rjj*0ރ{bUT' < Ir}>~"du{8_GXDtFM"`7X"S59LHA:fpǧ5>M&#BI*veEΏq 0:(FޗA?HCSD1|0A T9Oz5:o{sޛ^\S7 |!5ݐW*׸.A~U{X#\4\V#KMj(rՓ:skNΫ_Vł|͙W㪊dJU VՍ%UjBZ!,wX PUXt_zo486D-{D-]tqa&43l/}{9.au%il&2b<"WHkG%w릸˟.jЛNܧ#TH˥zw`c^ԡT^hZWk\IWe| %bR^LtԼ8޲Ӭ[eOh“zjvԔ379c[y㩓ϩ=.K|&")MV35 [8#q42\iH`l6ZK#ɑ1c5tLQ P8C#LPv6@zz,1^}$\~W d,YN_ ͢-˲_G?_QgK^<]V^SQ{L٧N/SwY3>Fm0K/ӓ܉J)噏E',=eVh0,U9rx>?VL{%箒k$>$叫>8|n Flzݰvgϵ?Z6a?7_gV/juvgNNgB8Zl@=DjoF\uވqiz:[Z^p?> ފ~ 3IiQES v!z%򱫍-tSz lbR:W#hi!KNZ4H&0Qbsv 6]gRfmHZ|:a ![X0TL>ޠӊ-'т$䮊}Ax"&"^.zyg9kFΖrW-aT>\SRtNSr&[@بEd"%5ڬeh)ZӏSAT1HI I0DQ`&Q19rBu:[Btˠ8(x܌'ngIu{ocZe tҕ*yK$I@*^t5Z]afR[5tF]t/zg 5c j,|j)|xq2YJw 't3ZhEV`RJ"^7.k0STqmC7-AOհbo ̗t%jCo]/nHDBY%$ &lyn^I2϶k>9 ^B -c v Oe̖Ov64?&_G\,_rNyjQNaot2~跍S&,c`)zQzyeu=ܓgyW]r&kk^Þ PRH-{Yƽ8qڛ<ޣqV9`H2"iơp&kr'#^,gg,uJK# X鞟{U*.hrzMYկo#\kZ|Ih{P3]+D5H'yWMgNjG.oqvT)yͣZԿ-h8=£,$?oT^~{9!dsg~܁φϗ?СG LZPXR+9T*791{y2_i㺠 NФ1hl`t-v,vSau;AXDd+LR5Qۤ\%>BD N"DA4)D#&(Ƥ_&@@YyE[9.qRH3rnN a ]лKX~XrӼ&(Gqhۢ %g=/#n-!!qC61PuԼteEggA"U*5IEY ~6#g\klW>r??;ג؛xAWƠxzrfiݚڜZۛwҰ3d$]͵BH*a"HV!UlsDH-<.)ɧ,)mQV)>HQBQ:@YP 6#f g}rD N!A0LJaep͍N)XKZ:,䕋KJS٥u0/Daw7(YWqVD $(HyFVƆT36TfheAԫ *8. 7.hu+6L^Y8f!BZRP GCZG,Kش\8`0%\;Ӛ#"rֱ$IOb=b~g<{fkAԠȀQqiTPx0(gs7H&x}{:yz)mx5E϶Uә>՝NTm`'Z,_ v煠$?mڼpLӆk$?ieb6J+&Ydƿ8 jҏWޭgY3i,|d?׻yJkRL?E8ZSN}!i }4m[V]+j]lhl/5O)Ҡ _v݅`w!]v݅W(ej\mu7[L1 J+W3[IJE#BsnN$ hʃ 䕅4hmU$rf(H㨕sH *^҇*&hR 5A)΄ل\M9* Y@/ g7B8Wv類û>>'hjLU2X8?z72S8]!/C'Q9sc @<$ KݙlgS]&;uTf .]ky!~ä"XkQ`0ҙȌsV,r(t X)eyR+v,RR-ȍ%fRiWS6R3RH P )CJ@mrpk1YΞ|tgfOuE0$5Mom[zGN~M=ذfwtw=>B%m=ȭC8ˣ6]^?Sx uUz{Nl۾uز–U+wmyxwm=/.9+Og{~>2l/g[Gwڜ ^H^4ӟ7ݝ@WmNI[ga]RS$}Ȁ:nB\Pʻ] |U#q%J޳ȉT[VxS(Z-$_9|r\rh08v jڙ&1:&%Nk$ F$ GHTsU$(IJ7W]6p)}Gs@v񸛷ێuE7ڦ&ƧNFi"I'@:X9XА( A(R@ugb,& CgNhDvҀk:DhD!p) <. ”K'[)Y{ |_$xt[[m;#ǕmA{QmkA9Z2;*9e W0\gxdv:ӳtKhξp3qF⍓ aٷԐ8ωf8lB=Hƕ,'yO˩%B\_m_MdK\Ox4OY/. [hr'{1*?͎hNsK/0:E*zv ⎟{7h@B_蛜i:>۶MNb<:?Fqv_TPǙ7dm/MF7v%)WWY/NR 0}1{I=?uoA/W|F y_p.Sķ9nRCprMVkAZyP/M ߣ` gt/ԡݫ}eB.}##S5Х&_া)?tlCS!PuNoÞRSgې6v_J#zo%[Rg]tQJ=J]N{ǃ}{6 }453: ˏ؛(-&T+oAj.;[e9L0Zk_ <6f"a#3li6)Sgϼ'NcbQ$;9f1Et}yv˶3$l"J(E D%X kCFhHPk8s:)9;q)rp>=3!y&+p.wFP8ﹰ\HTR`4Q4h@ HJ$49-GTdD4W3zE^@iݎi6Rmmbpӣxq])(ά .<{[cWR~|EFPdO+VJl)|4t"T!q_ԕK)WZ:eWH;Zs>hŎ|XGHRGZc$ &IF"gML.Vg[9=0>-oyzq}ӏ;.Ƹ%)ȽwEBFT\434z#1#]5e˪}Uݯ:g/]5/*\3WB\R ⪬Ș2V Em ZD)kWReK bFd őxq )8N> +@Xd&cY˵pAb]".*`/  FCdiWqӫ$־8 '`5t^msɋ,7酊,;tRH PL%//;!bя&~&R?pQ4ywbo.Ex3|zހ}?P?-kN+ o?ym$7y,FE=2I84pѣ1l+[=N| ]m[0a/F[lM͵8>fs||vzԶvzkBO]zMo"۫p5׏3p߷~ܢ]و4>n'Cͅ .Oz-0@'ԾL_$.ʅ2 וUA' 1Б4W:\*}dIdu@8FFGJSvS2ܯٍ)r<ʖ*m6䌬ci \EƠB2׊rrpkq`$Auj)QlFV 2("i 2O}J\k:NѾATV&콚 `.[LN^ln*=Fܫ(j( 9`:(%n!xRI=Rf  J<9P"Bg;UӨ#s8 . k% p-1VI! "#.E4" 785I*ʃq=4DU'f79f[28\|=jEp_-|-&09t9JW.&ptxLcE<6x3V㻻_KUZ0+HTjQqK.Wl=y:,9g \-suvXcVw*Vj,9-hcjy8 sLzOnunA !8uf&i87\g%y.sa(0qr cʉR#^ʪw?)ϞFG| s Mف>v9#7Mq"(`DMTD=sJ4%MkCdFƞ)fU`ԋ')Z)!hɁGv@Rp‚T?_wqmpv8[ژ%<0/ȷx h=Q?r֓W>C3myĺI1VEqNRdE $"@2#9:WP缪\A;ͭMsN8uViP&\&-j\JF'K)Z$]9%r8C( rZ|3q騔!|:Nt\^gG=?MR3yx>\A_BvQRRRihR `9Nqb1q\pv6:eHJu*P |x*?'xh? Y \DQLQ!QcDq GG*A2%*POJz姃O6;sڢ21PhA%!$G֫|`X%tꈣhTQd⠓:,Gy2ﷀwY#WT)•p^l*hPev5e&ݘ`ȡjV>6W]8ꀡpNE's<M(<ƕRwE2eQ ً;qXcJ Zf৸ z5 _񖲺\mҭ!(!P ]ɑMa*ƓnM1 Whiz1DWd+sBxP=?g?/G Ȅ+%X -Xi=M%ʩ $2tr{xhOxܖߣPxixDܠKZQTL =ށtBSt!xK& ՛\j&6F ɜ 2bӎs!D&Be(&lT\RΛflٖiY{wm@\DYM`iWN|*I1mݣsh`+(j"Eܛ5RJLUq5V$mق*ԈhQ"y{5J83k7W GAkʺS ^p1q9oa#B{6QD'%PwX'GwtA4jޣa%)ÈSh\)l( αSzW5N嗔QS\IC˓(N*^4s*PdHgb 61bؖI\vl[? qɟ&h0jaAo.;"}bLR|[ K7̤m[.+VWB > 'Mq˙%zT3 rzNL\&S$^r~m]Vd%ol Dnv!p5H߿-߽R Rx,JBaaXX8{K[|pd]4^Dϟ$dJ*Pu6Lz~7uM;Yumµ<{1*M֧V~]|0ڰsa?iw>nveEmմ:;o!Z2[rsKۚa[Qdac3 1(8!iOmkכb͇maL#z\K ɬ m~9;}r;NȽkqJ-<'2&lg=R-ZZqhijFԢu>( 2 :H1@*PY:M$덢^X 5ַ\J8scc IJGLH xWhsN?@XoYks[VdrT}&KN|KOSWHBEn|t&LL3אrRcuv\翏Ǯ3X/m%HJ-uL,m#>n~ƙA'kczjزU)T-!c8J$VNUah|}v9UgKmM#^mև^Mg׼/m? zޙE: IBgGȀ9p|Ļp9 8't<"c\%ig DC)c):魶&H]Ҟ-UhIJ<;kmpvh4MNi,VN{ST~EYɳ4hAfɶ*іlM϶(5]̶*$Mɡ0W WC[ BNWʎNZDW6+khkV@*\MhЕ"k"BE`ص&BɦUF):J k]!`dk uDW-kUFIz*|-+x{ Wƺh2ʕNWCP񭫽 ʎLWGøZ)CW,= ::t)l[QALajYH ~)hܫbU7-T:x*qOa_@9z^̢VrYyb)HjYrKRdyi1-H.M~5@Q>=g3̪ᘥVVf׊π4y1rG?->ȳg?l YB; 2-ũ.\g7JZj+lfT2X?ZF@rP/oDPQ`aNGx]7:V*N__."Vm]nZ]6״;mg;>KmcThAj4m2cEqRV;I[Mol+\tfi/&v2aSZ+"o+b=a|dgC }?pCGanfrBb^|7?Ki~]S0?@b%AǕصZ)w Iٕ( dgBJQ"#zD %=˘[b#``5.v+Z:$t;ԴsOL-+Li{*ehu`?ttut>1:-+,@0_]trhe*Tuut%Mt+pMk xg0]"]WJּ=U+i[*=ɃPj ҕ6<@dk*պ-thMRȎNU:'mA WɶЕI3H*Ԣ+#4Et9kuhu Ո>>' +TTX6h ʻf}>`Xa3 006C4W{&sIB/9DisǕٗo72g#Y~=R"L-ܙhI{9:E +i`38q:ԃL-O-xIk[!1&8ʇK-8s;g+g G? C#NN!n3Vr 8I-\Pi4Hk[#92K!X"6ayXPd8}rp!Dq!"zAϩ̓'t;on,}l%$Sr!U-1:WkUnzeq!Lm 6ͷl>/#Se kkP oSM:,0e*:1g„IV&m=5Quali4Qh) mGylNPD6%K d~j*pb)Uf뀬-PRJk]re{]ZߪИU07T3F+PtRU=QKʇ<wDF0S"Y炙qĐu hQJAnʒ&dKX*0Bgw@>R;Ylnc0j!SwUa}pxĨt # AuqI%B~O_ݛ\Ysզ:uY(/`TJwC4*ܽOIzM59QJj MQ Ar#钍5w:r|XvIkNֹ[ʱ'Wsqu~1 mƂs VRj() AF5K!RXuݷn\ D@K$XZFR-+搌wr26z=򰡥27g ,(R>IA:ZaPB Bт:K+zn'0L'mTBф i'S BAmQxi,h**u(:ݡ-!x<4ӊ#:CI0kD(QTzsPyUo<X,X]A8Fn ee ,q/YgeV\ZB6tGV- YZec`t" mZ'S7f=ak7E5hS5B-E9Q G6NqA)9x X5lGD&0VN ddR!}EQ$xIiN +QQY[4 W TW7"z,8(&`ΫmO J+Y;F͐jPob~8w(c)hduP BHP%D&T+Di?hx"WX)[K1 : 9 & b :TjM\I!0Ej>dt já<l24~ :zojMt%Ji7 ] B!Ь3K*-d@B[TАcݾ8'YƸgU]%׽6Q Yd^$L'HH/>XҠDtP 6CZb@$F=(aL!#A KwA={thg .USϘ J(N; zPIk4BK.}oGdu=9g:9>oU/x "Q}`mFL`-d3x:p4@.A[0JhIWo#f BX9Y4<ѣyԄ0 4_4+Xq==dHiDh2Q,C,bw W4 h^=K`r!YF[[yQx+$@e6 *YȩՏoTG^ s mG7j%eD!v"}IwӍ?ݞyLEUv!ȕSе?T4(ccP.u%1h /fs7DR& +28Œbk ) x0 M-`гV W nk 9%bkVcCXDA5DC7HX] @N%m1Y5(+VF/CσrFL"|R:W|݆i63+I@̴LZU'J)C2~ȃ 8wG{w[0qVcCe t׈Bj(4cvf:PF._t3,*k$k4Y,phm@ [SW`$XHď퐄M٤$E}>FJP0\qih}{yM/]ώXxO˴gM\ϝIv4f`.`5BgekOak0vBšb1j1֚cfԌ]߼Ual+4Ou'/PDuN M蓻~#wt(䍡"UA  _0ۢ#ɡf$U#1t \:'X&W!Xڨbj#gmAiomEEJ@:XV=Եj@רMk&=Lb2Վ*WZ@׶䠧g[:?9801>|›k΁ >b1di1`EH&XkʛvLpyP/FF̆ M F9,`=KF$=ki0kN’1 +Itk /&"0@.: lnT *KcbH0r:@jW!t$b4uŲsL\HPλ]~Fz+ZP^){KAApՋ7B{qv x"`!רLyM5u%*=/_ZM5dP]oy:ר能i-O/?\V?|bBR\l[mk=?9Z]nҮwtW!^=OX^a7Gܞpw7gG/^h/p{N3>~u7B9nߺ߽9xVm>9cV=ɿfuk׆ÑG]7, Ār| 7..6@ Mԝ@ҋY:,t'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N q:Ȩy@dt0qnZ#̀@@ycN @\@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $NgBE=qӋqmO  @(щH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 |@y=(9@yN (Ng:'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qz@n>ݹ>YVS~{}rCo`ܵvv~Uݗ@_q Cp1.1db\hS7.1JG= GKWX qt9O\c1-S+Fi3h|pKRW/f/.-h+Fy ]=J&%G da1tpBW?2(wUCeIA7y\-]=!ma(yZt@WQK^d:E n*@ Apn{z|ۓ >X^7r؝׳5_$9`=@.nyq1^v+fo(}LkvnmVEsp՛mKEWnܸYyWA|⶯y@~Jn$!OOoOv'>tն꓁8o\&oꧽ=fH~8Ӱ8^&p%?^G]>Zf ^{~k?l;O_8JNn\л2<1Ÿ?~ic= ]~vI1_ htq#w/\]kje\vgU+pdt~L}6j$m4 ԞR#Py(9ʩ˾:˛V1,]7\F=U¹vlv`kSYkԃ +>3gr!*,0يK雓qRl{үFzi} Ѩ}Mz`@KVGvgyut*/ww]ѯIPw"ڲ:6|_ǹ`Y~P$Ç,g3hHq5H6LY3ꯪˆRW@ctoK]%r9>uPI?gb 1d :uURҳz)(?!u62J r*QY]=Cu5A XkEi ~oRI7\Y_M' ZV>dI审$?`@,7%$94#ih%9eb<, 0~*珛䉙/E翤&Z.Y|S0\S^}Yg^*a 49MSjų_x3 pGuE ^޴pg1-( .xkR0Oӱ )4>Z ߬^lj->~M7G6}{ |7]з}@RWK8|Ũ0Q .J~|((4FZ=foR/o故 .}a$ҍ!2zc7ɮXe 4pqA@S/ xd!R&R/5eDDL ` Xy$RDļ$|&2_PB"eэt1ڽ|6+VZ uZ@7 nM ̓|/V%,.|:9̼_V%/=YVl A,t6 ?W_3ckZf 1Wfb67rךqn55S7^~PaIw|쵲2aDBP5; v(UTsc%bD֟?|tfa0xP}M:v724f3a>KlǾ593("LoreUE'?Of&/В_/b4K͇jmWa:_:I ^t9G>X?z2HgKsKw/AG}oyTF;,/`M%uV'j"a^fJZz!'3id 7V#{ GC@ߑ!yr=IK]֜xƌʚ旅l 1,VQid9p f[$FzV"$tBI z ( e5Bi2#$ 9Spݸ*-(vd3ZP/&{ /X XDcnQs044nƬU^:!Keo{EiQ,+Z wJFQXD;xC@QTJ! Ty %56Fr֣T8:o D Xf F8ZaK) NJHrfP4TJF9`ŤԌHdDԈ:N"W B M[i(g\qLb0QcR >FI0REG&Y OXiۉb#PԞEMZ'IgJb`K0"1!5W0LjӃ gE * [df"\ r^[miWLX5WHGĊV0V,)步F 9 p-3$HlPV3enL^0 ꑎB! (1L*1Ni1 D$7~ &am ~ޏQz\L"9 }j!|k.'[51Lhƥ[*a61."`fş].779r"day c2R&b@rN!t'I J1u]3K;ҝA ^vPgwJ`pVtb !vr͜\d>G>&. x,>NX^tH31"Y췫ߖ_旓^QF(8Lp_C4[]ˮM X+ 7VP*&?t,)'7#~.x^Ub@EI̥p_-7c[q~\㞯fT0eյ[b|{Ku͐f$"[2[DF\G[J> zvr0Zd*Awd]}2t5d$5́}OKo&xwqg\wV)\qj)qqrYm>䊿%f;~xzYl3uܲm5QE]]7AHoo?~ۗ_~e^\(^Kz0 ??@ՏhZ4j*֠iFr+rMojwY\oCXͷrKpgiMnv@CrJDkpH8jTqXL8 !XI޵ܟ 8Ywy 0rFh+"̢u"̩1Z"I`Ou|¤ycn'}wN>^i)ri :0=M #Dcpce=z;zH{3$iay[cR;ĬW逰 9HysQK){PTnelʭ.e`\hs?:s(VϠ}FY6 r໛zs$ڷ95_EVVo4g%CUX1Q5i2vdgH(&\lDVN[#7_գwTH~ɽgHu@Z:]pjN ZGK!I3jg>LLj9+j$ SSk9+ lw=iA[g72lUAr( 嶠zmar0KrmY 8@,HqZdt8:b80I>惡@:C{؁E~gy`0@[uC%2 .-&Q)/p (%vnRS1G Eέ=#"( \$JG@i"I˛ԭ{4t&ޢԽREM Q*KΕN3tNwkQ(u:I~〟["@N -(JRh `QkM F2yTԵ ?/i#K)OX{jIn͠Xm#,r{lt޶[m#i #C1AYA\R{#hE:v/G"hVafZ{9_K܌ԏꗁEkou/aSK/EJ =!E=HDiǀ^잮:}NOUo5ELhB $< :Uņ߃Yo"mKrYjVtOi5E(wAYJ/9v`X L"5YT"xƆT3gjo*6!L=DRbvCl-*,zs 1-ncZl-K0%핰mӚ#"vֱ$80zŢߏ:=L3+8-  E,8׌Kxͩ LۧICl#c#njGjmΉ Zl+m|8[(ܙ^tMp>/D}\X{Z_DBL8JZk׎3[KJfWɑ?A1D6YH֙'AA9WbThU q\GgX W%=V2 !BehgBL l"*G9KIP'AqŅv5?5#=_M&-WX;m|Dar^j%AxrItfXxϵq<B2߃Fy{{ gl-7쉭T {d~;7Z*`ó/IP032 Zij7-`P0 C3ў,_Զ\>ve]}qv:jնp}}5_E`ޡaۅw=->WrК;wDhw׹׏9a>eu;ܺ^=>Iu=e%Gnl;Ė5;k{;MF#twf<*~5 oW[ꞎ[plzGijڷm6)[aCs/ѡwPn6mbkdÛקX+5)q"IMe57ꈬB\W(͎纨aݟ~xLwKF5p#$Y9*T b7WCbxnJob(vZFܭf6w7k-n6}-AyWr2=߻d!Z؈nYp&t`*Pſ !HE  m3`x `9K:9"iH<*@(S6A.E\&ֵlQ^^ҫ$xzrul<}qej'& jix^jt=ZP(CTV'^%,J&3 La3 [CK(H|/(V3PHnL@tD3Ǥ&ngJSy2]~ FW dQr<&`)ʷ/E퓔&hADB/|u{[&}G оA=Vv#,ėJxv..øVaRT7NpKXRCR8 W!Exqe8oYm`v/s 9W1=xd>}\S?6[A> \y8m슞j6p֐֮aW95[m<o=}VOB={է{oM\\Ξ~[Cƍmh{3N>ŬeT , ^q.-s' /ʹ2cz6X+@̖ן%*E DP}kk\i鴖CeKs8liΏ-[QO@YˏKaIﭖI2I-F$w9Q(5CR䬉i: ҝ>0>,f?fͧ;.Ƹ%)਽gEBFT\/+4z#1z]k|'{uݯ.SwAv3SZh>LTB*+2筍պ vQ:ǮQq:zْB1`(Bq @^Ch 䅏 zaheV3#ovQdm,g`-u+Y . Yܻ9ؑb>pz$6,we;nfѤc (˘jp^Z8WDUO0+J/?UGw>2%8-(Y$YQ<{-+1C-ǨQB)&֠ʔlcSy`_GD!AGdxōV EtW+8͖hldee1Q ɤy/D<:b\*N)}ieWj8uGZϑZ;VN^{YRIJ=ҫzY{T|YW:(%n!yRI=Jf (UR%*EFU !9SM@)Cy-THƱ`c)U1HBJ#cGr\b!/|ƒȮoyogѭKSAmYq&`h2:E#D N*ʘRr27E,:j1=12lrxǣr0[Bmc`"|L+] g?b$lHcAb1`_U0V:JZgBhJXNDTq΢J3 ʾMB*P!#3*kB$^ȴHօ$(0~2/fq1[q$ᆱ0*-/Z&09I%ث^oe&jIzU\oМ5_\m$iIp%vR꧸|}TOtf6=f܌lZ <ԋl6/^|^y\ԟay{2nLG[mE N(Ϊ!Q5_&>8jkVt " bkkv2!-]yT˓VDZt6BO6Y\'/jyIfU%t Jͼ>8; z }^u[/?%wN,8 Qj5"V'7rtIP&s\66:__dS @=ieQ +[beU)B#T'vDZa0~* :OZ>;z9SQk b IQ-jn)qO)uZ/' ).8 M[j\TFyZk!-qu}BUQ|de rSsf%zg$Ү~?ffl+|Oi|O/aO as@fUS{DNN۲(0%I6vIlY,pmN-np9}֊NwiG2m}~*m'JEj51E+z]dX5ސ]T_SW0y (37ÛFxͷ%6m8ql%7RynZY?*n lɄ@ȣ.06{=l^Zi ZV .,rlȯE})9u]V+3m\?TK|wbgvͻޘnW(o3?A;2MwȆJdq>Sdo(PoC>d 0L}&j#>]<|D *&bI=E'| srB&IXO`(!$@k}!-* +hg5=⛽Q L{ܢ{lc=C!S:;Ԭr 'k\t5nIzD#xw:OV2vW6Kw[_LlFSIv> -m2,BmJ GgF(1LtmAtNe7I8tz紖dY$A< h:%5L65JH ݙħ4ٲ!3rY'z)w'߸ִL 1:[]0*ͬr(ZBk %Z AES͝b1Ǣlh&z%"~t3j9pnoa^|v$F󲤌ۦeHA;cUT5êT@l d!6ɄLV sA(x"E6bP=YoYZKlf6o{R}*~ ^Z @JPyTr&[Q-f;XN ]Tf%(XOxR.אOLS)jQ) ''aB' D&TYRf>W^Pk[O;i̻ Xg#[i>"SI090K uICtڗAp"8t~q3j 2BxcAw#u4Y &Yt$@o 4&#r% #䪷Ļ[tń}7zg?F"2|0gu8XA: :#tsRa .2hѳT(dSP=sӪXS<.ş4)͢]>.Zm0E,gY{h0.l?Xk۞PD@ccRy%ML4eLPz߃\ GA^JN>.ƭʫܤ6dƹ1[M-ڔ~ g_E 1$6>C9+JZKa6(|5&,&eUccHƒ_uI' 4̪#N.n+69J΃祎M ,-c 6B!@'J _\bæ3 /#mDW%Az<%'BL>0 -XBbc)wc~NhtK}&5^䙫\mP="6Ő M%eØ'UfD߼j+=8vz?}1=USB>aLV״})RH%`LuV)~N|ڟ|O'[z+(jʘ $EpVOO!1~{XƳJb~IUAFg$]}x7^+^l2%4Jwp~2OkC.jin0˥M GIJ͒kk=^݌u2;c{ۋoaֶɼloU+"&U߾9k=6Gz5 t ̲VsX~eF4>me`ŪO'=yxԠ:GqEvڹV'l6zԁ 6b򫌪G@̪Jyy-?NK^VG?_|v;ښPKYrY4ty_N4*ZgZΖx<#f_ݏ׿~{~=_u_Sbu9~jl1/ۛ1tkho1zs _g\;^˭N5c̀@_-%86u4N|^_DP0b2 I!Iڋ &)d@@EC= 鲃7)ڀ`"|1YҨNArl@VSkOQ%9^Ƣk \50=M Rc(˶ul:܆>+mg-Yݲ.;\;C6ٖXvFwGHb{)И}($KBgN30(]9B;q}\RUbcunvDWHm\;'}8ϫL-7sݛZ?O?;K*'V^>RϯY:]ygnm=fߗlIPWFu@jt{U:@'T\65ZH^$狮z]sYv&xJʤ ˘J$) Ygb.E]T:i ;U򀳶YWUëTN`խv le!Qb N&!\OY121)b :#hi.Vk@$Y+ON=?em:w?ee \;0ԍoj@GL{ANe'wR>" GK߉BgK1@'ĻX4NH@ϛd=_bkjG8x*P1҃'8;?a',2F[J--(zr!()OFZR4jǐXSQcXTf{V-W$}6 BHgoY3xu9~B'k$\v؆O[/{]޻xq~4;gݚ?mƾݡl *)R&Q,`Bh!8G,%Jlc!DREGOc(HO E$wuަs7Y/'+Kcym=ۭ(o+i, vkKr~%0|"S6^Jj|t!>fj"v-ڍ7;^sB*0,_C:#*mٜbZKdJʫ`":Ud,]a`,Aǫ%3'70i L^_hLi?LUOT5fYF *])M. t)恆Zϡzv&biù|5ȓI1k #Xl ǒH:lC3|v}%JUW|)MBS# @犵EtYI]D-hBA DɖlM b{ѓ"YŢBR.:ZEfε 3XAz}dgX*(-NEW:P(?)]-L ,,l0O4],Vzlq+`q^Xg]mɑEi_/z6JL`7lYoke^H1K:ns 'r1 ʊ1cЇoN.(UcM4 B%' a/uBZ~q]Cƺ0:n(yD@ײXV+<!s2])+?mO>Ga*F+6Y/o4EՠX)L@6~H @:W?6(b*EMlH0Ae&RjzBFj}.9^At lY;*"\} т 9iєotA,'UsjZ4uk Xa)/'{l 3?>KYmP'*23+F!NT< `S2~IHA Kt0_Mr쯞rG ˪byvWy" C%^#bcFMdA'wynj(;PNvM,I/ D:/3JIkd =E l ØdM0@QK9E & rpEe,u}165Øi;TJmjߝO>: c}nwv:Oց_<ϫ74=gj{8_!ZRf X`o |Ţ_e)GJv{zHJ!)%Q HhF=5UO=S]u9MƬpf ‡=]wcV"Sv;E[L%wgCQ?[cݻt^- z{n?gwqxkz0(1qn}[_}nyDdG{^ߊ[8"q_zQ:auP\9U،܆bscAӏW|խhZrY2Ř(ˊS!B"o8@N}_WY%U|v%ZIdAg Er$+J0-Y%VCG P>EZGvaÅlėZx5MVyg_9sk7@, 1ȴB"YzhƁT\F.hRA܉X R־t'ⱖ{Ҹ++;)_/֌KL:g6\Uf"_]]\2geL~QΊ皸©f.f‹osrYgH-FU,3];1{|o-f>QXsV%ioǝ2U:2_YM_=䇫7ͣcr+ϧo;Bs_/S͒Mxws>ݠ YVC}gXkzZCkT߭[å&):R4+'=1ts` [ËC_}8fxsV1Z e^. a&=C"?l]+76NJ+ztB<ޞٴ4GKp})5?Ԕk5ӺkSǎ*I#ҧpPO"h-EfnFMʼneߨ/'ѿ9nR!FTFJf>:Ҝiy VyK0 ed@ U |7Si$L7j$uЗ78zD3 [8::~SilW_!`Dqۢ\ݵ2'fU"iR)i^zI'%Id)O$Q3D'\[+FLTkBk8'^F3ȝA;(k E98wBJO";*wY3s->=߮I, !=+őD셟7;DV$jR1J%]͢Y MÒ8uw2Į-cWs`슗eX 0V6PX"p5^U;v1S㐭׼P*z^]O"'Y0X$W5$+QŬ"y6:N]n{F18I$uWYW'}nW;Bu' ӧB :ǁbG7Z'+I7ܜ_\-kJR/R2تY!y`EhaQsrE 9e-b۬I9ZtvDpz a4lj)N$DC ~>>slϫ.6r(+ +5\PkzG"` ꕄZh3N0vO++Umw% ُ/(tbNt֭M㉚}jH=Xzl41>g#$R6qw||G;;_hdlJ2)lY$1_}FpWWuY'KVI!Ųaª}KtN)Yo~MTn/_&O.SX_jV<g_~˗I+Jc=_&%,E'fu=7lYsyo[8u ?^o@խcˍ$\i9ɷݺ-R^ +{wU{6F߭tj}u7\̯K}G&7s_hT{#ܻuln8<ߊ:O?]ӛ_)e>U]Òe|]޳ }DK[J_ :1wvXQ3*[rܹ~ngrs?/͈hY;/)r7hURB#Ow[|_y8V '?]?j%3`6ąl+Xh6#^WWD;({8U#@Q]H=1c8elȹy^xjNOg1ׂ61ͫ?~%bR([2KI#-Y3VMB$ݧ|.!ic2!46cEeUY=&]"he&Gɩy2Ph`n3lb$f$!M"A;PT@?`!f,4JIJDW'_0KUk=b"uQ9:/CM|_Yc}"*7-QsvZ.xm+L-$;7{Eժ"7,BZ$}.F)^eyj"eEk\q @]0Mg3^\wdv9BbRwIm2*VN kiG@ʖ’ - ).IE F3`Cp;-s?Jc4U &JBN ,tЩN%f2x)B Ca[IcQ֦h"EAa2pJmuŸEgMKa YE%d h*' ϊH8 d0! hŝ]Ka[T|PX.-m. v98l|@T q1$E f0aJ Ȃ@ B-u q]h"؞MvZfߕ49ac[-k^q`<f^x1u@KqY=P۪dpҫhR0 RnU L͇(yi+ vXmQ|QE (%(x$L䴬yE,CvLDJôL燍Uc4nh=ŋ ":t@2GM8ڂΎd P?`yrx*fE 8aN겒w2<}Xyv?<'B\ro9TX] M("f_[b =N x@6DK x1j"!]%bT15ʪ+@ @0EcY!G:&oC@EL3y20jxrI'#6Qv4F,pZnkT5>pP e#s4pqإ6FuF 39(k $P~2C4"+YƢnT\1xXeH!(r",U@k6֜ß?倱ڳ0M`U#42Ko^5JR\WoĽQEZz7j[ 3]$5$oYBeڋ!K-hL6b1GXiڎm64繨)8) #"篆GC{o>*c(uR"˥P4LT@=Bn%%AO p'r5쀱U߸YúEc4Bz_a+@1\ƒr:@b 94R&xyư2aNPʢHbQuT+F|RW=ATXD t.1#dnІE8zV29.XZ#e̍L #R5YH>9XM݅$Ơ~B1X>ܳFrqנ\U8/ڒL^5ʾawU@֨2 G`93#ae%3J z̀|Q = {!ֿ?H8Z{q~cvNQ}6DU˥71.ńE@苛PDCJ,9vR@0[`4bM(uiU~ޮHx;g&^m#:ًH)t akiU9LJXI-dn:)C<;d_N\9Δ۲ٍ͘jDo V"& 6¾&P3W&PָjV:EL tw9wmb~ֆ]kMeX,[boF9f$Ƈ߹<:_Sx_(JHTzeq*Nss)i_>>YOS{`)[˟wsS7oy7aopVF Yk\ YŜ,k[kp4`/c( `[UۂV? <}1WNXE!c}z7q i;޾*`n,8E߿}p{9 u8}0.[8@{u{g_};%XLEYRrL9U.OfB2f;ɔLXōՂ:s׶{N輪}i~_8:am>gHZ;n^+`rVh?)"6s~ Cw̗S%ѡ6\xī%^-jWKZxī%^-jWKZxī%^-jWKZxī%^-jWKZxī%^-jWKZxī%^-jWj%&MZ\^ :yK~Z% ~u>Z}q:t[sϳP9EESb-xeV@-QϳtwJޟh'<]D[>YG8>$-Vh`X*\Ƌ%\tV>Adؘ89~W}4,=DL+].>~Oo]6CDWhI¢՘f{(ٴǤL0C25dn#NyGkt,/iSJz7a2r9F"F iצm-Hu|,cOeW&g4+xv>_^} ߋn vVw>w6Tx$ ={hEEeZT겨{\E*s} ӫ%x<z;Ы0}A6J;odFe$o6|m֓<,wϛ:NԣKNXA=M~H@w%z4犓DIDIDIDIDIDIDIDIDIDIDIDIDIDIDIDIDIDIDIDIDIDIDI|=Z#&ݠ聹RX_D4$zFΌ.ѳI"yfH Ӵ<|Kzr#FPiKW!Nrv(*=O "sA#_bO95Z:,NO%!/wrl}JHir. z}qqW:džG^^^3,RbtokD۳-q텰Ĺ&5qsMk\8Ĺ&5qsMk\8Ĺ&5qsMk\8Ĺ&5qsMk\8Ĺ&5qsMk\8Ĺ&5qsMk\kĸY8^З]@E.f|b:2pNv%!!$3؋|GwSΖֿ/~*3iCn_E%?R:!o4 L $SVAF=u#:9z7 sh#|n ϼ:|^r;vsK8lH8Ar&IX_8xMr30#7&W#7kr3mc |r3^3fnӽv׋_GJJ;xT Rp&,VT=Gqm>K6dlx}H5٠J;\(0t* YʲȃsͩZAf͜k6DDU]" qz֗6JWۿ]w-_Xuۇ\5U2_0wA|VVQ"1ÜVUV[;i>{[?Mn{)F-B4[ZB"OzkHDHDHDHDHDHDHDHDHDHDHDHDHDHDHDHDHDHDHDHDHDHDHD{[ԃ[qYj\P;} tGL?]\Qe6cLʆL.ZۢӊvDz=1|X =Ejp}RO6eyETPjmSZӶ>{TI| јclF):93E%Ul~Alѵ>o,t6qjj ^7X(Mȁ½h{=QJ΃4{rUi8!͢zyzqlthƔoti jm\\lwË'{iyb0.?NV> \nTːXvtv^fKke۾ȟfW=L~ǖdG;O 1gM-풺lzuSCO?Vc6WG݈mݛOJfx?)|ϝi-7o9v.g;^3d;hJ/rjOiC;ڍN.BLJlOac2 GGG)|,P ?"^X6vpov7o_pt5oHF zex}iX~zbx)-^Gޤhcӣ]͛lK}(ArܙFsؿo~Jk}L>S~WrUB^&`&ms3R–30+(C{ګ &WPt,8`p2f׵yi2toa/Bɒ{u٤jA;js|Sv>uxi+&){۳څj[ꮒhS ;_⎕j|uNɸ 1T߮y!tk/ 넅K9FR׍wv7vJ$~Q6eep|f YY{ xj;2^+O{4MM߫cSLr+fBI!=Y7w1̄):IHvy}KhD5s5q̙Ld~Z#Z3YS-Dp?Mɧ{7 >a5k N/剣GLKj%ui(S @$ٻɫlYX:+6T'(taڛAj$|[ 9)R0 NJϞFL9$J$aZ |L#_5k5޸oek[3R?K"=>%C0knz,nR}9TBҍ/Ub+lNG8N ((?&V(q!W,CC 6Rk xk %.ąT]Bm+qъ=Qh_f2H8 0PwFxAC%Rڗ0>)t#a977g=y>``Q,0q@t: DJՂH1QZ]&k3^UtBRnSW~%5h]v ڻ}ַ^]h><5+f(*7mbbS $=po!M&z{Gg)ʒeH AqMv 9zr!iJ#hQ%&&|9.ɜ>WEl1N1 W6쪑F[ i,:f7gUncq_pd"m_F<ǹ@튉t[G_:y&R2ph<`Y"'C" a1TݷY2Y\2隘X0"  ud@u9wqdE1nFjDUY#^#q%LJPzqQΣY#VKPkܥ biA)xdFt3f@6NZI.#(dzX #$-HAUֈȹ_#^%zԞ⤫ Mj\^4bW6dw ey騔/=ě@2`'M&kŧЋۢqǶ > ٴ[G |q9iEF\t~cvqX_&`3ҧUBn(l0ߠWI C%b;>QxG;B(=r]Nh1nc\YuD zR\7e.r[^ţEd0ѻ653g-砧LhRk{W= a2IT0IJĆV0*BcFL Y ^c:S^쫭θ'̒lzCZZόmSK`FNqqy+K„,Drj22G&bL©R'4Q𠢯g! O!FMR&1' H_\5rInG|MY)ja&ʗBI?r _i.;)n˜$:qY_=PI2@8e/-0ÃO:[Jv1,BFV6f$==5`yї_: )!VʓE6Z*W@Q8Ɖԛ&r:8&i/qlS"O9kY+W&1ӛ)V0*3εtt1&3(gFyIN,n'ѰE|V_ O*V%+M2g$1=Al9sErbcMp>qR»ݕHGkP3?f6F>zX)4MZCbͽ⨢8Kh5(SG3u=^gN ! f1cR347@XLB[#mGh8KMiCzkʂUcMOT>1uPFXLUin˂ɤqBVɗtBHlXe۪8dS鐕C~NЋIX%M{Iӥn}u֐((]@Bl*>]diV6邬J@V:ۻR sF'|IJ4xnR@@0Zjj=ibxŃCosQ ɚm3 !]hd[ACƐ 2^YNvE2DB cJ\%)sFyHS{)u21 6tq"]fϷ,'{Kˍ69|Vcq20[o xME-JU(d*`#&_u:0o|Kv-ڮ%Ixde y UNH[p Hl2d2w *-C 4.bza!kED9T a9Qy(+uW cJ;dѳWa8cl;ýxУXd99'lSΩ/h!zA+H>=z!ec'˓ `䥶iHR).+,^)YFip=.{ k=.b8SW\'+t>8icX.*09Ős`&ߧ?ҝi}O/<ȓL5}v%0PSLp%l\]Lұ]NI 撚kfְ8+9ƒTڅi l!+Za0k3pQ97 ' p0NncbHfD+Cimu5١FHVhֺƿ,No,R=õ)r4z0cezu{qv1$fN11h]킣Xmyypز򙔾{eӈ%-fYޑ 4>iu`YŲGWBOϗY[=kwf]ztRYS'#ebM,/&x.L]T|i?^I(5COuɴN*NrA25:9;n1DwiB+nAן}xyw/>ͯ?x>{¿s9Oݥ$Gˣ|6]injiS+Yg μ-/yKcB,/۸_΍?H7&POZhN(%=S35 Z)Ei )yE:@K{:7]pǖy>sz>\KD @DD|󤇎ޕ\?ImÀ8z`f bL IYU7I[ɢEyڀ%}<ްSİsR>h+Ǽ :{.mU0 bQ]B z 6٨% ׉'Z:Ü˄3n5G#1J:s$3-O1琣wCi GcBM܃ ڨ<A8 =1ٹP\oK%T˙¿nam U0fx_> oGhYU?9|޵:z?ߎ:K&%AZg4x;,\gh ΖڅNѵԃ=z"H%h Yt<j dWO^PDQ @h TZbk/E%s]95S}/*:瀳(5 ƏR(TB8K{E)Q)FXBU'k4OYN W>73i$HϦy=H맂Ӷv8{DPO iqGӶckT^qZ9YU;TsFi 8Hw$'WܑPqGz}HeB23fɘh\XVV.%eKuIk#6 HJ 4W{V [Yug9tơO#& Lk2NgϗS3LǾ2cʌ[LHwG1beyy%14xt:'_~F%Y+}-> >;p`km7M3^1.{gޗF?*UM˚T1, E@0| b.<%PI5wx)uZ~9Yd)˕^ m9zAߵu1Ռ O10!a+kZde>].NѼ_ΰ"}?0Lyq%5SauPŋ<ȆWwQ9Êf`aTM~lDFNHDPHGe:ggLO> v4'<"?RZzA $KC2&Ǡ @Q W8g{mV|݇wro/ms%377~< 'xwUSB)*i@n|!a.^km.hϻ_~{`awl& =I$ŭnsM>>nCy}["J;FZ+j͕uEС7/ɼhNܼOμhNȼhN¼#'JҦS4*GnT#a xARƸ4I5N䘍Ԙx G* ԀUzl3:pLr)+x*l(֝Ɖ9_>%sL#`e<\~ܨ=ӳ&Mɤ7q0qMWnbz[2f/vwZߤ'ewKF9YT[g "{"k^ Yz{HaptBx^[MKW67cmT+'\Tp5cYmh$osmv=ݰ4AX[HԎ\A&[|mV:N_gX$9R΅lt@({ƴEKEpJ&fZ>罥BehuyKY)* ugЁGY)N$/^b;$ō*MAp)'^ A$Í"]ERHĨ䀕іBFyIdx#4@D3.QTyĄҒ]lߨnz0$\S,w1:+zɽ]&t#[n0owYWEږ7eYz<]mecT $"V l8CE欐JDUÞu,)=>uʘ8ezsH,:&%4D#1)%dR9clJ1^X2兼0/=/<*/\(mv.卦au< Ƿ7UaMMB.iȝwFϵf#AyS>$3e,7< ,q&WhfN(cU)@"AJug;Q\1OAJx F*QzW`(+Mȓ).*sXY[H%f#.C2Tk;\%n?߾ϯ//o/^e\5uAꬳ .tP<7?|FӶTޢinM ^r ߥ]Ur~FлWa!1~@K/ܤ#^G&6IwF8jt5pA X J$1*8 A9PdR.T.&%kԻ^^Pw󿧤uX^xt? NYO8 9,hKX"'.")nO&MŲ{GAoJyĝVNg{NMpU4>Lܬ(vz/6R(OvH_x2KS?\\sѓ;m9|k"YMbcu5]a|3 CP ǙF ˧- 8q>?ջ]_GGQ|}ߤD8HkwGn0[t/,*%x'vjqYߕ]5>H&ɨ|-qiA:6~Nlf%Ogr 8fSlb`Y[cėZkd=2?Y?$(-9Z@J mQ)% tH/j~2`)kLI 1q]6xL,h5T"=Ѩ)l~ΞSԘd:tLOܹO'Ӌn_>v.bj9U{uy>oym݊dpHK@P0b+܎F::Yp`"9l.G4R*`u4[mMr=)Z U4Qwv3x|6ei^{.bNcWTdYY=yE(Jx,Jwo|r}}UEp bIQ-n)qQjU =pX^'J{&j|x!>GCEqҷؖ/v m5U_,,qSE &%0\ a"-B { bPb$O|9|Jin vo3h+a ג7wPm1,h_īFY#6lk}c[/N"X Piq h"ʙ3V-1hrX"3)fU`Cp,rge}6'j%'u<3`(UN+?)w춫J^"_.R() tDOµ["yS*$b,*f$9 hԻ("͉Iq$>:8oZ%6*g:T&\%roh@ 3kVJV$.Ʋe,I+ƙ4BTrh5Rpst :qENpqo9ڲProCJQ*9B2+N,F$8NFl>vJwUJsYCO?h?-,.z(\1Z" E*"% (' TS/'UtTQWҞIu*! ^*Щ*"WT3l,[{IKLWT)8/HH6urUU>-*bbBKV'"8LG Y_9TXa[TQ' sܚ(#E^]$Ag1Th;Μ *.H\?2pf #$ IA2' pZST@tRPX5!R:(MmyU(*ea)6 7'aQ'& G/~VeH8B^A~TLFCX޴<0(4ȶy=7I;9)!R KJ)It`)}:$qZj1A޴GQgWl-:w HnL={9YYڛcw$ըD^r/$554& h"BIFЪE߾:.l!?w&E}RsP1hxϢ1:&%Nk$ F$ GHTs LE =ͬUF:vvSS 1pߢh_N>\K)b7>6}2dr4<\Ւd!Z؈bYp&t"8KW9XА( A(R@uGy-$*DPҀk:DhD!p) xI])[69b\z8*&3 1&(ř@L PU s0Z_:p vjT po6~5% ݭb{7Y76 C^-bP R1zKۥxE}]mu7 H_xoע˦9 > S#ix.ch"\N`D@|1՜Dg*J窲)ulJT hj'MHC^"M$$X"OI J}nH6F YaQ]-,E*^^D٢D-ƀ!q82uIeL%@t3svHlalC  &䂱z¦(3 Y;[4a8 H}K$MZ_ߤFcZGQɢƐwAgg>)Ԍ2wD@6LL OLr=G yPH L|Kt**dEN@ %ݬM-3KgF/^!`eӱ6*C"H;B} x1-F~B~hh?s  ^xq҇/yg_9KA&E {B`'M L$p`hѱg!zme`]tg`z`oyWx59'jxv\Oǧ]`yYW9Ebz3GWg(#oYK#ޟosfU:ΧGګ.x|>:x{8,.緬CGČ~nmNt^~Wsrh4H뷷$o-+feVݛbaFyIݟ~0;~v^O񝝞z|Zce˅*OZ} '8Fy`kj djryYӲᅡWWX[^M;q2~!8\Ƀˏd<=D>x#$RХV-r۾Ypx elýuKb^^kjI|^ pΎpWExt{;Nv67%-B&h9,h2zyTGө1pǚx߷;su,}^j2Y-_o=CNfQQ f gtc?iG.<86%|W0?fxq/]۩jojx_Dᤔb[$ӥC,sޔmR1iڴ/LK٣Vt󺖼PK^3Ϟzľup.דpIoN'%\N]'&iQ_, ٰA[UlQյYٜMKy͖>K|4 >6kl 1_,dc0 !# kTV `=ҒFb*fd^#Xg{dF/Պ|cDLI[ֱ%dd3N` I4EbA2*r:'FcxN0[l^/g}_@Y]XW??06B?S-s窅~xqhMo0 ~hx+]A_4ыQ{[GcOT-}dLTBNfA9HƝSQ1vrH+6"˾:)_NUV_t%?w/jg Jic6lVd$+Ccֆ X{Uꚧ, a~Ati仆 O6Chsu$`֞^+60CcF8X-w\!B;DHQVϖ Dv,@ lYrԢ$Ȣ5eBbΆ\"R&abA.ٜԑ(->L*`L flQ) }%Ik,{~f3ƧKE|HVl>{dr쭣籎Cx VQw9>=φ UTP2411j(v{腶;kwh0ߏp>JCN+ zFhs.8 ۈꝩW T; TѤT۩`͐O} ޮP32r Y )$=H-bȊYł,h5wVxWT[J%c(uY˄*D&Җ.#]9Z{vy\PN,u!|g͇#cMQyٻI\sQVxA/]Xg2* $u:')2:%cJ碥baⒷK*#e_AdM`ajy*Ͱg슅1 FvUxu]- _n9fkv3K??a|v#0A) +( I!F!lʰiI昊*}[[V:=Qu0+cTZUԎDj2[a߮vS& $*"oyd\}Afѣvo~xtE'p1{!2E-KL^{q:e(iÍ`ھ-4!e( {I0d)R>96\ljx،sydǮh#G9>:HFf M,щ,VB 2%Kb(3M1x)þ1YSg;:QLdrL.ڏ=ie@#HTL1"6#qD|wF#lfb͸dW\tqOEA/5gEv\`(.efq)pq_w  vgn|4E?g?>S>ُ*auqZ[?u&pd/ߵQ]!8C"10$.Q O|>!kጏedXt2;94JтTJ[_V>*dxQY\b6hl< `tim@"gQ.SRu1WKm~_?ܫ0׹,Xj-Lt9^a3Y7bz{UW U\%?RdzfhtD! E$x@}:M.Wa~oغ}1ve/ꙉxj벓E)t! \J fk`"fR&;IBJ&c[R@<1W5H>!b)!Zل- O_7#g|н1'WheG_ S!wzS;w?&mXd,°Tyfi /TB[! (Q3 R`7Hk;pNC{v}6eO{9;2"x^ca F`:tVcU*K}ƥk( jPZ $32Ioj&UAuf=^SK1.tc9kFΖr퓘݋I, D\ ()mC-9+F8" ʙYd"N.kNc/-Eׯ@N?3;&4A$@:@ˌN L.TqRa9k̈́zSSMt5OZ ~=`#{i>3"sIK53 _zh{5bO{in} &MN$PJRBɊ\dMP% 3ީ^M70mS*|m;+MQE M&4ZSjѮXsq .+cl4]C7׳X6')մϜ^.ݘ{5d'XՋ}=9=X\Y|T$9'ba7GXdŧ@hGCme$="sy5^)ae_,~޽\$^AK`l X"O2K8 ccVMJrfԨ;ºUFGT/R$QW$ f:.EjA!DmKz݌SK5mK18kb-{MAPJD'Jj@b.#VvEnuNg")UUa ڐ2* \-~ج%_l #LiUݡc@tQnH)] &xS*4%F"1RO0D:?|jX~\ _{y, 9~8wof53c>0ҳ3P2BI!PRC"ngW 4 w˟ BÜURjy Iɭ.Ċ|r~^̦r*Ru ]:nbxgY~Zuu||[84TZZiP)%׿<nmJf]qV[M5:1p2sOkx?va̿뫳ŁW=İh1O{ޮH@>>}uhbJaJ޿ŦeԆe@(/6.,oY3'F]|2,7zvƛU޸MnWk[:ee~2M3Ư2#y(y]}e_]3 qj}2oߎRmx|)łkK%T8΃dz_$Ib_zؒ2##/G7m,nÎ!iqLjod(,,Z% XYEQq|GmCoX|&Zf KUD̗8/⌘7>}qO9ӓ~t\f):j{ 5!gC䷯ڶ54Zbhmzی\rø? pkbC1_S0W[Hh$d'c'% S㒅d Lnڄ IM 6P{wKeLCC_Ypz}́R}Pq)iAUQXHDDN+: }m&H.;>L΢KXՇz@SZ?=wTLm ߭]:^^gt/]WU&Tʻ@* {7't~On,=(SG`%A5\+z#CZ0EiJ8e6" ? 4AbLYpcrYDgT`8HD3$m~6-W/IbVOO|>~XLe׋b7XVv-NŒ((' BHGMv"NݢdwV%˹q*L28lld BLk(`kӹ7'Ǘ,7ei=Z:o)͛޲| R#*w)i/>|8g d;KЩ¬Y幑py9w=%)P*gkV܁ e9YB|fj~!u-؍w &} _>khA*2ȦJ 0ZPJlyŁYak‘ %vcޒM۠͒䑓s)&ͪRAISٵM7aȄ {U_7]h9if$Y*/hSb9ԗ  %3љ] 2M〹J9)e&&1HwS/E_+iyN"%)1uV*u\u)JD%`2LcXtAVDzcx9YP^2bXGcJy9uKXQ-3HkZ"wY +$Zpj[%uTBe+?X,`C}`nW*sn}&Kg>kx6J=s]z@,{ +^LP؝,[tl)&vc3ܗ3casY# {fuІoNxfV Q֝Uu1Wu9@}{NHO=-4WضFFGVx춲!-`VËYiZR8˳6rtu! 0FbC2*0X{@,~˚v:{e')K9BSZ`€PVG$g( V{WKLtw3U ;]+:1w̘i\^M2/f_vZs1jJ}/ÿr{<",J>-Bɶ>Mp4oHxd0u'׫ܸ>uϦ4߄}&㴯iyojn-LݪF3:-cn7?dVX$tj1S]_-ݲnIMoPp̍g+[y,|K3thɫ_󄶫yA)3{lQ<>$4\MraXM.O?njlOv˧Mq}E)\6 كzUރȥ6/Kݦ[~ċp`&u2զ{V)K&xebK9 J;2#qeXЕv:gX1<+9 ;I`ȕ!m5L'3ocs.&48^hS2Id}22f&rDNi6Ev)NNʤ_95vGxh"P }*7QQ_lv +dً|tYh&°Z 423S{msO{>;;m6Ȕ3dAiwkW eYdOz 9?Y>*D)9&4 '*2g'z Qo'v#I#qdHZ ̚Ĩ1V¿IjZ^p"8ivuѕ,hkY~{@6j:)Ub97k&4 Y.!  h{.3HW-zDhnW~}AO|Ȯ-32+{dxzy gD ,ɞ'ЎYVw+0VŧpGk+wC|NЫiQk_DT-▛Ǜ_. H -S] K~<^R J&@_IPBPsNRr#((fЙ>j;@*h[hnL]6GL Y1#h8cPI ^Q̕4Ku:ywX־]*GO&Z4#z |/qZhc=#D*F=earIs r 12O"(t %iƠceS2 ߗZ Z09xo@x2& d;V[)TpdL=#H.dJqmJDf25 .K<#zަ]SDRnP5Wį?< sGL|ெ8w1qI/wi6|3RN$?]x_n*0p!׳{1WjKUOϦGizjvx680bm̹hl- Y]<̍W;JlH(7t4 4 8mbyMxR(EOƣOˍތ9<GZ8*/\dӨk֖NKɏ2$yrYD^MI~A#>klR/a0$NBvu|NEƗ"/od â  .?2=\AX|&Zf KUD̗8/⌘7>}qO9ӓ~t\f)S0{O& l迟C۶C [ U/6|qUKn7[c-B a݀@1f1 { iM|-Q%;RLgK1*0ڄ IM 6P{wK^T,8e>s@)z&R8g5&>h\I݉sVu:M<]NwF+E yНR FcCzz~4@yF e2qn1qc4R=_`j {cmJI'}kvpK &!CtR+e"L,Oۈ1O%f Zƕ-rqV;]>'ތugiϣRazv+HomQzBxz\?r9*R [q(V`U *Q QXsC$![h~k }з\C -Q1=T ٻ޶nlWvefmN 0'y ndɕd7Π}ޒ%ٺDlDERk}\.*I>CfI i$Yo`7Ԩ/>"tT ݽoxu)q+'DqW3_iX7]31\ L$ߺ~ַ{*o5U^=J+[YB ׼j&{[xo+Qآyne)V10S}Ywzp;; cԢj#GTI)!cJn@Is_zgRBF1q]6 &EdXGh!b [ņ39_;z6EZ‡^MΚ6@.C2͛=K '+C Nl[^%hRΞσYg;[P7tE?ݙȫW~U{m<&5S fS ǀ3q׹[O#Gm^FZ?<ځ**ArIi6%I,W\U>U8h-Ug1=].K) J19*Ϥ-gGrgG:Fi >! 8W(hʺ SR|qO_xVl__A$~hFIt­r El.@H]HL451sUHF%Sm íN*J3bْA),H6u\VPyЪ){ؚY _^o,oƓxQnd~uf7tw=}zk6ե͆s|6sl]azuմ^=.`Yws~5e-VZv[]q;md~8H۝fbl>66Ӟ3zL4ӟ6nM!H&ú;z6Nuu9W dqn>i 5ĵ("'BRS9nYMe*j!.uLhe\`cs{}$s<5(2hx.ch"\N`D@|$K5'QE♊RA,iԥmS՛38 mU|n~Nv;f{<$ˇKq4D Q5 .DN"0,u^j#:XА( A(R@u{A,:`A1Z/ CKa"EA(0PKb"438IaꝣQ1xΐQKǗRBf9Y}1Zl݊ ;#Ǖz|Uv?|`zaZՆhϸ#ىW)K< -뀧 GAJ!%doyOjɍ<h4M@L^rB)y O6׭g;q䵞AT\,2AYBI"Xʢhdgknt ">yZ1.+tv[ᐍM"4;w8u~ mt |=s$&M,IZG# WH=:2'8ҚDw0K0Si1Lp j 4΂I7u^^'G_oޞRr̨:cJ~GG?~j\޼;:9y8GJ-~WI|ulϳW,o'gsNğq"]Lo1vD- uw:8y8ߴ?.nnO?t5?ߤtx柼Fakޝze-N/7o#bLF*;?}נW?q7yYp~]\w`ًџ(F{8:4iٵIn2nY@ҳZ8`SKS}ƽ72n/-y#ܹeLծUowN{a#m\󨦣ig'=ۈc_nK5ٸ=\UN޼?`f{j1kQ;K?U-7;آ^;|3\p5KyʉN}`).75R5 W$]ff5Lȸ~S^tk/ǻ^6.߮b߇pH?_n?&%}  +b@Ŗ\;np_l]8ze3 ښy@m=Xpum1=0nĤ]oovŲS eA8fv=iWns"9YlS˾ C(ep5Ep` BSj g.Xd|wR||`>A+ps9ﹰ\HTR`4QA=z-MN kc=A2@JtWd*&;eZ g\s;,hCiX HRG䕕I2I-F4$s QjF& id0~Yf#ƿ_xS;yI`[ .5"( z!!-X]ԡ( :p//:K;㮏]%ZrroIZkD V_w̎~E`b=H3C2'i0gf1*irUdd\Gg<(5CC >uv]1q25t9kYy)콆mHJtс!䌓c W]1M t˘*p^WZ8W' lGl^hbup ҇q!g)Dɒ (p&x-+8zƔjUNK]D*B!sney(1LCZh`2IP]ਢZJ[eUD$QQ"iqh>C%Q 5PDpJ +jlVXGmW#4Kh>!xAZY^_^(I#E-NO\exűgpBy㛱ǫiP~G|_/BIҚ).&2"1H;>W{5s󜜮.) US`u3!{Xc9e.=XuSȺ\5Xthu5\)C%٨VK ̹aS\~NMdN߼=lWCqxwd%6 ;>m Pvٰ;WE/WI9'ۍﯿ搩cY3>Kl1&R-uvH|ڈvV Ԛ~ o[4SlU/\'&NPgWj-TZ(%EQ!o%X BW#L!/9jW6x cjĤFwvm ?<χ2_Uy69:ZR{#a ^g y $|Dt <;/VՃ갽_YTC 8\*h5~vFSv1!|A<9:hm*/mSzJׯ./,r 69{sdҸ鐼V'O,n{[Saʇ^m^$sA;L|~j{͐gNώO6+%`W?6{@!2@Ƭ: /|b,Ύ/O<>zfyس>v5蠾k_HRlT}w2-PQۃ$cGD hDQ6RQq9u7<[xҧ}1G\Amc^}yF%I~z p>݀2eۚW뱡&Z'O \X=3ͬɪi]B0[4>dgԡ!/< ۙ ;#qe{aVʳNΨ-t~]iI6NN I9}XrY%.} *(}1GpUì;i3;{7W׆sIkW/L9*ߕU 7 &J];T=h)@wx$GxBawWa~鳾r^.}*mZƴzCh踎mFv`'o?7=|#x#g}㿞fpWG?؎=\3/?4k֞˧YIVmqN6͒FR^sтM}{/7m{m\¿ȂF?Vrni{7?~VnBy3v$WSkS?Nw9(HBSmw@&Ag&cI7 9A7qq`}rzt|;h LvoHy˿bnvsoDByc{?XM|,|H'՟.z^SE^mfgㆬ:׬?gBQź@ 0{?bXfty/RcssCuf;&GL 1#Wc I- =dЛvĺ],Jqw!iWtB㛷kS4x,wsq^o?.[k\JRQwW#гc+ؚfI%(ҕc`wuG}GS&U5f1:kU:eذO9W36bzեan5dn/wjةrKZa=7Z 6FKur8foƗzg.5|gTM5 %S)5zXp- wO×2:b4vc7Ccv..bll.nBj!SBuW=;d|HEWڋ:fxyF9 9S,Y8B>5 zj35Ժg&mH%%VƠsr=꜌%x~ w:>ÒDAnZwNR -rI!a##$6+&LH/*Yݨ}%U.D5#<%Cb Cpڸ\=D@ҬXOih^,HqHQ]|{XMi}^\*H #) ;*x{j%x!Ԭ;vDiJ2](_ 6Bdl`r)4|7!kLv**6(:ݠ-w Z 4@sXRs+~Q7P(PyUv1 kluX\k+]sb!Nuhޘ7#)-KZt1%'-tVw5TMX&E  ʄ6ph\'I%]bۀ6@@-V(!ȮhYc@+!7CA `ܠQ B CB dτ(OҗM:J`#eIW=$V*#d`>^!Bkڝ3h Ռ3{v8= `:JDMȚ\\{lc: HƍJL `?Aj8vGyƹME תѷÄEeXYw wM2BPMDRa5{-'tX=.:64MRU#YP4jynHH !HH !HH !HH !HH !HH !HH !HH !HH !HH !HH !HHl. 4ɹ;$f\;Ciz$B}$࢐@B $$@B $$@B $$@B $$@B $$@B $$@B $$@B $$@B $$@B $$@B $$@B}$U. 4Й@0ΐ@6gOI *x!HH !HH !HH !HH !HH !HH !HH !HH !HH !HH !HH !vH[덎w{߫_Gꩽ]m6VO6}rtvx ]*g>&duk>sGgWy)gUgJ~7>OXO9f?8 Y qKWY9 o!)%Q 5~ۖ\AU{P*'>‚XOM7i9d\NF QQ[ƢգF|q>+'3 (ɝƊl:w}1. & S)4%} mt 5bކ.7 BatL uDl`FVi|,RG[YMͱGh*'7ł9 v$M2G_Ǭ!;b9cujdAr_q j0bl$rJ|Zuܵ.Y{ɱ4Q -9+R,1B_ tH*u~s`IsrʾRWʽ{t 6=:Ӆ]NP4aZGRq>=Kd*9m|m96x|8ʓY wc1>: 0oCvU s0)D) Ӄla%VL0W,}Z-]ZލC3.[^.ߎ|]!в Pq6*4[,ȶb67+J*,ޭlXL.υ)b )kxr,R:eEe'`w|L 9,t%1+}arڧui | 7jz/jͣfkؔ7̤ۖ˯V?a[>h~|/3YS}7ބoMzpi׋~;sE 9Ζ'uݥ7E }6>Dw;+w߫<";3;'b(`b6࿭;A_vl0-PmO_# C<>`bT W6;EQN|,12ou(,U`o6UHKB7 s|;=`PLRmrY(4z)Zj5CT0> 9J!D1RXf^h1<tkϊgssC?߸_ܰ9pUR[qUkS ,NR7\&#_x}1>]`V v*f0X5Lrᷛ[~kɧqHf-Po >zpP55m`;,S6Ͳ957܂V`[n{|+m7iҚ.ݸX9?WӑB22U0CG Ӫ0TrN5z.(;ew/GO{=cHZɩhQ#Z DHgC8nt RUj(p-9R,t6>={qxyH*_߼i):b[gPbg܂%ܓZ'*R9PgPelj6 $+2;xn{'HXݧwo`:u dخ JՓy |no6"glP S n}@?Sꇞ%sd3/4ڸd )ed@,2*3 i2V`iwuYpY#'íݬy ͳ#;|FPR0aeN p6\UҠE 4E476Kݹ;oFSt`M5ۂo{J1훦Hg`Ձhk *A3ila)D8];8-'zڞ7OvԌ=(ZwXqZ@"1 .2T V4؂`*s x$`aZu2LwZ RZFL&Z܊tcf|=on>!H|v? ` ;*fzE:'ejxn. ;mﯾHLذft‡w=Ϋ>a:yw0;I+'csI@hwש׏|d\L.u5Ժ[s\tf7 ٬:,n=]Y;o.o8畖a6};k9Cx,ɖu鸩a=~c;rmzh.]mDs;y.  {OpxuuDֻgh_8Lt*Tw맙d4tgf4~J+St <쒗vzK`nGg.07}_: rih~wy>}7+%>-0[ j[_/5AcgC{yV{/ګVq6w4-0kR?kF?W2^F8%~{QQB{$NUclgh%x1{7 mŤfŖaN+!o/?/7AxᭆSmv*3U 0—}|,E] |-nZT`wcx0❺h;XBO&Ǝu7 z8 4_J趻緟]E9~kW_ Q()1L+:8(aiA!_ӗOL A{!6nMy⑶ 0 mݢn>1IMzRS1A%1 S}٤IJ(v`{b'"r]g3Ǟ|E!EEn"n5X53J#Õ L{ M,x |ƒ娼< erepxQn( A cHPUܘ]A:J3$^8ū \7q=;dQ-#.+}.Y?9|JqNy)E X/Fa,͸`LthwjFd>ZP&|Gm7 j3AI֌ٯal0g if]H{]qƋmBN偕5'nPhi4,_FL㨵Z"gG=h'b$idRkoS2`ouDjT^,KU ЅdOP&z$^FR&=a:0- 92klv2+&hlk }^aw*Ha(֗PS(G$8*UX^ȅ4*XȀ 1+ A ('8=r`TiLf}2ԤOE#fc5"ˬY{KEEsȍ#MZe:D fO5 pA*βjD0BB2g@!1X@X҄Ij$fX2klpu6:qɱzQd֋׋^Yٻ6$U' vGCppdl q| Qm1Hlkݯz/ѢLCǀ-3ǯ 879' Ӛ'BARA[II6d^܅^}w>TӇ;Pa Q=m𯪟}Ipm#e?ZFvX4osjm+{C-LK]j%\)4–N:;w4mce(awK^NѨ!(J)0k 2ƥIY@}2@ X1> 'ḑyNYDS)riP}Jj|๘-68\|kf'S05ݰ@~z'xALWrB;;m_q7fǙC.x[J(!DxxkHZRtĀXZ-EaNensy5湧O:oif5/A?VaC:6wlѵe^޴\ 9˫.TPԵcˆU(n_r}k?_x̴Uaw6+˷(j2g=8z3bC j f{Mi  RQ>)$Hc4NMuGp>,D Bm|"ڗ@E*E [X211"-:|L]5:CY\SJg)po2"h͞ hbY+9+@\kѬI_{4qG 2}Az׶ְ-{$8״qzhыpTiqV;Urm\vK|F/ݣ^>$ 2!STDROPqNJjPY2&*$A8,3RRԀnDM$BFRiMNck˴҄\7F7zQC4>mnQ:KSqT&=_z>.m]ZdmBGSŦ/]@339yUs8 r$K%/g4ȔwMhÈj]Ewd?L:[8JAZYAbQod)Z˕( \yG rfw3xυBw }* LD˔1Yѽ,W=V:](-d >[8;9ޕZåbօy$F*KϿBC5%Ҁ˜ND]E)xKtJ:g'b7=CD}z -ʦ֌HjbGOePkptUa&14-i'T}Ϟ<^LR9˔MuJy#&\ $Ɂbz2FL7Aڮ?ޗ{1y𱆅Mo/zY"D; 21AeIrQ>wa!Ұ O1ytAQ6V[rg̶KWT@"'LdGJ6]e&$ۿ6B/%SP #XM%VKi*s*A| (Rp Ku4Ѧh!)bx`!x%0f&. ZS@##H76phϹIѐ-EIlac䬙>ӦbeFeDwFg@nRnZi^"id!;I948e]j,Yy"^mk-y BEI*H ,j Q" 2JMb"T|x8Za%ketD$ޯjT% <("Zr3*d2F8BH2ϐX~hKBL [E=CAD2tI`-#^{k R _t@.#Ԭ =$uvhW<!'eIS\T悱s;iԐB.1! sw̏Ifԏhk?E:h᯻Xfo~>G#oa\ >CH0$i.7ea]?è40R\_{ y( wljxS,qBQ@F ]ZqNy8r\hJz\4Y)XO] b:mnr 1#7 \y"37!'Ç*Bun+W)[7{:mJ(N>*J΅Zw55`Ԇ5ڊ 4!0D#Ү*FBN}]Oۯ/{uB?:v'|'^Sf~<; p]r#$Im+܂O_66547Ђ79 _g\WFR!1Àmgyj-8>g֭NKqIw>H,PB$Q}&f:Ǥ\\?PJVEwS岇'N9 {JS%)G%wj9 Asp9k@}Fu*1JR_si:K ;w6uf.0Zs=p՛rPЗ􃏳u Vx Kd;[M̒/4.ku j^<oa\~ &d< Gu`PUBy3( ?b{uM&M2z' .vU5oᦗGUiePLZ1b1WO5w\'8@qEQ7p b n~GxYfFp𾛃_v?G~t&[$(ٮ򶫼-Z=*RpK(1IL8^j׆d [%Zl[6e8fɽ&Rr9Jb*֥z53[Z˧˹N -T$yy@*}'fQNࠞv($q)IƀX`Q( @Q}pHʀx zQ `I)g[3s;{~_9X cPЍ.~ ~t?R:fx9H T1@|tMc+~\_=s ^,<=N.C$-- It>aۏ9t93"5YOTFMI&{v0FO*Th-`VV1֮!tFliy%r#|I2҃*eٜ7?vzI:ӏ%->&S9E ?dj\yy#ZoJbCǼP6XeZo`ٍTI؈eBerg_.s-;+G.T\:_ֳǃQ?UjT|^g˽:ݿ᥻L-X;^KaIv{1+3g[Hƛn7?ϊӿQmEH`q~8Ͽ)8pT'gVUI]JIe]৬*|j9jÔxC~'ޮ]nG@%ƾÌ- #VaH"ebٺЃʬw9'2N^u[^oµ˩(ql[9y)ݳ]_''ìѧ{mC~s©zߺWyoPv~~-WBtB\j{󫝄X!NwhE:-/~tt΀mi[2X<^y*O#ACÁ|c_ 1}۬nnAX&WּJኛMNTձTvn".HvV2q5W>hgV&aSjc{;=C>%PHUh'h=O9WUV]*iҢTo?&c-h~kHovV[t|S *JpuglR,}ִZd!jZC'`ܔRkPUε`04IfԽs:E/=j-6h0ZPGWgTcԻfVH)H6omҤ&D1 pT=|.f cWIx30fEE^3"Z䔄; &lQվ04YV1+!dңiv=M2 m7 *)sP!D}| ޅK@#N{'l~%BpG/D"O0^H#6hOT&C*R=mG^%cNօM,iM!b|jQ$EUV7NTRZ<'JIIbE1\2'%d >zoa|XkQJNiB(( E?ƬvJcVcOI=i],>gD ɞ5))idw"R5V&Y)dW(! QS׆RDWh|z`rJcϨ-OXUED'Nk!N0vu`;8V;;? QLnWˮRW ڢY cMpmV:njE ul@ ]k -cwUЦ\C^ִZP%BF֣(Mհ֞  <ջVLU:zR")4`k۪npWUhedԬ4XfT lp`W@ \idX$+$57!:i*dL'Bp,Kvagbd"zbJ˲f@57J 7 c:Zom@ 1S$R Lh=iFn3%sB(]e֜G}4ud97i;1KE)bD*p`Rp 3::Ÿr@Vʤc@8"+4poSCMEwf+%JE97;KQ@;;ҡWHPSAHv`W.{/hZqjRm9@GVLAG= ]I4Z*#``!&SAyCs ~XA̤ƂBgs՜P$rDMWD&deZP16|Oy 3@YLZ[]Qx nGep6CƂI,TGWZ"hFymV Dw*<}Xw- n]@,M$C.2T(ʈ>bM୊PG]Jrm^Ls@ Q/сwK }Czv@ @jL)=`r*a3m>%0ZuIBvp ԁ` (-f1c &˽e;VAa8(%ͱk?n,L$fj2RAٕ`?Aj880"㬪p*yX0&a!dE e#@ DUb| i0V =li`F+fVo- Rti[ y/GѠMJw a:J؀$`>T%La^ ڪmryXq{yk9^貜گgkZ:ms#f,-y @ztqt*D6IҢJ$vhIlQf1hmTkM!JK9O%'ޮ `L Ly{a#ҷfĞT2E9)j>.nDDh[h' JTEJԃ ( hTU")RC0ւM[ [ƮXd ?v5a+Bq&SK \sD Eɟt7/V0 .LB)E#Hmr3WREp݊^ ,)巤O邰7kOX|e8vJDMa=֍"}w~=臕TM}7ow2'O1.vRtmzo5 nq Y&)t26ܛ:=ʗ, 99%9miW+u[誷ɔf*=ʳ 6ĝeEnVot9&zwZ[.~msuo6AGG[ʸؼ hA-+50]렴Vjڮ!EnM>˭d/l=Z_x[of}mj^-Wn 4\$Y{_zuv̳`_Yqqru~{NO6`gKx^珣+%õg,ly"a$7.T]m]zgÞPP r( Eqt ?e[x [dI׫ɡpu%80[i RЦ9ϲqfP7f0] ӻQYq3 _,rVϳ*[Q)}cI,oTK2T38ƏrZ$ I]shK|hmˡ6[M1LdPn YńPn2sr׳rۼ` wCL =\\]~˳:8*?~(@lB՚wʾjZGjhY7-PBT0^vbЄ29 agOg9K29ct:JZiA-$]ȲZ2u.@o>-g û퉢z#]xFsVw{~~;6;j@Az|7=wI;CU-@_hѵ y_r >O a+"]lyx0;1[|8=?i떼oLLCZ]Z]]ԻvA;z/z8:)w3Дܡpbٻ64WiwVm}f3 b`1ɗA4ɐd"}ꃇ&)(QvH"vu>oF)0t 3 g0S +2/^d,uD%ԖhQjcaLW\ Fi{RghYq;_ףiEaBZ|c7"~G,5^!۞_Oaװ9xU/t5UX1Mr롯0ކ E v{FʥN*(ֿ \ |5:4ߎ:Gje-W\\ƺ^%,tlb}жzv}>l{iatRWNC9JV W^vA)_Bi]8bGX=-T(JJcI#{5A,`A Tr͚(U&r-[sԆO-d:rwBVr:=H=48ΐ)PJ#^kb6}2w_i1ʭ3yBA1\c DIc4$p!IJ!akεFn!3nlٽ ]Ұo`3ݱ)sqs5\ (eً޺ǿX;ٻZen`/]Ive hkriy36"glP% S 澗[sQȇićg3/4ڸ$ )ed@$2*3 iV iwuYx`4"r å=$}pY3;_2`˜U +`3S1pI#sTiKVIr3([Ӵ߼Y6o޾;wMo*HZ v;%ص oIᚣVPq:\m6P>oBJ?\O|>DZ%  OM+7m1h[Q5⿚HJC1w*H.9A"MzD~_~v^[ĦlבSf9@4Mƃ. _ª0knwJZ Qb6WLt] oz{V:yw#6m6PDUZ?)Xb*cAJWO˺a8G.lW vhVBJjݿmU»+o?QJa:o}t}<~Kܸc:uн^o],Ͷm}w;qsWyJEjCd x=DI9+굁Ϥ ~SYWso2nԐ1K8+HEFnHArArjVY !y$o5G0t)A,Z!n#)FQ K٣ rD!Y1C.UQu޶v0߆x. 7b4)-5^XG<Ϭw $?M_mujѱg9C"C R[ lM|od&n }]hξq(D$0 FN"vH@x!vdHF{4]O,sA6ym7KEk|:ЛwɌ`,qX:KmRdzɖy<1.ގGocb&>S].~ѽ`- ;|wm0>Q:jkw{M{]]ó7j>wMzg0Bp[oݰg?༫k7~hkXl*kj~y 1u2Y};~FHߌ.ӰXMK ¬ ->TŗNiŒ(Ve9|Z,ǓI2˪o+54p~^n{av~;)jBT~\郓}{Q|X>qbٽU{}'ղףtV)s_TkgRb%6LUSp_lFJT`F7#3|4;*.S6 &VzWncyDnj,1%ח12mOnte-o]vtZ1Mo/xBʏ,8zo}5,ّt rw9@uyWvL 6KM#7iB:h//=AOOs.e.%yhixɫHYS#r.z̡(XT&")MV35 [8#q42\i@<`˰єXo@ :,ىo&(XxN;-urCM@p df#A VqcBt6zԑIϼ,V,̯'58(#;321#/5gF/jñgGÎ0#Ǎ6>Ͱ` /YuV=\ZRbɬS֖Pա=+NCȓ N'Am9B"WY$r !JN -vGLzB+d 47WPlODF!dxE= \h EJblbd ب p [T )M y%;&H ;P OFE8I$.J-j*r 4LG(>~uuL P/*KVT#cC'RD8b;Aь ATI'(zG>+2 % #>jK ^Ȍ ^t93@*72fkndUaaq(XH{,͑"*Fԥ̔L"[t$0tUo;>QzSl^z˃F`/ Bc8pI2pL*:=V*(sVA-7zfɛGf/hMs5gt .Q~1ϓgض:[DO0i9{:=xVILW8/ .-gUܒsc̱r> s4k/9$QK-9+R 3'qi<[3JN[0xvH|Y3t)OwSI/Wau% E;WaT̡\:o}􄯊b0w]S_ͯOCھ)uu::7nJ͌af゗- 'j1()3Q*$FGT0t1M prUf,:uy? BKe,!]Zn5P,uS|Nչ^.ڥLLr%} T3`OLj9+ɶDI}ϳɳ:W7ctIO=)G[_JN,L˵a[ey{={. w= 6q13#o_(O 61(QȨK{L8Oԯ&Lg02/^UgXvZ[ u"WUzE)(sV> N\&iUu.] `W-Rmb7?T'SQ:͛whmQ/|g~=j亀(^}S&"fP^Ķg; \F ڴhN?TTCM7?jV$w6bf#*)LKm|,Sl#GͬV2(EYGGS|4hCB91?N54:}:g 2VV("CɢDŊ`dH):zҏfY"ȰaP)2 iJa (l%*J\dO8q">!D|ƉE@bR(ySm3oRu 6zGaW/8|lo9nM텯w@ =xυp c* LD˔1Y1,WV:C% .x'y5h!  ЖqBv=<~> QB">E ""T[iSĕp?;{tY?|_e}l}j2N35D20I0Ap!V{ h¥Ĩ LQe] $%hWxm4DJR0H;s^%Y4Adps]:};(6D,@HAHjd)J`$._]Iab~8j:VʵW5\Op]B:TsF?zP?͕ /g_GWՋ.w;VWոpYyw<ֽ^W KrpP̎5*'L]z94w⾸؝06Cy]u7pzGw ۶ #xq//XS = \ Yh,3BMf?'&T5;W!RW"ӏz bp脓q5LF]5~̀yx8_ QZ76!1MUc Nz }35n@Ǡ0|3m:Bړ 7;f"'~lMtnv8^3Q1yc+g.oo&-8/EW/B xFp#\լNkpv%EU'sw!GN|D, o-?_53Y,ռ~~ v&'%uwJhxfVbHnsUDYY'Wdq栵bd/kX&qT}TT݋ @_~~_q Ok`k ?|Ҕn70-5&fQbBoӽm3)ncnzEy)7|+]Zf;\[ݣk/翾> Ȧ$#%$ fL8\aASaTR{@%h22eʐN,Q>piԚ:[PBc89"h>jKSD$5Ggl8')v؛\c_Xj0oR WS:29=F ]74˖iD8TI.P3rY+Ixx0s2.HVk@%v(F7'#<9v]B,x bVsSllll2Bѭߎm=i[{9i tV^c,=҂\bRcJ;um ?ktJpSY[-ߞޱce)nKYRThS!XC4yd"8bratnE>À}KD{Z !e'W8xKE_@Xc^ /Ҝ ]oKL`D'Ǘ^ n=Œw&{*V(,w$1kh Jg8 &iεR$j%7c -ַۙW;;;m64/L+ -Q<2Iq(`,dۑf)K(*2P#GLX2Z<XF JzBHi$.hN qjr@vS&UwKIdÔ Ꝺ kۯ߄#MiP*!ASR`tu@PODT\1ޠl IX;A&#_k|}1Td[cl ~(z0XiD=wdb2"ygx@ĴC5(axq_3 :x$dyO%usI=dA4@"J#%?;J@!VI`ג)vN+ɥ%3vZI)#,"EnԠpTGm֙"W C1XJjfv̽iI2އњײtG7A&J卾YKqc`ca_qkc=qA҃,I6qfq]kS0-NPD2V`[PP8,hX[pkfZ#`n=$nRnT% <(":*d"*d2F8BH2XQGQkƞ^3fR*$@Ȇ|GXˈޚ} p!ukHygPW<!'eIS|c'! ^Bֵd׷$.54 3dm37-bQ؃H>?:zhwѻZpتk[eVgյϪɍ.粫0ܰDa8j8'ceL=հ˺7m`~/=Ns[O&//q-x()/XăXW*͊J-Vy,wja:`Ĥ\L%%kfw%N3粇;Nf9 {JS9E8*sVYȭ:WKK*ėxe]uJ}L'H6;OrcaUI2;oH䯠UusIpjěvxfvZx7j|I`FUMggt< ƭ6csիrғvQkuaxz\}/ΪlCd>I;-\ _+~گPjBW _+~yCse+~گPjBW _+~گPjBW _+cگP+~ _+~eq,#jBWR\+~P-~ _+~گPjBWJT _+~گX[+~گPjBWJ&{BW _+~گ$jRjPjگPjBW _+~گPjBW _+~گPjBW _I;r=Ğ;`‚ŭS&sޥ"jtOب% ׉'Z:Ü˄+u6m!fWtLv+چlm1kRPXbPGO[1M5Ð 驎ykA, *H1O6seyOI?6Cl9 +.GhaU-;sۣW߾5%˾w0oϼ5fC[8+ (־5i0DuPZ #A*!;T:>T>[NH/D Eg:SHGk""[{-B,|k0S= LV^t#} 8ˁZZJS ,mЦSR$*ɭ5XpqZ{+ë1i;!K;l0O3 қvM_#A=~,G ʶP>omJٻ!I "zʄ+tRR{<͒1Q$`5 ±`՝r))\H^IPDRBHyo1!1504uڙ8<$ 5u웜6& 9Ug"wu;'O[,I>.~- En@GxTv?x91S>y"ѭſٻ6r,W;|41f>moи|ٚ%$'^òc=PTDUEsX16WL|%fh%}^`cSNMB{ eAwx,)N~򓣟8 )$Us ުstNjHQ٤hVS2j (1,1a,pZ V&n#ӁY!*jxuBٌ'n_ |<\4nmf~6ޏнClJyŞVnwR̨cqf$,'>"$*EqD`3HtJu$U[N'ǹv9 MNOg )d(Тʁ X9W<h4˛4V^#߮7hVQn|&ei5w8Q q39tR>bQK֑ d{ <M *HCK', Yv"sT6&!Hdd\v~^д t:ο*+&i%ZgRx%72ySФQ@L dAHLU] [˃3H%SSXNZƴ6s|^dEn" ]yTkΗ..i.6[3}hiC%M.1,AH?<rBz |EsnXc0[̤L쳱 0gͳ4%M&զU_mH7e{`]>&ȩO;Ptss0\3ǟ'fe5E'jI):891Y{*i9V_~>؆iy?x~ &/k}31-R+s(EWUILKLWrrv$h {"=O") 0`R2Yf-Ĉx ;8 QWGcUI7DtLʬd~gEsEc hPRh3x EFjjM'r^P~w}є>Nj}\aw0y&2Sf5OQfF,;TFf:/"[E="8-*w^ڝEoZӐǧz'o;}LV~܄>\TSRl }  ia'S0NJ _"uO4} oϏR:__rC4$Idkm,FI* J$-&L.X Jʆ&/$ -w)E\{E39fNS^/jjWkΞeVG65[{i}I~]0) ? IfB}|r^veMyWތ/'Ϥ37>#6`j1F пe@^_oW~lj~"GOwnb&S)1XeK'~a~jybSvt3/_ʅI/ԬނD. O[/[6?QʂQsHT 5R4D寯S2ٶ%ďof~v670IDW SaohV`2չE7({poz;Lw;ll|"Ƭ`Ϫ!]ۻ[6JhI1鵽R/fOB.rKO*Pt3/ztG2ۏ }}4 dC%lݿmYwwW^`bf{ԼVr|(T6׷Sy·Pr/b<8c:^^oqZMeEup7 2OyP6bAgqWdt%<.\$ )1%FXT_^'}XS kv_Q82Q_Dj+ėjxYBWA7vg9KYY$pڄP<>xH<0mP)Kڙפ,E#oA鵞ApfeL{;s#ijs0_,͗8~p3}WX7o ggխ67)znzӇŖy4.n/~ G,{߷`Yߏ.V1A1 o 4DH.h|o)U;w7xд[apx.{C&Bnuka~iwwyl-7mh4WVY;/-.b&}Sگq6;~~^OFz~ 2J̅ʬ >h,|{AX`0{}6)>Fq1˚_n]հw&5az%FU[-^~F^ri ԚV-l4ڿ7tC_kG۾*kiղ5ᬇrgWnM?jQCdZwv>.Ghxl1D r aAhp}w{ًGFm)M.͇5>ډw/z?]oq{2, Z"X |UtܮoK(zXH%qMͤ {ީou(*1hp(`&ъiӫFP+:raZm/I?=rN BB\0_ʑ{[d- I SȆ gUeT&ӳi]*y{z,S eMʗF8f=Y3\:߾\{AFfIy;H*zd DdJZPY,I{I#$3*G R&ҋ /9z>P*MϜ@p4@s%BL9$ZeHb0*Wf`ȗ4nƧP8(}wO:{@Ċ{??JmΓ:_gvTЋcf``Q,g]S>1n")ms%hQhR1Zԝe:g]sA+XЁ{եMG`]ty"# a B.gp5 mZb5(*E'\eϛ,%R,1IXI$WkK I< N̎\tܜ=<f|)vKmmq1SXmxE2."mO5~ϔuرgbGĤ:r#]qxpsN~*Te#u `oj`I}NɅ;sP[pY6)`RxV;\08bz a4dJPNIynDA!a>+m؜AR9Z] 6Z-T,rE%iZi23&QW3N0='`ّV_1<$HdpU0j+ZٽPSL:`Vmtɑ%4K/efUT#w`,\)hδa'sIHR%'0AKA X59}LYrAC VS95/\fX9WfơX(+c>+Mafkz; S^`iП^9b3p<;3,xX H$0Z`^#$*CzS%}"Ku2 C{ ) lJMxc1[&mW p !e՚h$橠vѡv`_:Lt=&ѣ7^FoJ},!J4D@ݷYi[Ł21CL&&HMđ0.YEHT'X5g7~iGVT82"B"n!D39< QȜJ:J]ʆd\,ρxAIPUYΘ6@8)emD.8zXHkdGHJZh8TFjٍ=jO8]F9jP\ԕqQu|ֆ̓,/|9+1H:<+ډuh +1pT8<ԇ lGyvy"^dی:g~nn%]\Ϸ ٻ6WrC/8qd"v Q-qM )Zv[=䘍xg<#jD=Q8!cQN8e%#O]d+F/RRu1g ̶r?{Ƽ/ 4ᓻZ~j)KsUߍ^Y*)xqeīI5ݸFV o|/#ӧW?q&7=㹗r* "LS ^{M(OFx.~@zx叻/}{y{W!7O/~氂fʏ{ڛ~npo;^4^͗:|p2LGfDi8#D@`;p]HDk4\*&rA)l/R^ Ka{stT@o)l/R^ Ka{)l/Y Ka{)l/%R^ ۋ.Ka{)l/R^ Ka{)l/6ُ0CKa{w,%߱R^ Ka{)lQj I7ڬhRynIj!R'"@EE :1"YEhA4sg{zZy,1Ϥ_VacRpXXG!%=XN_ǥ+v(5 "T%*(QD@NJx4Bj9Sn:P F %ڄMWU>M+gr̴^}٫-$7_:U(v[̭*s mҵ<̒MFC^θl!3&K^3V {>mXztCG{q=ѝC8⑨FZw^t,rN#vܩk4 {$$ٹ$ݡ$݉ILS&t\y1h& iK6!^J8.p $BlT%rH@*+{L+Mxb]'wEQI5 KI}ߞDeϤ]>uHqw%EHRD#^b1ALqMLO$>:8 Ǿ].LVZxj jLydFZѝl' g !rU b z#CLZDi@2q-D}6B yTJp$N eV1'8ÁHT ٰ 5f}>{O9o2.P}Ctv}jt+Tz$$'#i b>x^xU h|.Lx{Qup 8;[f)4SNQcIT, #N2h H(\JUլ@NPy! KC$Ec3PE#DFwEΆw=' /濫`J"!Id( ԟqL|}&J!~VMDV5C6 USU=Ǔɤ_^|Oh|Q=(R/Ϫ⨚ +a?~mr'7jL~A`:{~U y鍭&yh:Fx0;̝۶ /8?ޝ5fxۑn髫\Q2❜OU)kf Cb{K$?Z r{,d!C(GMAP8V RTgT0VMs:dR>QxSMegWa.FHȀKIm4F8Dz2{LA~ŸLQٙ>LMT j9 #?wDH{T5 cM]|Zp#)Y b^z֤pcD+cv<fRbQHVű*"-?aiLKMyD%;VIA\A:c_@X/~9?&tn|k(M4K<+~+^[ _9ca4?m ݼ_l>U>hŚ=Dvke9{~>^C\$^sȀl~M2;RRArhs%4%żO>Qb!#S: Iѽ% )463s *XR裶4ED]w댜 cⷱvr%EРpxR ͼKe.\YO HJW8*-ȖXpD@?TI.P3tY+IaF'e]j筼WvʭazAo1“yft.5eCV@" pAlށqn퐌]F(5{`Yc_}90q]+vw,הGZ3{,k8Ɣv"h/j#A<ɻlX;jWⶤX#*7\Z`*k3oRGP9έ҇4s *^eByIz0b"RxT_,3r6شu.!; CO`}!xgRSٲ]i6,w$Ckh Jg8 &iεR$r%7ѣ-ۙ7W .rvw(mh L+`(ϻ۠)`,dvCGq,@Q3aI0 hms'mW1 '3r6F%1r?QG'w1*ޕ")iFCTD981˔ N4ށ.sI"]/z=?Y Y <LS!V|Ƅ#ӣ T2%*PN '?-+14Q[(P">B_ɬh&XoA%~4Nsm' 3I+}3&,k-y BEI*H ,j Q" 2JuP Pyی"T"`n=$nRjT% <("Z^ETdpd"אXaGaݫ5c_y3)xo - >P0&w$x 'H 2| P[A&uvhW<!'eIS|c'!=Zf}Y]U i.Qݳa?/EXpA_e&*.~RgnT*/NfVr꯳[g??E!ӫϿ{"o?~yūc=?~e^+ I#5mWCx EO&.oeokĚP1v@Psp>u4A~x=$`;$wemIte1@c1(V>먓 `4H߬>p6HjGЅ:2+38CQrv`1 '"]9)NzjC#\0㤉i8WVT!``NIJxZZ6PsVu*5x!1I ,ê.tyBwbBl!,q/pk xd9O'Al!G{ɱ6$Qs-9˝R 3ݍuvmAP=f;6>WmgYc"_]j[cR;ĬW逰 9H ]Tҧe|:Բ(0 ,n𳕂]|)mR-(t%m Ŀ%8Ͳ_"r~{Wjo~~> 4: 6ܼE_[޼5g3ZbFN+YUi RQ̪3ZO=h).(ʼ!T霥'm * ##ʍ⢏=.(>;WXJM!D1g6: M"/9V Z9´=]q$dBDf)S,i23Q8HXPB8 ܂4FƐ"|RiBH"&I 0/zgX^jP@xlZR)U -#FvZZ9HnigD9DŽRn&x92J#"( S \$JCFҲϨlݕBGҝメ)mx^e]ɱVz_9N(MѶ"Fg3A:Yڔ7?~-9` `hn)1HWЛjcv hE?BAڙДk㢓 VH)#c@p&#穥6>]Wn` K8 (^JJdL#nbi1g;JaظAPk@&NF7G~jϳ7~ q bn%V g ؿXWO d -%6X4Zm03t 42G4he0a߽c8M;mykxwٔ!k.Zl%Oe,% |7d߀buzϪK'ނJmm.9lJ1|RuAtXcӁBǒX7(\("XSg=Q(AHR-$Hh\0Ϭ)Ea2i!x0k5f,`Hk1âFs+ -ֆBQY,8;P+T(JNN APU0l 6;xDopٷ>aFu z"p ѸJ')0+X*&QPóetYRN_vpzoo`(-%m4&g Z/s)AI7-k3+|(Vc;`ƎpM QIPM9O* jC (݄vt]O8L̇h`XY *2fIaQz9JZԟa/O꼺 pO9^]O^;ba*f][G΃,>|@oqL'e55Y~̔YS 9*:[zC3Ysm1 WIlOXK^`K\# 7떩$fsש׏ m؇?.&m"Ρu9:ZosXltfWZ2un7-~獷w LwyfzFkny[r\W?qosZ}gfLikbOm 54i79`G2xjo~<6"{ږ1']Ën$Pֹ9Y5,2mn#ݽ'43>lʁ+A$8:6h%Q+m9Œ A&gA≁Fu 2RHxf*"ED;,K=o{BڧBY'M؝*Hâ[yFSv/Л=:sR]p~>_4m0}df`e4ËPv3^o闉ݻѠn{' f0|b%zHN5sd>\9FՌzgf1W@T҂6_{B[dr96+*)aɽUnIJtr%\Јsfʵ!,lPPJ۟}u(#$\~)nLꦟ wpUBYr֒i"I%Njˡko毳lǏCv AgPQγ6__պKxKWc_|Ui( `(qrN) _ ({V ʾE_/o%(}O$,_WkC1>fO:K{K0> n^ ܝjk݉ݭ묗ldRO$H7fnbB"ζMz.x*4=E ~Zyix,V+? EE[W 78Ŝ9s˭#`q*;nDU@h@^C\q }u~?JFUg#J!&Wk'P[;nޖ7}]6.wm"ΨLrJ)FH\dIaLkqgEwdhg]L?.K`#p$g\Kr1LB۶y&no@U]^"zrz&$K|+36 b4Μ|;_ݜ=ڃ Չ/,Mkv^(|ƦLJgiڰJ8صRa8,,m@N٥t@rR`I5锕SެJX;Jd Pk.+Keס_&D6^h}(>1S?tђQ V*#`ٻ֞6WZH4c*Vh7sI*h>"T jvж$D0sSvY~P统 ;iSWs!͘,W/`\X1U7#cQ0ik5&|`So%2N+RLQ"0& Ab2gQ,u1[B31i}_~ea8ŋQC(v5>}1/YƯPdf†SS֩yRy|m:g95W1TgSyu1Y寯Ǿ9tέ}3WLнUI'>y'>`{6_UR)+kiֲqROlbd.ds [9VuY̆Y^j\'**l'yInck'әdp`*=9e[JuZug:EK"2k3sv1"1s ΘMGݢ٨:6NlƆfVm㎯! @O%<^HLن O2Yh9'Y~!l9rIԘ8ܝ6p_>hi!C~>st!B `B^vLmdB"+{*k~Kg}/s[{qY 1bl٫4yle9T<}ԿS0 kVt-t RJV:ѹ(rw2uᔒxM9UdQٿeaut3s2)E=YgǏ???(bgAiGw6~'g<çg ;DD(DȨ:Ѥ1 T&5'e`!L|(>؛hgyyĀ.pLh}vʻ6Obp^phg=M譲 $24~ɤKol)Rmifhjr2@OVT3@L =a`E-0d&K )utk{yZp>4{Tc09n r{Sh%xjz{o $ZqzBpRytBL>q6}<*f|k>.]!ܖd6jXq dO>~z4ֿljxc_kɦZ̐2czu$W,Q|~^HW\3>ݣŰTW(I%)ӋA?Fs(ΌS[2C!kW~) ?.buͭbJoUv'$Ǘ?1߽:~cy 8/FcMxR\w/ Tm뫫ZZ.{=6B]njAC*b9 Дڰcnpj#D Ibc3|qRdYeSB$>j .(D71bM`btRruɰCm\zg#{x Ʃ%,:Rɭ5(μ¨nWs0t ~(Ob){k4!hc(n~s^i꫹ma04yk/L#kۿ4u%ި\ Wu4Li)8 &:q%~je T%Rzmj7VT2pm1W颫Np 4:z.}RhFL!NDH6pi>uybcj'&XW.BG5@3O*&A[d/'p`_${ 3cc8e2'pI7~ l3gEe#GT)T!uZ1D RGI[%DIs)d9rVDSELjM<`<+$ 0+XGO4!b:5 _ DϽ~8nncxгW'3cѭ >|4!i_g%z"漋Bw,pNdy@'T1$ig DC)c)Zau4[mMr=)ZI*yb;ΚY_Ī-ɽ'_vHb2'FYtY ]9l7$I_[Y7B-{z* z)}xG|vslٖ7e!^ Ws{ ?C6BL0`0ep͍N-`A Br<ߒPBZI=#9 Zz-t@YI-Q.aVX'=5T( ($2ɔ4%g`\'._8bd ~V`Iы cv E'9SEo0!gQ{ui⿞eqXP¥\|>u" i͟q$bg}>~c7dw=O5Y?8qzx a~=ǏM ?ٞပnxٚ:iiӏX˄AP>;t~a.Ŕ4D%V|!UGy.)s!^W脓Q>C}{e)73pӢ?3'ڛ? hln51ύJsT c2s ;%!t.߫GH* J˿dBrL4BI&FMd .F֩Y,G$!j&, fJYZtzpkxzYsZ% %&GPƤ))F4k| !B,)N,F@8'hgS6I tv? Y \DQLQ!QcDq G*A2%*PO lVgP_nmW͇?hm1J-1PCPGyV1N[щب|W"V_2- 񊪔W_ЎP |HRZ!3h3N1->LڳC+tDi i7SmrڹX3riNŸ^R^*`}^ R pX'VB.5^Z@IsM̂A H]S5zPI=(y! a֙YQ;9]T*y[@TZ[˺?v[kŴOAļ1YRx%72y}4)},`,Q *r"T6$=e<ueLko3u?ŹD yk]Y,luOMf$ʫ 18eD BXq,P.kʻΚl,󱍼(aMXɐ}6<ӚgENE&e j*5Dv+I>jؼIʡ;Pmts1zجM};RfxQ#5^hk\R$!$`ҝF8 =;hmCԟ;&10ygɔ2,H`ZD,#XaA +<c(SM= 1o`~l_뗒57>_6}3p0[qk2z+H>̲*"1BS\2ekZmc06 d`浶.B!wt30O{|wN#phHӳ^LN~錳ތW'1w)iz3ݲ?_GtRJA۽ϟ[_Ej;X+Nz ںg'H8}]qyZ%?]Di)Rso<ٰ?8x,gZC&BnHa8__O~Ӑ>Ri{o~6ʺ[dq,lW _'s587?/ʰX{-\ߺPW=(\^Bu0{u6){geͯ[co{XVʘSoR:+WmѸ8荆mic~ںV-|}1{g=1#nxV'$]ܴ}V%ZZJΧcoYԲãydTVv`hg\Њbn -w_/bH=(+S2nM.Ooz_puy"6հ,'bIkhɄ6_I;[I(Du#hC>ǫI&'_Wi*T_ R'%1Ǵ@hx0Ǩ Lo_7ihBM,$JXU--JݽStGI`<_/Tm=[]4Mlђ/nvSwb&!j t{,FnA8ݾuO;y͊ȷ1w`G@Du um*qOI#;D*#QߦҡSwxA*$bNJ gNFhLK[rHeHd$#`Tg2ChQWdPGWE\xh#W^ᆈD8.3~8c/ggllB4lb*61А "P*%~\={,c%i'Pd+QhcȤf*$kb95c!w #8)x䈀ܢYYE̵\o _8^0 'A(w^2K1gVqMz Y';孢 twNOl`M^`wI{(^Y57oSj@_|? F{I 40U1CeGi;=eOJ[m๓Ga!$e`=%Δz!&0뜮jkR”ʂܥ=pc@H\Rd.sI,Nym]85B Qȓt'J,%A`,(.{Kb;1+˞*7< ^;c% ? yU z2iK |iɴL{aj:9Nr*qiNeqjc ژrj|1Et:cx3bhc ݆L 誜ϒbslz_42ݑ?,tJ`.SYizm%u=/|Sft5o_pW * ?H~tHva$6蔳 Pb'r8mg_Sv nZ\(<V}wkf񨿉,-508w.mvvGt#vlsqJd=8;Uy֓"]3BkvG%fI݆}ralQuu3o]1qa~Ghc5⊗=>&і\Gy2ȏvg?|{sꕧRvj.u6;?6&,~s5xu(o0N+q+Uj[-$rQܜ&`3Po5 my 8 |\AdBv-7{[n mqvK[n&Jz+^%P1n*CZI !ud]F"6X}*mr:sO|}`V#vJ̕ܪ]>=dg%[yw`58mX^ГISkT0sUW«$?2 ƅFYiPhơ @-OAG`vHQ)AhS)\c|BZc.H%}2 (cN1 WjyT{`މݮmvq7ч-B^l \F"Pp]svs:RVCa8h{ƃp58Wȥ2(SnX}ȪA8*:ޞ&@IXF0)`#ZJ$FsNȝ;;l0Ǵgqե|}C'ɶIiHQ!@Z%DV+AJE3RWTTjMӖY1q99le#COJbTDk7PgH3!X&Ȏ!by3/Vwtw7ƧU~Dh`,)hδ`'sQHT%'0AKA X2jkr{f"`c\!U&fEm΀&EK*T3ږ8-c=RVBY:̭w /N:Ů7hp0}_-6dzs8ÂgdFƹ1K27Uч,D[J[r TFBb6&{1%c&nWj p !elb0PcfGg;&졓Dڳl2=zC5kDF8)HcKD =J *Đ r d2dkbbA$lF1d+j춇-_&(X,bE:Yĭ3>Is>+D"s**e2w)q% +Y.ǖ3 7N Jĺ#'Hꮾ <%٬"0I.)._IFmYg8W5u. =DMQ\' o;/r6uՁlYK޸d[{֋Ūxomup̋QP{҂1"3FƩč:& db ^| +7}dGPa=6k$VEw%ǧ~tZHyQ|2>*)e<.TM3ıtkW2/F2׵us|+@ C+et#c",E }CnV >l^T!a,7,;qFǯEk6i}&w\2{ZĎ 'Z N̳A.\e;s*]fiC.1JR Z_GxZ|[]{fޯW)Iiޟzz+COqM^ يk fπekfB!ExD `%Y`ٳ~boNש\ {UNgq"k6̈ksmfptC{WSfƘлqώc7/{:`r}fy\}i6ߞ@ixѣF9QX)AUHRzQdB霹Y$+<F/Cn낐W\"+/z9I؜Hͻc5t^&$삇iZNg[u[?ߛFGw5Bor-@"{ɊC ]w[с[U/^o|qT*|"GxL&sG <N|)̪JVŌ-&G2Y&@|'݀Y$GAR2gNr ,)'焖EUčوODZ %6$Y$g%i[&#~FΆM1t3d'7o2.P޶BW<9 ѭN#dt"{laH#B (t!HHdCL._zq󜴊zmsCHER($T9+{#~ g}r<O'+__x_:>qLty4Bq(u(*-gNc$)A,Tw+6W"[~,l#4~gd,Fic1Y5WYԏƓ|&isp&Yˏ5/Ifv8ixy=goKW|ճkz %Ϟ5^.xr6+3 &a<?f'][˜.F8R9fEbpkROfj1{U|~R?=0k<',u K~_gl$%A, bXv(c s)q$$&-ŷcwj4+{κi:R+%\ k.z惣~ƈ$ߊC/ ٻAЊ_e"WʐL,c⧹ݽ%mT2nc.ٍ޼Xm}kx'+q2ًZW#W [\z̎5y"h0>,[ piP| 3 @;W4tBYPpX2 2xmBNH*֐dωxz#gØM3PB._Hp*@_3H:ГwF-( ;I@>GR|%7ڥ ]CZfK,9*#jlf97 ]Ik3&f}vg{hZ:hgz퐌1*s6|1E L#B2BB*c;$cSr nYೖ/8p&^n*Ox\S=;鸹zv:<)`/Q ~Td4h^|9p3?ܸ.xUẙq.H\m3{=LciKЙK͒ wayޥ^ v.fs{@_|y3ʟMQ.:L^毤IVNô|VKPÍZA·lr.cmV Zgl:alt^6y]ޤvܯˏ+WjgEҸZ!J[t|-0|N4[ -]*zJdZk$%5X [noE@]Iے/"}I]r t2$gy9 ɨB`<L0Q[cSI:mQtmp9 |pnAzm.ER4R$2̸ץH;缳d:SQ.{ h X"|i Vhh"W KZ*In{곑*go6B &N DH^AڀJϲh812X53X4ƕGlzP0Dr9Y'W%1rmǨ㣻spH9\ ,SB%^g@tedp QpY"TS~.P'M%+b02õå"FGL9ei '*$U\P7OOTX2 O2B?YK4s:geZMc 19獴*^ O&27jl*g^#M4j]Zh9BRb){<0@MX@Y L!ΦH޶ѳ]Xd[პo6ǃrVJ LK}x_JZzI=0skk=]yoG* 33YT!uf7Y;9UӵyfF];WHBsFϻḌ!>M|aUK`Ug0fo「Je\dzIMYeXչN ]w.=$,pZշ}R~p5;:cnz̲D[cR;ĬW逰 9H]Tҧe\7ujYUbbumjFfn8ې%'[(|%iؿ%8β_"|~{W{ջ`y$WEj4P-~b ?ۭ;g Ū:Ke]7R xC<{В_QyCn9Km * ##/R|8u]E7(C"blt9D _rJEZ9´=㦝\q$dBDf)S,i23p𑰠%q"zA,op JCNX ࣗJMBBX1(7)OOYKoAiFCHy0<1*ƨi52A$k9|I/"jsue+IUzILu,ߋuȢR!fb8*OL*PL)!; KvT<<?6ȡݥ~%7s!Y6g: $/p׹\ɶ]*t_ʍ4՗{sm݆޹ u/6tQvH념Xϫq?&ihYZX^r+ln%9+Lw  /h}@O|;6׀a1F'3EJBK/LjQ$k4dZQ1G EέRȰ, Ƹ#hC))$l3j6\`L`(vJ'3GͦdX[]ir+`ځGh[J[P#KZv煠XArfo"gT\!r -%& zSnmgAH?3Z;Arm\tҰ )ed@r,Dc<4eFgkQʨRR%nbi1g;JaظAPklAÛ?كR?7Hs-vÛAf̹I- C-%6X4Zmf \.id*mi" `b{"qv[NI=5dMQ&[Go&qaolkT̔QS9,F;[zC}m1 ׉lX%Ww=gY;vp.iSFnW"7V 薡$fsשO ۠\L EhCru^=,WZ2wnZ{=o]<0݂l閎+Aw~;4ۚVtM\9PYv}va,3wh,m/ci=T\Ƭ3t\`w#".-5$g\:aQocgwۓݍ.ZҜyD~WTZ=8rڠiDbKVHQ4XQd@ $ CF`T9F$DYL^jʈhA #(H8Hlgl%Sxm)R<|s>@Y|[66ѧ^*txs*{ex_KqڎxɕXZqQJ TJ33TjG,X%Bc#p"8O+'ui_Iұ{9 J[4IJ}LtR%Lq#- ?ttú -֜]G!mD=v݉l /~g_9 `q""h+ \Ee ya]{Di97BBHQ`$H@.Ra}7ރ(-*ޖN^w׿ dh?Fp-,y^"v~_M\^C^Q-l+ky#_ǘ{vʟyTݎUܫ1NVzapzTr>skF?KSoǘPp⬖ɵk}MQr˪_VUszg3bWn՗p%] |9#Uo&4Iuf)-i=DUCM5ff&cp)m^7B[xr96+k:*/BTxU7[xhQ.> OitJւAX rwgoYP*[C. *$>[ZMҢ8Q-w1wgT۞ CPbfnǡeqGϖZa!Qn(pH2N3G1!=0Iϼcz0]̋0ik{QHcT^t!#r̶ W})-(Jee{ /+*s%-o! x"NTeYfHqH] R0XJΩT6E'찓^PzhI/b{J>(ZOEQRKDZ`/&B:C `A Pd(܅|#tz> 0>,*{YX}oY"a8 l .V_x"ϖхg3JиZu_FdO/{fE5OQ~3޵qd׿BSxzz 06$86z$!%ky&T{%.$5ۆr^[U컣 bdCf4yoPZNIg-}f'!.9"9bخ'n_g5 YS~pv?T$_.a1?:^?F8XZH߱~|-!?~*Xljz#Z%=U[W4FHEqD^1,#qu:%?_xh9+=9˿n39r.~|&sZ*.Ȣ)"Uٚf}6˨ oy?fZqӿ׿|Z-}iYy k/. $"o[YͥhU2 d%P-9MNVym)ddF4A ou~_ *DW-JHNbeZX~ʸRQh;׵HjNcsd`K4.H\or ';SÄ{% VSֆ GUNȪs#DE)Q̶h:xѬEkAޙ0QG%2 M`ke#be_GR|1!-8fm9`4VؔPRDVhr  IkSm'M xVIb9kO^)ۓ v%d)1jd䒳"ac5V`&Y)dS(! Qc͕RlT":˄wh\ zXoBXr ?BnEE D'+sVk!N=0 /5`;8FQMH"Z P|dKJ#Z4^AQ*6A' Mrʴ]2Ϊ"ɨR{HE$d!f@57._s*+1h a#) ) x&PI44Qd%LxkJ!E Db ʮ˜L\Ρ80)8Й!Oź>n,2Ф~ζZ*7AS1Dt0GQF5hґ%ᐼ 0|[HPSJ Ni3+P~4uոIQ@IFR˚'*FS %i (D4pP(46je| $z Uu5fVcᛯyk&Lo AQLKQDFIu2D$&b LD0,|066klnl+AMے|gtB&hK`" `Z:p< >m9@ޕ^J.G[ -*``>qL΢&qcIN k P$RDM+ʫ2 iIQp^¸0G />"$/e76T l*^X:7)@JTU UĝC]@Uəj&8x/vnuKoŝGUD|ҷ{%} \{#Bzt)]L_bRnB%\ \˅&"rjI7k)2;}JpQ uY.z | l+ hwX3)f4/njV N+]KdyWN ГHoGdص7Fu&M3?&(M6rP+HƂ5'%0CIaI6c;*gD(! cȡhhͥV 4]0V#DI+fc޲PR*4[y/ۨP&$l@0U}ieT`7LYm7aw7v9΋z/poW79vfF nfV%Q\ۘ1EvWQdPf1}aRgAA#y(Y-:`4vfcL`ʋw#9>& }IaFhQ"}ျ!(A/Qk CC*C, (+TC(MxrUpՎ9B|{?i׻ //V0 .)~BN28FPd:  3^ETXd أtF$ a^'p `NLUYh:#rq@pIPAR䳗]tu2 ha\5gm} ZZ-QAVAR7SAj+v'QHL)#%P77F] ԭԞ@ߢN@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X JoW DSRQF<%y!j+`%?TYxJ V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%зB))`OG (`-^ +-?TJ V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%зwiS>KMUՂose\wWg/>M7Oyzv8NjEw9,;ۧO!9@ @N-CUϣwCm&iCo]p5(1l_^_mQ:-bN%!argU?H [MϨ䇋y5gW=pMzA?]\aO񰸱K GF9{-S[_hݿ-Sœԏ+;M2q85N0.NgslvG%b)0Z@6cjBߦQ~ac L-s_؜}^Ve\bhMw$K{q XXDWP?/?6z_{~d禜[rg(."/=HߞӋ{):9|ks_W+@LʪVk6JZ%0~o>7tU/B QbX̤2ō d2s :`%GNlHm(؆ttUYN2pK]U,)m[0~9rQ3xŧ]A %<;/._LZ xᙔ,?{ݧw|g- H?kO8ЩԜ[Z >_]%4yoÿIxz9u]\~f}zY/ޟqR^+ nߝaW?WWWݡF*xOb~ny?j=4hS;[h#7gj*4攷ZMޘ[OJ9o0ħy|=%)zs}͌ M̴iF1Yͺ<=b'Cg/9xX;ϏQ݄cl_ Uf/o#mxF^֎v3J_u>Qj ^\5pƀ5 ۇ6ԛIw%ߕ!ǧz4l]sT~DE(uώ{{6x68wwo>uTy]KyXG |Q+m#Iا#j`;}6EYdUEY$E%%.dV2+*+/"ur*z =ί5־cmVf;07bZ "gcZWWuTXA Ӛ#"rֱ$IObb5mSn?icI@d A,'`AFqHkƥɉr7Y/`d[fT"⺠ͨ8mF(t1&l*H{qfΗMٰVxߔzDi&W(d@ ?7J~6Rv~Z/ pTY+M Z^9l%)a Fd^UHP'AA.+ 1*i4ڪH8. X3PϡQ+!9: b&A|}A3P><{z{ U~Bm}X!?sxw6}m"SO_E)"sMEj%A8#;7V(smF;hNB!1ox,ގ}kw鯺i\CvO&M'@MQzqU/jEi7qbWysjMzaC mKV RY߳)8u9_$F"q:~ulMGOJ/f9UHB>J@a=|(\my]5BLz4۪FbvSFCI+1+)!L;T@f6q=]ɣӐ-7nl}a=y>&x}A C*tpVˮhr!q""$ٶcirZXГI$zE@/@:.R̮8fˋ٭֞ /xz[RS{+6Ҕ%b;b5i fK uZ"EBBԕC)WZU=!~|4ˎO31$KxbIKcAä{B#;C(m!)rD4=xA7OO" ܬu_Ǽ~x=p$0-IG ?.5"( "YCUo0R-չQ꾃܋.:'Aw}#ݕ R?vsMD_ qL%WeE漕ZW9pԶrhlP\ao[2,F 42PPUT !4IGaK0e#V9"zQ~P:gSuM+7W/nQ=9( v8'wi]~`E)P.1U༮p2$`:M..sIo)Dɒ (pʳhsTĭ.e pg#9YM.<*;%r8&D҅8ͦaMż.6;Cý JGQIS&i-N}%,yM"jq΢J3 ʞ&!N !#3ZQ!քHdi/8"Fօ$TGema<,&aъ[kW,Z{T]Mfy8-]pO1~6`~w{{ac̲?zcOxaȜ=rmeV<{}n=.{׫eLtqJvm=~zoDԜzaZHăYW)AσII7)~#nb dD.Sĭσiub `sdUtLIg=nY0$@' .~&ujlȑU1*i2N R0KA(rH9φr9_B>s>%p=qrmyw˛Q˳[ܪxxʠt8#')2ɜwd>b+܎F::Yp]䴳9KUhښ"uI{ST!Uta1q<;XNjlz4ȧkv}!w:E_ݯW irJΕUvnV7>D:ƭW`*VբG-9hZ*oYÝ꽲F[,P]۹M-m!ےn. 8])U4K`s$SSE ]{oG*%8k'AG3E*|_ IQGEKc@cUwuR<&!z\Gra8ˆ\U[5;[f)s)$*I' l{ Fa.%Fe`*nV F'( Dsͼ Ek녥!R"1DҀÙu(ɢ"mEΆ' 濫`J"!Id( ԟqL|}&J~ID%\ /  3?rXY?L G/\EA z?,~VgGdpQ~'|pY?+y)98(%bQdb4+UQWq_(v帏(VC1swo[>s om#}ëJF Ž(?`H]~DlQbKf! YfB=ԗŵ8J|熱;脓I1J8#|O) `A<>-> +5!n .m&!f…fZQٹ~P10|M w@H{\T cMm!tD?b5wp*>o=jPeI(ֻ SPQ1s dy_[2i y,BL I3mF%3Y?kdJ1W#BRcP S?gĒUZ3X|Zext+{@b.s,2f9(g'QnBx,i h.VLfӊu2߁9iwf|[5dU7B'GB:մp_! h,޿o%?ߊI[-9o5ך( / B>S\Iw3<*~戅x[6-on9lrE;l;Krka;{~<^M\_I>ћDvЃ^  K1,h* ^Jy|BF,u\*%. ZS@#{KRhlg<TTGmi9{AVztP+UQK-84V+/a:`pe=%N#C#)a_P\Ц5EĂc$*@MrR yZaL¯H 3:9,蒌T;o_Tn-Wcz;##<9lT%TU#,D@<^t2G26FwoDzƞ=2q];vwltva'L@1YbK[q)Dԥ^'|G.)ylu[q۳;v$վlmI˳؊#*7\Z`*k3oRGP9έ҇4s o7*^eByIz0b"RxT_hXJklPu>!; COИ`=f3pl(W9z69@3˝$st(=Z꼬JU0z,/˗SR ^lTky JϦ(30qh4>wRau5S]Uȩ]E_'_u˓?'Wh 9z$j~m[Mc{4E^|vU]nh&?A h-9{:"?I촦OAB`B% ` D6!H4=&傍L_RjuI_{a\"aC_Ipzr3ǔYA;g5ќ^ s"'Nv:UkkUy?umRA=`_kj/>e,<3ީx4F$.Իbu"Ge|_\৹ 1r*wbxnTaqM*p2qEmW+`q%/Zaūwoά 7ðǹ{Tuq^GqQ&mW۟^lTA)n:ӋI+~$+AInc0Y @E*E [t|k0et7˻k`9+m"%ܗYUZSɈ.Zúen[en[2 #;[ħJv6q/:L40U#c}.} 8:\VJI{E锨G#,Jrk)v88fx=jI~ XugۺU2}Aza[EyX=c\ :ԋa5!x JjvE 8NprQ*48Jǝ*yGk9M"@f^"ɞjvk+ZwILS&t\y5h& iKn2Υ :$I5UI$% S^JXˇ-Z#kzo -9uCxPYR7Oh ]vڳT(ٺFG1 b fh܅341er^Dvo,úK7mF/}[   9& ȑ4L1<X qt*IG4AT&o ,ZGQgf8ɀ *vmȹ_' h,}GzST~EY~%FP7j'JѶ?7D!6hG<^Zu "9]O`R~z| (][s+UJ*;Ѝ[{+Y$U'˅ŲD*"e9:}K"eI5/3`0C7D` BJI`B 6iI1M]͚3{>ށ.ncS0{e&@3'-:0N?LUv~-wTMH5`R*-qxFǾcg y۱/Ϣqo΂F:@(^;:R"䒵}) u# u PJ(,Z&O3ɘ()g焖3 d׮A /ﵑmf,%I2IfE #d֟ _i;OOZ(qlgq]i4xܴ?NptjgIm#j믗9_WF$F|0M7:v5Ԟ496gAks.Eohv?.<ÿ) M#-,G@ӾnEt$\U鍓iXċ|=3;,5o)UV@a7Z@"f._Nr5 1?dĽb.ƇsCoI3?/3E]OSo?}yX1jb,a}2!5nbY7hf{}LTJ ڴ@.@o,P5?xMb6 Ī'WbѬ\n,#,OתWB}*S|4 )!u8b2:uT9xb\O#J4Z萍FEK45~s?ڌoЪ>׍--ebzt/=3o1:Ze}4ϷtK)$Ly} D+C@S-lb}ʓTE>sIQ2$07MĤIʞ!n1}51*#xkP LM UiD7-z-J@&z> #c:Et"ZcD<+ѥ(Kng^^b$#[g%P_tJOKS&eq D2;5S 9CrEh1zJBr~$3kl9D Qf50d"rцK)c[Ae/غڝd[h0 *ƉSB*yQ* K(s(=4%uP0%vc aM|XWI3f$%皪2_ >'-3|=An.5&MaK72\̺Dz{(YΉ6hJccBik%ƍSΈϬ)HgA&huysw+x)ӳ+ MZ@VFڵ(; (|k.?/VOYKRg+4?^tّϿx6tE{ llVc@R |1%}y=oZ-)mf'6>Y^ <Mu`jGÓ=^98LJUmnW+ֆt\Ӯ慌Ԇ5OF^bk|$e3etQVˋNa .5/>|7LgۂNZp]^ j$&-~O"T_bvy˭nq0n옅ywo~|7ܛͫc[x1t ]^@}h/E^ˢ]..oyK?\oש c݀@gɡwHW'$%P=sBjs& 1$bl#3&#%0^0] :<ew,btiBwE,ʲZlRƭ?qgNBV{ɴʋ2VrE}} !Jwdk:tX]Wνj膲UTɠ$HZȢ*АMh)&dfR+^h>"Dњm%*JQ ADY]!Xcmu֝Zh([1̳\r͗G1gW<]+\7Sس牠BZ-Ѷ/{*m@G"CɮM*EAa=~_ܟZV;Y/GѲhK<_L J"*R_jU҉ t~6!g-!|@6qG<)"@HxJ2ts .+% r*dN%YG aLpE2gn}rBqc;* }D~ WubqJ!!}Szdʫ/Rm2 '#^XBX4_t9}TnŅj¬UBN¥FCۺe(3Zٽe Nj}yv6Ax"W.ݶd*$GYFV(E"D:Hj?OuHPr%í#.NTLJ\@Psÿ́ϫ-BSp4W̓<Ղ&ۣ=߆Q)i wzU^ac%XS4XNygNk@n9`}fhXk -%9kqB\{v⇹5F~peөMczڵ+J:\]{{>}KWӈ=P}+AEY7DddD`lF{Xwl|6; yfxUUc7KϳeWEQHE)PZHP&Q{ 66UNr,5`TUMLI\DMq+j)g "h$V*2.H]P^wYw>.K5z( s5LYoI\ oHh^.Ɵ>K^,3xK]P6~۾?oE+Fku¸Vem@,t[Q%pz=Ly;랜%CT֐y`FF(c`FĄTT?bq]'u^YYkayɧHq(rPѤ%P0,8XTDG3 , Vv"s%$f*xY+oJb6۞<H*'|h/u  5YVYZϜ VHɟV=y ~{@Mb_gZ6ҧdH~J:6rSbL\z E=H(Ƕg7ݍ# *Caj3jX45e4zl"EV;;#XiqZ@"1 !SFq:Zdk7XEGvݕϾABon4Q:}ndN*"fWMbOav'ise)t/]OΦƇ`(-%.m4d Z/K)AI7:Ct>DZ2cUx_4MkPLU(s>>*2.J~_oY{e(},]>Ddl$[~d Td"V7J؟`t;cl@뼸~L竬*7xf. .ߵ[leZ|ޚ*ך]5%MN\=\[&ېjkYHp6Zkj_;:j K7ڕ̚]^?*Xnb%O˺a:G.0TqgqfQA@GY 0@y`&!k=܏zW^be;o -cx؟l^phYP ٗH;GFz3駑>\azcgGo~Os?Sywg•i* trBɕeNF],vq\4<4TjTX)`)$ŤOˮ޲!9'Lz"CJu [.p74,mtF䴈FH_H Jd.3geUjt!nNslS">_حVk^9ю8ffld8?KuZQrihIE%N[[j@U"#`?Z -)#ڟCTEQ5A"GѨi"h')'D LA",&B=煔r\u d8{ ۳ְPY1$Ȧ*GF+8 @\ D HhֲcHjE de_> "r0{+%Rkˮ+r^lzPKv}w&'uw/e}P_:E4!RS^Ja60؁ Bf\0J L:Ai;b"P2zMU`P>RLz8Q[M*Ffdj3FyU:$XH;Bc㌗yҌ|o,e^]U:?u OA51ZjENHIRLA{쭋$ReJY= JlRx,#2m2 Ӂi]I1bwFÈm#bb jw&=jAࡦNzmEީ J^0D_rCqNVuTD+$d~\H☁ !f`E;$` rG(`#Fut0f~q_3x,"#Fw.yô5Fn@i8p.#!J0q@/HEYFHHq,($^2XP= ȕ4(&LR#1s:1"vFN.V}uv&%E1.{\XQ)qV#CGGlr1E!0[l',ePwIǾx:<( >Qm ?S# ΣV٠$ƙr'8B*Fԥ̔L"[t$0t}U;>1QzSl^z˃F`/ Bc8pI2pL*nhXOFY; ZDlhaܧlak+rאqy3,tTWooSi}Dϭa2".WY{rkikVx FSDOjCEUz,[q sv-#xMS׽nu4UKR[;j`L\y__f\16KQ+ O#Wgߟ|j%"x.Ө\r05H$*9d[59˴WuѷM.\! \HK m09M4Pl[߶F/$/ۣf& zݒI*#.p^ݗ =0[?F'J.N6`TCqIįMiypB/^*t2 yrRZblC9:9!P:5.ẕCAZ>V(7yLڏNM9vO$PLd_kL>U2_9VS]PiIp)`:#fi9tyujMٲGWս*4/D9T^ВKŬ ‘r㯛<ȑiM"s3oq*쫳uݲ=ʞ{(Cw9.#s5_=| ۱rK^S^V0/]rBFq]ުN IBxFb7~yIIZxɦi]~3GE=ʖ~Y_`U^jn|d6̓zSx t8'~rĈ'r /Jkbl$ P#wemItei|Ec"<+%<>E @J}qBtlHݕefeVZa_|{$XLjg;kSgr)zrbkg*g<-WËCc<8fB<+`>Ua6;J@ɻj6ݙzeԭI9N80rѩ?$BaPtF{KqZlr?N[9tevѼJoGٸJ%ӥ߆Y=WOƄ?uLI$;QDVqO/X"N>5qi2J'iyɋ%JDH wʼny |eiDWĨGQO-<¬K*E'k<ć[|` Bkx}{j fJv83$)?xʹMF]|#v,y9#aj fM"S˳{j":!^^;g)l|y:WX`겺娋R1vhF!Ҍ'A RtqglWKHU z-S 4{. y>'QEV] ՏOPa6.8b%Һ=-2Q#xl- "HW2{rF8R A({Tj^\|\^d;c1yk~ G,qA[-.+FؖGRD嗜/_wMg[T>`qMkV"g5+3M*G%sbBY^ZKTJʔʚk32ެlc-6GI3WLJbE$[,#L뜖yQYΰ&]X N!;j{f]4X<y216_<{w{ %%`Mg\FܛɓzH 0?fS]nƑı5f(mpћ`6zYr}R=PO3gq1%$jE#4^#Mg6H== /3$|PXDU&zB;zwuK#!J`8+M89B*u1EpYkbiLI62pr,X8FEsydq-f"üa:5:&8yp'/d>ߪ΀i8:+Kq`ݧёp~fk,mVMmM'u04,WZm-r~/ǟe}E*14 z:c5e^8p-}x.qz;'D rci_&=y*#F&7ޏBKO/qxy0ށ5 "Nӆ P*!TAڌIL0)J>tiI~۴=pFH< gB gzq)>+ccMXzՅtˎO1D;֭BBֻfSL܍_ ݋~ j.;[LMv`me_5.$`q+afJݠ h|JFHT ~.jǽ۫,q<1c:4pnتQQ9.1ПR$M^-G a'ec{.^m]6PʶеӶ47:92Ǣhfx&I\? (u1ٲ50:@s}~?5](3^bb4hNL<@&X{>g#9K~P ZC$/Ou(Xp,'1 eY50 eIcEP0ac"cMR8^a{9mEI~AZY/eoV(t8?SS^=9(bXm& *[ڰű0 ٤:='Odz 95PKB4^K?'L~^p <@BYbptbfk_m۷s,h1ywMls{q^iLڏ[n Yb0U?utTC<q_ϴmYOrkՊ"-bcQ E';ܱ!FXb]~/8A>(kb>3|><>gװfә́e}6=OaMt^>+h-Eqdg4%~RKO]Z RCNW+Sz3,k@]A/{VzA,[ j.˱E^Ŋ+J9+ F ]噖6 0oֺ!6U{w-NʅسIo9TaMLǡbyŀ53{7O{[ ".Ȧf_|xf;h*}>oqjK:gK;;~z- TQpZ9'|[{.9D *(DqiI/bP;otѱ?Gd:P}/%2JlOd8 eM>wHvy^H gD[)͏hZ&~;>pj{‰L%4b˯[՞Ջx_ˆѮ4tIJ|3\0nNGE*Dy*hSg LҪ,2T)L}$>;sNb6s 9O-3XU5_ǁPyVsS [p$hn3_'p\nɃk7fRDfU@OG[y.HL&{tFpU&F+j7ùcFF!u CAiZ}^T?M >I._דU6|/GFYl7BY'\Jůdk+nU'<7N޼;:VYRN` kD[#oVcsU2ɤ s榳]? B8XxE᜕xƱԣ0[N&w_cE I=$gtKe eei91SZ2"xŭw#kc]f- Q0:_#rTh)dI1w[xNeU~|V?|o/ugZU' $G>vF"Z2}1뭿iXDoS+mxU[Vl4_#|,G_r~$@ƒӬ(,-3M# Uˑ{36ͶsyГ֌1Q%gF731S56mjAMs +؞n0*v\+j%pheRZ'bQ}t"h^X!BW Abk0e{M6>-`Jz-mTn(a]Pb }-cJNfaQ~L)ПGHǥ^Ž2{e$+Q2TNq 2\p2n̳DR*QZ+We .#*_wVg ]td9վ87}}cu2,;?|H)XWf/2{ً{q[:$JDYrΔU34/B"FVJU5&:kSvU*:IGׇoHt4QΩ>\oZeSCe!+ϪL &d`SuB ^T%|j,J^pFEeުnӿ ~R3( s ɀu >U夠302+Rcj(NzP!Jţَk (tQ2Yuf>رL ?`uI4ö; Z`v:V)ρ2 de1CuF@9l\Xj+1 \_-) \YёR`vh6yKctDz[-LU>Klb2`E<{(^"F\p ~sq59vh=|jNh2=}TjeQ_vS:$>zRg:6KLeEvQauz鷟e'l4E>ڏiXy @yǾ0"WEZo.`B5v +MV\6cBB ˗NȀm>9?AQHʈewk}'`<7<*/.cҙɃp41}OvI9w!˔l;v"J:"E!5YלBbOqzV}*5C؋ѝ)hYS^³XcJz[ɑ_18qЇBܦ\FlY%~Ͽ~Z씔)1ETe ~`anPA+cpIhu|tC}'A_g8_̝(ɥ\X( #=xHsaJ)m}}iUn?9nh`i6[~o$z57-YE6Zs'r'$5Z;QcN]Ҕ`GH ښQwYюߪz 2 m 1NZ~$n-sLGLLbhc2=[fHR'Iͱ-]6' I4Dy~SF.ikl"1W"Șs~ѥ%r@ qQӺKQU"E(n`HGeŒ= x4ϳt|\?٣E 8,Es怱udwUI/r߿xLS A H-[:c!k(56.\R>&ٸ<~:|^̉*`<͹Wzv,B4^8Q_Q>3?:x=M׶EwMu\WPhE5rH(!Z+='s- hI](r *iswՄGNي OV1똵Z:!- =Eab;RA4-=TpGY @sCnONBq5ޮbR_/<ccܘڬ/i L|8#[jv>> x,/ȁn@^/I"eo~[u*Vd$u}WsgC!1D,j!$GOcqR=W|˗:DE=(:o^܏Z^Lg*E؍ 񉡧\4=b>*T}.U~lPM)VJ!1L"YK :v܉r6BSxh5cU(JDYa>̡E@hB9jYc^{p 8 YlDj.N0],,Y>X2M1Reb} )?C "5y+h $0oXQ$X"MYi-(ͱIHDAۏn^jP#>!_#WL,h^rS3"L#BxzPB ?uD` 2 rOLgeYײqǚmyX<%AIjt)C+L$NRنm| (7c} I͞:̏.iB͡!T8dH2pLfO5ߺU]Xۢw֪@Ί_a0NJ}`|?hn[SQ8"8UJde5|| n1cR^ƥZǒ:(:H+JbʘDɬL~فVw!Z,҈RImP(3jDKO%خEA *5+<Ө`6â iSrA|ج5;0Ȇ*KS%钧w(Pb2ǹB?|ZfՠYrIs;ɃZq#|ǒ˂'Q ♔(M@$)f*QcۮuźG1D ۝ඟ5mcŢ ߿ 8+ʰ*c1kl:eb9gS rP+M0ZC.mkuߙ#_߽^RM:A$O#F"MUX)2*ۓGR,9>kuW02Z$f OƪNHrm-k ;DL}l(_M4 ໯](̭[җg{jTCK;BڸBZGjk=*y%v9HD;`=sn'l`r]zц%)Dαx}s)8Ua)Yi7g`QHx)jcR0( 5|2h,u |NkOn'X,i!.݃ zwrUd!X,m5 I Y#ra 9m30o/1x[L'* eLJ7s:E븙@s 0~Lޝzƪ"ŝ/<#F2i4PzQ*dr-g.5HW6:)Ϫwh<M/5P8UD87 [Akըb6ziuI@Xt='9t\QryzK?х$*"4'?v4Lex&cՀ0؜'}ݩwUaQY7'nMG1`YjpB`IZEhd:/ *ܮPm+*Tb|9xPhBӲƀh h1M8 ^4*zsJ/b&Qzj+j.=UXua'kaEŠ} ;W9c8%DwT)8U?$KQ+2迣r4\ :f('◭#_eAzX₩KC!=ekSh83c1~x֟y  4tc!$; XDOG\|RWdxqy&t)AA,lP[~n L4EQOM (\s!ObÅs#xo{%{$J2v)Yg>5gCتQ#v!瞧}پ,JIg.[E x,W`[oΡFV@yt?䟡TBjS*㈴d8P^[Ht68EЍY"{?cs)$p|zxrq1ir8Y JTco̥ό5v{kWĸ'υj2 d:\@}! "d:cU:SpxśȁvϔzV&`ղA;5\кۏ=M H#u6{0Lɦe)Ԥ?/7}IK3ZtHink}mBd_UzԩФ%!?aJP5`zAM(*q Y&A&zdeT1??gwW/IGq ={kM 7%:\{ ps9u9MԱ EP>;u|v3ƚAa' J<3pՠFqS̸-@S ?y*ë/'bZӺ ޝsڃOjF0ӱb9qcB%O(@23)Qh$`IRsXZ5>c3]$ϳqQK5'<8ze=$jQa ' 9'\Fڌ щy јGrʵ*iY 2KfA&=CaV4YK lp-L&Yg BKGCNWא&aHFɳukNJ$8a oYs~(`'Y'6FWqɤ*}-\}گ܉H :AZÔJj'jH;)5Eȵ VTʅ9(HYQZBO'i_haX dQ:m:}d! Tn+N$ m9@5w@IUn ŠVV*kx4J0[Vx# nV9Ѳz:~/=/^MQ݃=gY {&2)Ao&88SI%A ?շ:ی "eI}WWܐ_R]RZ|)> ? ?XDg6yg柀OΪiOʑ!OIjzGqmlq(vߊS_|n} kE x[Yo!:ehiM#Drv ScCbt m.v`Rh(;cTpcwr4u%; w^@י,HfN>< {mq F:A-u/|?7Ȓ]և[?g:Vw lÛ+7|u|&R ;ê=V5Q椹2bBzR:AЮN =n NZpy}tolb4{Ej֬*nGƅc\ᛎ$%JmQ@Ύs w`/f4NVDsBյ'MZҿ{t( \7}F<O*.#Bz_Eu1vܽMܡ 2L͆81In*7<.wDe!eW\HVFK4KQUd(ƊsƲ MPHBAB1I[Ebߔe-`֚dHF(**4ׄNxrgT<SeRw\ B&F"76yכḲ|p;U!,[ {7J1JV,Lxq` ZKfW`osKCk1z!"dy= l#yV9A5Â>«BԌ =qOC~0 sLļmyz;ɕG咫JߏYHQZ-GAɋX$_FT5&.YsfzL{iV_Yn zsE)m5n N|5PaVJGlm Z$xu([dgs( rҮ@=:9دSI6 !Isc5IҤmU Nq!Ѭ/}pѻ雃-tO/FMȯQk2#4bH~p{PF5o$d cZv!7n}% fm/EbtuUVH˵ø+ETf^pUUVX˄j(R)hۡR3HkC v88ϓZNS'=Ba\hh9 I\_ߌpSNDoPveij#&XTaHVFQ_1Te-kUhX~pYn)A 8`֟B ݻXn >R*-\:G1ؓsDɱ} GؚH`jO,t͸8s/Al)l61%n.;K+3 s;ynUxQ]/0LcJtjn7Դ4% 'uEXccJ‘u*GEx%=< 5:&C7;S lටѢvr^M}sez2nϩJJ]'yBl)BhQU _<v؟H}f:GUAweK\B  﫟F@0M)qZBN! 串מSD}ԾSRZ}/Z~JToOUÑ$ZqI5fw%&틹fZ+) ԅ,'x+$:{ W_n2TZrQN6 fxC̏[ESY|aC%Z-}{+QII*e0^ Du/Dm xW6%:c =gZܦ>`Kĕ@: 7DJBK"O!,3D#RLC(b%Ρ|s d;F0K2Υv|߮20^viOGMM nIIQs)Aq8u‘Q|@)*] Z WˇV <cf6zd83cxN`Xp?z6ngl v0-g18ǧcst{u֓0>fBvW^5o4Y}{pN#NKsVk՜s&lYw8*96/_3p-ka{ڲƓ=jڶn,7:<]қɇ c4ߠz0K ~bj)RX%vU_'dݍ ܶic(20> 8mY3iJAKzs(y537h̫ a/ 7jp9^;f7>{OwG>)I$M!MB09J=gA0Nz~hF?q 6xZ,T9Df(%ҚozOV8Κ;`ӝx@oFt䨘J)/:yd'߼0'9502 x> Xc6+"Y)EV9J#wDT[֨o7+mm8ԦrGiW@%#pu=!@CTRR5Vc=Zj["V9‘,_)ͻgp0QR bNɋ>^&(!ŧ=.}ޣ/W{I A<v!dz21*\mYcÝċI5]dny)Vz:ЗYhI7zA0B/P#$o@l9ᮐiީNNRhpD uO6r Ah/mPb%t`FRMqi E2x7#$Jo"-ZG8m:mK# ^ẂKUbȿ>X:Sbي Ѩ%xrr$5zo޺\t =vkd$TtN1^Orf]#u֙b:4m-sw;VeNK.˳Ɣ'űdGQ)I9]""G4"tJeg˘i}2 ӈnkcnڨ'\͖3R6OQ2Ud.* %_}}ߏg>2rCvJJuh\bI専%BeJ[q֦%u\j;;A)]OFe"!"br4i.[&2-K3`DUhe7䮙"%>U{VtxSϯ9&Hz}pTRsU^vp"$EqMEWl|Ŀ"aqaoK`95 Zeyr: #CJ$01RPo~ 9P0yXٻ xv~QU`?%S`?%ɃeAӈPdGjeEĵN8$ w\'4YdRcR{ >c rjΌu6_7ڢ٠7ًgHwX8t*B*B_KGdxW 7칇ώE>n9tTs+cX:rʒ3CZcKǪਟ ਟ thAI4c (`9F`RE6†Ҍ8 7܅#w ؕ=C0k+3L7hY:yEC 5Ɖi5Z .[u:[%H0EZwI^+FJJ' ֵjӼ &ֿZ?%:9n֦vsY3O5cT3O5czXnJc9(X bؔCQ9agM``RhMqoj7sʲ.s^W[D=]Ε@\@(ۜ`BjVWjf,E'dцq!R#|Boפ Im룟z}S~ѯ[Ϩ#k[ 142zAC"C\[܅O 3R7BqQS [p Tt23PYO͟i<[: h*Y7'RsnlO-EZ?H[յi B!gUL%1:xjbS60 > "6aKqxBYZcrEk2^g砊o.!ȴ;GTbuA0ɰ RlV:ćC0%nk >E(z*ӨNYQ3i`5 .Xu85Lg Ȫy# $1Wse-r)Z_ϖ̲bq_?5rIr鿭܁O"xG^MD8^ᴂM clr28-x IJxBœfPXk[pTlЏE2}F~/~!m%F^_I).`MڣbF:m5-xq#-P0Ut (rƃb Iі?2mSϤJj FHViYɍJ+!#3Y\L ^gM詉%K20".\i UJ!$%^R12`1%JHܫ< VK6|䓼'(S~kÚʝԬg7˗eLvf7*ئtd/5aU{yVYF |Cf[2C-d'&sl,}~:oڞ2 O_ˈ9 gP $@JV*J)V;5KnV>1X[F5``( wn QⵠP&+Vp]Nrk&'y5 KqO}f`t6>1)Lπ'@BR l4 ?R˽B3TAc켇d \9f?]jxn=XZc`|z4:mQꭲt:dV Gk0ʐBRP4CۋZ}ޢiu=[fq>zԆtb62DuR0OQ5L˄f cߕ-^Nf5"\Jsd+#ُa}Ff/y:%aTAD|:K)Į1c6=jqK2+htfx/VItVfU'nL4=1̇d^ :Zsr@އBŬ#Rgɳߥ.ZB @q{X[s46|C?%TVAoc)!={){Fg־+U)2#0\^ gGMฎL *\0!ps N ;Ή_2)" b0yyT"ޘ0~=>*,ϋs  u1X^q߿.|MlSV'}A:IS3=)ۦ8\uYtd&?%O=?mXx/!S)-.L/>!|Cyc6 Wi6m"ž.ਆKϯ^ߺO Q07oO{Թ…. q:@ 1A1dƊnٴYtj3xKPR93\)TAc{*fG_e^fwE RH:ڇK_sYo>pč4D|Jb#$IHo\?_/OO&Sv}-\!zS]d %pn`EX+8bGlĥ`"01$Ob[G"gj$>wtN Qz uѝ3anśGJrx0JmzwyL&[$z;`v(9/ip%qB9'dޥأ 5CI)G`kRs+|t[K6z2ePV4 (LCA(䈾=LFE)@) >g^k@pږcp GBgL|lbNR\p%qB;qݞ UX6zVXCrXgqB+DhBO*hT[%i%\eJq4{(I!O \n.?i43@k`Zan Nթ-ID=j= }(BK۶':H b~&8I'hÌsEx$[<) :ؙ+ ݷ/պ eN.GD$2b{Nms̿yǁJP1,C!4gB?FK4͈C O}arZ@J`jO6{,B'0?.Z3NQRpQS.`e(;}b!r3m8)ifIQKKL*ٯL}NY/X=ơTݫ&YmaJxG}LYX4)`:w6 s/(wQB<9ׯC〉f\ M Uи8"G\u޾jL(QA 'Jcvd斐Zgyf">Vu4,s |)aR&6u*sMqː2 oISb=INp?HUrv&^tLUS%Uǖʰ+(%od4/;1> vNJy.SZnc n=j Dads7 ˖<˽@)QpzU nKuҧSBPE1d}qcf R7@ꙒJ_X;r|.>e\DiZU/"ƶym{ 4Zsј X$yY+N"M%nqt[mȱPL)>Du13[ʑP׬P);_oCW?LJ ]nZL!=|1Cv?cԽvI&*ht Nr&!K56Y&DAFcAá0]5spad#xfuedA@n!SQKsW8 F%7eOjXֽ g6ro9iQahR3s(\2ΐv\z$˘5 鐂+!5C[߯MV*]yL3]4Fb@[ kF3ቄKo5k_rC6ԓ*h/ 6f.>rd/WDnk'sx5:&q^n - tIjeAARٻ1Yop>Qe#h~+%$ _?DWJc@*]=l-~<>Lӣ~fdXUɍ5M)K•}(x۝|Wϳwn Mpd؈챗͓'gg᲋2u~ 47Z7Er,.{>sm4(w|/r % TavNse]{Pn?)YЍ+jP_ߖOڤkC">VKLb:-x7OxvrG򸁏\ZC8szW'Qɫ'14*.>ިx( ?p=[j[[K\`Op\'  +!,65a諻";5b kh$d7˞`{TWؘ i}~53-wY Z ù׋|,} OVc vu}v.)Qh54 S8&SX{\}jDӍly,"Cbg)vvT954v$e*D\f JVCLITK$_ + u(G5~ ;>XkHaG[ֿ,.WyQ@b>cOѲ-qh~ `jh xJd,NĔGh%:=NuX%[.7dURӫ&w[5{C%Wևȹ+M -s9ͥF _` ۾~Mky|U~$d[ITVhQG14~,MI)T̘$K0!s8&wEfFrIt*cVs&{|{SK$Jl5Io:_L{HGZ2`ḲZ?}!ޏr׫\y]ۇAnۦȪ d=~wHDYDGPT0Pk%Nڠ4ʠOZ"+xbB54^]Ub'Q(`ͨBwh*م=) $q+%bW`_wܖQC1Nf$rډ䶶p,fgA!Cg<[-i^#nߖ ܧ`8o&K SX*ʮ.cB! 6E3jBm4#-lۤoI/)RovQg ǨKBQ3NV&9Ų=I+lV+b`dz]U; D}9Vw^3OpI:0I`HTAwÊm8v1|]1q|dLeO#u}8lXu ,?TA7~ʺU(p]M aAF0Թ54gk`LǀJ68|@P:x36ܯ1Lt1uSf,1vYu9-( `/L\qqLߠf`E83?GzTڿS^0."4 ~MTeZGɻδM)Ӣs?o!ekGL޽];MA/'柺线nF8X}3 o]a a TnY"ô 72`va;syˮcIr8`endaػV-nhTmJ1XU bhwzZ DZ = Y pBgdLqQˢG M>P<Ϊ^=hhͰ'8Q$ȽqWA~ͪp# $>3 bw $4N<"Fܧ\Li2r iWeIsiPXJ~lfO<xp&0OP&7͟ c<јZݗZsh1"z= w,7ȢM\w_n/ΐFU4nP!C+QQ ! $:Ǻtpܯ/~kߗ Ӳ̿{Q26L!I_'g~@|a$@|6 ]\Nqs?RVR!JQXFlGk>H m~K!(rYq4~$I_DJ~x,E" BqpI}5s{+?g~?B/> r [_c!1@#̭ɑQԍ@P Pt|{FlSB0%Njq#YAwq ).q%1k.-Ź:BeSY98dzpX Ӭ-/gS%hF5/hf_!u͖]ٍSYtBit*\n/Z׋ C؉΃7V\)Mq?JU zaJ1ZIwf>7Om;;f n6~KZ ıTίLQF ["Ϝrs $F{vndJ{vZl-lGHZ'%Nx^D*khVBv^oTLmeHCB4qxkkzJ%X7F}x ` 8F#"1f촁_/|d =ƒh`jw9 o8<L8C`ƚ7;gXCs"ȵF_ ^pPE+5V&>_g XCus EIvg>N9ՈQCc;30Cs;&|v܀QM54ǩ3q[>s0d^&}APkJ 0%fC2&P? a+SUy>դ4 Vu ~[<(D baokLQr']Io,.,rd4#%$@V!o )5UQFA xZzޔ2Vѕm oKG6r:li2tCǓR4$X&TR\qP Iw/qYup jne>9e߫!:.%{` iH?~m tjFPqϨ)54t``i'ڏtaHsLؓ `lٓ+TǍE#CPEtpԝ?!F ߈+SFޤ,%jy <ӄI|"$6` K%m(/ ÎdrDlbWwrH+-v&z$Aks$Xj- L*t~ 2uU-+^CH ؇ѩ1:(E(yM_ aas *3y 7 @[XI_gSJtAyEn,^ǐIS1JmxCH$+D^˕Sq%y\ wqGwqGU|jY"ՠH6SLie;L`+kTiC g!sm ʕ%ta%]B,9]`c5T jw1m3$(bvw}H[|](1Kkh1 X%4y<,H=!?Yh=X:A\]#%%Jb~jla(?w߯~u Ʊ`J f`!^HeQskhMXv8.ZaDF$lGejI_yJtnjh-sP'eY5 f$>[h\01Te UYԛrOjaI0wVl3r#" s >O &jQjQ^PHİ5R2cR(^DVP J$4$[$.ߒ $Z{0%oR}PiNZ?~w )Fa\k0{AgL.s'W1 ̌fE B ai#y! R jԥq" |-J=<غ`]_ }5[nfˍlQ-7ΖKQzOJ zi0VZR06q, i`\ y /j/  @7PMp?·Sk?3F0z('ت$EA.eqTǓ  )V 砢E-B 0Xϭ)JI¼*{lJK/("JNkk#<ɋR)91s6b6f "vpgeijl]TW*)҇ z.:: C# Lj6:!4, P;LxDLxxa; n|gUآ%>=lE+dLѦ0399JcB ,]Hgf/`ϢL)#*-MR ${CbRw5S9Ip]J# j˝G]DŽ e'<Т$8V-F̃0OT$WuV'A1ZU&duND]5XV.P:(slbE]d^k|!&ZeQ8x;BTK/Gd4YCr"s2交ѷSE/(q`R6d27H 6zVdR›,m`61X,%mTu`o!񥩶^n ,5`EeJDRaN%+&pQkuKջIq526C:RdbfdRS.ݰtD\\=tuRJP=n4J&t.yCwK ~ /dIۆ<)T9gesEUkNi&E|>2}з25QE)g1uɤ,vaF~F,S<2Mli ̐kpWmUJP,+& a X"p]o>U gnė*VVsfH(׏?[<-. zED$վ)4 |C\# lHD$adă!nɺYToz05l6FMך&Õ"{CNַEt7+tvKrutu^=w:=80&%wpT!.5+zOk8<˞kݫ^s5oVrqvMz}C(e;ww}Z.2A\36eE,r93{4Bɾ:]>8rt9hpX9ڠL4ޛN+Hi6ԚCNsi} 4w ho|xMC=et3㌶0sFbNQ:CH˸f0m_ЃrAφSJ3g jAj*+g>AhO;F6t5=+]ž4ioR :'wgg׺վ;{&{4{ai1]Kr=P7(p/{cΔ=7w-#283%)Vg8^?-8o=^kz˚zMY^Z[Xf9;h`3v^|Wln↸ÔLEƐ勼Qz/աϮc,]@x]΀B.FWk)eS:AGyF4gOG{?r]g#Кid_SɈyAYb!3ޏ<(;h^鑌~O%_m,>ϋzUxPr0@hI-OHg`̎nиu fG5^N J6*ʅIStkM|w܉u ųG6:8VlF EP"0vH 9 윟9/ ^hK3}M֏o3Y2Yaqџwplylݕ`WO r3iap $1s?}NtV?^~]xqBL6)^m WHKf}UKai3J5Zj:ֆMx@N[QCɡ p%Ԗ*ŸSxrY"_Tohڦ, dHbPmU6egYg2 {$l11"eE bc?QH@&Vd,c%;Idq\pM3ˏ_r,j)b3ǩ欼̑hz'Jgh[$;&b,(jJk"6k.6eV/dCVRE԰N-5Ew'uu͇%yg-ц)Cé*8MBEqjSQ3<5]`]A"'ݨ/Sh٥a#U?HG1bzpI4qy!;9չ睜TV}یlt{2;9N6d2{^SdN>my 8d`> ӶlP҃0(L+d((cݳ( S5̰:K "7nj9V(ttBr֗Kaa.g߾X?6ڇBA*RT {pwmemJZqxa)s{tr6iqjxK]otB8ٟ0MwPݷ e*wIZ(R҂ MOxė_&Q(Ѽ5Cɸ~rԳ2N1 ;`c8Bi}F(fϒ WwS{cvax75h-UFƟ In%f}! 5ӻ+Zl/ Z!qbk^eTڋYIGai]`O͡=-$0$swa.\}b.)-cmo3Fo^nj wqT5!Bs|mUHE"/չ)["-C xZvhnߨag<  m#!F|ְsInj;R 3sߊAäY?B8YxwMĮ#V}Ǝ ~Qx@2 İEF`̀ (_}!WK^q?9bQ#4}x=^&{zqͣqԶ"?FǏ/%y8 `O}q?m:BDq% " ~Ns;/~!OjwԎw1n`[9&y dvOo:LNf'1=`11Ȟ)l:5&lu6Mx?`yDbؙdys< j{BSm:fsTрhtu0O 4S՞-.@ ! S'OЈB{v:Uޏ 6aY(4s'ۊo`2>KRFwW3KU̽Vp"QZGE=eިmv7?;"< yclawѠeӓ!lFfflnQ*G0g$!Rt-[PL]զU0}plΎ # ,/y$sƴrmmG-=k$7)1y(@'T>rԫÛTܲsgkPguAV|%?Z9'G"hQS*ͼJոIk ɷ!#HįvM[UlOlXd-6.:c|ag&-hBI8_{)dgGX2RUNՅQ2 /YGL^?LBrZԳMv;%T MXS7Af7)c=1hbB`713sh-n~)i8fblM|7Ksvl,buyqm0ZN6][JhDc".q%!cv%;j?9#S]XMNh%d=MQou̠V|eebU{"̕Aw^ ol$ \ ,x.t=&UP0 \sHKS-p2 5UA.0A^mJq~Jfat+/t#'74OoYLͣD-R T6gngt HX̵<ɼk_r&P[V4oP]aX}1YblMG7VÒR!c儰vm^@,ɻBݕT\Qzs l0 Xv[mtgv^eK>9sߺfrP`Sw|KboVo< ~-dyB*v&-:JA1pSX%]qTcK=ߎ!8E9ED؝`ՑXÙF `#0Z!ymё3ScoÏn\!ȕ30#6O*-}(3{#(F{Zmm$ʆ胹57k2`[n@Ұwڥ$7 FyˠJ8DO?s|X"SLeqE.P+4 vS1V5-I512[]qSN97YQ.M@۠orɻ//_::[z#u Im\dSQ$ Mb|Rvc'řEF@vbMUje>tx1GZ3b'=}պx2ܵEr;P =%2٫K(yT]vXbk&8dLGGc _Ӥ%U "C,{X=1ǰiM8LuF,^bMe9 1f`5$ Ppoo@[V$`<;~;V*2L*bY^\`rCG*wN CFDNvלrop[,\m>!oDˎ`V$L(Zo 5Ic[/!c\qxv1g;0hT٭A = 5 tdn_O#8V {z#OYR}i"֥քJ0 9_fX*F馝to5ggkw'. aLŮLM֬Qw3rwU)(e#ـfaXO4i1zβ]=gSVJ=h0FE5br37}7C[H [oǿ/zϾ8rw_뻓})T~Ft^hlzPC=e<+;Ϧa׸ū.;id;3}=vәߟٚ*oFsky."3=^׋Yj-/vK־ԭ/l@kW8:S}e~T$@p.CkG,ʹ91fZގo* L/qÕƾhv1x:IiOIa9cS/IN G>tm~c9X~z0}6fg+] hKR:'fR%j`,Qy} 3o R;5Cs?naA+AINB&Qoo\|4zQФvOoqV)nw\i Oڱ=1&)Oj7MGʞ~xóo'o~'d ՞l£, nGͻTW5!i5f(6ў=%ڿH&9g연P| /tlr.O>%g-L5yAnIR/Ֆa:joJ5l6 % 53Kn#;;H?f("vk2j*fZ7dNZ,ZV&]79mU$ 짜dZ۲!SI֨g3=3 #b7"Zuɯ&fF(R"\Mh/S2_= _.No=ڶW*grt"|"2J ]+9$ ŐR*Œ%R"e./"ֲjRF/&fmtXgP,!z󲎕XA{ 3.VԤ0SqOe>Cs:h&N2^n 9q9'E(uWZ£/K4 ^ݖBu|:u{틒#/Wqu]G{+7+6W[Bkߥ bV`#:ʤDO;%VI?dݙLޟ|l+(Tʥ"UbL' nrcUe3>|_?h,)Gfoլ ZZJBks|ZPhIڜ8Z?ͬT,z 69B ,ʞi{( lk|'*ĭDΘI7piInG4{& q?wf[4C/q9vcwnQW`}ޘiR8>BWb !S3jh*߽j {wW;AVph-& hwrѫֳ(HkHXc ƨD+F͂&ZY4XȇtB"5Dso w+ ƠBulsB1LZRqFB JkR\Pߘ)Y2Zl8"8s5EPhU$[~ Ӫ.d4z#Z!V%*dj%RNNHFe4Lс;St-뮞]YlqU&3uQEXbaՕFO+j3<51# \LnLTN@ >  z1lj[0N^1 K$%(̭ٛ,1i1\e."jm4%MqLØy,Vz̑$r\9z"zҰml#A|NͽFaK8+&ۆPt^q[a\4o-lmIgsnc9K^8wb+ALuY;0Ƥh# bZ[aH>BS`9~Y )q~ YkI l&=^`ܱC}:fБnh:VÍ:ܻjm{vaWS޾3+aQ@b銑[u%qb3*b6LuFf[S#(`FsJ޻ k)NA&vFl7]ލ9KWĞjRFP٭/]H>7`3I-ogM] lFX3JdCoo^o?t 1Mx^XMtb ]Ƭc1"JгU9ܾ1]V9i䪰ȮDo);G"ҳΨ4Vp68T ,I3IC!s.%6' 5TtSe }1?_-X_1;sWjV"06iv$mHRQyAeE0`r F8o.9A'NTȔVEW c, }(7YEƀ"[ƶ($;'cױh=K\8dLS<D%c:u2=t h;"49N`;y@W]Ib>?~_*Bt!$ahL6ThΣR73pA8m:Zf g,XJX{r[GR1;o P$q- NZ IJ8 1ȇz^"Đk4]+b%r6&`x&OcEbZi7c,&ܲuo`(e#K\;NZD/{B8,Fc@m.??~K]nUP$rRmo(Q,I/ϻO_*b)EkHbpq(a9nxtيObFtk!XOٳ!=R?5,ڻhXu1 O6]s]Ĩ*bjTslekN=_|3 FuV1kI_ p;$XBPcqA\.Z9( ! s1֚Z+@Qea:y9xE5R˸Ki[x_n۔a^9G0Cq7A#\90'0dV];d X&s:$Dє=kܚH)VߝP^Z{D%Z自+9vU k{X+1)NwI\'PmrT(@9ƚm=sj#<.P[f惆~!o'&m9d.:ж )BlC (oKeH" OPٵL\MT75(O۲ov3kv7߲)EwZ+:[$:Ɯ"Q DlqМy_ ڸ[?A_1'C_Έ -)H-}#90'гIӣPTlxԄe^![1{5{fZ۵ãphmu'ZH/Tva:ka|<$³U]q~oאkgCAb3.l ~ٯ1߬8΍Ff#ZH!-3q0X gqc1u7 qMowCNj-Lhޖ8>Rn`DWbZkuT&#$rҠٞ#f;jnotEtcAMl]ϡcNEo<Ԓ!9@j[ںGrhNf m6>hC5%Fںs:l{]=jHnO[wH# "|*U}Tlo m1'# "?[8u=spu![cΞoVzo;م'ެ,8ǜ;5zg%+ӏ䛅DĒyW~ξ k}pVtXk9ڔr7t iqXb˅~kǁ)84U`gc֚٘x(XQP3ݬz@T Ԭgc2&x$ʢS<Ҙ|ql͏dLXY1{-~|}s o힬b=ŝwEDxGǘc7|Cu99F!<{ =G(tut2ו[g j١o?}񷁣0^g 8-6Te]n ~岈YF[.>.<&f/__L8 MO~r0-ͣ[V?R@45E)Ϗ:ЀFts@&AN]du?xKcQo1f[Mx77~n}iTcoBŲD|7c;Dm(&1kk5Rm>ǽ{Pw.ʊRhMR&جLrKM:kTYcLqrvwٛG? ï .g8GYU|6[G[ENM؃;Rѐn" 8h.[:*߳Teȭ>n9SBww{bkL!""c˛_9|*c|_?<{?^gS?ڜs>E|nuOKP4Rn },gꠅĻEi/KrDü)0;'Q.a臵DkԎSPG%d(!1h]o#7W >ID+@P@h-[c6-?jƉ'8{$VnF$%G"E´ ݄t>e+ݮܵpjHCfy([ :Cq\tpt3vAcwvgX^x|"u\; ˢq2h8y]恓bLXO0dB<9&g(fv;;T~^,LwL0C!?g4C魟I QZrKΚbnqwY3BU N!|5>6r_=4חm{#ETPD0IQ2ԱQ/>"Cn-KW A X>qp_t}wN!6O) {NՆIϻ$_x^ӓw`s+3;co.]Kk?q(9`t.u߱Itp>1ʔ [t[P+ȗk޵&|-27z^ Q#]S1XJtjD}3J0oΜX;A&6m2nբJQ U- "{Ք6f,*טzGxY p8mFsGf/ß鎭@P; Jo:4o^|Cf ߄ek+$5:Yǘ֠ \qUl^@_>} +5"Kg#=F^ RWy1 ݂t8>oFxtۙqλv|W:yVw:w(Hy_e&wqvu -`/ae(ǯɸ>|xZ."Km;L$Hs`eBP H$LR*/*=;c`"=Joz`ldM,CѪ}B5ebC /)޾c{p'u YlPVQڟZEv/빾s)h*ٺrbF*@ބZj`|X!2WErRsM5>TbVj|eWùoZ[%yxɖrݨ!e6q'xQ/zQOvpY'>wϟ?͟뿬sG}, 6E󯺬B;A톸Iqu7vL[)[~`;yGnv>ZP r̹3s[,S v\4$RBi$ukcV\=0 ,#Ǎ;vDJT>L\Mv٥+ޝv7" 푏VEe1rv%FoVKrZ8_"f}$q] 76>I:(e`X*22LfȽ$݊HՌ12WG;IdHXV"NSntR$й}ЀB)# d,T3)RH͉%3mgB-fUS:(Ywލ≓I8*hGZ]h%{)Mer78wppw\%AboOKVn#'hĽ %WD<"Qk_3)MÄq |A>U.ώ3pwĸzV㝈K}uB^gO{ii7)"Bc:~7H1r@/WԚ\{ ʢV z{kDׄL}eWQyPd⋠l|7 A1R(~lPvH[J&(+3e+pBB ΡĔKtå6WArӼ8 yS&2e`f{e#0ʆݤ-ah:UaS<- aO N)f-oVh:܇|k=SZ.+.y=L&Ev=͔$̇(va P䊔.#%$GN|>Wޠi<I$P&7 eiDʂT2 cQfVV[P, !/pV;o\DZ%̙]Jm;[` YGv͔J Q)zEq8};{;գd%vGN4(w=.5riIbbm'먠<4WFdcEA !僝/x/pA]%!x~2+nO1(4C5pcE9;έq3(jx۷֐ Շ! eAPÎp  $h*ReqҪ2Vi;5?(;ꚠV㽈wzH86ʉoNXc:|HeOH C?}|E)ӻ<ݨcTDlOBjWGPqpBaTq8d0Gd3Zq 4}1d7Po2t=Me@xϹ& ܎-H1C2{$RIi[v|n&ж%iF򠊶'{l$U[Ώz/,!(SZwS6B]UWEn hʤQCE%!?\JB8xȭ*,zC$I$ )tYcZrRk/0{JN*=H\V'TQl7$MBUJ(o"x)"4_ǻ$rRfJ_@ᯉQŸGhm:"Rl oO1ʀs$$ѨJUHp71UCgL5ܨۺ-9:bx,!ȵ=]<2']7N.:$4sc22G[lOF]\㘓N|6d:&NI?Mϻi\֛̟Ж9AH0Lp?3`p3_=C5b'h7I?;N rKp4e~< տ~jv>Z@/R+ۂ 柺ݹqO.u%n.%I@$%W1 4IP2Nukn>|B?'|ywxcȥazr3h™>tweBE7A6)_-7+PmCW4%L)ƅC>_ ,)6.~Svӛk6Tv/]C'${=~/<Si7\dcv~!Zg^h5|{41HKl05_wkvqq)> UNM+ѲqI~/vܞ91 =vm {nIb{o]Cƽ[)N?uZ\(,\VʅEQ?v55Do.R |>T44x.~G__ tŲ2=AT޹Fn}rrR# Vb#3.>t]R6R֚CB3yd$2e2 yTF`4;C2RrSO?^|Y'G-{*~; gFaKGRX>Fx\1ݏvja5X9K4M$:{6MTH%H!D2"&V "A> TZ#R s& uY!<HD)*Ȑ4IzlDglg_clqbсL&c8p k3!_=\F* eP6JڅJJO*J !!g6*i\RcCh"eYiQ<&)|z]:Izq~d:ЧY\wS?Bβ?dޠ7ᜉ?n%sGiQ=Ȳ\\O|b+֤۬5K# 8g@u3ؘE tP*3<ܞ* )P+bΨV:bI{U 6.eE[Ks#LP,Cm"lBH#jRLDI Q)u DkZL4J?U\!P[̴͆@>;Rm)>q!6.(Gs%ضSCv U WVd *Z 1 UJ5'6)$wmmK~9 v6Xl$l^}Kl/߷z(Cz3i)@$MwU]_`{N;V P&U6x=5~^Id嶑zI;#OCJ~"|='ǰ#%v_^\`4$X9o.~l w;e$)mJ%ƥ?#&W#ўWe gC"U*z7Yև<>4%>?URD)*VceK{;C@0-R{HI潁̌?$J-M%s{T_ @_w`{YFeYFe,f*(qjM8Z-C Sk1e#'!Goi5)+'EbZv[G2Sô4\~p(}>]ÐPY BkFLS#8wYE% J)Fc"KED -**zOĀb4*P63%l6a`i޲S0Ǚ^! xe`8au3փFN9%(2"vF]6ZFH6( ب"6"%3 c%02$Q-M ;=Ufԕ, 69) E1%l1Sg,6"QŒ<%sʤ9>J 6"ģ+46:oaF0!L*c%I-RD0q$RTڍށKS C K=6m)@0P=Dew(WET:8Ne8Yŧy7n^,p/pbx݈!\bvޛBpqwsIZ H F/@"櫋,2 '"Pm{&D\pzl[2jݕ*v炯>g/ʼn<^IH!Wђ2Np%Em-"BJ1mhШa}IIb^, =Iϔ4iRݕ=ek7N[^9BDżT*;o$Q)[hK-H߷̾2-(BF*&C3aD-O~(G)=jE1bHD&In8 gQ"G50y qߐEٗ~833(+d[m\0}}>_ieLɷҸL vRʣAރJVI_m휫[`w\W0)T~poעA l@~ŇTTN}E{bi}ϦSp}% e!>՟M,ßפ@S*)硜MjWQAvA"AEOӊ 5GGo_;p;b{xNH" ?.S)+zl=[!K|-Mfki?ZЂ#z c;ʦو'\[6A_lRatn֙-GN3On@30jE짳6B9A5YUr9rFT4~4Y&^z͓tZ-^>|cn/rXzJGmScg=bzx^gB|8߭2Ix%u$,q BS1@x`f烈&mߩf9- )TyK|?&~{?,gom|ˏc?,dߔ;X%SG'_#niRou!oG'i|dPeR גtZ^bXhj1WN8!Sk(XU5e=L<ͬ)>ፓ.Kǚ31)@[rEI kcVFs\1/0ȱH+̻ʊ!Nc`yqN7Ax>%"Ξ1ΞP3,"QG?S=>,)pBNt2L}Mh.~6+3LRZ Q\8^87$0NFJ̜td攩|_5J[T5J[fV3SY $W %Ohr*bzdp8 #M@l Z3g,;StLgY 5I/Q&JHz: W/iSg%!z-B40Z8 U3UaᴈTbB~ųڪxyޢc:Rxߒl<l")>k^& O}a\ {{W͝I .eP;*+kʊڡk;ԃk<"M] E’7Z cڌouAiܔ`.’ lHeNMUQgRxlʜ6%,( o F^!F!} T2hdv"܏RM }=le-FʁMY f5+~VcacigS!AwIo1YNuUj!?:j~e>{ PCLKN`Mfq|?5['mgΚ1>p\2>c"Sa!3uVwu_c5(@5 f,|SKǴ43(m' +#PAQNf$z BG9Pu87~\ Mst:&pG4zMt20$<5XQñQ 1ЁG'Nv d9TuN ׶C-HJ 8 [J^V'<đ"c]V dx(獼ޜJ?4?,(>Lf_nE"%;E:\~)"?rqѐ`9R!.~s9cs=XokxȚz]6,0)R=?{Bl!{mp24Ž" G[Tz7 @|feu$q˙lϿd^ƒdbN }܄X*jM}"!Ѓ`*h:cznūxxקï 7I}~xg#G>g%@ջ׆)DJA6zo.lMHίy6{L#rLogD+qY^!Nw^ V{L/VTs 2>D):G 'LW!T~|Vl& Nf6{g;|)<9RO'=FBCW~|'nj+i& q|ޮ@2300xs[4עRڃWo Bqq53j΍†/cEkmWJ=Edr9EASzP~Ur/)6)a7U?LI CrF ʗ!ۀH饞p1V,)oRO{w$( NҭVkc~ |1a2bEzŪg2߅O;B-hy ^O`K􍨐!6+.„#={~ 1p .gf;lIcఎ^V. )XYTkG4ei~/SY5G+ vU"V.Uad̈d4d f va/bܮZqhRB'tC_ybښ`5ȃMÒ5;"Ҽ@X&MRԈߞ@DGSf4RES4GyfkO5r!sі?ct$f.8#HEVVY$ȶkw0=$y,/SyQmF_˰$Sqs,z|m:G)Ae\2'dgr}8c}`0`P `8E fˉghESw(*$Q6. 2c@9+e(InS;3}T;Fxx[;pu\O O O O::z,$#e _ U~$v3;qUA/JP9 #WHfR,uC~ܝO8{OTub:$4uEÓv(VSǒڮҞȦ3!X]Q^ނ1n WڇH{Vv%Iqc(&v,Jz"-|#(GnRC6rޏ$QUm ˝&_%}קA)tw ZӞ+tN:&'Qa%:6\FG$3AmdY1Qպ:R?mf2:xe!y#Xt^*Y*{+}e-Ƿz5;~+;n՟PQ^Y;E`ZL3jJkulO1++ox'Mz8鞯)*`2zVJ.Y8`AJT.MĞFj 2Krtw?f;iHznǏ'zM"D\r4nBVD<ɰL\l]ܱ别^{dOGls6XW~BP Ԓ5 AsUgײK;hYAwBu!Ae}Ȭ=T. b^zZm(*VD##LQoJ8M~R]uJ }mҽʩNOdy2O,W Ty:/C:{H&brf*ݏ45̎(e<}ݫmQNVIIg;J#$2f[B UjEbFƜ1|Gc#$EWs9*ʎ>thFѵ0I|dGf8c%@H^K|t?J' u+j{]GՕ_ 8n|YpdIYQhMT`{8,xr&_QY Ы(uqEuN^tcfaIorBńhM]JƧ7K2+n4P/^O/d8ŪQ"d*-?zމ zV^ΕvTH˅RRwһģ{aԟzu,9VMq݉ٛUEI,^C/]NC"DMNYeb,ЛQ?zg{!9yM4 Y''"Z?I&]}z#Yӭ읿I՝}7j\Ho'h_dNݙKlב+V ~l[c,m;Oډa ,ĩcy\ĮbwԠ/#ibO53%9tbIW^_Ij[9Efyr|c-r}u e鳨xܗt",W*!nyDvC\Kv30_D}Fax4J<*g{95tZi6Թcs}QS0 ɤYs2/gg*ns;|<ku+*l/mc/G[Ox3Qf܈RUWM"tϟ%eWA9fW^dt;'MG~oD~d0g7ݭ<{[_=oSӼ҃3܏,jO7i'˓Kݱ{{< 5+}%}8(xGw_ 2~}=VH2ngm@UIF÷Ѥ2Ec!uIMW2c>^un?LҧKW㴹䟟B~K}I-Q۳,bcnIOrtfzUXïŷC@XC;q %}ώ ']Ea1j_s(<ɀ쎍c#}%d%[_%}I]dKd!_`Dr)ТZ(s,!| K:6 / X6ï&6Cmrڶ)%wMZsQLv5_*~QiB|7YNQc,,79jmxǔ&Xnּ$)V^)RNq:sC${,[w͖ஔͯrjZޭ蓊ZC4yXVb=d Su(C k) Th׉Fya#?ya#ƛ _TIJDa|hh-;A[i8v+=-b\25>P#}hh7CP4R+K*(trH|4j^|n͓_pVA7P+  إ(B9ȶ: t$.ǡ < x=x>HK` "3RO-iiɰt YyڅJ,t\\\c qn(gGpNxxw%+m'\er<!VX?" o4]&~ `!rFrn#߁~H|`,[J;D&w%뿣4L^]\^lϳZ-w'CmZ$f!Iۋ!1^2AF5gg|lJxzG^7uW>.A<ULY)MPxpj=!tMJVB)J\V`_W RUeCPnLG.p80l-Dד<;^@ 1Fjt-Bu-iu2oZU5)@҄lT^Q~ݼWiExS*T|tgّRwT'PLKi?{@u', We?߻LKދ rAп=R?>]^~T;iE~q o~$UóKEۀ?vv?h{1吺|ffg٫.g~IWIt6ShoI4O>IЧ ݇Q۳,bwH.rȿ4 I ]R {r?]'Υ8K;t3nǺ$ѷ#5å_,3oR{ȹ 3_zŠKaIig{<`ؖ~'OWD[)h-5A0K ѶL4;.Ig/Ix澤ar~֗l}֗4%/8n}֗l}B=~$X8Ab&mZA/lNhK}IyYf@ +ʩ7>q7J}I%pP/NֳT%:10Sm\F6bs}yr(X*/{7dbʉy@-rm@pc+K,VCMv5.appv#)Vð hc4"m_~ @]ڱ`=k= l}b-7U֜alg62nDm=-dOl!'TS @EmKE X/"`t6?[U燴4l{@K]m`%؍Kk z|&V`0 ^T!Dh1D:{X-4Iƕ$;!q)H\VzNOc)_#IP][s7+,̦&MzpZKI6UIMˤXڜ(HqRs$f7eˣv>hiyARavq2V)g7_IwK8r_o_IQ2߷7itu]<:-h./^Mn;^|bףOy-y!|ɏU|F6.~qyk-l_U|7} D(-)JG()H eܹtkV"w{X8O溥lc!F]p&7??9 ĝvg'!'> @&fƓ#sV\)y J0O~ny@[̹kې|91RsLtr|ܛD6vFJE` Q@.9hnOVvQq3Vkc (5\R% +˴yV]dv~?o%o`ܴ79_Fc5yE?tH-F.O霍j\k2RgI_.H?YGO-Lls8s^= LՖ;J YXdH&Z 9 p"2-ypsR?ϗF(nY-EEjy'S䖬\e1eQs t/?h9*8Y iylF;ܞ̖L!Υd&D6Q'N86<ad!Km rM O[+Su^2nUIAЋPyOB@1!R1lnvUʼnW5vP? [1!9dT~`+E:#9иTc'r'@ȝf O 9k ɝ*\4%Ü"@cYؘrbcRNlL'6 VҁH?.$Q#aGG,pd9L. I+ ,l#\:z}Q=ޚ?8^RRj~RJOHI4(e033dhYIA )4CK|D!A_2ICm}q-bhʏC)S Мge'xEtsd\4X@vDNIC˓K<)1>dxIpӷB-8@Y-јDq50,Ƥeل1ԋȬ~Y{g'oVn*W*"UJHi0%gM"aV*#'CdI=;s"GRMc?ѨŞ9*}P4Rj96omޛRy;A<-M:X6sߣƄ1,-!HJF."b;3GHA!M6h j%=($Ekň$X X2B QdOV&YWYW4M0ͺꂹ4b,-,'|=XYeXyIΓ%MFٺkպYŒ_oX>qzY]^<L@du_F-xӏ_q3_$˜"ț_#_\o+cwVv6{Vŗ,}{t˜& l]gVk4iI10SR%Dt!ѢԧM"RʐgµdHd8y&]\O"NOW'%ѱ!罴u hh&R] d]iu3A+ieIY677F{3{ ?ŰƦ?:xde P8pj\DG=b$I[Q$էhJ0 ;8X{w)9$zvTSp>1+C%RP_pFK v@ A˜[Pص*l3mOk}Pb(iOtN5{)I0i>oǹ/rfv? P}R QܸmJ{widWVoG-*fW?*٠GYBDWhb: e2ˀ)~ KCC$59lov1lez?4gGD,jA{'v?.Or+Gr5m٨߮>ZH/=GV⽷|0UHm:1WWfwt|),'>VtJضkxl͗G_ܾ?3/Gg|=.ne9kԸ6 [jĴt#IhVlyADzVy6NM~kn($nvCD`P"krFP BG!`cr֛nxjnrwZ*wݧ\,UHfG'jCr߾~\]%9 *?c_\7gp;t\xsIzU,͍odm7fz)o㈒/&wxѢGBN'4{B=h/d+W WaʩJQw#(Ini:NMPsDn6'$ngQdR)WLԗ.. *FK#ݬF Rʹ"k6lV++\XXN4"$jE,x0;I^Hɟ9ﺻˆq_L\l/G޹IO9SR.mbtYe댶!K&5ӽ1ODJlV+ !yЛ_%0v\}ccvR2;8{ y!;ssA^qI먵WCc!Q#X ɇu -DeKDXd3O;3E*P!X&̩wղ%i;`,߸vqUME"aW 5gU5b0)V`!V#XU 6GFEdRH {pjD¨ldp"+\vhG|aO]3&:Ͳ}7kҘ,lHtŠc$Aՠ&TSmJ큤 J%3=T,X0Gk6TLqp'+JjuV +sPAg5v3,pDbq\'-3r 7z @ ӘK$^eb˞7esQߊXU|4V4~IzZW%.7oS㖗TLʌ?WN!YR ,Ȭ4AYڤkI*M~-A j=h}r2B/%u҈ |J'Q&h -ʴSBy J N>]t eP0+vIABј./SDBTW)1*xbc!YGf$mAM -1Rg.sѢJzdmƥ$^/z!DO&)7Z:) V0j'T̬5)B3+Xd1"[Ws+9WZ;y͖uńaB߭B'[ok aS:m9ޤxwiRF_K-$l,ƍ}nt)N3(c-TjН7AG!Tڴ+H'#7t^&.z1^K#φ 6F9m=4%_`ܞ@r'0fnJh6^-~zt)OZX!F%3 ,I;ԡ|>|2Ԓ-Q+QQ+FSAN),p}ye"d̡k*K_^S)bs;NQMTy'?EjnTi;&p%_vR3(+2j=|@>;FރԦ&)z)ٛA_yj+ NLʐ㉫2d+!IzrBj_QT~HK2#[ qFpx&V\[?=1~h>~2,.P%c3>Fn߾7(ԍZ9Rz܃OROJnkqQ-PC{MV,F쥭VtJW%Uȝ~W7GݸY糍evTX:$@5̭K!̋7} %.ZBlblDdL8&l$Q" t"YUQg|M](}d\65WxOy*sQ툑sFG#FIb/pIT.RW.z::SYxLߨJ(5'Ls*%HC[pUwtt]U`OWe'ЏA;/ߧD'fZlBn@Qth&ZF!1 ](TzD?&SAU9;Ϗљy%j܎,1pY fR-V1vT1_fe ?0 nWam͘u`\Wz?¼+1JqSZ9x\H [{Du4.lu=xAG*.PƯ%1"hmPFE, 1v* R!0^IXf$bG ñN pn&1Eq=L,74g qIƸ! )l I"pP%y h8psE~mq+S*W'"W<X괧ɐ gN!R0 eP$@!b  f`0a&l0eR*F05MO10Q՟4JME_޿0H4Ԇo(#Uu\Fe|3Yl xEDr\ǫ'HYfJǸػUcӲ/$O!E<vT*`Zo9\Vv 72:5b ^[,"; BiOBH*d"@QMA2[SK-&H;W{.-;Jb @qc.tI =6q8 5"`* 2zn|7)PQ ǘ5Tcp7߫g+:O*RP;ttEd2CY:iǰ}%h3>qI{)JP 9]FȤy._|) ?w=IZ\Mn\2- |}sfqsobim:F&xYmqfx)f?='OwV59[}uù%QN iɛI>}^? ]JHS~C㥖SoRZp12q U0Nkx+ @Fe-.{{HyjH$7l)ۮxVixvcp iN(g/e !E?_8{ى0ࢶOq1O{_;mZr:4@P6y ʏc: Lupbjepkl=ؕWZHwB2U-{-p=E,*'"@[0j(z{V)їLfL$x}HSSd58.au/mP(9=;{8{V`s%ruϻ"ǸUP MP+׮XdƯ'+ՑEZ Do3ʼn$@(1s|YiI+_DÉK ka@Oh1쫸/e 23PR2.'ڜ݆(bzj:,D ZSr⧄9ÈzYl} *yau s [;o,g -Q^q)Is}`r 0FO1JJci~_}랛o?YXod>fgwsT/1Ye)~?e_)Oٗ~1Jf2AӒk-!ͥ4yORD;JF@Z'Q4Ą25WyCU=ͯV˘IwNC{\A$ކ~🖫wN_ˋ*SzyK™*_"d>RemrD|݉DJ~ʰܭoy :>NwXBnf``Vg[DYY rrLG <_1q܏b3*V{6LE 5y{fWwtU,@81;|d$Z̎xg!;0]fZ\g>SZ.N0`BvN?"T}:k F""%uA)͇r\O^+VEAnhQ P`s Q6BX6q/lk<.4e]'/d6ܑ.0޳.3 =T7`==B<2b:#+)=fL 1x+Z(DRBMq}(A {ZVS[<Ֆf9~{\ߥ,HZ93NSL8yMz^?Ļ۳} 4tL_NC uB*N)"vh樣0I8TRc(ϬW{Sy!|z<)4s8MTH׸uVE ` RHwEoU!Ռ+*g jɭ\h<ԊַmoE{HPgV.ngHz}<~tH}SN<3 ?^_?o(ɴPJƨ\@){I`t0?B?xu}26W=i5k=z˛=3xoe3#Rh #[\W7^_ZI?/bΏѯd.k=l׻﮾z6pyy~~g2Yy^]Շn\ӣK~_#W $ZxiP<;arf!|s=6{ՅǗb2wcޙ9,1Ab|OC,p־v;|v瑟Wf Krʴf؍c'cT}OR^9oQI Z,}*pBP"3 L~d{V y$w}W=b?κRS'K~d0T|BD&%Ni"PK$'5S3퍦(vQm&+ɶ:dhB ,iKD ,6yMm GKj/#jI# ZJR:ŔNy¿lplT|#P-qTX Rk tL(D1Qu ks~E4ǧxXl<ӵFT4ʁ~ւ`{Nۈu,逪rBJ *\)A%I@#1(4ȭ>`C8N%WD W컛s3*027<%;´N[T eO6#: mM}X%_+iQdLƮ~H1Mxʌ,tʭ2Yn:_a[-l+nӝ+QQj,vO$C+t'H($@´OHZ5o]˟NpD"YxEūh1ďJ1V?B*r=ҰQuXXWXQKu"^k R}$r) LB $Jn|.󎈦2_={=a(a}&>H`G0**TvDf-0G5/){ok,Qe|.AJѺi⣇h.0k$|!ʮm>-?uqBVҹmz`kԅQ]aVksa:Nd(MRx<7 I51S9VbD./Zh`a, ܑBYZU.F3׀ADU8x^s0sʋ}oXh\; `߁zxrb93ՙ͔U]daÄ]|-?kxbOyN:\1h5)fR4yayZNd\bIџl^h],`M,xmǟɿٮֹ߭kKیI3K24ŚOH %^2q/ nQ)2pYJXOOl 苽t.s5Xkhm#ɥH9-+vO~w1.?jAm W:REG=t'(kZK]NR괸 ߩipqhjֳT{ﵮZmi tB=U.EzSA,xlϧy9qCL%ۊti ү[ۡNM[3BċBW!ԹUZtS] =Eî0 7LҸ?XZD_#_ɳE7gDU@AQEƓ 1Ӎ%'Ud3oiq=8Ǔx Acƣ\mY+k/?VI>!8d iϐ`p"hB(!dO <0 7 <?i+֜ h%${Eoo0[0W9H9R ޣx 7cˇ CXo7(;1G 4j65ic)vqbkje. w =W2nLȉ51!i-¹2rwCc <`]ҳP~Ȍ&"f3"rd*"bD#rn6{L#E䚀H$\GFh}@(z<QۼUEhdj{9~::TmVJF 6ެ5 P`JiӍӚE&Es6Nl0ʼ՗dj>K_m~U48{WO`g ?3 1|87*H,yD $|{5]vP=fp?|b>s.%'MܜcȜ1I35lnMfhJNAǧOQS?yӢI55|~^6 /1W]dGR^~w'y]Mrv{'J_Wt*< - W:U(g58`*(̴?(pGKݞ(%t%mi hg+ϧB_,N Qxi5?o' ״֟{߫nur>ZxsZ*77*8qJ,(A$ +PKq`be$0J C98"j02DJYɄ)I̭\,GEF)IJ\"B֍(PCayL7⿯8@ryUӟV( udEXTnfU9Th T d$a1C2ޱ),3X$ȴ4$'RbӌLSe*'Yn3 P5:'JTi)3E.Pj3iȩZ̪m2k /Ng$%CeV`iRXc5 )<+oZL0FuhE'װ]D ^$%\ 30K Nld&zTY3΄Q9]f-၇J>=~*jg_?I0 .-v{~G_Y\̷%oo>,?O; {SD[F|$-#i>+-L-?a࿞b5s[Ŕ {{r5aHjk>s3kIh!STrK8 %k\,0}==ݍC{n=[ u7o&ek -6dTe9M2 f|&pkl;\?Qܕw;w䝱`#rxG3G)Z2`I[*Fod|a}S/_s :U{B驺ܶ8)NDKk{+!;mv!<+`V͈Il-35_%*?Xn{8Z $NjظGZgEC3_jʀmCtZ4cbQ(:6N68:lh4HJ| J!tw&13Ψ"%cRɜbWDòxJb,C[</ a ldy;Ҿ䚵Wc0@?aZ#_&0+۝!E# (oSm^Eorh oF*@AoO>O_{goAP:/n[W_7zgMef. [}Ha]Eh/e]zus͆'; T{'0I[QE@5rFD)PNQy\S}JB 'XYiq\w}UkTA\vVHoWTཤ C z>Rv I42a:!+לت"ƌ_}bK$һ@/!~zwԚĿLZ3r=kJNLEP>n%.􂊠U XgB!k<:gbdY!&ڜEßVPhFye?K6CDG)v2-p&C[ðDa0*pjL[W~W%K^$EX!xVHX"9>f[Ev۫"g $K}Ƣ ֢_W!I0"ZG_z'zg_?_|;+~IHz20Nt5pMiQ($8:Xk+43,Le܂GIJK@:on<^p~px/n:039\T7uKEgi&Qhvl$y$P73‡Ŗ:1ݏ|qB,ق~K&0JprGd`*q*TZ6R`i0SiFĊ.b}k\rh9OFR=2vuom/syJ$SO 0T]Bx&KGN%nAD`?gN<ď\IgRq{ {l=>"TCcýӅ} %8>J~M4xELnq_ˏ:TZY=)Bjzթ 7p!F,Äsd7N;H )drȓL{ajVh^u(=@ޏf ;KLWӍdㄡgTB3Vsd܇$V+3.HQG[0,;Q:0%Q_~"`Wy#ek/RU &ԁbW6~kML+J-FkMEM,\^2Zt))n\m'jVG-!xǍ: IF=jW5rAf6!։`0h$%N`:IQqQCN5Rǘ7G-қJIesl2ЙgAҥ%ݢa>t^v~;ϳA>L|F_$bGUd 5E8X~G1yi].~R̍2}~HWɠ;zqt& e=ڲr@ף!\EtJіuS&XT ʨN3XS}>,hYVf:`'(yYln4}e7gZugOt]|Ő-R%R" %-REjϢ@}8"D3_l3G7$ָdxfx6݂?q[xqh yzNp8GB%c0×/Bǣ)FTh.dФ%фQn\A<՞ݞݞݞ3+Y3M)]k8 㹃I{d LQ22=+bP+Zݻ=/Zt̬FH\V  t<ܧmnVP wt(,!ߐԡԲLϨAI'gD'1SLk2A >pO@TPSė.w7BPd\$ɤ81y C*Ql]CL:9k#QpԾDKTY\`DWW (e3'#ח/X }b~pv =5G\u*u;LfBIBkFxh9<#pU&sb=GcYƢ=\3ho} $Q"[};)ZDΊZo plz:Xd]y@].;(+ɧ=FP&D"-x v9#]p H#J 8_i-Q*aBUjݲ庽zrm.Ϲ EzEɯ'a󊱀 2(2xW_ޣN^.rڛ.<~۵ ia2CMpY-URD.`e7Y DYe3Όδ5^9$!p,5ѷQgph(ɲ}y{E'@5Xvn9DUS]9|,gs)'Ub-\{2L{t 'g`p;i2#WC8E#˔*hL\&vp$CDao,(-9.6%?`ԋq_Vposf,`̤rzBB{)HPOOr}9 h;JVɌ{0M^J~D-DIqqj@t `5Mrʾd/ӒU@5 \p|Kt%Ư0bOꌣJF:*}P|K60'ipE0Dv/=􅕢cC:ig4F5L[i) _+#[E(!iϪaj?oa:aK-xm tƱ'C^hFCv~,{Uw#]yzb_if6q*Y)eĉM&VW@4AB̊qzdaEŏ~7(AP6]X$`VFԷy2AZE B)~a{w5̔, A2l!$~;a2"&T-D}46<戕w dn` pF+28iN]Z3ᝓ[ zq>X>f4d%ٷZՒ"fkEFQ~Λf}>q"'uཀྵ_B47ۮ}_Q%bΦkomEf þp[=چv9욱Nr\6$b-F %uD촚쌋itKw4PA {w"0i$eOp]kT(ʵnw SR!tND}>5ȷ4Ks)n`fh5JmPNGlOv}uM+[I~{ܣVQ4FSdA8-eGbREsXIdcNWPs&ۿTz %LsB֒S V3ZζO:EM'9h =Q&T3m/z |p3ӡΏGfu`,O\2]TLl:} Љb9%ZKT5;O6@ rnGΗ-rcL>7 LO.HҐY !O( ΁ɱ}AbBb Y8e'H" p?Z"MmNq|X!0#%ŏmz‚QebujРkfΚfku?lŞRvQsq7qVVLKEW7nAfK#܇8+]O.Dot|:I_5 ]WA`߼:oYY@ ]*qfO>CԎV~`Q`m~JYN\VЖR ujvi]w2SG !rAiO>@ETX[L{.,;Jў,Em%(%2  #k!^ US(ljAj%X~/py}˕{|ΑuMHǬnͬ[^fLWrqZr6[-(0zmCoa9v*z,QV5|6[FƟYV ͢ Jjd͒'f(ӥہ9Xρٖ;֒{[[z$T[9ZRGOep\lxiM=upQRpJ fT3K> t2-h03hBYn5(P\rlqpIdU?NjfX0K &4V%ZKtƴb2 !6\ǂTѳ j;UUZRx O0[J;vau*..Y2\\I¦ǥӼ9P6:[㼪BW=#i-})Cg3ȝ֒j(!d!=*1CXXJ:h_lضv)-:`UBM]@T[J9)5P -&vUji)tF+G}.9~YCYg-l*""HO~v;vWiLlҔ='ΉܜhF-[K$8ރsAtTZ Xk*=mX)F &AY_"I3 fTJkHՁS,%M(ݜnr9p CSvI z0'' >$wLHU_[P$ u1!6sy%U0|k[ƮBבN<ڽBcmU׆x7"]qt3w§qڮT!2F #L+B{ ҟHx!c$9y4`,)YkPŸe7Zbvk)d{9nm}i\:~dWt`Tvkɓ9%K,P jVہ]ەLRs%(a"甄Dp$ltP,9y%jmୖlaej[Z :#ZY:Ԭj% f礃0BY񥛪9~Nh02xϾcbïJAՓ'@6:.${ 2 `iE}{ԡ&h gFhQH$ldzuU~3]/͉TDi2kJ׳%_GTU,W<͝1š8tpN9|fA3$Isdt"I3[\^ tN= PMH~ D0yVY(k/Ҍs+x߷A MU ksĜ31e[,ˍZ c&wsCb-;ؽ:PЏ(u H!˪A9&B]5;=i6n>p/U[0iu#{BA%63Gἤiu,֣nꐵTLG#6TE"9Z8ۺRx)#ԽoA!Zos"tUBh)GR =deՁc"!Ui.VQwuI yj[Fb|˗,HHsB^}+WS;Ĉmt3C[֒p9c]-80?Eb /[rʌoVX -vv?޶Mmh1Rj4a!{CD!R.w]0~gPCԿF+$Y:I5~p{3*B8Mѓ#ƧӑyX{sag/Y:A"칭8(nrQ*[3m} xȊ3^yV YLl'"#{_|aް/ADdI!|L|&'D`\EтW9@H ~ll:>gۨ(]#a]xL^E |!ŭ3R >5vRK@Y2kujpfh^ \8wIh:oo7@ ^DV1Hrqms3X"F[)bEL6B{ *v/2l6 =O^0wR@pP.́D(W9H|a޿ҐӷwG̏ij˙K/t_}o4cyRl|K>掘ӎ=Xё!_4 ͇[ϙHPCJI߰*3BD\g6jœK?/37vFb]kZnZJ"F%c6ZvFҊkKgBK!%Ch5rU_'n/([y񒾄-Z(O"\Xgađc&Y gu aK[>zA+SiWE珷?~6*}^0eI+WU;keʰlx2vͳ6)ڹ}x-s&,( gu):n(tf읿eSCAf DCYF9q~O?\gеTrJ>aӺ`>a<E8>fegJFeA'\“gT0MvܡeISy&wXī@5olN 88 ;_ٰ:FPkuI9߸P<3k3 ̗+z^}/L[ZǿMVR̉R]{yln)8@G8`2A,Ǜ;:oݯ%׃}fe&v?NBX.=~g׋,;H"+. _׃|_Fifz v6g% 3*F{+I5)4͝k@(`{"<$[)Kd'; Nvgp;? JzU9`Gi r' } B7F+N)A _{zBcGcoS p "\g75p%afợ+8^LP z]R31}3zv+xehߝs|+4ZBMH&l< Rpf?vZRm%Nc8DikV6\䜱`:7;זX픠u;xF}p4.s˃yX|]s7WX^ظWÖl*Im/RZWHǦIQC0搥?u;;,;9Ovo6ܷks;};5k*ީi)`;V _=F󍺘ecSG)7lJY?ЛFY8B 9{@:λd#$+zƨJ`%hv@]%q%BwltJ<uIsޫMmoܿn< =N-QJnqu7ss<̛ `rz~Ճ3}S}pw5fR}~,--H^ԿXf~F"(Iղ7s{{ϙuTs-.ι _6eqϓهӫ+e~B2 U ȫߐ2㺯*ǻ I#Պj(2s0r]mYʜY DGz.*ʙlV,/Ww}h]Is2t)y4֦WpOkdsUVSaF<G[0gWwpwZI,ҵ ׂӜpf-3@8ZbwbYǃ?-<5!o{G\ak=`X2ZcCb^VK'Afc>PB4ԬM}l|TM<5!C^nL50T*I€[r*pRy~"TB qQK Eabd`/RSܪqGޱ(!<DAX80Y"0pTF8eZbҔP%AZam-&`t`3r!D(\J =X>ּu^*-kR,}PR1 xXg}=k6<%FwC 8}K6Y%kskz89,+WKpwK녰Css- E8D`}oh)|"ճQ@RR ѸDgX^NDcƧNb1r{e$NR#{=F)9LBs8BcHח#l}bu1Zc6}ߝ5 wHK4 ti_I+%%_ Dk\P{tkn4 n%QƔO;M]>=$*n\wz_w'05:č"fʯU8A%Ը'Jv3Jz|ofbɬ T՞1nXf2+\s6=L#lqه8?taxL-]\qw4Xx. Ib2J*܋ 9HN[-Hu C7]E-p)vrcL9Ie=F"ʮ:KevL(P8r\z!U{zwZ-;eáLr y(FvF o׳۟L+wLzxy@͔R]K=Xv>cZߤe<1ʕ[ɩӽ/ ²tafDOߤOݤSNrHΓkt֗[OK>0),kN$TkSH5&恇:e*^]u-I> !(BeG:cB12 LdzO"8j$AR;uZ4]_yըVw]+>dF9,dOlNOo?njX~׿}U|+O=a[@|?Xcba(:!.pYQ Nc` aem9v4|V}!1kur`.$HٍEA2fqXIl|hAg3,vJu S |AYbEXJzk%$= BĎ%TttqܒIC8dPȯ(8F)VqX!+ʱA5NQ`%!Ѡ ^תI `[+rrJr.Q#j>uöFm3Yhρ'K@E%(83D3߆,X y1܇ h0(áLA).Hte s w+rNoQv3_-yGSd7gff+(Ly=ؙz6 ɪ}$; p thz_.1za=1IQ~GvҽT䎱y$U|y}g G6{J6~RU<ײp~ Z&Mx䋦/wJ3K,}w~l 2yTڍuvH4b[.R;F66T6Rk]klBS[ kqJsY{c$&+ea솏Rq(8 #J(`(g"x&R$` k PD)1%2~Z5GJ:r|wֱS(L$Nc[Qѝ{aSy”&$Ma5˃F D. Mn%!))GqnFW[.R;F6`'v&4U@h-i 8Vf Lp~4 ?eBQMCvqYK'D8M9̩`41MQd (Nh (wj`ri"ArlՃLVIy[-F[.R;F6w)Ƣ7Mhv+',L{ڍQA51Z0g:&)';RR/1ߪw,7[9̛LI?%v򕙿}򣙾<;/WCwQ^j⥐0iE#> %WLħ7 ^ DDiⓚ?K|/+e>SސOoT~uȨE0Rd(L5^0SN5!4!) !cQ&yHxM{]_p_NS>Q,Jm/XykiϏۛ5-^X30+x-O#𠧳 i?^~ů B93߹\|ஹXXIPmۓRcTУ'3wH~皍TX%1 "N|2ٸo3:r~is=;W!Vwu}rn.̻%FYm!$_u_۝#D>Mɹ?_^ɛ "W38_4SѨpZ61]?G7H%Hdz'b[,ВbilIԿ;ͧ_[.LL4X):lqH]'C$cC"e[%EʗIVj$[qLp<15[d/"&^# x4*h5n\ė &r_T$LėQ^!|GJy<*i CʋA|IE$MXFyņ 7.`/MPvHV0 dzYC$ԌbsUsTc8Mj7fXY"VsUU;Y_ =өgޭQ=y{`RW1ո񋟼^༴oUg}hǼ˘t]͠@#.;jZ"ugXq-0&<8u?4||/eU2)vm;B(kHUQ;BYƪ{Yף}Z8dʫ {^Q(G܄8I4;b (ΩPlVi?Q :ML{x"F- n;|n3岡{3w) @5noKV)]-Cک?\͢-# <ҠDt$g '84f;ԆQ6bX! 0 _&k`5ԁo*)gR  hITF )G"aA<V)gM001SV \!קXሏzO*I,-l}}tՖ$=cE%6Ӳ,֒GOBDOQpG8U5q?<"JI񜘔6'1vq c=v] R\˻ ?b%WhX$S?+{= 3ftI\@'z^Vs023.F6 G4$ S+1 f9B3* q6W_?^Mc)Ec܁^MVEbwJ?ZnA:{ 72 }ٻ޶W-G3;/ eB]cw|[INw/0}Y[}s%JDH~,YEtMu'6,XsMvb?TL d>HL9+czAd+mZ (sDI*Ov5idj~Τ6ۙˤ(Kޞ7]dhAa#GPi_rB@֩8-]2"DQQ hbQh xHDJ-^Zd$EnESB',D\@HE)uyDiECBFa!B$ )ڎR3Ô+!A,ꬰٯ_җ6ܻ}ըLDgyo#>S, hhq0!juV.,E1 e Rg-,ah- \,^e562:AnQƙ)q"ɢl߲!Ab) zLX TϏaěȗΣ7G%/wN?dtj.+Й1\"F=۬UaJh'=x0(.]tq N]{D#3חnG;7 IxM&?[1rv Bz ?@IW^쁿I|'!RIdJl@bFԬ ϫCOVP;x/p:ƣNm@0,FK>.h^QA>k^M#`&9#ǽzOL ;'ב7Saq3ۙ^l^Mfy̼y̼1Pb "M)KBJb0Ȓ)G29(A &Sl;l^6eWwv^ y5}%(@[tBE&ědPDgO.$jKVdIF4 ݵDks [md'Y7u Z&3("bYDwDzuEp o]>.>b4JR]YFԛ-(]80LEv'g.P6@!Wߓ,V2fKhH>gȋVG*^"f+wTn CN/B$o (xNhE*jR1՝H"5HX&!f_/5.п=i}{ͷ3LZ(ލI琦>p??'(zy]d&Ӊbln=[ ׇ6(ݜ]~3]?x1h)iɇs\zb87#UGÙqhIͫV*WFc /B t"oiR#-P !9Z*In*;ZgIƫ%ZdLhwQX&`qIJ=x[9D[^jAzIl((H#/Q$=hrʞm^Iv68Q;2&=gipNY9frI1ꔱ9Xz4\k;ekIvl4hirnUc-mc"d7c]; Q*;}I'^FK|t2Dh73߮{o.O?_>1 lnfuW؆kpٺ@.WidURL=PIA-Pmtn IǤ-v hökejܭbc؏`~v|!A!e5Y{0z2Z1`CSB9 lD <#̺xp>cB$F>g8ۤxhVC54#,{rs83,d(XKJ$4&VS; "R mXl`JvͶ grLJ7=w:['Nr;% ڎ9)z4C2gzv|OL>!4F9ZRh.V ye8l^dgϻl|=[߳ӫگN^_ϖLS^O)^`ϻ,~|c 6sO@L豬!'jD,m/vSrIe9nj PCZ Mt,|Ĕz8H(SXVHmH+sC1WFyuolY5ӫ/Mo4ΰAf0zG$qƻ%}>X_)?DR KV\=0JJFώ*Pú!~e@'U@AQ\rx{ASD4a"rXv?Š3ύS4"5U%NIZ:eJ_'K&gh'AGBX9tD{Dz= )^=8܃DWS+H^M6ۼ:o[$ FDBIBz 6d#vx ;#H{WD"*2;jLm;=\>j0cAp AXVAy))(&2䭓Ny^t;Yy~TjPpF(Yˇ:%PvZ㦷(mrfHiŮUe#3L ȍ4P8ک73&Y$ !kBKbpu֧~y^[BXItcTJc@lֿ?wg3?Q*`3۲2X t+mʤ=VhG|Ai |S4,3Ù#;n)0C5|c` W$:-N p;ȖZ Pc-Km.{1%L4|M~ WW5ŪQ!hvZ,};Y`~}G2nB6 /| d676&5n0:*}nvq>'^L< dNs톄GlưsoqPf#;,g(H3_̾ۓR搓DOdvJf4(3 A$ Fdᅲ<w&ό)I!|g e4Qgj{_1Zȗ!A6OId߯Z֛V7[B2Y*qN Waa;Vлj O_Z`Rtp_LB>buZ|xlV]MƟ:-`@ 趡y>E!;^K>' oM >.r6v ;zUp,(#+dm) E(ڨ2S c8uRaFg08uC%B~ ^>_)k q͎!=ޕRAT-YiC)_׌'): V,.vWVé`~QW~}.4'SM x %E6;N[F`_:ngW4!K?}и)HbE\:"yυPΥ#t>pW2]_U%yf$W>Pn﮶K +$l  "(3u ?-R{@q߯!-Zj[5x/kڻ1j(%qH~*Vo53,q+q0JǛt#ctx\h?wO.Fd'd `HNN>CgjJy"5`LPM1j ےHq{!mX:1p|sLH~s^6oW& noWo(a<.8}{zNQ'f{?b;LM&BxR)vx鈐4U{I|u7`0gڡB\B{¶c Qi^$km_~~l,^XR׿/UF@8e\=w)庐[1K8cՕPUĀa$XҚR!HI%  Uy}x{n,ZhwW1ídRYs=my7V.6s@?܇uRokΣRWGtz{3G3T3VT#ߛv~:Ov+4]UOko 0;'o-2ha6&= N|'=w"]r騧{"?wYa%eN󪭟wЯ5ז0/ȷuj75!Mxx?9btHaW^)&YT畗gbyl |p)=±Es3BJT \眥*â}$@908籨!7lJPf/5= P[x0'&f]`p;(djZa,N+";eF=>k$<0|RXD-h( (+PGL["r¥| 9fCs3\quЮ]_($_ Ć̙݌ ~6{>FgC3ֶҹ4f<B[yS䗉/f9&/@@52 h&085DYmA%Hܴ&6DPSb\/F12P vIvGixjyR<3?&4{ؠ̣ IVCVigGl0 EE0DNʓ_6Q1G~KFl(HF Z1eL]:1Y1:NUʜ+ Q!p D ii3ia9U @h(RTCV7zVϵeݶX|{/cXՆPhY 1-)`Vi= J.$pJuƋ:SʒHK|@#V vH'49`c_I(~"Ŵmz&?]ٸ)?*yjT_XleKR'%ZJ+eǜxpQSR+mK-XL*Ϻ3S`aPjB=(4'JrDK*;,J+s%r{݅5ۅroϲk Ypy3T.;(P#%BexVӎS$c eS#\dul⾩0%rg@(]')(xA AVx-O Q/8.P.P۹T\z $U%+ɴ&כjA4$07?iqc֩B(i|Fk6Ÿ?}TkQ;6A!73R/}~k^ϴdM#~HwY^V W&u/`LgbҹwFR1u_~?cJ^_?mWzʕNP##??8T]§PQ7#t}\vטPF_/y\F+G p#3 '<иb[p(Lx9T(lLiEpE]rC!3U%ow*$6PGʡF)gԤqmG`Lh&dhJNˀ֘6YMW!ot{Qىh彷ξ@p*{/cfH6DMæx򅒒{-*aIQ:/^bn:A_-l:,p0;qYl'jEN@,>狉[,Ҳ[P{ I/qDU"&f1)HWyZ?hy%K'pftq~c'5v,*ځn cTrx)ߠy~}['[x-6[5JP`qu]\C@"!mHUx4kuRUz)I&OKআƞ+sްQ*W%HYHw)i6>ǒVVX5ˉP*`-91"qe2䪦{Cņ77weuV$nћч[^/F)2(,etfg˳,v9|sgc X̤db;)f5Tf8r̦Z[|iS/E#ޏ<5\m^pkD[ $SbiKǢC@n='j?8 _+9ӆrPFӌB1(#8Ö:f21} L1WވBcbQn lYF"”7qD ۻ C)1],A>xIc`u693 6W}g+]jxfǜye<؃|Ldfz?q5p|9jmt9Fgk'9zwˎ3M[v5go=9R6d9F6db_zq˥zuW$}LwE%RYWKc{Ԇ0ݬ^ q#\0}Pc{t9w8&r4vҎXcB$ u[PB!^akGYr7_Y^ =y7 {F :.ok~Z7o7]kx>ȵ@χp68 "p!@L$fِI &`UFifs8 N-j j%5d %S%O$& bT0  Рn!#1^$&!WF9KԘ әEnX*! ,L'&q:켅dϜKT7@!5GnyK% ,9 ɣ4XRRJdxA2R Ne+}v!&v3H"H-9|NNf$qYD$4*5# A:0 2"p R ~njd$cMK_Ngɷ6k4pҐ2=Bf_ tAp!jBb4,'6c0XH4Ir)nZLU)կuj[V([T'W߁Kԇ 0c-۱x,V9(*K^V6^&*LAY>miA Ê\[/U@kx2mX^dVDjF#W԰Z $q´@2 D궴.%ۺ6 W|rjr\yGDTprA [<X6-©B ŋYЌ:ZJmYHj$"@wm]qԾ'$ )o}e @73Q@>)ylrQi&ʌmA1+@h-|ֵIDz䬤s}"n2O瓇Nb6)p&:葕L[]wCywOgʟۚ# '_ζ!8z_C5pD !qiH8w2!hّN,"D^r;<)?d2MT_qJ:nBr1(rZUC'~yƎiqivGLӖUͮċ򠇍k#Hc4Ii(4,r8Ja=dО8*nj|Jڵ"W}f}vLsB'Έۂ%lr|7?5p4\Nɛ˗7f)&7 zF˫<.ޜpA&oX<1?ߌ̝jjE ˻hS^4ur'2~>e멼;x9+^Weo9- =ƚm.粞ry&H$EַާږWĉg`/qa1b#B9npns!C8eK>^rfOv\lFkFͬ&ipx"6j%vdm&YxS6E"n Ōey.;!r %CZx̲f*]9pFz*7ǨbSdU/'ZT^îB ЦǕr nS:úS-[PxS r{*gIh/ϵ-ŧ.E(M4#[iZ f5;Y0E f0sA/>YXz1;)6/bIAYlA^y5:̜X"/68yu|Jލ{W ܰaUv\ўYpiB bx}k9;>lUl,BxNLnz}^et*欙 #d=tfW;zo1_G7M|3wd'i/adAF[DV҅6K2V'3=:Y9GҕQ1&ԉ;I&*Z9hq]|juǶ؜SZH:B`_:ԥ8+qHf<"/`H`iB^%6Keˋ"UU s~LA2 d{6X|G7CZ3~^|YB`P+, }78kQ 5 5 Y_H< 5W꘍?g](j $w* &*]3F[Å50WN\ՉK1"t^VEz,Vxـ˖@LwWu{ϥ+N݂p ܺ)ȓyAJ NZ' LCE26>p{$tT3"6>JdYb`"%AÛo&ǠW~1tht~ 96zov)hO',ܙK{tN+.;\ڃ@1naM\x".|LkORgܪ?>iѠjԎ[߭;ЙR$<(PLBӜrisX5u 1knlݚ6R+1bvj$<@Z*\}:aKj0'u@Q>>Vղ꬟nofO6k8ߎ1]5jRXm>O\X,JԿZcKn-}q_q@o㗻/_ |q>=CDJC>[+K>{*,g1uIduL`NgB{ U_,v!pmSD0]M];,%`ΐdG{-_ND{'EXY> 4-\g8`M댂%dei|Z~iz 3{+_ ެ䅍 j1$Ec3n煮l߇fA5 :u[]t\tyOY.c],BFJTcH2:laՅ⿈k7SUޤb>I;5,6QBx:+7'rTr8*SS֨}˚ŏPz`eK&W홗ݸ zqt!4PDAsǒHBDρˀA`0 6Pce%ZW*A\s'ylĉ ,ƃ$8NG]1 sM]cuc0\3Wgpx |(O;`>ovDx8 R)+ʍ|n_.m9`\^iF/Occ߆eSx(޸kf͊G7z矂7Ϗ...)h tDz:V2Æhb8LDI''!Dw܆?!b%*]E ۛ&_ʿEl: nQ +2pޓLid3SGRaXL2Wp8u s ZGs;/#ax"wCBt8jZ&JՊK%*#MH{CpX]R.+#S2X:p'a2QM W72)!K}ώVl՟iD3!W!LSIAM>k-φ'6>H#>ꉏј:ZU/sgJ,򋟟WTjSq~k/@蘅\ɠ֭5ֵ(했O]%5RA]l@ Gy%B`JyvDFF^JdQy $ `qAU 92HxNdSam#h=Fp1;yv`-ՌK*/D1A0R*>WA̟&zO:*F`EN{.vLr?߄E໙_ϻt{CY?`wbEeMxg˿#.?3źlQg+~|~| GL2l˧Gf!*0UH5 Eڲ2:&D4+9}{bb:Ml&SQ(3ecKoop[ųHdBrf+_QI{Ͱ+o4|큒XR ̠ #V Y*sϲ9q(g|nQVb<8bч- MMtmoTM37#VJ+d-]17lvv8B/=t@X8ί b!JN_7}!2xgI _&&ƒEcXo8Cr} Q3(o\aw7 !\?&_+ ZG_K8z;SX<9ӓ:rTLWmcŢD\f)Nʮ.֓9!kr7Sr+ չ3;^S/AhDu,Cq$cNIBZӁ(Du7smw"sq܈ >c!a4` LCSB@LUKi D K'_}nfk E!L`O:HZ@uibrU4ӸғtqjBNb+0KϠKFaАifW.@d/@-Qr1~?%|kyztW*t~iP8 L:,g%r#TBP`(0c&d<`n3l@ ]FY9rNŨ[<ѣ sgR:y.tNzV_gSp2A=qs`" "cLΜWp)[r}:5F:comd?JɎF5({iB周Д(9.i8.fvԀ$_I 8G1Aٯ6H=cV;%ujK0!D@nגky-6]$R B[$a4 } g.DgO_L֙>' ǷG':3v-A\:8{d8 3#rcXdpJ> a lay۽rpvE8NfwZW׸,H'/FG'BYS٣YbGSI!o}*}M\:@k0C(Dl  I+8'i^m|ׄ%1L10z,j^3 #~}K(:i(4SBAЃ]2ٍ5&q,6 |^8|+%Z9A; .K8`%rX΅S-1҃&%|Z-F[\jD)\-5lgJp[/Wyjӽ*2{,Z0ގ1]u܆wՍWk b/lhO/&mr7Kaޘ/Hy5)*:;nW\'畺cDKˍK9͋]4muia!s=Djo"x\J )Ym pp;PqWc-%*7S+?wR1@ d75,˴LpoBXq$rAX2d'G"5Qioi9n V"+RC==0KA|?\-ȕ+z:ZX @WS|J9xaTD\`IU]!#su<ᦴd_/n+h˜PQ1Wwݨg)IJ3& 7k٢s^bp^pi  @ o[qGp90{δx,c;NvX^" (sN( `+V o\[-;X)~V H1KV6 "0SKmwmmzJ%αٳ}99j%{[CRԈ\ XFWU]]] vpGk!2*0fH')CՂHj-maSVo9HW¦5^(#C`Q`Ё#ʽ("o5YC\#2O!aAia [2¦+"ߨ"Dͯ>^M$Jf:L|!Ԍ%q{Cȵ$k%_v6ٯ#+=|(BA@O! P{1Ηʗ1VONn4t> 3+܃.:_VR8V-CoQʴ:Ի+Sy28}ƬCg(y虽l _mzHk?J]):7a$jɳI ,>nkqw)o(뉥LszgM4 6|M<I%8&X c2D$ "g&a##XcKQ a#>YEn"n5 s42\iH`l6Y5lW̬q|]),fS7?Q)0xE 8xƵc['d7LvV5+j+U࿻-nCĎLBSPac Cd *Mxq$oKЋrڠiD 3E@d@ $ Q9i#,? D40EDZPDhlqÌiuNPFN5pQ;Qk A>6]A0iI a@MAwBWS "T_ TB@G~R ) ?_򴖄&ωia=p'43 ~sq~O0i`40Z$xuQ̘0G S"L~R$UEzw~-D[; N gtb%-|ReR__52<N׈[(8 4x4C5VA*s/;Cm )sb%I'51eZLS%N7K.41]RzCS]{q"VoaJ tl΁u=.M,a6[^zq鯗KXff~S_c,,8Su ޖ޷c["]zYMn+-<*N"1 (W $5w:R0ƭaY+aeXmb`&P"^c8YVxU+!oٳGhq?ݸ/;QlI^v䪉 9`Kcggm}Ā'#)eoWC _3ilLyl=0F)ҀuCΉY&QQ:k~wZ"0.RjvU!R"% )2RM(Idf v1s\] HԢj43=TtEA+Vj|RBW",x|HACޙeVEڬ%G5Urx6͑6|~Su2O[H:FHw<H.,t3y2DvG;kQ >3Ljw-fMS߆oG1Ҫg*f knd]Ra7vJ W?Do}; և@*.dK[j~<%o'pU˟VRbj$2a87dxu w>Ja2k,y>JKL*ppM\-N<_pl65kk=f~`_kp%g3N<}pM^íevwcfӇEK&@fX{ D҉0Ͽͽ aa6.sMk^C3{ &9|ͫ 0'VKoc>|'{[r[~qT/YR mLz62zl]OjX) J#/4.f֑RVL:A%~u-15`и|z8y.>dzwU|m*r株zu9ԑNTGZr;ʖ4B0_k/CsPٟ~ߺr?5܍2O/bS$p_^+7kgaC̰ -[_լwY<&+@&rWN8Vpd g#^٬nAj ׇ#{DySPihv^ldS3n+hYXatno6j9n,mu" Rֱ.~zJZiۉE8uwC[)_ih[F'mH707dn߳SpY.RXzuJmwBCg=k@dmZ*!yO͞ 8+=YY]1Dψ|qp<ԋڏz|1&/ S=Ow{q$ səήM@sӭӹܠ,3zeK>]L'[OpɃto3̦MޒZLsĜIŀ.@I) 1VݘRF7/7Z V4ms+r( Y=Vs|?9Ơ9*|QRJJ M 1H~`97«s(hs< I3yB # 9d0/gV5A"HR;Ke]I2H>jܿmlqggTؗcl98S݌3d7bIyWk3f_ʻA#XsH7g7dDw9%//Y+w9V2 y2 U C-- $qiEV@nVjRdXmV FCNH;uBҗ;-Td O5ZW9 LF;G Ax9 HboOӻ鼀EWE./ ^^x_^=ա9x]BWMq6a0Vۑ>m`HSLM?o?W8,w([IB#M+mEg\;fuc0" xauyL( ƺrڠiDbm G`EQ9"t;І Н\Lrς_[ԂˤQ}{az~P>Ӏo}?Vc˿}')x?-^7 _'I㗬OI#q锣> <(t{M9 T`DRk ED3~sP,cc)4!]rQc2!vFb4vc)!2f]O ZXv"&ĂIq毖+Pɩ!V{5z +"c^ wDwtw`Ykwυ F9CY֘!'3.t8( cҘ>3Pcl}s &dc^$psiLT)30f9&91aɲB?9=u0&9;CTlApo~ <Ϗ=} nK/{t-OeLIà,u)a FG4n*k%r0tx(w *Q Rv}ܹ? 1^8cP&$ʰcyx~UIҕn1 s#ryHc4qi8 is <=wUEn"n5X53J#Õya)X!( (ׂ ڼxXR6g!?Ib&?ˋ̅²W&'S38K1'4h208boЄyO4H9a30c2/gK0ػ8$WDLwa8̬EIײI`FvSd4(Vw!}UUFyD_50͉6rB7 B@ɵ!9K| 9Q)qG7GQ ig%7v.d[`iQIeE4X `R4T !G#BT@1Qz *Q,ۅdcP"u9 s)1dl2Y,:0 cm؅@[)h!GkceƢљ8RL^QVq/i+!4Xw]Y*80F f8Ꮢ݋euu}ynNJ'M}ܪJ2)mi{,8_heօb~WWgk?}d?NOM_'sRRV:_O7eo'ᮚ1˛*p}DkO6/p(kjn!@?..H6l"/jkdF;1*VNIM kt37Z &¿C˷49aOc҇S'*bo_gKe _-7q 1uSIZƛXhm_dz(}e ò'28CeMe5m]F8ͧa L\ +h?*0!䥬A~ kMశf|Xw~ٲ*F'*ɳ2 P2] XTCׁ1{l#]!c_YsUmpcz?6rKr =[sEWgV.)v+c@oeuԗ'aGvK%XaG#yG1&z?mF k ejt[^J 4#xv3 kGØZD+0Ɇ%DXe3 $*ia} WүDX1|Խq1x4y0y{e Y0pq9X̓'(?]^㦳sz1gṚHn|JrIa.r\94SF)z7ߎ..Goo\Q>vL2N/F&voCzuFƣϗ“OdZao޹C YBxP3$b 5CHkƔԇ1a(J|ݙa:ye-h6fqx?cӤ0^M-i OgM'zEB+]+s0)smJ:2̖ShŒ\ECo!AlmGg(6A֟Ϫ~uxMzl@WOxO1z 3__ 3q6UW >WȢX=S/-&Ce[8,'ۣ`1h TҚNBtNuGuwmЋ Rt(M`^ښ[6:XY9*cb3^C Y#$\hia'et< c)z1N+Y78YDlie?J6-)Cts]]++\&Ю}+fk$_4{פ/g1+ Z < (*,F28`L3aAqoIT"BEC ?)v80Vz jI/Q r L^G˲ S>&P3髲gT"i~]~_8m]`-|ztFAC& D6M8=q=B)!<`yb4ֵ}QE[5^}VY$.kr170IMpUDCE)[khUˬӷ56p ko_8ֈ NBD%ٖ(QʅJӵjArmđ\8\(;)BAU"3D2 PdTPL/+{kQը-^A7H7C!fQ2ݷ1:;{m,:Qsi*+J.5DF u@Eƥw }IFvo]dh*Š ]vG7 #y}t-J 0" m&h-SȄ WG\eWYI=:pstD1"_y!0C-AןښP`ECZ-W+VK+LJ ĵjM0p;ϼ FDsʮ3kZq%w5IQ}m LB>A/ ՐdgLZPP9^z͘)X&H>st1`s$ʌR-"-e6W޶T} C٧AņOC٧]4^}†'%)a*8P}acϑWzp, EXQ c&-G7'|b\6JȒ"#ܢG6'o.gr^{39o~{vs;~n|R ϹO/˜Gƣ4QԻ'wۛ&j3}Y/THkoW:)VI|(*u&P ^~XB5#hT9@)P{ݻ'ӜpF}7$ 7ż!+nb`UuYp]l_@%+fS+s_FoK-W%4=>w5/}I9_5gp@S? "8)eIՇ/Y,!Dd?7ةpxwUՓJ>4*g@U|"*Yp̍VFxj< #t6>^/_Λ׸h3$ɩohEd!8s-GNpfUsrlW YW &+.5:xa`Z 6{Xs%a@Z\R0t4#ew'~^I7ѯc7&sEH!){"A1{>t={IvyΧ-~.}qMc`D_U;]^M_^ۻ5/6٩іKK$l iaڝO PfµӅ9qÈޠK@5S#;F{bFI6\Ųuw ZUn'a7Rǂp^ ^gk!VbB_M\D= ( jr\%w>vK[)jkL;+ٴP}1> {g ~T@ZM $TGQL}r7uT F,>Q`|v^,=EEon8.ٻ6nW9oyWuHȇ$(ТE˗[Re9iJWʊS 93CrfO/Ǖt㢠]22@s$N?fFQVFKm:QoSmm^1tyr@~]'z1x[߁@3;tafcV9@=7&#y%Z d7h5.1i=+(غ# :+lhHLx74BXQKFZxaQVOzȎOzR1E4@R.Inje!]4'] J^sR-QQ5뻚1vL"Ȃ[h0` 7&uyH*LJQD1*aDb쮓 vCp\OZb%H-jQkđ(yrBxSQ0 ,:Op.JTCp8Ey劔j, l8}Hif+,5+%qA0+YY6:7NT&7]@<"P""ZY]~!<)nU&RՄkX .j/+J'x~q' ~o1"xL FGW 2^& ( 7] Lc 12}IRYˀTꤒ$D )0"XMxP,[Gm8Q"z~h <-Κ19XDo)9#pJ~:6((o5/}yr=`⹴/t[MA# LBFOCJN!H\N@)L[isQOs4-Y٫q/I_5d "nח+$XOGg2]dKP]+MFWvo__7-Uy0X<=,EodkAao^/8fFJP}y3Fxx!_ ~"v/nw7<|{p_12D 1-Wv,ap0̞E é!W C"s6/ 6\V*0tԬRQMsɳ7c.dm &e!jJ âjݭ` wC;ASo q @Fh׭/sa&XaKKnSs@T\7^ճԌrthCvns68( |Bj me!5m|"v T -Xb =>6FF |b %\|nεPF㖌afeg/TMŲYl e͑<5n[=5#s-%" h {nSh vȦ [7P-DfE{&jw Mڴ&bB@fP)om#emLcHfm7*_G$'wZΞXћF?8tzIΩCо3;$s528q|u/Y6;Wp~.p922y^1آtM^(~A'ƅΡE>/??JLkcy"~g"?\&ߵOY> qTf% e"N_R;\1JDi|s7o'ϵ<.<˷/ܬϿ>pݯ7Q ?ګ;|(F"YO;նն:6:Cxݝո^*ըO[ qf w'Lcn]G?wDe_I(;÷u7o:vj]ԇq>hJ8P/E-UG_GxpvƯOg](Vt=ۢn`{x[iAdSu((cWT7&{JID zB>JRe\-ñ=N>gf1n?\u!]j͘f=z>-!:rʏqz;i%*?iB…!U41Q7YJfY#t4?oW\C(YyuFT:;_cw-J,_7L=܍t\{\4 J Fsl޺mhd".2OZKj) ʪjnyS xU`t 0 TwJ' ad 긂y("zWs&&ܳ B F1DQf9IS c‰ UJ%b'qzkey,9Fox; 6]kFe3jfw-mIbF#չ F*"Ĝ3V&帍$!Xw2ʻkuSw j` >rH[eo=zMH艉J#`,M<_$GQ\Vz AhJ'xUYlVJdռvP'˝vSD]W$E J y>RݾDᛩ=/'` 2@9u$Z/ .&LG3krXC os'dwP EyT#\C^֘(= &S-;*>Xl'0m ?dO)U| !he@ȭ@jcVŐ*!8ܑB۰g$Zf4k+`RrIpˉ0$% ^H]EB9ciiEG(z#_ g晤6X`i`((hzJ[('ުU8B;Jla VJ3#2_/rxNbqFEqFs2谕FܿQKvȴ,ixG.hnœPi9+#qD{~՛[ RͦɇE q'+o~`($Tj7*7Fs#z[+ʫhsd0N.g,Fs=KDzJ8c*WcHKgҝ/>28/|1hFҡaȁ.-8\[8hBS;TA/!1$Fל{ƙ9z I`"5+dž>?қu(%pNf{Ju{F|ݻ!F,ꐋ%r8k0wv~bwWf]%pl},VPR!^>:^OfUmD@+vCǻmѪH;xV)D yO#?Գ4'=?R$0z evF*ۜeP^i [g\ďFRM`*PZTj]HD mTZQ{?~25',x0Y En#M}N aP1TT^"sZQFԳ@d@:QljN _ oqH&\» cՆ%#yb;in (q6?M]@81$iK_eW`"IՎ9qbGa7j>'>"A sw%^ݢ?+^,QYsj>w?[3vSϦ]Ysɑ+l8ȬCzj}cgf]FF ɲcV5@ D5Ì(ݙUeU Pǒn$58lb܇ УZp5aߣ2(N zI霂׶Iұ I %ѱB3eg"凜uﮃYuw4rݝ+˻sR;F}6X;\}$7X[}wR: ٶGN{ǫ|k[N'£4:]GJw׌7fCS9oa[gM 3VԨko\aDA1I>ȃk<  ZX|F^Kٸe\jvO-IˍYsml][̽HKTԕrm(J1ZGկj"rG9:*>ml/V|f|V-9Eɑշ{p;-Ej cv鑭j3U,ߨ)3`:eIa:FM;;3:OXƚ"И&iNY%^WȴIiS_\^q5"]goo3S7ovL-y 0@66[v9n$Vis&/Ӱ}9<*q<AZAjdRHX1kn}ZSPiRO>pWd|9GTj*ڇTi?Zmg 1-no?-DZi:t4Pu'G{hH\):?=!ȻBptN˶絍b?I"uЍ )@ȍ&̀xJ]mtRQڶJ-N1()~"q~S Lpv)tfP(AZ3OR@J;B}P 8P^0%rg3 ,g@V1X5d4`|+gKeDZ8ײs$Iś|Uf~5sf͛|.uemPL3X'(rALjfAfrj:M(bRϜnHԜ4'PD~2I8$r : 2m4Jc‚mly˸RSFR e{"㲻]F^;U?u/Z%NCŞJ..;ʟ))Jn{Np>vs^d,Օl XL%iZPa0@6!N>FvL{ D|2Fd' gʠe5H RҨSm6ZH"{iTP-֋>P4*(40_nbMHؑD nk+Mݧ6 Ju^w_]V0X\@೙O?9R D /fuu6_n;SlxE)J S8 J;[dʂ9SoL%њ@~.ZRĔxDZ>zjs"{R,Q$bt[̮J[F@LSj#YwnfeE8&?}Ů:ߞN f.Xĭɫ2+l|/of>fvyWV_\ R+ˋ~\E!TJcZI(3%%7ܔE5m0?vƊ]Ұv.zUЄ[W'vdF:%qzn#Xl-BOes(F[`A7dRr}E*f~lXEr'Y~\sU\6( *O*͘ ¢}f0s&`h3[mbޜhr B`.\PiAw}0/~ke?mexv5ڋ' A[D"jU޳ f|e=ffD?fef0yХI3G%`)WHNJ+sg̮)Rkvu=T۬0耥 !I5W@$ZvpA+[>tc~ؾɒr©)<`4i,GmL6\P#KҘ8c5m/&IebbS3"1:,jaWPW!,RhF C{$ 1CB3&vDʰ~kOpB4Y\}R*<{e! V&) _"^U$!;M$|&m/|*(AOj{hz .elC$1XRV˘Աmf(s(ѩ/w"aIf9>th (x)_v3%GoHR|-~oLԼ V}%ˊ+?[,3\z p:謹;اASot#ǖVq5)VZ^)/nB&! ;5 \rXa?3ۈn 8ctֺ]kEɊ1YɂَIǫYR^bNBNJʓ{~tdy$^pLnjzJ9WZX e5IxC\b.te@Y($l!Y&!؝΋\L{hAUD7sR. )ݨ6fYygFs "'#F!2S'%C nO Jӝi7fEmvFa|]<$Nv h: c׍lzn a}'}]o !4j) a}l{kb(_dsQZh__Cח&\r={̶-rzɲ@3f ̌>QxR(*Gn]oU"FJIzLnÇSWwuމ-*Ź5.}L`S$EȠ63sDI($ꑩ\*8FlEuc;q‡WCUn)Vm"F$e'mq#qqyбF$ ?'NkTj+)𭀻t3 8aW4y1 ::X'rz`}<A\,-jڠSVzxKݳ2tJX@!W:y2Ƀ7kYzgb Rg> ֶMӢiئNJ]qSKa 9PqAk5)to,2 H |l9i".tJ@K5XRDzXr0@T_Աl %:)|j!"vF)zagԘnKgcdu8iz|8GN^}۱}'0EYAbkx>, "Nqm"Ig~x45W{;Ե  ;fRJVmC;B2RcH5̾xMǡv4m16>Zy<9 |E3]s7WXr7nT)ۻ%;/cs#K:rd %Rp&[NLhtOw-R{/XuHҵf+~)$Ħupu A̟Uc~n#e6~KrF#+L߷冺Ww9YV'9 ڻjQY=y#hLά|t j7ekAT>=C4fL|e"~ufU:L]$WuN4Ogo_Ǚ?~6 8]N}W2JrCͲToЎަ;sr+aW7=on>gboH'l4טeҲ4/Yt5K4[IEm!ObNql Y,JoG MG$9LާcqSr<;oyt^#gwzru2z{yI(ӳ8ynJVfuѻ]_h`dU(g]o~ߦi<ؖU͛MRpYZX؅֫>Rd[{ %DRvT~]&{Vb䃰-x O7=j 药FA7`+cm ]54}Mh[|s$IY)LfBbeUy09D讦s'|>VsgL|dmW V |CEHL҆gƍd.Bbwo\?jh-Wqb2z mʗG.w%7=Хepjфh?x8آ6kH!Ldd:6{Qd=:Q'{uq jo*߿@$&j@[`--twTD<4R!KT&ғkH >)-!!^4f;؇d}2{_dR/ zWo3}ȫ~򨯳x -Sb^{r~k8(VrW&{#1葞;}\oH4z_7.Ngg+wN/.J"xie2uWXxÆO%B<>[ ~|qzM]_۱w&ɢ!j̽i`)drz霾9?  "ܧ7HA1@L.H$ELZ((LЁCF]b*.MњnxW! M$lE"O0_kɶg3[%5eaU>Bn@B=7?|o:=q[ xefW{A[SŐ@~״*if}=”r߯bkrK&b e`U)TɭTZ%iMvQ(>0/MlCrw)S<Y1o5ܦ$mMI;^hWW1'77k+y琸-u-RQ'_GPo`F'ezcŖPdo/xzo&^-z" { I.|;]lm#8JE> qKozbj i^d =k4" gry1O/*KTQw9YkE916suZqg#;Rhӧ>N_Ms`7tlwsIΔFrz6g-6$qN)^$GE@yYTѫeZkP߀LMǐ^K,=rݢXĘ17g&)6bRkջt| sn;pڨTj=K n! J. ժKJ"!k8חmᵔ/m>4燔J#glRMJIY]^4HOViy)][+%!p@Ȕbȑ)"]ޮuzk#DK(gejG)+-\(Ј=R(w&bRLĤI1+Dъ,=m#/uhQXB/imHcqAbd}h竣?*>W mȾ =-XPhe{X4ёgXH{Wi4g'd"G9>: RIE𜴾U@[FiqZzBB|CJHF=.r!rj% t.7A'DVL݂vC^J;|BN~ \D8Gʒf*^.P02^: $>`9F@496 .'iU#40AhV=XЂlxWL^sLIr^ s—Zǔ9RѦ$ ȑB|إ2Xg>Ƞ(\bYfc AH&x Λ,ΑgpAo7yk/ 55GL9)˳dv¯XU<}Q.}w1JK _>+޾t,W!^gG履HO"s`6 F?}4˫-~>=%FU G^((\yZƩuz6prL3`4E@->;ʪf)V|8& ~t-w]2%:$Pu= .á KbV!-6gb`XĘAoCr6ɞ.!}|RXI\P1#48c1;Дc[0tҔ$Y;"_!w(^cGGj; ?$86] &qwecj(N5ĝ3!8Nr}caJYVL"i=ȶ-Բ1Hf Z؈X15801ڀm OQ)D܂XB2:9zBY0?Fe & pR xŔ&*w]2&4H}j9ۼVk+1Gػe֐JHy'H´)$Z%)) i9 /Z +!m;. {<ʏ醘ӴQJ3 "Lh`YrJH)JkƺXb(ͶX앆Dæ %JWl OIaBʔ$\!ZW/|2:s]-d[ntǛ즧חit~=:F?5spH28q\ǫop>͹}2}^ђg>%ҺpQZ 8+w~ :k~ͫfeS:Z햢)Gmyy xR5逸|Xl/@CBaXp0UE>(l<,D on:_IIry5r|wt`]'!da WW+̀w3% ؀y{堀?͛7]Kw(eE^& +%gT^ʏ[R *:,JҘOS+6"6uIWg>~=ww1ܗZCҏot]Ϫr։Xzb;=G\KZ>_x)~WSӐstf5 >9ܧOf-Y&XpW!d_+wk%˥nebO'^oFp ŤXaJ>S`#1ot"0W;!-\,߽^8-쳍!T81|֏^M&6I vX I}NB.3Vǹelaq`"ցRk&c^Sh8~7;S"?mzJq^8Xb'B.lZXaOިSaٻ0/^kws6k9 7~BfO;!+› O~}ϣvl)_?_jKgHpk)ͭNWW\qwn>YEaQ;ξ>6JB+›헿&b\Eh :;sWATJ1'ү0>!Z3 b6 \T=^چpE0 u0f}IpnrWh|R@{OOo >hgⒾ5 \exg&a>1oQI~}}Z (6|j(;XM_ TA. K!ߚ4>5l h!})þߞ `AZޘlW>)ю):ɎJT/a;N1%l-Ը-ڪ KM'F퉢@@줈Yk 6є.pn(ћj d]^6.1L/ry[X:v{ ]G9GK.ڬ)s\sVc;P h16K߈ ʡn͏ /so%ڳ^uMPBI0 ]8@7y%W=]@?oq&65ʒtc]7网"(% qBFyYH+CucGhk-km jD ٠Bqz2/uyE:5{u|ޒ\7:U y B6F9l# ' ;o^`Ux, 8zuyj@moWaﯓp*ǘf9msYAp11pXtSLDF;$AIhxEW]ĝR-Ų: k[A.hsK[[HwX^(sP )IW/ݦX%! zx_ՋJ;ᴣ%y:Īӳ%{*<:[w?EغuZ&lcVxu > ai/~:9\w#$ITۍ/Hʥ. hi !Uf!x-|׾/}TWb\< ^⩜.x*%4m>nͯR`rqRrdŝ*2S /a/!5%QD g~#|9MQ}≿U]oBHs׿ې32g!?Ӹ6UHm5?!LJina/U7;ִ̓U&'suxNi0A'X&6mppBNx:]+Rt|%Ū3^bpfu5+Zx"V |/ںhѥf_/)P:'s. aGU_t}b1"0 I䲷~3o]ᥟhT~WQv/E?}R/0x?oe}jOY1xo7Ua  `hh} /{3r+bO'~-d6>ջL/Fb ?ւ_,߽md`@b9>U1Z\SǴ0cBqHE!-W+BhNթ`HkT pj ԅ*M6e-&7N~Oibo+,ft,r#Jx.CW?" ħ[}^}$V*LfYo.!(MzET@?b4v&*yvLfIlx8}5 Ǚ~ԜcHP_bb<|v50P&n7Rqgm}&E <вs-/ Lq[9n 43Fj_+1~ܿ}:7s1i{[ON2cjdx*Od1ub>Y̯z<|ryqS ŭ3K0f$zĥx #`hU-Q>׫ ?H߰#u/<^NWȞf3}$cR}W.u}Toäwz;[^ -FĒ^lq8#s2q#H]*7B[7ӡKtjL˅duC jD^ub4t:xơn?f'3Eoq4}.Rc׋}!oعo"NZ/x8Y})z<] %E0LSV<7ul‚9g{UDѭzp)E\)v썆pyA% -0!"L"C))㎚LbTŸ,Vq'cK0Oc)F!i9%f;Y l ̈ mA,A~W1M'Z;8X)%OM,8A1p':RLh,tle"NK12*%2&bNMRU_5 [.5B8 6d(m~Ӄw@jp3SChc'pI 4Ӻ/)b PQ_$:щq+]!$ Yjhqt x[ߕ8kDΞw6Q &󺆉G0k;<&BY#ڵ|q^&8ǿʬFB{zO1Biۿ Ŕ'VTc,itM#M Ģ8 !FK\Ŕp8RCPL,qH2bFӘ? ǐHMw TC.S]50OL ի\Y~y~#bCv(alϖm>,zz{J]pZpTS8b)Hb$bZrweL":Nx>OGD8*cfXЭYʦJIj%PX k? M2! u`-p轉M1(#ӛPe(+Rd5sIg'^#HsCB*uai3q0t)"V`ERgw yi0&H (:/%`J57O%JK0vZ2lJ uD KR)_ʹUmeH0 cM[G~e+ԧ V8Lb\H\`*\& /\rN;{XAU*S]"L o|-س [OL?w{1/8q6'dim:9a;_~ 0Z|t]0Ōd4\5MRc/gnb<Z0r\%.?kS ov3~w3dc(Τs8=?m$}!ey5+OIBY|2&f_iEK4}3,ig[ʽ9RDNtIV 7}Gˬ['t00bbTa'@OlTn?3sEfkљ5'^[)//H3ukVMMrar~^wmH_aUn(}^\%+$pl~!% ɡgN]4q]c%>8S͈Bԇ5唑=7nFoOWV:M:jm{|u%zgf[ufOam).u P>e~9_8m$X.aW$tyl?͌CM+_*uRꬑ$(?Ŏh pz6by ? B Hzr3^O3Qzyh =}.K~\!!=Un鰼:x&??tkW)d ŋYԒmMlwTE]kAeOwknѬ(z0Y2 HrV"|eVlOgY:NYIS(ENˉ+X/ٔN|!ByivhRyE<~ޛGz9Lfӻqa/aM85۫Ofv}?ίS_Gt>w?%ޢKR*/| XB`9>.xw~/W2{k/gA}ڣGGqz;O2OqHWRb[qV0`Sq%8OKDZk2J ~P lR^v5M'|(zA ElmQ9X*k[qxD#=UsQ|y.g'tt< o֫Hx\6XуR`Ə<1n}7fC̵b8.`"WA U '̥c#ee[i&nTVjJT3`nuQ#U U0joQ#A$k45`&&[~ TԚF09T(PƐf<)1R*5c5 %WӸVڊ!\lĦ lOb6Lhg~\\)09ڷW!@IyKܽ/-C}j@z0:җ}~7I%`{YF #ֺgٻ㗡vS).u|'fͩ>]㟿/j\AlpT1GSt&̙<3vbnr)%ZC'*_lQM⏧ ѥ3%W$&ŠI ȧ音&d3{g-9aR˗l|Tqp#u=Aqf&7F޶d1SFN-mؚFн=>[ܥ$Hcџ&:0_08"}=y~Gˈ4f X<3*! +wmzmCM%L$CZ?=R˶nsWlD dq?~!̋@k4Bl\j=ӥǏY lBĨ{@1 -?_fy(eb]9bBvw N%$e]~q W=yHfz- l06Ȳ<  ߜ:+-MJOg .]]qG^WTM&I6rܘq|ӴdK9z1P}X]RR!HSW}\KI2^x @>=vx~E_8Dzu_#!+|qǙK;V:0uz n;Q uK\ֱ=,qZ48x9ݝ8]8%@:Z}V{Jc`$Jn\ #S>CoGf`dّjYH :+ΧXb Z3[#M6cH%/8ut#QLG12#r(cwiyns V94!'Ve xZDD5QȜhc9ޓ8(3!L2K8ɪ2EW9s0K.2=MŒS"$Nw/&|V#"=s~A.f9|*n/–P_@GMrqWԔP^D1Š+w62"L/4BۃvpҿCqN:MG 6%Ƹ͵°H 9tu kHEwVx~kKUI%ԭ/Tw[.r8B.N*DƤℋW)A0Kn $e K B#D1H",gvIDOrA:8VC$kuϿm^z|)(v`>%$}tt)MR L1wŒ5ƤWs<`OQJ = apFH@R~T8X8rC Za((8߫eh.r,0jñ 'jvGo3¨|:VkztezLQ2=*_.ugƀQ`R#:$՚`X27[FʹL5릓ל +2grz>p{(Qۣ2.+2&Ֆ@ * IωH$VJ40M'jM'rHEyPdBt=O$rb\^}_5!%>v1,tH񣷋?Ƨs@tMBOL&?x]P{XĘWO {X%c+g,1odK2^U&/Ak\!-YDY!Dײj y} ~}Twwf,y M5o Ǐ>JX k*=>Lu렒Db0kդwL( ynla4w)B fAIN30 h7*HDyMf/k1`7*$㨸v1 d P' 1 PkI8Q˝Pq w+sl~,Jlf3zrQ%k\sL >L3e:4CfQsZ; 88/f[~Vy,llqj8l;o9 l-SLy.*r罋 <7WU+yKf9H:S Åip}w Pv49\WVKp^2e;a:EwҸOS~t= VRk-8LjW,ZffhglޮvSƋI3\̞Anpx?L~._i$'~XB@ە^њnj״/3|婇wEJ~_`5wc_7Hc%YjCJ۽K/I+ M,#|P_㷵es~~L[LM\f&}c;@_2Gt!gf0ݾ6m+X{Lj)4Dj 5ۢDdwT/mX'4J1^| E X@jsJ@KbnxPSVz 8XL'+-W9%\zNrk}|êPN,H2rFӘ7d`rWU-1?L>gBMƥw畩I_FlF+q$ )79ai)X@Ƽ2x5:_9qH b[㇟NL,n-}Ph$2͵FZ#T`BX•j+0 AIMٛT+Q@4|Kl0X'.V'~6MF-J, Zׯ`_ GokH+nnG拳q 4npϔ\ @o_spsM|rő87=+ ϵ$ՉwKWF:릩"`ݚ EuQǺwAQz kx֭ h#uS%yB֔)t:QVf/4պ!q=Z$N֪.KCbBHg5֤[V.*/W?I>H?{G9Y$d}{=<om;+Y. rfl4f{$`GjuW}EWu OEܩ߭N:&*+o~Xwʏ|zݚ?\\~6wWӗ){utp떘׳CogT%^mn>fveT&ȀrԥfJ.ȍ=&FR~q@FFRl3cݥZ'Y[#ޫ#JG*vv~^]맛Eοy_1`&D&..2IwB Wa$jY*6pQFzy2)Eٵ |e Td扢HDeB@`Dͱ-q5TNxpq&. 680%F92~ mBSuǓ&H)o>ʯo2Ӂo c13րz,)tjFq\Rz۝-gn[ҞW@^٨)C!d~g \ ,B > y B 9yCB %wvrvU0asoi%)3qZ-l M8m$8@P@<|g' ^9ٵ楓,]:)K Kp[,(ϙiCZR"qfD 5ķBܨ$OqNmx.*{"uZG;+3!X9 jPZgb/D_b0F$d!"P(^\p.ԲĊ 8t(AUwB-Z#g3 -tU<\1Aj!6$HT>St,.4dcԂQD ܥe(:)2*liȸ3)hb9^TaѱwKuԚ*CLNA.Uђogtu]rM䠚,F b<< ;gt  qX8)hif-ma2ƏB$b0҉xx?l?5gjij=m*hYxAo[v:XqGh]>yVwWv[q;F+{ljo>BP&:@ͼ)c!ewOכnj]%.+WW)ߒ*R+p2{"!m:w:5/@:6Ņ>Y 2i]V߼_UDEr%YO\(;ʚG-%29JviWB姭s״u;:)w(A)G#7gC>4 2}C 7h#vި^=umnA4'Pρ&v}8i+.QAE] ܇(c}ˑ1S;i*т{̠{]%Y"#s8ϐ4` < |lrm})zjYwoM+VL'X>: 7m.nH-WxnH hȬ fó2;aHFuILzzrkc=,JAf>c #t @aeG_5z3IYF1qHU\$UqQl4\&:bHuT},ZHR^p'b*?6*Y)ՂU!@ Ejify(Reu9^f}U?_UucL_ Zx?JdcL$ h4<-biI.o, щ9I IxVYq!y2.3 f7Φ1;W#+φ4,%cx^V΄Ϫ \*U{ZUV _DO|^)^Lgo?5JVy.,"UL`OEڔ3tz=Av:yPp!5+X^hfz+ӧc&֒Dn ?An&[==/dt=-Qpzbck;V`z(ߩ?>EZ qp`XJmCn aԐ7.[gzh69:ݤ!PFrIv8UAދT!V0)*V;fmRH:J^?)4{J+8]*RpAF@8VP<*@_>RfAyμPN 朳X#U+&F$\DYIBqM[~"lZ#%L2VK-k[N) 6"3޴HQP81"~*M1/"u nzMT|Ig-RqĊ8_BJ-l2`%`hXO\I=~=_mq)CSdtޢJRfx)l]KPh9 h#9 c"nN8w?H-=bR rXJчXp<ٴ6`yI,8·hɘR&ihFVA)F6_l)݊nc|)-N>n\Sn ons"ybEs#;ǔZf L#ZNKe Y9l˹Y,lHH :@!Vd.-C9P"TvyG2qL]Uo+KQIMGf?;{< ) z5pAq)U+*(QUTOV?c B '1V߮.;F\f=|vݱjmIjA~>ҳWOOnbUoQ#D1~ge/pTS?b$'S-;cAP+>JxDw⟬ f{J𠨋EHKcZK3{)8*8@=qGMgzAf]O1_ @DwSA5*}\o1'cH\)Gt4﷦evE 3X,S_ȕ&c7ggws|Mp7ׁg^D<DkoQxXI'KX5i{ gD}@TCXhS C Wt>P`TH+<=$k .4ܢmҺz qe|0 NUjAБ{QN=~g GnK§$==/*m'.6q"wM: sF{lԮ=su+%n\{;eZz5G1;bLIp1\N-F xe$$+_J)o,o׶Txjj/45vNF-;G[U^\Ticd2n6-̏e\/D(dS>{޶Ѫb ˱Zg5{uR؁͓>8)Yygeo5O&ޤLyBrɜau ޴BtU", `|ЭP^H&eКNqG):UMy ݱb4aEbUE`,K!h5NZǘi!(m6xq4\migJ2DFG3]ޒVmG(A( s VsmWK\~X":1o ~/R~ox{>65xo$>am?.;*o%,C[1/\_2duPT~/ܬO⫒^_'p"w,S}9+pS`Mife:e ʡY tՕmח_H:;ϚNqΤ8!#[յe:Ug'YՄ@ٞ/>#!QWMUo%AV($[8W,F9pBϖ'TW  +uLߧLu%*jgJM\ >_6gke2ZWyV7Jvw\Zy:d5*)z;˘Q4bsY6*>&1;ZGtLyeB.NEzVݽٜ:bDx,ya:Eu%ҴRS&|(ryFޒ˞(E?ޕ5q$鿂<+`v-yccg׎ne&uA$M@ef5qInlZ2qTg}WUe~|jK61Ņ?^k2VzU1j4}u"N-|v3ߗ &/ț o򂼩/q|?x uZD>Ѡwm :0A 88 ~yS1y=x{rV7W >z^+Xm_]>gq38~ ŭuFWL\ɗN(i["EG8pȨYH1V0:"64?}C5ձ k9,jR^7`-,|KIjMuW38Xǘ.zj$XCmaN2Od-,N+z:VTZ\Zt'~y6s'ẂKbC.]Iq<9./Dw ᔷH/_cۥ ;Q+%wi:Es&1D`6iDBߞ&҈T[!Kx *20ɹ ^5ѫjn=wwəW_Ϧ[\aq_cYayv<=~^,ǵ ?/^1<_XaFG\J3fQ5;`ŲJ} -GOrhon)¬@y3O,MYEɢ5;o-"T'{WZI6%{zz-:Ey;uNmQ91%wrs:WKgimro**bx筀k^\}ܹLi3o!3iWCfطP 1\DkOҲvSխ+OcX-א\XkDFN#h] sUr+՚< t+G$w"#?XU>ϏN%NFfZm'nB^󀲲O^;:?@W\zaqVcAGh+8UK4[[ JD3hy_S[Fڭ ELA֩zڭڭ-%M1[aڭ EtM ;06 MmtJ,N?Lapp[N.>8& F=˫ؽX#&Ii&iͯiQ>Ԯ J4Ó,˯]ly0OϞ6@[YF~~ZY=Y0{Nw#Ma Kj]=Œ3XRȒicKm]'LBײH:g8-yfL5Y'RT(5+-sg.yW>N& ن;_oքf&ɮ&:e҇9-fٟ=KB"֨^ezW'SP3=kdsZ.W!ZRwOt+KqUR܂`C^,C3RYe>+=V o~! kIlv]ae IOSm bwm%[a=H,wlz iǫ/:k%Cg5M6*xfV[PU{"'5auӠ"hR51@z!gr :ZN7UQU3l|Dau (YߢaBvQLs2.F>Qowvia﫨}hrs^K~8ކر?{.;3 Bg9Yr{j r-@ "YBdO_*҂uz{a@ic%KpAl(P$皙IIɰ>U+H5 !mNe%O)7 l`bk!IRntnbj,ɞ/lԀ/x7le6H;S`N;-Ӝ-UEgVLebdjGCy?!]s"gg/VѺBL]Y;PjO;/Ԝߧ3$9^vǍ"P8XKBA "Xڹ3Υɧsۆ0 3C Ѡ:b&jne?„i1r9f$kxe5}ňb9^Hx%X Q.%o~Oe7|I-s{xiA袕Q,c&p%A6PaB%yR<ŭxw旬*%} r3I%$:Cl 9R; Go #Jj/ 0fNt6_rԇ\]<{dˆ$*]&*ԫ4cev b UGnF+Z9$4W!dQaR{@(GQڔQ<*Qo tKPOf GMRBDCTtʤf2i&!QKsƵᵔV8 ^ܺ;wMnn͍'W󾮋O!PA?-Gy*ʘ1^ZUcPi?g6/1>]Í3~0` ߥg@uSNg?/LJ;|&pm̠bKtƠ8|3SK ʘwziPFPuւʕEWZ,:axDCm-n-0 T VBeR*2o,&!bƛAAJ9**(@·c",m ?CXR p7&7gLBt'YfCnoJ5>ֻZ6+Z9o *w'm)!@Uq۠Jr0"CIQ7ۋJW,F {8Э%cQ JͲ@u`e* VUES4߹+UsЂ=*}2Uϳ OF|aUW}K<WT {8z*)( ?oWy#0.b o!$ ]8x%[r O^fH ErPl7ISYI p+Ps62.dx3C֭̔-Pܒq}<>R00_50eORRmPqEźB;HڗR\YF$ៈ7.AƆ2sxgIj zW\[W\P Mfh$iD_.z"TNQ 8(R(F3#ptSfA#Oy2uj'B9yMFYs >gjjb:2ѻd(HeĐ?DE[ 1C=cދu>ΟJ3!)J"\A:N!L~10cBt$'"XL \C*%RR44.FoޑZ:q̘%I Dj X+Qb<5I{Լ$ "J]˲{kgYbDmNsODF8Մ9 25i"5*L|JqmRGD H(5bUȒLRoA!(o±WᴴfbǖL׹E(TH}/-dR[Cl+b *[sĀ)-FY&1 ҈cKrVc#ۈ!] YMbl Af E͌BGi}Q۫W?Sf(׆O[=?ߐ^q?'Gi+[(zq\nѧmf}fCuCDniV>KM JNxuc~֞` F7\RW}u$M>_tj09C! m$bC;fY>Q4/cY;Z!Cj&tQs/#Y 7ưzsuP4J68,ڜ';v!`/Jm'x5T"?4Od4}źI]Mћ,7YFo2?Q 2 BaU^(Pf!z tHF/y6o*.\?FDXh*"- VOCqˬz`vk4ơW2dD ̽}-mI3F$1Ž)uFj* EgZzB$Hn+HV& W?KY!l$$3NEPЊ^yQa[1e\/*.׷ PCA8GuGA†;U²^Uaz-%M)*t).ً7Q{'jp R>b:G{PSN2HvPtNY,j)o%''2֩| r!&dH6eAz`w=@!*YVK̿oP)f\B[J''#qxNLqW)e e Z-r`R|wc*99Ҙ 5TDy!4Nzc(A,Taoo6qbcAnoݝ->“9}-2}^n.Ұ_5cn`<_Ġ-9%K{MeAKg j_YUi ggiO|[a G#@Ja`i%LzDx}*wF5Z5x`bD1qgx#E4M(FKx¤!L7(5zwh;I/R}qm ?*n*cW3Y/ vN|&-LP#_Ftr9*Ճn8mz4)7o/hX];) uZ{xx]|&:;oh5Y@$Foʓl xb.'N[ٛuk(M_X9 Ƿ~<E Zc K*\TYe{KʕhAn˔h^Jh;4tIj=j zKSG̿h^k)-^!BBZhkLOy], d2&b뤣jWɺAVpB ez@q&|{;*8>yW%/g% ATtָ'2WIgM'pvwְЎs óf=uDZj+84rߴ2ip"LBHЯΣb>(b*7W.z0- wbԲN|9ϳ =⿅sy{E^"2웋ZL o2je,6#urt1O,+h>YL\H\q}yd-j]ƥ~^_xmTo/PPi!<XYYU!*V_x<QkBoEvy_~4_E\A Z$I2\ukt7$,>7jfaqW%хMA$ľ cJ +>OѲzn6-d&Pd!볦XɎzWKvzw7/l9yfJfןiEg;~auS26Q8$틓Ro?? :&T !*s0ƅbDee)baCجhRیԂ =8eTa3mP+!=ٓ64qtVgM6+\Np_qqqr1TpGL5UTksP ܑenj]OfKiyadHZ+ w&$(*g'LFifAsC 1_MaXĈNk[Y1RHánLT%]0k:Y:A JgBvj5xIݞWYUL hP1zT߀ !*4[z]1e&><|r:)+ R c.J.U\Ş5O#LڹVnW$0EvHvӉ썜\ 5,ugG"g ըTq8^pԬM+$m%фC9gԃ2с8]ѯq*0wïw/sҕUT0*&ϝ; PKTXj H3cAdRzvnnBʃ};4$̱Vla#}Md>ĩ8ehk"И. nT7{Ψңł,(P%0 y%=k<]TDBoz8$ *Һi}RW=PDovisЎ (Q}a8 &8+):@i}zVi[h6퓯n>.:yDx4?EGaF3aQ18k ZfdgfCu; 7{Q9GNy/N9Hbw\d3= '|oi($y_(yAhҎsJ`Ѯ&v!-*_ꂸ8D*{ԄeS;mz l#G 0t@5'Afh(Qbg7M5 lP;Hb8-s:5Xoصm[h`*QM]iVЫ5J 8D>cgR8=uv!Ff?/9z'/$Vi4ua+F,U8 md4 ult. !ww:ԆDn<1s8z@+R( /s pDh V1;\3ѨwXQԢL~U&H_AK!HO=)2Es Rƶ+WQQ@Eˀ sxWk<(F=pN{$2J`wGjw[D5{E?i~q򻋄 H{) ~9pZ󑱟! p|n6 C5=iOYC͈J}^?tI'ZuWpbl $9\2 VRu=I:!,t5J7pe *4\ .M8mW!i!X ݖ{>>0VLdVd1aUfY >]cF&O0KȢT"+ܥ??0Iȅ3B&M%PΰH .k\j]Tbym/&Pb3eʘy&Li+:k̈RT%a}uh152|L5]6Y ͻ \QY@S2w5p&?%$NɀՑ9D~x_pJP_Q*+1Ϸ<4I5ĢtPVupd7NNku\(:r58lx:*ƅbk@q ׌D{![0 jyӃuQ`M),(yƲ kk֤5ckY/>`mxjʁTQ#D5t =kH3Z%bۮ0-s t-!0kFKcū4V+q,!i,9T )Y&Zej]oX)k{ 5 )"icOι\rGxP#խsTmq4&VqF=+O?ׂLEnTf6t:K,GO>3у0XJ=5X#[wɿض'׈5CoqU̥˘vJonU1ΕXVڳVw7Uٖ2ز֜pBwJX^jLCAX,Co<yp貔mC)"H.M]R]46]_ZGrEr1x&ZrB+:rn2S-5G4l?fp'VG%x=껝ut f]Mi+vv|ٶ'\=_%&{!J–rZF|]o@Oݦl4˷6Q W!9kVƽysͥ^kã}_AQptܦ>lv@0YO^F4 &ǻ/G6rsZ4|p2'#ýd2(T0]F)d1#ɦ7p0V"n%$09A@؂6B«Af \*N`p6'hrtΥވܾm$ܛDEulMdKftjkHknLBaѳ`k/^-Lg8){ ac7QJ iu?x`_]C4'Vp{;<&Wb{')[w|/۾M>M#꒨䬰ɕe ߈rd:~N+̗ ЋHFsdf%Ox29tT~B>ju&č@7ocF9Q8znl/f]i2Bh&R;膹ሁ xp0SKf6RD, G+-"4Rs|i7Bܰ/7T:zs/:::l9CMTXѕ.KR~9㎽˸>29aye`G /q1PWg_k~=.'7;S)YR)YR)YR)YQBoM!^ LA@5"R(K)"*I:qiaLĈ24yf(#!ZCʠImP 2bV01GXk &bj',8Oz%gʝ57q;ROGz:ӑ{:nD-jLq],tYܒ%KqK▬bޚ~ KDb)כV1_DA&AgDIT}G*-'JaA:(XmLZn“?OwIS!e' ˌ#_: <*,b!9?8} }|?}!:;p) ی.?r~kŻWolOB_~ ;^35ow_l~f{z wͻtww_ݚ^`͋__owo۷ڛ^MOX˽w_us7\ɸ3m7F%_&ן'褳qq ׍{|NVpC{u0G_o1@_70/)Ou}Mɽ˼a2˜XK_((~C3`03;frԌ~?ΠN{'6jzbO?wV59'Ib)qGNF⋋Tʯݻä.|q ƥչ@|~c8t6@fu]noq~*x|޼3Ꝃ,m N>=k['@"_vK,i|KU]-=v)T~ OkmnFS7OI&O٭ O,)-;Jo㒢.C$M|q$"/sn_/ƭMIp/ Ɩ}~u}]?<);?^{o/3 w_iD3qj-kF+~zp_nFed'4ԛNꁌk!:;3;]ަ5L{u䁖X\f#@d2j E87n <:|0M~ ubul-&#}ݟQOoNZX*н^ 6sM{@io۟67<\zȒ85IF @(s0oIs|znhZϯE+U|yߧ.K^'|\E&w>h΂L R!LsX&k]QE }o .*TϿB͊޺?*{>**I N wK)hcC΁six;6;N"2Ø:E<%Z8u/p!J/3R! \Yh|+~Ms>sz[k* q/L <nBDE$J q ='wΡY^4qH"Iz=^*ҫH,k}sdj7{+KKމ9jĬ=er6Ri 9]ѭlQfl\h9Yq5n?d &?'?jxRWiA~S$7RâBo\竓3u&8R5JXo++TLi2Ff̀DP՛ !.R΁wV5:JM4%*P:bIuY3ڄmuuH KLպ"@D$٪ʙ4dP|ZGCтDpR A\B>`BDq!b#~ɛ03U&^{Si 8Mh݈#(h&!*ѪjD3H-pBhs HeN*Ny;Ud^0j#uC:3yǗ|ʗ|j^kUokrXWQFeNW KJ[9h@12Ƶs$8t3I XGՈ6_A'!]ioS\mLf|XoUXoU͛cm]kH %۴kh#8/WkTW_kѡEz dBJ"&o W7@dLI.HERH~CҌkDO\$UfRRq#GWSj6㓖0pKTؠ{$ܐt,d0*HACISd$1$&<eHVQK ݙqF720eVq0@p&Q " =>cFt'#HJ~ [_ԽG]ӓ_.Ƙgu}q&> ~p5FQ$>y!+hVΉ&däFC@735.'84[ JiXª"YI#z(΃Tq,|"K"W #TDthS):݅-P P CԳ\jSr/Cq7Z}UӥFn.QH"?|=zD3>toEBQ\tV݄ۏ+ K* e筻x_1sqtCM굽wy7)Viˑ#ʌja+ vrAh pg4Qf.{uA%M<~2"4OnmrΉd^.],P-|ZuՒVVۦa>mA߁iZє'L2hv~UOaCxocvҿA\MUwkכKwK8\t>VzdߜQw~x~xgJm`Hy?= Uou'|SaC_*FM31/%AI޾IW&+hX}Yd$wpVSb`nJ@O ?y?śÑBdO*X0 +>Ac</F}-2$s՛(P ]Q[vF4N?Au6$h 6s+s@QPTQ2m2\ğ倞> ϳsc2 fŘM7A`qx8RG)l, MtSwg'h<"d(=zpzMTxb$:`H4hP1 )]paxȬ:!E_ł@NڔqIrNZiM9iUNZVU9iVea ɸ`*uq“VJD55'IWBTV)H5:lUAIedmŠeZɎaG]tn;<pj:F;oxG-s6džJ6DQ*hs):CALTtPAE>4OQa%@*MՇ$N K1"!!tI2Iqڢ PBaߨ](FIG_!z݋o8ndbp&RqKdjr^,~ !UL`_%OeFJ`֓?Q.%ݧ'ڵ$W"F7~bi㾍XRi|wJdLp&X@R,bx0^!1sz#;f\?J> .zw>wvbCQpO$cq9DJ ,F`-I)uF1ju|2"ND6d+C ۫++[_8o,8! I<״B*2*1)<[? ڳ#Upq(hHHyIlLX>ĄT;(Ph9cd@MvCl6EsHg)蟢֧fK"{E.}X|DSIVB\ 4@5nStԀOw`/C8ҽ e>eT0 GEDO?op{".:%. 7<8G7J*%-qLF/ \b%FŞmF-6g(C \7F \E \e! ʽ: (ldV@WI9$IBJ&>r?h6Ϣ Z ӄ7'XX W#v/,HCt dUuҟ;y)9#$x|bJ1@XV hRwS)BC(B"t-Bw}Bw| +Cpw^/nqIuJJ[*<NU1&6$2w]Z<Z=*>}m V谮Gu{2eX,}, k&Ik)lSCl6nOˢ6fIKU;PvLU|^(jE*8**x ?S1aMG {-jw^N't1hϫoэ>!oNwq~O޵Ƒ#_1d-`LbY%=jK.eI[CܺP`HPm-l$Ku߼Ycö}3ϖVp,S}KDKtz\w`tq\Ӈ叇@NS0@fʡlR:s)>^8`&JRzLL,59!VoH\M>[j59sL?n{}eq^:>%9^gg0Թ-0s~S߼pt}^>aT9%G95b93l+Skj.Ƨ$gV Zqz3TWa1sW<|Έqc &i,QGL<.1Lst bBV;]:*V=BGԘX4X*01]¤~8  >}/w7uD|)y'IFUNTFm%Zg>EBjSSjT]0Bٕ0pY ~2,AeIt.1iT|%N`|R>Ŗw@3={o?'Kר.5canEε6\ -v&e=\SݓS/}K+62 m mǒjE` 66EZay W߷TR\Mxs .49yp aW8WBtut6XK$ӌydLHBbh QMY\KSsWB/o[zވ,>'oƖ&OdF}Otׂ?)ΦZZ(;$߭T}&#q0OJΫpjpYgmo!Y\)&2SMpȴR$v* kA@3U@6IW$WA{"je]8Gl+IL\5qo&spx2~W%\[*_ C=O>|t#dJEMU"Nj`dĊ ob:$L4Vb#gXŧ 7D|"HY~;+ޥN=_{3pr$Ly٥)ަVTƋ`UR!]O9j:aze/6Z()Gj[`B%)5Xdb2Wϫ6UA{ Ú28bjHOb $ob 22f7ȋ12Bv\ahl!c{%q "o|[@J <5T\ :,AXN1N5qe7S k r[?c|q^$OwDI7O'޹A aIʸ`yƺ(eg@rlJoC=eةC<`$(+?y߲G$3^#[ G n(e@yV|d%v{vFqm)bs3PDJu"%O v usA.{UqŲ"j8"a6G%!rQ^Aȸ(ɨ:KUhd%< AGG*F$52 y#,ǥ0ƩKX<'u`PjH6^emF +PF5s$!yDCBt>(jRBgn f$XL~3~dثjUpˎ9rnP}|0 T/#,<"0 co&溅-}BF26`~HEѲ[wDKx4Ѕ3ɨ|޻3bLDS6DT"!F6mQ,- x!VqDxd^Ȭ,'xekY66.uћkU#")/dֺvEv)ZW-̮M.d&cēfKaj$#ڽdU8Rcw[a"e^mVF@Y>*169Pu*%>h"e|acȾ-H@!d:6N6Ucf#JŖV-r mB*9;7bf9+r~xM}1qE2U>M6['L^m˩9]9WzlrnD C iI]|/%: 1'5ACZq*8`98{QQ*}r4ىœjJG.l}k!gy˖#BOU!5]$&I'¢86?TC d&wo!~lM&Sz*{lOYlIL2x81&A7 CCrj"W.>&j|vwњqVJc\x{ꨁmpe/#;Xy=v[d t^ί%D?3/pdи3Yc(x{@,E9*2y6BKPĎ9C%a]J/O - r: -ѽ_!s>* B]`g19\==jmz߾^/]Cr]bd*z768P郵>% QTš ޝC%3҃[+w8guHs~G񆬿D`X W¯q@qI}k3.{C5R-H#$%BA/U;n7q1wE8X"K,[DPu2gH 9qAAG)5" P% (]ݝf ю8RG6~ivhFJLy@WUT^̨ Ðg Y  ʣ{g|PYU;.ye7wk8RɴaY3#`^!(R\>wnбIn1q'rbmwWonvhfoa6q#NS>2E?\5Bu].Bd b/2ʎLOx؏C16_b<+Bda6|uTHXT+~9j2~$MH<~)dwnP5 Pƛ -_Ch嘷G@Y4V 6PέECs1nP'RHL`^qK@+<7'3C[c= ֎@l7*/Uz)[C՚t_J8Bi IS(PeOLO+DDH'MBm>Ӭ{Ƞ]9`U/%9[5"u|0c>Kao{{5O&q@'o[3nG<= ħc`;NyS(}޽s"]KL#7mq|<;6C,޵#".Nۼo!,3@&;9A7Ɏ5#K$;,Rruc-Jn`7NwŪb]T!9#H5݊Z”E]g&}sT}2vHlDugHXѲAᢐ8S+BИ94EY0dsP^b({Y{+5L:<㯻 Rbin;R?t%JYgW#}4 9'(cQj$L9‚QZ]Eg6,h1|r\+Xwuզ ٢ɡeL %5XQY5\ηsQu[j@R!`C*mt [Mӆ\cm@ n!]ci8Vśj .[vb 3]O3ゕP)kdz dT2·#/ EN&Y>67}۱9d^ ,S]EGxYH]3'ͭb/A Y= |vqDm"}${kηxGR35 N&Lz:o jKaa: @6um"_/_@ 6N%Sz29yJoX}NtS`~nR-p~[,il|E_ލnO}ukӱۙMt̚sVdS9 b3rșT]ζ!^WvQZ0f=ȩQ`ng-Y5\.+N+WN/CτRX?=05xCDbWpСsdjp!-f`L)VLE#r\=,aB# 5Cx٦I+?hhLq fnGZx+i`ړ ?[ Nrl؆;Ƭ#ϧ&9RL?ϭү@ B ^o[`$ .$RBtb́?b@wtbI :K)0@|ֽ%o WX"O7,ZВk[dH"r-mY`s|_ =A9 P&AE0`FT0,)#~UڪwMAaSju&t<?լ|w8|Ûp1YBF7dZk B*\Ol$#%*7rj&[Dq,-%\S`$u*Kii!SNf0fNnٟcU1\BpiqpJf(1AKwCWc;&9ʎbl=+P"3.P*t`pb(/H"G96 D\cIzjdWD#~ SDF9e]n7]q"GCXq+#Y.#I& 9Tef %EU}x}^ d1P8&* n%- Ν&(>µUVf,ѫzKJ|e!**hyY4{ -ѪG'>ҟl0\pс):E>SPVoY%z? ~kم>Uo>,|6jcVܖ"t|}t8{ VTVaD"(5wԥ,ˌKb%ct?z.jf:u۩pq%' 8e~>u3_;+[yj04_6=#Hh}Q4Cux$U_jE1h뾢v'#*$JgR[淣yvm$.8R=u#BˌOl"0('wv Ξ#"}38tu5Q>褝1 䥒Hܞq!bà[R6]|ΧX+i`g~ g&BL*, gkj'7U1CָoxwUyE;<1U7훗rGa0\{H^2o/hX\__ܟvQ@fqK['JpEd(eR%..zFyIG"-fu3Dy,8&ԅ8r}cƸlv=~^ 1}$D]*B՟RUtL"M|@y幟EJm5d%*[Jk F-V^KЪ>`McZBu+ȢȀX UuDen-!)(ިk&T:{"FIѪ+ HHD% q "^K<KgJ[rFu.v<5uc[-VJQFWwYا#0N)G[b cQb *]6RSYؕ0)_YM!tI$t# 8, {vH3Rf"mOJ. 53[<-I&))2"5˓]\chVޜ,T(/8$ $E=%/B[J XgE"Dk;z I<>r ުUy YaΠ-Q2F*01C=n=Gy~&rC_T~nTRyfG]@귘[T`?GE )]-͝)Z<6(ᒥ2Ć [g$"*䛞i͞Q[EZ2RrsIOY}3uv{R α0-ޛtX'nMKoi3H~IxUZ0'\#Ц%" =TuTWb׸[2%OcdRNڡ u~/mčDy'^KQGsϱfHF>}V#nO,sYhm_}mR hCuu%mx5Uz /C/yJF58^}Y!ib/y D n\ BJ)8]ra] $}ZDRO; l~x1j, 4o,oXb Bs-b@`"Z $1J[(FsT[IR˅Q:-/j6G1`Fēb H;80RB0ƙ[$\#SHeVB2QʚC#۝3t‰xX_i~->te/: 8ʻOf (`)g)KajQJtzD$Cukp'uB Qy inUǃGB %s"1m=/7k]&Ht]?pyˋW"A*eno`6jݙ;R;WE>#8dFjFXjp9r&RiP.wN3'(d;|"c~3(R(g3=gM$O'valGYDaqs :WYiO'x;;+{ԌgS̘0+Fg<{uƍ($!4t-yƺ];7.\:c[J9@Z(L48RHQ!-? S vyEgwpxU|7*P"I-Y{HN oLpvdPagpU 5*ϭ)2E<RLj-2eϑ0$w?lx7kFWG#S!)7{~œGb}2 ?**fWݻքLE nZ*05" e-곏u3:kP4{ xJk61d4m}؆;ՁUG+0 cCء:583,^DUm@36PwynD^䚋2Z5 HZD f K (3#8CE)Xk^2sl |J- C늨m">i#a5cY!Owb\"ǒ/9&szr>qI/ \<~r tdP92J<^b`/~L/ϛI.-wY!!Uyj)K̨־˨ٻm$WTm`Ua/J%uIn?쥦@ȒG'I hicmDyFw(D0CqenW;cF8p v'=9Cޑnj,%:X6=^Ň_/ǻYqg%H8\Ƚ /*'J  -,BJQjK#Ho|(}M8)c0KȒ[wLq TY`"Yì9rX*$PjP֯&}4H6$d=}a6C9{ea1a|HFu޽5 Ycqts*`_WTNLt^G@,uڶ5LHA%DVssGQaxPCE,L;P5WfRu{+we֯ZQ)-U+ & FZH*J YP&}X$_=$9].[(YrĭA)"Jn~DarW;:G8 jR "8X !ӟU1ݩ)Qh.;=c D*}[=Ծ~?g';ryB9K#kA]R2/dPs{z]{̓] 5[^fW g5ti!vJ.!۽iIAYK7l@VQ_9exĢ#(RRtJnC7O|'ϗBJL^+„p&cplT!hW]rZNM&1Q &Q,;e~ȘJKol?eيmt^ Xc6chU+U^d@)0 g~Pu;N"!TOA> ǡ9Xx\'Wg"X$f .^"(7)>~r4^"6j1)%:L*]&>HeZEAFp :֐L2.g(ы_.D,bJt HPe**SW=cܿol=f1(A2yaN:Cq= K*YQc(ɼY4$ɱV lP.\mf{Z)^ڷw?Z/7nbndRjսɻ-s˫*lew8<@O@`&\ ‚Br/f؄AxUtv\,'OݍD͹FȎlv`[W %6di$o+GTGN?̝50kiПWO|r*()Nb֝i^43V?kVPjJ+#V%.@`>k{9f{9r*zvAT9w3XedsGϑ>Ʃ:]PM1ygv~*G*gpŹ8ƾʞhsTX:h!.= ٵA !2IS4~5wXEz,Tt+:MUsu}=cͤPM`&)<:ὟHHGk!"\(:1:n7`/C>pާvULM:#4>.>݆09xl$cndm7$Y5`&e;*nN#Ľ+rAݫm9KV}3Lͯ#֍i_CVD@\ uAFZv^u[VD$@:";ke Z#OF9*iZ;SHKY7Sa:7܅4\u]{# 3$9y%$ ^BX[|e DP9x)Ÿf/2 /ۼL\DTp>3|vS*>Ԙ/bJ翽Σj!#Bt#l!E= :FRҿ% +0Fδá9h[Ѕ{#U26ԍ1Ҽr]O}359q<,ڒq$c)z$tk< %n M#1Fqi0WP( !h]Yb'F˸FCVeVWޛWF=c%8Yz>#$ uy|RG_O]>͌Bv*<+:r&z*vS%GS㤜 QZp\1é|0ǃ|ّeG=Ctv}yYwY[pďJD]ruHJ!ul4 z8./;ucX`M(JQ(+( fK4,,6JkJ4<: `&QC#ݗWz/6z)4/}uG!ɯze.'k~vpFu`H;f~ W<8 O^ (-)\w4FS O8!-4kKsWrȈvʔw kdɭ~ ) ' ~x8].*sgȧގZNt>~W7ՄBՍ_yv֣=j=duj:bj[b}xC>n~m3:xI_~ah$QT++H}uHf( X9Ĺ  ,LaArt ,BY^mj:|}.xt4yJzVn0#/ۿvD.=l'|}r~rZAX VV+֓/OoBQJO{#D$dBJKq2J:FrfYI q1AaBZ?Y-i4,KZ SS dU;<z,촜i3?kWJ(Ɵess\wz6UKW],Ac)ƿd:j~_zzӇ\{ጿ'^Y˟:0;t~R_e]$F{sVN$O+/]2l7ۺ?Mm+7w~]ks7+,}Ƚ 2UdSD.Ä"RcocfH )QjHEJD3n4N7t:)`C2j @(mUvrQ)ogoQJpHIBJBu .B!5\wE0oBy IYUjxjP[oKVh֍k*TevuL_l&V٫+(G_ʞ2#(X٧rkd_Z4.Qvpu|0'Ӌo/Ao.M χvpe9^^Tt9d%|yZl8 ?M~{R7Jܕ%Ǻd|j? *p)(kv8J{QAGzur U]<צ7C{ G2j.2Xrշqr25V㓳r,d^1@n؇Xr҅i0_5]< `$￀>ﳗ/ûڇ*^FLw;=8~C0Rpux|vM)NҷgzU}6?ͫ:z;P&4VOiYPo{:4m ^S(%e"&{><9J5TIxxj[F۷?y)Li6q+Yf\j.{g"W]y4%۫˕/R󿊷)}_$Hy<: }ПR 4aA!蔎FA@L3W&H O/9FE1:e }Ih!*hnMǚܤyw{ro"w%/eJE\m.^3u(@[H\;2 #zGwdv2 8 TlK` 6b, weOse)ih6;67byh~mTv'm*o#ܱ߱|\ddCPD3`R )rd7HKLj IR,Y,$;$Dͱ?8 q)Nzr+޽{IwmrCmPhl%>Kvo0a})5S,bh8PH9Ժ 8a"S2&. 6hJ`ƹk.cEEw~-釔LC(Csouȳ3.:O\e:#:|8o]rp٥LY.OayEu,\E1W.T*swNs/;|;J C}̛RFL!E4hG}uHwLE63莉bRӕ+2ڔ50Ne2Tj'r"(OߔI"@9˥VN`B ݑGB͎H莄n~fH%<`2|ER`bd(:$Z m -_-܉.BnK6d^r ; F. ]@7ńw#%A,m?N6Z+yR֮׽.*@v~|0^Qǃ\ܚ:Hch'/ƟFo܄ " ߜF_I5׈+AhyWe6U7K"$m|$_!VըGQZ*ʨS Mt*{}{qBQ٨BSFכ;ZvP"qE9=,*&Xe J! o[Š){CT{k1v.Cd!A%L#) TsnE,V#%8֧=al<$tkC44. nLn\q>/[H7L޲Pl乡dA9_"U>.M+C+5}0ڼ7<>{a0YsG_5):[u'u{w~y x`VNF<941sXcM0FŬ@jHR)(}) T2n$9Ej:0; 4Qyd (`##0c:bf4LDuzbt>@N 7L)av_~+N; bzs/]e:'_>"_/g]J)YZO#qDٸK aBo@4(οNl˹)q8$Cd5/wiOny a$E<֝zx^5 JJ%ժR};lGH1˛Jf>Ā`.V|Keщ瘆 [㴏dz_I׊:+ZxbH?8&3S`\(-kb0B9t" Dֆp@`֏uҏwm/-af}stI&DI/M RA0NJ4(a`6&$ @bۏuֱۗɇ4P9U>\5&BbЅ/tA?|REcD[ꌂŜN,$Ocsl T2N\Z|{XQ7Xaf԰Sas{qv0HQF<% Vyݍ6rGM܊DĊ@<8~mkEkwNp#$E8vøHTgv;e !t2⃠,2'K`j@E)M`=X[.#3(@Ƶvtvtv|TIKlbVO /҆<6{A͑# xPB1z½VA ХaQX XT>qZ5yPŕx:iM6c6:rc#QzdHiF{5NF.I8:@u}%fEC9qX3sXr, b]PӘIɘRhiNpĺcZ!vScFFlQѨiۀժuOgdD1(iz( |ҪF9W9RT dvfD'M!Iq(.qntŕU֋(,mP["@/5ɤ8Mb9g$Z$I2 |nT~>CFq,X|^cb?Kf0ݻ.;]㤑e=iHs q0 v' _ Iz# qJҝAU< "+#(8A) RoF"b+Q.h&^KqNx"mdЙ#04D 0UW:Ҝ#(GDQE͈h GrkK%UǁPB9qV:FN:$LG2E9r5㯭p섫0z:#4鸒HX vkQ(VdL(:fsȱba - Ηp(,cd<KL'rs^ı!OFn8C+5Ί4q̥!ngG|Dk?o qpuRDV\ucJ64Å" AIipWuhkEBs]+nluˮڝ/$URom]BaF-L4|G];Zn-X"e6 !Cn8~8q`|e35G mZ۝mV}um еd({`BmJ)~w1*3Yo̷Zw^ߌMmam/Q)%7v5|!tl[wŽǚ/[̇+KhX`:͉۾MfzU˧{7Bw&r0ezua,Ȇr A6 jװTBjXZsUОő?@u0ǕHu^8X雗dzdzVӫ*=Ƈ?{vz`4׈>|nUAOjpA^>n96AL0tVsxª$˛Z~[?{b͍e' 봑ON1]ԝ}՝z`T#ŏ_wׁM }unʅٌr߽JTQ_<)4E,lI,;'kma:Yޕ5q$鿂DCUu2^ͼ̮&F*aQ$eѸjv7p(,ʫ2UmsTPd0et. kCxzo qo4 ٍ{e;ފ9;4.%g_tƿT^{)_gK&iςrΠ%JR94:]}n,@$S=)ԔɔWϣb幰arG:`J&r-xد!د%ڸ_z#Z_7,bZSe~Jg.~ByKVdƴ8gׄ'qώY176@_q*k6I#S͆O4rҩ@DuDT9x(1Ph_/:V5 * @QU[I")%کSzz/ Nn{Yġŏ5x'tLD"^<+(?*kv*(# J">#RFw,e{Gvhq.:p=s(<4v&sxky(gah5bHy 7 2ڰNek 6ڪIZQD'PmHL)q-^e6J3wT&y]EY+I{d(4v]Ga"1¯>Aյat[ɚ>n$%ƧVǥ~'$b7yֳx{gf2zׂL s{M^k {<ҟ Tƶ`1Mߡsgc{ˮBv:a)3y5QCH*'+9g9cR՞-8"k*7^B1< c'H0Z+㼩DCޒ/5bwT˾3-Bua5|3b'Y A_A/zm#Y5^A/vp3`%!v^xjy#i zd[>X$4^M{=kӞƆqiIǔhmZ m?edQdlĵ S%(XN])0 I.64h(a0R>iGg Y4j#8˵X+褐AjhM_G+2u R2L'4k~b:Ѣɫ=[|w,dT[zNn&~trSThĤ֢b@\[b j/owM=اڎ}l~7Cc4_29_C!_V(lL%Ӭe `.YaV/О_,zKV61]XmlIa|2]w͚m{Ar"p3+\MߪI)R)Kp'*?.oWq TS&J*.AWN:_) I%C)I%IT!dg&?sI#VsF$+ NUZ*hN~u*VƂwxQb^Wg5-*_J8G( GG w,II/bf{:fѳk K r(mkOɊ Z4XΓpv mO(䚑G~o @ugiyzݐ<|fq: OX Ө?d ŸEAO t._&oBt}UWazj"CՒ:R`sLe\VP_v8x$J3 uDDhIk# (w$84B7ԸT=] ,Ŵ |ehȪ\#&- C.Vr ,ANQHQk |@B+DZ (-II*cx&냀d:pw=l2>Z1pp:dUu刏RQڗ 4ۄ DSLjRjAWh0: /,@c]96jn49ps+k֊w֪_J@ݮG(rَ k\ѭ.΅dZCTP^E PQk~$(9n߈aZQ7a4ИjUP4NVK ,/R4؁4$<2k"0Bl|":Hr  k :f[)5H(u/,u\VYiF@ܽ٠Jn,um:R:rzY2S$2u".xwӢt-T".az nz %\G3J:^&ms-ODqnRPr1gYz SmMOBM[t[JyCsI) T Ic6KֺpsgnO;e(#J˩eR_4C8>&+>UhٜDXBePI*dɠNQ}Tir~i .,q¶\ ^ߛ'2ګ巵T8odcp]5w#_f:U,W|˭qLGXA7aŶ_}F^j316K8ˆ r<4Mjd:ǟǗ7ynQdd?!O"Ԉ0FOUB]d*#QF"ؘ8 biLbCr;Vm >b#=ȟ5nJ{~š- Q׽ډv- ߰pASx/<8A#cGE3 @EZʄX˕ l ):/lVEG Hϵ :+s>rkeB+6g0ۆ-#tHO t>lGzq,˒i}؄hw?˒#~߀d0R%= )T'-%)Q %=6q™3$98;&I)Z}4w59@؟]{C]3<Ժq3_N:PN;Y *vΌmQCMu V4zwb*?я>Ut>8 yJ83UR/FF'V Ɋ$Sab4QX{}`OXͲ_kMgռjB<ׯ0#737걄-)H)0sU>uk o L |eꜳ[\wVR-/S4>V_NuPa#RR{(\v7/"k6:|օPt1x5=o/i;b4SABG*˵6F4u| QgGyC-l9KLU`Ҭ*Zi0`;nұ臢d35s.>?GUjFh'%?ż{zf^⏌y+g+֟]뉟OLOǷi$&YQ#nŮb4ͲT`xVPptwp ^`Tzgݳ6[N[Dsmd92:j2F)jx(JA>\?+gYhp(jI(ԅJۛ΃.ŔN_'S*a j/АLPB%YBötdɔpǞr=&"AWR%q̻{m7 DBmqNGYHsU`{5|.I .<=S#2 J86jeLnQ04ލ4[^L PΜSz"9BRµy<\!HiT@ȝmCT)ѵgnoűtj'H tKWzh,8*zpy, :b1$B; K1)Z\ۛµ =zX oٮAt$*"XH ׈!'r8 J|hsPSUv:׷oǕ{<ܞ2/V| 0:QRU{&45xF͏ƕ/_y>O#$ jA nVx7Xƃ&5wé2Rhf3bgXȄP+]&#ڈHI!5Z RDgCܢl4Fm BE$EMNP D"j8c "55 D :`%SMTkI[[;(.<\.]$ ]3VIz"5G34f b%ʣj!iz/IC0P̄e5CF45졠qô44hIuB! .Z#q/iQFT'up5%`T9&W绡 4uPc &znTūXĶ"wn͠\H\n"\ѹГ 1:9)/:H/Gz] N>;~< .\J컫h)kvt3(Q\짢武˄EF TJ;:.zia|dVs;@77sASR~04*#yʄ@#V+͉eVYerK`UՅ U>Ք CcTe.Uؖ̓aw)͊?#Bߩ xw|aXHDfhEڅc͠h15Ni(hGVp0CيH-DoDK<~ǔrALZo1D@@bKM'D>D81щ++'Eӛ" I4C&YoήԞ/5A}Q2th~v<^t׀x7x z <:f=|Х_9" ݡ⑖@\3.~)e.qۻ</t+\n\7FWz}uTtiBÿMy[Nds"wUܦ ;􆮏Ebt:j@p_YrM:oRFbTgXD$w`]J`y{jQXn={$kdG?Wp'gyFЗ>zȅ|Ej%>S?vr;8Z\m|тMwVBS='!9YEHϲF2Ǒ3KIeOl 2t5 fy窋˕) nz K`T<<$ߚλېZxj7*;_[l4nn%7z` $L;\mhE&2oCdLSk!.x'f{b(c1j{0 Gr`E: Ȍ) Iϋ4g̎92-(%Ԋ,:dB9rVMjkI 5i%4`^eFk`,0&uuvs;)y] ua/U0{T \`)mSBS*c9C[NtK:=y[V: Y…m%A|{NOiǭ! %JvqgY`Իcv6z/]$+ܪy*;ߤËpdG ~ , t!PNzzCdK3I\P x|,9gEa vC~Z/%ȡOZj7 lA_qi4ʾDž_/E"ZY.$Rh& HYi5gto +BJ߿lN[7wWa<.F&)y#soF*~4p<,rqu֓Lea6K,4e.k1g}0Ci#HMY{^lD ZQ!orǣ"&i+bvN~reyy5bA ѣ?L?]3匤Lf;qo#=ɒoFUVi_/F_64,NxywTxr/\K2_.R]U>R2*4i:̯.}1sIb'N?K)g),SW#C~Fft$)c%\35A'bAeQYeicFI&hy"i0LJJvJ͝kp\ BA%( @脖dhYtv)s_6̓xn cLek$\74Ln=g$O":K2( :%STL6]3H ݩ٤&O# ;@,rVbwgv;[l2_8B$ q-vVO֡@+G 0>o`΅u.Ƶ]b~5O;G2]~B8R@F߯l/K6ǣ) .xtBǻJ/V7^bJ3Q>aIA%Q|*s uFY9K=g)笜S)*0g6q#җ\-eWWC.]nsI%|ʖ YIَw+$% D`f8ҺX2O7ݍF7Ѭ(;WFwGH2 "pgvTgv^$1'rѤec쒧\.,~V`/;VDbʠĆ!RvS BWEÿ7${ٕvTJ;i8% |%>_V ;̦~-n2`%zO%?AGR1U|}iQ4$WSrEc J 1Q63~w) vJr$2dQop³#A;A^)1\Ǽ`J)5Tt V^\c "6dЂMQDC'{d$iO),/N}ܮ~to[7!?υvgk'=Ⱥ.|u2Y|uè.)OB0_~yW$ܗ#Xkd^}2U@pMUSʧ4ڔSK(I+9ǾU~j=ޣlL4 C9"#gM A 6X/¤k.Er ؏3^WFRlPˊ)28 gт sKѨAyF%Z.43;%XrkKYV<:"D Y$:x,UVciJS UWEr#yƙ]e|ZFTY(~7[q*j. 8L0Y.sf&ߋEfPd{;ѡo{ aS&NyDD{`S(i4"rI*ګ71/^{TZ58P!x&xWO%l@vfNsa1W`Y b8o ST4#5DԚXI/!joqH{Jz$jEE.nDtX4?zwcP,F"=Y(QgͩO* aT*g%RE3UKg6O 0x)'Ԇ5a8jkya D*L{9"XRmhcιqKKL{ jpg^cDRsq]Y WwtA`pG`訫wi˓u51'8GM JfE1Z&A{>P(5K繫*|kBa=(KAJѯ~=\*{ %‘V)Hcj>k8xb3.I ';5<I}5@wӤ5Q!O]1%rH 8n!$I BZ9e@\[bJ ^j-Bq7Lk!Rh_;?{piû}Q3Dey]D1R缱{:Wl&Arx=TQiPT((ӨKVVU0RPYPWt UZrDmmʗuВ7醽mw?HءBT7L^;xZ˿L7_~p-=W%5yV =}r#u֛G1o3V#kXxuK AU/MPyX^_}ݫ|6̖t|vU<ѪIVN'^]mc0N'?\~8zhNog{[AsvӾ V7 ӮB]j 6J$9BXU`\{K/~_1fͷ~Y|PCu/> \oGs4唬ߏJn Y_?0Fu[\';*ġHp=>lUs5-8ߜExJߴؔn 0ndNaE)gI҉fK8ߜE`&ljF2ሎ;s<įCa?XD_N1Cx]@KN^haspL8)2{42RqXXjvz{kPΗ[{IHǜkvyͷn u?VJL{3JpV,5pK1rf5AWF$FoË޶n=hXw|J(W{ yG 楐kuH.y qYhcԐ2By{r_4BD~G(5 J7We\sYG@r#a+YӘyYW$j5׬ g=B^zHdil_ƬwAS9i'a ar'?qzpbFWR (J*LHk/yClol^B{O#\3 m8XWm%y1ڰch-AQM^J{oHBPgkZX\:*lLwScP7vרg>16ւVS 9 00F0Ld`x$1uɌVq|#8{]xv'gB?S9 ZuYg Q85(hQDtFpKHԳqu'0W/&T).0iO[FPMF''7K5?hΆˁ#>~>_gB{O#P56v̊ƃYPKuDjhG4b8h?Xw+EQ(֓r}@ZokX1ltWLuZ݈>wԔ%@ǔw,+#q+k]yVRVk]I];]h/pUe{ë8zI5O[+ׇ̮o( q>ߪ2.)k͇wʞ/9%[,}j͢]L$yurquq[yҭp.[Ɓ~P_/ίSnqo8:2De4z RNIW8bc.Vnk:;^Fd9TvNy\F׎PGrRku@vC(]J#MejF(/`+JYHNh\vJ<Ñ ǜrծv@`vv;Bi*;H$:ݣ&}EբEA.٩lvJUL՚#`sJ *=i1PH"QYnV댘6.Zr1*QX40HO 8I]\t3}ɋߧwܳ/,9t!6&ݧ4cõU Cy%ip=J9E$S5hRHh!=֗"o8M,J' GM}r߳|/V{3t&) !Q` Q#HMbq0qtdRO/+--c5-BPJq7|NNAu0S[nt"6pL).!h 5Odt*h ҉iwLtòaJ2޸g6^7[ZҘ.ɣzᲺaHrMw]JC%hVY}Ÿa>zI`USSpppIG~w!'H3&=x_BԺ}w7׷KȱcwG#=i}QDl >|0Oja}{{9T`Dz~f9ɚ-Pk։ `p*(Ow/=%-2RI%ob"3׊qxRq_[˧'frk rE1ZQeͷ2R\ 1"_Ljڶ+Bx(a 3 y7{>jp[Emp:Fhue2!/ouzԸXvq(6|}K݆jDRZ\-WaeS+ƻ oO JjCFJ-)SGѼہ@Y7Áhv-;6qqvew4zN߂sB;`ogíLlt5^]&.Ǵca#Q:w_*>Ϊ9^%QiIERVNY9!N(Rg6!)'Vzp@,P&*B@hI`.m{x;Υ7^r w!ym&D9"RH k#\=)=:Whx4FJ%Cԍ4/ٹ併DD" LPIG90 ፊH9"=2C.$$ъrwFmKKKԩ,h( Hsp/)$MA29ցd)1B&g_MRd㝠z@ւN@8Բwn'N9rnCBQVN{j1^{zl: exoit/^;{OUs!ŋJebwgjf.}-"V!OFPeT2UtЅhZwwg7( >Fjz po\^g~ŕ#<׿ݪNVgpٻ-`fJ RgAUBZ T^tr[|MJntM! o~G"JJpiEFvtdO5&Fp> ~iQ=(o@|̑G*#4^oIKF.pH 7=Eo廲;!fhrDnJ |z, qtg̓{ftoY3>;h7/5\ '$/Aa7soFȝn#]Eza (E("S>Ɏ,4ϒx!n$- "/O12l-$U+n _'\92Kw;W+#Lr<3\kr剄2Il\f2'ud**0$O*p DKK:'yA : Iׁ.&q6 {$.FC |1d~k~_MC쌹0GVQڂQ{5{{^kˬ/f̚ RZFK)i$*j S+%JA3bY@X! Y[L H5dݽRb:⹥[ JP5Ff5В@](RQFNLX=ӖZX!$qKhԠLF5q#̺Ԫc ;2_/62Y!'7Pe[fzbi(ф?ҏX~Vk0[tc?믨s{lY 큧Xg(fm~zA ?j߄+;w+>?bI "ض![AԒuy(xAR2 LԡRjKAm0.d\%Ecmoz^vt GI%iy/^'h_x:⌡Ytgrݝho_qU!jJǰꅡ@ eDUJHP{S#V鐢bNy #QG2Ah$QQ!$ zZA@EysJ U^%,b #!*']qֱJ~bLxZ `SW Bcdp@ e>e&lpګ"ׅeP(8Tnť ..e2qCSg8z[JPYRxDQ%c k@G%MRGix!Q.e؎SH5Z҄TBxB:+"4:і cіt1R{Oo'yuW+ns/-ťtxXS\vFS++]LAih6˞B#dԚQ؈nJM# vmoEK37 ?ٟaSQdR(#Bȭk=Fbv݆Nq9zFu@iRSO;x00e^$9$\?ho-li(E$4pZ2ՆN@5m *)5Z֬Od.&4Je`7,@ݜ R9qIpf^#] kk8s"TiBsтאeiB(*c-=ԩT+⁗1m P4G~p棨Pf`9V*o9UɡQTh$´QWL7;{P_Ckf[)h!Slf gE*'qҸu=|a [NU%8 يYZ/o &Lf'_p s@%&~VJGS:ou e6T1hH}G#`ݪSvẺ8?Zۈ6+*X].P-hꉍ,;( ^0 bm?-K)CZ_OLqsey7e>2! {;pv>3ko!{ @t9'SMs|ҝOn]._0g %( ˏ7l$\ꪖ3ʵ;_ dWcw=>,/xf=Rg.ڸGPLR3[o*l´΂M%7 @Dɦ.%t"ϒ-sS,WpmkV`2VM@Z2Q;\orN=eXggqnu4KA _YȄ:cW|r0c^:&.vH*.DIlcn:(ꐟCmn>e s+`y,4 w!x΋t`jŚ^&|S3U\Dch`0x6Z{Y@%O@2蓸 ;t'ހ.H+ݲI6v<ɓps.4Il㮤mŠnOA3˧+/2LSuSb3f}'X \ .3j!Pv&3+Y.0@i~*TJ#1ZC&Kk $eb<\D)7US*jhr(MLkX{$ZSG+){>}vaO:퐖Ty SjL?óOIZ^y{eg=\irR{sDr*Wۨ4_%vtyey|Ǜ<q~8Xϓjr<[dsb+lC\}4A572ϫ/ʿLBڔi8u˄y5OUBUSID DUw->mFodz~d Bht` gu'r,6M<>B&4v?u-vo}v<ܥ\ڮZk#]9j)朝_E{3 j~e9~Nu*w~{QIY:Bm0 ODyH<ǒ2^&\ۛO,)X|QG!`RTA܆e ! cCBGk\׮`UwĠ14:xe1/ 3CUd{k@ L\dq2 ʯUs}I#Y,І}>""3+˃,Xlt"`AROݷ>~zbcK'!7l&_cV˱%VOt>zd'Nsz{;`l\SNk}q#ϣ93~8{[dv} Y:3¶ݠ/2H ֟[^mNUS8t⓭x˧#Mv^Wn1FjՋQ2qjvZ(w^|ybjZgo^E (Ejv%*!^%+ K(E4n AR\K*)M_8Z'Ur{j3_( &CMꤸzyN nENvrRXh9{IAq>dXUސg!ZAD^Ysc>4V쬊H7˗XD' ‡VICUR^IitwUXFj|$Z3Ip %\9[%thfu%F-(몾g{?~wuU^䒣eUQ%V{DhN-k%a`Ħ'F.ڟ |imN ȼ,3J,f'i͔ffCVrե?}آEt9/${tε.YJi !5c%;Ugfl=-}a^t?=fgj~*Z$ۯ'5/#D0ϸքo v1Ͳ? Y;k窻w/3^OK֗ s 9xwpv*.=A̰c P7S`np`MR70Ff==Z~I{y <}!PSMJSB-ɚuֿ-"Vr3'͗Kq~#-z zsFk/'%+]"Chshd('VR(1&fmP_(R̥%7b*B Va"*\,63ٲ82u`^`&U4,&2ҴZv>J@5reRZ,A2?o4ݕ2)QIy~%HI{&1)yS$iX>C율xj ʴd80S5-X92{@&_JAox/)3TOgTy[Cж5?mۚrJ' Ea, e*j+*īHb1WR*sOCo4ЛoZҗl՜-*:+N,*A+[Kr hU 㬖]##J( 꼪ⷸZy剒X|Ue HT !sC @uy%JI1*{ƪ\т[#t`mVID܁ @/^0^kQ'EăBNl [&$wq+DS.HUZT=K)F+dsUF`] 4*2Ib)b"A T]8 TVMȖV8?0QHzF9Zdߺ:Km)Jzsdq#꬟ ֓:GyN3=GXmDB`ͤCxo̽֯y:mTBJX[g^Y>i 9{R79{$=$i_'$<V^kGz{)qaSsKR~*Nȓr|k J>^=;dwF}f |z!@L ``0'B;noZgݖ 䏿裿wKzOr8~(Gisq`,@zْr@VQ8X+Uc4<ڪ$c2x/L&4.ra 0-0z=J#7t\\Vv}bŷZ|!o}$ڛn67%eF; OA5%t,bCLb ``;t">-^HMnPogF;K/Ocg`hr~iK/85`LNg6e櫍zd81YS3Pct=޹;1gH8[Bh9btp0>az`LWgGV08Fܡث?cLғg @LiTF/g]ػ {Ȉ75efBNh߽˷WXMܚ= 铿=!gUy[". ݎBcK@j;ux1Lux[v=4YA=k%tp{p;;SN6H-g].Wto2}p>,:zʴΕibT{sVr۬rǥC'%*m;/ ԓP >kѯiOL;i"IQ!?SMg~&84S򭗅};Yn>ukQ7=]HJns_KwX(vnaus@hn|yCY|jt`Sٙ0ʱ謬*r?v:yLV U]=쯢JCf.'rWn DDۄ@dbV7MNVKDӶ5ʤS%-֏6],—\R6zͱf! X9k ,SYi -`f@]cM2V9}96Fm^"^2e,̻ŒU@)q wemzdT"G 0q ʴ)&)9N>UMJ*RlUK؁[bZ9A ˈQ#D}s9MP-SYJ]_Q븏e+kHuoCԱ"e^>iC|R+iEΎg{g ^WHHG!5xҩCq-2 zR,F91 #,WCa5oqlk1dIY{b>Tِ Lc(JikP}>!pJ,YabuGppwK,YA\hKK!d{-APC|~JCs^Ȏ8A?p~Pe/N)h3ܱ2U],tF$XA!>f-8%/4Zn[0R٩1']rc % D/wH)c{fRXY(XeEZ,X`p,p?QGܬañ%m\5OqŢrY89guL3t'"J=%%PםB9Oa,QC$RaQAGI&'1hVMxsWRb|0/,^~K EI>C)S$+"s 46+/a҃:U"YS`,'(^,FNIG拻\ڐ$/@Zmn9f!a0$_$[_b.sVHXXL~lT*2%zYN*@r:맭uN8Ζ_l'Ktnd*YQF!ՁҎqc2|MܣFԏT9>&0J{B-@.]#,i!nu1 `.=J\GGa'5v.[1Wmb6(8oA#ܙŭРev@lY' bKN?bhg.J3Oh_e)"EYM;lkzy2˳(DGѭ3Hϱ6VլWmԖmk ︓#!or#>oWBO%o) n4Fdָ7Aad0s 8h铳AN E+w f@Y=̢|ila9 [?Žzǡ}:ð6<܀ňyد:3Hs؜jd-;Q" :ᬼDns._4]iX @#bPbX &~ q@|!Wv :3y'ܓr]bjlGĐ }۬XƩz}Mt|TeC^*#|bيeulUo!_LźFŬl? Jb>d `.A Dq i\`Da8"%f2MII߯gM)s4.KCb7 ]I=;m8 ƅ|QzEY7'ywFCo=#wfM+^p.la_b=GTw1xR[x  сpH^/9u_6_2$oFnLe<S,Y|%.3a!:E8Iֶle\,L?NXeC]f+0[_2l\1FY v'?w?٪c[d"&Wr{D ˏ=>~8j88W8 "Nnx?f[϶`ɠk%[5-G)فw2xwDžY#`WqJǜ{6 Z½jkIyPH1! `;3D7(st^:}DñCRܥ.3dS3 ML-YiÜE,Uh `dTXwjqlY7G B/ɿ9:ӛ|j6YA~/Rh4{p\a{3^˵XaBoŲC H0{$*0YDDp$@"݌ǘ'z0&}vKd)ISΌ'`b^jrq)&Xuɾٵes~^chVQau&h(_e5}ztY pI&ϯշ,WV?q|Qog@,8z,F?LYbvߗ]_OaTpY13V;5~SkwѦ@ >}{tS~~M% [>qK$#"<'(CBp3LE;q^*OҾm>Wa,u2WNic2E~Nh¯̈~ugfteҠ2w*C 1V ]g|QJEqew|” Dug}?}Z9\ -ܜH:GS{(&_P2 Or"$Svun8آus偍踎QǺ޽1 gsed~B2zR=&`cضuTs!جV^cxC56c=m/T`sS3Ya6?{(KTQL/ m l:6| IN o6 o]zF g~~.!ͯoբ-\zx*s /DntÀ60pyQO_McJvL vZIf!"O`@HWoΣga=z'$$̲S}CɼQeHTp|m]|3nؼ}iҢ+#=?fI}Wf!TY*z&ۇI%qޯ]x[̟eU %gl $T2&\E"ބk<8 9ͳDٙ,fJ$^kPTᅕ1<WYOmg.̪4cJ={RAġz $OFa^Do))-D0IIVCA2TqT1#0 ;(p0c}>pX=1K|5h<![DԸ89BЈ'9LBU̡Ǭa1f5RuCZɃ|9%u@΅=w6/_|_t1Hg<( @sO,}ABn i~:2iAWvNYHD7zKz[{VK{uhJ/G g0!b}PdL>_K0E7_\losW* _j%saSP>H/2ȌY4bj"%bhH %osC̀<^N&r\O^߬/ R, kTs"_z|C.7"7Ds& I m\AF'Ai\7锂 'Z=PC/ qe _Ysϓ @.!2v¤tO6E{g!K Pzߴ ցuHFf7E*d\pBHj >0*ɾE vIG"I{k +l*-@:t>$vH !;ranxs񝬖H{ÉL|o&0ߖ9欩9~1@dW2a!1܇!uTWbeM'{9R&"-tY%(QU_ eDy_R0ԋcVT5  IVvD^@9aT*btm,8rC@G.0GݫtRtBZ!o-=k#3'eT 0oȘ + /@Cx A,Hc4l$2mN “^q$<&y7o+A`C'aXD b^c[!ŃK C(C(   72ds $=>13/徲n(Hߎs1?%2 Kv\"4)d,")%ʊ2*bg,E"Evv(1 'Xo UuDE[,N5^`W1_@!)ݻ9(F ВQhqٝA l$XyϠA(qhc u_@uF즁9Ar2[RQQV_\nmշ^qo.{ُ_ݵ57rKʕP۸ғwJ˗$uEG+$? )b!ڲE|}z#K[.@ܴ5NVyK-|&i,S"p0**!YͥSK)8I+N!Z; 2 !?HnRPnecIѼIL(|oIYB41V +eVFR*3&4 Cw#Oq~fc9 ǰJ.`Ut;U6G)) 3. %)4:DH *,AUSWkDQm)fy0`Cm@a$;~+Lj~ oX__|mXQ1哽EH?[#j~sҖdUvE.?4|v3.1zdx_u﫮}UjWfnYtB-ATJ҉RE TS㸊jc/I{10 Hϓۏ#G?F}Q\< 3Ghcm2MB`-o۰AOޫ^*GWͨ*46:G[=KDp(hR((VDB9%T)fUt749hlH3XW n$'8Ef>Y=iNζ"-w!ד4o|< };A,XO˄08ϥwU Ύg07+~\4] ha /|b{i!gO]=t_5,//ς$Oc8V왒Y/϶μZqpXQlL !jpxq8#Ƒ, =*dvP(~  b5jj ةFz4ET7%YH|g>oVR"vL& ᜳ;$Gp?QO1v7Lv9S>ƩƯqT#Q{qRkWH\I }P&* 8 >PE/'i\3;b@Dt#CF$n3NTĭ*QM@BP. z(  P/1~¬h gX *XVVf@u#k%U7d+!weEXJ'ྙ}'h9ɼQѤUGw=\"_5UDvUr>y_H$LjCѐm$4y$0 Dam0 gR#QQ 5@UW'P$g"مW@EDe!Cg$.1!C=D8#)u_`_)De M%S< (8s/Y9u*()(jD 1Qƈq-C-2Ё QD%#@HPI8hMq}&7 so2 #&x(SL"F \m #$mCJ9>{밑QV$n~*'94\@HGIr-X4#%|1&V"AҠI.]&_ǭfQ2 Q,čL2 -tM@nMPA(o#焥@lUtidjϢuI(o3L>a$G\ 3"X4+uE\§{T kO_DLgpps3xyb;y%I߹[- ܲFity t ly]8nTj%0YQxN*7; W4N;qH(T(skM[3[&% HBZuGNn i ~^zl!֝1sR9U7d7{bJByn?LV-gM,soVyG_p;dgX$w^޻>-Ή6dG|x7i?|aash%R׿)mi9 mՓPEkّ3Y7;Fc&[xե?'5kC0ʡ$o67E;w\j.e牏5(V Ξ{D#BrW#$\ryѹVWp<"|SÃS8Ʃ0 E)=}STptSnKmal+/\k(N?-.ٍQ[ =Nfp4AQ{G MֶB͐}nռ7`4#IYMj ygh Kx4S6\pEka%i]oY/{ WXRK7.^ .Ѐn޻U3o+ce9)lN1㣈w+Sfb0o_@gPW;VP oʷ'' ״ϚSL z!w_jH{x +CIa|3*.pFP_9nM²*+ >mJC } Kj>`P3)ô =`!Plբa%O\f|TU=81/c6 N)=}G(cf~~PL=8 E͠St9٢a̖ .rǠh.<z]L]8Ef'm?vG`('F;!c<5![IǃkcIƥIجLfG1$L>Al27;ʃ& 省SzE僓Dur9DntѠYk8B{ /^!aG}m^!s2h"Pn{`U<7h0sV{&8TEO(*; twgQXd*s|Zu׭=V~Shu};z' )DiK.Mn0{n>ջi]% < Z1 DCfc1&rd.4 9(j08R/7D%128B8HnM8%R6&&V LwW9舠l:'dD *x mB" &BqU-Y52(J{Gz oePSgTp/JGxPp UPV SD58~zwӪcϻ&Ϝ-$gr{gX`t-ed8W_I&a,ȵ;@OcPk ZO+&EXqs腎.|9:g:Dv͡3[&\IԄzR){I6UPA> #aiYU ŔKNC*QL&!gdCD%tkL#f][,[WDnL7BԒPIJ2MΡȹ0!Qm@>gZ2q To!ջ;|s4҆*|,eu!ZEDsA 5Ҧ}SDJl` u-5. -2-'14TXBzv;TS_LJ"xccÆd!O21 O7eVdnh̨j.Z )N&+K ZwF eԁ}UmN›쬮bi.SMf~?}8#H{?Ynvl8x{}]*:͢M A+@QՉRE zUV'V*.ջٜbPu[_;QvD4_:UL /nc? #"dFt!EbBN|Ȓ.@§ Dbj6Z&'2cMȇ3d,ne<ٻvfޭ.MQVԻճ7nޜd}V]tyw;ح h<$@i[86'2yHw/e4%w~z%[KWvZc*pQdvt5SVNiKow!p]u-Y/aY?/>.&?fgg3*:RLhfTkFNR휊>~*HRs9;TetR,uۥfDLI|2PMw_0"N 8ӧ젨-6i3V *ϩFY>eE&.E4Egm4MJQT6͚ճ2I9/VCYv0D>_S G,V -!oʯ OQ 8sXi>X}WUf4W= vߦhoG*p[ôex"'-l2`N"lONWtzw^(0(QjgrA#ȶW^Og4acKDz6d?7aDy ʞ$~s}gPMgɤd#3mʶ)}st{}cbU 0ʉT@ZZrxkNԂYOe~?LPcmQWz3qB G@PJ$~ǘjCNJ?*n]p5/3S3)249uq\o&#}ڣBRkvd, R;,c˄Q19 ;d31P8DŽ]2qbi|$!/8%e=u$t[;0t (11gmó5Pῦ? bh;O]\աSaXT,|g&R=X+ 3=>)שv[uazjWs vH|6k)Dg\F*)Z6M`VW.O(|Lwjr)==G?>负'%#^Wu=te2Ӳlf 0ơ̰l;}VyNzRf%)GÎn1Fr:+V 1 wFRNAh& =0F"=íj~{Wx cyQ0kَy[b,\;hԶ憋cmz]ޛ~|wSQ #t4AܝK97KǬL]&4L^+1'B a]W}`B ފ 3V[_c@. U'ʷ˜/eCsLJʸu*gj>4Pvuw }Y1Y 7A4q# mJtSW\bG#5ma-G3jVjOI!u{#Fb2\skٞw|S{;OEżcK/&LGY`nxNs5WPKݷ+Gwn #YدpÈf6KCj(IGqiVʇ {G0*yh )4q:" %$C*˒s֫`X!FʾmcEITwulABs'G7­쮉 zAS)v7=tVCo"T6>aIY, -nU$PSGyS9J':6dVFrPJv<Ѥ8?#mEs͜.22yQonb1m"$m B +\R(eqJuFr X3jE`f9\rm2 .x tWtrv1r"<_ m>we3{1Hr4}G M fG$7m,:+&EG)ڇ$RRcr"l?!4`TJԖWe=]TƫMR^Z}oOb-) m[5u$to4|I2PYD̪gRxB%`ϴ헟$IGjc ؔuӭ Ժ&]}u0znjEA=b;yw;:m8]g2hqo2mc sy7nOHcjR];=Bvmls,_6лʄ@Y9RI;zCGyx҄TSEOtTYTEX028`p<%a񻀃"e./g24c"‹\gnU>qI҆8n9⬃t.2 1_֮K#Q8k(rW !c'yߨ7ǨIL(Y/^hMXc<Т4R`d dDPx4Jڏ)RsҤ)5RaPKڟ^-Te/DQ }M~![le yi]X1gA])2u|xHV> g| 1V$y!\a{nTDЖ㯣 6H!*E!}ni+vjggt/(N5fFŸ, Z3QƝJ %Io^:O'L661&f^OJM6ZI: !^p­۶DRvdJHLn O]p]ߍI/-Jmdoۤ$>KE- DNCRZ4dYnp ݮ_$[KgxISkERy+DwٙƖ>\.rDf} rO< JpCG[NޔiXߔIF&K/'eç Np!0C^|.?SbxW@RVˬMT~ H Q<4X"pZ!ҊuT1hli HmF @p+NT,wֱY C1/a(UdH,Cfi;^U^R {u`Opy֒m1?t'J P.ǕNa 2*T+ʕ'1,W@gMEz(O!q~e%J[4fLކLA'eHZK+w٠0oWy/6?t. 6˨nT+jz7"x=Q㮹4H@G꬗.̧C\\H]S[N~&uMj+x?Mz~zۯϿݕsO׫jW7WUonn}:j0[ `ܕ\Ͼdocӛd|S @^P.cZ,5r "H^q^6t-]wI4UݸuB<~\3\ΠA Z)=Vx󡨖|\H*֎ !ڭZj)1' gzSuy x$rҋ* J KӋM\T}:⬹K:A. S(KrDazjl){fi.2M&|/+F}bQĵ+&AQ"@$\q{ϑ LqY9*?`P K1YA .r͂$< `ʇPHp9z!r\Թ rqiE2Jlj{c UR[j"\qMeM(}r/{Cސt{8|_īi>k`q6ptHXsI/hf8 l?=5#d6Y~q>ΡE$V%O 9{t?"N݄>Yd}'M ɍ; 6X=e6!t @dTS4ֹ2F9:,= ~Ӗ{܇sL393^F Q7tm) ^a"ÏיU9xaB]@eK F<} 0$1|$CmE#)ݒ^#+vk4>LxtPMWbU$Bi1 /`*0H`YX0KulBXuY$MLCA5԰(VrO4#Z~^`B^P˜gvOG,߂3س7u`MXoM]Jd [~0md7gAj/@[y.a@L߽5nnm:7iɾz?聆%DD=LxG $-ɛhz}^U7#?}zf|͇*$*PZ500nY4p ,AL2,q ;ǭ\ 5ʪ>xom5g2;ߟg yg y֠xU\9ӁV&D+Fc Xk)auCVGDb꼚Wj^CP~CqJ,PT~t:/Z^wаW{b^M0s y,;kuO/]ޟYpto^P91P- Yd>e)B=FjAH? *O\MJa~cN ːCbkRi"m_]GE!f я O^No=vv` sQpw[ld;x~в$Pm] AL30{IY]Ho[ VCRM:N޺u>} D^W17I׭loZs%N*:;>W^-Nj6vXp N~FϫOash.G:[uWhkmOѾ#R.;i'ӌO! &닺?pY%qFM GotANPO:A[s7b;pʨ f;*1daUn<8R.4P! XQb3(.ZH`1@1x}7@@tQG me+z S5fZ`47¡ZcpP$Ѝkakb+lV3$5^.ݻ|ZGraㆮ/?K_~u/+++كdO{kB~02bl4o8Jl@|| 0fQS]fґeR :j IК )"F4A'ƚ)8wb6J@7Y4pV8#+,o^Z04l^_13B5:"[a 6a uJB¦72E,KfP@3Z:«0vX@zPq {}IaW8&~L$M@e T$p@L~ -gTRv܋(D=b`tis""L[ \E(RL9c`"8wi7J⋑m3p$~^[Z4a{ozKD6*eT쉱 a LXLW *`:%"U#X+yFIe0<> :o)`  U"AcAO !$ d5$iWCz/Nh֩ \#OVڰsޤ}xڰE6,%ۍBmDridFF*^Sog+Z-r{찬awqϲ_{$-67SYyqC%}ڑ{t*(?wŖ@Bz!/!V!qVZIVi)>gjw3SH]EJTVסNAu/N=ǺYVevB9;c>%b ;fBZ)7UNCSփfSyrNy6̏} 8<6AqeFk'x]8gaA\pXvxcgEP^_} Ia;, 6tD6jN=M"{-؄d%&0ǣd)FYLacRv.&e i2&'.(.ZP*8&`J`ae*ΔJb{0hRfsc `z6`+_y7h)|Qa\bSլ$ҌvD6Ձ[5*a*T }EqzuwUvP›4'.IustX MR|vQmM'9ϟ=sE.T63 y%qFq]Qt. \)N:Ny wQ+sNݳ75 4jEucD,wYApE[-SYN,.3 C'Eqt M>bǼ3X"[h"S]IJ^e[Npru!GT(& J]Fht@Qb_N>wZph5erT I&AHٍpw#c¨EڕEa )fNÖmф?71qpF;Am 6&ֻϿ|WW^x?h8f6Nm**Emk[Yu'HN]=HiN"N;%TDVl4 q^źGŽ"\9fBi3L`ĴĜq. ÎN:}RX}d(g66Nߑ˗ fc<N;\Rᾛ# ~1Ί+U/֗Q=|iIR͇wd,MwN3 b`T(Z1ݭMf8ΪO8"1-O, S%RڗrBQa%BHQ~ڀapBRT?R/҃rf Djq*YBMᔪiՍTf+ ~.g&RrLp2S;hs[a.9A8y{qyk*^6mE[X?RYGxG(o^>?n]`E#Um ly0`ha:֒~[201M i~F3H'IŐ3]eti/ymftB6κ#e>Vڪ+^ן.)ki𤙫ET)y0^#[IɁ jO=3O")&M-O^he7ݎ9[>Sf? b"J>L;tn"Ռnw]7jQqtqDR7O[rw# ծlve`㊛V5/g[U])+}=?USu=)$Iv.+PG7r0_~{O.Gr!}/,?KD<c& %cQ.( "mzbF2b0 Q6_/ZBBsڊ^vnƬpk >\Ѩtw<=D/r hF=A Ʈfw)0uoM縫wbE z  D{pV@ Т.hZ=Z̉,џ1dm<9 l'qr\M'>m:'hœ8\ϊBLwœY!8btWH߿I('Dv\6FD6gntRC|$&&ń QʰE1 9 ٗs%$֠%癏FÖpy>_N~>B*mE ]ԝB~F!{ǐ\||n!esW|@NsԵ+mc!E'=".8G\rɣS(Lk,M_#45%v5q$+9Gr >zߛt]Ǯ&h#񨃜-,U^z˛OI2Y4*zѯLԩ*Rʚ'Vb]kqOp ?P+؇-hٺ~o,;?{"PaJ} b=k)!e*Ls^Q`ޚ#'d>h菼Y80,%+1 P<5 8rD1t\%Lcs:"Y<H#&:̠0@'D3E,w'Sf/Sda"$Fƻ^X- Ȥ!0. ?3Y%}-)hBmx6Iz?'@*' 4"mK`"mcP荝%mZR[tfC^u>k^L`/`[^~ɧ^]$<{D> 쟾 v4I~ &WL`v|v3¿?ɷşl}߽ͳ,,w`WYw?|?|ϋ}n>ݗH_5χEowwmb_᝝l?IYWAl6M3=oǣ9Y#[j+._mnVE9==CiM&qouC@DTQNf$դHJ g}^y֤gGL _|Y6Iﳝ !VNX M&Û%a/-:`oNDF%Y^mhwEV29{8S/zdoҝSNL; `7|Q︀u`ˁ6d5?} ۑ:&ՙ'2ltov|5ed~of6|o^twuxz4ud·X'oߞ`xZ}__y&6px~}32%Y oϓaXe~ SL~%wfy?2"~ـNj<MHF/IQ"h8sp\Mn8o`Ͽ_^O ~jzY0kt)Ʒ1Y\f߿a1u;*v x=eX^.mH0ܞNo:<&׳;6* <`3`(% >wi“n Nʳo&;Da4Os_gE$"A$̝:X^kz(:) &%|5D*rP)epqIp)cN~P ǦOŽx%H-93ww·6{>iǦs;0cۊ}=:fr O~0q6"ڔ_8{)?:&AC:I"6QIKds12Q >eLWha>+"Ȝx9$^w=YHzgMun…ًFf[̷ɳ|+~N@3pNߌ?WR+x'ݫ|~lCa|Ȫ]%b6;TF$Jpjf| u61&T,Zy5$iQ5;[Υ(̜Csު\g}}RꪱYG.M@i[U1 |KӷVxK;vE-HhT?;ϡ'j ,Š(HTHk(䓅܀s8I`9 D X UvT??+'h$FAH, X,N! B&P4>3aJ%_ rn rX` a+ʣc%ZøbclWS|ߙumR}v]1bKj [׸=Xx1cK[뾛CϏi8mUx8~~:pgN.Uxӊ5+jʶ5ԄQeK]ئ* kS!i-y\n*\`cOڧ F]c-MG5 lc2_-QqW(mQJnE}PH>'c:b|fmswUtmHikcrнV~wECFV[Ϛ0ؙ8G2|S,&yi@oYx;Q2 %8 s NG0O'p.ŲXۤht[ eͲO¾gz+y4eqRӰRpMS,0>s_no&IW+T!| \5iAD\O2^j4WP%9m.YJ ni4IIt / A<>q؎WQKpphzw}%ڵO\^xF|)&UaUx`CI J}QO~ujbk&9#?=S/2Fw].wWt?t#ueUkXG+x ES]麱:_Lu%UOs]㯈ے{۳-Tջ/rYF_»~y'@Y ֎{xmewl?fіK9 ]|J&B9S_»RoTiXD)1KTTBne!/L+HA-⬔9%G>)tۣ󺛎 cw6ѥc`Rɨ`2bgb4]I sHm70Bj{4Y׻mb +zdJZ\nTR+m(3ɑ)/_m5 z#v-=ؿ.9@0c88OYV?9j*ky`c}%8؞r{jI`mZ>mR7ڎj/M=}0TR7 ! ޾xy| P3) "`qQ+Xӝ4Ɯpj(A6BQbC0"ce ehe2f'7yM(FDX(H0 9ܟ1fA =@ !4ò9od[{${qTQ D]Rǹ_u.C:.BZz׌Й-=ŗ?PqffI<>_܄&Kvj5N!f$[P`[ Y~ybk$,1+]&&Fr֝j.뺻JXsZ/wQ# gx"jŤI>U~Jj$zio4w_k̸ZX۔ܸ T胃_d].ߖ5:(pQM/$k0@-!YˇoD5f\k4KꓦQՑJچt3$g)]?KҪE^rrxr֑MhZpʿXc(KDdL)<1Ql"P 80W0 I708oTˡHmY4!Ďt_%:])~hE25QѫV[lF0W@m:id;sRH~Xr)la*3!$o)q|_}%<ik0<.-NTk?#*5<.h  #No{&Rˆ~ի0S\HnW?Ng5>#iKl[b&[ \ޣJ4)@=wGxIXb#^̃Rb/q"h9ej# #HƂXøpb+1L 1 ]l@j҇ev:"P֙yp.՛ c\gI:OjqFͥ'~E_D5m'QpF],۹ vz#t qxB;`ɔ*4 %(S/^j59^j)tk?_ MX3v,ߣG\/xQH dQnba( fsVdU\Q > av.(E>E"ASo4o;T53aWQ[@.MK]5_,)s)^F/ٴ-\!)gxT#ġXSwɹqԶZBDJq G5{w%D<3ۅ3Q'uq'ag_u;~?eJy1=9'Ǽ.cO0FP#VY.52a:E,"' x&(J=u>v]㋊Kꈰisè B"dR$DRf8}0?D[zI]ZK꺕89 ʑyNH NrчXixܣ=-L&J'G eg?΄+) MNfUq;Bہ?*aNv@KN(jI -%>,tsQzߏ sEɆ<}wsK>\?*0';}+ȩue|g\|MӦvT;7Fjo\QASHG@l\Tñ{9G+ T?ZB۪PLUtΡlEQ9ML;h%U+u\FJyd>qFkA4&QBQ 0-֋~ç a5Ss2ɯ[%ZF+=.2.WP  mr1[1j3l`P$*A#}q/TAWC8g׍2$IDG<~.ڪuDOQU섞d|Rk rpuJ"Kh&:x&:|:.W|VC971k$ :"hC&q@B^K)?>t*ӂfq'܅`HcC#ƀ @h RD=niŬq".E.AwfgP4C\rz 7Ld~mJfs$M]_>o&ʒ4(@%c5D/v, 4( 9{ 9ͻ&wF%emڀr'-7w2BbVؼIn ZAnrZ>Hȸ*$[d3ڔ5hh*̄W e?k4Yeh.M=^yG.BH$y(`CR%r$yW Y]"]_)a,8 gMju%zmQNGִ6iY}"8|RY%P9C{5s*\q%“'66$GJҎq'ZШB()**:@L}@8} Eq}U^دOgh9ប N3#%DUO=Er$uq9{BrjpdML0RLLQ.K5z N )J// Qiz.*q ¨|t9& ^ sJ9az;qN.f UAo(UG&(+IQqd \24wPTŚpuDY]Mt)d,A 18$=Jn-=0n3o$Ä ;cKyu}ӗZOځ4|hK2m4Qo)7 cq%gg*Z=ݸU3)nMg%n|% @"D,\*WJ{aj$$ #hTU-+T(?Z5pguIY4FS 򠄢6hTc,nS7TwAlkI Zύhv`;.d4uJg C$f)5!H\|RzF%H+UN$_<!)B|v}OI٭LNΠr&5[wp ppgt ZjHh\b;6_ œbU}fp;\Jf+2yNݹll2HB7SCj]l"dlxAJ&yAw33x Q* *ԌݶWk/Q %{n(g"BʆwrΖlv/wCnQW<!6ux}wn&2~zu}:f*l?]ڛ8r2 .6YQYZ,wxԇy,wq!~G_PR%1bd/8+'!+}%byTJJqw2\V/qh@IGpWdl&\2FM=N;dÍGʯ(EhRX;P<ӵFMi*ViE# 8)P.YJ|HčtB'!Іб};BV TזZߤ9E6-"t] Bpg8J_rvC 1~>筱N%[ȧYáߔI'N6˅8_*DCp;p={m T@nqU9J]E& HmDC5Ӈovvqs.\"!z;Ϯ$Q87BPYws3/>Yp4cBw8ޗGNS-|1 7v7vѣ;Ʈ G5Z8|h42YmLZLW!LLC5K1s%V-L[zޭ06 Ch ٸj7&ꋃ^ZF T=,1-Q(sTKͫ6kZӾ&n~_lCm _ojf}<\兀FE$FeQ+a,8^`so.p?^\hpd:'NL7LKF2+0NH™ %+sQa1SDH+S].9Hz{YwTPEnZ.yxcӇSs}Cz#=Eq ңV^ïA ~ yBCjZ#AO[?Pw߼9^OQd.C W-k6.Cfjjә}|/?$قo--,ftߍ3؏~cϥ؀+yNj5z!0׶۲O8,egm"`_)mQEsOWѿ.d3|4oE rݕ?8t.ُ~BK+k`7Q~!(#\y5T.yƺClFl6g"^4?N<\U>ȿ,O;I#h8]Ğmw7ǛEL|y|Lny[ȔxQ kw#\Đ©pArA#~ǧ{+HB>scsJսt4@ &]/jC@x76x wj ? ?;{T*^Of(3'M 8N'i4P?%FzŠa:|h:®T+g<zi-HeQyXH2 R ^@  4Mœ:n!,C晰dXhT:&' 9A0C.sܝ7AxbȰnγ{rF1ZVOVjHv 3Rg`ʺ8q8qZ<NTͨ>^+㖵1Z,kc nB@/1[pP;J{E?/FdU G9lMc,J2 Rgt:-XlR`,<< ̽< ;:-߻C/Is۲`+T{tpPp8tgAQp`:\ ʨW:V( :}C'QEØWʨCXbچ:{wt$͸Ӳjs8w+ C eW}M3OOoMleaB!xXF6 +? 3ﮌywf!$?L-DmHEDAIq ٿX0Yhd5,9?9/K3ޕ5q$鿂EaBlL'٧ F"mږ7o6xID5ZLRʬ3T~-\\,cEJמ8-kO'NwO@0)$m&Da @)Sai-J@`DG Io9+ο!~3y?_˅ǽ fqըs"@}y/Lj4o/cs-S2KGq&a )2J)\N"{E ґHF'6. (=n4 MV ^/t)ɗ<Ϥu8m9$ 17@i"'dX5>4b&nx'`ȼkj5]ː m)<8*ZG.ĥ{*V,q3rLvB]%^T]e͹nO˦)ʓm[#I˒ժϝWlѫ$jgݿ>bEmy*oq=t~q)#Lws7M#gOot>_,7uJOO.N\WGPUoNfvln=Չy4+RQNX%>غåy<]ff ^b}~={<3`}ҳJX}~Eዺοv'7U2|<( YMNVOw[v}8\&-Ͼ<ۼfG-]]2J@>T9؅BX|~DK>4% ZY%w\#U&1B Ŝ#wID @b?%jKӊ3GPl1?+oPqyX&nVH s ϡUD>^ċ[<ߕ ?FTjOIpY<FUrZ&χdJ-A*"g7(As~pWTfps v# `ڄ"/\!xn*.\²u ct{:vbn5WkJŷcb3ݔp("͢ Rkd.f֧_WWI_]L\Y_|%u벿//фFMH4VЉ2tIg3ڤ蔇?ЧO/)O_6pJC߾2RvO]jR)x $-!DBsw_#]Fdԩ4?j{+Sau *hԺ0}]lvgmN"MѮrmȭQ[-C@V9" E98,6x۝mB%+}8IZN9¸.@2j ))WghRd"Py;bZYG+(uPީeY4Qn ՞=$^ d!B m)ML5_{ |أ۵I>ϛ }ֻΒz1!MG'b 쳨IݶY2ka* R-n=@%([ۥwt_vy_~g;.+w:n-B9 7i&-abO2Ott@Hg0Ν?Fe3fKٱ0Ki: []kTVv?|S*bIBP|TTgic&k=(Z-HD zWH}JP){[~GE)cx'>\kwA'ґc>"ڒwÅsLpXIq丑@:' xccu'{" s˫.>-SzMYP*!34(!'E#'DyK_b@F<+ [)E@}α̈́qx!{օNpkɡlK ]:P!qte5K 'p@=LYoÓ~`ZvʐAR@ֵ)ȩPZW5H1[ݻTScf?`0CDWӀ(va{\/Jkc} &oEkaoa}GtZ **S?;k p;ZP΢Ǜmׂ9kK}%|w- 1bVApr5KЪ-ɕc>v S)߳Ⱦg+iwůʝShh5B "QGCo-DZ1;aqDʔ:΋vaRbqJKt~WŇw'(^5e l*4: 4% ՚U~њ0N }d5m*xQ  ipwnp|jV+be=ű=3DDfBY`P1 n !۾YPfX@P`ԯAf JdtәE%9]q(ڇja5wO-rT h{x={i #꥟/"yq}2eɂT;&?v:I. _kBțW33X|q;Nxsy/ \G[6ӓ@!0f:.ه ?TU(y#y:ݕreY Lp\ͯV?++.m FDiaqʅTV9^`c mg'׿ٹ{}7һ: n2yuݲ4iKV%4R闙J (䔫JƜHH^|w(0qC @xϟl\΁O%w1Gki"RvYp(!xlȒZ)Fp4k;] Ҧ7-D+IݢhOⲣZׅ\dz -9|=3؄`̨[1-~LC>hÌƧlOK?SuBz]*"BP/ (=jIF\Ƹo*=g;5oƎay;Bjz: mrϨBĝQA=۾,DӪJJeg_k )#m(!s/%o9h5` m)3kY͉f/FGx!6&eTòȃǵAh֔ 00)YGݵwc6[ ms=wTH !`hG"ARRŃu([S;P\J*HIR _iekAgmsM&+'=nNzq L8-Uho$u8G9S \b cmm@?r=ze5 .HC;"9SZ/57kYIːZ$#hqP?WfDv r^ d!BIPhLmb(30yu29e PF]Yo#G+^222hlYٝ'F-%D^o$TEݶ w*VE}q}Gۘ!xϑ( YK/.`r Q{sY8_W9arbّA$M:&g%W\S_2e5ףxUC0ڪw?)ꐭJ rBAeY+m8s N8y zxul}R𚈘,R$Kw:uCG3*QU1N`.֑𒳃35%Z 3W%2Kzz/6b\4CF*qBs_lolQHoT) lJfEӒӻ6D2޲R,>5Wy)E.H< *4[B젭~QU*L=y~Y _ՎZj 0Bp\)"I 3c!`3q`*{@1"äNGI2 Q@ dXA9oOBkw:Yz HLe~u˽:YJ% ԹG:k3fL$ J6E$NCmj6!!Ta-IFav7n<ŤGutc; v(D@sfβfK,+v!D ,nM 1[{^2wSeyxgU׫̶**Y̪߳O>s>~^Ҟ?]_5ooPRU+Fqc>YE6//㿾}7s s"͓y:|:ywvҜ YyNM҅LdUӧNnsh3OjKӯLjF< MazfCwޟ< ۤ[фQA撛:Gͤ&6AI1_&4s ҅jU6i-~݉EEKt{5'?<ښm7feS0.B@Qco/k.8feL⃉IB_{X #KA\]ZAP;1'qs4iSs_\.?O~Yj4WVtu$n՚u]P/|`"' B8 60{LօWcO!ݮW5ˎrrgTn{HԒ̫ bj~3jFgᷗ]ϋ %6BNno+opVI|l=9;;'}d%)Ǘi iZ6z4*8+4%,Pir{v\nl2]zOhՐ5oNzZ/@UJ؃IEۥv4Aٝbm*QC:2 <_Nrh>6vDha4PjsU=%ыz~UaߪњAvP`p}̖=&i cUJ{ӯ74 [/_/5KުWx0FFW&vhED=?_H$4 dI>#?@PªZƕ=H=VqIl ^s bEk~vBWA~:^sVl!` \D2BD%&&+BBALrBqG~I~F Y YK+SiI]mIrיj71wGmʈk6֩CJkZoc5R{wjP=7e[nyD`:,q4;ʎu##x:n* ΀{_^X)`M(M܈{ փ@ kc ̮tՏ~]åIi0VX'C$Ǖ}8c1Ncz{Ŗ 6 šG %a)Ӏ@9zF@rQr%F;4/Hկ ?kk^ t(jm\QfyOq֮re| Ǡ5o@qrK2hkƺXƩ.7X%v&6- #A|ZIar&ftg4Nh[xhW"Nѐ+PV0YY1h8WLGxvds4]^]|SO'?\O<wyWQÅ`>zv+]M}BeURY7F1ǵltlaHT ^_ȴ_TrpE}6]MnBj Wg]oϣk4{mev1O1A5隷a1bBr8r2* Z FM8FG !|]Tu@2|v~v/y{-dJ3f߲lYZԡ@P,x18S)=Vq0WE?mu;{#-7fŸZjtUm/-wwjvcJB埿~{ iO=\iYts.Qm|uNٽP\5q~oL@qwT?J$ZvGxn4 BۻP&p/c+b/F|Ѝ3uXo\H%| ιN1g.(Z. ^9Cd 1JA.Ơc.АL]$ u+8R5 ><Ц@r6Ą!'l 0Z#bmz^MkNm~x#CDzѽ<=12ݬ`]boeS'F X-Zk4\W"eXWr3GZ"9ݖ8s{|)=vSǀkV6ȷR y>Sz7:]7_ѱ]ޜ^ I]s썟?9o./ãϪua={x/_yqddYVEgRq\{$84Rq\hyE: q_42iMQJ2^pN_$mF"d" ,%Zg>B;2nO;;yB_}tuHʡP) xUf5n1!ݰB@Ud39zOgZn4 ZXMG#2HH6.VB/,_̨_7HIG<4l#!T)meN2fۯklN_P3_khW&&֍ڃ&]}wlOeNޝovvpE)t)~[V%&ۯmɚqDF`5H, b Y[d&A.L5?g, 1 Be4p j+k*C]ndҒmG4v[jYn5ϓ2k`5,6z+\g/O)?Hr27ܝ !\8} sz]^&ɕNTOlWN @E-i8u54@fᰧ!3֯ g2xx/,c[ƻlC;_>R;!./up(\{iCpF8.d@uU䔳ũ YG"(v?~ڤךtem^н0wjKdF.b0%rO]/9 P7; \ۃj`E_Ɏ^v@4H[t%E@H:K!$1Hl}Bo{i?vU巏6NPڑ(x/;Fy h՞ZļŖuۻuB_n4fz~L^iu]fW?4FtsW;C($S?-c-ȑZ2Q!wzwGJ|Hx(4O/mbeiޠA6|45 B:%DEg,ϲgC ,! SNJS!S*ь2\\/HYo7:)JIKs立޵#En;|? EY, 63_v`l2DIv&w-%Ifտ"Ub>;,DK B)LIgsMn/]Di[~N-aNd1P Y +7nN^*0c0\ $E5)F1Aʫ,!9%hxOXl58moIMl f%HP>HX._A ! R[eds8/4ݞ9 ~q.~8\Nݖo_߿OR"hK1*[%!uN;inP,0zqnF9f/+5הnq_7wumݏ?l>=70S1pH V(~3*:̗Sj1ẫo%ו/˺ZSj]YSs9&Z[7 mcD)S( R/†L`:+/vy&v4t|4˜ggL k)dXƈtRBgQ@;6~vmKa`K' 鄮I~{{+#4ȋbO/vǗ#l10e#>BgLb͖1NAí`_hkv _T1 J|iz?-^F .Jgkqx"Кԏm1iI~Vd*sAkEOt1SHT [`Qwi dެ@9ъwy9ʩ] δ'ckR)Y楉IVH9a VAXlin,<^Xʱ;;K=Ψs $5蛀zGOab9ӈkb$eiK%vYPҶs-F4*E~ qcjQ͎Q͚Ө*0pBSi=J5 @AL¡h3FiCUZ 8n)".B[ׇcgJpGq{JV6xGHe,FvgUe DiG3R1L#<$l5(pBWFSklh_'r-2Wy~52%ZmxOg3c")L/d?] ts a!r sw0e`ϡ> ;F8!O%cHo̽qƓOa5."1zfF+:e2)9X3# iUOXF8Vb+$*bEo%q5;%OMIek67ZBJR+AA(+7Q;zBW0 `Znۊ""giBuΔ) @ ,#Np"4.wd6\5xpUan0v1k%U{ WZ1!@4 14B̜LǿY 6MÈZ>>#;\LT* Bl:i6Hr˩ "A X)(4*,w4Mjܽw;vx kPD%aP=QF9k:eeJ;4y-xUA+%krt G7ux :a_oUH;_WiWz &70ueP`#Euv%pF nJ90c{%3b02:_~\8@a:mtNgD^x!sy)J),_8>)z1m)O'jDE'P5K9q& H}sn ])6wrrJ雷뛔hXQŌl>k/fkb$P]cƒذeb;čmCN -m >&7K!+":wލd07>5ʇ:qrxhoFCi7рI w4vN9W≜٭jX\щ"uN&PKGϊĄ*rVʰs.@wz ֌R>bkP3[|@xNDs|< |-Tp.;{֮ʑ父R"h })=lBp)gjE8i6N0WubX&D|,iA*ը`E}&FvíO*ac04C^H-(ћv) vV>VHAš-c>T.C %qh[ѓe"cvhBtᶭ2I} eɲB+~Q6kSN_Xz8ǜ㶎+|V+pEIV:%7T&y3XIʼnD< ' -H):ilVY&jvuHH$'ԢBIV]gsw)xvE JT~'T|b%CUhi/n2i.{&IJS@y>iO?H}gD`An1m'䬵M@m }֫௰nnVG@0d!gJ27ℒ< (V-V)[;5SFxo.~ksю|gS"}j:W}EA5oq3cٓd&86- )euS(bDr*g)):|k[`"ȏq:M ϶ڒ#Ifȋl |UY3U _9j{ i G f=LJiXnDY1 5IJO\leփaRUWb[oe"/ϗށHO~yn "2DϤ?/{ӻ i;^S-~ Z*32O$Jz2`*Q={g̀=na\ ֩|EO%B&V~Ӓ8;q+P& cRvU/}w :"v$r6:cfaBcI??'nbb覘0fz%(U*G B*s,j9*2ot$"(7x=_?櫕7t5Ђ ،Ә|Q>5_9rüN9G\M4nvS4VG PuSO4 3dЉЗ+LIk#R<[e0FؗJnq:PfaOXτr(9V{E-7K_v)cN-9Cظ#XQd`ffl[`sc APcWAYBr :Vnj`vщSJaWLkR]r-d黥Ga:]P ZB Ll}XR8]p,b9pZBÄ aTn)sy*aK#q.Ve}Hd.ω[!޿5Dҭ!]C?A*Ynb?yEO>DnVs] )xyƀ!nJV~IJX i?~?OhWgB2&]絊8*&46z,jQ9PbopꤕR:Jn^ *C*֥m ]җy`[$].\~4* ,D2=ͭںsmIе(cjPcka6\ut9f U2TieC$25e+*D)Ƥ5V_' ׎[DƗV:]m2vZ}wuZx|a,ru L|Tf׻`Pɢ lȌ>M#x.NNVud|w~v|(\jCBZc)c}<"P"@%KLX.T!!bRFz̷+ ˞Ɔ Uk4tWHX0Xq¤AZbːHsc+ᰃ/Y:GjpԠ8L(=R {;įYE-:Y/"1"52Ј8S0cK'2#bh5_ f$ D,? }}Y]Ex8G̸0tT{(\O8 uZjH 6igIp^%*sQZ*Od8, z^Ctl @ױn&J]?]ms{2M93tBy%z6qqSL)!Ea=LIg8T@\vF^" YJпztCM`n\[ϏA[\ekwP˫;n1׮QAig U& |s=>{5"zPj%B 5|Pk; 9Rq-yt"(f%Я_4J?_1ޙ݇g3=سbYXNlٚomlslwF{t L$nkw8x{pΝ.i GdbB;I;iKpH+ Bc߰Z=] DmSi̪j9Á~ sƅNdALRz' N|;4ݑA?D@Ew^t(=2䚠-0D5!4j܍o RiC[LiGT-, * 3筓1#lyC,yea$FFX*m&TѡcZCԊWv!\!J-c.Di93tZW%*hڶzDe #`iBuB2p|-+0am(n8N^ z1HkT0A!.JBlYBpƽ)a|"+AVt&T[_Di ,03+[h2RNeBe&k&"CaMMNg2$)Usm B B?NnMSUܚۿ#O!< IO =S2I`ݺ:IMa g?ogS fC''~wM +rLS.W:s5Gt{ZaUw]_uշǓ뗏*^"EtZ*8xp`61qV2PQ]|쭑R N(rMU\#l.),ϨB4NiT6lm"lM4æ4vن70A>wq4VoۻZ$z&,䍛hMqJ5dB< ,z-p #A0e@H˜**R N Jad[K1!M(|5K̳֊~/&hH,1w8Yﻓњ+v-'_*kHiIIYSW pR'4VhJ0*gZ "(wpycnͧ~8Ivȷ|Sy͞v8 _|Eѻ?fO7vPM*buI9q1r;ҏD"S?ж$nT]In/e)'ꁭߡ ި*N<|<]܁ҥ;hn~z/+,](݇Ӫ'*ٓ"ﱑ='ww婃~cGR:R+;r1H)}FjN5ES[7n96%5WReȑF bb:Ϩ:u-zݲ_4ջ5a!oDlJ/[nz7D얋ARG3*u :U;OڙԀqoSPyqK6ӐXAiJ&.#߁(ɊFd5/Nn_}I!"wb=gBvJ{=qg%)gmI's?5ٲJ^V;WMRt3J˖P?n=oB-Yk|Y[""a]|YH@жƛ ʶ|\xZX$iM}~DPŘQ=:X z !X!>&+%qX;Ѵ{xqW(nWt-^n:), 4[]܉JczTJNLƕXʧs'Vڏprod`(Hy۫ i+hVJ$=TybuOzJC/Hl6>]m@Ӝ8+I{1xE(WwCE\3 Վ |tDհXu<w}KQqL3!k[G[s޾[̣Al~KIs H͐R1^8s51AuS"e^ڇ.6]Ns #|'"7(JǷ#+Le (@\ZOM%ъiȒA>&s1"z@[K'b2:bcnB[sw_ME%;b)Y^.u<[@fڛ]p?_-d_]8*;~v.N>Ğ}J RpzvkxF/FS/Dm6i'uz;td9kXrt>ZO16hd+$ՙmIG%f1lf\2?!l DxK.(@xωE ;hG(lse߰hܚVz.xžف>~wUtxgyoKzY|>001ײ*[+'VjcD\O7.B !y_'7y:WY}wt6_LJk'.NS<^!tεͿNfWsLJ[>K6p+0t-{ts{3E1]T,3n#7o*N1<:pB$8PҪ31|C&WzV_p@eo_7RLYdvPELI~&B,Lr3C5O0`31yi 9x~?f.s]h$?qѭ7O\ggJ@gT:JI+8*8ImRMic;XݲɩD'MTaTDe!%X8`Bq&$"z(Gq%>d2dfq(pf6~H\Lxwu4J6=*(jqkJ V&yR[z xoҰ|AV OgY!I{y [?t̠b{5n϶cT:fXP墜 }WC TKDn 5tO=X  |ٍKv]V':x tϫM:8ޝ_\=ف?&˧UO P<(u*҈,Ul>|2Q }{VW.Ыv*JȁCrlٷ#lrXlbo24?`T_?5.aw!y!RȼYfGf8wvil'P w΁݌ 7p(D1CW8wcXe $eFWčAzS#IX)=3%QDHD@2dagyz+^t>l@aP~ן]GdP ]5[  hHzC  +U^ЂZJϬAoG >nme/1Obba4t}li2zCY$x ScrK[~`H*=׆.-{tၸеo|;hط-tcH1) `ooW'/GdtބKƴ(W;h-?e/z1@ jJ]!酿o9ftC\.lym͚OÊ7+^)ݦå\k ޾;.ίO(`E$EȨńJvLWcU#Jsf:zg1Bl _~O9ہ#dgJfLXbt9t8znȳi'ePKAV4K x?yOXgiE3>=[rrm_<xucBNM´)ܟWsx+_-I6L!8FuȰ7)T2rpcu !+0 2@ XF%!C8RyHLp&`FaT3ZC< "9ykuȊiI$U~N$*,Ѣ`٦cƐԡm!A#;{)vgv:j\ె 5^}S?"?S[0AvV=cl&P CnIΞΫ; \8Quz-Z Jf) bq$)$G\2MU^/|`1ڕ 60] WYPX"Ih{<Φ_T^GalW]Ojecɋ^ZѷE; w~b6#򔪵}pg b-j{W"K/U~Rzʇ~^a2# E4Kh%-}GvDH(Viqٷv˿oHn]H7.eJ !9n5M{voa'쟽5.S߽D!ށ{ޜczmVmTa@wznv{מ` o-Ыm٤ 56UP?b; :d&#cbvX#-%v`Lq2HRTEe*= ɻ O>M$ƆB"1GSqy$B mUQ)ݩr3<^뗏oIyǓb , $Š5][%/@>D^ /8p]4CLbV':Qo[( g&.mlA-"%xllH#M3lȬ 沯 gs:?ckUϪc-L5О?~67Bq6o`rtÕ|JL!]7#G*E0LJ㣲|sqUa?o⏘MKfѱ0|:^XܗA^!J(J (.b0^ h*I-VR)c8aNs@McXd)V}ы*!2#f6Å~|DeAbnfZaO**uȴYk Ek1>?x7E`)ӊ#wɔTʹ&쑄>{dHQkDw#Lj-b$=prCWv$}L?'\QCޣ,OgJjyHܙ`nj2FTuAV[qgNZ7UuvoWca evN\hnd@ הvܔj9zpMFWMqTՔڔj8uڔ*C UN0pw, -HujSk-C:cA=u@Qk _9NDT;Jt{PG_b.@*N7HkW2.2ltMe;jVi#ԼIewҺyq.)٣;DlypyRQb# бv! %.fTs9E`Zm,FgI)U "B2 K$W[*Y(33|7 1A?Zk TPLL8tS\oǎ8{@w8 )跪7TK"bRt.M"Nz)߾lO3Bqə E)*1Yi(1:f46*zI P; ͇X~0 `2PD*i5zXa{/颿P "!o^~R0[]m7 $bLewF1lTB "9ЀVV<1L++@5j@R xNg")PlȢ<hF\0FhfKX@XM׎Q,d,,rZ0QlwG&yT a@=d%-;>f 榏u٠"GeAJG;!OF*:XRԅ|"#Sm&+wr1Hw4nGi(JF\-{ڭ EtC QXj@$7@'O & {Tr})\ ΝTgsLBb|1!  8+t Nmz%1!Xy96." wrB{O31"-و$B\t^2fIy)O{O?\e5 p&޾;.ί.}2mvJ.,u9h޾pQWNM2n\jW_c㝲 F#jP  %z(ZUBi$ty@PwY׹ܡl(ʂyqeFc6YT %W,&J<|R{1qz RN8+pإu#IB/ޙ>Ѓ6n2 !Okyn7HI%GUEj4-}bi2|=[[lJ۳P4HqBͻ&CE6,,۴t2g~n+Y'7aJmRm?![w7k$d-R|n-# C2g&CIbh/?D3oYlM'J5fwؗd.EN·񩓐oA*E[=<0v[NOP7Pf#Ipٞ6Al$j޶#%2g4 TJEgW7Z)@VB(y򡌏VS(NQ2'KqㅧrJM-YrCҘ*lrV OAUŠЩ.&RŋWn_vc{LIW;VXtXٖ3u"i.2W}B L"!2V MM.U`'"~% ؎>VȪ%l7 p_~ƫIt v;sT(jF[iw +$itaITV; Bv5Ef+FؕFڂx"uI%ADhZS`fYqXy M.22ΛwD\4r>2wȝd5&cŏ/amfUUƄ]zpŚiT߁ӆf PMfkj/~߇Ud6 8A1cqX˰qqX `3j $nKF#νǃءh)zđ&RC3hhes&kqiۻ8L3+!x:ʺ.j\".dsvBZP* ІL<"u[uW0=ئOݛu؆Df` VȠ](. }˛-2m(7x=LMa m͔nhr8jW3'O сC8MmPļ$Q|N,$%w(uʞ1@cSHd*Z) FxZTY%_G!:^@w)+P znѣRMz)ӿ̐oTXeM^gV//t/!e vaKZb[iDQ`vW+yS.nc9窼W_{vZ)C@У'ʟ>HҎ6sZX_q}5I1 gwlB=wWNwКbLK-kSO9לbN.+)O {P%W9F8:P y "eVtPL -=l Ԕd=5Zv #0' h܃{)bfhTouqg2K݂|c]hւϯ]4'dr)3Nk/_Ew_4g m_f6;`%JփwgQ0ѹk5/'B#csi6s&.\d ]:LY6\AOW-/o{??}ۯCv+b/n+SlLACJ{wa . ht(L{1An~"s䶭.{hPL||vZ(׬ێ.Ç2nApN;Gɣ{g H>? cvշ2J:Jce*#wz.]ˡ yKf1HA07Y>I..L G-V&38 x. fvw}q`&WZv}"WeZZSEfǸ8a|%RؿC}L`"=U]C |y{).:?}\1 |}>~q0v?Z:.XŒ: ~>\*Au R AuQ@6j@, KDo-]Y{Y *-Ue-S *[4*kjPAUYkU!k^VfgL)HRɘ {#l4OAD; F%LҤ{w7 %$I&"}Fq\(MȌV$<|Q5 ܓjq}>-qYNKUMнY nsTAO+DUܛ UT[V ]cG2wVO8hԶWaR%ޤ 3HNȗ_^ LJ L.1d)zIj,Y}o2=U\ưI<։Si1[;|e_Vt6~8mW+DCM(k8eȡ ̇dm0Kr0=ihSh 2z} =O##֟F>A%b[ ) `/ fap̚f? E-y;KdKn/p-W_]i0?_-Z698D*2(km{<# mnNxaq]Ѱ7ܺC߅Ofv?W5D֕9s$8čm%Uep6DHh6 Qr^u ^/uI=-^,6#pFSH |HTcJ(ax b{1X$EgpFT8P RPȣ;1Q&G2A4.J6+b7 f*S!C` 27$yNJRɔ.I避Hvk>L1xD/a1L$%I+J<ࡗcȁ"aK]v/5Q?``2ؚ]B?|hJe"rn<)T`= 3|(h R8.psA}H}l;˲ |JtX)ql}$ϩzm8)lFEK-XǢX w@g*@K 6_d[}ʘ/P_-jB鮟 o Pr5LoPq)ŏ\?Ԃb6A^|S@m4vˋ󤐁[Wtԣ ݶKa"> {f(|E_3v8u) @T7uK" ?oR>k8}"8;3ǣ X4'Jk;ш4vm^. TvuJOj࿪~ֈSg't|wZ75h;_% \󍽝ݦFu}ږ"&yݷKEԚkbL ߇9hMekY%SJPa.X;?!܎$Q{^?Xo}|PMt9TЅ̇hYÔeڲJRo,SnODXXBCeV80SYq**g6k5R%-:*kŰ&zi=baFqx -w4'HoYzc' #Z!i:n-T{.\+mĚ.m:0͓D:~Lp>&I-rN;枍L}{*MRL}_TxyF$Y~8l2f"j˄Jg?}k{goʋE%F5 w6r-X6 e59ꕵb?MYm2 X:QP.VwQ)Xќa DMWs1AmCӞfxgjFsp2/HPj{Q۽8Q{v V9;s^g ŜSA(6H-Ln|''IsLlj&5"l P!xk)I gYMT : `V+܂oAmUAqp)TRڲk#JAPwҭSǓ a`eݕ_ę|9ٰGrhC_>Ts5op-aT0>ɉǃ$"#PJIb"7y(j+_xAKiaa|% fI  _FҊPj1?qaHd] hSoI(=c\UYL җ"-|pQpqK]ڢb cd{KpAs.(#ib'"Ti*5hJ9.m,gLR9}GK,)Ÿ%m.0%>@5 >|DSʎ<[+BW:RRu Qa#DK.m=ک(BG3AVkeDB,/3|4yQ$Ԍl(ەHG`1$:|+p\p1RN$seH.g`ױum&IA")v%&FM3Lc+%NY)!|)˜D`*JƎT iNpDO %۞5T"Gv>K+ ZNz Хx#}%v()uBENK1CU D)y@ټ:.*u\KvbC ~dq"NɜOirI!E-KU!Z̝QOs*a-0p7>u;8q 8_3њAU+UbSZ² xD\H"=?k \fUTrN7DbjD? X20mޥV__#h8jZNGB*ODFI֩4oe?tM("?& +֡D i^]hX.NsY㵧)w] lm{'o-.ڪk8_XN#ʸz a˞) Ὲ GxiG[,~i`]P;p}A,?zkg#NӲ|)Ғ6vQmjWxp ;Hb"}mדHs` S#? G";NGMذe+07iH#d"ΓNEoşEs^\ 49E"RgJz9LGܝb/X SD~"6Y~[AoE#a..zҒ6vQ !iI`BB0Z%+nqo6M@lc9-xS%z9\!q(?edCm}if0Z8?M8A{Ifp7$W/jۦ.ll "egUmJϵ)^B8(Lwz-]a* Џ;5&vR?S7m52\)E5?5u<Pp0V̇lMAnpc}%1E;KNrv']ks'D9AX}1CEFXA8qx*`X8-QDl4^Hc(5Վ %Hf:kݳf=gU;u,RWWF$K06>uͷZQ[\ :D H~t"ˎ;xxLFEER\nʬ!tk%Q4x06ݎ ֶhI"j(?RᨕTʪ|+p4,k7pҙ;`DW4fV_Gp ArBE]s` B3_@ZhcM<[_"՗)yWˇLHbt̤oM0ӂBmGLoSI -qEo N\ɐ պ-tnP鈁AH#+TR<;L)He&+xovB"s WY[\~V0"]ϴof?G.)͖/Y&ޅ 9S`pm3w+&+ib_߼KF f 5w?oV+Zm'y@|ԦDS,icW͏VR mfq0,噡bběp02ſy-o3- lV=V PSimPx6Ux0aR+n8H_n]Ż 6hun#A- kCk)ؖݛ?K0AS]u͊}E@})[2^)05%~G )#8%c]M\Ct9sP56Gq.VyYLsY&0Dԣ׶1`Wyuj?"qD\(乎t![j?p%$i8Qxq~g1OrbΩR[% Fjl=@~hE"$\  G=9W\> FB9+8,7J-iޞI%eazy~o$% R~=p)z'y*R k-ڦݽ[[dAStD+f&~tY]vFmg4Y%=<{f8!D!icx!DWPIOr3lv؉K"LX%`fbnPluB!K`Dau4iֵxaT ie<6c;"\|*a14FF%Vx1bc&(J3LE~xѱ7 @󍧍A S ׶?:56O[pͲ&zyRy}I`E1<6Eo?'3T/e!<]p-HG ]TNAMk!/S@nՓ\eAGΫlf*qytNLwbO,m8t6f~wh4pR?O,~&i~ [pogַp{e¿W?Y_^}ٻcqF-?-h )\2m'!x+׆5M~lQm _(ۆ_ ~Z$ c5ߍM|>&?y?ϲ=f_0dziH`wdrfaZq屛ZQ=|2ucKf4L/+d~V~:'(|N @@&m$>_ȾM= ]~l4c2 ߛf[2Ϧ$ssfUXiO_QóotĦ;j۶2޺B j|oOG; 3A\ew}  z{|J/|}śs}$y8cЧ~׵p󧉳ts6y&~˄w,t;_mg׹_ /NF7ECN\@K[O}0;l2H܍ 6.t,3? Eoo^;`μם˛Agҙ޳wֿaXMnGg~-tO~MG~buY+/SpnD8{իa<~9 (I{_yrǠ4\0&g0{o+|~? eLp$tp_nUh\`"8YO}vy68}.4: ^]^*\S J!'( c6)8xH?uSGGf\(K ܚ8{YJ6<W}tv Y]yv!Vs$ !("Nr+0I )A0+bB40DV^;92Q,ipGT_zObݻ5:ǥ\h-XhkLqb$gS)&1c μDk .5w\ۻw8u=d*1ahB!.=P0 7;/TJ"-iz]@w{_ 'y )|eǃb/la֪^/ln3`U1/jPsg[T j"u&$0JMο|AvWrN{ 2\0,w{_/{)*oLM7v©žs=/<8nvg3 P9j6&o{qټYh *?Kvve.CNmUdNWss*?A*~n2{-G/#F$FVڼ65ppfoA%4={|pn_@eSp7]o'p}7 D^emv~6<F/B3)8]7޿[%O3rd1odp]i\׉pi(Ydib$qwGE.0WrLѫ])vU53A]p`#C@!YS@qջtwWh-I>?9>?9>8v'=>av|xTx7HYZ$1hdA 壝]#(1/,kOoo?~/?|[\f%0=9z;ZIH6Po?knEwP3&]| ]i:> l;i#+ b~؃3Bߝ3q3C2519EB3!T94HT(Zxj:縜5awAh.}ˈ,7cĐsh'>69Ͻ[z }4`\JS QsH̏zj5ognɚ7ˮcT2,@WHGQs΂w.>'5a=cVr{ɂx[͝"-MWCJ]!=&RC!jڟd@M?%%zfFE/X62e>{nݑ}[,M=ʵ"Tʕ/B*7\W١'փ>DpCܥ{ Gف½R\f G]vLJ Bb^/ñzxw iQE~HfWz癲CG2iS}Ё(Tt(&JHA( Z4MAp,QMXtpmaMV$'U0=,؇{֫rttXaN.%LQ(;LJQEjUhyu_Ӫ*5ʔ!ZJ-nNOArj Q2|EPKʥ{DjjTi:$0PO;قR,PWt_]S[nV/֯k#j/W-aXѯ|g2;8J#y 8NjA4&RBj!O,3ZHDD`Am3/o-U+9*eN3'!nj>8ςSDU^#>c;?FfmN݃i%wץJ:+4r T YV`UTq"jS%U`V+g$#J8l҅_#oYʞJں6Q;˴ 6ָ@TTk[C-%nKhXSɼ3; TE0g,Z_ǣL_# 5:~}s1Ns-ŢrkY*3~8]2w%S7*ASѮkPw$bޯ!JD^Ciz;1IW&*xNt)8w 0j^j=FCLF &៍t$KhHA#PaK@H,\x+7#ݶڢ4dz\_^'3{s%' ~COϼѥKP}w_b{ХJ=Ln &\S} zthWmzbmإc=pbRS18P >^l MKEJ ?qjeB):`Χ 8c?Q.1vS!Tdzy\U:XDV::RL6պjy@#WK^\-挎(P[Ǎ#̛ YDziI=_0Yxoxǘ!7+Or}y_n^o`οs~QՕ-rEX 6&P'DhlÖj˅ b,Z{ Q6lC}ʡf9Qd8NH9Eʚ鏙qN}5ٲY~ 5]3PU : @Q.B39- p?tÏo;%H͠owoW:i{pr`BVfLeqSbBaMiaPa͹i4`FgZ?0++(+ K;Y aо6(TMЛVZfOYNۦ29oK-8#mzI-ӊf>qQ&{AH-#xSvyI\ߍrbtb{lH:l,5B)@z;OќHS _M,ZY-)hU^fw\&BT.eurf()ٳy;1ʂ~3?ϦtHXe]+#F9n?lcb]hCyT#Q{*eMݞJjNhZhN<{[àGJPN$9[Oއ釫g[vAj3Ϫf[ov =|<%|ӖE陆grk6zG9wAV %HcYwB>9Gz9jZAi!p] Z1!JI2p7dg,^$%Ht87cѼg,R yy>!(!C{ӽ,R7x>ga$5b$v7FKo8e&B5D}1}SjH5 ӏ߮?UpJCgzf0BpP:3J9Pu!ϙ`%Vcڦ{5;ܖ}\F >]c$ݒ'2JxU8J ZRTЃT EgZ~Fu"u osJ"^FrA6}WjY@ QoUo-zq>/'6g}DQ}(F$W ADex+Oe Rə99'Z㛀@y‹Ż;P37 gTl9C &JHH*I%E! }J<\gBs($Io*\4yBWB(?\uUvY}ΧC'lۿQ -me <ƕQ.QިF"UXNjJ*`׷uE=7cm/jjK ۚEZ[k\L\k˖ck@&-x::"_%qVF/LzCA=>SZ 9V)mmbG Ψj =#)${=3]!0B*:T4x(AN蘃NG.k3683Xh< ȝ\Q%q404~~To?awW(^NP+}O4E͘׸j"+s(ZDo :u-Nt*\/#HM^C"N_bt@ֳWXG \rr\,VtE$qm^1-uMEPa%8*H\m>g Y=Zre DՁx뾐V "1Z2U ;M2n%Z6 t\W*j8'Dxr $$P0B-TPO<Q%3$ DWpI$F6c0:d{ zV96xT+%]ZsbҢzb"~!EYHD<X`M(i%xr -,z8,+QiCiոIb`{³#Q0[[\#1e5(;MZs_(V4j"+ OBZ ]q-΁@"aġZ dp!T PQ7Z(+EE?+~]8]6M._6i"K^Q߯zHI#z#ҷ ^͞_{M(6,D|Ahh$TI;2EXs‰ LL${IGQ`DMP"'{אJτ[2T9JHґIw@°pA1 ۅȱ 8O_ _& >i0S7_n*ԘldUNC2*ik2Z$sˤd`9#DM40 &>ՂKYZ%>^McK C'ÖO"$NE̖L2艗c49>5:zaT]B bI%6}1ݯzó3>d@-ȓ& DmPȄtdxMNDHv-|"l>|'4,ޠd{^|!}tTq7 : 1 $IW`ܐ$T)3!''Trm$C2i4Yp"cdF7ѯ(dm| 瑬9ٺɦ oS3>ٺ˖Qk>,9 2y H)\3,B`'4rw3'5a||kHgr_`+I%^͚ٳw5H?mW8iTbW?**w>BDUwD 1wdͦi]{{O$L.bG8蕺^HٛxFjbj]mȅޛ2(LD .6#/]ǀ1t 8/;9: .&6)c"eޥd c `\WZԤZ>KJe&?*q) IB.2EF%B'Xx.3L~ ͝zЌ?p>)-T)88/\IMP2oMp x9pƜ$stYHI?Ǵ adȭ с6df.<%i' )dV1&]Uc#g>hMFdC­䬹Nse0X "cHit|TG1m &]ǐz0:#I"$dL|$ tTlupd7?0e Aƍdf|5Ec0en&ŬV+>-#-F'{Ejru:$糳P6Cn#`!bJӆ5k6eզW)syk )*>:$q~9gPo2n}? o%|n;nʬo?L-/ w@_u]?}zO$ O7nyV5R~e[V*O> Ϭ4ILŃ-J9ukfst~U4\!Lir 8& ^V;Yڑy,^.7jnQ&J9K5rZE1}*(F)mU8-:Z-! TҜl5EȆ {|ɳՆf ǝnc^ mH+Ŵ1;XIQ6E$sGpFw|ol%|ݴ=Ogb3gϗ&N25s: If\~t}~U}C jA7B́Һ7*Ҫ;o\s0T8B.>޻8]wVg?˘Nr7SlU h\T >ɾzg~Ti6~Ot g~d?} [xƽQV1G(bo~k~k~nx+"g܌ãh"\?B|wуnx#䌉*wAG!Iw?ODg@S#G- 958j!8yK3F-j0뚲jpmbtZJefʺ>Ms|Lix?8{f%eS~]ޙrq7)4왏)[L=ݾQd^1=:,kFF\ 56Wpk(>!{xk8kaRάo+&+&Rg=[!YZrilRbGEo9iщry u$*'oRqʎi4(az\-HGS8#v¶!iiٸ|e_5T|JKO0@ǙJ҇XgEAӪRҗrglJ3>;tGW׷,Hc٣uwMɧ/gY 4S.eQ!0%$4H@k݀MDKͻkڜ_JҕF4>U6|v"]3Giݝ ɹ|e/8JsE,( /CJ'ME5{30p26JQ1G{ͅP2`1A0IgRDGd>2  }UaIjouۼ͕&0ǚ 1DVUO\q=IZH#M!NE%sF=V|d%Iғ.Os` BLm ^WÐUθU7]wOVm{ ,UkE%l$>ѧr|q=yO;G]wW7nsy9 ,IKvե}oЩ>qe 1ɽ}M<^eUbyzgI H־lu07nips&^t;UDD`;[>m35?k R9 h~oe}Jf;q#~FટBsW ;Bm_ b<aaƒ񢡺j5zv޾ Fk7hS[ n5?i-ڌVf@Ԁ*wSb6%Vg?˘ߔVixM{yݛ* gՏ{| #c@pV%5ct!$&@<{4RIP+Mh.%cm^(BE|; :)kߛ7h}1X`qr˅X޺ z%l#!1iF{ˆtzݺ2+~Z~!Z\J%EME`= 9ɑBb\;ѐcAظ"tjmrsxfԒjЬqw&^<Җ,pwmQ4/aնOUX~IG6?5yZ41\{]ˉ 8.x3K%]f;yB:\xwZuO*/>ߞ>?p{)nH|ޫ[Mnxloۋ|{OL  czɍ_@ɼ_Nlf{kK>SdmR;1V],j [jq)`kv["1QQr[-đtsNĉ.%*O u~0@;xMB_&^&oc1;I)-eW- $'J`e#Sh.K\/+րj.y2Ak)kXxW@$7E4rEgo36"N!bqku}9Hf%Pyo@aR 1^\grcb=x'?+ʾ_vl°~{l1`NQM 0zcՒѻ_>[X(rv6RbBIޣuvIҐ]N]dWdE=ME:Ԏ;%hՔQ}N]ڟ@= *j2Յp8 f*<0%Y`7ܳw5p8uAj<^_gV4ri`+\NȱnpKr1volhclO^Ά}8C/}V^Y-@:zz-[;՗ë.Hs%"ư@U ^"UHڟ3XFi"BjT\ V=Wqo)VHnexV %.YI,mƵ"BgYYd9!zR$t\A$ZpU-R#]5UD#qʊ”#KjhL %SJ $)9ƹѶTM&y[UcDEGD?Y*h:9_Dr1dR0vDQ NOO)~/0:,~w_QB/)A ̋dm=x#N6Oo}a=G>C#N6dtqGX^inTa2\@MCDZ6\|0@9"4õ%ύGQ::rmSݩwL-PkX K)'ص]Q݅1ι%$}hvr8vjt@%l i EJK2@#ִN()ջCfl ] Vf4_탣b]VV۳lq~5Ϸ>w G:߾52&2BS:]SƇZ{j=m+ }h w#wٴ:KA[0Y^@zuʖ 7W8 q .\ą,Rrm ʔ*ɥ͑:g&c%ReX(X S3=5ޠ?ތ6͢mMEoF.C ­v~f߭~0EqX]wkp[$z5FPյ"TDdpv?YA2MPseA(ÍRJ2$(HF5jhҿ(lHwN>| MgK-wro4]ɁJhV n"a?vJD_1gZVq!06[1< IѮf/HղUphh/!"%,H݉BI#X,2J%s ,ҚL8B=cŏD@&.ֆ ܡda։C;SE_,v7o߿o4q%haص0\ &x/Rk L߾mM6CVLzggմB7B&jB좪wvmolD'i@e኷4,^~6Kh^ɗS67Fo2Ks,*>7}iZJzWc͜;\+rW=m:AID\@G$WRtR4dUz,gFRjN4WZTDts(s-Ҍb$ JhaLqiԬIb~5U05,"ZL/#mv5kl>=x+<:t(=O#/p@Զ^d R)~Rr*@\K[b(ef$8/deLX]`pUF`h~ 7„k:Jad vw:.᱐k/_Ύ&ҝzEiZlJkV㫥k/X6^ ڛyj~6\dƎ5Ӻu?= s1C3JkNȲ:vw3^noAu*D\R EJ+%&CRf,{znS{++Hm=Ti񸒯kqְ302&lu˃.\N\3ٴ:p ;z;2uviWgWl*/trGld፣6߮ތ?8'W1C90uX9\o.x?|f=Ls+W> ^CƹɺpY1RRznOg.,CZt7-ɖ y}躠v}O4zV~?7Ż}=20W N1^@ nJS5F,SDPPMNx&3If3rR"JkA'u*KAT05LRbFRjL2-,'-HA U̔HPAdep0`ӂHD8j~?gm3pQ3bj쯾d!w6iÏڱ5(8oDc K!g; fD6g?‰/y{FcDǏ}YJlTNՖowP4lx1DTZIFWrm˭@|;`Y^M}ucKƪ1~>p >p }3]I4YX>{3dβʺ ȅ֛g/׿$f]Ew݅[twevr~[=p=$7Si<ƵjPZ$4>}; {k Cft+ hEx`N\f*[$(gp8_- kO9(5В0o+o- Yoz6lXq /m7׳ `yǠ!Wh P!|PrJ- ޲!$U]bVgaQ&Le3ܿ8A&Cvߺ[x!<9HMR!$6ْM;p*%ݷAG%Vv;3RHUF+s{}D{-(%Ak,,t|!deEIAd0hH.5Gxa#i9_ *`-z#[.D9$ :!eҀlNf(j#heiE ʖ"̪] TEߌodW}uo>_ZbƀI\k@~ݤ׵M넱ӯg]󙼖S@Q>ni`(֛٨e 3>.T1T$]џ!6s`?UȾ)n)xLy0 a9g6P4o6V &~ ne~{$|OonnK&M4jDOݿ bLy9Jb\金H4[v NKoU~BKAƠ*)U8Pe@œ7 Cqa@4͕(!cFRԬVu~~IQ'3{76fMS@v=Nvvm,U}yLn'Íƶj2eT**΍ -Qtb\J$KĽcGe&yr&M)TF2ηT^_q~oDfwIl?)a֪n%S MZ ^hl6䒥_oQ1ҍ~A[LV؀0ߔ8xdB)Qy:yz9`֩R9\TwF1,֨'@PTSo^Q6QkrBoZD4UoGn{TB\AyXuz}עCi0]m>*i$IHId; FOdoNHVkV<86c6HֺX\>Q ֚ [ƹguJ[^9c :r˥eRX,[ >@U%%㉁ ĹE%DAgPraƃ#eLnPz'S.TsFF 2VivNF{C2魺45umtoy}(XB2H:Ueukg^tqAGqUMYP?Rzplj̼7`UR~X,[R g(#G%ZZUZ\uN'yP^Ĥ%"UE.XY^G̯.C;kMqvRul&&ZYEeLDox,,q+JR)#-bӼ3ѕjI6زZkf(wĨ17(yɒ(03+[lSܵ|{9*T9ަ6*A6mTZh,@U?Y n8y4hqQ^r9tLX(%]k&t˄& α!d?}|pZcHIdmt eCln\UR ՙ#tOc+nAtTԻ3|dwCU^r܎c8~zz}jo2/#sKgk ]W=) ! .0 KP2_ز ic 3jZ rGx`:hvAP|cXc8q9*cZMV(65vCn(ݞP) uwRE]+>- tS5F$\&[jno&}naך9n~w!דsrOV()4o'[qd1vݞZCμ;q)X04!'Du8(tpE],~삧^[e9í^eKm۔M |{8lA6/SxP{` xq5g<φ<`C#f<2Τ :T} K FkU`K%.5N&^F̖(gj 6h788*{Z.w%ߚ%ѫփcE*(dgMTwX v35PfR3%p@8.{OB?) ps([WOQK$%H?솧FPvpBdoO51"i9R['d|pr 7k$C{+?'T0P:gњ:@Ž|>'~id:Iԉu *"E%8 v{-dvXQ"kɁN.LHK} ,% b-kc7#ͫIqٻs<.$fQS]c?~fܰs'vC v^wY)4U_|?v(c1hh(O6ϊw9LB_&6Sr wWODYPa~$ܶ3[(*)[]>k JNHTL72χ,:Iӥ+8@5(i˯]%t=٫]E?i>gyipp% Nޞ_< 9 O{~l!p kP8+q\\Ncq]D p~g+Z]?<_pr;- uӗy"\tUZfS@=z}Vxj"0 (03+|\`tNV1/R%/3Ŋw9cX.X}7b6"%USphy,ʀe,P`%ૃm% .xފ9oy Zjc6\W=+;9.9nޟ1+QH/e#=Q%1Δ2)h9a xg^%VyrϳZ)X~.ph,!Q т!PϨdf+υ)Yf!JP:ی}DQFCKa5:8 %|K,c&ըӻժ C`lBF!Z>^y6ם&L8+ra(#GiL[ET :AyMsI[6Zc-iƳ~rMv y<%şơTd+nzaVԺeh$y#%YeQx!0&{.zRgmD$Q9&ːi.a5(ٶ7SweA*²–,xL4C,!I"s!bHɘ{4"J,ɦ. !B'.}qPJ1='e~}*?>LFZΪڟf]TU<}g.MzzʏwvW"4w#޼it*"^\g~xg;'hT)UJC=w3%H$,If.9:=FؖQ %ػFn%WAv`fK`w^J<#˓ p%nI-[-& "_Ud*Z&KSs^;sZ6n!tݤܸx|Hd^.lZFv,}Nשod;w~@[Ѽ͉eqj]7iy_/˚~qg+c(2iIJedMQ}fđ}$@LK }v}&#J6mC 5J%U\_ Ozp=(ctqFKѧRC֊QlI+Yy@Zd3&)`'j|fehȼ.osۚfVr,Jp2 M%7/BO/\I->}w1YY6VG%vƺ/h9ӕ,C kb4|ChSTN>+u UR,7׍FaV7 䲙6Z$Iթ19mu3u#YuULX>mY85r (^9#JP3XlseI``ZhTk&N$dDZΞi8v3mIs)gnɜ^A1 -dNDgɇAZ9J$u>P% 42*bJr1hɈs $"c"K%JA4Wu!sBnHQ֋k]8^NF~aC ^(g}&FoV.4{ k0lo>Mi͛[iwВ3tC[h卧⺍ɧM^W\ >킺\dP8yoNi???(p=}rI`?|" Mponj8έn6Cw1Lv Ut3p{-7nt} Կ\(&y­@{ߣA0}0м&;>|'㵹Js8zaO?:˞:~I뜟osH6䐼8mΛ ,#_5>U9:Ϫ[Jkŭ-W xCñYDlt; %EgpN8(&tQ"Oa}qC .9 /U?N'YH>ǡ4m& cd%NKqnʏMy]-CV)*[,,M)r耸JɄrG7}>8WaM&8.Vp#`WMzߧԡceg~Q9"Oړ),XyAV }oya< d 7^8F z$P?5$>غ+bP&o?2%Q*{KUTTCi}6Trn̙۠ sAiET")i&7ESJ.F"[I^ѤޯMvK8b5JxGy<Yh|MXW3($P/x`3%8Z8j9 &54F t?f#1nF00V U// ,SH \h}m^P롇WlB-PP56@7Y%TSGfLc)].SڞWg2e ץ }^| e0to{k#: I՗LkU&->xh%MrM'ws}ܷnq/_*'䍢}w(3p !tPx{!EEs*4.y+d#6ڳ}M֊4$\\֕SKB/Aߗϓf7K. /~|ɯ-ZRGC) -kn-{!=<,4#rdM=.O 2˶绫+F?ygixWXbBBo$kw??H.Knj]Ө'77[]Zc?| kĘ^w%z|]< qTzF`pK4!K_yL|qmd\j%K.8;YJP Ie)zI0gXeŹsvtĮ1DΣ28YI!)R"*Pƀ$BB!֐ܳDlR""EQ5U5PJR ;%ڐq׸HaY%iO+QvD *ƈɂ E /hu7rmg{xrmFe+뻉4[AnQF(w[Ɨ!kLj4J;ō9]!XyTƩ:Wo35![g4krT+2Jن)5]Y4UXk䊈1qP۰Pmk,yLF{!9YA5U_(+%N]dP>]NĠwɶ|_iCVNyh2yּɻZlCV9mFɧ iBmL޽TLBZ^gfۛCmM_,Ŏ|!ԧEģLX&2? :-yWXvu^>C*lKU7D􊓲3TBV[ C?]4g?6iG 0ۂh! 'M4'M6{}<^l=}jCjJuMH^pyCfЭˍ.)( G SX΄6c]祭%qqcn,)~d,CZtǧ,A11Ǵ  |QXzKK^ llHnHL: ʥļ1[-Q^rɜWp@~z 2*biEB*-pms fiv fi(sUP:@Qh4\ZXL]`R$nQȕ!aap%z{jyw=07XXl]#~ 6oakltFUw4Gڍ]i Ǡͧ8 {.5@[fMk+iݪ<B@9꤉ޯS}-vPu@P5)yQ~(3\=%k Rtn(إ2~='p A1mݷYt$X#gO4MY.I } 鳒kHIG̦i`,hgm03pbLyvOqAB"!Y %xA̧3-9g"bLZdrA" `qlWTuh|,%W$4鷲|>c0 YCt&̸}wLv'~>>(Y猸K;~ys~f߬~ΪoVc7IdH+Hgc2gyd6>q(OkF,ף:M9BD_ EIw{QoDTd/98G'{ON,<%;='>%za\5z'05*ħNaa,cX!^ap3K jYq@߯ԡAV;Ƙ4zڼW,/2ȂQ@ | 0oxK j:߂ L"Y'(NM=ItoIS": b*֭.NsVpmĒ XN cxR(5xCas1=6*Zs@G*(RϬո`1is'Ƕ=;:q^j9NX[r~meEC,_Ӌ/0 . <>8A^(̶0 ӗQ/GfpF1P8441~8epd2r4Z)N"wnV9nV9-*ez} ϣiQ "RcK)gu@G!6:wMo)u5`N@05XV6;লrߝ1܈eFѥE :N hΏQĦTs*TRZ-TH%UO)U j򟗎  >ϏwzG3%D)~\<0_⻴rv+:=$nFج,'/f~ϋZrvy[~oXR~jϟ?_^3hKL*ӏ?}tO 8;IA)$:^~~=/E`2mATﯔU걥PdREplc\jI,M3e"ᑑUG¬ȼZQegݒ7(t:;OnM&5/(˨W= iIDVQt*-&T]RlM[r|:CWw1aȅ@1^ Vǝ-^]*ۯ.;[WY&;^TR-;z-[!@K:%Yp& g NP{Dyf& dBSN(!-|enTjePe5)0xLG8MR. ݞ#8}[2TlwyN#|x̸R{:\t|#2)1D e&j\ T)RB-b*}cc~@0v:0,8In[a#✟ NdU0v^iElDf0GUƄ_$aԫAwDAQ)&o_Z'K U*p8Iͳ|˒TU(.Rsjpښ2G8[0 wG{uܥ~vߚ1\~Zxټ/WaQӱ6O ks?jq>]}J=N~<)*#voRz}p[ryIBqm">v'M&vkŠFtZhO>vkhvCBq[T2w &JQA)QU#QP~"Zf]eydKæuaٱ}yyu/%Y9ӋptШ{揾"1ˋpy{0C>c9{rČq?fI&3Όtjb!э-཮@x!ZԈoaE:P8θ fU}"a oP 9pgJ OcH21PI822A -JqGObl<ޖ N"U=`ALf5l~FHLHN! `эDbV+cB(8a[ާueCxj#HϏLj%SYݟfui˞S6) *rό>K!c|⤌^[z3b i :TjT{k!Eqd%Qd\x&s5oUx~ A5}9$8'E4qaUE 'ҙfi&A wts*.r+2 EH!cLqB & QI0[9f,+T/V؂J zBo{i!v?!ݔO84LEM')\t~bB&.V@a>H!y5H8Ӡx 4  cqDBpq ΊU81r:Z@ri*A#[)ZӜZ7K X7.pVaupS0緟v\@`5 w@becʣ A .mAZd-nU`bղwָ} \m1Go GigB5"דW n|\ xTkQ7Ϊȣ&^M.-RXһ҂>Z A GpOycf(_̗Wg-`Ob-)@R$MJ%^j 2'q/h:|؄q01:'Zӱ&@Hgct$J ^'@s&"Ft1u`9UV }ʨ٧k姑iivv5WP{hjdڕw?RK߹ݠAռĚTbT>G5|3Ea Q30j}T*8 d+K-7JT-8 _Ѣ[gKZH$\+*If'T$1T4{"ͫ (%QHO^ Լph='J<4K> SxՀAr!,.Lg_ya'3p{J{*p2) Nr0)Ý/R]1 y s֥a;{><\ZYu.zxW҇[z?[vRs;zBЬ(n=@hٻ6ndWXz9gP*?lVήSq}ɖ 3%?IIe$s<894n4^@PQ K)B7U)Zw'9v>ZsۮcyuCB+RqbVbm(?R)"/.ݝ͞3ԜL g(td>xi;f!RE) `^KZ֌C4JÁYiUNMYꊗ SBQEF\Wڸ$h&+ ]R!²]+SJD౷S0&dm{s֝ t"B;:Ä#2aCDˉ^ hmiءI*g1ۭ9exfb&`N;1d(DVj.{$LzH≄!xtS*ػ0n{y-#1z& BJi=TEyI'`EXUxQIZM(TMIIYˠQtX5!8ʔȐ 1ByCUeSE]ag=|61޻BQfWBhl;NJeYH=¼>eacL\;PId/ Cud?p.B\5k?#6CߓR9V)%?/ءZz]쭞Y:.awͺ5* 9xz>-VMܽL^Oq9ɓ {qs?+'7zOG~׳WuUfX~;Ɗr7WG|13pS$uzt~ʣ]^xXE5ypsNy,tE(+W~+WE IȑhLyzbG1Gb11h8zn%[ 9r)/8֘vCAb":c4nÝXRznU[ 9r)JF(ݤAĎD yƤO'jr"Z\+i$֎=u`+5ұD &Ԃ4D\+m|]XO}SOG꤫MMHIn'TE4/= V3r,-VUͺ^!*S.撤ٓ15%x7Ob)\ȝ*.+ϗGĚvώWa٭^||}vYߝϮ.&PzgյMN?d궄3ds,A3//c51o׸[/zQ|\υ;Zx&Kհ#v:],7XV[n_U~FxQT SWHA%39)PU@.2K-+X?|/~x5pg ΍Ob[g5NYbl4\уo[>Hv=6gvkDTe:Dq_\t]I ;ee+JsN>FKi;W;$7%vZ,Ad@D*}-fXJV%_ykrSVNAK%@xE9 ol^Pz^R*"ˌ&2(';-;/|XC{ ~q{o?YXŅ^c4O+?Xk8cL׵ofWӟ>/|s/jym`i:@* >fWXH)GDQ8}E$96UaȺ y6\XeJP@绍Pdv۶._νE PNRQn4 `enœ2(#9xRP9* >@WUjvQ6xRաj>ٽ8ݥ6% +_ kM*>B( I) VUѬ+h pZ0,DN0҂1;g~ws^9 ˗ %W]s'*.'=yɧƜJXVp8|ݺ~zǰǭf}[\nhx}m&}UKSҲU11w/}wK1F j@)'9EҺ~YϬk"ʝoHUj(de*]HKP3TQR,hQK6MRпwonoW^gMDxL{ ޔOExfAf u E/8:HvA 1 WUaV,' *`( j g*˒*SpEJmJR1;`yZ~^E'n&!=W9&峮[#d%`}dh;2%D Wqޏuk$>D7 9L1V:`w#&(uY=A$Q Ŵ7"d"k +=\bCl!rr6)"l'"#kp_Y2 ߡO XQ|e-KxT0NҶ3 2**/>`3Dub#,x(wGH  |HA\مي_I #pJm@ @ȔM ݉1?)ڛ)yaI s A>TFqLts cŚ;FhGiLj)3L#/|*cMr) ko@9軷7_'&ڭ,1)V|y':+\tW׫kf| *ѳ{d 37Zn==;[DwZĎOjcG v8^~ԮfPGU꼮Py*FG#TGrY׻MLγwd<#=tGcλh ?βcji9D߂a ڎjCB˗)]ҬQ\+o+^|7{0ACdh/.;hz owBd\nvhz)ڶ}=TSJ=ƍg1ʕhG_hO5/h!N:S%`Mg((Ph?SnEGPY(~Y HͅzF6۷]tC.^IH#/+}Ti>J y^KF0̧zTZ?7ϯ>\ZfgbBjLLio3©s$>PLs./ejŊ$6`:ylHA: [l~@9Å'&,';-2f^0J ͠%ĭ *D߈T Ŀ]LPq~{24W^|Klպ&w쉺s_N䢨c*H{JCLSܾYg.XBH<Iy.Og]wPO) 제BB\DdJq`4nyunNM[<ݑD=kRHȑhLё5n(lX BD'v&ڭGH"pҟRHȑhL <m&1v Dtbh<DPBeW|UAGzigme :'z l:-Go=0~RXyhNߛϷTO ¡=[" 'YBBjY=論[w9!O!qz8OvJ0`A>9l5hSHˡp$z8y}Yg`|{[\ gb{A;V@#*T^bE.f$*S2Cȅ]nw~/5nF* [cԐ{I:h1w^rĀBbL*sFD)!c Ъ(%GF?mb׎#P?gsOI=a #iTo{o& k/;=\YoUi׼}p~5P ܺxZ,r)W\d5cjGZod q5x*zg}gu_t5X֧W^'e &/?|CI0+:ȳ-/ịA"ɏ˪)< }\L2SgmX ٚv($nyRoLII- tdk[J[ ^jFi[#xs ZKHRDZc̦ T+k'fpqL CaO(9c1vQ?Wc5h1ʣ}] 򤷩ӖF$1Yεr' S9hA[ظ 4lAE88pw?DZˬΝvN9*5Gs+y+>R#o;sA!VZAP34 }>ixn5}PqHq:P5}^*Np\*"м/{(R3F?3}{宸R>f܂xh{{[GOŏL/Ň_ m=M*~s+TJp`huV&ubbrKpyR]`ƣG PRN2aэ3 ESN41x>]xo7jN]ԟQݩVQ688>i(t $ 쨳@0Xh?LTmYfTMwצ>-kR3JނU/ͰI+67n%h_.8u6jn'M.ނ3F鸙F}טv_,7y; 9gxk5`k(a2PLyA(M_+qK3n~1E vb&/jߤkɑ(l[غe~}zqQYn4WlnL>Z<9`*8՞{l}Ov7HxGRɱl䨫;J9f"5HсVgYCNj KMbxtA)]J%0 G8>d5RJNG01P1Ekl,7'yj=⑒l[TLYg1^  Ê0X.q|"#c'lF+ƨ*j[N5шnW!֔m?ۏvH-Tq:éywEM*w?ޚ;iQZ (074Y&;LXNm]WM,z$l6tw59Ѝ2faEMk.Lҹ`|Tк4GT +wy,v;bNF9VIפ6ѵx %k_<-'^/! ~5YL& !3iliQޟi5aṋ-Dr*,TrGJ=ݨ`Жok&/d'-Pޤ ^X3}Ȃǃ8t84\Swq߷OI)5Me>8ZhNŤQϺW jVj6Bmm=pgĨYSD p(┕Ck Qą`-*& ٲndBBR#aFl֥ gWRj҇G;t2&9jN4o9f0ǯIO'/QRhz!LJZ2t "*Q \F VeU ȯ7HL4fQ/ihyּi{nC-Y#vdoʡ/n7q1ۋVsqwo-p?Z`Էo0m s(s 2k!4`G\4'9KBңGM-{LhlƂo.QT%o9sv\sv J3I:l:%i#ݔ' 3Fs-<3^dhKS{O=e;˕!V ӢMT y+FK=~4fE"q +q?>akS%F#ؕ_vqe~~zÁ6Ö7q~pg_n`Ōq 0}wQuFO|++%mBFf s8Lή+ymQ=l%p57i?˂E]ZL"[:k' ]piaSJpE5 7TFHa A(z.DQ 6sfm7ߧ!P˞Jf}</aܗJ1h;V }g6RNr)S0@qQYHU:&Ҭoe ~n  ˣf2nS}9|!T#!lKk BwrjdV?,3W"YBW-1d\Ԉq<+:V Ym&cHfX%ZNDU:8eܘ@$QzMo !LVHQ11Zh)Ush$cU洽 V.cĆPi /qY5"am`!*F&Vڏ+1NZS b t1I̥1^( VKC-B:y 2 R0㉈{RPo#fӑ[ƸV Ăkm#GEbϬl/0H<= I&ؒW3SlRfjMLjVU,Yemt7nSC꼵3?,Ҹ=p #Y?k 7>(ka":,6<&6| www1^_?Wwlh<}NdGі0=&.KEީI^ ~1]y".вjN'#Kjx?C{(`;A1Y٧sА7HJh,ݯnS8CTޮB9" EQrQ_F`~0h}Ux NH !=ch Ck8l'EpN^]]q+6Sv2` [׆im=Yy.B`;i1l)(c..k9Ŭ|Cu `5Sp^\J>D<μb?+vQhXjX į'hz5k~V6L$vM7}TmH&" |N3Mo{R&-PIP)] {m B\( C `"B`t{0&8N7 [8g`kGO7GL-nr㤡tuA:]H'CՀ,DȦ;gy)P܁Cp}ԒGC1Gc~dNMÙ=vx]li?a|p6Z K;y&㾃|,FQgϾd zjaU9Ԁunj]^I א˖%Qgڐ I~zhAՈMV'^~~y Py8|~մDx2_Q F֗%Cj]. 0L{աUcq&JMZ9Y4!ysfqxV4Nhk/tXqb:dʅμA)Yǭ~:PfTV'j)ʘRAeHbo 39Y爢PS̉ Kte[=2^{97㶀 \[&hcRnx|ƅ0֍ݮ@O$ǨE*xkW}O-[m@z}D圮v/} 2ՙK{ VH=DZ &T0zkn4 zI|Mf .!Lڄ6Ir/Fwd4.QNFH+ ^ٖ Ɣ􋅕]2a/w n)OpY3sP9O=dߞ3p I(yaR.`bG9(=W`Fk gs ƫ7رi36`eT P+ DT JT< :ՄcWUj-V^?WbX1^ ֧ Bq&&PI*&-u[}uQ_vĊ2vpvYe RNbJr7ffU"P~M;8P$ ᳇{36Qf .g*kx`:J>&GMo)7>~pjFp\9 {XQ+EAҫKwHCW0C ÌBɌQ,OeEaƋ MO_ƣy-hOQԥBPd4z`}=< Y&wn9߮}f?>lxj;o^ys:_Ňf*8wf_NqK|>򤕋-<X YRt~dEHChhG1Vѩ;Gv«?θҝiDք|"ZGD\2vxPOXb^rR~ޠLkT/B_~]< <):\vqu}9Üĕ{8ةAR)>^LөLiM5%5(:tڵ-薥fѺCb| fPU~|ێbt,K07 ?ߢO'wo82}z՘[BPgbh6CIwNa)At[! 4yh0P9/ a_>4 yDRԒ ?\%K",wKc]*ḉpahx{bJY)f˖~  ~>!4>\T]|V_V%/& dBDK ) ŶEyB~aMG%1mǯ]*-dÐ'p5#M닡8VBx9Hw„tDB%FXjՉMX߭ GQ 5;{WhU4z9Ư 1\d~(KQn,eil: %{I3A"SPmYFlxt3R툄_@3CV|ȑ~x5niYD˸Iؖ{: _g\ړWZV5v2WQ jȌC,j0! gŁ̋`F -F9Dj)71b"ɰAg:;>Ě 椨=s>,jƝ9eXǧ$W#4XM2„H*AӐ̍gf&Oo@HdRhvdvaAD|}MTcssePpɤ3|}qLFj9U* W\-.Ql<1&.Av*<WQ: 3A.Si{t I o<_ 9Xn1oZC`\=ΦW`_=|Pi?|m6D*c찂B +NɊN/8)+ho^ gaE.{^0)uݞ5gۂ_&nqE,jHg%3YK@EӦvTܴSp:V6rTG#ɛ[r64.jl6$& cSZ4>/Zj,l:XXI0tEx4Lk8fz,xtL #Zp)µ\j dL6.m^dbVK,2 #>HH0)ij !2npbj%SVct3X,3b+9|\diX&YT<&dF^1Iiy6NnShX)YDžXK?k 5z yEa~ K4~=`Zd >~lc~v}q *SΝͿ[ߙ,y8H>v%alշ`dhJ(X*&?t㾇 0APA:R}v^#P3Q?3Ny@Ǚ6|G]Ԥ"LOw'}`>CvO4QDHۛml%-T:a%3%ߵe*ɜr3NɮMLnBBr,S Yw?odɽ?MҮMВ\Te`ɰٕ14u7,- /n5ЩK޵ө 5=%{CANi(xONώ47x[Se8k4!5h"]7tmm||? -.T+8CԕlX*խi|^])j-ɛhF 0Pof90dڲU( riu$dX@+f$#O9]Ow}V". 8H,D+?z1uK`G/e}PkAŎ?&srzB]q->1Y%.ceBDf>"Ubii+khX$ Gd(BTl݊ƐN:b%eaL| 9Nm@{3.ØiC5W*D*뀠:a%⋗l'qhn@]Q $~ Y, CFeLSXXlȑe0dQn4RFPM˄met$i5CPLuv' dL'!!\2,{6w5&Zס$(V2Exۦ),XNiwc%aqg{QI@ϐr_\{T>iб!=˦Z~I_Lc{]du4J×[IS&d44YLBWvʬ |s=MH]  Pdyę#JyoK3 `!k 04 shi18 f^ 5ȴj/=,!K'ro*S̚@(1Lew9aS[|u)L>2iAB~ `mHro/Pás V-.,sI:rޕ57r#lq}76vf?:82Iݻ"%xH(X:::ėDH|p32/jua!D l Fm(j?GnNim"AZ}{z.,䕛hM1:ݶwcrޭ̅md{ҩ]/h%,䕛hMq<]m0w+ tJhݎ:l0V\օrͷ6mJpITfkyr-RYʵdgGZu/2.aʺ,Τ JQ\iΚ὘=B##]rsSCxY( cW X)i@Jjw[Ӹ<^POtT*K=N~|ƽߨ0Cb>B[FzS<]o# Sz<ß/;&Yic|K$KB'K^ k9d*F jϊ~]@{շȡH:[+4kT>Yw 7HR[F=#S8K0}eT͢lP i_C[ZTg Xq%pw<[L6=sXw0z~ZŘPd\."rAjN,"EW5X.#q;8qԳ bXRwj>K眗rn)*}(eG|g-5鉿Y O[I- G\?~7PE.DDh$_훓I5ߜ')N5sw|瓀dK#ۓ\iFq]KoqRI(NG彟@Hk(354?~w4'0{&[gXT[U1#ye啖V.wR| zf ;@rTI՗@]OŸ@Uv%0\ h~\|9aGoKR=FyC+05뚲?/17oy`RnfB[c<9D1e`d, 4OcZh^xQLmI픒-` 68h"@3ppf&@<ƈ4Ny`C=sPz FB e 6fG&;Ƈmf68`e5?\Ek:ԭ>cpŧ ڨG`&p7+-\BjTRuTmDS]T"@qi!u wO j&<(=Nu]>=#LA(+&L萠W$Š %bLNv^Ͽ>A:pF1,6n,|t'1ڶ=}heCFp g:k43hH0Jslk_Z|->N8\j@Jˣ Y@{ye^FȁŤ@aywvcڝ"p0,50̹Ar)'3 @8lNbC< ;_Xzy^u&SPXwb[-Mq.D7#u4t;*эd1 % 'ϻR胅衈Z .zFPG(ڈ94l'21UDb8rLV \zeF)XK Y5NNs޿ZHg#s Rc #R >9G4K{x.r\oaGnq{Ӳ,l.t4A/ӢduO rUt_WoQcCDP1S[<=]r ) w\H1oS ٶ^hPg֘)Bfq죜/Li{y<yþ{m:"mLJ3I!HSgi%%n(-V6zkMDBF r D68)Ν\rS ^"!#w 6K>z Q*^7NAkz'{U'OlI3SX3Q  z"q1IvEІVFjl_o &%kcS$XSfJM8O-8rg`K}NFa,i6kv\%A`j,W*aOXX4*jɖ,WkpR҂mz$i/,ɌenJ4D1b5xu;P!it,J`FʽbB8qPnn5]{xwc`` %zkn.?su L=]C*2tɋKoV2ݯӍBX% NT6hSI z^fQc5%,)uiRߜ OU'ѯ-S'E`ü=fmB0p^U+19FGSf)1m-l&I7VSFijc ! K2w rh)<8diNbQ1¸ IC><{G%ŵq]ݙϳ6\\O'A8 hๅdv8H6ES Kw+L>w4|fl6*V}_lveO4d~Jc`I"8%aS^05|sҹjօHon'dfn263+ vQ+gmh=rQ;؛ŠF {!@g:vڭ',#fqw_T_'vղQc8+c_&%T~Wl*Aܘp(VBJYm6{`/r<6RVXcaѱr}!jDZBE~Nv1$knvEGoW*\n0 m}GIo~=u)7fsR+$F-3<%z=qav~uAx(hwmB85K|V tJihRu!B|7s۞T[ؤAɪk={O@ y#udI:%)@K"춫ZTO־((c-ċ' 6eB(Cޏ:hWjF{3жwӏV+CEU2_fGD I߷g1`.K)atAD ?k-pT{i(z/\@fh%bI;}őԃESRi)Ƿj*\K4]wE{1hui]>.ܢ+1:*B]S!m.Mϙ(v|1/Ttܪ"2߭f1Cαќ̴#DZQ:;hlЮZfp4z=^yzvz8wv_1ΈfT@oq@S6*7 گgƇ֖ƨA2hyWNÙ^1LX`R7?!+S$N V̢Hъ=F!beaH0LܸIPjE|*n6KX)p)0-)6j%Q #{u=} Xtv_<,rtf,1j#V`rNQ"y1Mj \x5.<]juS̨*QMKJh PBSLZdo2 $sWK`B4QY ΥMTBk%zF8اd(BQû10=$@ xF6NCgNp|1lFɚVAhr0|6뮍x=4)cLʹʍæ`4bKw}sָi03IpeHNI#Mjgs yLԋ eօm"tZJuV ޸UJgnrC7Y3xq=P}W0z8O5Os$](:pb֙';o~̯qڴa~_WoF&>,]ݜwH_.e{.`eXI$WjI-nɱ'fbHnm| ^[Kz7+ʂlc%yU|+L拳@ב]Wf˝~ޕ>k_&pJY/XL[Uud!DslJUnmG~'2s0Y:<~yֲw6[|q?af+}M}_\":ZLtBq$)Ed:7}Bv;F%r|&˽Ű֬;:M-J6K@pJ[ w"IS8edB ?M]\%1ݺ FtuZϷ7)-IZ9ƋS !r8< e VXLtIbls UڨUMk\ crH4'I^*8cC3 8%498,%#I 0jEoz KǑ6ND$IAY_y#5}V,[DUff Ѵ7g&+"SxkWEo 6M6=iuo5PrSCҊ8"$=-97ntRH遜zUHֽy-%3[`}=1S7sd,^=D5q˒UaVyZVнt"+Ӂ3q _k,||QY2]]ut5^Gx]uU $5b!N:aeɰpHXƤ% \jks6(×P_nWx|^(Hk#E[}Gy^jA 9^hYQ$L:WyUxi.vqD;Қ;wL@=IJbTLK$g[p9a+[}p' ~%H^e?(ֶQ&pJ~Bź}KB|HwziuOfiW{*^%}3pɣ'ikΤpn~~ 0;3m=Qmm\[ۦO-/ /9n'nܔwFr &aQn~?6 wB@C WkWuuZ;ahi ӂٖ#Aq$*|95*\w= U_ )NUHzrUs 1рz/J"DjhPqbMu!Ʒc r7ZFc[-!8SG UJo]M?rz}r-u潙O橤RꎄЛJQ9aZ }LGCR*1_ejƜb+厶+S9*ԷK[R*D?REE-} chŵ̞+TIBfwSȟܥ&OZĒadF `\pdu&'v~uTۧϫ )eI*xz/wKMObK]JugR7%GN4?{TszA*I:W)d8[\K.ՐLkaLMYD Ns3(VCߖ=}3^*`5' hQפ?Wj'ߖPޠ~I-knj_i#7TGP1o.ĸJ7 o+;*/vd7ZL1޹-*vAumoX ЊWkwY1,]Q5br? (.j`Ł.>x ^rc|:q㷅-2@S]G]G]G]WU~L$wܖQgm)eV(㌄Et  ;eѥTOi7G +Cb;mjۻ-z7Un)kīo2HKDQĺ^|ED~Znv:HF0,n]\~1^=M*unu`gG Fz3C9 ^y˸'TDRIfoa0X|s/6A / ~LDK, tiA1IFA[ NK% XKT"ZjeVI05q0z|Pt, 1k˽Ǝ ahJC!y)(8RKkQZ,Yե81gE=5?=6QJv:[< PzGiG+xnGgf~=BT> ?a?}Wo"P)v'0cOsXn8՚|?Ʃ Q9ٔh]z$ʟpU-/T $+GR]rϥ 3jZKT#L+ ZQNl[;l-alMᇪR fov6!&pkоIi߃}K8@Z˫5``o61ܸHJi6KX)qJŴذ`졞xp9HVh{"Qg~:&4R#MS8x&s(ӕ7ʂYW.9d1)\yCKҽA6G6rJe>>=LּI=6c'>lXwmuaG5WZ)>ҭF\ۛ=7~DF|f\ كA qtك, ILbup}?<EeI$۾F, ;R<-Һ` s)hgf C#3MkF. PGT.N5:6RS$p xv'{ \a+q0-#N0Sk(ɐ)O@^4p $D-zNPlko1Ȓib``f.PK.V)B#s,r1Hm }*TስcD.,䕛hM u3wl.)Fv+ ݲ;ݺWnY6%GޭZڙ$gшXŀڹM&B^)aHz~EPPtR.c%,&L˝mK+|V@3fќcQagUuac')l+r4¡J'WIKd)ES(%jZ6'9JDR)tAc!9]FR5P2:Am//|kk_\V~^WΌl' <-:!J:wܑt,wL ?#Fßb: e.R@ߠ  m $X:E XY((ƪp;MP(wG5*CRr 1' x-V- S^DBc ID)=q@LܧLW]({GYUB/ѫxS}Ul}?ҩFShUNK۴L»[X*/Ps A} {*u=o8! O]vYCL@BV*Zp"ę+\I$/l(@pyp AkEuRIiހxE3tJH RU(CdK6"| (|3=MlTs+.,D!)J*S!&CY%n !b@̤;$G8ݻLdupH&m+-,.غ?&`qϬ,w˫Li;rLDo׍*wb;T%.{Pbִ.ͧݙ/acZ6|捖*n1TFP03ZR+8ɇAiH Y-庾 f(u˯Ňɴ K5B*Ab䦢+[_3[2dKRZEYŬRƤDx1܉:-ZSTC/ 4UVjR)XbAFPYR#;`UXi%bRMZvP7#$JY w&X)wBIkJ!3HJND5Ӿ!Q`GnI;A u( ?(I%LmeAt,P4zUJ\T[t?yw]ABN@cu .lf@}||q8l^fM7L9f0ޭ>PO_ 3[0c!ߖ>4DűDHZ!}wJsW( oЃVI52ǀ!ǀ烼f,v֞hц$^^U='5?>dӣO5ZS7#iaΖut4ԇR_W|<``zEZ;6S[]{,=p\z`5w.6@21t }~aI㭣/jۋ}9ZW!j"{CWN4B8WY| fkY' % oQPqtAPCK.K˫L(8B+VjSbH-u$T<dEڣ`LOZ0h^M4xXGmN5wvZv?[6PeQ5>g#BN_'Ϟa:7'_4]sfU&{Xw7ow޾f96o1c6z/_V%J7%̝6x%#/2u˻A|n9;ra$|"%ST ޲nZ3jX BD'v.|5C\n0RH.Q2o { >AEx Q[:P/\DdߧAr-щv;ggj2Sϭ Y\ugJD@@P/ө4ۥ 1PTEl!9'Y@YP/wÆzp gJnQ̩9.6(vm?7ҭ<%mv0)ϼ|z:ugmpl 9 f& M`h=W7Kď}M|^#_FC ҧcnlZmܔFPI%y-GNXuDJ%a\;jfBp]nߋT+T>P-k5s K=;=#(C_@0(l7X!ZU'ą9*cĩA8Pjs1䞳4`'GB lpJ{4/qS+D"[Kݠ>Vɗ;6?T#wBdk}E_ suML&ŹT—!8#CLQN`c+u}]PG@}5/a$FzwĨ~CzGނ%. Wad()L}]QH{z0&jM @*u]]{2U>i4=}Xi_c|-nxN$H!HN8 ( ޯ~ ++dm D00 ^A`bheH0bl D EkuTuȵRʢ*T,p/kkdJJ|/_-1| R2љ?8;G>^u8b@oc{sE .bx]CD)Jt{ i J!+)i(!z(uMLi:VV >Ϩy镒UKnJ=Onbc;Do뙚8YiU;&Ү'{EO/7Ňfv)&բ맳5J#*Z0K*;s È7u?7슐IOR_JxҤ>ۊcw10lp[aNG>"c@C1CΔiDaI}*2 f]b[pB:&8  D3zP&oj ;'.>i-wo]h i ]BN(zyQ='ΝbNYdcըF/] F֩HdX /CX|}!nm cܔk5X&f/5 O7ή}83IԛiKg>A)>ω3y?2Η~KuW~  P(1yMr#\_M.|$WN>#G3FIARca1=Ws(jn3W5+,!WoQ꜆|'ҧ@]OÈ s ~t愄4ޘg_m3$Y  eA([T B:+gV_-`?d1qF@*b[ɛ/cvv}&-VIqs!AE_@'j{!l h#x6ݿO d)q2[??L HU8-q2w<}N%F* m(ḐTTeFD,6ם &Xn8' hOw|zUbncȐS )&f]Л/[ɲa!Us ¶1[2-,rޙdР|2&/^J O`ރ0\Y0P hq8 ma:Uo)tJ|v|^qF;0ruߍG>>Bxr4qKg7,fyiyG!ˡpKȏ3r{عM9๙arkP ܚ^mbηlQ805:Zwjݯԓӝq[_ ]53pAye%q % }N Ç>qл%f!7I4MNe9v9 ‹m]:]eatts ohPp/]C}=+ Wzs8,z}ԅ:'^G.h/ݤo_\^iƘm&f_KK,KƓ.D+”2G=$J&h2n^~ AlLRcA8=FG8rn!svE# s&?;ӧC/lBzS+\2%n\db5ne/Ąlepn1m.Pl@ BAE&^ ԑ,/X^b0vʭq![vm&NvPnp"E&>,Fd퐺o3F._ܷ Lw0!:\o2  `[ 9^[6r F@ |c`;s9maKdn[YݲDdd8FRUdȇve"mܿAQp8ٿ.aqik'UsX·S뛢5X炞WῊ|X>|@b9W>cW͑%G? FgC^jrjԐ Ƃ\U>6hC.;kAǘ>{Tٻ*w_/W#8Y;sۖruX bZ?j|Noϳ6£U $f$v}¹H<_Q2 BP6:=}W:UU%[iNwXZ "}X:IL1n׸3|dUR|l]I()[|6 J,}r+ڐRIwb_;a:u~75; ' ^eC{2cqN۠gZ;:+~Ro3t:7(Vpam;HeVš.| M4 OꖲF ש uMHy;&.nGvA"N L8̼9#|N9@qjwvbu:?>zXk_-ְy к_zK[`j'3 A LZOcU& C;zz'F qo`S;ly] 0a/¨1MeM{4/w^G|u@ !$EP,#A}Lsl#ǐ}%s,& ' x񙌖7 n=YGgG >h݆EGkQN|^"ѯyt 'm_.>?qkϮw?piv"l [nqpS(p\"|P\[(e [JEKR%Ig.J=f91wO.6_O}/G bݨk=_vdqU:^R<$ R+ 5NSs8ҜLl}t;㚃^{7Э'3yE+RR yI~{0>1/zS tRV О㏬Y3ԍT[ѰrwŦG<6>J&y}ۏaHP9Rw28Qe/GPh5Xj~y?%xoiO^pd: z~| 鸫WAs&Z #I{tٗ!z$߯A_6S^Orl_¨?0}򄔚.X׋260-&Ta!Ԏ (IR۴ؤc5kehWA;|7hlDP:/— f x ĥ="}da.rUG#3ȸ !FA[iwo/)gxcl[-[gb2vQ!:g ۟^޼[0)zc:Kjae)uγwp K9YUW)(UPr:݃܁ J+Z]L- w/;wN\(/+ [\u$@JQ0 ;]ծ!׃HB%bKe7Yy@ !((3PL`^<;S!$# y˷o(TTF` &:++k׷u朓 \c/=\++C׎|(߽>MoJFb|ebRmw$1s*n neJjMhyzvhU -n=\"p.r3u@Ry~!S!qM+34d82E]Q:nh28؏!u[23 ` y*(ud4Pg7?0еevք,b$s^o.Ss@uoL 0eC7 ʹVի%y|5ɑTxf:ˎwm|!L 7sw?3gey3>V`W*I"U-a;Pb!w w<u'x!1[ |EŦzژ`BR4eEu+J/o,O$Lw?T-LY}e;39^ey!jR&O2&P$8%0ClqZyg$uJ׍#,_¨"Wč wxqգ`䀠U[UN\ rtzGoPe [k] ~t*QFk#sۖs. W$0aKj$<@cXiio[ BcwCCĈCjA {Z ! ^'oATJa93.J/' \7P9ԁ)2?O%W׮0ȆHKR\@F=<@VY G[c6_&օ+ йtX '儕6N č4#$1.ʑ&Q%1h1janl ]Ʊ: yP-& $,d%@ `cw_`4=C@f Óvϴ+Ѕ|!aj!3Q w'/25NqBP@"pRMA "]AU1FŊIϧ <\pqʰe sT0hIx[-S'+@/RGYE㑐{Jppb=ŧv=BQWx!M!2$Vfgii4dCεÜ*S =`B@̠o'uZE8(:rc!7gQކqJO֨ij޸v-%i>c^^<,PT /Q3޻k?xc_8:RV7j?ߝDqIB)?{m-ɥ«VR+KR*FDݐ+٩+ wWCrHb83|Җei9z~ oS^-~0vOn'ǭ;Ohy:3-eVpb7w;G9&e}ePmtfN6V y 0XjZkyNɼ1f,\P8a #u~F50޸-:bQ;>|"c7ܱ,-W&imzZ?>}6mX= -- #5 I{*D@DT]A*k%XhO2 9fݧrRg?gMܱ0\}'+󛧹_^.v=# c7\e~>UCgNk}_s\,oosSiN_Oi,gj-k9XiUdΖCU /.BH,Ur1z$\:%=gJt 0ƈ-HD'4ScI+i 9la&e N$J}reJtkVdԞ!ἰ^Bt̰'N{TS[Q(:.j31ZF{QCaii&%H(.:q ;J7D i[ẕR2g71crO%Vy| _84{#)IQ_yeea| CyQav[ݿ}&X/kLZ8PW4>Yd<@k[W! xo9;iL1n 5F24=VF,9Nd'퇓nr"li&.^$?%sS@Y2 5F't֩F*FsvL\(5qaDL2Ụ~`~鬱|5>k^w74fx&INW0!yM[xF)vJoA/l| cygi  Wv~Ҏ«Mn,0'a?_A/7k>(<0ǪArjq2m0p 솻H+4!oƪN|0֌Ю tG2م4 sPC qRjarJWkt7jcّGq r єKO*']|s2Kcz@Psaލ剕 lJ en2 u.)?rRXr]>!^ĻOn8ӱ͏ڔ_iAY/gzp\ȱ9_Kng۳Gfxz/lyl1-LEfl,1d&xHfL[7cGk`KY/aȦO#i 7ٴlMOɦ'd`agƼ[љ4RH{#;ϣi1;뫽;#GcBẁhh&f'̿I}5ܔe@o7xBVv.m߻|-YfQ,+?" _Z\&>n$y>S|3L}Li0B(4W)$"+eCK@!(Кڅ|YP7췷7Z*4;ۛ>nS?rE)|Ŝ7z{򃗱5^uiȔ^(4*2! h5Xyͣ G"fBCBTf->TFD 05?&XE_JUi52&2 ]ו*F!gbRgNܡ.w;Б@"Jgsua1ҵm,.Y9._WVk% rHJ+!h1"-HnR.VU^}x7c& Xf fBȷ/ 1A |:%EHF5 ,N0.ATiM,mBC*` XmsZjRӖU K6!5m+#3ADeɤʤBJ19f+r6 䖌@.֓MV놖sӾHn!D酰.jWR Y[梒6)Bd^[R!i 敛 Pګyo5)TR!I&X0F'LN/\I%2-~cLؗ`ہf`-6@TU(ʹVqÄrf9mMÈAXgݬ-eP\mⱋŋ{5 ,qWMopAzի*fp0L7ҜNdc֩d㢛ltY\Y#Q]ZH)T!9iEJ"sDr4$ceK3*|:q9Qn}úhvI. ѩ- h >M('DR)Bg-[BPz"JPKPF*G~>`:޶A`0I?QAp0ً z<}揮E}S _F~k]m;O6VUЧ4,ě1$d>.)Exol!75wQ.駜2)gһ(K2ҍBK5:Gɠ -4aYbk.g11LUynOvy{[D.*(fy! |ƍzZ6҂hRpYϸ`d i w+j<"M'\ۼ5 ޳t/ %onqͿFC[~/'&~@=K!M?ARXMg!՜4[QF 7Rg/ݧ#YJmeBYM/Gwng\?H*wNC&3!qPc/&T=p [oS؋ƞCbD_MnHP-ʌ ˛4orR᪚)6SNUt5i(]FQ<9S:^V1zD#5Kvgpyg<YF?Z?w".m9?v#.vj9\XyTn$Ƽ~Q|XW5W\ۑ@p룕ηsp`chlC2?L#ڄ@aދF&#퇓n 3ai4ULyYJ+S'JCin) &._8#Gkv52co:#^-r%Fx5w¯-^YotDx&ɞ!AfuXoַ]ML9ڝcbVqAg"ր,z?݆Ӽ i8rl}\[ Rb ĹƚuSiW75W P Pm=n7,%e.˞mqIDnȋTt#~c 2ǥkDuDvHpq#F-d$Δl4l<ꤔ0j4 %]*gDym(piI?ףR=3sNECb yvYZ3!/er3>W%GZA\cmkDdgdž~̰|T+[J34>xu}&R`+Me+ށFDٳ9t~u:@ȑ9%|0\|=q!l+=)@L:3A(6 FFK5"Z fK~5Y|1a̜Q!YHZ TeT(TN#*&*$SMf͚@7":Ѭ6*Wd4K>HD[i\`;6.fQ2%kGZ{x-FzFFhI1 Zyfi3I¡O:8@&0l г*z֠+)JS6^;ZjuNH) wPbQUd1:lKá&+KX{O@1RI;ڿ@ەA5:04J9cR/8fyK?Lmi*0\Zg{ùl@&Zpحj2TZu k&tQ[DtCXx/7Th%30:a֛'Kd$~]FKEhwoin[NҿV_4"+eErPq;vV 7Ռ+'̒5!XKVJ` \Fg@ d||-gkHc!8d< m}7޶+ p85{a"*l_/ v0t,ډ/@fr>Bi˧ÛP=KEo@f4PUS?0냟3Q@ A&7Qw탻n`l4 C2Yú2M_X9}}Hn-uҡn [8cȵxĮXu}s"깍oå]חd uiȬ88ʵǎõw1aBU;2i}|f={&!㚝w$; ̘d oGg'Vbő:R#G|FĠp!63*{1[?*?'xv#[*-D@ Z\Y*6]֞=FyZ ةbߣ a)©хn/]::bHT4Xǭ1չuHJF2)`ӳV{_-y cu~Jw_ C'Wi؝KZ@@n/7)&E`&EwʃhQhC 5xEZ Y(`%L4k.1Va4O7\+_pug5of4,f]QҙKׯ.31XV7Gd'l'h}KV}pK0ژ%exn|aB K]X0YiN@|Ⳑ X0m0P5 mք=x:T?_"fyɂ||` I}/:t\ {<}N^ڢ S+ F9Ye00O?^\%2R'V>,:r?(5-򏪳:/,™F]~&Yհ<ܳVDXՕʪĝWF& ZH{j BJ?/! ,됃j"k,#S;紂F%W8ýhg A 1Jz H$+b -Gl; uZ3V,<,܏ 0biM,"GRqlPAB(I5L$<{\̬u9E%G]drJGBD/dHQwJL:m XWgWc79@)Hp_FWр5ZwW%(²ĸTmS@8ߖu^._]eEV:q'h6o2$p[RyVv몕)1 1&B$BLՁ p~7OP󸡞ЫRJSJ!:Ec^EE3>Ją: /^p0Q54d™ur1)Skm=n=H寻KQfuL̮?\[Ō)J Qs1AYuڧH\dJAnjHJA`*ȧ T~,Ui健V^,|_04ˎT)w?v1eT0rcpd ܙSA_\+j%%wf9O;WEOck`T+8mH~謰P?Om !-Nz-M(\A NLE#CGʒJ0P,y&`ˋ *T`EbFh1.K*vW_/y5sұL@s "L#HcЂG8.Z"GZa1^$d0v>FEe‚S*)񠽼HIsB30fTsM]bQ1 EV`R ?H;LT 5bP``HF$ 1`;iW',`b]`+BNޭ9{ ~)9H?it"2TܻK_OD'_D5xmHA2/tJ|I7Җ#Ovi˹A0򽊏8pw>Y~O%rc#viƇ鞄y`Eyggmc3}rCsX!.luX& fڐu3{aCn'%IjW!KQ.|Q04xJW ݘͿ󛦄j2Mslҳ%ɀNA-ql8iA^(/V{Rn`aA#rގSsp#r3/mp3~TdHYؚA ,JA/poBBD"S1Kt))qF T)c@xc}E;;Y:Ӓ;;gr _sU_岹w= q Y$qk2hQIB(QL#`m;=z+1[ Vf=XOkm]c~,itUIhGv BX \$ MZi_S*,7θd=Q(f{ Z%1 HhNu04Eu$p1H qW:(5.ReD ֜jR w lQ"ki v-%"DhPb"$:0dߥ:' !X]-'![M^h0 .HՈW$1XX@985ՔaϾn iV1Bti{`Q3h05en#R!nC$ r@k1:&(+1Ic.$L"̘;]W1Qd슏7V@[hNNb0Y~6lV6/HjʞH{bSY=Yx::]Tg]|,ORoF|:Zr+n\䳋dtTܤtwYnc_~sɮ"d_> \7[#߆jQ \)9^7aW]L”7xnݕ9ՄiAfЮq qb{e9Vj6c! Kq{bPk^1*僑IS- Kl˃ȶϨ|y>-aZ9;S|م|"J8xkO1/BA}Fv>:a`ڭ@s[h+_Ź8'QA}FvN/ nݺ\D˔V_>橄pq!%qD0wEIwRH<n^e-2)VӤo}2  &)cZEKWn}$Rc- k06  jq–#lILjI|;RZ4?DIϜT\)d5'fK1Hm(3t\#"t$KKc'N,BF DWT٣ZWӊ2_GQ&-;M=/Ň ]`(H]&ݥ) yojnw ˿t.ƴje1q'hfX?lo]ꀃw?*[Wζ? DÑyZF%o[0'[eDC]bHtn$LVrcѫY,?Lݡi^ԯULԾc6?2E=tK$ <,8]1 )>4[˪iDŽr+UU9O۷cKd\j}17U^$9_ћG J+̠!d<^h&x9I +Ic9$IcU8e!5}u(gҚ%O>CdHiG`?wg_zRzXDS ϓqx8e'8g^P)TVp(2&Y UT i@p"+yJ͹z=/áW znC70L5BT+!iwٶ U5ˆʉQ!B4&"V?Z.y~hTΔ2. z1waa ]%6Im$ɷUI^E4JLXeH;0qAdmH&x TD kl?x~,1y~1ZEԁz1Z0v4:]ayv}7oq-tXExoWu9%ťaׄh]I&m%-t&QۇN&fo.2)duF6GߧuQfeh_~\>Uǵzbw' A'|\oj܊y~U~j%Hj3@e%ᠪi)R~Ź cxIi/~xLW;OT>} 9+ޕ7m,MH}[JH%TeL(&HŊL;xPc;柟}8V\c…">$lL@ܶ@kgeB㕭l% igq~ HnTlڅ;/Fﳊmvi<~n>hvx+hG02^vGh o8G^xv~ٿ{g8at˛W?^]v_x;iO0nnݚ/:|uNj߮@9y}pӽztѾ'RD {Ccg=*2C;pz9qƟ ŶރM2'č>8 O%d!En L1jk?Q` r3+[40Ќ)O󒛻{0Y4.(4=2qlhBS2Ʒ+h*]ދKWx^/֌M6x[TxoA=/z9;@jr3tܩǺ]_Bۿ_\]}^܌}2~˫nzh^M;@F /`0nqc>cK^oUbZϣdykov0It$ Sl/Ë`82kPB߾m HXFWi{\vۯ[7^ Qa ޹b<ҷKǝvzSǵ/gpàGt^4v/r@Z&p{BB=K^Vȥ4/"ez Bg lGV& +[;'3z0V Hh ݊٘37.в@)/U z ő^:5R |g>vgj,2>7Fbє=YdWss 7b@ke֤=w+:Vu$]:uwWthE!ڍ[`ˡ?;ΨB-}atinZ3ԱRcRJNR*m`sQ9T߫Ќ٣}c5{^"ڍA*ӻ$O%H$=R>$ "_˗ϰ"q΋~{۽3h"LV5޽}XTӦgTcXw%-$Ɯ0z9F`{{V f.F@"2k3schoA)l6RTCoh:aEB\k-ب?JTs;f(iHF C2e$1 M2|WrV Y+% ~ 'ْ((gKgF"oL,Uדo~34^̤ h tJB so5^Y2ih5nzXJ6!qvވG,"EgQnnekf2b9ǼSi~&2s56ă>Cr}GJ8>h b*5 3&孙{2Au2eP2ǖJO,#kaYe$ A9F@wi{]k\PvM\TAѷ1!y8n5"zpه[)&TL*TK+EB deP?YL}q\mLYBRrtKSUӴ 2KYftQJt@*JNQ'LŽS[ovqL1gIݱII&uop|.񙲆8Tlc9^0KNU1l&q)01a0 Z:I[8H}?cCp'!j::#3c)^%`T,Q*n;KB1Nba s9eOQ}J5feX,(ev;΃J2͙m`"+YCVuvrrxO;<тv2:fO SwCf/.ՐCRD / kgf2ʊ/HQ9\)Y6 X[['JfX=8ls3:2@JpV!LX`X)&%cqWJP-֕usjnV7`>,q9#߲Ykps%ttѕ[3\-NހO?RA,fQ؅p.9t/|r-ލW1QgB[Zvxa7rwGc޲UlJ3rb3$Jq3Ln F'J"U&’lbyA-u@{aUk_!,*5ӷ i%K*7hKjLݒw4V8"r a-\U+/9\҈Wm .0`nڃEMnf^zSyK +Axbm[Yvc[)a(T<_e:T VtvCt槸q ӄo3?wduFdVwKS{d}ئb&xs=)qOKү8J3Zq8 drG5bccMl{׋7o/~%ˆ7;N5j7oZE:΄;m q\e/9_++S.!?jL%uPb ɈYçp"TSB"(R'"'\N;20HW^bl|b}CD@RW BSj=V[d漎NGTOOp1•W);BR%+`#Bd3si /Ю_\)FV䮇'TK_D FSR0a8ڥȧi=x ,"%$q 'c'#InP6pcÇғpO'ji"p7|cMOS-0rATrSE a|c{3V zE+yun2f4R;%q#gx D^_棼*GJ?L#3"8FhcDЫTs1`_ I{Xrg[ $S)ArK/=OXkdI.*Fl,v k11$xZ)fxLPJԸXjѰK׹96cJ:ȀFcR'K.wjql:iȋFι>;Ss}0ZpL=Os`&=/NvT Q'#*ּzis,!.PUd)LZEo'xzcSCOknJ*OjjY>++-,ei5-'e!P(eƉDrXaQ8iH'vtŌQuRF)dd;wnx .w41Qբ(]Oewyϲ$ўAQ,0I9ABkͨ+32CS>ӋpVdL'Nܙg9 +[xr4eXɕu6AB$r0c'ۋ1e'p/g}xl6jDc4V=t'#홭?lxuձë^Qw$c 9Ctcʥڕ.F (wl %ž]Y7+ z=}(feyccfV3/P@Mݤlٚ M)G7(nA"2Bf ! S ~ϸj__ TM߹ٳ܉^ b|{co|_ Oz7&u 'eO%8Qr#~eh̬R/a5_{I5[9.2_,wUdb7-0Ąr_QC*>Pہ$BFFf%g]fI*L9G8+ 0a&TH/R2ǂuΔtTB=(Uw{w||@VGJjYyg/c_ո|tĥT &!oz|H6y]ǷhCbn_"o|]ε=atTqL6^%c#)E:1+~x<(/HQ J`;+.lx66hx!B>y> ޽RP}Ln#CYuc" Y>ut0IbaL3>Tm*P('X^0$CdcFX-(PY5.0Ԙ"A F ܁{c1hh#p0'-wdFjx\/ze ~6frX"`]]j$hyBE #ﵴZ5 Ae(LanaֻbuT4^\@dA" ,v7D80+bBzU ATC%RG,/K4*"^siASG\ӗDTǚ1 ~J n,)V0%C.={;nQj1RwǾdS2P-C39 FdT'Cz^VY𮇼`Lb`rQoOaaԍ,ÝfW߯Vy⫾I2˅&tl!ɐfAw'%sTsۦAN<;w(y;Ae]~A P = b6\g$kakL)tib03KS)j,*-4SA )|\Cqլ}ɱ#"a{v[v*n"2eIݰ7>kdP,%ӗd>e{2CC϶ rfwYYY(X<»3g`f򾹥!9ǧ;G<) DtB/'O 8o'ۓ=jNBGh7ZHDőe O҂ L٦x:>Gbը\?\]ī)E1SzXepZtutAgKw9INWۙ2jrXR*~^tF-x}lM;ɪ ]w4^z+SxNqyY˿~N VΗ׍p͒)./@n5Qb":sn3"O9MIn]H ,jtT}WF3j\ RD'w6|XO%j.$䅋hLFKuwh@:gAٟ9G+|A,8O'|zH e* ;T7߾CꞚm5*d,ߴR<(0ZL'a~ϴv(ԸWrVvldCƻq vB0d]'Mo~]v<+;F`lȤ]F7FzZ}lr6ԫe3.Mt4oԤ;Z:H$Z ){YHN'#rL$T?Y 0K5OONŎ䢐Y$ UbRn (mmb_BMqoÇ\*5Ly[I`ͽ$)=Jvt$w\ɶ{drṪk\B>uƑwUE9Hyoc9+nߧ|%!`_śT_6p9ǂYu3Ng=i,d%&x>җhD(wk`YT7~6 j` 7h5DΥ%N0]=J7PC ͤOE[Z ,YޝD"+ԉ_W(]s|.l=G/CBe⛼W 55 9`C-au68~}ˆR1TQY-en>Yݶh4WW7AT!xT_Ogi5~[Yꃩ&c{*WsFD7ϗG\! 41&DaQkMs'l)m_oV9M3(+<؄2dWڻScfHSq98)z [mZ38KBUZϽAI&Ѧc><-M/+yN*uO{8 $v>(0-X%8qx"RxI'ԉvoCC"iFDSPl^ ʹ`FpHbĤ ,%ް:[OǑ3~G@/aW χ/fSxߍVnA,-_5ܐ~wdq5Xauw2ѷ:2^ #\te`),NRRdHa2D P-`NjRGob`̟dzk͏[MoDוDؘoCG '! -AN=^MFz5~STU]S2ͯRXd &HR̐;P e%])bY"_)Ɔ˅~SlNQ# z􉺇(}&:F&naP C.pv`f`c sh8iU}sZF'reT[8yyj$%aTl4Ĉ+K|{`[WźB'c\X]mqIb0z٬Ϡ,?đTa M")c4iouxm {.}(@ܸJ{gKx1n{ NX6&FBђ!"|[m=6*`33(w{xȕDwxo-^.HPV NE$[D%`:Efi>8*PïV J\* Oo"^q(3oy ٫Zc\aG 0-+!pb"찐)ψzV:1wƚx MҦ,L"pM5 RYaugb/g-X?f<3EU.8IP>3=~(JB̏%ȝm 9 1IJ$Xn_ Z["x b,V XDi=<Ȝ2¡ N\~:=~30 ϹMRRq`OM &]<^V+q'aq'gq&Gh%ƾ/].+w09^1'LO&i*`pjdIFW6Ssqw>Ncl^B-%Xx>sDL3K#VA=y1}s Dvhvws+w>E2,䓢s}/ǖ=)Rf5xp2pFT?^7%wmmHZhUMMdJ6y̺p#K(3߷ArxϝT”+6C׍.)Žq!A3!iǼOjRBXx90`1 o#HeW?girãG k`[U=?Zdᗏ[fK[l}\EJI`O;?ݹ;S'ku?{فBVJD9}^}48J$@eU#Ma%Wi~qR0Nk;Zw5^b *E:o9*2m`刽4^:tVDaxw/dcN\~a51˟O*Wc;̔O 7SVȺG4z޸]?uY|EAlKߍjPHOT(w)_3t4:tWy ?Sw˴<\ai'mY nnׂqz6VLM0$zMafG bgQÈmz5R;B~\;f{łu ZJˁ͞\O,"c^?졖 wvWcvY?U$aye>aڨZAf Huڧ hkOj)!U< IH!گ#S4B`[/Cv-zzmI+ [Y܍Q`Yws.>/2s?6=~xwϝ~[Zwǒ]Ε~*=wSr"A&肉Jo~}޾@5~i-gWz8n{II9a+Yo?|rq2ڞ".ߡ bˆZ1N__=XﯖXuWdބVwؔf:~jxl7 JP:%> </NmnRTaa1yA0py}6kӚϣiİuZ;Ϯ*e(|m-VOnqGEy^vi낮!aC Ϝ[=Tv³vO9݉Wb6 7'шvFfR3'ӌnx䌪&JO7b) &OqvSRfqya Q9JCq5%tcjqǙwx<ռj8\cZ us/ #e87%JF:A$x(]w6/2 .m˨e?;'w6㍘9&Fˎ|y/1K<+fL41ϻ  nlHk#Չk6*3Ӓx6ԯlԑzYl`P 01KE\d(D3r)3VF#P;$ "2DŽDBȤ:i,&'t4ǐ#7ڥN3 \d5djf+dg k{2dJYj EVfQO\Rg+A7\ [;+x o=AĔ" 2MEVngyY9Teegxwj I _"^"3Kgv9tZ0Etĵ1W֥I;+EImb 1hGҁ>y AcQy !ZiM2xn|D4zb{-d[+w,K sp2G2B:iDfH{]җU6D+nk&bjD̩iWƓUJ@$f ч1$!2]^t0ķ7{ML #V2IXeYadj\LօVr#1yjq&H]2_\pP:;( (%?].3odX/8{⟯/!wfkwͲrqfyHOi/Qъa6p cp!iV ί:J 7&Td$N ֨!W'އONgei0aAfdoM~L [ _gWqRVJ*np _!~rikZkSsul{3~~JSpjt(3rcIiR +=”6GortlH'Sz'd(d}@*@Mi6 rEʼ_SjJ%VK+Z'5O>QQw:[.L{SNgw9ed7}cgeU w J.#mLkܖx̢f%Py.ZVȚ$ɱ'~iPKF잓q6Hcf)zObD&՜Wܜ\5z3I1{Wwd$Vy?KC^,&Izͼʞ'Li7-T>[E5&D-rX4^{̐%4y:K ! gy eɡ᭲(31TCɼWZo_ifB4Y ~ФAOI?j&rn܅ =c3* U wO r/<5C>uM=w3`WKѾ=w3lsYC{5^E(;H6>ZMPG&{PgJ@Ƿ5\ YOWWw t~-6^3]m=k(Bx4UYVY,]C(( }Ӧj2q+1k#2WLbVosܥ|Qνd|ye~{Ggw7$_/oi\@OVd>5)eڋT8O'wg6)Ëj`a>c ' 5/9eњA\ eC0fxo˵*];ztY}.Ri˧nٌmJ G4Q CZ;ڵR蒼EKClS5[yt3#Wn};;裒r#FJЃ3~7Ī{f6ҌR?|X:{Ѐmq؉muVNGgn2Q~sA~e|y}Zwb~լ4 4ֹrhgFgegUv|úgcA0! ˺p0AqQ/گhsZNt M@!xd ntPQB۟As0H&ߩng㻝ȵeGhy>UV쯓-n̺VO~߿|wr~eEM~zHE}6uaGlI;e\?/ݲr;5뮺FNa;bN~OrW,wQXmDIF%Y-l_y` ޜZlh{X́.j uC9?-P|}7P?:C![h+7MP:nb̭USM@w.{)ݲ$ňe.kݡ?E\̌ܬr~qo/C<-f3&cR%) k`V[%#NHO8d>it~ڻ%}v1Hj(G|^Eء>_=l\` <˾ -Ao]r#JprJSa%~${Qw3wÃRaZ;븨Uû;Lްjb5"m(6Dف:gGفVRV FiO{ 3Mi8 @S)ʁ6+en'ѭ%/6Ĺ[֍g+\>̈Bgk.rGitfudԴ`}cd%E;V :~-u@:y/~*`.`ls׳(lt9 ]K˄,h )D 1ɣN)gAvs{7RRfX7Y/d@K~l Ɠ;Osq-QK>,g8>Z"Ar=ÀɥBv녯}w>^/W WK{xQZn6= gC#h`_uIЋgYʏzpA|>Q0K)4,s3\$L R k^M^ L6K`)CZFG Cp:G *c%qh$BsqLޫ.v>ۗ\Ujz_S~)zWV8^-(W(ȅI-~~^' "뎋Nd`t U佤D=g(I#7SuX99K7}ӣW]LyVmYw}kIzc]x0Xǂ>)VzP,z2Ur=2FX&PFPcD-T[]Q#7>iJ WgZ@NQҒ603! ׺le4|X_Z86;U <:hߨo-`sDZ;`t*zMcQ,83IXQ"W^vLss4e0GeEu>lp 9M ,;5~8ʭ0GQj|ʼϑ)p$9%8{H[l$^{`I$*2 xC*N[\>CEpEqX,cT>G-f`ЗMk 9z.iKQDƘAJ!B5(.Z.TECoz  TJ:Q%nlA%,b4r@^h"p1ג0X2.+ګz}GOd$c2I(r-6d@ Ah*ܠy?ɇ ;UKbRBUN>ҟԪN%*O%!HM:eH/]<8p5Č.lܪjGH<H rMx.2 ̉9$*-eGDˆhB&呔HYjC!>i9 D@xc!:KҚ ,z+bOsExF_Wd=*D8CGRe'`c(zj .G*srZLε`S,@[8it DР7$)މBASbєBB笥)Vb N )}XjLT))biJ:kiC B O4Cŭ^b#j<"H wdeQYS"W5'9)J9Zq+11WH9]2DQ[R}qm~ wMC fMyل{o  R-xލKƕx<;^ܶC{/޶DPp S"'g)/{=R`M,z b') BWn]pz)+A5[^~X1^hM({[8zq dc&={15G)v7,nzC4%O3Qj!v&dEJL1nh0 (Pm1 Ŕ)H#"= HQ2kE& ވ7 kNQG& Sl#LA2jjM4wlAt,G=3<ޱd*R/%Ca׊NdjP70R/u&G}',08ap~^,SSwkeɅ?ee~vGg ']W8uJ4r3@<8topV}@m yNXQjE|ǔO9ma){\qlR;J:BBIwY!O]u~l27[׻Q+HK^&6oyCf9 =Wl]pJ{S& W$F2<%wr_t_>#^K777=&hv]~ŧ}&`лkw cC;Vνv H3o']r__}rZkFwɘ?9ơ_W5U{GfeNoIf(sF0+zl(J?nM$ Q`ɵW&Y#)BXz[z6.B{G%-hFS78ݟoie:O/^[N+-}dQyg! rŋ׽V Wk[Z֟+A9lgOiiuyZUj^ӫuE_{IV7g%8)y`+j(#e-Pb>oKuM6ViO~1NAakYPBr4(MyE" yYUqyi Jɓ6umcyflN],%Zg]VqHQYқKIߵ/W5y;{ۡ]6I􎺓.) n) {>qN/sr4'<"BD|;Svvʔ{bh[ŏ3p.QiaĢzyLr1g)J[yث޵#bupUEiw_ ^#+oQv:l<:(v #.dWKJbt.tB7AUM>Z=}ו fh8Z-UP)H).X{ug{e*Z *SrUWXYJ2h!y*U|J8i OmaC: c*_n.:<}?]O&3Zښ_x N{I!?lUR(]mkcyAgaw/J~?⬞67NoLYN;ROWh^]\F{"Po!)Hyqჴ=66i'ވ|&ר'S14+u=ג[Pu^) tB_zhs~hV/7blw8f*Y?UZo& a-[޲U\_$aQ+}T[ Ȭ4Q?xu͕gsv%֜Uo; }~týbZ0ʣzz',cwiZ2֫?Ì3{;ʈ v#~i* #UwSԜ^"ضF^|I1CJ48.yJk|m+( G./B*۽/%_\S_e(ψ+adݨKF3B" k#/z2**mRUN-@Sy@N/"~x;#$E ~Zu?_Mp j"#<6lbC6"PV̍-hA1 d*ժi T sG5)0UD1-b{Z8L8 hu,L5Uc"_RC$L.X&PXbgrQx%*y8x!ĔK).Rh  ZTj;U'HM`% EU!;%ҫJ $ =h)\jKH[sob唕iPj϶FgB뛜 )ESM#8ʰ*Ԑ4ʘAkNK)l4NI*/uHY6Q!`l`HRT`.EXZK'f9V|Ԓ!!;eԥ`VeNQAzM2U"ݒkЯRXj؋?WAmɅU62eFT-=GY=d)r欤ގ㤖cL2D@?ZtfvKe %%E@KԖ`~Ԭɪc Yp3w3rxUIRJGO:m{L"&N# Yk2b2mI,ɔиqHvYelSlm=E\v%_Q62SLV´IPSxZtO,S΍4>uđQ:vLQkTK+bd%zKݹaJHdUR8H@X\02kYLH e%IJ"um9tF'5th%ЛD麭bdxJ%^[g.HL8 H°]AERo =-[.W{oϭVٲﺂTc9e?ӻr{KeދBQ}f54yq_ŅڡD^qҷ-!V2,wyc"1*,dHXqϾ[)26s/cXOo㭯 0 AH@|ߞzZ80=S>c:D?FeѪAQc?L>pv0;ȸ7TՁA7'ז<F9z@Ig>J9ooǪ5Ex)ꄇOSMy4|ǂu4wj"8EXO[hF@;SvEVZνyƚժe &k7I9Oj_9@nLbi.i]>\ܥz.1c=@N$j{)y-㘨R`[*4ovn'Mfz O@xi$)}fv+;~]cHX]ɋ:Dמ !I%TH1A%'E};,!Ͻ% ws˻:bͻ9c=;Dg<ƪgǍp G[^LN{s#ӸUy94[(Yx' +>v0&̘} ̽Ny&[j>S5$[5 F Vs "堜GV+Mbd)8F.sM@s RCg;օ)C$M"@ r왪"/L>TtHṞrb!Z})CI`r҅$rڊu22C"&t$dj^lWHKo1XC.dCrWSl j 6W pVGB<O+9)W&W9Ny=+*&kI?ѩK D񈡰$:WzX:LlP51!mn_@ ;#l/% ?Y7~?W$uDna WaFEm {;ly9/W_)+^& 9I`}BȄ!T*|lwi˚Yi^"%{8~CN{\r΁x⎒3:*IEwM[fi3vkp ٩/yX^1^yN %Q=p珨3'FW>3y<gFz>3ohTWB؞k|x~xg{UHi-.\svj1A gQSwY=7Imf8_Vh;v=_w!N|dhmgdg;hxOt<_`kKN{drv7HqR2fSU\%q"[)\wrvL%HEhgjGCa]Hf뭡4dv yt[[(rN:.h oQ [(lޅ ^+Eڈn\O0qwnF}`у鼅rƧN8mp#n7潯QD#Ӏ )Z<wϼFk{{\\7I @ڝWw, @ɤ ϟ-xsܧU~2\on5 $2;-_0wn~5i[N<[H{ӛ4S˨ƽ z'bw}=oo*VB'Z$2I'ZT͆g-U oJҁ|aͥ/>M/֭֚ŧӼgg^M]ũPȚSGjJ{gS{4rv`0_6=52uJiE~DqWqCݵNQT P强:*)BYV2jDmn8g@ M]pxA5o[qfjmOhKpv`=i59?oU̞],іc;s8Jl6' A{[9VV D˜a nþ"@;Ayq.%E>5¼>2rJ\*8kn(4/1t87Ho!P[CJ-jiw>*Bi2IMd8|I*hb$j)Fp;b=beTp b| ǶtZiZ6(a6ϔ a2ȘiuW2HPK)}L/hLYI(e(մĊmjalmS計j[ډZ Cx-@1Qhov*,h4#NUtΰ赠=VLFgY$*8$?r\ mTxbT¢ Z(QF,aTgI%g -h.-p1ZΔHtR[Q*0@@PB&4T#Vqe\Dc1 _{]QTN0/&MUg9UʋﱩŃ씕F[E%\:X]o)!ՎBsײ#1bI)ش-r{ N]osf]_|~: |X;>F0L0NI0{#e""^İvPbw:82dct7%|jN&+8x#9}'W􁔜襻rf^m E/YQ?:6ރ>\19l~ D '=ɱ|R=ۏlҭ@YYp4#3=x8]hR2qC_w<ėK}?TeT?3g=F { J!˖o}{@#. p׊M1OWw('ыybJm b!jdE"}-N"H ]I1ۡTk:-*ISʥ8j!eeR2BN^Sz(FE_n Y!Hk? |Za* ?=&8'tRybQ]d,/6킨Ax ~kɞ$/O x,H> fw;K8cGTQF{{a*0BI)C7Q9RBD͖ˏxnϙPc2V{Z lemM+M )"V4;S^f%gu DF}_ |,x<V&|LIi|*0mMK bNd8aڈe$@M &fx1c}cr,0H @v"1Gs2˚2k,HQf1m-Z1TBhM R}tYY" hr{MR%^02Z 0e)u-v Xꈂ@ ԌGa" }* m Lg7_R ҦP)|_75nբ1&~p "8EuJeػ.; WXVv_I#oa"j[?s_=?>6Saw}M,M>J1Sh!󳅽)Fã3F(\]%*v ^vr%DƑϵpw4E5_gjВt}sWs78n|pk./H1d @~f03MTf!n-rf dF5V7;O]ȵs35Rs; !x 4DFqP[\l2K-8 6ʛcٖ3 Hűdy lBWLL׀$S0M`eֈ PNܫ P5 O.DDRǩG$x lOtT3UQ@x3w}y il/nۼvBk.!"ʢTYϘƄA":UDuLsŇN PS'u i^QR*P^t;WM&\~]P[Wy]-Z*DZ9ZE3Mso|'Gw?b1"sڔof`{EwF*e[W˶벭m=<{*BK n+bWWOn2-A3PuﶩտWƩe튜*v/or&,oϗ.j`ޞퟛRn]}\U?}p GB5jHQ惾xM&_#tonfPjUUC2~WůwNUeBg+ytn[PiWt ѻ[rҎA}Fu;KV5֭7m UNvA&zdݒ8A}Fu;`1muۺf4׺ !_v)|˺ $?VT{oj#PLMМVO)FSh3AӒ E>|ʏAlf+)`FZ^og(LFx{]Y7GIgi7|CuK j^^} m~"r׸ZlTFZA4jT:JV+U]E45jYAʔs=X@ ',+|C'Tk.6AEJ1|@'Kֻȧ1 ӵ"?@Y^YyY2G܊|chZw=:rB0aNX]v{k7ċ$^!4jfeA~`Yu)N 14Re| 2O + G G z}ڇ0)%E}˷6Iጼ>x6"Uڄ7w}F ~5mw?.C.k{|wS24 RJb$|I1n {A*,< C5HP6d ;`i)Rx:lYkٱs)qs1eȽ:nz0%AQ#([sz J6* ь)* 6"sb@vۗ/}4@1NW.͐ ]}g? 73䴭Y_ٿH.}ׄWѾֿ"Sf)g2 PåtfKΫ3x0sZ3XGHtŸDS1>J!*屨LGBv}4S Z*Qw40ZԐ#DP1k#睩 կT!@ sG}:fڀ0(y <ꄗu=A* 9wP|D 4\Nh$8IF* 4xt?5EVh$7gA$P0N_VySAV^R4ֵs֣e8U,W)`;1n3!feΕ>]~y_FN }%9BTs?eDj3}blv W\Pա{JD6pXG;dk W|DU+fs @08٫xHaޕ6#"KcR@=-M oّWѢJSO?Z;u>}qd֘wTbx2|̊5Gُ/yLڙ3in^P.QqXIQ91PE @(kR D]!e]j_\o~n $\ή\&v؆.g@`'g36kW7f7좊n|:peģh2 ,M4ʦZzSIoGn:16Bۈ͠Ѻoӻ7(br+}&wՁĶE0`-.wKa!'nm (v&ډ >7xcWTA,pYi~4c\}Q)@^0[rz= yTPzmIH8s[av|iws Q:~xΗbS^lyF\ OgEef]ڥ2cRFBړ}0ANW,A"k2 މ}Xd`9P-N%  i!i$eT+Tt9{ўYJD'\atK]U:WlNNbw^.ܩˈՖ[~Ė7%8ޫ CU]\+>^eu* J,Z:3}YLR2ʠBfGx.@k]i`] z,jga$N9Ml -"!ԳtN]z);Yo4?!`Ir k0noK*ɶo'wcD#sےƿ+Bl͉a$_AwWfpJD\& /o-F _RəoTpq$GqRo@(RE^w~z|]<.Ob{9'%M/` RۭLG,a& P/>v^D' ɧvnΟnW{ KθEIn+J*w=.\Og2ų}M"dLCm9R'bjG^Ф}wjՏxn %2QYB*R$zAZ'>-Api08y]o;'11t'Z^y> lEm1Z NREq:^k(H y@!`*1gP 7}&팙:-dPS G 2dȪہT{}kVF8d94WR* 0$ K1׵9 c-k*4R=YTG)P, 5nG9)T혩)8kcLA(*FT\!=̞D+$rA1r%= B <>VH9 bọQrěŽfD[VvT o͹,-.e5/o !ʍ Y#Z.רfwviɶsn=\r_ ;ܿ^R%ZioVˋ ؿZ1%ԮlU>̶h4wvYk,jYq_5Fɲ}zQ׿4{ɑ9_^pAkK5fyALX`T?<捰5Oujv̗!Bzb!`VS$v9ʶ T&r޾Hz!|;V JSҪKAé***\۟oj!xWŔf=yL yC˞iϞ9z tG~[nX㦨[|9!h\2l[]ۛ7 eʆwFDܚD]H_|+QT@ģY0x4M< @B9k7>4ZB#`Xjo k<ueoZkOӡfߌ[\+)#p4Z>d}һŷڕ[Ҵį#Y;"i+X7$(IGΙZ4i{w]޲ Ed\M$7GGB L4h9!b[` i9}z; )AGp1>q@Ki;qg/J٩B:EV >*%}vcz<6ַ{?&H]5hgo: 7: bҝdrPTH)g z?tՕuegFIJ6?Ym90ZN-Shݲo/S*fj~;u[v& l"xr_\P̚aєp?.JcVǕ-?v{Yuz7o(iaNʢeDPU5C6cb ry\"؆.g?741MkZ5mV7f߱OsM9vǷjb-t^h\J !rTJT~ ,@,Qّ.n7DHNQɁB|0a|04)AbscӀBuY"N;KxW?~{^oʺPuYjrL9ّ6~\ Wp G÷V#1zmL.vKtgBdRggf;t|(OyޖC97p虂s^􂝖R)6C#̑"µ3kgۆ&(gP C} JiWĔ%ZL@Mrބ؆-ǟw15rŕ;HZs>X؋.{V BlQǪ2#*RJ"1t{ *2]5(a@i} / Q"T tO-z/d:]'bj f¤tSԽq,G8Q=p|J棭P_þ|h!J1ܾ?ci82prSO7.[ᣅ/Db ^$erޠKޔc⤵8,A0IJZt:pN(%&b-۬4˕7~f7W{k )i{JSdQ`V1niwPV޺h~ڤ HE#2W+:evQ}epNIc2Ҕ٘Gƒ<<{a!΀,un 7|$vG}m(5n!]^َW}L\PjqAK0W *YO>qI@uKl=Tzug"iex0ޘ;/ڝWSZ7}4e Usӏ3D f{!($n*ݧMڛKc.0-`gRvVdMYrpj}܆ݗ 7VnǴ7`\¢< /^yxFQ嗟-"Ŋ @#RcsgݾӾ1Ktp8Cʸm!d$MRe9+clWB]ktXΞ\WIi%k\6 =hI9DqMן?wH:#DK!~ty3Q?_[xr?՘+>,cPu(|{D)4f~4͟X&bx)DdMWy9XnbǓrzKR BK7]wߠ7|薍o."4k =5Y Ïg@ߏ_X`[#NQ>MVKŸ~z[đđ!Yk")3NSޏqi2{Nv y=,WV 5'D1FY< n>f< nVq)2;bC "CP:ӉXA0,l*YZ@͋ Aab̏}N&I1?M ؅)ϣx<]q{fSzF] 2t-w\bv{s q/UR+''x2{pm?q(4y>h4rWɚ$lެtzqjOw7xe@~k8ƲclD32!1&4Y@ Xla!"0So:(yEv+dm@S IȖAN$5+I,',Sm/YNlbDғ )56{qfC=djS0R3OSL1 c,cM-ՂQD18Tc *yHG"#E~kU_Meko)yӆ%Z3#b&7ϮiĴfmwKKl5|N%Ae&!@8X UGri{S,};oh6 S)Aj ]n>MѼOi>[ i{_zٚwIN _7מvIwGazNHCGYuI)=cT{d+w X| qQKT|q3_k}@[vn q.w ו;sܒVNr_5tR ' /$烈'˅M?lavHX^Ж^oN&&d}/M1f (͌XD%B u 6A@<:hNNtp6gkc_ck2!YREK3.|K|ւj[;xsְ 1*,WU2 qIKu ݜ#ߥ"m]xߵ]$YQTR[;WMZQ TA<ݩISzJLQ _ O#GL^~Y<ß>꼹4Fɫ3 rGJC|2 4:̀ѓj=qcT7\a"\0nYeB2u R GDaaIF@2D0KľwBv98]SԹә*!'齈0 %A)9%-m[P)h D'-n juJHǽ҃Pupq j^~*+.rN'K\ Ld0XtIc`Dn^].!u #\;ֱDyq2}񦁳+ 7C)Y.F*ڊtMXgmXX6S%z9&7(k>0Aꌾ^4uE\㠊hsRk؆K{/fQ~ȜfkHȘWBY_0ÐuPǹ{bE(Ƚ0պ=L+]G+Z_/W/o's/fV̽/XuY(x{|xaFk7z9QO w_;\Jr2PTM[q­7E7zA`tz#I4kޢ +a*Xr` 4~95'b\Z Wo'w~&AB\uFi![iەs6DbLKr+Gvj[yݓ6 TDaƍ8RX\ 0 FȆ!Be8 Tn+N2vZi[U( Rf "RP" bXq,NT.%ԥ}9{rIiқ\`nO>ƫhmU N!P#D)~<{}l@ty:}bӏ~~/3x>[7//% x4}w:Z*o{/w i9 76GE͈t!C\xr2:<]Ğ4"RSLlƒw2v7rr Tv R i$N̍Y.~8O`0w;{)j Zճ!EĺAk,Ο&H36n24N(dQ+F;r7vBRJP(;OJ%c()̐H+r8NTG*HO+#(>j]<-kb"18aUrO:DIE HQ09R RFo'FF 0%TÚֽ^A N\m-T-År r6R,Xq%SI `5P<؄*-'ŗjiULj}!,5,!jM)BU2 r L<{pB1?{Nr/oZОlOsBP㩫>Yr9u&m_f0hHn" ~1LҘ<*诡kaoBY"U "u~u7Zfpr~HDK?)@t_\X)I*X/(LZGYl4o ~?-HW{C_=Hi-~(`*rҋρ AϺX Ԍ$%_U'X6U8V]*Asě}'u$ 0…:qZ˛@5\˛?泧WybƫB͛Ȩ)o:߂!D롃xxu@q U?ZL沺: Ǿ:Lji]kW 'ZGnqR5S-1K}tCt͘ZAUXs"^D5rВU@7|FD({t-치? ojkks!L"}8"1#_QoUPʳ$(Bt?RfFD!T}bKu[䋄Tt 4 BP ʀq##K][̌偎 Gԫcz7+^~q` f,&k3߿jM6IѶ<ؒJq*jc?k'pհex%@J{[EnmB(D*OPDa(aQj5RT"vڄJ*%:tcm)ï@[(/pnp2gwq>}vh@X<=H2QjP~7x k $?-}8TD|w },AD ӻ!rUt wJA5|{+° "X`jxL@ 4È#yֳ;4LĨ󉺇 +OTc9͵NA}\e7lTq-1K*$Wl6l`=hg~ GI$y*J*E, Iq`$2hk4x!.!BT!졀Fk9 ~"PZ/`!H+Ή:@9;`cp DP%WZ*eI<2k5)}TbȖ?,pzKK`2;6P" +]Bg6gLWw4C <(-IN=ɲ' ~{JI!Cz &/ްYg+!r8yIfw7#;i&cZ19,rp.d4 9gxސTٴTc"wV1 BTBr*dk$NI˘hgjC2+el'~(Y𱒵OU}*ո1%$;+UJEV$,Twm {.6ӝa0iSP#fvlQ^>9w,)$ l۹OnD9(xNCQQc*mv|P<|PQWpP{5@fkV!NYݲDŽ`CM(eH#t}b]ZTPS$P)vjYj3br"6+Y6CEJ+xZ@t3|uV>^4+׻MhK0 ~^|~Z:ǦXPNU9W(ޙu iݚ7.[nI>OK]-ϓ$OXX}p2>v0Y.YU}3K?‚|pصbjY >t]S_9CuwU̽<̢%|DBE S jMK.*U ?) )="/QJgZNJYOԷ;PsEJ_rpvKH~W9m9q4? (N8ipJ5-ozwɽzk\Dd{ =yVh]k]uWOi j1,Q7,\bJ)S L>[X)\lٽxF9$ Q{uAI}BR7R ϻVJxYb0()[#r1j [4 {־Ś$_6JW;oVDtd,0yYu;\T:71I5[q@qp%i FTrOؓ#x$xk? B2$řJ\w= ჏,1J$籉##Z3XRP#I8NAuL* s.'ɘpAlhmf*AI}dz[^PN:+vT}$95wPtgVzPn_Bw>Fso[W֏ZnQ^/(/b>@*9zc U,yW̟h`rV^H=|ӳ3s~_p;tSt]UN탮\W_^O5FTOK<96Sw'NJ7Xi5)D.^,8ﷵJ^O;cҪT-{U*rL~6F%I,]vmLXQsQruMWTrsT(vaW{j,)+S"V H"#%T4JL"LT h" MJcJHYm L@ ڙt},V^hF2k pMI*^{/Kat[Ryrl9q5Ba1 d `QQ2Rw*;Cuc-hejSPj α,ن;U Ƌ7ͼEqsQ2:HEZFGxLШLRj,;V݂ၔ77XwǷm,/Nz%M J#jX{j.v;ާo!xG7/1OS:9&i.LԤ-F^B.٨hl)ͲK-o=fEa7̪K!BkУu 9mZe8b͚6]S%?Ѓ; ;KӬvrD*l ,'$6N 1sEo1ǂ&Hhb7oT`^E!"( q)'@?.-)AL8X)S?ba=aq>)ʕ Fxd`ϕ<7pq:EqlȦH#N(M,2F"cE*Ht-vw.f~gaCϻOu5Ad!я~xD*1Ɣoχ4 2f4:}zD?z7[X |N'fRO_|0TP %Y.׉ؘI07#J(`^lc9FĘrpR4|1u *PLIK&8JW Vja*JƠQyLD_ik]TT q$ZR t=D+kp>' Rs4ÊF; y3Fu'ȵyHoJ _xx(()% {hP-| kUo=7vhKd.̕Yxk8m#lØc$REh))l'`Dp\5.x|lC؇Wb>_b3aVݼtxj6 -Xr~W.$CWP@K泒R"&ˎ_Clq D|4F V#A2X["6:2sa3m;J.<#X?k Ѕ7xJP-k`G`/P}{gdkS_H4 x%,Uf(" Ybbzvotu{(`ByB7:F@1yIsjc|$"B *i R6Ҧ2c zSC)OR- zPGoWìIǂ@omk A0 &RO"R+ n|2Kyy%>6xKҔ-zLj4<.N+^nSi1ͯr<м!;DgDgc$fO?*1OU33M'I.1eOD"Ei\?TzطH}F#EqT-i'NgԿⵝj}#f*vկd!dVwݳ$\F[zNJ;7ON//ǜfS F9h}Y}kOw`t =;(/w뜋t3,Yi^C&N'U"4P߾5h"$bw*CUl7~SimT.9JԇoH=GD^f/T&!/6F*Ytpj b RKL$TQIuB%bae 9 5z;9R#}P+LË́PS *dݴb}n49Xy'!ՙu /iݚ7.A2Uj:k<_źRJ>u Qźu}I;^Ӻ5!!o\DksgTsӧX#_ KEs+ O  ]~H_7NaMq~ngxpN+&?YzNWW^G|W]E7H,|fG S1Rzy =E1`NJ,=# jW0REF OJRJRr#R㶕ROI][o+FeUS49g7hd97`%nɒV[3ATwU}E q TIvE@%l I# }ܤZgRk-zMok[lBQy!YTsBk)ϴSNAq%x:)bW< _@H:U12$֯0;uu'7Zt_XyfѱGKM>^Ϡa%EI}'k ΅; ƭvM匿p,R*?0?1ba $ iY`ύB~(OSYB^֝?$tAA<\؟-8Hs:v8(E}DGG`ltҤ@uf$ 蹤Zc@,&7ui8Ď\+Y p;&D+7PWU(e* !2hh Fzv[֊<ĚTRr/`[2<6Jz*~'f!oIdDiWq,Ѹ(1(]z <@ lBTи88J !pf2F'$#!F>\W+wt`b=>GdzZ +"zew(H5׀F#yNpF!ɵg6T(ްl% F(e4ʃH qVp^huʨ Ntﳽdiok:dc@nBR[Z-~%_NN%Lyhi^,IRTa/ud:T-68"j&FajG"oh^zKL℠cR$g*^X"now1Gcڤ]uweҵQyҪ.iuI^\U_J9z,CP5=\l' ɫ(kmvQd)JQzBta'oY3嗗֣$ Krj&ɝo-~0N^w s؛uD+њHu~gOn4Z0~(Ȅd1!RkL@wU8^aҖ0 טy)ɼ3y4| 9W2o]h)>۠:i)3_L-U=kxW2LIjHVՑF-}D{ H#b01Y[FՐs+G1ljN+s U»8 -}MQұkR'I )хjLИIT?_w|DfZ -c0]fσ@Z.HsDjLR iUʙ;|{$}k~ƾkB |vs3.e- /3EևK?,J0AUR9Si\51yFiٸyh PȀ4bFHWs̈^pњơo6Z _4?'<7Q$KF?dd_ˇ\nLI&*dr]Ι:9"J'z aL)b'(*B"X`~CI PPd>Zd&KdPyϞPi^! dj)LAdٍ 9$G R3<$D`q4"tWI[Psۈ*3c)ðg% @ p60movF&`_ѫؼq587yIFfYH,kHTA'KVIV-86B( Ggg0G1o(NC5Ec[ߟ-0XSLM&D`lGSymL^5'Ce詐lF.e0UpR'7 ${$}i !3V[[=:$M2D:ZW 2WRmU2K e M:IRmC3t4F!Pǒ@uIuNR D$Ro_/fӫʇ͏:z\UlӀϑK|gnϔ'Ӗ}۸{zo@ cu79JR5pDQ#r@~OlnqА#8fc"dRR>,yuunoϫ/Ƕ}1^]lz}Leumo9^+%A x}}eޟ =_QĿZājt;1}+ -K)) gi>GrN1N{0`􁂡cDJqL=8;Q{18\'E5m.[JBG * v{t9W),5tS{$ZQN흑j^>~lt7@͒>Gsk~ TQ}brW8[K) ?\߿=Ϳs;~ 05~r&/ 2O{oclAw-}&M L_,G* *^TE&bRpS]B@cY%A :q7z%!Ԅ 7_Y{sX( 1pc )>1J !A7:EcdB7=I{ƛ>AF6PFm 䀆x.Y:Uaì("0;xIZ 4*b=ccJ"Z v $#G7OdնD2t6HL8G +^: :2ETYbK)K%XQI()O9({k46` T`%y)g8@Cs ŠmK6]6/S7~|L,USKކU:K8}}!3? ZMW[P]MDEjfC3 Sr{9:y"x096h@u JK Va2+TG)^IbW3{^}Z4 woddxI,SScX 8-&@H5!0fJ⍕e`A5t_9px<Й@^ _r_\'3#IWi{LǎyXw6\mur۩qQ-7F.m/66“= wn!9 IJjuzD(P&D(G69QFdNQurg" NHaI}F'%E=;D8.l3ZKE](~}|LjבEhW ^(At\JGP q@-qLroU3eo@)["~ǗV),bˍڻzNi[%)$|pNsE ;t(烼Ɛ teB}5ĖY!@aONP +r(!ޝfN#JKYw|uĦdoڔz=|duSe+S(kWx|5.<.j1Z)<Ũ )0 RW_uvȑ~\Pl"``D/cApXPz_}lr;zdTR_r`K qII\jFL'jAx -Ċ)F?J-(w5U=7sD,WGbJ^#oד׋yZtOẇ` \~.?}ǮЍtaVGZdz*?[)R^KK~Z29cΗ,쎗oC=0G.QYDZҐ\ECt.bu@Gb:勉 hw)wkKo[xֆ|*S<~q"ưb:勉 XN{‹Pߖ:fW/SzhwjXUp$d V|^3uܐ J&jy/`AAZlz y. ʉz6J$4C°,(CIX" E)kIWAa?tb|3,2E*t$iƬUlBg,MF#LV e*dF(D(0 `$5SOL P*2"2悅H]T˶V6S$eu9lm(8xv6,z rmIjRJkxu6,%`W@Wg KGȮTHJ;nGӚy~O#Vfp;Qɸz<画4!:WLNCRc&  bp]7z3[!q켧Fs{jOojMþ=5 1QQ;AuԨ@$} ;NH-)g瓖%*]'ZLՁy<1x۩5O&%˧-?&B8!!0#  =y& ET$8Xz:R@ XԂV 4ˤTc k/1cD)⚴WU0&ARO:nQf!QZ](Q8Ip$S&f# $TTcM(h)֙֊"LJ2ppƈ0uǼMZȤQjo.mc|}Jfqx6/.փ͐x_NDtuGE#R͇q?X|3ZO[<{0w<{b>?SC!7ѸQ/Uu6a.q[mؽ ,&]IuezŘ1/mc<+v6q?1c/ X|J>> o^3jKHأ~+A!`OF=~QcO;>#^kaps[dϿ?;?CyE[ȏ= mB^ Tp|d q^cͧwxwBh)[QF 4W{2|0VI7(¯cʢSi.cHaQ.wO$zB,+a4`];0,:8)afW!^#vNOۃgB"rG\ w~ 80{qYp~yk"…0t8@gT*ӡEIuRNQ"]gS>❢t.!םieuF;G@;Z3`fxH>pwr=[H0p$aP)q* *f % be$$$ʰX)LsL@yfB'(}6UtN9zm{o\NG+E'yBؚ5uER_B8"~2$UMk)a0HR3Dk;vSv yvWW5qʴ΄x#NK@?בy"ª<\l( VX)NZ"*t?f.rLmb,eLjHJ#>Lzn>>Ag$GX$"F|bͩ⩐ER20r4g^-2"٫Ws|m\ 1φf[ǵ;EiE;pr~;7Z)z\|ŴD)g: ah*YG^|Xu:%&,SZeW^\VRzhwv!rv."5(Z&JH _:/LtJ!pt^6 R~Ȧ;: LLŻF`_}*:b*n|9v8q O!ĀWAD2*=!:ՙkBc(Dq"Am H2f@(kb5^TRUA%V]҄1iK /TcASƱTdvuΈ<؏2lQX%ʠPA**;v9wp3Ae;h"@kR;Ε\G-1J9p wp>";[ԋ;S|ũDB1٩d;7ʒ[=_|t=7e(xtfP>mt*X)3!r*SB$Ww}HъI֛TӺU\g[^岥%|!. Tɞ{~bm}^Tv*δ  EccBEd2#0D!,v'JFG̟g%u{/3Z+̵ Gpòð 7͸QqnsxRPTP>gwɬcvp;I橛ɿX[$N_u[>~oTQ[\υ|Ϲ'6Tp pI9 1Y__#)@THj.dOa% F,aiƸmIIsg12p 2cI&hFckqb1qbS_yGD:(]w XC@{no˯o!4[mݑD;Ҋ7Mr}H+h厴Zr&ދN:'8l(oair"bBgv?K7v\ q./Ě +@\Azr! +!vJ[r*v}itoCne)A J2ed2gXhTpӴȼLӶi=.IXiE0 {6~>6ohC1o0$վޔbyĎ~;{tbjJgD/g!+cI$(1H1&haYf]wCj?% C $e+W׊o݌vT/S\'_fjXp&l+n%9O-Gȟw\pG~;yCs.-vƬ>߬\_}C [-pOi,s(\dلݣճ96/"{[N#NYeg=xI$pVJYb+ ;j=#>H; pB% Z濵PbCRJYc 5@p$)7Fi q"(%u0؍l)e~} 6AGKsSk:ZR;Rыq-U .[1+ <0YQBݾYn z禔KT/7|֗]0m's.¯<~5spgW1Zy"cmb-F 9nSS1bO];ݒ?P{)=&6_FV7 ~&aa߄BVV1*QkFڏzK{RSonk`%1󎲹{wv'z*1t59J/Yz'P5*OXGy>)>%Lb$yESNeTzEddё{'c̠^WwG`pq )J*Xt@8^G:E5RxQ5^G:ŋ5xMg8).̾h;o]DR2.hp9.>ui ]=$\|]nfsCQxw8 KW38:YT.hNv=7=FK̓abKBl<<Fr".n`?8Χ Ďv93zٯ#~W.SdYzSSbٿ8؜51ҩ!_gng-ƹd2qL[&{#li9 U>ަ]t&ɪ&'>r[%LDb47:yƒ3ڧRuRTgz'{0g8b+n}e ?<6)<ǧ7Fp9H'Hu.qǰjz: S11͸N-']`@<IG'gZ\X@EuJ6u两 ` ]%Ǚ9F=A$x k{O.kعgL9!eLUՂjͥ|e:R.=!TGKrL#لHS!QP-`q*ThM9y$94 )kiJK:RsjMF(7 lNCN:RzB(Ulݕ,֍ϡ1ja•}pt7GrBc?|z7˿w~K1-X}p?u)44pEBhR#O&. V~*DK_ӇQr‘vnorM@ש "0N7 yz| ::1 &| $?DY_(8BA7!c'o__HJBz*j" [WD1(MD+>{Tj"ciMw)a'$,p(j5d0J!m̻wC>"5\ǗyGQۀAWQ|0 G[ZK{o )ꩼp>U£qC#H $m]=v&o\*i +aQCܤcpO&=Xx/.6V1m, O-'RS+ǁipL/ >0Йu@} 8b5=WG1uk)W~1L9IlC$S;~Ͻ rI0rC罂S#T#)ڮ\hs? fIcvFXȩ YJ$IAPKK@0)f#5CdfĄhͤm<=J*b9@m r2ZcEHSPf ̦2,3GکW.Z33fdHL|{Y*=KL#>@ϑ(AR;)"p@^4Pp:R~-+{z/]M=8SzL3z}BuJ x$ P0rRnbfz?܆~J-=m+[*S:oΫO8kc)j: u;#<@x-*Z>Mi[8{w#wvږwkkOu/pb ,EsEmn>zngsZ1\^e(+}Q&o@\h=>r["P㝒~Wj =U*X)T%jzlWQFjW|椂iI׌"{H;~ͽ,tTzҭjT;U.QvKfIV. GYU_Чƹ }jS_?}'eR+%(rbBE+"Q`;qnzGqWg$U#t՜͹hS%h:uqaEDf&QL2L[RIH^f B+D5H'Hy<ʊJvqy)#WM+ wVR Nj^{<4-&ZUL̽7p{,n\|{g46!V8%T4:1#sBF(6+=%@􊭔}CHZJ-> sgV$].k>EsʒYOK{+-S-[R@Lڦcyh.5#^NA% ?@zqQPZGl4K :P B*f~I4LEJ1.[^2MI.NF٪eRwshG`H\#/B]8M3 0ԉJe(927o(I\ӌe4B;$U+U/xչڱH%_5S_cWV]~0. 6PԠ+Bӛ̸qSpl!ZV i+!Bb:YURp*ނi=D}%9 }^5BK8[ʹ c,w“ƣk4JS9>oH۫r;-/˯_jo/0"b[ME{OJ٧v*ƩI a~+(][;c1o8X1"&,q)P#r~1[?J1VNz++w-do+։YD#0]{9 d cZ!(*_lsuG\9+ Ł5$ӘPt&afH4sjGǴUżlmȃOt9~7X\`:zzbZHmǩS %и\<шax((!ç/NB⋱C#F]uO7ǵAސNj V:xZ uTRQvXbV;g²,HŃvx+*S "~wAE-R\FzJ*o E+۹(՝m6VoMTq*':NdN T2MijɤNsSCb'9YfTj!$P:% RW"՜Plj$JR$qLqDY)aW޶tvb.VRAg_}ܵ"B5QT3\d~i#rb$P3ꐢIy.xg\SIƑ]-bd!^Fve@C! e:(}8k(&Rчl5n"[&v@=v'#*L [M!'J?½p_'i[z=Bc{)y j^̣PK!"zO@o}~CHj$vD XL޴X^#K֔n?q{+WR W6 N8l=#{%9~ ãHoL.pwt_%C䨛j!efT3"Qm.DE=?D5|Yb(Z[E;ZQ<,FKڪ *%)9a̿ ?D%'J#BuO_aE%y9>SmTj!wg(eo;eѪ1a @jDV%jJ@F|Eeh|}9*FjI4M_חZPĕbq)kUE;RG ٕ 5R ~,RlNMoMccY?EW|f?m!2LC1},%H|4T{nݼd4ko˳[?ru9L,Q8N zZcgW/+ga7CѭJP3-Y;7F6E0tM+ .5A4FխGLouk^#a!D|Ĺy.5AclJwk_6FlւsݱZhYܩNf1o8ɬѬy`9AkMrMHn65" $(jŒh2fBTF:h52v^w&$5ܗLs9ql3`df+]_y4F@qZ]VzV"TMDJM D'JR0c*kI0yE;?ܡT,O)P2g\ Ydr>os+I+I7ѻO.g_"N=R=fOIjKKN~8JÍ۶viȄJ[c‰^Mf~`~Hr2 +IZ\w1fs!u̻JqRW?PL/?]ͧ63C=v}kV(ËRo YCN˨%KRX6?6'@q/w@ytGǾ׿k.tzE8Jg+ѫ,޲\tzEj❺V;0l`[i[i{JfiC/=-N߳E;cqHZ"5ɰ"cI\ P c;T|(§~w|O 3@yszW[YSgIkj]joztݩrdCN5IUVj߃N5%TGi1 jq,GtKB)mHJQ)Nrƾj 9vT7],U(Q<>~9l{K&v+U}lȽ@.54_4"k! k&3/JATe,m#ucF;FJ%@~>-cN9;+䅎xƃF%xw5#ry?t07߆ K8 l,}ɏfo}{f1U:ۻ۰?G>o~)N6 d.doO+W޲7WTatO` m诒缭a&M$Bχ_6 Di˃DlIn؎od`b ك߯gPU%[,~?LNQ7I1xHA<jԵw?lO\jg>~eޕ[5AwjƼ$}K~(Puލc.~c+gLrvzcWr mZIɌ[jG EABPo[4?\<=,Qp\*[GH28cP7(u"'8;M%KTMCx`¸z]&5iNګ.vuq(I<#Y0'@֘ʸFS6',P(+RSiJU`p\LKFeV 4.S +x31' ˸F}rJnW?{ܸ\ݥ(xeT]]n'5In얋2#VkMI E;d#t|%w/RU,Q'_UBRF?~qK:6p#Wg/4cfɭI}1Ʉ[A?}8|)w/I%R>{t{ n.ؘ08~Ftnyq_xyB!RRs6_ >3B D©l{7Kh&+5h7khjj ph60+M`4jjUdoJqdB ZKJQ}A5-rҷlc1.ј%bg#}on0#\&v6+S{A":.mXTA5XEhyXAM@C1dE0uȳI?;NfĦc;[D+9(k'u@q5u4 \S`ݠ(l8יK,2w MKX^()2XU&ZH6zqAKɡ'HHrQ i<馕nH% !4 :&4p)ҥ֦ 멄 iu6+R!$L'`4TKmAR@PesZ&|x$sLK4dLSY+YOS ȢʴMs+2Z׺pĂuB8"9Qjci*Qn領AaInt8P&"$#ݰX<Ɛ 1 s77Ҩ2^'KdZ Gaa{8\ ! aN3pΊjk^nX۫ﯢ$֢ۗ ͧ"X,Xٮh`ޫ|at?{)f9pOjI\Q$.ln,E+D ^Χ|s/'/߽i^:yڠZ|C"~#1K@_=e)LJs-!*42%yAI%[P&ׅ"֐\YTau*ؠn58jsnjjAdrzs >+uJI5.Y LfMK aזTkNoJ)R36h=lVJyz%{VT2U:+G&լ~1JߢB`ǧ~#~I5&OVT@ C5Ɲon/X pttn<ό"*83t$;/{y^\k}yygӅЕx_| r~SAy9 `kwbQƥN6xXChA0?s KCc@j%at%2l.+4/֓J5|1Z6TOn8]T 3Z?..կNIsv1ZpprA+݈濞Ɗ BB]\c˚}(ǜ Zjc)! 3 w+rAhɈ(RG#+/cݔ86JsȈ()d3sk,˭5Jt*t" jxĹ$C̈^$ꞣEG`AnCzHي)r '1mɑʑ:rILY1.)BA8|}sFu״ױ1 ˰m\8{i>U ;|j~V#_]C(ZG8T5 9uh!^NQK:h.C~~/uć ?U^]/U\.8F[j\ջ_OmNuf2/kƓo# ;5NP8^V8J%ɖJ6b Ӕ x5,͠Cܴ]kDNS[ 1"ѿ#LD, ȄS{&*3&%fr(}֍5X vYMhdJ>/Ykk*PV woWaH́cD+_) 8 C's"zb޿KعE?hmm8@m1!S[{>{Fŗ쭁FqQ=1vo]aF5t~khjxH8@){YyT]F<@]2y]! iJTŞ͒*qL1Rj_2Ț#6~ӽyf=hĉ62ZQTlC2NT mՇ hJSFV}Ak9V_=T/vI5gY\zk٥R-OX+\7]lվ-n~]qWi1[ vv o$?ľ9w.D&)nڙ,Tusri"IR5Pe=πğݢʪҼ# hMڱP#wD!xXN;x㭾PG HwB&eSB ~Ź݄Cn21w4nERnz.,On6mJd\OÎ0$pWmZ5C90"6ql$k4og֬:C~ \2}WWuM>e@D#9kpFHpg#(FǛNO_}OFX׷ WvGv&kD(*)1[Cj׵o~1YEf=jI]ϰ6(f*DT;ZjVZae̤R \uTp&2Dkw^:rܿ5 J3p =0H0ϩCÄY)mC_I Jj  -`5@mUja!4WZԍߵ  ҿr{!Ji2oJPޤr^O6rC*+.LrX8prW4ջoç&f9l1E<S_ )'x2vӓascSZ2d;bl^]tO6H6zV(cs1[F.`WLzޕ hfv*ݡM;xJz0bRa`|/W9xe*dOBU*E%vK1l]' 1J!cԀ,s r bJA$HT~px}{FuЋ/ae/ "n|Y8 /8]/Zh 7 E NY(%-k6jz2wG|$}\SoKd3ϬfSn] TVs3>rԵp暥5,|( %g)*G7r4TNJ)x;Berd dZ HA.pixVBj1|!f@?(Q'F,R(2"%hZ( 1h^h2q5!1d&b]G/T\x{a? =u8 eڦ _: WQGB1@@E-@O46W vԪI 0TPꮬMS=Y^1]gm٧wf4&eԂjoMn&A$O#OX\fm6|Ra}bbt!#ks- ^*Ӭ)!{xl{Z *{Q6O_G@ rAVa1#? j`戮(BD}ƓOW0ȌMbi4$F&F9)Kb#DCG^!`FTNPN4UUz<]%u>}^՞4J!,2IWjF?1bj"s UMQWFW) 4v9l] ս9.*9hbfrCRÏfNDL|2Ʃ62A9tJz EIUR:8OSK QAFB^ NIfEO:"nAȸuKAYK#*}:(&m=Cmwㆬd๻xJrʅY Y9j^5;3, =YI\0%"}DAГ'&70zdSEYWB PcZJVfa~F@9)ȄN,BQG~kMZz`a5'fIϥ %p{Lu;YqJn b S\t^>ÓVSqL:=l9V#u({KY3-%R=FIE@H 2\UgY==9w1J_~\8mG'g&^i]&v "Lc (q9&NK=YlT33'Vr#<}4593t{c*]oΤ")76\Zq'M>6?݊6ꇥ 80)~MH ߼~]"7$߲*r4ZҬ=RPcќ3˫ֿZR}<<Of&d  \OMmjmg=4 ۿ4r:p:Q%@nēl=s@d|ݤx&Z wOCCwlNs.l-#x&U^e{zA92㖸 }qkP[]-ScV^N5O1:X D'o0_3KY;<LN"F`Vsڊu@.n\5r_UOV Joz%I=~ԫT}ٖTĭVz-] 4|& Gh??oe j,U{Vs 힕<~!ۇo_._|tCN,O؝ɳR )-Zqޗ{Ki# =q7r:wgP tJ?GԞGykߋXpLb[1)jV0A[H6W\yfJ3I#f^%,+V)n$& ZzB~K* tms) mUߥ>d"!k/ޤ,pۅ_>m|b?Y_Aq}$C0_=f=`~Oa{r& e%o)jOE^@ژ3\$*aZQ]x4GRܚ›AD7rJv͜]F'X"{WrNf̑dQ93 ya-] D0 ᢪ##_HІɯGoY59(ӔHR A t4%8:Bvwk0ZP.鰚p|*;kz_$bTF Pc\p B$ ^ TŹ!Si0`b\=}!#UH\ujd&rAS[Z1I0&*,YɊ`=-ȕ"tr-A.ײzz]!O58d=[eE(Zq&W*t) bװ~9gW49ƿ-ixb!Ax ֙(~%9+)(PTQ8sZy(~;CY0C9%⢄Ŭp2&dEsS@/\2A c栺TX53b[G> A M՗FN3($Mv3_X?b{*zh_n]tICr4Յ7xϗ#~FnwctG!$+OO5x2{:DǼ?5&jf4L   49W ARn()3xB;iT5w4) JDP%Z$ MwϘfDK qʴ#VR.:gDi= >yE@szywW gW DXD+9j=*TL-2uC. 0ŧbvuHj\Gd=-"ch~5>zȤONZd#7vU8[.@45Zxv%Sv?ɢEtńzԃ`BɢE,3 )"5&Y @hO+ GZyh uGy d-Zፗk豯Bru5{N(Ncyusؗgcpxכk2u fpTMzO?^u {Q>I~r~7>ӆCm!u5>-GQFE,.MZ@51.;-} dJYaL RFI'Z:x:JS;Tᤏ4mV22VxV\QՂh9!6bYukKd,&`zR*2ó)Pz3Rk Ha6-"(I)eiNBJ)˓D lً2'L&)FC5HYK)7yRM7ӧKQ}ݥV ȓRI'!P4)=c)I r8Ӥ^-fƐϞ~<4brrQ6[;h޵#E>0.e& nf0m6g'ɗ?%٭djI2$bX$Ue:y,* !yTV*~&XgXJI•l>O1ZgV%{w=Tk-;sC yf55&78{k[DSF zϮxiub @AA Q"*%Xd\!@@L-`eBKwl:6)$qBN"МH;h,+-ÊKrQRrzǫ@+;s^%EAFJȨЉtt hj%TOiYӸٔ@L9R^]^ɜ0<^y(~*9C꼲ZN* !,@VM5UCIF3- Z%AkbB":W2wqB bX-#RoZ@RMS7=#d"LG.D_?0Mh7GXvKow+F$/;SoW7yqz587fI 7$*8xͨ&sRvFŠ3xҊ\]A-?3r0Z +fr6}kkт`辢"HG$Buz rTf͚j!.H3l팶7s[zn"-0 D!xk(Gԛ'LQӸ $L4j$bfy7i\\9yT>jDOY\M2,'A`'2xlqt yzͱŽ$l%?4]:ZmRIK)' 󴴢r~If4EFdy_dE6#zAN2HmqH AZns{3iI{,u=CE\ oWv}q,Wu&s{6t6' AJf༴YHi IUSIcT2L>”{`{76OHC4ÑU<-rǀHt猈!NMp{pYν¹=kp6ν"c{ȔA뭋ùi,-põM8G.(>,νLNS^NQsv@-2ù#XeT/!s A'ͬMkQ\1+c$ڵ[rTIBe(KOJ ;٠X8=]g3_<%lM>v^ThK~h ZVqϝV+~_wJ<H*9NFrۢOzE8e))i}a#*Ɋ=XgycTjȐ"J?߯?s#ꣿ|N͟_8 $o*:|ȏ7ww}'B Mz?Q0C.%W[3_y^xrM(T*ĩ_MW.>+Ml$D![S"yJ_MCCЪRcӦBq68JWW miŞ/M!^_3w]|:¥au7 2[fP{q Ӯ3`)nǻE!к%.m0*Ma&'57W5de ,Ǜb.0)߽6/'c53%S(XN?%9.r~ijG{}X Rp#SdñH'fDr]]|j5x__xymlQ%U)?WW8˲ɪ⪴R_w~n-K*bx/~uݼ 'O5_}spwM,*ƃ,)I<0C"-n3qH%9:npbRE@fI~Qq UN zaʖw1!?֗9mB$ZC{8L6|*K4=sۻat[_NwnEdr4?u"ӻ !߸uؕOqԲV8oYE'0k3 t&FɦF>aՔ6U|J5\ ´(~'_f7 ;דrv_=yy\`Y0i4vO3>EGB˔~hR*C{׃Rՠj?fdIJ GkNYE(QL2EH`.G@p&5@'9Ƙ߷e0=g15xoVQ!JRơ8}Tɕ1 4`VX 9 ԎZCC=!5a:hT)!)0J ǜ!z.v!KMLˣEY?pٗ^_֎/\2}SL]R[CAbdE5A)X*0 V{yPFh&B3y(c;ݵδTmgs%-vV5Z;[Qu鸒GW;M8%@kN-mA3XdeZgOQqwCF Q\XxyQiOEnа4W?aS"'?],qls\<<.[>c2~z򌉋$  AzUn&t54OVHYVɄyXڥ-,zt@NlV>3v]! .ލk^OrèKquy%4b,bp]r$GnLݘ g9z-D?8OY>vA"v%<{cyMB D*0O~P),2_q8'vN:.JT |.+4}ThUa5cL);j,zL Y Qs\D@ q xL{N\%LEwfjg^JD΢7} Mʆg7чr7+ӛۜ x`]!_h+\"JZ3uZYE닞vW>hJ qQ7 (Ȧlnu9 :%pALh1`L7st28 ZQJI\0DD靈ы)&ȃ0p#h+xDh42ȪB/v|D%'`BPPHhp 13%5=T\r}9))I<}_Elvݠ"ɪJL睾Wg0jހqm5e*iGOPrnߟb @)dLamo6(ݞ,<=Yn 3SU=8چj'S{ (R/y>K_{vɉɉԑy:eȧf@Rk[b*"Fkqaޞlݙ]9zU3>o_Ռ]Yo%7v+_ [G8ͬ2lVhrd:9uq=37X臽]3qzKHL`_)1!)cӳi1:haUg:' wRDV+ I6$,MhV料@gO;j/hfԀW]DHP*Oda6 `6&RsԽ%. 14|| @n-åNŕf) z?HmH r 0%&3>OiR4#B^yVPf"U +t*Qݖ[wE[5JRnv!B^yVP*W Zf rB%m]Dio:Yff1"䕇~LAf*$Ύe*ΚfyÙj&ńD?@@ 7@06=1` t tY9 y˝`jy;VKweFa x5s9Cn̫Y`\yFϯDQiHPFY8krnQ.󆘏*zc$43l1yz9~ h0nO­v49g{g8{}Bū_n󋇳uuW&>$ g7'+g|93N8M/E Ek!%}~І&}9IM4osuO@ծhaƒ\z.2qJs *qapu)G mM!NdNyrp`zjv!ӝw~KMj56 ypɜҖ@c !FQ3͙5QBuJx{dL|I!7"0c NI2^$:HGswSkZ 悜GfQrlVVd HjQfǝAL02FJ0yәQU, '1YymZAVH5f41]aOӗjP(:KT㧧YTH @#@A$sӣ!"q>Ip2>rӷq}tM\?ּ^*!za̬VQJJKlX$p2_!W_C[*q iM vZ-vSr~Gي^*c' oD2YLbJDG Hcqt=:M=_%*l[@ 35JI|7U`3 3%n$/*Zz; \}%'  ,zW`01΁ҺD`@Nh.n|yiSiT4xH5=Q˘u AiK1!+%|T(<9#Arb_&R0 c˥iu0q n@iia 04A7?2YM4ǤHuak |c`2 B Aώ`\g9wy5&炭 |hM+G PWB3K8oY|CT Y^-j R[bDkr}̕ {zP>-HDi,YW(fV+! '~Y ׇ.X& h) *w=.3z cW_ .o,a~$M@ .f.T Z1紥V EdO>QQ]r>\jEBՏ}0Q*p(%lJB%p*wc@.!\%U8WHNSFQ°^_5(ﲩ vX`!HMVE\KޫW=yfS b}wU lmOqٹiSYKs(7zjݸ*8ۺۛ158.0t/,l$$~h[M  ^_r_Ne (p4wiV7ȃuD9egRٜϝPzD i.07 Q||37 >7#iZL%`X|1;8.Dz &}TP/ˁWr C޳mo>Ǧw>DݟI9>?C|xi;!scznYgvVah&f-;P=y_^x{t=Wo3jۇeW|FX}w99+֊x><ӎ|2ӕb,kaVHR^4k[mA!"(ޅ9Ek:lCۉsŰ&0 JjsƑ4ȸc%qXdD'W_o-../}{uj+B>KT*RZ&t^`឵/_79))qFo>Б7y>t$e=tc"D6os(EUqTL 8YbW'w[|D 3N*#6"iBryi0cL}aU1Tܭ 8kGe@v: )Ty|;?ђ4GxB,;z !`4492EWDкL_epgӳ{#5xT 6*$NES]fʌC#t2?Oj% ܳw>JܝJ $S6Є Ǿ".4^p^c$>R-OQT߽Fk #sz..u`O$I{ 2G %Mɧb#?cD+Ѫwۯt;n*[ rB%mLe6F>ZWٍk JdaV~u] 1Vw82!۲8J텗m%{~^N򇦳1_7?t{~ O]\(t0?.{3=ŧ]4>O?Ϳ?xwaxwJݥw/w_2Jyç܄~Vjm4g'3I>k`n 0U8K+v%\2\.r5.9WNE =jǥSq2 D$SW'q*[oT)"@OVWj_5H%q<(y0JlFݚu秊v~|;:h"2%।`9x'+I؜%Bx~J(ϻ%Y83irqӶ)ܴSh^0~owNb=;9h;7iC2͉wvs&W򃉀ajyX@afPς}1H/? S9?%`e@ƉOƃsjc1{rF&{QƆlӳEOL '' jy0EGY#1 /u;\aOio`uqy~ۻ"wG$U)'3۾drm3}~|(֟ߢl!w@%ed/n)DWwc5eߐ$ i-O4$^92 V N]+ ĝilIPK9#*XJ1 rDZLnMpG9+l}.Hs!Cݥ3smf;!-_E݊c?DP9qz X\ "e㏁Vb[8p$b>gJA0QY|:ˇ/3S|נּsJ^2,?<)SF l td2;~ 0`e j6ܵAP`zMd#N1w,\Q~򺵱nVAKɁ{# shϛxҥ$8D},sƥ^1)[\HȌ?M\jrX?_ߵ)G_Sض)!S s&%7acYjQ FYDc[)yy1)Rh%4gdhIvظPB;9 F|DR[R)C Yl9TiG_DcuL{꼲) %)JGY#HD6P>vSCRWk;AؒB؝;~w"{K/0Xz5/`tec?di[ވ8 tn4!"nGztm$DQyiP4L|ƷEEz|p@bcZt]EΏ ҈Y Ab2#Ln)Z_p `'p˼'=-Y2=& JarR Zsz)=k)''g$pifװM%NXJ 5M#ĩI D-@[Pk9Bs<8ץu,aj8B̷VxR65}Ry]> )%8LJIK4ޮr: )݄zXGMpnh/(I)^J '!P1c]zRR.r)XxǔRh}^JXJUǢ*AؓY%R?^X-GuK\6H-} UBGkꙻЪ^|f`%F8SSt7݆354`;`_E$-+NE B NHes1(z[#JjVC^KCkDPB=v =Gى7LxUHj!p1Az"!7)"Z3X=aV}D, kd$6DIL0ČB*o7PBJ67AԨkSL.Q͸Ҏif| WW%1J61bS7MI)=OQZunMκ@"ZIupD_'taP(a[t7GXvK$w ٝ7E2&L57ACCo 4(J*yA'KЛFQZr,s˥+&>BoZ'pwO6C 4{m-L`|Л@>s);VRq&҈XD:5.r"+R!70(8܅Yf:j_=KYx\Q1zR,X&ռL\ԻI)ޔ1\rPRȩ:s 9P,!B[ _n9铘[Tͭ[g= Rk"OBJ Rc$zGsRxy)\7P.}R,¤ /zL)"LJqi7-DI)~tbQ)E0)WuHwsv lOVO67IT;Z7yj7cXt~(X?b%UeQ=&qTM>zZ>TKh:}>/@ΫZ,qg-(2:e}2nv)8Ϻ5uyͅ!gɵiBB>q:2 (P2,}υZ*UzBE8^gn& 8C`r@֬}7nqhI01iΐ]d23/bX}g2XbY1]DO y:Y6~Mo48|~lucxbA>m.nQzϐP/eR-Sf߸nq͜4}}9{9`?16[|Xd7|Z;e/^eWw/7g*oK-\}uW?WedjKhvY6&+0|许"yă[m7,Hנ:2w)hl/ꗧGpI`Ǔ$|17Tb~2fEnjYP> ?8U!3[Sw{=g!,?Cܥ7Q5Jށf.:DfšXz9Jbŧ_vt,}. XٽUz2/7O1.s~h؀ P{x4G Py?߮^F@3wI1ﯦJV y+GF`/\$Ѩz] Y6/Ek~x V/n J $V~1^TП_Rr' Ę Ť7wЗغ<%8ĻS.1sUx%yvʔX*Pt/EjBAq[_) `mVvH$A0QP{CDpVWaͯw'9劢(ߝʋe0X(jZEfe;/P~Ҡ@y3p?xz،r[T=y9QVE54`ʟ^; G%j+#$dQ0仲B}s׾cgӋl!kE2ԛ2T/%,rJ눹DDqPX$ 2~v-_wxC@;>3ĭ>XG:Վ8NX#(D VpJ#@qJ>M c-N@s+9d!9`a*ao ϳ0pA|A#C`fI-=:j@9Uxu`AI4 JRFx<]ש1 f) XhnfhRe$V`mTR8VhA,4IDbtB4NX5 DrDi=p D2=b[PkAKiSD) 5 )Xw;p(`1vPvӺMY%negZarĿ6$V;fk8^ȣB=Ss !fF`Ⱥ>HE`$1U]܃=IWHkc@,F"Lw|'% 3,:ϥG,"D+VCGs~h*ފKJ[(:Ol|HtOARUSq'IMD$>M:QhVynܸCĪvMFRHœ#+JN:FE*uRd5>݌XzuDm>u FgIqtSfN%Q#cIRj%Tq!%1KY,bHHI-AF& PbT?{WحO7PV\ ) 3y \mt$̟dk*n#G8wOc"(jUx"ٜ{h A{KpdCp n91 aS.Jg5Bo>8C3i<ڝUsU{-Lw:;>LqCh-  CNE}f\4|9bMT+}Q;2/6٢DiX\&. ]Ѥ>\l^g}߱. wQiV}&Q)r"{,rٻjZ$!MaZ *E3oTTPheHCsS N~nVn}1h:}FV[ĆmYOe'oqnk7S돨bЄuv;aqi$ O<&dUmv\Y&#,Kw/M@*{B!lN>zec  !l nE-hCI[ Am9Q,bYkE^DD` VvܑUّsn((5zk, $L&-X& -&hL̑x_B$qʿ 8\愅ws9x9?"*H,DE' T -%`ɞwe_B*u;K}pWL{ 4ZqjӦm+N 'UM @^q8-i¹W$ O]nkwӘ{\m K:OYS(8XQH DI&I 4Q)~'8RKehAp|o"Ǯ幉 x#8jg̤,9씥xfzIlɁk^$=e4U [8BCaB)Uͪhc,J@ #FDCꥆd'Xj9MJ@DWvkҲ4jRb!Pn[-J@,]kPcQu`?ՐԤVrOP~-a_Viݦdm6Oy:%͌wVKۧ "sD *|oia[~|q=(]op"YL."Oui㧛kT*Br/W3]W7W4i)}Đ41[{0Mxz{{8J\3XM;s[,E aj'wCܘ9Wbz+9i ڊFRO<E 6ݡݽ6 h娍{8:t\/Vm0R1V诋c=F%ŭ8-)^y-%jt21Sv vl֥!^ڀp(W,/R x׾󿾴0%SZFHi%1 9 &{ޱvCXž#2z]gGAKe_CC&d[u2}Hcs2}Wrl/&^rjKDiNW8g0Mx-~CW;"IԪܹކܹEl 9H?jVv9*%%GQOWaFߒ(OZ`Ur}EˉH^x[h;t竻eXx ˛;;\, zizn=PrJ.&8֊ ˧bY+iK\V/aG0o;nƘZ}C)ZjZcдV+OGWҡtVi\Xq ABx܅0/rbz,~ c4$ 9LДڶ& dQ9j]GTo}(&y}=;`vTw/ja1Zp]` ]~65e1a;I$YgnatI3e;56+\p ¢Hq[0c:jv%xK-Y|zIq5J*[)n".^fZXe@ɑ$1_u߱wk|'WSu#[hoo7" p1k"E_%gDN%nQy1[QzuFdɅQCw<-DS^=0h+SBZ)*PWwꨢRHǚ|H{a`ԙE|^e44 jil؞\4Q9%hRbM~DAmh}b}0mPI7 ~>4IZE2M_:6fS~7Sg J~&8 Uعqԯ *e >=^ڡ]sLصbgcҪqL{tIQRnE+N0S0# n~89Wh:X+mR&f~S={WSo@^JzSo\v3}`7'Z_戚 6H$mz`ھZSNip{XT #hU\4Ad LۂG%6FE`I_<8 1Q.@[m8WSPSQdt+(x 'lG=D}hb> l{p"^ w+֌lB) " ӏՍZ~REfMFWI,w\QY^DH BVVhY`sϚмݑYȧQ8G4=ucKފ$1xHzptU"BiK+8%-H.|AW)?!;SNfeg `z,IA$Cbd jX-:mmHІSpv撁M"BDBYaB;3k)7qF0+uyrٰͬs`%7eͭNqoM$FIdBc#8j1~6s>HzW Xm=p%YH,զ|nTOvϖfpzto6 THJqoPIcwĉv K >Ӫy$**6MkS֏*NrAs3;_pop|BR֊bjҽR a!4b:S9K$%ɥ:~a[̝;SktlG;3Nk2*;A͎;}+-7>=STsCLW?'P!HajR U`$PwfP毪>k`~~] rbhdAtA3fqhW'F -GZ_ҊϭA46 [U3!y{i%Jetpλi0<}V:wW-'U8"LN.`Y:1[ A5[nEz'= :5効ev/qWΗ֢`&wŎ#WӁ$yuBxw>|2j*"_nrH=fDc@*dڐE9fp:Wɇպc 'V8MRtފҎnMM6l hc!Su9Dy o`K \ÆF26u\I~\׼YoX]{`M+ecD$թ:3jS't {ɿ97S>=ќw'#iK˺ߕ=],]p4'AGϔ)DyY y1qgႝ'DP2gL2縅ƛCDE_Y4&EL֒o]"j\JQCa?fChbg93^N&ɬqaye[k0: :}dt~!vFע]WFFg_5S ^j$J<4uT$~NmIBAd"X92eNEsy~Gc=ahnwpQr<V;(A:y`UkXgI@ͬUb 2 w,X4Ł2¥^'fk.p:I4ےi`KB ̱x@Y;=JV-tƗ?W·=K&?"2B2H02U۔e d$n 4;.k60g/ o0'^T1(uhfUhVlJsaW9k흦KHIfmJO:[K<XPɮis kQԆ೴IdY!2'22.݌qhw,qp$p !D1J mN%?Vn92sD,c aj7[6rp9130(f9u^F'}4wNq1}}i^*n^O!zW^l."LEz) t& . "2Yˀrvߨ>*~}C>s_&ܙ{rHyh؈P\e3, ǜQ0gSq;z 0SYQs(KY6>}m+:~508 4\|&BU 0W뿵A&O]x~zа=%Uj?'O+|πVө}IN [\I5K#wPyQ B6E $HUO:*HR E,۹јtF(~Jb}r>>ŭw!ռ%Z#(WX/V1i^`;w() !AxcapiYR0!aN5 O"Ie ̪njcٹmrג:~*|jHXPQ^ͅ4E@ۛ8 ms cBq5n `gWǽ lL!rJ(Hd Rb+/ӎ =I"䐶8S^ή*R]rY9k${uVړ|+ [ dRhQ0ۡAɪt|9Ʃ?CfOgn近!3=85^: 3X,`h&pPIRF\,Z0Ũqim(G#V F UW!Z~ߧٲ6X)˼ #C=jvе_8 z%3"!ljǸTt>^#nIe!=׊_g { aFϔB ՂZ*wj}ӟ<~Jyr{=vO6nr)]? tҢ3cLriP';݉s|Ȋ&嫪o LzR!|DϢLhEB撝,I؟ڨ!G@Q:AAXAOs{Lnʑ"4- 7[/f> %hnCjtF,]]f>+)y&B9^q@w$W׷ub 3Y9k8] =Wg= '2衣b:Z5 r IU+;oE$MĔ+#bgMb12$rr fpM"H(iDFi[W~7ZS@R}I75bXO.\7hݩ`vXA+b6}F]0t{ȸʰ dCe%|L;&YCR?p6v$.~,56hY +Fd2 \7RLBv"nUboW-jlq@UZWt@2F5hK}>q@{̼Y3܆^3d_\Ϊ.^A[ Ĩ=0gD9,gA$"H'yt9zD,[_rͽ+emkR}d> $Z<.+m5bL2@:ꋆO?Wkuգ&#y b{ϡX"sH#ޫ{43g'ЇP{CxFVٱspoH+xS"Eo K, mO"' :6GI9e^+tiS zc-# qkdƙQ$ZxjU[X|m`1@ l*]tjp߻TSd|m`y/\pikSjD1 YgHNPq9b.HsˊT7.Aݜ٦'3eBԞIgj1ҫ.9K:vR'GORȉjgz2njXhאB 9IIqW 7X#:k5_Cs@0f29Qm{"d&l/qM ɑ4^F%_;Hîhx 6l墛}#E<Dj3`gmK4osL%?O3@AɪIJ6ċ1xl$Qꎈe牷Hno<˥me){l5}r\]B+y)5 c 4Vޟk`/L<7_ws-~~~\`+kFUVozA~ݱj1o/ =rTflv1Ǫ)`s ^qJupܡ7tJx<-%OKږ !#}F y3%{9Y[͊BݎJt;TDŽ|F BIPGLpd kyθnigџUt1eK[S~XZC6r?R+NIk[uVW-"[h1qjNo_"΋,J L`XXR h+E\C9N=x;tt 7pB3UTus7?4Rڣcȉ١"/'جٻB ~y6[yIY ^_,s1tT]sAɃJԦX;ïa b}T&2b>g;*'r.G9oA2'8R̂ݮ^Y=G\plyK_b^د.QC܊Z\`Ԑ^3U|bdڮҕwߢ[t\z1UEw4aU/1![N؆m tmୱt%ǣ'J‡0[]~}-v0POMtc؄}kD/ PM<~|DB : ޢ=@c;g)%](v{R6!r2Z+Xʌm"]lyE2dI79kzl8hB-Bbx9#kVlN5[i >t T/m}t85tD!~<`bd@ d*$,{ /?@A3%f3E ;Hc y^9I$*D8iSn6oW_o8{܇ZjӒZ60>Z"F=mJfHƩX3{r19՟;Zǟ>7ןp?@jZQQsm=}9h[F#jl`OlM%]#i~39tFk08Z{Jb+Dgˏxf6pCK#~X\j&PIw5MI1KZ A2r"(s ] wfF%R{2Y8K_?:6o]l/.OR[z3hn)Cn52IhC)VqC{}1/TqDØm}Wc+IKWTr϶dӎ0y"`Aw!dPɅ R!.lf\ %7?Cf_hEĭ[{ 2'饻?ۑ8L˵.OmtkOH5#//#C ⯫[#f.E|&N?~w]<:Υ +CΜ ^ウ.k̄A.Yr'Oc4PSU2:Z}QtaT8i`%$e@ 4}Kyt ?H&BPUų6T~ߜV>_gW5Zjm4 YjB͠M+^?CtܫjL(p1)E>T='N?(pPG|3c;wlZrc0`sJϐf<"Z3&U#d u jX3gaVF?2{+b=PՐ)uF?Zn}NAn\DzIhPی#J=!s鋤jsϘO:@8'nRfr¶k31TG*RYfrTe-: r͑|9XJ6v<̃>Nʆ(qJ$~]83cnT`'NyE` $EШ M%?{;_ DL}Xcvw. Ҳ^MN|=ʵZFg+5VsKy0tőWܤ_l{?ɒMĬ1*DHm@ 2 Ð=:x_/5a܀76n(PFmx_uֱZ"aɕA8op89A%zDz]vg#]`.ֻlw7>A:t]9Oekk:*q.}_ՒBh" y$Sb[yA 3~Ƨo:R6**uΫjF#30T(k^oAM[T"c sa7ƈD KUfBe 1U:ܩ|{in,qP,ЭHJG7Fjg[? pk][Qc*F[ZFe%ܬxb0Nk7*rrc{(=U/V}PZMWi+k2ٺP&ŤIm%c '2dKc2kdO7CKq8j䴍_?>8iuGxhJe2*'2+3[U%7zq3*䤐ȶ_f?#G,=o~bʬ`vo- ]k{hŰ8REanCB3́0 H#Q>umѕs* e*L/$:¬)j J6WU+BQ,F]ZӹPFQyrc5 Wꡫ.iCm^ǸSW=TgBsF (SB@LVOeɩ 1/eKM gq_]?w'qxo J(~~R'RXjm^Fa&%Z:#(m'#s97!D競$w=Z..r,SI(hߤ*Vi\^܎JΰL4k2YY3Y6BPJi_e  9 !7Em]n-+!z >5#oR ?g{=@bOᶲ׎u?{bvRk3b_ l9$:Nm^j'*ڌl7VMX+ም B~Qct8"=As "u~iݱg_9[-Pď@ݢuD X@a1VCb6c2EyGh-w[׀3]/mOe=$._"z3J;J{ GtMZBne夅#ښxC;}U?ctDtF==Y}?H`?/cy/;$V%*OR>)eoSޯcӊCF@($QGpS^`-qj =̀&lԔ'YSӧ$M\ 0QQhc;U7KB[Eoқ%PLh"gDR_5i (MJ)IEF7*nQRfiKA+ # Ez@ɸtLִZyhRBjƒ@40"@MҫwjDJ/?nDDvD_=K*B el ֦r̽Y<$45y2hh8RY@;KKA(jZc Pzl}n8 50|A4Oz) =no2t6'AhTQUȬ2&E]qo k1[jMZ #\;5/[HNEڜRh'-T@PqVZl=nD!yb!γw7;`>a ȎY*[nK]*կ'*b@p#AA'︄s<~tlLJKI:@v;>`j|4R)A_MRى^WJ,(4%|gq]uX-RzbnM\t[!n;k ll55s|€3;;Ph7.Vea9D` i p_1W6-whxފO7Z՘`vD%m)[yBup~~uu} +IB5:\AȘjr JK$op\cUTQ HY*&Td™x&Ʋv7E: friR@y%6JW?ـQ\$XUY%AHjހ%xl+*vL0B)ߜhu %g#'f+׫ϝqoh|lTpN<$Ax#t( MN6yU Lɢv5+4hm\P~w'SHC9Cy8N L9V;u𡐴uAhH@"|^DS.IJcO|6Nfd4D0YSFr~] U_3!.~N)8G.`JL٩9ɋXf  _0ujv/HRSyVT-d'&*#;Υ_rb68g?uMCʔj&W;YxRIzmjZrz j վ;-z6.z!ƹQ8_Lm=$sP aLuȚK8(|fhB8扽J2czH_1a1 LyFFh,̼ !Om](I,Rź[a[r*Y_ƙqH0 伽 řbcEڋg&$Б*;O0nBnt>vK]aZz#(0 m͉ꌆL|KH"ZS)k3G"-]^ߛxG S]{) 3R'3f1g]]pZёCf&r >ڌTŲ<)_Y6hc7aEW y7Y $@MvKWʌ NN?~\RX rE0s>pxn WtbTSLKN~PXyjh|$A|΂*ҿq1TB`5P H FqXl-V|vY $mZX ,^iL!"|Oin DtPHySsHR/i: A0 =Rʍ^k9#C4s|d3We*΂9pOۖn=I"D%u pދʺM$m)==e@/WA$e'K[7aVyث/zR`(v!RKPCC9&n%#ްf IJXx }ضo ÷R3_/9NN,;F$@"?iqKMrg/n~;n>P؛l~uckܓ|Tⅆڨl#j_s>]tͯ3:$WA0 ?`d@SJ~'rz,ӬCs )}o_REhՙ‚KJ`m VK]O;ԎVD f'W UϏYV6M0,7M K.Hu~K /b.Xѥ Ñ00uZj#YmV0Y_Ԇh5\-zg|ZNGeu]6g,~k\%^2IB8>g=nAe'yrk߻q+%~|[uQ]Be}뒃qW 44~fy&?Z$XΰArEA?i~[":wC^4}Tݐ)ox,| gV=Ii!C޼^)\H xs=Q>HmFjYroH5t6Ym ͐s`š )~ԆVv[X`5ŢhLqdz3Uj#sҢFczRrZ֏C.2LEO X,d㚭66}Os>AyNpk>W%rHX[te dZA6=c?43}FfZE~6WtMDv.Nׇ l~iƲn颎vؠ>x6 3qV2/ASB!J1NL A$1'ۧH,JaצzgK{3LGxA㗨? ͖51OhlB{2D+xDUKӕ (ِ% {yb?j>;N o#_>? rPEf Ns'<4Y[Q"]|8,S+4:lk6x2ihmb` O$p ry ></++S("'%LKzgKL9`7s[LB.yz-XJskHr&^k3߰3qzk t=8hc;H찗[G6k\rF83fUKMP`yam3r*r U1'i :s z+Kde.MGkRݦ[ 8Tڴ0:u7!&H.%`f~mOEk _mHCWק(;FqRtw0Ѓɮ09KK_kתAV!D.}VBh> mmm٧aul쒁19mk; >7g~m3v 5g-dFncZŐ"+Fj z:샜biPǘFyjv(u.[nCٺWSs p[JdasCS6׎e[~RFm h.*S*,LXkϳ2lid-kmߵA;v,/bR"K[}Zp.'Ҩlv<  HMX[NH12#M9},<"$Íi9C_k]׊\b-mwڒ?[./0.+L:q GFKέDZ./onvSsq.QG+!E}! 7'ooǣ R$FWCقG/<&q録ł L憔qk2KU/_'= ڶm$]z]-1 ߕ݋ IJK"JO^(aB·8đ'@h '8Iª+J#XHڰrfaݨ!7u!7S_¬1gãtURH&W̛YpAAI/Ym=5Ԧ.F?Pۋm% ZO |ge^eebF#3ʭ`|TTx'>1ir5!Q 1AD`W̤@ô.95S>+)ecj89'@.2XdA%njl@C g2!|lo=9tǷcRcbZˑ@?ˌc"X)4_&Kd.ȔC2#{0-ԣK7Wk{!nT[\?נ>Za^^bv !}p1گ~HHZ}L:C שj=/? 1''{?5~WWsrg9]`x?/{f2v\!m:)F)p,L{z7 .E=bh2sy&LGM*&J6fmz* [bHzjbxV*ń:j9`Aϸr,6uhk #FE1RѠ֐.&z@bY!_kdBl|"V\+)A>GaurwұUeGv~ NsKO|y܊/_[X*LqIb`A8ʗ=Rs|?ގW#wzRN_XU/>`S]3:/aIKS6~Eu0j}μ{T.+!4& [в8ޣlĜC):8cSԳլC123_B2t:HAA9ynw!xMo|dč; Y{#LAÄE/b5Ď"*oAwncs4r4 6lA6I h/7 (4mm\zL AK31Xe~ zx،[¨I)%uj ;[kՀw1Ч^iwʽK{<"~!vS?[zVCí6MAH',T,% g?TBaFhu\-Gꘗo4`J@KΗ9-q>~`l{hQc۟cu PX-lޙ qP9"H9Zo,\ƭҐl#7P!dxu(Ey]Hǫ]u{Ugz9hx{z Gl$Ӟ ce<hGX tr͜?V_ۑږXH}w׻=ERxGoJ3 a)XPQ1:; Ss+wi @gRr,s5\JPr? 5kN/F/K弜jC׋ICbk)D¾6ڛnxOs7@H('ve'#ݘ)\xØJy|Al;#8Z_9!Q7`>,9vJ}u":t)}`:HwomW'}yq%ckR Fb@KUIh3uH,?&V,C+%acaM"ꒄYQE/ISpkuivD"-2"FV#$q?C/9hVmne cln|fv|2LRrjC޷H<"Z Y3N9p۴-`DpGH4ՔJ&u E홱 sjMX䓉t1y8,]Uˇyr%zbk:~%ے%H"oz6IHXT;&sOr pg@5P4q0' g{p̩2N=MA ^;` R9A,D!X-l"/(YLd(T~Sae JC33ô3],jXXHWc-@"E5҄NH#"g΍ I53A;/lE=dB'!ʨ^p?WJ_PWUW)R%#HUڌ+.͉6GbN9z̪Y*erY";mS}iEDiJ铘׽hL)X-^x Ra hc.aP^ (}i퉊iԟARx$I_Kg &M\qTfYtOSfuTO7"m[n H0> }62&a-) IQ3j>t:2.,S@e_P4Er Q@W:.bb>1x@a 13rHRL .)+o/^bIR.hlJ!Z iqPn7*Y1әsL\߿q?+>e ʩMY Սn+FN/b=v,>X],!(CEWGDVXSJ>B`ZLm6!$K&)W'd,Dm[rfJOl6  hA;Nժ*ѐ?nowM`X˿\}x75,ի}EoCeo1R?6d`(g?dlࣁȆ ȊAL`ԩx)2O)c<;0# ʊLnm\(R59W:[er: 䀱vX%3&XmL#C`094tW/(EͮdTĚIFlNH`%dŗJ׻ɱ:腧ٯQ'x7Nt ItшB8x^M3-6H8osӰ "G^!ܼ/OW iߔ./Cj^؍~w6PK, ni5p%VdK`cG/yJwg#amWpPa6"TڳKcLSFmzNJya3w51VU!b6a-,3{FDQONfy,rv@*;KLAg\(ld{shMV6pyBreEgt2A;E9Gh :Mhv > bK*r8OV.}V`$E_Vb X*'S%$SlJ>_#ˀCFUJlU8z5D1 p w2sk~2Vlcc(+|m5eh#o2R1(IwgOR^<_i|Iܚ @XЫЙM>@Wd3]n7sag*557[LǍMː'IZ)aZ 9et%)@禐\AYr8hS(U{H `Nz/x )4g 2TIh1ҘJEV`S]8vהP5=]D=`(6Y0Ox?/o^}G'7?{y ?>y%SUY g)VmYMKT^ҶoyTl1w~񰹏\w׻whpxgTO]8+_?0[Զ:GpAp;-wD}H4INQf˜$S QJSMɠ SV4?TqLj0$W?&x!K-tF0e_ΦysǴ5;>_C݊/_ *LqIb`A@a2ZG/ {8`ϟN7 /ׄ/=x\+'| NvZ &+_Uyl!˛ -ݰוNg--XUU IJ$UwDƝYy5*݇15Cn: wN-Z`ojV j;SdeP{:kwN']&Bu80QlCD6 <7@Ivz4zM`e{OuLC5bk|HwΦ T< φ9 wsnQ04,մڏo;x<)BVm8pa`2 wc>@\[1 ;HG#׎=D`Zid]/ș;6&IYǸn/1\껇Cs=. (>8ql ?:JI0{Y"(5YVpa%!=f3<S$׳r8vJA=cP'LOCQV`EjΑՇSv}RkYK׆Ww = nGWoBXa zӞX1f|7#"4Y.o2|%(+y4Te*gYX B-5DW"ܨ3+(׆B1W`jP^dEz i&f649WkTfQEKĻ&-6+6TB`4a(|&=dl1HB^! zpV7:d^OY[{͛;ࣝ\e:0=P pVK1B Ӥ?w [}N-> ] W 1ԟfyɸQAfN Y EgV!yfΉIqsxNtPr (:Mڐ\=NKn%1g)O(1$X# U!{_v/|J16aIdg#yBt\[o+3%zJdԂ"f`1Z<>!yIH6VJg1uc:ҤfsZ4*s.Y[{w4I=IԎZvtK4cF2zHZg%a =e5#EB4TґO߿pUHƌ&-$N03rβi1['CH^Pm98l&?o󁿧_4v-l\FxAEϞ <ϥc`RR%3e Қuo"NWMAD'ʢӃA|8В+ O7K B/k}snBЬare%/%ɻ:ѫ_[~<߭LG N//)—~/[#)iqQgҁNi!L) J3t6TYp>T"fE*ar$VՔeVRH'%$'N-~#OFndAsP< 4LõI(>nȠgT%ICM\׌J'7`~H+1DA{t%@N~~/uE\|S, h8uSD0yKf^7LCSfpdPH1Ő&Ō`FO7z|PI,ߺUN"=ך)o 1FEvKh)"3 Ev>v20 9poЮٟgn7z0 2:=:{@)y; qQw;J(9z:3 MEaq%}X߀/x>ٽmM7Vh1b{rP GIj Ftp^6%A1(lpkwh+uF۸wPr_}I :0ZNQԞ޻E@%zANRI&`S )q^!rΛZ&x!BNݬuonW5wt'/~KF⼆EG-Eli˙l;E旔{aEgTX~\Q.>p>S[yE Tɸ@ȝ&֜_i_iN<>.c?UZ_/ҩ&dpP.r9Q- 竬#OILZ3a\MC z 1_ԥhadRYNRj ^_,'[b0o5_ ?+֔ODU'H&S W)akJ@eq䪂WSD"-Wn1h ')AY)h h q,!xU ] ~>=PKJФa~qQ-oF^^_#[nG\k2fQ×{!{d $l)%c zB+Y~ּk@O5t>>pBLM9FQ͟]*h^{{TKyѾhġi:wBA+5&[mS{~y4֢&&/ \4O/ןåkuPXOKt ŨG-1nQʥe fLѳQUi.4yb/~o=פ0p;bsNߚ!B3A '#~k0 螖UZViXe`MZ"Pf&2Ca0QmwzqY=֢FIۨ2Mv!UV NX%8aMqҘX|@I{״ :F ,:KٵHͮB0֣̮{ VقXe b-5قZk(a N"Gk[k"f7&M篭|M2դmlGw"c&f)esn \v4 n2XIK,]6˽"vM?ICAFޏ QC4$u|uu[vR= MsJagڥ x;k䥋h}Q UvxuҾ}.rSt$ [U-^5l`C`A!5/N0kvɽ\VQs[XJ=Ede0k4㳺K*D)jҫNbG5n<V\*y0Ec)֑"8dBjӍd/<ܾH] Fbce @Pk#pnKRykDKtE r=tΖ#A:S=J^DQYC*N&O_vI$ ~p~q~(ERmIIv 4dgGo4rYW^`5| }*yk1*UfTsw$񱋭KjҎ"P Вq+Cr l=m*S=4DU' k'8:;˹nz=akiG ]4$ēN2TGe ӳ(rDO11ȵ Y#OSrI=AȘ]. 'ZZX`/VBljR5lT`@!.f|n"m|dQЀص}Y|XJ42L 3BZ+% 2%H\H1+3l>YS.=$<`$U,OL׮A > =5{ D@>j G Cm8֟ܿᜱnh@ y=,V*FK1 m+J8AnH2Sdr)Q%T9IXz dq; +09(]D-,@q$T2:>XUn;l"X0ͦONڽ}pX2gTbp;FF\)TZPd>%(*bg+y5oLtZQkrRbFMSaZW@޺ZSk'ynm>TCFa~O1J&>b&e`VՊs^x`ڕD"3NKEWr6*:` tElddc}݄`pdhh)hm候/0#U) ZaA4eB_jE8"@{_K:8 ZKiRb կህRfD U3̚?KIv[w}J4 sFe[]C2Ⅻ*QT#ySrU)Q}, %  fq"A$u:VoVq}7;l!b$м17XSښFRҖ .sY~8g+G]!P*ylnт4ZSQb#kIƊՒ!s Zne16g##-T]$ mjR!STaBe4MPϘdV3+q^C9 vHII]ֶ훏;XrACV"V#KXE%O*NIQ%hUV Yh! <W"|,[ h<,Avzyq~(/u*ksJzAZ4\.q1jA} ;zqfDRX})Jۘ\k>%89-R*M1{M~s ,9SI Xhlb '`!,`GD8qvQӿώq~D?7rqVk$Dm/U E#s LD) (R`gKP;XTF<}2 7CY9i?Xjcg0tŢo.NG>c fܺf];rNVyɳu*e1>Ml,4h~*#S [#{u3{l(d'.4n uZ} !K vtBb;c1Ūdd^v׳xjڨ>j-#iHcS8 ;0,6~ CnO#yDEE\V]bhq]!@wLxM(y`BBs"P$B Vxp$K[{T"Qf &O+@D孔h,$nl)hql^Bov$Mv,8Cڽ̎mt };JG7PcAJ)t\LI+ɱ*:dIXx}op}99m\~١lfm={kiZVt-=`~k;m'^OϷJ5c~]`Sgg%ђAR"9`PҲbȅNb L FTF9WV9O_^~Tu ٻzre:!z09a6aw 0̉gū7 fk=.K2s GxYB5' <.`CHY*ݩʮǀ)xIxB JsutI{zig}sJ^u6RuN-6No;ay hFrEG>Y\d+.*k 'Xt`R3Gq١zn3y \6<ؽ=cs؍itɱyqpUv@ꖧabՌn\ np 5 o?qD#rz9zZJkZaL8Ux[3_=-αo9g쯓Aج ·ЊKJ _Reil5}0HRdkqvqr4:ʧW,ZwҾ3Go9,kϲmF#;/gD}Em#[ԸWn $T> dYK*X>H)*՘HWN<[&k[7|xfn7H Y윣lC>lt[mL?ze[ưIGN6 xc_Խ]pKq^Mgl x l@G$o &dBN_9/__V`O_K*߭^e^Fox%4~ϷrSpϟ.wona<^n7W*]UOo/^~^;Ags\^#oX7͍QݟߗL*+6OWL5cf& >]fW\%e{(_\ON[*-iދ7ox?e'~OoQlWԿg|&i]}Y#ޭxUkVu3˕ m(tvˋ^z\x;_#/\Xr%z˻o|UJ}~*Ib$f(F$IbFH/񃳵^OL4>otS&5sdO @+/$0((hɑK4TmpбLJ:auccZFÓ}Br攄sdZz5t~\> ZH^vNG6nR} ̂tZ4CpE*"BgI9c,|VaƪzkJ X h:NJIeSU@e"3~^͉>փ|$@:^vѻX_u3f@" Uy YF6teT %i9Q:BVڲHIs0Ǫ@*GFr]kW\<=@. vCdJRZU2:i g+6Y"uXᰪvw\?O;F&Q2§Ob{1cU<'Z%ժd)(0fhy6V=,!P~6FPl픑 6Z<;JE_=H;ϖpĖ84Fy(!0w x=&6xScz#Si0zjzxP8{|jzh1ᬲGix>gl"fA×jy[TɁ)Bٸ'N8ƭ~;▢L^ gq6|Pdb_)2?G8Cʚ9Ϩ;QвwBܲL_^e$S9;SH2hIBMk(HiP;h$#C%>?Ne ;|Y{?閇deOΚ&xUX\rZל>PϷQSq{d#04Y|[UjP@I$4JDՆbxle=otd[i "z_%rhErW +:(KF~Y5Ed- ssRA0*B@ mDI9T#kNv u,tRS. eLI3dkHFR~o1vilnM!X? {GóP>j)*l] 4؃BPw j3OXS€ZaJCR.TuVECHFN5nЩ%RYWzīn;-$hQ+C1mOX83ӳ6M}q#iF`l_1A}@[KQ5rpZiYRhQfQ,f)DHbzd,JҮc-U,߆Yq抆XLR%h{1.\J"lj.Ɂ-* NT& E=J'3/r)އm*~MF!jk9Mcw<5c3z6c{w^ }UL%fOULS}pel%п/ouEYi2Ihn./mI8i䲞z떟|Uo-Jrqgo@ouf66P)d@Q_j[Q_:iZis=82cvryҖ],@K=}]=mqk3ܲ Z.sug[nEʼn5tv ͢3ܞr4Z)_MƸ,ygQ% 6=hϊ P)yJ,%ڋpƅQ21W;Fdu J ̃DH= =ZJ} ||k8`?OQx[fqrn ]Ce:*>J=r*xVȗ%yod=jC +Қ`Ecj\r>}sn}]{7o:s@3xسXcIw7KiFcz z^5]$15Տr$/WצPC I_F]ؐ2k*is0$cTB(VA/>YkXurwdiY*@tV!Dl5d|n &,I.Dݦ%cGCd_C&SsE,!e4sg{Y퇌\gMXQ犱|Gb2.Vw ѵ.+ْ-NƪEjJIOe^gcYjm:"wqq9otk} 1ΡTE1BQ)goQxXM}fơיPw, v&݄a[Np7"9WkapY w} S|I1䘹%,T^ڤR }QmzѤң2^)`fvAcU}+%Z FCC$I bX9(&mls^.ԎԾΤvDc5m8Tm|̥pv_gMzngywړ'dty5:DBr f$fې{S"U}pƂ)9F&4 A|։%$)bI])թHwk [y:puD5L\ٺ8HEV >0:;ԜCȎNiiQ'-^m:M` ᝵(2Yy(vi{;Xj}¦AYϿw?Tq5O^~ WU'K8éJ*|R(,榗e(* IF_Z#05egK*G5c{|n48@$BW,in֘-!I~< r.8WDk J>xҊyM||6lH{i(zIa3YYcRu–Jh5X]l*بe.[&gJ%Ԅ6ʗ_sGOc P};ɨ4V Q*!G <NԮZ=Jf!.=VFEMq6HFWšhオb% (JVy9JT]]S|_ }_WPe߼M H@R >:A^.VjdG8 aYk ޾f6dV<.*Jԥ'kb27Fmp5b<vb(a\[?.Zc[bı*(֩IcMЧ){ϲăvDh6'Lކ{p"⩃;TC>?n1ק^^*BBp^ G}9XύtrׂpzB-\BeL퓘lMn;#.n7~^1+ ',{,i^ɷ-R<OΚYMYPĄ7WsOkNzl?%ܬpC"QMY@`)ecnǛÏ_IWeSXZS ahӉZTlk25r`wKbkYRH56 y?rO6AH^ Q Y 8V%E0r8Zn0+/c =U裻=;fQelMp.ߝ>o3e8߭DcXe[!wUͺ hİ5J3bEq 4xپ4<Z1[sfb;f6.b^دs@0$/tՇmXe܈1鶮0.4}b8쎿GW܏Ho#W1bC(~QV лI#al*3NJY"\pC$T}+}L Hܺ#*&KdEVbȦHF@E!Pe3AȾ.!B+{D[q~ƙ9ooEgȶWVCĥ kc[&ݽz&mi# Vݻi}=Zwp~9 7/M8M'gCɘoQd]= PW'i<+| rF9} #8s}T7"ǯEU2z`ʗE_}M/Or`O<uԜ7=C81] @8&pfh+J}C]EI]ĖD{o]-0XCuf5\l@V¡]1\ 0oʜAz͝M0Il>Ԃsa6ƻ$$ĢV.&3v#+ a2Į8B8+vP /D2k0}CU0}UPs&.\BԾۊڨZa'${dZ'{T% rB'}|KГ>^[/rx1LARvƊI{)I?`V,p+2{ic NN﹚Nz2-Eqʙ M=%4z0,M(I(u,18V'N9Z/AWA&)[fճ??-믿"LOuAڷ3[1#6AW2b(:bŽH6=E4)S$TkM9~O i& 7kb8ƱRŠXƉ"Pyy#TH$Azi8X™S`kiI=\1ˬVĒ$dzQX%goz׃R띨ʯ5(OމqOމz;!^ ;X ^~6p]f~4Exs,Q0 YpEΏ-^8 'Ho@_Ѓhc8U4zy0,)tgK.} +,( =Ӥ;8M X42,jPuGi95ѥ}8%Ӆ'"D^ɷz"wJd6&` {>E`\W_0D쏫4#!H8aTPEf-b 2΍$ѵs\+c9v8Q:DU&1d1'׭J8"Y'9ŘF@fY A?Q;XP<$p$a&NlYSXdiR} v;;rw^^#phk,9Kw+=g*ĭ(PHIP%85VR>KCF!w[ *?׵JU^ZI]X!z5ހ_\=}p#oGћ`!~xqe?YoGߋ‚`GO>Dw,FIoD Y,!?DoEk?_Wl/fFG =|Ss8]xEHػRh>)cC͞u b8XGPA2%yy>zSE೰`]ѳ#`&gDF7/N'3&:9oGgu*:G=33궍3BS4aZJt_^lx9sWp#Q*NsnTbԽrmVF-a_<-77K#DI(M[Rk%&zSlTH!v2nq ,iʝ A+a9X2}I"ye+W?#8ox[V=]#JiEѤS!s,?%~ɛh %Ѩ`Z@В@ƌJpBb!m$r+j#uSol/ gZ`i)bwyPcOqIذ/0cio'%=P#8T.P >KWH"k4 Ux]~cKx=8o6rR+xj+F_!*t̩R??[j%`7;X+}kU0%UJ#)-uB\p= f/keQƛi7J4&^;| !R8,J81[~{[a\]$Ջ{-<*1 :CQu{/:A /.R.R(fج;rN"4LJHn,"QB*iguZ7_FJ`F м;׳eC ѭƩ;Q^H (1ւ^G珓|hr t]RcP*Sa*Ƿ:|woIj 2]pm`^=M3h!S :d%/aA Uk)Vtxqw8XP,cgw8p$l?=j;2'uY=%W3W@To[I`n;8Tβj%=0Ī%>z p~K0٘\}ՀM֊-k\48Yp =1MvbnyixXY4A9{bKۥ'4p7JP6*6&rؑ >ގ+|%un9e_]~R| )l纡FHRܒK˔\*H (b6q}*VLw> [Jb, j鄍oWV6lB4\|߃ F2&0vHm $2!9ú=Ɠ=)g OdR}U1Lq}D)#f>gc&2YA!*5cK`S7i]QqȄ&ܳI !` PsyA(O4ek@o4il*Bxsg%w`\+οǃcnC\ ĕH<"'1N K]8WN i΍u)1a``Vؘ d 1dQIY`?L4b.bgMV1IհK}*uZ` 0;}8Z9pcFe6jkDKVŪCł6\8jo3 f!TԐ?$e<>i|pQ;K.ĺ(%rO2ȪKo}Kʬ]ޞ^af exN\by@ !EbI1*CѾCȏ[(U8?#F4CxmfM: ً)31nns[[MDrGUZ^]c.h3HǍKBHct1ƩЭ+o#hd-n3(kN^O|m)o1\g[r wxo6>CBӵy;%~3iFN)=C'q0X_S%uWy[0wh:߻7ӻ'P]S,Viwlqi-<:5{g i>xJONEVKQEi[Cgڔ mqInض[අNQ N)F?yF}//o8ۻW{.3Wmb\[jR_ׇ9ĚDG6u(rll7=xH#-vנbPrmes30m ආz,p R.$R{~J+?39ڦ{]Q JFmv3EU'Ւ3lSPMX!̌33*Sdf=<< 7N]{I%FjIr`:aS L<JDB+moe;_xyk6tuӔ}ʄqGW[lLW0j%ֈ qf }<vwV%Lt$`a1k&h*sXK#0)p'LF`oh14!:%^l5`n<<;=MT85 [K;2X2b&Cwax%Vd ۫gbl*MPr/8 OFEa1˅D\wD$Ƃ=~{5I5ZlSCB6 AiAРd.[eSAU5Jcx5FAP*ylϢj\3ڀyj&g5Ll;ମ#dg: %"H;nϼQ̡ztnr/k3aPWd׍?@WYq-IZ|Δʮ%d`N /yl޾l룆B_giG9@%r<5b7 ^=K@th- _\\Th.SVƈ|M^,J:=L/vQ7t5M>dw4I-(1Z80,I?#I#zh?\>| V-sw?]mo#7+|=%" Eɗ`eFr$y^j@"Ÿ R5}D) R|]:fn^F䱖c_tzEiq?;Y˜'3X)1 *oO H&=XzƱ\3'^[u˾{7#}]F $x uX,1FK;BJ>QHtTy'R_JD\0cIdڑ:Nr 9Jo, CGPP8{h&0#TX"5*,Ր`R_y5KUb"c 'm>{*^\ oAz N\eB<u"\gy`.Q9fxEsNJC1G^%GKr T)rl2&+p+p2]zN.>F&qF`CX=1Z#B(XyR'J)-Nܡ[٨D+4Eh`c( :"3]M2&eJnmRtHs=he gJ?L;.rU:[龬Pkŵc>jaֈ8뙄$'6\$i膋$S\$Wwp\OZi׃zIu@P\Ǹw5y\ݍhkH4S9ߠhoNBT^>XaXnp?6RS'{{8f9)`H_F9'RR4aPd-X uq y,İtz0bG6)v2ݚ潖3{v*`[s5"s_诖˖LttTkchA,`[#( g%`vCT ){H&Fq.Jg:1}3Z`y(EIBܱ%uVob͛ŭ"DT7WYP:ܚ5~[;Ew/_S/%#M ,u}rWwغ [ϩpk(v# y*JJigumjY7%XNwԱnXUK֭=ruBC^)M{GMR3ZNwԱnEH`֭@S[Ut_h"%*Ec,=fZ- ;':yv4_~(>ǣ̓qh24l^Liay) ΄bgZ35cr֔$Q:/11=گlg/˕l|ن *6,&y5?WEO_o1́@IJ;]/V M̵lL0rR(0AeP Y1!$} 'slQH@brcbKI$WU4B13oұB"G 9?Na*cU%<{cH 8O ð+g7vaKA{dzuǝ9;,N"9ɽcY,3-vlx~sͥ3s71L^go2`R Vok_TzX&ǿ+SvTZj٧3dT8˖Y=(%N"(}O=~bb *ް񨊎n67?+ܛia1χ"JtXA3"G'NVA.!K9L㨰#7r{hmdK5 .ziR-2}W3҄enl#7JĂsˏ% ~dscQHn<l6qr?SgCɆ"v2lgD;A[s%󐃳zlp.fࠌRǨx!pZXmvU!S\ ΂vyX?+,Gq,31AhK5$)+rt6+B>buPFZJ>L蓻Y[Rx#l!GCHH%b$FJls~LVi͚#G8ՊX៫'F ٠UVg'gW[m3`L+.drKƂ"PŠT2 |;Sis;ۜXpӃ88^p}Z_`M~X[@waf ލHEsı 5sÚ9l8f|`^F֏? J ) )خ3x\qp.2hύdz0lUJ |)wYx0Z >PDÌ2zJsrMNY@sIu)" 3oY'f5u)n=#H8Qϼ=֪$bj kO+h,.-WSTx Z 0g´֛VF2|j_-fqE &jub`w5ݵplW)s atS59s"'sm,앋0}9C>XKʧ*ܞ24@&[%\=ƛG75ގx=x>-蝀1>g,1&Lpg,'&S>xw`l޽pi, p=sPP[xawf8{o&;9I4c^WٽnD a-EMZz-cx| }h I:Є=胑Lx@͔2&u3ΥC`~WE4s:l?sx(^Cņ' λk 7ѓTls)R0Mw:3Z *20εqkp~rV In5hČ@N3Bgue؎ծqDTTϾS,Ko~5I8YEU)_*PQz8kΑ9Y .{b+Xg2r.KF -nzQR -<#4G3&bp]j\E#ݣ5^[+%cJ%h7^- dY{9]n-EVQ\n^]=wduiMOը۽tň4i='{bpi^}iqDS 5>R3-CaMN^$N\{ 9!:k %B8POjokB3:Hi.+I.ƻ~zpRݍ֑(V^o Rŏ i=n%Ĩh\mN^K(:<=aGOkG]ZRT.99U5AxjB^z"+8OD4䊦&<_Lb( !+x)\;^JÃ;*W!HNU)ytsZǏ/> ,B$B}3]BD\H{7wa"556y)!.]0=bvϜ,^>=.)LZ"6uNRXA.4{Dڡa?tIOP\֏_u+c` EwէΡwWRz9Yo#S,f/`Kv{u5[?~ɯFf.?= !]*iZIJ3KՀ*U:&_͜B2㴑 .eR"` rDP!RŹfE9e8$>h QL5;gvňjzT@8h3c1ѹR0Ϸ\ %JC(_RUHe'bЗmهձKE3V^"#ܴ6IG).V [94!z31֟rczw'qtHS;<#q7PYD̂*,P XU*=|M+ * eԙz5#«R'subcIz3 g8`V}/eKȉEv O~mu>īc1JӻAC̑]Crqc:u#w2ZA~=BP&;ڔ"XJPʕL<)+aXo) Z1&َL0L&It3cd6'WOH'wiD'S!H ɰ6ka+@ݴF~|k~<*|o2Cs\W4/WبZ\M^3~J2LGA~zx "2Dzߡ7^B2~07& *uw lVʕE70ӻ?U BEby X WgT~w?³6=&(,f>bXгsgo$?t驪Rk`J,W**vQ(yoNfnJp\H.#3Si`:}5X'm{iH0ryyw0[jiͥK*xlzנz}^ P\tyE}j`oq]Qf񓁹^x3\<d唷v'CT%"OQ-l(%֛]a-dzgt> L8oD^sU/Vqb=̈0Kw[jjA=!h!V )~~UjON!C%T&&K8G**Cb4]o4YMbMV#4;1IN@k ;t~3@6  H $bP*'k[~LU[~P@#1eJ=Ǐ޸ǿ+ӪC4/&[ް=`Qc @9IXNg!ZF&rR}r-Uxű"ͼU0"XߖJY07ɕfě{б;{[(!tVUpuX}eg 껿=ي[ BIoë-ct,ԒUѰ7rXx7>C5w7[@Uk&E(˞m9d|jn/Pvns$j{VίeI@gU|;<` pdT3YAzW[qiAf㬞:Y&CzCIpo8~+9{jbB-~1{|bo=^^[ڱ#jPbVPW{= */X_ fh]7}^P^ T_5)Aũ[k:ř[Yĭ-)Rk|-1S+ kB5c{nU{i=0\3Y*TYɫ7ǽd|S~n2U S⽙\ 1&ݻPF 7րW>Nz+ u~iA+'gyt Bsgx(N7Q"Z˯ﶡg:|\1)WlsOJƚ5)v)sVX=)s5IR|&S^} ashjFb! VzŔ5j9ݍ VԵ&9b0Put1`|&ccIK B oܒ\Tr_pAxJTlKDqVԻ0ǵ/sHRb šP0ǭ̼f LU[(Tn`׊p~apb$,r`)xZm9ņ",u``($3I`V8iS,&2-$zRhTI& ڂ] g!ƥW4`9FG#9U^ F%raO -@azЁz# Az'i5P ^Bb9#\n &˅Z/A5 nyr mij ETͨ ;K*%9X/n0=—"(ٖw#|)ϨEn_kNr>`AL.]=SgunN-]Q%08A'K: a1,CA4A` Oq-%{9,|V x0Y-@KmFԼf5쳰!ú01 xN<l#dR?{vYx z/nCn9Sg]!Xq#KwΣovaY֙2v:߁3./.09Z]x3e͠wgǽ;.硺X!~]re#ȵ2q19!kXbLlL36橵('?K'kZB]̜s+~Xc' j1$-|7P)Z=2EF_ Ê8 }u[__掝9bXGci+XMa1`@Ơ8v!#yA{[' w6)YM"%t aer#VK4/ZX0n$u kNEK,ʍpT[ . *&aMzG՝&3)BkI"$ǂlتnU 58,cd F| }:U+ [R Yk&pRKi-h~NmɉOBQ' Q>DB`;dM 3"" dM^ ‰py ̹Pp`=E-AHkTL;EMT($VRc-f+*xPST+_3kԈ=TPrTP $f im3Ϟ!ɞ! gHb bѾR #s&n?YW"\sI571DeZ֢[#P-QKGO{$+L?amxO $i"YJ?c,fgê(xf#Y:V~J<^:nThrj+[41g8dXÚUyG!}eUDd3Ĵu|)^s)OĆK>OG8OSVDV@fFz.Dsf=W:hLVj5nE(εQJƸQyυwR0Pƕ,Py . 61bVh~Js̔Pq% ?.w5\1͠7.vg[$H,'%FH i41F149FH|_br1s+%QB`&Qq0Tkg$EZiMa=Ba@DB U`xSj3S+C< 11!WV6Bqh1W5r%F.#ifFrAFB'qLm)]R8JS Y]m_ZwV=1Ka ŕN+q@9Fq'VE`D hq;~޺4 1nĂm:5CAXL_Khq%Cdq95˟d4^q0ӛ^7ޗ?yw}}u7 KF?..ٷ{Ln{$7V&uݎ/i%8j_|Qgןq­{7.U=ƠcpZI%?e6c5JDNMn@_L0%0]&uFSFw늘_;2S8>f<ҴW Fss)sW-EuyOx»;S jwoyq~ fK?9·a{+AGz|Ar^>xb>"XqVC履oLׯ󨓾>bE!0+xؿ\.=(+5#3WQYY03_/?{ƍ_Q.~*WKS2cIʏx߯1CI#>D 1R]ݍF7L'< К d.F~? >ϕ?xubvNOT?秿Ç7 |;~ZS}ۋٹ-Qh0<-x3>ZߕO|-O=gׯW 6`nr0 W| .6ן| ϩGe ) T]]νiuQU^[|1OE*1[|H٫lLY!ܵA[@#p_T훝-+M* x?^gQ|ѓwV^`T>iaU4.j F~<1 2 eO?|tK=$hj̳я߀Vߣƣgu`>}]%?7OxWbU~Ijz\4ӺV;I)y( HCK]ɏ!6O9H1OudH&-`Ha`n,CO<0(g1sdv$zMF1 [s.!}X9K$9sGWHRŚ(Vn&:\!v\}?RDH %Nײsd|1"!GJgiH i_$W` +uba!$b6"!s]IԨB6JTJ΢Vo*~[j}oCTȒSʓQs0H\Jcɔ]Eeq=jevB΅>@"y9:O֢N[; Trښ"?:_ 9%B#~´Lx2RT!k_Jݨ4@uUv}- wpuW$"u-ʕ/e(]L8Όa.7`/ŋ\tP͉ 3f[(9NKȕ/jHpOڬ3 @49ts l!WYn QNŒEmad~N#Kɑ\S<8MT$u 칵k .R[#S\Prdg+B%U.ۑnv;`k+ mŀ>kL6/1Z+bGu$bhG JAޕxOߎxȠI*-u!NDkM vX36#ީhT94T"#"9z'#,<%G8PTd7# oF@C' #J1R0d^x$ 3nhUN7$VDiMFKB.3Ԍ1!X'-lohpI)!R:16R:1s _jהJ܂/bhk(w9 ㈽2Z|yUe=fJNqWmlx;bѬ1 ]4kYkx W ;&A};Z\׃JFRZC[}G;P&BSR+f}zv(:eR*Atpwn 6hIѣS*݊G#Pyhk`CQ^UL(r8s_w^L#hk%HNmF!6q7 y4sJZ*F|a! _Li+HkSV yTESj}]QI^WٓOnt/Oܾ~o~ʧܠiJJ97PEM NSew=-pTc"ZT7=N{=N\RQXh"զJ6d<$Piwr…M0nօcְ%ivI,,U*K*|U Aj1%kQ󥿄C_.r-_'v[0ê`I ;`^?򜎉jk,p-9HdG R(7HP؃L-J3Y õ Q Q߸hū&-| h9$ζ/$fM'Mג=_Rpp z@w\W<ثn^UEϧ័XsNJ(WBXp`,w@o2x~`r?0ܯ3y1CeYm<#9͈`Gq^^ÖZ%7qlBXd,`*N~T]v^H*^`N}IUv$Y4cK3C[[Ӄ";7gbMnW%wsb|TIDwHת]dFQc=2KCZ('ω&!3e¦X{\[R*=H\ [zB3!v6U77aU 2[LL]$>Mg57H½yHvYT`ܾȲI`#q3uÔ& et`5s]F[՘q~`bCq,mtkZ^3}T0QXh$ڷcFXhE~GqXp7 At$@U-PBM`$fTW45/ [G9,*N3` mB0`VƯfLIiI&E ljы3VOXJ3h{Dc X}g13l8cm\W- 2Ó,Y fb>Nam^ZN$kZ`̀ 40)d 1, K7w?J6:u[N~^hZ -VR8(*}:vGBzݥK_q{ռ0FMjLM*⟂A2&WCs|`oɉ/Oy07e>Y4Lά<IWK{h:[k;;;r13Y1obwƄe0T4 ~̱dbtنg\Zp[Q)~nA>A>A>uGZJ(wƢ pFV* ,.E_GEO}Pԇ,R0AI{ۑԤ%BgEq'R7ЇDґ\/0LQh)]prSK`҂xʉD,uF"xH!yB&bJ?#XY0yysX-FWֿ3eg%a?yz=(J4cs31yG=+giʯ^ .A ADdf/G,3^ 4<A_\ Nf6eq|l(2`;8G8UDv5\I1U<Q$Gն4vV"A/o,R CO&jvdR㮘5y7 6dV4%ARhwksD$C%.k>>O5v|Qa NVJ 9W1}&J5^{p]u#r(mqG/aiiDD&>إ '"S1Ҭ-\ $52L[,z#*`ÊbZN} Ȩ4c ٙ'KR13lH}cՃǏEUc3WL.M!1S}3_bk2I$ eV582 񥛇þ-˅3 ^y Ej 3s/bG"HȖ"vaH\;(JՇFqv%#Ȁ3cz> 4nZZL_Z\V0AxRH(l|n֬VlHd=Lb,_1K,H9; a p"n8RF(gn奛T`'e?ҖWS<TQtG!Z^d..c^,fD1KpElR#-[o4)Y ʘ.3kUyWhvtd;%&!=_](+ ^ F#<(qvi8Sח`c:o#G̥MugrwGVɓ;?{XJ׼"KTnꔯtt?ʁz&O̦gbs~-Oj_x|oW uІ/|_=q_~P6}Uۗn$MOmmt[m My+yj{R$o|%_k k^7K&F1$Q`2HȔh}!kν3??~)x/k~Y= /M| Ԩ4l2On3 G̵_l+nK\q!y;u){MoI 2ܕ6{1 آ"]J[PCdL%8~4 Abs L4²QsoC].JR[ XURE]}bVu;'Ig^~J+18*eB~ȭCNBͶZ1Thͬ-_PUoqI>@d!N"N jo疕?6EлAk KFI\mMeBӰ؅ѳ [mWBB"'3W^EJZ"`/.'0G NV9{.SU: 8Qmh =~ ,W#h%tF>zm.ŏ̿Nn$Y sonsܘvb喺WIŹT,X a&O˥7l&\`$%%1mE95Mil{Ǿڏvd t8jٟ{Q")7(% XЊ t@TkY;\@>ZyO#jQZ %E&FM?T E1|Vr\dK&'cJqIł!`3U ){ 9IXXuU*ŷU<_GHOm&D X^w U9-d{G,W{ZQ3 *6BWDYN൵R /ٚSrz=o l(EhZ i(ӠFp>;닩FfsGV.|I H Jfҥw_|;r*68wb}ey^A @̺RS(o5^x[ٓ}GZ]Pozb $ibʍJ|U!zr4Ġ9s'C;<qLxDt?s8;cWo"6St `dqߒ6$\F\|;j*.<[wNƻ7J 3sx5>}E#56&Ex/ UJz ަ3l{OOÞOMgaPI,E%=L!d/>ϋoGMq,uhػ2溳[sj%!!eSS*A8!4md!VgNgy{;G$M*ݭ{U<Q b/kowS5KpuBh*3mWg]{ tQ ;33c!O ϒ; *n7.5w LB)v2 J:t|l'Л WF|R ZlLFp;x-ް福)SYY*7&[:缻L-l1 )Z0jN̝ĕÉ^[a{{zgRR-=PL8Ql id.}]qgBLB6^GSkZMfG&{29FObSi½?M[xZZ3k^[{[h*ɩ-y*mq4_mW0X`Ck1P3B$`=~,eBnz\nԩ_͠͏|V#QS-W;SMEocokΜg7( sQUbOP.)Tϭ@bSC]!'-lS"ݪc ͇='JyrB#&; #*:ֻ7_G).(E̹_\Wr.>wI5M_wDXwރwp!M9N҅Rh9p%:Ď@f<|"I,g IᬐͳL/3vn=z=;8VWO5@;#9gu9 ZoXh֞χG3vҚ9X_~S&;j\m]֞iǸmϛMEDz GAs4jqS)czOE^mr*DSIelu϶9B -j?O vvl_:M@+7w/?o&d0D-IFlX,3w6Ss9>uWhG5߶?*1) d*yi>$Mf,~dtYhhE'W[ Zzϴ H:@,P[6TIDC[߯ iLPanͮs:PzY ]-8"/^EFf]yͿmG?MƟG^˻kw7=b+qv</7MC G5 _Scޱzihz*/ѻĒb)8USQ5ϊ8ag\E%JP/+Rl> -gTQX6N AH}ƤK\(`Q3Ld@~ni_A1H RW+S~ޮcI sP\bw*J\8%|cUW҃eXoJXU`B.R4ÜTp2u>JFZ}iC͡*jDSC-JK EX{Y ]h5?{pfr7_z}?_*FY CL|J9,tZO"\hM\_G MOZ"dKf`xY rC/ RqA߻;'Ȯ0;7XDј~]^;CFPlڟS{"w ٵ[Q[G(.KC=ףZC>FO#Xx2:~og2VKApkF}*H[nPt-t;J)}2kZ{pcWf%/CWFP NQt<9bolowdj=&.c%f!EޮܧU غGB*)e5F~GPbz`澰k/%GPnLY޵q,Be?d-!691}h$=:UCJ452eiWU]jg#9ł=X/(d/BM᷂q θFERE:5 *+"O*ei*_)X#|6rCΉTY#.mQF$V9\N6Zʇu~7c޲l1)arZӦ47p1RAfyslKM7^?C<82mA~*-+* 8O+ , s;!1a؞n5>WwkKr>?˪@>`v Dة\uz{[GSQ/S|Dt O![NN@~ibͻv$w{Okku5z:`R |4W*&=VK;ך|dJzỎ p`Pl{ U@}~THW`s9ܞW%g&URiSؙG. '4f 6_:÷ᡅ;C8k}BEFH-ύms(ZcEcZgўXt;(GeG2W ӂJ #@|/#2{BUƥf,9*CZt55)M1PmD]bw |Zz68h 0SCnj^)+rQjb4WR~NkF(]77r,jДvMYhͅTxB5f) ú ~`a4o{\ 8v5Ǖk6|YZR &Jt~wHˮ8b"FwuE~~֫~s| _ ~z??oͺ Ja+>@R~A{@́m/(L` 5m4Ch|E'N>t3WS`xwFN/ SHrIBf̾ՄGRҳ&Fü|)ij#?z5O?\AZ|w؝`APfs[s e؍{,ސLH@g> %yRɺhMV[`"pϓ\/,DžVuqn? $IMYgh_~|Oޖ(QO9U段Cu9WR^,d1Z`|@ _zqĥZj ^LlߘUlQ%d^)Lg0jB_$ax_t=IjP,p}eŤY߼R{/p!hUXND+񠸱?k[cs_Fti*/O@)Nfi=;j~FY>8H֮N.h` A,f> W/.# }KXЁIHX6l'uTަZVΧ41٬MVYVUۦcVM>.JeG~ϧLfMIT0%9.J{)DpH/dLfg+nzs/ 6&.|0i4]|\y@ &ec!xϢdT o%]Ŷ'h(rq:]TV$48,j4Pz?NXvAUxVnVĤ# F #q`/fJ P-CK0V"}iWE [x^h@z0^r^m7kݑnMC5)yݴT} JzStK!\dnR0˼.\ ĂboPۨ6hPr%׎R 2A,b֖ LU "kcu=x^ۨ "L,LJ9#2HGHw>6T-um;}uJ9j<0Zߺ2ΆDdmՇJBE 8%DSO(c]"nwshk[V@v wک;%9ɱ?jǮ::5lfU0E{Q>_UmCJc쌟hw Lx- x]+FZ9I*}h>}#e*39 3z!:KH 0c9H94vo9a6}z'_qeS8FR! Z!Tâ&JS\ca4:Y* ;NH/&Z( kMhڒt]{!f2f (|(L}؟4dxfdFS+/B#, :@P2+0efľs5?skPBx}4NU|i!Ӡ,KNQVC SD0:B!Պr=`=OS؅c=Շ-}ZK|Zv|M7%[Fv@mv؉$i N)@ï›܋V Ė8K ͪ`[n.e2$X ѨhIë#slq%K)4>qqI,H o(4 KA;N©0j#N8P>΄7*I %<LZPYCTrZd! ([J/ Zf@LtTxO:*Kf@q7Q!2LbX.DNg/RWϨtާ!A'A.+ MX7Tvy"ݐ;g}/3BE/Ex%aQ$囫ʎx!顆1 x&<\`d@4y @99ѧ+`[J:l H)P [󾸄1z3嬻KBd6\_/>/*ΠxS+"%`i ˪ͤ9e26Jd BH?r 񄥩?p+#H[TGTjq }Pў:M c.#O4p>Zj4Kuaq;#Geާ1viev;&AfxY썵Tuޓq$W}mˀ?^ }ڌ(a^դEC )ilFbÞ\>R-5J#QtL@amKqgp/.p[\t`e߆BO$G!xYc)MhHLqo#Py+1o(-UF̋N _zY]|9o M)/m @}:QDC^ZFr-&K-WlqtGt`zVx)>KN@JKLȊa(W+h>j -@+kQ5찬a\~jְâk[!Dt@5ҧąYSZp+0-f)aFEɴKRؚ\p!p̵|!m>|9$M!$`3r[Zm݃b[ѕ2{( -{X4mW-9m,|z ivHkVQ "sFBRL"lqz C­K0 )FBHwhGҚE2lOr-m9!nWMa.C mw: h!r/]0|KKZn/P;~X]zvO>u4 *X-1. dܓjֿt|Җ_J'oX\5ǶL6l75 %t)F~?re5>wtz:VKNð'1nl&"@ s$I9̤+Ќ~?bA0|dT2hiJ>}rg}hW-/ yuB0zk fYgR~Sx\JF!&H -x#"1]]o&7Ցb4.,p%ūӬ)jB jw4|m:5"D@SnX]$Qgя>- 0v;ewE Yg4L7:ԔHhYD1Iy9r2v^>M| |#XՍ;y2:_E a7žKa2k4 )LP5JOp&!#q?Fn.5͚Sp[٦of/\0H.F>`W_N?lbo`tewf7#f)4T٦ݎNQU 2dkW}%CT>u1Uy7̡{1P#2+&pN GO)Ӥ?8lV$.>ORϥW,M,O\i:%q7,4x1ѝ!]H| t0b&{LUa ,g.=@ pCE ʭuWDX=u>,G0^+4}ѳN9O;ʋ6.06ާK_/,SO<>2]9^U$~xv.q|t?9AQ%|r:eJO.q`By19΄sG"zoDUܕĔd琓"Q@ӶNlU:-Ę]Lg xS9*NېfCm&*6k:ǽr? { R/"80xƥzˋ/aEIcⱃtҋIZ|ZƁp 7/=֫`^~o/|g঳*Ϻ_/ 143۝pBMd`-j05jkJGT1/~XDHO_\d\pҷNԟ6!AM QA$IHQ1L6FgJ2%G*6rD8-1aY'BGi  .@]ʮ^~R\VM:k0m9ܰPn'5eJD2Zv7|\D𒮺)~PJM? +PjO;t3)T$}90}949۝^iRYɤ]{vxԎ)l~р}S$Z[?$T뺷5v$79 ןmxHY,z’AwN)撾y8\}뻝Wp-\~ dΡӨ];ܽ-lԃbl^VٸyIrju-[(ecEW Gq&t5W3?hH cSQ㠝yЈegİTʮ߷YXep7[MʆFH/"_O$%(ijtMO+Q?rQZYԢ]J/ǻFLGnT#lPFND\__HȚHH[Mj-x5=Ekp[ZUrCU:bwt=fUhb\;?؉J3J16YAJ'k^D}xeO:+{՜/J.}9lJo^>|[^mϓVJ`S`'fہsIg I- 14xY,b+:{#?*iOU~i:%~4:!2Z]\L2zFgHN1K7[nH"ʹM"e|+xKgeŸ֑ԓME'0 [KjSJȘɿzN'#ׄ14PBKd<"!RU5KDZ /%e`j3ITK`0VE)l0nc%)00${?r8% @pk#ol0(jܛyE+J3 )85U*U?VKj˞:v6/c({xoS]#USؼ'N  0x͝lNgMa llI(\8ZFnT6T%ژGo.l,56ZWZ3jA W:IS9=xqĒ_!k8 H){j*F2b8n*F+Ec7H0!9׽[~/D'7\PI2Ȅ:GK'2eSDhb.)E~?fU ^.$&|Fa|= /)-p{3\\¤P)#IQ#nG(OGQǗ]Qk^O2}N `]幧>K4 xoxN2Q צVe;{<89"z+Q29;YΙ%͙\zzY$4.K=Z"%QeLJDI%DIs@K98S☨aK]g MDL+V !`ZJ#XK:f%6@g%`IR$ (9l)pv(aTMVZ%=c6UalZbE-3z~lL9Uf_QhU2L}oc)pO]kT,3M{$ZIdh` F(!߬lL SsjmM z\ b${_?@\P+ $$I !ɞw{X Zggvo\\~ N7{LDv-A0LDqX梬6N9H0&j. KyjH6&,Tj}ګW94CR7rxIa>q`^%)fCCM f -d'0ڈsbA1RGkfnjQ`| 8lo(0]PpF-8׊*[QH3)C9SEA 4*]55h^PI^CL: f̕VBe[~[#-zы< cnYNWMV8m$Yg<\т8 sEQ)nL* ȈF~"LUr`~a1m!nF`c%oh7o@ؐ= lÊbJ92t;3Ri(_%wwzώ<  ԂZ&{_;)eyMG_3i AZyUo]l+g,IQ6p s j.C( .khW"ZxX`hf =Vm',nogkp+S; QY8AU/UrCayh`8t/翙P8=XBƨ爂ߪ8}`y]9XDDN4߻^pֺ9?xy\޿zzY?lXz8EFG#$j 块2ŕ+!+ ^]hZ!HRLVԩӼ b bMi_!) fbqڞZQ]%b Km8 rL%gw2#lw|eFК2C}B_ҥeOW}y`Bܘ a$޸<6FMS`oi{ôOD+y`/FX(]Ϲ ̿NϿQإcWeІ6d8wLbYs v fdObwYkZK բv A#>\Rja7n/a-<|x;xlVIbV_I|?JZhQ˻>/ز%{n2:KIi ߆+z&HfH 7ڐz?8OTX޷f4C$k͐^<2AΑcH~jVG#5x1-wyU*l9>qC~3opGMY,.s^}|yϓ8u(/?hr?S?x~fGd6?0 Pv鎲*WUdQ`}N^V͆n^eHC^ҩZ5cu~Z<Qn9^znZ.4䅫hbvzH۶nºAQpov<ֺpLցpmSz(E "VnB$WL{5A#M^330cWڱwI/*:87N bsa) :%c$wMGD @q^@, z\vۜ1>>6ˎaxԹPm~Зonu{TEXCIy ɥF4tʨ ?`.R/8q:p)(oN"$=>\ ُ v %J.Bc1xY4 ZW-Al#f9Z8] |9KYýa>ۘlޥ01kO|F2DgnMq8PJi6<6H3G<ZxX7-Bm.eZˌ:>BC^)G8p-yj(`QLDqh WV:GƷlnmy:mc݆vhH灭[뉆Z.4䅫hc2Ϧt`naRߟFq b8,^6>5/,/d8>8%A}(uM҇2 ŨqyM{1ՙ\M$v6MSt 3x$?6[ם}AVH.J/_s븮_s|jMwW꾔ii壷?½"&We<~)lt F0Gw%ڈ֮6IS!ZVI"Wz\%]A"ih-|ig~UєGcmdɠ&$R/ =ݥ7řbқpH)*[\%@ozs3oe Pxgb'tR|%~yAA}oL4S&ؘ$+K,G%{ޱB?jוW4pV@;2j&#!e.nl: 맾#4;S4`:FOJROl:W=O\1uH?џq[7I/レ"]\ߕw~\1 }ꐥ>eݕ+չR&\*z`]Q9N:9PRh1N9ь+G>)LX+r0B1EVp%R sQMC{(|CGn iGɔY/sk1~^$G*|z/1gk%c>j36.I3pO HC9TCW?Nvu{v I ~GoOFd:/EqQVQLoξXH@+.|=k ,5rtf[!5L~KˇAW5(`Rln3~i&tR$h1Z=EJbYylFfL2Q+Mv.>dfWT$:^8a{ߜ Bss&_h^^ 3˫A^hBsDv>Bl%#VύBt~b5!ãɵҟf.l=i=V@94|*azNKȈ XhdNݐ`H/ NO]H um$<4wxISOL3/!qZ\Ok6EؗYU _8A^r ^Ųu3\Z"=0]v*Wwj9a? l5F((tmN|o\qP[v 9a 8g W㥴G;"c`ݽmmPXBi[Aw8zn2}|gb3j/Tk/؃䨨ߊ$O]iKPU ݫ݅TyWuw=׋,6`uL'j'I̺3 ʀI̶IN޿l+{杔BBdpJIglmf2-jUA`4/i^$Ӽ !4r`b 82%P2k:r^@FZOeMݍCgHHp8'o)dV.˥udV]ͱsϠ%얷8#h EѢCc\2lp(L bw93bA0_RG1Q s֘݋-hZJJ jk,FD Fܐ`gVQ ֪[5S'2'!R x؃1` pA(@g+bߣ؆@d\2-D PrZ# 0^D\ڀ*S~,WAW0>knVFջe\oV_V.ogOQV9!l\t%! ?6 uo˥}ή#D_]l=AlʀyMo}>V_"蹒Z}ܬV_r|%sΧi{di盽z=pBjnjiy#8-3 mhg@g’6j2d&ř`$ŭ 6έ;_;++?ZYn2oܡǧ|G8>A:ywj~ܼb35- Qb{M>ٵxXhTSSs$s'0-G'4FIOuo =Uc̞cX7*W=(Snd5Fw;buڹ>w/ncX7*pw|حEwwLUʍF7nm*[nV 楞yA~A~βRLTkF꽠/dѹt^7nA_XdB,(*ա?b*^jycљI1Nxc@yc}o>6NѮ>KiWR!r,j":a(~[+/;ek%ޜ J:>FY߽@mp Hm3.VQ7ƍ$Xα^ET*$>@^jߊHpjm9ky+|ݏRDG_ۗ{B5V(}- ILv-WI+7Z ItPZuQ[TaC\@5Sٶe)- E }&G+ Њh6#i#V${L U:hz$Dv_6 LB|Pq" DOֆz8U(' -Wmp̂3Hk0ȝ @P= g<c6 :-_5sˈ)8*<t:6e:aPy4- &JwwާȂЀ>FB;j3:YX@ e+f@f s>3<`&|?_L]m+T :@cW BAWq >$ lgx cv,"l#RoMĚu}zafX"xi}+HkU*'0X 0GqS|vMݻ-蚯R1yM]3bjIucTW2X~+ '^+qD)jAը:V" ۾T/"tZݰ$gl4K#hh2˄zDj[}+$-,SjH\UU2Gp\ǿQgV1!hB}t6D#6qN6VƖn棹[4v.%-9%ūBi5Sfb $8O@U\h6 bT/w)g9\-c}ƦfW?l }kҍd_EH) K?8pULI^",onV*Bw|>F(U޺0F*[oiO$k aD|㡲%4;5&9;Xc2 t gA|04Y4ГChN\ T*;*UORu_$u_$u_$u_tmm-Z Z0hK Us9M3ucq3mvUx}Ϟ}( ,0Ƨm䵞f ~4nX4{{w#[h}+8 -!ÃŃaT PRS)Uҡm%%^ 2jpka'EvhFvg=)/m|P^IYCtpzMZ(ʉ+j`~jަC9YNZtWNrPLԺV-Sɋgj*_-EPT~g6OϮb*Ш\RPEn׌=˗@Dr$'wµپ|W}*5F<{]hЙGXd<ξB$d<ɔ(٫gfU@FZURtP)oW3` Z(ia%eՄåVJ>ީΤ3(w!$,b%H,zk>qOf'1,䍛hM<5zϻl,j11ox]Qlޭb5ӻa!oDl ɇލw~j11oxmyNA3MtǦ0Ӧ0=va^ZÜ7 `!X=ߧٝTʸ>3_it^)o~J'|K8Rh pD{iĆBK.~?om'l8ȴV%^PMUl S$0E }\Aďii=1>ؕ3Ս)$=Lƿvar/home/core/zuul-output/logs/kubelet.log0000644000000000000000006000433415153564677017720 0ustar rootrootMar 09 14:02:38 crc systemd[1]: Starting Kubernetes Kubelet... Mar 09 14:02:38 crc restorecon[4706]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 09 14:02:38 crc restorecon[4706]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 14:02:39 crc restorecon[4706]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 09 14:02:39 crc restorecon[4706]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 09 14:02:39 crc kubenswrapper[4722]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 09 14:02:39 crc kubenswrapper[4722]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 09 14:02:39 crc kubenswrapper[4722]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 09 14:02:39 crc kubenswrapper[4722]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 09 14:02:39 crc kubenswrapper[4722]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 09 14:02:39 crc kubenswrapper[4722]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.875094 4722 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.880058 4722 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.880093 4722 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.880102 4722 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.880112 4722 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.880123 4722 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.880131 4722 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.880139 4722 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.880147 4722 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.880155 4722 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.880164 4722 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.880172 4722 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.880179 4722 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.880187 4722 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.880195 4722 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.880230 4722 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.880243 4722 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.880254 4722 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.880263 4722 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.880271 4722 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.880287 4722 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.880296 4722 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.880304 4722 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.880311 4722 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.880319 4722 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.880326 4722 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.880338 4722 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.880348 4722 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.880358 4722 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.880366 4722 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.880375 4722 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.880383 4722 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.880391 4722 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.880399 4722 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.880407 4722 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.880415 4722 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.880425 4722 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.880432 4722 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.880441 4722 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.880452 4722 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.880462 4722 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.880470 4722 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.880480 4722 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.880487 4722 feature_gate.go:330] unrecognized feature gate: Example Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.880496 4722 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.880503 4722 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.880512 4722 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.880520 4722 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.880528 4722 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.880537 4722 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.880547 4722 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.880555 4722 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.880563 4722 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.880574 4722 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.880585 4722 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.880593 4722 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.880601 4722 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.880611 4722 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.880619 4722 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.880628 4722 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.880636 4722 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.880644 4722 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.880653 4722 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.880661 4722 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.880669 4722 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.880677 4722 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.880685 4722 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.880693 4722 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.880702 4722 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.880709 4722 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.880717 4722 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.880756 4722 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.880949 4722 flags.go:64] FLAG: --address="0.0.0.0" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.880968 4722 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.880985 4722 flags.go:64] FLAG: --anonymous-auth="true" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.880999 4722 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881012 4722 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881023 4722 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881038 4722 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881052 4722 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881063 4722 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881073 4722 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881091 4722 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881101 4722 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881110 4722 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881120 4722 flags.go:64] FLAG: --cgroup-root="" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881128 4722 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881137 4722 flags.go:64] FLAG: --client-ca-file="" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881146 4722 flags.go:64] FLAG: --cloud-config="" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881155 4722 flags.go:64] FLAG: --cloud-provider="" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881163 4722 flags.go:64] FLAG: --cluster-dns="[]" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881174 4722 flags.go:64] FLAG: --cluster-domain="" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881182 4722 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881192 4722 flags.go:64] FLAG: --config-dir="" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881237 4722 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881253 4722 flags.go:64] FLAG: --container-log-max-files="5" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881266 4722 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881293 4722 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881302 4722 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881312 4722 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881321 4722 flags.go:64] FLAG: --contention-profiling="false" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881330 4722 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881339 4722 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881348 4722 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881359 4722 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881370 4722 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881379 4722 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881388 4722 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881397 4722 flags.go:64] FLAG: --enable-load-reader="false" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881406 4722 flags.go:64] FLAG: --enable-server="true" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881415 4722 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881427 4722 flags.go:64] FLAG: --event-burst="100" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881436 4722 flags.go:64] FLAG: --event-qps="50" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881445 4722 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881458 4722 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881468 4722 flags.go:64] FLAG: --eviction-hard="" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881479 4722 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881487 4722 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881496 4722 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881505 4722 flags.go:64] FLAG: --eviction-soft="" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881514 4722 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881523 4722 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881532 4722 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881541 4722 flags.go:64] FLAG: --experimental-mounter-path="" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881550 4722 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881559 4722 flags.go:64] FLAG: --fail-swap-on="true" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881567 4722 flags.go:64] FLAG: --feature-gates="" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881578 4722 flags.go:64] FLAG: --file-check-frequency="20s" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881586 4722 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881595 4722 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881605 4722 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881614 4722 flags.go:64] FLAG: --healthz-port="10248" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881623 4722 flags.go:64] FLAG: --help="false" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881632 4722 flags.go:64] FLAG: --hostname-override="" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881640 4722 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881649 4722 flags.go:64] FLAG: --http-check-frequency="20s" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881658 4722 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881667 4722 flags.go:64] FLAG: --image-credential-provider-config="" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881676 4722 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881685 4722 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881695 4722 flags.go:64] FLAG: --image-service-endpoint="" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881704 4722 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881713 4722 flags.go:64] FLAG: --kube-api-burst="100" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881722 4722 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881731 4722 flags.go:64] FLAG: --kube-api-qps="50" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881740 4722 flags.go:64] FLAG: --kube-reserved="" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881752 4722 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881761 4722 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881770 4722 flags.go:64] FLAG: --kubelet-cgroups="" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881778 4722 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881787 4722 flags.go:64] FLAG: --lock-file="" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881796 4722 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881806 4722 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881816 4722 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881829 4722 flags.go:64] FLAG: --log-json-split-stream="false" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881839 4722 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881848 4722 flags.go:64] FLAG: --log-text-split-stream="false" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881857 4722 flags.go:64] FLAG: --logging-format="text" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881866 4722 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881876 4722 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881884 4722 flags.go:64] FLAG: --manifest-url="" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881893 4722 flags.go:64] FLAG: --manifest-url-header="" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881904 4722 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881913 4722 flags.go:64] FLAG: --max-open-files="1000000" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881924 4722 flags.go:64] FLAG: --max-pods="110" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881934 4722 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881943 4722 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881952 4722 flags.go:64] FLAG: --memory-manager-policy="None" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881961 4722 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881970 4722 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881979 4722 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.881988 4722 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.882008 4722 flags.go:64] FLAG: --node-status-max-images="50" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.882017 4722 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.882026 4722 flags.go:64] FLAG: --oom-score-adj="-999" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.882035 4722 flags.go:64] FLAG: --pod-cidr="" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.882044 4722 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.882059 4722 flags.go:64] FLAG: --pod-manifest-path="" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.882068 4722 flags.go:64] FLAG: --pod-max-pids="-1" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.882079 4722 flags.go:64] FLAG: --pods-per-core="0" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.882088 4722 flags.go:64] FLAG: --port="10250" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.882097 4722 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.882106 4722 flags.go:64] FLAG: --provider-id="" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.882115 4722 flags.go:64] FLAG: --qos-reserved="" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.882124 4722 flags.go:64] FLAG: --read-only-port="10255" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.882132 4722 flags.go:64] FLAG: --register-node="true" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.882141 4722 flags.go:64] FLAG: --register-schedulable="true" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.882150 4722 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.882196 4722 flags.go:64] FLAG: --registry-burst="10" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.882231 4722 flags.go:64] FLAG: --registry-qps="5" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.882240 4722 flags.go:64] FLAG: --reserved-cpus="" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.882248 4722 flags.go:64] FLAG: --reserved-memory="" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.882259 4722 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.882268 4722 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.882277 4722 flags.go:64] FLAG: --rotate-certificates="false" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.882286 4722 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.882294 4722 flags.go:64] FLAG: --runonce="false" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.882303 4722 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.882312 4722 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.882322 4722 flags.go:64] FLAG: --seccomp-default="false" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.882332 4722 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.882341 4722 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.882350 4722 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.882360 4722 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.882369 4722 flags.go:64] FLAG: --storage-driver-password="root" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.882378 4722 flags.go:64] FLAG: --storage-driver-secure="false" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.882387 4722 flags.go:64] FLAG: --storage-driver-table="stats" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.882396 4722 flags.go:64] FLAG: --storage-driver-user="root" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.882405 4722 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.882414 4722 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.882423 4722 flags.go:64] FLAG: --system-cgroups="" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.882435 4722 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.882457 4722 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.882466 4722 flags.go:64] FLAG: --tls-cert-file="" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.882475 4722 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.882485 4722 flags.go:64] FLAG: --tls-min-version="" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.882494 4722 flags.go:64] FLAG: --tls-private-key-file="" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.882502 4722 flags.go:64] FLAG: --topology-manager-policy="none" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.882511 4722 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.882520 4722 flags.go:64] FLAG: --topology-manager-scope="container" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.882529 4722 flags.go:64] FLAG: --v="2" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.882540 4722 flags.go:64] FLAG: --version="false" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.882551 4722 flags.go:64] FLAG: --vmodule="" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.882561 4722 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.882571 4722 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.882814 4722 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.882825 4722 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.882834 4722 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.882842 4722 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.882851 4722 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.882858 4722 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.882866 4722 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.882874 4722 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.882882 4722 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.882890 4722 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.882897 4722 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.882905 4722 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.882913 4722 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.882921 4722 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.882930 4722 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.882939 4722 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.882947 4722 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.882956 4722 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.882967 4722 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.882979 4722 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.882989 4722 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.882999 4722 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.883007 4722 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.883016 4722 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.883025 4722 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.883035 4722 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.883043 4722 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.883052 4722 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.883062 4722 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.883071 4722 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.883081 4722 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.883090 4722 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.883099 4722 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.883107 4722 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.883115 4722 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.883124 4722 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.883131 4722 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.883139 4722 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.883147 4722 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.883155 4722 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.883162 4722 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.883172 4722 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.883181 4722 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.883190 4722 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.883198 4722 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.883231 4722 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.883240 4722 feature_gate.go:330] unrecognized feature gate: Example Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.883247 4722 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.883255 4722 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.883263 4722 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.883273 4722 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.883285 4722 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.883293 4722 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.883303 4722 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.883313 4722 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.883321 4722 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.883330 4722 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.883338 4722 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.883347 4722 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.883356 4722 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.883364 4722 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.883372 4722 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.883380 4722 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.883388 4722 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.883396 4722 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.883403 4722 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.883411 4722 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.883418 4722 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.883426 4722 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.883434 4722 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.883441 4722 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.883454 4722 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.897358 4722 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.897402 4722 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.897566 4722 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.897589 4722 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.897599 4722 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.897609 4722 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.897642 4722 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.897652 4722 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.897661 4722 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.897669 4722 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.897677 4722 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.897687 4722 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.897696 4722 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.897704 4722 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.897713 4722 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.897723 4722 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.897734 4722 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.897744 4722 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.897753 4722 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.897762 4722 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.897770 4722 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.897778 4722 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.897786 4722 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.897794 4722 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.897805 4722 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.897817 4722 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.897848 4722 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.897859 4722 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.897869 4722 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.897878 4722 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.897887 4722 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.897896 4722 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.897905 4722 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.897914 4722 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.897922 4722 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.897931 4722 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.897940 4722 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.897948 4722 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.897957 4722 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.897965 4722 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.897973 4722 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.897981 4722 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.897989 4722 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.897997 4722 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.898005 4722 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.898015 4722 feature_gate.go:330] unrecognized feature gate: Example Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.898023 4722 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.898030 4722 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.898041 4722 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.898051 4722 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.898065 4722 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.898074 4722 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.898084 4722 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.898093 4722 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.898104 4722 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.898113 4722 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.898124 4722 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.898133 4722 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.898141 4722 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.898150 4722 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.898159 4722 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.898168 4722 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.898176 4722 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.898185 4722 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.898193 4722 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.898230 4722 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.898240 4722 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.898248 4722 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.898256 4722 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.898263 4722 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.898271 4722 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.898279 4722 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.898288 4722 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.898300 4722 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.898526 4722 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.898540 4722 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.898549 4722 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.898560 4722 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.898571 4722 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.898580 4722 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.898589 4722 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.898598 4722 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.898607 4722 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.898616 4722 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.898627 4722 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.898636 4722 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.898645 4722 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.898653 4722 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.898663 4722 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.898671 4722 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.898679 4722 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.898688 4722 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.898696 4722 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.898704 4722 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.898712 4722 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.898720 4722 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.898728 4722 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.898736 4722 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.898744 4722 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.898752 4722 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.898759 4722 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.898767 4722 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.898776 4722 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.898785 4722 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.898793 4722 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.898802 4722 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.898811 4722 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.898819 4722 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.898828 4722 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.898836 4722 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.898844 4722 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.898852 4722 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.898859 4722 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.898867 4722 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.898875 4722 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.898883 4722 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.898891 4722 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.898899 4722 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.898909 4722 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.898919 4722 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.898928 4722 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.898936 4722 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.898945 4722 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.898954 4722 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.898962 4722 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.898971 4722 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.898979 4722 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.898988 4722 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.898997 4722 feature_gate.go:330] unrecognized feature gate: Example Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.899005 4722 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.899013 4722 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.899020 4722 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.899028 4722 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.899036 4722 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.899044 4722 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.899051 4722 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.899061 4722 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.899070 4722 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.899079 4722 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.899087 4722 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.899095 4722 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.899104 4722 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.899112 4722 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.899120 4722 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 09 14:02:39 crc kubenswrapper[4722]: W0309 14:02:39.899130 4722 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.899142 4722 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.900183 4722 server.go:940] "Client rotation is on, will bootstrap in background" Mar 09 14:02:39 crc kubenswrapper[4722]: E0309 14:02:39.905472 4722 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.914692 4722 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.914822 4722 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.916550 4722 server.go:997] "Starting client certificate rotation" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.916577 4722 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.916819 4722 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.948440 4722 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.951980 4722 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 09 14:02:39 crc kubenswrapper[4722]: E0309 14:02:39.953009 4722 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.194:6443: connect: connection refused" logger="UnhandledError" Mar 09 14:02:39 crc kubenswrapper[4722]: I0309 14:02:39.978272 4722 log.go:25] "Validated CRI v1 runtime API" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.019702 4722 log.go:25] "Validated CRI v1 image API" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.022844 4722 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.028299 4722 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-09-13-57-57-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.028352 4722 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:41 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:42 fsType:tmpfs blockSize:0}] Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.054728 4722 manager.go:217] Machine: {Timestamp:2026-03-09 14:02:40.052027255 +0000 UTC m=+0.607595851 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654120448 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:70de6d16-940c-46da-be51-a9b50262dda2 BootID:3321b793-006a-42f5-9c18-7f48ed9bad15 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827060224 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:41 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:42 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:8d:4a:b6 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:8d:4a:b6 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:c5:22:f9 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:fc:9b:7a Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:5d:1d:92 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:63:90:ca Speed:-1 Mtu:1496} {Name:eth10 MacAddress:3a:6e:8d:72:5d:f4 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:7a:91:e2:4f:c4:ca Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654120448 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.055012 4722 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.055436 4722 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.055956 4722 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.056162 4722 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.056228 4722 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.056527 4722 topology_manager.go:138] "Creating topology manager with none policy" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.056544 4722 container_manager_linux.go:303] "Creating device plugin manager" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.057102 4722 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.057145 4722 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.058172 4722 state_mem.go:36] "Initialized new in-memory state store" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.058294 4722 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.062421 4722 kubelet.go:418] "Attempting to sync node with API server" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.062450 4722 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.062482 4722 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.062500 4722 kubelet.go:324] "Adding apiserver pod source" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.062558 4722 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.067834 4722 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.069417 4722 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 09 14:02:40 crc kubenswrapper[4722]: W0309 14:02:40.070765 4722 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.194:6443: connect: connection refused Mar 09 14:02:40 crc kubenswrapper[4722]: E0309 14:02:40.070877 4722 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.194:6443: connect: connection refused" logger="UnhandledError" Mar 09 14:02:40 crc kubenswrapper[4722]: W0309 14:02:40.070899 4722 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.194:6443: connect: connection refused Mar 09 14:02:40 crc kubenswrapper[4722]: E0309 14:02:40.071065 4722 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.194:6443: connect: connection refused" logger="UnhandledError" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.072613 4722 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.074504 4722 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.074554 4722 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.074622 4722 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.074638 4722 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.074662 4722 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.074677 4722 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.074691 4722 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.074714 4722 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.074730 4722 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.074745 4722 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.074788 4722 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.074804 4722 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.075912 4722 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.076781 4722 server.go:1280] "Started kubelet" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.078353 4722 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.078342 4722 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.079193 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.194:6443: connect: connection refused Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.079438 4722 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 09 14:02:40 crc systemd[1]: Started Kubernetes Kubelet. Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.081109 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.081177 4722 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.081540 4722 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.081563 4722 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.081945 4722 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 09 14:02:40 crc kubenswrapper[4722]: E0309 14:02:40.082000 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:02:40 crc kubenswrapper[4722]: W0309 14:02:40.082813 4722 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.194:6443: connect: connection refused Mar 09 14:02:40 crc kubenswrapper[4722]: E0309 14:02:40.082962 4722 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.194:6443: connect: connection refused" logger="UnhandledError" Mar 09 14:02:40 crc kubenswrapper[4722]: E0309 14:02:40.083376 4722 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.194:6443: connect: connection refused" interval="200ms" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.084188 4722 factory.go:153] Registering CRI-O factory Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.084291 4722 factory.go:221] Registration of the crio container factory successfully Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.084471 4722 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.084494 4722 factory.go:55] Registering systemd factory Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.084511 4722 factory.go:221] Registration of the systemd container factory successfully Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.084554 4722 factory.go:103] Registering Raw factory Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.084588 4722 manager.go:1196] Started watching for new ooms in manager Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.090552 4722 manager.go:319] Starting recovery of all containers Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.092605 4722 server.go:460] "Adding debug handlers to kubelet server" Mar 09 14:02:40 crc kubenswrapper[4722]: E0309 14:02:40.093460 4722 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.194:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189b312fe510cc8f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:40.076729487 +0000 UTC m=+0.632298103,LastTimestamp:2026-03-09 14:02:40.076729487 +0000 UTC m=+0.632298103,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.105539 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.106515 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.106613 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.106701 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.106784 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.106872 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.106962 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.107043 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.109270 4722 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.109398 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.109497 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.109579 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.109658 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.109733 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.109832 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.109915 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.109984 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.110064 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.110140 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.110231 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.110311 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.110381 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.110467 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.110543 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.110619 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.110698 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.110812 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.110988 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.111074 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.111159 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.111280 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.111360 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.111431 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.111523 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.111611 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.111693 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.111767 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.111848 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.111948 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.112101 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.112191 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.112303 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.112390 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.112484 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.112562 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.112668 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.112758 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.112839 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.113000 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.113059 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.113073 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.113088 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.113102 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.113123 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.113141 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.113157 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.113171 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.113187 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.113222 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.113236 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.113251 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.113266 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.113279 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.113323 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.113339 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.113351 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.113364 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.113377 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.113390 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.113403 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.113417 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.113430 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.113442 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.113456 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.113474 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.113490 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.113505 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.113519 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.113533 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.113549 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.113563 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.113584 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.113605 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.113621 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.113636 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.113651 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.113686 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.113699 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.113753 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.113769 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.113782 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.113800 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.113812 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.113824 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.113834 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.113851 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.113874 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.113892 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.113916 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.113936 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.113950 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.113962 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.113973 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.113985 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.113997 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.114017 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.114029 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.114044 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.114057 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.114069 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.114083 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.114095 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.114137 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.114152 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.114163 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.114174 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.114185 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.114212 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.114224 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.114235 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.114252 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.114275 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.114327 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.114337 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.114350 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.114362 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.114375 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.114387 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.114398 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.114409 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.114420 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.114430 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.114442 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.114455 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.114468 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.114481 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.114519 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.114531 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.114546 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.114561 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.114580 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.114593 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.114607 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.114623 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.114644 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.114658 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.114671 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.114683 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.114693 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.114706 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.114716 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.114728 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.114739 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.114750 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.114761 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.114771 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.114782 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.114793 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.114804 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.114817 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.114828 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.114839 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.114851 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.114863 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.114873 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.114885 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.114895 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.114909 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.114924 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.114943 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.114958 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.114968 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.114980 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.114991 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.115006 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.115018 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.115028 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.115040 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.115051 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.115062 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.115071 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.115083 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.115094 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.115106 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.115116 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.115126 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.115142 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.115153 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.115162 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.115173 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.115185 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.115195 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.115224 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.115235 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.115245 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.115259 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.115276 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.115291 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.115302 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.115315 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.115327 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.115340 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.115355 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.115367 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.115380 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.115391 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.115403 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.115415 4722 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.115428 4722 reconstruct.go:97] "Volume reconstruction finished" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.115437 4722 reconciler.go:26] "Reconciler: start to sync state" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.122190 4722 manager.go:324] Recovery completed Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.132291 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.135468 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.135521 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.135535 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.137132 4722 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.137161 4722 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.137189 4722 state_mem.go:36] "Initialized new in-memory state store" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.144078 4722 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.147804 4722 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.147859 4722 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.147898 4722 kubelet.go:2335] "Starting kubelet main sync loop" Mar 09 14:02:40 crc kubenswrapper[4722]: E0309 14:02:40.147952 4722 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 09 14:02:40 crc kubenswrapper[4722]: W0309 14:02:40.150352 4722 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.194:6443: connect: connection refused Mar 09 14:02:40 crc kubenswrapper[4722]: E0309 14:02:40.150461 4722 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.194:6443: connect: connection refused" logger="UnhandledError" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.156846 4722 policy_none.go:49] "None policy: Start" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.158092 4722 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.158122 4722 state_mem.go:35] "Initializing new in-memory state store" Mar 09 14:02:40 crc kubenswrapper[4722]: E0309 14:02:40.183030 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.214820 4722 manager.go:334] "Starting Device Plugin manager" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.214892 4722 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.214908 4722 server.go:79] "Starting device plugin registration server" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.215576 4722 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.215597 4722 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.216460 4722 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.216677 4722 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.216710 4722 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 09 14:02:40 crc kubenswrapper[4722]: E0309 14:02:40.225487 4722 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.248189 4722 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc"] Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.248373 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.249675 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.249887 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.250102 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.250551 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.250838 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.250932 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.252429 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.252469 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.252479 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.252678 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.252712 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.252727 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.252901 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.253028 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.253057 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.254916 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.254949 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.254961 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.254975 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.255038 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.255052 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.255325 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.255539 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.255645 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.256615 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.256658 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.256673 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.256896 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.257045 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.257091 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.257520 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.257547 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.257558 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.257738 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.257768 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.257779 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.258165 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.258226 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.258242 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.258467 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.258505 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.259340 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.259363 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.259370 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:02:40 crc kubenswrapper[4722]: E0309 14:02:40.285220 4722 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.194:6443: connect: connection refused" interval="400ms" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.315885 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.317907 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.317980 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.318023 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.318058 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.318061 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.318138 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.318101 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.318181 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.318252 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.318295 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.318308 4722 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.318338 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.318488 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.318536 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.318571 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.318616 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.318665 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.318689 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.318717 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 14:02:40 crc kubenswrapper[4722]: E0309 14:02:40.319319 4722 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.194:6443: connect: connection refused" node="crc" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.419768 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.419822 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.419842 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.419865 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.419884 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.419903 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.419946 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.419969 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.419991 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.420012 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.420034 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.420052 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.420080 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.420113 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.420140 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.420282 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.420352 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.420418 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.420363 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.420380 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.420518 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.420429 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.420276 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.420511 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.420309 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.420554 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.420564 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.420579 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.420622 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.420622 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.519702 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.521562 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.521650 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.521670 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.521718 4722 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 14:02:40 crc kubenswrapper[4722]: E0309 14:02:40.522537 4722 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.194:6443: connect: connection refused" node="crc" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.595736 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.600958 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.616482 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.636322 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.638965 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 14:02:40 crc kubenswrapper[4722]: W0309 14:02:40.661879 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-3ab830645b1615ff847452053179c9c2652b91d157f0baedd702b93334f3379a WatchSource:0}: Error finding container 3ab830645b1615ff847452053179c9c2652b91d157f0baedd702b93334f3379a: Status 404 returned error can't find the container with id 3ab830645b1615ff847452053179c9c2652b91d157f0baedd702b93334f3379a Mar 09 14:02:40 crc kubenswrapper[4722]: W0309 14:02:40.664280 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-0f9f91b314232ffff2208294117c591ad8aef85985c61cc34a601b74e1853d1d WatchSource:0}: Error finding container 0f9f91b314232ffff2208294117c591ad8aef85985c61cc34a601b74e1853d1d: Status 404 returned error can't find the container with id 0f9f91b314232ffff2208294117c591ad8aef85985c61cc34a601b74e1853d1d Mar 09 14:02:40 crc kubenswrapper[4722]: E0309 14:02:40.685898 4722 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.194:6443: connect: connection refused" interval="800ms" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.923570 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.926932 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.926991 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.927009 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:02:40 crc kubenswrapper[4722]: I0309 14:02:40.927047 4722 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 14:02:40 crc kubenswrapper[4722]: E0309 14:02:40.927695 4722 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.194:6443: connect: connection refused" node="crc" Mar 09 14:02:41 crc kubenswrapper[4722]: I0309 14:02:41.080063 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.194:6443: connect: connection refused Mar 09 14:02:41 crc kubenswrapper[4722]: I0309 14:02:41.153435 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"3ab830645b1615ff847452053179c9c2652b91d157f0baedd702b93334f3379a"} Mar 09 14:02:41 crc kubenswrapper[4722]: I0309 14:02:41.157772 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"0f9f91b314232ffff2208294117c591ad8aef85985c61cc34a601b74e1853d1d"} Mar 09 14:02:41 crc kubenswrapper[4722]: I0309 14:02:41.158821 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"fa5914715b9d3222fbf3fbe5791f8d466f90d130210d6e7aa1ca18b1643812e4"} Mar 09 14:02:41 crc kubenswrapper[4722]: I0309 14:02:41.161774 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"0fba7e929c43fa362f8003203438e6b2bafe110546e9e8c93ab67e8711344eca"} Mar 09 14:02:41 crc kubenswrapper[4722]: I0309 14:02:41.163433 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"95b8856546831303570dd377eac32465e328059fbf2839822c6d8cc3fabb90dc"} Mar 09 14:02:41 crc kubenswrapper[4722]: E0309 14:02:41.225240 4722 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.194:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189b312fe510cc8f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:40.076729487 +0000 UTC m=+0.632298103,LastTimestamp:2026-03-09 14:02:40.076729487 +0000 UTC m=+0.632298103,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:02:41 crc kubenswrapper[4722]: W0309 14:02:41.385872 4722 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.194:6443: connect: connection refused Mar 09 14:02:41 crc kubenswrapper[4722]: E0309 14:02:41.385980 4722 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.194:6443: connect: connection refused" logger="UnhandledError" Mar 09 14:02:41 crc kubenswrapper[4722]: W0309 14:02:41.423355 4722 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.194:6443: connect: connection refused Mar 09 14:02:41 crc kubenswrapper[4722]: E0309 14:02:41.423478 4722 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.194:6443: connect: connection refused" logger="UnhandledError" Mar 09 14:02:41 crc kubenswrapper[4722]: E0309 14:02:41.487103 4722 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.194:6443: connect: connection refused" interval="1.6s" Mar 09 14:02:41 crc kubenswrapper[4722]: W0309 14:02:41.583878 4722 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.194:6443: connect: connection refused Mar 09 14:02:41 crc kubenswrapper[4722]: E0309 14:02:41.584037 4722 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.194:6443: connect: connection refused" logger="UnhandledError" Mar 09 14:02:41 crc kubenswrapper[4722]: W0309 14:02:41.651668 4722 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.194:6443: connect: connection refused Mar 09 14:02:41 crc kubenswrapper[4722]: E0309 14:02:41.651816 4722 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.194:6443: connect: connection refused" logger="UnhandledError" Mar 09 14:02:41 crc kubenswrapper[4722]: I0309 14:02:41.728679 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 14:02:41 crc kubenswrapper[4722]: I0309 14:02:41.731622 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:02:41 crc kubenswrapper[4722]: I0309 14:02:41.731663 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:02:41 crc kubenswrapper[4722]: I0309 14:02:41.731677 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:02:41 crc kubenswrapper[4722]: I0309 14:02:41.731712 4722 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 14:02:41 crc kubenswrapper[4722]: E0309 14:02:41.732477 4722 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.194:6443: connect: connection refused" node="crc" Mar 09 14:02:42 crc kubenswrapper[4722]: I0309 14:02:42.059560 4722 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 09 14:02:42 crc kubenswrapper[4722]: E0309 14:02:42.061069 4722 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.194:6443: connect: connection refused" logger="UnhandledError" Mar 09 14:02:42 crc kubenswrapper[4722]: I0309 14:02:42.080448 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.194:6443: connect: connection refused Mar 09 14:02:42 crc kubenswrapper[4722]: I0309 14:02:42.169670 4722 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="928d88a2e78d66b996b4b4f8c09878f78dd7fbdb000e2ab6d43ac77c999a748f" exitCode=0 Mar 09 14:02:42 crc kubenswrapper[4722]: I0309 14:02:42.169745 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"928d88a2e78d66b996b4b4f8c09878f78dd7fbdb000e2ab6d43ac77c999a748f"} Mar 09 14:02:42 crc kubenswrapper[4722]: I0309 14:02:42.170019 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 14:02:42 crc kubenswrapper[4722]: I0309 14:02:42.171392 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:02:42 crc kubenswrapper[4722]: I0309 14:02:42.171479 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:02:42 crc kubenswrapper[4722]: I0309 14:02:42.171500 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:02:42 crc kubenswrapper[4722]: I0309 14:02:42.172272 4722 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="a97aabe30af8bc4ff72fbac7f4fbf7804aff61f8d2fe99753558e7e40f348867" exitCode=0 Mar 09 14:02:42 crc kubenswrapper[4722]: I0309 14:02:42.172416 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 14:02:42 crc kubenswrapper[4722]: I0309 14:02:42.172395 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"a97aabe30af8bc4ff72fbac7f4fbf7804aff61f8d2fe99753558e7e40f348867"} Mar 09 14:02:42 crc kubenswrapper[4722]: I0309 14:02:42.173634 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:02:42 crc kubenswrapper[4722]: I0309 14:02:42.173691 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:02:42 crc kubenswrapper[4722]: I0309 14:02:42.173709 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:02:42 crc kubenswrapper[4722]: I0309 14:02:42.179155 4722 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="d21ce78d41277e674d92c74180541175dcbc953b11bd3ddc5d381cf93628cb3f" exitCode=0 Mar 09 14:02:42 crc kubenswrapper[4722]: I0309 14:02:42.179407 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 14:02:42 crc kubenswrapper[4722]: I0309 14:02:42.179406 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"d21ce78d41277e674d92c74180541175dcbc953b11bd3ddc5d381cf93628cb3f"} Mar 09 14:02:42 crc kubenswrapper[4722]: I0309 14:02:42.180755 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:02:42 crc kubenswrapper[4722]: I0309 14:02:42.180850 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:02:42 crc kubenswrapper[4722]: I0309 14:02:42.180986 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:02:42 crc kubenswrapper[4722]: I0309 14:02:42.184987 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 14:02:42 crc kubenswrapper[4722]: I0309 14:02:42.185083 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"03fa5f2f3d3dc4ebaf6b5db02fe2036caaf7e0de7197cb5d3409dbbae676a338"} Mar 09 14:02:42 crc kubenswrapper[4722]: I0309 14:02:42.185245 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"41c6395bf5f6006036c3d58919ea6ea976b56827cac8ae369b10dd4037d29153"} Mar 09 14:02:42 crc kubenswrapper[4722]: I0309 14:02:42.185331 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"50ecab202a56b3927a1dcdeae44cf5397c4182f1776cd5c3203941d95b498020"} Mar 09 14:02:42 crc kubenswrapper[4722]: I0309 14:02:42.185480 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ff66d43b6e872eed24aacaf3ea882c7148b6ef62bbe5b7bf10ede5d2691a3680"} Mar 09 14:02:42 crc kubenswrapper[4722]: I0309 14:02:42.186178 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:02:42 crc kubenswrapper[4722]: I0309 14:02:42.186233 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:02:42 crc kubenswrapper[4722]: I0309 14:02:42.186248 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:02:42 crc kubenswrapper[4722]: I0309 14:02:42.189582 4722 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1" exitCode=0 Mar 09 14:02:42 crc kubenswrapper[4722]: I0309 14:02:42.189847 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1"} Mar 09 14:02:42 crc kubenswrapper[4722]: I0309 14:02:42.189874 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 14:02:42 crc kubenswrapper[4722]: I0309 14:02:42.191997 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:02:42 crc kubenswrapper[4722]: I0309 14:02:42.192067 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:02:42 crc kubenswrapper[4722]: I0309 14:02:42.192085 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:02:42 crc kubenswrapper[4722]: I0309 14:02:42.198257 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 14:02:42 crc kubenswrapper[4722]: I0309 14:02:42.199743 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:02:42 crc kubenswrapper[4722]: I0309 14:02:42.199784 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:02:42 crc kubenswrapper[4722]: I0309 14:02:42.199799 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:02:43 crc kubenswrapper[4722]: I0309 14:02:43.080488 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.194:6443: connect: connection refused Mar 09 14:02:43 crc kubenswrapper[4722]: E0309 14:02:43.088527 4722 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.194:6443: connect: connection refused" interval="3.2s" Mar 09 14:02:43 crc kubenswrapper[4722]: I0309 14:02:43.195676 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"5113a57854e81210223fb6e60aedab903bcc7d376c2953814c360fa6019ecd17"} Mar 09 14:02:43 crc kubenswrapper[4722]: I0309 14:02:43.195849 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 14:02:43 crc kubenswrapper[4722]: I0309 14:02:43.197483 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:02:43 crc kubenswrapper[4722]: I0309 14:02:43.197564 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:02:43 crc kubenswrapper[4722]: I0309 14:02:43.197584 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:02:43 crc kubenswrapper[4722]: I0309 14:02:43.200618 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e2fbabc5ec2c100150ce886c06c3bd78cc5896c8a06ac9d262eeb552a8a8bb5d"} Mar 09 14:02:43 crc kubenswrapper[4722]: I0309 14:02:43.200698 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"be3836660bf646052991965270cd69b8b8237b5c4bd698a73789b050d23e6877"} Mar 09 14:02:43 crc kubenswrapper[4722]: I0309 14:02:43.200718 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"3dbef38ff5e714c92fc8a86f1e3c64adf606cb198145f884a000f8dee200fcf2"} Mar 09 14:02:43 crc kubenswrapper[4722]: I0309 14:02:43.200731 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"56c76a5e8718243a17d8d9fd9a8dcf6083c2ccb65b4816926dca08a89e465728"} Mar 09 14:02:43 crc kubenswrapper[4722]: I0309 14:02:43.202837 4722 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="e6625c60c717ffc2b85b70374bc50dfae41b04d6362db0c7daf387f31c25862e" exitCode=0 Mar 09 14:02:43 crc kubenswrapper[4722]: I0309 14:02:43.202959 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"e6625c60c717ffc2b85b70374bc50dfae41b04d6362db0c7daf387f31c25862e"} Mar 09 14:02:43 crc kubenswrapper[4722]: I0309 14:02:43.203004 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 14:02:43 crc kubenswrapper[4722]: I0309 14:02:43.211103 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:02:43 crc kubenswrapper[4722]: I0309 14:02:43.211226 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:02:43 crc kubenswrapper[4722]: I0309 14:02:43.211246 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:02:43 crc kubenswrapper[4722]: I0309 14:02:43.215036 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"a48ba8f8443935496fa8f3a7e262723b19515f3adc95847d64236f956f91cc20"} Mar 09 14:02:43 crc kubenswrapper[4722]: I0309 14:02:43.215099 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"5f502c2ee0b77f66f4b32b7d3173eb71bd6d34c31724e8032a0200dac6fd62b5"} Mar 09 14:02:43 crc kubenswrapper[4722]: I0309 14:02:43.215125 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"c5c374ffb7928a20af6f154bfca0c494d692f1f6f55eb8de5d4d5db79a355013"} Mar 09 14:02:43 crc kubenswrapper[4722]: I0309 14:02:43.215099 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 14:02:43 crc kubenswrapper[4722]: I0309 14:02:43.215264 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 14:02:43 crc kubenswrapper[4722]: I0309 14:02:43.216771 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:02:43 crc kubenswrapper[4722]: I0309 14:02:43.216818 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:02:43 crc kubenswrapper[4722]: I0309 14:02:43.216835 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:02:43 crc kubenswrapper[4722]: I0309 14:02:43.218113 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:02:43 crc kubenswrapper[4722]: I0309 14:02:43.218152 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:02:43 crc kubenswrapper[4722]: I0309 14:02:43.218161 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:02:43 crc kubenswrapper[4722]: I0309 14:02:43.333634 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 14:02:43 crc kubenswrapper[4722]: I0309 14:02:43.334864 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:02:43 crc kubenswrapper[4722]: I0309 14:02:43.334917 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:02:43 crc kubenswrapper[4722]: I0309 14:02:43.334929 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:02:43 crc kubenswrapper[4722]: I0309 14:02:43.334963 4722 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 14:02:43 crc kubenswrapper[4722]: E0309 14:02:43.335965 4722 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.194:6443: connect: connection refused" node="crc" Mar 09 14:02:43 crc kubenswrapper[4722]: W0309 14:02:43.363592 4722 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.194:6443: connect: connection refused Mar 09 14:02:43 crc kubenswrapper[4722]: E0309 14:02:43.363718 4722 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.194:6443: connect: connection refused" logger="UnhandledError" Mar 09 14:02:43 crc kubenswrapper[4722]: I0309 14:02:43.611348 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 14:02:43 crc kubenswrapper[4722]: I0309 14:02:43.686566 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 14:02:43 crc kubenswrapper[4722]: I0309 14:02:43.693264 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 14:02:44 crc kubenswrapper[4722]: I0309 14:02:44.223779 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"dc2755da6b63cf19319476368a715f50342e62ebab549e9cf8fc68b2274664d4"} Mar 09 14:02:44 crc kubenswrapper[4722]: I0309 14:02:44.223857 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 14:02:44 crc kubenswrapper[4722]: I0309 14:02:44.225842 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:02:44 crc kubenswrapper[4722]: I0309 14:02:44.225896 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:02:44 crc kubenswrapper[4722]: I0309 14:02:44.225918 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:02:44 crc kubenswrapper[4722]: I0309 14:02:44.227850 4722 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="21b62fa3b3144d1d24dbb3a9e5f203e4c692377a54fb2322ebe6b927aa67403b" exitCode=0 Mar 09 14:02:44 crc kubenswrapper[4722]: I0309 14:02:44.227973 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 14:02:44 crc kubenswrapper[4722]: I0309 14:02:44.228033 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 14:02:44 crc kubenswrapper[4722]: I0309 14:02:44.228042 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 14:02:44 crc kubenswrapper[4722]: I0309 14:02:44.228151 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 14:02:44 crc kubenswrapper[4722]: I0309 14:02:44.227960 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"21b62fa3b3144d1d24dbb3a9e5f203e4c692377a54fb2322ebe6b927aa67403b"} Mar 09 14:02:44 crc kubenswrapper[4722]: I0309 14:02:44.228430 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 09 14:02:44 crc kubenswrapper[4722]: I0309 14:02:44.229696 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:02:44 crc kubenswrapper[4722]: I0309 14:02:44.229731 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:02:44 crc kubenswrapper[4722]: I0309 14:02:44.229749 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:02:44 crc kubenswrapper[4722]: I0309 14:02:44.229777 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:02:44 crc kubenswrapper[4722]: I0309 14:02:44.229814 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:02:44 crc kubenswrapper[4722]: I0309 14:02:44.229832 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:02:44 crc kubenswrapper[4722]: I0309 14:02:44.230570 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:02:44 crc kubenswrapper[4722]: I0309 14:02:44.230613 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:02:44 crc kubenswrapper[4722]: I0309 14:02:44.230625 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:02:44 crc kubenswrapper[4722]: I0309 14:02:44.230641 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:02:44 crc kubenswrapper[4722]: I0309 14:02:44.230650 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:02:44 crc kubenswrapper[4722]: I0309 14:02:44.230674 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:02:45 crc kubenswrapper[4722]: I0309 14:02:45.237267 4722 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 09 14:02:45 crc kubenswrapper[4722]: I0309 14:02:45.237312 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 14:02:45 crc kubenswrapper[4722]: I0309 14:02:45.237375 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 14:02:45 crc kubenswrapper[4722]: I0309 14:02:45.237267 4722 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 09 14:02:45 crc kubenswrapper[4722]: I0309 14:02:45.237339 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"09fa62e7293a50fb3f6bdaf37061cbe5e95a830c8f7337660ce73138dbcaa641"} Mar 09 14:02:45 crc kubenswrapper[4722]: I0309 14:02:45.237574 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 14:02:45 crc kubenswrapper[4722]: I0309 14:02:45.237590 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"04169d9b8359e26cc9807e27fb9a5d54ca0946aaade294c5bcc272a7d21c4db6"} Mar 09 14:02:45 crc kubenswrapper[4722]: I0309 14:02:45.237636 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"63769aca40f721bec0716427145a3ceea0595836c03d31c98f6d45ff14de68be"} Mar 09 14:02:45 crc kubenswrapper[4722]: I0309 14:02:45.243673 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:02:45 crc kubenswrapper[4722]: I0309 14:02:45.243825 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:02:45 crc kubenswrapper[4722]: I0309 14:02:45.243844 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:02:45 crc kubenswrapper[4722]: I0309 14:02:45.244283 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:02:45 crc kubenswrapper[4722]: I0309 14:02:45.245254 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:02:45 crc kubenswrapper[4722]: I0309 14:02:45.245401 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:02:45 crc kubenswrapper[4722]: I0309 14:02:45.246337 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:02:45 crc kubenswrapper[4722]: I0309 14:02:45.246418 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:02:45 crc kubenswrapper[4722]: I0309 14:02:45.246438 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:02:46 crc kubenswrapper[4722]: I0309 14:02:46.190943 4722 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 09 14:02:46 crc kubenswrapper[4722]: I0309 14:02:46.205005 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 14:02:46 crc kubenswrapper[4722]: I0309 14:02:46.245934 4722 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 09 14:02:46 crc kubenswrapper[4722]: I0309 14:02:46.245939 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"2dff721159b382bcfdde6f406a328ef2ee9264b35a8d8baeb7a3c6d31a60ff3b"} Mar 09 14:02:46 crc kubenswrapper[4722]: I0309 14:02:46.245994 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 14:02:46 crc kubenswrapper[4722]: I0309 14:02:46.245995 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"31ba786781fcdb781f5fa45276e29e94756863d2398109d91e81a52f6f88bee2"} Mar 09 14:02:46 crc kubenswrapper[4722]: I0309 14:02:46.246087 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 14:02:46 crc kubenswrapper[4722]: I0309 14:02:46.246961 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:02:46 crc kubenswrapper[4722]: I0309 14:02:46.246992 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:02:46 crc kubenswrapper[4722]: I0309 14:02:46.247001 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:02:46 crc kubenswrapper[4722]: I0309 14:02:46.247646 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:02:46 crc kubenswrapper[4722]: I0309 14:02:46.247671 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:02:46 crc kubenswrapper[4722]: I0309 14:02:46.247679 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:02:46 crc kubenswrapper[4722]: I0309 14:02:46.537102 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 14:02:46 crc kubenswrapper[4722]: I0309 14:02:46.538765 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:02:46 crc kubenswrapper[4722]: I0309 14:02:46.538817 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:02:46 crc kubenswrapper[4722]: I0309 14:02:46.538831 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:02:46 crc kubenswrapper[4722]: I0309 14:02:46.538859 4722 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 14:02:46 crc kubenswrapper[4722]: I0309 14:02:46.575958 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 14:02:46 crc kubenswrapper[4722]: I0309 14:02:46.613670 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 09 14:02:47 crc kubenswrapper[4722]: I0309 14:02:47.249588 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 14:02:47 crc kubenswrapper[4722]: I0309 14:02:47.249609 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 14:02:47 crc kubenswrapper[4722]: I0309 14:02:47.251260 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:02:47 crc kubenswrapper[4722]: I0309 14:02:47.251298 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:02:47 crc kubenswrapper[4722]: I0309 14:02:47.251308 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:02:47 crc kubenswrapper[4722]: I0309 14:02:47.251402 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:02:47 crc kubenswrapper[4722]: I0309 14:02:47.251494 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:02:47 crc kubenswrapper[4722]: I0309 14:02:47.251512 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:02:47 crc kubenswrapper[4722]: I0309 14:02:47.608677 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 14:02:47 crc kubenswrapper[4722]: I0309 14:02:47.608887 4722 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 09 14:02:47 crc kubenswrapper[4722]: I0309 14:02:47.608929 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 14:02:47 crc kubenswrapper[4722]: I0309 14:02:47.610540 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:02:47 crc kubenswrapper[4722]: I0309 14:02:47.610625 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:02:47 crc kubenswrapper[4722]: I0309 14:02:47.610642 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:02:48 crc kubenswrapper[4722]: I0309 14:02:48.251841 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 14:02:48 crc kubenswrapper[4722]: I0309 14:02:48.253069 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:02:48 crc kubenswrapper[4722]: I0309 14:02:48.253144 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:02:48 crc kubenswrapper[4722]: I0309 14:02:48.253162 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:02:48 crc kubenswrapper[4722]: I0309 14:02:48.331496 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 14:02:48 crc kubenswrapper[4722]: I0309 14:02:48.331744 4722 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 09 14:02:48 crc kubenswrapper[4722]: I0309 14:02:48.331802 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 14:02:48 crc kubenswrapper[4722]: I0309 14:02:48.333870 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:02:48 crc kubenswrapper[4722]: I0309 14:02:48.333913 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:02:48 crc kubenswrapper[4722]: I0309 14:02:48.333927 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:02:49 crc kubenswrapper[4722]: I0309 14:02:49.205327 4722 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:02:49 crc kubenswrapper[4722]: I0309 14:02:49.205418 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:02:49 crc kubenswrapper[4722]: I0309 14:02:49.350606 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 14:02:49 crc kubenswrapper[4722]: I0309 14:02:49.350822 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 14:02:49 crc kubenswrapper[4722]: I0309 14:02:49.352083 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:02:49 crc kubenswrapper[4722]: I0309 14:02:49.352122 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:02:49 crc kubenswrapper[4722]: I0309 14:02:49.352138 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:02:50 crc kubenswrapper[4722]: E0309 14:02:50.225705 4722 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 09 14:02:53 crc kubenswrapper[4722]: W0309 14:02:53.787731 4722 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:02:53Z is after 2026-02-23T05:33:13Z Mar 09 14:02:53 crc kubenswrapper[4722]: E0309 14:02:53.788216 4722 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:02:53Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 14:02:53 crc kubenswrapper[4722]: W0309 14:02:53.792323 4722 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:02:53Z is after 2026-02-23T05:33:13Z Mar 09 14:02:53 crc kubenswrapper[4722]: E0309 14:02:53.792419 4722 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:02:53Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 14:02:53 crc kubenswrapper[4722]: I0309 14:02:53.795394 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:02:53Z is after 2026-02-23T05:33:13Z Mar 09 14:02:53 crc kubenswrapper[4722]: W0309 14:02:53.799548 4722 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:02:53Z is after 2026-02-23T05:33:13Z Mar 09 14:02:53 crc kubenswrapper[4722]: E0309 14:02:53.799631 4722 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:02:53Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 14:02:53 crc kubenswrapper[4722]: W0309 14:02:53.805673 4722 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:02:53Z is after 2026-02-23T05:33:13Z Mar 09 14:02:53 crc kubenswrapper[4722]: E0309 14:02:53.805783 4722 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:02:53Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 14:02:53 crc kubenswrapper[4722]: E0309 14:02:53.808500 4722 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:02:53Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189b312fe510cc8f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:40.076729487 +0000 UTC m=+0.632298103,LastTimestamp:2026-03-09 14:02:40.076729487 +0000 UTC m=+0.632298103,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:02:53 crc kubenswrapper[4722]: E0309 14:02:53.812051 4722 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:02:53Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 14:02:53 crc kubenswrapper[4722]: E0309 14:02:53.812579 4722 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:02:53Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 09 14:02:53 crc kubenswrapper[4722]: E0309 14:02:53.814663 4722 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:02:53Z is after 2026-02-23T05:33:13Z" node="crc" Mar 09 14:02:53 crc kubenswrapper[4722]: I0309 14:02:53.816313 4722 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 09 14:02:53 crc kubenswrapper[4722]: I0309 14:02:53.816397 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 09 14:02:53 crc kubenswrapper[4722]: I0309 14:02:53.823307 4722 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 09 14:02:53 crc kubenswrapper[4722]: [+]log ok Mar 09 14:02:53 crc kubenswrapper[4722]: [+]etcd ok Mar 09 14:02:53 crc kubenswrapper[4722]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 09 14:02:53 crc kubenswrapper[4722]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 09 14:02:53 crc kubenswrapper[4722]: [+]poststarthook/openshift.io-api-request-count-filter ok Mar 09 14:02:53 crc kubenswrapper[4722]: [+]poststarthook/openshift.io-startkubeinformers ok Mar 09 14:02:53 crc kubenswrapper[4722]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Mar 09 14:02:53 crc kubenswrapper[4722]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Mar 09 14:02:53 crc kubenswrapper[4722]: [+]poststarthook/generic-apiserver-start-informers ok Mar 09 14:02:53 crc kubenswrapper[4722]: [+]poststarthook/priority-and-fairness-config-consumer ok Mar 09 14:02:53 crc kubenswrapper[4722]: [+]poststarthook/priority-and-fairness-filter ok Mar 09 14:02:53 crc kubenswrapper[4722]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 09 14:02:53 crc kubenswrapper[4722]: [+]poststarthook/start-apiextensions-informers ok Mar 09 14:02:53 crc kubenswrapper[4722]: [-]poststarthook/start-apiextensions-controllers failed: reason withheld Mar 09 14:02:53 crc kubenswrapper[4722]: [-]poststarthook/crd-informer-synced failed: reason withheld Mar 09 14:02:53 crc kubenswrapper[4722]: [+]poststarthook/start-system-namespaces-controller ok Mar 09 14:02:53 crc kubenswrapper[4722]: [+]poststarthook/start-cluster-authentication-info-controller ok Mar 09 14:02:53 crc kubenswrapper[4722]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Mar 09 14:02:53 crc kubenswrapper[4722]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Mar 09 14:02:53 crc kubenswrapper[4722]: [+]poststarthook/start-legacy-token-tracking-controller ok Mar 09 14:02:53 crc kubenswrapper[4722]: [-]poststarthook/start-service-ip-repair-controllers failed: reason withheld Mar 09 14:02:53 crc kubenswrapper[4722]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Mar 09 14:02:53 crc kubenswrapper[4722]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Mar 09 14:02:53 crc kubenswrapper[4722]: [-]poststarthook/priority-and-fairness-config-producer failed: reason withheld Mar 09 14:02:53 crc kubenswrapper[4722]: [-]poststarthook/bootstrap-controller failed: reason withheld Mar 09 14:02:53 crc kubenswrapper[4722]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Mar 09 14:02:53 crc kubenswrapper[4722]: [+]poststarthook/start-kube-aggregator-informers ok Mar 09 14:02:53 crc kubenswrapper[4722]: [+]poststarthook/apiservice-status-local-available-controller ok Mar 09 14:02:53 crc kubenswrapper[4722]: [+]poststarthook/apiservice-status-remote-available-controller ok Mar 09 14:02:53 crc kubenswrapper[4722]: [-]poststarthook/apiservice-registration-controller failed: reason withheld Mar 09 14:02:53 crc kubenswrapper[4722]: [+]poststarthook/apiservice-wait-for-first-sync ok Mar 09 14:02:53 crc kubenswrapper[4722]: [-]poststarthook/apiservice-discovery-controller failed: reason withheld Mar 09 14:02:53 crc kubenswrapper[4722]: [+]poststarthook/kube-apiserver-autoregistration ok Mar 09 14:02:53 crc kubenswrapper[4722]: [-]autoregister-completion failed: reason withheld Mar 09 14:02:53 crc kubenswrapper[4722]: [+]poststarthook/apiservice-openapi-controller ok Mar 09 14:02:53 crc kubenswrapper[4722]: [+]poststarthook/apiservice-openapiv3-controller ok Mar 09 14:02:53 crc kubenswrapper[4722]: livez check failed Mar 09 14:02:53 crc kubenswrapper[4722]: I0309 14:02:53.823433 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 14:02:54 crc kubenswrapper[4722]: I0309 14:02:54.083150 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:02:54Z is after 2026-02-23T05:33:13Z Mar 09 14:02:54 crc kubenswrapper[4722]: I0309 14:02:54.274920 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 09 14:02:54 crc kubenswrapper[4722]: I0309 14:02:54.276969 4722 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="dc2755da6b63cf19319476368a715f50342e62ebab549e9cf8fc68b2274664d4" exitCode=255 Mar 09 14:02:54 crc kubenswrapper[4722]: I0309 14:02:54.277034 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"dc2755da6b63cf19319476368a715f50342e62ebab549e9cf8fc68b2274664d4"} Mar 09 14:02:54 crc kubenswrapper[4722]: I0309 14:02:54.277380 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 14:02:54 crc kubenswrapper[4722]: I0309 14:02:54.278419 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:02:54 crc kubenswrapper[4722]: I0309 14:02:54.278484 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:02:54 crc kubenswrapper[4722]: I0309 14:02:54.278505 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:02:54 crc kubenswrapper[4722]: I0309 14:02:54.279487 4722 scope.go:117] "RemoveContainer" containerID="dc2755da6b63cf19319476368a715f50342e62ebab549e9cf8fc68b2274664d4" Mar 09 14:02:55 crc kubenswrapper[4722]: I0309 14:02:55.082459 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:02:55Z is after 2026-02-23T05:33:13Z Mar 09 14:02:55 crc kubenswrapper[4722]: I0309 14:02:55.278089 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 09 14:02:55 crc kubenswrapper[4722]: I0309 14:02:55.278278 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 14:02:55 crc kubenswrapper[4722]: I0309 14:02:55.282269 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:02:55 crc kubenswrapper[4722]: I0309 14:02:55.282324 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:02:55 crc kubenswrapper[4722]: I0309 14:02:55.282347 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:02:55 crc kubenswrapper[4722]: I0309 14:02:55.285113 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 09 14:02:55 crc kubenswrapper[4722]: I0309 14:02:55.285850 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 09 14:02:55 crc kubenswrapper[4722]: I0309 14:02:55.287908 4722 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ec26b85ea6ccc495996fb439fb57ce1e90c7f3ff7b0a63af107430b6ab455ea0" exitCode=255 Mar 09 14:02:55 crc kubenswrapper[4722]: I0309 14:02:55.287954 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"ec26b85ea6ccc495996fb439fb57ce1e90c7f3ff7b0a63af107430b6ab455ea0"} Mar 09 14:02:55 crc kubenswrapper[4722]: I0309 14:02:55.288011 4722 scope.go:117] "RemoveContainer" containerID="dc2755da6b63cf19319476368a715f50342e62ebab549e9cf8fc68b2274664d4" Mar 09 14:02:55 crc kubenswrapper[4722]: I0309 14:02:55.288197 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 14:02:55 crc kubenswrapper[4722]: I0309 14:02:55.289660 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:02:55 crc kubenswrapper[4722]: I0309 14:02:55.289713 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:02:55 crc kubenswrapper[4722]: I0309 14:02:55.289737 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:02:55 crc kubenswrapper[4722]: I0309 14:02:55.290783 4722 scope.go:117] "RemoveContainer" containerID="ec26b85ea6ccc495996fb439fb57ce1e90c7f3ff7b0a63af107430b6ab455ea0" Mar 09 14:02:55 crc kubenswrapper[4722]: E0309 14:02:55.291115 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 14:02:55 crc kubenswrapper[4722]: I0309 14:02:55.320852 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 09 14:02:56 crc kubenswrapper[4722]: I0309 14:02:56.086359 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:02:56Z is after 2026-02-23T05:33:13Z Mar 09 14:02:56 crc kubenswrapper[4722]: I0309 14:02:56.292654 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 09 14:02:56 crc kubenswrapper[4722]: I0309 14:02:56.294493 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 14:02:56 crc kubenswrapper[4722]: I0309 14:02:56.295554 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:02:56 crc kubenswrapper[4722]: I0309 14:02:56.295601 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:02:56 crc kubenswrapper[4722]: I0309 14:02:56.295613 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:02:56 crc kubenswrapper[4722]: I0309 14:02:56.311884 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 09 14:02:56 crc kubenswrapper[4722]: I0309 14:02:56.580601 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 14:02:56 crc kubenswrapper[4722]: I0309 14:02:56.580747 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 14:02:56 crc kubenswrapper[4722]: I0309 14:02:56.581875 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:02:56 crc kubenswrapper[4722]: I0309 14:02:56.581928 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:02:56 crc kubenswrapper[4722]: I0309 14:02:56.581943 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:02:57 crc kubenswrapper[4722]: I0309 14:02:57.083271 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:02:57Z is after 2026-02-23T05:33:13Z Mar 09 14:02:57 crc kubenswrapper[4722]: I0309 14:02:57.296392 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 14:02:57 crc kubenswrapper[4722]: I0309 14:02:57.297480 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:02:57 crc kubenswrapper[4722]: I0309 14:02:57.297505 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:02:57 crc kubenswrapper[4722]: I0309 14:02:57.297514 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:02:57 crc kubenswrapper[4722]: I0309 14:02:57.389840 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 14:02:57 crc kubenswrapper[4722]: I0309 14:02:57.390071 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 14:02:57 crc kubenswrapper[4722]: I0309 14:02:57.391468 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:02:57 crc kubenswrapper[4722]: I0309 14:02:57.391547 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:02:57 crc kubenswrapper[4722]: I0309 14:02:57.391576 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:02:57 crc kubenswrapper[4722]: I0309 14:02:57.392516 4722 scope.go:117] "RemoveContainer" containerID="ec26b85ea6ccc495996fb439fb57ce1e90c7f3ff7b0a63af107430b6ab455ea0" Mar 09 14:02:57 crc kubenswrapper[4722]: E0309 14:02:57.392857 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 14:02:58 crc kubenswrapper[4722]: I0309 14:02:58.085335 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:02:58Z is after 2026-02-23T05:33:13Z Mar 09 14:02:58 crc kubenswrapper[4722]: I0309 14:02:58.337415 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 14:02:58 crc kubenswrapper[4722]: I0309 14:02:58.337596 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 14:02:58 crc kubenswrapper[4722]: I0309 14:02:58.338913 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:02:58 crc kubenswrapper[4722]: I0309 14:02:58.338969 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:02:58 crc kubenswrapper[4722]: I0309 14:02:58.338982 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:02:58 crc kubenswrapper[4722]: I0309 14:02:58.342155 4722 scope.go:117] "RemoveContainer" containerID="ec26b85ea6ccc495996fb439fb57ce1e90c7f3ff7b0a63af107430b6ab455ea0" Mar 09 14:02:58 crc kubenswrapper[4722]: E0309 14:02:58.343539 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 14:02:58 crc kubenswrapper[4722]: I0309 14:02:58.345708 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 14:02:58 crc kubenswrapper[4722]: W0309 14:02:58.784551 4722 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:02:58Z is after 2026-02-23T05:33:13Z Mar 09 14:02:58 crc kubenswrapper[4722]: E0309 14:02:58.784630 4722 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:02:58Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 14:02:58 crc kubenswrapper[4722]: W0309 14:02:58.870621 4722 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:02:58Z is after 2026-02-23T05:33:13Z Mar 09 14:02:58 crc kubenswrapper[4722]: E0309 14:02:58.870711 4722 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:02:58Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 14:02:59 crc kubenswrapper[4722]: I0309 14:02:59.083573 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:02:59Z is after 2026-02-23T05:33:13Z Mar 09 14:02:59 crc kubenswrapper[4722]: I0309 14:02:59.206356 4722 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:02:59 crc kubenswrapper[4722]: I0309 14:02:59.206440 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 14:02:59 crc kubenswrapper[4722]: I0309 14:02:59.299758 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 14:02:59 crc kubenswrapper[4722]: I0309 14:02:59.301089 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:02:59 crc kubenswrapper[4722]: I0309 14:02:59.301124 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:02:59 crc kubenswrapper[4722]: I0309 14:02:59.301135 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:02:59 crc kubenswrapper[4722]: I0309 14:02:59.301763 4722 scope.go:117] "RemoveContainer" containerID="ec26b85ea6ccc495996fb439fb57ce1e90c7f3ff7b0a63af107430b6ab455ea0" Mar 09 14:02:59 crc kubenswrapper[4722]: E0309 14:02:59.301956 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 14:02:59 crc kubenswrapper[4722]: I0309 14:02:59.350999 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 14:02:59 crc kubenswrapper[4722]: W0309 14:02:59.619933 4722 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:02:59Z is after 2026-02-23T05:33:13Z Mar 09 14:02:59 crc kubenswrapper[4722]: E0309 14:02:59.620006 4722 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:02:59Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 14:03:00 crc kubenswrapper[4722]: I0309 14:03:00.084153 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:03:00Z is after 2026-02-23T05:33:13Z Mar 09 14:03:00 crc kubenswrapper[4722]: I0309 14:03:00.216026 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 14:03:00 crc kubenswrapper[4722]: E0309 14:03:00.216195 4722 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:03:00Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 09 14:03:00 crc kubenswrapper[4722]: I0309 14:03:00.217185 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:03:00 crc kubenswrapper[4722]: I0309 14:03:00.217244 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:03:00 crc kubenswrapper[4722]: I0309 14:03:00.217260 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:03:00 crc kubenswrapper[4722]: I0309 14:03:00.217287 4722 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 14:03:00 crc kubenswrapper[4722]: E0309 14:03:00.219941 4722 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:03:00Z is after 2026-02-23T05:33:13Z" node="crc" Mar 09 14:03:00 crc kubenswrapper[4722]: E0309 14:03:00.225998 4722 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 09 14:03:00 crc kubenswrapper[4722]: I0309 14:03:00.302444 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 14:03:00 crc kubenswrapper[4722]: I0309 14:03:00.303966 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:03:00 crc kubenswrapper[4722]: I0309 14:03:00.304029 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:03:00 crc kubenswrapper[4722]: I0309 14:03:00.304047 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:03:00 crc kubenswrapper[4722]: I0309 14:03:00.304980 4722 scope.go:117] "RemoveContainer" containerID="ec26b85ea6ccc495996fb439fb57ce1e90c7f3ff7b0a63af107430b6ab455ea0" Mar 09 14:03:00 crc kubenswrapper[4722]: E0309 14:03:00.305309 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 14:03:01 crc kubenswrapper[4722]: I0309 14:03:01.083445 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:03:01Z is after 2026-02-23T05:33:13Z Mar 09 14:03:01 crc kubenswrapper[4722]: W0309 14:03:01.635199 4722 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:03:01Z is after 2026-02-23T05:33:13Z Mar 09 14:03:01 crc kubenswrapper[4722]: E0309 14:03:01.635353 4722 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:03:01Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 14:03:02 crc kubenswrapper[4722]: I0309 14:03:02.025488 4722 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 09 14:03:02 crc kubenswrapper[4722]: E0309 14:03:02.030631 4722 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:03:02Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 14:03:02 crc kubenswrapper[4722]: I0309 14:03:02.085168 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:03:02Z is after 2026-02-23T05:33:13Z Mar 09 14:03:03 crc kubenswrapper[4722]: I0309 14:03:03.084104 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:03:03Z is after 2026-02-23T05:33:13Z Mar 09 14:03:03 crc kubenswrapper[4722]: E0309 14:03:03.812688 4722 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:03:03Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189b312fe510cc8f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:40.076729487 +0000 UTC m=+0.632298103,LastTimestamp:2026-03-09 14:02:40.076729487 +0000 UTC m=+0.632298103,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:04 crc kubenswrapper[4722]: I0309 14:03:04.084448 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:03:04Z is after 2026-02-23T05:33:13Z Mar 09 14:03:05 crc kubenswrapper[4722]: I0309 14:03:05.084111 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:03:05Z is after 2026-02-23T05:33:13Z Mar 09 14:03:06 crc kubenswrapper[4722]: I0309 14:03:06.083612 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:03:06Z is after 2026-02-23T05:33:13Z Mar 09 14:03:07 crc kubenswrapper[4722]: I0309 14:03:07.085593 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:03:07Z is after 2026-02-23T05:33:13Z Mar 09 14:03:07 crc kubenswrapper[4722]: I0309 14:03:07.220595 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 14:03:07 crc kubenswrapper[4722]: E0309 14:03:07.222411 4722 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:03:07Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 09 14:03:07 crc kubenswrapper[4722]: I0309 14:03:07.222610 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:03:07 crc kubenswrapper[4722]: I0309 14:03:07.222661 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:03:07 crc kubenswrapper[4722]: I0309 14:03:07.222687 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:03:07 crc kubenswrapper[4722]: I0309 14:03:07.222722 4722 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 14:03:07 crc kubenswrapper[4722]: E0309 14:03:07.227584 4722 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:03:07Z is after 2026-02-23T05:33:13Z" node="crc" Mar 09 14:03:07 crc kubenswrapper[4722]: W0309 14:03:07.896644 4722 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:03:07Z is after 2026-02-23T05:33:13Z Mar 09 14:03:07 crc kubenswrapper[4722]: E0309 14:03:07.896751 4722 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:03:07Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 14:03:08 crc kubenswrapper[4722]: I0309 14:03:08.083142 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:03:08Z is after 2026-02-23T05:33:13Z Mar 09 14:03:08 crc kubenswrapper[4722]: W0309 14:03:08.475556 4722 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:03:08Z is after 2026-02-23T05:33:13Z Mar 09 14:03:08 crc kubenswrapper[4722]: E0309 14:03:08.475659 4722 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:03:08Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 14:03:09 crc kubenswrapper[4722]: I0309 14:03:09.083534 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:03:09Z is after 2026-02-23T05:33:13Z Mar 09 14:03:09 crc kubenswrapper[4722]: I0309 14:03:09.206004 4722 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:03:09 crc kubenswrapper[4722]: I0309 14:03:09.206115 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 14:03:09 crc kubenswrapper[4722]: I0309 14:03:09.206236 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 14:03:09 crc kubenswrapper[4722]: I0309 14:03:09.206497 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 14:03:09 crc kubenswrapper[4722]: I0309 14:03:09.207939 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:03:09 crc kubenswrapper[4722]: I0309 14:03:09.208027 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:03:09 crc kubenswrapper[4722]: I0309 14:03:09.208049 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:03:09 crc kubenswrapper[4722]: I0309 14:03:09.208753 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"50ecab202a56b3927a1dcdeae44cf5397c4182f1776cd5c3203941d95b498020"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 09 14:03:09 crc kubenswrapper[4722]: I0309 14:03:09.209019 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://50ecab202a56b3927a1dcdeae44cf5397c4182f1776cd5c3203941d95b498020" gracePeriod=30 Mar 09 14:03:09 crc kubenswrapper[4722]: I0309 14:03:09.332252 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 09 14:03:09 crc kubenswrapper[4722]: I0309 14:03:09.332717 4722 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="50ecab202a56b3927a1dcdeae44cf5397c4182f1776cd5c3203941d95b498020" exitCode=255 Mar 09 14:03:09 crc kubenswrapper[4722]: I0309 14:03:09.332782 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"50ecab202a56b3927a1dcdeae44cf5397c4182f1776cd5c3203941d95b498020"} Mar 09 14:03:10 crc kubenswrapper[4722]: I0309 14:03:10.084981 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:03:10Z is after 2026-02-23T05:33:13Z Mar 09 14:03:10 crc kubenswrapper[4722]: E0309 14:03:10.226132 4722 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 09 14:03:10 crc kubenswrapper[4722]: I0309 14:03:10.339354 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 09 14:03:10 crc kubenswrapper[4722]: I0309 14:03:10.340037 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"1a07bf846a5613ee835b70c1915ee46d1e45185f5ff81096ec2e4dd4238f35f3"} Mar 09 14:03:10 crc kubenswrapper[4722]: I0309 14:03:10.340182 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 14:03:10 crc kubenswrapper[4722]: I0309 14:03:10.341511 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:03:10 crc kubenswrapper[4722]: I0309 14:03:10.341585 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:03:10 crc kubenswrapper[4722]: I0309 14:03:10.341600 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:03:11 crc kubenswrapper[4722]: I0309 14:03:11.083736 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:03:11Z is after 2026-02-23T05:33:13Z Mar 09 14:03:11 crc kubenswrapper[4722]: I0309 14:03:11.342267 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 14:03:11 crc kubenswrapper[4722]: I0309 14:03:11.343472 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:03:11 crc kubenswrapper[4722]: I0309 14:03:11.343524 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:03:11 crc kubenswrapper[4722]: I0309 14:03:11.343544 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:03:11 crc kubenswrapper[4722]: W0309 14:03:11.344905 4722 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:03:11Z is after 2026-02-23T05:33:13Z Mar 09 14:03:11 crc kubenswrapper[4722]: E0309 14:03:11.345052 4722 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:03:11Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 14:03:12 crc kubenswrapper[4722]: I0309 14:03:12.085614 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:03:12Z is after 2026-02-23T05:33:13Z Mar 09 14:03:13 crc kubenswrapper[4722]: I0309 14:03:13.086254 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:03:13Z is after 2026-02-23T05:33:13Z Mar 09 14:03:13 crc kubenswrapper[4722]: I0309 14:03:13.611830 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 14:03:13 crc kubenswrapper[4722]: I0309 14:03:13.612032 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 14:03:13 crc kubenswrapper[4722]: I0309 14:03:13.613327 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:03:13 crc kubenswrapper[4722]: I0309 14:03:13.613364 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:03:13 crc kubenswrapper[4722]: I0309 14:03:13.613375 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:03:13 crc kubenswrapper[4722]: E0309 14:03:13.818601 4722 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:03:13Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189b312fe510cc8f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:40.076729487 +0000 UTC m=+0.632298103,LastTimestamp:2026-03-09 14:02:40.076729487 +0000 UTC m=+0.632298103,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:14 crc kubenswrapper[4722]: I0309 14:03:14.085151 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:03:14Z is after 2026-02-23T05:33:13Z Mar 09 14:03:14 crc kubenswrapper[4722]: E0309 14:03:14.227059 4722 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:03:14Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 09 14:03:14 crc kubenswrapper[4722]: I0309 14:03:14.228003 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 14:03:14 crc kubenswrapper[4722]: I0309 14:03:14.229789 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:03:14 crc kubenswrapper[4722]: I0309 14:03:14.229861 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:03:14 crc kubenswrapper[4722]: I0309 14:03:14.229881 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:03:14 crc kubenswrapper[4722]: I0309 14:03:14.229920 4722 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 14:03:14 crc kubenswrapper[4722]: E0309 14:03:14.232738 4722 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:03:14Z is after 2026-02-23T05:33:13Z" node="crc" Mar 09 14:03:15 crc kubenswrapper[4722]: I0309 14:03:15.085313 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:03:15Z is after 2026-02-23T05:33:13Z Mar 09 14:03:15 crc kubenswrapper[4722]: I0309 14:03:15.149228 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 14:03:15 crc kubenswrapper[4722]: I0309 14:03:15.150997 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:03:15 crc kubenswrapper[4722]: I0309 14:03:15.151094 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:03:15 crc kubenswrapper[4722]: I0309 14:03:15.151118 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:03:15 crc kubenswrapper[4722]: I0309 14:03:15.152146 4722 scope.go:117] "RemoveContainer" containerID="ec26b85ea6ccc495996fb439fb57ce1e90c7f3ff7b0a63af107430b6ab455ea0" Mar 09 14:03:16 crc kubenswrapper[4722]: I0309 14:03:16.084628 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:03:16Z is after 2026-02-23T05:33:13Z Mar 09 14:03:16 crc kubenswrapper[4722]: I0309 14:03:16.205325 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 14:03:16 crc kubenswrapper[4722]: I0309 14:03:16.205639 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 14:03:16 crc kubenswrapper[4722]: I0309 14:03:16.207918 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:03:16 crc kubenswrapper[4722]: I0309 14:03:16.207975 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:03:16 crc kubenswrapper[4722]: I0309 14:03:16.207998 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:03:16 crc kubenswrapper[4722]: I0309 14:03:16.359804 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 09 14:03:16 crc kubenswrapper[4722]: I0309 14:03:16.360676 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 09 14:03:16 crc kubenswrapper[4722]: I0309 14:03:16.363536 4722 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="413f1bb85b3d9f04be64e92b097c9bc232a688f0b87a936bdbc7c9c369cb820c" exitCode=255 Mar 09 14:03:16 crc kubenswrapper[4722]: I0309 14:03:16.363609 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"413f1bb85b3d9f04be64e92b097c9bc232a688f0b87a936bdbc7c9c369cb820c"} Mar 09 14:03:16 crc kubenswrapper[4722]: I0309 14:03:16.363726 4722 scope.go:117] "RemoveContainer" containerID="ec26b85ea6ccc495996fb439fb57ce1e90c7f3ff7b0a63af107430b6ab455ea0" Mar 09 14:03:16 crc kubenswrapper[4722]: I0309 14:03:16.363910 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 14:03:16 crc kubenswrapper[4722]: I0309 14:03:16.370282 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:03:16 crc kubenswrapper[4722]: I0309 14:03:16.370371 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:03:16 crc kubenswrapper[4722]: I0309 14:03:16.370402 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:03:16 crc kubenswrapper[4722]: I0309 14:03:16.371360 4722 scope.go:117] "RemoveContainer" containerID="413f1bb85b3d9f04be64e92b097c9bc232a688f0b87a936bdbc7c9c369cb820c" Mar 09 14:03:16 crc kubenswrapper[4722]: E0309 14:03:16.371727 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 14:03:17 crc kubenswrapper[4722]: I0309 14:03:17.086199 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:03:17Z is after 2026-02-23T05:33:13Z Mar 09 14:03:17 crc kubenswrapper[4722]: I0309 14:03:17.369685 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 09 14:03:17 crc kubenswrapper[4722]: I0309 14:03:17.390188 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 14:03:17 crc kubenswrapper[4722]: I0309 14:03:17.390392 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 14:03:17 crc kubenswrapper[4722]: I0309 14:03:17.391812 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:03:17 crc kubenswrapper[4722]: I0309 14:03:17.391858 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:03:17 crc kubenswrapper[4722]: I0309 14:03:17.391872 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:03:17 crc kubenswrapper[4722]: I0309 14:03:17.392681 4722 scope.go:117] "RemoveContainer" containerID="413f1bb85b3d9f04be64e92b097c9bc232a688f0b87a936bdbc7c9c369cb820c" Mar 09 14:03:17 crc kubenswrapper[4722]: E0309 14:03:17.392917 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 14:03:18 crc kubenswrapper[4722]: I0309 14:03:18.056543 4722 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 09 14:03:18 crc kubenswrapper[4722]: E0309 14:03:18.061279 4722 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:03:18Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 14:03:18 crc kubenswrapper[4722]: E0309 14:03:18.062535 4722 certificate_manager.go:440] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Reached backoff limit, still unable to rotate certs: timed out waiting for the condition" logger="UnhandledError" Mar 09 14:03:18 crc kubenswrapper[4722]: I0309 14:03:18.084275 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:03:18Z is after 2026-02-23T05:33:13Z Mar 09 14:03:19 crc kubenswrapper[4722]: I0309 14:03:19.084682 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:03:19Z is after 2026-02-23T05:33:13Z Mar 09 14:03:19 crc kubenswrapper[4722]: W0309 14:03:19.127725 4722 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:03:19Z is after 2026-02-23T05:33:13Z Mar 09 14:03:19 crc kubenswrapper[4722]: E0309 14:03:19.127860 4722 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:03:19Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 14:03:19 crc kubenswrapper[4722]: I0309 14:03:19.205377 4722 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:03:19 crc kubenswrapper[4722]: I0309 14:03:19.205592 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 14:03:19 crc kubenswrapper[4722]: I0309 14:03:19.351603 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 14:03:19 crc kubenswrapper[4722]: I0309 14:03:19.351959 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 14:03:19 crc kubenswrapper[4722]: I0309 14:03:19.354283 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:03:19 crc kubenswrapper[4722]: I0309 14:03:19.354510 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:03:19 crc kubenswrapper[4722]: I0309 14:03:19.354682 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:03:19 crc kubenswrapper[4722]: I0309 14:03:19.356061 4722 scope.go:117] "RemoveContainer" containerID="413f1bb85b3d9f04be64e92b097c9bc232a688f0b87a936bdbc7c9c369cb820c" Mar 09 14:03:19 crc kubenswrapper[4722]: E0309 14:03:19.356621 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 14:03:20 crc kubenswrapper[4722]: I0309 14:03:20.083750 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:03:20Z is after 2026-02-23T05:33:13Z Mar 09 14:03:20 crc kubenswrapper[4722]: E0309 14:03:20.226343 4722 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 09 14:03:21 crc kubenswrapper[4722]: I0309 14:03:21.083886 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:03:21Z is after 2026-02-23T05:33:13Z Mar 09 14:03:21 crc kubenswrapper[4722]: E0309 14:03:21.231034 4722 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:03:21Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 09 14:03:21 crc kubenswrapper[4722]: I0309 14:03:21.233190 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 14:03:21 crc kubenswrapper[4722]: I0309 14:03:21.234944 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:03:21 crc kubenswrapper[4722]: I0309 14:03:21.235015 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:03:21 crc kubenswrapper[4722]: I0309 14:03:21.235041 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:03:21 crc kubenswrapper[4722]: I0309 14:03:21.235088 4722 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 14:03:21 crc kubenswrapper[4722]: E0309 14:03:21.237986 4722 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:03:21Z is after 2026-02-23T05:33:13Z" node="crc" Mar 09 14:03:22 crc kubenswrapper[4722]: I0309 14:03:22.084893 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:03:22Z is after 2026-02-23T05:33:13Z Mar 09 14:03:23 crc kubenswrapper[4722]: I0309 14:03:23.084813 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:03:23Z is after 2026-02-23T05:33:13Z Mar 09 14:03:23 crc kubenswrapper[4722]: E0309 14:03:23.824044 4722 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:03:23Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189b312fe510cc8f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:40.076729487 +0000 UTC m=+0.632298103,LastTimestamp:2026-03-09 14:02:40.076729487 +0000 UTC m=+0.632298103,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:24 crc kubenswrapper[4722]: W0309 14:03:24.021628 4722 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:03:24Z is after 2026-02-23T05:33:13Z Mar 09 14:03:24 crc kubenswrapper[4722]: E0309 14:03:24.021765 4722 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:03:24Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 14:03:24 crc kubenswrapper[4722]: I0309 14:03:24.085056 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:03:24Z is after 2026-02-23T05:33:13Z Mar 09 14:03:24 crc kubenswrapper[4722]: W0309 14:03:24.476875 4722 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:03:24Z is after 2026-02-23T05:33:13Z Mar 09 14:03:24 crc kubenswrapper[4722]: E0309 14:03:24.477015 4722 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:03:24Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 14:03:25 crc kubenswrapper[4722]: I0309 14:03:25.085496 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:03:25Z is after 2026-02-23T05:33:13Z Mar 09 14:03:26 crc kubenswrapper[4722]: I0309 14:03:26.083337 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:03:26Z is after 2026-02-23T05:33:13Z Mar 09 14:03:27 crc kubenswrapper[4722]: I0309 14:03:27.085741 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:03:27Z is after 2026-02-23T05:33:13Z Mar 09 14:03:28 crc kubenswrapper[4722]: I0309 14:03:28.085547 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:03:28Z is after 2026-02-23T05:33:13Z Mar 09 14:03:28 crc kubenswrapper[4722]: E0309 14:03:28.237251 4722 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:03:28Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 09 14:03:28 crc kubenswrapper[4722]: I0309 14:03:28.238440 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 14:03:28 crc kubenswrapper[4722]: I0309 14:03:28.239755 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:03:28 crc kubenswrapper[4722]: I0309 14:03:28.239801 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:03:28 crc kubenswrapper[4722]: I0309 14:03:28.239815 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:03:28 crc kubenswrapper[4722]: I0309 14:03:28.239843 4722 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 14:03:28 crc kubenswrapper[4722]: E0309 14:03:28.244535 4722 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:03:28Z is after 2026-02-23T05:33:13Z" node="crc" Mar 09 14:03:28 crc kubenswrapper[4722]: W0309 14:03:28.837455 4722 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:03:28Z is after 2026-02-23T05:33:13Z Mar 09 14:03:28 crc kubenswrapper[4722]: E0309 14:03:28.837545 4722 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:03:28Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 09 14:03:29 crc kubenswrapper[4722]: I0309 14:03:29.085449 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:03:29Z is after 2026-02-23T05:33:13Z Mar 09 14:03:29 crc kubenswrapper[4722]: I0309 14:03:29.206046 4722 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:03:29 crc kubenswrapper[4722]: I0309 14:03:29.206248 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 14:03:30 crc kubenswrapper[4722]: I0309 14:03:30.083912 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:03:30Z is after 2026-02-23T05:33:13Z Mar 09 14:03:30 crc kubenswrapper[4722]: E0309 14:03:30.226522 4722 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 09 14:03:31 crc kubenswrapper[4722]: I0309 14:03:31.086777 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 14:03:31 crc kubenswrapper[4722]: I0309 14:03:31.762754 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 09 14:03:31 crc kubenswrapper[4722]: I0309 14:03:31.763513 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 14:03:31 crc kubenswrapper[4722]: I0309 14:03:31.765243 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:03:31 crc kubenswrapper[4722]: I0309 14:03:31.765451 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:03:31 crc kubenswrapper[4722]: I0309 14:03:31.765605 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:03:32 crc kubenswrapper[4722]: I0309 14:03:32.086920 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 14:03:33 crc kubenswrapper[4722]: I0309 14:03:33.090442 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 14:03:33 crc kubenswrapper[4722]: E0309 14:03:33.832590 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b312fe510cc8f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:40.076729487 +0000 UTC m=+0.632298103,LastTimestamp:2026-03-09 14:02:40.076729487 +0000 UTC m=+0.632298103,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:33 crc kubenswrapper[4722]: E0309 14:03:33.840745 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b312fe891b82d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:40.135510061 +0000 UTC m=+0.691078647,LastTimestamp:2026-03-09 14:02:40.135510061 +0000 UTC m=+0.691078647,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:33 crc kubenswrapper[4722]: E0309 14:03:33.849573 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b312fe892090a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:40.135530762 +0000 UTC m=+0.691099348,LastTimestamp:2026-03-09 14:02:40.135530762 +0000 UTC m=+0.691099348,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:33 crc kubenswrapper[4722]: E0309 14:03:33.857175 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b312fe8923aa6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:40.135543462 +0000 UTC m=+0.691112048,LastTimestamp:2026-03-09 14:02:40.135543462 +0000 UTC m=+0.691112048,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:33 crc kubenswrapper[4722]: E0309 14:03:33.864498 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b312fed875a31 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:40.218716721 +0000 UTC m=+0.774285307,LastTimestamp:2026-03-09 14:02:40.218716721 +0000 UTC m=+0.774285307,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:33 crc kubenswrapper[4722]: E0309 14:03:33.871943 4722 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b312fe891b82d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b312fe891b82d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:40.135510061 +0000 UTC m=+0.691078647,LastTimestamp:2026-03-09 14:02:40.249850651 +0000 UTC m=+0.805419267,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:33 crc kubenswrapper[4722]: E0309 14:03:33.876277 4722 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b312fe892090a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b312fe892090a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:40.135530762 +0000 UTC m=+0.691099348,LastTimestamp:2026-03-09 14:02:40.250071857 +0000 UTC m=+0.805640493,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:33 crc kubenswrapper[4722]: E0309 14:03:33.880360 4722 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b312fe8923aa6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b312fe8923aa6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:40.135543462 +0000 UTC m=+0.691112048,LastTimestamp:2026-03-09 14:02:40.250268613 +0000 UTC m=+0.805837229,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:33 crc kubenswrapper[4722]: E0309 14:03:33.885447 4722 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b312fe891b82d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b312fe891b82d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:40.135510061 +0000 UTC m=+0.691078647,LastTimestamp:2026-03-09 14:02:40.252453773 +0000 UTC m=+0.808022349,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:33 crc kubenswrapper[4722]: E0309 14:03:33.893155 4722 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b312fe892090a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b312fe892090a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:40.135530762 +0000 UTC m=+0.691099348,LastTimestamp:2026-03-09 14:02:40.252475094 +0000 UTC m=+0.808043670,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:33 crc kubenswrapper[4722]: E0309 14:03:33.899992 4722 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b312fe8923aa6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b312fe8923aa6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:40.135543462 +0000 UTC m=+0.691112048,LastTimestamp:2026-03-09 14:02:40.252485984 +0000 UTC m=+0.808054560,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:33 crc kubenswrapper[4722]: E0309 14:03:33.906884 4722 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b312fe891b82d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b312fe891b82d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:40.135510061 +0000 UTC m=+0.691078647,LastTimestamp:2026-03-09 14:02:40.25270286 +0000 UTC m=+0.808271456,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:33 crc kubenswrapper[4722]: E0309 14:03:33.913824 4722 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b312fe892090a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b312fe892090a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:40.135530762 +0000 UTC m=+0.691099348,LastTimestamp:2026-03-09 14:02:40.252721071 +0000 UTC m=+0.808289667,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:33 crc kubenswrapper[4722]: E0309 14:03:33.920482 4722 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b312fe8923aa6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b312fe8923aa6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:40.135543462 +0000 UTC m=+0.691112048,LastTimestamp:2026-03-09 14:02:40.252734621 +0000 UTC m=+0.808303207,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:33 crc kubenswrapper[4722]: E0309 14:03:33.927142 4722 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b312fe891b82d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b312fe891b82d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:40.135510061 +0000 UTC m=+0.691078647,LastTimestamp:2026-03-09 14:02:40.254941442 +0000 UTC m=+0.810510028,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:33 crc kubenswrapper[4722]: E0309 14:03:33.934036 4722 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b312fe892090a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b312fe892090a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:40.135530762 +0000 UTC m=+0.691099348,LastTimestamp:2026-03-09 14:02:40.254956543 +0000 UTC m=+0.810525129,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:33 crc kubenswrapper[4722]: E0309 14:03:33.940810 4722 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b312fe8923aa6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b312fe8923aa6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:40.135543462 +0000 UTC m=+0.691112048,LastTimestamp:2026-03-09 14:02:40.254967913 +0000 UTC m=+0.810536509,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:33 crc kubenswrapper[4722]: E0309 14:03:33.947527 4722 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b312fe891b82d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b312fe891b82d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:40.135510061 +0000 UTC m=+0.691078647,LastTimestamp:2026-03-09 14:02:40.255024105 +0000 UTC m=+0.810592681,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:33 crc kubenswrapper[4722]: E0309 14:03:33.954647 4722 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b312fe892090a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b312fe892090a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:40.135530762 +0000 UTC m=+0.691099348,LastTimestamp:2026-03-09 14:02:40.255046345 +0000 UTC m=+0.810614921,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:33 crc kubenswrapper[4722]: E0309 14:03:33.965365 4722 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b312fe8923aa6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b312fe8923aa6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:40.135543462 +0000 UTC m=+0.691112048,LastTimestamp:2026-03-09 14:02:40.255059785 +0000 UTC m=+0.810628361,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:33 crc kubenswrapper[4722]: E0309 14:03:33.972166 4722 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b312fe891b82d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b312fe891b82d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:40.135510061 +0000 UTC m=+0.691078647,LastTimestamp:2026-03-09 14:02:40.256640059 +0000 UTC m=+0.812208645,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:33 crc kubenswrapper[4722]: E0309 14:03:33.977295 4722 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b312fe892090a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b312fe892090a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:40.135530762 +0000 UTC m=+0.691099348,LastTimestamp:2026-03-09 14:02:40.25666679 +0000 UTC m=+0.812235366,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:33 crc kubenswrapper[4722]: E0309 14:03:33.982236 4722 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b312fe8923aa6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b312fe8923aa6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:40.135543462 +0000 UTC m=+0.691112048,LastTimestamp:2026-03-09 14:02:40.2566819 +0000 UTC m=+0.812250476,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:33 crc kubenswrapper[4722]: E0309 14:03:33.988847 4722 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b312fe891b82d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b312fe891b82d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:40.135510061 +0000 UTC m=+0.691078647,LastTimestamp:2026-03-09 14:02:40.257542653 +0000 UTC m=+0.813111229,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:33 crc kubenswrapper[4722]: E0309 14:03:33.995851 4722 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189b312fe892090a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189b312fe892090a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:40.135530762 +0000 UTC m=+0.691099348,LastTimestamp:2026-03-09 14:02:40.257554324 +0000 UTC m=+0.813122900,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.005449 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b3130085f4fb9 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:40.669077433 +0000 UTC m=+1.224645999,LastTimestamp:2026-03-09 14:02:40.669077433 +0000 UTC m=+1.224645999,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.012923 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b31300875a841 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:40.670541889 +0000 UTC m=+1.226110485,LastTimestamp:2026-03-09 14:02:40.670541889 +0000 UTC m=+1.226110485,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.018473 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b313008d80c49 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:40.676990025 +0000 UTC m=+1.232558611,LastTimestamp:2026-03-09 14:02:40.676990025 +0000 UTC m=+1.232558611,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.026966 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b3130096132aa openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:40.685978282 +0000 UTC m=+1.241546898,LastTimestamp:2026-03-09 14:02:40.685978282 +0000 UTC m=+1.241546898,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.033115 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b3130096278b0 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:40.686061744 +0000 UTC m=+1.241630330,LastTimestamp:2026-03-09 14:02:40.686061744 +0000 UTC m=+1.241630330,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.040342 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b31302e2ba200 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:41.303224832 +0000 UTC m=+1.858793408,LastTimestamp:2026-03-09 14:02:41.303224832 +0000 UTC m=+1.858793408,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.059785 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b31302e42d7bf openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:41.304745919 +0000 UTC m=+1.860314495,LastTimestamp:2026-03-09 14:02:41.304745919 +0000 UTC m=+1.860314495,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.065776 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b31302e4b096b openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:41.305282923 +0000 UTC m=+1.860851499,LastTimestamp:2026-03-09 14:02:41.305282923 +0000 UTC m=+1.860851499,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.071473 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b31302e51deb7 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:41.305730743 +0000 UTC m=+1.861299319,LastTimestamp:2026-03-09 14:02:41.305730743 +0000 UTC m=+1.861299319,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.079099 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b31302e795378 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:41.308316536 +0000 UTC m=+1.863885112,LastTimestamp:2026-03-09 14:02:41.308316536 +0000 UTC m=+1.863885112,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.084807 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b31302ee52315 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:41.315382037 +0000 UTC m=+1.870950623,LastTimestamp:2026-03-09 14:02:41.315382037 +0000 UTC m=+1.870950623,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:34 crc kubenswrapper[4722]: I0309 14:03:34.085108 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.095413 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b31302efcb0b8 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:41.316925624 +0000 UTC m=+1.872494200,LastTimestamp:2026-03-09 14:02:41.316925624 +0000 UTC m=+1.872494200,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.103616 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b31302f0a3460 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:41.317811296 +0000 UTC m=+1.873379872,LastTimestamp:2026-03-09 14:02:41.317811296 +0000 UTC m=+1.873379872,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.110749 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b31302f1f127f openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:41.319178879 +0000 UTC m=+1.874747455,LastTimestamp:2026-03-09 14:02:41.319178879 +0000 UTC m=+1.874747455,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.117341 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b31302f2f347a openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:41.320236154 +0000 UTC m=+1.875804720,LastTimestamp:2026-03-09 14:02:41.320236154 +0000 UTC m=+1.875804720,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.124145 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b31302f886efd openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:41.326083837 +0000 UTC m=+1.881652453,LastTimestamp:2026-03-09 14:02:41.326083837 +0000 UTC m=+1.881652453,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.129534 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b313041ec7ff5 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:41.634631669 +0000 UTC m=+2.190200245,LastTimestamp:2026-03-09 14:02:41.634631669 +0000 UTC m=+2.190200245,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.136000 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b3130428716ab openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:41.644762795 +0000 UTC m=+2.200331371,LastTimestamp:2026-03-09 14:02:41.644762795 +0000 UTC m=+2.200331371,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.142947 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b3130429c5230 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:41.646154288 +0000 UTC m=+2.201722864,LastTimestamp:2026-03-09 14:02:41.646154288 +0000 UTC m=+2.201722864,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:34 crc kubenswrapper[4722]: I0309 14:03:34.149034 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 14:03:34 crc kubenswrapper[4722]: I0309 14:03:34.150516 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:03:34 crc kubenswrapper[4722]: I0309 14:03:34.150565 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:03:34 crc kubenswrapper[4722]: I0309 14:03:34.150583 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.150874 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b3130504cc273 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:41.875821171 +0000 UTC m=+2.431389787,LastTimestamp:2026-03-09 14:02:41.875821171 +0000 UTC m=+2.431389787,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:34 crc kubenswrapper[4722]: I0309 14:03:34.151452 4722 scope.go:117] "RemoveContainer" containerID="413f1bb85b3d9f04be64e92b097c9bc232a688f0b87a936bdbc7c9c369cb820c" Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.151748 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.158461 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b31305162e1f2 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:41.894048242 +0000 UTC m=+2.449616858,LastTimestamp:2026-03-09 14:02:41.894048242 +0000 UTC m=+2.449616858,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.164704 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b3130517eb2b5 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:41.895871157 +0000 UTC m=+2.451439743,LastTimestamp:2026-03-09 14:02:41.895871157 +0000 UTC m=+2.451439743,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.170863 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b31305fa5eb20 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:42.133322528 +0000 UTC m=+2.688891144,LastTimestamp:2026-03-09 14:02:42.133322528 +0000 UTC m=+2.688891144,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.176473 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b313060628d51 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:42.145684817 +0000 UTC m=+2.701253403,LastTimestamp:2026-03-09 14:02:42.145684817 +0000 UTC m=+2.701253403,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.187117 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b3130620b41a1 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:42.173518241 +0000 UTC m=+2.729086857,LastTimestamp:2026-03-09 14:02:42.173518241 +0000 UTC m=+2.729086857,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.195312 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b31306260c339 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:42.179121977 +0000 UTC m=+2.734690563,LastTimestamp:2026-03-09 14:02:42.179121977 +0000 UTC m=+2.734690563,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.201605 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b313062980828 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:42.182744104 +0000 UTC m=+2.738312680,LastTimestamp:2026-03-09 14:02:42.182744104 +0000 UTC m=+2.738312680,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.211313 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b3130637ebf63 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:42.197864291 +0000 UTC m=+2.753432877,LastTimestamp:2026-03-09 14:02:42.197864291 +0000 UTC m=+2.753432877,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.219734 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b313070d2a8ab openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:42.421467307 +0000 UTC m=+2.977035893,LastTimestamp:2026-03-09 14:02:42.421467307 +0000 UTC m=+2.977035893,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.226111 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b313070ee2f3a openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:42.423271226 +0000 UTC m=+2.978839802,LastTimestamp:2026-03-09 14:02:42.423271226 +0000 UTC m=+2.978839802,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.235673 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b313070f50277 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:42.423718519 +0000 UTC m=+2.979287095,LastTimestamp:2026-03-09 14:02:42.423718519 +0000 UTC m=+2.979287095,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.242615 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b313070f5f61b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:42.423780891 +0000 UTC m=+2.979349467,LastTimestamp:2026-03-09 14:02:42.423780891 +0000 UTC m=+2.979349467,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.251007 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b3130719db06d openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:42.434773101 +0000 UTC m=+2.990341677,LastTimestamp:2026-03-09 14:02:42.434773101 +0000 UTC m=+2.990341677,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.257952 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b313071af1da5 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:42.435915173 +0000 UTC m=+2.991483749,LastTimestamp:2026-03-09 14:02:42.435915173 +0000 UTC m=+2.991483749,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.292390 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189b31307221168a openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:42.443384458 +0000 UTC m=+2.998953034,LastTimestamp:2026-03-09 14:02:42.443384458 +0000 UTC m=+2.998953034,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.314456 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b3130729153e7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:42.450740199 +0000 UTC m=+3.006308775,LastTimestamp:2026-03-09 14:02:42.450740199 +0000 UTC m=+3.006308775,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.320938 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b313072a12d98 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:42.451778968 +0000 UTC m=+3.007347544,LastTimestamp:2026-03-09 14:02:42.451778968 +0000 UTC m=+3.007347544,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.326663 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b3130800f5dd9 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:42.677104089 +0000 UTC m=+3.232672675,LastTimestamp:2026-03-09 14:02:42.677104089 +0000 UTC m=+3.232672675,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.333821 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b31308020d9d4 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:42.67824994 +0000 UTC m=+3.233818526,LastTimestamp:2026-03-09 14:02:42.67824994 +0000 UTC m=+3.233818526,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.337915 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b313080b7bff2 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:42.68813925 +0000 UTC m=+3.243707826,LastTimestamp:2026-03-09 14:02:42.68813925 +0000 UTC m=+3.243707826,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.343349 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b313080ca4bb2 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:42.689354674 +0000 UTC m=+3.244923240,LastTimestamp:2026-03-09 14:02:42.689354674 +0000 UTC m=+3.244923240,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.347639 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b313080f1bd79 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:42.691939705 +0000 UTC m=+3.247508281,LastTimestamp:2026-03-09 14:02:42.691939705 +0000 UTC m=+3.247508281,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.353015 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b3130813090e2 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:42.696057058 +0000 UTC m=+3.251625634,LastTimestamp:2026-03-09 14:02:42.696057058 +0000 UTC m=+3.251625634,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.356540 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b31308dc09250 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:42.9068212 +0000 UTC m=+3.462389786,LastTimestamp:2026-03-09 14:02:42.9068212 +0000 UTC m=+3.462389786,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.361192 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b31308ddc51ac openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:42.90863966 +0000 UTC m=+3.464208236,LastTimestamp:2026-03-09 14:02:42.90863966 +0000 UTC m=+3.464208236,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.366071 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189b31308ea5f7dd openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:42.921854941 +0000 UTC m=+3.477423517,LastTimestamp:2026-03-09 14:02:42.921854941 +0000 UTC m=+3.477423517,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.369701 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b31308fb5e984 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:42.93967706 +0000 UTC m=+3.495245636,LastTimestamp:2026-03-09 14:02:42.93967706 +0000 UTC m=+3.495245636,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.373735 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b31308fc55e9e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:42.940690078 +0000 UTC m=+3.496258654,LastTimestamp:2026-03-09 14:02:42.940690078 +0000 UTC m=+3.496258654,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.377870 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b313090af11a5 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:42.956005797 +0000 UTC m=+3.511574373,LastTimestamp:2026-03-09 14:02:42.956005797 +0000 UTC m=+3.511574373,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.381922 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b31309b4075b4 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:43.133306292 +0000 UTC m=+3.688874868,LastTimestamp:2026-03-09 14:02:43.133306292 +0000 UTC m=+3.688874868,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.385383 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b31309c3fa64a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:43.15003041 +0000 UTC m=+3.705598986,LastTimestamp:2026-03-09 14:02:43.15003041 +0000 UTC m=+3.705598986,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.389242 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b31309c4f8cd5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:43.151072469 +0000 UTC m=+3.706641045,LastTimestamp:2026-03-09 14:02:43.151072469 +0000 UTC m=+3.706641045,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.392898 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b3130a01b9ade openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:43.214777054 +0000 UTC m=+3.770345630,LastTimestamp:2026-03-09 14:02:43.214777054 +0000 UTC m=+3.770345630,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.397224 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b3130a9074ce0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:43.364441312 +0000 UTC m=+3.920009898,LastTimestamp:2026-03-09 14:02:43.364441312 +0000 UTC m=+3.920009898,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.401688 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b3130a9d32583 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:43.377800579 +0000 UTC m=+3.933369165,LastTimestamp:2026-03-09 14:02:43.377800579 +0000 UTC m=+3.933369165,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.405925 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b3130aca92aa8 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:43.425381032 +0000 UTC m=+3.980949628,LastTimestamp:2026-03-09 14:02:43.425381032 +0000 UTC m=+3.980949628,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.410540 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b3130ad6b960f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:43.438122511 +0000 UTC m=+3.993691087,LastTimestamp:2026-03-09 14:02:43.438122511 +0000 UTC m=+3.993691087,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.416825 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b3130dcc627fb openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:44.232587259 +0000 UTC m=+4.788155865,LastTimestamp:2026-03-09 14:02:44.232587259 +0000 UTC m=+4.788155865,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.421670 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b3130ea89d157 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:44.463513943 +0000 UTC m=+5.019082529,LastTimestamp:2026-03-09 14:02:44.463513943 +0000 UTC m=+5.019082529,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.426247 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b3130eb1ed97b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:44.473280891 +0000 UTC m=+5.028849477,LastTimestamp:2026-03-09 14:02:44.473280891 +0000 UTC m=+5.028849477,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.432191 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b3130eb311ab2 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:44.474477234 +0000 UTC m=+5.030045820,LastTimestamp:2026-03-09 14:02:44.474477234 +0000 UTC m=+5.030045820,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.438381 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b3130f60ec31e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:44.656775966 +0000 UTC m=+5.212344582,LastTimestamp:2026-03-09 14:02:44.656775966 +0000 UTC m=+5.212344582,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.444509 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b3130f6aaa5a5 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:44.666992037 +0000 UTC m=+5.222560653,LastTimestamp:2026-03-09 14:02:44.666992037 +0000 UTC m=+5.222560653,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.450323 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b3130f6bcf640 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:44.66819232 +0000 UTC m=+5.223760936,LastTimestamp:2026-03-09 14:02:44.66819232 +0000 UTC m=+5.223760936,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.458059 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b31310900ea0b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:44.974635531 +0000 UTC m=+5.530204137,LastTimestamp:2026-03-09 14:02:44.974635531 +0000 UTC m=+5.530204137,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.465127 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b31311083ee1f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:45.100662303 +0000 UTC m=+5.656230919,LastTimestamp:2026-03-09 14:02:45.100662303 +0000 UTC m=+5.656230919,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.472347 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b313110a16543 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:45.102593347 +0000 UTC m=+5.658161963,LastTimestamp:2026-03-09 14:02:45.102593347 +0000 UTC m=+5.658161963,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.479427 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b313126a4d29d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:45.471916701 +0000 UTC m=+6.027485317,LastTimestamp:2026-03-09 14:02:45.471916701 +0000 UTC m=+6.027485317,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.488527 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b313127a17866 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:45.488474214 +0000 UTC m=+6.044042800,LastTimestamp:2026-03-09 14:02:45.488474214 +0000 UTC m=+6.044042800,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.496132 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b313127b79f7e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:45.489926014 +0000 UTC m=+6.045494600,LastTimestamp:2026-03-09 14:02:45.489926014 +0000 UTC m=+6.045494600,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.497598 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b31313611c26a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:45.730714218 +0000 UTC m=+6.286282794,LastTimestamp:2026-03-09 14:02:45.730714218 +0000 UTC m=+6.286282794,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.502946 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189b313136edeb7e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:45.745142654 +0000 UTC m=+6.300711240,LastTimestamp:2026-03-09 14:02:45.745142654 +0000 UTC m=+6.300711240,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.514707 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 09 14:03:34 crc kubenswrapper[4722]: &Event{ObjectMeta:{kube-controller-manager-crc.189b3132052d1741 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 09 14:03:34 crc kubenswrapper[4722]: body: Mar 09 14:03:34 crc kubenswrapper[4722]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:49.205389121 +0000 UTC m=+9.760957707,LastTimestamp:2026-03-09 14:02:49.205389121 +0000 UTC m=+9.760957707,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 09 14:03:34 crc kubenswrapper[4722]: > Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.522032 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b3132052e1733 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:49.205454643 +0000 UTC m=+9.761023239,LastTimestamp:2026-03-09 14:02:49.205454643 +0000 UTC m=+9.761023239,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.528037 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 09 14:03:34 crc kubenswrapper[4722]: &Event{ObjectMeta:{kube-apiserver-crc.189b313318032195 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 09 14:03:34 crc kubenswrapper[4722]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 09 14:03:34 crc kubenswrapper[4722]: Mar 09 14:03:34 crc kubenswrapper[4722]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:53.816373653 +0000 UTC m=+14.371942249,LastTimestamp:2026-03-09 14:02:53.816373653 +0000 UTC m=+14.371942249,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 09 14:03:34 crc kubenswrapper[4722]: > Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.533322 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b3133180425e7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:53.816440295 +0000 UTC m=+14.372008891,LastTimestamp:2026-03-09 14:02:53.816440295 +0000 UTC m=+14.372008891,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.538270 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 09 14:03:34 crc kubenswrapper[4722]: &Event{ObjectMeta:{kube-apiserver-crc.189b3133186e420c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 500 Mar 09 14:03:34 crc kubenswrapper[4722]: body: [+]ping ok Mar 09 14:03:34 crc kubenswrapper[4722]: [+]log ok Mar 09 14:03:34 crc kubenswrapper[4722]: [+]etcd ok Mar 09 14:03:34 crc kubenswrapper[4722]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 09 14:03:34 crc kubenswrapper[4722]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 09 14:03:34 crc kubenswrapper[4722]: [+]poststarthook/openshift.io-api-request-count-filter ok Mar 09 14:03:34 crc kubenswrapper[4722]: [+]poststarthook/openshift.io-startkubeinformers ok Mar 09 14:03:34 crc kubenswrapper[4722]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Mar 09 14:03:34 crc kubenswrapper[4722]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Mar 09 14:03:34 crc kubenswrapper[4722]: [+]poststarthook/generic-apiserver-start-informers ok Mar 09 14:03:34 crc kubenswrapper[4722]: [+]poststarthook/priority-and-fairness-config-consumer ok Mar 09 14:03:34 crc kubenswrapper[4722]: [+]poststarthook/priority-and-fairness-filter ok Mar 09 14:03:34 crc kubenswrapper[4722]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 09 14:03:34 crc kubenswrapper[4722]: [+]poststarthook/start-apiextensions-informers ok Mar 09 14:03:34 crc kubenswrapper[4722]: [-]poststarthook/start-apiextensions-controllers failed: reason withheld Mar 09 14:03:34 crc kubenswrapper[4722]: [-]poststarthook/crd-informer-synced failed: reason withheld Mar 09 14:03:34 crc kubenswrapper[4722]: [+]poststarthook/start-system-namespaces-controller ok Mar 09 14:03:34 crc kubenswrapper[4722]: [+]poststarthook/start-cluster-authentication-info-controller ok Mar 09 14:03:34 crc kubenswrapper[4722]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Mar 09 14:03:34 crc kubenswrapper[4722]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Mar 09 14:03:34 crc kubenswrapper[4722]: [+]poststarthook/start-legacy-token-tracking-controller ok Mar 09 14:03:34 crc kubenswrapper[4722]: [-]poststarthook/start-service-ip-repair-controllers failed: reason withheld Mar 09 14:03:34 crc kubenswrapper[4722]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Mar 09 14:03:34 crc kubenswrapper[4722]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Mar 09 14:03:34 crc kubenswrapper[4722]: [-]poststarthook/priority-and-fairness-config-producer failed: reason withheld Mar 09 14:03:34 crc kubenswrapper[4722]: [-]poststarthook/bootstrap-controller failed: reason withheld Mar 09 14:03:34 crc kubenswrapper[4722]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Mar 09 14:03:34 crc kubenswrapper[4722]: [+]poststarthook/start-kube-aggregator-informers ok Mar 09 14:03:34 crc kubenswrapper[4722]: [+]poststarthook/apiservice-status-local-available-controller ok Mar 09 14:03:34 crc kubenswrapper[4722]: [+]poststarthook/apiservice-status-remote-available-controller ok Mar 09 14:03:34 crc kubenswrapper[4722]: [-]poststarthook/apiservice-registration-controller failed: reason withheld Mar 09 14:03:34 crc kubenswrapper[4722]: [+]poststarthook/apiservice-wait-for-first-sync ok Mar 09 14:03:34 crc kubenswrapper[4722]: [-]poststarthook/apiservice-discovery-controller failed: reason withheld Mar 09 14:03:34 crc kubenswrapper[4722]: [+]poststarthook/kube-apiserver-autoregistration ok Mar 09 14:03:34 crc kubenswrapper[4722]: [-]autoregister-completion failed: reason withheld Mar 09 14:03:34 crc kubenswrapper[4722]: [+]poststarthook/apiservice-openapi-controller ok Mar 09 14:03:34 crc kubenswrapper[4722]: [+]poststarthook/apiservice-openapiv3-controller ok Mar 09 14:03:34 crc kubenswrapper[4722]: livez check failed Mar 09 14:03:34 crc kubenswrapper[4722]: Mar 09 14:03:34 crc kubenswrapper[4722]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:53.823394316 +0000 UTC m=+14.378962912,LastTimestamp:2026-03-09 14:02:53.823394316 +0000 UTC m=+14.378962912,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 09 14:03:34 crc kubenswrapper[4722]: > Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.543308 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b3133186ffa2d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 500,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:53.823506989 +0000 UTC m=+14.379075575,LastTimestamp:2026-03-09 14:02:53.823506989 +0000 UTC m=+14.379075575,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.548183 4722 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189b31309c4f8cd5\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b31309c4f8cd5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:43.151072469 +0000 UTC m=+3.706641045,LastTimestamp:2026-03-09 14:02:54.280857604 +0000 UTC m=+14.836426180,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.552353 4722 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189b3130a9074ce0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b3130a9074ce0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:43.364441312 +0000 UTC m=+3.920009898,LastTimestamp:2026-03-09 14:02:54.455851207 +0000 UTC m=+15.011419793,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.556548 4722 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189b3130a9d32583\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189b3130a9d32583 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:43.377800579 +0000 UTC m=+3.933369165,LastTimestamp:2026-03-09 14:02:54.46507761 +0000 UTC m=+15.020646196,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.563395 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 09 14:03:34 crc kubenswrapper[4722]: &Event{ObjectMeta:{kube-controller-manager-crc.189b31345948ba8e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 09 14:03:34 crc kubenswrapper[4722]: body: Mar 09 14:03:34 crc kubenswrapper[4722]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:59.206421134 +0000 UTC m=+19.761989710,LastTimestamp:2026-03-09 14:02:59.206421134 +0000 UTC m=+19.761989710,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 09 14:03:34 crc kubenswrapper[4722]: > Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.566903 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b313459496e7b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:59.206467195 +0000 UTC m=+19.762035771,LastTimestamp:2026-03-09 14:02:59.206467195 +0000 UTC m=+19.762035771,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.571986 4722 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b31345948ba8e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 09 14:03:34 crc kubenswrapper[4722]: &Event{ObjectMeta:{kube-controller-manager-crc.189b31345948ba8e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 09 14:03:34 crc kubenswrapper[4722]: body: Mar 09 14:03:34 crc kubenswrapper[4722]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:59.206421134 +0000 UTC m=+19.761989710,LastTimestamp:2026-03-09 14:03:09.206084439 +0000 UTC m=+29.761653065,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 09 14:03:34 crc kubenswrapper[4722]: > Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.575600 4722 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b313459496e7b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b313459496e7b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:59.206467195 +0000 UTC m=+19.762035771,LastTimestamp:2026-03-09 14:03:09.206159221 +0000 UTC m=+29.761727827,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.579415 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b3136ad7be903 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:03:09.208996099 +0000 UTC m=+29.764564715,LastTimestamp:2026-03-09 14:03:09.208996099 +0000 UTC m=+29.764564715,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.618832 4722 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b31302efcb0b8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b31302efcb0b8 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:41.316925624 +0000 UTC m=+1.872494200,LastTimestamp:2026-03-09 14:03:09.326504653 +0000 UTC m=+29.882073229,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.626035 4722 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b313041ec7ff5\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b313041ec7ff5 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:41.634631669 +0000 UTC m=+2.190200245,LastTimestamp:2026-03-09 14:03:09.551525644 +0000 UTC m=+30.107094230,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.631790 4722 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b3130428716ab\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b3130428716ab openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:41.644762795 +0000 UTC m=+2.200331371,LastTimestamp:2026-03-09 14:03:09.562195978 +0000 UTC m=+30.117764574,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.640941 4722 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b31345948ba8e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 09 14:03:34 crc kubenswrapper[4722]: &Event{ObjectMeta:{kube-controller-manager-crc.189b31345948ba8e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 09 14:03:34 crc kubenswrapper[4722]: body: Mar 09 14:03:34 crc kubenswrapper[4722]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:59.206421134 +0000 UTC m=+19.761989710,LastTimestamp:2026-03-09 14:03:19.205491234 +0000 UTC m=+39.761059850,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 09 14:03:34 crc kubenswrapper[4722]: > Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.647157 4722 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b313459496e7b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189b313459496e7b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:59.206467195 +0000 UTC m=+19.762035771,LastTimestamp:2026-03-09 14:03:19.205656568 +0000 UTC m=+39.761225184,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:03:34 crc kubenswrapper[4722]: E0309 14:03:34.655355 4722 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189b31345948ba8e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 09 14:03:34 crc kubenswrapper[4722]: &Event{ObjectMeta:{kube-controller-manager-crc.189b31345948ba8e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 09 14:03:34 crc kubenswrapper[4722]: body: Mar 09 14:03:34 crc kubenswrapper[4722]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:02:59.206421134 +0000 UTC m=+19.761989710,LastTimestamp:2026-03-09 14:03:29.206127013 +0000 UTC m=+49.761695639,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 09 14:03:34 crc kubenswrapper[4722]: > Mar 09 14:03:35 crc kubenswrapper[4722]: I0309 14:03:35.088045 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 14:03:35 crc kubenswrapper[4722]: E0309 14:03:35.244462 4722 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 09 14:03:35 crc kubenswrapper[4722]: I0309 14:03:35.245270 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 14:03:35 crc kubenswrapper[4722]: I0309 14:03:35.246688 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:03:35 crc kubenswrapper[4722]: I0309 14:03:35.246763 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:03:35 crc kubenswrapper[4722]: I0309 14:03:35.246784 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:03:35 crc kubenswrapper[4722]: I0309 14:03:35.246829 4722 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 14:03:35 crc kubenswrapper[4722]: E0309 14:03:35.254518 4722 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 09 14:03:36 crc kubenswrapper[4722]: I0309 14:03:36.089056 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 14:03:37 crc kubenswrapper[4722]: I0309 14:03:37.084946 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 14:03:38 crc kubenswrapper[4722]: I0309 14:03:38.085023 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 14:03:39 crc kubenswrapper[4722]: I0309 14:03:39.087580 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 14:03:39 crc kubenswrapper[4722]: I0309 14:03:39.206919 4722 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:03:39 crc kubenswrapper[4722]: I0309 14:03:39.207025 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 14:03:39 crc kubenswrapper[4722]: I0309 14:03:39.207096 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 14:03:39 crc kubenswrapper[4722]: I0309 14:03:39.207318 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 14:03:39 crc kubenswrapper[4722]: I0309 14:03:39.208681 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:03:39 crc kubenswrapper[4722]: I0309 14:03:39.208716 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:03:39 crc kubenswrapper[4722]: I0309 14:03:39.208729 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:03:39 crc kubenswrapper[4722]: I0309 14:03:39.209473 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"1a07bf846a5613ee835b70c1915ee46d1e45185f5ff81096ec2e4dd4238f35f3"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 09 14:03:39 crc kubenswrapper[4722]: I0309 14:03:39.209629 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://1a07bf846a5613ee835b70c1915ee46d1e45185f5ff81096ec2e4dd4238f35f3" gracePeriod=30 Mar 09 14:03:39 crc kubenswrapper[4722]: I0309 14:03:39.439732 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 09 14:03:39 crc kubenswrapper[4722]: I0309 14:03:39.442168 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 09 14:03:39 crc kubenswrapper[4722]: I0309 14:03:39.442965 4722 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="1a07bf846a5613ee835b70c1915ee46d1e45185f5ff81096ec2e4dd4238f35f3" exitCode=255 Mar 09 14:03:39 crc kubenswrapper[4722]: I0309 14:03:39.443062 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"1a07bf846a5613ee835b70c1915ee46d1e45185f5ff81096ec2e4dd4238f35f3"} Mar 09 14:03:39 crc kubenswrapper[4722]: I0309 14:03:39.443170 4722 scope.go:117] "RemoveContainer" containerID="50ecab202a56b3927a1dcdeae44cf5397c4182f1776cd5c3203941d95b498020" Mar 09 14:03:40 crc kubenswrapper[4722]: I0309 14:03:40.085314 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 14:03:40 crc kubenswrapper[4722]: E0309 14:03:40.226708 4722 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 09 14:03:40 crc kubenswrapper[4722]: I0309 14:03:40.448716 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 09 14:03:40 crc kubenswrapper[4722]: I0309 14:03:40.450187 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"68bead726a61890d6212098f420a58ca85c9d97bb2efe57b314aff215c26169c"} Mar 09 14:03:40 crc kubenswrapper[4722]: I0309 14:03:40.450314 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 14:03:40 crc kubenswrapper[4722]: I0309 14:03:40.451624 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:03:40 crc kubenswrapper[4722]: I0309 14:03:40.451664 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:03:40 crc kubenswrapper[4722]: I0309 14:03:40.451673 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:03:41 crc kubenswrapper[4722]: I0309 14:03:41.088126 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 14:03:41 crc kubenswrapper[4722]: I0309 14:03:41.453138 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 14:03:41 crc kubenswrapper[4722]: I0309 14:03:41.454888 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:03:41 crc kubenswrapper[4722]: I0309 14:03:41.454935 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:03:41 crc kubenswrapper[4722]: I0309 14:03:41.454947 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:03:42 crc kubenswrapper[4722]: I0309 14:03:42.086339 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 14:03:42 crc kubenswrapper[4722]: E0309 14:03:42.250379 4722 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 09 14:03:42 crc kubenswrapper[4722]: I0309 14:03:42.255591 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 14:03:42 crc kubenswrapper[4722]: I0309 14:03:42.256750 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:03:42 crc kubenswrapper[4722]: I0309 14:03:42.256909 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:03:42 crc kubenswrapper[4722]: I0309 14:03:42.257005 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:03:42 crc kubenswrapper[4722]: I0309 14:03:42.257117 4722 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 14:03:42 crc kubenswrapper[4722]: E0309 14:03:42.264035 4722 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 09 14:03:43 crc kubenswrapper[4722]: I0309 14:03:43.103392 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 14:03:43 crc kubenswrapper[4722]: I0309 14:03:43.611731 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 14:03:43 crc kubenswrapper[4722]: I0309 14:03:43.612027 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 14:03:43 crc kubenswrapper[4722]: I0309 14:03:43.613762 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:03:43 crc kubenswrapper[4722]: I0309 14:03:43.613809 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:03:43 crc kubenswrapper[4722]: I0309 14:03:43.613827 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:03:44 crc kubenswrapper[4722]: I0309 14:03:44.083794 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 14:03:45 crc kubenswrapper[4722]: I0309 14:03:45.083150 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 14:03:46 crc kubenswrapper[4722]: I0309 14:03:46.084785 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 14:03:46 crc kubenswrapper[4722]: I0309 14:03:46.148490 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 14:03:46 crc kubenswrapper[4722]: I0309 14:03:46.149752 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:03:46 crc kubenswrapper[4722]: I0309 14:03:46.149805 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:03:46 crc kubenswrapper[4722]: I0309 14:03:46.149819 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:03:46 crc kubenswrapper[4722]: I0309 14:03:46.150627 4722 scope.go:117] "RemoveContainer" containerID="413f1bb85b3d9f04be64e92b097c9bc232a688f0b87a936bdbc7c9c369cb820c" Mar 09 14:03:46 crc kubenswrapper[4722]: I0309 14:03:46.204947 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 14:03:46 crc kubenswrapper[4722]: I0309 14:03:46.205118 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 14:03:46 crc kubenswrapper[4722]: I0309 14:03:46.206338 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:03:46 crc kubenswrapper[4722]: I0309 14:03:46.206383 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:03:46 crc kubenswrapper[4722]: I0309 14:03:46.206395 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:03:46 crc kubenswrapper[4722]: I0309 14:03:46.210078 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 14:03:46 crc kubenswrapper[4722]: I0309 14:03:46.467979 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 09 14:03:46 crc kubenswrapper[4722]: I0309 14:03:46.470456 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b7d636e81bd62363d09f680bbc312f71cbc2ac89d9af44b3c6b9efdaf29ded48"} Mar 09 14:03:46 crc kubenswrapper[4722]: I0309 14:03:46.470537 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 14:03:46 crc kubenswrapper[4722]: I0309 14:03:46.470669 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 14:03:46 crc kubenswrapper[4722]: I0309 14:03:46.472031 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:03:46 crc kubenswrapper[4722]: I0309 14:03:46.472066 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:03:46 crc kubenswrapper[4722]: I0309 14:03:46.472095 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:03:46 crc kubenswrapper[4722]: I0309 14:03:46.472107 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:03:46 crc kubenswrapper[4722]: I0309 14:03:46.472073 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:03:46 crc kubenswrapper[4722]: I0309 14:03:46.472168 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:03:47 crc kubenswrapper[4722]: I0309 14:03:47.085605 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 14:03:47 crc kubenswrapper[4722]: I0309 14:03:47.473954 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 09 14:03:47 crc kubenswrapper[4722]: I0309 14:03:47.474405 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 09 14:03:47 crc kubenswrapper[4722]: I0309 14:03:47.476302 4722 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b7d636e81bd62363d09f680bbc312f71cbc2ac89d9af44b3c6b9efdaf29ded48" exitCode=255 Mar 09 14:03:47 crc kubenswrapper[4722]: I0309 14:03:47.476345 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"b7d636e81bd62363d09f680bbc312f71cbc2ac89d9af44b3c6b9efdaf29ded48"} Mar 09 14:03:47 crc kubenswrapper[4722]: I0309 14:03:47.476389 4722 scope.go:117] "RemoveContainer" containerID="413f1bb85b3d9f04be64e92b097c9bc232a688f0b87a936bdbc7c9c369cb820c" Mar 09 14:03:47 crc kubenswrapper[4722]: I0309 14:03:47.476509 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 14:03:47 crc kubenswrapper[4722]: I0309 14:03:47.477517 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:03:47 crc kubenswrapper[4722]: I0309 14:03:47.477549 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:03:47 crc kubenswrapper[4722]: I0309 14:03:47.477561 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:03:47 crc kubenswrapper[4722]: I0309 14:03:47.478093 4722 scope.go:117] "RemoveContainer" containerID="b7d636e81bd62363d09f680bbc312f71cbc2ac89d9af44b3c6b9efdaf29ded48" Mar 09 14:03:47 crc kubenswrapper[4722]: E0309 14:03:47.478304 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 14:03:48 crc kubenswrapper[4722]: I0309 14:03:48.084938 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 14:03:48 crc kubenswrapper[4722]: I0309 14:03:48.480124 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 09 14:03:49 crc kubenswrapper[4722]: I0309 14:03:49.086539 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 14:03:49 crc kubenswrapper[4722]: E0309 14:03:49.255782 4722 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 09 14:03:49 crc kubenswrapper[4722]: I0309 14:03:49.264988 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 14:03:49 crc kubenswrapper[4722]: I0309 14:03:49.266172 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:03:49 crc kubenswrapper[4722]: I0309 14:03:49.266233 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:03:49 crc kubenswrapper[4722]: I0309 14:03:49.266243 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:03:49 crc kubenswrapper[4722]: I0309 14:03:49.266268 4722 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 14:03:49 crc kubenswrapper[4722]: E0309 14:03:49.269848 4722 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 09 14:03:49 crc kubenswrapper[4722]: I0309 14:03:49.351581 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 14:03:49 crc kubenswrapper[4722]: I0309 14:03:49.351793 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 14:03:49 crc kubenswrapper[4722]: I0309 14:03:49.353064 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:03:49 crc kubenswrapper[4722]: I0309 14:03:49.353106 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:03:49 crc kubenswrapper[4722]: I0309 14:03:49.353117 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:03:49 crc kubenswrapper[4722]: I0309 14:03:49.353761 4722 scope.go:117] "RemoveContainer" containerID="b7d636e81bd62363d09f680bbc312f71cbc2ac89d9af44b3c6b9efdaf29ded48" Mar 09 14:03:49 crc kubenswrapper[4722]: E0309 14:03:49.353966 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 14:03:50 crc kubenswrapper[4722]: I0309 14:03:50.064662 4722 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 09 14:03:50 crc kubenswrapper[4722]: I0309 14:03:50.087097 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 14:03:50 crc kubenswrapper[4722]: I0309 14:03:50.112940 4722 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 09 14:03:50 crc kubenswrapper[4722]: E0309 14:03:50.227750 4722 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 09 14:03:51 crc kubenswrapper[4722]: I0309 14:03:51.086371 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 14:03:52 crc kubenswrapper[4722]: I0309 14:03:52.086638 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 14:03:53 crc kubenswrapper[4722]: I0309 14:03:53.084872 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 14:03:53 crc kubenswrapper[4722]: I0309 14:03:53.618036 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 14:03:53 crc kubenswrapper[4722]: I0309 14:03:53.618260 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 14:03:53 crc kubenswrapper[4722]: I0309 14:03:53.619545 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:03:53 crc kubenswrapper[4722]: I0309 14:03:53.619598 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:03:53 crc kubenswrapper[4722]: I0309 14:03:53.619615 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:03:54 crc kubenswrapper[4722]: I0309 14:03:54.085607 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 14:03:55 crc kubenswrapper[4722]: I0309 14:03:55.084996 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 09 14:03:55 crc kubenswrapper[4722]: I0309 14:03:55.201324 4722 csr.go:261] certificate signing request csr-t8f6p is approved, waiting to be issued Mar 09 14:03:55 crc kubenswrapper[4722]: I0309 14:03:55.209748 4722 csr.go:257] certificate signing request csr-t8f6p is issued Mar 09 14:03:55 crc kubenswrapper[4722]: I0309 14:03:55.280659 4722 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 09 14:03:55 crc kubenswrapper[4722]: I0309 14:03:55.916739 4722 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 09 14:03:56 crc kubenswrapper[4722]: I0309 14:03:56.148720 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 14:03:56 crc kubenswrapper[4722]: I0309 14:03:56.150302 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:03:56 crc kubenswrapper[4722]: I0309 14:03:56.150376 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:03:56 crc kubenswrapper[4722]: I0309 14:03:56.150395 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:03:56 crc kubenswrapper[4722]: I0309 14:03:56.211239 4722 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-16 08:55:10.475532667 +0000 UTC Mar 09 14:03:56 crc kubenswrapper[4722]: I0309 14:03:56.211304 4722 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6762h51m14.264234416s for next certificate rotation Mar 09 14:03:56 crc kubenswrapper[4722]: I0309 14:03:56.270874 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 14:03:56 crc kubenswrapper[4722]: I0309 14:03:56.272310 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:03:56 crc kubenswrapper[4722]: I0309 14:03:56.272351 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:03:56 crc kubenswrapper[4722]: I0309 14:03:56.272364 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:03:56 crc kubenswrapper[4722]: I0309 14:03:56.272473 4722 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 09 14:03:56 crc kubenswrapper[4722]: I0309 14:03:56.280014 4722 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 09 14:03:56 crc kubenswrapper[4722]: I0309 14:03:56.280492 4722 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 09 14:03:56 crc kubenswrapper[4722]: E0309 14:03:56.280544 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 09 14:03:56 crc kubenswrapper[4722]: I0309 14:03:56.283467 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:03:56 crc kubenswrapper[4722]: I0309 14:03:56.283491 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:03:56 crc kubenswrapper[4722]: I0309 14:03:56.283499 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:03:56 crc kubenswrapper[4722]: I0309 14:03:56.283513 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:03:56 crc kubenswrapper[4722]: I0309 14:03:56.283523 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:03:56Z","lastTransitionTime":"2026-03-09T14:03:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:03:56 crc kubenswrapper[4722]: E0309 14:03:56.297697 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:03:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:03:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:03:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:03:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:03:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:03:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:03:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:03:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3321b793-006a-42f5-9c18-7f48ed9bad15\\\",\\\"systemUUID\\\":\\\"70de6d16-940c-46da-be51-a9b50262dda2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 14:03:56 crc kubenswrapper[4722]: I0309 14:03:56.311045 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:03:56 crc kubenswrapper[4722]: I0309 14:03:56.311093 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:03:56 crc kubenswrapper[4722]: I0309 14:03:56.311108 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:03:56 crc kubenswrapper[4722]: I0309 14:03:56.311130 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:03:56 crc kubenswrapper[4722]: I0309 14:03:56.311145 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:03:56Z","lastTransitionTime":"2026-03-09T14:03:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:03:56 crc kubenswrapper[4722]: E0309 14:03:56.327816 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:03:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:03:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:03:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:03:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:03:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:03:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:03:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:03:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3321b793-006a-42f5-9c18-7f48ed9bad15\\\",\\\"systemUUID\\\":\\\"70de6d16-940c-46da-be51-a9b50262dda2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 14:03:56 crc kubenswrapper[4722]: I0309 14:03:56.337115 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:03:56 crc kubenswrapper[4722]: I0309 14:03:56.337175 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:03:56 crc kubenswrapper[4722]: I0309 14:03:56.337187 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:03:56 crc kubenswrapper[4722]: I0309 14:03:56.337223 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:03:56 crc kubenswrapper[4722]: I0309 14:03:56.337236 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:03:56Z","lastTransitionTime":"2026-03-09T14:03:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:03:56 crc kubenswrapper[4722]: E0309 14:03:56.351982 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:03:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:03:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:03:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:03:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:03:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:03:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:03:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:03:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3321b793-006a-42f5-9c18-7f48ed9bad15\\\",\\\"systemUUID\\\":\\\"70de6d16-940c-46da-be51-a9b50262dda2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 14:03:56 crc kubenswrapper[4722]: I0309 14:03:56.362328 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:03:56 crc kubenswrapper[4722]: I0309 14:03:56.362395 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:03:56 crc kubenswrapper[4722]: I0309 14:03:56.362411 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:03:56 crc kubenswrapper[4722]: I0309 14:03:56.362432 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:03:56 crc kubenswrapper[4722]: I0309 14:03:56.362446 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:03:56Z","lastTransitionTime":"2026-03-09T14:03:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:03:56 crc kubenswrapper[4722]: E0309 14:03:56.376020 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:03:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:03:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:03:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:03:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:03:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:03:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:03:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:03:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3321b793-006a-42f5-9c18-7f48ed9bad15\\\",\\\"systemUUID\\\":\\\"70de6d16-940c-46da-be51-a9b50262dda2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 14:03:56 crc kubenswrapper[4722]: E0309 14:03:56.376176 4722 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 09 14:03:56 crc kubenswrapper[4722]: E0309 14:03:56.376249 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:03:56 crc kubenswrapper[4722]: E0309 14:03:56.476700 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:03:56 crc kubenswrapper[4722]: E0309 14:03:56.577082 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:03:56 crc kubenswrapper[4722]: E0309 14:03:56.678178 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:03:56 crc kubenswrapper[4722]: E0309 14:03:56.779411 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:03:56 crc kubenswrapper[4722]: E0309 14:03:56.880113 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:03:56 crc kubenswrapper[4722]: E0309 14:03:56.981192 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:03:56 crc kubenswrapper[4722]: I0309 14:03:56.994810 4722 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 09 14:03:57 crc kubenswrapper[4722]: E0309 14:03:57.082394 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:03:57 crc kubenswrapper[4722]: E0309 14:03:57.182838 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:03:57 crc kubenswrapper[4722]: E0309 14:03:57.283129 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:03:57 crc kubenswrapper[4722]: E0309 14:03:57.383481 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:03:57 crc kubenswrapper[4722]: I0309 14:03:57.389724 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 14:03:57 crc kubenswrapper[4722]: I0309 14:03:57.390037 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 09 14:03:57 crc kubenswrapper[4722]: I0309 14:03:57.391835 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:03:57 crc kubenswrapper[4722]: I0309 14:03:57.392030 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:03:57 crc kubenswrapper[4722]: I0309 14:03:57.392067 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:03:57 crc kubenswrapper[4722]: I0309 14:03:57.393185 4722 scope.go:117] "RemoveContainer" containerID="b7d636e81bd62363d09f680bbc312f71cbc2ac89d9af44b3c6b9efdaf29ded48" Mar 09 14:03:57 crc kubenswrapper[4722]: E0309 14:03:57.393611 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 14:03:57 crc kubenswrapper[4722]: E0309 14:03:57.483682 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:03:57 crc kubenswrapper[4722]: E0309 14:03:57.584239 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:03:57 crc kubenswrapper[4722]: E0309 14:03:57.685412 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:03:57 crc kubenswrapper[4722]: E0309 14:03:57.786549 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:03:57 crc kubenswrapper[4722]: E0309 14:03:57.887473 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:03:57 crc kubenswrapper[4722]: E0309 14:03:57.988161 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:03:58 crc kubenswrapper[4722]: E0309 14:03:58.089179 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:03:58 crc kubenswrapper[4722]: E0309 14:03:58.189402 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:03:58 crc kubenswrapper[4722]: E0309 14:03:58.289962 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:03:58 crc kubenswrapper[4722]: E0309 14:03:58.390286 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:03:58 crc kubenswrapper[4722]: E0309 14:03:58.491326 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:03:58 crc kubenswrapper[4722]: E0309 14:03:58.591989 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:03:58 crc kubenswrapper[4722]: E0309 14:03:58.693255 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:03:58 crc kubenswrapper[4722]: E0309 14:03:58.794393 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:03:58 crc kubenswrapper[4722]: E0309 14:03:58.895382 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:03:58 crc kubenswrapper[4722]: E0309 14:03:58.995831 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:03:59 crc kubenswrapper[4722]: E0309 14:03:59.097035 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:03:59 crc kubenswrapper[4722]: E0309 14:03:59.197809 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:03:59 crc kubenswrapper[4722]: E0309 14:03:59.298995 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:03:59 crc kubenswrapper[4722]: E0309 14:03:59.399610 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:03:59 crc kubenswrapper[4722]: E0309 14:03:59.499774 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:03:59 crc kubenswrapper[4722]: E0309 14:03:59.600440 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:03:59 crc kubenswrapper[4722]: E0309 14:03:59.701392 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:03:59 crc kubenswrapper[4722]: E0309 14:03:59.802303 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:03:59 crc kubenswrapper[4722]: E0309 14:03:59.903253 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:04:00 crc kubenswrapper[4722]: E0309 14:04:00.003478 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:04:00 crc kubenswrapper[4722]: E0309 14:04:00.103778 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:04:00 crc kubenswrapper[4722]: E0309 14:04:00.204636 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:04:00 crc kubenswrapper[4722]: E0309 14:04:00.228366 4722 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 09 14:04:00 crc kubenswrapper[4722]: E0309 14:04:00.305579 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:04:00 crc kubenswrapper[4722]: E0309 14:04:00.406185 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:04:00 crc kubenswrapper[4722]: E0309 14:04:00.506838 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:04:00 crc kubenswrapper[4722]: E0309 14:04:00.608004 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:04:00 crc kubenswrapper[4722]: E0309 14:04:00.709185 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:04:00 crc kubenswrapper[4722]: I0309 14:04:00.770666 4722 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 09 14:04:00 crc kubenswrapper[4722]: E0309 14:04:00.809842 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:04:00 crc kubenswrapper[4722]: E0309 14:04:00.910320 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:04:01 crc kubenswrapper[4722]: E0309 14:04:01.010979 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:04:01 crc kubenswrapper[4722]: E0309 14:04:01.112158 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:04:01 crc kubenswrapper[4722]: E0309 14:04:01.212337 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:04:01 crc kubenswrapper[4722]: E0309 14:04:01.313498 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:04:01 crc kubenswrapper[4722]: E0309 14:04:01.414424 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:04:01 crc kubenswrapper[4722]: E0309 14:04:01.515037 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:04:01 crc kubenswrapper[4722]: E0309 14:04:01.615290 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:04:01 crc kubenswrapper[4722]: E0309 14:04:01.716192 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:04:01 crc kubenswrapper[4722]: E0309 14:04:01.816721 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:04:01 crc kubenswrapper[4722]: E0309 14:04:01.917874 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:04:02 crc kubenswrapper[4722]: E0309 14:04:02.018014 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:04:02 crc kubenswrapper[4722]: E0309 14:04:02.118345 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:04:02 crc kubenswrapper[4722]: E0309 14:04:02.218615 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:04:02 crc kubenswrapper[4722]: E0309 14:04:02.319711 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:04:02 crc kubenswrapper[4722]: E0309 14:04:02.420895 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:04:02 crc kubenswrapper[4722]: E0309 14:04:02.521767 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:04:02 crc kubenswrapper[4722]: E0309 14:04:02.622451 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:04:02 crc kubenswrapper[4722]: E0309 14:04:02.723162 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:04:02 crc kubenswrapper[4722]: E0309 14:04:02.824298 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:04:02 crc kubenswrapper[4722]: E0309 14:04:02.925037 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:04:03 crc kubenswrapper[4722]: E0309 14:04:03.025530 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:04:03 crc kubenswrapper[4722]: E0309 14:04:03.126550 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:04:03 crc kubenswrapper[4722]: E0309 14:04:03.226874 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:04:03 crc kubenswrapper[4722]: E0309 14:04:03.327712 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:04:03 crc kubenswrapper[4722]: E0309 14:04:03.428809 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:04:03 crc kubenswrapper[4722]: E0309 14:04:03.529690 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:04:03 crc kubenswrapper[4722]: E0309 14:04:03.630610 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:04:03 crc kubenswrapper[4722]: E0309 14:04:03.730954 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:04:03 crc kubenswrapper[4722]: E0309 14:04:03.831303 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:04:03 crc kubenswrapper[4722]: E0309 14:04:03.932258 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:04:03 crc kubenswrapper[4722]: I0309 14:04:03.992630 4722 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 09 14:04:04 crc kubenswrapper[4722]: E0309 14:04:04.033384 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:04:04 crc kubenswrapper[4722]: E0309 14:04:04.134497 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:04:04 crc kubenswrapper[4722]: E0309 14:04:04.235263 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:04:04 crc kubenswrapper[4722]: E0309 14:04:04.335650 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:04:04 crc kubenswrapper[4722]: E0309 14:04:04.436058 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:04:04 crc kubenswrapper[4722]: E0309 14:04:04.536999 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:04:04 crc kubenswrapper[4722]: E0309 14:04:04.637549 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:04:04 crc kubenswrapper[4722]: E0309 14:04:04.738488 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:04:04 crc kubenswrapper[4722]: E0309 14:04:04.839630 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:04:04 crc kubenswrapper[4722]: E0309 14:04:04.940689 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:04:05 crc kubenswrapper[4722]: E0309 14:04:05.041192 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:04:05 crc kubenswrapper[4722]: E0309 14:04:05.141717 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:04:05 crc kubenswrapper[4722]: E0309 14:04:05.242806 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:04:05 crc kubenswrapper[4722]: E0309 14:04:05.343392 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:04:05 crc kubenswrapper[4722]: E0309 14:04:05.444693 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:04:05 crc kubenswrapper[4722]: E0309 14:04:05.544825 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:04:05 crc kubenswrapper[4722]: E0309 14:04:05.645980 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:04:05 crc kubenswrapper[4722]: E0309 14:04:05.746639 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:04:05 crc kubenswrapper[4722]: E0309 14:04:05.847604 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:04:05 crc kubenswrapper[4722]: E0309 14:04:05.948622 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:04:06 crc kubenswrapper[4722]: E0309 14:04:06.049030 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:04:06 crc kubenswrapper[4722]: E0309 14:04:06.149190 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:04:06 crc kubenswrapper[4722]: E0309 14:04:06.249724 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:04:06 crc kubenswrapper[4722]: E0309 14:04:06.350378 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:04:06 crc kubenswrapper[4722]: E0309 14:04:06.450615 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:04:06 crc kubenswrapper[4722]: E0309 14:04:06.457782 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 09 14:04:06 crc kubenswrapper[4722]: I0309 14:04:06.462754 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:06 crc kubenswrapper[4722]: I0309 14:04:06.462787 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:06 crc kubenswrapper[4722]: I0309 14:04:06.462798 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:06 crc kubenswrapper[4722]: I0309 14:04:06.462816 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:06 crc kubenswrapper[4722]: I0309 14:04:06.462828 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:06Z","lastTransitionTime":"2026-03-09T14:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:06 crc kubenswrapper[4722]: E0309 14:04:06.476912 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3321b793-006a-42f5-9c18-7f48ed9bad15\\\",\\\"systemUUID\\\":\\\"70de6d16-940c-46da-be51-a9b50262dda2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 14:04:06 crc kubenswrapper[4722]: I0309 14:04:06.483258 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:06 crc kubenswrapper[4722]: I0309 14:04:06.483341 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:06 crc kubenswrapper[4722]: I0309 14:04:06.483354 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:06 crc kubenswrapper[4722]: I0309 14:04:06.483396 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:06 crc kubenswrapper[4722]: I0309 14:04:06.483411 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:06Z","lastTransitionTime":"2026-03-09T14:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:06 crc kubenswrapper[4722]: E0309 14:04:06.497321 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3321b793-006a-42f5-9c18-7f48ed9bad15\\\",\\\"systemUUID\\\":\\\"70de6d16-940c-46da-be51-a9b50262dda2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 14:04:06 crc kubenswrapper[4722]: I0309 14:04:06.502951 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:06 crc kubenswrapper[4722]: I0309 14:04:06.503000 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:06 crc kubenswrapper[4722]: I0309 14:04:06.503010 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:06 crc kubenswrapper[4722]: I0309 14:04:06.503028 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:06 crc kubenswrapper[4722]: I0309 14:04:06.503041 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:06Z","lastTransitionTime":"2026-03-09T14:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:06 crc kubenswrapper[4722]: E0309 14:04:06.517459 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3321b793-006a-42f5-9c18-7f48ed9bad15\\\",\\\"systemUUID\\\":\\\"70de6d16-940c-46da-be51-a9b50262dda2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 14:04:06 crc kubenswrapper[4722]: I0309 14:04:06.522818 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:06 crc kubenswrapper[4722]: I0309 14:04:06.522853 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:06 crc kubenswrapper[4722]: I0309 14:04:06.522863 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:06 crc kubenswrapper[4722]: I0309 14:04:06.522881 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:06 crc kubenswrapper[4722]: I0309 14:04:06.522893 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:06Z","lastTransitionTime":"2026-03-09T14:04:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:06 crc kubenswrapper[4722]: E0309 14:04:06.535930 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3321b793-006a-42f5-9c18-7f48ed9bad15\\\",\\\"systemUUID\\\":\\\"70de6d16-940c-46da-be51-a9b50262dda2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 14:04:06 crc kubenswrapper[4722]: E0309 14:04:06.536083 4722 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 09 14:04:06 crc kubenswrapper[4722]: E0309 14:04:06.550731 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:04:06 crc kubenswrapper[4722]: E0309 14:04:06.651895 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:04:06 crc kubenswrapper[4722]: E0309 14:04:06.752761 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:04:06 crc kubenswrapper[4722]: E0309 14:04:06.853013 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:04:06 crc kubenswrapper[4722]: E0309 14:04:06.954262 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:04:07 crc kubenswrapper[4722]: E0309 14:04:07.054434 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:04:07 crc kubenswrapper[4722]: E0309 14:04:07.155127 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:04:07 crc kubenswrapper[4722]: E0309 14:04:07.256328 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:04:07 crc kubenswrapper[4722]: E0309 14:04:07.357391 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:04:07 crc kubenswrapper[4722]: E0309 14:04:07.458057 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:04:07 crc kubenswrapper[4722]: E0309 14:04:07.558705 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:04:07 crc kubenswrapper[4722]: E0309 14:04:07.659631 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 09 14:04:07 crc kubenswrapper[4722]: I0309 14:04:07.705963 4722 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 09 14:04:07 crc kubenswrapper[4722]: I0309 14:04:07.762836 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:07 crc kubenswrapper[4722]: I0309 14:04:07.762898 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:07 crc kubenswrapper[4722]: I0309 14:04:07.762912 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:07 crc kubenswrapper[4722]: I0309 14:04:07.762930 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:07 crc kubenswrapper[4722]: I0309 14:04:07.762942 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:07Z","lastTransitionTime":"2026-03-09T14:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:07 crc kubenswrapper[4722]: I0309 14:04:07.866036 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:07 crc kubenswrapper[4722]: I0309 14:04:07.866102 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:07 crc kubenswrapper[4722]: I0309 14:04:07.866125 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:07 crc kubenswrapper[4722]: I0309 14:04:07.866158 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:07 crc kubenswrapper[4722]: I0309 14:04:07.866180 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:07Z","lastTransitionTime":"2026-03-09T14:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:07 crc kubenswrapper[4722]: I0309 14:04:07.969790 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:07 crc kubenswrapper[4722]: I0309 14:04:07.969863 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:07 crc kubenswrapper[4722]: I0309 14:04:07.969876 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:07 crc kubenswrapper[4722]: I0309 14:04:07.969897 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:07 crc kubenswrapper[4722]: I0309 14:04:07.969909 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:07Z","lastTransitionTime":"2026-03-09T14:04:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.072159 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.072253 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.072271 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.072293 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.072309 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:08Z","lastTransitionTime":"2026-03-09T14:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.112246 4722 apiserver.go:52] "Watching apiserver" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.120251 4722 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.120644 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.121264 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.121314 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.121471 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.121587 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 14:04:08 crc kubenswrapper[4722]: E0309 14:04:08.121732 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 14:04:08 crc kubenswrapper[4722]: E0309 14:04:08.121834 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.122600 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.122818 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 14:04:08 crc kubenswrapper[4722]: E0309 14:04:08.123029 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.124325 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.124334 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.124744 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.124996 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.127881 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.127941 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.128057 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.128193 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.130954 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.166911 4722 scope.go:117] "RemoveContainer" containerID="b7d636e81bd62363d09f680bbc312f71cbc2ac89d9af44b3c6b9efdaf29ded48" Mar 09 14:04:08 crc kubenswrapper[4722]: E0309 14:04:08.167667 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.172153 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.173938 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.176494 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.176543 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.176559 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.176581 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.176597 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:08Z","lastTransitionTime":"2026-03-09T14:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.183645 4722 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.190397 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.206472 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.217216 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.229564 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.242634 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.254712 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.262949 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.263015 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.263134 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.263176 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.263323 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.263773 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.264481 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.264843 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.264964 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.265160 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.265547 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.265681 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.265790 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.265900 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.263639 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.263688 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.264189 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.264700 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.265094 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.265456 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.265943 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.266148 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.266709 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.266189 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.266850 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.266945 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.267009 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.267063 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.267088 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.267111 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.267135 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.267157 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.267179 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.267259 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.267425 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.267518 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.267699 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.267947 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.268020 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.268066 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.268104 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.268130 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.268166 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.268502 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.268527 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.268548 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.268569 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.268590 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.268612 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.268638 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.268659 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.268659 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.268681 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.268703 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.268723 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.268743 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.268763 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.268784 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.268806 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.268826 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.268847 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.268871 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.268894 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.268915 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.268937 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.268962 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.268985 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.269006 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.269029 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.269054 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.269075 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.269096 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.269118 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.269141 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.269163 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.269185 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.269226 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.269249 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.269270 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.269291 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.269311 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.269333 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.269355 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.269376 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.269397 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.269460 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.269481 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.269508 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.269539 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.269567 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.269594 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.269620 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.269650 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.269673 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.269695 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.269718 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.269741 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.269763 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.269785 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.269806 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.269827 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.269849 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.269872 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.269895 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.269918 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.269939 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.269960 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.269982 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.270005 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.270027 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.270049 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.270071 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.270093 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.270115 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.270138 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.270159 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.270182 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.270231 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.270254 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.270276 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.270297 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.270319 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.270341 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.270363 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.270385 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.270408 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.270525 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.270550 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.270604 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.270690 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.270991 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.271065 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.271095 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.271121 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.271144 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.271195 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.271166 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.271249 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.271272 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.271296 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.271319 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.271380 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.271404 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.271428 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.271517 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.271540 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.271564 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.271586 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.271608 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.271630 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.271657 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.271682 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.271706 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.271730 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.271754 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.271770 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.271778 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.271805 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.271827 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.271852 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.271876 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.271899 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.271922 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.271945 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.271968 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.271991 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.272016 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.272042 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.272065 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.272089 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.272111 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.272117 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.272134 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.272159 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.272183 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.272228 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.272227 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.272252 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.272279 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.272302 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.272325 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.272348 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.272370 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.272394 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.272427 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.272461 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.272484 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.272509 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.272533 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.272556 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.272580 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.272603 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.272626 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.272650 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.272675 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.272700 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.272723 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.272746 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.272770 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.272794 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.272816 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.272845 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.272867 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.272890 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.272914 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.272938 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.272961 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.272985 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.273010 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.273033 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.273055 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.273080 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.273106 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.273130 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.273154 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.273178 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.273219 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.273246 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.273269 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.273316 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.273344 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.273370 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.273400 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.273425 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.273450 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.273480 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.273507 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.273532 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.273559 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.273583 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.273611 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.273637 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.273660 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.273712 4722 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.273730 4722 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.273745 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.273761 4722 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.273776 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.273790 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.273809 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.273822 4722 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.273835 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.273848 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.273862 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.273875 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.273890 4722 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.273903 4722 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.273918 4722 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.273931 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.273946 4722 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.273961 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.273976 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.273990 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.274004 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.274027 4722 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.274041 4722 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.274056 4722 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.272194 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.272565 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.272645 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.272663 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.272905 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.273108 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.273339 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.273417 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.273529 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.273573 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.275872 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.273759 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.273887 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.273983 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.274184 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.274414 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.274505 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.274568 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.274752 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.274772 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.274833 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.275061 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.275311 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.275617 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.275659 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.276121 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.275910 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.276014 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.276438 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.276478 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.276846 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.276849 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.276883 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.277159 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.277882 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.277890 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.277893 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.278186 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: E0309 14:04:08.278273 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 14:04:08.778253612 +0000 UTC m=+89.333822188 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.278282 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.278294 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.278474 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.278684 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.278727 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.278746 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.278763 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.278959 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.279248 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.279324 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.280133 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.280403 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.279272 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.281313 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.281400 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.281520 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.281617 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.281770 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.281796 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.281804 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.281985 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.282262 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.282363 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.282458 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.282610 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.282734 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.282975 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.283032 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.283560 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.283660 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.283719 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.283745 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.283826 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.284127 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.284136 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.284144 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.284164 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.284182 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.284219 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.284528 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.284627 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.285173 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.285281 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.285350 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.285473 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.285679 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.286444 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.286483 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.286596 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.286625 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.286610 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.286701 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.286812 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.286849 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.287132 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.287265 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.287273 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.287528 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.287570 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.287585 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.287609 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.287625 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:08Z","lastTransitionTime":"2026-03-09T14:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.287786 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.287817 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 14:04:08 crc kubenswrapper[4722]: E0309 14:04:08.288575 4722 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 14:04:08 crc kubenswrapper[4722]: E0309 14:04:08.288655 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 14:04:08.788632378 +0000 UTC m=+89.344201144 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 14:04:08 crc kubenswrapper[4722]: E0309 14:04:08.288703 4722 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 14:04:08 crc kubenswrapper[4722]: E0309 14:04:08.288741 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 14:04:08.788733421 +0000 UTC m=+89.344301997 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.289489 4722 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.291387 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.292915 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.293049 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.293278 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.304220 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.304721 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.304460 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: E0309 14:04:08.304717 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 14:04:08 crc kubenswrapper[4722]: E0309 14:04:08.304916 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 14:04:08 crc kubenswrapper[4722]: E0309 14:04:08.304941 4722 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 14:04:08 crc kubenswrapper[4722]: E0309 14:04:08.305024 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-09 14:04:08.80499818 +0000 UTC m=+89.360566756 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.305138 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.304708 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.305412 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.305856 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: E0309 14:04:08.306141 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 14:04:08 crc kubenswrapper[4722]: E0309 14:04:08.306162 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 14:04:08 crc kubenswrapper[4722]: E0309 14:04:08.306179 4722 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 14:04:08 crc kubenswrapper[4722]: E0309 14:04:08.306260 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-09 14:04:08.806238385 +0000 UTC m=+89.361806961 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.306084 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.306459 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.306562 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.306722 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.306750 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.306768 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.306908 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.306895 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.307000 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.307027 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.307335 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.307936 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.308147 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.308921 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.308554 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.308960 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.309620 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.310009 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.310240 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.310268 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.310744 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.310851 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.310990 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.311087 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.311181 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.311466 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.314521 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.315819 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.315836 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.316096 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.316291 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.321663 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.321882 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.322542 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.323725 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.323847 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.323985 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.324095 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.324125 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.324309 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.324485 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.324388 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.324743 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.324764 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.324847 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.324938 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.325176 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.325271 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.325297 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.325373 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.325436 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.325519 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.325572 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.325602 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.326042 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.326395 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.326987 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.327548 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.327598 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.327647 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.327953 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.327970 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.330703 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.330873 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.330960 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.331024 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.332072 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.342255 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.344565 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.352717 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.374687 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.374761 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.374822 4722 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.374837 4722 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.374852 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.374879 4722 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.374891 4722 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.374903 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.374916 4722 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.374927 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.374939 4722 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.374951 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.374945 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.375007 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.374963 4722 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.375067 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.375084 4722 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.375125 4722 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.375138 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.375151 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.375163 4722 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.375174 4722 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.375187 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.375229 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.375242 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.375255 4722 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.375267 4722 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.375279 4722 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.375293 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.375305 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.375319 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.375332 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.375345 4722 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.375358 4722 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.375370 4722 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.375382 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.375395 4722 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.375407 4722 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.375420 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.375432 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.375444 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.375456 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.375468 4722 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.375481 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.375493 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.375507 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.375519 4722 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.375530 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.375542 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.375553 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.375564 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.375576 4722 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.375587 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.375599 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.375612 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.375624 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.375635 4722 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.375646 4722 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.375658 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.375670 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.375682 4722 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.375694 4722 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.375706 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.375719 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.375731 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.375745 4722 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.375756 4722 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.375770 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.375782 4722 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.375794 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.375806 4722 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.375818 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.375830 4722 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.375842 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.375854 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.375866 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.375877 4722 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.375889 4722 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.375901 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.375912 4722 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.375925 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.375941 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.375952 4722 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.375964 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.375976 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.375990 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.376002 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.376013 4722 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.376025 4722 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.376036 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.376047 4722 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.376058 4722 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.376070 4722 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.376081 4722 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.376093 4722 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.376104 4722 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.376116 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.376128 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.376140 4722 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.376152 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.376164 4722 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.376175 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.376187 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.376215 4722 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.376228 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.376241 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.376253 4722 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.376265 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.376277 4722 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.376289 4722 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.376301 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.376313 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.376325 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.376338 4722 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.376349 4722 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.376361 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.376374 4722 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.376387 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.376399 4722 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.376410 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.376423 4722 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.376435 4722 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.376447 4722 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.376457 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.376469 4722 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.376480 4722 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.376492 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.376504 4722 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.376515 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.376527 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.376538 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.376550 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.376564 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.376576 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.376588 4722 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.376600 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.376611 4722 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.376622 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.376633 4722 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.376645 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.376657 4722 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.376668 4722 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.376680 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.376691 4722 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.376702 4722 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.376713 4722 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.376725 4722 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.376736 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.376747 4722 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.376758 4722 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.376770 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.376781 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.376791 4722 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.376802 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.376812 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.376824 4722 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.376835 4722 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.376845 4722 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.376857 4722 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.376869 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.376880 4722 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.376891 4722 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.376902 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.376913 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.376925 4722 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.376936 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.376948 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.376959 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.376971 4722 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.376982 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.376995 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.377008 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.377020 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.377032 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.390259 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.390300 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.390315 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.390335 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.390348 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:08Z","lastTransitionTime":"2026-03-09T14:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.441405 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.450621 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.461792 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 09 14:04:08 crc kubenswrapper[4722]: W0309 14:04:08.464416 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-34702099cd9c2660abe4cdd265c9fec5b6c55dab82d72829a28d4aa5bc710890 WatchSource:0}: Error finding container 34702099cd9c2660abe4cdd265c9fec5b6c55dab82d72829a28d4aa5bc710890: Status 404 returned error can't find the container with id 34702099cd9c2660abe4cdd265c9fec5b6c55dab82d72829a28d4aa5bc710890 Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.494555 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.494595 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.494605 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.494622 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.494633 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:08Z","lastTransitionTime":"2026-03-09T14:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.536193 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"34702099cd9c2660abe4cdd265c9fec5b6c55dab82d72829a28d4aa5bc710890"} Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.537108 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"3dbdc09e267289a228015d25962e1142eb3c3ed49b41a58d76a39c30837df31b"} Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.538653 4722 scope.go:117] "RemoveContainer" containerID="b7d636e81bd62363d09f680bbc312f71cbc2ac89d9af44b3c6b9efdaf29ded48" Mar 09 14:04:08 crc kubenswrapper[4722]: E0309 14:04:08.538891 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.539147 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"6b136c523cf3ecb6e43aff8fe54b7687b8f18effca2ac29b4f56dd6438996ed1"} Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.597835 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.597874 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.597887 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.597905 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.597918 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:08Z","lastTransitionTime":"2026-03-09T14:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.701131 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.701382 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.701391 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.701409 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.701420 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:08Z","lastTransitionTime":"2026-03-09T14:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.780869 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 14:04:08 crc kubenswrapper[4722]: E0309 14:04:08.781039 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 14:04:09.781023499 +0000 UTC m=+90.336592075 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.804247 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.804286 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.804296 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.804310 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.804322 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:08Z","lastTransitionTime":"2026-03-09T14:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.881259 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.881297 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.881316 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 14:04:08 crc kubenswrapper[4722]: E0309 14:04:08.881454 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 14:04:08 crc kubenswrapper[4722]: E0309 14:04:08.881456 4722 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.881493 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 14:04:08 crc kubenswrapper[4722]: E0309 14:04:08.881490 4722 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 14:04:08 crc kubenswrapper[4722]: E0309 14:04:08.881552 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 14:04:08 crc kubenswrapper[4722]: E0309 14:04:08.881578 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 14:04:08 crc kubenswrapper[4722]: E0309 14:04:08.881592 4722 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 14:04:08 crc kubenswrapper[4722]: E0309 14:04:08.881555 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 14:04:09.881535304 +0000 UTC m=+90.437103940 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 14:04:08 crc kubenswrapper[4722]: E0309 14:04:08.881650 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 14:04:09.881635636 +0000 UTC m=+90.437204232 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 14:04:08 crc kubenswrapper[4722]: E0309 14:04:08.881471 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 14:04:08 crc kubenswrapper[4722]: E0309 14:04:08.881701 4722 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 14:04:08 crc kubenswrapper[4722]: E0309 14:04:08.881665 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-09 14:04:09.881658197 +0000 UTC m=+90.437226773 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 14:04:08 crc kubenswrapper[4722]: E0309 14:04:08.881826 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-09 14:04:09.881796081 +0000 UTC m=+90.437364697 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.906447 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.906493 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.906503 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.906518 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:08 crc kubenswrapper[4722]: I0309 14:04:08.906528 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:08Z","lastTransitionTime":"2026-03-09T14:04:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:09 crc kubenswrapper[4722]: I0309 14:04:09.008569 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:09 crc kubenswrapper[4722]: I0309 14:04:09.008601 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:09 crc kubenswrapper[4722]: I0309 14:04:09.008610 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:09 crc kubenswrapper[4722]: I0309 14:04:09.008633 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:09 crc kubenswrapper[4722]: I0309 14:04:09.008642 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:09Z","lastTransitionTime":"2026-03-09T14:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:09 crc kubenswrapper[4722]: I0309 14:04:09.111025 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:09 crc kubenswrapper[4722]: I0309 14:04:09.111063 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:09 crc kubenswrapper[4722]: I0309 14:04:09.111074 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:09 crc kubenswrapper[4722]: I0309 14:04:09.111092 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:09 crc kubenswrapper[4722]: I0309 14:04:09.111104 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:09Z","lastTransitionTime":"2026-03-09T14:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:09 crc kubenswrapper[4722]: I0309 14:04:09.213915 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:09 crc kubenswrapper[4722]: I0309 14:04:09.213947 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:09 crc kubenswrapper[4722]: I0309 14:04:09.213956 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:09 crc kubenswrapper[4722]: I0309 14:04:09.213969 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:09 crc kubenswrapper[4722]: I0309 14:04:09.213978 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:09Z","lastTransitionTime":"2026-03-09T14:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:09 crc kubenswrapper[4722]: I0309 14:04:09.316153 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:09 crc kubenswrapper[4722]: I0309 14:04:09.316194 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:09 crc kubenswrapper[4722]: I0309 14:04:09.316229 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:09 crc kubenswrapper[4722]: I0309 14:04:09.316248 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:09 crc kubenswrapper[4722]: I0309 14:04:09.316258 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:09Z","lastTransitionTime":"2026-03-09T14:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:09 crc kubenswrapper[4722]: I0309 14:04:09.418970 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:09 crc kubenswrapper[4722]: I0309 14:04:09.419048 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:09 crc kubenswrapper[4722]: I0309 14:04:09.419066 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:09 crc kubenswrapper[4722]: I0309 14:04:09.419091 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:09 crc kubenswrapper[4722]: I0309 14:04:09.419108 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:09Z","lastTransitionTime":"2026-03-09T14:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:09 crc kubenswrapper[4722]: I0309 14:04:09.522274 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:09 crc kubenswrapper[4722]: I0309 14:04:09.522559 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:09 crc kubenswrapper[4722]: I0309 14:04:09.522708 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:09 crc kubenswrapper[4722]: I0309 14:04:09.522816 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:09 crc kubenswrapper[4722]: I0309 14:04:09.522918 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:09Z","lastTransitionTime":"2026-03-09T14:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:09 crc kubenswrapper[4722]: I0309 14:04:09.542033 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"d69359acfc7a4ee7698088a7e0062b3604a5ceb92910cd3b3d69b1a6291dfdcf"} Mar 09 14:04:09 crc kubenswrapper[4722]: I0309 14:04:09.542087 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"687646beb4c6dd3fb9b659a36449a573f4c389b3e98b07127b18516fdc4f2c93"} Mar 09 14:04:09 crc kubenswrapper[4722]: I0309 14:04:09.543012 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"ee2ed55248b177bc0873ef3c41517401fdb30fc5ce5a545a1ca8c925e4d4e9e8"} Mar 09 14:04:09 crc kubenswrapper[4722]: I0309 14:04:09.554814 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d69359acfc7a4ee7698088a7e0062b3604a5ceb92910cd3b3d69b1a6291dfdcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://687646beb4c6dd3fb9b659a36449a573f4c389b3e98b07127b18516fdc4f2c93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 14:04:09 crc kubenswrapper[4722]: I0309 14:04:09.567803 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 14:04:09 crc kubenswrapper[4722]: I0309 14:04:09.579855 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0965e24b-526e-4842-ac1f-eca7a765355d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56c76a5e8718243a17d8d9fd9a8dcf6083c2ccb65b4816926dca08a89e465728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be3836660bf646052991965270cd69b8b8237b5c4bd698a73789b050d23e6877\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dbef38ff5e714c92fc8a86f1e3c64adf606cb198145f884a000f8dee200fcf2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7d636e81bd62363d09f680bbc312f71cbc2ac89d9af44b3c6b9efdaf29ded48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7d636e81bd62363d09f680bbc312f71cbc2ac89d9af44b3c6b9efdaf29ded48\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T14:03:46Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 14:03:46.681607 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 14:03:46.681849 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 14:03:46.682597 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2315154617/tls.crt::/tmp/serving-cert-2315154617/tls.key\\\\\\\"\\\\nI0309 14:03:46.866095 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 14:03:46.869876 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 14:03:46.869896 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 14:03:46.870280 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 14:03:46.870293 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 14:03:46.877096 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0309 14:03:46.877117 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 14:03:46.877121 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 14:03:46.877128 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 14:03:46.877135 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 14:03:46.877139 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 14:03:46.877143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 14:03:46.877146 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 14:03:46.878419 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T14:03:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fbabc5ec2c100150ce886c06c3bd78cc5896c8a06ac9d262eeb552a8a8bb5d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 14:04:09 crc kubenswrapper[4722]: I0309 14:04:09.589003 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 14:04:09 crc kubenswrapper[4722]: I0309 14:04:09.599243 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 14:04:09 crc kubenswrapper[4722]: I0309 14:04:09.615502 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 14:04:09 crc kubenswrapper[4722]: I0309 14:04:09.624682 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:09 crc kubenswrapper[4722]: I0309 14:04:09.624744 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:09 crc kubenswrapper[4722]: I0309 14:04:09.624754 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:09 crc kubenswrapper[4722]: I0309 14:04:09.624769 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:09 crc kubenswrapper[4722]: I0309 14:04:09.624778 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:09Z","lastTransitionTime":"2026-03-09T14:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:09 crc kubenswrapper[4722]: I0309 14:04:09.627744 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 14:04:09 crc kubenswrapper[4722]: I0309 14:04:09.643188 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0965e24b-526e-4842-ac1f-eca7a765355d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56c76a5e8718243a17d8d9fd9a8dcf6083c2ccb65b4816926dca08a89e465728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be3836660bf646052991965270cd69b8b8237b5c4bd698a73789b050d23e6877\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dbef38ff5e714c92fc8a86f1e3c64adf606cb198145f884a000f8dee200fcf2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7d636e81bd62363d09f680bbc312f71cbc2ac89d9af44b3c6b9efdaf29ded48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7d636e81bd62363d09f680bbc312f71cbc2ac89d9af44b3c6b9efdaf29ded48\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T14:03:46Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 14:03:46.681607 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 14:03:46.681849 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 14:03:46.682597 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2315154617/tls.crt::/tmp/serving-cert-2315154617/tls.key\\\\\\\"\\\\nI0309 14:03:46.866095 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 14:03:46.869876 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 14:03:46.869896 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 14:03:46.870280 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 14:03:46.870293 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 14:03:46.877096 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0309 14:03:46.877117 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 14:03:46.877121 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 14:03:46.877128 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 14:03:46.877135 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 14:03:46.877139 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 14:03:46.877143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 14:03:46.877146 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 14:03:46.878419 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T14:03:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fbabc5ec2c100150ce886c06c3bd78cc5896c8a06ac9d262eeb552a8a8bb5d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 14:04:09 crc kubenswrapper[4722]: I0309 14:04:09.656810 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 14:04:09 crc kubenswrapper[4722]: I0309 14:04:09.671869 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d69359acfc7a4ee7698088a7e0062b3604a5ceb92910cd3b3d69b1a6291dfdcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://687646beb4c6dd3fb9b659a36449a573f4c389b3e98b07127b18516fdc4f2c93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 14:04:09 crc kubenswrapper[4722]: I0309 14:04:09.680954 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 14:04:09 crc kubenswrapper[4722]: I0309 14:04:09.690717 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee2ed55248b177bc0873ef3c41517401fdb30fc5ce5a545a1ca8c925e4d4e9e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 14:04:09 crc kubenswrapper[4722]: I0309 14:04:09.698844 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 14:04:09 crc kubenswrapper[4722]: I0309 14:04:09.707641 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 09 14:04:09 crc kubenswrapper[4722]: I0309 14:04:09.727904 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:09 crc kubenswrapper[4722]: I0309 14:04:09.727960 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:09 crc kubenswrapper[4722]: I0309 14:04:09.727973 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:09 crc kubenswrapper[4722]: I0309 14:04:09.727993 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:09 crc kubenswrapper[4722]: I0309 14:04:09.728006 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:09Z","lastTransitionTime":"2026-03-09T14:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:09 crc kubenswrapper[4722]: I0309 14:04:09.789556 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 14:04:09 crc kubenswrapper[4722]: E0309 14:04:09.789710 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 14:04:11.789684199 +0000 UTC m=+92.345252785 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:04:09 crc kubenswrapper[4722]: I0309 14:04:09.830748 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:09 crc kubenswrapper[4722]: I0309 14:04:09.830794 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:09 crc kubenswrapper[4722]: I0309 14:04:09.830806 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:09 crc kubenswrapper[4722]: I0309 14:04:09.830824 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:09 crc kubenswrapper[4722]: I0309 14:04:09.830836 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:09Z","lastTransitionTime":"2026-03-09T14:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:09 crc kubenswrapper[4722]: I0309 14:04:09.890685 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 14:04:09 crc kubenswrapper[4722]: I0309 14:04:09.890737 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 14:04:09 crc kubenswrapper[4722]: I0309 14:04:09.890764 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 14:04:09 crc kubenswrapper[4722]: I0309 14:04:09.890793 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 14:04:09 crc kubenswrapper[4722]: E0309 14:04:09.890885 4722 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 14:04:09 crc kubenswrapper[4722]: E0309 14:04:09.890901 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 14:04:09 crc kubenswrapper[4722]: E0309 14:04:09.890922 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 14:04:09 crc kubenswrapper[4722]: E0309 14:04:09.890928 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 14:04:09 crc kubenswrapper[4722]: E0309 14:04:09.890985 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 14:04:09 crc kubenswrapper[4722]: E0309 14:04:09.891002 4722 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 14:04:09 crc kubenswrapper[4722]: E0309 14:04:09.891021 4722 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 14:04:09 crc kubenswrapper[4722]: E0309 14:04:09.891004 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 14:04:11.890958215 +0000 UTC m=+92.446526811 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 14:04:09 crc kubenswrapper[4722]: E0309 14:04:09.891142 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-09 14:04:11.891123679 +0000 UTC m=+92.446692275 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 14:04:09 crc kubenswrapper[4722]: E0309 14:04:09.890935 4722 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 14:04:09 crc kubenswrapper[4722]: E0309 14:04:09.891180 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 14:04:11.8911593 +0000 UTC m=+92.446727876 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 14:04:09 crc kubenswrapper[4722]: E0309 14:04:09.891261 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-09 14:04:11.891246893 +0000 UTC m=+92.446815559 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 14:04:09 crc kubenswrapper[4722]: I0309 14:04:09.933438 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:09 crc kubenswrapper[4722]: I0309 14:04:09.933502 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:09 crc kubenswrapper[4722]: I0309 14:04:09.933515 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:09 crc kubenswrapper[4722]: I0309 14:04:09.933532 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:09 crc kubenswrapper[4722]: I0309 14:04:09.933543 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:09Z","lastTransitionTime":"2026-03-09T14:04:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.036774 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.036819 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.036829 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.036844 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.036856 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:10Z","lastTransitionTime":"2026-03-09T14:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.139774 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.139877 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.139891 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.139912 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.139924 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:10Z","lastTransitionTime":"2026-03-09T14:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.148649 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 14:04:10 crc kubenswrapper[4722]: E0309 14:04:10.148835 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.148867 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 14:04:10 crc kubenswrapper[4722]: E0309 14:04:10.149044 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.150149 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 14:04:10 crc kubenswrapper[4722]: E0309 14:04:10.150295 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.154014 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.155362 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.158082 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.159315 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.160755 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.161448 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.162199 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.163903 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.164223 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:10Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.164772 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.166328 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.166959 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.168457 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.169162 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.169862 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.171072 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.171785 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.173085 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.173666 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.174424 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.175898 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.176571 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.177851 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.178462 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.180035 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.180748 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.181695 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.182190 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:10Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.183157 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.183963 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.185383 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.186078 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.187193 4722 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.187352 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.189536 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.190778 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.191496 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.193469 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.194400 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.195601 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.195682 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee2ed55248b177bc0873ef3c41517401fdb30fc5ce5a545a1ca8c925e4d4e9e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:10Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.196563 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.197895 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.198564 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.199836 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.200793 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.202062 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.202783 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.203963 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.204728 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.206325 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.207036 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.208237 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.208932 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.209225 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:10Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.210281 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.211026 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.211808 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.212958 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.223608 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d69359acfc7a4ee7698088a7e0062b3604a5ceb92910cd3b3d69b1a6291dfdcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://687646beb4c6dd3fb9b659a36449a573f4c389b3e98b07127b18516fdc4f2c93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:10Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.238377 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:10Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.241459 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.241492 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.241504 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.241523 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.241538 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:10Z","lastTransitionTime":"2026-03-09T14:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.257867 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0965e24b-526e-4842-ac1f-eca7a765355d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56c76a5e8718243a17d8d9fd9a8dcf6083c2ccb65b4816926dca08a89e465728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be3836660bf646052991965270cd69b8b8237b5c4bd698a73789b050d23e6877\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dbef38ff5e714c92fc8a86f1e3c64adf606cb198145f884a000f8dee200fcf2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7d636e81bd62363d09f680bbc312f71cbc2ac89d9af44b3c6b9efdaf29ded48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7d636e81bd62363d09f680bbc312f71cbc2ac89d9af44b3c6b9efdaf29ded48\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T14:03:46Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 14:03:46.681607 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 14:03:46.681849 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 14:03:46.682597 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2315154617/tls.crt::/tmp/serving-cert-2315154617/tls.key\\\\\\\"\\\\nI0309 14:03:46.866095 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 14:03:46.869876 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 14:03:46.869896 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 14:03:46.870280 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 14:03:46.870293 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 14:03:46.877096 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0309 14:03:46.877117 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 14:03:46.877121 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 14:03:46.877128 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 14:03:46.877135 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 14:03:46.877139 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 14:03:46.877143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 14:03:46.877146 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 14:03:46.878419 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T14:03:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fbabc5ec2c100150ce886c06c3bd78cc5896c8a06ac9d262eeb552a8a8bb5d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:10Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.344818 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.345550 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.345689 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.345766 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.345824 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:10Z","lastTransitionTime":"2026-03-09T14:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.448392 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.448751 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.448840 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.448941 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.449013 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:10Z","lastTransitionTime":"2026-03-09T14:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.551167 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.551197 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.551222 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.551239 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.551250 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:10Z","lastTransitionTime":"2026-03-09T14:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.654640 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.654724 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.654743 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.654769 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.654788 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:10Z","lastTransitionTime":"2026-03-09T14:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.758422 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.758508 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.758531 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.758562 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.758584 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:10Z","lastTransitionTime":"2026-03-09T14:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.861522 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.861591 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.861606 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.861627 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.861640 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:10Z","lastTransitionTime":"2026-03-09T14:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.964742 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.964842 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.964859 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.964884 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:10 crc kubenswrapper[4722]: I0309 14:04:10.964902 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:10Z","lastTransitionTime":"2026-03-09T14:04:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:11 crc kubenswrapper[4722]: I0309 14:04:11.067503 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:11 crc kubenswrapper[4722]: I0309 14:04:11.067559 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:11 crc kubenswrapper[4722]: I0309 14:04:11.067577 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:11 crc kubenswrapper[4722]: I0309 14:04:11.067601 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:11 crc kubenswrapper[4722]: I0309 14:04:11.067618 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:11Z","lastTransitionTime":"2026-03-09T14:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:11 crc kubenswrapper[4722]: I0309 14:04:11.169747 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:11 crc kubenswrapper[4722]: I0309 14:04:11.169786 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:11 crc kubenswrapper[4722]: I0309 14:04:11.169797 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:11 crc kubenswrapper[4722]: I0309 14:04:11.169811 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:11 crc kubenswrapper[4722]: I0309 14:04:11.169859 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:11Z","lastTransitionTime":"2026-03-09T14:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:11 crc kubenswrapper[4722]: I0309 14:04:11.271627 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:11 crc kubenswrapper[4722]: I0309 14:04:11.271846 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:11 crc kubenswrapper[4722]: I0309 14:04:11.271907 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:11 crc kubenswrapper[4722]: I0309 14:04:11.271967 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:11 crc kubenswrapper[4722]: I0309 14:04:11.272053 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:11Z","lastTransitionTime":"2026-03-09T14:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:11 crc kubenswrapper[4722]: I0309 14:04:11.375328 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:11 crc kubenswrapper[4722]: I0309 14:04:11.375874 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:11 crc kubenswrapper[4722]: I0309 14:04:11.375952 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:11 crc kubenswrapper[4722]: I0309 14:04:11.376036 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:11 crc kubenswrapper[4722]: I0309 14:04:11.376109 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:11Z","lastTransitionTime":"2026-03-09T14:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:11 crc kubenswrapper[4722]: I0309 14:04:11.478013 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:11 crc kubenswrapper[4722]: I0309 14:04:11.478282 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:11 crc kubenswrapper[4722]: I0309 14:04:11.478370 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:11 crc kubenswrapper[4722]: I0309 14:04:11.478437 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:11 crc kubenswrapper[4722]: I0309 14:04:11.478494 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:11Z","lastTransitionTime":"2026-03-09T14:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:11 crc kubenswrapper[4722]: I0309 14:04:11.551373 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"69e74824fe242fd2d54bed9734f33823697d7651043d9c97bac6ea5490625a0a"} Mar 09 14:04:11 crc kubenswrapper[4722]: I0309 14:04:11.563464 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69e74824fe242fd2d54bed9734f33823697d7651043d9c97bac6ea5490625a0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:11Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:11 crc kubenswrapper[4722]: I0309 14:04:11.580819 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee2ed55248b177bc0873ef3c41517401fdb30fc5ce5a545a1ca8c925e4d4e9e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:11Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:11 crc kubenswrapper[4722]: I0309 14:04:11.581521 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:11 crc kubenswrapper[4722]: I0309 14:04:11.581575 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:11 crc kubenswrapper[4722]: I0309 14:04:11.581587 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:11 crc kubenswrapper[4722]: I0309 14:04:11.581603 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:11 crc kubenswrapper[4722]: I0309 14:04:11.581613 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:11Z","lastTransitionTime":"2026-03-09T14:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:11 crc kubenswrapper[4722]: I0309 14:04:11.591390 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:11Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:11 crc kubenswrapper[4722]: I0309 14:04:11.610991 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d69359acfc7a4ee7698088a7e0062b3604a5ceb92910cd3b3d69b1a6291dfdcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://687646beb4c6dd3fb9b659a36449a573f4c389b3e98b07127b18516fdc4f2c93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:11Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:11 crc kubenswrapper[4722]: I0309 14:04:11.624028 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:11Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:11 crc kubenswrapper[4722]: I0309 14:04:11.640594 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2267a4d7-87b4-43f7-80c7-7a7ffb0327fc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04169d9b8359e26cc9807e27fb9a5d54ca0946aaade294c5bcc272a7d21c4db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09fa62e7293a50fb3f6bdaf37061cbe5e95a830c8f7337660ce73138dbcaa641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31ba786781fcdb781f5fa45276e29e94756863d2398109d91e81a52f6f88bee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dff721159b382bcfdde6f406a328ef2ee9264b35a8d8baeb7a3c6d31a60ff3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63769aca40f721bec0716427145a3ceea0595836c03d31c98f6d45ff14de68be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://928d88a2e78d66b996b4b4f8c09878f78dd7fbdb000e2ab6d43ac77c999a748f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://928d88a2e78d66b996b4b4f8c09878f78dd7fbdb000e2ab6d43ac77c999a748f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6625c60c717ffc2b85b70374bc50dfae41b04d6362db0c7daf387f31c25862e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6625c60c717ffc2b85b70374bc50dfae41b04d6362db0c7daf387f31c25862e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://21b62fa3b3144d1d24dbb3a9e5f203e4c692377a54fb2322ebe6b927aa67403b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21b62fa3b3144d1d24dbb3a9e5f203e4c692377a54fb2322ebe6b927aa67403b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:11Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:11 crc kubenswrapper[4722]: I0309 14:04:11.652218 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0965e24b-526e-4842-ac1f-eca7a765355d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56c76a5e8718243a17d8d9fd9a8dcf6083c2ccb65b4816926dca08a89e465728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be3836660bf646052991965270cd69b8b8237b5c4bd698a73789b050d23e6877\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dbef38ff5e714c92fc8a86f1e3c64adf606cb198145f884a000f8dee200fcf2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7d636e81bd62363d09f680bbc312f71cbc2ac89d9af44b3c6b9efdaf29ded48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7d636e81bd62363d09f680bbc312f71cbc2ac89d9af44b3c6b9efdaf29ded48\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T14:03:46Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 14:03:46.681607 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 14:03:46.681849 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 14:03:46.682597 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2315154617/tls.crt::/tmp/serving-cert-2315154617/tls.key\\\\\\\"\\\\nI0309 14:03:46.866095 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 14:03:46.869876 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 14:03:46.869896 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 14:03:46.870280 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 14:03:46.870293 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 14:03:46.877096 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0309 14:03:46.877117 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 14:03:46.877121 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 14:03:46.877128 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 14:03:46.877135 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 14:03:46.877139 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 14:03:46.877143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 14:03:46.877146 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 14:03:46.878419 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T14:03:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fbabc5ec2c100150ce886c06c3bd78cc5896c8a06ac9d262eeb552a8a8bb5d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:11Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:11 crc kubenswrapper[4722]: I0309 14:04:11.663942 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:11Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:11 crc kubenswrapper[4722]: I0309 14:04:11.683634 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:11 crc kubenswrapper[4722]: I0309 14:04:11.683689 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:11 crc kubenswrapper[4722]: I0309 14:04:11.683701 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:11 crc kubenswrapper[4722]: I0309 14:04:11.683721 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:11 crc kubenswrapper[4722]: I0309 14:04:11.683737 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:11Z","lastTransitionTime":"2026-03-09T14:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:11 crc kubenswrapper[4722]: I0309 14:04:11.785860 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:11 crc kubenswrapper[4722]: I0309 14:04:11.785891 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:11 crc kubenswrapper[4722]: I0309 14:04:11.785899 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:11 crc kubenswrapper[4722]: I0309 14:04:11.785912 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:11 crc kubenswrapper[4722]: I0309 14:04:11.785921 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:11Z","lastTransitionTime":"2026-03-09T14:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:11 crc kubenswrapper[4722]: I0309 14:04:11.807712 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 14:04:11 crc kubenswrapper[4722]: E0309 14:04:11.807902 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 14:04:15.807873534 +0000 UTC m=+96.363442110 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:04:11 crc kubenswrapper[4722]: I0309 14:04:11.888822 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:11 crc kubenswrapper[4722]: I0309 14:04:11.888880 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:11 crc kubenswrapper[4722]: I0309 14:04:11.888891 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:11 crc kubenswrapper[4722]: I0309 14:04:11.888906 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:11 crc kubenswrapper[4722]: I0309 14:04:11.888916 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:11Z","lastTransitionTime":"2026-03-09T14:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:11 crc kubenswrapper[4722]: I0309 14:04:11.909317 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 14:04:11 crc kubenswrapper[4722]: I0309 14:04:11.909360 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 14:04:11 crc kubenswrapper[4722]: I0309 14:04:11.909378 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 14:04:11 crc kubenswrapper[4722]: I0309 14:04:11.909394 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 14:04:11 crc kubenswrapper[4722]: E0309 14:04:11.909503 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 14:04:11 crc kubenswrapper[4722]: E0309 14:04:11.909519 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 14:04:11 crc kubenswrapper[4722]: E0309 14:04:11.909530 4722 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 14:04:11 crc kubenswrapper[4722]: E0309 14:04:11.909576 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-09 14:04:15.90956264 +0000 UTC m=+96.465131226 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 14:04:11 crc kubenswrapper[4722]: E0309 14:04:11.909877 4722 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 14:04:11 crc kubenswrapper[4722]: E0309 14:04:11.909937 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 14:04:15.909929951 +0000 UTC m=+96.465498527 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 14:04:11 crc kubenswrapper[4722]: E0309 14:04:11.909963 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 14:04:11 crc kubenswrapper[4722]: E0309 14:04:11.909984 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 14:04:11 crc kubenswrapper[4722]: E0309 14:04:11.909995 4722 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 14:04:11 crc kubenswrapper[4722]: E0309 14:04:11.910040 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-09 14:04:15.910029173 +0000 UTC m=+96.465597749 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 14:04:11 crc kubenswrapper[4722]: E0309 14:04:11.910411 4722 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 14:04:11 crc kubenswrapper[4722]: E0309 14:04:11.910625 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 14:04:15.910595679 +0000 UTC m=+96.466164325 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 14:04:11 crc kubenswrapper[4722]: I0309 14:04:11.991320 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:11 crc kubenswrapper[4722]: I0309 14:04:11.991369 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:11 crc kubenswrapper[4722]: I0309 14:04:11.991382 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:11 crc kubenswrapper[4722]: I0309 14:04:11.991401 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:11 crc kubenswrapper[4722]: I0309 14:04:11.991414 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:11Z","lastTransitionTime":"2026-03-09T14:04:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:12 crc kubenswrapper[4722]: I0309 14:04:12.093718 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:12 crc kubenswrapper[4722]: I0309 14:04:12.094068 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:12 crc kubenswrapper[4722]: I0309 14:04:12.094341 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:12 crc kubenswrapper[4722]: I0309 14:04:12.094534 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:12 crc kubenswrapper[4722]: I0309 14:04:12.094684 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:12Z","lastTransitionTime":"2026-03-09T14:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:12 crc kubenswrapper[4722]: I0309 14:04:12.149150 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 14:04:12 crc kubenswrapper[4722]: I0309 14:04:12.149254 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 14:04:12 crc kubenswrapper[4722]: E0309 14:04:12.149376 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 14:04:12 crc kubenswrapper[4722]: I0309 14:04:12.149398 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 14:04:12 crc kubenswrapper[4722]: E0309 14:04:12.149582 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 14:04:12 crc kubenswrapper[4722]: E0309 14:04:12.149767 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 14:04:12 crc kubenswrapper[4722]: I0309 14:04:12.197671 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:12 crc kubenswrapper[4722]: I0309 14:04:12.198023 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:12 crc kubenswrapper[4722]: I0309 14:04:12.198172 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:12 crc kubenswrapper[4722]: I0309 14:04:12.198399 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:12 crc kubenswrapper[4722]: I0309 14:04:12.198566 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:12Z","lastTransitionTime":"2026-03-09T14:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:12 crc kubenswrapper[4722]: I0309 14:04:12.300945 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:12 crc kubenswrapper[4722]: I0309 14:04:12.300989 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:12 crc kubenswrapper[4722]: I0309 14:04:12.301003 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:12 crc kubenswrapper[4722]: I0309 14:04:12.301023 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:12 crc kubenswrapper[4722]: I0309 14:04:12.301038 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:12Z","lastTransitionTime":"2026-03-09T14:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:12 crc kubenswrapper[4722]: I0309 14:04:12.403546 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:12 crc kubenswrapper[4722]: I0309 14:04:12.403582 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:12 crc kubenswrapper[4722]: I0309 14:04:12.403591 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:12 crc kubenswrapper[4722]: I0309 14:04:12.403609 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:12 crc kubenswrapper[4722]: I0309 14:04:12.403619 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:12Z","lastTransitionTime":"2026-03-09T14:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:12 crc kubenswrapper[4722]: I0309 14:04:12.506827 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:12 crc kubenswrapper[4722]: I0309 14:04:12.506876 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:12 crc kubenswrapper[4722]: I0309 14:04:12.506911 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:12 crc kubenswrapper[4722]: I0309 14:04:12.506930 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:12 crc kubenswrapper[4722]: I0309 14:04:12.506943 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:12Z","lastTransitionTime":"2026-03-09T14:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:12 crc kubenswrapper[4722]: I0309 14:04:12.609925 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:12 crc kubenswrapper[4722]: I0309 14:04:12.609954 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:12 crc kubenswrapper[4722]: I0309 14:04:12.609962 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:12 crc kubenswrapper[4722]: I0309 14:04:12.609975 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:12 crc kubenswrapper[4722]: I0309 14:04:12.609983 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:12Z","lastTransitionTime":"2026-03-09T14:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:12 crc kubenswrapper[4722]: I0309 14:04:12.712643 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:12 crc kubenswrapper[4722]: I0309 14:04:12.712714 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:12 crc kubenswrapper[4722]: I0309 14:04:12.712724 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:12 crc kubenswrapper[4722]: I0309 14:04:12.712738 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:12 crc kubenswrapper[4722]: I0309 14:04:12.712750 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:12Z","lastTransitionTime":"2026-03-09T14:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:12 crc kubenswrapper[4722]: I0309 14:04:12.815577 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:12 crc kubenswrapper[4722]: I0309 14:04:12.815643 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:12 crc kubenswrapper[4722]: I0309 14:04:12.815661 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:12 crc kubenswrapper[4722]: I0309 14:04:12.815689 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:12 crc kubenswrapper[4722]: I0309 14:04:12.815707 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:12Z","lastTransitionTime":"2026-03-09T14:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:12 crc kubenswrapper[4722]: I0309 14:04:12.919478 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:12 crc kubenswrapper[4722]: I0309 14:04:12.919565 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:12 crc kubenswrapper[4722]: I0309 14:04:12.919582 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:12 crc kubenswrapper[4722]: I0309 14:04:12.919601 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:12 crc kubenswrapper[4722]: I0309 14:04:12.919615 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:12Z","lastTransitionTime":"2026-03-09T14:04:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:13 crc kubenswrapper[4722]: I0309 14:04:13.021825 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:13 crc kubenswrapper[4722]: I0309 14:04:13.021894 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:13 crc kubenswrapper[4722]: I0309 14:04:13.021911 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:13 crc kubenswrapper[4722]: I0309 14:04:13.021940 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:13 crc kubenswrapper[4722]: I0309 14:04:13.021958 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:13Z","lastTransitionTime":"2026-03-09T14:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:13 crc kubenswrapper[4722]: I0309 14:04:13.123819 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:13 crc kubenswrapper[4722]: I0309 14:04:13.123874 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:13 crc kubenswrapper[4722]: I0309 14:04:13.123885 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:13 crc kubenswrapper[4722]: I0309 14:04:13.123899 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:13 crc kubenswrapper[4722]: I0309 14:04:13.123908 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:13Z","lastTransitionTime":"2026-03-09T14:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:13 crc kubenswrapper[4722]: I0309 14:04:13.226164 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:13 crc kubenswrapper[4722]: I0309 14:04:13.226237 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:13 crc kubenswrapper[4722]: I0309 14:04:13.226250 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:13 crc kubenswrapper[4722]: I0309 14:04:13.226265 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:13 crc kubenswrapper[4722]: I0309 14:04:13.226275 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:13Z","lastTransitionTime":"2026-03-09T14:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:13 crc kubenswrapper[4722]: I0309 14:04:13.328477 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:13 crc kubenswrapper[4722]: I0309 14:04:13.328517 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:13 crc kubenswrapper[4722]: I0309 14:04:13.328526 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:13 crc kubenswrapper[4722]: I0309 14:04:13.328539 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:13 crc kubenswrapper[4722]: I0309 14:04:13.328550 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:13Z","lastTransitionTime":"2026-03-09T14:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:13 crc kubenswrapper[4722]: I0309 14:04:13.430438 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:13 crc kubenswrapper[4722]: I0309 14:04:13.430480 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:13 crc kubenswrapper[4722]: I0309 14:04:13.430491 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:13 crc kubenswrapper[4722]: I0309 14:04:13.430512 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:13 crc kubenswrapper[4722]: I0309 14:04:13.430523 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:13Z","lastTransitionTime":"2026-03-09T14:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:13 crc kubenswrapper[4722]: I0309 14:04:13.532836 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:13 crc kubenswrapper[4722]: I0309 14:04:13.532878 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:13 crc kubenswrapper[4722]: I0309 14:04:13.532887 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:13 crc kubenswrapper[4722]: I0309 14:04:13.532901 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:13 crc kubenswrapper[4722]: I0309 14:04:13.532909 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:13Z","lastTransitionTime":"2026-03-09T14:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:13 crc kubenswrapper[4722]: I0309 14:04:13.634487 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:13 crc kubenswrapper[4722]: I0309 14:04:13.634520 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:13 crc kubenswrapper[4722]: I0309 14:04:13.634529 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:13 crc kubenswrapper[4722]: I0309 14:04:13.634542 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:13 crc kubenswrapper[4722]: I0309 14:04:13.634552 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:13Z","lastTransitionTime":"2026-03-09T14:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:13 crc kubenswrapper[4722]: I0309 14:04:13.736604 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:13 crc kubenswrapper[4722]: I0309 14:04:13.736667 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:13 crc kubenswrapper[4722]: I0309 14:04:13.736677 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:13 crc kubenswrapper[4722]: I0309 14:04:13.736690 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:13 crc kubenswrapper[4722]: I0309 14:04:13.736720 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:13Z","lastTransitionTime":"2026-03-09T14:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:13 crc kubenswrapper[4722]: I0309 14:04:13.839073 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:13 crc kubenswrapper[4722]: I0309 14:04:13.839145 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:13 crc kubenswrapper[4722]: I0309 14:04:13.839168 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:13 crc kubenswrapper[4722]: I0309 14:04:13.839198 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:13 crc kubenswrapper[4722]: I0309 14:04:13.839274 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:13Z","lastTransitionTime":"2026-03-09T14:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:13 crc kubenswrapper[4722]: I0309 14:04:13.943236 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:13 crc kubenswrapper[4722]: I0309 14:04:13.943276 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:13 crc kubenswrapper[4722]: I0309 14:04:13.943287 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:13 crc kubenswrapper[4722]: I0309 14:04:13.943305 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:13 crc kubenswrapper[4722]: I0309 14:04:13.943319 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:13Z","lastTransitionTime":"2026-03-09T14:04:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:14 crc kubenswrapper[4722]: I0309 14:04:14.045933 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:14 crc kubenswrapper[4722]: I0309 14:04:14.045968 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:14 crc kubenswrapper[4722]: I0309 14:04:14.045979 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:14 crc kubenswrapper[4722]: I0309 14:04:14.045993 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:14 crc kubenswrapper[4722]: I0309 14:04:14.046009 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:14Z","lastTransitionTime":"2026-03-09T14:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:14 crc kubenswrapper[4722]: I0309 14:04:14.148124 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 14:04:14 crc kubenswrapper[4722]: I0309 14:04:14.148157 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 14:04:14 crc kubenswrapper[4722]: I0309 14:04:14.148133 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 14:04:14 crc kubenswrapper[4722]: I0309 14:04:14.148267 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:14 crc kubenswrapper[4722]: I0309 14:04:14.148289 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:14 crc kubenswrapper[4722]: I0309 14:04:14.148298 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:14 crc kubenswrapper[4722]: E0309 14:04:14.148297 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 14:04:14 crc kubenswrapper[4722]: I0309 14:04:14.148310 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:14 crc kubenswrapper[4722]: I0309 14:04:14.148354 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:14Z","lastTransitionTime":"2026-03-09T14:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:14 crc kubenswrapper[4722]: E0309 14:04:14.148410 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 14:04:14 crc kubenswrapper[4722]: E0309 14:04:14.148519 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 14:04:14 crc kubenswrapper[4722]: I0309 14:04:14.250971 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:14 crc kubenswrapper[4722]: I0309 14:04:14.251028 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:14 crc kubenswrapper[4722]: I0309 14:04:14.251043 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:14 crc kubenswrapper[4722]: I0309 14:04:14.251066 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:14 crc kubenswrapper[4722]: I0309 14:04:14.251081 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:14Z","lastTransitionTime":"2026-03-09T14:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:14 crc kubenswrapper[4722]: I0309 14:04:14.354333 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:14 crc kubenswrapper[4722]: I0309 14:04:14.354397 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:14 crc kubenswrapper[4722]: I0309 14:04:14.354408 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:14 crc kubenswrapper[4722]: I0309 14:04:14.354423 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:14 crc kubenswrapper[4722]: I0309 14:04:14.354437 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:14Z","lastTransitionTime":"2026-03-09T14:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:14 crc kubenswrapper[4722]: I0309 14:04:14.457027 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:14 crc kubenswrapper[4722]: I0309 14:04:14.457074 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:14 crc kubenswrapper[4722]: I0309 14:04:14.457083 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:14 crc kubenswrapper[4722]: I0309 14:04:14.457100 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:14 crc kubenswrapper[4722]: I0309 14:04:14.457111 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:14Z","lastTransitionTime":"2026-03-09T14:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:14 crc kubenswrapper[4722]: I0309 14:04:14.559599 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:14 crc kubenswrapper[4722]: I0309 14:04:14.559652 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:14 crc kubenswrapper[4722]: I0309 14:04:14.559673 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:14 crc kubenswrapper[4722]: I0309 14:04:14.559700 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:14 crc kubenswrapper[4722]: I0309 14:04:14.559720 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:14Z","lastTransitionTime":"2026-03-09T14:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:14 crc kubenswrapper[4722]: I0309 14:04:14.662873 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:14 crc kubenswrapper[4722]: I0309 14:04:14.662925 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:14 crc kubenswrapper[4722]: I0309 14:04:14.662939 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:14 crc kubenswrapper[4722]: I0309 14:04:14.662959 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:14 crc kubenswrapper[4722]: I0309 14:04:14.662973 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:14Z","lastTransitionTime":"2026-03-09T14:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:14 crc kubenswrapper[4722]: I0309 14:04:14.766075 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:14 crc kubenswrapper[4722]: I0309 14:04:14.766146 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:14 crc kubenswrapper[4722]: I0309 14:04:14.766186 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:14 crc kubenswrapper[4722]: I0309 14:04:14.766263 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:14 crc kubenswrapper[4722]: I0309 14:04:14.766288 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:14Z","lastTransitionTime":"2026-03-09T14:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:14 crc kubenswrapper[4722]: I0309 14:04:14.868861 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:14 crc kubenswrapper[4722]: I0309 14:04:14.868929 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:14 crc kubenswrapper[4722]: I0309 14:04:14.868953 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:14 crc kubenswrapper[4722]: I0309 14:04:14.868983 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:14 crc kubenswrapper[4722]: I0309 14:04:14.869008 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:14Z","lastTransitionTime":"2026-03-09T14:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:14 crc kubenswrapper[4722]: I0309 14:04:14.973859 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:14 crc kubenswrapper[4722]: I0309 14:04:14.973943 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:14 crc kubenswrapper[4722]: I0309 14:04:14.973954 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:14 crc kubenswrapper[4722]: I0309 14:04:14.973972 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:14 crc kubenswrapper[4722]: I0309 14:04:14.973982 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:14Z","lastTransitionTime":"2026-03-09T14:04:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:15 crc kubenswrapper[4722]: I0309 14:04:15.076946 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:15 crc kubenswrapper[4722]: I0309 14:04:15.076987 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:15 crc kubenswrapper[4722]: I0309 14:04:15.076999 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:15 crc kubenswrapper[4722]: I0309 14:04:15.077018 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:15 crc kubenswrapper[4722]: I0309 14:04:15.077030 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:15Z","lastTransitionTime":"2026-03-09T14:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:15 crc kubenswrapper[4722]: I0309 14:04:15.181060 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:15 crc kubenswrapper[4722]: I0309 14:04:15.181116 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:15 crc kubenswrapper[4722]: I0309 14:04:15.181137 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:15 crc kubenswrapper[4722]: I0309 14:04:15.181163 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:15 crc kubenswrapper[4722]: I0309 14:04:15.181181 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:15Z","lastTransitionTime":"2026-03-09T14:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:15 crc kubenswrapper[4722]: I0309 14:04:15.285266 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:15 crc kubenswrapper[4722]: I0309 14:04:15.285316 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:15 crc kubenswrapper[4722]: I0309 14:04:15.285333 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:15 crc kubenswrapper[4722]: I0309 14:04:15.285357 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:15 crc kubenswrapper[4722]: I0309 14:04:15.285374 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:15Z","lastTransitionTime":"2026-03-09T14:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:15 crc kubenswrapper[4722]: I0309 14:04:15.389131 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:15 crc kubenswrapper[4722]: I0309 14:04:15.389180 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:15 crc kubenswrapper[4722]: I0309 14:04:15.389189 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:15 crc kubenswrapper[4722]: I0309 14:04:15.389222 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:15 crc kubenswrapper[4722]: I0309 14:04:15.389234 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:15Z","lastTransitionTime":"2026-03-09T14:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:15 crc kubenswrapper[4722]: I0309 14:04:15.492751 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:15 crc kubenswrapper[4722]: I0309 14:04:15.492794 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:15 crc kubenswrapper[4722]: I0309 14:04:15.492803 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:15 crc kubenswrapper[4722]: I0309 14:04:15.492818 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:15 crc kubenswrapper[4722]: I0309 14:04:15.492828 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:15Z","lastTransitionTime":"2026-03-09T14:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:15 crc kubenswrapper[4722]: I0309 14:04:15.600643 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:15 crc kubenswrapper[4722]: I0309 14:04:15.600706 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:15 crc kubenswrapper[4722]: I0309 14:04:15.600728 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:15 crc kubenswrapper[4722]: I0309 14:04:15.600755 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:15 crc kubenswrapper[4722]: I0309 14:04:15.600776 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:15Z","lastTransitionTime":"2026-03-09T14:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:15 crc kubenswrapper[4722]: I0309 14:04:15.704399 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:15 crc kubenswrapper[4722]: I0309 14:04:15.704463 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:15 crc kubenswrapper[4722]: I0309 14:04:15.704474 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:15 crc kubenswrapper[4722]: I0309 14:04:15.704494 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:15 crc kubenswrapper[4722]: I0309 14:04:15.704508 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:15Z","lastTransitionTime":"2026-03-09T14:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:15 crc kubenswrapper[4722]: I0309 14:04:15.808133 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:15 crc kubenswrapper[4722]: I0309 14:04:15.808182 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:15 crc kubenswrapper[4722]: I0309 14:04:15.808191 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:15 crc kubenswrapper[4722]: I0309 14:04:15.808229 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:15 crc kubenswrapper[4722]: I0309 14:04:15.808243 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:15Z","lastTransitionTime":"2026-03-09T14:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:15 crc kubenswrapper[4722]: I0309 14:04:15.843865 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 14:04:15 crc kubenswrapper[4722]: E0309 14:04:15.844059 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 14:04:23.844028215 +0000 UTC m=+104.399596781 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:04:15 crc kubenswrapper[4722]: I0309 14:04:15.911295 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:15 crc kubenswrapper[4722]: I0309 14:04:15.911371 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:15 crc kubenswrapper[4722]: I0309 14:04:15.911384 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:15 crc kubenswrapper[4722]: I0309 14:04:15.911405 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:15 crc kubenswrapper[4722]: I0309 14:04:15.911440 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:15Z","lastTransitionTime":"2026-03-09T14:04:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:15 crc kubenswrapper[4722]: I0309 14:04:15.945131 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 14:04:15 crc kubenswrapper[4722]: I0309 14:04:15.945181 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 14:04:15 crc kubenswrapper[4722]: I0309 14:04:15.945228 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 14:04:15 crc kubenswrapper[4722]: I0309 14:04:15.945249 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 14:04:15 crc kubenswrapper[4722]: E0309 14:04:15.945398 4722 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 14:04:15 crc kubenswrapper[4722]: E0309 14:04:15.945439 4722 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 14:04:15 crc kubenswrapper[4722]: E0309 14:04:15.945496 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 14:04:23.945478216 +0000 UTC m=+104.501046792 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 14:04:15 crc kubenswrapper[4722]: E0309 14:04:15.945554 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 14:04:15 crc kubenswrapper[4722]: E0309 14:04:15.945634 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 14:04:15 crc kubenswrapper[4722]: E0309 14:04:15.945660 4722 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 14:04:15 crc kubenswrapper[4722]: E0309 14:04:15.945566 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 14:04:23.945533977 +0000 UTC m=+104.501102583 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 14:04:15 crc kubenswrapper[4722]: E0309 14:04:15.945853 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-09 14:04:23.945776454 +0000 UTC m=+104.501345030 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 14:04:15 crc kubenswrapper[4722]: E0309 14:04:15.945896 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 14:04:15 crc kubenswrapper[4722]: E0309 14:04:15.945911 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 14:04:15 crc kubenswrapper[4722]: E0309 14:04:15.945922 4722 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 14:04:15 crc kubenswrapper[4722]: E0309 14:04:15.945957 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-09 14:04:23.945948818 +0000 UTC m=+104.501517394 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 14:04:16 crc kubenswrapper[4722]: I0309 14:04:16.014446 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:16 crc kubenswrapper[4722]: I0309 14:04:16.014521 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:16 crc kubenswrapper[4722]: I0309 14:04:16.014537 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:16 crc kubenswrapper[4722]: I0309 14:04:16.014555 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:16 crc kubenswrapper[4722]: I0309 14:04:16.014577 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:16Z","lastTransitionTime":"2026-03-09T14:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:16 crc kubenswrapper[4722]: I0309 14:04:16.117594 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:16 crc kubenswrapper[4722]: I0309 14:04:16.117638 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:16 crc kubenswrapper[4722]: I0309 14:04:16.117650 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:16 crc kubenswrapper[4722]: I0309 14:04:16.117666 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:16 crc kubenswrapper[4722]: I0309 14:04:16.117677 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:16Z","lastTransitionTime":"2026-03-09T14:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:16 crc kubenswrapper[4722]: I0309 14:04:16.148597 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 14:04:16 crc kubenswrapper[4722]: I0309 14:04:16.148613 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 14:04:16 crc kubenswrapper[4722]: I0309 14:04:16.148741 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 14:04:16 crc kubenswrapper[4722]: E0309 14:04:16.148854 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 14:04:16 crc kubenswrapper[4722]: E0309 14:04:16.148968 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 14:04:16 crc kubenswrapper[4722]: E0309 14:04:16.149061 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 14:04:16 crc kubenswrapper[4722]: I0309 14:04:16.219649 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:16 crc kubenswrapper[4722]: I0309 14:04:16.219690 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:16 crc kubenswrapper[4722]: I0309 14:04:16.219702 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:16 crc kubenswrapper[4722]: I0309 14:04:16.219720 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:16 crc kubenswrapper[4722]: I0309 14:04:16.219733 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:16Z","lastTransitionTime":"2026-03-09T14:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:16 crc kubenswrapper[4722]: I0309 14:04:16.322326 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:16 crc kubenswrapper[4722]: I0309 14:04:16.322361 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:16 crc kubenswrapper[4722]: I0309 14:04:16.322379 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:16 crc kubenswrapper[4722]: I0309 14:04:16.322401 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:16 crc kubenswrapper[4722]: I0309 14:04:16.322414 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:16Z","lastTransitionTime":"2026-03-09T14:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:16 crc kubenswrapper[4722]: I0309 14:04:16.425502 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:16 crc kubenswrapper[4722]: I0309 14:04:16.425608 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:16 crc kubenswrapper[4722]: I0309 14:04:16.425996 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:16 crc kubenswrapper[4722]: I0309 14:04:16.426082 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:16 crc kubenswrapper[4722]: I0309 14:04:16.426376 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:16Z","lastTransitionTime":"2026-03-09T14:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:16 crc kubenswrapper[4722]: I0309 14:04:16.529194 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:16 crc kubenswrapper[4722]: I0309 14:04:16.529244 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:16 crc kubenswrapper[4722]: I0309 14:04:16.529253 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:16 crc kubenswrapper[4722]: I0309 14:04:16.529267 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:16 crc kubenswrapper[4722]: I0309 14:04:16.529277 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:16Z","lastTransitionTime":"2026-03-09T14:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:16 crc kubenswrapper[4722]: I0309 14:04:16.631295 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:16 crc kubenswrapper[4722]: I0309 14:04:16.631363 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:16 crc kubenswrapper[4722]: I0309 14:04:16.631375 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:16 crc kubenswrapper[4722]: I0309 14:04:16.631393 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:16 crc kubenswrapper[4722]: I0309 14:04:16.631404 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:16Z","lastTransitionTime":"2026-03-09T14:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:16 crc kubenswrapper[4722]: I0309 14:04:16.734108 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:16 crc kubenswrapper[4722]: I0309 14:04:16.734147 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:16 crc kubenswrapper[4722]: I0309 14:04:16.734157 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:16 crc kubenswrapper[4722]: I0309 14:04:16.734171 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:16 crc kubenswrapper[4722]: I0309 14:04:16.734180 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:16Z","lastTransitionTime":"2026-03-09T14:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:16 crc kubenswrapper[4722]: I0309 14:04:16.830029 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:16 crc kubenswrapper[4722]: I0309 14:04:16.830084 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:16 crc kubenswrapper[4722]: I0309 14:04:16.830098 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:16 crc kubenswrapper[4722]: I0309 14:04:16.830117 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:16 crc kubenswrapper[4722]: I0309 14:04:16.830130 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:16Z","lastTransitionTime":"2026-03-09T14:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:16 crc kubenswrapper[4722]: E0309 14:04:16.840195 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3321b793-006a-42f5-9c18-7f48ed9bad15\\\",\\\"systemUUID\\\":\\\"70de6d16-940c-46da-be51-a9b50262dda2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:16Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:16 crc kubenswrapper[4722]: I0309 14:04:16.843448 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:16 crc kubenswrapper[4722]: I0309 14:04:16.843481 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:16 crc kubenswrapper[4722]: I0309 14:04:16.843490 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:16 crc kubenswrapper[4722]: I0309 14:04:16.843500 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:16 crc kubenswrapper[4722]: I0309 14:04:16.843508 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:16Z","lastTransitionTime":"2026-03-09T14:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:16 crc kubenswrapper[4722]: E0309 14:04:16.859050 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3321b793-006a-42f5-9c18-7f48ed9bad15\\\",\\\"systemUUID\\\":\\\"70de6d16-940c-46da-be51-a9b50262dda2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:16Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:16 crc kubenswrapper[4722]: I0309 14:04:16.862978 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:16 crc kubenswrapper[4722]: I0309 14:04:16.863011 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:16 crc kubenswrapper[4722]: I0309 14:04:16.863021 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:16 crc kubenswrapper[4722]: I0309 14:04:16.863036 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:16 crc kubenswrapper[4722]: I0309 14:04:16.863048 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:16Z","lastTransitionTime":"2026-03-09T14:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:16 crc kubenswrapper[4722]: E0309 14:04:16.874311 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3321b793-006a-42f5-9c18-7f48ed9bad15\\\",\\\"systemUUID\\\":\\\"70de6d16-940c-46da-be51-a9b50262dda2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:16Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:16 crc kubenswrapper[4722]: I0309 14:04:16.878579 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:16 crc kubenswrapper[4722]: I0309 14:04:16.878611 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:16 crc kubenswrapper[4722]: I0309 14:04:16.878620 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:16 crc kubenswrapper[4722]: I0309 14:04:16.878637 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:16 crc kubenswrapper[4722]: I0309 14:04:16.878649 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:16Z","lastTransitionTime":"2026-03-09T14:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:16 crc kubenswrapper[4722]: E0309 14:04:16.895807 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3321b793-006a-42f5-9c18-7f48ed9bad15\\\",\\\"systemUUID\\\":\\\"70de6d16-940c-46da-be51-a9b50262dda2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:16Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:16 crc kubenswrapper[4722]: I0309 14:04:16.899625 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:16 crc kubenswrapper[4722]: I0309 14:04:16.899656 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:16 crc kubenswrapper[4722]: I0309 14:04:16.899664 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:16 crc kubenswrapper[4722]: I0309 14:04:16.899676 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:16 crc kubenswrapper[4722]: I0309 14:04:16.899684 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:16Z","lastTransitionTime":"2026-03-09T14:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:16 crc kubenswrapper[4722]: E0309 14:04:16.910866 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3321b793-006a-42f5-9c18-7f48ed9bad15\\\",\\\"systemUUID\\\":\\\"70de6d16-940c-46da-be51-a9b50262dda2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:16Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:16 crc kubenswrapper[4722]: E0309 14:04:16.911021 4722 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 09 14:04:16 crc kubenswrapper[4722]: I0309 14:04:16.912408 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:16 crc kubenswrapper[4722]: I0309 14:04:16.912443 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:16 crc kubenswrapper[4722]: I0309 14:04:16.912453 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:16 crc kubenswrapper[4722]: I0309 14:04:16.912467 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:16 crc kubenswrapper[4722]: I0309 14:04:16.912478 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:16Z","lastTransitionTime":"2026-03-09T14:04:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:17 crc kubenswrapper[4722]: I0309 14:04:17.015006 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:17 crc kubenswrapper[4722]: I0309 14:04:17.015062 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:17 crc kubenswrapper[4722]: I0309 14:04:17.015079 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:17 crc kubenswrapper[4722]: I0309 14:04:17.015101 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:17 crc kubenswrapper[4722]: I0309 14:04:17.015117 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:17Z","lastTransitionTime":"2026-03-09T14:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:17 crc kubenswrapper[4722]: I0309 14:04:17.116906 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:17 crc kubenswrapper[4722]: I0309 14:04:17.116958 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:17 crc kubenswrapper[4722]: I0309 14:04:17.116968 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:17 crc kubenswrapper[4722]: I0309 14:04:17.116981 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:17 crc kubenswrapper[4722]: I0309 14:04:17.116990 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:17Z","lastTransitionTime":"2026-03-09T14:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:17 crc kubenswrapper[4722]: I0309 14:04:17.219097 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:17 crc kubenswrapper[4722]: I0309 14:04:17.219156 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:17 crc kubenswrapper[4722]: I0309 14:04:17.219173 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:17 crc kubenswrapper[4722]: I0309 14:04:17.219217 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:17 crc kubenswrapper[4722]: I0309 14:04:17.219235 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:17Z","lastTransitionTime":"2026-03-09T14:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:17 crc kubenswrapper[4722]: I0309 14:04:17.321378 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:17 crc kubenswrapper[4722]: I0309 14:04:17.321443 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:17 crc kubenswrapper[4722]: I0309 14:04:17.321461 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:17 crc kubenswrapper[4722]: I0309 14:04:17.321491 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:17 crc kubenswrapper[4722]: I0309 14:04:17.321512 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:17Z","lastTransitionTime":"2026-03-09T14:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:17 crc kubenswrapper[4722]: I0309 14:04:17.423914 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:17 crc kubenswrapper[4722]: I0309 14:04:17.423979 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:17 crc kubenswrapper[4722]: I0309 14:04:17.423995 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:17 crc kubenswrapper[4722]: I0309 14:04:17.424016 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:17 crc kubenswrapper[4722]: I0309 14:04:17.424031 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:17Z","lastTransitionTime":"2026-03-09T14:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:17 crc kubenswrapper[4722]: I0309 14:04:17.526588 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:17 crc kubenswrapper[4722]: I0309 14:04:17.526623 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:17 crc kubenswrapper[4722]: I0309 14:04:17.526632 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:17 crc kubenswrapper[4722]: I0309 14:04:17.526645 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:17 crc kubenswrapper[4722]: I0309 14:04:17.526654 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:17Z","lastTransitionTime":"2026-03-09T14:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:17 crc kubenswrapper[4722]: I0309 14:04:17.629406 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:17 crc kubenswrapper[4722]: I0309 14:04:17.629449 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:17 crc kubenswrapper[4722]: I0309 14:04:17.629483 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:17 crc kubenswrapper[4722]: I0309 14:04:17.629501 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:17 crc kubenswrapper[4722]: I0309 14:04:17.629513 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:17Z","lastTransitionTime":"2026-03-09T14:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:17 crc kubenswrapper[4722]: I0309 14:04:17.732634 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:17 crc kubenswrapper[4722]: I0309 14:04:17.732679 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:17 crc kubenswrapper[4722]: I0309 14:04:17.732690 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:17 crc kubenswrapper[4722]: I0309 14:04:17.732707 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:17 crc kubenswrapper[4722]: I0309 14:04:17.732719 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:17Z","lastTransitionTime":"2026-03-09T14:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:17 crc kubenswrapper[4722]: I0309 14:04:17.836253 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:17 crc kubenswrapper[4722]: I0309 14:04:17.836349 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:17 crc kubenswrapper[4722]: I0309 14:04:17.836368 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:17 crc kubenswrapper[4722]: I0309 14:04:17.836384 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:17 crc kubenswrapper[4722]: I0309 14:04:17.836396 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:17Z","lastTransitionTime":"2026-03-09T14:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:17 crc kubenswrapper[4722]: I0309 14:04:17.939091 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:17 crc kubenswrapper[4722]: I0309 14:04:17.939149 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:17 crc kubenswrapper[4722]: I0309 14:04:17.939165 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:17 crc kubenswrapper[4722]: I0309 14:04:17.939186 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:17 crc kubenswrapper[4722]: I0309 14:04:17.939225 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:17Z","lastTransitionTime":"2026-03-09T14:04:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:18 crc kubenswrapper[4722]: I0309 14:04:18.042908 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:18 crc kubenswrapper[4722]: I0309 14:04:18.042954 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:18 crc kubenswrapper[4722]: I0309 14:04:18.042970 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:18 crc kubenswrapper[4722]: I0309 14:04:18.042992 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:18 crc kubenswrapper[4722]: I0309 14:04:18.043010 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:18Z","lastTransitionTime":"2026-03-09T14:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:18 crc kubenswrapper[4722]: I0309 14:04:18.146164 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:18 crc kubenswrapper[4722]: I0309 14:04:18.146239 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:18 crc kubenswrapper[4722]: I0309 14:04:18.146255 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:18 crc kubenswrapper[4722]: I0309 14:04:18.146279 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:18 crc kubenswrapper[4722]: I0309 14:04:18.146294 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:18Z","lastTransitionTime":"2026-03-09T14:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:18 crc kubenswrapper[4722]: I0309 14:04:18.148424 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 14:04:18 crc kubenswrapper[4722]: I0309 14:04:18.148514 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 14:04:18 crc kubenswrapper[4722]: E0309 14:04:18.148548 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 14:04:18 crc kubenswrapper[4722]: I0309 14:04:18.148623 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 14:04:18 crc kubenswrapper[4722]: E0309 14:04:18.148730 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 14:04:18 crc kubenswrapper[4722]: E0309 14:04:18.148839 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 14:04:18 crc kubenswrapper[4722]: I0309 14:04:18.248846 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:18 crc kubenswrapper[4722]: I0309 14:04:18.248944 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:18 crc kubenswrapper[4722]: I0309 14:04:18.248964 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:18 crc kubenswrapper[4722]: I0309 14:04:18.249018 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:18 crc kubenswrapper[4722]: I0309 14:04:18.249036 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:18Z","lastTransitionTime":"2026-03-09T14:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:18 crc kubenswrapper[4722]: I0309 14:04:18.352231 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:18 crc kubenswrapper[4722]: I0309 14:04:18.352284 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:18 crc kubenswrapper[4722]: I0309 14:04:18.352421 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:18 crc kubenswrapper[4722]: I0309 14:04:18.352442 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:18 crc kubenswrapper[4722]: I0309 14:04:18.352454 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:18Z","lastTransitionTime":"2026-03-09T14:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:18 crc kubenswrapper[4722]: I0309 14:04:18.455562 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:18 crc kubenswrapper[4722]: I0309 14:04:18.455640 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:18 crc kubenswrapper[4722]: I0309 14:04:18.455665 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:18 crc kubenswrapper[4722]: I0309 14:04:18.455695 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:18 crc kubenswrapper[4722]: I0309 14:04:18.455718 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:18Z","lastTransitionTime":"2026-03-09T14:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:18 crc kubenswrapper[4722]: I0309 14:04:18.558973 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:18 crc kubenswrapper[4722]: I0309 14:04:18.559051 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:18 crc kubenswrapper[4722]: I0309 14:04:18.559072 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:18 crc kubenswrapper[4722]: I0309 14:04:18.559100 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:18 crc kubenswrapper[4722]: I0309 14:04:18.559123 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:18Z","lastTransitionTime":"2026-03-09T14:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:18 crc kubenswrapper[4722]: I0309 14:04:18.662384 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:18 crc kubenswrapper[4722]: I0309 14:04:18.662449 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:18 crc kubenswrapper[4722]: I0309 14:04:18.662461 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:18 crc kubenswrapper[4722]: I0309 14:04:18.662481 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:18 crc kubenswrapper[4722]: I0309 14:04:18.662494 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:18Z","lastTransitionTime":"2026-03-09T14:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:18 crc kubenswrapper[4722]: I0309 14:04:18.765369 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:18 crc kubenswrapper[4722]: I0309 14:04:18.765458 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:18 crc kubenswrapper[4722]: I0309 14:04:18.765486 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:18 crc kubenswrapper[4722]: I0309 14:04:18.765519 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:18 crc kubenswrapper[4722]: I0309 14:04:18.765544 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:18Z","lastTransitionTime":"2026-03-09T14:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:18 crc kubenswrapper[4722]: I0309 14:04:18.869580 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:18 crc kubenswrapper[4722]: I0309 14:04:18.869663 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:18 crc kubenswrapper[4722]: I0309 14:04:18.869678 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:18 crc kubenswrapper[4722]: I0309 14:04:18.869703 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:18 crc kubenswrapper[4722]: I0309 14:04:18.869739 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:18Z","lastTransitionTime":"2026-03-09T14:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:18 crc kubenswrapper[4722]: I0309 14:04:18.972243 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:18 crc kubenswrapper[4722]: I0309 14:04:18.972291 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:18 crc kubenswrapper[4722]: I0309 14:04:18.972303 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:18 crc kubenswrapper[4722]: I0309 14:04:18.972323 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:18 crc kubenswrapper[4722]: I0309 14:04:18.972335 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:18Z","lastTransitionTime":"2026-03-09T14:04:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:19 crc kubenswrapper[4722]: I0309 14:04:19.074490 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:19 crc kubenswrapper[4722]: I0309 14:04:19.074535 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:19 crc kubenswrapper[4722]: I0309 14:04:19.074546 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:19 crc kubenswrapper[4722]: I0309 14:04:19.074561 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:19 crc kubenswrapper[4722]: I0309 14:04:19.074574 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:19Z","lastTransitionTime":"2026-03-09T14:04:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:19 crc kubenswrapper[4722]: I0309 14:04:19.149599 4722 scope.go:117] "RemoveContainer" containerID="b7d636e81bd62363d09f680bbc312f71cbc2ac89d9af44b3c6b9efdaf29ded48" Mar 09 14:04:19 crc kubenswrapper[4722]: E0309 14:04:19.149930 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 09 14:04:19 crc kubenswrapper[4722]: I0309 14:04:19.177015 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:19 crc kubenswrapper[4722]: I0309 14:04:19.177066 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:19 crc kubenswrapper[4722]: I0309 14:04:19.177083 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:19 crc kubenswrapper[4722]: I0309 14:04:19.177104 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:19 crc kubenswrapper[4722]: I0309 14:04:19.177121 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:19Z","lastTransitionTime":"2026-03-09T14:04:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:19 crc kubenswrapper[4722]: I0309 14:04:19.279807 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:19 crc kubenswrapper[4722]: I0309 14:04:19.279945 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:19 crc kubenswrapper[4722]: I0309 14:04:19.279962 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:19 crc kubenswrapper[4722]: I0309 14:04:19.279984 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:19 crc kubenswrapper[4722]: I0309 14:04:19.280001 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:19Z","lastTransitionTime":"2026-03-09T14:04:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:19 crc kubenswrapper[4722]: I0309 14:04:19.383190 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:19 crc kubenswrapper[4722]: I0309 14:04:19.383364 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:19 crc kubenswrapper[4722]: I0309 14:04:19.383491 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:19 crc kubenswrapper[4722]: I0309 14:04:19.383540 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:19 crc kubenswrapper[4722]: I0309 14:04:19.383607 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:19Z","lastTransitionTime":"2026-03-09T14:04:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:19 crc kubenswrapper[4722]: I0309 14:04:19.487235 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:19 crc kubenswrapper[4722]: I0309 14:04:19.487290 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:19 crc kubenswrapper[4722]: I0309 14:04:19.487307 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:19 crc kubenswrapper[4722]: I0309 14:04:19.487330 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:19 crc kubenswrapper[4722]: I0309 14:04:19.487348 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:19Z","lastTransitionTime":"2026-03-09T14:04:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:19 crc kubenswrapper[4722]: I0309 14:04:19.590385 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:19 crc kubenswrapper[4722]: I0309 14:04:19.590460 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:19 crc kubenswrapper[4722]: I0309 14:04:19.590480 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:19 crc kubenswrapper[4722]: I0309 14:04:19.590503 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:19 crc kubenswrapper[4722]: I0309 14:04:19.590520 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:19Z","lastTransitionTime":"2026-03-09T14:04:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:19 crc kubenswrapper[4722]: I0309 14:04:19.693300 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:19 crc kubenswrapper[4722]: I0309 14:04:19.693334 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:19 crc kubenswrapper[4722]: I0309 14:04:19.693345 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:19 crc kubenswrapper[4722]: I0309 14:04:19.693360 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:19 crc kubenswrapper[4722]: I0309 14:04:19.693370 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:19Z","lastTransitionTime":"2026-03-09T14:04:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:19 crc kubenswrapper[4722]: I0309 14:04:19.795568 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:19 crc kubenswrapper[4722]: I0309 14:04:19.795595 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:19 crc kubenswrapper[4722]: I0309 14:04:19.795603 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:19 crc kubenswrapper[4722]: I0309 14:04:19.795615 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:19 crc kubenswrapper[4722]: I0309 14:04:19.795624 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:19Z","lastTransitionTime":"2026-03-09T14:04:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:19 crc kubenswrapper[4722]: I0309 14:04:19.898653 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:19 crc kubenswrapper[4722]: I0309 14:04:19.898703 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:19 crc kubenswrapper[4722]: I0309 14:04:19.898720 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:19 crc kubenswrapper[4722]: I0309 14:04:19.898742 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:19 crc kubenswrapper[4722]: I0309 14:04:19.898756 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:19Z","lastTransitionTime":"2026-03-09T14:04:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:20 crc kubenswrapper[4722]: I0309 14:04:20.001687 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:20 crc kubenswrapper[4722]: I0309 14:04:20.001752 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:20 crc kubenswrapper[4722]: I0309 14:04:20.001761 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:20 crc kubenswrapper[4722]: I0309 14:04:20.001775 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:20 crc kubenswrapper[4722]: I0309 14:04:20.001785 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:20Z","lastTransitionTime":"2026-03-09T14:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:20 crc kubenswrapper[4722]: I0309 14:04:20.104734 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:20 crc kubenswrapper[4722]: I0309 14:04:20.104788 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:20 crc kubenswrapper[4722]: I0309 14:04:20.104803 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:20 crc kubenswrapper[4722]: I0309 14:04:20.104823 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:20 crc kubenswrapper[4722]: I0309 14:04:20.104840 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:20Z","lastTransitionTime":"2026-03-09T14:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:20 crc kubenswrapper[4722]: I0309 14:04:20.148687 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 14:04:20 crc kubenswrapper[4722]: E0309 14:04:20.148844 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 14:04:20 crc kubenswrapper[4722]: I0309 14:04:20.148909 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 14:04:20 crc kubenswrapper[4722]: I0309 14:04:20.148926 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 14:04:20 crc kubenswrapper[4722]: E0309 14:04:20.149094 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 14:04:20 crc kubenswrapper[4722]: E0309 14:04:20.149148 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 14:04:20 crc kubenswrapper[4722]: I0309 14:04:20.163402 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:20Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:20 crc kubenswrapper[4722]: I0309 14:04:20.183903 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2267a4d7-87b4-43f7-80c7-7a7ffb0327fc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04169d9b8359e26cc9807e27fb9a5d54ca0946aaade294c5bcc272a7d21c4db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09fa62e7293a50fb3f6bdaf37061cbe5e95a830c8f7337660ce73138dbcaa641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31ba786781fcdb781f5fa45276e29e94756863d2398109d91e81a52f6f88bee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dff721159b382bcfdde6f406a328ef2ee9264b35a8d8baeb7a3c6d31a60ff3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63769aca40f721bec0716427145a3ceea0595836c03d31c98f6d45ff14de68be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://928d88a2e78d66b996b4b4f8c09878f78dd7fbdb000e2ab6d43ac77c999a748f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://928d88a2e78d66b996b4b4f8c09878f78dd7fbdb000e2ab6d43ac77c999a748f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6625c60c717ffc2b85b70374bc50dfae41b04d6362db0c7daf387f31c25862e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6625c60c717ffc2b85b70374bc50dfae41b04d6362db0c7daf387f31c25862e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://21b62fa3b3144d1d24dbb3a9e5f203e4c692377a54fb2322ebe6b927aa67403b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21b62fa3b3144d1d24dbb3a9e5f203e4c692377a54fb2322ebe6b927aa67403b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:20Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:20 crc kubenswrapper[4722]: I0309 14:04:20.197373 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0965e24b-526e-4842-ac1f-eca7a765355d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56c76a5e8718243a17d8d9fd9a8dcf6083c2ccb65b4816926dca08a89e465728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be3836660bf646052991965270cd69b8b8237b5c4bd698a73789b050d23e6877\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dbef38ff5e714c92fc8a86f1e3c64adf606cb198145f884a000f8dee200fcf2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7d636e81bd62363d09f680bbc312f71cbc2ac89d9af44b3c6b9efdaf29ded48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7d636e81bd62363d09f680bbc312f71cbc2ac89d9af44b3c6b9efdaf29ded48\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T14:03:46Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 14:03:46.681607 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 14:03:46.681849 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 14:03:46.682597 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2315154617/tls.crt::/tmp/serving-cert-2315154617/tls.key\\\\\\\"\\\\nI0309 14:03:46.866095 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 14:03:46.869876 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 14:03:46.869896 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 14:03:46.870280 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 14:03:46.870293 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 14:03:46.877096 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0309 14:03:46.877117 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 14:03:46.877121 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 14:03:46.877128 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 14:03:46.877135 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 14:03:46.877139 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 14:03:46.877143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 14:03:46.877146 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 14:03:46.878419 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T14:03:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fbabc5ec2c100150ce886c06c3bd78cc5896c8a06ac9d262eeb552a8a8bb5d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:20Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:20 crc kubenswrapper[4722]: I0309 14:04:20.207303 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:20 crc kubenswrapper[4722]: I0309 14:04:20.207334 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:20 crc kubenswrapper[4722]: I0309 14:04:20.207343 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:20 crc kubenswrapper[4722]: I0309 14:04:20.207356 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:20 crc kubenswrapper[4722]: I0309 14:04:20.207364 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:20Z","lastTransitionTime":"2026-03-09T14:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:20 crc kubenswrapper[4722]: I0309 14:04:20.209807 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:20Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:20 crc kubenswrapper[4722]: I0309 14:04:20.220393 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d69359acfc7a4ee7698088a7e0062b3604a5ceb92910cd3b3d69b1a6291dfdcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://687646beb4c6dd3fb9b659a36449a573f4c389b3e98b07127b18516fdc4f2c93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:20Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:20 crc kubenswrapper[4722]: I0309 14:04:20.232003 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee2ed55248b177bc0873ef3c41517401fdb30fc5ce5a545a1ca8c925e4d4e9e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:20Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:20 crc kubenswrapper[4722]: I0309 14:04:20.246104 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:20Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:20 crc kubenswrapper[4722]: I0309 14:04:20.260240 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69e74824fe242fd2d54bed9734f33823697d7651043d9c97bac6ea5490625a0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:20Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:20 crc kubenswrapper[4722]: I0309 14:04:20.309021 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:20 crc kubenswrapper[4722]: I0309 14:04:20.309062 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:20 crc kubenswrapper[4722]: I0309 14:04:20.309072 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:20 crc kubenswrapper[4722]: I0309 14:04:20.309087 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:20 crc kubenswrapper[4722]: I0309 14:04:20.309098 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:20Z","lastTransitionTime":"2026-03-09T14:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:20 crc kubenswrapper[4722]: I0309 14:04:20.411602 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:20 crc kubenswrapper[4722]: I0309 14:04:20.411653 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:20 crc kubenswrapper[4722]: I0309 14:04:20.411670 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:20 crc kubenswrapper[4722]: I0309 14:04:20.411692 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:20 crc kubenswrapper[4722]: I0309 14:04:20.411708 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:20Z","lastTransitionTime":"2026-03-09T14:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:20 crc kubenswrapper[4722]: I0309 14:04:20.515088 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:20 crc kubenswrapper[4722]: I0309 14:04:20.515161 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:20 crc kubenswrapper[4722]: I0309 14:04:20.515198 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:20 crc kubenswrapper[4722]: I0309 14:04:20.515274 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:20 crc kubenswrapper[4722]: I0309 14:04:20.515297 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:20Z","lastTransitionTime":"2026-03-09T14:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:20 crc kubenswrapper[4722]: I0309 14:04:20.619417 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:20 crc kubenswrapper[4722]: I0309 14:04:20.619488 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:20 crc kubenswrapper[4722]: I0309 14:04:20.619510 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:20 crc kubenswrapper[4722]: I0309 14:04:20.619540 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:20 crc kubenswrapper[4722]: I0309 14:04:20.619564 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:20Z","lastTransitionTime":"2026-03-09T14:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:20 crc kubenswrapper[4722]: I0309 14:04:20.722928 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:20 crc kubenswrapper[4722]: I0309 14:04:20.722989 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:20 crc kubenswrapper[4722]: I0309 14:04:20.723008 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:20 crc kubenswrapper[4722]: I0309 14:04:20.723031 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:20 crc kubenswrapper[4722]: I0309 14:04:20.723049 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:20Z","lastTransitionTime":"2026-03-09T14:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:20 crc kubenswrapper[4722]: I0309 14:04:20.794765 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-72xb4"] Mar 09 14:04:20 crc kubenswrapper[4722]: I0309 14:04:20.795139 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-72xb4" Mar 09 14:04:20 crc kubenswrapper[4722]: I0309 14:04:20.798235 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 09 14:04:20 crc kubenswrapper[4722]: I0309 14:04:20.798551 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 09 14:04:20 crc kubenswrapper[4722]: I0309 14:04:20.798754 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 09 14:04:20 crc kubenswrapper[4722]: I0309 14:04:20.822834 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2267a4d7-87b4-43f7-80c7-7a7ffb0327fc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04169d9b8359e26cc9807e27fb9a5d54ca0946aaade294c5bcc272a7d21c4db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09fa62e7293a50fb3f6bdaf37061cbe5e95a830c8f7337660ce73138dbcaa641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31ba786781fcdb781f5fa45276e29e94756863d2398109d91e81a52f6f88bee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dff721159b382bcfdde6f406a328ef2ee9264b35a8d8baeb7a3c6d31a60ff3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63769aca40f721bec0716427145a3ceea0595836c03d31c98f6d45ff14de68be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://928d88a2e78d66b996b4b4f8c09878f78dd7fbdb000e2ab6d43ac77c999a748f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://928d88a2e78d66b996b4b4f8c09878f78dd7fbdb000e2ab6d43ac77c999a748f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6625c60c717ffc2b85b70374bc50dfae41b04d6362db0c7daf387f31c25862e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6625c60c717ffc2b85b70374bc50dfae41b04d6362db0c7daf387f31c25862e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://21b62fa3b3144d1d24dbb3a9e5f203e4c692377a54fb2322ebe6b927aa67403b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21b62fa3b3144d1d24dbb3a9e5f203e4c692377a54fb2322ebe6b927aa67403b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:20Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:20 crc kubenswrapper[4722]: I0309 14:04:20.824946 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:20 crc kubenswrapper[4722]: I0309 14:04:20.824983 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:20 crc kubenswrapper[4722]: I0309 14:04:20.824992 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:20 crc kubenswrapper[4722]: I0309 14:04:20.825005 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:20 crc kubenswrapper[4722]: I0309 14:04:20.825024 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:20Z","lastTransitionTime":"2026-03-09T14:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:20 crc kubenswrapper[4722]: I0309 14:04:20.839830 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0965e24b-526e-4842-ac1f-eca7a765355d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56c76a5e8718243a17d8d9fd9a8dcf6083c2ccb65b4816926dca08a89e465728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be3836660bf646052991965270cd69b8b8237b5c4bd698a73789b050d23e6877\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dbef38ff5e714c92fc8a86f1e3c64adf606cb198145f884a000f8dee200fcf2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7d636e81bd62363d09f680bbc312f71cbc2ac89d9af44b3c6b9efdaf29ded48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7d636e81bd62363d09f680bbc312f71cbc2ac89d9af44b3c6b9efdaf29ded48\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T14:03:46Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 14:03:46.681607 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 14:03:46.681849 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 14:03:46.682597 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2315154617/tls.crt::/tmp/serving-cert-2315154617/tls.key\\\\\\\"\\\\nI0309 14:03:46.866095 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 14:03:46.869876 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 14:03:46.869896 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 14:03:46.870280 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 14:03:46.870293 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 14:03:46.877096 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0309 14:03:46.877117 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 14:03:46.877121 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 14:03:46.877128 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 14:03:46.877135 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 14:03:46.877139 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 14:03:46.877143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 14:03:46.877146 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 14:03:46.878419 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T14:03:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fbabc5ec2c100150ce886c06c3bd78cc5896c8a06ac9d262eeb552a8a8bb5d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:20Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:20 crc kubenswrapper[4722]: I0309 14:04:20.853615 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:20Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:20 crc kubenswrapper[4722]: I0309 14:04:20.867978 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d69359acfc7a4ee7698088a7e0062b3604a5ceb92910cd3b3d69b1a6291dfdcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://687646beb4c6dd3fb9b659a36449a573f4c389b3e98b07127b18516fdc4f2c93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:20Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:20 crc kubenswrapper[4722]: I0309 14:04:20.884740 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:20Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:20 crc kubenswrapper[4722]: I0309 14:04:20.891733 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfx7x\" (UniqueName: \"kubernetes.io/projected/43e3c934-6aa7-4c96-a3a6-378f7931fc2a-kube-api-access-lfx7x\") pod \"node-resolver-72xb4\" (UID: \"43e3c934-6aa7-4c96-a3a6-378f7931fc2a\") " pod="openshift-dns/node-resolver-72xb4" Mar 09 14:04:20 crc kubenswrapper[4722]: I0309 14:04:20.891813 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/43e3c934-6aa7-4c96-a3a6-378f7931fc2a-hosts-file\") pod \"node-resolver-72xb4\" (UID: \"43e3c934-6aa7-4c96-a3a6-378f7931fc2a\") " pod="openshift-dns/node-resolver-72xb4" Mar 09 14:04:20 crc kubenswrapper[4722]: I0309 14:04:20.899073 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-72xb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43e3c934-6aa7-4c96-a3a6-378f7931fc2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:20Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:20Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lfx7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-72xb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:20Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:20 crc kubenswrapper[4722]: I0309 14:04:20.917830 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee2ed55248b177bc0873ef3c41517401fdb30fc5ce5a545a1ca8c925e4d4e9e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:20Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:20 crc kubenswrapper[4722]: I0309 14:04:20.927685 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:20 crc kubenswrapper[4722]: I0309 14:04:20.927729 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:20 crc kubenswrapper[4722]: I0309 14:04:20.927740 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:20 crc kubenswrapper[4722]: I0309 14:04:20.927756 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:20 crc kubenswrapper[4722]: I0309 14:04:20.927769 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:20Z","lastTransitionTime":"2026-03-09T14:04:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:20 crc kubenswrapper[4722]: I0309 14:04:20.928826 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:20Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:20 crc kubenswrapper[4722]: I0309 14:04:20.938773 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69e74824fe242fd2d54bed9734f33823697d7651043d9c97bac6ea5490625a0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:20Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:20 crc kubenswrapper[4722]: I0309 14:04:20.993131 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfx7x\" (UniqueName: \"kubernetes.io/projected/43e3c934-6aa7-4c96-a3a6-378f7931fc2a-kube-api-access-lfx7x\") pod \"node-resolver-72xb4\" (UID: \"43e3c934-6aa7-4c96-a3a6-378f7931fc2a\") " pod="openshift-dns/node-resolver-72xb4" Mar 09 14:04:20 crc kubenswrapper[4722]: I0309 14:04:20.993246 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/43e3c934-6aa7-4c96-a3a6-378f7931fc2a-hosts-file\") pod \"node-resolver-72xb4\" (UID: \"43e3c934-6aa7-4c96-a3a6-378f7931fc2a\") " pod="openshift-dns/node-resolver-72xb4" Mar 09 14:04:20 crc kubenswrapper[4722]: I0309 14:04:20.993370 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/43e3c934-6aa7-4c96-a3a6-378f7931fc2a-hosts-file\") pod \"node-resolver-72xb4\" (UID: \"43e3c934-6aa7-4c96-a3a6-378f7931fc2a\") " pod="openshift-dns/node-resolver-72xb4" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.012054 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfx7x\" (UniqueName: \"kubernetes.io/projected/43e3c934-6aa7-4c96-a3a6-378f7931fc2a-kube-api-access-lfx7x\") pod \"node-resolver-72xb4\" (UID: \"43e3c934-6aa7-4c96-a3a6-378f7931fc2a\") " pod="openshift-dns/node-resolver-72xb4" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.030993 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.031049 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.031070 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.031096 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.031115 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:21Z","lastTransitionTime":"2026-03-09T14:04:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.117301 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-72xb4" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.134491 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.134553 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.134576 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.134604 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.134625 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:21Z","lastTransitionTime":"2026-03-09T14:04:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:21 crc kubenswrapper[4722]: W0309 14:04:21.138014 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43e3c934_6aa7_4c96_a3a6_378f7931fc2a.slice/crio-752a7ac46d4bb48a862b014dfcf2a81ff179320686ab7b6817faf42515fd9a0c WatchSource:0}: Error finding container 752a7ac46d4bb48a862b014dfcf2a81ff179320686ab7b6817faf42515fd9a0c: Status 404 returned error can't find the container with id 752a7ac46d4bb48a862b014dfcf2a81ff179320686ab7b6817faf42515fd9a0c Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.160011 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-hjrrb"] Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.160324 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-jv499"] Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.160740 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-h4zw5"] Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.160919 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-h4zw5" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.161164 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.161504 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-jv499" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.167559 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.167728 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 09 14:04:21 crc kubenswrapper[4722]: W0309 14:04:21.167819 4722 reflector.go:561] object-"openshift-multus"/"multus-daemon-config": failed to list *v1.ConfigMap: configmaps "multus-daemon-config" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-multus": no relationship found between node 'crc' and this object Mar 09 14:04:21 crc kubenswrapper[4722]: E0309 14:04:21.167850 4722 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"multus-daemon-config\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"multus-daemon-config\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.167888 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.167982 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.168081 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.168224 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.168370 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.168515 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.168655 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 09 14:04:21 crc kubenswrapper[4722]: W0309 14:04:21.168749 4722 reflector.go:561] object-"openshift-multus"/"default-dockercfg-2q5b6": failed to list *v1.Secret: secrets "default-dockercfg-2q5b6" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-multus": no relationship found between node 'crc' and this object Mar 09 14:04:21 crc kubenswrapper[4722]: E0309 14:04:21.168769 4722 reflector.go:158] "Unhandled Error" err="object-\"openshift-multus\"/\"default-dockercfg-2q5b6\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"default-dockercfg-2q5b6\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-multus\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.168817 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.194133 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d69359acfc7a4ee7698088a7e0062b3604a5ceb92910cd3b3d69b1a6291dfdcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://687646beb4c6dd3fb9b659a36449a573f4c389b3e98b07127b18516fdc4f2c93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:21Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.194683 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6b9e29bb-6e51-47ab-a543-b70117ab854d-multus-daemon-config\") pod \"multus-h4zw5\" (UID: \"6b9e29bb-6e51-47ab-a543-b70117ab854d\") " pod="openshift-multus/multus-h4zw5" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.194732 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6b9e29bb-6e51-47ab-a543-b70117ab854d-multus-socket-dir-parent\") pod \"multus-h4zw5\" (UID: \"6b9e29bb-6e51-47ab-a543-b70117ab854d\") " pod="openshift-multus/multus-h4zw5" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.194847 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6b9e29bb-6e51-47ab-a543-b70117ab854d-host-run-k8s-cni-cncf-io\") pod \"multus-h4zw5\" (UID: \"6b9e29bb-6e51-47ab-a543-b70117ab854d\") " pod="openshift-multus/multus-h4zw5" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.194895 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6b9e29bb-6e51-47ab-a543-b70117ab854d-host-var-lib-cni-multus\") pod \"multus-h4zw5\" (UID: \"6b9e29bb-6e51-47ab-a543-b70117ab854d\") " pod="openshift-multus/multus-h4zw5" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.194925 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6b9e29bb-6e51-47ab-a543-b70117ab854d-host-run-netns\") pod \"multus-h4zw5\" (UID: \"6b9e29bb-6e51-47ab-a543-b70117ab854d\") " pod="openshift-multus/multus-h4zw5" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.194959 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5q6db\" (UniqueName: \"kubernetes.io/projected/dac2aaf5-653b-4b2a-8efe-ed26bac8d648-kube-api-access-5q6db\") pod \"machine-config-daemon-hjrrb\" (UID: \"dac2aaf5-653b-4b2a-8efe-ed26bac8d648\") " pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.194990 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fd1357a3-36ef-4bc8-85bb-91f3c0f42994-os-release\") pod \"multus-additional-cni-plugins-jv499\" (UID: \"fd1357a3-36ef-4bc8-85bb-91f3c0f42994\") " pod="openshift-multus/multus-additional-cni-plugins-jv499" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.195035 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dac2aaf5-653b-4b2a-8efe-ed26bac8d648-mcd-auth-proxy-config\") pod \"machine-config-daemon-hjrrb\" (UID: \"dac2aaf5-653b-4b2a-8efe-ed26bac8d648\") " pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.195062 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fd1357a3-36ef-4bc8-85bb-91f3c0f42994-cni-binary-copy\") pod \"multus-additional-cni-plugins-jv499\" (UID: \"fd1357a3-36ef-4bc8-85bb-91f3c0f42994\") " pod="openshift-multus/multus-additional-cni-plugins-jv499" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.195089 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6b9e29bb-6e51-47ab-a543-b70117ab854d-cni-binary-copy\") pod \"multus-h4zw5\" (UID: \"6b9e29bb-6e51-47ab-a543-b70117ab854d\") " pod="openshift-multus/multus-h4zw5" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.195511 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6b9e29bb-6e51-47ab-a543-b70117ab854d-etc-kubernetes\") pod \"multus-h4zw5\" (UID: \"6b9e29bb-6e51-47ab-a543-b70117ab854d\") " pod="openshift-multus/multus-h4zw5" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.195559 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fd1357a3-36ef-4bc8-85bb-91f3c0f42994-cnibin\") pod \"multus-additional-cni-plugins-jv499\" (UID: \"fd1357a3-36ef-4bc8-85bb-91f3c0f42994\") " pod="openshift-multus/multus-additional-cni-plugins-jv499" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.196430 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6b9e29bb-6e51-47ab-a543-b70117ab854d-cnibin\") pod \"multus-h4zw5\" (UID: \"6b9e29bb-6e51-47ab-a543-b70117ab854d\") " pod="openshift-multus/multus-h4zw5" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.196538 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dac2aaf5-653b-4b2a-8efe-ed26bac8d648-proxy-tls\") pod \"machine-config-daemon-hjrrb\" (UID: \"dac2aaf5-653b-4b2a-8efe-ed26bac8d648\") " pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.196640 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fd1357a3-36ef-4bc8-85bb-91f3c0f42994-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jv499\" (UID: \"fd1357a3-36ef-4bc8-85bb-91f3c0f42994\") " pod="openshift-multus/multus-additional-cni-plugins-jv499" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.196732 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdw92\" (UniqueName: \"kubernetes.io/projected/6b9e29bb-6e51-47ab-a543-b70117ab854d-kube-api-access-tdw92\") pod \"multus-h4zw5\" (UID: \"6b9e29bb-6e51-47ab-a543-b70117ab854d\") " pod="openshift-multus/multus-h4zw5" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.196829 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/dac2aaf5-653b-4b2a-8efe-ed26bac8d648-rootfs\") pod \"machine-config-daemon-hjrrb\" (UID: \"dac2aaf5-653b-4b2a-8efe-ed26bac8d648\") " pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.196955 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6b9e29bb-6e51-47ab-a543-b70117ab854d-multus-conf-dir\") pod \"multus-h4zw5\" (UID: \"6b9e29bb-6e51-47ab-a543-b70117ab854d\") " pod="openshift-multus/multus-h4zw5" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.197041 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6b9e29bb-6e51-47ab-a543-b70117ab854d-system-cni-dir\") pod \"multus-h4zw5\" (UID: \"6b9e29bb-6e51-47ab-a543-b70117ab854d\") " pod="openshift-multus/multus-h4zw5" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.197118 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6b9e29bb-6e51-47ab-a543-b70117ab854d-host-var-lib-cni-bin\") pod \"multus-h4zw5\" (UID: \"6b9e29bb-6e51-47ab-a543-b70117ab854d\") " pod="openshift-multus/multus-h4zw5" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.197152 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fd1357a3-36ef-4bc8-85bb-91f3c0f42994-system-cni-dir\") pod \"multus-additional-cni-plugins-jv499\" (UID: \"fd1357a3-36ef-4bc8-85bb-91f3c0f42994\") " pod="openshift-multus/multus-additional-cni-plugins-jv499" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.197238 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6b9e29bb-6e51-47ab-a543-b70117ab854d-hostroot\") pod \"multus-h4zw5\" (UID: \"6b9e29bb-6e51-47ab-a543-b70117ab854d\") " pod="openshift-multus/multus-h4zw5" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.197308 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6b9e29bb-6e51-47ab-a543-b70117ab854d-host-run-multus-certs\") pod \"multus-h4zw5\" (UID: \"6b9e29bb-6e51-47ab-a543-b70117ab854d\") " pod="openshift-multus/multus-h4zw5" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.197343 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/fd1357a3-36ef-4bc8-85bb-91f3c0f42994-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jv499\" (UID: \"fd1357a3-36ef-4bc8-85bb-91f3c0f42994\") " pod="openshift-multus/multus-additional-cni-plugins-jv499" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.197420 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2lh2\" (UniqueName: \"kubernetes.io/projected/fd1357a3-36ef-4bc8-85bb-91f3c0f42994-kube-api-access-m2lh2\") pod \"multus-additional-cni-plugins-jv499\" (UID: \"fd1357a3-36ef-4bc8-85bb-91f3c0f42994\") " pod="openshift-multus/multus-additional-cni-plugins-jv499" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.197524 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6b9e29bb-6e51-47ab-a543-b70117ab854d-host-var-lib-kubelet\") pod \"multus-h4zw5\" (UID: \"6b9e29bb-6e51-47ab-a543-b70117ab854d\") " pod="openshift-multus/multus-h4zw5" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.197603 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6b9e29bb-6e51-47ab-a543-b70117ab854d-multus-cni-dir\") pod \"multus-h4zw5\" (UID: \"6b9e29bb-6e51-47ab-a543-b70117ab854d\") " pod="openshift-multus/multus-h4zw5" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.197683 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6b9e29bb-6e51-47ab-a543-b70117ab854d-os-release\") pod \"multus-h4zw5\" (UID: \"6b9e29bb-6e51-47ab-a543-b70117ab854d\") " pod="openshift-multus/multus-h4zw5" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.221943 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h4zw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b9e29bb-6e51-47ab-a543-b70117ab854d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdw92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h4zw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:21Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.237694 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.237731 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.237743 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.237759 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.237771 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:21Z","lastTransitionTime":"2026-03-09T14:04:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.241956 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee2ed55248b177bc0873ef3c41517401fdb30fc5ce5a545a1ca8c925e4d4e9e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:21Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.255291 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jv499" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd1357a3-36ef-4bc8-85bb-91f3c0f42994\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jv499\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:21Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.265315 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dac2aaf5-653b-4b2a-8efe-ed26bac8d648\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q6db\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q6db\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hjrrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:21Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.283762 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2267a4d7-87b4-43f7-80c7-7a7ffb0327fc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04169d9b8359e26cc9807e27fb9a5d54ca0946aaade294c5bcc272a7d21c4db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09fa62e7293a50fb3f6bdaf37061cbe5e95a830c8f7337660ce73138dbcaa641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31ba786781fcdb781f5fa45276e29e94756863d2398109d91e81a52f6f88bee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dff721159b382bcfdde6f406a328ef2ee9264b35a8d8baeb7a3c6d31a60ff3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63769aca40f721bec0716427145a3ceea0595836c03d31c98f6d45ff14de68be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://928d88a2e78d66b996b4b4f8c09878f78dd7fbdb000e2ab6d43ac77c999a748f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://928d88a2e78d66b996b4b4f8c09878f78dd7fbdb000e2ab6d43ac77c999a748f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6625c60c717ffc2b85b70374bc50dfae41b04d6362db0c7daf387f31c25862e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6625c60c717ffc2b85b70374bc50dfae41b04d6362db0c7daf387f31c25862e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://21b62fa3b3144d1d24dbb3a9e5f203e4c692377a54fb2322ebe6b927aa67403b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21b62fa3b3144d1d24dbb3a9e5f203e4c692377a54fb2322ebe6b927aa67403b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:21Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.297348 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0965e24b-526e-4842-ac1f-eca7a765355d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56c76a5e8718243a17d8d9fd9a8dcf6083c2ccb65b4816926dca08a89e465728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be3836660bf646052991965270cd69b8b8237b5c4bd698a73789b050d23e6877\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dbef38ff5e714c92fc8a86f1e3c64adf606cb198145f884a000f8dee200fcf2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7d636e81bd62363d09f680bbc312f71cbc2ac89d9af44b3c6b9efdaf29ded48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7d636e81bd62363d09f680bbc312f71cbc2ac89d9af44b3c6b9efdaf29ded48\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T14:03:46Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 14:03:46.681607 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 14:03:46.681849 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 14:03:46.682597 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2315154617/tls.crt::/tmp/serving-cert-2315154617/tls.key\\\\\\\"\\\\nI0309 14:03:46.866095 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 14:03:46.869876 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 14:03:46.869896 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 14:03:46.870280 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 14:03:46.870293 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 14:03:46.877096 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0309 14:03:46.877117 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 14:03:46.877121 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 14:03:46.877128 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 14:03:46.877135 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 14:03:46.877139 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 14:03:46.877143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 14:03:46.877146 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 14:03:46.878419 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T14:03:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fbabc5ec2c100150ce886c06c3bd78cc5896c8a06ac9d262eeb552a8a8bb5d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:21Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.299307 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6b9e29bb-6e51-47ab-a543-b70117ab854d-host-var-lib-kubelet\") pod \"multus-h4zw5\" (UID: \"6b9e29bb-6e51-47ab-a543-b70117ab854d\") " pod="openshift-multus/multus-h4zw5" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.299531 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6b9e29bb-6e51-47ab-a543-b70117ab854d-os-release\") pod \"multus-h4zw5\" (UID: \"6b9e29bb-6e51-47ab-a543-b70117ab854d\") " pod="openshift-multus/multus-h4zw5" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.299566 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6b9e29bb-6e51-47ab-a543-b70117ab854d-multus-cni-dir\") pod \"multus-h4zw5\" (UID: \"6b9e29bb-6e51-47ab-a543-b70117ab854d\") " pod="openshift-multus/multus-h4zw5" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.299594 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6b9e29bb-6e51-47ab-a543-b70117ab854d-multus-daemon-config\") pod \"multus-h4zw5\" (UID: \"6b9e29bb-6e51-47ab-a543-b70117ab854d\") " pod="openshift-multus/multus-h4zw5" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.299621 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6b9e29bb-6e51-47ab-a543-b70117ab854d-host-run-k8s-cni-cncf-io\") pod \"multus-h4zw5\" (UID: \"6b9e29bb-6e51-47ab-a543-b70117ab854d\") " pod="openshift-multus/multus-h4zw5" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.299653 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6b9e29bb-6e51-47ab-a543-b70117ab854d-host-var-lib-cni-multus\") pod \"multus-h4zw5\" (UID: \"6b9e29bb-6e51-47ab-a543-b70117ab854d\") " pod="openshift-multus/multus-h4zw5" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.299682 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6b9e29bb-6e51-47ab-a543-b70117ab854d-multus-socket-dir-parent\") pod \"multus-h4zw5\" (UID: \"6b9e29bb-6e51-47ab-a543-b70117ab854d\") " pod="openshift-multus/multus-h4zw5" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.299424 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6b9e29bb-6e51-47ab-a543-b70117ab854d-host-var-lib-kubelet\") pod \"multus-h4zw5\" (UID: \"6b9e29bb-6e51-47ab-a543-b70117ab854d\") " pod="openshift-multus/multus-h4zw5" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.300017 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6b9e29bb-6e51-47ab-a543-b70117ab854d-host-run-k8s-cni-cncf-io\") pod \"multus-h4zw5\" (UID: \"6b9e29bb-6e51-47ab-a543-b70117ab854d\") " pod="openshift-multus/multus-h4zw5" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.300004 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fd1357a3-36ef-4bc8-85bb-91f3c0f42994-os-release\") pod \"multus-additional-cni-plugins-jv499\" (UID: \"fd1357a3-36ef-4bc8-85bb-91f3c0f42994\") " pod="openshift-multus/multus-additional-cni-plugins-jv499" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.300342 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6b9e29bb-6e51-47ab-a543-b70117ab854d-host-var-lib-cni-multus\") pod \"multus-h4zw5\" (UID: \"6b9e29bb-6e51-47ab-a543-b70117ab854d\") " pod="openshift-multus/multus-h4zw5" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.300382 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6b9e29bb-6e51-47ab-a543-b70117ab854d-os-release\") pod \"multus-h4zw5\" (UID: \"6b9e29bb-6e51-47ab-a543-b70117ab854d\") " pod="openshift-multus/multus-h4zw5" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.300457 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6b9e29bb-6e51-47ab-a543-b70117ab854d-host-run-netns\") pod \"multus-h4zw5\" (UID: \"6b9e29bb-6e51-47ab-a543-b70117ab854d\") " pod="openshift-multus/multus-h4zw5" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.300405 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6b9e29bb-6e51-47ab-a543-b70117ab854d-host-run-netns\") pod \"multus-h4zw5\" (UID: \"6b9e29bb-6e51-47ab-a543-b70117ab854d\") " pod="openshift-multus/multus-h4zw5" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.300517 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5q6db\" (UniqueName: \"kubernetes.io/projected/dac2aaf5-653b-4b2a-8efe-ed26bac8d648-kube-api-access-5q6db\") pod \"machine-config-daemon-hjrrb\" (UID: \"dac2aaf5-653b-4b2a-8efe-ed26bac8d648\") " pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.300520 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6b9e29bb-6e51-47ab-a543-b70117ab854d-multus-cni-dir\") pod \"multus-h4zw5\" (UID: \"6b9e29bb-6e51-47ab-a543-b70117ab854d\") " pod="openshift-multus/multus-h4zw5" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.300574 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/fd1357a3-36ef-4bc8-85bb-91f3c0f42994-os-release\") pod \"multus-additional-cni-plugins-jv499\" (UID: \"fd1357a3-36ef-4bc8-85bb-91f3c0f42994\") " pod="openshift-multus/multus-additional-cni-plugins-jv499" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.300465 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6b9e29bb-6e51-47ab-a543-b70117ab854d-multus-socket-dir-parent\") pod \"multus-h4zw5\" (UID: \"6b9e29bb-6e51-47ab-a543-b70117ab854d\") " pod="openshift-multus/multus-h4zw5" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.301277 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dac2aaf5-653b-4b2a-8efe-ed26bac8d648-mcd-auth-proxy-config\") pod \"machine-config-daemon-hjrrb\" (UID: \"dac2aaf5-653b-4b2a-8efe-ed26bac8d648\") " pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.301673 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fd1357a3-36ef-4bc8-85bb-91f3c0f42994-cni-binary-copy\") pod \"multus-additional-cni-plugins-jv499\" (UID: \"fd1357a3-36ef-4bc8-85bb-91f3c0f42994\") " pod="openshift-multus/multus-additional-cni-plugins-jv499" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.303406 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6b9e29bb-6e51-47ab-a543-b70117ab854d-cni-binary-copy\") pod \"multus-h4zw5\" (UID: \"6b9e29bb-6e51-47ab-a543-b70117ab854d\") " pod="openshift-multus/multus-h4zw5" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.303442 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6b9e29bb-6e51-47ab-a543-b70117ab854d-etc-kubernetes\") pod \"multus-h4zw5\" (UID: \"6b9e29bb-6e51-47ab-a543-b70117ab854d\") " pod="openshift-multus/multus-h4zw5" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.303462 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fd1357a3-36ef-4bc8-85bb-91f3c0f42994-cnibin\") pod \"multus-additional-cni-plugins-jv499\" (UID: \"fd1357a3-36ef-4bc8-85bb-91f3c0f42994\") " pod="openshift-multus/multus-additional-cni-plugins-jv499" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.303504 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6b9e29bb-6e51-47ab-a543-b70117ab854d-cnibin\") pod \"multus-h4zw5\" (UID: \"6b9e29bb-6e51-47ab-a543-b70117ab854d\") " pod="openshift-multus/multus-h4zw5" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.303525 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dac2aaf5-653b-4b2a-8efe-ed26bac8d648-proxy-tls\") pod \"machine-config-daemon-hjrrb\" (UID: \"dac2aaf5-653b-4b2a-8efe-ed26bac8d648\") " pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.303545 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fd1357a3-36ef-4bc8-85bb-91f3c0f42994-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jv499\" (UID: \"fd1357a3-36ef-4bc8-85bb-91f3c0f42994\") " pod="openshift-multus/multus-additional-cni-plugins-jv499" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.303577 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdw92\" (UniqueName: \"kubernetes.io/projected/6b9e29bb-6e51-47ab-a543-b70117ab854d-kube-api-access-tdw92\") pod \"multus-h4zw5\" (UID: \"6b9e29bb-6e51-47ab-a543-b70117ab854d\") " pod="openshift-multus/multus-h4zw5" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.303599 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/dac2aaf5-653b-4b2a-8efe-ed26bac8d648-rootfs\") pod \"machine-config-daemon-hjrrb\" (UID: \"dac2aaf5-653b-4b2a-8efe-ed26bac8d648\") " pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.303617 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6b9e29bb-6e51-47ab-a543-b70117ab854d-cnibin\") pod \"multus-h4zw5\" (UID: \"6b9e29bb-6e51-47ab-a543-b70117ab854d\") " pod="openshift-multus/multus-h4zw5" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.303622 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6b9e29bb-6e51-47ab-a543-b70117ab854d-multus-conf-dir\") pod \"multus-h4zw5\" (UID: \"6b9e29bb-6e51-47ab-a543-b70117ab854d\") " pod="openshift-multus/multus-h4zw5" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.303596 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6b9e29bb-6e51-47ab-a543-b70117ab854d-etc-kubernetes\") pod \"multus-h4zw5\" (UID: \"6b9e29bb-6e51-47ab-a543-b70117ab854d\") " pod="openshift-multus/multus-h4zw5" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.303681 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/fd1357a3-36ef-4bc8-85bb-91f3c0f42994-cnibin\") pod \"multus-additional-cni-plugins-jv499\" (UID: \"fd1357a3-36ef-4bc8-85bb-91f3c0f42994\") " pod="openshift-multus/multus-additional-cni-plugins-jv499" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.303787 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/dac2aaf5-653b-4b2a-8efe-ed26bac8d648-rootfs\") pod \"machine-config-daemon-hjrrb\" (UID: \"dac2aaf5-653b-4b2a-8efe-ed26bac8d648\") " pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.303865 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6b9e29bb-6e51-47ab-a543-b70117ab854d-system-cni-dir\") pod \"multus-h4zw5\" (UID: \"6b9e29bb-6e51-47ab-a543-b70117ab854d\") " pod="openshift-multus/multus-h4zw5" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.303888 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6b9e29bb-6e51-47ab-a543-b70117ab854d-host-var-lib-cni-bin\") pod \"multus-h4zw5\" (UID: \"6b9e29bb-6e51-47ab-a543-b70117ab854d\") " pod="openshift-multus/multus-h4zw5" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.303927 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fd1357a3-36ef-4bc8-85bb-91f3c0f42994-system-cni-dir\") pod \"multus-additional-cni-plugins-jv499\" (UID: \"fd1357a3-36ef-4bc8-85bb-91f3c0f42994\") " pod="openshift-multus/multus-additional-cni-plugins-jv499" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.303933 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6b9e29bb-6e51-47ab-a543-b70117ab854d-host-var-lib-cni-bin\") pod \"multus-h4zw5\" (UID: \"6b9e29bb-6e51-47ab-a543-b70117ab854d\") " pod="openshift-multus/multus-h4zw5" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.304613 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/fd1357a3-36ef-4bc8-85bb-91f3c0f42994-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jv499\" (UID: \"fd1357a3-36ef-4bc8-85bb-91f3c0f42994\") " pod="openshift-multus/multus-additional-cni-plugins-jv499" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.304634 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dac2aaf5-653b-4b2a-8efe-ed26bac8d648-mcd-auth-proxy-config\") pod \"machine-config-daemon-hjrrb\" (UID: \"dac2aaf5-653b-4b2a-8efe-ed26bac8d648\") " pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.304668 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/fd1357a3-36ef-4bc8-85bb-91f3c0f42994-system-cni-dir\") pod \"multus-additional-cni-plugins-jv499\" (UID: \"fd1357a3-36ef-4bc8-85bb-91f3c0f42994\") " pod="openshift-multus/multus-additional-cni-plugins-jv499" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.305000 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6b9e29bb-6e51-47ab-a543-b70117ab854d-multus-conf-dir\") pod \"multus-h4zw5\" (UID: \"6b9e29bb-6e51-47ab-a543-b70117ab854d\") " pod="openshift-multus/multus-h4zw5" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.305145 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/fd1357a3-36ef-4bc8-85bb-91f3c0f42994-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jv499\" (UID: \"fd1357a3-36ef-4bc8-85bb-91f3c0f42994\") " pod="openshift-multus/multus-additional-cni-plugins-jv499" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.305182 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6b9e29bb-6e51-47ab-a543-b70117ab854d-cni-binary-copy\") pod \"multus-h4zw5\" (UID: \"6b9e29bb-6e51-47ab-a543-b70117ab854d\") " pod="openshift-multus/multus-h4zw5" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.305477 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6b9e29bb-6e51-47ab-a543-b70117ab854d-system-cni-dir\") pod \"multus-h4zw5\" (UID: \"6b9e29bb-6e51-47ab-a543-b70117ab854d\") " pod="openshift-multus/multus-h4zw5" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.305627 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2lh2\" (UniqueName: \"kubernetes.io/projected/fd1357a3-36ef-4bc8-85bb-91f3c0f42994-kube-api-access-m2lh2\") pod \"multus-additional-cni-plugins-jv499\" (UID: \"fd1357a3-36ef-4bc8-85bb-91f3c0f42994\") " pod="openshift-multus/multus-additional-cni-plugins-jv499" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.306366 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/fd1357a3-36ef-4bc8-85bb-91f3c0f42994-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jv499\" (UID: \"fd1357a3-36ef-4bc8-85bb-91f3c0f42994\") " pod="openshift-multus/multus-additional-cni-plugins-jv499" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.306955 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6b9e29bb-6e51-47ab-a543-b70117ab854d-hostroot\") pod \"multus-h4zw5\" (UID: \"6b9e29bb-6e51-47ab-a543-b70117ab854d\") " pod="openshift-multus/multus-h4zw5" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.306997 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6b9e29bb-6e51-47ab-a543-b70117ab854d-host-run-multus-certs\") pod \"multus-h4zw5\" (UID: \"6b9e29bb-6e51-47ab-a543-b70117ab854d\") " pod="openshift-multus/multus-h4zw5" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.307062 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/fd1357a3-36ef-4bc8-85bb-91f3c0f42994-cni-binary-copy\") pod \"multus-additional-cni-plugins-jv499\" (UID: \"fd1357a3-36ef-4bc8-85bb-91f3c0f42994\") " pod="openshift-multus/multus-additional-cni-plugins-jv499" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.307451 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6b9e29bb-6e51-47ab-a543-b70117ab854d-hostroot\") pod \"multus-h4zw5\" (UID: \"6b9e29bb-6e51-47ab-a543-b70117ab854d\") " pod="openshift-multus/multus-h4zw5" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.307519 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6b9e29bb-6e51-47ab-a543-b70117ab854d-host-run-multus-certs\") pod \"multus-h4zw5\" (UID: \"6b9e29bb-6e51-47ab-a543-b70117ab854d\") " pod="openshift-multus/multus-h4zw5" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.312130 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dac2aaf5-653b-4b2a-8efe-ed26bac8d648-proxy-tls\") pod \"machine-config-daemon-hjrrb\" (UID: \"dac2aaf5-653b-4b2a-8efe-ed26bac8d648\") " pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.314836 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:21Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.319412 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5q6db\" (UniqueName: \"kubernetes.io/projected/dac2aaf5-653b-4b2a-8efe-ed26bac8d648-kube-api-access-5q6db\") pod \"machine-config-daemon-hjrrb\" (UID: \"dac2aaf5-653b-4b2a-8efe-ed26bac8d648\") " pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.325236 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdw92\" (UniqueName: \"kubernetes.io/projected/6b9e29bb-6e51-47ab-a543-b70117ab854d-kube-api-access-tdw92\") pod \"multus-h4zw5\" (UID: \"6b9e29bb-6e51-47ab-a543-b70117ab854d\") " pod="openshift-multus/multus-h4zw5" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.325017 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2lh2\" (UniqueName: \"kubernetes.io/projected/fd1357a3-36ef-4bc8-85bb-91f3c0f42994-kube-api-access-m2lh2\") pod \"multus-additional-cni-plugins-jv499\" (UID: \"fd1357a3-36ef-4bc8-85bb-91f3c0f42994\") " pod="openshift-multus/multus-additional-cni-plugins-jv499" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.326487 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:21Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.335110 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-72xb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43e3c934-6aa7-4c96-a3a6-378f7931fc2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:20Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:20Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lfx7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-72xb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:21Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.340465 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.340501 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.340511 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.340527 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.340537 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:21Z","lastTransitionTime":"2026-03-09T14:04:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.345579 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:21Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.357469 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69e74824fe242fd2d54bed9734f33823697d7651043d9c97bac6ea5490625a0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:21Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.370538 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:21Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.385582 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69e74824fe242fd2d54bed9734f33823697d7651043d9c97bac6ea5490625a0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:21Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.396779 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d69359acfc7a4ee7698088a7e0062b3604a5ceb92910cd3b3d69b1a6291dfdcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://687646beb4c6dd3fb9b659a36449a573f4c389b3e98b07127b18516fdc4f2c93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:21Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.407340 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h4zw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b9e29bb-6e51-47ab-a543-b70117ab854d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdw92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h4zw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:21Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.418686 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee2ed55248b177bc0873ef3c41517401fdb30fc5ce5a545a1ca8c925e4d4e9e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:21Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.431873 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jv499" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd1357a3-36ef-4bc8-85bb-91f3c0f42994\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jv499\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:21Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.443655 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.443697 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.443707 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.443723 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.443734 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:21Z","lastTransitionTime":"2026-03-09T14:04:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.452458 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2267a4d7-87b4-43f7-80c7-7a7ffb0327fc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04169d9b8359e26cc9807e27fb9a5d54ca0946aaade294c5bcc272a7d21c4db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09fa62e7293a50fb3f6bdaf37061cbe5e95a830c8f7337660ce73138dbcaa641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31ba786781fcdb781f5fa45276e29e94756863d2398109d91e81a52f6f88bee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dff721159b382bcfdde6f406a328ef2ee9264b35a8d8baeb7a3c6d31a60ff3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63769aca40f721bec0716427145a3ceea0595836c03d31c98f6d45ff14de68be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://928d88a2e78d66b996b4b4f8c09878f78dd7fbdb000e2ab6d43ac77c999a748f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://928d88a2e78d66b996b4b4f8c09878f78dd7fbdb000e2ab6d43ac77c999a748f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6625c60c717ffc2b85b70374bc50dfae41b04d6362db0c7daf387f31c25862e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6625c60c717ffc2b85b70374bc50dfae41b04d6362db0c7daf387f31c25862e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://21b62fa3b3144d1d24dbb3a9e5f203e4c692377a54fb2322ebe6b927aa67403b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21b62fa3b3144d1d24dbb3a9e5f203e4c692377a54fb2322ebe6b927aa67403b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:21Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.467732 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0965e24b-526e-4842-ac1f-eca7a765355d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56c76a5e8718243a17d8d9fd9a8dcf6083c2ccb65b4816926dca08a89e465728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be3836660bf646052991965270cd69b8b8237b5c4bd698a73789b050d23e6877\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dbef38ff5e714c92fc8a86f1e3c64adf606cb198145f884a000f8dee200fcf2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7d636e81bd62363d09f680bbc312f71cbc2ac89d9af44b3c6b9efdaf29ded48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7d636e81bd62363d09f680bbc312f71cbc2ac89d9af44b3c6b9efdaf29ded48\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T14:03:46Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 14:03:46.681607 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 14:03:46.681849 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 14:03:46.682597 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2315154617/tls.crt::/tmp/serving-cert-2315154617/tls.key\\\\\\\"\\\\nI0309 14:03:46.866095 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 14:03:46.869876 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 14:03:46.869896 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 14:03:46.870280 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 14:03:46.870293 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 14:03:46.877096 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0309 14:03:46.877117 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 14:03:46.877121 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 14:03:46.877128 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 14:03:46.877135 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 14:03:46.877139 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 14:03:46.877143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 14:03:46.877146 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 14:03:46.878419 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T14:03:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fbabc5ec2c100150ce886c06c3bd78cc5896c8a06ac9d262eeb552a8a8bb5d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:21Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.485031 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:21Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.502102 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:21Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.513566 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-72xb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43e3c934-6aa7-4c96-a3a6-378f7931fc2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:20Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:20Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lfx7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-72xb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:21Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.526919 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.529637 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dac2aaf5-653b-4b2a-8efe-ed26bac8d648\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q6db\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q6db\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hjrrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:21Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.533981 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-jv499" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.554319 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.554406 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.554425 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.554453 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.554471 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:21Z","lastTransitionTime":"2026-03-09T14:04:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:21 crc kubenswrapper[4722]: W0309 14:04:21.560429 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddac2aaf5_653b_4b2a_8efe_ed26bac8d648.slice/crio-cbe070366422e5759badbe51bf4beb4e0f157f2149f66945bdecc13ab2d85f61 WatchSource:0}: Error finding container cbe070366422e5759badbe51bf4beb4e0f157f2149f66945bdecc13ab2d85f61: Status 404 returned error can't find the container with id cbe070366422e5759badbe51bf4beb4e0f157f2149f66945bdecc13ab2d85f61 Mar 09 14:04:21 crc kubenswrapper[4722]: W0309 14:04:21.561767 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd1357a3_36ef_4bc8_85bb_91f3c0f42994.slice/crio-61403f6f3b8b3177e29f6e0a7f7052e8dd01e107d5a31c52d7d20051a0c5618f WatchSource:0}: Error finding container 61403f6f3b8b3177e29f6e0a7f7052e8dd01e107d5a31c52d7d20051a0c5618f: Status 404 returned error can't find the container with id 61403f6f3b8b3177e29f6e0a7f7052e8dd01e107d5a31c52d7d20051a0c5618f Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.562664 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-5v7ng"] Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.563539 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.565233 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.567587 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.567672 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.568139 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.569714 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.569849 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.570968 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.579631 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dac2aaf5-653b-4b2a-8efe-ed26bac8d648\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q6db\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q6db\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hjrrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:21Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.586404 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-72xb4" event={"ID":"43e3c934-6aa7-4c96-a3a6-378f7931fc2a","Type":"ContainerStarted","Data":"bcf5e49245fd6051b12465a36b27f657b4923a62f88d61fe0fac971c4a2c9197"} Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.586478 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-72xb4" event={"ID":"43e3c934-6aa7-4c96-a3a6-378f7931fc2a","Type":"ContainerStarted","Data":"752a7ac46d4bb48a862b014dfcf2a81ff179320686ab7b6817faf42515fd9a0c"} Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.589246 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jv499" event={"ID":"fd1357a3-36ef-4bc8-85bb-91f3c0f42994","Type":"ContainerStarted","Data":"61403f6f3b8b3177e29f6e0a7f7052e8dd01e107d5a31c52d7d20051a0c5618f"} Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.590345 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" event={"ID":"dac2aaf5-653b-4b2a-8efe-ed26bac8d648","Type":"ContainerStarted","Data":"cbe070366422e5759badbe51bf4beb4e0f157f2149f66945bdecc13ab2d85f61"} Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.610396 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-run-ovn\") pod \"ovnkube-node-5v7ng\" (UID: \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\") " pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.610452 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-host-cni-netd\") pod \"ovnkube-node-5v7ng\" (UID: \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\") " pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.610495 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-log-socket\") pod \"ovnkube-node-5v7ng\" (UID: \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\") " pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.610547 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-run-systemd\") pod \"ovnkube-node-5v7ng\" (UID: \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\") " pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.610719 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-host-slash\") pod \"ovnkube-node-5v7ng\" (UID: \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\") " pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.610800 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-host-kubelet\") pod \"ovnkube-node-5v7ng\" (UID: \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\") " pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.610827 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-host-cni-bin\") pod \"ovnkube-node-5v7ng\" (UID: \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\") " pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.610856 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-ovnkube-config\") pod \"ovnkube-node-5v7ng\" (UID: \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\") " pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.610897 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-etc-openvswitch\") pod \"ovnkube-node-5v7ng\" (UID: \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\") " pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.610925 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-ovnkube-script-lib\") pod \"ovnkube-node-5v7ng\" (UID: \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\") " pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.610967 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-host-run-ovn-kubernetes\") pod \"ovnkube-node-5v7ng\" (UID: \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\") " pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.611004 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-node-log\") pod \"ovnkube-node-5v7ng\" (UID: \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\") " pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.611027 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5v7ng\" (UID: \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\") " pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.611053 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6nlp\" (UniqueName: \"kubernetes.io/projected/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-kube-api-access-v6nlp\") pod \"ovnkube-node-5v7ng\" (UID: \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\") " pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.611186 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-run-openvswitch\") pod \"ovnkube-node-5v7ng\" (UID: \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\") " pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.611286 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-var-lib-openvswitch\") pod \"ovnkube-node-5v7ng\" (UID: \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\") " pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.611337 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-host-run-netns\") pod \"ovnkube-node-5v7ng\" (UID: \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\") " pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.611375 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-systemd-units\") pod \"ovnkube-node-5v7ng\" (UID: \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\") " pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.611419 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-env-overrides\") pod \"ovnkube-node-5v7ng\" (UID: \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\") " pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.611454 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-ovn-node-metrics-cert\") pod \"ovnkube-node-5v7ng\" (UID: \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\") " pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.614448 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2267a4d7-87b4-43f7-80c7-7a7ffb0327fc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04169d9b8359e26cc9807e27fb9a5d54ca0946aaade294c5bcc272a7d21c4db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09fa62e7293a50fb3f6bdaf37061cbe5e95a830c8f7337660ce73138dbcaa641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31ba786781fcdb781f5fa45276e29e94756863d2398109d91e81a52f6f88bee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dff721159b382bcfdde6f406a328ef2ee9264b35a8d8baeb7a3c6d31a60ff3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63769aca40f721bec0716427145a3ceea0595836c03d31c98f6d45ff14de68be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://928d88a2e78d66b996b4b4f8c09878f78dd7fbdb000e2ab6d43ac77c999a748f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://928d88a2e78d66b996b4b4f8c09878f78dd7fbdb000e2ab6d43ac77c999a748f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6625c60c717ffc2b85b70374bc50dfae41b04d6362db0c7daf387f31c25862e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6625c60c717ffc2b85b70374bc50dfae41b04d6362db0c7daf387f31c25862e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://21b62fa3b3144d1d24dbb3a9e5f203e4c692377a54fb2322ebe6b927aa67403b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21b62fa3b3144d1d24dbb3a9e5f203e4c692377a54fb2322ebe6b927aa67403b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:21Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.635767 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0965e24b-526e-4842-ac1f-eca7a765355d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56c76a5e8718243a17d8d9fd9a8dcf6083c2ccb65b4816926dca08a89e465728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be3836660bf646052991965270cd69b8b8237b5c4bd698a73789b050d23e6877\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dbef38ff5e714c92fc8a86f1e3c64adf606cb198145f884a000f8dee200fcf2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7d636e81bd62363d09f680bbc312f71cbc2ac89d9af44b3c6b9efdaf29ded48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7d636e81bd62363d09f680bbc312f71cbc2ac89d9af44b3c6b9efdaf29ded48\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T14:03:46Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 14:03:46.681607 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 14:03:46.681849 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 14:03:46.682597 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2315154617/tls.crt::/tmp/serving-cert-2315154617/tls.key\\\\\\\"\\\\nI0309 14:03:46.866095 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 14:03:46.869876 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 14:03:46.869896 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 14:03:46.870280 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 14:03:46.870293 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 14:03:46.877096 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0309 14:03:46.877117 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 14:03:46.877121 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 14:03:46.877128 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 14:03:46.877135 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 14:03:46.877139 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 14:03:46.877143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 14:03:46.877146 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 14:03:46.878419 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T14:03:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fbabc5ec2c100150ce886c06c3bd78cc5896c8a06ac9d262eeb552a8a8bb5d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:21Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.647622 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:21Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.662323 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.662392 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.662412 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.662439 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.662457 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:21Z","lastTransitionTime":"2026-03-09T14:04:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.664829 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:21Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.677739 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-72xb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43e3c934-6aa7-4c96-a3a6-378f7931fc2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:20Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:20Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:20Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lfx7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-72xb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:21Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.694074 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:21Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.706996 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69e74824fe242fd2d54bed9734f33823697d7651043d9c97bac6ea5490625a0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:21Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.712505 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-host-kubelet\") pod \"ovnkube-node-5v7ng\" (UID: \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\") " pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.712557 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-host-cni-bin\") pod \"ovnkube-node-5v7ng\" (UID: \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\") " pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.712586 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-ovnkube-config\") pod \"ovnkube-node-5v7ng\" (UID: \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\") " pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.712608 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-ovnkube-script-lib\") pod \"ovnkube-node-5v7ng\" (UID: \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\") " pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.712637 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-etc-openvswitch\") pod \"ovnkube-node-5v7ng\" (UID: \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\") " pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.712653 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-host-cni-bin\") pod \"ovnkube-node-5v7ng\" (UID: \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\") " pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.712672 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-host-run-ovn-kubernetes\") pod \"ovnkube-node-5v7ng\" (UID: \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\") " pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.712695 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6nlp\" (UniqueName: \"kubernetes.io/projected/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-kube-api-access-v6nlp\") pod \"ovnkube-node-5v7ng\" (UID: \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\") " pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.712651 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-host-kubelet\") pod \"ovnkube-node-5v7ng\" (UID: \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\") " pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.712729 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-node-log\") pod \"ovnkube-node-5v7ng\" (UID: \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\") " pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.712745 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-etc-openvswitch\") pod \"ovnkube-node-5v7ng\" (UID: \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\") " pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.712753 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5v7ng\" (UID: \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\") " pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.712782 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-run-openvswitch\") pod \"ovnkube-node-5v7ng\" (UID: \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\") " pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.712808 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-var-lib-openvswitch\") pod \"ovnkube-node-5v7ng\" (UID: \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\") " pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.712830 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-host-run-netns\") pod \"ovnkube-node-5v7ng\" (UID: \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\") " pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.712853 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-systemd-units\") pod \"ovnkube-node-5v7ng\" (UID: \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\") " pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.712875 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-env-overrides\") pod \"ovnkube-node-5v7ng\" (UID: \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\") " pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.712896 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-ovn-node-metrics-cert\") pod \"ovnkube-node-5v7ng\" (UID: \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\") " pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.712922 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-run-ovn\") pod \"ovnkube-node-5v7ng\" (UID: \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\") " pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.712943 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-host-cni-netd\") pod \"ovnkube-node-5v7ng\" (UID: \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\") " pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.712966 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-log-socket\") pod \"ovnkube-node-5v7ng\" (UID: \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\") " pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.713004 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-run-systemd\") pod \"ovnkube-node-5v7ng\" (UID: \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\") " pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.713054 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-host-slash\") pod \"ovnkube-node-5v7ng\" (UID: \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\") " pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.713133 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-host-slash\") pod \"ovnkube-node-5v7ng\" (UID: \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\") " pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.713131 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-host-run-ovn-kubernetes\") pod \"ovnkube-node-5v7ng\" (UID: \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\") " pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.713274 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-ovnkube-script-lib\") pod \"ovnkube-node-5v7ng\" (UID: \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\") " pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.713334 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-run-openvswitch\") pod \"ovnkube-node-5v7ng\" (UID: \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\") " pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.713363 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-node-log\") pod \"ovnkube-node-5v7ng\" (UID: \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\") " pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.713389 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5v7ng\" (UID: \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\") " pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.713392 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-ovnkube-config\") pod \"ovnkube-node-5v7ng\" (UID: \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\") " pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.713418 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-log-socket\") pod \"ovnkube-node-5v7ng\" (UID: \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\") " pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.713444 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-run-ovn\") pod \"ovnkube-node-5v7ng\" (UID: \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\") " pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.713465 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-host-cni-netd\") pod \"ovnkube-node-5v7ng\" (UID: \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\") " pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.713485 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-run-systemd\") pod \"ovnkube-node-5v7ng\" (UID: \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\") " pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.713489 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-host-run-netns\") pod \"ovnkube-node-5v7ng\" (UID: \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\") " pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.713768 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-env-overrides\") pod \"ovnkube-node-5v7ng\" (UID: \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\") " pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.713773 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-systemd-units\") pod \"ovnkube-node-5v7ng\" (UID: \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\") " pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.713859 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-var-lib-openvswitch\") pod \"ovnkube-node-5v7ng\" (UID: \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\") " pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.716081 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-ovn-node-metrics-cert\") pod \"ovnkube-node-5v7ng\" (UID: \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\") " pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.725553 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d69359acfc7a4ee7698088a7e0062b3604a5ceb92910cd3b3d69b1a6291dfdcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://687646beb4c6dd3fb9b659a36449a573f4c389b3e98b07127b18516fdc4f2c93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:21Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.731676 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6nlp\" (UniqueName: \"kubernetes.io/projected/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-kube-api-access-v6nlp\") pod \"ovnkube-node-5v7ng\" (UID: \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\") " pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.739271 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h4zw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b9e29bb-6e51-47ab-a543-b70117ab854d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdw92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h4zw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:21Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.757974 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5v7ng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:21Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.770360 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.770467 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.770487 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.770515 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.770533 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:21Z","lastTransitionTime":"2026-03-09T14:04:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.775032 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee2ed55248b177bc0873ef3c41517401fdb30fc5ce5a545a1ca8c925e4d4e9e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:21Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.789175 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jv499" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd1357a3-36ef-4bc8-85bb-91f3c0f42994\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jv499\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:21Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.802563 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69e74824fe242fd2d54bed9734f33823697d7651043d9c97bac6ea5490625a0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:21Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.816827 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:21Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.830238 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d69359acfc7a4ee7698088a7e0062b3604a5ceb92910cd3b3d69b1a6291dfdcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://687646beb4c6dd3fb9b659a36449a573f4c389b3e98b07127b18516fdc4f2c93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:21Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.842906 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h4zw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b9e29bb-6e51-47ab-a543-b70117ab854d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdw92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h4zw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:21Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.863407 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5v7ng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:21Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.873586 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.873633 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.873643 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.873660 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.873673 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:21Z","lastTransitionTime":"2026-03-09T14:04:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.876404 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee2ed55248b177bc0873ef3c41517401fdb30fc5ce5a545a1ca8c925e4d4e9e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:21Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.877662 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.889420 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jv499" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd1357a3-36ef-4bc8-85bb-91f3c0f42994\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jv499\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:21Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:21 crc kubenswrapper[4722]: W0309 14:04:21.889489 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e305619_b3a2_44c9_9e54_e1afa4f43dbf.slice/crio-d2f6057c8c3de0bebcb5e88e9ed6dd935f0fc52db361546f77994d3be96ebf04 WatchSource:0}: Error finding container d2f6057c8c3de0bebcb5e88e9ed6dd935f0fc52db361546f77994d3be96ebf04: Status 404 returned error can't find the container with id d2f6057c8c3de0bebcb5e88e9ed6dd935f0fc52db361546f77994d3be96ebf04 Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.902027 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:21Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.915443 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:21Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.925132 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-72xb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43e3c934-6aa7-4c96-a3a6-378f7931fc2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcf5e49245fd6051b12465a36b27f657b4923a62f88d61fe0fac971c4a2c9197\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lfx7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-72xb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:21Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.939885 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dac2aaf5-653b-4b2a-8efe-ed26bac8d648\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q6db\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q6db\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hjrrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:21Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.957071 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2267a4d7-87b4-43f7-80c7-7a7ffb0327fc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04169d9b8359e26cc9807e27fb9a5d54ca0946aaade294c5bcc272a7d21c4db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09fa62e7293a50fb3f6bdaf37061cbe5e95a830c8f7337660ce73138dbcaa641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31ba786781fcdb781f5fa45276e29e94756863d2398109d91e81a52f6f88bee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dff721159b382bcfdde6f406a328ef2ee9264b35a8d8baeb7a3c6d31a60ff3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63769aca40f721bec0716427145a3ceea0595836c03d31c98f6d45ff14de68be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://928d88a2e78d66b996b4b4f8c09878f78dd7fbdb000e2ab6d43ac77c999a748f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://928d88a2e78d66b996b4b4f8c09878f78dd7fbdb000e2ab6d43ac77c999a748f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6625c60c717ffc2b85b70374bc50dfae41b04d6362db0c7daf387f31c25862e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6625c60c717ffc2b85b70374bc50dfae41b04d6362db0c7daf387f31c25862e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://21b62fa3b3144d1d24dbb3a9e5f203e4c692377a54fb2322ebe6b927aa67403b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21b62fa3b3144d1d24dbb3a9e5f203e4c692377a54fb2322ebe6b927aa67403b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:21Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.975267 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0965e24b-526e-4842-ac1f-eca7a765355d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56c76a5e8718243a17d8d9fd9a8dcf6083c2ccb65b4816926dca08a89e465728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be3836660bf646052991965270cd69b8b8237b5c4bd698a73789b050d23e6877\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dbef38ff5e714c92fc8a86f1e3c64adf606cb198145f884a000f8dee200fcf2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7d636e81bd62363d09f680bbc312f71cbc2ac89d9af44b3c6b9efdaf29ded48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7d636e81bd62363d09f680bbc312f71cbc2ac89d9af44b3c6b9efdaf29ded48\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T14:03:46Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 14:03:46.681607 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 14:03:46.681849 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 14:03:46.682597 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2315154617/tls.crt::/tmp/serving-cert-2315154617/tls.key\\\\\\\"\\\\nI0309 14:03:46.866095 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 14:03:46.869876 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 14:03:46.869896 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 14:03:46.870280 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 14:03:46.870293 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 14:03:46.877096 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0309 14:03:46.877117 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 14:03:46.877121 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 14:03:46.877128 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 14:03:46.877135 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 14:03:46.877139 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 14:03:46.877143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 14:03:46.877146 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 14:03:46.878419 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T14:03:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fbabc5ec2c100150ce886c06c3bd78cc5896c8a06ac9d262eeb552a8a8bb5d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:21Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.976786 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.976827 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.976842 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.976861 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:21 crc kubenswrapper[4722]: I0309 14:04:21.976872 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:21Z","lastTransitionTime":"2026-03-09T14:04:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:22 crc kubenswrapper[4722]: I0309 14:04:22.080008 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:22 crc kubenswrapper[4722]: I0309 14:04:22.080050 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:22 crc kubenswrapper[4722]: I0309 14:04:22.080061 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:22 crc kubenswrapper[4722]: I0309 14:04:22.080076 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:22 crc kubenswrapper[4722]: I0309 14:04:22.080088 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:22Z","lastTransitionTime":"2026-03-09T14:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:22 crc kubenswrapper[4722]: I0309 14:04:22.148514 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 14:04:22 crc kubenswrapper[4722]: I0309 14:04:22.148606 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 14:04:22 crc kubenswrapper[4722]: I0309 14:04:22.148627 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 14:04:22 crc kubenswrapper[4722]: E0309 14:04:22.148668 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 14:04:22 crc kubenswrapper[4722]: E0309 14:04:22.148769 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 14:04:22 crc kubenswrapper[4722]: E0309 14:04:22.148951 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 14:04:22 crc kubenswrapper[4722]: I0309 14:04:22.182565 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:22 crc kubenswrapper[4722]: I0309 14:04:22.182818 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:22 crc kubenswrapper[4722]: I0309 14:04:22.182845 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:22 crc kubenswrapper[4722]: I0309 14:04:22.182863 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:22 crc kubenswrapper[4722]: I0309 14:04:22.182883 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:22Z","lastTransitionTime":"2026-03-09T14:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:22 crc kubenswrapper[4722]: I0309 14:04:22.290600 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:22 crc kubenswrapper[4722]: I0309 14:04:22.290673 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:22 crc kubenswrapper[4722]: I0309 14:04:22.290696 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:22 crc kubenswrapper[4722]: I0309 14:04:22.290726 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:22 crc kubenswrapper[4722]: I0309 14:04:22.290750 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:22Z","lastTransitionTime":"2026-03-09T14:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:22 crc kubenswrapper[4722]: E0309 14:04:22.300903 4722 configmap.go:193] Couldn't get configMap openshift-multus/multus-daemon-config: failed to sync configmap cache: timed out waiting for the condition Mar 09 14:04:22 crc kubenswrapper[4722]: E0309 14:04:22.301019 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6b9e29bb-6e51-47ab-a543-b70117ab854d-multus-daemon-config podName:6b9e29bb-6e51-47ab-a543-b70117ab854d nodeName:}" failed. No retries permitted until 2026-03-09 14:04:22.800990105 +0000 UTC m=+103.356558701 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "multus-daemon-config" (UniqueName: "kubernetes.io/configmap/6b9e29bb-6e51-47ab-a543-b70117ab854d-multus-daemon-config") pod "multus-h4zw5" (UID: "6b9e29bb-6e51-47ab-a543-b70117ab854d") : failed to sync configmap cache: timed out waiting for the condition Mar 09 14:04:22 crc kubenswrapper[4722]: I0309 14:04:22.393093 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:22 crc kubenswrapper[4722]: I0309 14:04:22.393363 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:22 crc kubenswrapper[4722]: I0309 14:04:22.393374 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:22 crc kubenswrapper[4722]: I0309 14:04:22.393388 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:22 crc kubenswrapper[4722]: I0309 14:04:22.393399 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:22Z","lastTransitionTime":"2026-03-09T14:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:22 crc kubenswrapper[4722]: I0309 14:04:22.496518 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:22 crc kubenswrapper[4722]: I0309 14:04:22.497125 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:22 crc kubenswrapper[4722]: I0309 14:04:22.497148 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:22 crc kubenswrapper[4722]: I0309 14:04:22.497181 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:22 crc kubenswrapper[4722]: I0309 14:04:22.497241 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:22Z","lastTransitionTime":"2026-03-09T14:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:22 crc kubenswrapper[4722]: I0309 14:04:22.594658 4722 generic.go:334] "Generic (PLEG): container finished" podID="2e305619-b3a2-44c9-9e54-e1afa4f43dbf" containerID="bcc35a356ac2421f1f5a4e5460ac9461063a1fb85aaefce257f9f4e7aed4dc75" exitCode=0 Mar 09 14:04:22 crc kubenswrapper[4722]: I0309 14:04:22.594738 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" event={"ID":"2e305619-b3a2-44c9-9e54-e1afa4f43dbf","Type":"ContainerDied","Data":"bcc35a356ac2421f1f5a4e5460ac9461063a1fb85aaefce257f9f4e7aed4dc75"} Mar 09 14:04:22 crc kubenswrapper[4722]: I0309 14:04:22.594770 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" event={"ID":"2e305619-b3a2-44c9-9e54-e1afa4f43dbf","Type":"ContainerStarted","Data":"d2f6057c8c3de0bebcb5e88e9ed6dd935f0fc52db361546f77994d3be96ebf04"} Mar 09 14:04:22 crc kubenswrapper[4722]: I0309 14:04:22.597074 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" event={"ID":"dac2aaf5-653b-4b2a-8efe-ed26bac8d648","Type":"ContainerStarted","Data":"bf5b268d6a355ae3a2c40981769412124b85cdfd99364ed58e5163a744f5f83c"} Mar 09 14:04:22 crc kubenswrapper[4722]: I0309 14:04:22.597108 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" event={"ID":"dac2aaf5-653b-4b2a-8efe-ed26bac8d648","Type":"ContainerStarted","Data":"9f451493fa6d86f12c1291c0fe262bcbe4b62cef45c8dda696bb48fa3ded2eeb"} Mar 09 14:04:22 crc kubenswrapper[4722]: I0309 14:04:22.599437 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:22 crc kubenswrapper[4722]: I0309 14:04:22.599481 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:22 crc kubenswrapper[4722]: I0309 14:04:22.599499 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:22 crc kubenswrapper[4722]: I0309 14:04:22.599520 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:22 crc kubenswrapper[4722]: I0309 14:04:22.599537 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:22Z","lastTransitionTime":"2026-03-09T14:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:22 crc kubenswrapper[4722]: I0309 14:04:22.599798 4722 generic.go:334] "Generic (PLEG): container finished" podID="fd1357a3-36ef-4bc8-85bb-91f3c0f42994" containerID="4e56992f54f9fc4387130b74e3aa0f4652a5d6d960da1a5305b8668b8086fae2" exitCode=0 Mar 09 14:04:22 crc kubenswrapper[4722]: I0309 14:04:22.599831 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jv499" event={"ID":"fd1357a3-36ef-4bc8-85bb-91f3c0f42994","Type":"ContainerDied","Data":"4e56992f54f9fc4387130b74e3aa0f4652a5d6d960da1a5305b8668b8086fae2"} Mar 09 14:04:22 crc kubenswrapper[4722]: I0309 14:04:22.605115 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 09 14:04:22 crc kubenswrapper[4722]: I0309 14:04:22.613619 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h4zw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b9e29bb-6e51-47ab-a543-b70117ab854d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdw92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h4zw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:22Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:22 crc kubenswrapper[4722]: I0309 14:04:22.636382 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcc35a356ac2421f1f5a4e5460ac9461063a1fb85aaefce257f9f4e7aed4dc75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcc35a356ac2421f1f5a4e5460ac9461063a1fb85aaefce257f9f4e7aed4dc75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5v7ng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:22Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:22 crc kubenswrapper[4722]: I0309 14:04:22.650388 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d69359acfc7a4ee7698088a7e0062b3604a5ceb92910cd3b3d69b1a6291dfdcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://687646beb4c6dd3fb9b659a36449a573f4c389b3e98b07127b18516fdc4f2c93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:22Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:22 crc kubenswrapper[4722]: I0309 14:04:22.670306 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jv499" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd1357a3-36ef-4bc8-85bb-91f3c0f42994\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jv499\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:22Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:22 crc kubenswrapper[4722]: I0309 14:04:22.683826 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee2ed55248b177bc0873ef3c41517401fdb30fc5ce5a545a1ca8c925e4d4e9e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:22Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:22 crc kubenswrapper[4722]: I0309 14:04:22.702261 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:22 crc kubenswrapper[4722]: I0309 14:04:22.702294 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:22 crc kubenswrapper[4722]: I0309 14:04:22.702308 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:22 crc kubenswrapper[4722]: I0309 14:04:22.702326 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:22 crc kubenswrapper[4722]: I0309 14:04:22.702337 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:22Z","lastTransitionTime":"2026-03-09T14:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:22 crc kubenswrapper[4722]: I0309 14:04:22.702235 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2267a4d7-87b4-43f7-80c7-7a7ffb0327fc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04169d9b8359e26cc9807e27fb9a5d54ca0946aaade294c5bcc272a7d21c4db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09fa62e7293a50fb3f6bdaf37061cbe5e95a830c8f7337660ce73138dbcaa641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31ba786781fcdb781f5fa45276e29e94756863d2398109d91e81a52f6f88bee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dff721159b382bcfdde6f406a328ef2ee9264b35a8d8baeb7a3c6d31a60ff3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63769aca40f721bec0716427145a3ceea0595836c03d31c98f6d45ff14de68be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://928d88a2e78d66b996b4b4f8c09878f78dd7fbdb000e2ab6d43ac77c999a748f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://928d88a2e78d66b996b4b4f8c09878f78dd7fbdb000e2ab6d43ac77c999a748f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6625c60c717ffc2b85b70374bc50dfae41b04d6362db0c7daf387f31c25862e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6625c60c717ffc2b85b70374bc50dfae41b04d6362db0c7daf387f31c25862e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://21b62fa3b3144d1d24dbb3a9e5f203e4c692377a54fb2322ebe6b927aa67403b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21b62fa3b3144d1d24dbb3a9e5f203e4c692377a54fb2322ebe6b927aa67403b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:22Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:22 crc kubenswrapper[4722]: I0309 14:04:22.718539 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0965e24b-526e-4842-ac1f-eca7a765355d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56c76a5e8718243a17d8d9fd9a8dcf6083c2ccb65b4816926dca08a89e465728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be3836660bf646052991965270cd69b8b8237b5c4bd698a73789b050d23e6877\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dbef38ff5e714c92fc8a86f1e3c64adf606cb198145f884a000f8dee200fcf2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7d636e81bd62363d09f680bbc312f71cbc2ac89d9af44b3c6b9efdaf29ded48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7d636e81bd62363d09f680bbc312f71cbc2ac89d9af44b3c6b9efdaf29ded48\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T14:03:46Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 14:03:46.681607 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 14:03:46.681849 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 14:03:46.682597 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2315154617/tls.crt::/tmp/serving-cert-2315154617/tls.key\\\\\\\"\\\\nI0309 14:03:46.866095 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 14:03:46.869876 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 14:03:46.869896 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 14:03:46.870280 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 14:03:46.870293 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 14:03:46.877096 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0309 14:03:46.877117 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 14:03:46.877121 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 14:03:46.877128 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 14:03:46.877135 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 14:03:46.877139 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 14:03:46.877143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 14:03:46.877146 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 14:03:46.878419 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T14:03:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fbabc5ec2c100150ce886c06c3bd78cc5896c8a06ac9d262eeb552a8a8bb5d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:22Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:22 crc kubenswrapper[4722]: I0309 14:04:22.733323 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:22Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:22 crc kubenswrapper[4722]: I0309 14:04:22.744794 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 09 14:04:22 crc kubenswrapper[4722]: I0309 14:04:22.750124 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:22Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:22 crc kubenswrapper[4722]: I0309 14:04:22.760453 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-72xb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43e3c934-6aa7-4c96-a3a6-378f7931fc2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcf5e49245fd6051b12465a36b27f657b4923a62f88d61fe0fac971c4a2c9197\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lfx7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-72xb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:22Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:22 crc kubenswrapper[4722]: I0309 14:04:22.772519 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dac2aaf5-653b-4b2a-8efe-ed26bac8d648\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q6db\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q6db\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hjrrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:22Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:22 crc kubenswrapper[4722]: I0309 14:04:22.783891 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:22Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:22 crc kubenswrapper[4722]: I0309 14:04:22.796597 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69e74824fe242fd2d54bed9734f33823697d7651043d9c97bac6ea5490625a0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:22Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:22 crc kubenswrapper[4722]: I0309 14:04:22.804432 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:22 crc kubenswrapper[4722]: I0309 14:04:22.804459 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:22 crc kubenswrapper[4722]: I0309 14:04:22.804469 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:22 crc kubenswrapper[4722]: I0309 14:04:22.804484 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:22 crc kubenswrapper[4722]: I0309 14:04:22.804496 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:22Z","lastTransitionTime":"2026-03-09T14:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:22 crc kubenswrapper[4722]: I0309 14:04:22.809068 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:22Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:22 crc kubenswrapper[4722]: I0309 14:04:22.821285 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-72xb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43e3c934-6aa7-4c96-a3a6-378f7931fc2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcf5e49245fd6051b12465a36b27f657b4923a62f88d61fe0fac971c4a2c9197\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lfx7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-72xb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:22Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:22 crc kubenswrapper[4722]: I0309 14:04:22.824732 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6b9e29bb-6e51-47ab-a543-b70117ab854d-multus-daemon-config\") pod \"multus-h4zw5\" (UID: \"6b9e29bb-6e51-47ab-a543-b70117ab854d\") " pod="openshift-multus/multus-h4zw5" Mar 09 14:04:22 crc kubenswrapper[4722]: I0309 14:04:22.825600 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6b9e29bb-6e51-47ab-a543-b70117ab854d-multus-daemon-config\") pod \"multus-h4zw5\" (UID: \"6b9e29bb-6e51-47ab-a543-b70117ab854d\") " pod="openshift-multus/multus-h4zw5" Mar 09 14:04:22 crc kubenswrapper[4722]: I0309 14:04:22.835698 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dac2aaf5-653b-4b2a-8efe-ed26bac8d648\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf5b268d6a355ae3a2c40981769412124b85cdfd99364ed58e5163a744f5f83c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q6db\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f451493fa6d86f12c1291c0fe262bcbe4b62cef45c8dda696bb48fa3ded2eeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q6db\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hjrrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:22Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:22 crc kubenswrapper[4722]: I0309 14:04:22.856467 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2267a4d7-87b4-43f7-80c7-7a7ffb0327fc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04169d9b8359e26cc9807e27fb9a5d54ca0946aaade294c5bcc272a7d21c4db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09fa62e7293a50fb3f6bdaf37061cbe5e95a830c8f7337660ce73138dbcaa641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31ba786781fcdb781f5fa45276e29e94756863d2398109d91e81a52f6f88bee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dff721159b382bcfdde6f406a328ef2ee9264b35a8d8baeb7a3c6d31a60ff3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63769aca40f721bec0716427145a3ceea0595836c03d31c98f6d45ff14de68be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://928d88a2e78d66b996b4b4f8c09878f78dd7fbdb000e2ab6d43ac77c999a748f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://928d88a2e78d66b996b4b4f8c09878f78dd7fbdb000e2ab6d43ac77c999a748f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6625c60c717ffc2b85b70374bc50dfae41b04d6362db0c7daf387f31c25862e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6625c60c717ffc2b85b70374bc50dfae41b04d6362db0c7daf387f31c25862e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://21b62fa3b3144d1d24dbb3a9e5f203e4c692377a54fb2322ebe6b927aa67403b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21b62fa3b3144d1d24dbb3a9e5f203e4c692377a54fb2322ebe6b927aa67403b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:22Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:22 crc kubenswrapper[4722]: I0309 14:04:22.872866 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0965e24b-526e-4842-ac1f-eca7a765355d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56c76a5e8718243a17d8d9fd9a8dcf6083c2ccb65b4816926dca08a89e465728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be3836660bf646052991965270cd69b8b8237b5c4bd698a73789b050d23e6877\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dbef38ff5e714c92fc8a86f1e3c64adf606cb198145f884a000f8dee200fcf2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7d636e81bd62363d09f680bbc312f71cbc2ac89d9af44b3c6b9efdaf29ded48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7d636e81bd62363d09f680bbc312f71cbc2ac89d9af44b3c6b9efdaf29ded48\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T14:03:46Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 14:03:46.681607 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 14:03:46.681849 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 14:03:46.682597 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2315154617/tls.crt::/tmp/serving-cert-2315154617/tls.key\\\\\\\"\\\\nI0309 14:03:46.866095 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 14:03:46.869876 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 14:03:46.869896 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 14:03:46.870280 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 14:03:46.870293 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 14:03:46.877096 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0309 14:03:46.877117 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 14:03:46.877121 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 14:03:46.877128 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 14:03:46.877135 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 14:03:46.877139 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 14:03:46.877143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 14:03:46.877146 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 14:03:46.878419 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T14:03:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fbabc5ec2c100150ce886c06c3bd78cc5896c8a06ac9d262eeb552a8a8bb5d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:22Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:22 crc kubenswrapper[4722]: I0309 14:04:22.886714 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:22Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:22 crc kubenswrapper[4722]: I0309 14:04:22.900795 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:22Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:22 crc kubenswrapper[4722]: I0309 14:04:22.906145 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:22 crc kubenswrapper[4722]: I0309 14:04:22.906184 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:22 crc kubenswrapper[4722]: I0309 14:04:22.906212 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:22 crc kubenswrapper[4722]: I0309 14:04:22.906232 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:22 crc kubenswrapper[4722]: I0309 14:04:22.906247 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:22Z","lastTransitionTime":"2026-03-09T14:04:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:22 crc kubenswrapper[4722]: I0309 14:04:22.912365 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69e74824fe242fd2d54bed9734f33823697d7651043d9c97bac6ea5490625a0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:22Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:22 crc kubenswrapper[4722]: I0309 14:04:22.925466 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d69359acfc7a4ee7698088a7e0062b3604a5ceb92910cd3b3d69b1a6291dfdcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://687646beb4c6dd3fb9b659a36449a573f4c389b3e98b07127b18516fdc4f2c93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:22Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:22 crc kubenswrapper[4722]: I0309 14:04:22.938802 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h4zw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b9e29bb-6e51-47ab-a543-b70117ab854d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdw92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h4zw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:22Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:22 crc kubenswrapper[4722]: I0309 14:04:22.968256 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcc35a356ac2421f1f5a4e5460ac9461063a1fb85aaefce257f9f4e7aed4dc75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcc35a356ac2421f1f5a4e5460ac9461063a1fb85aaefce257f9f4e7aed4dc75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5v7ng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:22Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:22 crc kubenswrapper[4722]: I0309 14:04:22.988600 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee2ed55248b177bc0873ef3c41517401fdb30fc5ce5a545a1ca8c925e4d4e9e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:22Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:23 crc kubenswrapper[4722]: I0309 14:04:23.003240 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-h4zw5" Mar 09 14:04:23 crc kubenswrapper[4722]: I0309 14:04:23.009600 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:23 crc kubenswrapper[4722]: I0309 14:04:23.010170 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:23 crc kubenswrapper[4722]: I0309 14:04:23.010380 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:23 crc kubenswrapper[4722]: I0309 14:04:23.010543 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:23 crc kubenswrapper[4722]: I0309 14:04:23.010674 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:23Z","lastTransitionTime":"2026-03-09T14:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:23 crc kubenswrapper[4722]: I0309 14:04:23.009899 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jv499" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd1357a3-36ef-4bc8-85bb-91f3c0f42994\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e56992f54f9fc4387130b74e3aa0f4652a5d6d960da1a5305b8668b8086fae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e56992f54f9fc4387130b74e3aa0f4652a5d6d960da1a5305b8668b8086fae2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jv499\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:23Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:23 crc kubenswrapper[4722]: W0309 14:04:23.026724 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b9e29bb_6e51_47ab_a543_b70117ab854d.slice/crio-e53ae33397a1ccf1c138219043f73b655cbba90e500887baf26ec45b0e88dd2f WatchSource:0}: Error finding container e53ae33397a1ccf1c138219043f73b655cbba90e500887baf26ec45b0e88dd2f: Status 404 returned error can't find the container with id e53ae33397a1ccf1c138219043f73b655cbba90e500887baf26ec45b0e88dd2f Mar 09 14:04:23 crc kubenswrapper[4722]: I0309 14:04:23.113591 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:23 crc kubenswrapper[4722]: I0309 14:04:23.113978 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:23 crc kubenswrapper[4722]: I0309 14:04:23.113992 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:23 crc kubenswrapper[4722]: I0309 14:04:23.114014 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:23 crc kubenswrapper[4722]: I0309 14:04:23.114030 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:23Z","lastTransitionTime":"2026-03-09T14:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:23 crc kubenswrapper[4722]: I0309 14:04:23.216801 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:23 crc kubenswrapper[4722]: I0309 14:04:23.216850 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:23 crc kubenswrapper[4722]: I0309 14:04:23.216862 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:23 crc kubenswrapper[4722]: I0309 14:04:23.216879 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:23 crc kubenswrapper[4722]: I0309 14:04:23.216889 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:23Z","lastTransitionTime":"2026-03-09T14:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:23 crc kubenswrapper[4722]: I0309 14:04:23.319370 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:23 crc kubenswrapper[4722]: I0309 14:04:23.319853 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:23 crc kubenswrapper[4722]: I0309 14:04:23.319872 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:23 crc kubenswrapper[4722]: I0309 14:04:23.319895 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:23 crc kubenswrapper[4722]: I0309 14:04:23.319914 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:23Z","lastTransitionTime":"2026-03-09T14:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:23 crc kubenswrapper[4722]: I0309 14:04:23.424527 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:23 crc kubenswrapper[4722]: I0309 14:04:23.424559 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:23 crc kubenswrapper[4722]: I0309 14:04:23.424568 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:23 crc kubenswrapper[4722]: I0309 14:04:23.424584 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:23 crc kubenswrapper[4722]: I0309 14:04:23.424593 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:23Z","lastTransitionTime":"2026-03-09T14:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:23 crc kubenswrapper[4722]: I0309 14:04:23.527809 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:23 crc kubenswrapper[4722]: I0309 14:04:23.527847 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:23 crc kubenswrapper[4722]: I0309 14:04:23.527858 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:23 crc kubenswrapper[4722]: I0309 14:04:23.527876 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:23 crc kubenswrapper[4722]: I0309 14:04:23.527887 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:23Z","lastTransitionTime":"2026-03-09T14:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:23 crc kubenswrapper[4722]: I0309 14:04:23.605122 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-h4zw5" event={"ID":"6b9e29bb-6e51-47ab-a543-b70117ab854d","Type":"ContainerStarted","Data":"17cd3d1e3981e2f2a3b66780f753425c78452e6319fb8eae9935d3bac4456820"} Mar 09 14:04:23 crc kubenswrapper[4722]: I0309 14:04:23.605195 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-h4zw5" event={"ID":"6b9e29bb-6e51-47ab-a543-b70117ab854d","Type":"ContainerStarted","Data":"e53ae33397a1ccf1c138219043f73b655cbba90e500887baf26ec45b0e88dd2f"} Mar 09 14:04:23 crc kubenswrapper[4722]: I0309 14:04:23.612422 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" event={"ID":"2e305619-b3a2-44c9-9e54-e1afa4f43dbf","Type":"ContainerStarted","Data":"ef1c5a3944e8693d8c68a4108c23851352407d0fac7fb7a03763d9084f117dfb"} Mar 09 14:04:23 crc kubenswrapper[4722]: I0309 14:04:23.612509 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" event={"ID":"2e305619-b3a2-44c9-9e54-e1afa4f43dbf","Type":"ContainerStarted","Data":"f969085ae62d82dbe1f0f8f5d459c4a1bc7d1ae0a459da484f11dd6ae5fdd418"} Mar 09 14:04:23 crc kubenswrapper[4722]: I0309 14:04:23.612526 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" event={"ID":"2e305619-b3a2-44c9-9e54-e1afa4f43dbf","Type":"ContainerStarted","Data":"d1fc80c43be93eeb2f9805bf314c8430f52062d0ce5d0fa1830e7e0c27e674e3"} Mar 09 14:04:23 crc kubenswrapper[4722]: I0309 14:04:23.612541 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" event={"ID":"2e305619-b3a2-44c9-9e54-e1afa4f43dbf","Type":"ContainerStarted","Data":"a0106fa808804a96ca8247096399e3d04eb082b2691b1f39df3db9f6c630114c"} Mar 09 14:04:23 crc kubenswrapper[4722]: I0309 14:04:23.612556 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" event={"ID":"2e305619-b3a2-44c9-9e54-e1afa4f43dbf","Type":"ContainerStarted","Data":"24e737417baac512fab2089f5da12e7c47ef06eae8df7803ad6b248505ff3d18"} Mar 09 14:04:23 crc kubenswrapper[4722]: I0309 14:04:23.616190 4722 generic.go:334] "Generic (PLEG): container finished" podID="fd1357a3-36ef-4bc8-85bb-91f3c0f42994" containerID="6c00d804aec8d776f5f2ab9c7a81161c9942a94564d57afe47c633fb3dc2aa7d" exitCode=0 Mar 09 14:04:23 crc kubenswrapper[4722]: I0309 14:04:23.616335 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jv499" event={"ID":"fd1357a3-36ef-4bc8-85bb-91f3c0f42994","Type":"ContainerDied","Data":"6c00d804aec8d776f5f2ab9c7a81161c9942a94564d57afe47c633fb3dc2aa7d"} Mar 09 14:04:23 crc kubenswrapper[4722]: I0309 14:04:23.626037 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d69359acfc7a4ee7698088a7e0062b3604a5ceb92910cd3b3d69b1a6291dfdcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://687646beb4c6dd3fb9b659a36449a573f4c389b3e98b07127b18516fdc4f2c93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:23Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:23 crc kubenswrapper[4722]: I0309 14:04:23.631685 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:23 crc kubenswrapper[4722]: I0309 14:04:23.631713 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:23 crc kubenswrapper[4722]: I0309 14:04:23.631724 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:23 crc kubenswrapper[4722]: I0309 14:04:23.631739 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:23 crc kubenswrapper[4722]: I0309 14:04:23.631749 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:23Z","lastTransitionTime":"2026-03-09T14:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:23 crc kubenswrapper[4722]: I0309 14:04:23.640601 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h4zw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b9e29bb-6e51-47ab-a543-b70117ab854d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17cd3d1e3981e2f2a3b66780f753425c78452e6319fb8eae9935d3bac4456820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdw92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h4zw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:23Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:23 crc kubenswrapper[4722]: I0309 14:04:23.660871 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcc35a356ac2421f1f5a4e5460ac9461063a1fb85aaefce257f9f4e7aed4dc75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcc35a356ac2421f1f5a4e5460ac9461063a1fb85aaefce257f9f4e7aed4dc75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5v7ng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:23Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:23 crc kubenswrapper[4722]: I0309 14:04:23.680852 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee2ed55248b177bc0873ef3c41517401fdb30fc5ce5a545a1ca8c925e4d4e9e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:23Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:23 crc kubenswrapper[4722]: I0309 14:04:23.706326 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jv499" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd1357a3-36ef-4bc8-85bb-91f3c0f42994\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e56992f54f9fc4387130b74e3aa0f4652a5d6d960da1a5305b8668b8086fae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e56992f54f9fc4387130b74e3aa0f4652a5d6d960da1a5305b8668b8086fae2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jv499\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:23Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:23 crc kubenswrapper[4722]: I0309 14:04:23.719741 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-72xb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43e3c934-6aa7-4c96-a3a6-378f7931fc2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcf5e49245fd6051b12465a36b27f657b4923a62f88d61fe0fac971c4a2c9197\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lfx7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-72xb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:23Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:23 crc kubenswrapper[4722]: I0309 14:04:23.740956 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dac2aaf5-653b-4b2a-8efe-ed26bac8d648\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf5b268d6a355ae3a2c40981769412124b85cdfd99364ed58e5163a744f5f83c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q6db\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f451493fa6d86f12c1291c0fe262bcbe4b62cef45c8dda696bb48fa3ded2eeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q6db\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hjrrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:23Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:23 crc kubenswrapper[4722]: I0309 14:04:23.741128 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:23 crc kubenswrapper[4722]: I0309 14:04:23.741166 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:23 crc kubenswrapper[4722]: I0309 14:04:23.741182 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:23 crc kubenswrapper[4722]: I0309 14:04:23.741234 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:23 crc kubenswrapper[4722]: I0309 14:04:23.741254 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:23Z","lastTransitionTime":"2026-03-09T14:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:23 crc kubenswrapper[4722]: I0309 14:04:23.770725 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2267a4d7-87b4-43f7-80c7-7a7ffb0327fc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04169d9b8359e26cc9807e27fb9a5d54ca0946aaade294c5bcc272a7d21c4db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09fa62e7293a50fb3f6bdaf37061cbe5e95a830c8f7337660ce73138dbcaa641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31ba786781fcdb781f5fa45276e29e94756863d2398109d91e81a52f6f88bee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dff721159b382bcfdde6f406a328ef2ee9264b35a8d8baeb7a3c6d31a60ff3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63769aca40f721bec0716427145a3ceea0595836c03d31c98f6d45ff14de68be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://928d88a2e78d66b996b4b4f8c09878f78dd7fbdb000e2ab6d43ac77c999a748f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://928d88a2e78d66b996b4b4f8c09878f78dd7fbdb000e2ab6d43ac77c999a748f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6625c60c717ffc2b85b70374bc50dfae41b04d6362db0c7daf387f31c25862e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6625c60c717ffc2b85b70374bc50dfae41b04d6362db0c7daf387f31c25862e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://21b62fa3b3144d1d24dbb3a9e5f203e4c692377a54fb2322ebe6b927aa67403b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21b62fa3b3144d1d24dbb3a9e5f203e4c692377a54fb2322ebe6b927aa67403b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:23Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:23 crc kubenswrapper[4722]: I0309 14:04:23.790394 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0965e24b-526e-4842-ac1f-eca7a765355d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56c76a5e8718243a17d8d9fd9a8dcf6083c2ccb65b4816926dca08a89e465728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be3836660bf646052991965270cd69b8b8237b5c4bd698a73789b050d23e6877\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dbef38ff5e714c92fc8a86f1e3c64adf606cb198145f884a000f8dee200fcf2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7d636e81bd62363d09f680bbc312f71cbc2ac89d9af44b3c6b9efdaf29ded48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7d636e81bd62363d09f680bbc312f71cbc2ac89d9af44b3c6b9efdaf29ded48\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T14:03:46Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 14:03:46.681607 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 14:03:46.681849 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 14:03:46.682597 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2315154617/tls.crt::/tmp/serving-cert-2315154617/tls.key\\\\\\\"\\\\nI0309 14:03:46.866095 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 14:03:46.869876 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 14:03:46.869896 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 14:03:46.870280 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 14:03:46.870293 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 14:03:46.877096 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0309 14:03:46.877117 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 14:03:46.877121 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 14:03:46.877128 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 14:03:46.877135 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 14:03:46.877139 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 14:03:46.877143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 14:03:46.877146 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 14:03:46.878419 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T14:03:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fbabc5ec2c100150ce886c06c3bd78cc5896c8a06ac9d262eeb552a8a8bb5d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:23Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:23 crc kubenswrapper[4722]: I0309 14:04:23.805156 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:23Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:23 crc kubenswrapper[4722]: I0309 14:04:23.825178 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:23Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:23 crc kubenswrapper[4722]: I0309 14:04:23.845597 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:23 crc kubenswrapper[4722]: I0309 14:04:23.845629 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:23 crc kubenswrapper[4722]: I0309 14:04:23.845637 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:23 crc kubenswrapper[4722]: I0309 14:04:23.845650 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:23 crc kubenswrapper[4722]: I0309 14:04:23.845658 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:23Z","lastTransitionTime":"2026-03-09T14:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:23 crc kubenswrapper[4722]: I0309 14:04:23.847185 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:23Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:23 crc kubenswrapper[4722]: I0309 14:04:23.862725 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69e74824fe242fd2d54bed9734f33823697d7651043d9c97bac6ea5490625a0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:23Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:23 crc kubenswrapper[4722]: I0309 14:04:23.877495 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d69359acfc7a4ee7698088a7e0062b3604a5ceb92910cd3b3d69b1a6291dfdcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://687646beb4c6dd3fb9b659a36449a573f4c389b3e98b07127b18516fdc4f2c93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:23Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:23 crc kubenswrapper[4722]: I0309 14:04:23.890619 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h4zw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b9e29bb-6e51-47ab-a543-b70117ab854d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17cd3d1e3981e2f2a3b66780f753425c78452e6319fb8eae9935d3bac4456820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdw92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h4zw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:23Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:23 crc kubenswrapper[4722]: I0309 14:04:23.910168 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcc35a356ac2421f1f5a4e5460ac9461063a1fb85aaefce257f9f4e7aed4dc75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcc35a356ac2421f1f5a4e5460ac9461063a1fb85aaefce257f9f4e7aed4dc75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5v7ng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:23Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:23 crc kubenswrapper[4722]: I0309 14:04:23.927145 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee2ed55248b177bc0873ef3c41517401fdb30fc5ce5a545a1ca8c925e4d4e9e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:23Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:23 crc kubenswrapper[4722]: I0309 14:04:23.944518 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 14:04:23 crc kubenswrapper[4722]: E0309 14:04:23.944669 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 14:04:39.944646642 +0000 UTC m=+120.500215218 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:04:23 crc kubenswrapper[4722]: I0309 14:04:23.945574 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jv499" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd1357a3-36ef-4bc8-85bb-91f3c0f42994\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e56992f54f9fc4387130b74e3aa0f4652a5d6d960da1a5305b8668b8086fae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e56992f54f9fc4387130b74e3aa0f4652a5d6d960da1a5305b8668b8086fae2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c00d804aec8d776f5f2ab9c7a81161c9942a94564d57afe47c633fb3dc2aa7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c00d804aec8d776f5f2ab9c7a81161c9942a94564d57afe47c633fb3dc2aa7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jv499\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:23Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:23 crc kubenswrapper[4722]: I0309 14:04:23.948389 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:23 crc kubenswrapper[4722]: I0309 14:04:23.948428 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:23 crc kubenswrapper[4722]: I0309 14:04:23.948440 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:23 crc kubenswrapper[4722]: I0309 14:04:23.948459 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:23 crc kubenswrapper[4722]: I0309 14:04:23.948470 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:23Z","lastTransitionTime":"2026-03-09T14:04:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:23 crc kubenswrapper[4722]: I0309 14:04:23.961509 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:23Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:23 crc kubenswrapper[4722]: I0309 14:04:23.976634 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:23Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:23 crc kubenswrapper[4722]: I0309 14:04:23.987743 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-72xb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43e3c934-6aa7-4c96-a3a6-378f7931fc2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcf5e49245fd6051b12465a36b27f657b4923a62f88d61fe0fac971c4a2c9197\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lfx7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-72xb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:23Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:24 crc kubenswrapper[4722]: I0309 14:04:24.001528 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dac2aaf5-653b-4b2a-8efe-ed26bac8d648\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf5b268d6a355ae3a2c40981769412124b85cdfd99364ed58e5163a744f5f83c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q6db\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f451493fa6d86f12c1291c0fe262bcbe4b62cef45c8dda696bb48fa3ded2eeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q6db\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hjrrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:23Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:24 crc kubenswrapper[4722]: I0309 14:04:24.021977 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2267a4d7-87b4-43f7-80c7-7a7ffb0327fc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04169d9b8359e26cc9807e27fb9a5d54ca0946aaade294c5bcc272a7d21c4db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09fa62e7293a50fb3f6bdaf37061cbe5e95a830c8f7337660ce73138dbcaa641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31ba786781fcdb781f5fa45276e29e94756863d2398109d91e81a52f6f88bee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dff721159b382bcfdde6f406a328ef2ee9264b35a8d8baeb7a3c6d31a60ff3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63769aca40f721bec0716427145a3ceea0595836c03d31c98f6d45ff14de68be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://928d88a2e78d66b996b4b4f8c09878f78dd7fbdb000e2ab6d43ac77c999a748f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://928d88a2e78d66b996b4b4f8c09878f78dd7fbdb000e2ab6d43ac77c999a748f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6625c60c717ffc2b85b70374bc50dfae41b04d6362db0c7daf387f31c25862e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6625c60c717ffc2b85b70374bc50dfae41b04d6362db0c7daf387f31c25862e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://21b62fa3b3144d1d24dbb3a9e5f203e4c692377a54fb2322ebe6b927aa67403b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21b62fa3b3144d1d24dbb3a9e5f203e4c692377a54fb2322ebe6b927aa67403b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:24Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:24 crc kubenswrapper[4722]: I0309 14:04:24.034505 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0965e24b-526e-4842-ac1f-eca7a765355d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56c76a5e8718243a17d8d9fd9a8dcf6083c2ccb65b4816926dca08a89e465728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be3836660bf646052991965270cd69b8b8237b5c4bd698a73789b050d23e6877\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dbef38ff5e714c92fc8a86f1e3c64adf606cb198145f884a000f8dee200fcf2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7d636e81bd62363d09f680bbc312f71cbc2ac89d9af44b3c6b9efdaf29ded48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7d636e81bd62363d09f680bbc312f71cbc2ac89d9af44b3c6b9efdaf29ded48\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T14:03:46Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 14:03:46.681607 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 14:03:46.681849 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 14:03:46.682597 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2315154617/tls.crt::/tmp/serving-cert-2315154617/tls.key\\\\\\\"\\\\nI0309 14:03:46.866095 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 14:03:46.869876 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 14:03:46.869896 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 14:03:46.870280 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 14:03:46.870293 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 14:03:46.877096 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0309 14:03:46.877117 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 14:03:46.877121 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 14:03:46.877128 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 14:03:46.877135 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 14:03:46.877139 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 14:03:46.877143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 14:03:46.877146 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 14:03:46.878419 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T14:03:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fbabc5ec2c100150ce886c06c3bd78cc5896c8a06ac9d262eeb552a8a8bb5d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:24Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:24 crc kubenswrapper[4722]: I0309 14:04:24.045526 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 14:04:24 crc kubenswrapper[4722]: I0309 14:04:24.045569 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 14:04:24 crc kubenswrapper[4722]: I0309 14:04:24.045598 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 14:04:24 crc kubenswrapper[4722]: I0309 14:04:24.045623 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 14:04:24 crc kubenswrapper[4722]: E0309 14:04:24.045708 4722 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 14:04:24 crc kubenswrapper[4722]: E0309 14:04:24.045776 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 14:04:40.045757902 +0000 UTC m=+120.601326468 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 14:04:24 crc kubenswrapper[4722]: E0309 14:04:24.045778 4722 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 14:04:24 crc kubenswrapper[4722]: E0309 14:04:24.045800 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 14:04:24 crc kubenswrapper[4722]: E0309 14:04:24.045828 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 14:04:24 crc kubenswrapper[4722]: E0309 14:04:24.045860 4722 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 14:04:24 crc kubenswrapper[4722]: E0309 14:04:24.045856 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 14:04:24 crc kubenswrapper[4722]: E0309 14:04:24.045901 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-09 14:04:40.045892376 +0000 UTC m=+120.601460942 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 14:04:24 crc kubenswrapper[4722]: E0309 14:04:24.045942 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 14:04:40.045932827 +0000 UTC m=+120.601501403 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 14:04:24 crc kubenswrapper[4722]: E0309 14:04:24.045904 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 14:04:24 crc kubenswrapper[4722]: E0309 14:04:24.045960 4722 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 14:04:24 crc kubenswrapper[4722]: E0309 14:04:24.046005 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-09 14:04:40.045985708 +0000 UTC m=+120.601554284 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 14:04:24 crc kubenswrapper[4722]: I0309 14:04:24.049593 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69e74824fe242fd2d54bed9734f33823697d7651043d9c97bac6ea5490625a0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:24Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:24 crc kubenswrapper[4722]: I0309 14:04:24.051626 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:24 crc kubenswrapper[4722]: I0309 14:04:24.051715 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:24 crc kubenswrapper[4722]: I0309 14:04:24.051736 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:24 crc kubenswrapper[4722]: I0309 14:04:24.051767 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:24 crc kubenswrapper[4722]: I0309 14:04:24.051787 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:24Z","lastTransitionTime":"2026-03-09T14:04:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:24 crc kubenswrapper[4722]: I0309 14:04:24.064492 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:24Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:24 crc kubenswrapper[4722]: I0309 14:04:24.149195 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 14:04:24 crc kubenswrapper[4722]: I0309 14:04:24.149366 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 14:04:24 crc kubenswrapper[4722]: E0309 14:04:24.149406 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 14:04:24 crc kubenswrapper[4722]: I0309 14:04:24.149436 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 14:04:24 crc kubenswrapper[4722]: E0309 14:04:24.149602 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 14:04:24 crc kubenswrapper[4722]: E0309 14:04:24.149731 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 14:04:24 crc kubenswrapper[4722]: I0309 14:04:24.155858 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:24 crc kubenswrapper[4722]: I0309 14:04:24.155914 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:24 crc kubenswrapper[4722]: I0309 14:04:24.155933 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:24 crc kubenswrapper[4722]: I0309 14:04:24.155958 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:24 crc kubenswrapper[4722]: I0309 14:04:24.155978 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:24Z","lastTransitionTime":"2026-03-09T14:04:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:24 crc kubenswrapper[4722]: I0309 14:04:24.259366 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:24 crc kubenswrapper[4722]: I0309 14:04:24.259437 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:24 crc kubenswrapper[4722]: I0309 14:04:24.259505 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:24 crc kubenswrapper[4722]: I0309 14:04:24.259534 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:24 crc kubenswrapper[4722]: I0309 14:04:24.259553 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:24Z","lastTransitionTime":"2026-03-09T14:04:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:24 crc kubenswrapper[4722]: I0309 14:04:24.362750 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:24 crc kubenswrapper[4722]: I0309 14:04:24.362814 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:24 crc kubenswrapper[4722]: I0309 14:04:24.362833 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:24 crc kubenswrapper[4722]: I0309 14:04:24.362858 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:24 crc kubenswrapper[4722]: I0309 14:04:24.362878 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:24Z","lastTransitionTime":"2026-03-09T14:04:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:24 crc kubenswrapper[4722]: I0309 14:04:24.466111 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:24 crc kubenswrapper[4722]: I0309 14:04:24.466154 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:24 crc kubenswrapper[4722]: I0309 14:04:24.466165 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:24 crc kubenswrapper[4722]: I0309 14:04:24.466181 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:24 crc kubenswrapper[4722]: I0309 14:04:24.466194 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:24Z","lastTransitionTime":"2026-03-09T14:04:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:24 crc kubenswrapper[4722]: I0309 14:04:24.568931 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:24 crc kubenswrapper[4722]: I0309 14:04:24.568964 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:24 crc kubenswrapper[4722]: I0309 14:04:24.568974 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:24 crc kubenswrapper[4722]: I0309 14:04:24.568987 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:24 crc kubenswrapper[4722]: I0309 14:04:24.568995 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:24Z","lastTransitionTime":"2026-03-09T14:04:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:24 crc kubenswrapper[4722]: I0309 14:04:24.624164 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" event={"ID":"2e305619-b3a2-44c9-9e54-e1afa4f43dbf","Type":"ContainerStarted","Data":"f5765145caac69454a9a8e8de34d05cb1da06924dd365f3e74f52461be5027ae"} Mar 09 14:04:24 crc kubenswrapper[4722]: I0309 14:04:24.627068 4722 generic.go:334] "Generic (PLEG): container finished" podID="fd1357a3-36ef-4bc8-85bb-91f3c0f42994" containerID="0491f9997970cb050ece19d88d2a7064d4799db3772132f58055e1dbcf2f3a82" exitCode=0 Mar 09 14:04:24 crc kubenswrapper[4722]: I0309 14:04:24.627173 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jv499" event={"ID":"fd1357a3-36ef-4bc8-85bb-91f3c0f42994","Type":"ContainerDied","Data":"0491f9997970cb050ece19d88d2a7064d4799db3772132f58055e1dbcf2f3a82"} Mar 09 14:04:24 crc kubenswrapper[4722]: I0309 14:04:24.640312 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dac2aaf5-653b-4b2a-8efe-ed26bac8d648\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf5b268d6a355ae3a2c40981769412124b85cdfd99364ed58e5163a744f5f83c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q6db\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f451493fa6d86f12c1291c0fe262bcbe4b62cef45c8dda696bb48fa3ded2eeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q6db\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hjrrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:24Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:24 crc kubenswrapper[4722]: I0309 14:04:24.660169 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2267a4d7-87b4-43f7-80c7-7a7ffb0327fc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04169d9b8359e26cc9807e27fb9a5d54ca0946aaade294c5bcc272a7d21c4db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09fa62e7293a50fb3f6bdaf37061cbe5e95a830c8f7337660ce73138dbcaa641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31ba786781fcdb781f5fa45276e29e94756863d2398109d91e81a52f6f88bee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dff721159b382bcfdde6f406a328ef2ee9264b35a8d8baeb7a3c6d31a60ff3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63769aca40f721bec0716427145a3ceea0595836c03d31c98f6d45ff14de68be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://928d88a2e78d66b996b4b4f8c09878f78dd7fbdb000e2ab6d43ac77c999a748f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://928d88a2e78d66b996b4b4f8c09878f78dd7fbdb000e2ab6d43ac77c999a748f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6625c60c717ffc2b85b70374bc50dfae41b04d6362db0c7daf387f31c25862e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6625c60c717ffc2b85b70374bc50dfae41b04d6362db0c7daf387f31c25862e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://21b62fa3b3144d1d24dbb3a9e5f203e4c692377a54fb2322ebe6b927aa67403b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21b62fa3b3144d1d24dbb3a9e5f203e4c692377a54fb2322ebe6b927aa67403b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:24Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:24 crc kubenswrapper[4722]: I0309 14:04:24.672326 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:24 crc kubenswrapper[4722]: I0309 14:04:24.672361 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:24 crc kubenswrapper[4722]: I0309 14:04:24.672374 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:24 crc kubenswrapper[4722]: I0309 14:04:24.672391 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:24 crc kubenswrapper[4722]: I0309 14:04:24.672403 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:24Z","lastTransitionTime":"2026-03-09T14:04:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:24 crc kubenswrapper[4722]: I0309 14:04:24.673150 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0965e24b-526e-4842-ac1f-eca7a765355d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56c76a5e8718243a17d8d9fd9a8dcf6083c2ccb65b4816926dca08a89e465728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be3836660bf646052991965270cd69b8b8237b5c4bd698a73789b050d23e6877\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dbef38ff5e714c92fc8a86f1e3c64adf606cb198145f884a000f8dee200fcf2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7d636e81bd62363d09f680bbc312f71cbc2ac89d9af44b3c6b9efdaf29ded48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7d636e81bd62363d09f680bbc312f71cbc2ac89d9af44b3c6b9efdaf29ded48\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T14:03:46Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 14:03:46.681607 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 14:03:46.681849 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 14:03:46.682597 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2315154617/tls.crt::/tmp/serving-cert-2315154617/tls.key\\\\\\\"\\\\nI0309 14:03:46.866095 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 14:03:46.869876 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 14:03:46.869896 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 14:03:46.870280 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 14:03:46.870293 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 14:03:46.877096 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0309 14:03:46.877117 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 14:03:46.877121 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 14:03:46.877128 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 14:03:46.877135 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 14:03:46.877139 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 14:03:46.877143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 14:03:46.877146 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 14:03:46.878419 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T14:03:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fbabc5ec2c100150ce886c06c3bd78cc5896c8a06ac9d262eeb552a8a8bb5d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:24Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:24 crc kubenswrapper[4722]: I0309 14:04:24.686444 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:24Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:24 crc kubenswrapper[4722]: I0309 14:04:24.697985 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:24Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:24 crc kubenswrapper[4722]: I0309 14:04:24.708750 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-72xb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43e3c934-6aa7-4c96-a3a6-378f7931fc2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcf5e49245fd6051b12465a36b27f657b4923a62f88d61fe0fac971c4a2c9197\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lfx7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-72xb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:24Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:24 crc kubenswrapper[4722]: I0309 14:04:24.720794 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:24Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:24 crc kubenswrapper[4722]: I0309 14:04:24.733265 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69e74824fe242fd2d54bed9734f33823697d7651043d9c97bac6ea5490625a0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:24Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:24 crc kubenswrapper[4722]: I0309 14:04:24.744946 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d69359acfc7a4ee7698088a7e0062b3604a5ceb92910cd3b3d69b1a6291dfdcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://687646beb4c6dd3fb9b659a36449a573f4c389b3e98b07127b18516fdc4f2c93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:24Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:24 crc kubenswrapper[4722]: I0309 14:04:24.757139 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h4zw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b9e29bb-6e51-47ab-a543-b70117ab854d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17cd3d1e3981e2f2a3b66780f753425c78452e6319fb8eae9935d3bac4456820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdw92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h4zw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:24Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:24 crc kubenswrapper[4722]: I0309 14:04:24.775718 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcc35a356ac2421f1f5a4e5460ac9461063a1fb85aaefce257f9f4e7aed4dc75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcc35a356ac2421f1f5a4e5460ac9461063a1fb85aaefce257f9f4e7aed4dc75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5v7ng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:24Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:24 crc kubenswrapper[4722]: I0309 14:04:24.776833 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:24 crc kubenswrapper[4722]: I0309 14:04:24.776855 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:24 crc kubenswrapper[4722]: I0309 14:04:24.776863 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:24 crc kubenswrapper[4722]: I0309 14:04:24.776875 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:24 crc kubenswrapper[4722]: I0309 14:04:24.776883 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:24Z","lastTransitionTime":"2026-03-09T14:04:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:24 crc kubenswrapper[4722]: I0309 14:04:24.789309 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee2ed55248b177bc0873ef3c41517401fdb30fc5ce5a545a1ca8c925e4d4e9e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:24Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:24 crc kubenswrapper[4722]: I0309 14:04:24.803174 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jv499" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd1357a3-36ef-4bc8-85bb-91f3c0f42994\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e56992f54f9fc4387130b74e3aa0f4652a5d6d960da1a5305b8668b8086fae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e56992f54f9fc4387130b74e3aa0f4652a5d6d960da1a5305b8668b8086fae2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c00d804aec8d776f5f2ab9c7a81161c9942a94564d57afe47c633fb3dc2aa7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c00d804aec8d776f5f2ab9c7a81161c9942a94564d57afe47c633fb3dc2aa7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0491f9997970cb050ece19d88d2a7064d4799db3772132f58055e1dbcf2f3a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0491f9997970cb050ece19d88d2a7064d4799db3772132f58055e1dbcf2f3a82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jv499\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:24Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:24 crc kubenswrapper[4722]: I0309 14:04:24.878502 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:24 crc kubenswrapper[4722]: I0309 14:04:24.878552 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:24 crc kubenswrapper[4722]: I0309 14:04:24.878564 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:24 crc kubenswrapper[4722]: I0309 14:04:24.878582 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:24 crc kubenswrapper[4722]: I0309 14:04:24.878593 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:24Z","lastTransitionTime":"2026-03-09T14:04:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:24 crc kubenswrapper[4722]: I0309 14:04:24.980574 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:24 crc kubenswrapper[4722]: I0309 14:04:24.980614 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:24 crc kubenswrapper[4722]: I0309 14:04:24.980625 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:24 crc kubenswrapper[4722]: I0309 14:04:24.980640 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:24 crc kubenswrapper[4722]: I0309 14:04:24.980651 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:24Z","lastTransitionTime":"2026-03-09T14:04:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:25 crc kubenswrapper[4722]: I0309 14:04:25.083845 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:25 crc kubenswrapper[4722]: I0309 14:04:25.083949 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:25 crc kubenswrapper[4722]: I0309 14:04:25.083974 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:25 crc kubenswrapper[4722]: I0309 14:04:25.084003 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:25 crc kubenswrapper[4722]: I0309 14:04:25.084023 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:25Z","lastTransitionTime":"2026-03-09T14:04:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:25 crc kubenswrapper[4722]: I0309 14:04:25.187273 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:25 crc kubenswrapper[4722]: I0309 14:04:25.187349 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:25 crc kubenswrapper[4722]: I0309 14:04:25.187372 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:25 crc kubenswrapper[4722]: I0309 14:04:25.187396 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:25 crc kubenswrapper[4722]: I0309 14:04:25.187416 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:25Z","lastTransitionTime":"2026-03-09T14:04:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:25 crc kubenswrapper[4722]: I0309 14:04:25.293672 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:25 crc kubenswrapper[4722]: I0309 14:04:25.293733 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:25 crc kubenswrapper[4722]: I0309 14:04:25.293750 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:25 crc kubenswrapper[4722]: I0309 14:04:25.293773 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:25 crc kubenswrapper[4722]: I0309 14:04:25.293792 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:25Z","lastTransitionTime":"2026-03-09T14:04:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:25 crc kubenswrapper[4722]: I0309 14:04:25.396730 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:25 crc kubenswrapper[4722]: I0309 14:04:25.396810 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:25 crc kubenswrapper[4722]: I0309 14:04:25.396834 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:25 crc kubenswrapper[4722]: I0309 14:04:25.396862 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:25 crc kubenswrapper[4722]: I0309 14:04:25.396888 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:25Z","lastTransitionTime":"2026-03-09T14:04:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:25 crc kubenswrapper[4722]: I0309 14:04:25.500457 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:25 crc kubenswrapper[4722]: I0309 14:04:25.500535 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:25 crc kubenswrapper[4722]: I0309 14:04:25.500559 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:25 crc kubenswrapper[4722]: I0309 14:04:25.500592 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:25 crc kubenswrapper[4722]: I0309 14:04:25.500617 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:25Z","lastTransitionTime":"2026-03-09T14:04:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:25 crc kubenswrapper[4722]: I0309 14:04:25.603766 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:25 crc kubenswrapper[4722]: I0309 14:04:25.603842 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:25 crc kubenswrapper[4722]: I0309 14:04:25.603866 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:25 crc kubenswrapper[4722]: I0309 14:04:25.603898 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:25 crc kubenswrapper[4722]: I0309 14:04:25.603918 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:25Z","lastTransitionTime":"2026-03-09T14:04:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:25 crc kubenswrapper[4722]: I0309 14:04:25.637903 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jv499" event={"ID":"fd1357a3-36ef-4bc8-85bb-91f3c0f42994","Type":"ContainerDied","Data":"93c7d9a6b0dc8e76abab762ff6aae1975a31fbc80ee3a3cd32ec378298061748"} Mar 09 14:04:25 crc kubenswrapper[4722]: I0309 14:04:25.637738 4722 generic.go:334] "Generic (PLEG): container finished" podID="fd1357a3-36ef-4bc8-85bb-91f3c0f42994" containerID="93c7d9a6b0dc8e76abab762ff6aae1975a31fbc80ee3a3cd32ec378298061748" exitCode=0 Mar 09 14:04:25 crc kubenswrapper[4722]: I0309 14:04:25.655805 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d69359acfc7a4ee7698088a7e0062b3604a5ceb92910cd3b3d69b1a6291dfdcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://687646beb4c6dd3fb9b659a36449a573f4c389b3e98b07127b18516fdc4f2c93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:25Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:25 crc kubenswrapper[4722]: I0309 14:04:25.676143 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h4zw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b9e29bb-6e51-47ab-a543-b70117ab854d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17cd3d1e3981e2f2a3b66780f753425c78452e6319fb8eae9935d3bac4456820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdw92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h4zw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:25Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:25 crc kubenswrapper[4722]: I0309 14:04:25.707253 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcc35a356ac2421f1f5a4e5460ac9461063a1fb85aaefce257f9f4e7aed4dc75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcc35a356ac2421f1f5a4e5460ac9461063a1fb85aaefce257f9f4e7aed4dc75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5v7ng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:25Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:25 crc kubenswrapper[4722]: I0309 14:04:25.708073 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:25 crc kubenswrapper[4722]: I0309 14:04:25.708137 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:25 crc kubenswrapper[4722]: I0309 14:04:25.708154 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:25 crc kubenswrapper[4722]: I0309 14:04:25.708178 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:25 crc kubenswrapper[4722]: I0309 14:04:25.708193 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:25Z","lastTransitionTime":"2026-03-09T14:04:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:25 crc kubenswrapper[4722]: I0309 14:04:25.727888 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee2ed55248b177bc0873ef3c41517401fdb30fc5ce5a545a1ca8c925e4d4e9e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:25Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:25 crc kubenswrapper[4722]: I0309 14:04:25.750331 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jv499" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd1357a3-36ef-4bc8-85bb-91f3c0f42994\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e56992f54f9fc4387130b74e3aa0f4652a5d6d960da1a5305b8668b8086fae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e56992f54f9fc4387130b74e3aa0f4652a5d6d960da1a5305b8668b8086fae2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c00d804aec8d776f5f2ab9c7a81161c9942a94564d57afe47c633fb3dc2aa7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c00d804aec8d776f5f2ab9c7a81161c9942a94564d57afe47c633fb3dc2aa7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0491f9997970cb050ece19d88d2a7064d4799db3772132f58055e1dbcf2f3a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0491f9997970cb050ece19d88d2a7064d4799db3772132f58055e1dbcf2f3a82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93c7d9a6b0dc8e76abab762ff6aae1975a31fbc80ee3a3cd32ec378298061748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93c7d9a6b0dc8e76abab762ff6aae1975a31fbc80ee3a3cd32ec378298061748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jv499\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:25Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:25 crc kubenswrapper[4722]: I0309 14:04:25.765767 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-72xb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43e3c934-6aa7-4c96-a3a6-378f7931fc2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcf5e49245fd6051b12465a36b27f657b4923a62f88d61fe0fac971c4a2c9197\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lfx7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-72xb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:25Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:25 crc kubenswrapper[4722]: I0309 14:04:25.787368 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dac2aaf5-653b-4b2a-8efe-ed26bac8d648\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf5b268d6a355ae3a2c40981769412124b85cdfd99364ed58e5163a744f5f83c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q6db\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f451493fa6d86f12c1291c0fe262bcbe4b62cef45c8dda696bb48fa3ded2eeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q6db\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hjrrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:25Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:25 crc kubenswrapper[4722]: I0309 14:04:25.812104 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:25 crc kubenswrapper[4722]: I0309 14:04:25.812153 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:25 crc kubenswrapper[4722]: I0309 14:04:25.812164 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:25 crc kubenswrapper[4722]: I0309 14:04:25.812187 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:25 crc kubenswrapper[4722]: I0309 14:04:25.812211 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:25Z","lastTransitionTime":"2026-03-09T14:04:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:25 crc kubenswrapper[4722]: I0309 14:04:25.817943 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2267a4d7-87b4-43f7-80c7-7a7ffb0327fc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04169d9b8359e26cc9807e27fb9a5d54ca0946aaade294c5bcc272a7d21c4db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09fa62e7293a50fb3f6bdaf37061cbe5e95a830c8f7337660ce73138dbcaa641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31ba786781fcdb781f5fa45276e29e94756863d2398109d91e81a52f6f88bee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dff721159b382bcfdde6f406a328ef2ee9264b35a8d8baeb7a3c6d31a60ff3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63769aca40f721bec0716427145a3ceea0595836c03d31c98f6d45ff14de68be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://928d88a2e78d66b996b4b4f8c09878f78dd7fbdb000e2ab6d43ac77c999a748f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://928d88a2e78d66b996b4b4f8c09878f78dd7fbdb000e2ab6d43ac77c999a748f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6625c60c717ffc2b85b70374bc50dfae41b04d6362db0c7daf387f31c25862e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6625c60c717ffc2b85b70374bc50dfae41b04d6362db0c7daf387f31c25862e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://21b62fa3b3144d1d24dbb3a9e5f203e4c692377a54fb2322ebe6b927aa67403b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21b62fa3b3144d1d24dbb3a9e5f203e4c692377a54fb2322ebe6b927aa67403b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:25Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:25 crc kubenswrapper[4722]: I0309 14:04:25.840807 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0965e24b-526e-4842-ac1f-eca7a765355d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56c76a5e8718243a17d8d9fd9a8dcf6083c2ccb65b4816926dca08a89e465728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be3836660bf646052991965270cd69b8b8237b5c4bd698a73789b050d23e6877\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dbef38ff5e714c92fc8a86f1e3c64adf606cb198145f884a000f8dee200fcf2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7d636e81bd62363d09f680bbc312f71cbc2ac89d9af44b3c6b9efdaf29ded48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7d636e81bd62363d09f680bbc312f71cbc2ac89d9af44b3c6b9efdaf29ded48\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T14:03:46Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 14:03:46.681607 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 14:03:46.681849 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 14:03:46.682597 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2315154617/tls.crt::/tmp/serving-cert-2315154617/tls.key\\\\\\\"\\\\nI0309 14:03:46.866095 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 14:03:46.869876 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 14:03:46.869896 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 14:03:46.870280 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 14:03:46.870293 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 14:03:46.877096 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0309 14:03:46.877117 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 14:03:46.877121 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 14:03:46.877128 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 14:03:46.877135 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 14:03:46.877139 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 14:03:46.877143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 14:03:46.877146 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 14:03:46.878419 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T14:03:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fbabc5ec2c100150ce886c06c3bd78cc5896c8a06ac9d262eeb552a8a8bb5d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:25Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:25 crc kubenswrapper[4722]: I0309 14:04:25.856037 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:25Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:25 crc kubenswrapper[4722]: I0309 14:04:25.871605 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:25Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:25 crc kubenswrapper[4722]: I0309 14:04:25.894502 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:25Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:25 crc kubenswrapper[4722]: I0309 14:04:25.914503 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:25 crc kubenswrapper[4722]: I0309 14:04:25.914756 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:25 crc kubenswrapper[4722]: I0309 14:04:25.914851 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:25 crc kubenswrapper[4722]: I0309 14:04:25.914964 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:25 crc kubenswrapper[4722]: I0309 14:04:25.915053 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:25Z","lastTransitionTime":"2026-03-09T14:04:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:25 crc kubenswrapper[4722]: I0309 14:04:25.916171 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69e74824fe242fd2d54bed9734f33823697d7651043d9c97bac6ea5490625a0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:25Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:26 crc kubenswrapper[4722]: I0309 14:04:26.017796 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:26 crc kubenswrapper[4722]: I0309 14:04:26.017827 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:26 crc kubenswrapper[4722]: I0309 14:04:26.017837 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:26 crc kubenswrapper[4722]: I0309 14:04:26.017850 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:26 crc kubenswrapper[4722]: I0309 14:04:26.017859 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:26Z","lastTransitionTime":"2026-03-09T14:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:26 crc kubenswrapper[4722]: I0309 14:04:26.121130 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:26 crc kubenswrapper[4722]: I0309 14:04:26.121163 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:26 crc kubenswrapper[4722]: I0309 14:04:26.121183 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:26 crc kubenswrapper[4722]: I0309 14:04:26.121215 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:26 crc kubenswrapper[4722]: I0309 14:04:26.121228 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:26Z","lastTransitionTime":"2026-03-09T14:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:26 crc kubenswrapper[4722]: I0309 14:04:26.148630 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 14:04:26 crc kubenswrapper[4722]: I0309 14:04:26.148775 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 14:04:26 crc kubenswrapper[4722]: E0309 14:04:26.148895 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 14:04:26 crc kubenswrapper[4722]: I0309 14:04:26.148915 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 14:04:26 crc kubenswrapper[4722]: E0309 14:04:26.149059 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 14:04:26 crc kubenswrapper[4722]: E0309 14:04:26.149233 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 14:04:26 crc kubenswrapper[4722]: I0309 14:04:26.225105 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:26 crc kubenswrapper[4722]: I0309 14:04:26.225742 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:26 crc kubenswrapper[4722]: I0309 14:04:26.225757 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:26 crc kubenswrapper[4722]: I0309 14:04:26.225801 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:26 crc kubenswrapper[4722]: I0309 14:04:26.225817 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:26Z","lastTransitionTime":"2026-03-09T14:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:26 crc kubenswrapper[4722]: I0309 14:04:26.330066 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:26 crc kubenswrapper[4722]: I0309 14:04:26.330120 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:26 crc kubenswrapper[4722]: I0309 14:04:26.330138 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:26 crc kubenswrapper[4722]: I0309 14:04:26.330164 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:26 crc kubenswrapper[4722]: I0309 14:04:26.330183 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:26Z","lastTransitionTime":"2026-03-09T14:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:26 crc kubenswrapper[4722]: I0309 14:04:26.433879 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:26 crc kubenswrapper[4722]: I0309 14:04:26.433922 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:26 crc kubenswrapper[4722]: I0309 14:04:26.433938 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:26 crc kubenswrapper[4722]: I0309 14:04:26.433957 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:26 crc kubenswrapper[4722]: I0309 14:04:26.433970 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:26Z","lastTransitionTime":"2026-03-09T14:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:26 crc kubenswrapper[4722]: I0309 14:04:26.536766 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:26 crc kubenswrapper[4722]: I0309 14:04:26.536825 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:26 crc kubenswrapper[4722]: I0309 14:04:26.536846 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:26 crc kubenswrapper[4722]: I0309 14:04:26.536874 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:26 crc kubenswrapper[4722]: I0309 14:04:26.536896 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:26Z","lastTransitionTime":"2026-03-09T14:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:26 crc kubenswrapper[4722]: I0309 14:04:26.640475 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:26 crc kubenswrapper[4722]: I0309 14:04:26.640544 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:26 crc kubenswrapper[4722]: I0309 14:04:26.640572 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:26 crc kubenswrapper[4722]: I0309 14:04:26.640602 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:26 crc kubenswrapper[4722]: I0309 14:04:26.640626 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:26Z","lastTransitionTime":"2026-03-09T14:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:26 crc kubenswrapper[4722]: I0309 14:04:26.647455 4722 generic.go:334] "Generic (PLEG): container finished" podID="fd1357a3-36ef-4bc8-85bb-91f3c0f42994" containerID="9cc5ae61b0fa949ba4bf5aa60d09385399a557f19ef7c99dc43b21fc9db830e4" exitCode=0 Mar 09 14:04:26 crc kubenswrapper[4722]: I0309 14:04:26.648426 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jv499" event={"ID":"fd1357a3-36ef-4bc8-85bb-91f3c0f42994","Type":"ContainerDied","Data":"9cc5ae61b0fa949ba4bf5aa60d09385399a557f19ef7c99dc43b21fc9db830e4"} Mar 09 14:04:26 crc kubenswrapper[4722]: I0309 14:04:26.657897 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" event={"ID":"2e305619-b3a2-44c9-9e54-e1afa4f43dbf","Type":"ContainerStarted","Data":"8f5af077ae5b947194e5dba0542d053890a589582e415e1b41a932bdab6632e0"} Mar 09 14:04:26 crc kubenswrapper[4722]: I0309 14:04:26.686490 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2267a4d7-87b4-43f7-80c7-7a7ffb0327fc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04169d9b8359e26cc9807e27fb9a5d54ca0946aaade294c5bcc272a7d21c4db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09fa62e7293a50fb3f6bdaf37061cbe5e95a830c8f7337660ce73138dbcaa641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31ba786781fcdb781f5fa45276e29e94756863d2398109d91e81a52f6f88bee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dff721159b382bcfdde6f406a328ef2ee9264b35a8d8baeb7a3c6d31a60ff3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63769aca40f721bec0716427145a3ceea0595836c03d31c98f6d45ff14de68be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://928d88a2e78d66b996b4b4f8c09878f78dd7fbdb000e2ab6d43ac77c999a748f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://928d88a2e78d66b996b4b4f8c09878f78dd7fbdb000e2ab6d43ac77c999a748f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6625c60c717ffc2b85b70374bc50dfae41b04d6362db0c7daf387f31c25862e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6625c60c717ffc2b85b70374bc50dfae41b04d6362db0c7daf387f31c25862e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://21b62fa3b3144d1d24dbb3a9e5f203e4c692377a54fb2322ebe6b927aa67403b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21b62fa3b3144d1d24dbb3a9e5f203e4c692377a54fb2322ebe6b927aa67403b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:26Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:26 crc kubenswrapper[4722]: I0309 14:04:26.710990 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0965e24b-526e-4842-ac1f-eca7a765355d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56c76a5e8718243a17d8d9fd9a8dcf6083c2ccb65b4816926dca08a89e465728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be3836660bf646052991965270cd69b8b8237b5c4bd698a73789b050d23e6877\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dbef38ff5e714c92fc8a86f1e3c64adf606cb198145f884a000f8dee200fcf2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7d636e81bd62363d09f680bbc312f71cbc2ac89d9af44b3c6b9efdaf29ded48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7d636e81bd62363d09f680bbc312f71cbc2ac89d9af44b3c6b9efdaf29ded48\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T14:03:46Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 14:03:46.681607 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 14:03:46.681849 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 14:03:46.682597 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2315154617/tls.crt::/tmp/serving-cert-2315154617/tls.key\\\\\\\"\\\\nI0309 14:03:46.866095 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 14:03:46.869876 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 14:03:46.869896 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 14:03:46.870280 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 14:03:46.870293 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 14:03:46.877096 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0309 14:03:46.877117 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 14:03:46.877121 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 14:03:46.877128 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 14:03:46.877135 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 14:03:46.877139 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 14:03:46.877143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 14:03:46.877146 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 14:03:46.878419 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T14:03:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fbabc5ec2c100150ce886c06c3bd78cc5896c8a06ac9d262eeb552a8a8bb5d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:26Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:26 crc kubenswrapper[4722]: I0309 14:04:26.730360 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:26Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:26 crc kubenswrapper[4722]: I0309 14:04:26.743896 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:26 crc kubenswrapper[4722]: I0309 14:04:26.743939 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:26 crc kubenswrapper[4722]: I0309 14:04:26.743948 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:26 crc kubenswrapper[4722]: I0309 14:04:26.743961 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:26 crc kubenswrapper[4722]: I0309 14:04:26.743971 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:26Z","lastTransitionTime":"2026-03-09T14:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:26 crc kubenswrapper[4722]: I0309 14:04:26.751107 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:26Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:26 crc kubenswrapper[4722]: I0309 14:04:26.763923 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-72xb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43e3c934-6aa7-4c96-a3a6-378f7931fc2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcf5e49245fd6051b12465a36b27f657b4923a62f88d61fe0fac971c4a2c9197\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lfx7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-72xb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:26Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:26 crc kubenswrapper[4722]: I0309 14:04:26.776504 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dac2aaf5-653b-4b2a-8efe-ed26bac8d648\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf5b268d6a355ae3a2c40981769412124b85cdfd99364ed58e5163a744f5f83c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q6db\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f451493fa6d86f12c1291c0fe262bcbe4b62cef45c8dda696bb48fa3ded2eeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q6db\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hjrrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:26Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:26 crc kubenswrapper[4722]: I0309 14:04:26.793996 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:26Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:26 crc kubenswrapper[4722]: I0309 14:04:26.809967 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69e74824fe242fd2d54bed9734f33823697d7651043d9c97bac6ea5490625a0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:26Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:26 crc kubenswrapper[4722]: I0309 14:04:26.826197 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d69359acfc7a4ee7698088a7e0062b3604a5ceb92910cd3b3d69b1a6291dfdcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://687646beb4c6dd3fb9b659a36449a573f4c389b3e98b07127b18516fdc4f2c93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:26Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:26 crc kubenswrapper[4722]: I0309 14:04:26.842475 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h4zw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b9e29bb-6e51-47ab-a543-b70117ab854d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17cd3d1e3981e2f2a3b66780f753425c78452e6319fb8eae9935d3bac4456820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdw92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h4zw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:26Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:26 crc kubenswrapper[4722]: I0309 14:04:26.846489 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:26 crc kubenswrapper[4722]: I0309 14:04:26.846517 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:26 crc kubenswrapper[4722]: I0309 14:04:26.846530 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:26 crc kubenswrapper[4722]: I0309 14:04:26.846545 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:26 crc kubenswrapper[4722]: I0309 14:04:26.846556 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:26Z","lastTransitionTime":"2026-03-09T14:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:26 crc kubenswrapper[4722]: I0309 14:04:26.872646 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcc35a356ac2421f1f5a4e5460ac9461063a1fb85aaefce257f9f4e7aed4dc75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcc35a356ac2421f1f5a4e5460ac9461063a1fb85aaefce257f9f4e7aed4dc75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5v7ng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:26Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:26 crc kubenswrapper[4722]: I0309 14:04:26.893090 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee2ed55248b177bc0873ef3c41517401fdb30fc5ce5a545a1ca8c925e4d4e9e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:26Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:26 crc kubenswrapper[4722]: I0309 14:04:26.918681 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jv499" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd1357a3-36ef-4bc8-85bb-91f3c0f42994\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e56992f54f9fc4387130b74e3aa0f4652a5d6d960da1a5305b8668b8086fae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e56992f54f9fc4387130b74e3aa0f4652a5d6d960da1a5305b8668b8086fae2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c00d804aec8d776f5f2ab9c7a81161c9942a94564d57afe47c633fb3dc2aa7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c00d804aec8d776f5f2ab9c7a81161c9942a94564d57afe47c633fb3dc2aa7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0491f9997970cb050ece19d88d2a7064d4799db3772132f58055e1dbcf2f3a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0491f9997970cb050ece19d88d2a7064d4799db3772132f58055e1dbcf2f3a82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93c7d9a6b0dc8e76abab762ff6aae1975a31fbc80ee3a3cd32ec378298061748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93c7d9a6b0dc8e76abab762ff6aae1975a31fbc80ee3a3cd32ec378298061748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cc5ae61b0fa949ba4bf5aa60d09385399a557f19ef7c99dc43b21fc9db830e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9cc5ae61b0fa949ba4bf5aa60d09385399a557f19ef7c99dc43b21fc9db830e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jv499\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:26Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:26 crc kubenswrapper[4722]: I0309 14:04:26.950140 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:26 crc kubenswrapper[4722]: I0309 14:04:26.950191 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:26 crc kubenswrapper[4722]: I0309 14:04:26.950249 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:26 crc kubenswrapper[4722]: I0309 14:04:26.950272 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:26 crc kubenswrapper[4722]: I0309 14:04:26.950312 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:26Z","lastTransitionTime":"2026-03-09T14:04:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.004088 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.004120 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.004129 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.004142 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.004154 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:27Z","lastTransitionTime":"2026-03-09T14:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:27 crc kubenswrapper[4722]: E0309 14:04:27.021578 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3321b793-006a-42f5-9c18-7f48ed9bad15\\\",\\\"systemUUID\\\":\\\"70de6d16-940c-46da-be51-a9b50262dda2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:27Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.025546 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.025747 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.025891 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.026035 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.026160 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:27Z","lastTransitionTime":"2026-03-09T14:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:27 crc kubenswrapper[4722]: E0309 14:04:27.043699 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3321b793-006a-42f5-9c18-7f48ed9bad15\\\",\\\"systemUUID\\\":\\\"70de6d16-940c-46da-be51-a9b50262dda2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:27Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.049620 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.049655 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.049666 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.049686 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.049697 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:27Z","lastTransitionTime":"2026-03-09T14:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:27 crc kubenswrapper[4722]: E0309 14:04:27.066495 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3321b793-006a-42f5-9c18-7f48ed9bad15\\\",\\\"systemUUID\\\":\\\"70de6d16-940c-46da-be51-a9b50262dda2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:27Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.071105 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.071186 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.071231 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.071263 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.071283 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:27Z","lastTransitionTime":"2026-03-09T14:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:27 crc kubenswrapper[4722]: E0309 14:04:27.096036 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3321b793-006a-42f5-9c18-7f48ed9bad15\\\",\\\"systemUUID\\\":\\\"70de6d16-940c-46da-be51-a9b50262dda2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:27Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.100906 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.100989 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.101000 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.101015 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.101025 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:27Z","lastTransitionTime":"2026-03-09T14:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:27 crc kubenswrapper[4722]: E0309 14:04:27.114190 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3321b793-006a-42f5-9c18-7f48ed9bad15\\\",\\\"systemUUID\\\":\\\"70de6d16-940c-46da-be51-a9b50262dda2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:27Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:27 crc kubenswrapper[4722]: E0309 14:04:27.114363 4722 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.124481 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.124783 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.124806 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.124823 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.124835 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:27Z","lastTransitionTime":"2026-03-09T14:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.227285 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.227667 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.227847 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.227992 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.228123 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:27Z","lastTransitionTime":"2026-03-09T14:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.330679 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.330744 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.330762 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.330789 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.330808 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:27Z","lastTransitionTime":"2026-03-09T14:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.433383 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.433441 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.433477 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.433505 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.433528 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:27Z","lastTransitionTime":"2026-03-09T14:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.532443 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-l5pwg"] Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.532980 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-l5pwg" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.534801 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.535502 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.535646 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.536032 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.536842 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.536877 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.536885 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.536897 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.536907 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:27Z","lastTransitionTime":"2026-03-09T14:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.553926 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d69359acfc7a4ee7698088a7e0062b3604a5ceb92910cd3b3d69b1a6291dfdcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://687646beb4c6dd3fb9b659a36449a573f4c389b3e98b07127b18516fdc4f2c93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:27Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.569307 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h4zw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b9e29bb-6e51-47ab-a543-b70117ab854d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17cd3d1e3981e2f2a3b66780f753425c78452e6319fb8eae9935d3bac4456820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdw92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h4zw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:27Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.588012 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcc35a356ac2421f1f5a4e5460ac9461063a1fb85aaefce257f9f4e7aed4dc75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcc35a356ac2421f1f5a4e5460ac9461063a1fb85aaefce257f9f4e7aed4dc75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5v7ng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:27Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.610510 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee2ed55248b177bc0873ef3c41517401fdb30fc5ce5a545a1ca8c925e4d4e9e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:27Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.626681 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jv499" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd1357a3-36ef-4bc8-85bb-91f3c0f42994\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e56992f54f9fc4387130b74e3aa0f4652a5d6d960da1a5305b8668b8086fae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e56992f54f9fc4387130b74e3aa0f4652a5d6d960da1a5305b8668b8086fae2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c00d804aec8d776f5f2ab9c7a81161c9942a94564d57afe47c633fb3dc2aa7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c00d804aec8d776f5f2ab9c7a81161c9942a94564d57afe47c633fb3dc2aa7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0491f9997970cb050ece19d88d2a7064d4799db3772132f58055e1dbcf2f3a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0491f9997970cb050ece19d88d2a7064d4799db3772132f58055e1dbcf2f3a82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93c7d9a6b0dc8e76abab762ff6aae1975a31fbc80ee3a3cd32ec378298061748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93c7d9a6b0dc8e76abab762ff6aae1975a31fbc80ee3a3cd32ec378298061748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cc5ae61b0fa949ba4bf5aa60d09385399a557f19ef7c99dc43b21fc9db830e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9cc5ae61b0fa949ba4bf5aa60d09385399a557f19ef7c99dc43b21fc9db830e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jv499\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:27Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.637102 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l5pwg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8424174-764a-470b-ab49-e7369a42de7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l5pwg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:27Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.639249 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.639294 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.639309 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.639331 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.639343 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:27Z","lastTransitionTime":"2026-03-09T14:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.650581 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:27Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.662261 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-72xb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43e3c934-6aa7-4c96-a3a6-378f7931fc2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcf5e49245fd6051b12465a36b27f657b4923a62f88d61fe0fac971c4a2c9197\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lfx7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-72xb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:27Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.665005 4722 generic.go:334] "Generic (PLEG): container finished" podID="fd1357a3-36ef-4bc8-85bb-91f3c0f42994" containerID="a88cf1e6b90a3f13f44bbb08482773f007066c795b00cb0177f6cd2ed87cbb52" exitCode=0 Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.665054 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jv499" event={"ID":"fd1357a3-36ef-4bc8-85bb-91f3c0f42994","Type":"ContainerDied","Data":"a88cf1e6b90a3f13f44bbb08482773f007066c795b00cb0177f6cd2ed87cbb52"} Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.683519 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dac2aaf5-653b-4b2a-8efe-ed26bac8d648\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf5b268d6a355ae3a2c40981769412124b85cdfd99364ed58e5163a744f5f83c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q6db\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f451493fa6d86f12c1291c0fe262bcbe4b62cef45c8dda696bb48fa3ded2eeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q6db\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hjrrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:27Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.702156 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b8424174-764a-470b-ab49-e7369a42de7c-host\") pod \"node-ca-l5pwg\" (UID: \"b8424174-764a-470b-ab49-e7369a42de7c\") " pod="openshift-image-registry/node-ca-l5pwg" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.702299 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b8424174-764a-470b-ab49-e7369a42de7c-serviceca\") pod \"node-ca-l5pwg\" (UID: \"b8424174-764a-470b-ab49-e7369a42de7c\") " pod="openshift-image-registry/node-ca-l5pwg" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.702336 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-426rr\" (UniqueName: \"kubernetes.io/projected/b8424174-764a-470b-ab49-e7369a42de7c-kube-api-access-426rr\") pod \"node-ca-l5pwg\" (UID: \"b8424174-764a-470b-ab49-e7369a42de7c\") " pod="openshift-image-registry/node-ca-l5pwg" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.704404 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2267a4d7-87b4-43f7-80c7-7a7ffb0327fc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04169d9b8359e26cc9807e27fb9a5d54ca0946aaade294c5bcc272a7d21c4db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09fa62e7293a50fb3f6bdaf37061cbe5e95a830c8f7337660ce73138dbcaa641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31ba786781fcdb781f5fa45276e29e94756863d2398109d91e81a52f6f88bee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dff721159b382bcfdde6f406a328ef2ee9264b35a8d8baeb7a3c6d31a60ff3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63769aca40f721bec0716427145a3ceea0595836c03d31c98f6d45ff14de68be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://928d88a2e78d66b996b4b4f8c09878f78dd7fbdb000e2ab6d43ac77c999a748f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://928d88a2e78d66b996b4b4f8c09878f78dd7fbdb000e2ab6d43ac77c999a748f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6625c60c717ffc2b85b70374bc50dfae41b04d6362db0c7daf387f31c25862e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6625c60c717ffc2b85b70374bc50dfae41b04d6362db0c7daf387f31c25862e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://21b62fa3b3144d1d24dbb3a9e5f203e4c692377a54fb2322ebe6b927aa67403b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21b62fa3b3144d1d24dbb3a9e5f203e4c692377a54fb2322ebe6b927aa67403b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:27Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.721387 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0965e24b-526e-4842-ac1f-eca7a765355d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56c76a5e8718243a17d8d9fd9a8dcf6083c2ccb65b4816926dca08a89e465728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be3836660bf646052991965270cd69b8b8237b5c4bd698a73789b050d23e6877\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dbef38ff5e714c92fc8a86f1e3c64adf606cb198145f884a000f8dee200fcf2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7d636e81bd62363d09f680bbc312f71cbc2ac89d9af44b3c6b9efdaf29ded48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7d636e81bd62363d09f680bbc312f71cbc2ac89d9af44b3c6b9efdaf29ded48\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T14:03:46Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 14:03:46.681607 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 14:03:46.681849 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 14:03:46.682597 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2315154617/tls.crt::/tmp/serving-cert-2315154617/tls.key\\\\\\\"\\\\nI0309 14:03:46.866095 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 14:03:46.869876 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 14:03:46.869896 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 14:03:46.870280 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 14:03:46.870293 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 14:03:46.877096 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0309 14:03:46.877117 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 14:03:46.877121 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 14:03:46.877128 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 14:03:46.877135 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 14:03:46.877139 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 14:03:46.877143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 14:03:46.877146 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 14:03:46.878419 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T14:03:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fbabc5ec2c100150ce886c06c3bd78cc5896c8a06ac9d262eeb552a8a8bb5d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:27Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.735329 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:27Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.743547 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.743587 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.743599 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.743617 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.743629 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:27Z","lastTransitionTime":"2026-03-09T14:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.751852 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:27Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.766532 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69e74824fe242fd2d54bed9734f33823697d7651043d9c97bac6ea5490625a0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:27Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.777923 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69e74824fe242fd2d54bed9734f33823697d7651043d9c97bac6ea5490625a0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:27Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.792140 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:27Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.803817 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b8424174-764a-470b-ab49-e7369a42de7c-host\") pod \"node-ca-l5pwg\" (UID: \"b8424174-764a-470b-ab49-e7369a42de7c\") " pod="openshift-image-registry/node-ca-l5pwg" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.803868 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b8424174-764a-470b-ab49-e7369a42de7c-serviceca\") pod \"node-ca-l5pwg\" (UID: \"b8424174-764a-470b-ab49-e7369a42de7c\") " pod="openshift-image-registry/node-ca-l5pwg" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.803888 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-426rr\" (UniqueName: \"kubernetes.io/projected/b8424174-764a-470b-ab49-e7369a42de7c-kube-api-access-426rr\") pod \"node-ca-l5pwg\" (UID: \"b8424174-764a-470b-ab49-e7369a42de7c\") " pod="openshift-image-registry/node-ca-l5pwg" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.804225 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b8424174-764a-470b-ab49-e7369a42de7c-host\") pod \"node-ca-l5pwg\" (UID: \"b8424174-764a-470b-ab49-e7369a42de7c\") " pod="openshift-image-registry/node-ca-l5pwg" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.805003 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b8424174-764a-470b-ab49-e7369a42de7c-serviceca\") pod \"node-ca-l5pwg\" (UID: \"b8424174-764a-470b-ab49-e7369a42de7c\") " pod="openshift-image-registry/node-ca-l5pwg" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.808603 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d69359acfc7a4ee7698088a7e0062b3604a5ceb92910cd3b3d69b1a6291dfdcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://687646beb4c6dd3fb9b659a36449a573f4c389b3e98b07127b18516fdc4f2c93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:27Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.823924 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h4zw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b9e29bb-6e51-47ab-a543-b70117ab854d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17cd3d1e3981e2f2a3b66780f753425c78452e6319fb8eae9935d3bac4456820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdw92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h4zw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:27Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.824646 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-426rr\" (UniqueName: \"kubernetes.io/projected/b8424174-764a-470b-ab49-e7369a42de7c-kube-api-access-426rr\") pod \"node-ca-l5pwg\" (UID: \"b8424174-764a-470b-ab49-e7369a42de7c\") " pod="openshift-image-registry/node-ca-l5pwg" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.843821 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcc35a356ac2421f1f5a4e5460ac9461063a1fb85aaefce257f9f4e7aed4dc75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcc35a356ac2421f1f5a4e5460ac9461063a1fb85aaefce257f9f4e7aed4dc75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5v7ng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:27Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.845712 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.845739 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.845746 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.845760 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.845771 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:27Z","lastTransitionTime":"2026-03-09T14:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.850073 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-l5pwg" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.860901 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee2ed55248b177bc0873ef3c41517401fdb30fc5ce5a545a1ca8c925e4d4e9e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:27Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:27 crc kubenswrapper[4722]: W0309 14:04:27.867085 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8424174_764a_470b_ab49_e7369a42de7c.slice/crio-feff44a6864537318510b9a4a9ec89ef86e88b4053b4d96174ba0903d3c63d79 WatchSource:0}: Error finding container feff44a6864537318510b9a4a9ec89ef86e88b4053b4d96174ba0903d3c63d79: Status 404 returned error can't find the container with id feff44a6864537318510b9a4a9ec89ef86e88b4053b4d96174ba0903d3c63d79 Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.883891 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jv499" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd1357a3-36ef-4bc8-85bb-91f3c0f42994\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e56992f54f9fc4387130b74e3aa0f4652a5d6d960da1a5305b8668b8086fae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e56992f54f9fc4387130b74e3aa0f4652a5d6d960da1a5305b8668b8086fae2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c00d804aec8d776f5f2ab9c7a81161c9942a94564d57afe47c633fb3dc2aa7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c00d804aec8d776f5f2ab9c7a81161c9942a94564d57afe47c633fb3dc2aa7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0491f9997970cb050ece19d88d2a7064d4799db3772132f58055e1dbcf2f3a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0491f9997970cb050ece19d88d2a7064d4799db3772132f58055e1dbcf2f3a82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93c7d9a6b0dc8e76abab762ff6aae1975a31fbc80ee3a3cd32ec378298061748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93c7d9a6b0dc8e76abab762ff6aae1975a31fbc80ee3a3cd32ec378298061748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cc5ae61b0fa949ba4bf5aa60d09385399a557f19ef7c99dc43b21fc9db830e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9cc5ae61b0fa949ba4bf5aa60d09385399a557f19ef7c99dc43b21fc9db830e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a88cf1e6b90a3f13f44bbb08482773f007066c795b00cb0177f6cd2ed87cbb52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a88cf1e6b90a3f13f44bbb08482773f007066c795b00cb0177f6cd2ed87cbb52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jv499\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:27Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.898314 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l5pwg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8424174-764a-470b-ab49-e7369a42de7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l5pwg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:27Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.911693 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:27Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.926073 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:27Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.939371 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-72xb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43e3c934-6aa7-4c96-a3a6-378f7931fc2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcf5e49245fd6051b12465a36b27f657b4923a62f88d61fe0fac971c4a2c9197\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lfx7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-72xb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:27Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.948683 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.948713 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.948725 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.948742 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.948754 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:27Z","lastTransitionTime":"2026-03-09T14:04:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.952385 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dac2aaf5-653b-4b2a-8efe-ed26bac8d648\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf5b268d6a355ae3a2c40981769412124b85cdfd99364ed58e5163a744f5f83c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q6db\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f451493fa6d86f12c1291c0fe262bcbe4b62cef45c8dda696bb48fa3ded2eeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q6db\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hjrrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:27Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.976228 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2267a4d7-87b4-43f7-80c7-7a7ffb0327fc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04169d9b8359e26cc9807e27fb9a5d54ca0946aaade294c5bcc272a7d21c4db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09fa62e7293a50fb3f6bdaf37061cbe5e95a830c8f7337660ce73138dbcaa641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31ba786781fcdb781f5fa45276e29e94756863d2398109d91e81a52f6f88bee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dff721159b382bcfdde6f406a328ef2ee9264b35a8d8baeb7a3c6d31a60ff3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63769aca40f721bec0716427145a3ceea0595836c03d31c98f6d45ff14de68be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://928d88a2e78d66b996b4b4f8c09878f78dd7fbdb000e2ab6d43ac77c999a748f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://928d88a2e78d66b996b4b4f8c09878f78dd7fbdb000e2ab6d43ac77c999a748f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6625c60c717ffc2b85b70374bc50dfae41b04d6362db0c7daf387f31c25862e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6625c60c717ffc2b85b70374bc50dfae41b04d6362db0c7daf387f31c25862e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://21b62fa3b3144d1d24dbb3a9e5f203e4c692377a54fb2322ebe6b927aa67403b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21b62fa3b3144d1d24dbb3a9e5f203e4c692377a54fb2322ebe6b927aa67403b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:27Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:27 crc kubenswrapper[4722]: I0309 14:04:27.994792 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0965e24b-526e-4842-ac1f-eca7a765355d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56c76a5e8718243a17d8d9fd9a8dcf6083c2ccb65b4816926dca08a89e465728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be3836660bf646052991965270cd69b8b8237b5c4bd698a73789b050d23e6877\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dbef38ff5e714c92fc8a86f1e3c64adf606cb198145f884a000f8dee200fcf2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7d636e81bd62363d09f680bbc312f71cbc2ac89d9af44b3c6b9efdaf29ded48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7d636e81bd62363d09f680bbc312f71cbc2ac89d9af44b3c6b9efdaf29ded48\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T14:03:46Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 14:03:46.681607 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 14:03:46.681849 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 14:03:46.682597 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2315154617/tls.crt::/tmp/serving-cert-2315154617/tls.key\\\\\\\"\\\\nI0309 14:03:46.866095 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 14:03:46.869876 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 14:03:46.869896 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 14:03:46.870280 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 14:03:46.870293 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 14:03:46.877096 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0309 14:03:46.877117 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 14:03:46.877121 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 14:03:46.877128 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 14:03:46.877135 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 14:03:46.877139 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 14:03:46.877143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 14:03:46.877146 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 14:03:46.878419 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T14:03:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fbabc5ec2c100150ce886c06c3bd78cc5896c8a06ac9d262eeb552a8a8bb5d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:27Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:28 crc kubenswrapper[4722]: I0309 14:04:28.051684 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:28 crc kubenswrapper[4722]: I0309 14:04:28.051726 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:28 crc kubenswrapper[4722]: I0309 14:04:28.051739 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:28 crc kubenswrapper[4722]: I0309 14:04:28.051758 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:28 crc kubenswrapper[4722]: I0309 14:04:28.051772 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:28Z","lastTransitionTime":"2026-03-09T14:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:28 crc kubenswrapper[4722]: I0309 14:04:28.148600 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 14:04:28 crc kubenswrapper[4722]: E0309 14:04:28.148793 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 14:04:28 crc kubenswrapper[4722]: I0309 14:04:28.149637 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 14:04:28 crc kubenswrapper[4722]: E0309 14:04:28.149864 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 14:04:28 crc kubenswrapper[4722]: I0309 14:04:28.149976 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 14:04:28 crc kubenswrapper[4722]: E0309 14:04:28.150067 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 14:04:28 crc kubenswrapper[4722]: I0309 14:04:28.155960 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:28 crc kubenswrapper[4722]: I0309 14:04:28.155994 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:28 crc kubenswrapper[4722]: I0309 14:04:28.156003 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:28 crc kubenswrapper[4722]: I0309 14:04:28.156018 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:28 crc kubenswrapper[4722]: I0309 14:04:28.156028 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:28Z","lastTransitionTime":"2026-03-09T14:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:28 crc kubenswrapper[4722]: I0309 14:04:28.258789 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:28 crc kubenswrapper[4722]: I0309 14:04:28.258822 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:28 crc kubenswrapper[4722]: I0309 14:04:28.258834 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:28 crc kubenswrapper[4722]: I0309 14:04:28.258851 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:28 crc kubenswrapper[4722]: I0309 14:04:28.258862 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:28Z","lastTransitionTime":"2026-03-09T14:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:28 crc kubenswrapper[4722]: I0309 14:04:28.361846 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:28 crc kubenswrapper[4722]: I0309 14:04:28.361941 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:28 crc kubenswrapper[4722]: I0309 14:04:28.361959 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:28 crc kubenswrapper[4722]: I0309 14:04:28.361991 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:28 crc kubenswrapper[4722]: I0309 14:04:28.362016 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:28Z","lastTransitionTime":"2026-03-09T14:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:28 crc kubenswrapper[4722]: I0309 14:04:28.464534 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:28 crc kubenswrapper[4722]: I0309 14:04:28.464571 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:28 crc kubenswrapper[4722]: I0309 14:04:28.464580 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:28 crc kubenswrapper[4722]: I0309 14:04:28.464594 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:28 crc kubenswrapper[4722]: I0309 14:04:28.464603 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:28Z","lastTransitionTime":"2026-03-09T14:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:28 crc kubenswrapper[4722]: I0309 14:04:28.567778 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:28 crc kubenswrapper[4722]: I0309 14:04:28.567818 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:28 crc kubenswrapper[4722]: I0309 14:04:28.567830 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:28 crc kubenswrapper[4722]: I0309 14:04:28.567846 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:28 crc kubenswrapper[4722]: I0309 14:04:28.567856 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:28Z","lastTransitionTime":"2026-03-09T14:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:28 crc kubenswrapper[4722]: I0309 14:04:28.669846 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:28 crc kubenswrapper[4722]: I0309 14:04:28.669894 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:28 crc kubenswrapper[4722]: I0309 14:04:28.669905 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:28 crc kubenswrapper[4722]: I0309 14:04:28.669919 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:28 crc kubenswrapper[4722]: I0309 14:04:28.669929 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:28Z","lastTransitionTime":"2026-03-09T14:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:28 crc kubenswrapper[4722]: I0309 14:04:28.685893 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jv499" event={"ID":"fd1357a3-36ef-4bc8-85bb-91f3c0f42994","Type":"ContainerStarted","Data":"8a14e62f50fa8e5403349dc8696e9c7c45a5da1026227612186e7805b1be722f"} Mar 09 14:04:28 crc kubenswrapper[4722]: I0309 14:04:28.692259 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-l5pwg" event={"ID":"b8424174-764a-470b-ab49-e7369a42de7c","Type":"ContainerStarted","Data":"7d30648bba1b59fd7db5e75a61d6a004246668d148515da5738b77e5122358f6"} Mar 09 14:04:28 crc kubenswrapper[4722]: I0309 14:04:28.692356 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-l5pwg" event={"ID":"b8424174-764a-470b-ab49-e7369a42de7c","Type":"ContainerStarted","Data":"feff44a6864537318510b9a4a9ec89ef86e88b4053b4d96174ba0903d3c63d79"} Mar 09 14:04:28 crc kubenswrapper[4722]: I0309 14:04:28.698511 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" event={"ID":"2e305619-b3a2-44c9-9e54-e1afa4f43dbf","Type":"ContainerStarted","Data":"95b64a8b9951808b4bb84077c4173f26cc7ddba47598c21c2da2dd36baed95e2"} Mar 09 14:04:28 crc kubenswrapper[4722]: I0309 14:04:28.699093 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" Mar 09 14:04:28 crc kubenswrapper[4722]: I0309 14:04:28.711839 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h4zw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b9e29bb-6e51-47ab-a543-b70117ab854d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17cd3d1e3981e2f2a3b66780f753425c78452e6319fb8eae9935d3bac4456820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdw92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h4zw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:28Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:28 crc kubenswrapper[4722]: I0309 14:04:28.732840 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" Mar 09 14:04:28 crc kubenswrapper[4722]: I0309 14:04:28.743496 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcc35a356ac2421f1f5a4e5460ac9461063a1fb85aaefce257f9f4e7aed4dc75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcc35a356ac2421f1f5a4e5460ac9461063a1fb85aaefce257f9f4e7aed4dc75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5v7ng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:28Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:28 crc kubenswrapper[4722]: I0309 14:04:28.760753 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d69359acfc7a4ee7698088a7e0062b3604a5ceb92910cd3b3d69b1a6291dfdcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://687646beb4c6dd3fb9b659a36449a573f4c389b3e98b07127b18516fdc4f2c93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:28Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:28 crc kubenswrapper[4722]: I0309 14:04:28.772652 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:28 crc kubenswrapper[4722]: I0309 14:04:28.772699 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:28 crc kubenswrapper[4722]: I0309 14:04:28.772715 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:28 crc kubenswrapper[4722]: I0309 14:04:28.772736 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:28 crc kubenswrapper[4722]: I0309 14:04:28.772810 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:28Z","lastTransitionTime":"2026-03-09T14:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:28 crc kubenswrapper[4722]: I0309 14:04:28.777621 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jv499" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd1357a3-36ef-4bc8-85bb-91f3c0f42994\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a14e62f50fa8e5403349dc8696e9c7c45a5da1026227612186e7805b1be722f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e56992f54f9fc4387130b74e3aa0f4652a5d6d960da1a5305b8668b8086fae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e56992f54f9fc4387130b74e3aa0f4652a5d6d960da1a5305b8668b8086fae2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c00d804aec8d776f5f2ab9c7a81161c9942a94564d57afe47c633fb3dc2aa7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c00d804aec8d776f5f2ab9c7a81161c9942a94564d57afe47c633fb3dc2aa7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0491f9997970cb050ece19d88d2a7064d4799db3772132f58055e1dbcf2f3a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0491f9997970cb050ece19d88d2a7064d4799db3772132f58055e1dbcf2f3a82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93c7d9a6b0dc8e76abab762ff6aae1975a31fbc80ee3a3cd32ec378298061748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93c7d9a6b0dc8e76abab762ff6aae1975a31fbc80ee3a3cd32ec378298061748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cc5ae61b0fa949ba4bf5aa60d09385399a557f19ef7c99dc43b21fc9db830e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9cc5ae61b0fa949ba4bf5aa60d09385399a557f19ef7c99dc43b21fc9db830e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a88cf1e6b90a3f13f44bbb08482773f007066c795b00cb0177f6cd2ed87cbb52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a88cf1e6b90a3f13f44bbb08482773f007066c795b00cb0177f6cd2ed87cbb52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jv499\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:28Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:28 crc kubenswrapper[4722]: I0309 14:04:28.790549 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l5pwg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8424174-764a-470b-ab49-e7369a42de7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l5pwg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:28Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:28 crc kubenswrapper[4722]: I0309 14:04:28.807067 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee2ed55248b177bc0873ef3c41517401fdb30fc5ce5a545a1ca8c925e4d4e9e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:28Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:28 crc kubenswrapper[4722]: I0309 14:04:28.831023 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2267a4d7-87b4-43f7-80c7-7a7ffb0327fc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04169d9b8359e26cc9807e27fb9a5d54ca0946aaade294c5bcc272a7d21c4db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09fa62e7293a50fb3f6bdaf37061cbe5e95a830c8f7337660ce73138dbcaa641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31ba786781fcdb781f5fa45276e29e94756863d2398109d91e81a52f6f88bee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dff721159b382bcfdde6f406a328ef2ee9264b35a8d8baeb7a3c6d31a60ff3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63769aca40f721bec0716427145a3ceea0595836c03d31c98f6d45ff14de68be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://928d88a2e78d66b996b4b4f8c09878f78dd7fbdb000e2ab6d43ac77c999a748f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://928d88a2e78d66b996b4b4f8c09878f78dd7fbdb000e2ab6d43ac77c999a748f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6625c60c717ffc2b85b70374bc50dfae41b04d6362db0c7daf387f31c25862e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6625c60c717ffc2b85b70374bc50dfae41b04d6362db0c7daf387f31c25862e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://21b62fa3b3144d1d24dbb3a9e5f203e4c692377a54fb2322ebe6b927aa67403b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21b62fa3b3144d1d24dbb3a9e5f203e4c692377a54fb2322ebe6b927aa67403b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:28Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:28 crc kubenswrapper[4722]: I0309 14:04:28.845675 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0965e24b-526e-4842-ac1f-eca7a765355d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56c76a5e8718243a17d8d9fd9a8dcf6083c2ccb65b4816926dca08a89e465728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be3836660bf646052991965270cd69b8b8237b5c4bd698a73789b050d23e6877\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dbef38ff5e714c92fc8a86f1e3c64adf606cb198145f884a000f8dee200fcf2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7d636e81bd62363d09f680bbc312f71cbc2ac89d9af44b3c6b9efdaf29ded48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7d636e81bd62363d09f680bbc312f71cbc2ac89d9af44b3c6b9efdaf29ded48\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T14:03:46Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 14:03:46.681607 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 14:03:46.681849 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 14:03:46.682597 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2315154617/tls.crt::/tmp/serving-cert-2315154617/tls.key\\\\\\\"\\\\nI0309 14:03:46.866095 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 14:03:46.869876 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 14:03:46.869896 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 14:03:46.870280 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 14:03:46.870293 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 14:03:46.877096 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0309 14:03:46.877117 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 14:03:46.877121 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 14:03:46.877128 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 14:03:46.877135 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 14:03:46.877139 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 14:03:46.877143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 14:03:46.877146 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 14:03:46.878419 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T14:03:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fbabc5ec2c100150ce886c06c3bd78cc5896c8a06ac9d262eeb552a8a8bb5d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:28Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:28 crc kubenswrapper[4722]: I0309 14:04:28.861037 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:28Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:28 crc kubenswrapper[4722]: I0309 14:04:28.875008 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:28 crc kubenswrapper[4722]: I0309 14:04:28.875068 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:28 crc kubenswrapper[4722]: I0309 14:04:28.875084 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:28 crc kubenswrapper[4722]: I0309 14:04:28.875104 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:28 crc kubenswrapper[4722]: I0309 14:04:28.875120 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:28Z","lastTransitionTime":"2026-03-09T14:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:28 crc kubenswrapper[4722]: I0309 14:04:28.875661 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:28Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:28 crc kubenswrapper[4722]: I0309 14:04:28.889539 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-72xb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43e3c934-6aa7-4c96-a3a6-378f7931fc2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcf5e49245fd6051b12465a36b27f657b4923a62f88d61fe0fac971c4a2c9197\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lfx7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-72xb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:28Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:28 crc kubenswrapper[4722]: I0309 14:04:28.903193 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dac2aaf5-653b-4b2a-8efe-ed26bac8d648\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf5b268d6a355ae3a2c40981769412124b85cdfd99364ed58e5163a744f5f83c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q6db\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f451493fa6d86f12c1291c0fe262bcbe4b62cef45c8dda696bb48fa3ded2eeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q6db\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hjrrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:28Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:28 crc kubenswrapper[4722]: I0309 14:04:28.917740 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:28Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:28 crc kubenswrapper[4722]: I0309 14:04:28.930891 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69e74824fe242fd2d54bed9734f33823697d7651043d9c97bac6ea5490625a0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:28Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:28 crc kubenswrapper[4722]: I0309 14:04:28.951912 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee2ed55248b177bc0873ef3c41517401fdb30fc5ce5a545a1ca8c925e4d4e9e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:28Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:28 crc kubenswrapper[4722]: I0309 14:04:28.973177 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jv499" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd1357a3-36ef-4bc8-85bb-91f3c0f42994\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a14e62f50fa8e5403349dc8696e9c7c45a5da1026227612186e7805b1be722f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e56992f54f9fc4387130b74e3aa0f4652a5d6d960da1a5305b8668b8086fae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e56992f54f9fc4387130b74e3aa0f4652a5d6d960da1a5305b8668b8086fae2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c00d804aec8d776f5f2ab9c7a81161c9942a94564d57afe47c633fb3dc2aa7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c00d804aec8d776f5f2ab9c7a81161c9942a94564d57afe47c633fb3dc2aa7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0491f9997970cb050ece19d88d2a7064d4799db3772132f58055e1dbcf2f3a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0491f9997970cb050ece19d88d2a7064d4799db3772132f58055e1dbcf2f3a82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93c7d9a6b0dc8e76abab762ff6aae1975a31fbc80ee3a3cd32ec378298061748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93c7d9a6b0dc8e76abab762ff6aae1975a31fbc80ee3a3cd32ec378298061748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cc5ae61b0fa949ba4bf5aa60d09385399a557f19ef7c99dc43b21fc9db830e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9cc5ae61b0fa949ba4bf5aa60d09385399a557f19ef7c99dc43b21fc9db830e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a88cf1e6b90a3f13f44bbb08482773f007066c795b00cb0177f6cd2ed87cbb52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a88cf1e6b90a3f13f44bbb08482773f007066c795b00cb0177f6cd2ed87cbb52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jv499\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:28Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:28 crc kubenswrapper[4722]: I0309 14:04:28.978233 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:28 crc kubenswrapper[4722]: I0309 14:04:28.978258 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:28 crc kubenswrapper[4722]: I0309 14:04:28.978269 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:28 crc kubenswrapper[4722]: I0309 14:04:28.978285 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:28 crc kubenswrapper[4722]: I0309 14:04:28.978299 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:28Z","lastTransitionTime":"2026-03-09T14:04:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:28 crc kubenswrapper[4722]: I0309 14:04:28.987935 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l5pwg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8424174-764a-470b-ab49-e7369a42de7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d30648bba1b59fd7db5e75a61d6a004246668d148515da5738b77e5122358f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l5pwg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:28Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:29 crc kubenswrapper[4722]: I0309 14:04:29.002478 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:29Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:29 crc kubenswrapper[4722]: I0309 14:04:29.015598 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-72xb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43e3c934-6aa7-4c96-a3a6-378f7931fc2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcf5e49245fd6051b12465a36b27f657b4923a62f88d61fe0fac971c4a2c9197\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lfx7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-72xb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:29Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:29 crc kubenswrapper[4722]: I0309 14:04:29.026569 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dac2aaf5-653b-4b2a-8efe-ed26bac8d648\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf5b268d6a355ae3a2c40981769412124b85cdfd99364ed58e5163a744f5f83c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q6db\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f451493fa6d86f12c1291c0fe262bcbe4b62cef45c8dda696bb48fa3ded2eeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q6db\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hjrrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:29Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:29 crc kubenswrapper[4722]: I0309 14:04:29.054615 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2267a4d7-87b4-43f7-80c7-7a7ffb0327fc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04169d9b8359e26cc9807e27fb9a5d54ca0946aaade294c5bcc272a7d21c4db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09fa62e7293a50fb3f6bdaf37061cbe5e95a830c8f7337660ce73138dbcaa641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31ba786781fcdb781f5fa45276e29e94756863d2398109d91e81a52f6f88bee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dff721159b382bcfdde6f406a328ef2ee9264b35a8d8baeb7a3c6d31a60ff3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63769aca40f721bec0716427145a3ceea0595836c03d31c98f6d45ff14de68be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://928d88a2e78d66b996b4b4f8c09878f78dd7fbdb000e2ab6d43ac77c999a748f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://928d88a2e78d66b996b4b4f8c09878f78dd7fbdb000e2ab6d43ac77c999a748f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6625c60c717ffc2b85b70374bc50dfae41b04d6362db0c7daf387f31c25862e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6625c60c717ffc2b85b70374bc50dfae41b04d6362db0c7daf387f31c25862e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://21b62fa3b3144d1d24dbb3a9e5f203e4c692377a54fb2322ebe6b927aa67403b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21b62fa3b3144d1d24dbb3a9e5f203e4c692377a54fb2322ebe6b927aa67403b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:29Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:29 crc kubenswrapper[4722]: I0309 14:04:29.072787 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0965e24b-526e-4842-ac1f-eca7a765355d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56c76a5e8718243a17d8d9fd9a8dcf6083c2ccb65b4816926dca08a89e465728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be3836660bf646052991965270cd69b8b8237b5c4bd698a73789b050d23e6877\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dbef38ff5e714c92fc8a86f1e3c64adf606cb198145f884a000f8dee200fcf2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7d636e81bd62363d09f680bbc312f71cbc2ac89d9af44b3c6b9efdaf29ded48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7d636e81bd62363d09f680bbc312f71cbc2ac89d9af44b3c6b9efdaf29ded48\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T14:03:46Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 14:03:46.681607 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 14:03:46.681849 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 14:03:46.682597 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2315154617/tls.crt::/tmp/serving-cert-2315154617/tls.key\\\\\\\"\\\\nI0309 14:03:46.866095 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 14:03:46.869876 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 14:03:46.869896 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 14:03:46.870280 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 14:03:46.870293 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 14:03:46.877096 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0309 14:03:46.877117 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 14:03:46.877121 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 14:03:46.877128 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 14:03:46.877135 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 14:03:46.877139 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 14:03:46.877143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 14:03:46.877146 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 14:03:46.878419 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T14:03:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fbabc5ec2c100150ce886c06c3bd78cc5896c8a06ac9d262eeb552a8a8bb5d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:29Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:29 crc kubenswrapper[4722]: I0309 14:04:29.081629 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:29 crc kubenswrapper[4722]: I0309 14:04:29.081712 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:29 crc kubenswrapper[4722]: I0309 14:04:29.081725 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:29 crc kubenswrapper[4722]: I0309 14:04:29.081749 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:29 crc kubenswrapper[4722]: I0309 14:04:29.081764 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:29Z","lastTransitionTime":"2026-03-09T14:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:29 crc kubenswrapper[4722]: I0309 14:04:29.093484 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:29Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:29 crc kubenswrapper[4722]: I0309 14:04:29.112159 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:29Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:29 crc kubenswrapper[4722]: I0309 14:04:29.127481 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69e74824fe242fd2d54bed9734f33823697d7651043d9c97bac6ea5490625a0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:29Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:29 crc kubenswrapper[4722]: I0309 14:04:29.149121 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d69359acfc7a4ee7698088a7e0062b3604a5ceb92910cd3b3d69b1a6291dfdcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://687646beb4c6dd3fb9b659a36449a573f4c389b3e98b07127b18516fdc4f2c93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:29Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:29 crc kubenswrapper[4722]: I0309 14:04:29.166194 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h4zw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b9e29bb-6e51-47ab-a543-b70117ab854d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17cd3d1e3981e2f2a3b66780f753425c78452e6319fb8eae9935d3bac4456820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdw92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h4zw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:29Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:29 crc kubenswrapper[4722]: I0309 14:04:29.184299 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:29 crc kubenswrapper[4722]: I0309 14:04:29.184364 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:29 crc kubenswrapper[4722]: I0309 14:04:29.184379 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:29 crc kubenswrapper[4722]: I0309 14:04:29.184822 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:29 crc kubenswrapper[4722]: I0309 14:04:29.184873 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:29Z","lastTransitionTime":"2026-03-09T14:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:29 crc kubenswrapper[4722]: I0309 14:04:29.192808 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1fc80c43be93eeb2f9805bf314c8430f52062d0ce5d0fa1830e7e0c27e674e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f969085ae62d82dbe1f0f8f5d459c4a1bc7d1ae0a459da484f11dd6ae5fdd418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5765145caac69454a9a8e8de34d05cb1da06924dd365f3e74f52461be5027ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef1c5a3944e8693d8c68a4108c23851352407d0fac7fb7a03763d9084f117dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0106fa808804a96ca8247096399e3d04eb082b2691b1f39df3db9f6c630114c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e737417baac512fab2089f5da12e7c47ef06eae8df7803ad6b248505ff3d18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95b64a8b9951808b4bb84077c4173f26cc7ddba47598c21c2da2dd36baed95e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f5af077ae5b947194e5dba0542d053890a589582e415e1b41a932bdab6632e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcc35a356ac2421f1f5a4e5460ac9461063a1fb85aaefce257f9f4e7aed4dc75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcc35a356ac2421f1f5a4e5460ac9461063a1fb85aaefce257f9f4e7aed4dc75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5v7ng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:29Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:29 crc kubenswrapper[4722]: I0309 14:04:29.288699 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:29 crc kubenswrapper[4722]: I0309 14:04:29.288738 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:29 crc kubenswrapper[4722]: I0309 14:04:29.288749 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:29 crc kubenswrapper[4722]: I0309 14:04:29.288764 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:29 crc kubenswrapper[4722]: I0309 14:04:29.288776 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:29Z","lastTransitionTime":"2026-03-09T14:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:29 crc kubenswrapper[4722]: I0309 14:04:29.391922 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:29 crc kubenswrapper[4722]: I0309 14:04:29.391974 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:29 crc kubenswrapper[4722]: I0309 14:04:29.391987 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:29 crc kubenswrapper[4722]: I0309 14:04:29.392009 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:29 crc kubenswrapper[4722]: I0309 14:04:29.392024 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:29Z","lastTransitionTime":"2026-03-09T14:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:29 crc kubenswrapper[4722]: I0309 14:04:29.495730 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:29 crc kubenswrapper[4722]: I0309 14:04:29.495805 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:29 crc kubenswrapper[4722]: I0309 14:04:29.495824 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:29 crc kubenswrapper[4722]: I0309 14:04:29.495854 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:29 crc kubenswrapper[4722]: I0309 14:04:29.495873 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:29Z","lastTransitionTime":"2026-03-09T14:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:29 crc kubenswrapper[4722]: I0309 14:04:29.600514 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:29 crc kubenswrapper[4722]: I0309 14:04:29.600595 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:29 crc kubenswrapper[4722]: I0309 14:04:29.600614 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:29 crc kubenswrapper[4722]: I0309 14:04:29.600643 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:29 crc kubenswrapper[4722]: I0309 14:04:29.600663 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:29Z","lastTransitionTime":"2026-03-09T14:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:29 crc kubenswrapper[4722]: I0309 14:04:29.703951 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:29 crc kubenswrapper[4722]: I0309 14:04:29.704029 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:29 crc kubenswrapper[4722]: I0309 14:04:29.704051 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:29 crc kubenswrapper[4722]: I0309 14:04:29.704080 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:29 crc kubenswrapper[4722]: I0309 14:04:29.704100 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:29Z","lastTransitionTime":"2026-03-09T14:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:29 crc kubenswrapper[4722]: I0309 14:04:29.705585 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" Mar 09 14:04:29 crc kubenswrapper[4722]: I0309 14:04:29.705637 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" Mar 09 14:04:29 crc kubenswrapper[4722]: I0309 14:04:29.743983 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" Mar 09 14:04:29 crc kubenswrapper[4722]: I0309 14:04:29.765364 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dac2aaf5-653b-4b2a-8efe-ed26bac8d648\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf5b268d6a355ae3a2c40981769412124b85cdfd99364ed58e5163a744f5f83c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q6db\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f451493fa6d86f12c1291c0fe262bcbe4b62cef45c8dda696bb48fa3ded2eeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q6db\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hjrrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:29Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:29 crc kubenswrapper[4722]: I0309 14:04:29.800953 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2267a4d7-87b4-43f7-80c7-7a7ffb0327fc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04169d9b8359e26cc9807e27fb9a5d54ca0946aaade294c5bcc272a7d21c4db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09fa62e7293a50fb3f6bdaf37061cbe5e95a830c8f7337660ce73138dbcaa641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31ba786781fcdb781f5fa45276e29e94756863d2398109d91e81a52f6f88bee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dff721159b382bcfdde6f406a328ef2ee9264b35a8d8baeb7a3c6d31a60ff3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63769aca40f721bec0716427145a3ceea0595836c03d31c98f6d45ff14de68be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://928d88a2e78d66b996b4b4f8c09878f78dd7fbdb000e2ab6d43ac77c999a748f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://928d88a2e78d66b996b4b4f8c09878f78dd7fbdb000e2ab6d43ac77c999a748f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6625c60c717ffc2b85b70374bc50dfae41b04d6362db0c7daf387f31c25862e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6625c60c717ffc2b85b70374bc50dfae41b04d6362db0c7daf387f31c25862e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://21b62fa3b3144d1d24dbb3a9e5f203e4c692377a54fb2322ebe6b927aa67403b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21b62fa3b3144d1d24dbb3a9e5f203e4c692377a54fb2322ebe6b927aa67403b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:29Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:29 crc kubenswrapper[4722]: I0309 14:04:29.806973 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:29 crc kubenswrapper[4722]: I0309 14:04:29.807022 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:29 crc kubenswrapper[4722]: I0309 14:04:29.807040 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:29 crc kubenswrapper[4722]: I0309 14:04:29.807071 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:29 crc kubenswrapper[4722]: I0309 14:04:29.807096 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:29Z","lastTransitionTime":"2026-03-09T14:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:29 crc kubenswrapper[4722]: I0309 14:04:29.824485 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0965e24b-526e-4842-ac1f-eca7a765355d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56c76a5e8718243a17d8d9fd9a8dcf6083c2ccb65b4816926dca08a89e465728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be3836660bf646052991965270cd69b8b8237b5c4bd698a73789b050d23e6877\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dbef38ff5e714c92fc8a86f1e3c64adf606cb198145f884a000f8dee200fcf2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7d636e81bd62363d09f680bbc312f71cbc2ac89d9af44b3c6b9efdaf29ded48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7d636e81bd62363d09f680bbc312f71cbc2ac89d9af44b3c6b9efdaf29ded48\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T14:03:46Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 14:03:46.681607 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 14:03:46.681849 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 14:03:46.682597 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2315154617/tls.crt::/tmp/serving-cert-2315154617/tls.key\\\\\\\"\\\\nI0309 14:03:46.866095 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 14:03:46.869876 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 14:03:46.869896 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 14:03:46.870280 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 14:03:46.870293 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 14:03:46.877096 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0309 14:03:46.877117 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 14:03:46.877121 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 14:03:46.877128 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 14:03:46.877135 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 14:03:46.877139 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 14:03:46.877143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 14:03:46.877146 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 14:03:46.878419 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T14:03:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fbabc5ec2c100150ce886c06c3bd78cc5896c8a06ac9d262eeb552a8a8bb5d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:29Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:29 crc kubenswrapper[4722]: I0309 14:04:29.848459 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:29Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:29 crc kubenswrapper[4722]: I0309 14:04:29.870990 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:29Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:29 crc kubenswrapper[4722]: I0309 14:04:29.889077 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-72xb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43e3c934-6aa7-4c96-a3a6-378f7931fc2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcf5e49245fd6051b12465a36b27f657b4923a62f88d61fe0fac971c4a2c9197\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lfx7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-72xb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:29Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:29 crc kubenswrapper[4722]: I0309 14:04:29.909211 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:29Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:29 crc kubenswrapper[4722]: I0309 14:04:29.911627 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:29 crc kubenswrapper[4722]: I0309 14:04:29.911736 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:29 crc kubenswrapper[4722]: I0309 14:04:29.911765 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:29 crc kubenswrapper[4722]: I0309 14:04:29.911799 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:29 crc kubenswrapper[4722]: I0309 14:04:29.911844 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:29Z","lastTransitionTime":"2026-03-09T14:04:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:29 crc kubenswrapper[4722]: I0309 14:04:29.931549 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69e74824fe242fd2d54bed9734f33823697d7651043d9c97bac6ea5490625a0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:29Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:29 crc kubenswrapper[4722]: I0309 14:04:29.952653 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d69359acfc7a4ee7698088a7e0062b3604a5ceb92910cd3b3d69b1a6291dfdcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://687646beb4c6dd3fb9b659a36449a573f4c389b3e98b07127b18516fdc4f2c93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:29Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:29 crc kubenswrapper[4722]: I0309 14:04:29.979468 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h4zw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b9e29bb-6e51-47ab-a543-b70117ab854d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17cd3d1e3981e2f2a3b66780f753425c78452e6319fb8eae9935d3bac4456820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdw92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h4zw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:29Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:30 crc kubenswrapper[4722]: I0309 14:04:30.012771 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1fc80c43be93eeb2f9805bf314c8430f52062d0ce5d0fa1830e7e0c27e674e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f969085ae62d82dbe1f0f8f5d459c4a1bc7d1ae0a459da484f11dd6ae5fdd418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5765145caac69454a9a8e8de34d05cb1da06924dd365f3e74f52461be5027ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef1c5a3944e8693d8c68a4108c23851352407d0fac7fb7a03763d9084f117dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0106fa808804a96ca8247096399e3d04eb082b2691b1f39df3db9f6c630114c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e737417baac512fab2089f5da12e7c47ef06eae8df7803ad6b248505ff3d18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95b64a8b9951808b4bb84077c4173f26cc7ddba47598c21c2da2dd36baed95e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f5af077ae5b947194e5dba0542d053890a589582e415e1b41a932bdab6632e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcc35a356ac2421f1f5a4e5460ac9461063a1fb85aaefce257f9f4e7aed4dc75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcc35a356ac2421f1f5a4e5460ac9461063a1fb85aaefce257f9f4e7aed4dc75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5v7ng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:30Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:30 crc kubenswrapper[4722]: I0309 14:04:30.014669 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:30 crc kubenswrapper[4722]: I0309 14:04:30.014728 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:30 crc kubenswrapper[4722]: I0309 14:04:30.014740 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:30 crc kubenswrapper[4722]: I0309 14:04:30.014762 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:30 crc kubenswrapper[4722]: I0309 14:04:30.014773 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:30Z","lastTransitionTime":"2026-03-09T14:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:30 crc kubenswrapper[4722]: I0309 14:04:30.033772 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee2ed55248b177bc0873ef3c41517401fdb30fc5ce5a545a1ca8c925e4d4e9e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:30Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:30 crc kubenswrapper[4722]: I0309 14:04:30.052055 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jv499" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd1357a3-36ef-4bc8-85bb-91f3c0f42994\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a14e62f50fa8e5403349dc8696e9c7c45a5da1026227612186e7805b1be722f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e56992f54f9fc4387130b74e3aa0f4652a5d6d960da1a5305b8668b8086fae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e56992f54f9fc4387130b74e3aa0f4652a5d6d960da1a5305b8668b8086fae2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c00d804aec8d776f5f2ab9c7a81161c9942a94564d57afe47c633fb3dc2aa7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c00d804aec8d776f5f2ab9c7a81161c9942a94564d57afe47c633fb3dc2aa7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0491f9997970cb050ece19d88d2a7064d4799db3772132f58055e1dbcf2f3a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0491f9997970cb050ece19d88d2a7064d4799db3772132f58055e1dbcf2f3a82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93c7d9a6b0dc8e76abab762ff6aae1975a31fbc80ee3a3cd32ec378298061748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93c7d9a6b0dc8e76abab762ff6aae1975a31fbc80ee3a3cd32ec378298061748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cc5ae61b0fa949ba4bf5aa60d09385399a557f19ef7c99dc43b21fc9db830e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9cc5ae61b0fa949ba4bf5aa60d09385399a557f19ef7c99dc43b21fc9db830e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a88cf1e6b90a3f13f44bbb08482773f007066c795b00cb0177f6cd2ed87cbb52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a88cf1e6b90a3f13f44bbb08482773f007066c795b00cb0177f6cd2ed87cbb52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jv499\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:30Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:30 crc kubenswrapper[4722]: I0309 14:04:30.068828 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l5pwg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8424174-764a-470b-ab49-e7369a42de7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d30648bba1b59fd7db5e75a61d6a004246668d148515da5738b77e5122358f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l5pwg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:30Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:30 crc kubenswrapper[4722]: I0309 14:04:30.118537 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:30 crc kubenswrapper[4722]: I0309 14:04:30.118616 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:30 crc kubenswrapper[4722]: I0309 14:04:30.118636 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:30 crc kubenswrapper[4722]: I0309 14:04:30.118669 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:30 crc kubenswrapper[4722]: I0309 14:04:30.118691 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:30Z","lastTransitionTime":"2026-03-09T14:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:30 crc kubenswrapper[4722]: I0309 14:04:30.148885 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 14:04:30 crc kubenswrapper[4722]: I0309 14:04:30.148954 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 14:04:30 crc kubenswrapper[4722]: E0309 14:04:30.149023 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 14:04:30 crc kubenswrapper[4722]: I0309 14:04:30.148974 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 14:04:30 crc kubenswrapper[4722]: E0309 14:04:30.149146 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 14:04:30 crc kubenswrapper[4722]: E0309 14:04:30.149448 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 14:04:30 crc kubenswrapper[4722]: I0309 14:04:30.166923 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:30Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:30 crc kubenswrapper[4722]: I0309 14:04:30.190427 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69e74824fe242fd2d54bed9734f33823697d7651043d9c97bac6ea5490625a0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:30Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:30 crc kubenswrapper[4722]: I0309 14:04:30.212126 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d69359acfc7a4ee7698088a7e0062b3604a5ceb92910cd3b3d69b1a6291dfdcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://687646beb4c6dd3fb9b659a36449a573f4c389b3e98b07127b18516fdc4f2c93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:30Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:30 crc kubenswrapper[4722]: I0309 14:04:30.221870 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:30 crc kubenswrapper[4722]: I0309 14:04:30.221992 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:30 crc kubenswrapper[4722]: I0309 14:04:30.222024 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:30 crc kubenswrapper[4722]: I0309 14:04:30.222049 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:30 crc kubenswrapper[4722]: I0309 14:04:30.222065 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:30Z","lastTransitionTime":"2026-03-09T14:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:30 crc kubenswrapper[4722]: I0309 14:04:30.234805 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h4zw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b9e29bb-6e51-47ab-a543-b70117ab854d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17cd3d1e3981e2f2a3b66780f753425c78452e6319fb8eae9935d3bac4456820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdw92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h4zw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:30Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:30 crc kubenswrapper[4722]: I0309 14:04:30.272578 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1fc80c43be93eeb2f9805bf314c8430f52062d0ce5d0fa1830e7e0c27e674e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f969085ae62d82dbe1f0f8f5d459c4a1bc7d1ae0a459da484f11dd6ae5fdd418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5765145caac69454a9a8e8de34d05cb1da06924dd365f3e74f52461be5027ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef1c5a3944e8693d8c68a4108c23851352407d0fac7fb7a03763d9084f117dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0106fa808804a96ca8247096399e3d04eb082b2691b1f39df3db9f6c630114c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e737417baac512fab2089f5da12e7c47ef06eae8df7803ad6b248505ff3d18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95b64a8b9951808b4bb84077c4173f26cc7ddba47598c21c2da2dd36baed95e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f5af077ae5b947194e5dba0542d053890a589582e415e1b41a932bdab6632e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcc35a356ac2421f1f5a4e5460ac9461063a1fb85aaefce257f9f4e7aed4dc75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcc35a356ac2421f1f5a4e5460ac9461063a1fb85aaefce257f9f4e7aed4dc75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5v7ng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:30Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:30 crc kubenswrapper[4722]: I0309 14:04:30.294380 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee2ed55248b177bc0873ef3c41517401fdb30fc5ce5a545a1ca8c925e4d4e9e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:30Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:30 crc kubenswrapper[4722]: I0309 14:04:30.318256 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jv499" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd1357a3-36ef-4bc8-85bb-91f3c0f42994\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a14e62f50fa8e5403349dc8696e9c7c45a5da1026227612186e7805b1be722f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e56992f54f9fc4387130b74e3aa0f4652a5d6d960da1a5305b8668b8086fae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e56992f54f9fc4387130b74e3aa0f4652a5d6d960da1a5305b8668b8086fae2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c00d804aec8d776f5f2ab9c7a81161c9942a94564d57afe47c633fb3dc2aa7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c00d804aec8d776f5f2ab9c7a81161c9942a94564d57afe47c633fb3dc2aa7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0491f9997970cb050ece19d88d2a7064d4799db3772132f58055e1dbcf2f3a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0491f9997970cb050ece19d88d2a7064d4799db3772132f58055e1dbcf2f3a82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93c7d9a6b0dc8e76abab762ff6aae1975a31fbc80ee3a3cd32ec378298061748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93c7d9a6b0dc8e76abab762ff6aae1975a31fbc80ee3a3cd32ec378298061748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cc5ae61b0fa949ba4bf5aa60d09385399a557f19ef7c99dc43b21fc9db830e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9cc5ae61b0fa949ba4bf5aa60d09385399a557f19ef7c99dc43b21fc9db830e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a88cf1e6b90a3f13f44bbb08482773f007066c795b00cb0177f6cd2ed87cbb52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a88cf1e6b90a3f13f44bbb08482773f007066c795b00cb0177f6cd2ed87cbb52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jv499\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:30Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:30 crc kubenswrapper[4722]: I0309 14:04:30.325881 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:30 crc kubenswrapper[4722]: I0309 14:04:30.325930 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:30 crc kubenswrapper[4722]: I0309 14:04:30.325941 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:30 crc kubenswrapper[4722]: I0309 14:04:30.325961 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:30 crc kubenswrapper[4722]: I0309 14:04:30.325974 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:30Z","lastTransitionTime":"2026-03-09T14:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:30 crc kubenswrapper[4722]: I0309 14:04:30.339014 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l5pwg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8424174-764a-470b-ab49-e7369a42de7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d30648bba1b59fd7db5e75a61d6a004246668d148515da5738b77e5122358f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l5pwg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:30Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:30 crc kubenswrapper[4722]: I0309 14:04:30.357398 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-72xb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43e3c934-6aa7-4c96-a3a6-378f7931fc2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcf5e49245fd6051b12465a36b27f657b4923a62f88d61fe0fac971c4a2c9197\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lfx7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-72xb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:30Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:30 crc kubenswrapper[4722]: I0309 14:04:30.374406 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dac2aaf5-653b-4b2a-8efe-ed26bac8d648\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf5b268d6a355ae3a2c40981769412124b85cdfd99364ed58e5163a744f5f83c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q6db\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f451493fa6d86f12c1291c0fe262bcbe4b62cef45c8dda696bb48fa3ded2eeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q6db\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hjrrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:30Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:30 crc kubenswrapper[4722]: I0309 14:04:30.394586 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2267a4d7-87b4-43f7-80c7-7a7ffb0327fc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04169d9b8359e26cc9807e27fb9a5d54ca0946aaade294c5bcc272a7d21c4db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09fa62e7293a50fb3f6bdaf37061cbe5e95a830c8f7337660ce73138dbcaa641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31ba786781fcdb781f5fa45276e29e94756863d2398109d91e81a52f6f88bee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dff721159b382bcfdde6f406a328ef2ee9264b35a8d8baeb7a3c6d31a60ff3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63769aca40f721bec0716427145a3ceea0595836c03d31c98f6d45ff14de68be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://928d88a2e78d66b996b4b4f8c09878f78dd7fbdb000e2ab6d43ac77c999a748f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://928d88a2e78d66b996b4b4f8c09878f78dd7fbdb000e2ab6d43ac77c999a748f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6625c60c717ffc2b85b70374bc50dfae41b04d6362db0c7daf387f31c25862e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6625c60c717ffc2b85b70374bc50dfae41b04d6362db0c7daf387f31c25862e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://21b62fa3b3144d1d24dbb3a9e5f203e4c692377a54fb2322ebe6b927aa67403b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21b62fa3b3144d1d24dbb3a9e5f203e4c692377a54fb2322ebe6b927aa67403b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:30Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:30 crc kubenswrapper[4722]: I0309 14:04:30.412548 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0965e24b-526e-4842-ac1f-eca7a765355d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56c76a5e8718243a17d8d9fd9a8dcf6083c2ccb65b4816926dca08a89e465728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be3836660bf646052991965270cd69b8b8237b5c4bd698a73789b050d23e6877\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dbef38ff5e714c92fc8a86f1e3c64adf606cb198145f884a000f8dee200fcf2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7d636e81bd62363d09f680bbc312f71cbc2ac89d9af44b3c6b9efdaf29ded48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7d636e81bd62363d09f680bbc312f71cbc2ac89d9af44b3c6b9efdaf29ded48\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T14:03:46Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 14:03:46.681607 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 14:03:46.681849 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 14:03:46.682597 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2315154617/tls.crt::/tmp/serving-cert-2315154617/tls.key\\\\\\\"\\\\nI0309 14:03:46.866095 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 14:03:46.869876 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 14:03:46.869896 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 14:03:46.870280 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 14:03:46.870293 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 14:03:46.877096 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0309 14:03:46.877117 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 14:03:46.877121 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 14:03:46.877128 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 14:03:46.877135 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 14:03:46.877139 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 14:03:46.877143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 14:03:46.877146 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 14:03:46.878419 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T14:03:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fbabc5ec2c100150ce886c06c3bd78cc5896c8a06ac9d262eeb552a8a8bb5d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:30Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:30 crc kubenswrapper[4722]: I0309 14:04:30.428195 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:30 crc kubenswrapper[4722]: I0309 14:04:30.428275 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:30 crc kubenswrapper[4722]: I0309 14:04:30.428287 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:30 crc kubenswrapper[4722]: I0309 14:04:30.428309 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:30 crc kubenswrapper[4722]: I0309 14:04:30.428327 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:30Z","lastTransitionTime":"2026-03-09T14:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:30 crc kubenswrapper[4722]: I0309 14:04:30.431322 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:30Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:30 crc kubenswrapper[4722]: I0309 14:04:30.445337 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:30Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:30 crc kubenswrapper[4722]: I0309 14:04:30.531296 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:30 crc kubenswrapper[4722]: I0309 14:04:30.531370 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:30 crc kubenswrapper[4722]: I0309 14:04:30.531386 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:30 crc kubenswrapper[4722]: I0309 14:04:30.531433 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:30 crc kubenswrapper[4722]: I0309 14:04:30.531447 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:30Z","lastTransitionTime":"2026-03-09T14:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:30 crc kubenswrapper[4722]: I0309 14:04:30.634192 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:30 crc kubenswrapper[4722]: I0309 14:04:30.634252 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:30 crc kubenswrapper[4722]: I0309 14:04:30.634263 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:30 crc kubenswrapper[4722]: I0309 14:04:30.634278 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:30 crc kubenswrapper[4722]: I0309 14:04:30.634288 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:30Z","lastTransitionTime":"2026-03-09T14:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:30 crc kubenswrapper[4722]: I0309 14:04:30.737543 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:30 crc kubenswrapper[4722]: I0309 14:04:30.737591 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:30 crc kubenswrapper[4722]: I0309 14:04:30.737601 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:30 crc kubenswrapper[4722]: I0309 14:04:30.737618 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:30 crc kubenswrapper[4722]: I0309 14:04:30.737631 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:30Z","lastTransitionTime":"2026-03-09T14:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:30 crc kubenswrapper[4722]: I0309 14:04:30.842577 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:30 crc kubenswrapper[4722]: I0309 14:04:30.842633 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:30 crc kubenswrapper[4722]: I0309 14:04:30.842653 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:30 crc kubenswrapper[4722]: I0309 14:04:30.842673 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:30 crc kubenswrapper[4722]: I0309 14:04:30.842692 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:30Z","lastTransitionTime":"2026-03-09T14:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:30 crc kubenswrapper[4722]: I0309 14:04:30.946986 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:30 crc kubenswrapper[4722]: I0309 14:04:30.947031 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:30 crc kubenswrapper[4722]: I0309 14:04:30.947045 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:30 crc kubenswrapper[4722]: I0309 14:04:30.947066 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:30 crc kubenswrapper[4722]: I0309 14:04:30.947078 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:30Z","lastTransitionTime":"2026-03-09T14:04:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:31 crc kubenswrapper[4722]: I0309 14:04:31.048934 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:31 crc kubenswrapper[4722]: I0309 14:04:31.048985 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:31 crc kubenswrapper[4722]: I0309 14:04:31.048999 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:31 crc kubenswrapper[4722]: I0309 14:04:31.049018 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:31 crc kubenswrapper[4722]: I0309 14:04:31.049030 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:31Z","lastTransitionTime":"2026-03-09T14:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:31 crc kubenswrapper[4722]: I0309 14:04:31.151731 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:31 crc kubenswrapper[4722]: I0309 14:04:31.151771 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:31 crc kubenswrapper[4722]: I0309 14:04:31.151782 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:31 crc kubenswrapper[4722]: I0309 14:04:31.151795 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:31 crc kubenswrapper[4722]: I0309 14:04:31.151807 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:31Z","lastTransitionTime":"2026-03-09T14:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:31 crc kubenswrapper[4722]: I0309 14:04:31.254528 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:31 crc kubenswrapper[4722]: I0309 14:04:31.254587 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:31 crc kubenswrapper[4722]: I0309 14:04:31.254603 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:31 crc kubenswrapper[4722]: I0309 14:04:31.254626 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:31 crc kubenswrapper[4722]: I0309 14:04:31.254642 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:31Z","lastTransitionTime":"2026-03-09T14:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:31 crc kubenswrapper[4722]: I0309 14:04:31.357790 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:31 crc kubenswrapper[4722]: I0309 14:04:31.357834 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:31 crc kubenswrapper[4722]: I0309 14:04:31.357843 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:31 crc kubenswrapper[4722]: I0309 14:04:31.357859 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:31 crc kubenswrapper[4722]: I0309 14:04:31.357870 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:31Z","lastTransitionTime":"2026-03-09T14:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:31 crc kubenswrapper[4722]: I0309 14:04:31.460146 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:31 crc kubenswrapper[4722]: I0309 14:04:31.460224 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:31 crc kubenswrapper[4722]: I0309 14:04:31.460243 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:31 crc kubenswrapper[4722]: I0309 14:04:31.460267 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:31 crc kubenswrapper[4722]: I0309 14:04:31.460283 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:31Z","lastTransitionTime":"2026-03-09T14:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:31 crc kubenswrapper[4722]: I0309 14:04:31.562833 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:31 crc kubenswrapper[4722]: I0309 14:04:31.562894 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:31 crc kubenswrapper[4722]: I0309 14:04:31.562913 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:31 crc kubenswrapper[4722]: I0309 14:04:31.562939 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:31 crc kubenswrapper[4722]: I0309 14:04:31.562960 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:31Z","lastTransitionTime":"2026-03-09T14:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:31 crc kubenswrapper[4722]: I0309 14:04:31.666673 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:31 crc kubenswrapper[4722]: I0309 14:04:31.666753 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:31 crc kubenswrapper[4722]: I0309 14:04:31.666777 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:31 crc kubenswrapper[4722]: I0309 14:04:31.666802 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:31 crc kubenswrapper[4722]: I0309 14:04:31.666819 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:31Z","lastTransitionTime":"2026-03-09T14:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:31 crc kubenswrapper[4722]: I0309 14:04:31.714703 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5v7ng_2e305619-b3a2-44c9-9e54-e1afa4f43dbf/ovnkube-controller/0.log" Mar 09 14:04:31 crc kubenswrapper[4722]: I0309 14:04:31.718198 4722 generic.go:334] "Generic (PLEG): container finished" podID="2e305619-b3a2-44c9-9e54-e1afa4f43dbf" containerID="95b64a8b9951808b4bb84077c4173f26cc7ddba47598c21c2da2dd36baed95e2" exitCode=1 Mar 09 14:04:31 crc kubenswrapper[4722]: I0309 14:04:31.718258 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" event={"ID":"2e305619-b3a2-44c9-9e54-e1afa4f43dbf","Type":"ContainerDied","Data":"95b64a8b9951808b4bb84077c4173f26cc7ddba47598c21c2da2dd36baed95e2"} Mar 09 14:04:31 crc kubenswrapper[4722]: I0309 14:04:31.719454 4722 scope.go:117] "RemoveContainer" containerID="95b64a8b9951808b4bb84077c4173f26cc7ddba47598c21c2da2dd36baed95e2" Mar 09 14:04:31 crc kubenswrapper[4722]: I0309 14:04:31.743412 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h4zw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b9e29bb-6e51-47ab-a543-b70117ab854d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17cd3d1e3981e2f2a3b66780f753425c78452e6319fb8eae9935d3bac4456820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdw92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h4zw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:31Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:31 crc kubenswrapper[4722]: I0309 14:04:31.770163 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:31 crc kubenswrapper[4722]: I0309 14:04:31.770230 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:31 crc kubenswrapper[4722]: I0309 14:04:31.770249 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:31 crc kubenswrapper[4722]: I0309 14:04:31.770271 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:31 crc kubenswrapper[4722]: I0309 14:04:31.770286 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:31Z","lastTransitionTime":"2026-03-09T14:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:31 crc kubenswrapper[4722]: I0309 14:04:31.771454 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1fc80c43be93eeb2f9805bf314c8430f52062d0ce5d0fa1830e7e0c27e674e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f969085ae62d82dbe1f0f8f5d459c4a1bc7d1ae0a459da484f11dd6ae5fdd418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5765145caac69454a9a8e8de34d05cb1da06924dd365f3e74f52461be5027ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef1c5a3944e8693d8c68a4108c23851352407d0fac7fb7a03763d9084f117dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0106fa808804a96ca8247096399e3d04eb082b2691b1f39df3db9f6c630114c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e737417baac512fab2089f5da12e7c47ef06eae8df7803ad6b248505ff3d18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://95b64a8b9951808b4bb84077c4173f26cc7ddba47598c21c2da2dd36baed95e2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95b64a8b9951808b4bb84077c4173f26cc7ddba47598c21c2da2dd36baed95e2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T14:04:30Z\\\",\\\"message\\\":\\\"lector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 14:04:30.922401 6611 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 14:04:30.922473 6611 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 14:04:30.922717 6611 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 14:04:30.923081 6611 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 14:04:30.923389 6611 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 14:04:30.923577 6611 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0309 14:04:30.923618 6611 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0309 14:04:30.923634 6611 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0309 14:04:30.923657 6611 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0309 14:04:30.923676 6611 factory.go:656] Stopping watch factory\\\\nI0309 14:04:30.923678 6611 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0309 14:04:30.923712 6611 handler.go:208] Removed *v1.EgressFirewall ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f5af077ae5b947194e5dba0542d053890a589582e415e1b41a932bdab6632e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcc35a356ac2421f1f5a4e5460ac9461063a1fb85aaefce257f9f4e7aed4dc75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcc35a356ac2421f1f5a4e5460ac9461063a1fb85aaefce257f9f4e7aed4dc75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5v7ng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:31Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:31 crc kubenswrapper[4722]: I0309 14:04:31.789634 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d69359acfc7a4ee7698088a7e0062b3604a5ceb92910cd3b3d69b1a6291dfdcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://687646beb4c6dd3fb9b659a36449a573f4c389b3e98b07127b18516fdc4f2c93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:31Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:31 crc kubenswrapper[4722]: I0309 14:04:31.807001 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jv499" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd1357a3-36ef-4bc8-85bb-91f3c0f42994\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a14e62f50fa8e5403349dc8696e9c7c45a5da1026227612186e7805b1be722f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e56992f54f9fc4387130b74e3aa0f4652a5d6d960da1a5305b8668b8086fae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e56992f54f9fc4387130b74e3aa0f4652a5d6d960da1a5305b8668b8086fae2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c00d804aec8d776f5f2ab9c7a81161c9942a94564d57afe47c633fb3dc2aa7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c00d804aec8d776f5f2ab9c7a81161c9942a94564d57afe47c633fb3dc2aa7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0491f9997970cb050ece19d88d2a7064d4799db3772132f58055e1dbcf2f3a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0491f9997970cb050ece19d88d2a7064d4799db3772132f58055e1dbcf2f3a82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93c7d9a6b0dc8e76abab762ff6aae1975a31fbc80ee3a3cd32ec378298061748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93c7d9a6b0dc8e76abab762ff6aae1975a31fbc80ee3a3cd32ec378298061748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cc5ae61b0fa949ba4bf5aa60d09385399a557f19ef7c99dc43b21fc9db830e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9cc5ae61b0fa949ba4bf5aa60d09385399a557f19ef7c99dc43b21fc9db830e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a88cf1e6b90a3f13f44bbb08482773f007066c795b00cb0177f6cd2ed87cbb52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a88cf1e6b90a3f13f44bbb08482773f007066c795b00cb0177f6cd2ed87cbb52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jv499\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:31Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:31 crc kubenswrapper[4722]: I0309 14:04:31.820370 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l5pwg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8424174-764a-470b-ab49-e7369a42de7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d30648bba1b59fd7db5e75a61d6a004246668d148515da5738b77e5122358f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l5pwg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:31Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:31 crc kubenswrapper[4722]: I0309 14:04:31.836323 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee2ed55248b177bc0873ef3c41517401fdb30fc5ce5a545a1ca8c925e4d4e9e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:31Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:31 crc kubenswrapper[4722]: I0309 14:04:31.857433 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2267a4d7-87b4-43f7-80c7-7a7ffb0327fc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04169d9b8359e26cc9807e27fb9a5d54ca0946aaade294c5bcc272a7d21c4db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09fa62e7293a50fb3f6bdaf37061cbe5e95a830c8f7337660ce73138dbcaa641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31ba786781fcdb781f5fa45276e29e94756863d2398109d91e81a52f6f88bee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dff721159b382bcfdde6f406a328ef2ee9264b35a8d8baeb7a3c6d31a60ff3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63769aca40f721bec0716427145a3ceea0595836c03d31c98f6d45ff14de68be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://928d88a2e78d66b996b4b4f8c09878f78dd7fbdb000e2ab6d43ac77c999a748f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://928d88a2e78d66b996b4b4f8c09878f78dd7fbdb000e2ab6d43ac77c999a748f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6625c60c717ffc2b85b70374bc50dfae41b04d6362db0c7daf387f31c25862e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6625c60c717ffc2b85b70374bc50dfae41b04d6362db0c7daf387f31c25862e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://21b62fa3b3144d1d24dbb3a9e5f203e4c692377a54fb2322ebe6b927aa67403b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21b62fa3b3144d1d24dbb3a9e5f203e4c692377a54fb2322ebe6b927aa67403b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:31Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:31 crc kubenswrapper[4722]: I0309 14:04:31.871447 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0965e24b-526e-4842-ac1f-eca7a765355d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56c76a5e8718243a17d8d9fd9a8dcf6083c2ccb65b4816926dca08a89e465728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be3836660bf646052991965270cd69b8b8237b5c4bd698a73789b050d23e6877\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dbef38ff5e714c92fc8a86f1e3c64adf606cb198145f884a000f8dee200fcf2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7d636e81bd62363d09f680bbc312f71cbc2ac89d9af44b3c6b9efdaf29ded48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7d636e81bd62363d09f680bbc312f71cbc2ac89d9af44b3c6b9efdaf29ded48\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T14:03:46Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 14:03:46.681607 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 14:03:46.681849 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 14:03:46.682597 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2315154617/tls.crt::/tmp/serving-cert-2315154617/tls.key\\\\\\\"\\\\nI0309 14:03:46.866095 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 14:03:46.869876 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 14:03:46.869896 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 14:03:46.870280 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 14:03:46.870293 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 14:03:46.877096 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0309 14:03:46.877117 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 14:03:46.877121 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 14:03:46.877128 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 14:03:46.877135 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 14:03:46.877139 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 14:03:46.877143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 14:03:46.877146 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 14:03:46.878419 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T14:03:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fbabc5ec2c100150ce886c06c3bd78cc5896c8a06ac9d262eeb552a8a8bb5d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:31Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:31 crc kubenswrapper[4722]: I0309 14:04:31.873022 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:31 crc kubenswrapper[4722]: I0309 14:04:31.873049 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:31 crc kubenswrapper[4722]: I0309 14:04:31.873059 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:31 crc kubenswrapper[4722]: I0309 14:04:31.873075 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:31 crc kubenswrapper[4722]: I0309 14:04:31.873087 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:31Z","lastTransitionTime":"2026-03-09T14:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:31 crc kubenswrapper[4722]: I0309 14:04:31.885590 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:31Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:31 crc kubenswrapper[4722]: I0309 14:04:31.901702 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:31Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:31 crc kubenswrapper[4722]: I0309 14:04:31.917349 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-72xb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43e3c934-6aa7-4c96-a3a6-378f7931fc2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcf5e49245fd6051b12465a36b27f657b4923a62f88d61fe0fac971c4a2c9197\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lfx7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-72xb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:31Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:31 crc kubenswrapper[4722]: I0309 14:04:31.934342 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dac2aaf5-653b-4b2a-8efe-ed26bac8d648\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf5b268d6a355ae3a2c40981769412124b85cdfd99364ed58e5163a744f5f83c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q6db\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f451493fa6d86f12c1291c0fe262bcbe4b62cef45c8dda696bb48fa3ded2eeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q6db\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hjrrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:31Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:31 crc kubenswrapper[4722]: I0309 14:04:31.951725 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:31Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:31 crc kubenswrapper[4722]: I0309 14:04:31.967597 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69e74824fe242fd2d54bed9734f33823697d7651043d9c97bac6ea5490625a0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:31Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:31 crc kubenswrapper[4722]: I0309 14:04:31.974990 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:31 crc kubenswrapper[4722]: I0309 14:04:31.975011 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:31 crc kubenswrapper[4722]: I0309 14:04:31.975020 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:31 crc kubenswrapper[4722]: I0309 14:04:31.975032 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:31 crc kubenswrapper[4722]: I0309 14:04:31.975040 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:31Z","lastTransitionTime":"2026-03-09T14:04:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:32 crc kubenswrapper[4722]: I0309 14:04:32.078534 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:32 crc kubenswrapper[4722]: I0309 14:04:32.078583 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:32 crc kubenswrapper[4722]: I0309 14:04:32.078600 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:32 crc kubenswrapper[4722]: I0309 14:04:32.078622 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:32 crc kubenswrapper[4722]: I0309 14:04:32.078639 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:32Z","lastTransitionTime":"2026-03-09T14:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:32 crc kubenswrapper[4722]: I0309 14:04:32.150952 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 14:04:32 crc kubenswrapper[4722]: E0309 14:04:32.151068 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 14:04:32 crc kubenswrapper[4722]: I0309 14:04:32.151473 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 14:04:32 crc kubenswrapper[4722]: E0309 14:04:32.151553 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 14:04:32 crc kubenswrapper[4722]: I0309 14:04:32.151649 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 14:04:32 crc kubenswrapper[4722]: E0309 14:04:32.151717 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 14:04:32 crc kubenswrapper[4722]: I0309 14:04:32.181431 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:32 crc kubenswrapper[4722]: I0309 14:04:32.181462 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:32 crc kubenswrapper[4722]: I0309 14:04:32.181473 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:32 crc kubenswrapper[4722]: I0309 14:04:32.181488 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:32 crc kubenswrapper[4722]: I0309 14:04:32.181500 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:32Z","lastTransitionTime":"2026-03-09T14:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:32 crc kubenswrapper[4722]: I0309 14:04:32.284238 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:32 crc kubenswrapper[4722]: I0309 14:04:32.284310 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:32 crc kubenswrapper[4722]: I0309 14:04:32.284331 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:32 crc kubenswrapper[4722]: I0309 14:04:32.284355 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:32 crc kubenswrapper[4722]: I0309 14:04:32.284374 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:32Z","lastTransitionTime":"2026-03-09T14:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:32 crc kubenswrapper[4722]: I0309 14:04:32.388651 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:32 crc kubenswrapper[4722]: I0309 14:04:32.388720 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:32 crc kubenswrapper[4722]: I0309 14:04:32.388732 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:32 crc kubenswrapper[4722]: I0309 14:04:32.388755 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:32 crc kubenswrapper[4722]: I0309 14:04:32.388769 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:32Z","lastTransitionTime":"2026-03-09T14:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:32 crc kubenswrapper[4722]: I0309 14:04:32.491908 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:32 crc kubenswrapper[4722]: I0309 14:04:32.491972 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:32 crc kubenswrapper[4722]: I0309 14:04:32.491988 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:32 crc kubenswrapper[4722]: I0309 14:04:32.492009 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:32 crc kubenswrapper[4722]: I0309 14:04:32.492028 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:32Z","lastTransitionTime":"2026-03-09T14:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:32 crc kubenswrapper[4722]: I0309 14:04:32.595745 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:32 crc kubenswrapper[4722]: I0309 14:04:32.595799 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:32 crc kubenswrapper[4722]: I0309 14:04:32.595813 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:32 crc kubenswrapper[4722]: I0309 14:04:32.595835 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:32 crc kubenswrapper[4722]: I0309 14:04:32.595849 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:32Z","lastTransitionTime":"2026-03-09T14:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:32 crc kubenswrapper[4722]: I0309 14:04:32.699427 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:32 crc kubenswrapper[4722]: I0309 14:04:32.699478 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:32 crc kubenswrapper[4722]: I0309 14:04:32.699488 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:32 crc kubenswrapper[4722]: I0309 14:04:32.699508 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:32 crc kubenswrapper[4722]: I0309 14:04:32.699520 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:32Z","lastTransitionTime":"2026-03-09T14:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:32 crc kubenswrapper[4722]: I0309 14:04:32.723353 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5v7ng_2e305619-b3a2-44c9-9e54-e1afa4f43dbf/ovnkube-controller/1.log" Mar 09 14:04:32 crc kubenswrapper[4722]: I0309 14:04:32.724181 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5v7ng_2e305619-b3a2-44c9-9e54-e1afa4f43dbf/ovnkube-controller/0.log" Mar 09 14:04:32 crc kubenswrapper[4722]: I0309 14:04:32.728320 4722 generic.go:334] "Generic (PLEG): container finished" podID="2e305619-b3a2-44c9-9e54-e1afa4f43dbf" containerID="879b959fe7540c9b645b6ada02e192abb990fb3311a7346f0b1df41c47d62379" exitCode=1 Mar 09 14:04:32 crc kubenswrapper[4722]: I0309 14:04:32.728390 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" event={"ID":"2e305619-b3a2-44c9-9e54-e1afa4f43dbf","Type":"ContainerDied","Data":"879b959fe7540c9b645b6ada02e192abb990fb3311a7346f0b1df41c47d62379"} Mar 09 14:04:32 crc kubenswrapper[4722]: I0309 14:04:32.728488 4722 scope.go:117] "RemoveContainer" containerID="95b64a8b9951808b4bb84077c4173f26cc7ddba47598c21c2da2dd36baed95e2" Mar 09 14:04:32 crc kubenswrapper[4722]: I0309 14:04:32.729623 4722 scope.go:117] "RemoveContainer" containerID="879b959fe7540c9b645b6ada02e192abb990fb3311a7346f0b1df41c47d62379" Mar 09 14:04:32 crc kubenswrapper[4722]: E0309 14:04:32.730119 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-5v7ng_openshift-ovn-kubernetes(2e305619-b3a2-44c9-9e54-e1afa4f43dbf)\"" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" podUID="2e305619-b3a2-44c9-9e54-e1afa4f43dbf" Mar 09 14:04:32 crc kubenswrapper[4722]: I0309 14:04:32.754062 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h4zw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b9e29bb-6e51-47ab-a543-b70117ab854d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17cd3d1e3981e2f2a3b66780f753425c78452e6319fb8eae9935d3bac4456820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdw92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h4zw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:32Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:32 crc kubenswrapper[4722]: I0309 14:04:32.780476 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1fc80c43be93eeb2f9805bf314c8430f52062d0ce5d0fa1830e7e0c27e674e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f969085ae62d82dbe1f0f8f5d459c4a1bc7d1ae0a459da484f11dd6ae5fdd418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5765145caac69454a9a8e8de34d05cb1da06924dd365f3e74f52461be5027ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef1c5a3944e8693d8c68a4108c23851352407d0fac7fb7a03763d9084f117dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0106fa808804a96ca8247096399e3d04eb082b2691b1f39df3db9f6c630114c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e737417baac512fab2089f5da12e7c47ef06eae8df7803ad6b248505ff3d18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://879b959fe7540c9b645b6ada02e192abb990fb3311a7346f0b1df41c47d62379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95b64a8b9951808b4bb84077c4173f26cc7ddba47598c21c2da2dd36baed95e2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T14:04:30Z\\\",\\\"message\\\":\\\"lector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 14:04:30.922401 6611 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 14:04:30.922473 6611 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 14:04:30.922717 6611 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 14:04:30.923081 6611 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 14:04:30.923389 6611 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 14:04:30.923577 6611 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0309 14:04:30.923618 6611 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0309 14:04:30.923634 6611 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0309 14:04:30.923657 6611 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0309 14:04:30.923676 6611 factory.go:656] Stopping watch factory\\\\nI0309 14:04:30.923678 6611 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0309 14:04:30.923712 6611 handler.go:208] Removed *v1.EgressFirewall ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://879b959fe7540c9b645b6ada02e192abb990fb3311a7346f0b1df41c47d62379\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T14:04:32Z\\\",\\\"message\\\":\\\"igs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0309 14:04:32.599371 6762 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0309 14:04:32.599438 6762 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 14:04:32.599509 6762 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0309 14:04:32.599664 6762 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0309 14:04:32.599744 6762 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0309 14:04:32.599757 6762 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0309 14:04:32.599783 6762 factory.go:656] Stopping watch factory\\\\nI0309 14:04:32.599796 6762 handler.go:208] Removed *v1.Node event handler 7\\\\nI0309 14:04:32.599804 6762 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0309 14:04:32.599813 6762 handler.go:208] Removed *v1.Node event handler 2\\\\nI0309 14:04:32.599922 6762 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 14:04:32.600179 6762 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f5af077ae5b947194e5dba0542d053890a589582e415e1b41a932bdab6632e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcc35a356ac2421f1f5a4e5460ac9461063a1fb85aaefce257f9f4e7aed4dc75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcc35a356ac2421f1f5a4e5460ac9461063a1fb85aaefce257f9f4e7aed4dc75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5v7ng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:32Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:32 crc kubenswrapper[4722]: I0309 14:04:32.796506 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d69359acfc7a4ee7698088a7e0062b3604a5ceb92910cd3b3d69b1a6291dfdcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://687646beb4c6dd3fb9b659a36449a573f4c389b3e98b07127b18516fdc4f2c93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:32Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:32 crc kubenswrapper[4722]: I0309 14:04:32.803673 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:32 crc kubenswrapper[4722]: I0309 14:04:32.803740 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:32 crc kubenswrapper[4722]: I0309 14:04:32.803766 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:32 crc kubenswrapper[4722]: I0309 14:04:32.803799 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:32 crc kubenswrapper[4722]: I0309 14:04:32.803823 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:32Z","lastTransitionTime":"2026-03-09T14:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:32 crc kubenswrapper[4722]: I0309 14:04:32.815846 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jv499" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd1357a3-36ef-4bc8-85bb-91f3c0f42994\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a14e62f50fa8e5403349dc8696e9c7c45a5da1026227612186e7805b1be722f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e56992f54f9fc4387130b74e3aa0f4652a5d6d960da1a5305b8668b8086fae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e56992f54f9fc4387130b74e3aa0f4652a5d6d960da1a5305b8668b8086fae2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c00d804aec8d776f5f2ab9c7a81161c9942a94564d57afe47c633fb3dc2aa7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c00d804aec8d776f5f2ab9c7a81161c9942a94564d57afe47c633fb3dc2aa7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0491f9997970cb050ece19d88d2a7064d4799db3772132f58055e1dbcf2f3a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0491f9997970cb050ece19d88d2a7064d4799db3772132f58055e1dbcf2f3a82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93c7d9a6b0dc8e76abab762ff6aae1975a31fbc80ee3a3cd32ec378298061748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93c7d9a6b0dc8e76abab762ff6aae1975a31fbc80ee3a3cd32ec378298061748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cc5ae61b0fa949ba4bf5aa60d09385399a557f19ef7c99dc43b21fc9db830e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9cc5ae61b0fa949ba4bf5aa60d09385399a557f19ef7c99dc43b21fc9db830e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a88cf1e6b90a3f13f44bbb08482773f007066c795b00cb0177f6cd2ed87cbb52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a88cf1e6b90a3f13f44bbb08482773f007066c795b00cb0177f6cd2ed87cbb52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jv499\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:32Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:32 crc kubenswrapper[4722]: I0309 14:04:32.832928 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l5pwg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8424174-764a-470b-ab49-e7369a42de7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d30648bba1b59fd7db5e75a61d6a004246668d148515da5738b77e5122358f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l5pwg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:32Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:32 crc kubenswrapper[4722]: I0309 14:04:32.850715 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee2ed55248b177bc0873ef3c41517401fdb30fc5ce5a545a1ca8c925e4d4e9e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:32Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:32 crc kubenswrapper[4722]: I0309 14:04:32.875324 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2267a4d7-87b4-43f7-80c7-7a7ffb0327fc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04169d9b8359e26cc9807e27fb9a5d54ca0946aaade294c5bcc272a7d21c4db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09fa62e7293a50fb3f6bdaf37061cbe5e95a830c8f7337660ce73138dbcaa641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31ba786781fcdb781f5fa45276e29e94756863d2398109d91e81a52f6f88bee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dff721159b382bcfdde6f406a328ef2ee9264b35a8d8baeb7a3c6d31a60ff3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63769aca40f721bec0716427145a3ceea0595836c03d31c98f6d45ff14de68be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://928d88a2e78d66b996b4b4f8c09878f78dd7fbdb000e2ab6d43ac77c999a748f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://928d88a2e78d66b996b4b4f8c09878f78dd7fbdb000e2ab6d43ac77c999a748f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6625c60c717ffc2b85b70374bc50dfae41b04d6362db0c7daf387f31c25862e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6625c60c717ffc2b85b70374bc50dfae41b04d6362db0c7daf387f31c25862e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://21b62fa3b3144d1d24dbb3a9e5f203e4c692377a54fb2322ebe6b927aa67403b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21b62fa3b3144d1d24dbb3a9e5f203e4c692377a54fb2322ebe6b927aa67403b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:32Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:32 crc kubenswrapper[4722]: I0309 14:04:32.893760 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0965e24b-526e-4842-ac1f-eca7a765355d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56c76a5e8718243a17d8d9fd9a8dcf6083c2ccb65b4816926dca08a89e465728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be3836660bf646052991965270cd69b8b8237b5c4bd698a73789b050d23e6877\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dbef38ff5e714c92fc8a86f1e3c64adf606cb198145f884a000f8dee200fcf2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7d636e81bd62363d09f680bbc312f71cbc2ac89d9af44b3c6b9efdaf29ded48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7d636e81bd62363d09f680bbc312f71cbc2ac89d9af44b3c6b9efdaf29ded48\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T14:03:46Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 14:03:46.681607 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 14:03:46.681849 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 14:03:46.682597 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2315154617/tls.crt::/tmp/serving-cert-2315154617/tls.key\\\\\\\"\\\\nI0309 14:03:46.866095 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 14:03:46.869876 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 14:03:46.869896 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 14:03:46.870280 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 14:03:46.870293 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 14:03:46.877096 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0309 14:03:46.877117 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 14:03:46.877121 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 14:03:46.877128 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 14:03:46.877135 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 14:03:46.877139 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 14:03:46.877143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 14:03:46.877146 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 14:03:46.878419 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T14:03:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fbabc5ec2c100150ce886c06c3bd78cc5896c8a06ac9d262eeb552a8a8bb5d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:32Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:32 crc kubenswrapper[4722]: I0309 14:04:32.907070 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:32 crc kubenswrapper[4722]: I0309 14:04:32.907303 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:32 crc kubenswrapper[4722]: I0309 14:04:32.907416 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:32 crc kubenswrapper[4722]: I0309 14:04:32.907508 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:32 crc kubenswrapper[4722]: I0309 14:04:32.907602 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:32Z","lastTransitionTime":"2026-03-09T14:04:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:32 crc kubenswrapper[4722]: I0309 14:04:32.910192 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:32Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:32 crc kubenswrapper[4722]: I0309 14:04:32.926647 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:32Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:32 crc kubenswrapper[4722]: I0309 14:04:32.942206 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-72xb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43e3c934-6aa7-4c96-a3a6-378f7931fc2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcf5e49245fd6051b12465a36b27f657b4923a62f88d61fe0fac971c4a2c9197\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lfx7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-72xb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:32Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:32 crc kubenswrapper[4722]: I0309 14:04:32.958759 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dac2aaf5-653b-4b2a-8efe-ed26bac8d648\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf5b268d6a355ae3a2c40981769412124b85cdfd99364ed58e5163a744f5f83c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q6db\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f451493fa6d86f12c1291c0fe262bcbe4b62cef45c8dda696bb48fa3ded2eeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q6db\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hjrrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:32Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:32 crc kubenswrapper[4722]: I0309 14:04:32.975258 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:32Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:32 crc kubenswrapper[4722]: I0309 14:04:32.988889 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69e74824fe242fd2d54bed9734f33823697d7651043d9c97bac6ea5490625a0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:32Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.011064 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.011112 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.011123 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.011139 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.011156 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:33Z","lastTransitionTime":"2026-03-09T14:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.114068 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.114144 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.114164 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.114188 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.114225 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:33Z","lastTransitionTime":"2026-03-09T14:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.217111 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.217179 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.217196 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.217241 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.217261 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:33Z","lastTransitionTime":"2026-03-09T14:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.322399 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.322455 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.322468 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.322549 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.322566 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:33Z","lastTransitionTime":"2026-03-09T14:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.424933 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.424973 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.424983 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.424997 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.425008 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:33Z","lastTransitionTime":"2026-03-09T14:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.528177 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.528255 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.528268 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.528287 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.528300 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:33Z","lastTransitionTime":"2026-03-09T14:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.631738 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.631782 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.631797 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.631818 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.631835 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:33Z","lastTransitionTime":"2026-03-09T14:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.637620 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xsft8"] Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.638126 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xsft8" Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.640560 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.641742 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.657102 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0965e24b-526e-4842-ac1f-eca7a765355d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56c76a5e8718243a17d8d9fd9a8dcf6083c2ccb65b4816926dca08a89e465728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be3836660bf646052991965270cd69b8b8237b5c4bd698a73789b050d23e6877\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dbef38ff5e714c92fc8a86f1e3c64adf606cb198145f884a000f8dee200fcf2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7d636e81bd62363d09f680bbc312f71cbc2ac89d9af44b3c6b9efdaf29ded48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7d636e81bd62363d09f680bbc312f71cbc2ac89d9af44b3c6b9efdaf29ded48\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T14:03:46Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 14:03:46.681607 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 14:03:46.681849 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 14:03:46.682597 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2315154617/tls.crt::/tmp/serving-cert-2315154617/tls.key\\\\\\\"\\\\nI0309 14:03:46.866095 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 14:03:46.869876 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 14:03:46.869896 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 14:03:46.870280 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 14:03:46.870293 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 14:03:46.877096 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0309 14:03:46.877117 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 14:03:46.877121 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 14:03:46.877128 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 14:03:46.877135 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 14:03:46.877139 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 14:03:46.877143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 14:03:46.877146 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 14:03:46.878419 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T14:03:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fbabc5ec2c100150ce886c06c3bd78cc5896c8a06ac9d262eeb552a8a8bb5d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:33Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.671196 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:33Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.690775 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:33Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.706883 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-72xb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43e3c934-6aa7-4c96-a3a6-378f7931fc2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcf5e49245fd6051b12465a36b27f657b4923a62f88d61fe0fac971c4a2c9197\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lfx7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-72xb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:33Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.721405 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dac2aaf5-653b-4b2a-8efe-ed26bac8d648\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf5b268d6a355ae3a2c40981769412124b85cdfd99364ed58e5163a744f5f83c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q6db\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f451493fa6d86f12c1291c0fe262bcbe4b62cef45c8dda696bb48fa3ded2eeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q6db\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hjrrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:33Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.733535 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.733573 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.733585 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.733624 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.733638 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:33Z","lastTransitionTime":"2026-03-09T14:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.733885 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5v7ng_2e305619-b3a2-44c9-9e54-e1afa4f43dbf/ovnkube-controller/1.log" Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.737501 4722 scope.go:117] "RemoveContainer" containerID="879b959fe7540c9b645b6ada02e192abb990fb3311a7346f0b1df41c47d62379" Mar 09 14:04:33 crc kubenswrapper[4722]: E0309 14:04:33.737670 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-5v7ng_openshift-ovn-kubernetes(2e305619-b3a2-44c9-9e54-e1afa4f43dbf)\"" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" podUID="2e305619-b3a2-44c9-9e54-e1afa4f43dbf" Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.746795 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2267a4d7-87b4-43f7-80c7-7a7ffb0327fc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04169d9b8359e26cc9807e27fb9a5d54ca0946aaade294c5bcc272a7d21c4db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09fa62e7293a50fb3f6bdaf37061cbe5e95a830c8f7337660ce73138dbcaa641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31ba786781fcdb781f5fa45276e29e94756863d2398109d91e81a52f6f88bee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dff721159b382bcfdde6f406a328ef2ee9264b35a8d8baeb7a3c6d31a60ff3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63769aca40f721bec0716427145a3ceea0595836c03d31c98f6d45ff14de68be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://928d88a2e78d66b996b4b4f8c09878f78dd7fbdb000e2ab6d43ac77c999a748f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://928d88a2e78d66b996b4b4f8c09878f78dd7fbdb000e2ab6d43ac77c999a748f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6625c60c717ffc2b85b70374bc50dfae41b04d6362db0c7daf387f31c25862e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6625c60c717ffc2b85b70374bc50dfae41b04d6362db0c7daf387f31c25862e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://21b62fa3b3144d1d24dbb3a9e5f203e4c692377a54fb2322ebe6b927aa67403b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21b62fa3b3144d1d24dbb3a9e5f203e4c692377a54fb2322ebe6b927aa67403b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:33Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.761157 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:33Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.772297 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69e74824fe242fd2d54bed9734f33823697d7651043d9c97bac6ea5490625a0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:33Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.776319 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqpsw\" (UniqueName: \"kubernetes.io/projected/3e14085d-df3c-4218-aa43-0553786b8867-kube-api-access-sqpsw\") pod \"ovnkube-control-plane-749d76644c-xsft8\" (UID: \"3e14085d-df3c-4218-aa43-0553786b8867\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xsft8" Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.776368 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3e14085d-df3c-4218-aa43-0553786b8867-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-xsft8\" (UID: \"3e14085d-df3c-4218-aa43-0553786b8867\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xsft8" Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.776393 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3e14085d-df3c-4218-aa43-0553786b8867-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-xsft8\" (UID: \"3e14085d-df3c-4218-aa43-0553786b8867\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xsft8" Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.776525 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3e14085d-df3c-4218-aa43-0553786b8867-env-overrides\") pod \"ovnkube-control-plane-749d76644c-xsft8\" (UID: \"3e14085d-df3c-4218-aa43-0553786b8867\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xsft8" Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.791171 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1fc80c43be93eeb2f9805bf314c8430f52062d0ce5d0fa1830e7e0c27e674e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f969085ae62d82dbe1f0f8f5d459c4a1bc7d1ae0a459da484f11dd6ae5fdd418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5765145caac69454a9a8e8de34d05cb1da06924dd365f3e74f52461be5027ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef1c5a3944e8693d8c68a4108c23851352407d0fac7fb7a03763d9084f117dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0106fa808804a96ca8247096399e3d04eb082b2691b1f39df3db9f6c630114c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e737417baac512fab2089f5da12e7c47ef06eae8df7803ad6b248505ff3d18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://879b959fe7540c9b645b6ada02e192abb990fb3311a7346f0b1df41c47d62379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://95b64a8b9951808b4bb84077c4173f26cc7ddba47598c21c2da2dd36baed95e2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T14:04:30Z\\\",\\\"message\\\":\\\"lector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 14:04:30.922401 6611 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 14:04:30.922473 6611 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 14:04:30.922717 6611 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 14:04:30.923081 6611 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 14:04:30.923389 6611 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 14:04:30.923577 6611 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0309 14:04:30.923618 6611 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0309 14:04:30.923634 6611 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0309 14:04:30.923657 6611 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0309 14:04:30.923676 6611 factory.go:656] Stopping watch factory\\\\nI0309 14:04:30.923678 6611 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0309 14:04:30.923712 6611 handler.go:208] Removed *v1.EgressFirewall ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:28Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://879b959fe7540c9b645b6ada02e192abb990fb3311a7346f0b1df41c47d62379\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T14:04:32Z\\\",\\\"message\\\":\\\"igs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0309 14:04:32.599371 6762 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0309 14:04:32.599438 6762 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 14:04:32.599509 6762 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0309 14:04:32.599664 6762 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0309 14:04:32.599744 6762 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0309 14:04:32.599757 6762 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0309 14:04:32.599783 6762 factory.go:656] Stopping watch factory\\\\nI0309 14:04:32.599796 6762 handler.go:208] Removed *v1.Node event handler 7\\\\nI0309 14:04:32.599804 6762 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0309 14:04:32.599813 6762 handler.go:208] Removed *v1.Node event handler 2\\\\nI0309 14:04:32.599922 6762 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 14:04:32.600179 6762 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f5af077ae5b947194e5dba0542d053890a589582e415e1b41a932bdab6632e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcc35a356ac2421f1f5a4e5460ac9461063a1fb85aaefce257f9f4e7aed4dc75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcc35a356ac2421f1f5a4e5460ac9461063a1fb85aaefce257f9f4e7aed4dc75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5v7ng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:33Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.805019 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xsft8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e14085d-df3c-4218-aa43-0553786b8867\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqpsw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqpsw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xsft8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:33Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.818172 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d69359acfc7a4ee7698088a7e0062b3604a5ceb92910cd3b3d69b1a6291dfdcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://687646beb4c6dd3fb9b659a36449a573f4c389b3e98b07127b18516fdc4f2c93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:33Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.831361 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h4zw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b9e29bb-6e51-47ab-a543-b70117ab854d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17cd3d1e3981e2f2a3b66780f753425c78452e6319fb8eae9935d3bac4456820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdw92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h4zw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:33Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.837227 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.837261 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.837271 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.837289 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.837300 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:33Z","lastTransitionTime":"2026-03-09T14:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.843189 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l5pwg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8424174-764a-470b-ab49-e7369a42de7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d30648bba1b59fd7db5e75a61d6a004246668d148515da5738b77e5122358f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l5pwg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:33Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.856300 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee2ed55248b177bc0873ef3c41517401fdb30fc5ce5a545a1ca8c925e4d4e9e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:33Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.869596 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jv499" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd1357a3-36ef-4bc8-85bb-91f3c0f42994\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a14e62f50fa8e5403349dc8696e9c7c45a5da1026227612186e7805b1be722f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e56992f54f9fc4387130b74e3aa0f4652a5d6d960da1a5305b8668b8086fae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e56992f54f9fc4387130b74e3aa0f4652a5d6d960da1a5305b8668b8086fae2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c00d804aec8d776f5f2ab9c7a81161c9942a94564d57afe47c633fb3dc2aa7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c00d804aec8d776f5f2ab9c7a81161c9942a94564d57afe47c633fb3dc2aa7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0491f9997970cb050ece19d88d2a7064d4799db3772132f58055e1dbcf2f3a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0491f9997970cb050ece19d88d2a7064d4799db3772132f58055e1dbcf2f3a82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93c7d9a6b0dc8e76abab762ff6aae1975a31fbc80ee3a3cd32ec378298061748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93c7d9a6b0dc8e76abab762ff6aae1975a31fbc80ee3a3cd32ec378298061748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cc5ae61b0fa949ba4bf5aa60d09385399a557f19ef7c99dc43b21fc9db830e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9cc5ae61b0fa949ba4bf5aa60d09385399a557f19ef7c99dc43b21fc9db830e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a88cf1e6b90a3f13f44bbb08482773f007066c795b00cb0177f6cd2ed87cbb52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a88cf1e6b90a3f13f44bbb08482773f007066c795b00cb0177f6cd2ed87cbb52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jv499\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:33Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.877553 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqpsw\" (UniqueName: \"kubernetes.io/projected/3e14085d-df3c-4218-aa43-0553786b8867-kube-api-access-sqpsw\") pod \"ovnkube-control-plane-749d76644c-xsft8\" (UID: \"3e14085d-df3c-4218-aa43-0553786b8867\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xsft8" Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.877617 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3e14085d-df3c-4218-aa43-0553786b8867-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-xsft8\" (UID: \"3e14085d-df3c-4218-aa43-0553786b8867\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xsft8" Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.877657 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3e14085d-df3c-4218-aa43-0553786b8867-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-xsft8\" (UID: \"3e14085d-df3c-4218-aa43-0553786b8867\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xsft8" Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.877732 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3e14085d-df3c-4218-aa43-0553786b8867-env-overrides\") pod \"ovnkube-control-plane-749d76644c-xsft8\" (UID: \"3e14085d-df3c-4218-aa43-0553786b8867\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xsft8" Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.878362 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3e14085d-df3c-4218-aa43-0553786b8867-env-overrides\") pod \"ovnkube-control-plane-749d76644c-xsft8\" (UID: \"3e14085d-df3c-4218-aa43-0553786b8867\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xsft8" Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.879190 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3e14085d-df3c-4218-aa43-0553786b8867-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-xsft8\" (UID: \"3e14085d-df3c-4218-aa43-0553786b8867\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xsft8" Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.882125 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:33Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.888934 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3e14085d-df3c-4218-aa43-0553786b8867-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-xsft8\" (UID: \"3e14085d-df3c-4218-aa43-0553786b8867\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xsft8" Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.897913 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-72xb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43e3c934-6aa7-4c96-a3a6-378f7931fc2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcf5e49245fd6051b12465a36b27f657b4923a62f88d61fe0fac971c4a2c9197\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lfx7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-72xb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:33Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.899770 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqpsw\" (UniqueName: \"kubernetes.io/projected/3e14085d-df3c-4218-aa43-0553786b8867-kube-api-access-sqpsw\") pod \"ovnkube-control-plane-749d76644c-xsft8\" (UID: \"3e14085d-df3c-4218-aa43-0553786b8867\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xsft8" Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.910489 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dac2aaf5-653b-4b2a-8efe-ed26bac8d648\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf5b268d6a355ae3a2c40981769412124b85cdfd99364ed58e5163a744f5f83c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q6db\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f451493fa6d86f12c1291c0fe262bcbe4b62cef45c8dda696bb48fa3ded2eeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q6db\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hjrrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:33Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.939432 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2267a4d7-87b4-43f7-80c7-7a7ffb0327fc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04169d9b8359e26cc9807e27fb9a5d54ca0946aaade294c5bcc272a7d21c4db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09fa62e7293a50fb3f6bdaf37061cbe5e95a830c8f7337660ce73138dbcaa641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31ba786781fcdb781f5fa45276e29e94756863d2398109d91e81a52f6f88bee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dff721159b382bcfdde6f406a328ef2ee9264b35a8d8baeb7a3c6d31a60ff3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63769aca40f721bec0716427145a3ceea0595836c03d31c98f6d45ff14de68be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://928d88a2e78d66b996b4b4f8c09878f78dd7fbdb000e2ab6d43ac77c999a748f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://928d88a2e78d66b996b4b4f8c09878f78dd7fbdb000e2ab6d43ac77c999a748f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6625c60c717ffc2b85b70374bc50dfae41b04d6362db0c7daf387f31c25862e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6625c60c717ffc2b85b70374bc50dfae41b04d6362db0c7daf387f31c25862e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://21b62fa3b3144d1d24dbb3a9e5f203e4c692377a54fb2322ebe6b927aa67403b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21b62fa3b3144d1d24dbb3a9e5f203e4c692377a54fb2322ebe6b927aa67403b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:33Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.941283 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.941343 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.941357 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.941378 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.941387 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:33Z","lastTransitionTime":"2026-03-09T14:04:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.954508 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xsft8" Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.957617 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0965e24b-526e-4842-ac1f-eca7a765355d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56c76a5e8718243a17d8d9fd9a8dcf6083c2ccb65b4816926dca08a89e465728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be3836660bf646052991965270cd69b8b8237b5c4bd698a73789b050d23e6877\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dbef38ff5e714c92fc8a86f1e3c64adf606cb198145f884a000f8dee200fcf2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7d636e81bd62363d09f680bbc312f71cbc2ac89d9af44b3c6b9efdaf29ded48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7d636e81bd62363d09f680bbc312f71cbc2ac89d9af44b3c6b9efdaf29ded48\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T14:03:46Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 14:03:46.681607 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 14:03:46.681849 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 14:03:46.682597 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2315154617/tls.crt::/tmp/serving-cert-2315154617/tls.key\\\\\\\"\\\\nI0309 14:03:46.866095 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 14:03:46.869876 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 14:03:46.869896 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 14:03:46.870280 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 14:03:46.870293 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 14:03:46.877096 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0309 14:03:46.877117 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 14:03:46.877121 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 14:03:46.877128 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 14:03:46.877135 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 14:03:46.877139 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 14:03:46.877143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 14:03:46.877146 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 14:03:46.878419 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T14:03:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fbabc5ec2c100150ce886c06c3bd78cc5896c8a06ac9d262eeb552a8a8bb5d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:33Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:33 crc kubenswrapper[4722]: W0309 14:04:33.972020 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e14085d_df3c_4218_aa43_0553786b8867.slice/crio-dd17451c2e568a0423ecb45ccb22627a3bdcbf88e26058fc2221de970353614b WatchSource:0}: Error finding container dd17451c2e568a0423ecb45ccb22627a3bdcbf88e26058fc2221de970353614b: Status 404 returned error can't find the container with id dd17451c2e568a0423ecb45ccb22627a3bdcbf88e26058fc2221de970353614b Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.974475 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:33Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:33 crc kubenswrapper[4722]: I0309 14:04:33.993508 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:33Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.010025 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69e74824fe242fd2d54bed9734f33823697d7651043d9c97bac6ea5490625a0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:34Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.024171 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d69359acfc7a4ee7698088a7e0062b3604a5ceb92910cd3b3d69b1a6291dfdcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://687646beb4c6dd3fb9b659a36449a573f4c389b3e98b07127b18516fdc4f2c93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:34Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.041531 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h4zw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b9e29bb-6e51-47ab-a543-b70117ab854d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17cd3d1e3981e2f2a3b66780f753425c78452e6319fb8eae9935d3bac4456820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdw92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h4zw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:34Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.044830 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.044880 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.044893 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.044916 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.044934 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:34Z","lastTransitionTime":"2026-03-09T14:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.062295 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1fc80c43be93eeb2f9805bf314c8430f52062d0ce5d0fa1830e7e0c27e674e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f969085ae62d82dbe1f0f8f5d459c4a1bc7d1ae0a459da484f11dd6ae5fdd418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5765145caac69454a9a8e8de34d05cb1da06924dd365f3e74f52461be5027ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef1c5a3944e8693d8c68a4108c23851352407d0fac7fb7a03763d9084f117dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0106fa808804a96ca8247096399e3d04eb082b2691b1f39df3db9f6c630114c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e737417baac512fab2089f5da12e7c47ef06eae8df7803ad6b248505ff3d18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://879b959fe7540c9b645b6ada02e192abb990fb3311a7346f0b1df41c47d62379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://879b959fe7540c9b645b6ada02e192abb990fb3311a7346f0b1df41c47d62379\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T14:04:32Z\\\",\\\"message\\\":\\\"igs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0309 14:04:32.599371 6762 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0309 14:04:32.599438 6762 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 14:04:32.599509 6762 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0309 14:04:32.599664 6762 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0309 14:04:32.599744 6762 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0309 14:04:32.599757 6762 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0309 14:04:32.599783 6762 factory.go:656] Stopping watch factory\\\\nI0309 14:04:32.599796 6762 handler.go:208] Removed *v1.Node event handler 7\\\\nI0309 14:04:32.599804 6762 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0309 14:04:32.599813 6762 handler.go:208] Removed *v1.Node event handler 2\\\\nI0309 14:04:32.599922 6762 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 14:04:32.600179 6762 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-5v7ng_openshift-ovn-kubernetes(2e305619-b3a2-44c9-9e54-e1afa4f43dbf)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f5af077ae5b947194e5dba0542d053890a589582e415e1b41a932bdab6632e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcc35a356ac2421f1f5a4e5460ac9461063a1fb85aaefce257f9f4e7aed4dc75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcc35a356ac2421f1f5a4e5460ac9461063a1fb85aaefce257f9f4e7aed4dc75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5v7ng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:34Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.073465 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xsft8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e14085d-df3c-4218-aa43-0553786b8867\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqpsw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqpsw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xsft8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:34Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.091789 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee2ed55248b177bc0873ef3c41517401fdb30fc5ce5a545a1ca8c925e4d4e9e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:34Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.111357 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jv499" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd1357a3-36ef-4bc8-85bb-91f3c0f42994\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a14e62f50fa8e5403349dc8696e9c7c45a5da1026227612186e7805b1be722f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e56992f54f9fc4387130b74e3aa0f4652a5d6d960da1a5305b8668b8086fae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e56992f54f9fc4387130b74e3aa0f4652a5d6d960da1a5305b8668b8086fae2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c00d804aec8d776f5f2ab9c7a81161c9942a94564d57afe47c633fb3dc2aa7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c00d804aec8d776f5f2ab9c7a81161c9942a94564d57afe47c633fb3dc2aa7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0491f9997970cb050ece19d88d2a7064d4799db3772132f58055e1dbcf2f3a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0491f9997970cb050ece19d88d2a7064d4799db3772132f58055e1dbcf2f3a82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93c7d9a6b0dc8e76abab762ff6aae1975a31fbc80ee3a3cd32ec378298061748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93c7d9a6b0dc8e76abab762ff6aae1975a31fbc80ee3a3cd32ec378298061748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cc5ae61b0fa949ba4bf5aa60d09385399a557f19ef7c99dc43b21fc9db830e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9cc5ae61b0fa949ba4bf5aa60d09385399a557f19ef7c99dc43b21fc9db830e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a88cf1e6b90a3f13f44bbb08482773f007066c795b00cb0177f6cd2ed87cbb52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a88cf1e6b90a3f13f44bbb08482773f007066c795b00cb0177f6cd2ed87cbb52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jv499\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:34Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.128641 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l5pwg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8424174-764a-470b-ab49-e7369a42de7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d30648bba1b59fd7db5e75a61d6a004246668d148515da5738b77e5122358f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l5pwg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:34Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.148343 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 14:04:34 crc kubenswrapper[4722]: E0309 14:04:34.148538 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.149627 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 14:04:34 crc kubenswrapper[4722]: E0309 14:04:34.149730 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.150763 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 14:04:34 crc kubenswrapper[4722]: E0309 14:04:34.151361 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.157384 4722 scope.go:117] "RemoveContainer" containerID="b7d636e81bd62363d09f680bbc312f71cbc2ac89d9af44b3c6b9efdaf29ded48" Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.157649 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.157755 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.157788 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.157819 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.157842 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:34Z","lastTransitionTime":"2026-03-09T14:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.261208 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.261295 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.261310 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.261331 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.261345 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:34Z","lastTransitionTime":"2026-03-09T14:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.364358 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.364416 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.364430 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.364448 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.364461 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:34Z","lastTransitionTime":"2026-03-09T14:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.388716 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-pvvhj"] Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.389372 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pvvhj" Mar 09 14:04:34 crc kubenswrapper[4722]: E0309 14:04:34.389474 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pvvhj" podUID="f5dbc0be-527e-4f70-b185-3a10b1b11a75" Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.407106 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d69359acfc7a4ee7698088a7e0062b3604a5ceb92910cd3b3d69b1a6291dfdcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://687646beb4c6dd3fb9b659a36449a573f4c389b3e98b07127b18516fdc4f2c93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:34Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.426789 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h4zw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b9e29bb-6e51-47ab-a543-b70117ab854d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17cd3d1e3981e2f2a3b66780f753425c78452e6319fb8eae9935d3bac4456820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdw92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h4zw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:34Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.453886 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1fc80c43be93eeb2f9805bf314c8430f52062d0ce5d0fa1830e7e0c27e674e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f969085ae62d82dbe1f0f8f5d459c4a1bc7d1ae0a459da484f11dd6ae5fdd418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5765145caac69454a9a8e8de34d05cb1da06924dd365f3e74f52461be5027ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef1c5a3944e8693d8c68a4108c23851352407d0fac7fb7a03763d9084f117dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0106fa808804a96ca8247096399e3d04eb082b2691b1f39df3db9f6c630114c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e737417baac512fab2089f5da12e7c47ef06eae8df7803ad6b248505ff3d18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://879b959fe7540c9b645b6ada02e192abb990fb3311a7346f0b1df41c47d62379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://879b959fe7540c9b645b6ada02e192abb990fb3311a7346f0b1df41c47d62379\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T14:04:32Z\\\",\\\"message\\\":\\\"igs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0309 14:04:32.599371 6762 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0309 14:04:32.599438 6762 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 14:04:32.599509 6762 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0309 14:04:32.599664 6762 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0309 14:04:32.599744 6762 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0309 14:04:32.599757 6762 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0309 14:04:32.599783 6762 factory.go:656] Stopping watch factory\\\\nI0309 14:04:32.599796 6762 handler.go:208] Removed *v1.Node event handler 7\\\\nI0309 14:04:32.599804 6762 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0309 14:04:32.599813 6762 handler.go:208] Removed *v1.Node event handler 2\\\\nI0309 14:04:32.599922 6762 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 14:04:32.600179 6762 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-5v7ng_openshift-ovn-kubernetes(2e305619-b3a2-44c9-9e54-e1afa4f43dbf)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f5af077ae5b947194e5dba0542d053890a589582e415e1b41a932bdab6632e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcc35a356ac2421f1f5a4e5460ac9461063a1fb85aaefce257f9f4e7aed4dc75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcc35a356ac2421f1f5a4e5460ac9461063a1fb85aaefce257f9f4e7aed4dc75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5v7ng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:34Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.467324 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.467378 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.467393 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.467413 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.467427 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:34Z","lastTransitionTime":"2026-03-09T14:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.467835 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xsft8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e14085d-df3c-4218-aa43-0553786b8867\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:33Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:33Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqpsw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqpsw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xsft8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:34Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.487006 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f5dbc0be-527e-4f70-b185-3a10b1b11a75-metrics-certs\") pod \"network-metrics-daemon-pvvhj\" (UID: \"f5dbc0be-527e-4f70-b185-3a10b1b11a75\") " pod="openshift-multus/network-metrics-daemon-pvvhj" Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.487087 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vz4kf\" (UniqueName: \"kubernetes.io/projected/f5dbc0be-527e-4f70-b185-3a10b1b11a75-kube-api-access-vz4kf\") pod \"network-metrics-daemon-pvvhj\" (UID: \"f5dbc0be-527e-4f70-b185-3a10b1b11a75\") " pod="openshift-multus/network-metrics-daemon-pvvhj" Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.496551 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee2ed55248b177bc0873ef3c41517401fdb30fc5ce5a545a1ca8c925e4d4e9e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:34Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.516144 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jv499" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd1357a3-36ef-4bc8-85bb-91f3c0f42994\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a14e62f50fa8e5403349dc8696e9c7c45a5da1026227612186e7805b1be722f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e56992f54f9fc4387130b74e3aa0f4652a5d6d960da1a5305b8668b8086fae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e56992f54f9fc4387130b74e3aa0f4652a5d6d960da1a5305b8668b8086fae2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c00d804aec8d776f5f2ab9c7a81161c9942a94564d57afe47c633fb3dc2aa7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c00d804aec8d776f5f2ab9c7a81161c9942a94564d57afe47c633fb3dc2aa7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0491f9997970cb050ece19d88d2a7064d4799db3772132f58055e1dbcf2f3a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0491f9997970cb050ece19d88d2a7064d4799db3772132f58055e1dbcf2f3a82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93c7d9a6b0dc8e76abab762ff6aae1975a31fbc80ee3a3cd32ec378298061748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93c7d9a6b0dc8e76abab762ff6aae1975a31fbc80ee3a3cd32ec378298061748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cc5ae61b0fa949ba4bf5aa60d09385399a557f19ef7c99dc43b21fc9db830e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9cc5ae61b0fa949ba4bf5aa60d09385399a557f19ef7c99dc43b21fc9db830e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a88cf1e6b90a3f13f44bbb08482773f007066c795b00cb0177f6cd2ed87cbb52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a88cf1e6b90a3f13f44bbb08482773f007066c795b00cb0177f6cd2ed87cbb52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jv499\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:34Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.529495 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l5pwg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8424174-764a-470b-ab49-e7369a42de7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d30648bba1b59fd7db5e75a61d6a004246668d148515da5738b77e5122358f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l5pwg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:34Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.542034 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pvvhj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5dbc0be-527e-4f70-b185-3a10b1b11a75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pvvhj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:34Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.566716 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2267a4d7-87b4-43f7-80c7-7a7ffb0327fc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04169d9b8359e26cc9807e27fb9a5d54ca0946aaade294c5bcc272a7d21c4db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09fa62e7293a50fb3f6bdaf37061cbe5e95a830c8f7337660ce73138dbcaa641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31ba786781fcdb781f5fa45276e29e94756863d2398109d91e81a52f6f88bee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dff721159b382bcfdde6f406a328ef2ee9264b35a8d8baeb7a3c6d31a60ff3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63769aca40f721bec0716427145a3ceea0595836c03d31c98f6d45ff14de68be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://928d88a2e78d66b996b4b4f8c09878f78dd7fbdb000e2ab6d43ac77c999a748f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://928d88a2e78d66b996b4b4f8c09878f78dd7fbdb000e2ab6d43ac77c999a748f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6625c60c717ffc2b85b70374bc50dfae41b04d6362db0c7daf387f31c25862e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6625c60c717ffc2b85b70374bc50dfae41b04d6362db0c7daf387f31c25862e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://21b62fa3b3144d1d24dbb3a9e5f203e4c692377a54fb2322ebe6b927aa67403b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21b62fa3b3144d1d24dbb3a9e5f203e4c692377a54fb2322ebe6b927aa67403b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:34Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.569483 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.569541 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.569552 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.569569 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.569579 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:34Z","lastTransitionTime":"2026-03-09T14:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.584713 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0965e24b-526e-4842-ac1f-eca7a765355d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56c76a5e8718243a17d8d9fd9a8dcf6083c2ccb65b4816926dca08a89e465728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be3836660bf646052991965270cd69b8b8237b5c4bd698a73789b050d23e6877\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dbef38ff5e714c92fc8a86f1e3c64adf606cb198145f884a000f8dee200fcf2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7d636e81bd62363d09f680bbc312f71cbc2ac89d9af44b3c6b9efdaf29ded48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7d636e81bd62363d09f680bbc312f71cbc2ac89d9af44b3c6b9efdaf29ded48\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T14:03:46Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 14:03:46.681607 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 14:03:46.681849 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 14:03:46.682597 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2315154617/tls.crt::/tmp/serving-cert-2315154617/tls.key\\\\\\\"\\\\nI0309 14:03:46.866095 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 14:03:46.869876 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 14:03:46.869896 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 14:03:46.870280 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 14:03:46.870293 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 14:03:46.877096 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0309 14:03:46.877117 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 14:03:46.877121 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 14:03:46.877128 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 14:03:46.877135 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 14:03:46.877139 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 14:03:46.877143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 14:03:46.877146 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 14:03:46.878419 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T14:03:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fbabc5ec2c100150ce886c06c3bd78cc5896c8a06ac9d262eeb552a8a8bb5d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:34Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.588172 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f5dbc0be-527e-4f70-b185-3a10b1b11a75-metrics-certs\") pod \"network-metrics-daemon-pvvhj\" (UID: \"f5dbc0be-527e-4f70-b185-3a10b1b11a75\") " pod="openshift-multus/network-metrics-daemon-pvvhj" Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.588248 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vz4kf\" (UniqueName: \"kubernetes.io/projected/f5dbc0be-527e-4f70-b185-3a10b1b11a75-kube-api-access-vz4kf\") pod \"network-metrics-daemon-pvvhj\" (UID: \"f5dbc0be-527e-4f70-b185-3a10b1b11a75\") " pod="openshift-multus/network-metrics-daemon-pvvhj" Mar 09 14:04:34 crc kubenswrapper[4722]: E0309 14:04:34.588420 4722 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 14:04:34 crc kubenswrapper[4722]: E0309 14:04:34.588548 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f5dbc0be-527e-4f70-b185-3a10b1b11a75-metrics-certs podName:f5dbc0be-527e-4f70-b185-3a10b1b11a75 nodeName:}" failed. No retries permitted until 2026-03-09 14:04:35.088509743 +0000 UTC m=+115.644078359 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f5dbc0be-527e-4f70-b185-3a10b1b11a75-metrics-certs") pod "network-metrics-daemon-pvvhj" (UID: "f5dbc0be-527e-4f70-b185-3a10b1b11a75") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.605739 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:34Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.620380 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:34Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.621468 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vz4kf\" (UniqueName: \"kubernetes.io/projected/f5dbc0be-527e-4f70-b185-3a10b1b11a75-kube-api-access-vz4kf\") pod \"network-metrics-daemon-pvvhj\" (UID: \"f5dbc0be-527e-4f70-b185-3a10b1b11a75\") " pod="openshift-multus/network-metrics-daemon-pvvhj" Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.635274 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-72xb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43e3c934-6aa7-4c96-a3a6-378f7931fc2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcf5e49245fd6051b12465a36b27f657b4923a62f88d61fe0fac971c4a2c9197\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lfx7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-72xb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:34Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.651138 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dac2aaf5-653b-4b2a-8efe-ed26bac8d648\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf5b268d6a355ae3a2c40981769412124b85cdfd99364ed58e5163a744f5f83c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q6db\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f451493fa6d86f12c1291c0fe262bcbe4b62cef45c8dda696bb48fa3ded2eeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q6db\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hjrrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:34Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.666759 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:34Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.671693 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.671733 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.671745 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.671766 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.671782 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:34Z","lastTransitionTime":"2026-03-09T14:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.680706 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69e74824fe242fd2d54bed9734f33823697d7651043d9c97bac6ea5490625a0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:34Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.743184 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xsft8" event={"ID":"3e14085d-df3c-4218-aa43-0553786b8867","Type":"ContainerStarted","Data":"435bcac0d223a0ff9431ac5bc5be017aa95d2e567d991a1f4edc0f79559904e6"} Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.743318 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xsft8" event={"ID":"3e14085d-df3c-4218-aa43-0553786b8867","Type":"ContainerStarted","Data":"8fb0f733e44fc7a699961d799267ca3c01e65a62b29d74e6ae682bccba9e5281"} Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.743344 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xsft8" event={"ID":"3e14085d-df3c-4218-aa43-0553786b8867","Type":"ContainerStarted","Data":"dd17451c2e568a0423ecb45ccb22627a3bdcbf88e26058fc2221de970353614b"} Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.746166 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.749159 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"28fcbb961b20cd0cbc1e8548492168a0fe9920b7b98c7bfeffe34727c299030e"} Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.749643 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.761528 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h4zw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b9e29bb-6e51-47ab-a543-b70117ab854d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17cd3d1e3981e2f2a3b66780f753425c78452e6319fb8eae9935d3bac4456820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdw92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h4zw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:34Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.774466 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.774518 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.774531 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.774552 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.774562 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:34Z","lastTransitionTime":"2026-03-09T14:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.792815 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1fc80c43be93eeb2f9805bf314c8430f52062d0ce5d0fa1830e7e0c27e674e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f969085ae62d82dbe1f0f8f5d459c4a1bc7d1ae0a459da484f11dd6ae5fdd418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5765145caac69454a9a8e8de34d05cb1da06924dd365f3e74f52461be5027ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef1c5a3944e8693d8c68a4108c23851352407d0fac7fb7a03763d9084f117dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0106fa808804a96ca8247096399e3d04eb082b2691b1f39df3db9f6c630114c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e737417baac512fab2089f5da12e7c47ef06eae8df7803ad6b248505ff3d18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://879b959fe7540c9b645b6ada02e192abb990fb3311a7346f0b1df41c47d62379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://879b959fe7540c9b645b6ada02e192abb990fb3311a7346f0b1df41c47d62379\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T14:04:32Z\\\",\\\"message\\\":\\\"igs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0309 14:04:32.599371 6762 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0309 14:04:32.599438 6762 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 14:04:32.599509 6762 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0309 14:04:32.599664 6762 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0309 14:04:32.599744 6762 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0309 14:04:32.599757 6762 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0309 14:04:32.599783 6762 factory.go:656] Stopping watch factory\\\\nI0309 14:04:32.599796 6762 handler.go:208] Removed *v1.Node event handler 7\\\\nI0309 14:04:32.599804 6762 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0309 14:04:32.599813 6762 handler.go:208] Removed *v1.Node event handler 2\\\\nI0309 14:04:32.599922 6762 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 14:04:32.600179 6762 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-5v7ng_openshift-ovn-kubernetes(2e305619-b3a2-44c9-9e54-e1afa4f43dbf)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f5af077ae5b947194e5dba0542d053890a589582e415e1b41a932bdab6632e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcc35a356ac2421f1f5a4e5460ac9461063a1fb85aaefce257f9f4e7aed4dc75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcc35a356ac2421f1f5a4e5460ac9461063a1fb85aaefce257f9f4e7aed4dc75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5v7ng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:34Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.808690 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xsft8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e14085d-df3c-4218-aa43-0553786b8867\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fb0f733e44fc7a699961d799267ca3c01e65a62b29d74e6ae682bccba9e5281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqpsw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://435bcac0d223a0ff9431ac5bc5be017aa95d2e567d991a1f4edc0f79559904e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqpsw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xsft8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:34Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.824939 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d69359acfc7a4ee7698088a7e0062b3604a5ceb92910cd3b3d69b1a6291dfdcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://687646beb4c6dd3fb9b659a36449a573f4c389b3e98b07127b18516fdc4f2c93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:34Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.841552 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jv499" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd1357a3-36ef-4bc8-85bb-91f3c0f42994\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a14e62f50fa8e5403349dc8696e9c7c45a5da1026227612186e7805b1be722f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e56992f54f9fc4387130b74e3aa0f4652a5d6d960da1a5305b8668b8086fae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e56992f54f9fc4387130b74e3aa0f4652a5d6d960da1a5305b8668b8086fae2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c00d804aec8d776f5f2ab9c7a81161c9942a94564d57afe47c633fb3dc2aa7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c00d804aec8d776f5f2ab9c7a81161c9942a94564d57afe47c633fb3dc2aa7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0491f9997970cb050ece19d88d2a7064d4799db3772132f58055e1dbcf2f3a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0491f9997970cb050ece19d88d2a7064d4799db3772132f58055e1dbcf2f3a82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93c7d9a6b0dc8e76abab762ff6aae1975a31fbc80ee3a3cd32ec378298061748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93c7d9a6b0dc8e76abab762ff6aae1975a31fbc80ee3a3cd32ec378298061748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cc5ae61b0fa949ba4bf5aa60d09385399a557f19ef7c99dc43b21fc9db830e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9cc5ae61b0fa949ba4bf5aa60d09385399a557f19ef7c99dc43b21fc9db830e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a88cf1e6b90a3f13f44bbb08482773f007066c795b00cb0177f6cd2ed87cbb52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a88cf1e6b90a3f13f44bbb08482773f007066c795b00cb0177f6cd2ed87cbb52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jv499\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:34Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.857693 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l5pwg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8424174-764a-470b-ab49-e7369a42de7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d30648bba1b59fd7db5e75a61d6a004246668d148515da5738b77e5122358f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l5pwg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:34Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.877988 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.878050 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.878066 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.878095 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.878111 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:34Z","lastTransitionTime":"2026-03-09T14:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.890343 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee2ed55248b177bc0873ef3c41517401fdb30fc5ce5a545a1ca8c925e4d4e9e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:34Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.919900 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2267a4d7-87b4-43f7-80c7-7a7ffb0327fc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04169d9b8359e26cc9807e27fb9a5d54ca0946aaade294c5bcc272a7d21c4db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09fa62e7293a50fb3f6bdaf37061cbe5e95a830c8f7337660ce73138dbcaa641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31ba786781fcdb781f5fa45276e29e94756863d2398109d91e81a52f6f88bee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dff721159b382bcfdde6f406a328ef2ee9264b35a8d8baeb7a3c6d31a60ff3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63769aca40f721bec0716427145a3ceea0595836c03d31c98f6d45ff14de68be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://928d88a2e78d66b996b4b4f8c09878f78dd7fbdb000e2ab6d43ac77c999a748f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://928d88a2e78d66b996b4b4f8c09878f78dd7fbdb000e2ab6d43ac77c999a748f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6625c60c717ffc2b85b70374bc50dfae41b04d6362db0c7daf387f31c25862e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6625c60c717ffc2b85b70374bc50dfae41b04d6362db0c7daf387f31c25862e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://21b62fa3b3144d1d24dbb3a9e5f203e4c692377a54fb2322ebe6b927aa67403b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21b62fa3b3144d1d24dbb3a9e5f203e4c692377a54fb2322ebe6b927aa67403b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:34Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.940785 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0965e24b-526e-4842-ac1f-eca7a765355d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56c76a5e8718243a17d8d9fd9a8dcf6083c2ccb65b4816926dca08a89e465728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be3836660bf646052991965270cd69b8b8237b5c4bd698a73789b050d23e6877\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dbef38ff5e714c92fc8a86f1e3c64adf606cb198145f884a000f8dee200fcf2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7d636e81bd62363d09f680bbc312f71cbc2ac89d9af44b3c6b9efdaf29ded48\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7d636e81bd62363d09f680bbc312f71cbc2ac89d9af44b3c6b9efdaf29ded48\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T14:03:46Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 14:03:46.681607 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 14:03:46.681849 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 14:03:46.682597 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2315154617/tls.crt::/tmp/serving-cert-2315154617/tls.key\\\\\\\"\\\\nI0309 14:03:46.866095 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 14:03:46.869876 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 14:03:46.869896 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 14:03:46.870280 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 14:03:46.870293 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 14:03:46.877096 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0309 14:03:46.877117 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 14:03:46.877121 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 14:03:46.877128 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 14:03:46.877135 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 14:03:46.877139 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 14:03:46.877143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 14:03:46.877146 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 14:03:46.878419 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T14:03:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fbabc5ec2c100150ce886c06c3bd78cc5896c8a06ac9d262eeb552a8a8bb5d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:34Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.957475 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:34Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.977476 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:34Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.980665 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.980691 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.980700 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.980716 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.980727 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:34Z","lastTransitionTime":"2026-03-09T14:04:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:34 crc kubenswrapper[4722]: I0309 14:04:34.998402 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-72xb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43e3c934-6aa7-4c96-a3a6-378f7931fc2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcf5e49245fd6051b12465a36b27f657b4923a62f88d61fe0fac971c4a2c9197\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lfx7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-72xb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:34Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:35 crc kubenswrapper[4722]: I0309 14:04:35.015904 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dac2aaf5-653b-4b2a-8efe-ed26bac8d648\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf5b268d6a355ae3a2c40981769412124b85cdfd99364ed58e5163a744f5f83c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q6db\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f451493fa6d86f12c1291c0fe262bcbe4b62cef45c8dda696bb48fa3ded2eeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q6db\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hjrrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:35Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:35 crc kubenswrapper[4722]: I0309 14:04:35.034504 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pvvhj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5dbc0be-527e-4f70-b185-3a10b1b11a75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pvvhj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:35Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:35 crc kubenswrapper[4722]: I0309 14:04:35.049629 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:35Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:35 crc kubenswrapper[4722]: I0309 14:04:35.065140 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69e74824fe242fd2d54bed9734f33823697d7651043d9c97bac6ea5490625a0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:35Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:35 crc kubenswrapper[4722]: I0309 14:04:35.083046 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:35 crc kubenswrapper[4722]: I0309 14:04:35.083089 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:35 crc kubenswrapper[4722]: I0309 14:04:35.083098 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:35 crc kubenswrapper[4722]: I0309 14:04:35.083434 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:35 crc kubenswrapper[4722]: I0309 14:04:35.083492 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:35Z","lastTransitionTime":"2026-03-09T14:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:35 crc kubenswrapper[4722]: I0309 14:04:35.083928 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:35Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:35 crc kubenswrapper[4722]: I0309 14:04:35.094448 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f5dbc0be-527e-4f70-b185-3a10b1b11a75-metrics-certs\") pod \"network-metrics-daemon-pvvhj\" (UID: \"f5dbc0be-527e-4f70-b185-3a10b1b11a75\") " pod="openshift-multus/network-metrics-daemon-pvvhj" Mar 09 14:04:35 crc kubenswrapper[4722]: E0309 14:04:35.094608 4722 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 14:04:35 crc kubenswrapper[4722]: E0309 14:04:35.094659 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f5dbc0be-527e-4f70-b185-3a10b1b11a75-metrics-certs podName:f5dbc0be-527e-4f70-b185-3a10b1b11a75 nodeName:}" failed. No retries permitted until 2026-03-09 14:04:36.094643153 +0000 UTC m=+116.650211729 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f5dbc0be-527e-4f70-b185-3a10b1b11a75-metrics-certs") pod "network-metrics-daemon-pvvhj" (UID: "f5dbc0be-527e-4f70-b185-3a10b1b11a75") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 14:04:35 crc kubenswrapper[4722]: I0309 14:04:35.097848 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69e74824fe242fd2d54bed9734f33823697d7651043d9c97bac6ea5490625a0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:35Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:35 crc kubenswrapper[4722]: I0309 14:04:35.112530 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d69359acfc7a4ee7698088a7e0062b3604a5ceb92910cd3b3d69b1a6291dfdcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://687646beb4c6dd3fb9b659a36449a573f4c389b3e98b07127b18516fdc4f2c93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:35Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:35 crc kubenswrapper[4722]: I0309 14:04:35.126701 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h4zw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b9e29bb-6e51-47ab-a543-b70117ab854d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17cd3d1e3981e2f2a3b66780f753425c78452e6319fb8eae9935d3bac4456820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdw92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h4zw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:35Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:35 crc kubenswrapper[4722]: I0309 14:04:35.151078 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1fc80c43be93eeb2f9805bf314c8430f52062d0ce5d0fa1830e7e0c27e674e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f969085ae62d82dbe1f0f8f5d459c4a1bc7d1ae0a459da484f11dd6ae5fdd418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5765145caac69454a9a8e8de34d05cb1da06924dd365f3e74f52461be5027ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef1c5a3944e8693d8c68a4108c23851352407d0fac7fb7a03763d9084f117dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0106fa808804a96ca8247096399e3d04eb082b2691b1f39df3db9f6c630114c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e737417baac512fab2089f5da12e7c47ef06eae8df7803ad6b248505ff3d18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://879b959fe7540c9b645b6ada02e192abb990fb3311a7346f0b1df41c47d62379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://879b959fe7540c9b645b6ada02e192abb990fb3311a7346f0b1df41c47d62379\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T14:04:32Z\\\",\\\"message\\\":\\\"igs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0309 14:04:32.599371 6762 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0309 14:04:32.599438 6762 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 14:04:32.599509 6762 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0309 14:04:32.599664 6762 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0309 14:04:32.599744 6762 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0309 14:04:32.599757 6762 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0309 14:04:32.599783 6762 factory.go:656] Stopping watch factory\\\\nI0309 14:04:32.599796 6762 handler.go:208] Removed *v1.Node event handler 7\\\\nI0309 14:04:32.599804 6762 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0309 14:04:32.599813 6762 handler.go:208] Removed *v1.Node event handler 2\\\\nI0309 14:04:32.599922 6762 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 14:04:32.600179 6762 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-5v7ng_openshift-ovn-kubernetes(2e305619-b3a2-44c9-9e54-e1afa4f43dbf)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f5af077ae5b947194e5dba0542d053890a589582e415e1b41a932bdab6632e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcc35a356ac2421f1f5a4e5460ac9461063a1fb85aaefce257f9f4e7aed4dc75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcc35a356ac2421f1f5a4e5460ac9461063a1fb85aaefce257f9f4e7aed4dc75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5v7ng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:35Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:35 crc kubenswrapper[4722]: I0309 14:04:35.164135 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xsft8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e14085d-df3c-4218-aa43-0553786b8867\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fb0f733e44fc7a699961d799267ca3c01e65a62b29d74e6ae682bccba9e5281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqpsw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://435bcac0d223a0ff9431ac5bc5be017aa95d2e567d991a1f4edc0f79559904e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqpsw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xsft8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:35Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:35 crc kubenswrapper[4722]: I0309 14:04:35.180570 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee2ed55248b177bc0873ef3c41517401fdb30fc5ce5a545a1ca8c925e4d4e9e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:35Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:35 crc kubenswrapper[4722]: I0309 14:04:35.186511 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:35 crc kubenswrapper[4722]: I0309 14:04:35.186554 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:35 crc kubenswrapper[4722]: I0309 14:04:35.186567 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:35 crc kubenswrapper[4722]: I0309 14:04:35.186583 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:35 crc kubenswrapper[4722]: I0309 14:04:35.186595 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:35Z","lastTransitionTime":"2026-03-09T14:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:35 crc kubenswrapper[4722]: I0309 14:04:35.213416 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jv499" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd1357a3-36ef-4bc8-85bb-91f3c0f42994\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a14e62f50fa8e5403349dc8696e9c7c45a5da1026227612186e7805b1be722f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e56992f54f9fc4387130b74e3aa0f4652a5d6d960da1a5305b8668b8086fae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e56992f54f9fc4387130b74e3aa0f4652a5d6d960da1a5305b8668b8086fae2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c00d804aec8d776f5f2ab9c7a81161c9942a94564d57afe47c633fb3dc2aa7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c00d804aec8d776f5f2ab9c7a81161c9942a94564d57afe47c633fb3dc2aa7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0491f9997970cb050ece19d88d2a7064d4799db3772132f58055e1dbcf2f3a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0491f9997970cb050ece19d88d2a7064d4799db3772132f58055e1dbcf2f3a82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93c7d9a6b0dc8e76abab762ff6aae1975a31fbc80ee3a3cd32ec378298061748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93c7d9a6b0dc8e76abab762ff6aae1975a31fbc80ee3a3cd32ec378298061748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cc5ae61b0fa949ba4bf5aa60d09385399a557f19ef7c99dc43b21fc9db830e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9cc5ae61b0fa949ba4bf5aa60d09385399a557f19ef7c99dc43b21fc9db830e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a88cf1e6b90a3f13f44bbb08482773f007066c795b00cb0177f6cd2ed87cbb52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a88cf1e6b90a3f13f44bbb08482773f007066c795b00cb0177f6cd2ed87cbb52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jv499\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:35Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:35 crc kubenswrapper[4722]: I0309 14:04:35.230995 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l5pwg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8424174-764a-470b-ab49-e7369a42de7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d30648bba1b59fd7db5e75a61d6a004246668d148515da5738b77e5122358f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l5pwg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:35Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:35 crc kubenswrapper[4722]: I0309 14:04:35.255253 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pvvhj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5dbc0be-527e-4f70-b185-3a10b1b11a75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pvvhj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:35Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:35 crc kubenswrapper[4722]: I0309 14:04:35.276910 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2267a4d7-87b4-43f7-80c7-7a7ffb0327fc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04169d9b8359e26cc9807e27fb9a5d54ca0946aaade294c5bcc272a7d21c4db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09fa62e7293a50fb3f6bdaf37061cbe5e95a830c8f7337660ce73138dbcaa641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31ba786781fcdb781f5fa45276e29e94756863d2398109d91e81a52f6f88bee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dff721159b382bcfdde6f406a328ef2ee9264b35a8d8baeb7a3c6d31a60ff3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63769aca40f721bec0716427145a3ceea0595836c03d31c98f6d45ff14de68be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://928d88a2e78d66b996b4b4f8c09878f78dd7fbdb000e2ab6d43ac77c999a748f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://928d88a2e78d66b996b4b4f8c09878f78dd7fbdb000e2ab6d43ac77c999a748f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6625c60c717ffc2b85b70374bc50dfae41b04d6362db0c7daf387f31c25862e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6625c60c717ffc2b85b70374bc50dfae41b04d6362db0c7daf387f31c25862e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://21b62fa3b3144d1d24dbb3a9e5f203e4c692377a54fb2322ebe6b927aa67403b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21b62fa3b3144d1d24dbb3a9e5f203e4c692377a54fb2322ebe6b927aa67403b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:35Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:35 crc kubenswrapper[4722]: I0309 14:04:35.289211 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:35 crc kubenswrapper[4722]: I0309 14:04:35.289268 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:35 crc kubenswrapper[4722]: I0309 14:04:35.289280 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:35 crc kubenswrapper[4722]: I0309 14:04:35.289299 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:35 crc kubenswrapper[4722]: I0309 14:04:35.289311 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:35Z","lastTransitionTime":"2026-03-09T14:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:35 crc kubenswrapper[4722]: I0309 14:04:35.294773 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0965e24b-526e-4842-ac1f-eca7a765355d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56c76a5e8718243a17d8d9fd9a8dcf6083c2ccb65b4816926dca08a89e465728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be3836660bf646052991965270cd69b8b8237b5c4bd698a73789b050d23e6877\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dbef38ff5e714c92fc8a86f1e3c64adf606cb198145f884a000f8dee200fcf2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28fcbb961b20cd0cbc1e8548492168a0fe9920b7b98c7bfeffe34727c299030e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7d636e81bd62363d09f680bbc312f71cbc2ac89d9af44b3c6b9efdaf29ded48\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T14:03:46Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 14:03:46.681607 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 14:03:46.681849 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 14:03:46.682597 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2315154617/tls.crt::/tmp/serving-cert-2315154617/tls.key\\\\\\\"\\\\nI0309 14:03:46.866095 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 14:03:46.869876 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 14:03:46.869896 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 14:03:46.870280 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 14:03:46.870293 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 14:03:46.877096 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0309 14:03:46.877117 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 14:03:46.877121 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 14:03:46.877128 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 14:03:46.877135 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 14:03:46.877139 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 14:03:46.877143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 14:03:46.877146 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 14:03:46.878419 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T14:03:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fbabc5ec2c100150ce886c06c3bd78cc5896c8a06ac9d262eeb552a8a8bb5d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:35Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:35 crc kubenswrapper[4722]: I0309 14:04:35.309900 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:35Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:35 crc kubenswrapper[4722]: I0309 14:04:35.326725 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:35Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:35 crc kubenswrapper[4722]: I0309 14:04:35.338565 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-72xb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43e3c934-6aa7-4c96-a3a6-378f7931fc2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcf5e49245fd6051b12465a36b27f657b4923a62f88d61fe0fac971c4a2c9197\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lfx7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-72xb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:35Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:35 crc kubenswrapper[4722]: I0309 14:04:35.352029 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dac2aaf5-653b-4b2a-8efe-ed26bac8d648\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf5b268d6a355ae3a2c40981769412124b85cdfd99364ed58e5163a744f5f83c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q6db\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f451493fa6d86f12c1291c0fe262bcbe4b62cef45c8dda696bb48fa3ded2eeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q6db\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hjrrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:35Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:35 crc kubenswrapper[4722]: I0309 14:04:35.392126 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:35 crc kubenswrapper[4722]: I0309 14:04:35.392166 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:35 crc kubenswrapper[4722]: I0309 14:04:35.392174 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:35 crc kubenswrapper[4722]: I0309 14:04:35.392188 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:35 crc kubenswrapper[4722]: I0309 14:04:35.392219 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:35Z","lastTransitionTime":"2026-03-09T14:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:35 crc kubenswrapper[4722]: I0309 14:04:35.495001 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:35 crc kubenswrapper[4722]: I0309 14:04:35.495376 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:35 crc kubenswrapper[4722]: I0309 14:04:35.495386 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:35 crc kubenswrapper[4722]: I0309 14:04:35.495400 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:35 crc kubenswrapper[4722]: I0309 14:04:35.495409 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:35Z","lastTransitionTime":"2026-03-09T14:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:35 crc kubenswrapper[4722]: I0309 14:04:35.599814 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:35 crc kubenswrapper[4722]: I0309 14:04:35.599889 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:35 crc kubenswrapper[4722]: I0309 14:04:35.599914 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:35 crc kubenswrapper[4722]: I0309 14:04:35.599946 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:35 crc kubenswrapper[4722]: I0309 14:04:35.599973 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:35Z","lastTransitionTime":"2026-03-09T14:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:35 crc kubenswrapper[4722]: I0309 14:04:35.704385 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:35 crc kubenswrapper[4722]: I0309 14:04:35.704466 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:35 crc kubenswrapper[4722]: I0309 14:04:35.704489 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:35 crc kubenswrapper[4722]: I0309 14:04:35.704525 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:35 crc kubenswrapper[4722]: I0309 14:04:35.704549 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:35Z","lastTransitionTime":"2026-03-09T14:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:35 crc kubenswrapper[4722]: I0309 14:04:35.809302 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:35 crc kubenswrapper[4722]: I0309 14:04:35.809575 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:35 crc kubenswrapper[4722]: I0309 14:04:35.809601 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:35 crc kubenswrapper[4722]: I0309 14:04:35.809628 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:35 crc kubenswrapper[4722]: I0309 14:04:35.809649 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:35Z","lastTransitionTime":"2026-03-09T14:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:35 crc kubenswrapper[4722]: I0309 14:04:35.914117 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:35 crc kubenswrapper[4722]: I0309 14:04:35.914236 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:35 crc kubenswrapper[4722]: I0309 14:04:35.914249 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:35 crc kubenswrapper[4722]: I0309 14:04:35.914269 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:35 crc kubenswrapper[4722]: I0309 14:04:35.914286 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:35Z","lastTransitionTime":"2026-03-09T14:04:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:36 crc kubenswrapper[4722]: I0309 14:04:36.018342 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:36 crc kubenswrapper[4722]: I0309 14:04:36.018412 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:36 crc kubenswrapper[4722]: I0309 14:04:36.018430 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:36 crc kubenswrapper[4722]: I0309 14:04:36.018454 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:36 crc kubenswrapper[4722]: I0309 14:04:36.018467 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:36Z","lastTransitionTime":"2026-03-09T14:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:36 crc kubenswrapper[4722]: I0309 14:04:36.105403 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f5dbc0be-527e-4f70-b185-3a10b1b11a75-metrics-certs\") pod \"network-metrics-daemon-pvvhj\" (UID: \"f5dbc0be-527e-4f70-b185-3a10b1b11a75\") " pod="openshift-multus/network-metrics-daemon-pvvhj" Mar 09 14:04:36 crc kubenswrapper[4722]: E0309 14:04:36.105797 4722 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 14:04:36 crc kubenswrapper[4722]: E0309 14:04:36.105999 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f5dbc0be-527e-4f70-b185-3a10b1b11a75-metrics-certs podName:f5dbc0be-527e-4f70-b185-3a10b1b11a75 nodeName:}" failed. No retries permitted until 2026-03-09 14:04:38.105953646 +0000 UTC m=+118.661522362 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f5dbc0be-527e-4f70-b185-3a10b1b11a75-metrics-certs") pod "network-metrics-daemon-pvvhj" (UID: "f5dbc0be-527e-4f70-b185-3a10b1b11a75") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 14:04:36 crc kubenswrapper[4722]: I0309 14:04:36.121564 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:36 crc kubenswrapper[4722]: I0309 14:04:36.121641 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:36 crc kubenswrapper[4722]: I0309 14:04:36.121659 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:36 crc kubenswrapper[4722]: I0309 14:04:36.121688 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:36 crc kubenswrapper[4722]: I0309 14:04:36.121707 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:36Z","lastTransitionTime":"2026-03-09T14:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:36 crc kubenswrapper[4722]: I0309 14:04:36.149333 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pvvhj" Mar 09 14:04:36 crc kubenswrapper[4722]: I0309 14:04:36.149391 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 14:04:36 crc kubenswrapper[4722]: E0309 14:04:36.149493 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pvvhj" podUID="f5dbc0be-527e-4f70-b185-3a10b1b11a75" Mar 09 14:04:36 crc kubenswrapper[4722]: E0309 14:04:36.149764 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 14:04:36 crc kubenswrapper[4722]: I0309 14:04:36.149958 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 14:04:36 crc kubenswrapper[4722]: E0309 14:04:36.150057 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 14:04:36 crc kubenswrapper[4722]: I0309 14:04:36.149966 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 14:04:36 crc kubenswrapper[4722]: E0309 14:04:36.150164 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 14:04:36 crc kubenswrapper[4722]: I0309 14:04:36.224563 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:36 crc kubenswrapper[4722]: I0309 14:04:36.224623 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:36 crc kubenswrapper[4722]: I0309 14:04:36.224640 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:36 crc kubenswrapper[4722]: I0309 14:04:36.224665 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:36 crc kubenswrapper[4722]: I0309 14:04:36.224681 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:36Z","lastTransitionTime":"2026-03-09T14:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:36 crc kubenswrapper[4722]: I0309 14:04:36.328046 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:36 crc kubenswrapper[4722]: I0309 14:04:36.328113 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:36 crc kubenswrapper[4722]: I0309 14:04:36.328126 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:36 crc kubenswrapper[4722]: I0309 14:04:36.328146 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:36 crc kubenswrapper[4722]: I0309 14:04:36.328159 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:36Z","lastTransitionTime":"2026-03-09T14:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:36 crc kubenswrapper[4722]: I0309 14:04:36.430447 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:36 crc kubenswrapper[4722]: I0309 14:04:36.430517 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:36 crc kubenswrapper[4722]: I0309 14:04:36.430532 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:36 crc kubenswrapper[4722]: I0309 14:04:36.430556 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:36 crc kubenswrapper[4722]: I0309 14:04:36.430571 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:36Z","lastTransitionTime":"2026-03-09T14:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:36 crc kubenswrapper[4722]: I0309 14:04:36.534317 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:36 crc kubenswrapper[4722]: I0309 14:04:36.534378 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:36 crc kubenswrapper[4722]: I0309 14:04:36.534392 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:36 crc kubenswrapper[4722]: I0309 14:04:36.534415 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:36 crc kubenswrapper[4722]: I0309 14:04:36.534431 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:36Z","lastTransitionTime":"2026-03-09T14:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:36 crc kubenswrapper[4722]: I0309 14:04:36.637781 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:36 crc kubenswrapper[4722]: I0309 14:04:36.637837 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:36 crc kubenswrapper[4722]: I0309 14:04:36.637851 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:36 crc kubenswrapper[4722]: I0309 14:04:36.637869 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:36 crc kubenswrapper[4722]: I0309 14:04:36.637882 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:36Z","lastTransitionTime":"2026-03-09T14:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:36 crc kubenswrapper[4722]: I0309 14:04:36.741282 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:36 crc kubenswrapper[4722]: I0309 14:04:36.741365 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:36 crc kubenswrapper[4722]: I0309 14:04:36.741388 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:36 crc kubenswrapper[4722]: I0309 14:04:36.741421 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:36 crc kubenswrapper[4722]: I0309 14:04:36.741445 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:36Z","lastTransitionTime":"2026-03-09T14:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:36 crc kubenswrapper[4722]: I0309 14:04:36.845064 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:36 crc kubenswrapper[4722]: I0309 14:04:36.845142 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:36 crc kubenswrapper[4722]: I0309 14:04:36.845175 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:36 crc kubenswrapper[4722]: I0309 14:04:36.845250 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:36 crc kubenswrapper[4722]: I0309 14:04:36.845284 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:36Z","lastTransitionTime":"2026-03-09T14:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:36 crc kubenswrapper[4722]: I0309 14:04:36.948245 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:36 crc kubenswrapper[4722]: I0309 14:04:36.948325 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:36 crc kubenswrapper[4722]: I0309 14:04:36.948351 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:36 crc kubenswrapper[4722]: I0309 14:04:36.948385 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:36 crc kubenswrapper[4722]: I0309 14:04:36.948407 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:36Z","lastTransitionTime":"2026-03-09T14:04:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:37 crc kubenswrapper[4722]: I0309 14:04:37.051669 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:37 crc kubenswrapper[4722]: I0309 14:04:37.051723 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:37 crc kubenswrapper[4722]: I0309 14:04:37.051736 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:37 crc kubenswrapper[4722]: I0309 14:04:37.051753 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:37 crc kubenswrapper[4722]: I0309 14:04:37.051762 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:37Z","lastTransitionTime":"2026-03-09T14:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:37 crc kubenswrapper[4722]: I0309 14:04:37.155270 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:37 crc kubenswrapper[4722]: I0309 14:04:37.155333 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:37 crc kubenswrapper[4722]: I0309 14:04:37.155352 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:37 crc kubenswrapper[4722]: I0309 14:04:37.155372 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:37 crc kubenswrapper[4722]: I0309 14:04:37.155394 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:37Z","lastTransitionTime":"2026-03-09T14:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:37 crc kubenswrapper[4722]: I0309 14:04:37.259064 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:37 crc kubenswrapper[4722]: I0309 14:04:37.259185 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:37 crc kubenswrapper[4722]: I0309 14:04:37.259251 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:37 crc kubenswrapper[4722]: I0309 14:04:37.259281 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:37 crc kubenswrapper[4722]: I0309 14:04:37.259305 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:37Z","lastTransitionTime":"2026-03-09T14:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:37 crc kubenswrapper[4722]: I0309 14:04:37.363247 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:37 crc kubenswrapper[4722]: I0309 14:04:37.363320 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:37 crc kubenswrapper[4722]: I0309 14:04:37.363337 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:37 crc kubenswrapper[4722]: I0309 14:04:37.363366 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:37 crc kubenswrapper[4722]: I0309 14:04:37.363385 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:37Z","lastTransitionTime":"2026-03-09T14:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:37 crc kubenswrapper[4722]: I0309 14:04:37.467366 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:37 crc kubenswrapper[4722]: I0309 14:04:37.467483 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:37 crc kubenswrapper[4722]: I0309 14:04:37.467517 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:37 crc kubenswrapper[4722]: I0309 14:04:37.467565 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:37 crc kubenswrapper[4722]: I0309 14:04:37.467710 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:37Z","lastTransitionTime":"2026-03-09T14:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:37 crc kubenswrapper[4722]: I0309 14:04:37.486511 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:37 crc kubenswrapper[4722]: I0309 14:04:37.486555 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:37 crc kubenswrapper[4722]: I0309 14:04:37.486569 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:37 crc kubenswrapper[4722]: I0309 14:04:37.486588 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:37 crc kubenswrapper[4722]: I0309 14:04:37.486601 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:37Z","lastTransitionTime":"2026-03-09T14:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:37 crc kubenswrapper[4722]: E0309 14:04:37.509075 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3321b793-006a-42f5-9c18-7f48ed9bad15\\\",\\\"systemUUID\\\":\\\"70de6d16-940c-46da-be51-a9b50262dda2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:37Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:37 crc kubenswrapper[4722]: I0309 14:04:37.515101 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:37 crc kubenswrapper[4722]: I0309 14:04:37.515160 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:37 crc kubenswrapper[4722]: I0309 14:04:37.515182 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:37 crc kubenswrapper[4722]: I0309 14:04:37.515241 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:37 crc kubenswrapper[4722]: I0309 14:04:37.515261 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:37Z","lastTransitionTime":"2026-03-09T14:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:37 crc kubenswrapper[4722]: E0309 14:04:37.536313 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3321b793-006a-42f5-9c18-7f48ed9bad15\\\",\\\"systemUUID\\\":\\\"70de6d16-940c-46da-be51-a9b50262dda2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:37Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:37 crc kubenswrapper[4722]: I0309 14:04:37.542088 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:37 crc kubenswrapper[4722]: I0309 14:04:37.542172 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:37 crc kubenswrapper[4722]: I0309 14:04:37.542197 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:37 crc kubenswrapper[4722]: I0309 14:04:37.542457 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:37 crc kubenswrapper[4722]: I0309 14:04:37.542497 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:37Z","lastTransitionTime":"2026-03-09T14:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:37 crc kubenswrapper[4722]: E0309 14:04:37.564763 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3321b793-006a-42f5-9c18-7f48ed9bad15\\\",\\\"systemUUID\\\":\\\"70de6d16-940c-46da-be51-a9b50262dda2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:37Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:37 crc kubenswrapper[4722]: I0309 14:04:37.570281 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:37 crc kubenswrapper[4722]: I0309 14:04:37.570331 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:37 crc kubenswrapper[4722]: I0309 14:04:37.570351 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:37 crc kubenswrapper[4722]: I0309 14:04:37.570378 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:37 crc kubenswrapper[4722]: I0309 14:04:37.570394 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:37Z","lastTransitionTime":"2026-03-09T14:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:37 crc kubenswrapper[4722]: E0309 14:04:37.589385 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3321b793-006a-42f5-9c18-7f48ed9bad15\\\",\\\"systemUUID\\\":\\\"70de6d16-940c-46da-be51-a9b50262dda2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:37Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:37 crc kubenswrapper[4722]: I0309 14:04:37.595615 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:37 crc kubenswrapper[4722]: I0309 14:04:37.595678 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:37 crc kubenswrapper[4722]: I0309 14:04:37.595697 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:37 crc kubenswrapper[4722]: I0309 14:04:37.595724 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:37 crc kubenswrapper[4722]: I0309 14:04:37.595748 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:37Z","lastTransitionTime":"2026-03-09T14:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:37 crc kubenswrapper[4722]: E0309 14:04:37.617081 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3321b793-006a-42f5-9c18-7f48ed9bad15\\\",\\\"systemUUID\\\":\\\"70de6d16-940c-46da-be51-a9b50262dda2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:37Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:37 crc kubenswrapper[4722]: E0309 14:04:37.617369 4722 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 09 14:04:37 crc kubenswrapper[4722]: I0309 14:04:37.619686 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:37 crc kubenswrapper[4722]: I0309 14:04:37.619775 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:37 crc kubenswrapper[4722]: I0309 14:04:37.619793 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:37 crc kubenswrapper[4722]: I0309 14:04:37.619823 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:37 crc kubenswrapper[4722]: I0309 14:04:37.619844 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:37Z","lastTransitionTime":"2026-03-09T14:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:37 crc kubenswrapper[4722]: I0309 14:04:37.723593 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:37 crc kubenswrapper[4722]: I0309 14:04:37.723671 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:37 crc kubenswrapper[4722]: I0309 14:04:37.723689 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:37 crc kubenswrapper[4722]: I0309 14:04:37.723717 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:37 crc kubenswrapper[4722]: I0309 14:04:37.723735 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:37Z","lastTransitionTime":"2026-03-09T14:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:37 crc kubenswrapper[4722]: I0309 14:04:37.826756 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:37 crc kubenswrapper[4722]: I0309 14:04:37.826831 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:37 crc kubenswrapper[4722]: I0309 14:04:37.826847 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:37 crc kubenswrapper[4722]: I0309 14:04:37.826867 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:37 crc kubenswrapper[4722]: I0309 14:04:37.826893 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:37Z","lastTransitionTime":"2026-03-09T14:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:37 crc kubenswrapper[4722]: I0309 14:04:37.929954 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:37 crc kubenswrapper[4722]: I0309 14:04:37.930017 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:37 crc kubenswrapper[4722]: I0309 14:04:37.930044 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:37 crc kubenswrapper[4722]: I0309 14:04:37.930068 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:37 crc kubenswrapper[4722]: I0309 14:04:37.930088 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:37Z","lastTransitionTime":"2026-03-09T14:04:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:38 crc kubenswrapper[4722]: I0309 14:04:38.034067 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:38 crc kubenswrapper[4722]: I0309 14:04:38.034138 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:38 crc kubenswrapper[4722]: I0309 14:04:38.034165 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:38 crc kubenswrapper[4722]: I0309 14:04:38.034241 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:38 crc kubenswrapper[4722]: I0309 14:04:38.034272 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:38Z","lastTransitionTime":"2026-03-09T14:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:38 crc kubenswrapper[4722]: I0309 14:04:38.129377 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f5dbc0be-527e-4f70-b185-3a10b1b11a75-metrics-certs\") pod \"network-metrics-daemon-pvvhj\" (UID: \"f5dbc0be-527e-4f70-b185-3a10b1b11a75\") " pod="openshift-multus/network-metrics-daemon-pvvhj" Mar 09 14:04:38 crc kubenswrapper[4722]: E0309 14:04:38.129759 4722 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 14:04:38 crc kubenswrapper[4722]: E0309 14:04:38.129936 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f5dbc0be-527e-4f70-b185-3a10b1b11a75-metrics-certs podName:f5dbc0be-527e-4f70-b185-3a10b1b11a75 nodeName:}" failed. No retries permitted until 2026-03-09 14:04:42.1298971 +0000 UTC m=+122.685465866 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f5dbc0be-527e-4f70-b185-3a10b1b11a75-metrics-certs") pod "network-metrics-daemon-pvvhj" (UID: "f5dbc0be-527e-4f70-b185-3a10b1b11a75") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 14:04:38 crc kubenswrapper[4722]: I0309 14:04:38.138127 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:38 crc kubenswrapper[4722]: I0309 14:04:38.138194 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:38 crc kubenswrapper[4722]: I0309 14:04:38.138255 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:38 crc kubenswrapper[4722]: I0309 14:04:38.138284 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:38 crc kubenswrapper[4722]: I0309 14:04:38.138317 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:38Z","lastTransitionTime":"2026-03-09T14:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:38 crc kubenswrapper[4722]: I0309 14:04:38.148521 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 14:04:38 crc kubenswrapper[4722]: I0309 14:04:38.148526 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pvvhj" Mar 09 14:04:38 crc kubenswrapper[4722]: I0309 14:04:38.148547 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 14:04:38 crc kubenswrapper[4722]: I0309 14:04:38.148665 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 14:04:38 crc kubenswrapper[4722]: E0309 14:04:38.148856 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 14:04:38 crc kubenswrapper[4722]: E0309 14:04:38.148986 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 14:04:38 crc kubenswrapper[4722]: E0309 14:04:38.149287 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pvvhj" podUID="f5dbc0be-527e-4f70-b185-3a10b1b11a75" Mar 09 14:04:38 crc kubenswrapper[4722]: E0309 14:04:38.149354 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 14:04:38 crc kubenswrapper[4722]: I0309 14:04:38.241665 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:38 crc kubenswrapper[4722]: I0309 14:04:38.241904 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:38 crc kubenswrapper[4722]: I0309 14:04:38.241916 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:38 crc kubenswrapper[4722]: I0309 14:04:38.241935 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:38 crc kubenswrapper[4722]: I0309 14:04:38.241946 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:38Z","lastTransitionTime":"2026-03-09T14:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:38 crc kubenswrapper[4722]: I0309 14:04:38.344541 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:38 crc kubenswrapper[4722]: I0309 14:04:38.344597 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:38 crc kubenswrapper[4722]: I0309 14:04:38.344613 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:38 crc kubenswrapper[4722]: I0309 14:04:38.344633 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:38 crc kubenswrapper[4722]: I0309 14:04:38.344649 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:38Z","lastTransitionTime":"2026-03-09T14:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:38 crc kubenswrapper[4722]: I0309 14:04:38.448230 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:38 crc kubenswrapper[4722]: I0309 14:04:38.448292 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:38 crc kubenswrapper[4722]: I0309 14:04:38.448306 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:38 crc kubenswrapper[4722]: I0309 14:04:38.448327 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:38 crc kubenswrapper[4722]: I0309 14:04:38.448339 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:38Z","lastTransitionTime":"2026-03-09T14:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:38 crc kubenswrapper[4722]: I0309 14:04:38.552550 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:38 crc kubenswrapper[4722]: I0309 14:04:38.552639 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:38 crc kubenswrapper[4722]: I0309 14:04:38.552667 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:38 crc kubenswrapper[4722]: I0309 14:04:38.552706 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:38 crc kubenswrapper[4722]: I0309 14:04:38.552730 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:38Z","lastTransitionTime":"2026-03-09T14:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:38 crc kubenswrapper[4722]: I0309 14:04:38.656988 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:38 crc kubenswrapper[4722]: I0309 14:04:38.657072 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:38 crc kubenswrapper[4722]: I0309 14:04:38.657096 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:38 crc kubenswrapper[4722]: I0309 14:04:38.657135 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:38 crc kubenswrapper[4722]: I0309 14:04:38.657180 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:38Z","lastTransitionTime":"2026-03-09T14:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:38 crc kubenswrapper[4722]: I0309 14:04:38.760413 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:38 crc kubenswrapper[4722]: I0309 14:04:38.761362 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:38 crc kubenswrapper[4722]: I0309 14:04:38.761405 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:38 crc kubenswrapper[4722]: I0309 14:04:38.761432 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:38 crc kubenswrapper[4722]: I0309 14:04:38.761451 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:38Z","lastTransitionTime":"2026-03-09T14:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:38 crc kubenswrapper[4722]: I0309 14:04:38.864891 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:38 crc kubenswrapper[4722]: I0309 14:04:38.864961 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:38 crc kubenswrapper[4722]: I0309 14:04:38.864979 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:38 crc kubenswrapper[4722]: I0309 14:04:38.865006 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:38 crc kubenswrapper[4722]: I0309 14:04:38.865024 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:38Z","lastTransitionTime":"2026-03-09T14:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:38 crc kubenswrapper[4722]: I0309 14:04:38.968714 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:38 crc kubenswrapper[4722]: I0309 14:04:38.968833 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:38 crc kubenswrapper[4722]: I0309 14:04:38.968857 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:38 crc kubenswrapper[4722]: I0309 14:04:38.968889 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:38 crc kubenswrapper[4722]: I0309 14:04:38.968908 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:38Z","lastTransitionTime":"2026-03-09T14:04:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:39 crc kubenswrapper[4722]: I0309 14:04:39.072864 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:39 crc kubenswrapper[4722]: I0309 14:04:39.072935 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:39 crc kubenswrapper[4722]: I0309 14:04:39.072948 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:39 crc kubenswrapper[4722]: I0309 14:04:39.072975 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:39 crc kubenswrapper[4722]: I0309 14:04:39.072988 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:39Z","lastTransitionTime":"2026-03-09T14:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:39 crc kubenswrapper[4722]: I0309 14:04:39.176633 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:39 crc kubenswrapper[4722]: I0309 14:04:39.176701 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:39 crc kubenswrapper[4722]: I0309 14:04:39.176718 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:39 crc kubenswrapper[4722]: I0309 14:04:39.176741 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:39 crc kubenswrapper[4722]: I0309 14:04:39.176760 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:39Z","lastTransitionTime":"2026-03-09T14:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:39 crc kubenswrapper[4722]: I0309 14:04:39.281039 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:39 crc kubenswrapper[4722]: I0309 14:04:39.281127 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:39 crc kubenswrapper[4722]: I0309 14:04:39.281151 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:39 crc kubenswrapper[4722]: I0309 14:04:39.281190 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:39 crc kubenswrapper[4722]: I0309 14:04:39.281245 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:39Z","lastTransitionTime":"2026-03-09T14:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:39 crc kubenswrapper[4722]: I0309 14:04:39.384414 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:39 crc kubenswrapper[4722]: I0309 14:04:39.384498 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:39 crc kubenswrapper[4722]: I0309 14:04:39.384520 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:39 crc kubenswrapper[4722]: I0309 14:04:39.384553 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:39 crc kubenswrapper[4722]: I0309 14:04:39.384573 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:39Z","lastTransitionTime":"2026-03-09T14:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:39 crc kubenswrapper[4722]: I0309 14:04:39.489035 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:39 crc kubenswrapper[4722]: I0309 14:04:39.489115 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:39 crc kubenswrapper[4722]: I0309 14:04:39.489133 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:39 crc kubenswrapper[4722]: I0309 14:04:39.489162 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:39 crc kubenswrapper[4722]: I0309 14:04:39.489186 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:39Z","lastTransitionTime":"2026-03-09T14:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:39 crc kubenswrapper[4722]: I0309 14:04:39.592599 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:39 crc kubenswrapper[4722]: I0309 14:04:39.592672 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:39 crc kubenswrapper[4722]: I0309 14:04:39.592694 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:39 crc kubenswrapper[4722]: I0309 14:04:39.592723 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:39 crc kubenswrapper[4722]: I0309 14:04:39.592743 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:39Z","lastTransitionTime":"2026-03-09T14:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:39 crc kubenswrapper[4722]: I0309 14:04:39.695700 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:39 crc kubenswrapper[4722]: I0309 14:04:39.695775 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:39 crc kubenswrapper[4722]: I0309 14:04:39.695787 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:39 crc kubenswrapper[4722]: I0309 14:04:39.695807 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:39 crc kubenswrapper[4722]: I0309 14:04:39.695820 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:39Z","lastTransitionTime":"2026-03-09T14:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:39 crc kubenswrapper[4722]: I0309 14:04:39.799273 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:39 crc kubenswrapper[4722]: I0309 14:04:39.799331 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:39 crc kubenswrapper[4722]: I0309 14:04:39.799350 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:39 crc kubenswrapper[4722]: I0309 14:04:39.799379 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:39 crc kubenswrapper[4722]: I0309 14:04:39.799397 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:39Z","lastTransitionTime":"2026-03-09T14:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:39 crc kubenswrapper[4722]: I0309 14:04:39.903325 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:39 crc kubenswrapper[4722]: I0309 14:04:39.903420 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:39 crc kubenswrapper[4722]: I0309 14:04:39.903446 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:39 crc kubenswrapper[4722]: I0309 14:04:39.903478 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:39 crc kubenswrapper[4722]: I0309 14:04:39.903497 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:39Z","lastTransitionTime":"2026-03-09T14:04:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:39 crc kubenswrapper[4722]: I0309 14:04:39.952440 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 14:04:39 crc kubenswrapper[4722]: E0309 14:04:39.952912 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 14:05:11.952846595 +0000 UTC m=+152.508415241 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:04:40 crc kubenswrapper[4722]: I0309 14:04:40.007064 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:40 crc kubenswrapper[4722]: I0309 14:04:40.007120 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:40 crc kubenswrapper[4722]: I0309 14:04:40.007131 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:40 crc kubenswrapper[4722]: I0309 14:04:40.007150 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:40 crc kubenswrapper[4722]: I0309 14:04:40.007162 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:40Z","lastTransitionTime":"2026-03-09T14:04:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:40 crc kubenswrapper[4722]: I0309 14:04:40.054611 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 14:04:40 crc kubenswrapper[4722]: I0309 14:04:40.054748 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 14:04:40 crc kubenswrapper[4722]: I0309 14:04:40.054819 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 14:04:40 crc kubenswrapper[4722]: E0309 14:04:40.054834 4722 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 14:04:40 crc kubenswrapper[4722]: I0309 14:04:40.054916 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 14:04:40 crc kubenswrapper[4722]: E0309 14:04:40.054993 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 14:05:12.054920272 +0000 UTC m=+152.610488868 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 14:04:40 crc kubenswrapper[4722]: E0309 14:04:40.055063 4722 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 14:04:40 crc kubenswrapper[4722]: E0309 14:04:40.055175 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 14:04:40 crc kubenswrapper[4722]: E0309 14:04:40.055259 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 14:04:40 crc kubenswrapper[4722]: E0309 14:04:40.055268 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 14:04:40 crc kubenswrapper[4722]: E0309 14:04:40.055352 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 14:04:40 crc kubenswrapper[4722]: E0309 14:04:40.055380 4722 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 14:04:40 crc kubenswrapper[4722]: E0309 14:04:40.055292 4722 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 14:04:40 crc kubenswrapper[4722]: E0309 14:04:40.055316 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 14:05:12.055273542 +0000 UTC m=+152.610842158 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 14:04:40 crc kubenswrapper[4722]: E0309 14:04:40.055597 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-09 14:05:12.055568751 +0000 UTC m=+152.611137367 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 14:04:40 crc kubenswrapper[4722]: E0309 14:04:40.055634 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-09 14:05:12.055621652 +0000 UTC m=+152.611190268 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 14:04:40 crc kubenswrapper[4722]: E0309 14:04:40.109716 4722 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 09 14:04:40 crc kubenswrapper[4722]: I0309 14:04:40.148546 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 14:04:40 crc kubenswrapper[4722]: I0309 14:04:40.148613 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pvvhj" Mar 09 14:04:40 crc kubenswrapper[4722]: I0309 14:04:40.148666 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 14:04:40 crc kubenswrapper[4722]: I0309 14:04:40.148862 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 14:04:40 crc kubenswrapper[4722]: E0309 14:04:40.148834 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 14:04:40 crc kubenswrapper[4722]: E0309 14:04:40.149052 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pvvhj" podUID="f5dbc0be-527e-4f70-b185-3a10b1b11a75" Mar 09 14:04:40 crc kubenswrapper[4722]: E0309 14:04:40.149140 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 14:04:40 crc kubenswrapper[4722]: E0309 14:04:40.149253 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 14:04:40 crc kubenswrapper[4722]: I0309 14:04:40.174763 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pvvhj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5dbc0be-527e-4f70-b185-3a10b1b11a75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pvvhj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:40Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:40 crc kubenswrapper[4722]: I0309 14:04:40.200322 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2267a4d7-87b4-43f7-80c7-7a7ffb0327fc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04169d9b8359e26cc9807e27fb9a5d54ca0946aaade294c5bcc272a7d21c4db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09fa62e7293a50fb3f6bdaf37061cbe5e95a830c8f7337660ce73138dbcaa641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31ba786781fcdb781f5fa45276e29e94756863d2398109d91e81a52f6f88bee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dff721159b382bcfdde6f406a328ef2ee9264b35a8d8baeb7a3c6d31a60ff3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63769aca40f721bec0716427145a3ceea0595836c03d31c98f6d45ff14de68be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://928d88a2e78d66b996b4b4f8c09878f78dd7fbdb000e2ab6d43ac77c999a748f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://928d88a2e78d66b996b4b4f8c09878f78dd7fbdb000e2ab6d43ac77c999a748f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6625c60c717ffc2b85b70374bc50dfae41b04d6362db0c7daf387f31c25862e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6625c60c717ffc2b85b70374bc50dfae41b04d6362db0c7daf387f31c25862e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://21b62fa3b3144d1d24dbb3a9e5f203e4c692377a54fb2322ebe6b927aa67403b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21b62fa3b3144d1d24dbb3a9e5f203e4c692377a54fb2322ebe6b927aa67403b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:40Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:40 crc kubenswrapper[4722]: I0309 14:04:40.221563 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0965e24b-526e-4842-ac1f-eca7a765355d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56c76a5e8718243a17d8d9fd9a8dcf6083c2ccb65b4816926dca08a89e465728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be3836660bf646052991965270cd69b8b8237b5c4bd698a73789b050d23e6877\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dbef38ff5e714c92fc8a86f1e3c64adf606cb198145f884a000f8dee200fcf2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28fcbb961b20cd0cbc1e8548492168a0fe9920b7b98c7bfeffe34727c299030e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7d636e81bd62363d09f680bbc312f71cbc2ac89d9af44b3c6b9efdaf29ded48\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T14:03:46Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 14:03:46.681607 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 14:03:46.681849 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 14:03:46.682597 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2315154617/tls.crt::/tmp/serving-cert-2315154617/tls.key\\\\\\\"\\\\nI0309 14:03:46.866095 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 14:03:46.869876 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 14:03:46.869896 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 14:03:46.870280 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 14:03:46.870293 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 14:03:46.877096 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0309 14:03:46.877117 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 14:03:46.877121 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 14:03:46.877128 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 14:03:46.877135 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 14:03:46.877139 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 14:03:46.877143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 14:03:46.877146 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 14:03:46.878419 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T14:03:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fbabc5ec2c100150ce886c06c3bd78cc5896c8a06ac9d262eeb552a8a8bb5d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:40Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:40 crc kubenswrapper[4722]: I0309 14:04:40.242276 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:40Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:40 crc kubenswrapper[4722]: E0309 14:04:40.246567 4722 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 14:04:40 crc kubenswrapper[4722]: I0309 14:04:40.262056 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:40Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:40 crc kubenswrapper[4722]: I0309 14:04:40.278767 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-72xb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43e3c934-6aa7-4c96-a3a6-378f7931fc2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcf5e49245fd6051b12465a36b27f657b4923a62f88d61fe0fac971c4a2c9197\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lfx7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-72xb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:40Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:40 crc kubenswrapper[4722]: I0309 14:04:40.299068 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dac2aaf5-653b-4b2a-8efe-ed26bac8d648\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf5b268d6a355ae3a2c40981769412124b85cdfd99364ed58e5163a744f5f83c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q6db\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f451493fa6d86f12c1291c0fe262bcbe4b62cef45c8dda696bb48fa3ded2eeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q6db\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hjrrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:40Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:40 crc kubenswrapper[4722]: I0309 14:04:40.321624 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:40Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:40 crc kubenswrapper[4722]: I0309 14:04:40.339891 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69e74824fe242fd2d54bed9734f33823697d7651043d9c97bac6ea5490625a0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:40Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:40 crc kubenswrapper[4722]: I0309 14:04:40.359837 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d69359acfc7a4ee7698088a7e0062b3604a5ceb92910cd3b3d69b1a6291dfdcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://687646beb4c6dd3fb9b659a36449a573f4c389b3e98b07127b18516fdc4f2c93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:40Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:40 crc kubenswrapper[4722]: I0309 14:04:40.386742 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h4zw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b9e29bb-6e51-47ab-a543-b70117ab854d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17cd3d1e3981e2f2a3b66780f753425c78452e6319fb8eae9935d3bac4456820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdw92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h4zw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:40Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:40 crc kubenswrapper[4722]: I0309 14:04:40.412009 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1fc80c43be93eeb2f9805bf314c8430f52062d0ce5d0fa1830e7e0c27e674e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f969085ae62d82dbe1f0f8f5d459c4a1bc7d1ae0a459da484f11dd6ae5fdd418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5765145caac69454a9a8e8de34d05cb1da06924dd365f3e74f52461be5027ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef1c5a3944e8693d8c68a4108c23851352407d0fac7fb7a03763d9084f117dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0106fa808804a96ca8247096399e3d04eb082b2691b1f39df3db9f6c630114c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e737417baac512fab2089f5da12e7c47ef06eae8df7803ad6b248505ff3d18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://879b959fe7540c9b645b6ada02e192abb990fb3311a7346f0b1df41c47d62379\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://879b959fe7540c9b645b6ada02e192abb990fb3311a7346f0b1df41c47d62379\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T14:04:32Z\\\",\\\"message\\\":\\\"igs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0309 14:04:32.599371 6762 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0309 14:04:32.599438 6762 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 14:04:32.599509 6762 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0309 14:04:32.599664 6762 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0309 14:04:32.599744 6762 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0309 14:04:32.599757 6762 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0309 14:04:32.599783 6762 factory.go:656] Stopping watch factory\\\\nI0309 14:04:32.599796 6762 handler.go:208] Removed *v1.Node event handler 7\\\\nI0309 14:04:32.599804 6762 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0309 14:04:32.599813 6762 handler.go:208] Removed *v1.Node event handler 2\\\\nI0309 14:04:32.599922 6762 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 14:04:32.600179 6762 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-5v7ng_openshift-ovn-kubernetes(2e305619-b3a2-44c9-9e54-e1afa4f43dbf)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f5af077ae5b947194e5dba0542d053890a589582e415e1b41a932bdab6632e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcc35a356ac2421f1f5a4e5460ac9461063a1fb85aaefce257f9f4e7aed4dc75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcc35a356ac2421f1f5a4e5460ac9461063a1fb85aaefce257f9f4e7aed4dc75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5v7ng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:40Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:40 crc kubenswrapper[4722]: I0309 14:04:40.428758 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xsft8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e14085d-df3c-4218-aa43-0553786b8867\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fb0f733e44fc7a699961d799267ca3c01e65a62b29d74e6ae682bccba9e5281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqpsw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://435bcac0d223a0ff9431ac5bc5be017aa95d2e567d991a1f4edc0f79559904e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqpsw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xsft8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:40Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:40 crc kubenswrapper[4722]: I0309 14:04:40.443761 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee2ed55248b177bc0873ef3c41517401fdb30fc5ce5a545a1ca8c925e4d4e9e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:40Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:40 crc kubenswrapper[4722]: I0309 14:04:40.466564 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jv499" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd1357a3-36ef-4bc8-85bb-91f3c0f42994\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a14e62f50fa8e5403349dc8696e9c7c45a5da1026227612186e7805b1be722f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e56992f54f9fc4387130b74e3aa0f4652a5d6d960da1a5305b8668b8086fae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e56992f54f9fc4387130b74e3aa0f4652a5d6d960da1a5305b8668b8086fae2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c00d804aec8d776f5f2ab9c7a81161c9942a94564d57afe47c633fb3dc2aa7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c00d804aec8d776f5f2ab9c7a81161c9942a94564d57afe47c633fb3dc2aa7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0491f9997970cb050ece19d88d2a7064d4799db3772132f58055e1dbcf2f3a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0491f9997970cb050ece19d88d2a7064d4799db3772132f58055e1dbcf2f3a82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93c7d9a6b0dc8e76abab762ff6aae1975a31fbc80ee3a3cd32ec378298061748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93c7d9a6b0dc8e76abab762ff6aae1975a31fbc80ee3a3cd32ec378298061748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cc5ae61b0fa949ba4bf5aa60d09385399a557f19ef7c99dc43b21fc9db830e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9cc5ae61b0fa949ba4bf5aa60d09385399a557f19ef7c99dc43b21fc9db830e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a88cf1e6b90a3f13f44bbb08482773f007066c795b00cb0177f6cd2ed87cbb52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a88cf1e6b90a3f13f44bbb08482773f007066c795b00cb0177f6cd2ed87cbb52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jv499\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:40Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:40 crc kubenswrapper[4722]: I0309 14:04:40.481412 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l5pwg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8424174-764a-470b-ab49-e7369a42de7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d30648bba1b59fd7db5e75a61d6a004246668d148515da5738b77e5122358f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l5pwg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:40Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:42 crc kubenswrapper[4722]: I0309 14:04:42.148323 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 14:04:42 crc kubenswrapper[4722]: I0309 14:04:42.148337 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 14:04:42 crc kubenswrapper[4722]: E0309 14:04:42.149137 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 14:04:42 crc kubenswrapper[4722]: I0309 14:04:42.148507 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pvvhj" Mar 09 14:04:42 crc kubenswrapper[4722]: I0309 14:04:42.148403 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 14:04:42 crc kubenswrapper[4722]: E0309 14:04:42.149306 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 14:04:42 crc kubenswrapper[4722]: E0309 14:04:42.149424 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 14:04:42 crc kubenswrapper[4722]: E0309 14:04:42.149517 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pvvhj" podUID="f5dbc0be-527e-4f70-b185-3a10b1b11a75" Mar 09 14:04:42 crc kubenswrapper[4722]: I0309 14:04:42.185699 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f5dbc0be-527e-4f70-b185-3a10b1b11a75-metrics-certs\") pod \"network-metrics-daemon-pvvhj\" (UID: \"f5dbc0be-527e-4f70-b185-3a10b1b11a75\") " pod="openshift-multus/network-metrics-daemon-pvvhj" Mar 09 14:04:42 crc kubenswrapper[4722]: E0309 14:04:42.185939 4722 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 14:04:42 crc kubenswrapper[4722]: E0309 14:04:42.186024 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f5dbc0be-527e-4f70-b185-3a10b1b11a75-metrics-certs podName:f5dbc0be-527e-4f70-b185-3a10b1b11a75 nodeName:}" failed. No retries permitted until 2026-03-09 14:04:50.185998553 +0000 UTC m=+130.741567129 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f5dbc0be-527e-4f70-b185-3a10b1b11a75-metrics-certs") pod "network-metrics-daemon-pvvhj" (UID: "f5dbc0be-527e-4f70-b185-3a10b1b11a75") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 14:04:44 crc kubenswrapper[4722]: I0309 14:04:44.148620 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pvvhj" Mar 09 14:04:44 crc kubenswrapper[4722]: I0309 14:04:44.148656 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 14:04:44 crc kubenswrapper[4722]: E0309 14:04:44.149544 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pvvhj" podUID="f5dbc0be-527e-4f70-b185-3a10b1b11a75" Mar 09 14:04:44 crc kubenswrapper[4722]: I0309 14:04:44.148656 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 14:04:44 crc kubenswrapper[4722]: I0309 14:04:44.148688 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 14:04:44 crc kubenswrapper[4722]: E0309 14:04:44.149829 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 14:04:44 crc kubenswrapper[4722]: E0309 14:04:44.149906 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 14:04:44 crc kubenswrapper[4722]: E0309 14:04:44.150067 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 14:04:45 crc kubenswrapper[4722]: I0309 14:04:45.161006 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 09 14:04:45 crc kubenswrapper[4722]: E0309 14:04:45.248173 4722 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 14:04:46 crc kubenswrapper[4722]: I0309 14:04:46.148747 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 14:04:46 crc kubenswrapper[4722]: I0309 14:04:46.148854 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 14:04:46 crc kubenswrapper[4722]: I0309 14:04:46.148944 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 14:04:46 crc kubenswrapper[4722]: E0309 14:04:46.148906 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 14:04:46 crc kubenswrapper[4722]: E0309 14:04:46.150492 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 14:04:46 crc kubenswrapper[4722]: I0309 14:04:46.150545 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pvvhj" Mar 09 14:04:46 crc kubenswrapper[4722]: E0309 14:04:46.150765 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 14:04:46 crc kubenswrapper[4722]: I0309 14:04:46.152128 4722 scope.go:117] "RemoveContainer" containerID="879b959fe7540c9b645b6ada02e192abb990fb3311a7346f0b1df41c47d62379" Mar 09 14:04:46 crc kubenswrapper[4722]: E0309 14:04:46.154927 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pvvhj" podUID="f5dbc0be-527e-4f70-b185-3a10b1b11a75" Mar 09 14:04:46 crc kubenswrapper[4722]: I0309 14:04:46.802911 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5v7ng_2e305619-b3a2-44c9-9e54-e1afa4f43dbf/ovnkube-controller/1.log" Mar 09 14:04:46 crc kubenswrapper[4722]: I0309 14:04:46.805274 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" event={"ID":"2e305619-b3a2-44c9-9e54-e1afa4f43dbf","Type":"ContainerStarted","Data":"10ef5c0af2e49f88ea7089e2d4bdc701ce1c068e25ef30012e0b668aa9d4e8cc"} Mar 09 14:04:46 crc kubenswrapper[4722]: I0309 14:04:46.806362 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" Mar 09 14:04:46 crc kubenswrapper[4722]: I0309 14:04:46.825580 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2267a4d7-87b4-43f7-80c7-7a7ffb0327fc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04169d9b8359e26cc9807e27fb9a5d54ca0946aaade294c5bcc272a7d21c4db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09fa62e7293a50fb3f6bdaf37061cbe5e95a830c8f7337660ce73138dbcaa641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31ba786781fcdb781f5fa45276e29e94756863d2398109d91e81a52f6f88bee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dff721159b382bcfdde6f406a328ef2ee9264b35a8d8baeb7a3c6d31a60ff3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63769aca40f721bec0716427145a3ceea0595836c03d31c98f6d45ff14de68be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://928d88a2e78d66b996b4b4f8c09878f78dd7fbdb000e2ab6d43ac77c999a748f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://928d88a2e78d66b996b4b4f8c09878f78dd7fbdb000e2ab6d43ac77c999a748f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6625c60c717ffc2b85b70374bc50dfae41b04d6362db0c7daf387f31c25862e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6625c60c717ffc2b85b70374bc50dfae41b04d6362db0c7daf387f31c25862e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://21b62fa3b3144d1d24dbb3a9e5f203e4c692377a54fb2322ebe6b927aa67403b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21b62fa3b3144d1d24dbb3a9e5f203e4c692377a54fb2322ebe6b927aa67403b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:46Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:46 crc kubenswrapper[4722]: I0309 14:04:46.842579 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0965e24b-526e-4842-ac1f-eca7a765355d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56c76a5e8718243a17d8d9fd9a8dcf6083c2ccb65b4816926dca08a89e465728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be3836660bf646052991965270cd69b8b8237b5c4bd698a73789b050d23e6877\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dbef38ff5e714c92fc8a86f1e3c64adf606cb198145f884a000f8dee200fcf2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28fcbb961b20cd0cbc1e8548492168a0fe9920b7b98c7bfeffe34727c299030e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7d636e81bd62363d09f680bbc312f71cbc2ac89d9af44b3c6b9efdaf29ded48\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T14:03:46Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 14:03:46.681607 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 14:03:46.681849 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 14:03:46.682597 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2315154617/tls.crt::/tmp/serving-cert-2315154617/tls.key\\\\\\\"\\\\nI0309 14:03:46.866095 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 14:03:46.869876 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 14:03:46.869896 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 14:03:46.870280 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 14:03:46.870293 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 14:03:46.877096 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0309 14:03:46.877117 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 14:03:46.877121 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 14:03:46.877128 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 14:03:46.877135 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 14:03:46.877139 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 14:03:46.877143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 14:03:46.877146 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 14:03:46.878419 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T14:03:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fbabc5ec2c100150ce886c06c3bd78cc5896c8a06ac9d262eeb552a8a8bb5d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:46Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:46 crc kubenswrapper[4722]: I0309 14:04:46.861113 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:46Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:46 crc kubenswrapper[4722]: I0309 14:04:46.872798 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:46Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:46 crc kubenswrapper[4722]: I0309 14:04:46.883851 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-72xb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43e3c934-6aa7-4c96-a3a6-378f7931fc2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcf5e49245fd6051b12465a36b27f657b4923a62f88d61fe0fac971c4a2c9197\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lfx7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-72xb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:46Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:46 crc kubenswrapper[4722]: I0309 14:04:46.895529 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dac2aaf5-653b-4b2a-8efe-ed26bac8d648\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf5b268d6a355ae3a2c40981769412124b85cdfd99364ed58e5163a744f5f83c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q6db\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f451493fa6d86f12c1291c0fe262bcbe4b62cef45c8dda696bb48fa3ded2eeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q6db\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hjrrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:46Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:46 crc kubenswrapper[4722]: I0309 14:04:46.904893 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pvvhj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5dbc0be-527e-4f70-b185-3a10b1b11a75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pvvhj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:46Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:46 crc kubenswrapper[4722]: I0309 14:04:46.916126 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a834db9e-b389-4bac-bfe8-df204ff42b2e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:03:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:03:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5c374ffb7928a20af6f154bfca0c494d692f1f6f55eb8de5d4d5db79a355013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f502c2ee0b77f66f4b32b7d3173eb71bd6d34c31724e8032a0200dac6fd62b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a48ba8f8443935496fa8f3a7e262723b19515f3adc95847d64236f956f91cc20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97aabe30af8bc4ff72fbac7f4fbf7804aff61f8d2fe99753558e7e40f348867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a97aabe30af8bc4ff72fbac7f4fbf7804aff61f8d2fe99753558e7e40f348867\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:46Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:46 crc kubenswrapper[4722]: I0309 14:04:46.926287 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:46Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:46 crc kubenswrapper[4722]: I0309 14:04:46.936771 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69e74824fe242fd2d54bed9734f33823697d7651043d9c97bac6ea5490625a0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:46Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:46 crc kubenswrapper[4722]: I0309 14:04:46.949957 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d69359acfc7a4ee7698088a7e0062b3604a5ceb92910cd3b3d69b1a6291dfdcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://687646beb4c6dd3fb9b659a36449a573f4c389b3e98b07127b18516fdc4f2c93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:46Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:46 crc kubenswrapper[4722]: I0309 14:04:46.960549 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h4zw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b9e29bb-6e51-47ab-a543-b70117ab854d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17cd3d1e3981e2f2a3b66780f753425c78452e6319fb8eae9935d3bac4456820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdw92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h4zw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:46Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:46 crc kubenswrapper[4722]: I0309 14:04:46.978914 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1fc80c43be93eeb2f9805bf314c8430f52062d0ce5d0fa1830e7e0c27e674e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f969085ae62d82dbe1f0f8f5d459c4a1bc7d1ae0a459da484f11dd6ae5fdd418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5765145caac69454a9a8e8de34d05cb1da06924dd365f3e74f52461be5027ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef1c5a3944e8693d8c68a4108c23851352407d0fac7fb7a03763d9084f117dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0106fa808804a96ca8247096399e3d04eb082b2691b1f39df3db9f6c630114c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e737417baac512fab2089f5da12e7c47ef06eae8df7803ad6b248505ff3d18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10ef5c0af2e49f88ea7089e2d4bdc701ce1c068e25ef30012e0b668aa9d4e8cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://879b959fe7540c9b645b6ada02e192abb990fb3311a7346f0b1df41c47d62379\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T14:04:32Z\\\",\\\"message\\\":\\\"igs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0309 14:04:32.599371 6762 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0309 14:04:32.599438 6762 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 14:04:32.599509 6762 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0309 14:04:32.599664 6762 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0309 14:04:32.599744 6762 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0309 14:04:32.599757 6762 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0309 14:04:32.599783 6762 factory.go:656] Stopping watch factory\\\\nI0309 14:04:32.599796 6762 handler.go:208] Removed *v1.Node event handler 7\\\\nI0309 14:04:32.599804 6762 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0309 14:04:32.599813 6762 handler.go:208] Removed *v1.Node event handler 2\\\\nI0309 14:04:32.599922 6762 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 14:04:32.600179 6762 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f5af077ae5b947194e5dba0542d053890a589582e415e1b41a932bdab6632e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcc35a356ac2421f1f5a4e5460ac9461063a1fb85aaefce257f9f4e7aed4dc75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcc35a356ac2421f1f5a4e5460ac9461063a1fb85aaefce257f9f4e7aed4dc75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5v7ng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:46Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:46 crc kubenswrapper[4722]: I0309 14:04:46.989694 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xsft8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e14085d-df3c-4218-aa43-0553786b8867\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fb0f733e44fc7a699961d799267ca3c01e65a62b29d74e6ae682bccba9e5281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqpsw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://435bcac0d223a0ff9431ac5bc5be017aa95d2e567d991a1f4edc0f79559904e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqpsw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xsft8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:46Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:47 crc kubenswrapper[4722]: I0309 14:04:47.003453 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee2ed55248b177bc0873ef3c41517401fdb30fc5ce5a545a1ca8c925e4d4e9e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:47Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:47 crc kubenswrapper[4722]: I0309 14:04:47.017625 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jv499" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd1357a3-36ef-4bc8-85bb-91f3c0f42994\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a14e62f50fa8e5403349dc8696e9c7c45a5da1026227612186e7805b1be722f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e56992f54f9fc4387130b74e3aa0f4652a5d6d960da1a5305b8668b8086fae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e56992f54f9fc4387130b74e3aa0f4652a5d6d960da1a5305b8668b8086fae2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c00d804aec8d776f5f2ab9c7a81161c9942a94564d57afe47c633fb3dc2aa7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c00d804aec8d776f5f2ab9c7a81161c9942a94564d57afe47c633fb3dc2aa7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0491f9997970cb050ece19d88d2a7064d4799db3772132f58055e1dbcf2f3a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0491f9997970cb050ece19d88d2a7064d4799db3772132f58055e1dbcf2f3a82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93c7d9a6b0dc8e76abab762ff6aae1975a31fbc80ee3a3cd32ec378298061748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93c7d9a6b0dc8e76abab762ff6aae1975a31fbc80ee3a3cd32ec378298061748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cc5ae61b0fa949ba4bf5aa60d09385399a557f19ef7c99dc43b21fc9db830e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9cc5ae61b0fa949ba4bf5aa60d09385399a557f19ef7c99dc43b21fc9db830e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a88cf1e6b90a3f13f44bbb08482773f007066c795b00cb0177f6cd2ed87cbb52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a88cf1e6b90a3f13f44bbb08482773f007066c795b00cb0177f6cd2ed87cbb52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jv499\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:47Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:47 crc kubenswrapper[4722]: I0309 14:04:47.028734 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l5pwg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8424174-764a-470b-ab49-e7369a42de7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d30648bba1b59fd7db5e75a61d6a004246668d148515da5738b77e5122358f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l5pwg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:47Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:47 crc kubenswrapper[4722]: I0309 14:04:47.773426 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:47 crc kubenswrapper[4722]: I0309 14:04:47.773488 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:47 crc kubenswrapper[4722]: I0309 14:04:47.773505 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:47 crc kubenswrapper[4722]: I0309 14:04:47.773529 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:47 crc kubenswrapper[4722]: I0309 14:04:47.773546 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:47Z","lastTransitionTime":"2026-03-09T14:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:47 crc kubenswrapper[4722]: E0309 14:04:47.795583 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3321b793-006a-42f5-9c18-7f48ed9bad15\\\",\\\"systemUUID\\\":\\\"70de6d16-940c-46da-be51-a9b50262dda2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:47Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:47 crc kubenswrapper[4722]: I0309 14:04:47.801573 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:47 crc kubenswrapper[4722]: I0309 14:04:47.801627 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:47 crc kubenswrapper[4722]: I0309 14:04:47.801645 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:47 crc kubenswrapper[4722]: I0309 14:04:47.801668 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:47 crc kubenswrapper[4722]: I0309 14:04:47.801684 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:47Z","lastTransitionTime":"2026-03-09T14:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:47 crc kubenswrapper[4722]: I0309 14:04:47.811840 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5v7ng_2e305619-b3a2-44c9-9e54-e1afa4f43dbf/ovnkube-controller/2.log" Mar 09 14:04:47 crc kubenswrapper[4722]: I0309 14:04:47.812937 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5v7ng_2e305619-b3a2-44c9-9e54-e1afa4f43dbf/ovnkube-controller/1.log" Mar 09 14:04:47 crc kubenswrapper[4722]: I0309 14:04:47.819643 4722 generic.go:334] "Generic (PLEG): container finished" podID="2e305619-b3a2-44c9-9e54-e1afa4f43dbf" containerID="10ef5c0af2e49f88ea7089e2d4bdc701ce1c068e25ef30012e0b668aa9d4e8cc" exitCode=1 Mar 09 14:04:47 crc kubenswrapper[4722]: I0309 14:04:47.819691 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" event={"ID":"2e305619-b3a2-44c9-9e54-e1afa4f43dbf","Type":"ContainerDied","Data":"10ef5c0af2e49f88ea7089e2d4bdc701ce1c068e25ef30012e0b668aa9d4e8cc"} Mar 09 14:04:47 crc kubenswrapper[4722]: I0309 14:04:47.819739 4722 scope.go:117] "RemoveContainer" containerID="879b959fe7540c9b645b6ada02e192abb990fb3311a7346f0b1df41c47d62379" Mar 09 14:04:47 crc kubenswrapper[4722]: I0309 14:04:47.823065 4722 scope.go:117] "RemoveContainer" containerID="10ef5c0af2e49f88ea7089e2d4bdc701ce1c068e25ef30012e0b668aa9d4e8cc" Mar 09 14:04:47 crc kubenswrapper[4722]: E0309 14:04:47.823521 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-5v7ng_openshift-ovn-kubernetes(2e305619-b3a2-44c9-9e54-e1afa4f43dbf)\"" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" podUID="2e305619-b3a2-44c9-9e54-e1afa4f43dbf" Mar 09 14:04:47 crc kubenswrapper[4722]: E0309 14:04:47.824694 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3321b793-006a-42f5-9c18-7f48ed9bad15\\\",\\\"systemUUID\\\":\\\"70de6d16-940c-46da-be51-a9b50262dda2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:47Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:47 crc kubenswrapper[4722]: I0309 14:04:47.830709 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:47 crc kubenswrapper[4722]: I0309 14:04:47.830744 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:47 crc kubenswrapper[4722]: I0309 14:04:47.830760 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:47 crc kubenswrapper[4722]: I0309 14:04:47.830781 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:47 crc kubenswrapper[4722]: I0309 14:04:47.830793 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:47Z","lastTransitionTime":"2026-03-09T14:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:47 crc kubenswrapper[4722]: I0309 14:04:47.843604 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a834db9e-b389-4bac-bfe8-df204ff42b2e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:03:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:03:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5c374ffb7928a20af6f154bfca0c494d692f1f6f55eb8de5d4d5db79a355013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f502c2ee0b77f66f4b32b7d3173eb71bd6d34c31724e8032a0200dac6fd62b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a48ba8f8443935496fa8f3a7e262723b19515f3adc95847d64236f956f91cc20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97aabe30af8bc4ff72fbac7f4fbf7804aff61f8d2fe99753558e7e40f348867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a97aabe30af8bc4ff72fbac7f4fbf7804aff61f8d2fe99753558e7e40f348867\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:47Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:47 crc kubenswrapper[4722]: E0309 14:04:47.851141 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3321b793-006a-42f5-9c18-7f48ed9bad15\\\",\\\"systemUUID\\\":\\\"70de6d16-940c-46da-be51-a9b50262dda2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:47Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:47 crc kubenswrapper[4722]: I0309 14:04:47.855774 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:47 crc kubenswrapper[4722]: I0309 14:04:47.855820 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:47 crc kubenswrapper[4722]: I0309 14:04:47.855832 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:47 crc kubenswrapper[4722]: I0309 14:04:47.855851 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:47 crc kubenswrapper[4722]: I0309 14:04:47.855866 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:47Z","lastTransitionTime":"2026-03-09T14:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:47 crc kubenswrapper[4722]: I0309 14:04:47.862266 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:47Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:47 crc kubenswrapper[4722]: E0309 14:04:47.874907 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3321b793-006a-42f5-9c18-7f48ed9bad15\\\",\\\"systemUUID\\\":\\\"70de6d16-940c-46da-be51-a9b50262dda2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:47Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:47 crc kubenswrapper[4722]: I0309 14:04:47.878779 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69e74824fe242fd2d54bed9734f33823697d7651043d9c97bac6ea5490625a0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:47Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:47 crc kubenswrapper[4722]: I0309 14:04:47.880542 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:47 crc kubenswrapper[4722]: I0309 14:04:47.880587 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:47 crc kubenswrapper[4722]: I0309 14:04:47.880636 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:47 crc kubenswrapper[4722]: I0309 14:04:47.880659 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:47 crc kubenswrapper[4722]: I0309 14:04:47.880676 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:47Z","lastTransitionTime":"2026-03-09T14:04:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:47 crc kubenswrapper[4722]: I0309 14:04:47.895036 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d69359acfc7a4ee7698088a7e0062b3604a5ceb92910cd3b3d69b1a6291dfdcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://687646beb4c6dd3fb9b659a36449a573f4c389b3e98b07127b18516fdc4f2c93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:47Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:47 crc kubenswrapper[4722]: E0309 14:04:47.897976 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3321b793-006a-42f5-9c18-7f48ed9bad15\\\",\\\"systemUUID\\\":\\\"70de6d16-940c-46da-be51-a9b50262dda2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:47Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:47 crc kubenswrapper[4722]: E0309 14:04:47.898134 4722 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 09 14:04:47 crc kubenswrapper[4722]: I0309 14:04:47.907390 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h4zw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b9e29bb-6e51-47ab-a543-b70117ab854d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17cd3d1e3981e2f2a3b66780f753425c78452e6319fb8eae9935d3bac4456820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdw92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h4zw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:47Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:47 crc kubenswrapper[4722]: I0309 14:04:47.928408 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1fc80c43be93eeb2f9805bf314c8430f52062d0ce5d0fa1830e7e0c27e674e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f969085ae62d82dbe1f0f8f5d459c4a1bc7d1ae0a459da484f11dd6ae5fdd418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5765145caac69454a9a8e8de34d05cb1da06924dd365f3e74f52461be5027ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef1c5a3944e8693d8c68a4108c23851352407d0fac7fb7a03763d9084f117dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0106fa808804a96ca8247096399e3d04eb082b2691b1f39df3db9f6c630114c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e737417baac512fab2089f5da12e7c47ef06eae8df7803ad6b248505ff3d18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10ef5c0af2e49f88ea7089e2d4bdc701ce1c068e25ef30012e0b668aa9d4e8cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://879b959fe7540c9b645b6ada02e192abb990fb3311a7346f0b1df41c47d62379\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T14:04:32Z\\\",\\\"message\\\":\\\"igs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0309 14:04:32.599371 6762 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0309 14:04:32.599438 6762 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 14:04:32.599509 6762 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0309 14:04:32.599664 6762 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0309 14:04:32.599744 6762 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0309 14:04:32.599757 6762 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0309 14:04:32.599783 6762 factory.go:656] Stopping watch factory\\\\nI0309 14:04:32.599796 6762 handler.go:208] Removed *v1.Node event handler 7\\\\nI0309 14:04:32.599804 6762 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0309 14:04:32.599813 6762 handler.go:208] Removed *v1.Node event handler 2\\\\nI0309 14:04:32.599922 6762 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 14:04:32.600179 6762 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:31Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10ef5c0af2e49f88ea7089e2d4bdc701ce1c068e25ef30012e0b668aa9d4e8cc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T14:04:47Z\\\",\\\"message\\\":\\\"operator.openshift.io/daemonset-dns: default,},ClusterIP:10.217.4.10,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.10],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0309 14:04:47.142840 7031 ovnkube.go:599] Stopped ovnkube\\\\nI0309 14:04:47.141878 7031 services_controller.go:360] Finished syncing service metrics on namespace openshift-apiserver-operator for network=default : 1.907253ms\\\\nI0309 14:04:47.142928 7031 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0309 14:04:47.142980 7031 services_controller.go:356] Processing sync for service openshift-operator-lifecycle-manager/olm-operator-metrics for network=default\\\\nF0309 14:04:47.143018 7031 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initia\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f5af077ae5b947194e5dba0542d053890a589582e415e1b41a932bdab6632e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcc35a356ac2421f1f5a4e5460ac9461063a1fb85aaefce257f9f4e7aed4dc75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcc35a356ac2421f1f5a4e5460ac9461063a1fb85aaefce257f9f4e7aed4dc75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5v7ng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:47Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:47 crc kubenswrapper[4722]: I0309 14:04:47.939746 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xsft8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e14085d-df3c-4218-aa43-0553786b8867\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fb0f733e44fc7a699961d799267ca3c01e65a62b29d74e6ae682bccba9e5281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqpsw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://435bcac0d223a0ff9431ac5bc5be017aa95d2e567d991a1f4edc0f79559904e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqpsw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xsft8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:47Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:47 crc kubenswrapper[4722]: I0309 14:04:47.955755 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee2ed55248b177bc0873ef3c41517401fdb30fc5ce5a545a1ca8c925e4d4e9e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:47Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:47 crc kubenswrapper[4722]: I0309 14:04:47.971996 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jv499" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd1357a3-36ef-4bc8-85bb-91f3c0f42994\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a14e62f50fa8e5403349dc8696e9c7c45a5da1026227612186e7805b1be722f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e56992f54f9fc4387130b74e3aa0f4652a5d6d960da1a5305b8668b8086fae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e56992f54f9fc4387130b74e3aa0f4652a5d6d960da1a5305b8668b8086fae2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c00d804aec8d776f5f2ab9c7a81161c9942a94564d57afe47c633fb3dc2aa7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c00d804aec8d776f5f2ab9c7a81161c9942a94564d57afe47c633fb3dc2aa7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0491f9997970cb050ece19d88d2a7064d4799db3772132f58055e1dbcf2f3a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0491f9997970cb050ece19d88d2a7064d4799db3772132f58055e1dbcf2f3a82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93c7d9a6b0dc8e76abab762ff6aae1975a31fbc80ee3a3cd32ec378298061748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93c7d9a6b0dc8e76abab762ff6aae1975a31fbc80ee3a3cd32ec378298061748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cc5ae61b0fa949ba4bf5aa60d09385399a557f19ef7c99dc43b21fc9db830e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9cc5ae61b0fa949ba4bf5aa60d09385399a557f19ef7c99dc43b21fc9db830e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a88cf1e6b90a3f13f44bbb08482773f007066c795b00cb0177f6cd2ed87cbb52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a88cf1e6b90a3f13f44bbb08482773f007066c795b00cb0177f6cd2ed87cbb52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jv499\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:47Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:47 crc kubenswrapper[4722]: I0309 14:04:47.983916 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l5pwg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8424174-764a-470b-ab49-e7369a42de7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d30648bba1b59fd7db5e75a61d6a004246668d148515da5738b77e5122358f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l5pwg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:47Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:47 crc kubenswrapper[4722]: I0309 14:04:47.995648 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pvvhj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5dbc0be-527e-4f70-b185-3a10b1b11a75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pvvhj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:47Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:48 crc kubenswrapper[4722]: I0309 14:04:48.015256 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2267a4d7-87b4-43f7-80c7-7a7ffb0327fc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04169d9b8359e26cc9807e27fb9a5d54ca0946aaade294c5bcc272a7d21c4db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09fa62e7293a50fb3f6bdaf37061cbe5e95a830c8f7337660ce73138dbcaa641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31ba786781fcdb781f5fa45276e29e94756863d2398109d91e81a52f6f88bee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dff721159b382bcfdde6f406a328ef2ee9264b35a8d8baeb7a3c6d31a60ff3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63769aca40f721bec0716427145a3ceea0595836c03d31c98f6d45ff14de68be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://928d88a2e78d66b996b4b4f8c09878f78dd7fbdb000e2ab6d43ac77c999a748f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://928d88a2e78d66b996b4b4f8c09878f78dd7fbdb000e2ab6d43ac77c999a748f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6625c60c717ffc2b85b70374bc50dfae41b04d6362db0c7daf387f31c25862e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6625c60c717ffc2b85b70374bc50dfae41b04d6362db0c7daf387f31c25862e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://21b62fa3b3144d1d24dbb3a9e5f203e4c692377a54fb2322ebe6b927aa67403b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21b62fa3b3144d1d24dbb3a9e5f203e4c692377a54fb2322ebe6b927aa67403b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:48Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:48 crc kubenswrapper[4722]: I0309 14:04:48.030387 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0965e24b-526e-4842-ac1f-eca7a765355d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56c76a5e8718243a17d8d9fd9a8dcf6083c2ccb65b4816926dca08a89e465728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be3836660bf646052991965270cd69b8b8237b5c4bd698a73789b050d23e6877\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dbef38ff5e714c92fc8a86f1e3c64adf606cb198145f884a000f8dee200fcf2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28fcbb961b20cd0cbc1e8548492168a0fe9920b7b98c7bfeffe34727c299030e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7d636e81bd62363d09f680bbc312f71cbc2ac89d9af44b3c6b9efdaf29ded48\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T14:03:46Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 14:03:46.681607 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 14:03:46.681849 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 14:03:46.682597 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2315154617/tls.crt::/tmp/serving-cert-2315154617/tls.key\\\\\\\"\\\\nI0309 14:03:46.866095 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 14:03:46.869876 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 14:03:46.869896 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 14:03:46.870280 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 14:03:46.870293 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 14:03:46.877096 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0309 14:03:46.877117 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 14:03:46.877121 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 14:03:46.877128 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 14:03:46.877135 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 14:03:46.877139 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 14:03:46.877143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 14:03:46.877146 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 14:03:46.878419 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T14:03:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fbabc5ec2c100150ce886c06c3bd78cc5896c8a06ac9d262eeb552a8a8bb5d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:48Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:48 crc kubenswrapper[4722]: I0309 14:04:48.041627 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:48Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:48 crc kubenswrapper[4722]: I0309 14:04:48.053976 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:48Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:48 crc kubenswrapper[4722]: I0309 14:04:48.064331 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-72xb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43e3c934-6aa7-4c96-a3a6-378f7931fc2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcf5e49245fd6051b12465a36b27f657b4923a62f88d61fe0fac971c4a2c9197\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lfx7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-72xb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:48Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:48 crc kubenswrapper[4722]: I0309 14:04:48.080351 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dac2aaf5-653b-4b2a-8efe-ed26bac8d648\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf5b268d6a355ae3a2c40981769412124b85cdfd99364ed58e5163a744f5f83c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q6db\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f451493fa6d86f12c1291c0fe262bcbe4b62cef45c8dda696bb48fa3ded2eeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q6db\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hjrrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:48Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:48 crc kubenswrapper[4722]: I0309 14:04:48.149382 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pvvhj" Mar 09 14:04:48 crc kubenswrapper[4722]: I0309 14:04:48.149399 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 14:04:48 crc kubenswrapper[4722]: I0309 14:04:48.149440 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 14:04:48 crc kubenswrapper[4722]: I0309 14:04:48.149504 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 14:04:48 crc kubenswrapper[4722]: E0309 14:04:48.149617 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pvvhj" podUID="f5dbc0be-527e-4f70-b185-3a10b1b11a75" Mar 09 14:04:48 crc kubenswrapper[4722]: E0309 14:04:48.149705 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 14:04:48 crc kubenswrapper[4722]: E0309 14:04:48.149797 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 14:04:48 crc kubenswrapper[4722]: E0309 14:04:48.149873 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 14:04:48 crc kubenswrapper[4722]: I0309 14:04:48.826053 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5v7ng_2e305619-b3a2-44c9-9e54-e1afa4f43dbf/ovnkube-controller/2.log" Mar 09 14:04:48 crc kubenswrapper[4722]: I0309 14:04:48.832411 4722 scope.go:117] "RemoveContainer" containerID="10ef5c0af2e49f88ea7089e2d4bdc701ce1c068e25ef30012e0b668aa9d4e8cc" Mar 09 14:04:48 crc kubenswrapper[4722]: E0309 14:04:48.832831 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-5v7ng_openshift-ovn-kubernetes(2e305619-b3a2-44c9-9e54-e1afa4f43dbf)\"" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" podUID="2e305619-b3a2-44c9-9e54-e1afa4f43dbf" Mar 09 14:04:48 crc kubenswrapper[4722]: I0309 14:04:48.850515 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l5pwg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8424174-764a-470b-ab49-e7369a42de7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d30648bba1b59fd7db5e75a61d6a004246668d148515da5738b77e5122358f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l5pwg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:48Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:48 crc kubenswrapper[4722]: I0309 14:04:48.872106 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee2ed55248b177bc0873ef3c41517401fdb30fc5ce5a545a1ca8c925e4d4e9e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:48Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:48 crc kubenswrapper[4722]: I0309 14:04:48.897817 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jv499" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd1357a3-36ef-4bc8-85bb-91f3c0f42994\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a14e62f50fa8e5403349dc8696e9c7c45a5da1026227612186e7805b1be722f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e56992f54f9fc4387130b74e3aa0f4652a5d6d960da1a5305b8668b8086fae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e56992f54f9fc4387130b74e3aa0f4652a5d6d960da1a5305b8668b8086fae2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c00d804aec8d776f5f2ab9c7a81161c9942a94564d57afe47c633fb3dc2aa7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c00d804aec8d776f5f2ab9c7a81161c9942a94564d57afe47c633fb3dc2aa7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0491f9997970cb050ece19d88d2a7064d4799db3772132f58055e1dbcf2f3a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0491f9997970cb050ece19d88d2a7064d4799db3772132f58055e1dbcf2f3a82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93c7d9a6b0dc8e76abab762ff6aae1975a31fbc80ee3a3cd32ec378298061748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93c7d9a6b0dc8e76abab762ff6aae1975a31fbc80ee3a3cd32ec378298061748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cc5ae61b0fa949ba4bf5aa60d09385399a557f19ef7c99dc43b21fc9db830e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9cc5ae61b0fa949ba4bf5aa60d09385399a557f19ef7c99dc43b21fc9db830e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a88cf1e6b90a3f13f44bbb08482773f007066c795b00cb0177f6cd2ed87cbb52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a88cf1e6b90a3f13f44bbb08482773f007066c795b00cb0177f6cd2ed87cbb52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jv499\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:48Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:48 crc kubenswrapper[4722]: I0309 14:04:48.919487 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0965e24b-526e-4842-ac1f-eca7a765355d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56c76a5e8718243a17d8d9fd9a8dcf6083c2ccb65b4816926dca08a89e465728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be3836660bf646052991965270cd69b8b8237b5c4bd698a73789b050d23e6877\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dbef38ff5e714c92fc8a86f1e3c64adf606cb198145f884a000f8dee200fcf2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28fcbb961b20cd0cbc1e8548492168a0fe9920b7b98c7bfeffe34727c299030e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7d636e81bd62363d09f680bbc312f71cbc2ac89d9af44b3c6b9efdaf29ded48\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T14:03:46Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 14:03:46.681607 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 14:03:46.681849 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 14:03:46.682597 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2315154617/tls.crt::/tmp/serving-cert-2315154617/tls.key\\\\\\\"\\\\nI0309 14:03:46.866095 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 14:03:46.869876 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 14:03:46.869896 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 14:03:46.870280 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 14:03:46.870293 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 14:03:46.877096 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0309 14:03:46.877117 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 14:03:46.877121 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 14:03:46.877128 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 14:03:46.877135 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 14:03:46.877139 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 14:03:46.877143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 14:03:46.877146 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 14:03:46.878419 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T14:03:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fbabc5ec2c100150ce886c06c3bd78cc5896c8a06ac9d262eeb552a8a8bb5d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:48Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:48 crc kubenswrapper[4722]: I0309 14:04:48.938831 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:48Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:48 crc kubenswrapper[4722]: I0309 14:04:48.959145 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:48Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:48 crc kubenswrapper[4722]: I0309 14:04:48.977561 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-72xb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43e3c934-6aa7-4c96-a3a6-378f7931fc2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcf5e49245fd6051b12465a36b27f657b4923a62f88d61fe0fac971c4a2c9197\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lfx7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-72xb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:48Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:48 crc kubenswrapper[4722]: I0309 14:04:48.996539 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dac2aaf5-653b-4b2a-8efe-ed26bac8d648\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf5b268d6a355ae3a2c40981769412124b85cdfd99364ed58e5163a744f5f83c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q6db\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f451493fa6d86f12c1291c0fe262bcbe4b62cef45c8dda696bb48fa3ded2eeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q6db\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hjrrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:48Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:49 crc kubenswrapper[4722]: I0309 14:04:49.010575 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pvvhj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5dbc0be-527e-4f70-b185-3a10b1b11a75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pvvhj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:49Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:49 crc kubenswrapper[4722]: I0309 14:04:49.038606 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2267a4d7-87b4-43f7-80c7-7a7ffb0327fc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04169d9b8359e26cc9807e27fb9a5d54ca0946aaade294c5bcc272a7d21c4db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09fa62e7293a50fb3f6bdaf37061cbe5e95a830c8f7337660ce73138dbcaa641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31ba786781fcdb781f5fa45276e29e94756863d2398109d91e81a52f6f88bee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dff721159b382bcfdde6f406a328ef2ee9264b35a8d8baeb7a3c6d31a60ff3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63769aca40f721bec0716427145a3ceea0595836c03d31c98f6d45ff14de68be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://928d88a2e78d66b996b4b4f8c09878f78dd7fbdb000e2ab6d43ac77c999a748f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://928d88a2e78d66b996b4b4f8c09878f78dd7fbdb000e2ab6d43ac77c999a748f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6625c60c717ffc2b85b70374bc50dfae41b04d6362db0c7daf387f31c25862e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6625c60c717ffc2b85b70374bc50dfae41b04d6362db0c7daf387f31c25862e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://21b62fa3b3144d1d24dbb3a9e5f203e4c692377a54fb2322ebe6b927aa67403b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21b62fa3b3144d1d24dbb3a9e5f203e4c692377a54fb2322ebe6b927aa67403b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:49Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:49 crc kubenswrapper[4722]: I0309 14:04:49.055835 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:49Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:49 crc kubenswrapper[4722]: I0309 14:04:49.076613 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69e74824fe242fd2d54bed9734f33823697d7651043d9c97bac6ea5490625a0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:49Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:49 crc kubenswrapper[4722]: I0309 14:04:49.096010 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a834db9e-b389-4bac-bfe8-df204ff42b2e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:03:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:03:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5c374ffb7928a20af6f154bfca0c494d692f1f6f55eb8de5d4d5db79a355013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f502c2ee0b77f66f4b32b7d3173eb71bd6d34c31724e8032a0200dac6fd62b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a48ba8f8443935496fa8f3a7e262723b19515f3adc95847d64236f956f91cc20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97aabe30af8bc4ff72fbac7f4fbf7804aff61f8d2fe99753558e7e40f348867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a97aabe30af8bc4ff72fbac7f4fbf7804aff61f8d2fe99753558e7e40f348867\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:49Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:49 crc kubenswrapper[4722]: I0309 14:04:49.129282 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1fc80c43be93eeb2f9805bf314c8430f52062d0ce5d0fa1830e7e0c27e674e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f969085ae62d82dbe1f0f8f5d459c4a1bc7d1ae0a459da484f11dd6ae5fdd418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5765145caac69454a9a8e8de34d05cb1da06924dd365f3e74f52461be5027ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef1c5a3944e8693d8c68a4108c23851352407d0fac7fb7a03763d9084f117dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0106fa808804a96ca8247096399e3d04eb082b2691b1f39df3db9f6c630114c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e737417baac512fab2089f5da12e7c47ef06eae8df7803ad6b248505ff3d18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10ef5c0af2e49f88ea7089e2d4bdc701ce1c068e25ef30012e0b668aa9d4e8cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10ef5c0af2e49f88ea7089e2d4bdc701ce1c068e25ef30012e0b668aa9d4e8cc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T14:04:47Z\\\",\\\"message\\\":\\\"operator.openshift.io/daemonset-dns: default,},ClusterIP:10.217.4.10,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.10],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0309 14:04:47.142840 7031 ovnkube.go:599] Stopped ovnkube\\\\nI0309 14:04:47.141878 7031 services_controller.go:360] Finished syncing service metrics on namespace openshift-apiserver-operator for network=default : 1.907253ms\\\\nI0309 14:04:47.142928 7031 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0309 14:04:47.142980 7031 services_controller.go:356] Processing sync for service openshift-operator-lifecycle-manager/olm-operator-metrics for network=default\\\\nF0309 14:04:47.143018 7031 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initia\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-5v7ng_openshift-ovn-kubernetes(2e305619-b3a2-44c9-9e54-e1afa4f43dbf)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f5af077ae5b947194e5dba0542d053890a589582e415e1b41a932bdab6632e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcc35a356ac2421f1f5a4e5460ac9461063a1fb85aaefce257f9f4e7aed4dc75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcc35a356ac2421f1f5a4e5460ac9461063a1fb85aaefce257f9f4e7aed4dc75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5v7ng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:49Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:49 crc kubenswrapper[4722]: I0309 14:04:49.155472 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xsft8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e14085d-df3c-4218-aa43-0553786b8867\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fb0f733e44fc7a699961d799267ca3c01e65a62b29d74e6ae682bccba9e5281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqpsw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://435bcac0d223a0ff9431ac5bc5be017aa95d2e567d991a1f4edc0f79559904e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqpsw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xsft8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:49Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:49 crc kubenswrapper[4722]: I0309 14:04:49.174860 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d69359acfc7a4ee7698088a7e0062b3604a5ceb92910cd3b3d69b1a6291dfdcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://687646beb4c6dd3fb9b659a36449a573f4c389b3e98b07127b18516fdc4f2c93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:49Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:49 crc kubenswrapper[4722]: I0309 14:04:49.195836 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h4zw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b9e29bb-6e51-47ab-a543-b70117ab854d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17cd3d1e3981e2f2a3b66780f753425c78452e6319fb8eae9935d3bac4456820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdw92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h4zw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:49Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:49 crc kubenswrapper[4722]: I0309 14:04:49.358013 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 14:04:49 crc kubenswrapper[4722]: I0309 14:04:49.383493 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d69359acfc7a4ee7698088a7e0062b3604a5ceb92910cd3b3d69b1a6291dfdcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://687646beb4c6dd3fb9b659a36449a573f4c389b3e98b07127b18516fdc4f2c93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:49Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:49 crc kubenswrapper[4722]: I0309 14:04:49.407041 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h4zw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b9e29bb-6e51-47ab-a543-b70117ab854d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17cd3d1e3981e2f2a3b66780f753425c78452e6319fb8eae9935d3bac4456820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdw92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h4zw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:49Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:49 crc kubenswrapper[4722]: I0309 14:04:49.443619 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1fc80c43be93eeb2f9805bf314c8430f52062d0ce5d0fa1830e7e0c27e674e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f969085ae62d82dbe1f0f8f5d459c4a1bc7d1ae0a459da484f11dd6ae5fdd418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5765145caac69454a9a8e8de34d05cb1da06924dd365f3e74f52461be5027ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef1c5a3944e8693d8c68a4108c23851352407d0fac7fb7a03763d9084f117dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0106fa808804a96ca8247096399e3d04eb082b2691b1f39df3db9f6c630114c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e737417baac512fab2089f5da12e7c47ef06eae8df7803ad6b248505ff3d18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10ef5c0af2e49f88ea7089e2d4bdc701ce1c068e25ef30012e0b668aa9d4e8cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10ef5c0af2e49f88ea7089e2d4bdc701ce1c068e25ef30012e0b668aa9d4e8cc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T14:04:47Z\\\",\\\"message\\\":\\\"operator.openshift.io/daemonset-dns: default,},ClusterIP:10.217.4.10,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.10],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0309 14:04:47.142840 7031 ovnkube.go:599] Stopped ovnkube\\\\nI0309 14:04:47.141878 7031 services_controller.go:360] Finished syncing service metrics on namespace openshift-apiserver-operator for network=default : 1.907253ms\\\\nI0309 14:04:47.142928 7031 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0309 14:04:47.142980 7031 services_controller.go:356] Processing sync for service openshift-operator-lifecycle-manager/olm-operator-metrics for network=default\\\\nF0309 14:04:47.143018 7031 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initia\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-5v7ng_openshift-ovn-kubernetes(2e305619-b3a2-44c9-9e54-e1afa4f43dbf)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f5af077ae5b947194e5dba0542d053890a589582e415e1b41a932bdab6632e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcc35a356ac2421f1f5a4e5460ac9461063a1fb85aaefce257f9f4e7aed4dc75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcc35a356ac2421f1f5a4e5460ac9461063a1fb85aaefce257f9f4e7aed4dc75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5v7ng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:49Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:49 crc kubenswrapper[4722]: I0309 14:04:49.466557 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xsft8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e14085d-df3c-4218-aa43-0553786b8867\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fb0f733e44fc7a699961d799267ca3c01e65a62b29d74e6ae682bccba9e5281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqpsw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://435bcac0d223a0ff9431ac5bc5be017aa95d2e567d991a1f4edc0f79559904e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqpsw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xsft8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:49Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:49 crc kubenswrapper[4722]: I0309 14:04:49.494532 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee2ed55248b177bc0873ef3c41517401fdb30fc5ce5a545a1ca8c925e4d4e9e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:49Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:49 crc kubenswrapper[4722]: I0309 14:04:49.518764 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jv499" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd1357a3-36ef-4bc8-85bb-91f3c0f42994\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a14e62f50fa8e5403349dc8696e9c7c45a5da1026227612186e7805b1be722f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e56992f54f9fc4387130b74e3aa0f4652a5d6d960da1a5305b8668b8086fae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e56992f54f9fc4387130b74e3aa0f4652a5d6d960da1a5305b8668b8086fae2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c00d804aec8d776f5f2ab9c7a81161c9942a94564d57afe47c633fb3dc2aa7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c00d804aec8d776f5f2ab9c7a81161c9942a94564d57afe47c633fb3dc2aa7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0491f9997970cb050ece19d88d2a7064d4799db3772132f58055e1dbcf2f3a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0491f9997970cb050ece19d88d2a7064d4799db3772132f58055e1dbcf2f3a82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93c7d9a6b0dc8e76abab762ff6aae1975a31fbc80ee3a3cd32ec378298061748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93c7d9a6b0dc8e76abab762ff6aae1975a31fbc80ee3a3cd32ec378298061748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cc5ae61b0fa949ba4bf5aa60d09385399a557f19ef7c99dc43b21fc9db830e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9cc5ae61b0fa949ba4bf5aa60d09385399a557f19ef7c99dc43b21fc9db830e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a88cf1e6b90a3f13f44bbb08482773f007066c795b00cb0177f6cd2ed87cbb52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a88cf1e6b90a3f13f44bbb08482773f007066c795b00cb0177f6cd2ed87cbb52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jv499\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:49Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:49 crc kubenswrapper[4722]: I0309 14:04:49.533896 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l5pwg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8424174-764a-470b-ab49-e7369a42de7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d30648bba1b59fd7db5e75a61d6a004246668d148515da5738b77e5122358f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l5pwg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:49Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:49 crc kubenswrapper[4722]: I0309 14:04:49.561006 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2267a4d7-87b4-43f7-80c7-7a7ffb0327fc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04169d9b8359e26cc9807e27fb9a5d54ca0946aaade294c5bcc272a7d21c4db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09fa62e7293a50fb3f6bdaf37061cbe5e95a830c8f7337660ce73138dbcaa641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31ba786781fcdb781f5fa45276e29e94756863d2398109d91e81a52f6f88bee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dff721159b382bcfdde6f406a328ef2ee9264b35a8d8baeb7a3c6d31a60ff3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63769aca40f721bec0716427145a3ceea0595836c03d31c98f6d45ff14de68be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://928d88a2e78d66b996b4b4f8c09878f78dd7fbdb000e2ab6d43ac77c999a748f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://928d88a2e78d66b996b4b4f8c09878f78dd7fbdb000e2ab6d43ac77c999a748f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6625c60c717ffc2b85b70374bc50dfae41b04d6362db0c7daf387f31c25862e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6625c60c717ffc2b85b70374bc50dfae41b04d6362db0c7daf387f31c25862e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://21b62fa3b3144d1d24dbb3a9e5f203e4c692377a54fb2322ebe6b927aa67403b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21b62fa3b3144d1d24dbb3a9e5f203e4c692377a54fb2322ebe6b927aa67403b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:49Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:49 crc kubenswrapper[4722]: I0309 14:04:49.579842 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0965e24b-526e-4842-ac1f-eca7a765355d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56c76a5e8718243a17d8d9fd9a8dcf6083c2ccb65b4816926dca08a89e465728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be3836660bf646052991965270cd69b8b8237b5c4bd698a73789b050d23e6877\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dbef38ff5e714c92fc8a86f1e3c64adf606cb198145f884a000f8dee200fcf2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28fcbb961b20cd0cbc1e8548492168a0fe9920b7b98c7bfeffe34727c299030e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7d636e81bd62363d09f680bbc312f71cbc2ac89d9af44b3c6b9efdaf29ded48\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T14:03:46Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 14:03:46.681607 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 14:03:46.681849 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 14:03:46.682597 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2315154617/tls.crt::/tmp/serving-cert-2315154617/tls.key\\\\\\\"\\\\nI0309 14:03:46.866095 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 14:03:46.869876 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 14:03:46.869896 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 14:03:46.870280 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 14:03:46.870293 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 14:03:46.877096 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0309 14:03:46.877117 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 14:03:46.877121 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 14:03:46.877128 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 14:03:46.877135 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 14:03:46.877139 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 14:03:46.877143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 14:03:46.877146 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 14:03:46.878419 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T14:03:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fbabc5ec2c100150ce886c06c3bd78cc5896c8a06ac9d262eeb552a8a8bb5d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:49Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:49 crc kubenswrapper[4722]: I0309 14:04:49.600916 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:49Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:49 crc kubenswrapper[4722]: I0309 14:04:49.617313 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:49Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:49 crc kubenswrapper[4722]: I0309 14:04:49.638035 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-72xb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43e3c934-6aa7-4c96-a3a6-378f7931fc2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcf5e49245fd6051b12465a36b27f657b4923a62f88d61fe0fac971c4a2c9197\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lfx7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-72xb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:49Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:49 crc kubenswrapper[4722]: I0309 14:04:49.659166 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dac2aaf5-653b-4b2a-8efe-ed26bac8d648\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf5b268d6a355ae3a2c40981769412124b85cdfd99364ed58e5163a744f5f83c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q6db\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f451493fa6d86f12c1291c0fe262bcbe4b62cef45c8dda696bb48fa3ded2eeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q6db\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hjrrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:49Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:49 crc kubenswrapper[4722]: I0309 14:04:49.674296 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pvvhj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5dbc0be-527e-4f70-b185-3a10b1b11a75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pvvhj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:49Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:49 crc kubenswrapper[4722]: I0309 14:04:49.690016 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a834db9e-b389-4bac-bfe8-df204ff42b2e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:03:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:03:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5c374ffb7928a20af6f154bfca0c494d692f1f6f55eb8de5d4d5db79a355013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f502c2ee0b77f66f4b32b7d3173eb71bd6d34c31724e8032a0200dac6fd62b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a48ba8f8443935496fa8f3a7e262723b19515f3adc95847d64236f956f91cc20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97aabe30af8bc4ff72fbac7f4fbf7804aff61f8d2fe99753558e7e40f348867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a97aabe30af8bc4ff72fbac7f4fbf7804aff61f8d2fe99753558e7e40f348867\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:49Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:49 crc kubenswrapper[4722]: I0309 14:04:49.709308 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:49Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:49 crc kubenswrapper[4722]: I0309 14:04:49.729298 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69e74824fe242fd2d54bed9734f33823697d7651043d9c97bac6ea5490625a0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:49Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:50 crc kubenswrapper[4722]: I0309 14:04:50.148485 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 14:04:50 crc kubenswrapper[4722]: I0309 14:04:50.148539 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 14:04:50 crc kubenswrapper[4722]: E0309 14:04:50.148635 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 14:04:50 crc kubenswrapper[4722]: E0309 14:04:50.148672 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 14:04:50 crc kubenswrapper[4722]: I0309 14:04:50.149453 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 14:04:50 crc kubenswrapper[4722]: E0309 14:04:50.149513 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 14:04:50 crc kubenswrapper[4722]: I0309 14:04:50.149442 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pvvhj" Mar 09 14:04:50 crc kubenswrapper[4722]: E0309 14:04:50.149782 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pvvhj" podUID="f5dbc0be-527e-4f70-b185-3a10b1b11a75" Mar 09 14:04:50 crc kubenswrapper[4722]: I0309 14:04:50.174271 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2267a4d7-87b4-43f7-80c7-7a7ffb0327fc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04169d9b8359e26cc9807e27fb9a5d54ca0946aaade294c5bcc272a7d21c4db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09fa62e7293a50fb3f6bdaf37061cbe5e95a830c8f7337660ce73138dbcaa641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31ba786781fcdb781f5fa45276e29e94756863d2398109d91e81a52f6f88bee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dff721159b382bcfdde6f406a328ef2ee9264b35a8d8baeb7a3c6d31a60ff3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63769aca40f721bec0716427145a3ceea0595836c03d31c98f6d45ff14de68be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://928d88a2e78d66b996b4b4f8c09878f78dd7fbdb000e2ab6d43ac77c999a748f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://928d88a2e78d66b996b4b4f8c09878f78dd7fbdb000e2ab6d43ac77c999a748f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6625c60c717ffc2b85b70374bc50dfae41b04d6362db0c7daf387f31c25862e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6625c60c717ffc2b85b70374bc50dfae41b04d6362db0c7daf387f31c25862e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://21b62fa3b3144d1d24dbb3a9e5f203e4c692377a54fb2322ebe6b927aa67403b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21b62fa3b3144d1d24dbb3a9e5f203e4c692377a54fb2322ebe6b927aa67403b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:50Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:50 crc kubenswrapper[4722]: I0309 14:04:50.196261 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0965e24b-526e-4842-ac1f-eca7a765355d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56c76a5e8718243a17d8d9fd9a8dcf6083c2ccb65b4816926dca08a89e465728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be3836660bf646052991965270cd69b8b8237b5c4bd698a73789b050d23e6877\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dbef38ff5e714c92fc8a86f1e3c64adf606cb198145f884a000f8dee200fcf2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28fcbb961b20cd0cbc1e8548492168a0fe9920b7b98c7bfeffe34727c299030e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7d636e81bd62363d09f680bbc312f71cbc2ac89d9af44b3c6b9efdaf29ded48\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T14:03:46Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 14:03:46.681607 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 14:03:46.681849 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 14:03:46.682597 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2315154617/tls.crt::/tmp/serving-cert-2315154617/tls.key\\\\\\\"\\\\nI0309 14:03:46.866095 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 14:03:46.869876 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 14:03:46.869896 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 14:03:46.870280 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 14:03:46.870293 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 14:03:46.877096 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0309 14:03:46.877117 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 14:03:46.877121 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 14:03:46.877128 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 14:03:46.877135 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 14:03:46.877139 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 14:03:46.877143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 14:03:46.877146 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 14:03:46.878419 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T14:03:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fbabc5ec2c100150ce886c06c3bd78cc5896c8a06ac9d262eeb552a8a8bb5d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:50Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:50 crc kubenswrapper[4722]: I0309 14:04:50.209928 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:50Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:50 crc kubenswrapper[4722]: I0309 14:04:50.226080 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:50Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:50 crc kubenswrapper[4722]: I0309 14:04:50.238337 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-72xb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43e3c934-6aa7-4c96-a3a6-378f7931fc2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcf5e49245fd6051b12465a36b27f657b4923a62f88d61fe0fac971c4a2c9197\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lfx7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-72xb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:50Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:50 crc kubenswrapper[4722]: E0309 14:04:50.249551 4722 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 14:04:50 crc kubenswrapper[4722]: I0309 14:04:50.252967 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dac2aaf5-653b-4b2a-8efe-ed26bac8d648\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf5b268d6a355ae3a2c40981769412124b85cdfd99364ed58e5163a744f5f83c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q6db\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f451493fa6d86f12c1291c0fe262bcbe4b62cef45c8dda696bb48fa3ded2eeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q6db\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hjrrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:50Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:50 crc kubenswrapper[4722]: I0309 14:04:50.267847 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pvvhj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5dbc0be-527e-4f70-b185-3a10b1b11a75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pvvhj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:50Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:50 crc kubenswrapper[4722]: I0309 14:04:50.278389 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f5dbc0be-527e-4f70-b185-3a10b1b11a75-metrics-certs\") pod \"network-metrics-daemon-pvvhj\" (UID: \"f5dbc0be-527e-4f70-b185-3a10b1b11a75\") " pod="openshift-multus/network-metrics-daemon-pvvhj" Mar 09 14:04:50 crc kubenswrapper[4722]: E0309 14:04:50.278569 4722 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 14:04:50 crc kubenswrapper[4722]: E0309 14:04:50.278624 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f5dbc0be-527e-4f70-b185-3a10b1b11a75-metrics-certs podName:f5dbc0be-527e-4f70-b185-3a10b1b11a75 nodeName:}" failed. No retries permitted until 2026-03-09 14:05:06.278606727 +0000 UTC m=+146.834175303 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f5dbc0be-527e-4f70-b185-3a10b1b11a75-metrics-certs") pod "network-metrics-daemon-pvvhj" (UID: "f5dbc0be-527e-4f70-b185-3a10b1b11a75") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 14:04:50 crc kubenswrapper[4722]: I0309 14:04:50.284552 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:50Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:50 crc kubenswrapper[4722]: I0309 14:04:50.297174 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69e74824fe242fd2d54bed9734f33823697d7651043d9c97bac6ea5490625a0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:50Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:50 crc kubenswrapper[4722]: I0309 14:04:50.306480 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a834db9e-b389-4bac-bfe8-df204ff42b2e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:03:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:03:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5c374ffb7928a20af6f154bfca0c494d692f1f6f55eb8de5d4d5db79a355013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f502c2ee0b77f66f4b32b7d3173eb71bd6d34c31724e8032a0200dac6fd62b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a48ba8f8443935496fa8f3a7e262723b19515f3adc95847d64236f956f91cc20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97aabe30af8bc4ff72fbac7f4fbf7804aff61f8d2fe99753558e7e40f348867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a97aabe30af8bc4ff72fbac7f4fbf7804aff61f8d2fe99753558e7e40f348867\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:50Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:50 crc kubenswrapper[4722]: I0309 14:04:50.319785 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h4zw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b9e29bb-6e51-47ab-a543-b70117ab854d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17cd3d1e3981e2f2a3b66780f753425c78452e6319fb8eae9935d3bac4456820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdw92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h4zw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:50Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:50 crc kubenswrapper[4722]: I0309 14:04:50.337746 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1fc80c43be93eeb2f9805bf314c8430f52062d0ce5d0fa1830e7e0c27e674e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f969085ae62d82dbe1f0f8f5d459c4a1bc7d1ae0a459da484f11dd6ae5fdd418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5765145caac69454a9a8e8de34d05cb1da06924dd365f3e74f52461be5027ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef1c5a3944e8693d8c68a4108c23851352407d0fac7fb7a03763d9084f117dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0106fa808804a96ca8247096399e3d04eb082b2691b1f39df3db9f6c630114c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e737417baac512fab2089f5da12e7c47ef06eae8df7803ad6b248505ff3d18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10ef5c0af2e49f88ea7089e2d4bdc701ce1c068e25ef30012e0b668aa9d4e8cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10ef5c0af2e49f88ea7089e2d4bdc701ce1c068e25ef30012e0b668aa9d4e8cc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T14:04:47Z\\\",\\\"message\\\":\\\"operator.openshift.io/daemonset-dns: default,},ClusterIP:10.217.4.10,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.10],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0309 14:04:47.142840 7031 ovnkube.go:599] Stopped ovnkube\\\\nI0309 14:04:47.141878 7031 services_controller.go:360] Finished syncing service metrics on namespace openshift-apiserver-operator for network=default : 1.907253ms\\\\nI0309 14:04:47.142928 7031 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0309 14:04:47.142980 7031 services_controller.go:356] Processing sync for service openshift-operator-lifecycle-manager/olm-operator-metrics for network=default\\\\nF0309 14:04:47.143018 7031 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initia\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-5v7ng_openshift-ovn-kubernetes(2e305619-b3a2-44c9-9e54-e1afa4f43dbf)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f5af077ae5b947194e5dba0542d053890a589582e415e1b41a932bdab6632e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcc35a356ac2421f1f5a4e5460ac9461063a1fb85aaefce257f9f4e7aed4dc75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcc35a356ac2421f1f5a4e5460ac9461063a1fb85aaefce257f9f4e7aed4dc75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5v7ng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:50Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:50 crc kubenswrapper[4722]: I0309 14:04:50.349512 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xsft8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e14085d-df3c-4218-aa43-0553786b8867\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fb0f733e44fc7a699961d799267ca3c01e65a62b29d74e6ae682bccba9e5281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqpsw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://435bcac0d223a0ff9431ac5bc5be017aa95d2e567d991a1f4edc0f79559904e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqpsw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xsft8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:50Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:50 crc kubenswrapper[4722]: I0309 14:04:50.363386 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d69359acfc7a4ee7698088a7e0062b3604a5ceb92910cd3b3d69b1a6291dfdcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://687646beb4c6dd3fb9b659a36449a573f4c389b3e98b07127b18516fdc4f2c93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:50Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:50 crc kubenswrapper[4722]: I0309 14:04:50.378724 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jv499" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd1357a3-36ef-4bc8-85bb-91f3c0f42994\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a14e62f50fa8e5403349dc8696e9c7c45a5da1026227612186e7805b1be722f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e56992f54f9fc4387130b74e3aa0f4652a5d6d960da1a5305b8668b8086fae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e56992f54f9fc4387130b74e3aa0f4652a5d6d960da1a5305b8668b8086fae2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c00d804aec8d776f5f2ab9c7a81161c9942a94564d57afe47c633fb3dc2aa7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c00d804aec8d776f5f2ab9c7a81161c9942a94564d57afe47c633fb3dc2aa7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0491f9997970cb050ece19d88d2a7064d4799db3772132f58055e1dbcf2f3a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0491f9997970cb050ece19d88d2a7064d4799db3772132f58055e1dbcf2f3a82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93c7d9a6b0dc8e76abab762ff6aae1975a31fbc80ee3a3cd32ec378298061748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93c7d9a6b0dc8e76abab762ff6aae1975a31fbc80ee3a3cd32ec378298061748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cc5ae61b0fa949ba4bf5aa60d09385399a557f19ef7c99dc43b21fc9db830e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9cc5ae61b0fa949ba4bf5aa60d09385399a557f19ef7c99dc43b21fc9db830e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a88cf1e6b90a3f13f44bbb08482773f007066c795b00cb0177f6cd2ed87cbb52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a88cf1e6b90a3f13f44bbb08482773f007066c795b00cb0177f6cd2ed87cbb52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jv499\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:50Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:50 crc kubenswrapper[4722]: I0309 14:04:50.391795 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l5pwg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8424174-764a-470b-ab49-e7369a42de7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d30648bba1b59fd7db5e75a61d6a004246668d148515da5738b77e5122358f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l5pwg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:50Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:50 crc kubenswrapper[4722]: I0309 14:04:50.406363 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee2ed55248b177bc0873ef3c41517401fdb30fc5ce5a545a1ca8c925e4d4e9e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:50Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:52 crc kubenswrapper[4722]: I0309 14:04:52.148934 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 14:04:52 crc kubenswrapper[4722]: I0309 14:04:52.148988 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 14:04:52 crc kubenswrapper[4722]: E0309 14:04:52.149592 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 14:04:52 crc kubenswrapper[4722]: I0309 14:04:52.149076 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 14:04:52 crc kubenswrapper[4722]: E0309 14:04:52.149749 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 14:04:52 crc kubenswrapper[4722]: I0309 14:04:52.149044 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pvvhj" Mar 09 14:04:52 crc kubenswrapper[4722]: E0309 14:04:52.149902 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pvvhj" podUID="f5dbc0be-527e-4f70-b185-3a10b1b11a75" Mar 09 14:04:52 crc kubenswrapper[4722]: E0309 14:04:52.149982 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 14:04:54 crc kubenswrapper[4722]: I0309 14:04:54.148639 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pvvhj" Mar 09 14:04:54 crc kubenswrapper[4722]: I0309 14:04:54.148766 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 14:04:54 crc kubenswrapper[4722]: I0309 14:04:54.148855 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 14:04:54 crc kubenswrapper[4722]: E0309 14:04:54.148853 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pvvhj" podUID="f5dbc0be-527e-4f70-b185-3a10b1b11a75" Mar 09 14:04:54 crc kubenswrapper[4722]: E0309 14:04:54.149135 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 14:04:54 crc kubenswrapper[4722]: E0309 14:04:54.149262 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 14:04:54 crc kubenswrapper[4722]: I0309 14:04:54.149655 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 14:04:54 crc kubenswrapper[4722]: E0309 14:04:54.149775 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 14:04:55 crc kubenswrapper[4722]: E0309 14:04:55.250801 4722 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 14:04:56 crc kubenswrapper[4722]: I0309 14:04:56.148617 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 14:04:56 crc kubenswrapper[4722]: E0309 14:04:56.148758 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 14:04:56 crc kubenswrapper[4722]: I0309 14:04:56.148829 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 14:04:56 crc kubenswrapper[4722]: I0309 14:04:56.148955 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 14:04:56 crc kubenswrapper[4722]: E0309 14:04:56.148984 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 14:04:56 crc kubenswrapper[4722]: E0309 14:04:56.149013 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 14:04:56 crc kubenswrapper[4722]: I0309 14:04:56.148636 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pvvhj" Mar 09 14:04:56 crc kubenswrapper[4722]: E0309 14:04:56.149077 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pvvhj" podUID="f5dbc0be-527e-4f70-b185-3a10b1b11a75" Mar 09 14:04:58 crc kubenswrapper[4722]: I0309 14:04:58.067046 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:58 crc kubenswrapper[4722]: I0309 14:04:58.067097 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:58 crc kubenswrapper[4722]: I0309 14:04:58.067114 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:58 crc kubenswrapper[4722]: I0309 14:04:58.067131 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:58 crc kubenswrapper[4722]: I0309 14:04:58.067141 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:58Z","lastTransitionTime":"2026-03-09T14:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:58 crc kubenswrapper[4722]: E0309 14:04:58.078729 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3321b793-006a-42f5-9c18-7f48ed9bad15\\\",\\\"systemUUID\\\":\\\"70de6d16-940c-46da-be51-a9b50262dda2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:58Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:58 crc kubenswrapper[4722]: I0309 14:04:58.081950 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:58 crc kubenswrapper[4722]: I0309 14:04:58.081990 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:58 crc kubenswrapper[4722]: I0309 14:04:58.081998 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:58 crc kubenswrapper[4722]: I0309 14:04:58.082013 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:58 crc kubenswrapper[4722]: I0309 14:04:58.082022 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:58Z","lastTransitionTime":"2026-03-09T14:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:58 crc kubenswrapper[4722]: E0309 14:04:58.096792 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3321b793-006a-42f5-9c18-7f48ed9bad15\\\",\\\"systemUUID\\\":\\\"70de6d16-940c-46da-be51-a9b50262dda2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:58Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:58 crc kubenswrapper[4722]: I0309 14:04:58.100035 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:58 crc kubenswrapper[4722]: I0309 14:04:58.100071 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:58 crc kubenswrapper[4722]: I0309 14:04:58.100080 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:58 crc kubenswrapper[4722]: I0309 14:04:58.100093 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:58 crc kubenswrapper[4722]: I0309 14:04:58.100104 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:58Z","lastTransitionTime":"2026-03-09T14:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:58 crc kubenswrapper[4722]: E0309 14:04:58.111193 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3321b793-006a-42f5-9c18-7f48ed9bad15\\\",\\\"systemUUID\\\":\\\"70de6d16-940c-46da-be51-a9b50262dda2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:58Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:58 crc kubenswrapper[4722]: I0309 14:04:58.115290 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:58 crc kubenswrapper[4722]: I0309 14:04:58.115468 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:58 crc kubenswrapper[4722]: I0309 14:04:58.115548 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:58 crc kubenswrapper[4722]: I0309 14:04:58.115644 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:58 crc kubenswrapper[4722]: I0309 14:04:58.115718 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:58Z","lastTransitionTime":"2026-03-09T14:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:58 crc kubenswrapper[4722]: E0309 14:04:58.127435 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3321b793-006a-42f5-9c18-7f48ed9bad15\\\",\\\"systemUUID\\\":\\\"70de6d16-940c-46da-be51-a9b50262dda2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:58Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:58 crc kubenswrapper[4722]: I0309 14:04:58.130680 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:04:58 crc kubenswrapper[4722]: I0309 14:04:58.130751 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:04:58 crc kubenswrapper[4722]: I0309 14:04:58.130765 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:04:58 crc kubenswrapper[4722]: I0309 14:04:58.130782 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:04:58 crc kubenswrapper[4722]: I0309 14:04:58.130796 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:04:58Z","lastTransitionTime":"2026-03-09T14:04:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:04:58 crc kubenswrapper[4722]: E0309 14:04:58.143087 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:04:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3321b793-006a-42f5-9c18-7f48ed9bad15\\\",\\\"systemUUID\\\":\\\"70de6d16-940c-46da-be51-a9b50262dda2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:04:58Z is after 2025-08-24T17:21:41Z" Mar 09 14:04:58 crc kubenswrapper[4722]: E0309 14:04:58.143219 4722 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 09 14:04:58 crc kubenswrapper[4722]: I0309 14:04:58.148142 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 14:04:58 crc kubenswrapper[4722]: I0309 14:04:58.148142 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pvvhj" Mar 09 14:04:58 crc kubenswrapper[4722]: I0309 14:04:58.148181 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 14:04:58 crc kubenswrapper[4722]: I0309 14:04:58.148357 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 14:04:58 crc kubenswrapper[4722]: E0309 14:04:58.148517 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 14:04:58 crc kubenswrapper[4722]: E0309 14:04:58.148592 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 14:04:58 crc kubenswrapper[4722]: E0309 14:04:58.148658 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pvvhj" podUID="f5dbc0be-527e-4f70-b185-3a10b1b11a75" Mar 09 14:04:58 crc kubenswrapper[4722]: E0309 14:04:58.148709 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 14:05:00 crc kubenswrapper[4722]: I0309 14:05:00.148612 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 14:05:00 crc kubenswrapper[4722]: E0309 14:05:00.149076 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 14:05:00 crc kubenswrapper[4722]: I0309 14:05:00.148775 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 14:05:00 crc kubenswrapper[4722]: E0309 14:05:00.149155 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 14:05:00 crc kubenswrapper[4722]: I0309 14:05:00.148814 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pvvhj" Mar 09 14:05:00 crc kubenswrapper[4722]: I0309 14:05:00.148698 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 14:05:00 crc kubenswrapper[4722]: E0309 14:05:00.149274 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pvvhj" podUID="f5dbc0be-527e-4f70-b185-3a10b1b11a75" Mar 09 14:05:00 crc kubenswrapper[4722]: E0309 14:05:00.149340 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 14:05:00 crc kubenswrapper[4722]: I0309 14:05:00.162587 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69e74824fe242fd2d54bed9734f33823697d7651043d9c97bac6ea5490625a0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:00Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:00 crc kubenswrapper[4722]: I0309 14:05:00.175110 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a834db9e-b389-4bac-bfe8-df204ff42b2e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:03:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:03:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5c374ffb7928a20af6f154bfca0c494d692f1f6f55eb8de5d4d5db79a355013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f502c2ee0b77f66f4b32b7d3173eb71bd6d34c31724e8032a0200dac6fd62b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a48ba8f8443935496fa8f3a7e262723b19515f3adc95847d64236f956f91cc20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97aabe30af8bc4ff72fbac7f4fbf7804aff61f8d2fe99753558e7e40f348867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a97aabe30af8bc4ff72fbac7f4fbf7804aff61f8d2fe99753558e7e40f348867\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:00Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:00 crc kubenswrapper[4722]: I0309 14:05:00.188476 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:00Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:00 crc kubenswrapper[4722]: I0309 14:05:00.199560 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xsft8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e14085d-df3c-4218-aa43-0553786b8867\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fb0f733e44fc7a699961d799267ca3c01e65a62b29d74e6ae682bccba9e5281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqpsw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://435bcac0d223a0ff9431ac5bc5be017aa95d2e567d991a1f4edc0f79559904e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqpsw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xsft8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:00Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:00 crc kubenswrapper[4722]: I0309 14:05:00.216338 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d69359acfc7a4ee7698088a7e0062b3604a5ceb92910cd3b3d69b1a6291dfdcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://687646beb4c6dd3fb9b659a36449a573f4c389b3e98b07127b18516fdc4f2c93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:00Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:00 crc kubenswrapper[4722]: I0309 14:05:00.233121 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h4zw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b9e29bb-6e51-47ab-a543-b70117ab854d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17cd3d1e3981e2f2a3b66780f753425c78452e6319fb8eae9935d3bac4456820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdw92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h4zw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:00Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:00 crc kubenswrapper[4722]: E0309 14:05:00.251506 4722 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 14:05:00 crc kubenswrapper[4722]: I0309 14:05:00.258194 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1fc80c43be93eeb2f9805bf314c8430f52062d0ce5d0fa1830e7e0c27e674e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f969085ae62d82dbe1f0f8f5d459c4a1bc7d1ae0a459da484f11dd6ae5fdd418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5765145caac69454a9a8e8de34d05cb1da06924dd365f3e74f52461be5027ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef1c5a3944e8693d8c68a4108c23851352407d0fac7fb7a03763d9084f117dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0106fa808804a96ca8247096399e3d04eb082b2691b1f39df3db9f6c630114c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e737417baac512fab2089f5da12e7c47ef06eae8df7803ad6b248505ff3d18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10ef5c0af2e49f88ea7089e2d4bdc701ce1c068e25ef30012e0b668aa9d4e8cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10ef5c0af2e49f88ea7089e2d4bdc701ce1c068e25ef30012e0b668aa9d4e8cc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T14:04:47Z\\\",\\\"message\\\":\\\"operator.openshift.io/daemonset-dns: default,},ClusterIP:10.217.4.10,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.10],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0309 14:04:47.142840 7031 ovnkube.go:599] Stopped ovnkube\\\\nI0309 14:04:47.141878 7031 services_controller.go:360] Finished syncing service metrics on namespace openshift-apiserver-operator for network=default : 1.907253ms\\\\nI0309 14:04:47.142928 7031 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0309 14:04:47.142980 7031 services_controller.go:356] Processing sync for service openshift-operator-lifecycle-manager/olm-operator-metrics for network=default\\\\nF0309 14:04:47.143018 7031 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initia\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-5v7ng_openshift-ovn-kubernetes(2e305619-b3a2-44c9-9e54-e1afa4f43dbf)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f5af077ae5b947194e5dba0542d053890a589582e415e1b41a932bdab6632e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcc35a356ac2421f1f5a4e5460ac9461063a1fb85aaefce257f9f4e7aed4dc75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcc35a356ac2421f1f5a4e5460ac9461063a1fb85aaefce257f9f4e7aed4dc75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5v7ng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:00Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:00 crc kubenswrapper[4722]: I0309 14:05:00.270796 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee2ed55248b177bc0873ef3c41517401fdb30fc5ce5a545a1ca8c925e4d4e9e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:00Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:00 crc kubenswrapper[4722]: I0309 14:05:00.283503 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jv499" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd1357a3-36ef-4bc8-85bb-91f3c0f42994\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a14e62f50fa8e5403349dc8696e9c7c45a5da1026227612186e7805b1be722f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e56992f54f9fc4387130b74e3aa0f4652a5d6d960da1a5305b8668b8086fae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e56992f54f9fc4387130b74e3aa0f4652a5d6d960da1a5305b8668b8086fae2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c00d804aec8d776f5f2ab9c7a81161c9942a94564d57afe47c633fb3dc2aa7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c00d804aec8d776f5f2ab9c7a81161c9942a94564d57afe47c633fb3dc2aa7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0491f9997970cb050ece19d88d2a7064d4799db3772132f58055e1dbcf2f3a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0491f9997970cb050ece19d88d2a7064d4799db3772132f58055e1dbcf2f3a82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93c7d9a6b0dc8e76abab762ff6aae1975a31fbc80ee3a3cd32ec378298061748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93c7d9a6b0dc8e76abab762ff6aae1975a31fbc80ee3a3cd32ec378298061748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cc5ae61b0fa949ba4bf5aa60d09385399a557f19ef7c99dc43b21fc9db830e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9cc5ae61b0fa949ba4bf5aa60d09385399a557f19ef7c99dc43b21fc9db830e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a88cf1e6b90a3f13f44bbb08482773f007066c795b00cb0177f6cd2ed87cbb52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a88cf1e6b90a3f13f44bbb08482773f007066c795b00cb0177f6cd2ed87cbb52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jv499\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:00Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:00 crc kubenswrapper[4722]: I0309 14:05:00.292003 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l5pwg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8424174-764a-470b-ab49-e7369a42de7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d30648bba1b59fd7db5e75a61d6a004246668d148515da5738b77e5122358f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l5pwg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:00Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:00 crc kubenswrapper[4722]: I0309 14:05:00.301657 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:00Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:00 crc kubenswrapper[4722]: I0309 14:05:00.313840 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:00Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:00 crc kubenswrapper[4722]: I0309 14:05:00.323371 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-72xb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43e3c934-6aa7-4c96-a3a6-378f7931fc2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcf5e49245fd6051b12465a36b27f657b4923a62f88d61fe0fac971c4a2c9197\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lfx7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-72xb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:00Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:00 crc kubenswrapper[4722]: I0309 14:05:00.332766 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dac2aaf5-653b-4b2a-8efe-ed26bac8d648\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf5b268d6a355ae3a2c40981769412124b85cdfd99364ed58e5163a744f5f83c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q6db\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f451493fa6d86f12c1291c0fe262bcbe4b62cef45c8dda696bb48fa3ded2eeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q6db\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hjrrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:00Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:00 crc kubenswrapper[4722]: I0309 14:05:00.341387 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pvvhj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5dbc0be-527e-4f70-b185-3a10b1b11a75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pvvhj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:00Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:00 crc kubenswrapper[4722]: I0309 14:05:00.356901 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2267a4d7-87b4-43f7-80c7-7a7ffb0327fc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04169d9b8359e26cc9807e27fb9a5d54ca0946aaade294c5bcc272a7d21c4db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09fa62e7293a50fb3f6bdaf37061cbe5e95a830c8f7337660ce73138dbcaa641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31ba786781fcdb781f5fa45276e29e94756863d2398109d91e81a52f6f88bee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dff721159b382bcfdde6f406a328ef2ee9264b35a8d8baeb7a3c6d31a60ff3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63769aca40f721bec0716427145a3ceea0595836c03d31c98f6d45ff14de68be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://928d88a2e78d66b996b4b4f8c09878f78dd7fbdb000e2ab6d43ac77c999a748f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://928d88a2e78d66b996b4b4f8c09878f78dd7fbdb000e2ab6d43ac77c999a748f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6625c60c717ffc2b85b70374bc50dfae41b04d6362db0c7daf387f31c25862e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6625c60c717ffc2b85b70374bc50dfae41b04d6362db0c7daf387f31c25862e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://21b62fa3b3144d1d24dbb3a9e5f203e4c692377a54fb2322ebe6b927aa67403b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21b62fa3b3144d1d24dbb3a9e5f203e4c692377a54fb2322ebe6b927aa67403b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:00Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:00 crc kubenswrapper[4722]: I0309 14:05:00.370645 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0965e24b-526e-4842-ac1f-eca7a765355d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56c76a5e8718243a17d8d9fd9a8dcf6083c2ccb65b4816926dca08a89e465728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be3836660bf646052991965270cd69b8b8237b5c4bd698a73789b050d23e6877\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dbef38ff5e714c92fc8a86f1e3c64adf606cb198145f884a000f8dee200fcf2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28fcbb961b20cd0cbc1e8548492168a0fe9920b7b98c7bfeffe34727c299030e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7d636e81bd62363d09f680bbc312f71cbc2ac89d9af44b3c6b9efdaf29ded48\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T14:03:46Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 14:03:46.681607 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 14:03:46.681849 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 14:03:46.682597 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2315154617/tls.crt::/tmp/serving-cert-2315154617/tls.key\\\\\\\"\\\\nI0309 14:03:46.866095 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 14:03:46.869876 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 14:03:46.869896 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 14:03:46.870280 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 14:03:46.870293 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 14:03:46.877096 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0309 14:03:46.877117 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 14:03:46.877121 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 14:03:46.877128 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 14:03:46.877135 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 14:03:46.877139 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 14:03:46.877143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 14:03:46.877146 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 14:03:46.878419 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T14:03:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fbabc5ec2c100150ce886c06c3bd78cc5896c8a06ac9d262eeb552a8a8bb5d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:00Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:02 crc kubenswrapper[4722]: I0309 14:05:02.148607 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 14:05:02 crc kubenswrapper[4722]: I0309 14:05:02.148690 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 14:05:02 crc kubenswrapper[4722]: I0309 14:05:02.148780 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 14:05:02 crc kubenswrapper[4722]: E0309 14:05:02.148770 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 14:05:02 crc kubenswrapper[4722]: E0309 14:05:02.148896 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 14:05:02 crc kubenswrapper[4722]: I0309 14:05:02.148967 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pvvhj" Mar 09 14:05:02 crc kubenswrapper[4722]: E0309 14:05:02.149077 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 14:05:02 crc kubenswrapper[4722]: E0309 14:05:02.149180 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pvvhj" podUID="f5dbc0be-527e-4f70-b185-3a10b1b11a75" Mar 09 14:05:02 crc kubenswrapper[4722]: I0309 14:05:02.150589 4722 scope.go:117] "RemoveContainer" containerID="10ef5c0af2e49f88ea7089e2d4bdc701ce1c068e25ef30012e0b668aa9d4e8cc" Mar 09 14:05:02 crc kubenswrapper[4722]: E0309 14:05:02.151132 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-5v7ng_openshift-ovn-kubernetes(2e305619-b3a2-44c9-9e54-e1afa4f43dbf)\"" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" podUID="2e305619-b3a2-44c9-9e54-e1afa4f43dbf" Mar 09 14:05:04 crc kubenswrapper[4722]: I0309 14:05:04.149123 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 14:05:04 crc kubenswrapper[4722]: I0309 14:05:04.149254 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pvvhj" Mar 09 14:05:04 crc kubenswrapper[4722]: I0309 14:05:04.149282 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 14:05:04 crc kubenswrapper[4722]: I0309 14:05:04.149419 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 14:05:04 crc kubenswrapper[4722]: E0309 14:05:04.149404 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 14:05:04 crc kubenswrapper[4722]: E0309 14:05:04.149622 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 14:05:04 crc kubenswrapper[4722]: E0309 14:05:04.149757 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 14:05:04 crc kubenswrapper[4722]: E0309 14:05:04.149808 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pvvhj" podUID="f5dbc0be-527e-4f70-b185-3a10b1b11a75" Mar 09 14:05:05 crc kubenswrapper[4722]: E0309 14:05:05.252517 4722 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 14:05:06 crc kubenswrapper[4722]: I0309 14:05:06.148102 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pvvhj" Mar 09 14:05:06 crc kubenswrapper[4722]: E0309 14:05:06.148362 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pvvhj" podUID="f5dbc0be-527e-4f70-b185-3a10b1b11a75" Mar 09 14:05:06 crc kubenswrapper[4722]: I0309 14:05:06.148376 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 14:05:06 crc kubenswrapper[4722]: E0309 14:05:06.148547 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 14:05:06 crc kubenswrapper[4722]: I0309 14:05:06.148376 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 14:05:06 crc kubenswrapper[4722]: E0309 14:05:06.148684 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 14:05:06 crc kubenswrapper[4722]: I0309 14:05:06.148727 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 14:05:06 crc kubenswrapper[4722]: E0309 14:05:06.148789 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 14:05:06 crc kubenswrapper[4722]: I0309 14:05:06.342946 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f5dbc0be-527e-4f70-b185-3a10b1b11a75-metrics-certs\") pod \"network-metrics-daemon-pvvhj\" (UID: \"f5dbc0be-527e-4f70-b185-3a10b1b11a75\") " pod="openshift-multus/network-metrics-daemon-pvvhj" Mar 09 14:05:06 crc kubenswrapper[4722]: E0309 14:05:06.343085 4722 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 14:05:06 crc kubenswrapper[4722]: E0309 14:05:06.343136 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f5dbc0be-527e-4f70-b185-3a10b1b11a75-metrics-certs podName:f5dbc0be-527e-4f70-b185-3a10b1b11a75 nodeName:}" failed. No retries permitted until 2026-03-09 14:05:38.343120037 +0000 UTC m=+178.898688613 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f5dbc0be-527e-4f70-b185-3a10b1b11a75-metrics-certs") pod "network-metrics-daemon-pvvhj" (UID: "f5dbc0be-527e-4f70-b185-3a10b1b11a75") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 14:05:08 crc kubenswrapper[4722]: I0309 14:05:08.148357 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 14:05:08 crc kubenswrapper[4722]: I0309 14:05:08.148449 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 14:05:08 crc kubenswrapper[4722]: E0309 14:05:08.148545 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 14:05:08 crc kubenswrapper[4722]: I0309 14:05:08.148585 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pvvhj" Mar 09 14:05:08 crc kubenswrapper[4722]: E0309 14:05:08.148746 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 14:05:08 crc kubenswrapper[4722]: I0309 14:05:08.148815 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 14:05:08 crc kubenswrapper[4722]: E0309 14:05:08.149035 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pvvhj" podUID="f5dbc0be-527e-4f70-b185-3a10b1b11a75" Mar 09 14:05:08 crc kubenswrapper[4722]: E0309 14:05:08.149083 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 14:05:08 crc kubenswrapper[4722]: I0309 14:05:08.159337 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 09 14:05:08 crc kubenswrapper[4722]: I0309 14:05:08.289808 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:05:08 crc kubenswrapper[4722]: I0309 14:05:08.289935 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:05:08 crc kubenswrapper[4722]: I0309 14:05:08.289949 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:05:08 crc kubenswrapper[4722]: I0309 14:05:08.289967 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:05:08 crc kubenswrapper[4722]: I0309 14:05:08.290332 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:05:08Z","lastTransitionTime":"2026-03-09T14:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:05:08 crc kubenswrapper[4722]: E0309 14:05:08.304420 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:05:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:05:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:05:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:05:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3321b793-006a-42f5-9c18-7f48ed9bad15\\\",\\\"systemUUID\\\":\\\"70de6d16-940c-46da-be51-a9b50262dda2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:08Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:08 crc kubenswrapper[4722]: I0309 14:05:08.309806 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:05:08 crc kubenswrapper[4722]: I0309 14:05:08.309852 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:05:08 crc kubenswrapper[4722]: I0309 14:05:08.309865 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:05:08 crc kubenswrapper[4722]: I0309 14:05:08.309882 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:05:08 crc kubenswrapper[4722]: I0309 14:05:08.309894 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:05:08Z","lastTransitionTime":"2026-03-09T14:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:05:08 crc kubenswrapper[4722]: E0309 14:05:08.327301 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:05:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:05:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:05:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:05:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3321b793-006a-42f5-9c18-7f48ed9bad15\\\",\\\"systemUUID\\\":\\\"70de6d16-940c-46da-be51-a9b50262dda2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:08Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:08 crc kubenswrapper[4722]: I0309 14:05:08.331259 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:05:08 crc kubenswrapper[4722]: I0309 14:05:08.331300 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:05:08 crc kubenswrapper[4722]: I0309 14:05:08.331315 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:05:08 crc kubenswrapper[4722]: I0309 14:05:08.331332 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:05:08 crc kubenswrapper[4722]: I0309 14:05:08.331346 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:05:08Z","lastTransitionTime":"2026-03-09T14:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:05:08 crc kubenswrapper[4722]: E0309 14:05:08.350441 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:05:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:05:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:05:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:05:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3321b793-006a-42f5-9c18-7f48ed9bad15\\\",\\\"systemUUID\\\":\\\"70de6d16-940c-46da-be51-a9b50262dda2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:08Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:08 crc kubenswrapper[4722]: I0309 14:05:08.355575 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:05:08 crc kubenswrapper[4722]: I0309 14:05:08.355632 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:05:08 crc kubenswrapper[4722]: I0309 14:05:08.355641 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:05:08 crc kubenswrapper[4722]: I0309 14:05:08.355657 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:05:08 crc kubenswrapper[4722]: I0309 14:05:08.355669 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:05:08Z","lastTransitionTime":"2026-03-09T14:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:05:08 crc kubenswrapper[4722]: E0309 14:05:08.385103 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:05:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:05:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:05:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:05:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3321b793-006a-42f5-9c18-7f48ed9bad15\\\",\\\"systemUUID\\\":\\\"70de6d16-940c-46da-be51-a9b50262dda2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:08Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:08 crc kubenswrapper[4722]: I0309 14:05:08.389220 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:05:08 crc kubenswrapper[4722]: I0309 14:05:08.389251 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:05:08 crc kubenswrapper[4722]: I0309 14:05:08.389260 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:05:08 crc kubenswrapper[4722]: I0309 14:05:08.389274 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:05:08 crc kubenswrapper[4722]: I0309 14:05:08.389300 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:05:08Z","lastTransitionTime":"2026-03-09T14:05:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:05:08 crc kubenswrapper[4722]: E0309 14:05:08.403083 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:05:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:05:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:05:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:05:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:05:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3321b793-006a-42f5-9c18-7f48ed9bad15\\\",\\\"systemUUID\\\":\\\"70de6d16-940c-46da-be51-a9b50262dda2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:08Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:08 crc kubenswrapper[4722]: E0309 14:05:08.403277 4722 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 09 14:05:09 crc kubenswrapper[4722]: I0309 14:05:09.920558 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-h4zw5_6b9e29bb-6e51-47ab-a543-b70117ab854d/kube-multus/0.log" Mar 09 14:05:09 crc kubenswrapper[4722]: I0309 14:05:09.921739 4722 generic.go:334] "Generic (PLEG): container finished" podID="6b9e29bb-6e51-47ab-a543-b70117ab854d" containerID="17cd3d1e3981e2f2a3b66780f753425c78452e6319fb8eae9935d3bac4456820" exitCode=1 Mar 09 14:05:09 crc kubenswrapper[4722]: I0309 14:05:09.921858 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-h4zw5" event={"ID":"6b9e29bb-6e51-47ab-a543-b70117ab854d","Type":"ContainerDied","Data":"17cd3d1e3981e2f2a3b66780f753425c78452e6319fb8eae9935d3bac4456820"} Mar 09 14:05:09 crc kubenswrapper[4722]: I0309 14:05:09.922807 4722 scope.go:117] "RemoveContainer" containerID="17cd3d1e3981e2f2a3b66780f753425c78452e6319fb8eae9935d3bac4456820" Mar 09 14:05:09 crc kubenswrapper[4722]: I0309 14:05:09.939090 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69e74824fe242fd2d54bed9734f33823697d7651043d9c97bac6ea5490625a0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:09Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:09 crc kubenswrapper[4722]: I0309 14:05:09.969772 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a834db9e-b389-4bac-bfe8-df204ff42b2e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:03:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:03:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5c374ffb7928a20af6f154bfca0c494d692f1f6f55eb8de5d4d5db79a355013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f502c2ee0b77f66f4b32b7d3173eb71bd6d34c31724e8032a0200dac6fd62b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a48ba8f8443935496fa8f3a7e262723b19515f3adc95847d64236f956f91cc20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97aabe30af8bc4ff72fbac7f4fbf7804aff61f8d2fe99753558e7e40f348867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a97aabe30af8bc4ff72fbac7f4fbf7804aff61f8d2fe99753558e7e40f348867\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:09Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:10 crc kubenswrapper[4722]: I0309 14:05:10.016674 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:10Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:10 crc kubenswrapper[4722]: I0309 14:05:10.031683 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xsft8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e14085d-df3c-4218-aa43-0553786b8867\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fb0f733e44fc7a699961d799267ca3c01e65a62b29d74e6ae682bccba9e5281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqpsw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://435bcac0d223a0ff9431ac5bc5be017aa95d2e567d991a1f4edc0f79559904e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqpsw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xsft8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:10Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:10 crc kubenswrapper[4722]: I0309 14:05:10.044948 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d69359acfc7a4ee7698088a7e0062b3604a5ceb92910cd3b3d69b1a6291dfdcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://687646beb4c6dd3fb9b659a36449a573f4c389b3e98b07127b18516fdc4f2c93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:10Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:10 crc kubenswrapper[4722]: I0309 14:05:10.059504 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h4zw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b9e29bb-6e51-47ab-a543-b70117ab854d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:05:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:05:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17cd3d1e3981e2f2a3b66780f753425c78452e6319fb8eae9935d3bac4456820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17cd3d1e3981e2f2a3b66780f753425c78452e6319fb8eae9935d3bac4456820\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T14:05:09Z\\\",\\\"message\\\":\\\"2026-03-09T14:04:24+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9a389771-bfaa-4805-bbd7-772a0f07bd73\\\\n2026-03-09T14:04:24+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9a389771-bfaa-4805-bbd7-772a0f07bd73 to /host/opt/cni/bin/\\\\n2026-03-09T14:04:24Z [verbose] multus-daemon started\\\\n2026-03-09T14:04:24Z [verbose] Readiness Indicator file check\\\\n2026-03-09T14:05:09Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdw92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h4zw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:10Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:10 crc kubenswrapper[4722]: I0309 14:05:10.077712 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1fc80c43be93eeb2f9805bf314c8430f52062d0ce5d0fa1830e7e0c27e674e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f969085ae62d82dbe1f0f8f5d459c4a1bc7d1ae0a459da484f11dd6ae5fdd418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5765145caac69454a9a8e8de34d05cb1da06924dd365f3e74f52461be5027ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef1c5a3944e8693d8c68a4108c23851352407d0fac7fb7a03763d9084f117dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0106fa808804a96ca8247096399e3d04eb082b2691b1f39df3db9f6c630114c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e737417baac512fab2089f5da12e7c47ef06eae8df7803ad6b248505ff3d18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10ef5c0af2e49f88ea7089e2d4bdc701ce1c068e25ef30012e0b668aa9d4e8cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10ef5c0af2e49f88ea7089e2d4bdc701ce1c068e25ef30012e0b668aa9d4e8cc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T14:04:47Z\\\",\\\"message\\\":\\\"operator.openshift.io/daemonset-dns: default,},ClusterIP:10.217.4.10,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.10],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0309 14:04:47.142840 7031 ovnkube.go:599] Stopped ovnkube\\\\nI0309 14:04:47.141878 7031 services_controller.go:360] Finished syncing service metrics on namespace openshift-apiserver-operator for network=default : 1.907253ms\\\\nI0309 14:04:47.142928 7031 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0309 14:04:47.142980 7031 services_controller.go:356] Processing sync for service openshift-operator-lifecycle-manager/olm-operator-metrics for network=default\\\\nF0309 14:04:47.143018 7031 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initia\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-5v7ng_openshift-ovn-kubernetes(2e305619-b3a2-44c9-9e54-e1afa4f43dbf)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f5af077ae5b947194e5dba0542d053890a589582e415e1b41a932bdab6632e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcc35a356ac2421f1f5a4e5460ac9461063a1fb85aaefce257f9f4e7aed4dc75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcc35a356ac2421f1f5a4e5460ac9461063a1fb85aaefce257f9f4e7aed4dc75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5v7ng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:10Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:10 crc kubenswrapper[4722]: I0309 14:05:10.091654 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee2ed55248b177bc0873ef3c41517401fdb30fc5ce5a545a1ca8c925e4d4e9e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:10Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:10 crc kubenswrapper[4722]: I0309 14:05:10.105553 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jv499" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd1357a3-36ef-4bc8-85bb-91f3c0f42994\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a14e62f50fa8e5403349dc8696e9c7c45a5da1026227612186e7805b1be722f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e56992f54f9fc4387130b74e3aa0f4652a5d6d960da1a5305b8668b8086fae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e56992f54f9fc4387130b74e3aa0f4652a5d6d960da1a5305b8668b8086fae2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c00d804aec8d776f5f2ab9c7a81161c9942a94564d57afe47c633fb3dc2aa7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c00d804aec8d776f5f2ab9c7a81161c9942a94564d57afe47c633fb3dc2aa7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0491f9997970cb050ece19d88d2a7064d4799db3772132f58055e1dbcf2f3a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0491f9997970cb050ece19d88d2a7064d4799db3772132f58055e1dbcf2f3a82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93c7d9a6b0dc8e76abab762ff6aae1975a31fbc80ee3a3cd32ec378298061748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93c7d9a6b0dc8e76abab762ff6aae1975a31fbc80ee3a3cd32ec378298061748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cc5ae61b0fa949ba4bf5aa60d09385399a557f19ef7c99dc43b21fc9db830e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9cc5ae61b0fa949ba4bf5aa60d09385399a557f19ef7c99dc43b21fc9db830e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a88cf1e6b90a3f13f44bbb08482773f007066c795b00cb0177f6cd2ed87cbb52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a88cf1e6b90a3f13f44bbb08482773f007066c795b00cb0177f6cd2ed87cbb52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jv499\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:10Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:10 crc kubenswrapper[4722]: I0309 14:05:10.116724 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l5pwg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8424174-764a-470b-ab49-e7369a42de7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d30648bba1b59fd7db5e75a61d6a004246668d148515da5738b77e5122358f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l5pwg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:10Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:10 crc kubenswrapper[4722]: I0309 14:05:10.128156 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:10Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:10 crc kubenswrapper[4722]: I0309 14:05:10.140710 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:10Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:10 crc kubenswrapper[4722]: I0309 14:05:10.148461 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 14:05:10 crc kubenswrapper[4722]: I0309 14:05:10.148536 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 14:05:10 crc kubenswrapper[4722]: I0309 14:05:10.148492 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 14:05:10 crc kubenswrapper[4722]: E0309 14:05:10.148662 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 14:05:10 crc kubenswrapper[4722]: E0309 14:05:10.148798 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 14:05:10 crc kubenswrapper[4722]: E0309 14:05:10.148982 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 14:05:10 crc kubenswrapper[4722]: I0309 14:05:10.149319 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pvvhj" Mar 09 14:05:10 crc kubenswrapper[4722]: E0309 14:05:10.149486 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pvvhj" podUID="f5dbc0be-527e-4f70-b185-3a10b1b11a75" Mar 09 14:05:10 crc kubenswrapper[4722]: I0309 14:05:10.153599 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-72xb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43e3c934-6aa7-4c96-a3a6-378f7931fc2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcf5e49245fd6051b12465a36b27f657b4923a62f88d61fe0fac971c4a2c9197\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lfx7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-72xb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:10Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:10 crc kubenswrapper[4722]: I0309 14:05:10.168318 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dac2aaf5-653b-4b2a-8efe-ed26bac8d648\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf5b268d6a355ae3a2c40981769412124b85cdfd99364ed58e5163a744f5f83c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q6db\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f451493fa6d86f12c1291c0fe262bcbe4b62cef45c8dda696bb48fa3ded2eeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q6db\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hjrrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:10Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:10 crc kubenswrapper[4722]: I0309 14:05:10.185432 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pvvhj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5dbc0be-527e-4f70-b185-3a10b1b11a75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pvvhj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:10Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:10 crc kubenswrapper[4722]: I0309 14:05:10.204390 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36d472dc-6063-4ddf-9b31-68a6a23f365c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:03:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:03:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68bead726a61890d6212098f420a58ca85c9d97bb2efe57b314aff215c26169c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a07bf846a5613ee835b70c1915ee46d1e45185f5ff81096ec2e4dd4238f35f3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T14:03:39Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 14:03:09.643545 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 14:03:09.644916 1 observer_polling.go:159] Starting file observer\\\\nI0309 14:03:09.646001 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 14:03:09.646843 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 14:03:35.207688 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0309 14:03:39.212600 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 14:03:39.212719 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T14:03:09Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff66d43b6e872eed24aacaf3ea882c7148b6ef62bbe5b7bf10ede5d2691a3680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41c6395bf5f6006036c3d58919ea6ea976b56827cac8ae369b10dd4037d29153\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03fa5f2f3d3dc4ebaf6b5db02fe2036caaf7e0de7197cb5d3409dbbae676a338\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:10Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:10 crc kubenswrapper[4722]: I0309 14:05:10.227341 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2267a4d7-87b4-43f7-80c7-7a7ffb0327fc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04169d9b8359e26cc9807e27fb9a5d54ca0946aaade294c5bcc272a7d21c4db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09fa62e7293a50fb3f6bdaf37061cbe5e95a830c8f7337660ce73138dbcaa641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31ba786781fcdb781f5fa45276e29e94756863d2398109d91e81a52f6f88bee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dff721159b382bcfdde6f406a328ef2ee9264b35a8d8baeb7a3c6d31a60ff3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63769aca40f721bec0716427145a3ceea0595836c03d31c98f6d45ff14de68be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://928d88a2e78d66b996b4b4f8c09878f78dd7fbdb000e2ab6d43ac77c999a748f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://928d88a2e78d66b996b4b4f8c09878f78dd7fbdb000e2ab6d43ac77c999a748f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6625c60c717ffc2b85b70374bc50dfae41b04d6362db0c7daf387f31c25862e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6625c60c717ffc2b85b70374bc50dfae41b04d6362db0c7daf387f31c25862e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://21b62fa3b3144d1d24dbb3a9e5f203e4c692377a54fb2322ebe6b927aa67403b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21b62fa3b3144d1d24dbb3a9e5f203e4c692377a54fb2322ebe6b927aa67403b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:10Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:10 crc kubenswrapper[4722]: I0309 14:05:10.243667 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0965e24b-526e-4842-ac1f-eca7a765355d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56c76a5e8718243a17d8d9fd9a8dcf6083c2ccb65b4816926dca08a89e465728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be3836660bf646052991965270cd69b8b8237b5c4bd698a73789b050d23e6877\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dbef38ff5e714c92fc8a86f1e3c64adf606cb198145f884a000f8dee200fcf2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28fcbb961b20cd0cbc1e8548492168a0fe9920b7b98c7bfeffe34727c299030e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7d636e81bd62363d09f680bbc312f71cbc2ac89d9af44b3c6b9efdaf29ded48\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T14:03:46Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 14:03:46.681607 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 14:03:46.681849 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 14:03:46.682597 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2315154617/tls.crt::/tmp/serving-cert-2315154617/tls.key\\\\\\\"\\\\nI0309 14:03:46.866095 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 14:03:46.869876 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 14:03:46.869896 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 14:03:46.870280 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 14:03:46.870293 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 14:03:46.877096 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0309 14:03:46.877117 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 14:03:46.877121 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 14:03:46.877128 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 14:03:46.877135 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 14:03:46.877139 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 14:03:46.877143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 14:03:46.877146 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 14:03:46.878419 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T14:03:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fbabc5ec2c100150ce886c06c3bd78cc5896c8a06ac9d262eeb552a8a8bb5d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:10Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:10 crc kubenswrapper[4722]: E0309 14:05:10.253615 4722 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 14:05:10 crc kubenswrapper[4722]: I0309 14:05:10.259916 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pvvhj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5dbc0be-527e-4f70-b185-3a10b1b11a75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pvvhj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:10Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:10 crc kubenswrapper[4722]: I0309 14:05:10.273635 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36d472dc-6063-4ddf-9b31-68a6a23f365c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:03:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:03:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68bead726a61890d6212098f420a58ca85c9d97bb2efe57b314aff215c26169c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a07bf846a5613ee835b70c1915ee46d1e45185f5ff81096ec2e4dd4238f35f3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T14:03:39Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 14:03:09.643545 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 14:03:09.644916 1 observer_polling.go:159] Starting file observer\\\\nI0309 14:03:09.646001 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 14:03:09.646843 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 14:03:35.207688 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0309 14:03:39.212600 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 14:03:39.212719 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T14:03:09Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff66d43b6e872eed24aacaf3ea882c7148b6ef62bbe5b7bf10ede5d2691a3680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41c6395bf5f6006036c3d58919ea6ea976b56827cac8ae369b10dd4037d29153\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03fa5f2f3d3dc4ebaf6b5db02fe2036caaf7e0de7197cb5d3409dbbae676a338\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:10Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:10 crc kubenswrapper[4722]: I0309 14:05:10.292048 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2267a4d7-87b4-43f7-80c7-7a7ffb0327fc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04169d9b8359e26cc9807e27fb9a5d54ca0946aaade294c5bcc272a7d21c4db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09fa62e7293a50fb3f6bdaf37061cbe5e95a830c8f7337660ce73138dbcaa641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31ba786781fcdb781f5fa45276e29e94756863d2398109d91e81a52f6f88bee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dff721159b382bcfdde6f406a328ef2ee9264b35a8d8baeb7a3c6d31a60ff3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63769aca40f721bec0716427145a3ceea0595836c03d31c98f6d45ff14de68be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://928d88a2e78d66b996b4b4f8c09878f78dd7fbdb000e2ab6d43ac77c999a748f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://928d88a2e78d66b996b4b4f8c09878f78dd7fbdb000e2ab6d43ac77c999a748f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6625c60c717ffc2b85b70374bc50dfae41b04d6362db0c7daf387f31c25862e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6625c60c717ffc2b85b70374bc50dfae41b04d6362db0c7daf387f31c25862e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://21b62fa3b3144d1d24dbb3a9e5f203e4c692377a54fb2322ebe6b927aa67403b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21b62fa3b3144d1d24dbb3a9e5f203e4c692377a54fb2322ebe6b927aa67403b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:10Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:10 crc kubenswrapper[4722]: I0309 14:05:10.314497 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0965e24b-526e-4842-ac1f-eca7a765355d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56c76a5e8718243a17d8d9fd9a8dcf6083c2ccb65b4816926dca08a89e465728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be3836660bf646052991965270cd69b8b8237b5c4bd698a73789b050d23e6877\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dbef38ff5e714c92fc8a86f1e3c64adf606cb198145f884a000f8dee200fcf2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28fcbb961b20cd0cbc1e8548492168a0fe9920b7b98c7bfeffe34727c299030e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7d636e81bd62363d09f680bbc312f71cbc2ac89d9af44b3c6b9efdaf29ded48\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T14:03:46Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 14:03:46.681607 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 14:03:46.681849 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 14:03:46.682597 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2315154617/tls.crt::/tmp/serving-cert-2315154617/tls.key\\\\\\\"\\\\nI0309 14:03:46.866095 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 14:03:46.869876 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 14:03:46.869896 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 14:03:46.870280 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 14:03:46.870293 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 14:03:46.877096 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0309 14:03:46.877117 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 14:03:46.877121 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 14:03:46.877128 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 14:03:46.877135 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 14:03:46.877139 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 14:03:46.877143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 14:03:46.877146 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 14:03:46.878419 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T14:03:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fbabc5ec2c100150ce886c06c3bd78cc5896c8a06ac9d262eeb552a8a8bb5d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:10Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:10 crc kubenswrapper[4722]: I0309 14:05:10.331305 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:10Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:10 crc kubenswrapper[4722]: I0309 14:05:10.344994 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:10Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:10 crc kubenswrapper[4722]: I0309 14:05:10.356385 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-72xb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43e3c934-6aa7-4c96-a3a6-378f7931fc2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcf5e49245fd6051b12465a36b27f657b4923a62f88d61fe0fac971c4a2c9197\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lfx7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-72xb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:10Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:10 crc kubenswrapper[4722]: I0309 14:05:10.372254 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dac2aaf5-653b-4b2a-8efe-ed26bac8d648\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf5b268d6a355ae3a2c40981769412124b85cdfd99364ed58e5163a744f5f83c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q6db\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f451493fa6d86f12c1291c0fe262bcbe4b62cef45c8dda696bb48fa3ded2eeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q6db\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hjrrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:10Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:10 crc kubenswrapper[4722]: I0309 14:05:10.386195 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a834db9e-b389-4bac-bfe8-df204ff42b2e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:03:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:03:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5c374ffb7928a20af6f154bfca0c494d692f1f6f55eb8de5d4d5db79a355013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f502c2ee0b77f66f4b32b7d3173eb71bd6d34c31724e8032a0200dac6fd62b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a48ba8f8443935496fa8f3a7e262723b19515f3adc95847d64236f956f91cc20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97aabe30af8bc4ff72fbac7f4fbf7804aff61f8d2fe99753558e7e40f348867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a97aabe30af8bc4ff72fbac7f4fbf7804aff61f8d2fe99753558e7e40f348867\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:10Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:10 crc kubenswrapper[4722]: I0309 14:05:10.416091 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:10Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:10 crc kubenswrapper[4722]: I0309 14:05:10.426693 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69e74824fe242fd2d54bed9734f33823697d7651043d9c97bac6ea5490625a0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:10Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:10 crc kubenswrapper[4722]: I0309 14:05:10.436574 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d69359acfc7a4ee7698088a7e0062b3604a5ceb92910cd3b3d69b1a6291dfdcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://687646beb4c6dd3fb9b659a36449a573f4c389b3e98b07127b18516fdc4f2c93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:10Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:10 crc kubenswrapper[4722]: I0309 14:05:10.447652 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h4zw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b9e29bb-6e51-47ab-a543-b70117ab854d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:05:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:05:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17cd3d1e3981e2f2a3b66780f753425c78452e6319fb8eae9935d3bac4456820\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17cd3d1e3981e2f2a3b66780f753425c78452e6319fb8eae9935d3bac4456820\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T14:05:09Z\\\",\\\"message\\\":\\\"2026-03-09T14:04:24+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9a389771-bfaa-4805-bbd7-772a0f07bd73\\\\n2026-03-09T14:04:24+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9a389771-bfaa-4805-bbd7-772a0f07bd73 to /host/opt/cni/bin/\\\\n2026-03-09T14:04:24Z [verbose] multus-daemon started\\\\n2026-03-09T14:04:24Z [verbose] Readiness Indicator file check\\\\n2026-03-09T14:05:09Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdw92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h4zw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:10Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:10 crc kubenswrapper[4722]: I0309 14:05:10.481245 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1fc80c43be93eeb2f9805bf314c8430f52062d0ce5d0fa1830e7e0c27e674e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f969085ae62d82dbe1f0f8f5d459c4a1bc7d1ae0a459da484f11dd6ae5fdd418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5765145caac69454a9a8e8de34d05cb1da06924dd365f3e74f52461be5027ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef1c5a3944e8693d8c68a4108c23851352407d0fac7fb7a03763d9084f117dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0106fa808804a96ca8247096399e3d04eb082b2691b1f39df3db9f6c630114c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e737417baac512fab2089f5da12e7c47ef06eae8df7803ad6b248505ff3d18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10ef5c0af2e49f88ea7089e2d4bdc701ce1c068e25ef30012e0b668aa9d4e8cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10ef5c0af2e49f88ea7089e2d4bdc701ce1c068e25ef30012e0b668aa9d4e8cc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T14:04:47Z\\\",\\\"message\\\":\\\"operator.openshift.io/daemonset-dns: default,},ClusterIP:10.217.4.10,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.10],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0309 14:04:47.142840 7031 ovnkube.go:599] Stopped ovnkube\\\\nI0309 14:04:47.141878 7031 services_controller.go:360] Finished syncing service metrics on namespace openshift-apiserver-operator for network=default : 1.907253ms\\\\nI0309 14:04:47.142928 7031 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0309 14:04:47.142980 7031 services_controller.go:356] Processing sync for service openshift-operator-lifecycle-manager/olm-operator-metrics for network=default\\\\nF0309 14:04:47.143018 7031 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initia\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-5v7ng_openshift-ovn-kubernetes(2e305619-b3a2-44c9-9e54-e1afa4f43dbf)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f5af077ae5b947194e5dba0542d053890a589582e415e1b41a932bdab6632e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcc35a356ac2421f1f5a4e5460ac9461063a1fb85aaefce257f9f4e7aed4dc75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcc35a356ac2421f1f5a4e5460ac9461063a1fb85aaefce257f9f4e7aed4dc75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5v7ng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:10Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:10 crc kubenswrapper[4722]: I0309 14:05:10.496777 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xsft8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e14085d-df3c-4218-aa43-0553786b8867\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fb0f733e44fc7a699961d799267ca3c01e65a62b29d74e6ae682bccba9e5281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqpsw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://435bcac0d223a0ff9431ac5bc5be017aa95d2e567d991a1f4edc0f79559904e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqpsw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xsft8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:10Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:10 crc kubenswrapper[4722]: I0309 14:05:10.512923 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee2ed55248b177bc0873ef3c41517401fdb30fc5ce5a545a1ca8c925e4d4e9e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:10Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:10 crc kubenswrapper[4722]: I0309 14:05:10.533255 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jv499" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd1357a3-36ef-4bc8-85bb-91f3c0f42994\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a14e62f50fa8e5403349dc8696e9c7c45a5da1026227612186e7805b1be722f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e56992f54f9fc4387130b74e3aa0f4652a5d6d960da1a5305b8668b8086fae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e56992f54f9fc4387130b74e3aa0f4652a5d6d960da1a5305b8668b8086fae2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c00d804aec8d776f5f2ab9c7a81161c9942a94564d57afe47c633fb3dc2aa7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c00d804aec8d776f5f2ab9c7a81161c9942a94564d57afe47c633fb3dc2aa7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0491f9997970cb050ece19d88d2a7064d4799db3772132f58055e1dbcf2f3a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0491f9997970cb050ece19d88d2a7064d4799db3772132f58055e1dbcf2f3a82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93c7d9a6b0dc8e76abab762ff6aae1975a31fbc80ee3a3cd32ec378298061748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93c7d9a6b0dc8e76abab762ff6aae1975a31fbc80ee3a3cd32ec378298061748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cc5ae61b0fa949ba4bf5aa60d09385399a557f19ef7c99dc43b21fc9db830e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9cc5ae61b0fa949ba4bf5aa60d09385399a557f19ef7c99dc43b21fc9db830e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a88cf1e6b90a3f13f44bbb08482773f007066c795b00cb0177f6cd2ed87cbb52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a88cf1e6b90a3f13f44bbb08482773f007066c795b00cb0177f6cd2ed87cbb52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jv499\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:10Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:10 crc kubenswrapper[4722]: I0309 14:05:10.548659 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l5pwg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8424174-764a-470b-ab49-e7369a42de7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d30648bba1b59fd7db5e75a61d6a004246668d148515da5738b77e5122358f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l5pwg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:10Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:10 crc kubenswrapper[4722]: I0309 14:05:10.929901 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-h4zw5_6b9e29bb-6e51-47ab-a543-b70117ab854d/kube-multus/0.log" Mar 09 14:05:10 crc kubenswrapper[4722]: I0309 14:05:10.929979 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-h4zw5" event={"ID":"6b9e29bb-6e51-47ab-a543-b70117ab854d","Type":"ContainerStarted","Data":"30f56826c4b193cbd284ce58320073c1b4dc43c9eba976445ba4d4bb7c089960"} Mar 09 14:05:10 crc kubenswrapper[4722]: I0309 14:05:10.947721 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee2ed55248b177bc0873ef3c41517401fdb30fc5ce5a545a1ca8c925e4d4e9e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:10Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:10 crc kubenswrapper[4722]: I0309 14:05:10.966560 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jv499" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd1357a3-36ef-4bc8-85bb-91f3c0f42994\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a14e62f50fa8e5403349dc8696e9c7c45a5da1026227612186e7805b1be722f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e56992f54f9fc4387130b74e3aa0f4652a5d6d960da1a5305b8668b8086fae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e56992f54f9fc4387130b74e3aa0f4652a5d6d960da1a5305b8668b8086fae2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c00d804aec8d776f5f2ab9c7a81161c9942a94564d57afe47c633fb3dc2aa7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c00d804aec8d776f5f2ab9c7a81161c9942a94564d57afe47c633fb3dc2aa7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0491f9997970cb050ece19d88d2a7064d4799db3772132f58055e1dbcf2f3a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0491f9997970cb050ece19d88d2a7064d4799db3772132f58055e1dbcf2f3a82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93c7d9a6b0dc8e76abab762ff6aae1975a31fbc80ee3a3cd32ec378298061748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93c7d9a6b0dc8e76abab762ff6aae1975a31fbc80ee3a3cd32ec378298061748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cc5ae61b0fa949ba4bf5aa60d09385399a557f19ef7c99dc43b21fc9db830e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9cc5ae61b0fa949ba4bf5aa60d09385399a557f19ef7c99dc43b21fc9db830e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a88cf1e6b90a3f13f44bbb08482773f007066c795b00cb0177f6cd2ed87cbb52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a88cf1e6b90a3f13f44bbb08482773f007066c795b00cb0177f6cd2ed87cbb52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jv499\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:10Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:10 crc kubenswrapper[4722]: I0309 14:05:10.980281 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l5pwg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8424174-764a-470b-ab49-e7369a42de7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d30648bba1b59fd7db5e75a61d6a004246668d148515da5738b77e5122358f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l5pwg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:10Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:10 crc kubenswrapper[4722]: I0309 14:05:10.998193 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:10Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:11 crc kubenswrapper[4722]: I0309 14:05:11.012607 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-72xb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43e3c934-6aa7-4c96-a3a6-378f7931fc2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcf5e49245fd6051b12465a36b27f657b4923a62f88d61fe0fac971c4a2c9197\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lfx7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-72xb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:11Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:11 crc kubenswrapper[4722]: I0309 14:05:11.027965 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dac2aaf5-653b-4b2a-8efe-ed26bac8d648\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf5b268d6a355ae3a2c40981769412124b85cdfd99364ed58e5163a744f5f83c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q6db\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f451493fa6d86f12c1291c0fe262bcbe4b62cef45c8dda696bb48fa3ded2eeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q6db\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hjrrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:11Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:11 crc kubenswrapper[4722]: I0309 14:05:11.043530 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pvvhj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5dbc0be-527e-4f70-b185-3a10b1b11a75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pvvhj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:11Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:11 crc kubenswrapper[4722]: I0309 14:05:11.060502 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36d472dc-6063-4ddf-9b31-68a6a23f365c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:03:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:03:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68bead726a61890d6212098f420a58ca85c9d97bb2efe57b314aff215c26169c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a07bf846a5613ee835b70c1915ee46d1e45185f5ff81096ec2e4dd4238f35f3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T14:03:39Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 14:03:09.643545 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 14:03:09.644916 1 observer_polling.go:159] Starting file observer\\\\nI0309 14:03:09.646001 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 14:03:09.646843 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 14:03:35.207688 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0309 14:03:39.212600 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 14:03:39.212719 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T14:03:09Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff66d43b6e872eed24aacaf3ea882c7148b6ef62bbe5b7bf10ede5d2691a3680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41c6395bf5f6006036c3d58919ea6ea976b56827cac8ae369b10dd4037d29153\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03fa5f2f3d3dc4ebaf6b5db02fe2036caaf7e0de7197cb5d3409dbbae676a338\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:11Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:11 crc kubenswrapper[4722]: I0309 14:05:11.086991 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2267a4d7-87b4-43f7-80c7-7a7ffb0327fc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04169d9b8359e26cc9807e27fb9a5d54ca0946aaade294c5bcc272a7d21c4db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09fa62e7293a50fb3f6bdaf37061cbe5e95a830c8f7337660ce73138dbcaa641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31ba786781fcdb781f5fa45276e29e94756863d2398109d91e81a52f6f88bee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dff721159b382bcfdde6f406a328ef2ee9264b35a8d8baeb7a3c6d31a60ff3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63769aca40f721bec0716427145a3ceea0595836c03d31c98f6d45ff14de68be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://928d88a2e78d66b996b4b4f8c09878f78dd7fbdb000e2ab6d43ac77c999a748f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://928d88a2e78d66b996b4b4f8c09878f78dd7fbdb000e2ab6d43ac77c999a748f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6625c60c717ffc2b85b70374bc50dfae41b04d6362db0c7daf387f31c25862e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6625c60c717ffc2b85b70374bc50dfae41b04d6362db0c7daf387f31c25862e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://21b62fa3b3144d1d24dbb3a9e5f203e4c692377a54fb2322ebe6b927aa67403b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21b62fa3b3144d1d24dbb3a9e5f203e4c692377a54fb2322ebe6b927aa67403b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:11Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:11 crc kubenswrapper[4722]: I0309 14:05:11.102230 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0965e24b-526e-4842-ac1f-eca7a765355d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56c76a5e8718243a17d8d9fd9a8dcf6083c2ccb65b4816926dca08a89e465728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be3836660bf646052991965270cd69b8b8237b5c4bd698a73789b050d23e6877\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dbef38ff5e714c92fc8a86f1e3c64adf606cb198145f884a000f8dee200fcf2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28fcbb961b20cd0cbc1e8548492168a0fe9920b7b98c7bfeffe34727c299030e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7d636e81bd62363d09f680bbc312f71cbc2ac89d9af44b3c6b9efdaf29ded48\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T14:03:46Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 14:03:46.681607 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 14:03:46.681849 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 14:03:46.682597 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2315154617/tls.crt::/tmp/serving-cert-2315154617/tls.key\\\\\\\"\\\\nI0309 14:03:46.866095 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 14:03:46.869876 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 14:03:46.869896 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 14:03:46.870280 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 14:03:46.870293 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 14:03:46.877096 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0309 14:03:46.877117 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 14:03:46.877121 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 14:03:46.877128 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 14:03:46.877135 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 14:03:46.877139 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 14:03:46.877143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 14:03:46.877146 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 14:03:46.878419 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T14:03:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fbabc5ec2c100150ce886c06c3bd78cc5896c8a06ac9d262eeb552a8a8bb5d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:11Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:11 crc kubenswrapper[4722]: I0309 14:05:11.118466 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:11Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:11 crc kubenswrapper[4722]: I0309 14:05:11.131407 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a834db9e-b389-4bac-bfe8-df204ff42b2e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:03:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:03:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5c374ffb7928a20af6f154bfca0c494d692f1f6f55eb8de5d4d5db79a355013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f502c2ee0b77f66f4b32b7d3173eb71bd6d34c31724e8032a0200dac6fd62b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a48ba8f8443935496fa8f3a7e262723b19515f3adc95847d64236f956f91cc20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97aabe30af8bc4ff72fbac7f4fbf7804aff61f8d2fe99753558e7e40f348867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a97aabe30af8bc4ff72fbac7f4fbf7804aff61f8d2fe99753558e7e40f348867\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:11Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:11 crc kubenswrapper[4722]: I0309 14:05:11.148466 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:11Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:11 crc kubenswrapper[4722]: I0309 14:05:11.163266 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69e74824fe242fd2d54bed9734f33823697d7651043d9c97bac6ea5490625a0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:11Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:11 crc kubenswrapper[4722]: I0309 14:05:11.178986 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d69359acfc7a4ee7698088a7e0062b3604a5ceb92910cd3b3d69b1a6291dfdcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://687646beb4c6dd3fb9b659a36449a573f4c389b3e98b07127b18516fdc4f2c93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:11Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:11 crc kubenswrapper[4722]: I0309 14:05:11.194845 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h4zw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b9e29bb-6e51-47ab-a543-b70117ab854d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30f56826c4b193cbd284ce58320073c1b4dc43c9eba976445ba4d4bb7c089960\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17cd3d1e3981e2f2a3b66780f753425c78452e6319fb8eae9935d3bac4456820\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T14:05:09Z\\\",\\\"message\\\":\\\"2026-03-09T14:04:24+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9a389771-bfaa-4805-bbd7-772a0f07bd73\\\\n2026-03-09T14:04:24+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9a389771-bfaa-4805-bbd7-772a0f07bd73 to /host/opt/cni/bin/\\\\n2026-03-09T14:04:24Z [verbose] multus-daemon started\\\\n2026-03-09T14:04:24Z [verbose] Readiness Indicator file check\\\\n2026-03-09T14:05:09Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:05:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdw92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h4zw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:11Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:11 crc kubenswrapper[4722]: I0309 14:05:11.216782 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1fc80c43be93eeb2f9805bf314c8430f52062d0ce5d0fa1830e7e0c27e674e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f969085ae62d82dbe1f0f8f5d459c4a1bc7d1ae0a459da484f11dd6ae5fdd418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5765145caac69454a9a8e8de34d05cb1da06924dd365f3e74f52461be5027ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef1c5a3944e8693d8c68a4108c23851352407d0fac7fb7a03763d9084f117dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0106fa808804a96ca8247096399e3d04eb082b2691b1f39df3db9f6c630114c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e737417baac512fab2089f5da12e7c47ef06eae8df7803ad6b248505ff3d18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://10ef5c0af2e49f88ea7089e2d4bdc701ce1c068e25ef30012e0b668aa9d4e8cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10ef5c0af2e49f88ea7089e2d4bdc701ce1c068e25ef30012e0b668aa9d4e8cc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T14:04:47Z\\\",\\\"message\\\":\\\"operator.openshift.io/daemonset-dns: default,},ClusterIP:10.217.4.10,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.10],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0309 14:04:47.142840 7031 ovnkube.go:599] Stopped ovnkube\\\\nI0309 14:04:47.141878 7031 services_controller.go:360] Finished syncing service metrics on namespace openshift-apiserver-operator for network=default : 1.907253ms\\\\nI0309 14:04:47.142928 7031 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0309 14:04:47.142980 7031 services_controller.go:356] Processing sync for service openshift-operator-lifecycle-manager/olm-operator-metrics for network=default\\\\nF0309 14:04:47.143018 7031 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initia\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-5v7ng_openshift-ovn-kubernetes(2e305619-b3a2-44c9-9e54-e1afa4f43dbf)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f5af077ae5b947194e5dba0542d053890a589582e415e1b41a932bdab6632e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcc35a356ac2421f1f5a4e5460ac9461063a1fb85aaefce257f9f4e7aed4dc75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcc35a356ac2421f1f5a4e5460ac9461063a1fb85aaefce257f9f4e7aed4dc75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5v7ng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:11Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:11 crc kubenswrapper[4722]: I0309 14:05:11.229277 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xsft8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e14085d-df3c-4218-aa43-0553786b8867\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fb0f733e44fc7a699961d799267ca3c01e65a62b29d74e6ae682bccba9e5281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqpsw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://435bcac0d223a0ff9431ac5bc5be017aa95d2e567d991a1f4edc0f79559904e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqpsw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xsft8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:11Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:12 crc kubenswrapper[4722]: I0309 14:05:12.006275 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 14:05:12 crc kubenswrapper[4722]: E0309 14:05:12.006605 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 14:06:16.006562148 +0000 UTC m=+216.562130724 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:05:12 crc kubenswrapper[4722]: I0309 14:05:12.107957 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 14:05:12 crc kubenswrapper[4722]: I0309 14:05:12.108046 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 14:05:12 crc kubenswrapper[4722]: I0309 14:05:12.108081 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 14:05:12 crc kubenswrapper[4722]: I0309 14:05:12.108149 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 14:05:12 crc kubenswrapper[4722]: E0309 14:05:12.108382 4722 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 14:05:12 crc kubenswrapper[4722]: E0309 14:05:12.108502 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 14:06:16.108479815 +0000 UTC m=+216.664048391 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 09 14:05:12 crc kubenswrapper[4722]: E0309 14:05:12.108389 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 14:05:12 crc kubenswrapper[4722]: E0309 14:05:12.108553 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 14:05:12 crc kubenswrapper[4722]: E0309 14:05:12.108577 4722 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 14:05:12 crc kubenswrapper[4722]: E0309 14:05:12.108616 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-09 14:06:16.108605359 +0000 UTC m=+216.664173935 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 14:05:12 crc kubenswrapper[4722]: E0309 14:05:12.108495 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 09 14:05:12 crc kubenswrapper[4722]: E0309 14:05:12.108686 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 09 14:05:12 crc kubenswrapper[4722]: E0309 14:05:12.108714 4722 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 14:05:12 crc kubenswrapper[4722]: E0309 14:05:12.108831 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-09 14:06:16.108801694 +0000 UTC m=+216.664370420 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 09 14:05:12 crc kubenswrapper[4722]: E0309 14:05:12.108890 4722 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 14:05:12 crc kubenswrapper[4722]: E0309 14:05:12.108942 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-09 14:06:16.108925997 +0000 UTC m=+216.664494813 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 09 14:05:12 crc kubenswrapper[4722]: I0309 14:05:12.149441 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 14:05:12 crc kubenswrapper[4722]: I0309 14:05:12.149608 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 14:05:12 crc kubenswrapper[4722]: I0309 14:05:12.149659 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 14:05:12 crc kubenswrapper[4722]: E0309 14:05:12.149647 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 14:05:12 crc kubenswrapper[4722]: I0309 14:05:12.149476 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pvvhj" Mar 09 14:05:12 crc kubenswrapper[4722]: E0309 14:05:12.149871 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 14:05:12 crc kubenswrapper[4722]: E0309 14:05:12.150014 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 14:05:12 crc kubenswrapper[4722]: E0309 14:05:12.150269 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pvvhj" podUID="f5dbc0be-527e-4f70-b185-3a10b1b11a75" Mar 09 14:05:14 crc kubenswrapper[4722]: I0309 14:05:14.148975 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 14:05:14 crc kubenswrapper[4722]: I0309 14:05:14.149038 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pvvhj" Mar 09 14:05:14 crc kubenswrapper[4722]: I0309 14:05:14.149114 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 14:05:14 crc kubenswrapper[4722]: E0309 14:05:14.149108 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 14:05:14 crc kubenswrapper[4722]: E0309 14:05:14.149177 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pvvhj" podUID="f5dbc0be-527e-4f70-b185-3a10b1b11a75" Mar 09 14:05:14 crc kubenswrapper[4722]: I0309 14:05:14.149244 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 14:05:14 crc kubenswrapper[4722]: E0309 14:05:14.149291 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 14:05:14 crc kubenswrapper[4722]: E0309 14:05:14.149330 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 14:05:15 crc kubenswrapper[4722]: E0309 14:05:15.254833 4722 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 14:05:16 crc kubenswrapper[4722]: I0309 14:05:16.148302 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 14:05:16 crc kubenswrapper[4722]: I0309 14:05:16.148465 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 14:05:16 crc kubenswrapper[4722]: E0309 14:05:16.148549 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 14:05:16 crc kubenswrapper[4722]: I0309 14:05:16.148572 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 14:05:16 crc kubenswrapper[4722]: I0309 14:05:16.148657 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pvvhj" Mar 09 14:05:16 crc kubenswrapper[4722]: E0309 14:05:16.148722 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 14:05:16 crc kubenswrapper[4722]: E0309 14:05:16.148826 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 14:05:16 crc kubenswrapper[4722]: E0309 14:05:16.148896 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pvvhj" podUID="f5dbc0be-527e-4f70-b185-3a10b1b11a75" Mar 09 14:05:16 crc kubenswrapper[4722]: I0309 14:05:16.149820 4722 scope.go:117] "RemoveContainer" containerID="10ef5c0af2e49f88ea7089e2d4bdc701ce1c068e25ef30012e0b668aa9d4e8cc" Mar 09 14:05:16 crc kubenswrapper[4722]: I0309 14:05:16.952914 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5v7ng_2e305619-b3a2-44c9-9e54-e1afa4f43dbf/ovnkube-controller/2.log" Mar 09 14:05:16 crc kubenswrapper[4722]: I0309 14:05:16.956229 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" event={"ID":"2e305619-b3a2-44c9-9e54-e1afa4f43dbf","Type":"ContainerStarted","Data":"647b164e0544f016e9b6adbb065e7b958f1f7e1446a15f120191cfb35d030220"} Mar 09 14:05:16 crc kubenswrapper[4722]: I0309 14:05:16.956805 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" Mar 09 14:05:16 crc kubenswrapper[4722]: I0309 14:05:16.977365 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0965e24b-526e-4842-ac1f-eca7a765355d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56c76a5e8718243a17d8d9fd9a8dcf6083c2ccb65b4816926dca08a89e465728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be3836660bf646052991965270cd69b8b8237b5c4bd698a73789b050d23e6877\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dbef38ff5e714c92fc8a86f1e3c64adf606cb198145f884a000f8dee200fcf2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28fcbb961b20cd0cbc1e8548492168a0fe9920b7b98c7bfeffe34727c299030e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7d636e81bd62363d09f680bbc312f71cbc2ac89d9af44b3c6b9efdaf29ded48\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T14:03:46Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 14:03:46.681607 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 14:03:46.681849 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 14:03:46.682597 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2315154617/tls.crt::/tmp/serving-cert-2315154617/tls.key\\\\\\\"\\\\nI0309 14:03:46.866095 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 14:03:46.869876 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 14:03:46.869896 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 14:03:46.870280 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 14:03:46.870293 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 14:03:46.877096 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0309 14:03:46.877117 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 14:03:46.877121 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 14:03:46.877128 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 14:03:46.877135 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 14:03:46.877139 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 14:03:46.877143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 14:03:46.877146 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 14:03:46.878419 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T14:03:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fbabc5ec2c100150ce886c06c3bd78cc5896c8a06ac9d262eeb552a8a8bb5d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:16Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:16 crc kubenswrapper[4722]: I0309 14:05:16.999167 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:16Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:17 crc kubenswrapper[4722]: I0309 14:05:17.018828 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:17Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:17 crc kubenswrapper[4722]: I0309 14:05:17.034294 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-72xb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43e3c934-6aa7-4c96-a3a6-378f7931fc2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcf5e49245fd6051b12465a36b27f657b4923a62f88d61fe0fac971c4a2c9197\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lfx7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-72xb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:17Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:17 crc kubenswrapper[4722]: I0309 14:05:17.050494 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dac2aaf5-653b-4b2a-8efe-ed26bac8d648\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf5b268d6a355ae3a2c40981769412124b85cdfd99364ed58e5163a744f5f83c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q6db\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f451493fa6d86f12c1291c0fe262bcbe4b62cef45c8dda696bb48fa3ded2eeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q6db\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hjrrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:17Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:17 crc kubenswrapper[4722]: I0309 14:05:17.063479 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pvvhj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5dbc0be-527e-4f70-b185-3a10b1b11a75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pvvhj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:17Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:17 crc kubenswrapper[4722]: I0309 14:05:17.080658 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36d472dc-6063-4ddf-9b31-68a6a23f365c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:03:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:03:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68bead726a61890d6212098f420a58ca85c9d97bb2efe57b314aff215c26169c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a07bf846a5613ee835b70c1915ee46d1e45185f5ff81096ec2e4dd4238f35f3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T14:03:39Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 14:03:09.643545 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 14:03:09.644916 1 observer_polling.go:159] Starting file observer\\\\nI0309 14:03:09.646001 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 14:03:09.646843 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 14:03:35.207688 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0309 14:03:39.212600 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 14:03:39.212719 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T14:03:09Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff66d43b6e872eed24aacaf3ea882c7148b6ef62bbe5b7bf10ede5d2691a3680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41c6395bf5f6006036c3d58919ea6ea976b56827cac8ae369b10dd4037d29153\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03fa5f2f3d3dc4ebaf6b5db02fe2036caaf7e0de7197cb5d3409dbbae676a338\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:17Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:17 crc kubenswrapper[4722]: I0309 14:05:17.101723 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2267a4d7-87b4-43f7-80c7-7a7ffb0327fc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04169d9b8359e26cc9807e27fb9a5d54ca0946aaade294c5bcc272a7d21c4db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09fa62e7293a50fb3f6bdaf37061cbe5e95a830c8f7337660ce73138dbcaa641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31ba786781fcdb781f5fa45276e29e94756863d2398109d91e81a52f6f88bee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dff721159b382bcfdde6f406a328ef2ee9264b35a8d8baeb7a3c6d31a60ff3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63769aca40f721bec0716427145a3ceea0595836c03d31c98f6d45ff14de68be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://928d88a2e78d66b996b4b4f8c09878f78dd7fbdb000e2ab6d43ac77c999a748f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://928d88a2e78d66b996b4b4f8c09878f78dd7fbdb000e2ab6d43ac77c999a748f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6625c60c717ffc2b85b70374bc50dfae41b04d6362db0c7daf387f31c25862e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6625c60c717ffc2b85b70374bc50dfae41b04d6362db0c7daf387f31c25862e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://21b62fa3b3144d1d24dbb3a9e5f203e4c692377a54fb2322ebe6b927aa67403b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21b62fa3b3144d1d24dbb3a9e5f203e4c692377a54fb2322ebe6b927aa67403b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:17Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:17 crc kubenswrapper[4722]: I0309 14:05:17.117051 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:17Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:17 crc kubenswrapper[4722]: I0309 14:05:17.131400 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69e74824fe242fd2d54bed9734f33823697d7651043d9c97bac6ea5490625a0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:17Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:17 crc kubenswrapper[4722]: I0309 14:05:17.145543 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a834db9e-b389-4bac-bfe8-df204ff42b2e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:03:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:03:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5c374ffb7928a20af6f154bfca0c494d692f1f6f55eb8de5d4d5db79a355013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f502c2ee0b77f66f4b32b7d3173eb71bd6d34c31724e8032a0200dac6fd62b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a48ba8f8443935496fa8f3a7e262723b19515f3adc95847d64236f956f91cc20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97aabe30af8bc4ff72fbac7f4fbf7804aff61f8d2fe99753558e7e40f348867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a97aabe30af8bc4ff72fbac7f4fbf7804aff61f8d2fe99753558e7e40f348867\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:17Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:17 crc kubenswrapper[4722]: I0309 14:05:17.163782 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1fc80c43be93eeb2f9805bf314c8430f52062d0ce5d0fa1830e7e0c27e674e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f969085ae62d82dbe1f0f8f5d459c4a1bc7d1ae0a459da484f11dd6ae5fdd418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5765145caac69454a9a8e8de34d05cb1da06924dd365f3e74f52461be5027ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef1c5a3944e8693d8c68a4108c23851352407d0fac7fb7a03763d9084f117dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0106fa808804a96ca8247096399e3d04eb082b2691b1f39df3db9f6c630114c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e737417baac512fab2089f5da12e7c47ef06eae8df7803ad6b248505ff3d18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://647b164e0544f016e9b6adbb065e7b958f1f7e1446a15f120191cfb35d030220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10ef5c0af2e49f88ea7089e2d4bdc701ce1c068e25ef30012e0b668aa9d4e8cc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T14:04:47Z\\\",\\\"message\\\":\\\"operator.openshift.io/daemonset-dns: default,},ClusterIP:10.217.4.10,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.10],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0309 14:04:47.142840 7031 ovnkube.go:599] Stopped ovnkube\\\\nI0309 14:04:47.141878 7031 services_controller.go:360] Finished syncing service metrics on namespace openshift-apiserver-operator for network=default : 1.907253ms\\\\nI0309 14:04:47.142928 7031 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0309 14:04:47.142980 7031 services_controller.go:356] Processing sync for service openshift-operator-lifecycle-manager/olm-operator-metrics for network=default\\\\nF0309 14:04:47.143018 7031 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initia\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f5af077ae5b947194e5dba0542d053890a589582e415e1b41a932bdab6632e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcc35a356ac2421f1f5a4e5460ac9461063a1fb85aaefce257f9f4e7aed4dc75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcc35a356ac2421f1f5a4e5460ac9461063a1fb85aaefce257f9f4e7aed4dc75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5v7ng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:17Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:17 crc kubenswrapper[4722]: I0309 14:05:17.175576 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xsft8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e14085d-df3c-4218-aa43-0553786b8867\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fb0f733e44fc7a699961d799267ca3c01e65a62b29d74e6ae682bccba9e5281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqpsw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://435bcac0d223a0ff9431ac5bc5be017aa95d2e567d991a1f4edc0f79559904e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqpsw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xsft8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:17Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:17 crc kubenswrapper[4722]: I0309 14:05:17.190028 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d69359acfc7a4ee7698088a7e0062b3604a5ceb92910cd3b3d69b1a6291dfdcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://687646beb4c6dd3fb9b659a36449a573f4c389b3e98b07127b18516fdc4f2c93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:17Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:17 crc kubenswrapper[4722]: I0309 14:05:17.205840 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h4zw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b9e29bb-6e51-47ab-a543-b70117ab854d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30f56826c4b193cbd284ce58320073c1b4dc43c9eba976445ba4d4bb7c089960\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17cd3d1e3981e2f2a3b66780f753425c78452e6319fb8eae9935d3bac4456820\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T14:05:09Z\\\",\\\"message\\\":\\\"2026-03-09T14:04:24+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9a389771-bfaa-4805-bbd7-772a0f07bd73\\\\n2026-03-09T14:04:24+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9a389771-bfaa-4805-bbd7-772a0f07bd73 to /host/opt/cni/bin/\\\\n2026-03-09T14:04:24Z [verbose] multus-daemon started\\\\n2026-03-09T14:04:24Z [verbose] Readiness Indicator file check\\\\n2026-03-09T14:05:09Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:05:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdw92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h4zw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:17Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:17 crc kubenswrapper[4722]: I0309 14:05:17.218352 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l5pwg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8424174-764a-470b-ab49-e7369a42de7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d30648bba1b59fd7db5e75a61d6a004246668d148515da5738b77e5122358f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l5pwg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:17Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:17 crc kubenswrapper[4722]: I0309 14:05:17.237096 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee2ed55248b177bc0873ef3c41517401fdb30fc5ce5a545a1ca8c925e4d4e9e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:17Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:17 crc kubenswrapper[4722]: I0309 14:05:17.258500 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jv499" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd1357a3-36ef-4bc8-85bb-91f3c0f42994\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a14e62f50fa8e5403349dc8696e9c7c45a5da1026227612186e7805b1be722f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e56992f54f9fc4387130b74e3aa0f4652a5d6d960da1a5305b8668b8086fae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e56992f54f9fc4387130b74e3aa0f4652a5d6d960da1a5305b8668b8086fae2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c00d804aec8d776f5f2ab9c7a81161c9942a94564d57afe47c633fb3dc2aa7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c00d804aec8d776f5f2ab9c7a81161c9942a94564d57afe47c633fb3dc2aa7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0491f9997970cb050ece19d88d2a7064d4799db3772132f58055e1dbcf2f3a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0491f9997970cb050ece19d88d2a7064d4799db3772132f58055e1dbcf2f3a82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93c7d9a6b0dc8e76abab762ff6aae1975a31fbc80ee3a3cd32ec378298061748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93c7d9a6b0dc8e76abab762ff6aae1975a31fbc80ee3a3cd32ec378298061748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cc5ae61b0fa949ba4bf5aa60d09385399a557f19ef7c99dc43b21fc9db830e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9cc5ae61b0fa949ba4bf5aa60d09385399a557f19ef7c99dc43b21fc9db830e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a88cf1e6b90a3f13f44bbb08482773f007066c795b00cb0177f6cd2ed87cbb52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a88cf1e6b90a3f13f44bbb08482773f007066c795b00cb0177f6cd2ed87cbb52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jv499\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:17Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:17 crc kubenswrapper[4722]: I0309 14:05:17.963442 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5v7ng_2e305619-b3a2-44c9-9e54-e1afa4f43dbf/ovnkube-controller/3.log" Mar 09 14:05:17 crc kubenswrapper[4722]: I0309 14:05:17.965350 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5v7ng_2e305619-b3a2-44c9-9e54-e1afa4f43dbf/ovnkube-controller/2.log" Mar 09 14:05:17 crc kubenswrapper[4722]: I0309 14:05:17.972136 4722 generic.go:334] "Generic (PLEG): container finished" podID="2e305619-b3a2-44c9-9e54-e1afa4f43dbf" containerID="647b164e0544f016e9b6adbb065e7b958f1f7e1446a15f120191cfb35d030220" exitCode=1 Mar 09 14:05:17 crc kubenswrapper[4722]: I0309 14:05:17.972242 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" event={"ID":"2e305619-b3a2-44c9-9e54-e1afa4f43dbf","Type":"ContainerDied","Data":"647b164e0544f016e9b6adbb065e7b958f1f7e1446a15f120191cfb35d030220"} Mar 09 14:05:17 crc kubenswrapper[4722]: I0309 14:05:17.972333 4722 scope.go:117] "RemoveContainer" containerID="10ef5c0af2e49f88ea7089e2d4bdc701ce1c068e25ef30012e0b668aa9d4e8cc" Mar 09 14:05:17 crc kubenswrapper[4722]: I0309 14:05:17.982126 4722 scope.go:117] "RemoveContainer" containerID="647b164e0544f016e9b6adbb065e7b958f1f7e1446a15f120191cfb35d030220" Mar 09 14:05:17 crc kubenswrapper[4722]: E0309 14:05:17.984678 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-5v7ng_openshift-ovn-kubernetes(2e305619-b3a2-44c9-9e54-e1afa4f43dbf)\"" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" podUID="2e305619-b3a2-44c9-9e54-e1afa4f43dbf" Mar 09 14:05:18 crc kubenswrapper[4722]: I0309 14:05:18.011172 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee2ed55248b177bc0873ef3c41517401fdb30fc5ce5a545a1ca8c925e4d4e9e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:18Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:18 crc kubenswrapper[4722]: I0309 14:05:18.040483 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jv499" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd1357a3-36ef-4bc8-85bb-91f3c0f42994\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a14e62f50fa8e5403349dc8696e9c7c45a5da1026227612186e7805b1be722f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e56992f54f9fc4387130b74e3aa0f4652a5d6d960da1a5305b8668b8086fae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e56992f54f9fc4387130b74e3aa0f4652a5d6d960da1a5305b8668b8086fae2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c00d804aec8d776f5f2ab9c7a81161c9942a94564d57afe47c633fb3dc2aa7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c00d804aec8d776f5f2ab9c7a81161c9942a94564d57afe47c633fb3dc2aa7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0491f9997970cb050ece19d88d2a7064d4799db3772132f58055e1dbcf2f3a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0491f9997970cb050ece19d88d2a7064d4799db3772132f58055e1dbcf2f3a82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93c7d9a6b0dc8e76abab762ff6aae1975a31fbc80ee3a3cd32ec378298061748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93c7d9a6b0dc8e76abab762ff6aae1975a31fbc80ee3a3cd32ec378298061748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cc5ae61b0fa949ba4bf5aa60d09385399a557f19ef7c99dc43b21fc9db830e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9cc5ae61b0fa949ba4bf5aa60d09385399a557f19ef7c99dc43b21fc9db830e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a88cf1e6b90a3f13f44bbb08482773f007066c795b00cb0177f6cd2ed87cbb52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a88cf1e6b90a3f13f44bbb08482773f007066c795b00cb0177f6cd2ed87cbb52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jv499\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:18Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:18 crc kubenswrapper[4722]: I0309 14:05:18.055655 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l5pwg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8424174-764a-470b-ab49-e7369a42de7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d30648bba1b59fd7db5e75a61d6a004246668d148515da5738b77e5122358f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l5pwg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:18Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:18 crc kubenswrapper[4722]: I0309 14:05:18.067401 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:18Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:18 crc kubenswrapper[4722]: I0309 14:05:18.080819 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-72xb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43e3c934-6aa7-4c96-a3a6-378f7931fc2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcf5e49245fd6051b12465a36b27f657b4923a62f88d61fe0fac971c4a2c9197\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lfx7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-72xb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:18Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:18 crc kubenswrapper[4722]: I0309 14:05:18.099761 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dac2aaf5-653b-4b2a-8efe-ed26bac8d648\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf5b268d6a355ae3a2c40981769412124b85cdfd99364ed58e5163a744f5f83c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q6db\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f451493fa6d86f12c1291c0fe262bcbe4b62cef45c8dda696bb48fa3ded2eeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q6db\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hjrrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:18Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:18 crc kubenswrapper[4722]: I0309 14:05:18.113287 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pvvhj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5dbc0be-527e-4f70-b185-3a10b1b11a75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pvvhj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:18Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:18 crc kubenswrapper[4722]: I0309 14:05:18.135904 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36d472dc-6063-4ddf-9b31-68a6a23f365c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:03:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:03:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68bead726a61890d6212098f420a58ca85c9d97bb2efe57b314aff215c26169c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a07bf846a5613ee835b70c1915ee46d1e45185f5ff81096ec2e4dd4238f35f3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T14:03:39Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 14:03:09.643545 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 14:03:09.644916 1 observer_polling.go:159] Starting file observer\\\\nI0309 14:03:09.646001 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 14:03:09.646843 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 14:03:35.207688 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0309 14:03:39.212600 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 14:03:39.212719 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T14:03:09Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff66d43b6e872eed24aacaf3ea882c7148b6ef62bbe5b7bf10ede5d2691a3680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41c6395bf5f6006036c3d58919ea6ea976b56827cac8ae369b10dd4037d29153\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03fa5f2f3d3dc4ebaf6b5db02fe2036caaf7e0de7197cb5d3409dbbae676a338\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:18Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:18 crc kubenswrapper[4722]: I0309 14:05:18.149175 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pvvhj" Mar 09 14:05:18 crc kubenswrapper[4722]: I0309 14:05:18.149266 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 14:05:18 crc kubenswrapper[4722]: I0309 14:05:18.149321 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 14:05:18 crc kubenswrapper[4722]: I0309 14:05:18.149351 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 14:05:18 crc kubenswrapper[4722]: E0309 14:05:18.149456 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pvvhj" podUID="f5dbc0be-527e-4f70-b185-3a10b1b11a75" Mar 09 14:05:18 crc kubenswrapper[4722]: E0309 14:05:18.149652 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 14:05:18 crc kubenswrapper[4722]: E0309 14:05:18.149762 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 14:05:18 crc kubenswrapper[4722]: E0309 14:05:18.149828 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 14:05:18 crc kubenswrapper[4722]: I0309 14:05:18.161847 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2267a4d7-87b4-43f7-80c7-7a7ffb0327fc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04169d9b8359e26cc9807e27fb9a5d54ca0946aaade294c5bcc272a7d21c4db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09fa62e7293a50fb3f6bdaf37061cbe5e95a830c8f7337660ce73138dbcaa641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31ba786781fcdb781f5fa45276e29e94756863d2398109d91e81a52f6f88bee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dff721159b382bcfdde6f406a328ef2ee9264b35a8d8baeb7a3c6d31a60ff3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63769aca40f721bec0716427145a3ceea0595836c03d31c98f6d45ff14de68be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://928d88a2e78d66b996b4b4f8c09878f78dd7fbdb000e2ab6d43ac77c999a748f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://928d88a2e78d66b996b4b4f8c09878f78dd7fbdb000e2ab6d43ac77c999a748f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6625c60c717ffc2b85b70374bc50dfae41b04d6362db0c7daf387f31c25862e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6625c60c717ffc2b85b70374bc50dfae41b04d6362db0c7daf387f31c25862e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://21b62fa3b3144d1d24dbb3a9e5f203e4c692377a54fb2322ebe6b927aa67403b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21b62fa3b3144d1d24dbb3a9e5f203e4c692377a54fb2322ebe6b927aa67403b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:18Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:18 crc kubenswrapper[4722]: I0309 14:05:18.183541 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0965e24b-526e-4842-ac1f-eca7a765355d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56c76a5e8718243a17d8d9fd9a8dcf6083c2ccb65b4816926dca08a89e465728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be3836660bf646052991965270cd69b8b8237b5c4bd698a73789b050d23e6877\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dbef38ff5e714c92fc8a86f1e3c64adf606cb198145f884a000f8dee200fcf2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28fcbb961b20cd0cbc1e8548492168a0fe9920b7b98c7bfeffe34727c299030e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7d636e81bd62363d09f680bbc312f71cbc2ac89d9af44b3c6b9efdaf29ded48\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T14:03:46Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 14:03:46.681607 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 14:03:46.681849 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 14:03:46.682597 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2315154617/tls.crt::/tmp/serving-cert-2315154617/tls.key\\\\\\\"\\\\nI0309 14:03:46.866095 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 14:03:46.869876 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 14:03:46.869896 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 14:03:46.870280 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 14:03:46.870293 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 14:03:46.877096 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0309 14:03:46.877117 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 14:03:46.877121 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 14:03:46.877128 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 14:03:46.877135 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 14:03:46.877139 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 14:03:46.877143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 14:03:46.877146 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 14:03:46.878419 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T14:03:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fbabc5ec2c100150ce886c06c3bd78cc5896c8a06ac9d262eeb552a8a8bb5d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:18Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:18 crc kubenswrapper[4722]: I0309 14:05:18.202605 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:18Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:18 crc kubenswrapper[4722]: I0309 14:05:18.216622 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a834db9e-b389-4bac-bfe8-df204ff42b2e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:03:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:03:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5c374ffb7928a20af6f154bfca0c494d692f1f6f55eb8de5d4d5db79a355013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f502c2ee0b77f66f4b32b7d3173eb71bd6d34c31724e8032a0200dac6fd62b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a48ba8f8443935496fa8f3a7e262723b19515f3adc95847d64236f956f91cc20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97aabe30af8bc4ff72fbac7f4fbf7804aff61f8d2fe99753558e7e40f348867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a97aabe30af8bc4ff72fbac7f4fbf7804aff61f8d2fe99753558e7e40f348867\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:18Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:18 crc kubenswrapper[4722]: I0309 14:05:18.230972 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:18Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:18 crc kubenswrapper[4722]: I0309 14:05:18.245195 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69e74824fe242fd2d54bed9734f33823697d7651043d9c97bac6ea5490625a0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:18Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:18 crc kubenswrapper[4722]: I0309 14:05:18.259178 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d69359acfc7a4ee7698088a7e0062b3604a5ceb92910cd3b3d69b1a6291dfdcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://687646beb4c6dd3fb9b659a36449a573f4c389b3e98b07127b18516fdc4f2c93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:18Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:18 crc kubenswrapper[4722]: I0309 14:05:18.280583 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h4zw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b9e29bb-6e51-47ab-a543-b70117ab854d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30f56826c4b193cbd284ce58320073c1b4dc43c9eba976445ba4d4bb7c089960\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17cd3d1e3981e2f2a3b66780f753425c78452e6319fb8eae9935d3bac4456820\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T14:05:09Z\\\",\\\"message\\\":\\\"2026-03-09T14:04:24+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9a389771-bfaa-4805-bbd7-772a0f07bd73\\\\n2026-03-09T14:04:24+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9a389771-bfaa-4805-bbd7-772a0f07bd73 to /host/opt/cni/bin/\\\\n2026-03-09T14:04:24Z [verbose] multus-daemon started\\\\n2026-03-09T14:04:24Z [verbose] Readiness Indicator file check\\\\n2026-03-09T14:05:09Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:05:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdw92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h4zw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:18Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:18 crc kubenswrapper[4722]: I0309 14:05:18.304341 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1fc80c43be93eeb2f9805bf314c8430f52062d0ce5d0fa1830e7e0c27e674e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f969085ae62d82dbe1f0f8f5d459c4a1bc7d1ae0a459da484f11dd6ae5fdd418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5765145caac69454a9a8e8de34d05cb1da06924dd365f3e74f52461be5027ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef1c5a3944e8693d8c68a4108c23851352407d0fac7fb7a03763d9084f117dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0106fa808804a96ca8247096399e3d04eb082b2691b1f39df3db9f6c630114c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e737417baac512fab2089f5da12e7c47ef06eae8df7803ad6b248505ff3d18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://647b164e0544f016e9b6adbb065e7b958f1f7e1446a15f120191cfb35d030220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10ef5c0af2e49f88ea7089e2d4bdc701ce1c068e25ef30012e0b668aa9d4e8cc\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T14:04:47Z\\\",\\\"message\\\":\\\"operator.openshift.io/daemonset-dns: default,},ClusterIP:10.217.4.10,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.10],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0309 14:04:47.142840 7031 ovnkube.go:599] Stopped ovnkube\\\\nI0309 14:04:47.141878 7031 services_controller.go:360] Finished syncing service metrics on namespace openshift-apiserver-operator for network=default : 1.907253ms\\\\nI0309 14:04:47.142928 7031 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0309 14:04:47.142980 7031 services_controller.go:356] Processing sync for service openshift-operator-lifecycle-manager/olm-operator-metrics for network=default\\\\nF0309 14:04:47.143018 7031 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initia\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:46Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://647b164e0544f016e9b6adbb065e7b958f1f7e1446a15f120191cfb35d030220\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T14:05:17Z\\\",\\\"message\\\":\\\"/client/informers/externalversions/factory.go:141\\\\nI0309 14:05:17.105639 7362 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 14:05:17.105690 7362 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0309 14:05:17.105785 7362 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 14:05:17.106267 7362 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0309 14:05:17.106318 7362 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0309 14:05:17.106413 7362 factory.go:656] Stopping watch factory\\\\nI0309 14:05:17.106445 7362 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0309 14:05:17.106456 7362 handler.go:208] Removed *v1.Node event handler 2\\\\nI0309 14:05:17.119618 7362 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0309 14:05:17.119648 7362 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0309 14:05:17.119707 7362 ovnkube.go:599] Stopped ovnkube\\\\nI0309 14:05:17.119735 7362 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0309 14:05:17.119831 7362 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T14:05:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f5af077ae5b947194e5dba0542d053890a589582e415e1b41a932bdab6632e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcc35a356ac2421f1f5a4e5460ac9461063a1fb85aaefce257f9f4e7aed4dc75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcc35a356ac2421f1f5a4e5460ac9461063a1fb85aaefce257f9f4e7aed4dc75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5v7ng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:18Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:18 crc kubenswrapper[4722]: I0309 14:05:18.318376 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xsft8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e14085d-df3c-4218-aa43-0553786b8867\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fb0f733e44fc7a699961d799267ca3c01e65a62b29d74e6ae682bccba9e5281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqpsw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://435bcac0d223a0ff9431ac5bc5be017aa95d2e567d991a1f4edc0f79559904e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqpsw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xsft8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:18Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:18 crc kubenswrapper[4722]: I0309 14:05:18.729660 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:05:18 crc kubenswrapper[4722]: I0309 14:05:18.729741 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:05:18 crc kubenswrapper[4722]: I0309 14:05:18.729762 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:05:18 crc kubenswrapper[4722]: I0309 14:05:18.729793 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:05:18 crc kubenswrapper[4722]: I0309 14:05:18.729816 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:05:18Z","lastTransitionTime":"2026-03-09T14:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:05:18 crc kubenswrapper[4722]: E0309 14:05:18.777644 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:05:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:05:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:05:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:05:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3321b793-006a-42f5-9c18-7f48ed9bad15\\\",\\\"systemUUID\\\":\\\"70de6d16-940c-46da-be51-a9b50262dda2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:18Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:18 crc kubenswrapper[4722]: I0309 14:05:18.784011 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:05:18 crc kubenswrapper[4722]: I0309 14:05:18.784368 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:05:18 crc kubenswrapper[4722]: I0309 14:05:18.784746 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:05:18 crc kubenswrapper[4722]: I0309 14:05:18.784969 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:05:18 crc kubenswrapper[4722]: I0309 14:05:18.785146 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:05:18Z","lastTransitionTime":"2026-03-09T14:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:05:18 crc kubenswrapper[4722]: E0309 14:05:18.809880 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:05:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:05:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:05:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:05:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3321b793-006a-42f5-9c18-7f48ed9bad15\\\",\\\"systemUUID\\\":\\\"70de6d16-940c-46da-be51-a9b50262dda2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:18Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:18 crc kubenswrapper[4722]: I0309 14:05:18.821163 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:05:18 crc kubenswrapper[4722]: I0309 14:05:18.821476 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:05:18 crc kubenswrapper[4722]: I0309 14:05:18.821497 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:05:18 crc kubenswrapper[4722]: I0309 14:05:18.821522 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:05:18 crc kubenswrapper[4722]: I0309 14:05:18.821538 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:05:18Z","lastTransitionTime":"2026-03-09T14:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:05:18 crc kubenswrapper[4722]: E0309 14:05:18.841496 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:05:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:05:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:05:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:05:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3321b793-006a-42f5-9c18-7f48ed9bad15\\\",\\\"systemUUID\\\":\\\"70de6d16-940c-46da-be51-a9b50262dda2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:18Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:18 crc kubenswrapper[4722]: I0309 14:05:18.847915 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:05:18 crc kubenswrapper[4722]: I0309 14:05:18.847988 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:05:18 crc kubenswrapper[4722]: I0309 14:05:18.848007 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:05:18 crc kubenswrapper[4722]: I0309 14:05:18.848030 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:05:18 crc kubenswrapper[4722]: I0309 14:05:18.848046 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:05:18Z","lastTransitionTime":"2026-03-09T14:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:05:18 crc kubenswrapper[4722]: E0309 14:05:18.865927 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:05:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:05:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:05:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:05:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3321b793-006a-42f5-9c18-7f48ed9bad15\\\",\\\"systemUUID\\\":\\\"70de6d16-940c-46da-be51-a9b50262dda2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:18Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:18 crc kubenswrapper[4722]: I0309 14:05:18.872408 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:05:18 crc kubenswrapper[4722]: I0309 14:05:18.872481 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:05:18 crc kubenswrapper[4722]: I0309 14:05:18.872509 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:05:18 crc kubenswrapper[4722]: I0309 14:05:18.872545 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:05:18 crc kubenswrapper[4722]: I0309 14:05:18.872568 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:05:18Z","lastTransitionTime":"2026-03-09T14:05:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:05:18 crc kubenswrapper[4722]: E0309 14:05:18.895129 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:05:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:05:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:05:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-09T14:05:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-09T14:05:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3321b793-006a-42f5-9c18-7f48ed9bad15\\\",\\\"systemUUID\\\":\\\"70de6d16-940c-46da-be51-a9b50262dda2\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:18Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:18 crc kubenswrapper[4722]: E0309 14:05:18.895466 4722 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 09 14:05:18 crc kubenswrapper[4722]: I0309 14:05:18.980287 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5v7ng_2e305619-b3a2-44c9-9e54-e1afa4f43dbf/ovnkube-controller/3.log" Mar 09 14:05:18 crc kubenswrapper[4722]: I0309 14:05:18.988176 4722 scope.go:117] "RemoveContainer" containerID="647b164e0544f016e9b6adbb065e7b958f1f7e1446a15f120191cfb35d030220" Mar 09 14:05:18 crc kubenswrapper[4722]: E0309 14:05:18.989994 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-5v7ng_openshift-ovn-kubernetes(2e305619-b3a2-44c9-9e54-e1afa4f43dbf)\"" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" podUID="2e305619-b3a2-44c9-9e54-e1afa4f43dbf" Mar 09 14:05:19 crc kubenswrapper[4722]: I0309 14:05:19.009787 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a834db9e-b389-4bac-bfe8-df204ff42b2e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:03:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:03:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5c374ffb7928a20af6f154bfca0c494d692f1f6f55eb8de5d4d5db79a355013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f502c2ee0b77f66f4b32b7d3173eb71bd6d34c31724e8032a0200dac6fd62b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a48ba8f8443935496fa8f3a7e262723b19515f3adc95847d64236f956f91cc20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97aabe30af8bc4ff72fbac7f4fbf7804aff61f8d2fe99753558e7e40f348867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a97aabe30af8bc4ff72fbac7f4fbf7804aff61f8d2fe99753558e7e40f348867\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:19Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:19 crc kubenswrapper[4722]: I0309 14:05:19.028632 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:19Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:19 crc kubenswrapper[4722]: I0309 14:05:19.046170 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69e74824fe242fd2d54bed9734f33823697d7651043d9c97bac6ea5490625a0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:19Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:19 crc kubenswrapper[4722]: I0309 14:05:19.060163 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d69359acfc7a4ee7698088a7e0062b3604a5ceb92910cd3b3d69b1a6291dfdcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://687646beb4c6dd3fb9b659a36449a573f4c389b3e98b07127b18516fdc4f2c93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:19Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:19 crc kubenswrapper[4722]: I0309 14:05:19.075673 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h4zw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b9e29bb-6e51-47ab-a543-b70117ab854d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30f56826c4b193cbd284ce58320073c1b4dc43c9eba976445ba4d4bb7c089960\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17cd3d1e3981e2f2a3b66780f753425c78452e6319fb8eae9935d3bac4456820\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T14:05:09Z\\\",\\\"message\\\":\\\"2026-03-09T14:04:24+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9a389771-bfaa-4805-bbd7-772a0f07bd73\\\\n2026-03-09T14:04:24+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9a389771-bfaa-4805-bbd7-772a0f07bd73 to /host/opt/cni/bin/\\\\n2026-03-09T14:04:24Z [verbose] multus-daemon started\\\\n2026-03-09T14:04:24Z [verbose] Readiness Indicator file check\\\\n2026-03-09T14:05:09Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:05:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdw92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h4zw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:19Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:19 crc kubenswrapper[4722]: I0309 14:05:19.097353 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1fc80c43be93eeb2f9805bf314c8430f52062d0ce5d0fa1830e7e0c27e674e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f969085ae62d82dbe1f0f8f5d459c4a1bc7d1ae0a459da484f11dd6ae5fdd418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5765145caac69454a9a8e8de34d05cb1da06924dd365f3e74f52461be5027ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef1c5a3944e8693d8c68a4108c23851352407d0fac7fb7a03763d9084f117dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0106fa808804a96ca8247096399e3d04eb082b2691b1f39df3db9f6c630114c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e737417baac512fab2089f5da12e7c47ef06eae8df7803ad6b248505ff3d18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://647b164e0544f016e9b6adbb065e7b958f1f7e1446a15f120191cfb35d030220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://647b164e0544f016e9b6adbb065e7b958f1f7e1446a15f120191cfb35d030220\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T14:05:17Z\\\",\\\"message\\\":\\\"/client/informers/externalversions/factory.go:141\\\\nI0309 14:05:17.105639 7362 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 14:05:17.105690 7362 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0309 14:05:17.105785 7362 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 14:05:17.106267 7362 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0309 14:05:17.106318 7362 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0309 14:05:17.106413 7362 factory.go:656] Stopping watch factory\\\\nI0309 14:05:17.106445 7362 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0309 14:05:17.106456 7362 handler.go:208] Removed *v1.Node event handler 2\\\\nI0309 14:05:17.119618 7362 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0309 14:05:17.119648 7362 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0309 14:05:17.119707 7362 ovnkube.go:599] Stopped ovnkube\\\\nI0309 14:05:17.119735 7362 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0309 14:05:17.119831 7362 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T14:05:16Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-5v7ng_openshift-ovn-kubernetes(2e305619-b3a2-44c9-9e54-e1afa4f43dbf)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f5af077ae5b947194e5dba0542d053890a589582e415e1b41a932bdab6632e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcc35a356ac2421f1f5a4e5460ac9461063a1fb85aaefce257f9f4e7aed4dc75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcc35a356ac2421f1f5a4e5460ac9461063a1fb85aaefce257f9f4e7aed4dc75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5v7ng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:19Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:19 crc kubenswrapper[4722]: I0309 14:05:19.110754 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xsft8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e14085d-df3c-4218-aa43-0553786b8867\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fb0f733e44fc7a699961d799267ca3c01e65a62b29d74e6ae682bccba9e5281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqpsw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://435bcac0d223a0ff9431ac5bc5be017aa95d2e567d991a1f4edc0f79559904e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqpsw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xsft8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:19Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:19 crc kubenswrapper[4722]: I0309 14:05:19.125384 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee2ed55248b177bc0873ef3c41517401fdb30fc5ce5a545a1ca8c925e4d4e9e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:19Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:19 crc kubenswrapper[4722]: I0309 14:05:19.147371 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jv499" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd1357a3-36ef-4bc8-85bb-91f3c0f42994\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a14e62f50fa8e5403349dc8696e9c7c45a5da1026227612186e7805b1be722f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e56992f54f9fc4387130b74e3aa0f4652a5d6d960da1a5305b8668b8086fae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e56992f54f9fc4387130b74e3aa0f4652a5d6d960da1a5305b8668b8086fae2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c00d804aec8d776f5f2ab9c7a81161c9942a94564d57afe47c633fb3dc2aa7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c00d804aec8d776f5f2ab9c7a81161c9942a94564d57afe47c633fb3dc2aa7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0491f9997970cb050ece19d88d2a7064d4799db3772132f58055e1dbcf2f3a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0491f9997970cb050ece19d88d2a7064d4799db3772132f58055e1dbcf2f3a82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93c7d9a6b0dc8e76abab762ff6aae1975a31fbc80ee3a3cd32ec378298061748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93c7d9a6b0dc8e76abab762ff6aae1975a31fbc80ee3a3cd32ec378298061748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cc5ae61b0fa949ba4bf5aa60d09385399a557f19ef7c99dc43b21fc9db830e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9cc5ae61b0fa949ba4bf5aa60d09385399a557f19ef7c99dc43b21fc9db830e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a88cf1e6b90a3f13f44bbb08482773f007066c795b00cb0177f6cd2ed87cbb52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a88cf1e6b90a3f13f44bbb08482773f007066c795b00cb0177f6cd2ed87cbb52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jv499\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:19Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:19 crc kubenswrapper[4722]: I0309 14:05:19.159686 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l5pwg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8424174-764a-470b-ab49-e7369a42de7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d30648bba1b59fd7db5e75a61d6a004246668d148515da5738b77e5122358f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l5pwg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:19Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:19 crc kubenswrapper[4722]: I0309 14:05:19.174119 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:19Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:19 crc kubenswrapper[4722]: I0309 14:05:19.186833 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-72xb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43e3c934-6aa7-4c96-a3a6-378f7931fc2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcf5e49245fd6051b12465a36b27f657b4923a62f88d61fe0fac971c4a2c9197\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lfx7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-72xb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:19Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:19 crc kubenswrapper[4722]: I0309 14:05:19.207077 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dac2aaf5-653b-4b2a-8efe-ed26bac8d648\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf5b268d6a355ae3a2c40981769412124b85cdfd99364ed58e5163a744f5f83c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q6db\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f451493fa6d86f12c1291c0fe262bcbe4b62cef45c8dda696bb48fa3ded2eeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q6db\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hjrrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:19Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:19 crc kubenswrapper[4722]: I0309 14:05:19.221310 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pvvhj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5dbc0be-527e-4f70-b185-3a10b1b11a75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pvvhj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:19Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:19 crc kubenswrapper[4722]: I0309 14:05:19.236434 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36d472dc-6063-4ddf-9b31-68a6a23f365c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:03:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:03:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68bead726a61890d6212098f420a58ca85c9d97bb2efe57b314aff215c26169c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a07bf846a5613ee835b70c1915ee46d1e45185f5ff81096ec2e4dd4238f35f3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T14:03:39Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 14:03:09.643545 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 14:03:09.644916 1 observer_polling.go:159] Starting file observer\\\\nI0309 14:03:09.646001 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 14:03:09.646843 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 14:03:35.207688 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0309 14:03:39.212600 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 14:03:39.212719 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T14:03:09Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff66d43b6e872eed24aacaf3ea882c7148b6ef62bbe5b7bf10ede5d2691a3680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41c6395bf5f6006036c3d58919ea6ea976b56827cac8ae369b10dd4037d29153\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03fa5f2f3d3dc4ebaf6b5db02fe2036caaf7e0de7197cb5d3409dbbae676a338\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:19Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:19 crc kubenswrapper[4722]: I0309 14:05:19.262513 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2267a4d7-87b4-43f7-80c7-7a7ffb0327fc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04169d9b8359e26cc9807e27fb9a5d54ca0946aaade294c5bcc272a7d21c4db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09fa62e7293a50fb3f6bdaf37061cbe5e95a830c8f7337660ce73138dbcaa641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31ba786781fcdb781f5fa45276e29e94756863d2398109d91e81a52f6f88bee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dff721159b382bcfdde6f406a328ef2ee9264b35a8d8baeb7a3c6d31a60ff3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63769aca40f721bec0716427145a3ceea0595836c03d31c98f6d45ff14de68be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://928d88a2e78d66b996b4b4f8c09878f78dd7fbdb000e2ab6d43ac77c999a748f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://928d88a2e78d66b996b4b4f8c09878f78dd7fbdb000e2ab6d43ac77c999a748f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6625c60c717ffc2b85b70374bc50dfae41b04d6362db0c7daf387f31c25862e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6625c60c717ffc2b85b70374bc50dfae41b04d6362db0c7daf387f31c25862e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://21b62fa3b3144d1d24dbb3a9e5f203e4c692377a54fb2322ebe6b927aa67403b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21b62fa3b3144d1d24dbb3a9e5f203e4c692377a54fb2322ebe6b927aa67403b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:19Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:19 crc kubenswrapper[4722]: I0309 14:05:19.288317 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0965e24b-526e-4842-ac1f-eca7a765355d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56c76a5e8718243a17d8d9fd9a8dcf6083c2ccb65b4816926dca08a89e465728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be3836660bf646052991965270cd69b8b8237b5c4bd698a73789b050d23e6877\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dbef38ff5e714c92fc8a86f1e3c64adf606cb198145f884a000f8dee200fcf2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28fcbb961b20cd0cbc1e8548492168a0fe9920b7b98c7bfeffe34727c299030e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7d636e81bd62363d09f680bbc312f71cbc2ac89d9af44b3c6b9efdaf29ded48\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T14:03:46Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 14:03:46.681607 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 14:03:46.681849 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 14:03:46.682597 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2315154617/tls.crt::/tmp/serving-cert-2315154617/tls.key\\\\\\\"\\\\nI0309 14:03:46.866095 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 14:03:46.869876 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 14:03:46.869896 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 14:03:46.870280 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 14:03:46.870293 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 14:03:46.877096 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0309 14:03:46.877117 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 14:03:46.877121 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 14:03:46.877128 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 14:03:46.877135 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 14:03:46.877139 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 14:03:46.877143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 14:03:46.877146 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 14:03:46.878419 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T14:03:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fbabc5ec2c100150ce886c06c3bd78cc5896c8a06ac9d262eeb552a8a8bb5d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:19Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:19 crc kubenswrapper[4722]: I0309 14:05:19.311418 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:19Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:20 crc kubenswrapper[4722]: I0309 14:05:20.149063 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 14:05:20 crc kubenswrapper[4722]: I0309 14:05:20.149308 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pvvhj" Mar 09 14:05:20 crc kubenswrapper[4722]: I0309 14:05:20.149323 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 14:05:20 crc kubenswrapper[4722]: I0309 14:05:20.149535 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 14:05:20 crc kubenswrapper[4722]: E0309 14:05:20.150329 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pvvhj" podUID="f5dbc0be-527e-4f70-b185-3a10b1b11a75" Mar 09 14:05:20 crc kubenswrapper[4722]: E0309 14:05:20.150476 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 14:05:20 crc kubenswrapper[4722]: E0309 14:05:20.150640 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 14:05:20 crc kubenswrapper[4722]: E0309 14:05:20.150767 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 14:05:20 crc kubenswrapper[4722]: I0309 14:05:20.174482 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:20Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:20 crc kubenswrapper[4722]: I0309 14:05:20.202810 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:20Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:20 crc kubenswrapper[4722]: I0309 14:05:20.223645 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-72xb4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"43e3c934-6aa7-4c96-a3a6-378f7931fc2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcf5e49245fd6051b12465a36b27f657b4923a62f88d61fe0fac971c4a2c9197\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lfx7x\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:20Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-72xb4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:20Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:20 crc kubenswrapper[4722]: I0309 14:05:20.242815 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dac2aaf5-653b-4b2a-8efe-ed26bac8d648\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bf5b268d6a355ae3a2c40981769412124b85cdfd99364ed58e5163a744f5f83c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q6db\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f451493fa6d86f12c1291c0fe262bcbe4b62cef45c8dda696bb48fa3ded2eeb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5q6db\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-hjrrb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:20Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:20 crc kubenswrapper[4722]: E0309 14:05:20.256300 4722 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 14:05:20 crc kubenswrapper[4722]: I0309 14:05:20.262988 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-pvvhj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f5dbc0be-527e-4f70-b185-3a10b1b11a75\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vz4kf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:34Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-pvvhj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:20Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:20 crc kubenswrapper[4722]: I0309 14:05:20.282685 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"36d472dc-6063-4ddf-9b31-68a6a23f365c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:03:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:03:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68bead726a61890d6212098f420a58ca85c9d97bb2efe57b314aff215c26169c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a07bf846a5613ee835b70c1915ee46d1e45185f5ff81096ec2e4dd4238f35f3\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T14:03:39Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0309 14:03:09.643545 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0309 14:03:09.644916 1 observer_polling.go:159] Starting file observer\\\\nI0309 14:03:09.646001 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0309 14:03:09.646843 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0309 14:03:35.207688 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nI0309 14:03:39.212600 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0309 14:03:39.212719 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T14:03:09Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:03:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff66d43b6e872eed24aacaf3ea882c7148b6ef62bbe5b7bf10ede5d2691a3680\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://41c6395bf5f6006036c3d58919ea6ea976b56827cac8ae369b10dd4037d29153\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03fa5f2f3d3dc4ebaf6b5db02fe2036caaf7e0de7197cb5d3409dbbae676a338\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:20Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:20 crc kubenswrapper[4722]: I0309 14:05:20.319584 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2267a4d7-87b4-43f7-80c7-7a7ffb0327fc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://04169d9b8359e26cc9807e27fb9a5d54ca0946aaade294c5bcc272a7d21c4db6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09fa62e7293a50fb3f6bdaf37061cbe5e95a830c8f7337660ce73138dbcaa641\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31ba786781fcdb781f5fa45276e29e94756863d2398109d91e81a52f6f88bee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2dff721159b382bcfdde6f406a328ef2ee9264b35a8d8baeb7a3c6d31a60ff3b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://63769aca40f721bec0716427145a3ceea0595836c03d31c98f6d45ff14de68be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://928d88a2e78d66b996b4b4f8c09878f78dd7fbdb000e2ab6d43ac77c999a748f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://928d88a2e78d66b996b4b4f8c09878f78dd7fbdb000e2ab6d43ac77c999a748f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e6625c60c717ffc2b85b70374bc50dfae41b04d6362db0c7daf387f31c25862e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e6625c60c717ffc2b85b70374bc50dfae41b04d6362db0c7daf387f31c25862e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://21b62fa3b3144d1d24dbb3a9e5f203e4c692377a54fb2322ebe6b927aa67403b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21b62fa3b3144d1d24dbb3a9e5f203e4c692377a54fb2322ebe6b927aa67403b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:20Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:20 crc kubenswrapper[4722]: I0309 14:05:20.342756 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0965e24b-526e-4842-ac1f-eca7a765355d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56c76a5e8718243a17d8d9fd9a8dcf6083c2ccb65b4816926dca08a89e465728\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be3836660bf646052991965270cd69b8b8237b5c4bd698a73789b050d23e6877\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dbef38ff5e714c92fc8a86f1e3c64adf606cb198145f884a000f8dee200fcf2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28fcbb961b20cd0cbc1e8548492168a0fe9920b7b98c7bfeffe34727c299030e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7d636e81bd62363d09f680bbc312f71cbc2ac89d9af44b3c6b9efdaf29ded48\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-09T14:03:46Z\\\",\\\"message\\\":\\\"le observer\\\\nW0309 14:03:46.681607 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0309 14:03:46.681849 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0309 14:03:46.682597 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2315154617/tls.crt::/tmp/serving-cert-2315154617/tls.key\\\\\\\"\\\\nI0309 14:03:46.866095 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0309 14:03:46.869876 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0309 14:03:46.869896 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0309 14:03:46.870280 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0309 14:03:46.870293 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0309 14:03:46.877096 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0309 14:03:46.877117 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0309 14:03:46.877121 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 14:03:46.877128 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0309 14:03:46.877135 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0309 14:03:46.877139 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0309 14:03:46.877143 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0309 14:03:46.877146 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0309 14:03:46.878419 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T14:03:46Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2fbabc5ec2c100150ce886c06c3bd78cc5896c8a06ac9d262eeb552a8a8bb5d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:20Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:20 crc kubenswrapper[4722]: I0309 14:05:20.357303 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://69e74824fe242fd2d54bed9734f33823697d7651043d9c97bac6ea5490625a0a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:20Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:20 crc kubenswrapper[4722]: I0309 14:05:20.375044 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a834db9e-b389-4bac-bfe8-df204ff42b2e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:03:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:03:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:02:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5c374ffb7928a20af6f154bfca0c494d692f1f6f55eb8de5d4d5db79a355013\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f502c2ee0b77f66f4b32b7d3173eb71bd6d34c31724e8032a0200dac6fd62b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a48ba8f8443935496fa8f3a7e262723b19515f3adc95847d64236f956f91cc20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:02:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a97aabe30af8bc4ff72fbac7f4fbf7804aff61f8d2fe99753558e7e40f348867\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a97aabe30af8bc4ff72fbac7f4fbf7804aff61f8d2fe99753558e7e40f348867\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:02:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:02:41Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:02:40Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:20Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:20 crc kubenswrapper[4722]: I0309 14:05:20.394520 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:08Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:20Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:20 crc kubenswrapper[4722]: I0309 14:05:20.413861 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xsft8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3e14085d-df3c-4218-aa43-0553786b8867\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8fb0f733e44fc7a699961d799267ca3c01e65a62b29d74e6ae682bccba9e5281\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqpsw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://435bcac0d223a0ff9431ac5bc5be017aa95d2e567d991a1f4edc0f79559904e6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sqpsw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:33Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-xsft8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:20Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:20 crc kubenswrapper[4722]: I0309 14:05:20.438244 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d69359acfc7a4ee7698088a7e0062b3604a5ceb92910cd3b3d69b1a6291dfdcf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://687646beb4c6dd3fb9b659a36449a573f4c389b3e98b07127b18516fdc4f2c93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:20Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:20 crc kubenswrapper[4722]: I0309 14:05:20.453906 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-h4zw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b9e29bb-6e51-47ab-a543-b70117ab854d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:05:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30f56826c4b193cbd284ce58320073c1b4dc43c9eba976445ba4d4bb7c089960\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://17cd3d1e3981e2f2a3b66780f753425c78452e6319fb8eae9935d3bac4456820\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T14:05:09Z\\\",\\\"message\\\":\\\"2026-03-09T14:04:24+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_9a389771-bfaa-4805-bbd7-772a0f07bd73\\\\n2026-03-09T14:04:24+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_9a389771-bfaa-4805-bbd7-772a0f07bd73 to /host/opt/cni/bin/\\\\n2026-03-09T14:04:24Z [verbose] multus-daemon started\\\\n2026-03-09T14:04:24Z [verbose] Readiness Indicator file check\\\\n2026-03-09T14:05:09Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:05:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tdw92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-h4zw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:20Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:20 crc kubenswrapper[4722]: I0309 14:05:20.474399 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1fc80c43be93eeb2f9805bf314c8430f52062d0ce5d0fa1830e7e0c27e674e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f969085ae62d82dbe1f0f8f5d459c4a1bc7d1ae0a459da484f11dd6ae5fdd418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5765145caac69454a9a8e8de34d05cb1da06924dd365f3e74f52461be5027ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef1c5a3944e8693d8c68a4108c23851352407d0fac7fb7a03763d9084f117dfb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0106fa808804a96ca8247096399e3d04eb082b2691b1f39df3db9f6c630114c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://24e737417baac512fab2089f5da12e7c47ef06eae8df7803ad6b248505ff3d18\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://647b164e0544f016e9b6adbb065e7b958f1f7e1446a15f120191cfb35d030220\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://647b164e0544f016e9b6adbb065e7b958f1f7e1446a15f120191cfb35d030220\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-09T14:05:17Z\\\",\\\"message\\\":\\\"/client/informers/externalversions/factory.go:141\\\\nI0309 14:05:17.105639 7362 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 14:05:17.105690 7362 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0309 14:05:17.105785 7362 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0309 14:05:17.106267 7362 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0309 14:05:17.106318 7362 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0309 14:05:17.106413 7362 factory.go:656] Stopping watch factory\\\\nI0309 14:05:17.106445 7362 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0309 14:05:17.106456 7362 handler.go:208] Removed *v1.Node event handler 2\\\\nI0309 14:05:17.119618 7362 shared_informer.go:320] Caches are synced for node-tracker-controller\\\\nI0309 14:05:17.119648 7362 services_controller.go:204] Setting up event handlers for services for network=default\\\\nI0309 14:05:17.119707 7362 ovnkube.go:599] Stopped ovnkube\\\\nI0309 14:05:17.119735 7362 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0309 14:05:17.119831 7362 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-09T14:05:16Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-5v7ng_openshift-ovn-kubernetes(2e305619-b3a2-44c9-9e54-e1afa4f43dbf)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f5af077ae5b947194e5dba0542d053890a589582e415e1b41a932bdab6632e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bcc35a356ac2421f1f5a4e5460ac9461063a1fb85aaefce257f9f4e7aed4dc75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bcc35a356ac2421f1f5a4e5460ac9461063a1fb85aaefce257f9f4e7aed4dc75\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v6nlp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-5v7ng\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:20Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:20 crc kubenswrapper[4722]: I0309 14:05:20.494150 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:09Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ee2ed55248b177bc0873ef3c41517401fdb30fc5ce5a545a1ca8c925e4d4e9e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:20Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:20 crc kubenswrapper[4722]: I0309 14:05:20.514836 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jv499" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd1357a3-36ef-4bc8-85bb-91f3c0f42994\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a14e62f50fa8e5403349dc8696e9c7c45a5da1026227612186e7805b1be722f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e56992f54f9fc4387130b74e3aa0f4652a5d6d960da1a5305b8668b8086fae2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e56992f54f9fc4387130b74e3aa0f4652a5d6d960da1a5305b8668b8086fae2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6c00d804aec8d776f5f2ab9c7a81161c9942a94564d57afe47c633fb3dc2aa7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c00d804aec8d776f5f2ab9c7a81161c9942a94564d57afe47c633fb3dc2aa7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0491f9997970cb050ece19d88d2a7064d4799db3772132f58055e1dbcf2f3a82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0491f9997970cb050ece19d88d2a7064d4799db3772132f58055e1dbcf2f3a82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://93c7d9a6b0dc8e76abab762ff6aae1975a31fbc80ee3a3cd32ec378298061748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93c7d9a6b0dc8e76abab762ff6aae1975a31fbc80ee3a3cd32ec378298061748\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9cc5ae61b0fa949ba4bf5aa60d09385399a557f19ef7c99dc43b21fc9db830e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9cc5ae61b0fa949ba4bf5aa60d09385399a557f19ef7c99dc43b21fc9db830e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a88cf1e6b90a3f13f44bbb08482773f007066c795b00cb0177f6cd2ed87cbb52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a88cf1e6b90a3f13f44bbb08482773f007066c795b00cb0177f6cd2ed87cbb52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-09T14:04:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-09T14:04:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m2lh2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:21Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jv499\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:20Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:20 crc kubenswrapper[4722]: I0309 14:05:20.527801 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-l5pwg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b8424174-764a-470b-ab49-e7369a42de7c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-09T14:04:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d30648bba1b59fd7db5e75a61d6a004246668d148515da5738b77e5122358f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-09T14:04:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-426rr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-09T14:04:27Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-l5pwg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-09T14:05:20Z is after 2025-08-24T17:21:41Z" Mar 09 14:05:21 crc kubenswrapper[4722]: I0309 14:05:21.163170 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 09 14:05:22 crc kubenswrapper[4722]: I0309 14:05:22.148274 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 14:05:22 crc kubenswrapper[4722]: I0309 14:05:22.148332 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pvvhj" Mar 09 14:05:22 crc kubenswrapper[4722]: I0309 14:05:22.148282 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 14:05:22 crc kubenswrapper[4722]: I0309 14:05:22.148279 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 14:05:22 crc kubenswrapper[4722]: E0309 14:05:22.148435 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 14:05:22 crc kubenswrapper[4722]: E0309 14:05:22.148551 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 14:05:22 crc kubenswrapper[4722]: E0309 14:05:22.148626 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pvvhj" podUID="f5dbc0be-527e-4f70-b185-3a10b1b11a75" Mar 09 14:05:22 crc kubenswrapper[4722]: E0309 14:05:22.148670 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 14:05:24 crc kubenswrapper[4722]: I0309 14:05:24.148326 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 14:05:24 crc kubenswrapper[4722]: I0309 14:05:24.148371 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 14:05:24 crc kubenswrapper[4722]: E0309 14:05:24.148918 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 14:05:24 crc kubenswrapper[4722]: I0309 14:05:24.148484 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 14:05:24 crc kubenswrapper[4722]: I0309 14:05:24.148428 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pvvhj" Mar 09 14:05:24 crc kubenswrapper[4722]: E0309 14:05:24.149056 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 14:05:24 crc kubenswrapper[4722]: E0309 14:05:24.149173 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 14:05:24 crc kubenswrapper[4722]: E0309 14:05:24.149350 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pvvhj" podUID="f5dbc0be-527e-4f70-b185-3a10b1b11a75" Mar 09 14:05:25 crc kubenswrapper[4722]: E0309 14:05:25.257523 4722 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 14:05:26 crc kubenswrapper[4722]: I0309 14:05:26.148814 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 14:05:26 crc kubenswrapper[4722]: I0309 14:05:26.149303 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pvvhj" Mar 09 14:05:26 crc kubenswrapper[4722]: I0309 14:05:26.149157 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 14:05:26 crc kubenswrapper[4722]: E0309 14:05:26.149451 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 14:05:26 crc kubenswrapper[4722]: E0309 14:05:26.149562 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pvvhj" podUID="f5dbc0be-527e-4f70-b185-3a10b1b11a75" Mar 09 14:05:26 crc kubenswrapper[4722]: I0309 14:05:26.150416 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 14:05:26 crc kubenswrapper[4722]: E0309 14:05:26.150459 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 14:05:26 crc kubenswrapper[4722]: E0309 14:05:26.150600 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 14:05:28 crc kubenswrapper[4722]: I0309 14:05:28.148646 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 14:05:28 crc kubenswrapper[4722]: I0309 14:05:28.148738 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pvvhj" Mar 09 14:05:28 crc kubenswrapper[4722]: E0309 14:05:28.148905 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 14:05:28 crc kubenswrapper[4722]: I0309 14:05:28.148674 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 14:05:28 crc kubenswrapper[4722]: I0309 14:05:28.149045 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 14:05:28 crc kubenswrapper[4722]: E0309 14:05:28.149299 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pvvhj" podUID="f5dbc0be-527e-4f70-b185-3a10b1b11a75" Mar 09 14:05:28 crc kubenswrapper[4722]: E0309 14:05:28.149538 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 14:05:28 crc kubenswrapper[4722]: E0309 14:05:28.149948 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 14:05:28 crc kubenswrapper[4722]: I0309 14:05:28.996918 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 09 14:05:28 crc kubenswrapper[4722]: I0309 14:05:28.996979 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 09 14:05:28 crc kubenswrapper[4722]: I0309 14:05:28.996999 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 09 14:05:28 crc kubenswrapper[4722]: I0309 14:05:28.997021 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 09 14:05:28 crc kubenswrapper[4722]: I0309 14:05:28.997038 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-09T14:05:28Z","lastTransitionTime":"2026-03-09T14:05:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 09 14:05:29 crc kubenswrapper[4722]: I0309 14:05:29.073718 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-57l9g"] Mar 09 14:05:29 crc kubenswrapper[4722]: I0309 14:05:29.074367 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-57l9g" Mar 09 14:05:29 crc kubenswrapper[4722]: I0309 14:05:29.077055 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 09 14:05:29 crc kubenswrapper[4722]: I0309 14:05:29.079840 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 09 14:05:29 crc kubenswrapper[4722]: I0309 14:05:29.081140 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 09 14:05:29 crc kubenswrapper[4722]: I0309 14:05:29.081231 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 09 14:05:29 crc kubenswrapper[4722]: I0309 14:05:29.133101 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8c5ef78a-6d63-4e2f-84b8-20d6ad5d16c2-service-ca\") pod \"cluster-version-operator-5c965bbfc6-57l9g\" (UID: \"8c5ef78a-6d63-4e2f-84b8-20d6ad5d16c2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-57l9g" Mar 09 14:05:29 crc kubenswrapper[4722]: I0309 14:05:29.133181 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8c5ef78a-6d63-4e2f-84b8-20d6ad5d16c2-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-57l9g\" (UID: \"8c5ef78a-6d63-4e2f-84b8-20d6ad5d16c2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-57l9g" Mar 09 14:05:29 crc kubenswrapper[4722]: I0309 14:05:29.133341 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/8c5ef78a-6d63-4e2f-84b8-20d6ad5d16c2-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-57l9g\" (UID: \"8c5ef78a-6d63-4e2f-84b8-20d6ad5d16c2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-57l9g" Mar 09 14:05:29 crc kubenswrapper[4722]: I0309 14:05:29.133385 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/8c5ef78a-6d63-4e2f-84b8-20d6ad5d16c2-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-57l9g\" (UID: \"8c5ef78a-6d63-4e2f-84b8-20d6ad5d16c2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-57l9g" Mar 09 14:05:29 crc kubenswrapper[4722]: I0309 14:05:29.133421 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8c5ef78a-6d63-4e2f-84b8-20d6ad5d16c2-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-57l9g\" (UID: \"8c5ef78a-6d63-4e2f-84b8-20d6ad5d16c2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-57l9g" Mar 09 14:05:29 crc kubenswrapper[4722]: I0309 14:05:29.140004 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=44.139963176 podStartE2EDuration="44.139963176s" podCreationTimestamp="2026-03-09 14:04:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:05:29.122879769 +0000 UTC m=+169.678448385" watchObservedRunningTime="2026-03-09 14:05:29.139963176 +0000 UTC m=+169.695531792" Mar 09 14:05:29 crc kubenswrapper[4722]: I0309 14:05:29.140342 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=8.140332486 podStartE2EDuration="8.140332486s" podCreationTimestamp="2026-03-09 14:05:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:05:29.139763671 +0000 UTC m=+169.695332337" watchObservedRunningTime="2026-03-09 14:05:29.140332486 +0000 UTC m=+169.695901102" Mar 09 14:05:29 crc kubenswrapper[4722]: I0309 14:05:29.189583 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 09 14:05:29 crc kubenswrapper[4722]: I0309 14:05:29.201634 4722 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 09 14:05:29 crc kubenswrapper[4722]: I0309 14:05:29.225720 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-h4zw5" podStartSLOduration=105.225686401 podStartE2EDuration="1m45.225686401s" podCreationTimestamp="2026-03-09 14:03:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:05:29.225463335 +0000 UTC m=+169.781031981" watchObservedRunningTime="2026-03-09 14:05:29.225686401 +0000 UTC m=+169.781254997" Mar 09 14:05:29 crc kubenswrapper[4722]: I0309 14:05:29.234191 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8c5ef78a-6d63-4e2f-84b8-20d6ad5d16c2-service-ca\") pod \"cluster-version-operator-5c965bbfc6-57l9g\" (UID: \"8c5ef78a-6d63-4e2f-84b8-20d6ad5d16c2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-57l9g" Mar 09 14:05:29 crc kubenswrapper[4722]: I0309 14:05:29.234338 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8c5ef78a-6d63-4e2f-84b8-20d6ad5d16c2-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-57l9g\" (UID: \"8c5ef78a-6d63-4e2f-84b8-20d6ad5d16c2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-57l9g" Mar 09 14:05:29 crc kubenswrapper[4722]: I0309 14:05:29.234440 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/8c5ef78a-6d63-4e2f-84b8-20d6ad5d16c2-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-57l9g\" (UID: \"8c5ef78a-6d63-4e2f-84b8-20d6ad5d16c2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-57l9g" Mar 09 14:05:29 crc kubenswrapper[4722]: I0309 14:05:29.234483 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/8c5ef78a-6d63-4e2f-84b8-20d6ad5d16c2-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-57l9g\" (UID: \"8c5ef78a-6d63-4e2f-84b8-20d6ad5d16c2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-57l9g" Mar 09 14:05:29 crc kubenswrapper[4722]: I0309 14:05:29.234518 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8c5ef78a-6d63-4e2f-84b8-20d6ad5d16c2-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-57l9g\" (UID: \"8c5ef78a-6d63-4e2f-84b8-20d6ad5d16c2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-57l9g" Mar 09 14:05:29 crc kubenswrapper[4722]: I0309 14:05:29.234664 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/8c5ef78a-6d63-4e2f-84b8-20d6ad5d16c2-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-57l9g\" (UID: \"8c5ef78a-6d63-4e2f-84b8-20d6ad5d16c2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-57l9g" Mar 09 14:05:29 crc kubenswrapper[4722]: I0309 14:05:29.234667 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/8c5ef78a-6d63-4e2f-84b8-20d6ad5d16c2-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-57l9g\" (UID: \"8c5ef78a-6d63-4e2f-84b8-20d6ad5d16c2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-57l9g" Mar 09 14:05:29 crc kubenswrapper[4722]: I0309 14:05:29.235366 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8c5ef78a-6d63-4e2f-84b8-20d6ad5d16c2-service-ca\") pod \"cluster-version-operator-5c965bbfc6-57l9g\" (UID: \"8c5ef78a-6d63-4e2f-84b8-20d6ad5d16c2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-57l9g" Mar 09 14:05:29 crc kubenswrapper[4722]: I0309 14:05:29.244109 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8c5ef78a-6d63-4e2f-84b8-20d6ad5d16c2-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-57l9g\" (UID: \"8c5ef78a-6d63-4e2f-84b8-20d6ad5d16c2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-57l9g" Mar 09 14:05:29 crc kubenswrapper[4722]: I0309 14:05:29.257248 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8c5ef78a-6d63-4e2f-84b8-20d6ad5d16c2-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-57l9g\" (UID: \"8c5ef78a-6d63-4e2f-84b8-20d6ad5d16c2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-57l9g" Mar 09 14:05:29 crc kubenswrapper[4722]: I0309 14:05:29.280597 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-xsft8" podStartSLOduration=105.280569125 podStartE2EDuration="1m45.280569125s" podCreationTimestamp="2026-03-09 14:03:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:05:29.279874445 +0000 UTC m=+169.835443031" watchObservedRunningTime="2026-03-09 14:05:29.280569125 +0000 UTC m=+169.836137701" Mar 09 14:05:29 crc kubenswrapper[4722]: I0309 14:05:29.330539 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-jv499" podStartSLOduration=105.33051888 podStartE2EDuration="1m45.33051888s" podCreationTimestamp="2026-03-09 14:03:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:05:29.329700968 +0000 UTC m=+169.885269554" watchObservedRunningTime="2026-03-09 14:05:29.33051888 +0000 UTC m=+169.886087466" Mar 09 14:05:29 crc kubenswrapper[4722]: I0309 14:05:29.343817 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-l5pwg" podStartSLOduration=106.343788391 podStartE2EDuration="1m46.343788391s" podCreationTimestamp="2026-03-09 14:03:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:05:29.343017329 +0000 UTC m=+169.898585925" watchObservedRunningTime="2026-03-09 14:05:29.343788391 +0000 UTC m=+169.899356977" Mar 09 14:05:29 crc kubenswrapper[4722]: I0309 14:05:29.371406 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-72xb4" podStartSLOduration=106.371376441 podStartE2EDuration="1m46.371376441s" podCreationTimestamp="2026-03-09 14:03:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:05:29.371257948 +0000 UTC m=+169.926826534" watchObservedRunningTime="2026-03-09 14:05:29.371376441 +0000 UTC m=+169.926945027" Mar 09 14:05:29 crc kubenswrapper[4722]: I0309 14:05:29.393266 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-57l9g" Mar 09 14:05:29 crc kubenswrapper[4722]: I0309 14:05:29.402723 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podStartSLOduration=106.402699777 podStartE2EDuration="1m46.402699777s" podCreationTimestamp="2026-03-09 14:03:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:05:29.388004346 +0000 UTC m=+169.943572942" watchObservedRunningTime="2026-03-09 14:05:29.402699777 +0000 UTC m=+169.958268363" Mar 09 14:05:29 crc kubenswrapper[4722]: I0309 14:05:29.419553 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=21.419531757 podStartE2EDuration="21.419531757s" podCreationTimestamp="2026-03-09 14:05:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:05:29.418868598 +0000 UTC m=+169.974437204" watchObservedRunningTime="2026-03-09 14:05:29.419531757 +0000 UTC m=+169.975100333" Mar 09 14:05:29 crc kubenswrapper[4722]: I0309 14:05:29.469885 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=81.469856504 podStartE2EDuration="1m21.469856504s" podCreationTimestamp="2026-03-09 14:04:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:05:29.469117382 +0000 UTC m=+170.024685958" watchObservedRunningTime="2026-03-09 14:05:29.469856504 +0000 UTC m=+170.025425080" Mar 09 14:05:29 crc kubenswrapper[4722]: I0309 14:05:29.470120 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=79.470114851 podStartE2EDuration="1m19.470114851s" podCreationTimestamp="2026-03-09 14:04:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:05:29.45005391 +0000 UTC m=+170.005622506" watchObservedRunningTime="2026-03-09 14:05:29.470114851 +0000 UTC m=+170.025683427" Mar 09 14:05:30 crc kubenswrapper[4722]: I0309 14:05:30.075296 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-57l9g" event={"ID":"8c5ef78a-6d63-4e2f-84b8-20d6ad5d16c2","Type":"ContainerStarted","Data":"97f1b84da92deb023fd3bb016041e0a26abd4f970d73489fbaf468edf59c2bcf"} Mar 09 14:05:30 crc kubenswrapper[4722]: I0309 14:05:30.075369 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-57l9g" event={"ID":"8c5ef78a-6d63-4e2f-84b8-20d6ad5d16c2","Type":"ContainerStarted","Data":"b0c5f2f1336711989912d279766c87d6370aa2499b3dbcd1b3c2db4c9befa192"} Mar 09 14:05:30 crc kubenswrapper[4722]: I0309 14:05:30.094344 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-57l9g" podStartSLOduration=107.094319161 podStartE2EDuration="1m47.094319161s" podCreationTimestamp="2026-03-09 14:03:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:05:30.093404315 +0000 UTC m=+170.648972901" watchObservedRunningTime="2026-03-09 14:05:30.094319161 +0000 UTC m=+170.649887737" Mar 09 14:05:30 crc kubenswrapper[4722]: I0309 14:05:30.149142 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 14:05:30 crc kubenswrapper[4722]: I0309 14:05:30.149190 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pvvhj" Mar 09 14:05:30 crc kubenswrapper[4722]: I0309 14:05:30.149148 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 14:05:30 crc kubenswrapper[4722]: I0309 14:05:30.149259 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 14:05:30 crc kubenswrapper[4722]: E0309 14:05:30.150710 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 14:05:30 crc kubenswrapper[4722]: E0309 14:05:30.150814 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pvvhj" podUID="f5dbc0be-527e-4f70-b185-3a10b1b11a75" Mar 09 14:05:30 crc kubenswrapper[4722]: E0309 14:05:30.150908 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 14:05:30 crc kubenswrapper[4722]: E0309 14:05:30.150967 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 14:05:30 crc kubenswrapper[4722]: E0309 14:05:30.259546 4722 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 14:05:32 crc kubenswrapper[4722]: I0309 14:05:32.152393 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 14:05:32 crc kubenswrapper[4722]: I0309 14:05:32.152423 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 14:05:32 crc kubenswrapper[4722]: I0309 14:05:32.152394 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 14:05:32 crc kubenswrapper[4722]: I0309 14:05:32.152674 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pvvhj" Mar 09 14:05:32 crc kubenswrapper[4722]: E0309 14:05:32.152798 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 14:05:32 crc kubenswrapper[4722]: E0309 14:05:32.152983 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 14:05:32 crc kubenswrapper[4722]: E0309 14:05:32.153168 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 14:05:32 crc kubenswrapper[4722]: E0309 14:05:32.153374 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pvvhj" podUID="f5dbc0be-527e-4f70-b185-3a10b1b11a75" Mar 09 14:05:34 crc kubenswrapper[4722]: I0309 14:05:34.149165 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 14:05:34 crc kubenswrapper[4722]: I0309 14:05:34.149289 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pvvhj" Mar 09 14:05:34 crc kubenswrapper[4722]: I0309 14:05:34.149311 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 14:05:34 crc kubenswrapper[4722]: I0309 14:05:34.149674 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 14:05:34 crc kubenswrapper[4722]: E0309 14:05:34.149798 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pvvhj" podUID="f5dbc0be-527e-4f70-b185-3a10b1b11a75" Mar 09 14:05:34 crc kubenswrapper[4722]: E0309 14:05:34.150080 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 14:05:34 crc kubenswrapper[4722]: I0309 14:05:34.150156 4722 scope.go:117] "RemoveContainer" containerID="647b164e0544f016e9b6adbb065e7b958f1f7e1446a15f120191cfb35d030220" Mar 09 14:05:34 crc kubenswrapper[4722]: E0309 14:05:34.150278 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 14:05:34 crc kubenswrapper[4722]: E0309 14:05:34.150450 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 14:05:34 crc kubenswrapper[4722]: E0309 14:05:34.150869 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-5v7ng_openshift-ovn-kubernetes(2e305619-b3a2-44c9-9e54-e1afa4f43dbf)\"" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" podUID="2e305619-b3a2-44c9-9e54-e1afa4f43dbf" Mar 09 14:05:35 crc kubenswrapper[4722]: E0309 14:05:35.260774 4722 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 14:05:36 crc kubenswrapper[4722]: I0309 14:05:36.149248 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 14:05:36 crc kubenswrapper[4722]: I0309 14:05:36.149291 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 14:05:36 crc kubenswrapper[4722]: I0309 14:05:36.149263 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pvvhj" Mar 09 14:05:36 crc kubenswrapper[4722]: I0309 14:05:36.149407 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 14:05:36 crc kubenswrapper[4722]: E0309 14:05:36.149492 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 14:05:36 crc kubenswrapper[4722]: E0309 14:05:36.149641 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 14:05:36 crc kubenswrapper[4722]: E0309 14:05:36.149695 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 14:05:36 crc kubenswrapper[4722]: E0309 14:05:36.149984 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pvvhj" podUID="f5dbc0be-527e-4f70-b185-3a10b1b11a75" Mar 09 14:05:38 crc kubenswrapper[4722]: I0309 14:05:38.148948 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 14:05:38 crc kubenswrapper[4722]: I0309 14:05:38.148943 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 14:05:38 crc kubenswrapper[4722]: I0309 14:05:38.149038 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pvvhj" Mar 09 14:05:38 crc kubenswrapper[4722]: I0309 14:05:38.149109 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 14:05:38 crc kubenswrapper[4722]: E0309 14:05:38.149228 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 14:05:38 crc kubenswrapper[4722]: E0309 14:05:38.149333 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 14:05:38 crc kubenswrapper[4722]: E0309 14:05:38.149465 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 14:05:38 crc kubenswrapper[4722]: E0309 14:05:38.149517 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pvvhj" podUID="f5dbc0be-527e-4f70-b185-3a10b1b11a75" Mar 09 14:05:38 crc kubenswrapper[4722]: I0309 14:05:38.442370 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f5dbc0be-527e-4f70-b185-3a10b1b11a75-metrics-certs\") pod \"network-metrics-daemon-pvvhj\" (UID: \"f5dbc0be-527e-4f70-b185-3a10b1b11a75\") " pod="openshift-multus/network-metrics-daemon-pvvhj" Mar 09 14:05:38 crc kubenswrapper[4722]: E0309 14:05:38.442676 4722 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 14:05:38 crc kubenswrapper[4722]: E0309 14:05:38.442801 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f5dbc0be-527e-4f70-b185-3a10b1b11a75-metrics-certs podName:f5dbc0be-527e-4f70-b185-3a10b1b11a75 nodeName:}" failed. No retries permitted until 2026-03-09 14:06:42.442762267 +0000 UTC m=+242.998330873 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f5dbc0be-527e-4f70-b185-3a10b1b11a75-metrics-certs") pod "network-metrics-daemon-pvvhj" (UID: "f5dbc0be-527e-4f70-b185-3a10b1b11a75") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 09 14:05:40 crc kubenswrapper[4722]: I0309 14:05:40.148163 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 14:05:40 crc kubenswrapper[4722]: I0309 14:05:40.148172 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pvvhj" Mar 09 14:05:40 crc kubenswrapper[4722]: I0309 14:05:40.148325 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 14:05:40 crc kubenswrapper[4722]: I0309 14:05:40.148411 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 14:05:40 crc kubenswrapper[4722]: E0309 14:05:40.150756 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 14:05:40 crc kubenswrapper[4722]: E0309 14:05:40.150905 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 14:05:40 crc kubenswrapper[4722]: E0309 14:05:40.151047 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pvvhj" podUID="f5dbc0be-527e-4f70-b185-3a10b1b11a75" Mar 09 14:05:40 crc kubenswrapper[4722]: E0309 14:05:40.151131 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 14:05:40 crc kubenswrapper[4722]: E0309 14:05:40.262046 4722 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 14:05:42 crc kubenswrapper[4722]: I0309 14:05:42.148243 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pvvhj" Mar 09 14:05:42 crc kubenswrapper[4722]: I0309 14:05:42.148338 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 14:05:42 crc kubenswrapper[4722]: I0309 14:05:42.148302 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 14:05:42 crc kubenswrapper[4722]: I0309 14:05:42.148267 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 14:05:42 crc kubenswrapper[4722]: E0309 14:05:42.148412 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pvvhj" podUID="f5dbc0be-527e-4f70-b185-3a10b1b11a75" Mar 09 14:05:42 crc kubenswrapper[4722]: E0309 14:05:42.148516 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 14:05:42 crc kubenswrapper[4722]: E0309 14:05:42.148663 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 14:05:42 crc kubenswrapper[4722]: E0309 14:05:42.148730 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 14:05:44 crc kubenswrapper[4722]: I0309 14:05:44.149033 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 14:05:44 crc kubenswrapper[4722]: I0309 14:05:44.149141 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 14:05:44 crc kubenswrapper[4722]: E0309 14:05:44.149313 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 14:05:44 crc kubenswrapper[4722]: I0309 14:05:44.149422 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pvvhj" Mar 09 14:05:44 crc kubenswrapper[4722]: I0309 14:05:44.149438 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 14:05:44 crc kubenswrapper[4722]: E0309 14:05:44.149617 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 14:05:44 crc kubenswrapper[4722]: E0309 14:05:44.149747 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pvvhj" podUID="f5dbc0be-527e-4f70-b185-3a10b1b11a75" Mar 09 14:05:44 crc kubenswrapper[4722]: E0309 14:05:44.149907 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 14:05:45 crc kubenswrapper[4722]: E0309 14:05:45.263743 4722 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 14:05:46 crc kubenswrapper[4722]: I0309 14:05:46.148188 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 14:05:46 crc kubenswrapper[4722]: I0309 14:05:46.148259 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pvvhj" Mar 09 14:05:46 crc kubenswrapper[4722]: I0309 14:05:46.148184 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 14:05:46 crc kubenswrapper[4722]: E0309 14:05:46.148364 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 14:05:46 crc kubenswrapper[4722]: I0309 14:05:46.148391 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 14:05:46 crc kubenswrapper[4722]: E0309 14:05:46.148476 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pvvhj" podUID="f5dbc0be-527e-4f70-b185-3a10b1b11a75" Mar 09 14:05:46 crc kubenswrapper[4722]: E0309 14:05:46.148625 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 14:05:46 crc kubenswrapper[4722]: E0309 14:05:46.148733 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 14:05:48 crc kubenswrapper[4722]: I0309 14:05:48.148598 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pvvhj" Mar 09 14:05:48 crc kubenswrapper[4722]: I0309 14:05:48.148624 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 14:05:48 crc kubenswrapper[4722]: E0309 14:05:48.148757 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pvvhj" podUID="f5dbc0be-527e-4f70-b185-3a10b1b11a75" Mar 09 14:05:48 crc kubenswrapper[4722]: I0309 14:05:48.148842 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 14:05:48 crc kubenswrapper[4722]: I0309 14:05:48.148972 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 14:05:48 crc kubenswrapper[4722]: E0309 14:05:48.148964 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 14:05:48 crc kubenswrapper[4722]: E0309 14:05:48.149028 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 14:05:48 crc kubenswrapper[4722]: E0309 14:05:48.149093 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 14:05:49 crc kubenswrapper[4722]: I0309 14:05:49.150498 4722 scope.go:117] "RemoveContainer" containerID="647b164e0544f016e9b6adbb065e7b958f1f7e1446a15f120191cfb35d030220" Mar 09 14:05:49 crc kubenswrapper[4722]: E0309 14:05:49.150840 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-5v7ng_openshift-ovn-kubernetes(2e305619-b3a2-44c9-9e54-e1afa4f43dbf)\"" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" podUID="2e305619-b3a2-44c9-9e54-e1afa4f43dbf" Mar 09 14:05:50 crc kubenswrapper[4722]: I0309 14:05:50.148725 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pvvhj" Mar 09 14:05:50 crc kubenswrapper[4722]: I0309 14:05:50.148781 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 14:05:50 crc kubenswrapper[4722]: E0309 14:05:50.148951 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pvvhj" podUID="f5dbc0be-527e-4f70-b185-3a10b1b11a75" Mar 09 14:05:50 crc kubenswrapper[4722]: I0309 14:05:50.149012 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 14:05:50 crc kubenswrapper[4722]: I0309 14:05:50.149046 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 14:05:50 crc kubenswrapper[4722]: E0309 14:05:50.152376 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 14:05:50 crc kubenswrapper[4722]: E0309 14:05:50.153294 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 14:05:50 crc kubenswrapper[4722]: E0309 14:05:50.153634 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 14:05:50 crc kubenswrapper[4722]: E0309 14:05:50.264816 4722 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 14:05:52 crc kubenswrapper[4722]: I0309 14:05:52.149118 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pvvhj" Mar 09 14:05:52 crc kubenswrapper[4722]: I0309 14:05:52.149235 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 14:05:52 crc kubenswrapper[4722]: E0309 14:05:52.149349 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pvvhj" podUID="f5dbc0be-527e-4f70-b185-3a10b1b11a75" Mar 09 14:05:52 crc kubenswrapper[4722]: I0309 14:05:52.149404 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 14:05:52 crc kubenswrapper[4722]: I0309 14:05:52.149129 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 14:05:52 crc kubenswrapper[4722]: E0309 14:05:52.149516 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 14:05:52 crc kubenswrapper[4722]: E0309 14:05:52.149599 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 14:05:52 crc kubenswrapper[4722]: E0309 14:05:52.149777 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 14:05:54 crc kubenswrapper[4722]: I0309 14:05:54.148906 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 14:05:54 crc kubenswrapper[4722]: I0309 14:05:54.148992 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pvvhj" Mar 09 14:05:54 crc kubenswrapper[4722]: E0309 14:05:54.149081 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 14:05:54 crc kubenswrapper[4722]: I0309 14:05:54.149122 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 14:05:54 crc kubenswrapper[4722]: I0309 14:05:54.148929 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 14:05:54 crc kubenswrapper[4722]: E0309 14:05:54.149193 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pvvhj" podUID="f5dbc0be-527e-4f70-b185-3a10b1b11a75" Mar 09 14:05:54 crc kubenswrapper[4722]: E0309 14:05:54.149415 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 14:05:54 crc kubenswrapper[4722]: E0309 14:05:54.149500 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 14:05:55 crc kubenswrapper[4722]: E0309 14:05:55.266523 4722 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 14:05:56 crc kubenswrapper[4722]: I0309 14:05:56.148526 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 14:05:56 crc kubenswrapper[4722]: I0309 14:05:56.148554 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 14:05:56 crc kubenswrapper[4722]: I0309 14:05:56.148584 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pvvhj" Mar 09 14:05:56 crc kubenswrapper[4722]: E0309 14:05:56.148678 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 14:05:56 crc kubenswrapper[4722]: I0309 14:05:56.148527 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 14:05:56 crc kubenswrapper[4722]: E0309 14:05:56.148755 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pvvhj" podUID="f5dbc0be-527e-4f70-b185-3a10b1b11a75" Mar 09 14:05:56 crc kubenswrapper[4722]: E0309 14:05:56.148879 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 14:05:56 crc kubenswrapper[4722]: E0309 14:05:56.149083 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 14:05:56 crc kubenswrapper[4722]: I0309 14:05:56.168897 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-h4zw5_6b9e29bb-6e51-47ab-a543-b70117ab854d/kube-multus/1.log" Mar 09 14:05:56 crc kubenswrapper[4722]: I0309 14:05:56.169500 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-h4zw5_6b9e29bb-6e51-47ab-a543-b70117ab854d/kube-multus/0.log" Mar 09 14:05:56 crc kubenswrapper[4722]: I0309 14:05:56.169549 4722 generic.go:334] "Generic (PLEG): container finished" podID="6b9e29bb-6e51-47ab-a543-b70117ab854d" containerID="30f56826c4b193cbd284ce58320073c1b4dc43c9eba976445ba4d4bb7c089960" exitCode=1 Mar 09 14:05:56 crc kubenswrapper[4722]: I0309 14:05:56.169581 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-h4zw5" event={"ID":"6b9e29bb-6e51-47ab-a543-b70117ab854d","Type":"ContainerDied","Data":"30f56826c4b193cbd284ce58320073c1b4dc43c9eba976445ba4d4bb7c089960"} Mar 09 14:05:56 crc kubenswrapper[4722]: I0309 14:05:56.169618 4722 scope.go:117] "RemoveContainer" containerID="17cd3d1e3981e2f2a3b66780f753425c78452e6319fb8eae9935d3bac4456820" Mar 09 14:05:56 crc kubenswrapper[4722]: I0309 14:05:56.170408 4722 scope.go:117] "RemoveContainer" containerID="30f56826c4b193cbd284ce58320073c1b4dc43c9eba976445ba4d4bb7c089960" Mar 09 14:05:56 crc kubenswrapper[4722]: E0309 14:05:56.170767 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-h4zw5_openshift-multus(6b9e29bb-6e51-47ab-a543-b70117ab854d)\"" pod="openshift-multus/multus-h4zw5" podUID="6b9e29bb-6e51-47ab-a543-b70117ab854d" Mar 09 14:05:57 crc kubenswrapper[4722]: I0309 14:05:57.174146 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-h4zw5_6b9e29bb-6e51-47ab-a543-b70117ab854d/kube-multus/1.log" Mar 09 14:05:58 crc kubenswrapper[4722]: I0309 14:05:58.148481 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pvvhj" Mar 09 14:05:58 crc kubenswrapper[4722]: I0309 14:05:58.148584 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 14:05:58 crc kubenswrapper[4722]: I0309 14:05:58.148627 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 14:05:58 crc kubenswrapper[4722]: E0309 14:05:58.148624 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pvvhj" podUID="f5dbc0be-527e-4f70-b185-3a10b1b11a75" Mar 09 14:05:58 crc kubenswrapper[4722]: I0309 14:05:58.148481 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 14:05:58 crc kubenswrapper[4722]: E0309 14:05:58.148716 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 14:05:58 crc kubenswrapper[4722]: E0309 14:05:58.148904 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 14:05:58 crc kubenswrapper[4722]: E0309 14:05:58.148936 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 14:06:00 crc kubenswrapper[4722]: I0309 14:06:00.148756 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 14:06:00 crc kubenswrapper[4722]: I0309 14:06:00.148788 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 14:06:00 crc kubenswrapper[4722]: I0309 14:06:00.148905 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pvvhj" Mar 09 14:06:00 crc kubenswrapper[4722]: E0309 14:06:00.150134 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 14:06:00 crc kubenswrapper[4722]: I0309 14:06:00.150162 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 14:06:00 crc kubenswrapper[4722]: E0309 14:06:00.150366 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 14:06:00 crc kubenswrapper[4722]: E0309 14:06:00.150497 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 14:06:00 crc kubenswrapper[4722]: E0309 14:06:00.150567 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pvvhj" podUID="f5dbc0be-527e-4f70-b185-3a10b1b11a75" Mar 09 14:06:00 crc kubenswrapper[4722]: E0309 14:06:00.267379 4722 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 14:06:01 crc kubenswrapper[4722]: I0309 14:06:01.149640 4722 scope.go:117] "RemoveContainer" containerID="647b164e0544f016e9b6adbb065e7b958f1f7e1446a15f120191cfb35d030220" Mar 09 14:06:01 crc kubenswrapper[4722]: I0309 14:06:01.940630 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-pvvhj"] Mar 09 14:06:01 crc kubenswrapper[4722]: I0309 14:06:01.941059 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pvvhj" Mar 09 14:06:01 crc kubenswrapper[4722]: E0309 14:06:01.941148 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pvvhj" podUID="f5dbc0be-527e-4f70-b185-3a10b1b11a75" Mar 09 14:06:02 crc kubenswrapper[4722]: I0309 14:06:02.148583 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 14:06:02 crc kubenswrapper[4722]: I0309 14:06:02.148618 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 14:06:02 crc kubenswrapper[4722]: I0309 14:06:02.148712 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 14:06:02 crc kubenswrapper[4722]: E0309 14:06:02.148853 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 14:06:02 crc kubenswrapper[4722]: E0309 14:06:02.148935 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 14:06:02 crc kubenswrapper[4722]: E0309 14:06:02.149013 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 14:06:02 crc kubenswrapper[4722]: I0309 14:06:02.191586 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5v7ng_2e305619-b3a2-44c9-9e54-e1afa4f43dbf/ovnkube-controller/3.log" Mar 09 14:06:02 crc kubenswrapper[4722]: I0309 14:06:02.193985 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" event={"ID":"2e305619-b3a2-44c9-9e54-e1afa4f43dbf","Type":"ContainerStarted","Data":"71c08361cf5ed90f65d201ffd82a9403cb9c01ae88ea91cbbf351f6003bfa427"} Mar 09 14:06:02 crc kubenswrapper[4722]: I0309 14:06:02.194412 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" Mar 09 14:06:02 crc kubenswrapper[4722]: I0309 14:06:02.217222 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" podStartSLOduration=138.217192599 podStartE2EDuration="2m18.217192599s" podCreationTimestamp="2026-03-09 14:03:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:06:02.216383285 +0000 UTC m=+202.771951861" watchObservedRunningTime="2026-03-09 14:06:02.217192599 +0000 UTC m=+202.772761175" Mar 09 14:06:04 crc kubenswrapper[4722]: I0309 14:06:04.148852 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 14:06:04 crc kubenswrapper[4722]: I0309 14:06:04.148968 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pvvhj" Mar 09 14:06:04 crc kubenswrapper[4722]: I0309 14:06:04.148968 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 14:06:04 crc kubenswrapper[4722]: E0309 14:06:04.149090 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 14:06:04 crc kubenswrapper[4722]: I0309 14:06:04.149257 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 14:06:04 crc kubenswrapper[4722]: E0309 14:06:04.149270 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pvvhj" podUID="f5dbc0be-527e-4f70-b185-3a10b1b11a75" Mar 09 14:06:04 crc kubenswrapper[4722]: E0309 14:06:04.149411 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 14:06:04 crc kubenswrapper[4722]: E0309 14:06:04.149505 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 14:06:05 crc kubenswrapper[4722]: E0309 14:06:05.269089 4722 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 14:06:06 crc kubenswrapper[4722]: I0309 14:06:06.148603 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pvvhj" Mar 09 14:06:06 crc kubenswrapper[4722]: I0309 14:06:06.148710 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 14:06:06 crc kubenswrapper[4722]: I0309 14:06:06.148706 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 14:06:06 crc kubenswrapper[4722]: I0309 14:06:06.148603 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 14:06:06 crc kubenswrapper[4722]: E0309 14:06:06.148924 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pvvhj" podUID="f5dbc0be-527e-4f70-b185-3a10b1b11a75" Mar 09 14:06:06 crc kubenswrapper[4722]: E0309 14:06:06.149116 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 14:06:06 crc kubenswrapper[4722]: E0309 14:06:06.149237 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 14:06:06 crc kubenswrapper[4722]: E0309 14:06:06.149280 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 14:06:08 crc kubenswrapper[4722]: I0309 14:06:08.148093 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pvvhj" Mar 09 14:06:08 crc kubenswrapper[4722]: I0309 14:06:08.148142 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 14:06:08 crc kubenswrapper[4722]: I0309 14:06:08.148285 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 14:06:08 crc kubenswrapper[4722]: E0309 14:06:08.148272 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pvvhj" podUID="f5dbc0be-527e-4f70-b185-3a10b1b11a75" Mar 09 14:06:08 crc kubenswrapper[4722]: I0309 14:06:08.148347 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 14:06:08 crc kubenswrapper[4722]: E0309 14:06:08.148419 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 14:06:08 crc kubenswrapper[4722]: E0309 14:06:08.148494 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 14:06:08 crc kubenswrapper[4722]: E0309 14:06:08.148562 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 14:06:10 crc kubenswrapper[4722]: I0309 14:06:10.148770 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 14:06:10 crc kubenswrapper[4722]: I0309 14:06:10.148850 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 14:06:10 crc kubenswrapper[4722]: E0309 14:06:10.151373 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 14:06:10 crc kubenswrapper[4722]: I0309 14:06:10.151395 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pvvhj" Mar 09 14:06:10 crc kubenswrapper[4722]: I0309 14:06:10.151426 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 14:06:10 crc kubenswrapper[4722]: E0309 14:06:10.151751 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 14:06:10 crc kubenswrapper[4722]: E0309 14:06:10.151851 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pvvhj" podUID="f5dbc0be-527e-4f70-b185-3a10b1b11a75" Mar 09 14:06:10 crc kubenswrapper[4722]: E0309 14:06:10.151923 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 14:06:10 crc kubenswrapper[4722]: I0309 14:06:10.152442 4722 scope.go:117] "RemoveContainer" containerID="30f56826c4b193cbd284ce58320073c1b4dc43c9eba976445ba4d4bb7c089960" Mar 09 14:06:10 crc kubenswrapper[4722]: E0309 14:06:10.275473 4722 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 14:06:11 crc kubenswrapper[4722]: I0309 14:06:11.410702 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-h4zw5_6b9e29bb-6e51-47ab-a543-b70117ab854d/kube-multus/1.log" Mar 09 14:06:11 crc kubenswrapper[4722]: I0309 14:06:11.410772 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-h4zw5" event={"ID":"6b9e29bb-6e51-47ab-a543-b70117ab854d","Type":"ContainerStarted","Data":"d35e198363967793f8437918e78c17906e68a7a9bddca3be185af7534bf15d4f"} Mar 09 14:06:12 crc kubenswrapper[4722]: I0309 14:06:12.148719 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 14:06:12 crc kubenswrapper[4722]: I0309 14:06:12.148907 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pvvhj" Mar 09 14:06:12 crc kubenswrapper[4722]: I0309 14:06:12.148735 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 14:06:12 crc kubenswrapper[4722]: E0309 14:06:12.148994 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 14:06:12 crc kubenswrapper[4722]: I0309 14:06:12.149018 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 14:06:12 crc kubenswrapper[4722]: E0309 14:06:12.149163 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pvvhj" podUID="f5dbc0be-527e-4f70-b185-3a10b1b11a75" Mar 09 14:06:12 crc kubenswrapper[4722]: E0309 14:06:12.149433 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 14:06:12 crc kubenswrapper[4722]: E0309 14:06:12.149582 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 14:06:14 crc kubenswrapper[4722]: I0309 14:06:14.148818 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 14:06:14 crc kubenswrapper[4722]: I0309 14:06:14.148915 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 14:06:14 crc kubenswrapper[4722]: I0309 14:06:14.148915 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pvvhj" Mar 09 14:06:14 crc kubenswrapper[4722]: E0309 14:06:14.149059 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 09 14:06:14 crc kubenswrapper[4722]: I0309 14:06:14.149098 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 14:06:14 crc kubenswrapper[4722]: E0309 14:06:14.149428 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 09 14:06:14 crc kubenswrapper[4722]: E0309 14:06:14.149424 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-pvvhj" podUID="f5dbc0be-527e-4f70-b185-3a10b1b11a75" Mar 09 14:06:14 crc kubenswrapper[4722]: E0309 14:06:14.149518 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 09 14:06:16 crc kubenswrapper[4722]: I0309 14:06:16.072142 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 14:06:16 crc kubenswrapper[4722]: E0309 14:06:16.072529 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 14:08:18.07246648 +0000 UTC m=+338.628035096 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:16 crc kubenswrapper[4722]: I0309 14:06:16.148407 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pvvhj" Mar 09 14:06:16 crc kubenswrapper[4722]: I0309 14:06:16.148464 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 14:06:16 crc kubenswrapper[4722]: I0309 14:06:16.148407 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 14:06:16 crc kubenswrapper[4722]: I0309 14:06:16.148595 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 14:06:16 crc kubenswrapper[4722]: I0309 14:06:16.151938 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 09 14:06:16 crc kubenswrapper[4722]: I0309 14:06:16.152763 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 09 14:06:16 crc kubenswrapper[4722]: I0309 14:06:16.152902 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 09 14:06:16 crc kubenswrapper[4722]: I0309 14:06:16.152968 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 09 14:06:16 crc kubenswrapper[4722]: I0309 14:06:16.153850 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 09 14:06:16 crc kubenswrapper[4722]: I0309 14:06:16.156799 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 09 14:06:16 crc kubenswrapper[4722]: I0309 14:06:16.175023 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 14:06:16 crc kubenswrapper[4722]: I0309 14:06:16.175101 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 14:06:16 crc kubenswrapper[4722]: I0309 14:06:16.175155 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 14:06:16 crc kubenswrapper[4722]: I0309 14:06:16.175199 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 14:06:16 crc kubenswrapper[4722]: I0309 14:06:16.176982 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 14:06:16 crc kubenswrapper[4722]: I0309 14:06:16.186873 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 14:06:16 crc kubenswrapper[4722]: I0309 14:06:16.187178 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 14:06:16 crc kubenswrapper[4722]: I0309 14:06:16.188406 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 14:06:16 crc kubenswrapper[4722]: I0309 14:06:16.202944 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 09 14:06:16 crc kubenswrapper[4722]: I0309 14:06:16.211805 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 14:06:16 crc kubenswrapper[4722]: I0309 14:06:16.220356 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 09 14:06:16 crc kubenswrapper[4722]: W0309 14:06:16.457567 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-be6d47ad948734c056362556e484038a5b7e86528b7786ed9985362fdd9820cf WatchSource:0}: Error finding container be6d47ad948734c056362556e484038a5b7e86528b7786ed9985362fdd9820cf: Status 404 returned error can't find the container with id be6d47ad948734c056362556e484038a5b7e86528b7786ed9985362fdd9820cf Mar 09 14:06:16 crc kubenswrapper[4722]: W0309 14:06:16.490888 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-29a7448419d506dd5548896969861333495dfc7042c0988ef129bffa9bdfcf8c WatchSource:0}: Error finding container 29a7448419d506dd5548896969861333495dfc7042c0988ef129bffa9bdfcf8c: Status 404 returned error can't find the container with id 29a7448419d506dd5548896969861333495dfc7042c0988ef129bffa9bdfcf8c Mar 09 14:06:17 crc kubenswrapper[4722]: I0309 14:06:17.435539 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"4b7ebe7eb5aea86344767d0464f9a5dc605ced5ae8f86347cd5c5436a17cabe9"} Mar 09 14:06:17 crc kubenswrapper[4722]: I0309 14:06:17.435951 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"31afaf341568c74ed5e76431a885bb2c33fcf142e27ac7adf79194f6482b76db"} Mar 09 14:06:17 crc kubenswrapper[4722]: I0309 14:06:17.436163 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 14:06:17 crc kubenswrapper[4722]: I0309 14:06:17.437351 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"b6c9cdb6e11a71cf519da6c04bd741571af6f66c30db12bd77c4fd2055d79406"} Mar 09 14:06:17 crc kubenswrapper[4722]: I0309 14:06:17.437397 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"be6d47ad948734c056362556e484038a5b7e86528b7786ed9985362fdd9820cf"} Mar 09 14:06:17 crc kubenswrapper[4722]: I0309 14:06:17.438643 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"8e38a54dba96a0fa3960bbff59b17995ebd28049a6ce28003d62dc1e2a5aaa26"} Mar 09 14:06:17 crc kubenswrapper[4722]: I0309 14:06:17.438676 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"29a7448419d506dd5548896969861333495dfc7042c0988ef129bffa9bdfcf8c"} Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.062032 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.106562 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-sk7lp"] Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.107107 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-sk7lp" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.108853 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-mgzj7"] Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.109304 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-mgzj7" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.111827 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-fnmcv"] Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.112297 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fnmcv" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.115855 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-xxz9x"] Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.116566 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-xxz9x" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.118083 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-8tkbn"] Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.118576 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-8tkbn" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.120541 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-2mf4l"] Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.121649 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2mf4l" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.127694 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-6s4fg"] Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.128650 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-6s4fg" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.130637 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-7jl6s"] Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.131391 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7jl6s" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.136789 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.136970 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.137646 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.137832 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.138669 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.138767 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.138853 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.138929 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.138987 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.139213 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.139325 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.139363 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.139516 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.139624 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.140180 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.140345 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.140379 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.140432 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.140547 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.141982 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-jmwpv"] Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.142643 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-79q9f"] Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.143240 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-79q9f" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.143334 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-gh244"] Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.143628 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-jmwpv" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.144057 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-gh244" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.144784 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.144789 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.145766 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.147008 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-knrzp"] Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.147536 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-knrzp" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.147886 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.148100 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.152526 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.153536 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.157656 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.157696 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.157734 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.169638 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.169720 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.170025 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.170789 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-chrnr"] Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.171439 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-chrnr" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.172876 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.172914 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.172878 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.179822 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.180297 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.180466 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.189995 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.191668 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.192221 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.192301 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.192910 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.193378 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.193813 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.194678 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.195594 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.197450 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.197759 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.208069 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.208573 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.208749 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.208921 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.209070 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.209186 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.211473 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.211765 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.212069 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-68k6r"] Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.212702 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-6b65x"] Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.213147 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-6b65x" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.213404 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-68k6r" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.211778 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.214770 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.215368 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.215479 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.215603 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.215801 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.222171 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.222639 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.222829 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.223135 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.223443 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.223703 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.223993 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.224177 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.224304 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.224438 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.224676 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.224936 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.225190 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.225419 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.224008 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.225591 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.225839 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.225864 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.226559 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.226622 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.226874 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.227083 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.227247 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.227522 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.227588 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.227638 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.225607 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.227783 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.230109 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-sfch8"] Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.236149 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7q7bl"] Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.236540 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9c83f63c-9a9e-4e33-b0c6-54396ce9f07c-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-6s4fg\" (UID: \"9c83f63c-9a9e-4e33-b0c6-54396ce9f07c\") " pod="openshift-authentication/oauth-openshift-558db77b4-6s4fg" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.236598 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78rj2\" (UniqueName: \"kubernetes.io/projected/8f915ef9-5d9a-43ee-a333-def8766e083d-kube-api-access-78rj2\") pod \"downloads-7954f5f757-knrzp\" (UID: \"8f915ef9-5d9a-43ee-a333-def8766e083d\") " pod="openshift-console/downloads-7954f5f757-knrzp" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.236643 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/4084fbb0-8fae-4b7e-a3f6-ec9d723bb367-node-pullsecrets\") pod \"apiserver-76f77b778f-gh244\" (UID: \"4084fbb0-8fae-4b7e-a3f6-ec9d723bb367\") " pod="openshift-apiserver/apiserver-76f77b778f-gh244" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.236663 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5822aa06-923f-4ba0-bdf6-617c5a5eb617-config\") pod \"route-controller-manager-6576b87f9c-fnmcv\" (UID: \"5822aa06-923f-4ba0-bdf6-617c5a5eb617\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fnmcv" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.236693 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0194ee7-d343-4042-9c2b-08cc513ee43e-serving-cert\") pod \"controller-manager-879f6c89f-mgzj7\" (UID: \"e0194ee7-d343-4042-9c2b-08cc513ee43e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mgzj7" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.236711 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p88m8\" (UniqueName: \"kubernetes.io/projected/e0194ee7-d343-4042-9c2b-08cc513ee43e-kube-api-access-p88m8\") pod \"controller-manager-879f6c89f-mgzj7\" (UID: \"e0194ee7-d343-4042-9c2b-08cc513ee43e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mgzj7" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.236727 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/062db6f1-77ab-4eca-be53-6480160aff81-etcd-client\") pod \"apiserver-7bbb656c7d-2mf4l\" (UID: \"062db6f1-77ab-4eca-be53-6480160aff81\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2mf4l" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.236751 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69cpk\" (UniqueName: \"kubernetes.io/projected/9f2a2160-888c-4101-8b1c-63498753a2b7-kube-api-access-69cpk\") pod \"openshift-config-operator-7777fb866f-jmwpv\" (UID: \"9f2a2160-888c-4101-8b1c-63498753a2b7\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-jmwpv" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.236771 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/062db6f1-77ab-4eca-be53-6480160aff81-serving-cert\") pod \"apiserver-7bbb656c7d-2mf4l\" (UID: \"062db6f1-77ab-4eca-be53-6480160aff81\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2mf4l" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.236786 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/062db6f1-77ab-4eca-be53-6480160aff81-audit-dir\") pod \"apiserver-7bbb656c7d-2mf4l\" (UID: \"062db6f1-77ab-4eca-be53-6480160aff81\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2mf4l" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.236806 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/4084fbb0-8fae-4b7e-a3f6-ec9d723bb367-encryption-config\") pod \"apiserver-76f77b778f-gh244\" (UID: \"4084fbb0-8fae-4b7e-a3f6-ec9d723bb367\") " pod="openshift-apiserver/apiserver-76f77b778f-gh244" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.236822 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9c83f63c-9a9e-4e33-b0c6-54396ce9f07c-audit-dir\") pod \"oauth-openshift-558db77b4-6s4fg\" (UID: \"9c83f63c-9a9e-4e33-b0c6-54396ce9f07c\") " pod="openshift-authentication/oauth-openshift-558db77b4-6s4fg" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.236842 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/9c83f63c-9a9e-4e33-b0c6-54396ce9f07c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-6s4fg\" (UID: \"9c83f63c-9a9e-4e33-b0c6-54396ce9f07c\") " pod="openshift-authentication/oauth-openshift-558db77b4-6s4fg" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.236860 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a0be9cc-2cda-4b22-838b-0036cfa4405c-config\") pod \"machine-api-operator-5694c8668f-xxz9x\" (UID: \"8a0be9cc-2cda-4b22-838b-0036cfa4405c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xxz9x" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.236881 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6f37779-2f40-4aed-a52b-ee2e2693f16c-config\") pod \"openshift-apiserver-operator-796bbdcf4f-sk7lp\" (UID: \"b6f37779-2f40-4aed-a52b-ee2e2693f16c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-sk7lp" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.236895 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/062db6f1-77ab-4eca-be53-6480160aff81-encryption-config\") pod \"apiserver-7bbb656c7d-2mf4l\" (UID: \"062db6f1-77ab-4eca-be53-6480160aff81\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2mf4l" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.236915 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9c83f63c-9a9e-4e33-b0c6-54396ce9f07c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-6s4fg\" (UID: \"9c83f63c-9a9e-4e33-b0c6-54396ce9f07c\") " pod="openshift-authentication/oauth-openshift-558db77b4-6s4fg" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.236942 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/7c9f4b4e-a00a-4a59-bdc8-a76fe4d6d0d8-machine-approver-tls\") pod \"machine-approver-56656f9798-7jl6s\" (UID: \"7c9f4b4e-a00a-4a59-bdc8-a76fe4d6d0d8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7jl6s" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.236964 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6j4l\" (UniqueName: \"kubernetes.io/projected/e646e4a2-271c-4c66-ba95-67061b0323bf-kube-api-access-g6j4l\") pod \"cluster-samples-operator-665b6dd947-79q9f\" (UID: \"e646e4a2-271c-4c66-ba95-67061b0323bf\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-79q9f" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.236979 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7c9f4b4e-a00a-4a59-bdc8-a76fe4d6d0d8-auth-proxy-config\") pod \"machine-approver-56656f9798-7jl6s\" (UID: \"7c9f4b4e-a00a-4a59-bdc8-a76fe4d6d0d8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7jl6s" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.236998 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4084fbb0-8fae-4b7e-a3f6-ec9d723bb367-config\") pod \"apiserver-76f77b778f-gh244\" (UID: \"4084fbb0-8fae-4b7e-a3f6-ec9d723bb367\") " pod="openshift-apiserver/apiserver-76f77b778f-gh244" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.237015 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9c83f63c-9a9e-4e33-b0c6-54396ce9f07c-audit-policies\") pod \"oauth-openshift-558db77b4-6s4fg\" (UID: \"9c83f63c-9a9e-4e33-b0c6-54396ce9f07c\") " pod="openshift-authentication/oauth-openshift-558db77b4-6s4fg" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.237030 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/062db6f1-77ab-4eca-be53-6480160aff81-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-2mf4l\" (UID: \"062db6f1-77ab-4eca-be53-6480160aff81\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2mf4l" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.237050 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5822aa06-923f-4ba0-bdf6-617c5a5eb617-client-ca\") pod \"route-controller-manager-6576b87f9c-fnmcv\" (UID: \"5822aa06-923f-4ba0-bdf6-617c5a5eb617\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fnmcv" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.237068 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b919959a-1da1-4c74-9330-5bb8c33f5c26-serving-cert\") pod \"console-operator-58897d9998-chrnr\" (UID: \"b919959a-1da1-4c74-9330-5bb8c33f5c26\") " pod="openshift-console-operator/console-operator-58897d9998-chrnr" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.237088 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b919959a-1da1-4c74-9330-5bb8c33f5c26-trusted-ca\") pod \"console-operator-58897d9998-chrnr\" (UID: \"b919959a-1da1-4c74-9330-5bb8c33f5c26\") " pod="openshift-console-operator/console-operator-58897d9998-chrnr" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.237106 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4084fbb0-8fae-4b7e-a3f6-ec9d723bb367-serving-cert\") pod \"apiserver-76f77b778f-gh244\" (UID: \"4084fbb0-8fae-4b7e-a3f6-ec9d723bb367\") " pod="openshift-apiserver/apiserver-76f77b778f-gh244" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.237152 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkvbb\" (UniqueName: \"kubernetes.io/projected/8a0be9cc-2cda-4b22-838b-0036cfa4405c-kube-api-access-wkvbb\") pod \"machine-api-operator-5694c8668f-xxz9x\" (UID: \"8a0be9cc-2cda-4b22-838b-0036cfa4405c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xxz9x" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.237183 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e646e4a2-271c-4c66-ba95-67061b0323bf-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-79q9f\" (UID: \"e646e4a2-271c-4c66-ba95-67061b0323bf\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-79q9f" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.237221 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/4084fbb0-8fae-4b7e-a3f6-ec9d723bb367-etcd-serving-ca\") pod \"apiserver-76f77b778f-gh244\" (UID: \"4084fbb0-8fae-4b7e-a3f6-ec9d723bb367\") " pod="openshift-apiserver/apiserver-76f77b778f-gh244" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.237241 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2qqm\" (UniqueName: \"kubernetes.io/projected/4084fbb0-8fae-4b7e-a3f6-ec9d723bb367-kube-api-access-t2qqm\") pod \"apiserver-76f77b778f-gh244\" (UID: \"4084fbb0-8fae-4b7e-a3f6-ec9d723bb367\") " pod="openshift-apiserver/apiserver-76f77b778f-gh244" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.237269 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7n2r\" (UniqueName: \"kubernetes.io/projected/9c83f63c-9a9e-4e33-b0c6-54396ce9f07c-kube-api-access-c7n2r\") pod \"oauth-openshift-558db77b4-6s4fg\" (UID: \"9c83f63c-9a9e-4e33-b0c6-54396ce9f07c\") " pod="openshift-authentication/oauth-openshift-558db77b4-6s4fg" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.237292 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f2a2160-888c-4101-8b1c-63498753a2b7-serving-cert\") pod \"openshift-config-operator-7777fb866f-jmwpv\" (UID: \"9f2a2160-888c-4101-8b1c-63498753a2b7\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-jmwpv" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.237316 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/8a0be9cc-2cda-4b22-838b-0036cfa4405c-images\") pod \"machine-api-operator-5694c8668f-xxz9x\" (UID: \"8a0be9cc-2cda-4b22-838b-0036cfa4405c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xxz9x" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.237335 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c83f63c-9a9e-4e33-b0c6-54396ce9f07c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-6s4fg\" (UID: \"9c83f63c-9a9e-4e33-b0c6-54396ce9f07c\") " pod="openshift-authentication/oauth-openshift-558db77b4-6s4fg" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.237354 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/8a0be9cc-2cda-4b22-838b-0036cfa4405c-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-xxz9x\" (UID: \"8a0be9cc-2cda-4b22-838b-0036cfa4405c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xxz9x" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.237370 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b919959a-1da1-4c74-9330-5bb8c33f5c26-config\") pod \"console-operator-58897d9998-chrnr\" (UID: \"b919959a-1da1-4c74-9330-5bb8c33f5c26\") " pod="openshift-console-operator/console-operator-58897d9998-chrnr" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.237389 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/427a4c04-99cd-4f53-ae98-20c1755d7658-service-ca-bundle\") pod \"authentication-operator-69f744f599-8tkbn\" (UID: \"427a4c04-99cd-4f53-ae98-20c1755d7658\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8tkbn" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.239301 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hczss\" (UniqueName: \"kubernetes.io/projected/7c9f4b4e-a00a-4a59-bdc8-a76fe4d6d0d8-kube-api-access-hczss\") pod \"machine-approver-56656f9798-7jl6s\" (UID: \"7c9f4b4e-a00a-4a59-bdc8-a76fe4d6d0d8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7jl6s" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.239334 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/4084fbb0-8fae-4b7e-a3f6-ec9d723bb367-image-import-ca\") pod \"apiserver-76f77b778f-gh244\" (UID: \"4084fbb0-8fae-4b7e-a3f6-ec9d723bb367\") " pod="openshift-apiserver/apiserver-76f77b778f-gh244" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.239353 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/4084fbb0-8fae-4b7e-a3f6-ec9d723bb367-audit\") pod \"apiserver-76f77b778f-gh244\" (UID: \"4084fbb0-8fae-4b7e-a3f6-ec9d723bb367\") " pod="openshift-apiserver/apiserver-76f77b778f-gh244" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.239402 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsnwn\" (UniqueName: \"kubernetes.io/projected/b6f37779-2f40-4aed-a52b-ee2e2693f16c-kube-api-access-vsnwn\") pod \"openshift-apiserver-operator-796bbdcf4f-sk7lp\" (UID: \"b6f37779-2f40-4aed-a52b-ee2e2693f16c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-sk7lp" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.239409 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9vnnw"] Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.239419 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/062db6f1-77ab-4eca-be53-6480160aff81-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-2mf4l\" (UID: \"062db6f1-77ab-4eca-be53-6480160aff81\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2mf4l" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.239722 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9c83f63c-9a9e-4e33-b0c6-54396ce9f07c-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-6s4fg\" (UID: \"9c83f63c-9a9e-4e33-b0c6-54396ce9f07c\") " pod="openshift-authentication/oauth-openshift-558db77b4-6s4fg" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.239744 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9c83f63c-9a9e-4e33-b0c6-54396ce9f07c-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-6s4fg\" (UID: \"9c83f63c-9a9e-4e33-b0c6-54396ce9f07c\") " pod="openshift-authentication/oauth-openshift-558db77b4-6s4fg" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.239784 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e0194ee7-d343-4042-9c2b-08cc513ee43e-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-mgzj7\" (UID: \"e0194ee7-d343-4042-9c2b-08cc513ee43e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mgzj7" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.230191 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.240096 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-sfch8" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.240224 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7q7bl" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.231326 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.232564 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.232906 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.233788 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.235320 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.236502 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.239816 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ww47w\" (UniqueName: \"kubernetes.io/projected/5822aa06-923f-4ba0-bdf6-617c5a5eb617-kube-api-access-ww47w\") pod \"route-controller-manager-6576b87f9c-fnmcv\" (UID: \"5822aa06-923f-4ba0-bdf6-617c5a5eb617\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fnmcv" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.241023 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r48jn\" (UniqueName: \"kubernetes.io/projected/b919959a-1da1-4c74-9330-5bb8c33f5c26-kube-api-access-r48jn\") pod \"console-operator-58897d9998-chrnr\" (UID: \"b919959a-1da1-4c74-9330-5bb8c33f5c26\") " pod="openshift-console-operator/console-operator-58897d9998-chrnr" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.241056 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4084fbb0-8fae-4b7e-a3f6-ec9d723bb367-audit-dir\") pod \"apiserver-76f77b778f-gh244\" (UID: \"4084fbb0-8fae-4b7e-a3f6-ec9d723bb367\") " pod="openshift-apiserver/apiserver-76f77b778f-gh244" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.241085 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e0194ee7-d343-4042-9c2b-08cc513ee43e-client-ca\") pod \"controller-manager-879f6c89f-mgzj7\" (UID: \"e0194ee7-d343-4042-9c2b-08cc513ee43e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mgzj7" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.241111 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/427a4c04-99cd-4f53-ae98-20c1755d7658-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-8tkbn\" (UID: \"427a4c04-99cd-4f53-ae98-20c1755d7658\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8tkbn" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.241171 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0194ee7-d343-4042-9c2b-08cc513ee43e-config\") pod \"controller-manager-879f6c89f-mgzj7\" (UID: \"e0194ee7-d343-4042-9c2b-08cc513ee43e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mgzj7" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.241546 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c9f4b4e-a00a-4a59-bdc8-a76fe4d6d0d8-config\") pod \"machine-approver-56656f9798-7jl6s\" (UID: \"7c9f4b4e-a00a-4a59-bdc8-a76fe4d6d0d8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7jl6s" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.241573 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9c83f63c-9a9e-4e33-b0c6-54396ce9f07c-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-6s4fg\" (UID: \"9c83f63c-9a9e-4e33-b0c6-54396ce9f07c\") " pod="openshift-authentication/oauth-openshift-558db77b4-6s4fg" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.241593 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9c83f63c-9a9e-4e33-b0c6-54396ce9f07c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-6s4fg\" (UID: \"9c83f63c-9a9e-4e33-b0c6-54396ce9f07c\") " pod="openshift-authentication/oauth-openshift-558db77b4-6s4fg" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.241623 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9c83f63c-9a9e-4e33-b0c6-54396ce9f07c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-6s4fg\" (UID: \"9c83f63c-9a9e-4e33-b0c6-54396ce9f07c\") " pod="openshift-authentication/oauth-openshift-558db77b4-6s4fg" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.241647 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9c83f63c-9a9e-4e33-b0c6-54396ce9f07c-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-6s4fg\" (UID: \"9c83f63c-9a9e-4e33-b0c6-54396ce9f07c\") " pod="openshift-authentication/oauth-openshift-558db77b4-6s4fg" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.241674 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/9f2a2160-888c-4101-8b1c-63498753a2b7-available-featuregates\") pod \"openshift-config-operator-7777fb866f-jmwpv\" (UID: \"9f2a2160-888c-4101-8b1c-63498753a2b7\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-jmwpv" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.242600 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ghkm\" (UniqueName: \"kubernetes.io/projected/062db6f1-77ab-4eca-be53-6480160aff81-kube-api-access-9ghkm\") pod \"apiserver-7bbb656c7d-2mf4l\" (UID: \"062db6f1-77ab-4eca-be53-6480160aff81\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2mf4l" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.242646 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4084fbb0-8fae-4b7e-a3f6-ec9d723bb367-etcd-client\") pod \"apiserver-76f77b778f-gh244\" (UID: \"4084fbb0-8fae-4b7e-a3f6-ec9d723bb367\") " pod="openshift-apiserver/apiserver-76f77b778f-gh244" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.242677 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/427a4c04-99cd-4f53-ae98-20c1755d7658-config\") pod \"authentication-operator-69f744f599-8tkbn\" (UID: \"427a4c04-99cd-4f53-ae98-20c1755d7658\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8tkbn" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.242734 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5822aa06-923f-4ba0-bdf6-617c5a5eb617-serving-cert\") pod \"route-controller-manager-6576b87f9c-fnmcv\" (UID: \"5822aa06-923f-4ba0-bdf6-617c5a5eb617\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fnmcv" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.242805 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/427a4c04-99cd-4f53-ae98-20c1755d7658-serving-cert\") pod \"authentication-operator-69f744f599-8tkbn\" (UID: \"427a4c04-99cd-4f53-ae98-20c1755d7658\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8tkbn" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.242837 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/062db6f1-77ab-4eca-be53-6480160aff81-audit-policies\") pod \"apiserver-7bbb656c7d-2mf4l\" (UID: \"062db6f1-77ab-4eca-be53-6480160aff81\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2mf4l" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.242858 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6f37779-2f40-4aed-a52b-ee2e2693f16c-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-sk7lp\" (UID: \"b6f37779-2f40-4aed-a52b-ee2e2693f16c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-sk7lp" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.242878 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4084fbb0-8fae-4b7e-a3f6-ec9d723bb367-trusted-ca-bundle\") pod \"apiserver-76f77b778f-gh244\" (UID: \"4084fbb0-8fae-4b7e-a3f6-ec9d723bb367\") " pod="openshift-apiserver/apiserver-76f77b778f-gh244" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.242897 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9c83f63c-9a9e-4e33-b0c6-54396ce9f07c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-6s4fg\" (UID: \"9c83f63c-9a9e-4e33-b0c6-54396ce9f07c\") " pod="openshift-authentication/oauth-openshift-558db77b4-6s4fg" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.242925 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgnzc\" (UniqueName: \"kubernetes.io/projected/427a4c04-99cd-4f53-ae98-20c1755d7658-kube-api-access-wgnzc\") pod \"authentication-operator-69f744f599-8tkbn\" (UID: \"427a4c04-99cd-4f53-ae98-20c1755d7658\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8tkbn" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.243907 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9vnnw" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.244428 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.245264 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.255671 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.256658 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-hng8f"] Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.257478 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5j6rh"] Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.257806 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5j6rh" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.257967 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-hng8f" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.257961 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-tz6gh"] Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.258681 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-tz6gh" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.313961 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.314925 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.316113 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.317299 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kh6xb"] Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.317799 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kh6xb" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.318646 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.318675 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.318739 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.329373 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.331234 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-lh92k"] Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.331874 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lh92k" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.332116 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-b5jn4"] Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.332595 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-b5jn4" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.335659 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mrtb2"] Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.336123 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lnkwt"] Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.336503 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-dp8wn"] Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.336869 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-dp8wn" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.337173 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mrtb2" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.337269 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lnkwt" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.339585 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-tkhkf"] Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.340295 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tkhkf" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.344785 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5822aa06-923f-4ba0-bdf6-617c5a5eb617-config\") pod \"route-controller-manager-6576b87f9c-fnmcv\" (UID: \"5822aa06-923f-4ba0-bdf6-617c5a5eb617\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fnmcv" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.345256 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/4084fbb0-8fae-4b7e-a3f6-ec9d723bb367-node-pullsecrets\") pod \"apiserver-76f77b778f-gh244\" (UID: \"4084fbb0-8fae-4b7e-a3f6-ec9d723bb367\") " pod="openshift-apiserver/apiserver-76f77b778f-gh244" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.345500 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0194ee7-d343-4042-9c2b-08cc513ee43e-serving-cert\") pod \"controller-manager-879f6c89f-mgzj7\" (UID: \"e0194ee7-d343-4042-9c2b-08cc513ee43e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mgzj7" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.345617 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p88m8\" (UniqueName: \"kubernetes.io/projected/e0194ee7-d343-4042-9c2b-08cc513ee43e-kube-api-access-p88m8\") pod \"controller-manager-879f6c89f-mgzj7\" (UID: \"e0194ee7-d343-4042-9c2b-08cc513ee43e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mgzj7" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.345722 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/062db6f1-77ab-4eca-be53-6480160aff81-etcd-client\") pod \"apiserver-7bbb656c7d-2mf4l\" (UID: \"062db6f1-77ab-4eca-be53-6480160aff81\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2mf4l" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.345815 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69cpk\" (UniqueName: \"kubernetes.io/projected/9f2a2160-888c-4101-8b1c-63498753a2b7-kube-api-access-69cpk\") pod \"openshift-config-operator-7777fb866f-jmwpv\" (UID: \"9f2a2160-888c-4101-8b1c-63498753a2b7\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-jmwpv" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.345891 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/062db6f1-77ab-4eca-be53-6480160aff81-serving-cert\") pod \"apiserver-7bbb656c7d-2mf4l\" (UID: \"062db6f1-77ab-4eca-be53-6480160aff81\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2mf4l" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.345963 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/062db6f1-77ab-4eca-be53-6480160aff81-audit-dir\") pod \"apiserver-7bbb656c7d-2mf4l\" (UID: \"062db6f1-77ab-4eca-be53-6480160aff81\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2mf4l" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.346043 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/4084fbb0-8fae-4b7e-a3f6-ec9d723bb367-encryption-config\") pod \"apiserver-76f77b778f-gh244\" (UID: \"4084fbb0-8fae-4b7e-a3f6-ec9d723bb367\") " pod="openshift-apiserver/apiserver-76f77b778f-gh244" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.346112 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9c83f63c-9a9e-4e33-b0c6-54396ce9f07c-audit-dir\") pod \"oauth-openshift-558db77b4-6s4fg\" (UID: \"9c83f63c-9a9e-4e33-b0c6-54396ce9f07c\") " pod="openshift-authentication/oauth-openshift-558db77b4-6s4fg" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.346222 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/9c83f63c-9a9e-4e33-b0c6-54396ce9f07c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-6s4fg\" (UID: \"9c83f63c-9a9e-4e33-b0c6-54396ce9f07c\") " pod="openshift-authentication/oauth-openshift-558db77b4-6s4fg" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.346375 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/4084fbb0-8fae-4b7e-a3f6-ec9d723bb367-node-pullsecrets\") pod \"apiserver-76f77b778f-gh244\" (UID: \"4084fbb0-8fae-4b7e-a3f6-ec9d723bb367\") " pod="openshift-apiserver/apiserver-76f77b778f-gh244" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.346497 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/062db6f1-77ab-4eca-be53-6480160aff81-encryption-config\") pod \"apiserver-7bbb656c7d-2mf4l\" (UID: \"062db6f1-77ab-4eca-be53-6480160aff81\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2mf4l" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.346956 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/062db6f1-77ab-4eca-be53-6480160aff81-audit-dir\") pod \"apiserver-7bbb656c7d-2mf4l\" (UID: \"062db6f1-77ab-4eca-be53-6480160aff81\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2mf4l" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.345912 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5822aa06-923f-4ba0-bdf6-617c5a5eb617-config\") pod \"route-controller-manager-6576b87f9c-fnmcv\" (UID: \"5822aa06-923f-4ba0-bdf6-617c5a5eb617\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fnmcv" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.347254 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9c83f63c-9a9e-4e33-b0c6-54396ce9f07c-audit-dir\") pod \"oauth-openshift-558db77b4-6s4fg\" (UID: \"9c83f63c-9a9e-4e33-b0c6-54396ce9f07c\") " pod="openshift-authentication/oauth-openshift-558db77b4-6s4fg" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.347462 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9c83f63c-9a9e-4e33-b0c6-54396ce9f07c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-6s4fg\" (UID: \"9c83f63c-9a9e-4e33-b0c6-54396ce9f07c\") " pod="openshift-authentication/oauth-openshift-558db77b4-6s4fg" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.347600 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a0be9cc-2cda-4b22-838b-0036cfa4405c-config\") pod \"machine-api-operator-5694c8668f-xxz9x\" (UID: \"8a0be9cc-2cda-4b22-838b-0036cfa4405c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xxz9x" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.347701 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6f37779-2f40-4aed-a52b-ee2e2693f16c-config\") pod \"openshift-apiserver-operator-796bbdcf4f-sk7lp\" (UID: \"b6f37779-2f40-4aed-a52b-ee2e2693f16c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-sk7lp" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.349415 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-fnmcv"] Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.349459 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-mgzj7"] Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.350326 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/7c9f4b4e-a00a-4a59-bdc8-a76fe4d6d0d8-machine-approver-tls\") pod \"machine-approver-56656f9798-7jl6s\" (UID: \"7c9f4b4e-a00a-4a59-bdc8-a76fe4d6d0d8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7jl6s" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.350395 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6j4l\" (UniqueName: \"kubernetes.io/projected/e646e4a2-271c-4c66-ba95-67061b0323bf-kube-api-access-g6j4l\") pod \"cluster-samples-operator-665b6dd947-79q9f\" (UID: \"e646e4a2-271c-4c66-ba95-67061b0323bf\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-79q9f" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.350441 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7c9f4b4e-a00a-4a59-bdc8-a76fe4d6d0d8-auth-proxy-config\") pod \"machine-approver-56656f9798-7jl6s\" (UID: \"7c9f4b4e-a00a-4a59-bdc8-a76fe4d6d0d8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7jl6s" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.350463 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4084fbb0-8fae-4b7e-a3f6-ec9d723bb367-config\") pod \"apiserver-76f77b778f-gh244\" (UID: \"4084fbb0-8fae-4b7e-a3f6-ec9d723bb367\") " pod="openshift-apiserver/apiserver-76f77b778f-gh244" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.350486 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9c83f63c-9a9e-4e33-b0c6-54396ce9f07c-audit-policies\") pod \"oauth-openshift-558db77b4-6s4fg\" (UID: \"9c83f63c-9a9e-4e33-b0c6-54396ce9f07c\") " pod="openshift-authentication/oauth-openshift-558db77b4-6s4fg" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.350540 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/062db6f1-77ab-4eca-be53-6480160aff81-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-2mf4l\" (UID: \"062db6f1-77ab-4eca-be53-6480160aff81\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2mf4l" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.350563 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5822aa06-923f-4ba0-bdf6-617c5a5eb617-client-ca\") pod \"route-controller-manager-6576b87f9c-fnmcv\" (UID: \"5822aa06-923f-4ba0-bdf6-617c5a5eb617\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fnmcv" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.350607 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b919959a-1da1-4c74-9330-5bb8c33f5c26-serving-cert\") pod \"console-operator-58897d9998-chrnr\" (UID: \"b919959a-1da1-4c74-9330-5bb8c33f5c26\") " pod="openshift-console-operator/console-operator-58897d9998-chrnr" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.350628 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b919959a-1da1-4c74-9330-5bb8c33f5c26-trusted-ca\") pod \"console-operator-58897d9998-chrnr\" (UID: \"b919959a-1da1-4c74-9330-5bb8c33f5c26\") " pod="openshift-console-operator/console-operator-58897d9998-chrnr" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.350648 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4084fbb0-8fae-4b7e-a3f6-ec9d723bb367-serving-cert\") pod \"apiserver-76f77b778f-gh244\" (UID: \"4084fbb0-8fae-4b7e-a3f6-ec9d723bb367\") " pod="openshift-apiserver/apiserver-76f77b778f-gh244" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.350706 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e646e4a2-271c-4c66-ba95-67061b0323bf-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-79q9f\" (UID: \"e646e4a2-271c-4c66-ba95-67061b0323bf\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-79q9f" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.350727 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/4084fbb0-8fae-4b7e-a3f6-ec9d723bb367-etcd-serving-ca\") pod \"apiserver-76f77b778f-gh244\" (UID: \"4084fbb0-8fae-4b7e-a3f6-ec9d723bb367\") " pod="openshift-apiserver/apiserver-76f77b778f-gh244" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.350767 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2qqm\" (UniqueName: \"kubernetes.io/projected/4084fbb0-8fae-4b7e-a3f6-ec9d723bb367-kube-api-access-t2qqm\") pod \"apiserver-76f77b778f-gh244\" (UID: \"4084fbb0-8fae-4b7e-a3f6-ec9d723bb367\") " pod="openshift-apiserver/apiserver-76f77b778f-gh244" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.350789 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7n2r\" (UniqueName: \"kubernetes.io/projected/9c83f63c-9a9e-4e33-b0c6-54396ce9f07c-kube-api-access-c7n2r\") pod \"oauth-openshift-558db77b4-6s4fg\" (UID: \"9c83f63c-9a9e-4e33-b0c6-54396ce9f07c\") " pod="openshift-authentication/oauth-openshift-558db77b4-6s4fg" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.350840 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkvbb\" (UniqueName: \"kubernetes.io/projected/8a0be9cc-2cda-4b22-838b-0036cfa4405c-kube-api-access-wkvbb\") pod \"machine-api-operator-5694c8668f-xxz9x\" (UID: \"8a0be9cc-2cda-4b22-838b-0036cfa4405c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xxz9x" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.350872 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f2a2160-888c-4101-8b1c-63498753a2b7-serving-cert\") pod \"openshift-config-operator-7777fb866f-jmwpv\" (UID: \"9f2a2160-888c-4101-8b1c-63498753a2b7\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-jmwpv" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.351112 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a0be9cc-2cda-4b22-838b-0036cfa4405c-config\") pod \"machine-api-operator-5694c8668f-xxz9x\" (UID: \"8a0be9cc-2cda-4b22-838b-0036cfa4405c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xxz9x" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.351363 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c83f63c-9a9e-4e33-b0c6-54396ce9f07c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-6s4fg\" (UID: \"9c83f63c-9a9e-4e33-b0c6-54396ce9f07c\") " pod="openshift-authentication/oauth-openshift-558db77b4-6s4fg" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.351434 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/8a0be9cc-2cda-4b22-838b-0036cfa4405c-images\") pod \"machine-api-operator-5694c8668f-xxz9x\" (UID: \"8a0be9cc-2cda-4b22-838b-0036cfa4405c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xxz9x" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.351810 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/8a0be9cc-2cda-4b22-838b-0036cfa4405c-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-xxz9x\" (UID: \"8a0be9cc-2cda-4b22-838b-0036cfa4405c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xxz9x" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.351844 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b919959a-1da1-4c74-9330-5bb8c33f5c26-config\") pod \"console-operator-58897d9998-chrnr\" (UID: \"b919959a-1da1-4c74-9330-5bb8c33f5c26\") " pod="openshift-console-operator/console-operator-58897d9998-chrnr" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.351863 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/427a4c04-99cd-4f53-ae98-20c1755d7658-service-ca-bundle\") pod \"authentication-operator-69f744f599-8tkbn\" (UID: \"427a4c04-99cd-4f53-ae98-20c1755d7658\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8tkbn" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.351886 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/4084fbb0-8fae-4b7e-a3f6-ec9d723bb367-image-import-ca\") pod \"apiserver-76f77b778f-gh244\" (UID: \"4084fbb0-8fae-4b7e-a3f6-ec9d723bb367\") " pod="openshift-apiserver/apiserver-76f77b778f-gh244" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.351904 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hczss\" (UniqueName: \"kubernetes.io/projected/7c9f4b4e-a00a-4a59-bdc8-a76fe4d6d0d8-kube-api-access-hczss\") pod \"machine-approver-56656f9798-7jl6s\" (UID: \"7c9f4b4e-a00a-4a59-bdc8-a76fe4d6d0d8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7jl6s" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.351932 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/4084fbb0-8fae-4b7e-a3f6-ec9d723bb367-audit\") pod \"apiserver-76f77b778f-gh244\" (UID: \"4084fbb0-8fae-4b7e-a3f6-ec9d723bb367\") " pod="openshift-apiserver/apiserver-76f77b778f-gh244" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.351953 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9c83f63c-9a9e-4e33-b0c6-54396ce9f07c-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-6s4fg\" (UID: \"9c83f63c-9a9e-4e33-b0c6-54396ce9f07c\") " pod="openshift-authentication/oauth-openshift-558db77b4-6s4fg" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.351969 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9c83f63c-9a9e-4e33-b0c6-54396ce9f07c-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-6s4fg\" (UID: \"9c83f63c-9a9e-4e33-b0c6-54396ce9f07c\") " pod="openshift-authentication/oauth-openshift-558db77b4-6s4fg" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.351988 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsnwn\" (UniqueName: \"kubernetes.io/projected/b6f37779-2f40-4aed-a52b-ee2e2693f16c-kube-api-access-vsnwn\") pod \"openshift-apiserver-operator-796bbdcf4f-sk7lp\" (UID: \"b6f37779-2f40-4aed-a52b-ee2e2693f16c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-sk7lp" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.352006 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/062db6f1-77ab-4eca-be53-6480160aff81-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-2mf4l\" (UID: \"062db6f1-77ab-4eca-be53-6480160aff81\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2mf4l" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.352025 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e0194ee7-d343-4042-9c2b-08cc513ee43e-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-mgzj7\" (UID: \"e0194ee7-d343-4042-9c2b-08cc513ee43e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mgzj7" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.352043 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ww47w\" (UniqueName: \"kubernetes.io/projected/5822aa06-923f-4ba0-bdf6-617c5a5eb617-kube-api-access-ww47w\") pod \"route-controller-manager-6576b87f9c-fnmcv\" (UID: \"5822aa06-923f-4ba0-bdf6-617c5a5eb617\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fnmcv" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.352060 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r48jn\" (UniqueName: \"kubernetes.io/projected/b919959a-1da1-4c74-9330-5bb8c33f5c26-kube-api-access-r48jn\") pod \"console-operator-58897d9998-chrnr\" (UID: \"b919959a-1da1-4c74-9330-5bb8c33f5c26\") " pod="openshift-console-operator/console-operator-58897d9998-chrnr" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.352079 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4084fbb0-8fae-4b7e-a3f6-ec9d723bb367-audit-dir\") pod \"apiserver-76f77b778f-gh244\" (UID: \"4084fbb0-8fae-4b7e-a3f6-ec9d723bb367\") " pod="openshift-apiserver/apiserver-76f77b778f-gh244" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.352102 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65gll\" (UniqueName: \"kubernetes.io/projected/b642aeb0-15ca-4f07-a83d-ede389a04408-kube-api-access-65gll\") pod \"openshift-controller-manager-operator-756b6f6bc6-68k6r\" (UID: \"b642aeb0-15ca-4f07-a83d-ede389a04408\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-68k6r" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.352135 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e0194ee7-d343-4042-9c2b-08cc513ee43e-client-ca\") pod \"controller-manager-879f6c89f-mgzj7\" (UID: \"e0194ee7-d343-4042-9c2b-08cc513ee43e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mgzj7" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.352159 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/427a4c04-99cd-4f53-ae98-20c1755d7658-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-8tkbn\" (UID: \"427a4c04-99cd-4f53-ae98-20c1755d7658\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8tkbn" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.352182 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0194ee7-d343-4042-9c2b-08cc513ee43e-config\") pod \"controller-manager-879f6c89f-mgzj7\" (UID: \"e0194ee7-d343-4042-9c2b-08cc513ee43e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mgzj7" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.352221 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c9f4b4e-a00a-4a59-bdc8-a76fe4d6d0d8-config\") pod \"machine-approver-56656f9798-7jl6s\" (UID: \"7c9f4b4e-a00a-4a59-bdc8-a76fe4d6d0d8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7jl6s" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.352244 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9c83f63c-9a9e-4e33-b0c6-54396ce9f07c-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-6s4fg\" (UID: \"9c83f63c-9a9e-4e33-b0c6-54396ce9f07c\") " pod="openshift-authentication/oauth-openshift-558db77b4-6s4fg" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.352263 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9c83f63c-9a9e-4e33-b0c6-54396ce9f07c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-6s4fg\" (UID: \"9c83f63c-9a9e-4e33-b0c6-54396ce9f07c\") " pod="openshift-authentication/oauth-openshift-558db77b4-6s4fg" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.352287 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9c83f63c-9a9e-4e33-b0c6-54396ce9f07c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-6s4fg\" (UID: \"9c83f63c-9a9e-4e33-b0c6-54396ce9f07c\") " pod="openshift-authentication/oauth-openshift-558db77b4-6s4fg" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.352312 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9c83f63c-9a9e-4e33-b0c6-54396ce9f07c-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-6s4fg\" (UID: \"9c83f63c-9a9e-4e33-b0c6-54396ce9f07c\") " pod="openshift-authentication/oauth-openshift-558db77b4-6s4fg" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.352365 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b642aeb0-15ca-4f07-a83d-ede389a04408-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-68k6r\" (UID: \"b642aeb0-15ca-4f07-a83d-ede389a04408\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-68k6r" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.352391 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/9f2a2160-888c-4101-8b1c-63498753a2b7-available-featuregates\") pod \"openshift-config-operator-7777fb866f-jmwpv\" (UID: \"9f2a2160-888c-4101-8b1c-63498753a2b7\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-jmwpv" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.352412 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ghkm\" (UniqueName: \"kubernetes.io/projected/062db6f1-77ab-4eca-be53-6480160aff81-kube-api-access-9ghkm\") pod \"apiserver-7bbb656c7d-2mf4l\" (UID: \"062db6f1-77ab-4eca-be53-6480160aff81\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2mf4l" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.352477 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4084fbb0-8fae-4b7e-a3f6-ec9d723bb367-etcd-client\") pod \"apiserver-76f77b778f-gh244\" (UID: \"4084fbb0-8fae-4b7e-a3f6-ec9d723bb367\") " pod="openshift-apiserver/apiserver-76f77b778f-gh244" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.352523 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/427a4c04-99cd-4f53-ae98-20c1755d7658-config\") pod \"authentication-operator-69f744f599-8tkbn\" (UID: \"427a4c04-99cd-4f53-ae98-20c1755d7658\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8tkbn" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.352543 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5822aa06-923f-4ba0-bdf6-617c5a5eb617-serving-cert\") pod \"route-controller-manager-6576b87f9c-fnmcv\" (UID: \"5822aa06-923f-4ba0-bdf6-617c5a5eb617\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fnmcv" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.352561 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/427a4c04-99cd-4f53-ae98-20c1755d7658-serving-cert\") pod \"authentication-operator-69f744f599-8tkbn\" (UID: \"427a4c04-99cd-4f53-ae98-20c1755d7658\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8tkbn" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.352619 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/062db6f1-77ab-4eca-be53-6480160aff81-audit-policies\") pod \"apiserver-7bbb656c7d-2mf4l\" (UID: \"062db6f1-77ab-4eca-be53-6480160aff81\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2mf4l" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.352640 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6f37779-2f40-4aed-a52b-ee2e2693f16c-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-sk7lp\" (UID: \"b6f37779-2f40-4aed-a52b-ee2e2693f16c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-sk7lp" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.352688 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4084fbb0-8fae-4b7e-a3f6-ec9d723bb367-trusted-ca-bundle\") pod \"apiserver-76f77b778f-gh244\" (UID: \"4084fbb0-8fae-4b7e-a3f6-ec9d723bb367\") " pod="openshift-apiserver/apiserver-76f77b778f-gh244" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.352710 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9c83f63c-9a9e-4e33-b0c6-54396ce9f07c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-6s4fg\" (UID: \"9c83f63c-9a9e-4e33-b0c6-54396ce9f07c\") " pod="openshift-authentication/oauth-openshift-558db77b4-6s4fg" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.352738 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgnzc\" (UniqueName: \"kubernetes.io/projected/427a4c04-99cd-4f53-ae98-20c1755d7658-kube-api-access-wgnzc\") pod \"authentication-operator-69f744f599-8tkbn\" (UID: \"427a4c04-99cd-4f53-ae98-20c1755d7658\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8tkbn" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.352779 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b642aeb0-15ca-4f07-a83d-ede389a04408-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-68k6r\" (UID: \"b642aeb0-15ca-4f07-a83d-ede389a04408\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-68k6r" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.352812 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9c83f63c-9a9e-4e33-b0c6-54396ce9f07c-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-6s4fg\" (UID: \"9c83f63c-9a9e-4e33-b0c6-54396ce9f07c\") " pod="openshift-authentication/oauth-openshift-558db77b4-6s4fg" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.352857 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78rj2\" (UniqueName: \"kubernetes.io/projected/8f915ef9-5d9a-43ee-a333-def8766e083d-kube-api-access-78rj2\") pod \"downloads-7954f5f757-knrzp\" (UID: \"8f915ef9-5d9a-43ee-a333-def8766e083d\") " pod="openshift-console/downloads-7954f5f757-knrzp" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.353118 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6f37779-2f40-4aed-a52b-ee2e2693f16c-config\") pod \"openshift-apiserver-operator-796bbdcf4f-sk7lp\" (UID: \"b6f37779-2f40-4aed-a52b-ee2e2693f16c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-sk7lp" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.356668 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/4084fbb0-8fae-4b7e-a3f6-ec9d723bb367-encryption-config\") pod \"apiserver-76f77b778f-gh244\" (UID: \"4084fbb0-8fae-4b7e-a3f6-ec9d723bb367\") " pod="openshift-apiserver/apiserver-76f77b778f-gh244" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.357442 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/7c9f4b4e-a00a-4a59-bdc8-a76fe4d6d0d8-machine-approver-tls\") pod \"machine-approver-56656f9798-7jl6s\" (UID: \"7c9f4b4e-a00a-4a59-bdc8-a76fe4d6d0d8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7jl6s" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.358116 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7c9f4b4e-a00a-4a59-bdc8-a76fe4d6d0d8-auth-proxy-config\") pod \"machine-approver-56656f9798-7jl6s\" (UID: \"7c9f4b4e-a00a-4a59-bdc8-a76fe4d6d0d8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7jl6s" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.358374 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.358708 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4084fbb0-8fae-4b7e-a3f6-ec9d723bb367-config\") pod \"apiserver-76f77b778f-gh244\" (UID: \"4084fbb0-8fae-4b7e-a3f6-ec9d723bb367\") " pod="openshift-apiserver/apiserver-76f77b778f-gh244" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.359194 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/062db6f1-77ab-4eca-be53-6480160aff81-etcd-client\") pod \"apiserver-7bbb656c7d-2mf4l\" (UID: \"062db6f1-77ab-4eca-be53-6480160aff81\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2mf4l" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.359584 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/062db6f1-77ab-4eca-be53-6480160aff81-encryption-config\") pod \"apiserver-7bbb656c7d-2mf4l\" (UID: \"062db6f1-77ab-4eca-be53-6480160aff81\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2mf4l" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.360249 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9c83f63c-9a9e-4e33-b0c6-54396ce9f07c-audit-policies\") pod \"oauth-openshift-558db77b4-6s4fg\" (UID: \"9c83f63c-9a9e-4e33-b0c6-54396ce9f07c\") " pod="openshift-authentication/oauth-openshift-558db77b4-6s4fg" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.360847 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/062db6f1-77ab-4eca-be53-6480160aff81-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-2mf4l\" (UID: \"062db6f1-77ab-4eca-be53-6480160aff81\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2mf4l" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.361387 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0194ee7-d343-4042-9c2b-08cc513ee43e-config\") pod \"controller-manager-879f6c89f-mgzj7\" (UID: \"e0194ee7-d343-4042-9c2b-08cc513ee43e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mgzj7" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.361395 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b919959a-1da1-4c74-9330-5bb8c33f5c26-trusted-ca\") pod \"console-operator-58897d9998-chrnr\" (UID: \"b919959a-1da1-4c74-9330-5bb8c33f5c26\") " pod="openshift-console-operator/console-operator-58897d9998-chrnr" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.361546 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/4084fbb0-8fae-4b7e-a3f6-ec9d723bb367-etcd-serving-ca\") pod \"apiserver-76f77b778f-gh244\" (UID: \"4084fbb0-8fae-4b7e-a3f6-ec9d723bb367\") " pod="openshift-apiserver/apiserver-76f77b778f-gh244" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.362187 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c83f63c-9a9e-4e33-b0c6-54396ce9f07c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-6s4fg\" (UID: \"9c83f63c-9a9e-4e33-b0c6-54396ce9f07c\") " pod="openshift-authentication/oauth-openshift-558db77b4-6s4fg" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.362354 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4084fbb0-8fae-4b7e-a3f6-ec9d723bb367-trusted-ca-bundle\") pod \"apiserver-76f77b778f-gh244\" (UID: \"4084fbb0-8fae-4b7e-a3f6-ec9d723bb367\") " pod="openshift-apiserver/apiserver-76f77b778f-gh244" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.362729 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/062db6f1-77ab-4eca-be53-6480160aff81-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-2mf4l\" (UID: \"062db6f1-77ab-4eca-be53-6480160aff81\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2mf4l" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.362846 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b919959a-1da1-4c74-9330-5bb8c33f5c26-serving-cert\") pod \"console-operator-58897d9998-chrnr\" (UID: \"b919959a-1da1-4c74-9330-5bb8c33f5c26\") " pod="openshift-console-operator/console-operator-58897d9998-chrnr" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.363333 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/9c83f63c-9a9e-4e33-b0c6-54396ce9f07c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-6s4fg\" (UID: \"9c83f63c-9a9e-4e33-b0c6-54396ce9f07c\") " pod="openshift-authentication/oauth-openshift-558db77b4-6s4fg" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.364102 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f2a2160-888c-4101-8b1c-63498753a2b7-serving-cert\") pod \"openshift-config-operator-7777fb866f-jmwpv\" (UID: \"9f2a2160-888c-4101-8b1c-63498753a2b7\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-jmwpv" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.364225 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/427a4c04-99cd-4f53-ae98-20c1755d7658-config\") pod \"authentication-operator-69f744f599-8tkbn\" (UID: \"427a4c04-99cd-4f53-ae98-20c1755d7658\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8tkbn" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.364309 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e0194ee7-d343-4042-9c2b-08cc513ee43e-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-mgzj7\" (UID: \"e0194ee7-d343-4042-9c2b-08cc513ee43e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mgzj7" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.364504 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4084fbb0-8fae-4b7e-a3f6-ec9d723bb367-audit-dir\") pod \"apiserver-76f77b778f-gh244\" (UID: \"4084fbb0-8fae-4b7e-a3f6-ec9d723bb367\") " pod="openshift-apiserver/apiserver-76f77b778f-gh244" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.364592 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/427a4c04-99cd-4f53-ae98-20c1755d7658-service-ca-bundle\") pod \"authentication-operator-69f744f599-8tkbn\" (UID: \"427a4c04-99cd-4f53-ae98-20c1755d7658\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8tkbn" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.365048 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c9f4b4e-a00a-4a59-bdc8-a76fe4d6d0d8-config\") pod \"machine-approver-56656f9798-7jl6s\" (UID: \"7c9f4b4e-a00a-4a59-bdc8-a76fe4d6d0d8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7jl6s" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.365161 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9c83f63c-9a9e-4e33-b0c6-54396ce9f07c-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-6s4fg\" (UID: \"9c83f63c-9a9e-4e33-b0c6-54396ce9f07c\") " pod="openshift-authentication/oauth-openshift-558db77b4-6s4fg" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.365297 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/e646e4a2-271c-4c66-ba95-67061b0323bf-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-79q9f\" (UID: \"e646e4a2-271c-4c66-ba95-67061b0323bf\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-79q9f" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.365496 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/062db6f1-77ab-4eca-be53-6480160aff81-audit-policies\") pod \"apiserver-7bbb656c7d-2mf4l\" (UID: \"062db6f1-77ab-4eca-be53-6480160aff81\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2mf4l" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.365534 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e0194ee7-d343-4042-9c2b-08cc513ee43e-client-ca\") pod \"controller-manager-879f6c89f-mgzj7\" (UID: \"e0194ee7-d343-4042-9c2b-08cc513ee43e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mgzj7" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.365597 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2xkrn"] Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.365847 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/8a0be9cc-2cda-4b22-838b-0036cfa4405c-images\") pod \"machine-api-operator-5694c8668f-xxz9x\" (UID: \"8a0be9cc-2cda-4b22-838b-0036cfa4405c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xxz9x" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.366503 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2xkrn" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.366522 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-nvtgb"] Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.366596 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5822aa06-923f-4ba0-bdf6-617c5a5eb617-serving-cert\") pod \"route-controller-manager-6576b87f9c-fnmcv\" (UID: \"5822aa06-923f-4ba0-bdf6-617c5a5eb617\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fnmcv" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.366812 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5822aa06-923f-4ba0-bdf6-617c5a5eb617-client-ca\") pod \"route-controller-manager-6576b87f9c-fnmcv\" (UID: \"5822aa06-923f-4ba0-bdf6-617c5a5eb617\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fnmcv" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.367165 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-nvtgb" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.367231 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/427a4c04-99cd-4f53-ae98-20c1755d7658-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-8tkbn\" (UID: \"427a4c04-99cd-4f53-ae98-20c1755d7658\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8tkbn" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.367398 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/9f2a2160-888c-4101-8b1c-63498753a2b7-available-featuregates\") pod \"openshift-config-operator-7777fb866f-jmwpv\" (UID: \"9f2a2160-888c-4101-8b1c-63498753a2b7\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-jmwpv" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.367547 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9c83f63c-9a9e-4e33-b0c6-54396ce9f07c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-6s4fg\" (UID: \"9c83f63c-9a9e-4e33-b0c6-54396ce9f07c\") " pod="openshift-authentication/oauth-openshift-558db77b4-6s4fg" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.368819 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4084fbb0-8fae-4b7e-a3f6-ec9d723bb367-serving-cert\") pod \"apiserver-76f77b778f-gh244\" (UID: \"4084fbb0-8fae-4b7e-a3f6-ec9d723bb367\") " pod="openshift-apiserver/apiserver-76f77b778f-gh244" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.369377 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9c83f63c-9a9e-4e33-b0c6-54396ce9f07c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-6s4fg\" (UID: \"9c83f63c-9a9e-4e33-b0c6-54396ce9f07c\") " pod="openshift-authentication/oauth-openshift-558db77b4-6s4fg" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.369806 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b919959a-1da1-4c74-9330-5bb8c33f5c26-config\") pod \"console-operator-58897d9998-chrnr\" (UID: \"b919959a-1da1-4c74-9330-5bb8c33f5c26\") " pod="openshift-console-operator/console-operator-58897d9998-chrnr" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.370110 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9c83f63c-9a9e-4e33-b0c6-54396ce9f07c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-6s4fg\" (UID: \"9c83f63c-9a9e-4e33-b0c6-54396ce9f07c\") " pod="openshift-authentication/oauth-openshift-558db77b4-6s4fg" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.370228 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/4084fbb0-8fae-4b7e-a3f6-ec9d723bb367-image-import-ca\") pod \"apiserver-76f77b778f-gh244\" (UID: \"4084fbb0-8fae-4b7e-a3f6-ec9d723bb367\") " pod="openshift-apiserver/apiserver-76f77b778f-gh244" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.370637 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9c83f63c-9a9e-4e33-b0c6-54396ce9f07c-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-6s4fg\" (UID: \"9c83f63c-9a9e-4e33-b0c6-54396ce9f07c\") " pod="openshift-authentication/oauth-openshift-558db77b4-6s4fg" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.370750 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-6db7b"] Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.371916 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9c83f63c-9a9e-4e33-b0c6-54396ce9f07c-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-6s4fg\" (UID: \"9c83f63c-9a9e-4e33-b0c6-54396ce9f07c\") " pod="openshift-authentication/oauth-openshift-558db77b4-6s4fg" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.372484 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.372612 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4084fbb0-8fae-4b7e-a3f6-ec9d723bb367-etcd-client\") pod \"apiserver-76f77b778f-gh244\" (UID: \"4084fbb0-8fae-4b7e-a3f6-ec9d723bb367\") " pod="openshift-apiserver/apiserver-76f77b778f-gh244" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.372977 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/062db6f1-77ab-4eca-be53-6480160aff81-serving-cert\") pod \"apiserver-7bbb656c7d-2mf4l\" (UID: \"062db6f1-77ab-4eca-be53-6480160aff81\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2mf4l" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.373279 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9c83f63c-9a9e-4e33-b0c6-54396ce9f07c-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-6s4fg\" (UID: \"9c83f63c-9a9e-4e33-b0c6-54396ce9f07c\") " pod="openshift-authentication/oauth-openshift-558db77b4-6s4fg" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.377531 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/8a0be9cc-2cda-4b22-838b-0036cfa4405c-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-xxz9x\" (UID: \"8a0be9cc-2cda-4b22-838b-0036cfa4405c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xxz9x" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.377693 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9c83f63c-9a9e-4e33-b0c6-54396ce9f07c-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-6s4fg\" (UID: \"9c83f63c-9a9e-4e33-b0c6-54396ce9f07c\") " pod="openshift-authentication/oauth-openshift-558db77b4-6s4fg" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.378356 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-nk4z2"] Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.378796 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-6db7b" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.379022 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0194ee7-d343-4042-9c2b-08cc513ee43e-serving-cert\") pod \"controller-manager-879f6c89f-mgzj7\" (UID: \"e0194ee7-d343-4042-9c2b-08cc513ee43e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mgzj7" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.379098 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-8jf57"] Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.379297 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nk4z2" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.379388 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/4084fbb0-8fae-4b7e-a3f6-ec9d723bb367-audit\") pod \"apiserver-76f77b778f-gh244\" (UID: \"4084fbb0-8fae-4b7e-a3f6-ec9d723bb367\") " pod="openshift-apiserver/apiserver-76f77b778f-gh244" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.379748 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dgxnd"] Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.379899 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b6f37779-2f40-4aed-a52b-ee2e2693f16c-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-sk7lp\" (UID: \"b6f37779-2f40-4aed-a52b-ee2e2693f16c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-sk7lp" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.380086 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8jf57" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.380183 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fp2th"] Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.380476 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dgxnd" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.380720 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wz9f2"] Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.380875 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fp2th" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.382050 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-2mf4l"] Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.382081 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551080-mdzx2"] Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.382174 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-wz9f2" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.382313 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/427a4c04-99cd-4f53-ae98-20c1755d7658-serving-cert\") pod \"authentication-operator-69f744f599-8tkbn\" (UID: \"427a4c04-99cd-4f53-ae98-20c1755d7658\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8tkbn" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.382744 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-f8gtl"] Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.382879 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551080-mdzx2" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.383491 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551086-cvc28"] Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.383702 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-f8gtl" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.384273 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551086-cvc28" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.385438 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9c83f63c-9a9e-4e33-b0c6-54396ce9f07c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-6s4fg\" (UID: \"9c83f63c-9a9e-4e33-b0c6-54396ce9f07c\") " pod="openshift-authentication/oauth-openshift-558db77b4-6s4fg" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.385535 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-8tkbn"] Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.386928 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-jmwpv"] Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.388473 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-chrnr"] Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.389814 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.390518 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-sk7lp"] Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.392250 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-6s4fg"] Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.394376 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-6b65x"] Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.395690 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-b5hps"] Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.398017 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lnkwt"] Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.398139 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-b5hps" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.400110 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-xxz9x"] Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.400176 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2xkrn"] Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.401546 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-nk4z2"] Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.402677 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7q7bl"] Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.416384 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.419774 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-hng8f"] Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.421715 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9vnnw"] Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.423186 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-79q9f"] Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.424275 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-tz6gh"] Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.425405 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-b5jn4"] Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.426469 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-sfch8"] Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.429613 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mrtb2"] Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.430363 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.431122 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-knrzp"] Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.432969 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-68k6r"] Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.434262 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-hdrrh"] Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.436005 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-nnjvd"] Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.436156 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-hdrrh" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.436792 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-lh92k"] Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.436905 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-nnjvd" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.437378 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-nvtgb"] Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.438479 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-8jf57"] Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.439522 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-6db7b"] Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.441144 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5j6rh"] Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.442326 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kh6xb"] Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.443596 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wz9f2"] Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.444624 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-nnjvd"] Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.445708 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-tkhkf"] Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.446846 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551080-mdzx2"] Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.448403 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-gh244"] Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.450922 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.451086 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-f8gtl"] Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.452136 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-hdrrh"] Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.453211 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-b5hps"] Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.453553 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65gll\" (UniqueName: \"kubernetes.io/projected/b642aeb0-15ca-4f07-a83d-ede389a04408-kube-api-access-65gll\") pod \"openshift-controller-manager-operator-756b6f6bc6-68k6r\" (UID: \"b642aeb0-15ca-4f07-a83d-ede389a04408\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-68k6r" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.453617 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b642aeb0-15ca-4f07-a83d-ede389a04408-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-68k6r\" (UID: \"b642aeb0-15ca-4f07-a83d-ede389a04408\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-68k6r" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.454239 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b642aeb0-15ca-4f07-a83d-ede389a04408-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-68k6r\" (UID: \"b642aeb0-15ca-4f07-a83d-ede389a04408\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-68k6r" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.454336 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-pjvfp"] Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.455032 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b642aeb0-15ca-4f07-a83d-ede389a04408-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-68k6r\" (UID: \"b642aeb0-15ca-4f07-a83d-ede389a04408\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-68k6r" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.455245 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-pjvfp" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.455460 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551086-cvc28"] Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.456343 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b642aeb0-15ca-4f07-a83d-ede389a04408-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-68k6r\" (UID: \"b642aeb0-15ca-4f07-a83d-ede389a04408\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-68k6r" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.456568 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fp2th"] Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.457703 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dgxnd"] Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.471377 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.491329 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.510145 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.530170 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.550245 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.570254 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.591773 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.611665 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.629646 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.658575 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.670045 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.688948 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.710656 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.749380 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.769594 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.788680 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.809642 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.829290 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.876376 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.892091 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.911354 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.930155 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.949983 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.970079 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 09 14:06:20 crc kubenswrapper[4722]: I0309 14:06:20.989756 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 09 14:06:21 crc kubenswrapper[4722]: I0309 14:06:21.010241 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 09 14:06:21 crc kubenswrapper[4722]: I0309 14:06:21.030559 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 09 14:06:21 crc kubenswrapper[4722]: I0309 14:06:21.050839 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 09 14:06:21 crc kubenswrapper[4722]: I0309 14:06:21.070287 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 09 14:06:21 crc kubenswrapper[4722]: I0309 14:06:21.090248 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 09 14:06:21 crc kubenswrapper[4722]: I0309 14:06:21.109788 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 09 14:06:21 crc kubenswrapper[4722]: I0309 14:06:21.131256 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 09 14:06:21 crc kubenswrapper[4722]: I0309 14:06:21.149493 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 09 14:06:21 crc kubenswrapper[4722]: I0309 14:06:21.170589 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 09 14:06:21 crc kubenswrapper[4722]: I0309 14:06:21.189042 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 09 14:06:21 crc kubenswrapper[4722]: I0309 14:06:21.209178 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 09 14:06:21 crc kubenswrapper[4722]: I0309 14:06:21.230473 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 09 14:06:21 crc kubenswrapper[4722]: I0309 14:06:21.249802 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 09 14:06:21 crc kubenswrapper[4722]: I0309 14:06:21.270186 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 09 14:06:21 crc kubenswrapper[4722]: I0309 14:06:21.290671 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 09 14:06:21 crc kubenswrapper[4722]: I0309 14:06:21.309936 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 09 14:06:21 crc kubenswrapper[4722]: I0309 14:06:21.329852 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 09 14:06:21 crc kubenswrapper[4722]: I0309 14:06:21.348368 4722 request.go:700] Waited for 1.001986061s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-config-operator/serviceaccounts/openshift-config-operator/token Mar 09 14:06:21 crc kubenswrapper[4722]: I0309 14:06:21.366258 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69cpk\" (UniqueName: \"kubernetes.io/projected/9f2a2160-888c-4101-8b1c-63498753a2b7-kube-api-access-69cpk\") pod \"openshift-config-operator-7777fb866f-jmwpv\" (UID: \"9f2a2160-888c-4101-8b1c-63498753a2b7\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-jmwpv" Mar 09 14:06:21 crc kubenswrapper[4722]: I0309 14:06:21.390538 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p88m8\" (UniqueName: \"kubernetes.io/projected/e0194ee7-d343-4042-9c2b-08cc513ee43e-kube-api-access-p88m8\") pod \"controller-manager-879f6c89f-mgzj7\" (UID: \"e0194ee7-d343-4042-9c2b-08cc513ee43e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mgzj7" Mar 09 14:06:21 crc kubenswrapper[4722]: I0309 14:06:21.405958 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2qqm\" (UniqueName: \"kubernetes.io/projected/4084fbb0-8fae-4b7e-a3f6-ec9d723bb367-kube-api-access-t2qqm\") pod \"apiserver-76f77b778f-gh244\" (UID: \"4084fbb0-8fae-4b7e-a3f6-ec9d723bb367\") " pod="openshift-apiserver/apiserver-76f77b778f-gh244" Mar 09 14:06:21 crc kubenswrapper[4722]: I0309 14:06:21.424532 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7n2r\" (UniqueName: \"kubernetes.io/projected/9c83f63c-9a9e-4e33-b0c6-54396ce9f07c-kube-api-access-c7n2r\") pod \"oauth-openshift-558db77b4-6s4fg\" (UID: \"9c83f63c-9a9e-4e33-b0c6-54396ce9f07c\") " pod="openshift-authentication/oauth-openshift-558db77b4-6s4fg" Mar 09 14:06:21 crc kubenswrapper[4722]: I0309 14:06:21.445261 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkvbb\" (UniqueName: \"kubernetes.io/projected/8a0be9cc-2cda-4b22-838b-0036cfa4405c-kube-api-access-wkvbb\") pod \"machine-api-operator-5694c8668f-xxz9x\" (UID: \"8a0be9cc-2cda-4b22-838b-0036cfa4405c\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-xxz9x" Mar 09 14:06:21 crc kubenswrapper[4722]: I0309 14:06:21.489639 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6j4l\" (UniqueName: \"kubernetes.io/projected/e646e4a2-271c-4c66-ba95-67061b0323bf-kube-api-access-g6j4l\") pod \"cluster-samples-operator-665b6dd947-79q9f\" (UID: \"e646e4a2-271c-4c66-ba95-67061b0323bf\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-79q9f" Mar 09 14:06:21 crc kubenswrapper[4722]: I0309 14:06:21.509173 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ww47w\" (UniqueName: \"kubernetes.io/projected/5822aa06-923f-4ba0-bdf6-617c5a5eb617-kube-api-access-ww47w\") pod \"route-controller-manager-6576b87f9c-fnmcv\" (UID: \"5822aa06-923f-4ba0-bdf6-617c5a5eb617\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fnmcv" Mar 09 14:06:21 crc kubenswrapper[4722]: I0309 14:06:21.516436 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsnwn\" (UniqueName: \"kubernetes.io/projected/b6f37779-2f40-4aed-a52b-ee2e2693f16c-kube-api-access-vsnwn\") pod \"openshift-apiserver-operator-796bbdcf4f-sk7lp\" (UID: \"b6f37779-2f40-4aed-a52b-ee2e2693f16c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-sk7lp" Mar 09 14:06:21 crc kubenswrapper[4722]: I0309 14:06:21.517633 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-6s4fg" Mar 09 14:06:21 crc kubenswrapper[4722]: I0309 14:06:21.528327 4722 patch_prober.go:28] interesting pod/machine-config-daemon-hjrrb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:06:21 crc kubenswrapper[4722]: I0309 14:06:21.528431 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:06:21 crc kubenswrapper[4722]: I0309 14:06:21.534174 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r48jn\" (UniqueName: \"kubernetes.io/projected/b919959a-1da1-4c74-9330-5bb8c33f5c26-kube-api-access-r48jn\") pod \"console-operator-58897d9998-chrnr\" (UID: \"b919959a-1da1-4c74-9330-5bb8c33f5c26\") " pod="openshift-console-operator/console-operator-58897d9998-chrnr" Mar 09 14:06:21 crc kubenswrapper[4722]: I0309 14:06:21.540664 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-79q9f" Mar 09 14:06:21 crc kubenswrapper[4722]: I0309 14:06:21.545106 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78rj2\" (UniqueName: \"kubernetes.io/projected/8f915ef9-5d9a-43ee-a333-def8766e083d-kube-api-access-78rj2\") pod \"downloads-7954f5f757-knrzp\" (UID: \"8f915ef9-5d9a-43ee-a333-def8766e083d\") " pod="openshift-console/downloads-7954f5f757-knrzp" Mar 09 14:06:21 crc kubenswrapper[4722]: I0309 14:06:21.550745 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 09 14:06:21 crc kubenswrapper[4722]: I0309 14:06:21.570594 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 09 14:06:21 crc kubenswrapper[4722]: I0309 14:06:21.613898 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgnzc\" (UniqueName: \"kubernetes.io/projected/427a4c04-99cd-4f53-ae98-20c1755d7658-kube-api-access-wgnzc\") pod \"authentication-operator-69f744f599-8tkbn\" (UID: \"427a4c04-99cd-4f53-ae98-20c1755d7658\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-8tkbn" Mar 09 14:06:21 crc kubenswrapper[4722]: I0309 14:06:21.617386 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-jmwpv" Mar 09 14:06:21 crc kubenswrapper[4722]: I0309 14:06:21.624618 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-knrzp" Mar 09 14:06:21 crc kubenswrapper[4722]: I0309 14:06:21.627597 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-sk7lp" Mar 09 14:06:21 crc kubenswrapper[4722]: I0309 14:06:21.632275 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-gh244" Mar 09 14:06:21 crc kubenswrapper[4722]: I0309 14:06:21.632613 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ghkm\" (UniqueName: \"kubernetes.io/projected/062db6f1-77ab-4eca-be53-6480160aff81-kube-api-access-9ghkm\") pod \"apiserver-7bbb656c7d-2mf4l\" (UID: \"062db6f1-77ab-4eca-be53-6480160aff81\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2mf4l" Mar 09 14:06:21 crc kubenswrapper[4722]: I0309 14:06:21.632616 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 09 14:06:21 crc kubenswrapper[4722]: I0309 14:06:21.639012 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-chrnr" Mar 09 14:06:21 crc kubenswrapper[4722]: I0309 14:06:21.642265 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-mgzj7" Mar 09 14:06:21 crc kubenswrapper[4722]: I0309 14:06:21.650651 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 09 14:06:21 crc kubenswrapper[4722]: I0309 14:06:21.660040 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fnmcv" Mar 09 14:06:21 crc kubenswrapper[4722]: I0309 14:06:21.670495 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 09 14:06:21 crc kubenswrapper[4722]: I0309 14:06:21.676966 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-xxz9x" Mar 09 14:06:21 crc kubenswrapper[4722]: I0309 14:06:21.689854 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 09 14:06:21 crc kubenswrapper[4722]: I0309 14:06:21.712658 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 09 14:06:21 crc kubenswrapper[4722]: I0309 14:06:21.713572 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-8tkbn" Mar 09 14:06:21 crc kubenswrapper[4722]: I0309 14:06:21.739317 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2mf4l" Mar 09 14:06:21 crc kubenswrapper[4722]: I0309 14:06:21.741066 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-6s4fg"] Mar 09 14:06:21 crc kubenswrapper[4722]: I0309 14:06:21.750064 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 09 14:06:21 crc kubenswrapper[4722]: I0309 14:06:21.759375 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hczss\" (UniqueName: \"kubernetes.io/projected/7c9f4b4e-a00a-4a59-bdc8-a76fe4d6d0d8-kube-api-access-hczss\") pod \"machine-approver-56656f9798-7jl6s\" (UID: \"7c9f4b4e-a00a-4a59-bdc8-a76fe4d6d0d8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7jl6s" Mar 09 14:06:21 crc kubenswrapper[4722]: I0309 14:06:21.764506 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-79q9f"] Mar 09 14:06:21 crc kubenswrapper[4722]: I0309 14:06:21.770139 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 09 14:06:21 crc kubenswrapper[4722]: I0309 14:06:21.790759 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 09 14:06:21 crc kubenswrapper[4722]: I0309 14:06:21.810130 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 09 14:06:21 crc kubenswrapper[4722]: I0309 14:06:21.830630 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7jl6s" Mar 09 14:06:21 crc kubenswrapper[4722]: I0309 14:06:21.831175 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 09 14:06:21 crc kubenswrapper[4722]: I0309 14:06:21.851740 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 09 14:06:21 crc kubenswrapper[4722]: I0309 14:06:21.872633 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 09 14:06:21 crc kubenswrapper[4722]: I0309 14:06:21.890292 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 09 14:06:21 crc kubenswrapper[4722]: I0309 14:06:21.911977 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 09 14:06:21 crc kubenswrapper[4722]: I0309 14:06:21.931816 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" Mar 09 14:06:21 crc kubenswrapper[4722]: I0309 14:06:21.932155 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 09 14:06:21 crc kubenswrapper[4722]: I0309 14:06:21.952597 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 09 14:06:21 crc kubenswrapper[4722]: I0309 14:06:21.970324 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 09 14:06:21 crc kubenswrapper[4722]: I0309 14:06:21.989737 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.011308 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.032628 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.049983 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.079723 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.089763 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.109751 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-sk7lp"] Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.114947 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.130076 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-jmwpv"] Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.130304 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.131416 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-knrzp"] Mar 09 14:06:22 crc kubenswrapper[4722]: W0309 14:06:22.135487 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f915ef9_5d9a_43ee_a333_def8766e083d.slice/crio-602e706b4e7a149023707e185f3b7f1ecf059c1c6c39b720309bec6e3973a16a WatchSource:0}: Error finding container 602e706b4e7a149023707e185f3b7f1ecf059c1c6c39b720309bec6e3973a16a: Status 404 returned error can't find the container with id 602e706b4e7a149023707e185f3b7f1ecf059c1c6c39b720309bec6e3973a16a Mar 09 14:06:22 crc kubenswrapper[4722]: W0309 14:06:22.136936 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f2a2160_888c_4101_8b1c_63498753a2b7.slice/crio-f462d4025811ae625ad2e1d5d811c4f415cb71d5a5bcad30ccc9f86d5b6e86bb WatchSource:0}: Error finding container f462d4025811ae625ad2e1d5d811c4f415cb71d5a5bcad30ccc9f86d5b6e86bb: Status 404 returned error can't find the container with id f462d4025811ae625ad2e1d5d811c4f415cb71d5a5bcad30ccc9f86d5b6e86bb Mar 09 14:06:22 crc kubenswrapper[4722]: W0309 14:06:22.137271 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6f37779_2f40_4aed_a52b_ee2e2693f16c.slice/crio-92d8201478492f3515f22fa1ff639bc3845da93ebd968b3826b3bf3a1942c322 WatchSource:0}: Error finding container 92d8201478492f3515f22fa1ff639bc3845da93ebd968b3826b3bf3a1942c322: Status 404 returned error can't find the container with id 92d8201478492f3515f22fa1ff639bc3845da93ebd968b3826b3bf3a1942c322 Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.148773 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.170620 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.192429 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.196057 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-mgzj7"] Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.206482 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-chrnr"] Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.209843 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.230001 4722 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 09 14:06:22 crc kubenswrapper[4722]: W0309 14:06:22.236878 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb919959a_1da1_4c74_9330_5bb8c33f5c26.slice/crio-7b9fe57da4023f3a2a65f2f83c0f03556876a8a9f272ebcdd4ae38b775fba173 WatchSource:0}: Error finding container 7b9fe57da4023f3a2a65f2f83c0f03556876a8a9f272ebcdd4ae38b775fba173: Status 404 returned error can't find the container with id 7b9fe57da4023f3a2a65f2f83c0f03556876a8a9f272ebcdd4ae38b775fba173 Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.249805 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.269855 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.281560 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-fnmcv"] Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.289843 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.309795 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.309966 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-gh244"] Mar 09 14:06:22 crc kubenswrapper[4722]: W0309 14:06:22.317391 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5822aa06_923f_4ba0_bdf6_617c5a5eb617.slice/crio-e54670148e87c29178fb21b9b9f047c9a9313d6381ba2a040bff6585c615d2ee WatchSource:0}: Error finding container e54670148e87c29178fb21b9b9f047c9a9313d6381ba2a040bff6585c615d2ee: Status 404 returned error can't find the container with id e54670148e87c29178fb21b9b9f047c9a9313d6381ba2a040bff6585c615d2ee Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.329779 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.345695 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-xxz9x"] Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.346508 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-8tkbn"] Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.350064 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-2mf4l"] Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.350374 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 09 14:06:22 crc kubenswrapper[4722]: W0309 14:06:22.350784 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a0be9cc_2cda_4b22_838b_0036cfa4405c.slice/crio-bde49ada009c0cb3b848923ff1019ed3bc84e06f78bea4bade094f1a141b31f7 WatchSource:0}: Error finding container bde49ada009c0cb3b848923ff1019ed3bc84e06f78bea4bade094f1a141b31f7: Status 404 returned error can't find the container with id bde49ada009c0cb3b848923ff1019ed3bc84e06f78bea4bade094f1a141b31f7 Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.368439 4722 request.go:700] Waited for 1.931317185s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-dns/secrets?fieldSelector=metadata.name%3Ddns-default-metrics-tls&limit=500&resourceVersion=0 Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.371399 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.390625 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.411147 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.449077 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65gll\" (UniqueName: \"kubernetes.io/projected/b642aeb0-15ca-4f07-a83d-ede389a04408-kube-api-access-65gll\") pod \"openshift-controller-manager-operator-756b6f6bc6-68k6r\" (UID: \"b642aeb0-15ca-4f07-a83d-ede389a04408\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-68k6r" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.451187 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.473310 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.477887 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-8tkbn" event={"ID":"427a4c04-99cd-4f53-ae98-20c1755d7658","Type":"ContainerStarted","Data":"fd7c1e694e3113176c9815d602431f11a38c7ecddb892ec77dec186d23a692cf"} Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.479911 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-gh244" event={"ID":"4084fbb0-8fae-4b7e-a3f6-ec9d723bb367","Type":"ContainerStarted","Data":"fc6ef4f2e87c2de99bf615c3d21a4b64483d9b710d6fee99a734f1f0ae2935dc"} Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.482192 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7jl6s" event={"ID":"7c9f4b4e-a00a-4a59-bdc8-a76fe4d6d0d8","Type":"ContainerStarted","Data":"0582d4dc3b2c5a7b232a3a4138e5b3556008e80e7e33b33545e8454290a399da"} Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.482262 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7jl6s" event={"ID":"7c9f4b4e-a00a-4a59-bdc8-a76fe4d6d0d8","Type":"ContainerStarted","Data":"1738e8c5b87951d626bfb82f7b6662560b75a6e8988f05b20023ac50898d9b37"} Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.483897 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-6s4fg" event={"ID":"9c83f63c-9a9e-4e33-b0c6-54396ce9f07c","Type":"ContainerStarted","Data":"92b2657738997e464f43cf92f37aa038c34a22f9e970c6730d11b569f7bca4f4"} Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.483925 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-6s4fg" event={"ID":"9c83f63c-9a9e-4e33-b0c6-54396ce9f07c","Type":"ContainerStarted","Data":"b1f12e433ab07cfd411f327393f1488e291557e710d61f705c41e17c11665161"} Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.484115 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-6s4fg" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.487309 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-79q9f" event={"ID":"e646e4a2-271c-4c66-ba95-67061b0323bf","Type":"ContainerStarted","Data":"b10ba06509b8ca2cc720f15b60ee3d8d835161a38d4024db082b30dfbfef87a9"} Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.487337 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-79q9f" event={"ID":"e646e4a2-271c-4c66-ba95-67061b0323bf","Type":"ContainerStarted","Data":"aa6950783ff313b34bd377073d29934faf82e2b37af8cdd9e235b93a678f6509"} Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.487350 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-79q9f" event={"ID":"e646e4a2-271c-4c66-ba95-67061b0323bf","Type":"ContainerStarted","Data":"ddf6d0313a86474e0ada54c07b154ef1892637dc006953877950aa1fce8c8f03"} Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.487755 4722 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-6s4fg container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.11:6443/healthz\": dial tcp 10.217.0.11:6443: connect: connection refused" start-of-body= Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.487799 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-6s4fg" podUID="9c83f63c-9a9e-4e33-b0c6-54396ce9f07c" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.11:6443/healthz\": dial tcp 10.217.0.11:6443: connect: connection refused" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.490492 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-chrnr" event={"ID":"b919959a-1da1-4c74-9330-5bb8c33f5c26","Type":"ContainerStarted","Data":"51927e5090e0b3c63faeac1e7ba440eff196cb147decf4416045e5adc2db287e"} Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.490554 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-chrnr" event={"ID":"b919959a-1da1-4c74-9330-5bb8c33f5c26","Type":"ContainerStarted","Data":"7b9fe57da4023f3a2a65f2f83c0f03556876a8a9f272ebcdd4ae38b775fba173"} Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.490681 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.490726 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-chrnr" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.492132 4722 patch_prober.go:28] interesting pod/console-operator-58897d9998-chrnr container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.14:8443/readyz\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.492479 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-chrnr" podUID="b919959a-1da1-4c74-9330-5bb8c33f5c26" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.14:8443/readyz\": dial tcp 10.217.0.14:8443: connect: connection refused" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.495308 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fnmcv" event={"ID":"5822aa06-923f-4ba0-bdf6-617c5a5eb617","Type":"ContainerStarted","Data":"e54670148e87c29178fb21b9b9f047c9a9313d6381ba2a040bff6585c615d2ee"} Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.498492 4722 generic.go:334] "Generic (PLEG): container finished" podID="9f2a2160-888c-4101-8b1c-63498753a2b7" containerID="37f4d53ce686406a0abe392bc1353b51f0a4723d826deb8edc540ecbfbd6d3f6" exitCode=0 Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.498731 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-jmwpv" event={"ID":"9f2a2160-888c-4101-8b1c-63498753a2b7","Type":"ContainerDied","Data":"37f4d53ce686406a0abe392bc1353b51f0a4723d826deb8edc540ecbfbd6d3f6"} Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.498783 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-jmwpv" event={"ID":"9f2a2160-888c-4101-8b1c-63498753a2b7","Type":"ContainerStarted","Data":"f462d4025811ae625ad2e1d5d811c4f415cb71d5a5bcad30ccc9f86d5b6e86bb"} Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.501282 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2mf4l" event={"ID":"062db6f1-77ab-4eca-be53-6480160aff81","Type":"ContainerStarted","Data":"e1e401af463710908160ee7cd29703fb8519318f5dc0b8c03d372fc95f74ddc9"} Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.503143 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-xxz9x" event={"ID":"8a0be9cc-2cda-4b22-838b-0036cfa4405c","Type":"ContainerStarted","Data":"bde49ada009c0cb3b848923ff1019ed3bc84e06f78bea4bade094f1a141b31f7"} Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.507882 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-mgzj7" event={"ID":"e0194ee7-d343-4042-9c2b-08cc513ee43e","Type":"ContainerStarted","Data":"ef68012ce113c672326b92933b4c29a09a16cd7b6d5871cbfa1695f8f042b938"} Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.507956 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-mgzj7" event={"ID":"e0194ee7-d343-4042-9c2b-08cc513ee43e","Type":"ContainerStarted","Data":"1862afc05713de13623553d44ee6f258ae1ef1c3df42e081c0aace0ccc31617a"} Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.509238 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-mgzj7" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.511161 4722 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-mgzj7 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.511326 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-mgzj7" podUID="e0194ee7-d343-4042-9c2b-08cc513ee43e" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.513618 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-knrzp" event={"ID":"8f915ef9-5d9a-43ee-a333-def8766e083d","Type":"ContainerStarted","Data":"08b7729dd5722a2a83f8b986c99cc6315569c3cd2d38112dd77be8d55fd5d716"} Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.513683 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-knrzp" event={"ID":"8f915ef9-5d9a-43ee-a333-def8766e083d","Type":"ContainerStarted","Data":"602e706b4e7a149023707e185f3b7f1ecf059c1c6c39b720309bec6e3973a16a"} Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.513989 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-knrzp" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.518940 4722 patch_prober.go:28] interesting pod/downloads-7954f5f757-knrzp container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.519017 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-knrzp" podUID="8f915ef9-5d9a-43ee-a333-def8766e083d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.519529 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-sk7lp" event={"ID":"b6f37779-2f40-4aed-a52b-ee2e2693f16c","Type":"ContainerStarted","Data":"e46015a73236b15968ef83bf12b50bfc960ac0649966d3e026dab7d105ca41dd"} Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.519628 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-sk7lp" event={"ID":"b6f37779-2f40-4aed-a52b-ee2e2693f16c","Type":"ContainerStarted","Data":"92d8201478492f3515f22fa1ff639bc3845da93ebd968b3826b3bf3a1942c322"} Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.602500 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6d4c4768-195d-42fe-86a7-139c4ec0c86d-metrics-tls\") pod \"dns-operator-744455d44c-hng8f\" (UID: \"6d4c4768-195d-42fe-86a7-139c4ec0c86d\") " pod="openshift-dns-operator/dns-operator-744455d44c-hng8f" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.602585 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/ac1bc217-a917-4e71-8672-bab0cdaa1bbf-etcd-service-ca\") pod \"etcd-operator-b45778765-6b65x\" (UID: \"ac1bc217-a917-4e71-8672-bab0cdaa1bbf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6b65x" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.602608 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/704d2c33-a0ad-4dc1-a5e6-37aab398f9a0-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-5j6rh\" (UID: \"704d2c33-a0ad-4dc1-a5e6-37aab398f9a0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5j6rh" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.602658 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9d57536b-4f57-4098-b519-19fdc2559eda-registry-certificates\") pod \"image-registry-697d97f7c8-tz6gh\" (UID: \"9d57536b-4f57-4098-b519-19fdc2559eda\") " pod="openshift-image-registry/image-registry-697d97f7c8-tz6gh" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.602907 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbdw2\" (UniqueName: \"kubernetes.io/projected/9d57536b-4f57-4098-b519-19fdc2559eda-kube-api-access-bbdw2\") pod \"image-registry-697d97f7c8-tz6gh\" (UID: \"9d57536b-4f57-4098-b519-19fdc2559eda\") " pod="openshift-image-registry/image-registry-697d97f7c8-tz6gh" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.602925 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8a76c9b5-c226-4d93-8d7a-8e56210b572a-console-config\") pod \"console-f9d7485db-sfch8\" (UID: \"8a76c9b5-c226-4d93-8d7a-8e56210b572a\") " pod="openshift-console/console-f9d7485db-sfch8" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.602955 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8924\" (UniqueName: \"kubernetes.io/projected/3ba8e5ae-906d-47bd-862d-8e66e0defb9f-kube-api-access-m8924\") pod \"cluster-image-registry-operator-dc59b4c8b-7q7bl\" (UID: \"3ba8e5ae-906d-47bd-862d-8e66e0defb9f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7q7bl" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.602993 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8a76c9b5-c226-4d93-8d7a-8e56210b572a-oauth-serving-cert\") pod \"console-f9d7485db-sfch8\" (UID: \"8a76c9b5-c226-4d93-8d7a-8e56210b572a\") " pod="openshift-console/console-f9d7485db-sfch8" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.603087 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7tvf\" (UniqueName: \"kubernetes.io/projected/8a76c9b5-c226-4d93-8d7a-8e56210b572a-kube-api-access-k7tvf\") pod \"console-f9d7485db-sfch8\" (UID: \"8a76c9b5-c226-4d93-8d7a-8e56210b572a\") " pod="openshift-console/console-f9d7485db-sfch8" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.603148 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cl4st\" (UniqueName: \"kubernetes.io/projected/6d4c4768-195d-42fe-86a7-139c4ec0c86d-kube-api-access-cl4st\") pod \"dns-operator-744455d44c-hng8f\" (UID: \"6d4c4768-195d-42fe-86a7-139c4ec0c86d\") " pod="openshift-dns-operator/dns-operator-744455d44c-hng8f" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.603193 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4w7p8\" (UniqueName: \"kubernetes.io/projected/ac1bc217-a917-4e71-8672-bab0cdaa1bbf-kube-api-access-4w7p8\") pod \"etcd-operator-b45778765-6b65x\" (UID: \"ac1bc217-a917-4e71-8672-bab0cdaa1bbf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6b65x" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.603237 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/3ba8e5ae-906d-47bd-862d-8e66e0defb9f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-7q7bl\" (UID: \"3ba8e5ae-906d-47bd-862d-8e66e0defb9f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7q7bl" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.603259 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8a76c9b5-c226-4d93-8d7a-8e56210b572a-service-ca\") pod \"console-f9d7485db-sfch8\" (UID: \"8a76c9b5-c226-4d93-8d7a-8e56210b572a\") " pod="openshift-console/console-f9d7485db-sfch8" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.603276 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8a76c9b5-c226-4d93-8d7a-8e56210b572a-trusted-ca-bundle\") pod \"console-f9d7485db-sfch8\" (UID: \"8a76c9b5-c226-4d93-8d7a-8e56210b572a\") " pod="openshift-console/console-f9d7485db-sfch8" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.603296 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vq66f\" (UniqueName: \"kubernetes.io/projected/b12ae595-2119-49c9-9bfd-33eec4b6df65-kube-api-access-vq66f\") pod \"kube-storage-version-migrator-operator-b67b599dd-kh6xb\" (UID: \"b12ae595-2119-49c9-9bfd-33eec4b6df65\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kh6xb" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.603313 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8a76c9b5-c226-4d93-8d7a-8e56210b572a-console-serving-cert\") pod \"console-f9d7485db-sfch8\" (UID: \"8a76c9b5-c226-4d93-8d7a-8e56210b572a\") " pod="openshift-console/console-f9d7485db-sfch8" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.603340 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8a76c9b5-c226-4d93-8d7a-8e56210b572a-console-oauth-config\") pod \"console-f9d7485db-sfch8\" (UID: \"8a76c9b5-c226-4d93-8d7a-8e56210b572a\") " pod="openshift-console/console-f9d7485db-sfch8" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.603359 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6635182d-e9ef-4294-8ff3-ae305c10feb4-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9vnnw\" (UID: \"6635182d-e9ef-4294-8ff3-ae305c10feb4\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9vnnw" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.603408 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b12ae595-2119-49c9-9bfd-33eec4b6df65-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-kh6xb\" (UID: \"b12ae595-2119-49c9-9bfd-33eec4b6df65\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kh6xb" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.603462 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/ac1bc217-a917-4e71-8672-bab0cdaa1bbf-etcd-ca\") pod \"etcd-operator-b45778765-6b65x\" (UID: \"ac1bc217-a917-4e71-8672-bab0cdaa1bbf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6b65x" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.603480 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/704d2c33-a0ad-4dc1-a5e6-37aab398f9a0-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-5j6rh\" (UID: \"704d2c33-a0ad-4dc1-a5e6-37aab398f9a0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5j6rh" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.603499 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9d57536b-4f57-4098-b519-19fdc2559eda-ca-trust-extracted\") pod \"image-registry-697d97f7c8-tz6gh\" (UID: \"9d57536b-4f57-4098-b519-19fdc2559eda\") " pod="openshift-image-registry/image-registry-697d97f7c8-tz6gh" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.603544 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tz6gh\" (UID: \"9d57536b-4f57-4098-b519-19fdc2559eda\") " pod="openshift-image-registry/image-registry-697d97f7c8-tz6gh" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.603565 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9d57536b-4f57-4098-b519-19fdc2559eda-installation-pull-secrets\") pod \"image-registry-697d97f7c8-tz6gh\" (UID: \"9d57536b-4f57-4098-b519-19fdc2559eda\") " pod="openshift-image-registry/image-registry-697d97f7c8-tz6gh" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.603582 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/704d2c33-a0ad-4dc1-a5e6-37aab398f9a0-config\") pod \"kube-apiserver-operator-766d6c64bb-5j6rh\" (UID: \"704d2c33-a0ad-4dc1-a5e6-37aab398f9a0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5j6rh" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.603702 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9d57536b-4f57-4098-b519-19fdc2559eda-registry-tls\") pod \"image-registry-697d97f7c8-tz6gh\" (UID: \"9d57536b-4f57-4098-b519-19fdc2559eda\") " pod="openshift-image-registry/image-registry-697d97f7c8-tz6gh" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.603927 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b12ae595-2119-49c9-9bfd-33eec4b6df65-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-kh6xb\" (UID: \"b12ae595-2119-49c9-9bfd-33eec4b6df65\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kh6xb" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.604001 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d57536b-4f57-4098-b519-19fdc2559eda-trusted-ca\") pod \"image-registry-697d97f7c8-tz6gh\" (UID: \"9d57536b-4f57-4098-b519-19fdc2559eda\") " pod="openshift-image-registry/image-registry-697d97f7c8-tz6gh" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.604030 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6635182d-e9ef-4294-8ff3-ae305c10feb4-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9vnnw\" (UID: \"6635182d-e9ef-4294-8ff3-ae305c10feb4\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9vnnw" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.604066 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9d57536b-4f57-4098-b519-19fdc2559eda-bound-sa-token\") pod \"image-registry-697d97f7c8-tz6gh\" (UID: \"9d57536b-4f57-4098-b519-19fdc2559eda\") " pod="openshift-image-registry/image-registry-697d97f7c8-tz6gh" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.604093 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac1bc217-a917-4e71-8672-bab0cdaa1bbf-serving-cert\") pod \"etcd-operator-b45778765-6b65x\" (UID: \"ac1bc217-a917-4e71-8672-bab0cdaa1bbf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6b65x" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.604139 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac1bc217-a917-4e71-8672-bab0cdaa1bbf-config\") pod \"etcd-operator-b45778765-6b65x\" (UID: \"ac1bc217-a917-4e71-8672-bab0cdaa1bbf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6b65x" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.604157 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6635182d-e9ef-4294-8ff3-ae305c10feb4-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9vnnw\" (UID: \"6635182d-e9ef-4294-8ff3-ae305c10feb4\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9vnnw" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.604192 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ac1bc217-a917-4e71-8672-bab0cdaa1bbf-etcd-client\") pod \"etcd-operator-b45778765-6b65x\" (UID: \"ac1bc217-a917-4e71-8672-bab0cdaa1bbf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6b65x" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.604235 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3ba8e5ae-906d-47bd-862d-8e66e0defb9f-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-7q7bl\" (UID: \"3ba8e5ae-906d-47bd-862d-8e66e0defb9f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7q7bl" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.604254 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3ba8e5ae-906d-47bd-862d-8e66e0defb9f-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-7q7bl\" (UID: \"3ba8e5ae-906d-47bd-862d-8e66e0defb9f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7q7bl" Mar 09 14:06:22 crc kubenswrapper[4722]: E0309 14:06:22.607162 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 14:06:23.107138 +0000 UTC m=+223.662706746 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tz6gh" (UID: "9d57536b-4f57-4098-b519-19fdc2559eda") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.656968 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-68k6r" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.705117 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.705388 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cg7gs\" (UniqueName: \"kubernetes.io/projected/3cd02fb0-bac4-47b0-9846-a94399041f77-kube-api-access-cg7gs\") pod \"service-ca-operator-777779d784-nvtgb\" (UID: \"3cd02fb0-bac4-47b0-9846-a94399041f77\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nvtgb" Mar 09 14:06:22 crc kubenswrapper[4722]: E0309 14:06:22.705441 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 14:06:23.205396915 +0000 UTC m=+223.760965551 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.705506 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/996087ed-6480-4650-8632-c991e5d16c99-webhook-cert\") pod \"packageserver-d55dfcdfc-fp2th\" (UID: \"996087ed-6480-4650-8632-c991e5d16c99\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fp2th" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.705590 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4w7p8\" (UniqueName: \"kubernetes.io/projected/ac1bc217-a917-4e71-8672-bab0cdaa1bbf-kube-api-access-4w7p8\") pod \"etcd-operator-b45778765-6b65x\" (UID: \"ac1bc217-a917-4e71-8672-bab0cdaa1bbf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6b65x" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.705617 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8a76c9b5-c226-4d93-8d7a-8e56210b572a-service-ca\") pod \"console-f9d7485db-sfch8\" (UID: \"8a76c9b5-c226-4d93-8d7a-8e56210b572a\") " pod="openshift-console/console-f9d7485db-sfch8" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.705643 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/cc613bd5-c976-458c-aeda-d62a304f4103-signing-cabundle\") pod \"service-ca-9c57cc56f-6db7b\" (UID: \"cc613bd5-c976-458c-aeda-d62a304f4103\") " pod="openshift-service-ca/service-ca-9c57cc56f-6db7b" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.705670 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sm4kb\" (UniqueName: \"kubernetes.io/projected/9fb5d9ef-bb16-45d0-a7c0-a9fb43edeb34-kube-api-access-sm4kb\") pod \"control-plane-machine-set-operator-78cbb6b69f-2xkrn\" (UID: \"9fb5d9ef-bb16-45d0-a7c0-a9fb43edeb34\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2xkrn" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.705693 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87vhb\" (UniqueName: \"kubernetes.io/projected/30f3c24f-2b49-4343-8c94-f6d56b43a35d-kube-api-access-87vhb\") pod \"machine-config-operator-74547568cd-8jf57\" (UID: \"30f3c24f-2b49-4343-8c94-f6d56b43a35d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8jf57" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.705721 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xnvj\" (UniqueName: \"kubernetes.io/projected/3d05c111-6b35-44c3-b587-12f470d584c3-kube-api-access-6xnvj\") pod \"machine-config-controller-84d6567774-tkhkf\" (UID: \"3d05c111-6b35-44c3-b587-12f470d584c3\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tkhkf" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.705749 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f1fc8bc8-fa6c-4648-9bc3-491cec75d584-bound-sa-token\") pod \"ingress-operator-5b745b69d9-lh92k\" (UID: \"f1fc8bc8-fa6c-4648-9bc3-491cec75d584\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lh92k" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.705776 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/240d1325-4400-475e-8bc7-9915294148d8-mountpoint-dir\") pod \"csi-hostpathplugin-b5hps\" (UID: \"240d1325-4400-475e-8bc7-9915294148d8\") " pod="hostpath-provisioner/csi-hostpathplugin-b5hps" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.705801 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/ac1bc217-a917-4e71-8672-bab0cdaa1bbf-etcd-ca\") pod \"etcd-operator-b45778765-6b65x\" (UID: \"ac1bc217-a917-4e71-8672-bab0cdaa1bbf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6b65x" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.705824 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/704d2c33-a0ad-4dc1-a5e6-37aab398f9a0-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-5j6rh\" (UID: \"704d2c33-a0ad-4dc1-a5e6-37aab398f9a0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5j6rh" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.705845 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/30f3c24f-2b49-4343-8c94-f6d56b43a35d-images\") pod \"machine-config-operator-74547568cd-8jf57\" (UID: \"30f3c24f-2b49-4343-8c94-f6d56b43a35d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8jf57" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.705869 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25sgl\" (UniqueName: \"kubernetes.io/projected/3a9659ac-0c7d-41be-becc-5ec038244f00-kube-api-access-25sgl\") pod \"dns-default-nnjvd\" (UID: \"3a9659ac-0c7d-41be-becc-5ec038244f00\") " pod="openshift-dns/dns-default-nnjvd" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.705904 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tz6gh\" (UID: \"9d57536b-4f57-4098-b519-19fdc2559eda\") " pod="openshift-image-registry/image-registry-697d97f7c8-tz6gh" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.705930 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/996087ed-6480-4650-8632-c991e5d16c99-tmpfs\") pod \"packageserver-d55dfcdfc-fp2th\" (UID: \"996087ed-6480-4650-8632-c991e5d16c99\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fp2th" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.705976 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nrnl\" (UniqueName: \"kubernetes.io/projected/ba0aa566-e854-473c-b6b6-2f9dfece6133-kube-api-access-5nrnl\") pod \"ingress-canary-hdrrh\" (UID: \"ba0aa566-e854-473c-b6b6-2f9dfece6133\") " pod="openshift-ingress-canary/ingress-canary-hdrrh" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.706005 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/39cc5cc7-fd80-4461-b4b1-adece1093703-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-b5jn4\" (UID: \"39cc5cc7-fd80-4461-b4b1-adece1093703\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-b5jn4" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.706041 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b12ae595-2119-49c9-9bfd-33eec4b6df65-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-kh6xb\" (UID: \"b12ae595-2119-49c9-9bfd-33eec4b6df65\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kh6xb" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.706069 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4jdd\" (UniqueName: \"kubernetes.io/projected/e2641b0e-aae4-49df-931f-95e38505812f-kube-api-access-p4jdd\") pod \"marketplace-operator-79b997595-wz9f2\" (UID: \"e2641b0e-aae4-49df-931f-95e38505812f\") " pod="openshift-marketplace/marketplace-operator-79b997595-wz9f2" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.706098 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9d57536b-4f57-4098-b519-19fdc2559eda-registry-tls\") pod \"image-registry-697d97f7c8-tz6gh\" (UID: \"9d57536b-4f57-4098-b519-19fdc2559eda\") " pod="openshift-image-registry/image-registry-697d97f7c8-tz6gh" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.706122 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/317444ee-0620-47d2-869e-77578a367a87-service-ca-bundle\") pod \"router-default-5444994796-dp8wn\" (UID: \"317444ee-0620-47d2-869e-77578a367a87\") " pod="openshift-ingress/router-default-5444994796-dp8wn" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.706150 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/9fb5d9ef-bb16-45d0-a7c0-a9fb43edeb34-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-2xkrn\" (UID: \"9fb5d9ef-bb16-45d0-a7c0-a9fb43edeb34\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2xkrn" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.706180 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d57536b-4f57-4098-b519-19fdc2559eda-trusted-ca\") pod \"image-registry-697d97f7c8-tz6gh\" (UID: \"9d57536b-4f57-4098-b519-19fdc2559eda\") " pod="openshift-image-registry/image-registry-697d97f7c8-tz6gh" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.706376 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6635182d-e9ef-4294-8ff3-ae305c10feb4-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9vnnw\" (UID: \"6635182d-e9ef-4294-8ff3-ae305c10feb4\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9vnnw" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.706412 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkcsm\" (UniqueName: \"kubernetes.io/projected/317444ee-0620-47d2-869e-77578a367a87-kube-api-access-zkcsm\") pod \"router-default-5444994796-dp8wn\" (UID: \"317444ee-0620-47d2-869e-77578a367a87\") " pod="openshift-ingress/router-default-5444994796-dp8wn" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.706476 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac1bc217-a917-4e71-8672-bab0cdaa1bbf-serving-cert\") pod \"etcd-operator-b45778765-6b65x\" (UID: \"ac1bc217-a917-4e71-8672-bab0cdaa1bbf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6b65x" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.709741 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/240d1325-4400-475e-8bc7-9915294148d8-plugins-dir\") pod \"csi-hostpathplugin-b5hps\" (UID: \"240d1325-4400-475e-8bc7-9915294148d8\") " pod="hostpath-provisioner/csi-hostpathplugin-b5hps" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.709894 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3ba8e5ae-906d-47bd-862d-8e66e0defb9f-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-7q7bl\" (UID: \"3ba8e5ae-906d-47bd-862d-8e66e0defb9f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7q7bl" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.709941 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/228d39d8-b0bc-4491-be90-e473c090f412-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-mrtb2\" (UID: \"228d39d8-b0bc-4491-be90-e473c090f412\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mrtb2" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.710043 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/317444ee-0620-47d2-869e-77578a367a87-metrics-certs\") pod \"router-default-5444994796-dp8wn\" (UID: \"317444ee-0620-47d2-869e-77578a367a87\") " pod="openshift-ingress/router-default-5444994796-dp8wn" Mar 09 14:06:22 crc kubenswrapper[4722]: E0309 14:06:22.710062 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 14:06:23.210045894 +0000 UTC m=+223.765614550 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tz6gh" (UID: "9d57536b-4f57-4098-b519-19fdc2559eda") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.710101 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6d4c4768-195d-42fe-86a7-139c4ec0c86d-metrics-tls\") pod \"dns-operator-744455d44c-hng8f\" (UID: \"6d4c4768-195d-42fe-86a7-139c4ec0c86d\") " pod="openshift-dns-operator/dns-operator-744455d44c-hng8f" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.710136 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/317444ee-0620-47d2-869e-77578a367a87-default-certificate\") pod \"router-default-5444994796-dp8wn\" (UID: \"317444ee-0620-47d2-869e-77578a367a87\") " pod="openshift-ingress/router-default-5444994796-dp8wn" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.710174 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f8055a95-6b09-4e32-88b8-82ad36ca5029-srv-cert\") pod \"olm-operator-6b444d44fb-lnkwt\" (UID: \"f8055a95-6b09-4e32-88b8-82ad36ca5029\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lnkwt" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.710219 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/26cb738b-5dfa-4b97-8153-790ec6eb198b-node-bootstrap-token\") pod \"machine-config-server-pjvfp\" (UID: \"26cb738b-5dfa-4b97-8153-790ec6eb198b\") " pod="openshift-machine-config-operator/machine-config-server-pjvfp" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.710251 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/704d2c33-a0ad-4dc1-a5e6-37aab398f9a0-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-5j6rh\" (UID: \"704d2c33-a0ad-4dc1-a5e6-37aab398f9a0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5j6rh" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.710272 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8a76c9b5-c226-4d93-8d7a-8e56210b572a-service-ca\") pod \"console-f9d7485db-sfch8\" (UID: \"8a76c9b5-c226-4d93-8d7a-8e56210b572a\") " pod="openshift-console/console-f9d7485db-sfch8" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.710284 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/ac1bc217-a917-4e71-8672-bab0cdaa1bbf-etcd-service-ca\") pod \"etcd-operator-b45778765-6b65x\" (UID: \"ac1bc217-a917-4e71-8672-bab0cdaa1bbf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6b65x" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.710366 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f8055a95-6b09-4e32-88b8-82ad36ca5029-profile-collector-cert\") pod \"olm-operator-6b444d44fb-lnkwt\" (UID: \"f8055a95-6b09-4e32-88b8-82ad36ca5029\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lnkwt" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.710406 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8a76c9b5-c226-4d93-8d7a-8e56210b572a-console-config\") pod \"console-f9d7485db-sfch8\" (UID: \"8a76c9b5-c226-4d93-8d7a-8e56210b572a\") " pod="openshift-console/console-f9d7485db-sfch8" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.710479 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8924\" (UniqueName: \"kubernetes.io/projected/3ba8e5ae-906d-47bd-862d-8e66e0defb9f-kube-api-access-m8924\") pod \"cluster-image-registry-operator-dc59b4c8b-7q7bl\" (UID: \"3ba8e5ae-906d-47bd-862d-8e66e0defb9f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7q7bl" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.710514 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnf6j\" (UniqueName: \"kubernetes.io/projected/abf2dbee-a467-4858-8d5a-d4d1bf1bb430-kube-api-access-gnf6j\") pod \"migrator-59844c95c7-nk4z2\" (UID: \"abf2dbee-a467-4858-8d5a-d4d1bf1bb430\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nk4z2" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.710541 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3a9659ac-0c7d-41be-becc-5ec038244f00-metrics-tls\") pod \"dns-default-nnjvd\" (UID: \"3a9659ac-0c7d-41be-becc-5ec038244f00\") " pod="openshift-dns/dns-default-nnjvd" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.710566 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3cd02fb0-bac4-47b0-9846-a94399041f77-serving-cert\") pod \"service-ca-operator-777779d784-nvtgb\" (UID: \"3cd02fb0-bac4-47b0-9846-a94399041f77\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nvtgb" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.710586 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nc6w4\" (UniqueName: \"kubernetes.io/projected/26cb738b-5dfa-4b97-8153-790ec6eb198b-kube-api-access-nc6w4\") pod \"machine-config-server-pjvfp\" (UID: \"26cb738b-5dfa-4b97-8153-790ec6eb198b\") " pod="openshift-machine-config-operator/machine-config-server-pjvfp" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.710611 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgsjv\" (UniqueName: \"kubernetes.io/projected/228d39d8-b0bc-4491-be90-e473c090f412-kube-api-access-kgsjv\") pod \"package-server-manager-789f6589d5-mrtb2\" (UID: \"228d39d8-b0bc-4491-be90-e473c090f412\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mrtb2" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.710632 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c219beb3-4ba5-43bd-b2ec-3855d19c2b57-srv-cert\") pod \"catalog-operator-68c6474976-dgxnd\" (UID: \"c219beb3-4ba5-43bd-b2ec-3855d19c2b57\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dgxnd" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.710653 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/240d1325-4400-475e-8bc7-9915294148d8-socket-dir\") pod \"csi-hostpathplugin-b5hps\" (UID: \"240d1325-4400-475e-8bc7-9915294148d8\") " pod="hostpath-provisioner/csi-hostpathplugin-b5hps" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.710680 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cl4st\" (UniqueName: \"kubernetes.io/projected/6d4c4768-195d-42fe-86a7-139c4ec0c86d-kube-api-access-cl4st\") pod \"dns-operator-744455d44c-hng8f\" (UID: \"6d4c4768-195d-42fe-86a7-139c4ec0c86d\") " pod="openshift-dns-operator/dns-operator-744455d44c-hng8f" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.710704 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wqrg\" (UniqueName: \"kubernetes.io/projected/f1fc8bc8-fa6c-4648-9bc3-491cec75d584-kube-api-access-5wqrg\") pod \"ingress-operator-5b745b69d9-lh92k\" (UID: \"f1fc8bc8-fa6c-4648-9bc3-491cec75d584\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lh92k" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.710731 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8a76c9b5-c226-4d93-8d7a-8e56210b572a-trusted-ca-bundle\") pod \"console-f9d7485db-sfch8\" (UID: \"8a76c9b5-c226-4d93-8d7a-8e56210b572a\") " pod="openshift-console/console-f9d7485db-sfch8" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.710758 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vq66f\" (UniqueName: \"kubernetes.io/projected/b12ae595-2119-49c9-9bfd-33eec4b6df65-kube-api-access-vq66f\") pod \"kube-storage-version-migrator-operator-b67b599dd-kh6xb\" (UID: \"b12ae595-2119-49c9-9bfd-33eec4b6df65\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kh6xb" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.710786 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/3ba8e5ae-906d-47bd-862d-8e66e0defb9f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-7q7bl\" (UID: \"3ba8e5ae-906d-47bd-862d-8e66e0defb9f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7q7bl" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.710810 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7e1fdc12-aac5-4f72-9b22-0212c2f3988e-secret-volume\") pod \"collect-profiles-29551080-mdzx2\" (UID: \"7e1fdc12-aac5-4f72-9b22-0212c2f3988e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551080-mdzx2" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.710832 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csnbb\" (UniqueName: \"kubernetes.io/projected/40be416c-1b7b-4973-b9ed-25ae20cd660d-kube-api-access-csnbb\") pod \"auto-csr-approver-29551086-cvc28\" (UID: \"40be416c-1b7b-4973-b9ed-25ae20cd660d\") " pod="openshift-infra/auto-csr-approver-29551086-cvc28" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.710855 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8a76c9b5-c226-4d93-8d7a-8e56210b572a-console-serving-cert\") pod \"console-f9d7485db-sfch8\" (UID: \"8a76c9b5-c226-4d93-8d7a-8e56210b572a\") " pod="openshift-console/console-f9d7485db-sfch8" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.710876 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/240d1325-4400-475e-8bc7-9915294148d8-registration-dir\") pod \"csi-hostpathplugin-b5hps\" (UID: \"240d1325-4400-475e-8bc7-9915294148d8\") " pod="hostpath-provisioner/csi-hostpathplugin-b5hps" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.710898 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76545\" (UniqueName: \"kubernetes.io/projected/240d1325-4400-475e-8bc7-9915294148d8-kube-api-access-76545\") pod \"csi-hostpathplugin-b5hps\" (UID: \"240d1325-4400-475e-8bc7-9915294148d8\") " pod="hostpath-provisioner/csi-hostpathplugin-b5hps" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.710923 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8a76c9b5-c226-4d93-8d7a-8e56210b572a-console-oauth-config\") pod \"console-f9d7485db-sfch8\" (UID: \"8a76c9b5-c226-4d93-8d7a-8e56210b572a\") " pod="openshift-console/console-f9d7485db-sfch8" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.710986 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/ac1bc217-a917-4e71-8672-bab0cdaa1bbf-etcd-service-ca\") pod \"etcd-operator-b45778765-6b65x\" (UID: \"ac1bc217-a917-4e71-8672-bab0cdaa1bbf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6b65x" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.710996 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6635182d-e9ef-4294-8ff3-ae305c10feb4-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9vnnw\" (UID: \"6635182d-e9ef-4294-8ff3-ae305c10feb4\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9vnnw" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.711021 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fzvj\" (UniqueName: \"kubernetes.io/projected/53e973bb-4b49-4815-b8b3-a6cd76e210bf-kube-api-access-5fzvj\") pod \"multus-admission-controller-857f4d67dd-f8gtl\" (UID: \"53e973bb-4b49-4815-b8b3-a6cd76e210bf\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-f8gtl" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.711058 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4j9cs\" (UniqueName: \"kubernetes.io/projected/c219beb3-4ba5-43bd-b2ec-3855d19c2b57-kube-api-access-4j9cs\") pod \"catalog-operator-68c6474976-dgxnd\" (UID: \"c219beb3-4ba5-43bd-b2ec-3855d19c2b57\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dgxnd" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.711079 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3d05c111-6b35-44c3-b587-12f470d584c3-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-tkhkf\" (UID: \"3d05c111-6b35-44c3-b587-12f470d584c3\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tkhkf" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.711103 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8a76c9b5-c226-4d93-8d7a-8e56210b572a-console-config\") pod \"console-f9d7485db-sfch8\" (UID: \"8a76c9b5-c226-4d93-8d7a-8e56210b572a\") " pod="openshift-console/console-f9d7485db-sfch8" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.711105 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/53e973bb-4b49-4815-b8b3-a6cd76e210bf-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-f8gtl\" (UID: \"53e973bb-4b49-4815-b8b3-a6cd76e210bf\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-f8gtl" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.711183 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b12ae595-2119-49c9-9bfd-33eec4b6df65-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-kh6xb\" (UID: \"b12ae595-2119-49c9-9bfd-33eec4b6df65\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kh6xb" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.711279 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/26cb738b-5dfa-4b97-8153-790ec6eb198b-certs\") pod \"machine-config-server-pjvfp\" (UID: \"26cb738b-5dfa-4b97-8153-790ec6eb198b\") " pod="openshift-machine-config-operator/machine-config-server-pjvfp" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.711324 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9d57536b-4f57-4098-b519-19fdc2559eda-ca-trust-extracted\") pod \"image-registry-697d97f7c8-tz6gh\" (UID: \"9d57536b-4f57-4098-b519-19fdc2559eda\") " pod="openshift-image-registry/image-registry-697d97f7c8-tz6gh" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.711353 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9d57536b-4f57-4098-b519-19fdc2559eda-installation-pull-secrets\") pod \"image-registry-697d97f7c8-tz6gh\" (UID: \"9d57536b-4f57-4098-b519-19fdc2559eda\") " pod="openshift-image-registry/image-registry-697d97f7c8-tz6gh" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.711424 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/704d2c33-a0ad-4dc1-a5e6-37aab398f9a0-config\") pod \"kube-apiserver-operator-766d6c64bb-5j6rh\" (UID: \"704d2c33-a0ad-4dc1-a5e6-37aab398f9a0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5j6rh" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.711449 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/39cc5cc7-fd80-4461-b4b1-adece1093703-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-b5jn4\" (UID: \"39cc5cc7-fd80-4461-b4b1-adece1093703\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-b5jn4" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.713800 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8a76c9b5-c226-4d93-8d7a-8e56210b572a-trusted-ca-bundle\") pod \"console-f9d7485db-sfch8\" (UID: \"8a76c9b5-c226-4d93-8d7a-8e56210b572a\") " pod="openshift-console/console-f9d7485db-sfch8" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.714071 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac1bc217-a917-4e71-8672-bab0cdaa1bbf-serving-cert\") pod \"etcd-operator-b45778765-6b65x\" (UID: \"ac1bc217-a917-4e71-8672-bab0cdaa1bbf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6b65x" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.714238 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/ac1bc217-a917-4e71-8672-bab0cdaa1bbf-etcd-ca\") pod \"etcd-operator-b45778765-6b65x\" (UID: \"ac1bc217-a917-4e71-8672-bab0cdaa1bbf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6b65x" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.714882 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6d4c4768-195d-42fe-86a7-139c4ec0c86d-metrics-tls\") pod \"dns-operator-744455d44c-hng8f\" (UID: \"6d4c4768-195d-42fe-86a7-139c4ec0c86d\") " pod="openshift-dns-operator/dns-operator-744455d44c-hng8f" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.715569 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b12ae595-2119-49c9-9bfd-33eec4b6df65-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-kh6xb\" (UID: \"b12ae595-2119-49c9-9bfd-33eec4b6df65\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kh6xb" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.715937 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/704d2c33-a0ad-4dc1-a5e6-37aab398f9a0-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-5j6rh\" (UID: \"704d2c33-a0ad-4dc1-a5e6-37aab398f9a0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5j6rh" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.717505 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9d57536b-4f57-4098-b519-19fdc2559eda-installation-pull-secrets\") pod \"image-registry-697d97f7c8-tz6gh\" (UID: \"9d57536b-4f57-4098-b519-19fdc2559eda\") " pod="openshift-image-registry/image-registry-697d97f7c8-tz6gh" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.717596 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3d05c111-6b35-44c3-b587-12f470d584c3-proxy-tls\") pod \"machine-config-controller-84d6567774-tkhkf\" (UID: \"3d05c111-6b35-44c3-b587-12f470d584c3\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tkhkf" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.717643 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/240d1325-4400-475e-8bc7-9915294148d8-csi-data-dir\") pod \"csi-hostpathplugin-b5hps\" (UID: \"240d1325-4400-475e-8bc7-9915294148d8\") " pod="hostpath-provisioner/csi-hostpathplugin-b5hps" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.717678 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9d57536b-4f57-4098-b519-19fdc2559eda-bound-sa-token\") pod \"image-registry-697d97f7c8-tz6gh\" (UID: \"9d57536b-4f57-4098-b519-19fdc2559eda\") " pod="openshift-image-registry/image-registry-697d97f7c8-tz6gh" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.717793 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/3ba8e5ae-906d-47bd-862d-8e66e0defb9f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-7q7bl\" (UID: \"3ba8e5ae-906d-47bd-862d-8e66e0defb9f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7q7bl" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.717833 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3cd02fb0-bac4-47b0-9846-a94399041f77-config\") pod \"service-ca-operator-777779d784-nvtgb\" (UID: \"3cd02fb0-bac4-47b0-9846-a94399041f77\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nvtgb" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.717932 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3a9659ac-0c7d-41be-becc-5ec038244f00-config-volume\") pod \"dns-default-nnjvd\" (UID: \"3a9659ac-0c7d-41be-becc-5ec038244f00\") " pod="openshift-dns/dns-default-nnjvd" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.718045 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac1bc217-a917-4e71-8672-bab0cdaa1bbf-config\") pod \"etcd-operator-b45778765-6b65x\" (UID: \"ac1bc217-a917-4e71-8672-bab0cdaa1bbf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6b65x" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.718261 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6635182d-e9ef-4294-8ff3-ae305c10feb4-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9vnnw\" (UID: \"6635182d-e9ef-4294-8ff3-ae305c10feb4\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9vnnw" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.718913 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9d57536b-4f57-4098-b519-19fdc2559eda-registry-tls\") pod \"image-registry-697d97f7c8-tz6gh\" (UID: \"9d57536b-4f57-4098-b519-19fdc2559eda\") " pod="openshift-image-registry/image-registry-697d97f7c8-tz6gh" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.720220 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6635182d-e9ef-4294-8ff3-ae305c10feb4-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9vnnw\" (UID: \"6635182d-e9ef-4294-8ff3-ae305c10feb4\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9vnnw" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.720939 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9d57536b-4f57-4098-b519-19fdc2559eda-ca-trust-extracted\") pod \"image-registry-697d97f7c8-tz6gh\" (UID: \"9d57536b-4f57-4098-b519-19fdc2559eda\") " pod="openshift-image-registry/image-registry-697d97f7c8-tz6gh" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.721554 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac1bc217-a917-4e71-8672-bab0cdaa1bbf-config\") pod \"etcd-operator-b45778765-6b65x\" (UID: \"ac1bc217-a917-4e71-8672-bab0cdaa1bbf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6b65x" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.725023 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddxlc\" (UniqueName: \"kubernetes.io/projected/f8055a95-6b09-4e32-88b8-82ad36ca5029-kube-api-access-ddxlc\") pod \"olm-operator-6b444d44fb-lnkwt\" (UID: \"f8055a95-6b09-4e32-88b8-82ad36ca5029\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lnkwt" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.725110 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ac1bc217-a917-4e71-8672-bab0cdaa1bbf-etcd-client\") pod \"etcd-operator-b45778765-6b65x\" (UID: \"ac1bc217-a917-4e71-8672-bab0cdaa1bbf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6b65x" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.725141 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6fls\" (UniqueName: \"kubernetes.io/projected/7e1fdc12-aac5-4f72-9b22-0212c2f3988e-kube-api-access-h6fls\") pod \"collect-profiles-29551080-mdzx2\" (UID: \"7e1fdc12-aac5-4f72-9b22-0212c2f3988e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551080-mdzx2" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.725185 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3ba8e5ae-906d-47bd-862d-8e66e0defb9f-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-7q7bl\" (UID: \"3ba8e5ae-906d-47bd-862d-8e66e0defb9f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7q7bl" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.725236 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ba0aa566-e854-473c-b6b6-2f9dfece6133-cert\") pod \"ingress-canary-hdrrh\" (UID: \"ba0aa566-e854-473c-b6b6-2f9dfece6133\") " pod="openshift-ingress-canary/ingress-canary-hdrrh" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.725229 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d57536b-4f57-4098-b519-19fdc2559eda-trusted-ca\") pod \"image-registry-697d97f7c8-tz6gh\" (UID: \"9d57536b-4f57-4098-b519-19fdc2559eda\") " pod="openshift-image-registry/image-registry-697d97f7c8-tz6gh" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.725262 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39cc5cc7-fd80-4461-b4b1-adece1093703-config\") pod \"kube-controller-manager-operator-78b949d7b-b5jn4\" (UID: \"39cc5cc7-fd80-4461-b4b1-adece1093703\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-b5jn4" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.725315 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f1fc8bc8-fa6c-4648-9bc3-491cec75d584-metrics-tls\") pod \"ingress-operator-5b745b69d9-lh92k\" (UID: \"f1fc8bc8-fa6c-4648-9bc3-491cec75d584\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lh92k" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.725351 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/704d2c33-a0ad-4dc1-a5e6-37aab398f9a0-config\") pod \"kube-apiserver-operator-766d6c64bb-5j6rh\" (UID: \"704d2c33-a0ad-4dc1-a5e6-37aab398f9a0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5j6rh" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.725356 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/30f3c24f-2b49-4343-8c94-f6d56b43a35d-auth-proxy-config\") pod \"machine-config-operator-74547568cd-8jf57\" (UID: \"30f3c24f-2b49-4343-8c94-f6d56b43a35d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8jf57" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.725407 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/996087ed-6480-4650-8632-c991e5d16c99-apiservice-cert\") pod \"packageserver-d55dfcdfc-fp2th\" (UID: \"996087ed-6480-4650-8632-c991e5d16c99\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fp2th" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.725439 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7e1fdc12-aac5-4f72-9b22-0212c2f3988e-config-volume\") pod \"collect-profiles-29551080-mdzx2\" (UID: \"7e1fdc12-aac5-4f72-9b22-0212c2f3988e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551080-mdzx2" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.725487 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xg2hf\" (UniqueName: \"kubernetes.io/projected/996087ed-6480-4650-8632-c991e5d16c99-kube-api-access-xg2hf\") pod \"packageserver-d55dfcdfc-fp2th\" (UID: \"996087ed-6480-4650-8632-c991e5d16c99\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fp2th" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.725512 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f1fc8bc8-fa6c-4648-9bc3-491cec75d584-trusted-ca\") pod \"ingress-operator-5b745b69d9-lh92k\" (UID: \"f1fc8bc8-fa6c-4648-9bc3-491cec75d584\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lh92k" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.725558 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e2641b0e-aae4-49df-931f-95e38505812f-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-wz9f2\" (UID: \"e2641b0e-aae4-49df-931f-95e38505812f\") " pod="openshift-marketplace/marketplace-operator-79b997595-wz9f2" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.725581 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e2641b0e-aae4-49df-931f-95e38505812f-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-wz9f2\" (UID: \"e2641b0e-aae4-49df-931f-95e38505812f\") " pod="openshift-marketplace/marketplace-operator-79b997595-wz9f2" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.725830 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b12ae595-2119-49c9-9bfd-33eec4b6df65-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-kh6xb\" (UID: \"b12ae595-2119-49c9-9bfd-33eec4b6df65\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kh6xb" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.726106 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9d57536b-4f57-4098-b519-19fdc2559eda-registry-certificates\") pod \"image-registry-697d97f7c8-tz6gh\" (UID: \"9d57536b-4f57-4098-b519-19fdc2559eda\") " pod="openshift-image-registry/image-registry-697d97f7c8-tz6gh" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.726174 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbdw2\" (UniqueName: \"kubernetes.io/projected/9d57536b-4f57-4098-b519-19fdc2559eda-kube-api-access-bbdw2\") pod \"image-registry-697d97f7c8-tz6gh\" (UID: \"9d57536b-4f57-4098-b519-19fdc2559eda\") " pod="openshift-image-registry/image-registry-697d97f7c8-tz6gh" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.726227 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/30f3c24f-2b49-4343-8c94-f6d56b43a35d-proxy-tls\") pod \"machine-config-operator-74547568cd-8jf57\" (UID: \"30f3c24f-2b49-4343-8c94-f6d56b43a35d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8jf57" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.726260 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8a76c9b5-c226-4d93-8d7a-8e56210b572a-oauth-serving-cert\") pod \"console-f9d7485db-sfch8\" (UID: \"8a76c9b5-c226-4d93-8d7a-8e56210b572a\") " pod="openshift-console/console-f9d7485db-sfch8" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.726305 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/317444ee-0620-47d2-869e-77578a367a87-stats-auth\") pod \"router-default-5444994796-dp8wn\" (UID: \"317444ee-0620-47d2-869e-77578a367a87\") " pod="openshift-ingress/router-default-5444994796-dp8wn" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.726335 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7tvf\" (UniqueName: \"kubernetes.io/projected/8a76c9b5-c226-4d93-8d7a-8e56210b572a-kube-api-access-k7tvf\") pod \"console-f9d7485db-sfch8\" (UID: \"8a76c9b5-c226-4d93-8d7a-8e56210b572a\") " pod="openshift-console/console-f9d7485db-sfch8" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.726229 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3ba8e5ae-906d-47bd-862d-8e66e0defb9f-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-7q7bl\" (UID: \"3ba8e5ae-906d-47bd-862d-8e66e0defb9f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7q7bl" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.726359 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/cc613bd5-c976-458c-aeda-d62a304f4103-signing-key\") pod \"service-ca-9c57cc56f-6db7b\" (UID: \"cc613bd5-c976-458c-aeda-d62a304f4103\") " pod="openshift-service-ca/service-ca-9c57cc56f-6db7b" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.726625 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c219beb3-4ba5-43bd-b2ec-3855d19c2b57-profile-collector-cert\") pod \"catalog-operator-68c6474976-dgxnd\" (UID: \"c219beb3-4ba5-43bd-b2ec-3855d19c2b57\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dgxnd" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.726688 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdg9n\" (UniqueName: \"kubernetes.io/projected/cc613bd5-c976-458c-aeda-d62a304f4103-kube-api-access-hdg9n\") pod \"service-ca-9c57cc56f-6db7b\" (UID: \"cc613bd5-c976-458c-aeda-d62a304f4103\") " pod="openshift-service-ca/service-ca-9c57cc56f-6db7b" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.727051 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8a76c9b5-c226-4d93-8d7a-8e56210b572a-oauth-serving-cert\") pod \"console-f9d7485db-sfch8\" (UID: \"8a76c9b5-c226-4d93-8d7a-8e56210b572a\") " pod="openshift-console/console-f9d7485db-sfch8" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.728587 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8a76c9b5-c226-4d93-8d7a-8e56210b572a-console-oauth-config\") pod \"console-f9d7485db-sfch8\" (UID: \"8a76c9b5-c226-4d93-8d7a-8e56210b572a\") " pod="openshift-console/console-f9d7485db-sfch8" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.729691 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8a76c9b5-c226-4d93-8d7a-8e56210b572a-console-serving-cert\") pod \"console-f9d7485db-sfch8\" (UID: \"8a76c9b5-c226-4d93-8d7a-8e56210b572a\") " pod="openshift-console/console-f9d7485db-sfch8" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.732864 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9d57536b-4f57-4098-b519-19fdc2559eda-registry-certificates\") pod \"image-registry-697d97f7c8-tz6gh\" (UID: \"9d57536b-4f57-4098-b519-19fdc2559eda\") " pod="openshift-image-registry/image-registry-697d97f7c8-tz6gh" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.733645 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6635182d-e9ef-4294-8ff3-ae305c10feb4-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9vnnw\" (UID: \"6635182d-e9ef-4294-8ff3-ae305c10feb4\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9vnnw" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.734511 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ac1bc217-a917-4e71-8672-bab0cdaa1bbf-etcd-client\") pod \"etcd-operator-b45778765-6b65x\" (UID: \"ac1bc217-a917-4e71-8672-bab0cdaa1bbf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6b65x" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.746909 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4w7p8\" (UniqueName: \"kubernetes.io/projected/ac1bc217-a917-4e71-8672-bab0cdaa1bbf-kube-api-access-4w7p8\") pod \"etcd-operator-b45778765-6b65x\" (UID: \"ac1bc217-a917-4e71-8672-bab0cdaa1bbf\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6b65x" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.769881 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3ba8e5ae-906d-47bd-862d-8e66e0defb9f-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-7q7bl\" (UID: \"3ba8e5ae-906d-47bd-862d-8e66e0defb9f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7q7bl" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.787692 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/704d2c33-a0ad-4dc1-a5e6-37aab398f9a0-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-5j6rh\" (UID: \"704d2c33-a0ad-4dc1-a5e6-37aab398f9a0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5j6rh" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.806841 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8924\" (UniqueName: \"kubernetes.io/projected/3ba8e5ae-906d-47bd-862d-8e66e0defb9f-kube-api-access-m8924\") pod \"cluster-image-registry-operator-dc59b4c8b-7q7bl\" (UID: \"3ba8e5ae-906d-47bd-862d-8e66e0defb9f\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7q7bl" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.835931 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 14:06:22 crc kubenswrapper[4722]: E0309 14:06:22.836040 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 14:06:23.336017032 +0000 UTC m=+223.891585608 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.836164 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/996087ed-6480-4650-8632-c991e5d16c99-webhook-cert\") pod \"packageserver-d55dfcdfc-fp2th\" (UID: \"996087ed-6480-4650-8632-c991e5d16c99\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fp2th" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.836213 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/cc613bd5-c976-458c-aeda-d62a304f4103-signing-cabundle\") pod \"service-ca-9c57cc56f-6db7b\" (UID: \"cc613bd5-c976-458c-aeda-d62a304f4103\") " pod="openshift-service-ca/service-ca-9c57cc56f-6db7b" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.836241 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sm4kb\" (UniqueName: \"kubernetes.io/projected/9fb5d9ef-bb16-45d0-a7c0-a9fb43edeb34-kube-api-access-sm4kb\") pod \"control-plane-machine-set-operator-78cbb6b69f-2xkrn\" (UID: \"9fb5d9ef-bb16-45d0-a7c0-a9fb43edeb34\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2xkrn" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.836267 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cg7gs\" (UniqueName: \"kubernetes.io/projected/3cd02fb0-bac4-47b0-9846-a94399041f77-kube-api-access-cg7gs\") pod \"service-ca-operator-777779d784-nvtgb\" (UID: \"3cd02fb0-bac4-47b0-9846-a94399041f77\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nvtgb" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.836291 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87vhb\" (UniqueName: \"kubernetes.io/projected/30f3c24f-2b49-4343-8c94-f6d56b43a35d-kube-api-access-87vhb\") pod \"machine-config-operator-74547568cd-8jf57\" (UID: \"30f3c24f-2b49-4343-8c94-f6d56b43a35d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8jf57" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.836315 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xnvj\" (UniqueName: \"kubernetes.io/projected/3d05c111-6b35-44c3-b587-12f470d584c3-kube-api-access-6xnvj\") pod \"machine-config-controller-84d6567774-tkhkf\" (UID: \"3d05c111-6b35-44c3-b587-12f470d584c3\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tkhkf" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.836337 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f1fc8bc8-fa6c-4648-9bc3-491cec75d584-bound-sa-token\") pod \"ingress-operator-5b745b69d9-lh92k\" (UID: \"f1fc8bc8-fa6c-4648-9bc3-491cec75d584\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lh92k" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.836358 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/240d1325-4400-475e-8bc7-9915294148d8-mountpoint-dir\") pod \"csi-hostpathplugin-b5hps\" (UID: \"240d1325-4400-475e-8bc7-9915294148d8\") " pod="hostpath-provisioner/csi-hostpathplugin-b5hps" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.836379 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/30f3c24f-2b49-4343-8c94-f6d56b43a35d-images\") pod \"machine-config-operator-74547568cd-8jf57\" (UID: \"30f3c24f-2b49-4343-8c94-f6d56b43a35d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8jf57" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.836402 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25sgl\" (UniqueName: \"kubernetes.io/projected/3a9659ac-0c7d-41be-becc-5ec038244f00-kube-api-access-25sgl\") pod \"dns-default-nnjvd\" (UID: \"3a9659ac-0c7d-41be-becc-5ec038244f00\") " pod="openshift-dns/dns-default-nnjvd" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.836437 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tz6gh\" (UID: \"9d57536b-4f57-4098-b519-19fdc2559eda\") " pod="openshift-image-registry/image-registry-697d97f7c8-tz6gh" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.836461 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/996087ed-6480-4650-8632-c991e5d16c99-tmpfs\") pod \"packageserver-d55dfcdfc-fp2th\" (UID: \"996087ed-6480-4650-8632-c991e5d16c99\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fp2th" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.836485 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/39cc5cc7-fd80-4461-b4b1-adece1093703-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-b5jn4\" (UID: \"39cc5cc7-fd80-4461-b4b1-adece1093703\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-b5jn4" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.836512 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nrnl\" (UniqueName: \"kubernetes.io/projected/ba0aa566-e854-473c-b6b6-2f9dfece6133-kube-api-access-5nrnl\") pod \"ingress-canary-hdrrh\" (UID: \"ba0aa566-e854-473c-b6b6-2f9dfece6133\") " pod="openshift-ingress-canary/ingress-canary-hdrrh" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.836544 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4jdd\" (UniqueName: \"kubernetes.io/projected/e2641b0e-aae4-49df-931f-95e38505812f-kube-api-access-p4jdd\") pod \"marketplace-operator-79b997595-wz9f2\" (UID: \"e2641b0e-aae4-49df-931f-95e38505812f\") " pod="openshift-marketplace/marketplace-operator-79b997595-wz9f2" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.836577 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/317444ee-0620-47d2-869e-77578a367a87-service-ca-bundle\") pod \"router-default-5444994796-dp8wn\" (UID: \"317444ee-0620-47d2-869e-77578a367a87\") " pod="openshift-ingress/router-default-5444994796-dp8wn" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.836603 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/9fb5d9ef-bb16-45d0-a7c0-a9fb43edeb34-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-2xkrn\" (UID: \"9fb5d9ef-bb16-45d0-a7c0-a9fb43edeb34\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2xkrn" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.836641 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkcsm\" (UniqueName: \"kubernetes.io/projected/317444ee-0620-47d2-869e-77578a367a87-kube-api-access-zkcsm\") pod \"router-default-5444994796-dp8wn\" (UID: \"317444ee-0620-47d2-869e-77578a367a87\") " pod="openshift-ingress/router-default-5444994796-dp8wn" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.836671 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/240d1325-4400-475e-8bc7-9915294148d8-plugins-dir\") pod \"csi-hostpathplugin-b5hps\" (UID: \"240d1325-4400-475e-8bc7-9915294148d8\") " pod="hostpath-provisioner/csi-hostpathplugin-b5hps" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.836700 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/228d39d8-b0bc-4491-be90-e473c090f412-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-mrtb2\" (UID: \"228d39d8-b0bc-4491-be90-e473c090f412\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mrtb2" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.835932 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vq66f\" (UniqueName: \"kubernetes.io/projected/b12ae595-2119-49c9-9bfd-33eec4b6df65-kube-api-access-vq66f\") pod \"kube-storage-version-migrator-operator-b67b599dd-kh6xb\" (UID: \"b12ae595-2119-49c9-9bfd-33eec4b6df65\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kh6xb" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.836728 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/317444ee-0620-47d2-869e-77578a367a87-metrics-certs\") pod \"router-default-5444994796-dp8wn\" (UID: \"317444ee-0620-47d2-869e-77578a367a87\") " pod="openshift-ingress/router-default-5444994796-dp8wn" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.836811 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/317444ee-0620-47d2-869e-77578a367a87-default-certificate\") pod \"router-default-5444994796-dp8wn\" (UID: \"317444ee-0620-47d2-869e-77578a367a87\") " pod="openshift-ingress/router-default-5444994796-dp8wn" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.836853 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f8055a95-6b09-4e32-88b8-82ad36ca5029-srv-cert\") pod \"olm-operator-6b444d44fb-lnkwt\" (UID: \"f8055a95-6b09-4e32-88b8-82ad36ca5029\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lnkwt" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.836877 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/26cb738b-5dfa-4b97-8153-790ec6eb198b-node-bootstrap-token\") pod \"machine-config-server-pjvfp\" (UID: \"26cb738b-5dfa-4b97-8153-790ec6eb198b\") " pod="openshift-machine-config-operator/machine-config-server-pjvfp" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.836914 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f8055a95-6b09-4e32-88b8-82ad36ca5029-profile-collector-cert\") pod \"olm-operator-6b444d44fb-lnkwt\" (UID: \"f8055a95-6b09-4e32-88b8-82ad36ca5029\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lnkwt" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.836939 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnf6j\" (UniqueName: \"kubernetes.io/projected/abf2dbee-a467-4858-8d5a-d4d1bf1bb430-kube-api-access-gnf6j\") pod \"migrator-59844c95c7-nk4z2\" (UID: \"abf2dbee-a467-4858-8d5a-d4d1bf1bb430\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nk4z2" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.836970 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3a9659ac-0c7d-41be-becc-5ec038244f00-metrics-tls\") pod \"dns-default-nnjvd\" (UID: \"3a9659ac-0c7d-41be-becc-5ec038244f00\") " pod="openshift-dns/dns-default-nnjvd" Mar 09 14:06:22 crc kubenswrapper[4722]: E0309 14:06:22.836990 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 14:06:23.336974039 +0000 UTC m=+223.892542615 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tz6gh" (UID: "9d57536b-4f57-4098-b519-19fdc2559eda") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.837034 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3cd02fb0-bac4-47b0-9846-a94399041f77-serving-cert\") pod \"service-ca-operator-777779d784-nvtgb\" (UID: \"3cd02fb0-bac4-47b0-9846-a94399041f77\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nvtgb" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.837064 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nc6w4\" (UniqueName: \"kubernetes.io/projected/26cb738b-5dfa-4b97-8153-790ec6eb198b-kube-api-access-nc6w4\") pod \"machine-config-server-pjvfp\" (UID: \"26cb738b-5dfa-4b97-8153-790ec6eb198b\") " pod="openshift-machine-config-operator/machine-config-server-pjvfp" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.837089 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgsjv\" (UniqueName: \"kubernetes.io/projected/228d39d8-b0bc-4491-be90-e473c090f412-kube-api-access-kgsjv\") pod \"package-server-manager-789f6589d5-mrtb2\" (UID: \"228d39d8-b0bc-4491-be90-e473c090f412\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mrtb2" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.837113 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c219beb3-4ba5-43bd-b2ec-3855d19c2b57-srv-cert\") pod \"catalog-operator-68c6474976-dgxnd\" (UID: \"c219beb3-4ba5-43bd-b2ec-3855d19c2b57\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dgxnd" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.837136 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/240d1325-4400-475e-8bc7-9915294148d8-socket-dir\") pod \"csi-hostpathplugin-b5hps\" (UID: \"240d1325-4400-475e-8bc7-9915294148d8\") " pod="hostpath-provisioner/csi-hostpathplugin-b5hps" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.837154 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/30f3c24f-2b49-4343-8c94-f6d56b43a35d-images\") pod \"machine-config-operator-74547568cd-8jf57\" (UID: \"30f3c24f-2b49-4343-8c94-f6d56b43a35d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8jf57" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.837166 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wqrg\" (UniqueName: \"kubernetes.io/projected/f1fc8bc8-fa6c-4648-9bc3-491cec75d584-kube-api-access-5wqrg\") pod \"ingress-operator-5b745b69d9-lh92k\" (UID: \"f1fc8bc8-fa6c-4648-9bc3-491cec75d584\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lh92k" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.837221 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7e1fdc12-aac5-4f72-9b22-0212c2f3988e-secret-volume\") pod \"collect-profiles-29551080-mdzx2\" (UID: \"7e1fdc12-aac5-4f72-9b22-0212c2f3988e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551080-mdzx2" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.837239 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csnbb\" (UniqueName: \"kubernetes.io/projected/40be416c-1b7b-4973-b9ed-25ae20cd660d-kube-api-access-csnbb\") pod \"auto-csr-approver-29551086-cvc28\" (UID: \"40be416c-1b7b-4973-b9ed-25ae20cd660d\") " pod="openshift-infra/auto-csr-approver-29551086-cvc28" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.837259 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/240d1325-4400-475e-8bc7-9915294148d8-registration-dir\") pod \"csi-hostpathplugin-b5hps\" (UID: \"240d1325-4400-475e-8bc7-9915294148d8\") " pod="hostpath-provisioner/csi-hostpathplugin-b5hps" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.837279 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76545\" (UniqueName: \"kubernetes.io/projected/240d1325-4400-475e-8bc7-9915294148d8-kube-api-access-76545\") pod \"csi-hostpathplugin-b5hps\" (UID: \"240d1325-4400-475e-8bc7-9915294148d8\") " pod="hostpath-provisioner/csi-hostpathplugin-b5hps" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.837325 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fzvj\" (UniqueName: \"kubernetes.io/projected/53e973bb-4b49-4815-b8b3-a6cd76e210bf-kube-api-access-5fzvj\") pod \"multus-admission-controller-857f4d67dd-f8gtl\" (UID: \"53e973bb-4b49-4815-b8b3-a6cd76e210bf\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-f8gtl" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.837344 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4j9cs\" (UniqueName: \"kubernetes.io/projected/c219beb3-4ba5-43bd-b2ec-3855d19c2b57-kube-api-access-4j9cs\") pod \"catalog-operator-68c6474976-dgxnd\" (UID: \"c219beb3-4ba5-43bd-b2ec-3855d19c2b57\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dgxnd" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.837363 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3d05c111-6b35-44c3-b587-12f470d584c3-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-tkhkf\" (UID: \"3d05c111-6b35-44c3-b587-12f470d584c3\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tkhkf" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.837364 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/240d1325-4400-475e-8bc7-9915294148d8-mountpoint-dir\") pod \"csi-hostpathplugin-b5hps\" (UID: \"240d1325-4400-475e-8bc7-9915294148d8\") " pod="hostpath-provisioner/csi-hostpathplugin-b5hps" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.837378 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/53e973bb-4b49-4815-b8b3-a6cd76e210bf-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-f8gtl\" (UID: \"53e973bb-4b49-4815-b8b3-a6cd76e210bf\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-f8gtl" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.837501 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/26cb738b-5dfa-4b97-8153-790ec6eb198b-certs\") pod \"machine-config-server-pjvfp\" (UID: \"26cb738b-5dfa-4b97-8153-790ec6eb198b\") " pod="openshift-machine-config-operator/machine-config-server-pjvfp" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.837557 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/39cc5cc7-fd80-4461-b4b1-adece1093703-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-b5jn4\" (UID: \"39cc5cc7-fd80-4461-b4b1-adece1093703\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-b5jn4" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.837566 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/cc613bd5-c976-458c-aeda-d62a304f4103-signing-cabundle\") pod \"service-ca-9c57cc56f-6db7b\" (UID: \"cc613bd5-c976-458c-aeda-d62a304f4103\") " pod="openshift-service-ca/service-ca-9c57cc56f-6db7b" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.837601 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3d05c111-6b35-44c3-b587-12f470d584c3-proxy-tls\") pod \"machine-config-controller-84d6567774-tkhkf\" (UID: \"3d05c111-6b35-44c3-b587-12f470d584c3\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tkhkf" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.837624 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/240d1325-4400-475e-8bc7-9915294148d8-csi-data-dir\") pod \"csi-hostpathplugin-b5hps\" (UID: \"240d1325-4400-475e-8bc7-9915294148d8\") " pod="hostpath-provisioner/csi-hostpathplugin-b5hps" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.837684 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3cd02fb0-bac4-47b0-9846-a94399041f77-config\") pod \"service-ca-operator-777779d784-nvtgb\" (UID: \"3cd02fb0-bac4-47b0-9846-a94399041f77\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nvtgb" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.837710 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3a9659ac-0c7d-41be-becc-5ec038244f00-config-volume\") pod \"dns-default-nnjvd\" (UID: \"3a9659ac-0c7d-41be-becc-5ec038244f00\") " pod="openshift-dns/dns-default-nnjvd" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.837744 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddxlc\" (UniqueName: \"kubernetes.io/projected/f8055a95-6b09-4e32-88b8-82ad36ca5029-kube-api-access-ddxlc\") pod \"olm-operator-6b444d44fb-lnkwt\" (UID: \"f8055a95-6b09-4e32-88b8-82ad36ca5029\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lnkwt" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.837773 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6fls\" (UniqueName: \"kubernetes.io/projected/7e1fdc12-aac5-4f72-9b22-0212c2f3988e-kube-api-access-h6fls\") pod \"collect-profiles-29551080-mdzx2\" (UID: \"7e1fdc12-aac5-4f72-9b22-0212c2f3988e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551080-mdzx2" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.837794 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ba0aa566-e854-473c-b6b6-2f9dfece6133-cert\") pod \"ingress-canary-hdrrh\" (UID: \"ba0aa566-e854-473c-b6b6-2f9dfece6133\") " pod="openshift-ingress-canary/ingress-canary-hdrrh" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.837812 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39cc5cc7-fd80-4461-b4b1-adece1093703-config\") pod \"kube-controller-manager-operator-78b949d7b-b5jn4\" (UID: \"39cc5cc7-fd80-4461-b4b1-adece1093703\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-b5jn4" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.837833 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f1fc8bc8-fa6c-4648-9bc3-491cec75d584-metrics-tls\") pod \"ingress-operator-5b745b69d9-lh92k\" (UID: \"f1fc8bc8-fa6c-4648-9bc3-491cec75d584\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lh92k" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.837866 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/30f3c24f-2b49-4343-8c94-f6d56b43a35d-auth-proxy-config\") pod \"machine-config-operator-74547568cd-8jf57\" (UID: \"30f3c24f-2b49-4343-8c94-f6d56b43a35d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8jf57" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.837890 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/996087ed-6480-4650-8632-c991e5d16c99-apiservice-cert\") pod \"packageserver-d55dfcdfc-fp2th\" (UID: \"996087ed-6480-4650-8632-c991e5d16c99\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fp2th" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.837913 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7e1fdc12-aac5-4f72-9b22-0212c2f3988e-config-volume\") pod \"collect-profiles-29551080-mdzx2\" (UID: \"7e1fdc12-aac5-4f72-9b22-0212c2f3988e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551080-mdzx2" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.837934 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xg2hf\" (UniqueName: \"kubernetes.io/projected/996087ed-6480-4650-8632-c991e5d16c99-kube-api-access-xg2hf\") pod \"packageserver-d55dfcdfc-fp2th\" (UID: \"996087ed-6480-4650-8632-c991e5d16c99\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fp2th" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.837986 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f1fc8bc8-fa6c-4648-9bc3-491cec75d584-trusted-ca\") pod \"ingress-operator-5b745b69d9-lh92k\" (UID: \"f1fc8bc8-fa6c-4648-9bc3-491cec75d584\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lh92k" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.838008 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e2641b0e-aae4-49df-931f-95e38505812f-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-wz9f2\" (UID: \"e2641b0e-aae4-49df-931f-95e38505812f\") " pod="openshift-marketplace/marketplace-operator-79b997595-wz9f2" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.838033 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e2641b0e-aae4-49df-931f-95e38505812f-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-wz9f2\" (UID: \"e2641b0e-aae4-49df-931f-95e38505812f\") " pod="openshift-marketplace/marketplace-operator-79b997595-wz9f2" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.838076 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/30f3c24f-2b49-4343-8c94-f6d56b43a35d-proxy-tls\") pod \"machine-config-operator-74547568cd-8jf57\" (UID: \"30f3c24f-2b49-4343-8c94-f6d56b43a35d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8jf57" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.838107 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/317444ee-0620-47d2-869e-77578a367a87-stats-auth\") pod \"router-default-5444994796-dp8wn\" (UID: \"317444ee-0620-47d2-869e-77578a367a87\") " pod="openshift-ingress/router-default-5444994796-dp8wn" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.838147 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/cc613bd5-c976-458c-aeda-d62a304f4103-signing-key\") pod \"service-ca-9c57cc56f-6db7b\" (UID: \"cc613bd5-c976-458c-aeda-d62a304f4103\") " pod="openshift-service-ca/service-ca-9c57cc56f-6db7b" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.838167 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c219beb3-4ba5-43bd-b2ec-3855d19c2b57-profile-collector-cert\") pod \"catalog-operator-68c6474976-dgxnd\" (UID: \"c219beb3-4ba5-43bd-b2ec-3855d19c2b57\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dgxnd" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.838195 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdg9n\" (UniqueName: \"kubernetes.io/projected/cc613bd5-c976-458c-aeda-d62a304f4103-kube-api-access-hdg9n\") pod \"service-ca-9c57cc56f-6db7b\" (UID: \"cc613bd5-c976-458c-aeda-d62a304f4103\") " pod="openshift-service-ca/service-ca-9c57cc56f-6db7b" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.838616 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/317444ee-0620-47d2-869e-77578a367a87-service-ca-bundle\") pod \"router-default-5444994796-dp8wn\" (UID: \"317444ee-0620-47d2-869e-77578a367a87\") " pod="openshift-ingress/router-default-5444994796-dp8wn" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.838869 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/996087ed-6480-4650-8632-c991e5d16c99-tmpfs\") pod \"packageserver-d55dfcdfc-fp2th\" (UID: \"996087ed-6480-4650-8632-c991e5d16c99\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fp2th" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.840640 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/240d1325-4400-475e-8bc7-9915294148d8-plugins-dir\") pod \"csi-hostpathplugin-b5hps\" (UID: \"240d1325-4400-475e-8bc7-9915294148d8\") " pod="hostpath-provisioner/csi-hostpathplugin-b5hps" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.841140 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3d05c111-6b35-44c3-b587-12f470d584c3-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-tkhkf\" (UID: \"3d05c111-6b35-44c3-b587-12f470d584c3\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tkhkf" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.842181 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3a9659ac-0c7d-41be-becc-5ec038244f00-metrics-tls\") pod \"dns-default-nnjvd\" (UID: \"3a9659ac-0c7d-41be-becc-5ec038244f00\") " pod="openshift-dns/dns-default-nnjvd" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.842316 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/240d1325-4400-475e-8bc7-9915294148d8-socket-dir\") pod \"csi-hostpathplugin-b5hps\" (UID: \"240d1325-4400-475e-8bc7-9915294148d8\") " pod="hostpath-provisioner/csi-hostpathplugin-b5hps" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.842581 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/53e973bb-4b49-4815-b8b3-a6cd76e210bf-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-f8gtl\" (UID: \"53e973bb-4b49-4815-b8b3-a6cd76e210bf\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-f8gtl" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.842706 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/240d1325-4400-475e-8bc7-9915294148d8-registration-dir\") pod \"csi-hostpathplugin-b5hps\" (UID: \"240d1325-4400-475e-8bc7-9915294148d8\") " pod="hostpath-provisioner/csi-hostpathplugin-b5hps" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.843182 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/240d1325-4400-475e-8bc7-9915294148d8-csi-data-dir\") pod \"csi-hostpathplugin-b5hps\" (UID: \"240d1325-4400-475e-8bc7-9915294148d8\") " pod="hostpath-provisioner/csi-hostpathplugin-b5hps" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.843342 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/317444ee-0620-47d2-869e-77578a367a87-default-certificate\") pod \"router-default-5444994796-dp8wn\" (UID: \"317444ee-0620-47d2-869e-77578a367a87\") " pod="openshift-ingress/router-default-5444994796-dp8wn" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.843587 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/26cb738b-5dfa-4b97-8153-790ec6eb198b-certs\") pod \"machine-config-server-pjvfp\" (UID: \"26cb738b-5dfa-4b97-8153-790ec6eb198b\") " pod="openshift-machine-config-operator/machine-config-server-pjvfp" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.843777 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/317444ee-0620-47d2-869e-77578a367a87-metrics-certs\") pod \"router-default-5444994796-dp8wn\" (UID: \"317444ee-0620-47d2-869e-77578a367a87\") " pod="openshift-ingress/router-default-5444994796-dp8wn" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.844081 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3cd02fb0-bac4-47b0-9846-a94399041f77-config\") pod \"service-ca-operator-777779d784-nvtgb\" (UID: \"3cd02fb0-bac4-47b0-9846-a94399041f77\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nvtgb" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.844697 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c219beb3-4ba5-43bd-b2ec-3855d19c2b57-srv-cert\") pod \"catalog-operator-68c6474976-dgxnd\" (UID: \"c219beb3-4ba5-43bd-b2ec-3855d19c2b57\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dgxnd" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.845132 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/30f3c24f-2b49-4343-8c94-f6d56b43a35d-auth-proxy-config\") pod \"machine-config-operator-74547568cd-8jf57\" (UID: \"30f3c24f-2b49-4343-8c94-f6d56b43a35d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8jf57" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.845624 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/9fb5d9ef-bb16-45d0-a7c0-a9fb43edeb34-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-2xkrn\" (UID: \"9fb5d9ef-bb16-45d0-a7c0-a9fb43edeb34\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2xkrn" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.846364 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7e1fdc12-aac5-4f72-9b22-0212c2f3988e-secret-volume\") pod \"collect-profiles-29551080-mdzx2\" (UID: \"7e1fdc12-aac5-4f72-9b22-0212c2f3988e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551080-mdzx2" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.848144 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7e1fdc12-aac5-4f72-9b22-0212c2f3988e-config-volume\") pod \"collect-profiles-29551080-mdzx2\" (UID: \"7e1fdc12-aac5-4f72-9b22-0212c2f3988e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551080-mdzx2" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.848711 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-6b65x" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.848828 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f1fc8bc8-fa6c-4648-9bc3-491cec75d584-trusted-ca\") pod \"ingress-operator-5b745b69d9-lh92k\" (UID: \"f1fc8bc8-fa6c-4648-9bc3-491cec75d584\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lh92k" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.849677 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/39cc5cc7-fd80-4461-b4b1-adece1093703-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-b5jn4\" (UID: \"39cc5cc7-fd80-4461-b4b1-adece1093703\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-b5jn4" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.850479 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3a9659ac-0c7d-41be-becc-5ec038244f00-config-volume\") pod \"dns-default-nnjvd\" (UID: \"3a9659ac-0c7d-41be-becc-5ec038244f00\") " pod="openshift-dns/dns-default-nnjvd" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.851102 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e2641b0e-aae4-49df-931f-95e38505812f-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-wz9f2\" (UID: \"e2641b0e-aae4-49df-931f-95e38505812f\") " pod="openshift-marketplace/marketplace-operator-79b997595-wz9f2" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.852496 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/996087ed-6480-4650-8632-c991e5d16c99-webhook-cert\") pod \"packageserver-d55dfcdfc-fp2th\" (UID: \"996087ed-6480-4650-8632-c991e5d16c99\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fp2th" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.853119 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3cd02fb0-bac4-47b0-9846-a94399041f77-serving-cert\") pod \"service-ca-operator-777779d784-nvtgb\" (UID: \"3cd02fb0-bac4-47b0-9846-a94399041f77\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nvtgb" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.855112 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f8055a95-6b09-4e32-88b8-82ad36ca5029-profile-collector-cert\") pod \"olm-operator-6b444d44fb-lnkwt\" (UID: \"f8055a95-6b09-4e32-88b8-82ad36ca5029\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lnkwt" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.855195 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f8055a95-6b09-4e32-88b8-82ad36ca5029-srv-cert\") pod \"olm-operator-6b444d44fb-lnkwt\" (UID: \"f8055a95-6b09-4e32-88b8-82ad36ca5029\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lnkwt" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.855316 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39cc5cc7-fd80-4461-b4b1-adece1093703-config\") pod \"kube-controller-manager-operator-78b949d7b-b5jn4\" (UID: \"39cc5cc7-fd80-4461-b4b1-adece1093703\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-b5jn4" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.855740 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cl4st\" (UniqueName: \"kubernetes.io/projected/6d4c4768-195d-42fe-86a7-139c4ec0c86d-kube-api-access-cl4st\") pod \"dns-operator-744455d44c-hng8f\" (UID: \"6d4c4768-195d-42fe-86a7-139c4ec0c86d\") " pod="openshift-dns-operator/dns-operator-744455d44c-hng8f" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.860243 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/317444ee-0620-47d2-869e-77578a367a87-stats-auth\") pod \"router-default-5444994796-dp8wn\" (UID: \"317444ee-0620-47d2-869e-77578a367a87\") " pod="openshift-ingress/router-default-5444994796-dp8wn" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.860261 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e2641b0e-aae4-49df-931f-95e38505812f-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-wz9f2\" (UID: \"e2641b0e-aae4-49df-931f-95e38505812f\") " pod="openshift-marketplace/marketplace-operator-79b997595-wz9f2" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.860518 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/996087ed-6480-4650-8632-c991e5d16c99-apiservice-cert\") pod \"packageserver-d55dfcdfc-fp2th\" (UID: \"996087ed-6480-4650-8632-c991e5d16c99\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fp2th" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.860921 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/26cb738b-5dfa-4b97-8153-790ec6eb198b-node-bootstrap-token\") pod \"machine-config-server-pjvfp\" (UID: \"26cb738b-5dfa-4b97-8153-790ec6eb198b\") " pod="openshift-machine-config-operator/machine-config-server-pjvfp" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.860917 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/228d39d8-b0bc-4491-be90-e473c090f412-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-mrtb2\" (UID: \"228d39d8-b0bc-4491-be90-e473c090f412\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mrtb2" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.860963 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f1fc8bc8-fa6c-4648-9bc3-491cec75d584-metrics-tls\") pod \"ingress-operator-5b745b69d9-lh92k\" (UID: \"f1fc8bc8-fa6c-4648-9bc3-491cec75d584\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lh92k" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.861039 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ba0aa566-e854-473c-b6b6-2f9dfece6133-cert\") pod \"ingress-canary-hdrrh\" (UID: \"ba0aa566-e854-473c-b6b6-2f9dfece6133\") " pod="openshift-ingress-canary/ingress-canary-hdrrh" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.861470 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3d05c111-6b35-44c3-b587-12f470d584c3-proxy-tls\") pod \"machine-config-controller-84d6567774-tkhkf\" (UID: \"3d05c111-6b35-44c3-b587-12f470d584c3\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tkhkf" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.861972 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/30f3c24f-2b49-4343-8c94-f6d56b43a35d-proxy-tls\") pod \"machine-config-operator-74547568cd-8jf57\" (UID: \"30f3c24f-2b49-4343-8c94-f6d56b43a35d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8jf57" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.864367 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c219beb3-4ba5-43bd-b2ec-3855d19c2b57-profile-collector-cert\") pod \"catalog-operator-68c6474976-dgxnd\" (UID: \"c219beb3-4ba5-43bd-b2ec-3855d19c2b57\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dgxnd" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.866567 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6635182d-e9ef-4294-8ff3-ae305c10feb4-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9vnnw\" (UID: \"6635182d-e9ef-4294-8ff3-ae305c10feb4\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9vnnw" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.868445 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/cc613bd5-c976-458c-aeda-d62a304f4103-signing-key\") pod \"service-ca-9c57cc56f-6db7b\" (UID: \"cc613bd5-c976-458c-aeda-d62a304f4103\") " pod="openshift-service-ca/service-ca-9c57cc56f-6db7b" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.879956 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7q7bl" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.886165 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9vnnw" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.890008 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9d57536b-4f57-4098-b519-19fdc2559eda-bound-sa-token\") pod \"image-registry-697d97f7c8-tz6gh\" (UID: \"9d57536b-4f57-4098-b519-19fdc2559eda\") " pod="openshift-image-registry/image-registry-697d97f7c8-tz6gh" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.892835 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-68k6r"] Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.894904 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5j6rh" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.901714 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-hng8f" Mar 09 14:06:22 crc kubenswrapper[4722]: W0309 14:06:22.907752 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb642aeb0_15ca_4f07_a83d_ede389a04408.slice/crio-ea42107ab7c2122155bc2d79751b2d7892e4b31c169d80857eadf69d04cade9d WatchSource:0}: Error finding container ea42107ab7c2122155bc2d79751b2d7892e4b31c169d80857eadf69d04cade9d: Status 404 returned error can't find the container with id ea42107ab7c2122155bc2d79751b2d7892e4b31c169d80857eadf69d04cade9d Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.919717 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kh6xb" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.926951 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbdw2\" (UniqueName: \"kubernetes.io/projected/9d57536b-4f57-4098-b519-19fdc2559eda-kube-api-access-bbdw2\") pod \"image-registry-697d97f7c8-tz6gh\" (UID: \"9d57536b-4f57-4098-b519-19fdc2559eda\") " pod="openshift-image-registry/image-registry-697d97f7c8-tz6gh" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.938922 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 14:06:22 crc kubenswrapper[4722]: E0309 14:06:22.939293 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 14:06:23.439256354 +0000 UTC m=+223.994824930 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.939549 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tz6gh\" (UID: \"9d57536b-4f57-4098-b519-19fdc2559eda\") " pod="openshift-image-registry/image-registry-697d97f7c8-tz6gh" Mar 09 14:06:22 crc kubenswrapper[4722]: E0309 14:06:22.940098 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 14:06:23.440085889 +0000 UTC m=+223.995654545 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tz6gh" (UID: "9d57536b-4f57-4098-b519-19fdc2559eda") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.950037 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7tvf\" (UniqueName: \"kubernetes.io/projected/8a76c9b5-c226-4d93-8d7a-8e56210b572a-kube-api-access-k7tvf\") pod \"console-f9d7485db-sfch8\" (UID: \"8a76c9b5-c226-4d93-8d7a-8e56210b572a\") " pod="openshift-console/console-f9d7485db-sfch8" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.971342 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87vhb\" (UniqueName: \"kubernetes.io/projected/30f3c24f-2b49-4343-8c94-f6d56b43a35d-kube-api-access-87vhb\") pod \"machine-config-operator-74547568cd-8jf57\" (UID: \"30f3c24f-2b49-4343-8c94-f6d56b43a35d\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8jf57" Mar 09 14:06:22 crc kubenswrapper[4722]: I0309 14:06:22.986770 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f1fc8bc8-fa6c-4648-9bc3-491cec75d584-bound-sa-token\") pod \"ingress-operator-5b745b69d9-lh92k\" (UID: \"f1fc8bc8-fa6c-4648-9bc3-491cec75d584\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lh92k" Mar 09 14:06:23 crc kubenswrapper[4722]: I0309 14:06:23.005803 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8jf57" Mar 09 14:06:23 crc kubenswrapper[4722]: I0309 14:06:23.008842 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cg7gs\" (UniqueName: \"kubernetes.io/projected/3cd02fb0-bac4-47b0-9846-a94399041f77-kube-api-access-cg7gs\") pod \"service-ca-operator-777779d784-nvtgb\" (UID: \"3cd02fb0-bac4-47b0-9846-a94399041f77\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-nvtgb" Mar 09 14:06:23 crc kubenswrapper[4722]: I0309 14:06:23.034426 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sm4kb\" (UniqueName: \"kubernetes.io/projected/9fb5d9ef-bb16-45d0-a7c0-a9fb43edeb34-kube-api-access-sm4kb\") pod \"control-plane-machine-set-operator-78cbb6b69f-2xkrn\" (UID: \"9fb5d9ef-bb16-45d0-a7c0-a9fb43edeb34\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2xkrn" Mar 09 14:06:23 crc kubenswrapper[4722]: I0309 14:06:23.042169 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 14:06:23 crc kubenswrapper[4722]: E0309 14:06:23.042578 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 14:06:23.542557419 +0000 UTC m=+224.098125995 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:23 crc kubenswrapper[4722]: I0309 14:06:23.047908 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xnvj\" (UniqueName: \"kubernetes.io/projected/3d05c111-6b35-44c3-b587-12f470d584c3-kube-api-access-6xnvj\") pod \"machine-config-controller-84d6567774-tkhkf\" (UID: \"3d05c111-6b35-44c3-b587-12f470d584c3\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tkhkf" Mar 09 14:06:23 crc kubenswrapper[4722]: I0309 14:06:23.072111 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/39cc5cc7-fd80-4461-b4b1-adece1093703-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-b5jn4\" (UID: \"39cc5cc7-fd80-4461-b4b1-adece1093703\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-b5jn4" Mar 09 14:06:23 crc kubenswrapper[4722]: I0309 14:06:23.091893 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4jdd\" (UniqueName: \"kubernetes.io/projected/e2641b0e-aae4-49df-931f-95e38505812f-kube-api-access-p4jdd\") pod \"marketplace-operator-79b997595-wz9f2\" (UID: \"e2641b0e-aae4-49df-931f-95e38505812f\") " pod="openshift-marketplace/marketplace-operator-79b997595-wz9f2" Mar 09 14:06:23 crc kubenswrapper[4722]: I0309 14:06:23.128320 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nrnl\" (UniqueName: \"kubernetes.io/projected/ba0aa566-e854-473c-b6b6-2f9dfece6133-kube-api-access-5nrnl\") pod \"ingress-canary-hdrrh\" (UID: \"ba0aa566-e854-473c-b6b6-2f9dfece6133\") " pod="openshift-ingress-canary/ingress-canary-hdrrh" Mar 09 14:06:23 crc kubenswrapper[4722]: I0309 14:06:23.129562 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25sgl\" (UniqueName: \"kubernetes.io/projected/3a9659ac-0c7d-41be-becc-5ec038244f00-kube-api-access-25sgl\") pod \"dns-default-nnjvd\" (UID: \"3a9659ac-0c7d-41be-becc-5ec038244f00\") " pod="openshift-dns/dns-default-nnjvd" Mar 09 14:06:23 crc kubenswrapper[4722]: I0309 14:06:23.143094 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-nnjvd" Mar 09 14:06:23 crc kubenswrapper[4722]: I0309 14:06:23.145741 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tz6gh\" (UID: \"9d57536b-4f57-4098-b519-19fdc2559eda\") " pod="openshift-image-registry/image-registry-697d97f7c8-tz6gh" Mar 09 14:06:23 crc kubenswrapper[4722]: E0309 14:06:23.146369 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 14:06:23.646351139 +0000 UTC m=+224.201919715 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tz6gh" (UID: "9d57536b-4f57-4098-b519-19fdc2559eda") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:23 crc kubenswrapper[4722]: I0309 14:06:23.156356 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-sfch8" Mar 09 14:06:23 crc kubenswrapper[4722]: I0309 14:06:23.158296 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkcsm\" (UniqueName: \"kubernetes.io/projected/317444ee-0620-47d2-869e-77578a367a87-kube-api-access-zkcsm\") pod \"router-default-5444994796-dp8wn\" (UID: \"317444ee-0620-47d2-869e-77578a367a87\") " pod="openshift-ingress/router-default-5444994796-dp8wn" Mar 09 14:06:23 crc kubenswrapper[4722]: I0309 14:06:23.163552 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-6b65x"] Mar 09 14:06:23 crc kubenswrapper[4722]: I0309 14:06:23.169906 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdg9n\" (UniqueName: \"kubernetes.io/projected/cc613bd5-c976-458c-aeda-d62a304f4103-kube-api-access-hdg9n\") pod \"service-ca-9c57cc56f-6db7b\" (UID: \"cc613bd5-c976-458c-aeda-d62a304f4103\") " pod="openshift-service-ca/service-ca-9c57cc56f-6db7b" Mar 09 14:06:23 crc kubenswrapper[4722]: W0309 14:06:23.184426 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac1bc217_a917_4e71_8672_bab0cdaa1bbf.slice/crio-a25d26640eab77938b280c2780d1a9ccee03bd38bee89653fed85490c7d8af98 WatchSource:0}: Error finding container a25d26640eab77938b280c2780d1a9ccee03bd38bee89653fed85490c7d8af98: Status 404 returned error can't find the container with id a25d26640eab77938b280c2780d1a9ccee03bd38bee89653fed85490c7d8af98 Mar 09 14:06:23 crc kubenswrapper[4722]: I0309 14:06:23.193705 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7q7bl"] Mar 09 14:06:23 crc kubenswrapper[4722]: I0309 14:06:23.202478 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnf6j\" (UniqueName: \"kubernetes.io/projected/abf2dbee-a467-4858-8d5a-d4d1bf1bb430-kube-api-access-gnf6j\") pod \"migrator-59844c95c7-nk4z2\" (UID: \"abf2dbee-a467-4858-8d5a-d4d1bf1bb430\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nk4z2" Mar 09 14:06:23 crc kubenswrapper[4722]: I0309 14:06:23.211709 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wqrg\" (UniqueName: \"kubernetes.io/projected/f1fc8bc8-fa6c-4648-9bc3-491cec75d584-kube-api-access-5wqrg\") pod \"ingress-operator-5b745b69d9-lh92k\" (UID: \"f1fc8bc8-fa6c-4648-9bc3-491cec75d584\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lh92k" Mar 09 14:06:23 crc kubenswrapper[4722]: I0309 14:06:23.230999 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76545\" (UniqueName: \"kubernetes.io/projected/240d1325-4400-475e-8bc7-9915294148d8-kube-api-access-76545\") pod \"csi-hostpathplugin-b5hps\" (UID: \"240d1325-4400-475e-8bc7-9915294148d8\") " pod="hostpath-provisioner/csi-hostpathplugin-b5hps" Mar 09 14:06:23 crc kubenswrapper[4722]: I0309 14:06:23.231817 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lh92k" Mar 09 14:06:23 crc kubenswrapper[4722]: I0309 14:06:23.240279 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-b5jn4" Mar 09 14:06:23 crc kubenswrapper[4722]: I0309 14:06:23.245276 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-dp8wn" Mar 09 14:06:23 crc kubenswrapper[4722]: I0309 14:06:23.247536 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 14:06:23 crc kubenswrapper[4722]: E0309 14:06:23.247954 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 14:06:23.747936223 +0000 UTC m=+224.303504809 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:23 crc kubenswrapper[4722]: I0309 14:06:23.250450 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9vnnw"] Mar 09 14:06:23 crc kubenswrapper[4722]: I0309 14:06:23.258018 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fzvj\" (UniqueName: \"kubernetes.io/projected/53e973bb-4b49-4815-b8b3-a6cd76e210bf-kube-api-access-5fzvj\") pod \"multus-admission-controller-857f4d67dd-f8gtl\" (UID: \"53e973bb-4b49-4815-b8b3-a6cd76e210bf\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-f8gtl" Mar 09 14:06:23 crc kubenswrapper[4722]: I0309 14:06:23.266750 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tkhkf" Mar 09 14:06:23 crc kubenswrapper[4722]: I0309 14:06:23.276284 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4j9cs\" (UniqueName: \"kubernetes.io/projected/c219beb3-4ba5-43bd-b2ec-3855d19c2b57-kube-api-access-4j9cs\") pod \"catalog-operator-68c6474976-dgxnd\" (UID: \"c219beb3-4ba5-43bd-b2ec-3855d19c2b57\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dgxnd" Mar 09 14:06:23 crc kubenswrapper[4722]: I0309 14:06:23.278085 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2xkrn" Mar 09 14:06:23 crc kubenswrapper[4722]: I0309 14:06:23.281818 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-nvtgb" Mar 09 14:06:23 crc kubenswrapper[4722]: I0309 14:06:23.286005 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nc6w4\" (UniqueName: \"kubernetes.io/projected/26cb738b-5dfa-4b97-8153-790ec6eb198b-kube-api-access-nc6w4\") pod \"machine-config-server-pjvfp\" (UID: \"26cb738b-5dfa-4b97-8153-790ec6eb198b\") " pod="openshift-machine-config-operator/machine-config-server-pjvfp" Mar 09 14:06:23 crc kubenswrapper[4722]: I0309 14:06:23.289896 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-6db7b" Mar 09 14:06:23 crc kubenswrapper[4722]: I0309 14:06:23.299095 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nk4z2" Mar 09 14:06:23 crc kubenswrapper[4722]: I0309 14:06:23.299665 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kh6xb"] Mar 09 14:06:23 crc kubenswrapper[4722]: I0309 14:06:23.309273 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgsjv\" (UniqueName: \"kubernetes.io/projected/228d39d8-b0bc-4491-be90-e473c090f412-kube-api-access-kgsjv\") pod \"package-server-manager-789f6589d5-mrtb2\" (UID: \"228d39d8-b0bc-4491-be90-e473c090f412\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mrtb2" Mar 09 14:06:23 crc kubenswrapper[4722]: I0309 14:06:23.318290 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dgxnd" Mar 09 14:06:23 crc kubenswrapper[4722]: I0309 14:06:23.330026 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csnbb\" (UniqueName: \"kubernetes.io/projected/40be416c-1b7b-4973-b9ed-25ae20cd660d-kube-api-access-csnbb\") pod \"auto-csr-approver-29551086-cvc28\" (UID: \"40be416c-1b7b-4973-b9ed-25ae20cd660d\") " pod="openshift-infra/auto-csr-approver-29551086-cvc28" Mar 09 14:06:23 crc kubenswrapper[4722]: I0309 14:06:23.333560 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-wz9f2" Mar 09 14:06:23 crc kubenswrapper[4722]: W0309 14:06:23.341184 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6635182d_e9ef_4294_8ff3_ae305c10feb4.slice/crio-0c19d7b2282f919afd847055a1ed13d6c78c2a0c67426e2a8778d38fc4574b78 WatchSource:0}: Error finding container 0c19d7b2282f919afd847055a1ed13d6c78c2a0c67426e2a8778d38fc4574b78: Status 404 returned error can't find the container with id 0c19d7b2282f919afd847055a1ed13d6c78c2a0c67426e2a8778d38fc4574b78 Mar 09 14:06:23 crc kubenswrapper[4722]: I0309 14:06:23.349140 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-f8gtl" Mar 09 14:06:23 crc kubenswrapper[4722]: I0309 14:06:23.350404 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tz6gh\" (UID: \"9d57536b-4f57-4098-b519-19fdc2559eda\") " pod="openshift-image-registry/image-registry-697d97f7c8-tz6gh" Mar 09 14:06:23 crc kubenswrapper[4722]: E0309 14:06:23.350750 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 14:06:23.850735943 +0000 UTC m=+224.406304519 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tz6gh" (UID: "9d57536b-4f57-4098-b519-19fdc2559eda") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:23 crc kubenswrapper[4722]: I0309 14:06:23.357119 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddxlc\" (UniqueName: \"kubernetes.io/projected/f8055a95-6b09-4e32-88b8-82ad36ca5029-kube-api-access-ddxlc\") pod \"olm-operator-6b444d44fb-lnkwt\" (UID: \"f8055a95-6b09-4e32-88b8-82ad36ca5029\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lnkwt" Mar 09 14:06:23 crc kubenswrapper[4722]: I0309 14:06:23.377653 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551086-cvc28" Mar 09 14:06:23 crc kubenswrapper[4722]: I0309 14:06:23.378667 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6fls\" (UniqueName: \"kubernetes.io/projected/7e1fdc12-aac5-4f72-9b22-0212c2f3988e-kube-api-access-h6fls\") pod \"collect-profiles-29551080-mdzx2\" (UID: \"7e1fdc12-aac5-4f72-9b22-0212c2f3988e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551080-mdzx2" Mar 09 14:06:23 crc kubenswrapper[4722]: I0309 14:06:23.386875 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-b5hps" Mar 09 14:06:23 crc kubenswrapper[4722]: I0309 14:06:23.401993 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xg2hf\" (UniqueName: \"kubernetes.io/projected/996087ed-6480-4650-8632-c991e5d16c99-kube-api-access-xg2hf\") pod \"packageserver-d55dfcdfc-fp2th\" (UID: \"996087ed-6480-4650-8632-c991e5d16c99\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fp2th" Mar 09 14:06:23 crc kubenswrapper[4722]: I0309 14:06:23.405043 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-8jf57"] Mar 09 14:06:23 crc kubenswrapper[4722]: I0309 14:06:23.427474 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-hdrrh" Mar 09 14:06:23 crc kubenswrapper[4722]: I0309 14:06:23.434986 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-hng8f"] Mar 09 14:06:23 crc kubenswrapper[4722]: I0309 14:06:23.455015 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 14:06:23 crc kubenswrapper[4722]: E0309 14:06:23.455804 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 14:06:23.95578707 +0000 UTC m=+224.511355646 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:23 crc kubenswrapper[4722]: I0309 14:06:23.460500 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-pjvfp" Mar 09 14:06:23 crc kubenswrapper[4722]: I0309 14:06:23.460961 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5j6rh"] Mar 09 14:06:23 crc kubenswrapper[4722]: I0309 14:06:23.473828 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-nnjvd"] Mar 09 14:06:23 crc kubenswrapper[4722]: W0309 14:06:23.500025 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30f3c24f_2b49_4343_8c94_f6d56b43a35d.slice/crio-0a7041cafe3cf1774ac1767685bf9f0c8abae3c77e6723dbeeae471c7eca348a WatchSource:0}: Error finding container 0a7041cafe3cf1774ac1767685bf9f0c8abae3c77e6723dbeeae471c7eca348a: Status 404 returned error can't find the container with id 0a7041cafe3cf1774ac1767685bf9f0c8abae3c77e6723dbeeae471c7eca348a Mar 09 14:06:23 crc kubenswrapper[4722]: I0309 14:06:23.560765 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mrtb2" Mar 09 14:06:23 crc kubenswrapper[4722]: I0309 14:06:23.563056 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lnkwt" Mar 09 14:06:23 crc kubenswrapper[4722]: I0309 14:06:23.565862 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7q7bl" event={"ID":"3ba8e5ae-906d-47bd-862d-8e66e0defb9f","Type":"ContainerStarted","Data":"f7704bd23f40ef83ac418d22036da29daca51b0a399bbae7040a614ea63feb51"} Mar 09 14:06:23 crc kubenswrapper[4722]: I0309 14:06:23.566262 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tz6gh\" (UID: \"9d57536b-4f57-4098-b519-19fdc2559eda\") " pod="openshift-image-registry/image-registry-697d97f7c8-tz6gh" Mar 09 14:06:23 crc kubenswrapper[4722]: E0309 14:06:23.566700 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 14:06:24.06668272 +0000 UTC m=+224.622251286 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tz6gh" (UID: "9d57536b-4f57-4098-b519-19fdc2559eda") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:23 crc kubenswrapper[4722]: I0309 14:06:23.577012 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-sfch8"] Mar 09 14:06:23 crc kubenswrapper[4722]: W0309 14:06:23.610241 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a9659ac_0c7d_41be_becc_5ec038244f00.slice/crio-9851d23744be70c8a9cc4f74a859f1651a00feb695c0e968de5f46f6f8049509 WatchSource:0}: Error finding container 9851d23744be70c8a9cc4f74a859f1651a00feb695c0e968de5f46f6f8049509: Status 404 returned error can't find the container with id 9851d23744be70c8a9cc4f74a859f1651a00feb695c0e968de5f46f6f8049509 Mar 09 14:06:23 crc kubenswrapper[4722]: I0309 14:06:23.610834 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8jf57" event={"ID":"30f3c24f-2b49-4343-8c94-f6d56b43a35d","Type":"ContainerStarted","Data":"0a7041cafe3cf1774ac1767685bf9f0c8abae3c77e6723dbeeae471c7eca348a"} Mar 09 14:06:23 crc kubenswrapper[4722]: W0309 14:06:23.610896 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod704d2c33_a0ad_4dc1_a5e6_37aab398f9a0.slice/crio-ff541e8df421c6bf2b7d5956910733920ad72fd57603d09c40fa45e860dddebe WatchSource:0}: Error finding container ff541e8df421c6bf2b7d5956910733920ad72fd57603d09c40fa45e860dddebe: Status 404 returned error can't find the container with id ff541e8df421c6bf2b7d5956910733920ad72fd57603d09c40fa45e860dddebe Mar 09 14:06:23 crc kubenswrapper[4722]: I0309 14:06:23.627683 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9vnnw" event={"ID":"6635182d-e9ef-4294-8ff3-ae305c10feb4","Type":"ContainerStarted","Data":"0c19d7b2282f919afd847055a1ed13d6c78c2a0c67426e2a8778d38fc4574b78"} Mar 09 14:06:23 crc kubenswrapper[4722]: W0309 14:06:23.637325 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod317444ee_0620_47d2_869e_77578a367a87.slice/crio-feb3a1769030d5dcca5220cfc04e993138a3ad1753946d5a015998220255bb8c WatchSource:0}: Error finding container feb3a1769030d5dcca5220cfc04e993138a3ad1753946d5a015998220255bb8c: Status 404 returned error can't find the container with id feb3a1769030d5dcca5220cfc04e993138a3ad1753946d5a015998220255bb8c Mar 09 14:06:23 crc kubenswrapper[4722]: I0309 14:06:23.640926 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551080-mdzx2" Mar 09 14:06:23 crc kubenswrapper[4722]: I0309 14:06:23.642828 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-8tkbn" event={"ID":"427a4c04-99cd-4f53-ae98-20c1755d7658","Type":"ContainerStarted","Data":"e614a94c2fbf38f966a561149647e547de6024a28362b5b063c83cb79f8c1d5e"} Mar 09 14:06:23 crc kubenswrapper[4722]: I0309 14:06:23.657033 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fp2th" Mar 09 14:06:23 crc kubenswrapper[4722]: W0309 14:06:23.664984 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a76c9b5_c226_4d93_8d7a_8e56210b572a.slice/crio-6fee2f304e49495845c7a3f368cf91dcffba93fe423b49acd454c70f8fd5cb71 WatchSource:0}: Error finding container 6fee2f304e49495845c7a3f368cf91dcffba93fe423b49acd454c70f8fd5cb71: Status 404 returned error can't find the container with id 6fee2f304e49495845c7a3f368cf91dcffba93fe423b49acd454c70f8fd5cb71 Mar 09 14:06:23 crc kubenswrapper[4722]: I0309 14:06:23.666988 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 14:06:23 crc kubenswrapper[4722]: E0309 14:06:23.667619 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 14:06:24.167575594 +0000 UTC m=+224.723144200 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:23 crc kubenswrapper[4722]: I0309 14:06:23.674375 4722 generic.go:334] "Generic (PLEG): container finished" podID="4084fbb0-8fae-4b7e-a3f6-ec9d723bb367" containerID="4e5caf996f7c2c772b0634e8bd89afe9b1e1afc248574fc1eb44830292f88b51" exitCode=0 Mar 09 14:06:23 crc kubenswrapper[4722]: I0309 14:06:23.674568 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-gh244" event={"ID":"4084fbb0-8fae-4b7e-a3f6-ec9d723bb367","Type":"ContainerDied","Data":"4e5caf996f7c2c772b0634e8bd89afe9b1e1afc248574fc1eb44830292f88b51"} Mar 09 14:06:23 crc kubenswrapper[4722]: I0309 14:06:23.683091 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-lh92k"] Mar 09 14:06:23 crc kubenswrapper[4722]: I0309 14:06:23.686634 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-jmwpv" event={"ID":"9f2a2160-888c-4101-8b1c-63498753a2b7","Type":"ContainerStarted","Data":"85acb30f0b7edaa70a94e32978a937bc5732b62f2e5d01e71c5c4e5bc6878dfb"} Mar 09 14:06:23 crc kubenswrapper[4722]: I0309 14:06:23.686810 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-jmwpv" Mar 09 14:06:23 crc kubenswrapper[4722]: I0309 14:06:23.706238 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-68k6r" event={"ID":"b642aeb0-15ca-4f07-a83d-ede389a04408","Type":"ContainerStarted","Data":"343e76c820ba369d4a8eb3402f2485fe25e3a3e67bc0c71f87d2cc7eb6f42833"} Mar 09 14:06:23 crc kubenswrapper[4722]: I0309 14:06:23.706281 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-68k6r" event={"ID":"b642aeb0-15ca-4f07-a83d-ede389a04408","Type":"ContainerStarted","Data":"ea42107ab7c2122155bc2d79751b2d7892e4b31c169d80857eadf69d04cade9d"} Mar 09 14:06:23 crc kubenswrapper[4722]: I0309 14:06:23.723036 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7jl6s" event={"ID":"7c9f4b4e-a00a-4a59-bdc8-a76fe4d6d0d8","Type":"ContainerStarted","Data":"79916c6726dbfb002d28e78d11c07952f18996befe41677c7d20046f8af907ae"} Mar 09 14:06:23 crc kubenswrapper[4722]: I0309 14:06:23.739804 4722 generic.go:334] "Generic (PLEG): container finished" podID="062db6f1-77ab-4eca-be53-6480160aff81" containerID="95502c47c0a8b9b2366c23815d060e75fb191d8c423f6e5af0a0ad8f33d29e56" exitCode=0 Mar 09 14:06:23 crc kubenswrapper[4722]: I0309 14:06:23.739891 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2mf4l" event={"ID":"062db6f1-77ab-4eca-be53-6480160aff81","Type":"ContainerDied","Data":"95502c47c0a8b9b2366c23815d060e75fb191d8c423f6e5af0a0ad8f33d29e56"} Mar 09 14:06:23 crc kubenswrapper[4722]: I0309 14:06:23.751856 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-xxz9x" event={"ID":"8a0be9cc-2cda-4b22-838b-0036cfa4405c","Type":"ContainerStarted","Data":"e4041c088a8cab2386ba7e54d1400d0adb6fcc5948d032f6ce194864732d54ad"} Mar 09 14:06:23 crc kubenswrapper[4722]: I0309 14:06:23.751918 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-xxz9x" event={"ID":"8a0be9cc-2cda-4b22-838b-0036cfa4405c","Type":"ContainerStarted","Data":"d9d90696164d03684d073fd2c313e2dae0ee84e925cb65c6c2dbb3d986d126ec"} Mar 09 14:06:23 crc kubenswrapper[4722]: I0309 14:06:23.764974 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kh6xb" event={"ID":"b12ae595-2119-49c9-9bfd-33eec4b6df65","Type":"ContainerStarted","Data":"62bafddfbc9da4a8ad309fa4ab44e38fc6b7f60492f7d2ca49ea9a83b14db6de"} Mar 09 14:06:23 crc kubenswrapper[4722]: I0309 14:06:23.769700 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tz6gh\" (UID: \"9d57536b-4f57-4098-b519-19fdc2559eda\") " pod="openshift-image-registry/image-registry-697d97f7c8-tz6gh" Mar 09 14:06:23 crc kubenswrapper[4722]: E0309 14:06:23.770024 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 14:06:24.270012504 +0000 UTC m=+224.825581080 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tz6gh" (UID: "9d57536b-4f57-4098-b519-19fdc2559eda") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:23 crc kubenswrapper[4722]: I0309 14:06:23.776115 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fnmcv" event={"ID":"5822aa06-923f-4ba0-bdf6-617c5a5eb617","Type":"ContainerStarted","Data":"da3b6bbffdb84559c44a3447c9e3299b1c7e01d39efa1a9a7f3dfab8784d16cc"} Mar 09 14:06:23 crc kubenswrapper[4722]: I0309 14:06:23.776498 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fnmcv" Mar 09 14:06:23 crc kubenswrapper[4722]: I0309 14:06:23.778979 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-6b65x" event={"ID":"ac1bc217-a917-4e71-8672-bab0cdaa1bbf","Type":"ContainerStarted","Data":"a25d26640eab77938b280c2780d1a9ccee03bd38bee89653fed85490c7d8af98"} Mar 09 14:06:23 crc kubenswrapper[4722]: I0309 14:06:23.779971 4722 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-fnmcv container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Mar 09 14:06:23 crc kubenswrapper[4722]: I0309 14:06:23.780591 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fnmcv" podUID="5822aa06-923f-4ba0-bdf6-617c5a5eb617" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Mar 09 14:06:23 crc kubenswrapper[4722]: I0309 14:06:23.780095 4722 patch_prober.go:28] interesting pod/downloads-7954f5f757-knrzp container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Mar 09 14:06:23 crc kubenswrapper[4722]: I0309 14:06:23.780971 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-knrzp" podUID="8f915ef9-5d9a-43ee-a333-def8766e083d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" Mar 09 14:06:23 crc kubenswrapper[4722]: I0309 14:06:23.780500 4722 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-mgzj7 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Mar 09 14:06:23 crc kubenswrapper[4722]: I0309 14:06:23.781121 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-mgzj7" podUID="e0194ee7-d343-4042-9c2b-08cc513ee43e" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Mar 09 14:06:23 crc kubenswrapper[4722]: W0309 14:06:23.790500 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1fc8bc8_fa6c_4648_9bc3_491cec75d584.slice/crio-d34eb7c7463b50fe7667df18c390d0744eb328294e9408e52173e85d51e4aeb3 WatchSource:0}: Error finding container d34eb7c7463b50fe7667df18c390d0744eb328294e9408e52173e85d51e4aeb3: Status 404 returned error can't find the container with id d34eb7c7463b50fe7667df18c390d0744eb328294e9408e52173e85d51e4aeb3 Mar 09 14:06:23 crc kubenswrapper[4722]: I0309 14:06:23.873389 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 14:06:23 crc kubenswrapper[4722]: E0309 14:06:23.873643 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 14:06:24.373614458 +0000 UTC m=+224.929183034 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:23 crc kubenswrapper[4722]: I0309 14:06:23.874266 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tz6gh\" (UID: \"9d57536b-4f57-4098-b519-19fdc2559eda\") " pod="openshift-image-registry/image-registry-697d97f7c8-tz6gh" Mar 09 14:06:23 crc kubenswrapper[4722]: E0309 14:06:23.876075 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 14:06:24.3760679 +0000 UTC m=+224.931636476 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tz6gh" (UID: "9d57536b-4f57-4098-b519-19fdc2559eda") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:23 crc kubenswrapper[4722]: I0309 14:06:23.975732 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 14:06:23 crc kubenswrapper[4722]: E0309 14:06:23.976110 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 14:06:24.476072498 +0000 UTC m=+225.031641074 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:23 crc kubenswrapper[4722]: I0309 14:06:23.984319 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-8tkbn" podStartSLOduration=160.984266431 podStartE2EDuration="2m40.984266431s" podCreationTimestamp="2026-03-09 14:03:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:06:23.974656386 +0000 UTC m=+224.530224962" watchObservedRunningTime="2026-03-09 14:06:23.984266431 +0000 UTC m=+224.539835007" Mar 09 14:06:24 crc kubenswrapper[4722]: I0309 14:06:24.004419 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-knrzp" podStartSLOduration=160.004402058 podStartE2EDuration="2m40.004402058s" podCreationTimestamp="2026-03-09 14:03:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:06:23.999372379 +0000 UTC m=+224.554940955" watchObservedRunningTime="2026-03-09 14:06:24.004402058 +0000 UTC m=+224.559970634" Mar 09 14:06:24 crc kubenswrapper[4722]: I0309 14:06:24.015736 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-nk4z2"] Mar 09 14:06:24 crc kubenswrapper[4722]: I0309 14:06:24.080448 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-mgzj7" podStartSLOduration=160.080431214 podStartE2EDuration="2m40.080431214s" podCreationTimestamp="2026-03-09 14:03:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:06:24.07691164 +0000 UTC m=+224.632480206" watchObservedRunningTime="2026-03-09 14:06:24.080431214 +0000 UTC m=+224.635999790" Mar 09 14:06:24 crc kubenswrapper[4722]: I0309 14:06:24.081635 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tz6gh\" (UID: \"9d57536b-4f57-4098-b519-19fdc2559eda\") " pod="openshift-image-registry/image-registry-697d97f7c8-tz6gh" Mar 09 14:06:24 crc kubenswrapper[4722]: E0309 14:06:24.083470 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 14:06:24.583455604 +0000 UTC m=+225.139024180 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tz6gh" (UID: "9d57536b-4f57-4098-b519-19fdc2559eda") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:24 crc kubenswrapper[4722]: I0309 14:06:24.097680 4722 ???:1] "http: TLS handshake error from 192.168.126.11:54574: no serving certificate available for the kubelet" Mar 09 14:06:24 crc kubenswrapper[4722]: I0309 14:06:24.169915 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-6s4fg" podStartSLOduration=161.169895279 podStartE2EDuration="2m41.169895279s" podCreationTimestamp="2026-03-09 14:03:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:06:24.167586941 +0000 UTC m=+224.723155527" watchObservedRunningTime="2026-03-09 14:06:24.169895279 +0000 UTC m=+224.725463855" Mar 09 14:06:24 crc kubenswrapper[4722]: I0309 14:06:24.193910 4722 ???:1] "http: TLS handshake error from 192.168.126.11:54576: no serving certificate available for the kubelet" Mar 09 14:06:24 crc kubenswrapper[4722]: I0309 14:06:24.194852 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 14:06:24 crc kubenswrapper[4722]: E0309 14:06:24.195177 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 14:06:24.695155028 +0000 UTC m=+225.250723604 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:24 crc kubenswrapper[4722]: I0309 14:06:24.230944 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-68k6r" podStartSLOduration=160.23091649 podStartE2EDuration="2m40.23091649s" podCreationTimestamp="2026-03-09 14:03:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:06:24.218887642 +0000 UTC m=+224.774456218" watchObservedRunningTime="2026-03-09 14:06:24.23091649 +0000 UTC m=+224.786485066" Mar 09 14:06:24 crc kubenswrapper[4722]: I0309 14:06:24.273361 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-6s4fg" Mar 09 14:06:24 crc kubenswrapper[4722]: I0309 14:06:24.288677 4722 ???:1] "http: TLS handshake error from 192.168.126.11:54586: no serving certificate available for the kubelet" Mar 09 14:06:24 crc kubenswrapper[4722]: I0309 14:06:24.292559 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-6db7b"] Mar 09 14:06:24 crc kubenswrapper[4722]: I0309 14:06:24.299649 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tz6gh\" (UID: \"9d57536b-4f57-4098-b519-19fdc2559eda\") " pod="openshift-image-registry/image-registry-697d97f7c8-tz6gh" Mar 09 14:06:24 crc kubenswrapper[4722]: E0309 14:06:24.300394 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 14:06:24.8003629 +0000 UTC m=+225.355931466 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tz6gh" (UID: "9d57536b-4f57-4098-b519-19fdc2559eda") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:24 crc kubenswrapper[4722]: I0309 14:06:24.309028 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-b5jn4"] Mar 09 14:06:24 crc kubenswrapper[4722]: I0309 14:06:24.325766 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2xkrn"] Mar 09 14:06:24 crc kubenswrapper[4722]: I0309 14:06:24.325815 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-tkhkf"] Mar 09 14:06:24 crc kubenswrapper[4722]: I0309 14:06:24.361649 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-chrnr" Mar 09 14:06:24 crc kubenswrapper[4722]: I0309 14:06:24.383853 4722 ???:1] "http: TLS handshake error from 192.168.126.11:54592: no serving certificate available for the kubelet" Mar 09 14:06:24 crc kubenswrapper[4722]: I0309 14:06:24.404377 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 14:06:24 crc kubenswrapper[4722]: E0309 14:06:24.404562 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 14:06:24.904530821 +0000 UTC m=+225.460099547 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:24 crc kubenswrapper[4722]: I0309 14:06:24.404782 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tz6gh\" (UID: \"9d57536b-4f57-4098-b519-19fdc2559eda\") " pod="openshift-image-registry/image-registry-697d97f7c8-tz6gh" Mar 09 14:06:24 crc kubenswrapper[4722]: E0309 14:06:24.405157 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 14:06:24.905150059 +0000 UTC m=+225.460718635 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tz6gh" (UID: "9d57536b-4f57-4098-b519-19fdc2559eda") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:24 crc kubenswrapper[4722]: W0309 14:06:24.486779 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26cb738b_5dfa_4b97_8153_790ec6eb198b.slice/crio-5e3649a3cc20f4dc7245c19bbb8ac689c42e2ec4fd7d7eb9b21e61ff88bc468a WatchSource:0}: Error finding container 5e3649a3cc20f4dc7245c19bbb8ac689c42e2ec4fd7d7eb9b21e61ff88bc468a: Status 404 returned error can't find the container with id 5e3649a3cc20f4dc7245c19bbb8ac689c42e2ec4fd7d7eb9b21e61ff88bc468a Mar 09 14:06:24 crc kubenswrapper[4722]: I0309 14:06:24.491731 4722 ???:1] "http: TLS handshake error from 192.168.126.11:54608: no serving certificate available for the kubelet" Mar 09 14:06:24 crc kubenswrapper[4722]: I0309 14:06:24.499318 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-sk7lp" podStartSLOduration=161.499300262 podStartE2EDuration="2m41.499300262s" podCreationTimestamp="2026-03-09 14:03:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:06:24.498095167 +0000 UTC m=+225.053663743" watchObservedRunningTime="2026-03-09 14:06:24.499300262 +0000 UTC m=+225.054868838" Mar 09 14:06:24 crc kubenswrapper[4722]: I0309 14:06:24.510996 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 14:06:24 crc kubenswrapper[4722]: E0309 14:06:24.511422 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 14:06:25.011374491 +0000 UTC m=+225.566943067 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:24 crc kubenswrapper[4722]: I0309 14:06:24.531160 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-chrnr" podStartSLOduration=161.531143527 podStartE2EDuration="2m41.531143527s" podCreationTimestamp="2026-03-09 14:03:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:06:24.529088667 +0000 UTC m=+225.084657243" watchObservedRunningTime="2026-03-09 14:06:24.531143527 +0000 UTC m=+225.086712103" Mar 09 14:06:24 crc kubenswrapper[4722]: I0309 14:06:24.611595 4722 ???:1] "http: TLS handshake error from 192.168.126.11:54620: no serving certificate available for the kubelet" Mar 09 14:06:24 crc kubenswrapper[4722]: I0309 14:06:24.618586 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tz6gh\" (UID: \"9d57536b-4f57-4098-b519-19fdc2559eda\") " pod="openshift-image-registry/image-registry-697d97f7c8-tz6gh" Mar 09 14:06:24 crc kubenswrapper[4722]: E0309 14:06:24.618948 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 14:06:25.118935552 +0000 UTC m=+225.674504128 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tz6gh" (UID: "9d57536b-4f57-4098-b519-19fdc2559eda") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:24 crc kubenswrapper[4722]: W0309 14:06:24.633511 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39cc5cc7_fd80_4461_b4b1_adece1093703.slice/crio-45c7e1c98e09f93ea28458dfdc4ea948011f77cd60918432838fd23ed50fad3d WatchSource:0}: Error finding container 45c7e1c98e09f93ea28458dfdc4ea948011f77cd60918432838fd23ed50fad3d: Status 404 returned error can't find the container with id 45c7e1c98e09f93ea28458dfdc4ea948011f77cd60918432838fd23ed50fad3d Mar 09 14:06:24 crc kubenswrapper[4722]: W0309 14:06:24.656943 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc613bd5_c976_458c_aeda_d62a304f4103.slice/crio-a631bda765d25876305a42771b5db259c318abcbc989f05799a46521ffcd6bf2 WatchSource:0}: Error finding container a631bda765d25876305a42771b5db259c318abcbc989f05799a46521ffcd6bf2: Status 404 returned error can't find the container with id a631bda765d25876305a42771b5db259c318abcbc989f05799a46521ffcd6bf2 Mar 09 14:06:24 crc kubenswrapper[4722]: I0309 14:06:24.717240 4722 ???:1] "http: TLS handshake error from 192.168.126.11:54636: no serving certificate available for the kubelet" Mar 09 14:06:24 crc kubenswrapper[4722]: I0309 14:06:24.719997 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 14:06:24 crc kubenswrapper[4722]: E0309 14:06:24.720247 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 14:06:25.220229938 +0000 UTC m=+225.775798514 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:24 crc kubenswrapper[4722]: W0309 14:06:24.780515 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9fb5d9ef_bb16_45d0_a7c0_a9fb43edeb34.slice/crio-e28b39429728d0d46741084f745414d83aa3ee77bcfb0b8c1d923cc75468f517 WatchSource:0}: Error finding container e28b39429728d0d46741084f745414d83aa3ee77bcfb0b8c1d923cc75468f517: Status 404 returned error can't find the container with id e28b39429728d0d46741084f745414d83aa3ee77bcfb0b8c1d923cc75468f517 Mar 09 14:06:24 crc kubenswrapper[4722]: I0309 14:06:24.837110 4722 ???:1] "http: TLS handshake error from 192.168.126.11:54638: no serving certificate available for the kubelet" Mar 09 14:06:24 crc kubenswrapper[4722]: I0309 14:06:24.840096 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tz6gh\" (UID: \"9d57536b-4f57-4098-b519-19fdc2559eda\") " pod="openshift-image-registry/image-registry-697d97f7c8-tz6gh" Mar 09 14:06:24 crc kubenswrapper[4722]: E0309 14:06:24.840582 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 14:06:25.340566928 +0000 UTC m=+225.896135504 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tz6gh" (UID: "9d57536b-4f57-4098-b519-19fdc2559eda") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:24 crc kubenswrapper[4722]: I0309 14:06:24.886702 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lnkwt"] Mar 09 14:06:24 crc kubenswrapper[4722]: I0309 14:06:24.929411 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5j6rh" event={"ID":"704d2c33-a0ad-4dc1-a5e6-37aab398f9a0","Type":"ContainerStarted","Data":"ff541e8df421c6bf2b7d5956910733920ad72fd57603d09c40fa45e860dddebe"} Mar 09 14:06:24 crc kubenswrapper[4722]: I0309 14:06:24.941983 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 14:06:24 crc kubenswrapper[4722]: E0309 14:06:24.942357 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 14:06:25.442341708 +0000 UTC m=+225.997910284 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:25 crc kubenswrapper[4722]: I0309 14:06:25.045067 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7jl6s" podStartSLOduration=162.045049795 podStartE2EDuration="2m42.045049795s" podCreationTimestamp="2026-03-09 14:03:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:06:25.044351685 +0000 UTC m=+225.599920261" watchObservedRunningTime="2026-03-09 14:06:25.045049795 +0000 UTC m=+225.600618371" Mar 09 14:06:25 crc kubenswrapper[4722]: E0309 14:06:25.045492 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 14:06:25.545472669 +0000 UTC m=+226.101041245 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tz6gh" (UID: "9d57536b-4f57-4098-b519-19fdc2559eda") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:25 crc kubenswrapper[4722]: I0309 14:06:25.045091 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tz6gh\" (UID: \"9d57536b-4f57-4098-b519-19fdc2559eda\") " pod="openshift-image-registry/image-registry-697d97f7c8-tz6gh" Mar 09 14:06:25 crc kubenswrapper[4722]: I0309 14:06:25.046994 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-b5jn4" event={"ID":"39cc5cc7-fd80-4461-b4b1-adece1093703","Type":"ContainerStarted","Data":"45c7e1c98e09f93ea28458dfdc4ea948011f77cd60918432838fd23ed50fad3d"} Mar 09 14:06:25 crc kubenswrapper[4722]: I0309 14:06:25.148508 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lh92k" event={"ID":"f1fc8bc8-fa6c-4648-9bc3-491cec75d584","Type":"ContainerStarted","Data":"d34eb7c7463b50fe7667df18c390d0744eb328294e9408e52173e85d51e4aeb3"} Mar 09 14:06:25 crc kubenswrapper[4722]: I0309 14:06:25.151835 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 14:06:25 crc kubenswrapper[4722]: E0309 14:06:25.152391 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 14:06:25.65237412 +0000 UTC m=+226.207942696 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:25 crc kubenswrapper[4722]: I0309 14:06:25.171300 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-nvtgb"] Mar 09 14:06:25 crc kubenswrapper[4722]: I0309 14:06:25.171712 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nk4z2" event={"ID":"abf2dbee-a467-4858-8d5a-d4d1bf1bb430","Type":"ContainerStarted","Data":"0eb183c194de5c8edb8de25aab77bde731d659e5e044a8120feaf0c770a5df0f"} Mar 09 14:06:25 crc kubenswrapper[4722]: I0309 14:06:25.230014 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-6b65x" event={"ID":"ac1bc217-a917-4e71-8672-bab0cdaa1bbf","Type":"ContainerStarted","Data":"d69f3ac13dcd70b18da47b2ead66ab5f35d3352532f08678c39dbd2908fb7ed0"} Mar 09 14:06:25 crc kubenswrapper[4722]: I0309 14:06:25.238787 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-dp8wn" event={"ID":"317444ee-0620-47d2-869e-77578a367a87","Type":"ContainerStarted","Data":"feb3a1769030d5dcca5220cfc04e993138a3ad1753946d5a015998220255bb8c"} Mar 09 14:06:25 crc kubenswrapper[4722]: I0309 14:06:25.248835 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551086-cvc28"] Mar 09 14:06:25 crc kubenswrapper[4722]: I0309 14:06:25.251041 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2xkrn" event={"ID":"9fb5d9ef-bb16-45d0-a7c0-a9fb43edeb34","Type":"ContainerStarted","Data":"e28b39429728d0d46741084f745414d83aa3ee77bcfb0b8c1d923cc75468f517"} Mar 09 14:06:25 crc kubenswrapper[4722]: I0309 14:06:25.260841 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tz6gh\" (UID: \"9d57536b-4f57-4098-b519-19fdc2559eda\") " pod="openshift-image-registry/image-registry-697d97f7c8-tz6gh" Mar 09 14:06:25 crc kubenswrapper[4722]: E0309 14:06:25.263392 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 14:06:25.763366983 +0000 UTC m=+226.318935559 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tz6gh" (UID: "9d57536b-4f57-4098-b519-19fdc2559eda") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:25 crc kubenswrapper[4722]: I0309 14:06:25.269586 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7q7bl" event={"ID":"3ba8e5ae-906d-47bd-862d-8e66e0defb9f","Type":"ContainerStarted","Data":"27d0f1fdee93bf8b2e1068d12ecf1a4f90a20d0e53488376ca27428462306647"} Mar 09 14:06:25 crc kubenswrapper[4722]: I0309 14:06:25.299148 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9vnnw" event={"ID":"6635182d-e9ef-4294-8ff3-ae305c10feb4","Type":"ContainerStarted","Data":"3663dfcea25be7a0aef6f300cbe9f78ede7dc7f6593db2c33fa90378d612a5ad"} Mar 09 14:06:25 crc kubenswrapper[4722]: I0309 14:06:25.315646 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-hng8f" event={"ID":"6d4c4768-195d-42fe-86a7-139c4ec0c86d","Type":"ContainerStarted","Data":"ed8dc23ed82bfa9cb4aa665dd612849039cc7ff87b481ae49b2fc597aaccd3b2"} Mar 09 14:06:25 crc kubenswrapper[4722]: I0309 14:06:25.318266 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-nnjvd" event={"ID":"3a9659ac-0c7d-41be-becc-5ec038244f00","Type":"ContainerStarted","Data":"9851d23744be70c8a9cc4f74a859f1651a00feb695c0e968de5f46f6f8049509"} Mar 09 14:06:25 crc kubenswrapper[4722]: I0309 14:06:25.321166 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-pjvfp" event={"ID":"26cb738b-5dfa-4b97-8153-790ec6eb198b","Type":"ContainerStarted","Data":"5e3649a3cc20f4dc7245c19bbb8ac689c42e2ec4fd7d7eb9b21e61ff88bc468a"} Mar 09 14:06:25 crc kubenswrapper[4722]: I0309 14:06:25.353755 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-sfch8" event={"ID":"8a76c9b5-c226-4d93-8d7a-8e56210b572a","Type":"ContainerStarted","Data":"6fee2f304e49495845c7a3f368cf91dcffba93fe423b49acd454c70f8fd5cb71"} Mar 09 14:06:25 crc kubenswrapper[4722]: I0309 14:06:25.361146 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-6db7b" event={"ID":"cc613bd5-c976-458c-aeda-d62a304f4103","Type":"ContainerStarted","Data":"a631bda765d25876305a42771b5db259c318abcbc989f05799a46521ffcd6bf2"} Mar 09 14:06:25 crc kubenswrapper[4722]: I0309 14:06:25.368537 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 14:06:25 crc kubenswrapper[4722]: E0309 14:06:25.369449 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 14:06:25.86941216 +0000 UTC m=+226.424980736 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:25 crc kubenswrapper[4722]: I0309 14:06:25.369829 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tz6gh\" (UID: \"9d57536b-4f57-4098-b519-19fdc2559eda\") " pod="openshift-image-registry/image-registry-697d97f7c8-tz6gh" Mar 09 14:06:25 crc kubenswrapper[4722]: E0309 14:06:25.372064 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 14:06:25.872033188 +0000 UTC m=+226.427601764 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tz6gh" (UID: "9d57536b-4f57-4098-b519-19fdc2559eda") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:25 crc kubenswrapper[4722]: I0309 14:06:25.407715 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tkhkf" event={"ID":"3d05c111-6b35-44c3-b587-12f470d584c3","Type":"ContainerStarted","Data":"cd98a54626e3f4794151edd056c60ec9ae17c010342020a3ec4c7725d79e1d74"} Mar 09 14:06:25 crc kubenswrapper[4722]: I0309 14:06:25.433377 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-f8gtl"] Mar 09 14:06:25 crc kubenswrapper[4722]: I0309 14:06:25.446414 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-b5hps"] Mar 09 14:06:25 crc kubenswrapper[4722]: I0309 14:06:25.450534 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fnmcv" Mar 09 14:06:25 crc kubenswrapper[4722]: I0309 14:06:25.451125 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-mgzj7" Mar 09 14:06:25 crc kubenswrapper[4722]: I0309 14:06:25.469140 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wz9f2"] Mar 09 14:06:25 crc kubenswrapper[4722]: I0309 14:06:25.510622 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-jmwpv" Mar 09 14:06:25 crc kubenswrapper[4722]: I0309 14:06:25.522393 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 14:06:25 crc kubenswrapper[4722]: E0309 14:06:25.535861 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 14:06:26.035837668 +0000 UTC m=+226.591406244 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:25 crc kubenswrapper[4722]: I0309 14:06:25.542056 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-79q9f" podStartSLOduration=162.542017231 podStartE2EDuration="2m42.542017231s" podCreationTimestamp="2026-03-09 14:03:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:06:25.338938716 +0000 UTC m=+225.894507292" watchObservedRunningTime="2026-03-09 14:06:25.542017231 +0000 UTC m=+226.097585807" Mar 09 14:06:25 crc kubenswrapper[4722]: I0309 14:06:25.561298 4722 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 14:06:25 crc kubenswrapper[4722]: I0309 14:06:25.562142 4722 ???:1] "http: TLS handshake error from 192.168.126.11:54650: no serving certificate available for the kubelet" Mar 09 14:06:25 crc kubenswrapper[4722]: I0309 14:06:25.564739 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551080-mdzx2"] Mar 09 14:06:25 crc kubenswrapper[4722]: I0309 14:06:25.598429 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-jmwpv" podStartSLOduration=162.598402414 podStartE2EDuration="2m42.598402414s" podCreationTimestamp="2026-03-09 14:03:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:06:25.375594263 +0000 UTC m=+225.931162839" watchObservedRunningTime="2026-03-09 14:06:25.598402414 +0000 UTC m=+226.153970980" Mar 09 14:06:25 crc kubenswrapper[4722]: I0309 14:06:25.628542 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dgxnd"] Mar 09 14:06:25 crc kubenswrapper[4722]: I0309 14:06:25.651074 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fp2th"] Mar 09 14:06:25 crc kubenswrapper[4722]: I0309 14:06:25.665420 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tz6gh\" (UID: \"9d57536b-4f57-4098-b519-19fdc2559eda\") " pod="openshift-image-registry/image-registry-697d97f7c8-tz6gh" Mar 09 14:06:25 crc kubenswrapper[4722]: E0309 14:06:25.667087 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 14:06:26.167072881 +0000 UTC m=+226.722641457 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tz6gh" (UID: "9d57536b-4f57-4098-b519-19fdc2559eda") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:25 crc kubenswrapper[4722]: I0309 14:06:25.700163 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fnmcv" podStartSLOduration=161.700129872 podStartE2EDuration="2m41.700129872s" podCreationTimestamp="2026-03-09 14:03:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:06:25.554017198 +0000 UTC m=+226.109585774" watchObservedRunningTime="2026-03-09 14:06:25.700129872 +0000 UTC m=+226.255698448" Mar 09 14:06:25 crc kubenswrapper[4722]: I0309 14:06:25.704824 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-7q7bl" podStartSLOduration=161.704806341 podStartE2EDuration="2m41.704806341s" podCreationTimestamp="2026-03-09 14:03:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:06:25.573772004 +0000 UTC m=+226.129340580" watchObservedRunningTime="2026-03-09 14:06:25.704806341 +0000 UTC m=+226.260374917" Mar 09 14:06:25 crc kubenswrapper[4722]: I0309 14:06:25.710793 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-hdrrh"] Mar 09 14:06:25 crc kubenswrapper[4722]: I0309 14:06:25.712711 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-sfch8" podStartSLOduration=161.712690715 podStartE2EDuration="2m41.712690715s" podCreationTimestamp="2026-03-09 14:03:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:06:25.597638262 +0000 UTC m=+226.153206848" watchObservedRunningTime="2026-03-09 14:06:25.712690715 +0000 UTC m=+226.268259291" Mar 09 14:06:25 crc kubenswrapper[4722]: I0309 14:06:25.734682 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9vnnw" podStartSLOduration=161.734661057 podStartE2EDuration="2m41.734661057s" podCreationTimestamp="2026-03-09 14:03:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:06:25.710451928 +0000 UTC m=+226.266020504" watchObservedRunningTime="2026-03-09 14:06:25.734661057 +0000 UTC m=+226.290229633" Mar 09 14:06:25 crc kubenswrapper[4722]: I0309 14:06:25.768842 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 14:06:25 crc kubenswrapper[4722]: E0309 14:06:25.769249 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 14:06:26.269226172 +0000 UTC m=+226.824794748 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:25 crc kubenswrapper[4722]: I0309 14:06:25.816217 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-6b65x" podStartSLOduration=161.816173026 podStartE2EDuration="2m41.816173026s" podCreationTimestamp="2026-03-09 14:03:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:06:25.758217166 +0000 UTC m=+226.313785752" watchObservedRunningTime="2026-03-09 14:06:25.816173026 +0000 UTC m=+226.371741602" Mar 09 14:06:25 crc kubenswrapper[4722]: I0309 14:06:25.880166 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tz6gh\" (UID: \"9d57536b-4f57-4098-b519-19fdc2559eda\") " pod="openshift-image-registry/image-registry-697d97f7c8-tz6gh" Mar 09 14:06:25 crc kubenswrapper[4722]: E0309 14:06:25.880588 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 14:06:26.380573927 +0000 UTC m=+226.936142503 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tz6gh" (UID: "9d57536b-4f57-4098-b519-19fdc2559eda") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:25 crc kubenswrapper[4722]: I0309 14:06:25.943933 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-xxz9x" podStartSLOduration=161.943911766 podStartE2EDuration="2m41.943911766s" podCreationTimestamp="2026-03-09 14:03:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:06:25.917695898 +0000 UTC m=+226.473264474" watchObservedRunningTime="2026-03-09 14:06:25.943911766 +0000 UTC m=+226.499480342" Mar 09 14:06:25 crc kubenswrapper[4722]: I0309 14:06:25.983022 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 14:06:25 crc kubenswrapper[4722]: E0309 14:06:25.983299 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 14:06:26.483278174 +0000 UTC m=+227.038846750 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:25 crc kubenswrapper[4722]: I0309 14:06:25.983817 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tz6gh\" (UID: \"9d57536b-4f57-4098-b519-19fdc2559eda\") " pod="openshift-image-registry/image-registry-697d97f7c8-tz6gh" Mar 09 14:06:25 crc kubenswrapper[4722]: E0309 14:06:25.984195 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 14:06:26.484186271 +0000 UTC m=+227.039754847 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tz6gh" (UID: "9d57536b-4f57-4098-b519-19fdc2559eda") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:26 crc kubenswrapper[4722]: I0309 14:06:26.084996 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 14:06:26 crc kubenswrapper[4722]: E0309 14:06:26.087307 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 14:06:26.58726605 +0000 UTC m=+227.142834626 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:26 crc kubenswrapper[4722]: I0309 14:06:26.095783 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tz6gh\" (UID: \"9d57536b-4f57-4098-b519-19fdc2559eda\") " pod="openshift-image-registry/image-registry-697d97f7c8-tz6gh" Mar 09 14:06:26 crc kubenswrapper[4722]: E0309 14:06:26.096381 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 14:06:26.596365269 +0000 UTC m=+227.151933845 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tz6gh" (UID: "9d57536b-4f57-4098-b519-19fdc2559eda") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:26 crc kubenswrapper[4722]: I0309 14:06:26.107132 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mrtb2"] Mar 09 14:06:26 crc kubenswrapper[4722]: I0309 14:06:26.198011 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 14:06:26 crc kubenswrapper[4722]: E0309 14:06:26.198809 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 14:06:26.698782458 +0000 UTC m=+227.254351034 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:26 crc kubenswrapper[4722]: I0309 14:06:26.300052 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tz6gh\" (UID: \"9d57536b-4f57-4098-b519-19fdc2559eda\") " pod="openshift-image-registry/image-registry-697d97f7c8-tz6gh" Mar 09 14:06:26 crc kubenswrapper[4722]: E0309 14:06:26.300782 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 14:06:26.800761184 +0000 UTC m=+227.356329760 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tz6gh" (UID: "9d57536b-4f57-4098-b519-19fdc2559eda") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:26 crc kubenswrapper[4722]: I0309 14:06:26.410706 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 14:06:26 crc kubenswrapper[4722]: E0309 14:06:26.411608 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 14:06:26.911589802 +0000 UTC m=+227.467158378 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:26 crc kubenswrapper[4722]: I0309 14:06:26.437619 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-sfch8" event={"ID":"8a76c9b5-c226-4d93-8d7a-8e56210b572a","Type":"ContainerStarted","Data":"ff7a4d0a96e5e1f0cc0fdcc6585ec856f476a50b47ef4023cefba4393ef415b6"} Mar 09 14:06:26 crc kubenswrapper[4722]: I0309 14:06:26.451887 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dgxnd" event={"ID":"c219beb3-4ba5-43bd-b2ec-3855d19c2b57","Type":"ContainerStarted","Data":"d7d01c41a2826bd8010f913ec88d645989069c60c57d8e2e4b34fb7845ee973d"} Mar 09 14:06:26 crc kubenswrapper[4722]: I0309 14:06:26.458301 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-gh244" event={"ID":"4084fbb0-8fae-4b7e-a3f6-ec9d723bb367","Type":"ContainerStarted","Data":"3c66c65216c1240a306cc36b5c881fc5bea46a8ab7b465f5d351efcf5edeec7b"} Mar 09 14:06:26 crc kubenswrapper[4722]: I0309 14:06:26.461118 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lnkwt" event={"ID":"f8055a95-6b09-4e32-88b8-82ad36ca5029","Type":"ContainerStarted","Data":"e115d9b67e844376836c406237f41ac68dd2136f63714faef2f5b270864750b7"} Mar 09 14:06:26 crc kubenswrapper[4722]: I0309 14:06:26.494913 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-wz9f2" event={"ID":"e2641b0e-aae4-49df-931f-95e38505812f","Type":"ContainerStarted","Data":"34fca4843b9f5fee1507b016600c3bebb74b2540b7e1e14623371c52a8a5cf23"} Mar 09 14:06:26 crc kubenswrapper[4722]: I0309 14:06:26.504394 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-hdrrh" event={"ID":"ba0aa566-e854-473c-b6b6-2f9dfece6133","Type":"ContainerStarted","Data":"affb63e2c57e482c0acff82b7c3b97630f003ac032e27e57c03928f744b7d854"} Mar 09 14:06:26 crc kubenswrapper[4722]: I0309 14:06:26.515819 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tz6gh\" (UID: \"9d57536b-4f57-4098-b519-19fdc2559eda\") " pod="openshift-image-registry/image-registry-697d97f7c8-tz6gh" Mar 09 14:06:26 crc kubenswrapper[4722]: E0309 14:06:26.516120 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 14:06:27.016109254 +0000 UTC m=+227.571677830 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tz6gh" (UID: "9d57536b-4f57-4098-b519-19fdc2559eda") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:26 crc kubenswrapper[4722]: I0309 14:06:26.549174 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-6db7b" event={"ID":"cc613bd5-c976-458c-aeda-d62a304f4103","Type":"ContainerStarted","Data":"013e3cad31eae04fbf3d6634cf8c94ef31b175f34a940d3ae4d098ed39d5239f"} Mar 09 14:06:26 crc kubenswrapper[4722]: I0309 14:06:26.614518 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-6db7b" podStartSLOduration=162.614496553 podStartE2EDuration="2m42.614496553s" podCreationTimestamp="2026-03-09 14:03:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:06:26.613862584 +0000 UTC m=+227.169431160" watchObservedRunningTime="2026-03-09 14:06:26.614496553 +0000 UTC m=+227.170065129" Mar 09 14:06:26 crc kubenswrapper[4722]: I0309 14:06:26.618350 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 14:06:26 crc kubenswrapper[4722]: E0309 14:06:26.621962 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 14:06:27.121925593 +0000 UTC m=+227.677494169 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:26 crc kubenswrapper[4722]: I0309 14:06:26.666384 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tkhkf" event={"ID":"3d05c111-6b35-44c3-b587-12f470d584c3","Type":"ContainerStarted","Data":"d85f2d337aa66a055073e7fc7b697c4ec148ad4d7e928d5d1b1afa97dc2a1eca"} Mar 09 14:06:26 crc kubenswrapper[4722]: I0309 14:06:26.682573 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mrtb2" event={"ID":"228d39d8-b0bc-4491-be90-e473c090f412","Type":"ContainerStarted","Data":"b16d1beb93f1c200e53a2cac634b914fdf218558335da70dc2addd2859b26931"} Mar 09 14:06:26 crc kubenswrapper[4722]: I0309 14:06:26.703163 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fp2th" event={"ID":"996087ed-6480-4650-8632-c991e5d16c99","Type":"ContainerStarted","Data":"cd72a08c4855314e60d45fb4498ef6991bbf5a415c2c2a5eb0b078f4dac78d1a"} Mar 09 14:06:26 crc kubenswrapper[4722]: I0309 14:06:26.722370 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tz6gh\" (UID: \"9d57536b-4f57-4098-b519-19fdc2559eda\") " pod="openshift-image-registry/image-registry-697d97f7c8-tz6gh" Mar 09 14:06:26 crc kubenswrapper[4722]: E0309 14:06:26.722767 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 14:06:27.222754024 +0000 UTC m=+227.778322600 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tz6gh" (UID: "9d57536b-4f57-4098-b519-19fdc2559eda") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:26 crc kubenswrapper[4722]: I0309 14:06:26.748077 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kh6xb" event={"ID":"b12ae595-2119-49c9-9bfd-33eec4b6df65","Type":"ContainerStarted","Data":"b8eaafcc97a85014a6c70a6fff0a3969d0b29364b0cb0635e9928d508adf7b09"} Mar 09 14:06:26 crc kubenswrapper[4722]: I0309 14:06:26.825075 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 14:06:26 crc kubenswrapper[4722]: E0309 14:06:26.826215 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 14:06:27.326176713 +0000 UTC m=+227.881745289 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:26 crc kubenswrapper[4722]: I0309 14:06:26.851076 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5j6rh" event={"ID":"704d2c33-a0ad-4dc1-a5e6-37aab398f9a0","Type":"ContainerStarted","Data":"817c424663feb46c516bb1358cb2fd98f668093f75e211734e303bc21154a431"} Mar 09 14:06:26 crc kubenswrapper[4722]: I0309 14:06:26.868876 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551086-cvc28" event={"ID":"40be416c-1b7b-4973-b9ed-25ae20cd660d","Type":"ContainerStarted","Data":"00396f1583d3454ca2cb8f8ea0f6a14f1cba7e698f5924eb845c2864ed34ef09"} Mar 09 14:06:26 crc kubenswrapper[4722]: I0309 14:06:26.887412 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551080-mdzx2" event={"ID":"7e1fdc12-aac5-4f72-9b22-0212c2f3988e","Type":"ContainerStarted","Data":"0a8fd4eed82d6739ca1061292610cac0298147fe092602018bac3f6624c31bea"} Mar 09 14:06:26 crc kubenswrapper[4722]: I0309 14:06:26.920420 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-kh6xb" podStartSLOduration=162.92039373 podStartE2EDuration="2m42.92039373s" podCreationTimestamp="2026-03-09 14:03:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:06:26.77483196 +0000 UTC m=+227.330400536" watchObservedRunningTime="2026-03-09 14:06:26.92039373 +0000 UTC m=+227.475962326" Mar 09 14:06:26 crc kubenswrapper[4722]: I0309 14:06:26.920806 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-5j6rh" podStartSLOduration=162.920799951 podStartE2EDuration="2m42.920799951s" podCreationTimestamp="2026-03-09 14:03:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:06:26.912600178 +0000 UTC m=+227.468168754" watchObservedRunningTime="2026-03-09 14:06:26.920799951 +0000 UTC m=+227.476368527" Mar 09 14:06:26 crc kubenswrapper[4722]: I0309 14:06:26.930165 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-dp8wn" event={"ID":"317444ee-0620-47d2-869e-77578a367a87","Type":"ContainerStarted","Data":"10bc0de264d7ed06b3f68942e6e3c249950032a18e0566405054ad9fe8eaaee6"} Mar 09 14:06:26 crc kubenswrapper[4722]: I0309 14:06:26.930188 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tz6gh\" (UID: \"9d57536b-4f57-4098-b519-19fdc2559eda\") " pod="openshift-image-registry/image-registry-697d97f7c8-tz6gh" Mar 09 14:06:26 crc kubenswrapper[4722]: E0309 14:06:26.930430 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 14:06:27.430419897 +0000 UTC m=+227.985988473 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tz6gh" (UID: "9d57536b-4f57-4098-b519-19fdc2559eda") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:26 crc kubenswrapper[4722]: I0309 14:06:26.936755 4722 ???:1] "http: TLS handshake error from 192.168.126.11:54666: no serving certificate available for the kubelet" Mar 09 14:06:26 crc kubenswrapper[4722]: I0309 14:06:26.979829 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-dp8wn" podStartSLOduration=162.979812092 podStartE2EDuration="2m42.979812092s" podCreationTimestamp="2026-03-09 14:03:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:06:26.968767234 +0000 UTC m=+227.524335820" watchObservedRunningTime="2026-03-09 14:06:26.979812092 +0000 UTC m=+227.535380668" Mar 09 14:06:27 crc kubenswrapper[4722]: I0309 14:06:26.996130 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-nnjvd" event={"ID":"3a9659ac-0c7d-41be-becc-5ec038244f00","Type":"ContainerStarted","Data":"0641db3ceeca5e8c33a889e92e3c1e0b4a181f8256e8372a1bbe468b764a32a7"} Mar 09 14:06:27 crc kubenswrapper[4722]: I0309 14:06:27.003576 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-b5hps" event={"ID":"240d1325-4400-475e-8bc7-9915294148d8","Type":"ContainerStarted","Data":"027e721b54699dc58607549f50bcc09fdd643c2a4f47dc46cfde9dad77d8f25a"} Mar 09 14:06:27 crc kubenswrapper[4722]: I0309 14:06:27.030997 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 14:06:27 crc kubenswrapper[4722]: E0309 14:06:27.031452 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 14:06:27.531400433 +0000 UTC m=+228.086969019 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:27 crc kubenswrapper[4722]: I0309 14:06:27.032307 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-nvtgb" event={"ID":"3cd02fb0-bac4-47b0-9846-a94399041f77","Type":"ContainerStarted","Data":"750c8ba88956a25392e58769f5c90dc8bc49251da13dcac4cad458731786845d"} Mar 09 14:06:27 crc kubenswrapper[4722]: I0309 14:06:27.048088 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tz6gh\" (UID: \"9d57536b-4f57-4098-b519-19fdc2559eda\") " pod="openshift-image-registry/image-registry-697d97f7c8-tz6gh" Mar 09 14:06:27 crc kubenswrapper[4722]: E0309 14:06:27.050260 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 14:06:27.550247142 +0000 UTC m=+228.105815718 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tz6gh" (UID: "9d57536b-4f57-4098-b519-19fdc2559eda") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:27 crc kubenswrapper[4722]: I0309 14:06:27.088421 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-nvtgb" podStartSLOduration=163.088392954 podStartE2EDuration="2m43.088392954s" podCreationTimestamp="2026-03-09 14:03:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:06:27.086974412 +0000 UTC m=+227.642542988" watchObservedRunningTime="2026-03-09 14:06:27.088392954 +0000 UTC m=+227.643961530" Mar 09 14:06:27 crc kubenswrapper[4722]: I0309 14:06:27.093011 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-hng8f" event={"ID":"6d4c4768-195d-42fe-86a7-139c4ec0c86d","Type":"ContainerStarted","Data":"88e13f920f2ae416ba476babfeb590357e9cc534463bac640f6d6973ba2aeb7d"} Mar 09 14:06:27 crc kubenswrapper[4722]: I0309 14:06:27.103501 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8jf57" event={"ID":"30f3c24f-2b49-4343-8c94-f6d56b43a35d","Type":"ContainerStarted","Data":"f3638dbc49c230d4be5390379d44f273c52da73f466e3817e5c4947b73642ec5"} Mar 09 14:06:27 crc kubenswrapper[4722]: I0309 14:06:27.116982 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-f8gtl" event={"ID":"53e973bb-4b49-4815-b8b3-a6cd76e210bf","Type":"ContainerStarted","Data":"f1550825049488102c79c9c43e46001eed332b1af2da68febea6d340a8e6b803"} Mar 09 14:06:27 crc kubenswrapper[4722]: I0309 14:06:27.136492 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8jf57" podStartSLOduration=163.13647122 podStartE2EDuration="2m43.13647122s" podCreationTimestamp="2026-03-09 14:03:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:06:27.135866153 +0000 UTC m=+227.691434729" watchObservedRunningTime="2026-03-09 14:06:27.13647122 +0000 UTC m=+227.692039796" Mar 09 14:06:27 crc kubenswrapper[4722]: I0309 14:06:27.137155 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lh92k" event={"ID":"f1fc8bc8-fa6c-4648-9bc3-491cec75d584","Type":"ContainerStarted","Data":"3c5928408217a41209ae0f46ea6e71801fac52f79802bf65c8403b865e77cc4d"} Mar 09 14:06:27 crc kubenswrapper[4722]: E0309 14:06:27.208610 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 14:06:27.70858846 +0000 UTC m=+228.264157036 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:27 crc kubenswrapper[4722]: I0309 14:06:27.208651 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 14:06:27 crc kubenswrapper[4722]: I0309 14:06:27.217669 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tz6gh\" (UID: \"9d57536b-4f57-4098-b519-19fdc2559eda\") " pod="openshift-image-registry/image-registry-697d97f7c8-tz6gh" Mar 09 14:06:27 crc kubenswrapper[4722]: E0309 14:06:27.218094 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 14:06:27.718082142 +0000 UTC m=+228.273650718 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tz6gh" (UID: "9d57536b-4f57-4098-b519-19fdc2559eda") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:27 crc kubenswrapper[4722]: I0309 14:06:27.248067 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-dp8wn" Mar 09 14:06:27 crc kubenswrapper[4722]: I0309 14:06:27.252283 4722 patch_prober.go:28] interesting pod/router-default-5444994796-dp8wn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 14:06:27 crc kubenswrapper[4722]: [-]has-synced failed: reason withheld Mar 09 14:06:27 crc kubenswrapper[4722]: [+]process-running ok Mar 09 14:06:27 crc kubenswrapper[4722]: healthz check failed Mar 09 14:06:27 crc kubenswrapper[4722]: I0309 14:06:27.252364 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dp8wn" podUID="317444ee-0620-47d2-869e-77578a367a87" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 14:06:27 crc kubenswrapper[4722]: I0309 14:06:27.318505 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 14:06:27 crc kubenswrapper[4722]: E0309 14:06:27.320095 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 14:06:27.820076858 +0000 UTC m=+228.375645434 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:27 crc kubenswrapper[4722]: I0309 14:06:27.387994 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-mgzj7"] Mar 09 14:06:27 crc kubenswrapper[4722]: I0309 14:06:27.422626 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tz6gh\" (UID: \"9d57536b-4f57-4098-b519-19fdc2559eda\") " pod="openshift-image-registry/image-registry-697d97f7c8-tz6gh" Mar 09 14:06:27 crc kubenswrapper[4722]: E0309 14:06:27.422968 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 14:06:27.922955251 +0000 UTC m=+228.478523817 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tz6gh" (UID: "9d57536b-4f57-4098-b519-19fdc2559eda") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:27 crc kubenswrapper[4722]: I0309 14:06:27.451258 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-fnmcv"] Mar 09 14:06:27 crc kubenswrapper[4722]: I0309 14:06:27.525130 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 14:06:27 crc kubenswrapper[4722]: E0309 14:06:27.525528 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 14:06:28.025498253 +0000 UTC m=+228.581066819 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:27 crc kubenswrapper[4722]: I0309 14:06:27.525584 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tz6gh\" (UID: \"9d57536b-4f57-4098-b519-19fdc2559eda\") " pod="openshift-image-registry/image-registry-697d97f7c8-tz6gh" Mar 09 14:06:27 crc kubenswrapper[4722]: E0309 14:06:27.525909 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 14:06:28.025895375 +0000 UTC m=+228.581463951 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tz6gh" (UID: "9d57536b-4f57-4098-b519-19fdc2559eda") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:27 crc kubenswrapper[4722]: I0309 14:06:27.627248 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 14:06:27 crc kubenswrapper[4722]: E0309 14:06:27.627417 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 14:06:28.127393577 +0000 UTC m=+228.682962153 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:27 crc kubenswrapper[4722]: I0309 14:06:27.627630 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tz6gh\" (UID: \"9d57536b-4f57-4098-b519-19fdc2559eda\") " pod="openshift-image-registry/image-registry-697d97f7c8-tz6gh" Mar 09 14:06:27 crc kubenswrapper[4722]: E0309 14:06:27.627954 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 14:06:28.127945243 +0000 UTC m=+228.683513819 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tz6gh" (UID: "9d57536b-4f57-4098-b519-19fdc2559eda") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:27 crc kubenswrapper[4722]: I0309 14:06:27.728745 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 14:06:27 crc kubenswrapper[4722]: E0309 14:06:27.728970 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 14:06:28.22894134 +0000 UTC m=+228.784509916 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:27 crc kubenswrapper[4722]: I0309 14:06:27.729196 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tz6gh\" (UID: \"9d57536b-4f57-4098-b519-19fdc2559eda\") " pod="openshift-image-registry/image-registry-697d97f7c8-tz6gh" Mar 09 14:06:27 crc kubenswrapper[4722]: E0309 14:06:27.729503 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 14:06:28.229490925 +0000 UTC m=+228.785059501 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tz6gh" (UID: "9d57536b-4f57-4098-b519-19fdc2559eda") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:27 crc kubenswrapper[4722]: I0309 14:06:27.830835 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 14:06:27 crc kubenswrapper[4722]: E0309 14:06:27.831043 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 14:06:28.331010838 +0000 UTC m=+228.886579414 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:27 crc kubenswrapper[4722]: I0309 14:06:27.932331 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tz6gh\" (UID: \"9d57536b-4f57-4098-b519-19fdc2559eda\") " pod="openshift-image-registry/image-registry-697d97f7c8-tz6gh" Mar 09 14:06:27 crc kubenswrapper[4722]: E0309 14:06:27.933140 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 14:06:28.432893351 +0000 UTC m=+228.988461927 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tz6gh" (UID: "9d57536b-4f57-4098-b519-19fdc2559eda") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:28 crc kubenswrapper[4722]: I0309 14:06:28.033158 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 14:06:28 crc kubenswrapper[4722]: E0309 14:06:28.033596 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 14:06:28.533578109 +0000 UTC m=+229.089146685 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:28 crc kubenswrapper[4722]: I0309 14:06:28.134448 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tz6gh\" (UID: \"9d57536b-4f57-4098-b519-19fdc2559eda\") " pod="openshift-image-registry/image-registry-697d97f7c8-tz6gh" Mar 09 14:06:28 crc kubenswrapper[4722]: E0309 14:06:28.134772 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 14:06:28.634760291 +0000 UTC m=+229.190328867 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tz6gh" (UID: "9d57536b-4f57-4098-b519-19fdc2559eda") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:28 crc kubenswrapper[4722]: I0309 14:06:28.158615 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-pjvfp" event={"ID":"26cb738b-5dfa-4b97-8153-790ec6eb198b","Type":"ContainerStarted","Data":"5985c5fd7cb7fe98a458c3bf89b3c9fd1628238bb0a53ba3d0a545158bf6bb3a"} Mar 09 14:06:28 crc kubenswrapper[4722]: I0309 14:06:28.163192 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2xkrn" event={"ID":"9fb5d9ef-bb16-45d0-a7c0-a9fb43edeb34","Type":"ContainerStarted","Data":"32054b6935ae5a5e4ebdce00bb1a7555a32bc440d3418b667b6555abb8296300"} Mar 09 14:06:28 crc kubenswrapper[4722]: I0309 14:06:28.164981 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-nnjvd" event={"ID":"3a9659ac-0c7d-41be-becc-5ec038244f00","Type":"ContainerStarted","Data":"b5f199c2c71f6028a1467477efbb8b6ad363e9c8d7f88d595ae876f66f65663e"} Mar 09 14:06:28 crc kubenswrapper[4722]: I0309 14:06:28.165420 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-nnjvd" Mar 09 14:06:28 crc kubenswrapper[4722]: I0309 14:06:28.174419 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2mf4l" event={"ID":"062db6f1-77ab-4eca-be53-6480160aff81","Type":"ContainerStarted","Data":"5100f4f38b16f8030981ded6aacc8110167300d6988e51f2cd47357197b5b6c5"} Mar 09 14:06:28 crc kubenswrapper[4722]: I0309 14:06:28.179081 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-pjvfp" podStartSLOduration=8.179069096 podStartE2EDuration="8.179069096s" podCreationTimestamp="2026-03-09 14:06:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:06:28.178615802 +0000 UTC m=+228.734184378" watchObservedRunningTime="2026-03-09 14:06:28.179069096 +0000 UTC m=+228.734637672" Mar 09 14:06:28 crc kubenswrapper[4722]: I0309 14:06:28.182717 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-nvtgb" event={"ID":"3cd02fb0-bac4-47b0-9846-a94399041f77","Type":"ContainerStarted","Data":"b5b40028c278fa09e05b2540d7ea5fe920e0efe5daf6de47f61e65159ea19b86"} Mar 09 14:06:28 crc kubenswrapper[4722]: I0309 14:06:28.188584 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dgxnd" event={"ID":"c219beb3-4ba5-43bd-b2ec-3855d19c2b57","Type":"ContainerStarted","Data":"b664453e53fd2ade7f0e7440ee555365a0657f568b5c0929fcced9082eb6ae61"} Mar 09 14:06:28 crc kubenswrapper[4722]: I0309 14:06:28.188867 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dgxnd" Mar 09 14:06:28 crc kubenswrapper[4722]: I0309 14:06:28.189927 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lnkwt" event={"ID":"f8055a95-6b09-4e32-88b8-82ad36ca5029","Type":"ContainerStarted","Data":"d708c56ace6a0794be5e9b44e9d196661f8745ae35cf30b479af812d413dbd2a"} Mar 09 14:06:28 crc kubenswrapper[4722]: I0309 14:06:28.190557 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lnkwt" Mar 09 14:06:28 crc kubenswrapper[4722]: I0309 14:06:28.190639 4722 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-dgxnd container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.42:8443/healthz\": dial tcp 10.217.0.42:8443: connect: connection refused" start-of-body= Mar 09 14:06:28 crc kubenswrapper[4722]: I0309 14:06:28.190665 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dgxnd" podUID="c219beb3-4ba5-43bd-b2ec-3855d19c2b57" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.42:8443/healthz\": dial tcp 10.217.0.42:8443: connect: connection refused" Mar 09 14:06:28 crc kubenswrapper[4722]: I0309 14:06:28.195342 4722 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-lnkwt container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" start-of-body= Mar 09 14:06:28 crc kubenswrapper[4722]: I0309 14:06:28.195400 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lnkwt" podUID="f8055a95-6b09-4e32-88b8-82ad36ca5029" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" Mar 09 14:06:28 crc kubenswrapper[4722]: I0309 14:06:28.199877 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-8jf57" event={"ID":"30f3c24f-2b49-4343-8c94-f6d56b43a35d","Type":"ContainerStarted","Data":"caa4e342a96e040febf9ef827e53507b5c13e74dda106c9bbda1f0d1aa18b71e"} Mar 09 14:06:28 crc kubenswrapper[4722]: I0309 14:06:28.211073 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-wz9f2" event={"ID":"e2641b0e-aae4-49df-931f-95e38505812f","Type":"ContainerStarted","Data":"64d164d2bc8fe4385d91a3b84212f0498b57af3e2a5c68489f924b444efff3c6"} Mar 09 14:06:28 crc kubenswrapper[4722]: I0309 14:06:28.211378 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-wz9f2" Mar 09 14:06:28 crc kubenswrapper[4722]: I0309 14:06:28.215044 4722 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-wz9f2 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.34:8080/healthz\": dial tcp 10.217.0.34:8080: connect: connection refused" start-of-body= Mar 09 14:06:28 crc kubenswrapper[4722]: I0309 14:06:28.215088 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-wz9f2" podUID="e2641b0e-aae4-49df-931f-95e38505812f" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.34:8080/healthz\": dial tcp 10.217.0.34:8080: connect: connection refused" Mar 09 14:06:28 crc kubenswrapper[4722]: I0309 14:06:28.216387 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2xkrn" podStartSLOduration=164.216373032 podStartE2EDuration="2m44.216373032s" podCreationTimestamp="2026-03-09 14:03:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:06:28.215336851 +0000 UTC m=+228.770905427" watchObservedRunningTime="2026-03-09 14:06:28.216373032 +0000 UTC m=+228.771941608" Mar 09 14:06:28 crc kubenswrapper[4722]: I0309 14:06:28.227451 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tkhkf" event={"ID":"3d05c111-6b35-44c3-b587-12f470d584c3","Type":"ContainerStarted","Data":"b6667142d5c62a671e31bd006c20c8a9f4b1bf5d60885ff9ca5be87ad6331c48"} Mar 09 14:06:28 crc kubenswrapper[4722]: I0309 14:06:28.231107 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-hdrrh" event={"ID":"ba0aa566-e854-473c-b6b6-2f9dfece6133","Type":"ContainerStarted","Data":"aacfc3efb4714c02118c8b822fb8c58283124bd29b195c629f337fcb30ae10e2"} Mar 09 14:06:28 crc kubenswrapper[4722]: I0309 14:06:28.233083 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lh92k" event={"ID":"f1fc8bc8-fa6c-4648-9bc3-491cec75d584","Type":"ContainerStarted","Data":"f6ad7014044113a6859548c3f0ad06fad45acdc05debea34eff82e857a0539c2"} Mar 09 14:06:28 crc kubenswrapper[4722]: I0309 14:06:28.234636 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mrtb2" event={"ID":"228d39d8-b0bc-4491-be90-e473c090f412","Type":"ContainerStarted","Data":"778991893ca7b914cdde2f5428e80ab6cfce3570917a24a4ccaaaff3928b14db"} Mar 09 14:06:28 crc kubenswrapper[4722]: I0309 14:06:28.234654 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mrtb2" event={"ID":"228d39d8-b0bc-4491-be90-e473c090f412","Type":"ContainerStarted","Data":"c70e13eeb8017abd97cc4079107698175690f3a771f0bc49558361372e6a4464"} Mar 09 14:06:28 crc kubenswrapper[4722]: I0309 14:06:28.234949 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mrtb2" Mar 09 14:06:28 crc kubenswrapper[4722]: I0309 14:06:28.235071 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 14:06:28 crc kubenswrapper[4722]: E0309 14:06:28.235306 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 14:06:28.735283034 +0000 UTC m=+229.290851610 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:28 crc kubenswrapper[4722]: I0309 14:06:28.235398 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tz6gh\" (UID: \"9d57536b-4f57-4098-b519-19fdc2559eda\") " pod="openshift-image-registry/image-registry-697d97f7c8-tz6gh" Mar 09 14:06:28 crc kubenswrapper[4722]: E0309 14:06:28.235837 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 14:06:28.735831189 +0000 UTC m=+229.291399755 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tz6gh" (UID: "9d57536b-4f57-4098-b519-19fdc2559eda") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:28 crc kubenswrapper[4722]: I0309 14:06:28.253960 4722 patch_prober.go:28] interesting pod/router-default-5444994796-dp8wn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 14:06:28 crc kubenswrapper[4722]: [-]has-synced failed: reason withheld Mar 09 14:06:28 crc kubenswrapper[4722]: [+]process-running ok Mar 09 14:06:28 crc kubenswrapper[4722]: healthz check failed Mar 09 14:06:28 crc kubenswrapper[4722]: I0309 14:06:28.254019 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dp8wn" podUID="317444ee-0620-47d2-869e-77578a367a87" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 14:06:28 crc kubenswrapper[4722]: I0309 14:06:28.257719 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fp2th" event={"ID":"996087ed-6480-4650-8632-c991e5d16c99","Type":"ContainerStarted","Data":"4b273112c5687a28adabe3b469c671154db487b3bd47cef0ffa59c7a855124c1"} Mar 09 14:06:28 crc kubenswrapper[4722]: I0309 14:06:28.257752 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fp2th" Mar 09 14:06:28 crc kubenswrapper[4722]: I0309 14:06:28.259019 4722 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-fp2th container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.44:5443/healthz\": dial tcp 10.217.0.44:5443: connect: connection refused" start-of-body= Mar 09 14:06:28 crc kubenswrapper[4722]: I0309 14:06:28.259080 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fp2th" podUID="996087ed-6480-4650-8632-c991e5d16c99" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.44:5443/healthz\": dial tcp 10.217.0.44:5443: connect: connection refused" Mar 09 14:06:28 crc kubenswrapper[4722]: I0309 14:06:28.271135 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-gh244" event={"ID":"4084fbb0-8fae-4b7e-a3f6-ec9d723bb367","Type":"ContainerStarted","Data":"68bd2833a2f6b5b995fcf53a1686a8a4cd2138deddda53728f8776d1e820dfd2"} Mar 09 14:06:28 crc kubenswrapper[4722]: I0309 14:06:28.276143 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2mf4l" podStartSLOduration=164.276128775 podStartE2EDuration="2m44.276128775s" podCreationTimestamp="2026-03-09 14:03:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:06:28.273897839 +0000 UTC m=+228.829466405" watchObservedRunningTime="2026-03-09 14:06:28.276128775 +0000 UTC m=+228.831697351" Mar 09 14:06:28 crc kubenswrapper[4722]: I0309 14:06:28.282054 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nk4z2" event={"ID":"abf2dbee-a467-4858-8d5a-d4d1bf1bb430","Type":"ContainerStarted","Data":"5b21ab5d287237e7a74daad6d0dc149f04e946b17275802cedb25e9bd1d40bc1"} Mar 09 14:06:28 crc kubenswrapper[4722]: I0309 14:06:28.282103 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nk4z2" event={"ID":"abf2dbee-a467-4858-8d5a-d4d1bf1bb430","Type":"ContainerStarted","Data":"3034d557a0a84ec8319f62c04d50646409e95fc1c60b7841f684f2075ce1b33f"} Mar 09 14:06:28 crc kubenswrapper[4722]: I0309 14:06:28.295907 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-f8gtl" event={"ID":"53e973bb-4b49-4815-b8b3-a6cd76e210bf","Type":"ContainerStarted","Data":"a245110b55e1a1a4bea2dd820346c8cb0a7a5954e672c4af1b7b9aca09fa4099"} Mar 09 14:06:28 crc kubenswrapper[4722]: I0309 14:06:28.295967 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-f8gtl" event={"ID":"53e973bb-4b49-4815-b8b3-a6cd76e210bf","Type":"ContainerStarted","Data":"3d8d4fc99d9d7f382258640cb933788d738919891b9b43d0d40a71d38d1214a8"} Mar 09 14:06:28 crc kubenswrapper[4722]: I0309 14:06:28.304775 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-nnjvd" podStartSLOduration=8.304754574 podStartE2EDuration="8.304754574s" podCreationTimestamp="2026-03-09 14:06:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:06:28.298616092 +0000 UTC m=+228.854184668" watchObservedRunningTime="2026-03-09 14:06:28.304754574 +0000 UTC m=+228.860323150" Mar 09 14:06:28 crc kubenswrapper[4722]: I0309 14:06:28.310352 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551080-mdzx2" event={"ID":"7e1fdc12-aac5-4f72-9b22-0212c2f3988e","Type":"ContainerStarted","Data":"dc947cec8c36a2871a868e6175dd1a21b31edd8e69d4e4e07cedf44eaf8a8a73"} Mar 09 14:06:28 crc kubenswrapper[4722]: I0309 14:06:28.332330 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-hng8f" event={"ID":"6d4c4768-195d-42fe-86a7-139c4ec0c86d","Type":"ContainerStarted","Data":"4b433ebc6f79f9bac6ff4e573f78d76a8893dfb20feaa5c66d35a4bb9d253b3e"} Mar 09 14:06:28 crc kubenswrapper[4722]: I0309 14:06:28.336133 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 14:06:28 crc kubenswrapper[4722]: E0309 14:06:28.338454 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 14:06:28.838422823 +0000 UTC m=+229.393991389 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:28 crc kubenswrapper[4722]: I0309 14:06:28.340317 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-mgzj7" podUID="e0194ee7-d343-4042-9c2b-08cc513ee43e" containerName="controller-manager" containerID="cri-o://ef68012ce113c672326b92933b4c29a09a16cd7b6d5871cbfa1695f8f042b938" gracePeriod=30 Mar 09 14:06:28 crc kubenswrapper[4722]: I0309 14:06:28.340832 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-b5jn4" event={"ID":"39cc5cc7-fd80-4461-b4b1-adece1093703","Type":"ContainerStarted","Data":"e4a0701959979bfcf6182efa1a567bea22235bde73043b1d0ba7bd89e0fa4ede"} Mar 09 14:06:28 crc kubenswrapper[4722]: I0309 14:06:28.341783 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fnmcv" podUID="5822aa06-923f-4ba0-bdf6-617c5a5eb617" containerName="route-controller-manager" containerID="cri-o://da3b6bbffdb84559c44a3447c9e3299b1c7e01d39efa1a9a7f3dfab8784d16cc" gracePeriod=30 Mar 09 14:06:28 crc kubenswrapper[4722]: I0309 14:06:28.347564 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-f8gtl" podStartSLOduration=164.347544554 podStartE2EDuration="2m44.347544554s" podCreationTimestamp="2026-03-09 14:03:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:06:28.34641148 +0000 UTC m=+228.901980056" watchObservedRunningTime="2026-03-09 14:06:28.347544554 +0000 UTC m=+228.903113130" Mar 09 14:06:28 crc kubenswrapper[4722]: I0309 14:06:28.399007 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-wz9f2" podStartSLOduration=164.398988371 podStartE2EDuration="2m44.398988371s" podCreationTimestamp="2026-03-09 14:03:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:06:28.396179338 +0000 UTC m=+228.951747914" watchObservedRunningTime="2026-03-09 14:06:28.398988371 +0000 UTC m=+228.954556947" Mar 09 14:06:28 crc kubenswrapper[4722]: I0309 14:06:28.439686 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tz6gh\" (UID: \"9d57536b-4f57-4098-b519-19fdc2559eda\") " pod="openshift-image-registry/image-registry-697d97f7c8-tz6gh" Mar 09 14:06:28 crc kubenswrapper[4722]: E0309 14:06:28.451682 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 14:06:28.951665543 +0000 UTC m=+229.507234119 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tz6gh" (UID: "9d57536b-4f57-4098-b519-19fdc2559eda") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:28 crc kubenswrapper[4722]: I0309 14:06:28.490404 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lh92k" podStartSLOduration=164.490390223 podStartE2EDuration="2m44.490390223s" podCreationTimestamp="2026-03-09 14:03:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:06:28.487820676 +0000 UTC m=+229.043389252" watchObservedRunningTime="2026-03-09 14:06:28.490390223 +0000 UTC m=+229.045958799" Mar 09 14:06:28 crc kubenswrapper[4722]: I0309 14:06:28.524399 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mrtb2" podStartSLOduration=164.524382721 podStartE2EDuration="2m44.524382721s" podCreationTimestamp="2026-03-09 14:03:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:06:28.522894387 +0000 UTC m=+229.078462963" watchObservedRunningTime="2026-03-09 14:06:28.524382721 +0000 UTC m=+229.079951297" Mar 09 14:06:28 crc kubenswrapper[4722]: I0309 14:06:28.541836 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 14:06:28 crc kubenswrapper[4722]: E0309 14:06:28.542420 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 14:06:29.042400086 +0000 UTC m=+229.597968662 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:28 crc kubenswrapper[4722]: I0309 14:06:28.575428 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-nk4z2" podStartSLOduration=164.575407916 podStartE2EDuration="2m44.575407916s" podCreationTimestamp="2026-03-09 14:03:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:06:28.574624462 +0000 UTC m=+229.130193038" watchObservedRunningTime="2026-03-09 14:06:28.575407916 +0000 UTC m=+229.130976492" Mar 09 14:06:28 crc kubenswrapper[4722]: I0309 14:06:28.629042 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dgxnd" podStartSLOduration=164.629017916 podStartE2EDuration="2m44.629017916s" podCreationTimestamp="2026-03-09 14:03:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:06:28.626628065 +0000 UTC m=+229.182196661" watchObservedRunningTime="2026-03-09 14:06:28.629017916 +0000 UTC m=+229.184586492" Mar 09 14:06:28 crc kubenswrapper[4722]: I0309 14:06:28.643596 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tz6gh\" (UID: \"9d57536b-4f57-4098-b519-19fdc2559eda\") " pod="openshift-image-registry/image-registry-697d97f7c8-tz6gh" Mar 09 14:06:28 crc kubenswrapper[4722]: E0309 14:06:28.644186 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 14:06:29.144171176 +0000 UTC m=+229.699739752 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tz6gh" (UID: "9d57536b-4f57-4098-b519-19fdc2559eda") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:28 crc kubenswrapper[4722]: I0309 14:06:28.719803 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-hdrrh" podStartSLOduration=8.719779519 podStartE2EDuration="8.719779519s" podCreationTimestamp="2026-03-09 14:06:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:06:28.680084051 +0000 UTC m=+229.235652627" watchObservedRunningTime="2026-03-09 14:06:28.719779519 +0000 UTC m=+229.275348105" Mar 09 14:06:28 crc kubenswrapper[4722]: I0309 14:06:28.745107 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 14:06:28 crc kubenswrapper[4722]: E0309 14:06:28.745355 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 14:06:29.245321327 +0000 UTC m=+229.800889903 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:28 crc kubenswrapper[4722]: I0309 14:06:28.745524 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tz6gh\" (UID: \"9d57536b-4f57-4098-b519-19fdc2559eda\") " pod="openshift-image-registry/image-registry-697d97f7c8-tz6gh" Mar 09 14:06:28 crc kubenswrapper[4722]: E0309 14:06:28.745995 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 14:06:29.245980046 +0000 UTC m=+229.801548622 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tz6gh" (UID: "9d57536b-4f57-4098-b519-19fdc2559eda") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:28 crc kubenswrapper[4722]: I0309 14:06:28.754885 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tkhkf" podStartSLOduration=164.75486218 podStartE2EDuration="2m44.75486218s" podCreationTimestamp="2026-03-09 14:03:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:06:28.750998785 +0000 UTC m=+229.306567361" watchObservedRunningTime="2026-03-09 14:06:28.75486218 +0000 UTC m=+229.310430756" Mar 09 14:06:28 crc kubenswrapper[4722]: I0309 14:06:28.755593 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-gh244" podStartSLOduration=165.755588081 podStartE2EDuration="2m45.755588081s" podCreationTimestamp="2026-03-09 14:03:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:06:28.725746216 +0000 UTC m=+229.281314802" watchObservedRunningTime="2026-03-09 14:06:28.755588081 +0000 UTC m=+229.311156657" Mar 09 14:06:28 crc kubenswrapper[4722]: I0309 14:06:28.806661 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lnkwt" podStartSLOduration=164.806642086 podStartE2EDuration="2m44.806642086s" podCreationTimestamp="2026-03-09 14:03:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:06:28.806572794 +0000 UTC m=+229.362141380" watchObservedRunningTime="2026-03-09 14:06:28.806642086 +0000 UTC m=+229.362210662" Mar 09 14:06:28 crc kubenswrapper[4722]: I0309 14:06:28.807908 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fp2th" podStartSLOduration=164.807900364 podStartE2EDuration="2m44.807900364s" podCreationTimestamp="2026-03-09 14:03:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:06:28.785699765 +0000 UTC m=+229.341268341" watchObservedRunningTime="2026-03-09 14:06:28.807900364 +0000 UTC m=+229.363468940" Mar 09 14:06:28 crc kubenswrapper[4722]: I0309 14:06:28.837367 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29551080-mdzx2" podStartSLOduration=165.837347828 podStartE2EDuration="2m45.837347828s" podCreationTimestamp="2026-03-09 14:03:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:06:28.835631267 +0000 UTC m=+229.391199853" watchObservedRunningTime="2026-03-09 14:06:28.837347828 +0000 UTC m=+229.392916404" Mar 09 14:06:28 crc kubenswrapper[4722]: I0309 14:06:28.849586 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 14:06:28 crc kubenswrapper[4722]: E0309 14:06:28.849788 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 14:06:29.349759966 +0000 UTC m=+229.905328542 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:28 crc kubenswrapper[4722]: I0309 14:06:28.849860 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tz6gh\" (UID: \"9d57536b-4f57-4098-b519-19fdc2559eda\") " pod="openshift-image-registry/image-registry-697d97f7c8-tz6gh" Mar 09 14:06:28 crc kubenswrapper[4722]: E0309 14:06:28.850308 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 14:06:29.350293131 +0000 UTC m=+229.905861707 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tz6gh" (UID: "9d57536b-4f57-4098-b519-19fdc2559eda") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:28 crc kubenswrapper[4722]: I0309 14:06:28.888711 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-b5jn4" podStartSLOduration=164.88869381 podStartE2EDuration="2m44.88869381s" podCreationTimestamp="2026-03-09 14:03:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:06:28.887842596 +0000 UTC m=+229.443411172" watchObservedRunningTime="2026-03-09 14:06:28.88869381 +0000 UTC m=+229.444262386" Mar 09 14:06:28 crc kubenswrapper[4722]: I0309 14:06:28.951284 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 14:06:28 crc kubenswrapper[4722]: E0309 14:06:28.951552 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 14:06:29.451535625 +0000 UTC m=+230.007104191 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:29 crc kubenswrapper[4722]: I0309 14:06:29.054609 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tz6gh\" (UID: \"9d57536b-4f57-4098-b519-19fdc2559eda\") " pod="openshift-image-registry/image-registry-697d97f7c8-tz6gh" Mar 09 14:06:29 crc kubenswrapper[4722]: E0309 14:06:29.054952 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 14:06:29.554939623 +0000 UTC m=+230.110508199 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tz6gh" (UID: "9d57536b-4f57-4098-b519-19fdc2559eda") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:29 crc kubenswrapper[4722]: I0309 14:06:29.143276 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fnmcv" Mar 09 14:06:29 crc kubenswrapper[4722]: I0309 14:06:29.164067 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 14:06:29 crc kubenswrapper[4722]: E0309 14:06:29.164300 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 14:06:29.664266087 +0000 UTC m=+230.219834653 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:29 crc kubenswrapper[4722]: I0309 14:06:29.164451 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tz6gh\" (UID: \"9d57536b-4f57-4098-b519-19fdc2559eda\") " pod="openshift-image-registry/image-registry-697d97f7c8-tz6gh" Mar 09 14:06:29 crc kubenswrapper[4722]: E0309 14:06:29.164821 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 14:06:29.664805954 +0000 UTC m=+230.220374560 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tz6gh" (UID: "9d57536b-4f57-4098-b519-19fdc2559eda") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:29 crc kubenswrapper[4722]: I0309 14:06:29.177403 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-hng8f" podStartSLOduration=165.177384767 podStartE2EDuration="2m45.177384767s" podCreationTimestamp="2026-03-09 14:03:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:06:28.915462956 +0000 UTC m=+229.471031532" watchObservedRunningTime="2026-03-09 14:06:29.177384767 +0000 UTC m=+229.732953343" Mar 09 14:06:29 crc kubenswrapper[4722]: I0309 14:06:29.212352 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-mgzj7" Mar 09 14:06:29 crc kubenswrapper[4722]: I0309 14:06:29.256445 4722 patch_prober.go:28] interesting pod/router-default-5444994796-dp8wn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 14:06:29 crc kubenswrapper[4722]: [-]has-synced failed: reason withheld Mar 09 14:06:29 crc kubenswrapper[4722]: [+]process-running ok Mar 09 14:06:29 crc kubenswrapper[4722]: healthz check failed Mar 09 14:06:29 crc kubenswrapper[4722]: I0309 14:06:29.256500 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dp8wn" podUID="317444ee-0620-47d2-869e-77578a367a87" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 14:06:29 crc kubenswrapper[4722]: I0309 14:06:29.265706 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ww47w\" (UniqueName: \"kubernetes.io/projected/5822aa06-923f-4ba0-bdf6-617c5a5eb617-kube-api-access-ww47w\") pod \"5822aa06-923f-4ba0-bdf6-617c5a5eb617\" (UID: \"5822aa06-923f-4ba0-bdf6-617c5a5eb617\") " Mar 09 14:06:29 crc kubenswrapper[4722]: I0309 14:06:29.265757 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5822aa06-923f-4ba0-bdf6-617c5a5eb617-config\") pod \"5822aa06-923f-4ba0-bdf6-617c5a5eb617\" (UID: \"5822aa06-923f-4ba0-bdf6-617c5a5eb617\") " Mar 09 14:06:29 crc kubenswrapper[4722]: I0309 14:06:29.266008 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 14:06:29 crc kubenswrapper[4722]: I0309 14:06:29.266056 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5822aa06-923f-4ba0-bdf6-617c5a5eb617-serving-cert\") pod \"5822aa06-923f-4ba0-bdf6-617c5a5eb617\" (UID: \"5822aa06-923f-4ba0-bdf6-617c5a5eb617\") " Mar 09 14:06:29 crc kubenswrapper[4722]: I0309 14:06:29.266087 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5822aa06-923f-4ba0-bdf6-617c5a5eb617-client-ca\") pod \"5822aa06-923f-4ba0-bdf6-617c5a5eb617\" (UID: \"5822aa06-923f-4ba0-bdf6-617c5a5eb617\") " Mar 09 14:06:29 crc kubenswrapper[4722]: I0309 14:06:29.267077 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5822aa06-923f-4ba0-bdf6-617c5a5eb617-client-ca" (OuterVolumeSpecName: "client-ca") pod "5822aa06-923f-4ba0-bdf6-617c5a5eb617" (UID: "5822aa06-923f-4ba0-bdf6-617c5a5eb617"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:06:29 crc kubenswrapper[4722]: I0309 14:06:29.267114 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5822aa06-923f-4ba0-bdf6-617c5a5eb617-config" (OuterVolumeSpecName: "config") pod "5822aa06-923f-4ba0-bdf6-617c5a5eb617" (UID: "5822aa06-923f-4ba0-bdf6-617c5a5eb617"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:06:29 crc kubenswrapper[4722]: E0309 14:06:29.267160 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 14:06:29.7671463 +0000 UTC m=+230.322714876 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:29 crc kubenswrapper[4722]: I0309 14:06:29.274713 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5822aa06-923f-4ba0-bdf6-617c5a5eb617-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5822aa06-923f-4ba0-bdf6-617c5a5eb617" (UID: "5822aa06-923f-4ba0-bdf6-617c5a5eb617"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:06:29 crc kubenswrapper[4722]: I0309 14:06:29.274951 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5822aa06-923f-4ba0-bdf6-617c5a5eb617-kube-api-access-ww47w" (OuterVolumeSpecName: "kube-api-access-ww47w") pod "5822aa06-923f-4ba0-bdf6-617c5a5eb617" (UID: "5822aa06-923f-4ba0-bdf6-617c5a5eb617"). InnerVolumeSpecName "kube-api-access-ww47w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:06:29 crc kubenswrapper[4722]: I0309 14:06:29.360966 4722 generic.go:334] "Generic (PLEG): container finished" podID="5822aa06-923f-4ba0-bdf6-617c5a5eb617" containerID="da3b6bbffdb84559c44a3447c9e3299b1c7e01d39efa1a9a7f3dfab8784d16cc" exitCode=0 Mar 09 14:06:29 crc kubenswrapper[4722]: I0309 14:06:29.361419 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fnmcv" Mar 09 14:06:29 crc kubenswrapper[4722]: I0309 14:06:29.362309 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fnmcv" event={"ID":"5822aa06-923f-4ba0-bdf6-617c5a5eb617","Type":"ContainerDied","Data":"da3b6bbffdb84559c44a3447c9e3299b1c7e01d39efa1a9a7f3dfab8784d16cc"} Mar 09 14:06:29 crc kubenswrapper[4722]: I0309 14:06:29.362356 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fnmcv" event={"ID":"5822aa06-923f-4ba0-bdf6-617c5a5eb617","Type":"ContainerDied","Data":"e54670148e87c29178fb21b9b9f047c9a9313d6381ba2a040bff6585c615d2ee"} Mar 09 14:06:29 crc kubenswrapper[4722]: I0309 14:06:29.362378 4722 scope.go:117] "RemoveContainer" containerID="da3b6bbffdb84559c44a3447c9e3299b1c7e01d39efa1a9a7f3dfab8784d16cc" Mar 09 14:06:29 crc kubenswrapper[4722]: I0309 14:06:29.366596 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e0194ee7-d343-4042-9c2b-08cc513ee43e-proxy-ca-bundles\") pod \"e0194ee7-d343-4042-9c2b-08cc513ee43e\" (UID: \"e0194ee7-d343-4042-9c2b-08cc513ee43e\") " Mar 09 14:06:29 crc kubenswrapper[4722]: I0309 14:06:29.366668 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0194ee7-d343-4042-9c2b-08cc513ee43e-serving-cert\") pod \"e0194ee7-d343-4042-9c2b-08cc513ee43e\" (UID: \"e0194ee7-d343-4042-9c2b-08cc513ee43e\") " Mar 09 14:06:29 crc kubenswrapper[4722]: I0309 14:06:29.366695 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e0194ee7-d343-4042-9c2b-08cc513ee43e-client-ca\") pod \"e0194ee7-d343-4042-9c2b-08cc513ee43e\" (UID: \"e0194ee7-d343-4042-9c2b-08cc513ee43e\") " Mar 09 14:06:29 crc kubenswrapper[4722]: I0309 14:06:29.366757 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0194ee7-d343-4042-9c2b-08cc513ee43e-config\") pod \"e0194ee7-d343-4042-9c2b-08cc513ee43e\" (UID: \"e0194ee7-d343-4042-9c2b-08cc513ee43e\") " Mar 09 14:06:29 crc kubenswrapper[4722]: I0309 14:06:29.366793 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p88m8\" (UniqueName: \"kubernetes.io/projected/e0194ee7-d343-4042-9c2b-08cc513ee43e-kube-api-access-p88m8\") pod \"e0194ee7-d343-4042-9c2b-08cc513ee43e\" (UID: \"e0194ee7-d343-4042-9c2b-08cc513ee43e\") " Mar 09 14:06:29 crc kubenswrapper[4722]: I0309 14:06:29.367123 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tz6gh\" (UID: \"9d57536b-4f57-4098-b519-19fdc2559eda\") " pod="openshift-image-registry/image-registry-697d97f7c8-tz6gh" Mar 09 14:06:29 crc kubenswrapper[4722]: I0309 14:06:29.367254 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5822aa06-923f-4ba0-bdf6-617c5a5eb617-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 14:06:29 crc kubenswrapper[4722]: I0309 14:06:29.367275 4722 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5822aa06-923f-4ba0-bdf6-617c5a5eb617-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 14:06:29 crc kubenswrapper[4722]: I0309 14:06:29.367288 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ww47w\" (UniqueName: \"kubernetes.io/projected/5822aa06-923f-4ba0-bdf6-617c5a5eb617-kube-api-access-ww47w\") on node \"crc\" DevicePath \"\"" Mar 09 14:06:29 crc kubenswrapper[4722]: I0309 14:06:29.367300 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5822aa06-923f-4ba0-bdf6-617c5a5eb617-config\") on node \"crc\" DevicePath \"\"" Mar 09 14:06:29 crc kubenswrapper[4722]: E0309 14:06:29.367590 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 14:06:29.867576079 +0000 UTC m=+230.423144655 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tz6gh" (UID: "9d57536b-4f57-4098-b519-19fdc2559eda") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:29 crc kubenswrapper[4722]: I0309 14:06:29.368605 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0194ee7-d343-4042-9c2b-08cc513ee43e-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "e0194ee7-d343-4042-9c2b-08cc513ee43e" (UID: "e0194ee7-d343-4042-9c2b-08cc513ee43e"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:06:29 crc kubenswrapper[4722]: I0309 14:06:29.369447 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0194ee7-d343-4042-9c2b-08cc513ee43e-config" (OuterVolumeSpecName: "config") pod "e0194ee7-d343-4042-9c2b-08cc513ee43e" (UID: "e0194ee7-d343-4042-9c2b-08cc513ee43e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:06:29 crc kubenswrapper[4722]: I0309 14:06:29.369729 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0194ee7-d343-4042-9c2b-08cc513ee43e-client-ca" (OuterVolumeSpecName: "client-ca") pod "e0194ee7-d343-4042-9c2b-08cc513ee43e" (UID: "e0194ee7-d343-4042-9c2b-08cc513ee43e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:06:29 crc kubenswrapper[4722]: I0309 14:06:29.381215 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-b5hps" event={"ID":"240d1325-4400-475e-8bc7-9915294148d8","Type":"ContainerStarted","Data":"61ae2766b812ce1452c02a5d02c6f21b1ec4689409491d18191d80fbd36bbb12"} Mar 09 14:06:29 crc kubenswrapper[4722]: I0309 14:06:29.382522 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0194ee7-d343-4042-9c2b-08cc513ee43e-kube-api-access-p88m8" (OuterVolumeSpecName: "kube-api-access-p88m8") pod "e0194ee7-d343-4042-9c2b-08cc513ee43e" (UID: "e0194ee7-d343-4042-9c2b-08cc513ee43e"). InnerVolumeSpecName "kube-api-access-p88m8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:06:29 crc kubenswrapper[4722]: I0309 14:06:29.382669 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0194ee7-d343-4042-9c2b-08cc513ee43e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e0194ee7-d343-4042-9c2b-08cc513ee43e" (UID: "e0194ee7-d343-4042-9c2b-08cc513ee43e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:06:29 crc kubenswrapper[4722]: I0309 14:06:29.395987 4722 generic.go:334] "Generic (PLEG): container finished" podID="e0194ee7-d343-4042-9c2b-08cc513ee43e" containerID="ef68012ce113c672326b92933b4c29a09a16cd7b6d5871cbfa1695f8f042b938" exitCode=0 Mar 09 14:06:29 crc kubenswrapper[4722]: I0309 14:06:29.396638 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-mgzj7" Mar 09 14:06:29 crc kubenswrapper[4722]: I0309 14:06:29.406894 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-mgzj7" event={"ID":"e0194ee7-d343-4042-9c2b-08cc513ee43e","Type":"ContainerDied","Data":"ef68012ce113c672326b92933b4c29a09a16cd7b6d5871cbfa1695f8f042b938"} Mar 09 14:06:29 crc kubenswrapper[4722]: I0309 14:06:29.406968 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-fnmcv"] Mar 09 14:06:29 crc kubenswrapper[4722]: I0309 14:06:29.406996 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-mgzj7" event={"ID":"e0194ee7-d343-4042-9c2b-08cc513ee43e","Type":"ContainerDied","Data":"1862afc05713de13623553d44ee6f258ae1ef1c3df42e081c0aace0ccc31617a"} Mar 09 14:06:29 crc kubenswrapper[4722]: I0309 14:06:29.429490 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-fnmcv"] Mar 09 14:06:29 crc kubenswrapper[4722]: I0309 14:06:29.433598 4722 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-wz9f2 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.34:8080/healthz\": dial tcp 10.217.0.34:8080: connect: connection refused" start-of-body= Mar 09 14:06:29 crc kubenswrapper[4722]: I0309 14:06:29.433691 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-wz9f2" podUID="e2641b0e-aae4-49df-931f-95e38505812f" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.34:8080/healthz\": dial tcp 10.217.0.34:8080: connect: connection refused" Mar 09 14:06:29 crc kubenswrapper[4722]: I0309 14:06:29.437183 4722 scope.go:117] "RemoveContainer" containerID="da3b6bbffdb84559c44a3447c9e3299b1c7e01d39efa1a9a7f3dfab8784d16cc" Mar 09 14:06:29 crc kubenswrapper[4722]: E0309 14:06:29.441237 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da3b6bbffdb84559c44a3447c9e3299b1c7e01d39efa1a9a7f3dfab8784d16cc\": container with ID starting with da3b6bbffdb84559c44a3447c9e3299b1c7e01d39efa1a9a7f3dfab8784d16cc not found: ID does not exist" containerID="da3b6bbffdb84559c44a3447c9e3299b1c7e01d39efa1a9a7f3dfab8784d16cc" Mar 09 14:06:29 crc kubenswrapper[4722]: I0309 14:06:29.441292 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da3b6bbffdb84559c44a3447c9e3299b1c7e01d39efa1a9a7f3dfab8784d16cc"} err="failed to get container status \"da3b6bbffdb84559c44a3447c9e3299b1c7e01d39efa1a9a7f3dfab8784d16cc\": rpc error: code = NotFound desc = could not find container \"da3b6bbffdb84559c44a3447c9e3299b1c7e01d39efa1a9a7f3dfab8784d16cc\": container with ID starting with da3b6bbffdb84559c44a3447c9e3299b1c7e01d39efa1a9a7f3dfab8784d16cc not found: ID does not exist" Mar 09 14:06:29 crc kubenswrapper[4722]: I0309 14:06:29.441317 4722 scope.go:117] "RemoveContainer" containerID="ef68012ce113c672326b92933b4c29a09a16cd7b6d5871cbfa1695f8f042b938" Mar 09 14:06:29 crc kubenswrapper[4722]: I0309 14:06:29.443001 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lnkwt" Mar 09 14:06:29 crc kubenswrapper[4722]: I0309 14:06:29.453014 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dgxnd" Mar 09 14:06:29 crc kubenswrapper[4722]: I0309 14:06:29.473278 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 14:06:29 crc kubenswrapper[4722]: E0309 14:06:29.473813 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 14:06:29.973781651 +0000 UTC m=+230.529350237 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:29 crc kubenswrapper[4722]: I0309 14:06:29.474558 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0194ee7-d343-4042-9c2b-08cc513ee43e-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 14:06:29 crc kubenswrapper[4722]: I0309 14:06:29.474582 4722 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e0194ee7-d343-4042-9c2b-08cc513ee43e-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 14:06:29 crc kubenswrapper[4722]: I0309 14:06:29.474592 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0194ee7-d343-4042-9c2b-08cc513ee43e-config\") on node \"crc\" DevicePath \"\"" Mar 09 14:06:29 crc kubenswrapper[4722]: I0309 14:06:29.474602 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p88m8\" (UniqueName: \"kubernetes.io/projected/e0194ee7-d343-4042-9c2b-08cc513ee43e-kube-api-access-p88m8\") on node \"crc\" DevicePath \"\"" Mar 09 14:06:29 crc kubenswrapper[4722]: I0309 14:06:29.474613 4722 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e0194ee7-d343-4042-9c2b-08cc513ee43e-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 09 14:06:29 crc kubenswrapper[4722]: I0309 14:06:29.522154 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-mgzj7"] Mar 09 14:06:29 crc kubenswrapper[4722]: I0309 14:06:29.528277 4722 scope.go:117] "RemoveContainer" containerID="ef68012ce113c672326b92933b4c29a09a16cd7b6d5871cbfa1695f8f042b938" Mar 09 14:06:29 crc kubenswrapper[4722]: E0309 14:06:29.534184 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef68012ce113c672326b92933b4c29a09a16cd7b6d5871cbfa1695f8f042b938\": container with ID starting with ef68012ce113c672326b92933b4c29a09a16cd7b6d5871cbfa1695f8f042b938 not found: ID does not exist" containerID="ef68012ce113c672326b92933b4c29a09a16cd7b6d5871cbfa1695f8f042b938" Mar 09 14:06:29 crc kubenswrapper[4722]: I0309 14:06:29.534251 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef68012ce113c672326b92933b4c29a09a16cd7b6d5871cbfa1695f8f042b938"} err="failed to get container status \"ef68012ce113c672326b92933b4c29a09a16cd7b6d5871cbfa1695f8f042b938\": rpc error: code = NotFound desc = could not find container \"ef68012ce113c672326b92933b4c29a09a16cd7b6d5871cbfa1695f8f042b938\": container with ID starting with ef68012ce113c672326b92933b4c29a09a16cd7b6d5871cbfa1695f8f042b938 not found: ID does not exist" Mar 09 14:06:29 crc kubenswrapper[4722]: I0309 14:06:29.541521 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-mgzj7"] Mar 09 14:06:29 crc kubenswrapper[4722]: I0309 14:06:29.580458 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tz6gh\" (UID: \"9d57536b-4f57-4098-b519-19fdc2559eda\") " pod="openshift-image-registry/image-registry-697d97f7c8-tz6gh" Mar 09 14:06:29 crc kubenswrapper[4722]: E0309 14:06:29.582576 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 14:06:30.082560439 +0000 UTC m=+230.638129015 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tz6gh" (UID: "9d57536b-4f57-4098-b519-19fdc2559eda") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:29 crc kubenswrapper[4722]: I0309 14:06:29.597627 4722 ???:1] "http: TLS handshake error from 192.168.126.11:54670: no serving certificate available for the kubelet" Mar 09 14:06:29 crc kubenswrapper[4722]: I0309 14:06:29.681868 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 14:06:29 crc kubenswrapper[4722]: E0309 14:06:29.682212 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 14:06:30.182176464 +0000 UTC m=+230.737745030 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:29 crc kubenswrapper[4722]: I0309 14:06:29.783721 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fp2th" Mar 09 14:06:29 crc kubenswrapper[4722]: I0309 14:06:29.783990 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tz6gh\" (UID: \"9d57536b-4f57-4098-b519-19fdc2559eda\") " pod="openshift-image-registry/image-registry-697d97f7c8-tz6gh" Mar 09 14:06:29 crc kubenswrapper[4722]: E0309 14:06:29.784374 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 14:06:30.284358786 +0000 UTC m=+230.839927362 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tz6gh" (UID: "9d57536b-4f57-4098-b519-19fdc2559eda") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:29 crc kubenswrapper[4722]: I0309 14:06:29.884973 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 14:06:29 crc kubenswrapper[4722]: E0309 14:06:29.885181 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 14:06:30.385146627 +0000 UTC m=+230.940715203 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:29 crc kubenswrapper[4722]: I0309 14:06:29.885288 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tz6gh\" (UID: \"9d57536b-4f57-4098-b519-19fdc2559eda\") " pod="openshift-image-registry/image-registry-697d97f7c8-tz6gh" Mar 09 14:06:29 crc kubenswrapper[4722]: E0309 14:06:29.885552 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 14:06:30.385544978 +0000 UTC m=+230.941113554 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tz6gh" (UID: "9d57536b-4f57-4098-b519-19fdc2559eda") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:29 crc kubenswrapper[4722]: I0309 14:06:29.986030 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 14:06:29 crc kubenswrapper[4722]: E0309 14:06:29.986319 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 14:06:30.486249186 +0000 UTC m=+231.041817762 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:29 crc kubenswrapper[4722]: I0309 14:06:29.986404 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tz6gh\" (UID: \"9d57536b-4f57-4098-b519-19fdc2559eda\") " pod="openshift-image-registry/image-registry-697d97f7c8-tz6gh" Mar 09 14:06:29 crc kubenswrapper[4722]: E0309 14:06:29.986760 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 14:06:30.486741371 +0000 UTC m=+231.042310137 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tz6gh" (UID: "9d57536b-4f57-4098-b519-19fdc2559eda") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.088041 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 14:06:30 crc kubenswrapper[4722]: E0309 14:06:30.088296 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 14:06:30.588256053 +0000 UTC m=+231.143824629 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.088925 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tz6gh\" (UID: \"9d57536b-4f57-4098-b519-19fdc2559eda\") " pod="openshift-image-registry/image-registry-697d97f7c8-tz6gh" Mar 09 14:06:30 crc kubenswrapper[4722]: E0309 14:06:30.089378 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 14:06:30.589360476 +0000 UTC m=+231.144929052 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tz6gh" (UID: "9d57536b-4f57-4098-b519-19fdc2559eda") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.124720 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5bbf48759-6bvgh"] Mar 09 14:06:30 crc kubenswrapper[4722]: E0309 14:06:30.125122 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0194ee7-d343-4042-9c2b-08cc513ee43e" containerName="controller-manager" Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.125143 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0194ee7-d343-4042-9c2b-08cc513ee43e" containerName="controller-manager" Mar 09 14:06:30 crc kubenswrapper[4722]: E0309 14:06:30.125160 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5822aa06-923f-4ba0-bdf6-617c5a5eb617" containerName="route-controller-manager" Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.125168 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="5822aa06-923f-4ba0-bdf6-617c5a5eb617" containerName="route-controller-manager" Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.125323 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0194ee7-d343-4042-9c2b-08cc513ee43e" containerName="controller-manager" Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.125373 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="5822aa06-923f-4ba0-bdf6-617c5a5eb617" containerName="route-controller-manager" Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.125938 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5bbf48759-6bvgh" Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.128424 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-68dc4b87b5-hhjmz"] Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.128571 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.128653 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.128726 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.128899 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.129454 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.129847 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.130302 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-68dc4b87b5-hhjmz" Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.133923 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.134099 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.135419 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.135636 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.136104 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.136103 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.142161 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5bbf48759-6bvgh"] Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.143123 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.166768 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5822aa06-923f-4ba0-bdf6-617c5a5eb617" path="/var/lib/kubelet/pods/5822aa06-923f-4ba0-bdf6-617c5a5eb617/volumes" Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.167523 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0194ee7-d343-4042-9c2b-08cc513ee43e" path="/var/lib/kubelet/pods/e0194ee7-d343-4042-9c2b-08cc513ee43e/volumes" Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.168191 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-68dc4b87b5-hhjmz"] Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.189908 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.190238 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13ca2ac7-d53e-437c-a6e2-b851e281ebf3-config\") pod \"route-controller-manager-68dc4b87b5-hhjmz\" (UID: \"13ca2ac7-d53e-437c-a6e2-b851e281ebf3\") " pod="openshift-route-controller-manager/route-controller-manager-68dc4b87b5-hhjmz" Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.190356 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e4e43d96-89d5-4ba8-8b68-5b3be43d23a9-client-ca\") pod \"controller-manager-5bbf48759-6bvgh\" (UID: \"e4e43d96-89d5-4ba8-8b68-5b3be43d23a9\") " pod="openshift-controller-manager/controller-manager-5bbf48759-6bvgh" Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.190443 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbdf5\" (UniqueName: \"kubernetes.io/projected/e4e43d96-89d5-4ba8-8b68-5b3be43d23a9-kube-api-access-jbdf5\") pod \"controller-manager-5bbf48759-6bvgh\" (UID: \"e4e43d96-89d5-4ba8-8b68-5b3be43d23a9\") " pod="openshift-controller-manager/controller-manager-5bbf48759-6bvgh" Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.190717 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4e43d96-89d5-4ba8-8b68-5b3be43d23a9-serving-cert\") pod \"controller-manager-5bbf48759-6bvgh\" (UID: \"e4e43d96-89d5-4ba8-8b68-5b3be43d23a9\") " pod="openshift-controller-manager/controller-manager-5bbf48759-6bvgh" Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.190771 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/13ca2ac7-d53e-437c-a6e2-b851e281ebf3-serving-cert\") pod \"route-controller-manager-68dc4b87b5-hhjmz\" (UID: \"13ca2ac7-d53e-437c-a6e2-b851e281ebf3\") " pod="openshift-route-controller-manager/route-controller-manager-68dc4b87b5-hhjmz" Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.190853 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e4e43d96-89d5-4ba8-8b68-5b3be43d23a9-proxy-ca-bundles\") pod \"controller-manager-5bbf48759-6bvgh\" (UID: \"e4e43d96-89d5-4ba8-8b68-5b3be43d23a9\") " pod="openshift-controller-manager/controller-manager-5bbf48759-6bvgh" Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.190883 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4e43d96-89d5-4ba8-8b68-5b3be43d23a9-config\") pod \"controller-manager-5bbf48759-6bvgh\" (UID: \"e4e43d96-89d5-4ba8-8b68-5b3be43d23a9\") " pod="openshift-controller-manager/controller-manager-5bbf48759-6bvgh" Mar 09 14:06:30 crc kubenswrapper[4722]: E0309 14:06:30.190903 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 14:06:30.690879708 +0000 UTC m=+231.246448284 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.191044 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/13ca2ac7-d53e-437c-a6e2-b851e281ebf3-client-ca\") pod \"route-controller-manager-68dc4b87b5-hhjmz\" (UID: \"13ca2ac7-d53e-437c-a6e2-b851e281ebf3\") " pod="openshift-route-controller-manager/route-controller-manager-68dc4b87b5-hhjmz" Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.191151 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wd872\" (UniqueName: \"kubernetes.io/projected/13ca2ac7-d53e-437c-a6e2-b851e281ebf3-kube-api-access-wd872\") pod \"route-controller-manager-68dc4b87b5-hhjmz\" (UID: \"13ca2ac7-d53e-437c-a6e2-b851e281ebf3\") " pod="openshift-route-controller-manager/route-controller-manager-68dc4b87b5-hhjmz" Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.249332 4722 patch_prober.go:28] interesting pod/router-default-5444994796-dp8wn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 14:06:30 crc kubenswrapper[4722]: [-]has-synced failed: reason withheld Mar 09 14:06:30 crc kubenswrapper[4722]: [+]process-running ok Mar 09 14:06:30 crc kubenswrapper[4722]: healthz check failed Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.249376 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dp8wn" podUID="317444ee-0620-47d2-869e-77578a367a87" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.293215 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13ca2ac7-d53e-437c-a6e2-b851e281ebf3-config\") pod \"route-controller-manager-68dc4b87b5-hhjmz\" (UID: \"13ca2ac7-d53e-437c-a6e2-b851e281ebf3\") " pod="openshift-route-controller-manager/route-controller-manager-68dc4b87b5-hhjmz" Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.293306 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e4e43d96-89d5-4ba8-8b68-5b3be43d23a9-client-ca\") pod \"controller-manager-5bbf48759-6bvgh\" (UID: \"e4e43d96-89d5-4ba8-8b68-5b3be43d23a9\") " pod="openshift-controller-manager/controller-manager-5bbf48759-6bvgh" Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.293325 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbdf5\" (UniqueName: \"kubernetes.io/projected/e4e43d96-89d5-4ba8-8b68-5b3be43d23a9-kube-api-access-jbdf5\") pod \"controller-manager-5bbf48759-6bvgh\" (UID: \"e4e43d96-89d5-4ba8-8b68-5b3be43d23a9\") " pod="openshift-controller-manager/controller-manager-5bbf48759-6bvgh" Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.293346 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4e43d96-89d5-4ba8-8b68-5b3be43d23a9-serving-cert\") pod \"controller-manager-5bbf48759-6bvgh\" (UID: \"e4e43d96-89d5-4ba8-8b68-5b3be43d23a9\") " pod="openshift-controller-manager/controller-manager-5bbf48759-6bvgh" Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.293361 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/13ca2ac7-d53e-437c-a6e2-b851e281ebf3-serving-cert\") pod \"route-controller-manager-68dc4b87b5-hhjmz\" (UID: \"13ca2ac7-d53e-437c-a6e2-b851e281ebf3\") " pod="openshift-route-controller-manager/route-controller-manager-68dc4b87b5-hhjmz" Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.293384 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tz6gh\" (UID: \"9d57536b-4f57-4098-b519-19fdc2559eda\") " pod="openshift-image-registry/image-registry-697d97f7c8-tz6gh" Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.293407 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e4e43d96-89d5-4ba8-8b68-5b3be43d23a9-proxy-ca-bundles\") pod \"controller-manager-5bbf48759-6bvgh\" (UID: \"e4e43d96-89d5-4ba8-8b68-5b3be43d23a9\") " pod="openshift-controller-manager/controller-manager-5bbf48759-6bvgh" Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.293423 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4e43d96-89d5-4ba8-8b68-5b3be43d23a9-config\") pod \"controller-manager-5bbf48759-6bvgh\" (UID: \"e4e43d96-89d5-4ba8-8b68-5b3be43d23a9\") " pod="openshift-controller-manager/controller-manager-5bbf48759-6bvgh" Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.293439 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/13ca2ac7-d53e-437c-a6e2-b851e281ebf3-client-ca\") pod \"route-controller-manager-68dc4b87b5-hhjmz\" (UID: \"13ca2ac7-d53e-437c-a6e2-b851e281ebf3\") " pod="openshift-route-controller-manager/route-controller-manager-68dc4b87b5-hhjmz" Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.293462 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wd872\" (UniqueName: \"kubernetes.io/projected/13ca2ac7-d53e-437c-a6e2-b851e281ebf3-kube-api-access-wd872\") pod \"route-controller-manager-68dc4b87b5-hhjmz\" (UID: \"13ca2ac7-d53e-437c-a6e2-b851e281ebf3\") " pod="openshift-route-controller-manager/route-controller-manager-68dc4b87b5-hhjmz" Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.294871 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13ca2ac7-d53e-437c-a6e2-b851e281ebf3-config\") pod \"route-controller-manager-68dc4b87b5-hhjmz\" (UID: \"13ca2ac7-d53e-437c-a6e2-b851e281ebf3\") " pod="openshift-route-controller-manager/route-controller-manager-68dc4b87b5-hhjmz" Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.295746 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e4e43d96-89d5-4ba8-8b68-5b3be43d23a9-proxy-ca-bundles\") pod \"controller-manager-5bbf48759-6bvgh\" (UID: \"e4e43d96-89d5-4ba8-8b68-5b3be43d23a9\") " pod="openshift-controller-manager/controller-manager-5bbf48759-6bvgh" Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.295948 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e4e43d96-89d5-4ba8-8b68-5b3be43d23a9-client-ca\") pod \"controller-manager-5bbf48759-6bvgh\" (UID: \"e4e43d96-89d5-4ba8-8b68-5b3be43d23a9\") " pod="openshift-controller-manager/controller-manager-5bbf48759-6bvgh" Mar 09 14:06:30 crc kubenswrapper[4722]: E0309 14:06:30.296027 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 14:06:30.796012948 +0000 UTC m=+231.351581524 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tz6gh" (UID: "9d57536b-4f57-4098-b519-19fdc2559eda") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.296822 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4e43d96-89d5-4ba8-8b68-5b3be43d23a9-config\") pod \"controller-manager-5bbf48759-6bvgh\" (UID: \"e4e43d96-89d5-4ba8-8b68-5b3be43d23a9\") " pod="openshift-controller-manager/controller-manager-5bbf48759-6bvgh" Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.296967 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/13ca2ac7-d53e-437c-a6e2-b851e281ebf3-client-ca\") pod \"route-controller-manager-68dc4b87b5-hhjmz\" (UID: \"13ca2ac7-d53e-437c-a6e2-b851e281ebf3\") " pod="openshift-route-controller-manager/route-controller-manager-68dc4b87b5-hhjmz" Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.310957 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4e43d96-89d5-4ba8-8b68-5b3be43d23a9-serving-cert\") pod \"controller-manager-5bbf48759-6bvgh\" (UID: \"e4e43d96-89d5-4ba8-8b68-5b3be43d23a9\") " pod="openshift-controller-manager/controller-manager-5bbf48759-6bvgh" Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.319122 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/13ca2ac7-d53e-437c-a6e2-b851e281ebf3-serving-cert\") pod \"route-controller-manager-68dc4b87b5-hhjmz\" (UID: \"13ca2ac7-d53e-437c-a6e2-b851e281ebf3\") " pod="openshift-route-controller-manager/route-controller-manager-68dc4b87b5-hhjmz" Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.323601 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbdf5\" (UniqueName: \"kubernetes.io/projected/e4e43d96-89d5-4ba8-8b68-5b3be43d23a9-kube-api-access-jbdf5\") pod \"controller-manager-5bbf48759-6bvgh\" (UID: \"e4e43d96-89d5-4ba8-8b68-5b3be43d23a9\") " pod="openshift-controller-manager/controller-manager-5bbf48759-6bvgh" Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.328220 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wd872\" (UniqueName: \"kubernetes.io/projected/13ca2ac7-d53e-437c-a6e2-b851e281ebf3-kube-api-access-wd872\") pod \"route-controller-manager-68dc4b87b5-hhjmz\" (UID: \"13ca2ac7-d53e-437c-a6e2-b851e281ebf3\") " pod="openshift-route-controller-manager/route-controller-manager-68dc4b87b5-hhjmz" Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.358641 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-w95cf"] Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.359641 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w95cf" Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.361424 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.377662 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-w95cf"] Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.393925 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 14:06:30 crc kubenswrapper[4722]: E0309 14:06:30.394245 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 14:06:30.894214131 +0000 UTC m=+231.449782707 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.419953 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-b5hps" event={"ID":"240d1325-4400-475e-8bc7-9915294148d8","Type":"ContainerStarted","Data":"c90c8a9d19a18a6f5dff498cf21a0bc095827845812788577b2066c8b5076d57"} Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.437006 4722 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.451841 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5bbf48759-6bvgh" Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.467487 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-68dc4b87b5-hhjmz" Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.495259 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tz6gh\" (UID: \"9d57536b-4f57-4098-b519-19fdc2559eda\") " pod="openshift-image-registry/image-registry-697d97f7c8-tz6gh" Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.495361 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65e3d647-8806-4c0c-b9aa-142739f2fbe0-utilities\") pod \"certified-operators-w95cf\" (UID: \"65e3d647-8806-4c0c-b9aa-142739f2fbe0\") " pod="openshift-marketplace/certified-operators-w95cf" Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.495528 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kq5r\" (UniqueName: \"kubernetes.io/projected/65e3d647-8806-4c0c-b9aa-142739f2fbe0-kube-api-access-2kq5r\") pod \"certified-operators-w95cf\" (UID: \"65e3d647-8806-4c0c-b9aa-142739f2fbe0\") " pod="openshift-marketplace/certified-operators-w95cf" Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.495678 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65e3d647-8806-4c0c-b9aa-142739f2fbe0-catalog-content\") pod \"certified-operators-w95cf\" (UID: \"65e3d647-8806-4c0c-b9aa-142739f2fbe0\") " pod="openshift-marketplace/certified-operators-w95cf" Mar 09 14:06:30 crc kubenswrapper[4722]: E0309 14:06:30.496559 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 14:06:30.996542838 +0000 UTC m=+231.552111414 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tz6gh" (UID: "9d57536b-4f57-4098-b519-19fdc2559eda") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.554767 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-pwxnx"] Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.556358 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pwxnx" Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.558865 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.573036 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pwxnx"] Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.596841 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 14:06:30 crc kubenswrapper[4722]: E0309 14:06:30.597487 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 14:06:31.097000418 +0000 UTC m=+231.652568994 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.597597 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tz6gh\" (UID: \"9d57536b-4f57-4098-b519-19fdc2559eda\") " pod="openshift-image-registry/image-registry-697d97f7c8-tz6gh" Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.597726 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65e3d647-8806-4c0c-b9aa-142739f2fbe0-utilities\") pod \"certified-operators-w95cf\" (UID: \"65e3d647-8806-4c0c-b9aa-142739f2fbe0\") " pod="openshift-marketplace/certified-operators-w95cf" Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.599383 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kq5r\" (UniqueName: \"kubernetes.io/projected/65e3d647-8806-4c0c-b9aa-142739f2fbe0-kube-api-access-2kq5r\") pod \"certified-operators-w95cf\" (UID: \"65e3d647-8806-4c0c-b9aa-142739f2fbe0\") " pod="openshift-marketplace/certified-operators-w95cf" Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.599493 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65e3d647-8806-4c0c-b9aa-142739f2fbe0-catalog-content\") pod \"certified-operators-w95cf\" (UID: \"65e3d647-8806-4c0c-b9aa-142739f2fbe0\") " pod="openshift-marketplace/certified-operators-w95cf" Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.599955 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65e3d647-8806-4c0c-b9aa-142739f2fbe0-catalog-content\") pod \"certified-operators-w95cf\" (UID: \"65e3d647-8806-4c0c-b9aa-142739f2fbe0\") " pod="openshift-marketplace/certified-operators-w95cf" Mar 09 14:06:30 crc kubenswrapper[4722]: E0309 14:06:30.600350 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 14:06:31.100339697 +0000 UTC m=+231.655908283 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tz6gh" (UID: "9d57536b-4f57-4098-b519-19fdc2559eda") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.600726 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65e3d647-8806-4c0c-b9aa-142739f2fbe0-utilities\") pod \"certified-operators-w95cf\" (UID: \"65e3d647-8806-4c0c-b9aa-142739f2fbe0\") " pod="openshift-marketplace/certified-operators-w95cf" Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.647458 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kq5r\" (UniqueName: \"kubernetes.io/projected/65e3d647-8806-4c0c-b9aa-142739f2fbe0-kube-api-access-2kq5r\") pod \"certified-operators-w95cf\" (UID: \"65e3d647-8806-4c0c-b9aa-142739f2fbe0\") " pod="openshift-marketplace/certified-operators-w95cf" Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.693970 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w95cf" Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.700737 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 14:06:30 crc kubenswrapper[4722]: E0309 14:06:30.700899 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 14:06:31.20086632 +0000 UTC m=+231.756434896 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.701361 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0f74bde-752e-497e-ad82-ec7a1676bbd5-utilities\") pod \"community-operators-pwxnx\" (UID: \"c0f74bde-752e-497e-ad82-ec7a1676bbd5\") " pod="openshift-marketplace/community-operators-pwxnx" Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.701439 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0f74bde-752e-497e-ad82-ec7a1676bbd5-catalog-content\") pod \"community-operators-pwxnx\" (UID: \"c0f74bde-752e-497e-ad82-ec7a1676bbd5\") " pod="openshift-marketplace/community-operators-pwxnx" Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.701488 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tz6gh\" (UID: \"9d57536b-4f57-4098-b519-19fdc2559eda\") " pod="openshift-image-registry/image-registry-697d97f7c8-tz6gh" Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.701519 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9t5bt\" (UniqueName: \"kubernetes.io/projected/c0f74bde-752e-497e-ad82-ec7a1676bbd5-kube-api-access-9t5bt\") pod \"community-operators-pwxnx\" (UID: \"c0f74bde-752e-497e-ad82-ec7a1676bbd5\") " pod="openshift-marketplace/community-operators-pwxnx" Mar 09 14:06:30 crc kubenswrapper[4722]: E0309 14:06:30.701906 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 14:06:31.20188853 +0000 UTC m=+231.757457106 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tz6gh" (UID: "9d57536b-4f57-4098-b519-19fdc2559eda") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.748742 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5bbf48759-6bvgh"] Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.759945 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-t8pts"] Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.764239 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t8pts" Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.771178 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-t8pts"] Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.804224 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.804736 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0f74bde-752e-497e-ad82-ec7a1676bbd5-catalog-content\") pod \"community-operators-pwxnx\" (UID: \"c0f74bde-752e-497e-ad82-ec7a1676bbd5\") " pod="openshift-marketplace/community-operators-pwxnx" Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.804803 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9t5bt\" (UniqueName: \"kubernetes.io/projected/c0f74bde-752e-497e-ad82-ec7a1676bbd5-kube-api-access-9t5bt\") pod \"community-operators-pwxnx\" (UID: \"c0f74bde-752e-497e-ad82-ec7a1676bbd5\") " pod="openshift-marketplace/community-operators-pwxnx" Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.804877 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0f74bde-752e-497e-ad82-ec7a1676bbd5-utilities\") pod \"community-operators-pwxnx\" (UID: \"c0f74bde-752e-497e-ad82-ec7a1676bbd5\") " pod="openshift-marketplace/community-operators-pwxnx" Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.805299 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0f74bde-752e-497e-ad82-ec7a1676bbd5-utilities\") pod \"community-operators-pwxnx\" (UID: \"c0f74bde-752e-497e-ad82-ec7a1676bbd5\") " pod="openshift-marketplace/community-operators-pwxnx" Mar 09 14:06:30 crc kubenswrapper[4722]: E0309 14:06:30.805609 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 14:06:31.305592987 +0000 UTC m=+231.861161563 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.805609 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0f74bde-752e-497e-ad82-ec7a1676bbd5-catalog-content\") pod \"community-operators-pwxnx\" (UID: \"c0f74bde-752e-497e-ad82-ec7a1676bbd5\") " pod="openshift-marketplace/community-operators-pwxnx" Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.819721 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-68dc4b87b5-hhjmz"] Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.824711 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9t5bt\" (UniqueName: \"kubernetes.io/projected/c0f74bde-752e-497e-ad82-ec7a1676bbd5-kube-api-access-9t5bt\") pod \"community-operators-pwxnx\" (UID: \"c0f74bde-752e-497e-ad82-ec7a1676bbd5\") " pod="openshift-marketplace/community-operators-pwxnx" Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.872905 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pwxnx" Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.905816 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0b6cdb1-050e-4ed3-b20e-d825e4db1edb-utilities\") pod \"certified-operators-t8pts\" (UID: \"b0b6cdb1-050e-4ed3-b20e-d825e4db1edb\") " pod="openshift-marketplace/certified-operators-t8pts" Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.905911 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tk4lv\" (UniqueName: \"kubernetes.io/projected/b0b6cdb1-050e-4ed3-b20e-d825e4db1edb-kube-api-access-tk4lv\") pod \"certified-operators-t8pts\" (UID: \"b0b6cdb1-050e-4ed3-b20e-d825e4db1edb\") " pod="openshift-marketplace/certified-operators-t8pts" Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.905984 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tz6gh\" (UID: \"9d57536b-4f57-4098-b519-19fdc2559eda\") " pod="openshift-image-registry/image-registry-697d97f7c8-tz6gh" Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.906087 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0b6cdb1-050e-4ed3-b20e-d825e4db1edb-catalog-content\") pod \"certified-operators-t8pts\" (UID: \"b0b6cdb1-050e-4ed3-b20e-d825e4db1edb\") " pod="openshift-marketplace/certified-operators-t8pts" Mar 09 14:06:30 crc kubenswrapper[4722]: E0309 14:06:30.906581 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 14:06:31.406564294 +0000 UTC m=+231.962132990 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tz6gh" (UID: "9d57536b-4f57-4098-b519-19fdc2559eda") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.976945 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zmkmp"] Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.978492 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zmkmp" Mar 09 14:06:30 crc kubenswrapper[4722]: I0309 14:06:30.981663 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zmkmp"] Mar 09 14:06:31 crc kubenswrapper[4722]: I0309 14:06:31.009365 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 14:06:31 crc kubenswrapper[4722]: E0309 14:06:31.009624 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 14:06:31.50960149 +0000 UTC m=+232.065170086 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:31 crc kubenswrapper[4722]: I0309 14:06:31.010190 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tk4lv\" (UniqueName: \"kubernetes.io/projected/b0b6cdb1-050e-4ed3-b20e-d825e4db1edb-kube-api-access-tk4lv\") pod \"certified-operators-t8pts\" (UID: \"b0b6cdb1-050e-4ed3-b20e-d825e4db1edb\") " pod="openshift-marketplace/certified-operators-t8pts" Mar 09 14:06:31 crc kubenswrapper[4722]: I0309 14:06:31.010715 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tz6gh\" (UID: \"9d57536b-4f57-4098-b519-19fdc2559eda\") " pod="openshift-image-registry/image-registry-697d97f7c8-tz6gh" Mar 09 14:06:31 crc kubenswrapper[4722]: E0309 14:06:31.011042 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 14:06:31.511033093 +0000 UTC m=+232.066601669 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tz6gh" (UID: "9d57536b-4f57-4098-b519-19fdc2559eda") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:31 crc kubenswrapper[4722]: I0309 14:06:31.011192 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0b6cdb1-050e-4ed3-b20e-d825e4db1edb-catalog-content\") pod \"certified-operators-t8pts\" (UID: \"b0b6cdb1-050e-4ed3-b20e-d825e4db1edb\") " pod="openshift-marketplace/certified-operators-t8pts" Mar 09 14:06:31 crc kubenswrapper[4722]: I0309 14:06:31.011272 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0b6cdb1-050e-4ed3-b20e-d825e4db1edb-utilities\") pod \"certified-operators-t8pts\" (UID: \"b0b6cdb1-050e-4ed3-b20e-d825e4db1edb\") " pod="openshift-marketplace/certified-operators-t8pts" Mar 09 14:06:31 crc kubenswrapper[4722]: I0309 14:06:31.011447 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0b6cdb1-050e-4ed3-b20e-d825e4db1edb-catalog-content\") pod \"certified-operators-t8pts\" (UID: \"b0b6cdb1-050e-4ed3-b20e-d825e4db1edb\") " pod="openshift-marketplace/certified-operators-t8pts" Mar 09 14:06:31 crc kubenswrapper[4722]: I0309 14:06:31.011970 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0b6cdb1-050e-4ed3-b20e-d825e4db1edb-utilities\") pod \"certified-operators-t8pts\" (UID: \"b0b6cdb1-050e-4ed3-b20e-d825e4db1edb\") " pod="openshift-marketplace/certified-operators-t8pts" Mar 09 14:06:31 crc kubenswrapper[4722]: I0309 14:06:31.037868 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tk4lv\" (UniqueName: \"kubernetes.io/projected/b0b6cdb1-050e-4ed3-b20e-d825e4db1edb-kube-api-access-tk4lv\") pod \"certified-operators-t8pts\" (UID: \"b0b6cdb1-050e-4ed3-b20e-d825e4db1edb\") " pod="openshift-marketplace/certified-operators-t8pts" Mar 09 14:06:31 crc kubenswrapper[4722]: I0309 14:06:31.084690 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-w95cf"] Mar 09 14:06:31 crc kubenswrapper[4722]: I0309 14:06:31.107038 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t8pts" Mar 09 14:06:31 crc kubenswrapper[4722]: I0309 14:06:31.112368 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 14:06:31 crc kubenswrapper[4722]: I0309 14:06:31.112674 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqvdd\" (UniqueName: \"kubernetes.io/projected/4b4a7622-7bca-4fca-adb3-eec526b21b2b-kube-api-access-cqvdd\") pod \"community-operators-zmkmp\" (UID: \"4b4a7622-7bca-4fca-adb3-eec526b21b2b\") " pod="openshift-marketplace/community-operators-zmkmp" Mar 09 14:06:31 crc kubenswrapper[4722]: I0309 14:06:31.112730 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b4a7622-7bca-4fca-adb3-eec526b21b2b-utilities\") pod \"community-operators-zmkmp\" (UID: \"4b4a7622-7bca-4fca-adb3-eec526b21b2b\") " pod="openshift-marketplace/community-operators-zmkmp" Mar 09 14:06:31 crc kubenswrapper[4722]: I0309 14:06:31.112809 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b4a7622-7bca-4fca-adb3-eec526b21b2b-catalog-content\") pod \"community-operators-zmkmp\" (UID: \"4b4a7622-7bca-4fca-adb3-eec526b21b2b\") " pod="openshift-marketplace/community-operators-zmkmp" Mar 09 14:06:31 crc kubenswrapper[4722]: E0309 14:06:31.118461 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 14:06:31.618422399 +0000 UTC m=+232.173990975 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:31 crc kubenswrapper[4722]: I0309 14:06:31.219889 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b4a7622-7bca-4fca-adb3-eec526b21b2b-catalog-content\") pod \"community-operators-zmkmp\" (UID: \"4b4a7622-7bca-4fca-adb3-eec526b21b2b\") " pod="openshift-marketplace/community-operators-zmkmp" Mar 09 14:06:31 crc kubenswrapper[4722]: I0309 14:06:31.220336 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqvdd\" (UniqueName: \"kubernetes.io/projected/4b4a7622-7bca-4fca-adb3-eec526b21b2b-kube-api-access-cqvdd\") pod \"community-operators-zmkmp\" (UID: \"4b4a7622-7bca-4fca-adb3-eec526b21b2b\") " pod="openshift-marketplace/community-operators-zmkmp" Mar 09 14:06:31 crc kubenswrapper[4722]: I0309 14:06:31.220394 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b4a7622-7bca-4fca-adb3-eec526b21b2b-utilities\") pod \"community-operators-zmkmp\" (UID: \"4b4a7622-7bca-4fca-adb3-eec526b21b2b\") " pod="openshift-marketplace/community-operators-zmkmp" Mar 09 14:06:31 crc kubenswrapper[4722]: I0309 14:06:31.220443 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tz6gh\" (UID: \"9d57536b-4f57-4098-b519-19fdc2559eda\") " pod="openshift-image-registry/image-registry-697d97f7c8-tz6gh" Mar 09 14:06:31 crc kubenswrapper[4722]: E0309 14:06:31.220971 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-09 14:06:31.720950591 +0000 UTC m=+232.276519227 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-tz6gh" (UID: "9d57536b-4f57-4098-b519-19fdc2559eda") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:31 crc kubenswrapper[4722]: I0309 14:06:31.221497 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b4a7622-7bca-4fca-adb3-eec526b21b2b-utilities\") pod \"community-operators-zmkmp\" (UID: \"4b4a7622-7bca-4fca-adb3-eec526b21b2b\") " pod="openshift-marketplace/community-operators-zmkmp" Mar 09 14:06:31 crc kubenswrapper[4722]: I0309 14:06:31.221591 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b4a7622-7bca-4fca-adb3-eec526b21b2b-catalog-content\") pod \"community-operators-zmkmp\" (UID: \"4b4a7622-7bca-4fca-adb3-eec526b21b2b\") " pod="openshift-marketplace/community-operators-zmkmp" Mar 09 14:06:31 crc kubenswrapper[4722]: I0309 14:06:31.250900 4722 patch_prober.go:28] interesting pod/router-default-5444994796-dp8wn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 14:06:31 crc kubenswrapper[4722]: [-]has-synced failed: reason withheld Mar 09 14:06:31 crc kubenswrapper[4722]: [+]process-running ok Mar 09 14:06:31 crc kubenswrapper[4722]: healthz check failed Mar 09 14:06:31 crc kubenswrapper[4722]: I0309 14:06:31.250985 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dp8wn" podUID="317444ee-0620-47d2-869e-77578a367a87" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 14:06:31 crc kubenswrapper[4722]: I0309 14:06:31.253107 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqvdd\" (UniqueName: \"kubernetes.io/projected/4b4a7622-7bca-4fca-adb3-eec526b21b2b-kube-api-access-cqvdd\") pod \"community-operators-zmkmp\" (UID: \"4b4a7622-7bca-4fca-adb3-eec526b21b2b\") " pod="openshift-marketplace/community-operators-zmkmp" Mar 09 14:06:31 crc kubenswrapper[4722]: I0309 14:06:31.280438 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pwxnx"] Mar 09 14:06:31 crc kubenswrapper[4722]: I0309 14:06:31.300536 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zmkmp" Mar 09 14:06:31 crc kubenswrapper[4722]: I0309 14:06:31.321494 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 14:06:31 crc kubenswrapper[4722]: E0309 14:06:31.321815 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-09 14:06:31.821777243 +0000 UTC m=+232.377345819 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 09 14:06:31 crc kubenswrapper[4722]: I0309 14:06:31.390296 4722 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-09T14:06:30.437041732Z","Handler":null,"Name":""} Mar 09 14:06:31 crc kubenswrapper[4722]: I0309 14:06:31.424657 4722 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 09 14:06:31 crc kubenswrapper[4722]: I0309 14:06:31.424696 4722 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 09 14:06:31 crc kubenswrapper[4722]: I0309 14:06:31.426078 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tz6gh\" (UID: \"9d57536b-4f57-4098-b519-19fdc2559eda\") " pod="openshift-image-registry/image-registry-697d97f7c8-tz6gh" Mar 09 14:06:31 crc kubenswrapper[4722]: I0309 14:06:31.433507 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 09 14:06:31 crc kubenswrapper[4722]: I0309 14:06:31.433557 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tz6gh\" (UID: \"9d57536b-4f57-4098-b519-19fdc2559eda\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-tz6gh" Mar 09 14:06:31 crc kubenswrapper[4722]: I0309 14:06:31.479664 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5bbf48759-6bvgh" event={"ID":"e4e43d96-89d5-4ba8-8b68-5b3be43d23a9","Type":"ContainerStarted","Data":"f955a81a3843f4555182e6149afe7b013e9969ddb1ca43d46c0169717bea5581"} Mar 09 14:06:31 crc kubenswrapper[4722]: I0309 14:06:31.479704 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5bbf48759-6bvgh" event={"ID":"e4e43d96-89d5-4ba8-8b68-5b3be43d23a9","Type":"ContainerStarted","Data":"1fa826a95fa1b912554a1349bb8d83bc43736bc11b2065ecfdedc90fe6eff981"} Mar 09 14:06:31 crc kubenswrapper[4722]: I0309 14:06:31.480672 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5bbf48759-6bvgh" Mar 09 14:06:31 crc kubenswrapper[4722]: I0309 14:06:31.480722 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-t8pts"] Mar 09 14:06:31 crc kubenswrapper[4722]: I0309 14:06:31.495395 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5bbf48759-6bvgh" Mar 09 14:06:31 crc kubenswrapper[4722]: I0309 14:06:31.496740 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-tz6gh\" (UID: \"9d57536b-4f57-4098-b519-19fdc2559eda\") " pod="openshift-image-registry/image-registry-697d97f7c8-tz6gh" Mar 09 14:06:31 crc kubenswrapper[4722]: I0309 14:06:31.512378 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w95cf" event={"ID":"65e3d647-8806-4c0c-b9aa-142739f2fbe0","Type":"ContainerStarted","Data":"5a8078b43ae67f49168afccf4629ab4efb623d4c7a3f8161b3074dee44703f2a"} Mar 09 14:06:31 crc kubenswrapper[4722]: I0309 14:06:31.516824 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-68dc4b87b5-hhjmz" event={"ID":"13ca2ac7-d53e-437c-a6e2-b851e281ebf3","Type":"ContainerStarted","Data":"b1b7ea33f844f9b44d9fa4569482bc9bcd47681316dd9954e9365ad41bb40792"} Mar 09 14:06:31 crc kubenswrapper[4722]: I0309 14:06:31.516869 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-68dc4b87b5-hhjmz" event={"ID":"13ca2ac7-d53e-437c-a6e2-b851e281ebf3","Type":"ContainerStarted","Data":"cf4fdd8de94c9649d442a9d96e79904c32cf7613c1874a8d2ebf25c0c3a51dfa"} Mar 09 14:06:31 crc kubenswrapper[4722]: I0309 14:06:31.517141 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-68dc4b87b5-hhjmz" Mar 09 14:06:31 crc kubenswrapper[4722]: I0309 14:06:31.518416 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pwxnx" event={"ID":"c0f74bde-752e-497e-ad82-ec7a1676bbd5","Type":"ContainerStarted","Data":"0fc91f10b5f35318045918b18351a0167656d308f21ff2e0f29b30aeee849892"} Mar 09 14:06:31 crc kubenswrapper[4722]: I0309 14:06:31.522131 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5bbf48759-6bvgh" podStartSLOduration=3.522103847 podStartE2EDuration="3.522103847s" podCreationTimestamp="2026-03-09 14:06:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:06:31.508587616 +0000 UTC m=+232.064156192" watchObservedRunningTime="2026-03-09 14:06:31.522103847 +0000 UTC m=+232.077672423" Mar 09 14:06:31 crc kubenswrapper[4722]: I0309 14:06:31.524417 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-b5hps" event={"ID":"240d1325-4400-475e-8bc7-9915294148d8","Type":"ContainerStarted","Data":"e5c54e0fc7bb2ceef92a5d432878c3ec318758c41429de5c890e83231384517a"} Mar 09 14:06:31 crc kubenswrapper[4722]: I0309 14:06:31.524454 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-b5hps" event={"ID":"240d1325-4400-475e-8bc7-9915294148d8","Type":"ContainerStarted","Data":"d2573fdb7d9c5bae5ae0a5fc8f8a7f1c9c1539668634f59af22313fde4891443"} Mar 09 14:06:31 crc kubenswrapper[4722]: I0309 14:06:31.529116 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 09 14:06:31 crc kubenswrapper[4722]: I0309 14:06:31.555618 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-b5hps" podStartSLOduration=11.555601241 podStartE2EDuration="11.555601241s" podCreationTimestamp="2026-03-09 14:06:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:06:31.55355515 +0000 UTC m=+232.109123736" watchObservedRunningTime="2026-03-09 14:06:31.555601241 +0000 UTC m=+232.111169817" Mar 09 14:06:31 crc kubenswrapper[4722]: I0309 14:06:31.569925 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 09 14:06:31 crc kubenswrapper[4722]: I0309 14:06:31.590083 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-68dc4b87b5-hhjmz" podStartSLOduration=3.590063054 podStartE2EDuration="3.590063054s" podCreationTimestamp="2026-03-09 14:06:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:06:31.588554249 +0000 UTC m=+232.144122825" watchObservedRunningTime="2026-03-09 14:06:31.590063054 +0000 UTC m=+232.145631630" Mar 09 14:06:31 crc kubenswrapper[4722]: I0309 14:06:31.609661 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-tz6gh" Mar 09 14:06:31 crc kubenswrapper[4722]: I0309 14:06:31.633099 4722 patch_prober.go:28] interesting pod/downloads-7954f5f757-knrzp container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Mar 09 14:06:31 crc kubenswrapper[4722]: I0309 14:06:31.633504 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-knrzp" podUID="8f915ef9-5d9a-43ee-a333-def8766e083d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" Mar 09 14:06:31 crc kubenswrapper[4722]: I0309 14:06:31.633887 4722 patch_prober.go:28] interesting pod/downloads-7954f5f757-knrzp container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Mar 09 14:06:31 crc kubenswrapper[4722]: I0309 14:06:31.633906 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-knrzp" podUID="8f915ef9-5d9a-43ee-a333-def8766e083d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" Mar 09 14:06:31 crc kubenswrapper[4722]: I0309 14:06:31.634133 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-gh244" Mar 09 14:06:31 crc kubenswrapper[4722]: I0309 14:06:31.634161 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-gh244" Mar 09 14:06:31 crc kubenswrapper[4722]: I0309 14:06:31.640739 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zmkmp"] Mar 09 14:06:31 crc kubenswrapper[4722]: I0309 14:06:31.663069 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-gh244" Mar 09 14:06:31 crc kubenswrapper[4722]: W0309 14:06:31.705247 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b4a7622_7bca_4fca_adb3_eec526b21b2b.slice/crio-b38fa506870e412e0bd6b9856a391fb968c0813bb14774e8bf52a00c34b7de53 WatchSource:0}: Error finding container b38fa506870e412e0bd6b9856a391fb968c0813bb14774e8bf52a00c34b7de53: Status 404 returned error can't find the container with id b38fa506870e412e0bd6b9856a391fb968c0813bb14774e8bf52a00c34b7de53 Mar 09 14:06:31 crc kubenswrapper[4722]: I0309 14:06:31.740508 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2mf4l" Mar 09 14:06:31 crc kubenswrapper[4722]: I0309 14:06:31.740575 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2mf4l" Mar 09 14:06:31 crc kubenswrapper[4722]: I0309 14:06:31.768989 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2mf4l" Mar 09 14:06:31 crc kubenswrapper[4722]: I0309 14:06:31.935890 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-68dc4b87b5-hhjmz" Mar 09 14:06:32 crc kubenswrapper[4722]: I0309 14:06:32.105169 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-tz6gh"] Mar 09 14:06:32 crc kubenswrapper[4722]: W0309 14:06:32.128629 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d57536b_4f57_4098_b519_19fdc2559eda.slice/crio-743f4b506d0a7b71136d894ffc0bf3e17c1ccc6acd143da4697e5829a6a62cde WatchSource:0}: Error finding container 743f4b506d0a7b71136d894ffc0bf3e17c1ccc6acd143da4697e5829a6a62cde: Status 404 returned error can't find the container with id 743f4b506d0a7b71136d894ffc0bf3e17c1ccc6acd143da4697e5829a6a62cde Mar 09 14:06:32 crc kubenswrapper[4722]: I0309 14:06:32.207542 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 09 14:06:32 crc kubenswrapper[4722]: I0309 14:06:32.254140 4722 patch_prober.go:28] interesting pod/router-default-5444994796-dp8wn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 14:06:32 crc kubenswrapper[4722]: [-]has-synced failed: reason withheld Mar 09 14:06:32 crc kubenswrapper[4722]: [+]process-running ok Mar 09 14:06:32 crc kubenswrapper[4722]: healthz check failed Mar 09 14:06:32 crc kubenswrapper[4722]: I0309 14:06:32.254314 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dp8wn" podUID="317444ee-0620-47d2-869e-77578a367a87" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 14:06:32 crc kubenswrapper[4722]: I0309 14:06:32.534286 4722 generic.go:334] "Generic (PLEG): container finished" podID="c0f74bde-752e-497e-ad82-ec7a1676bbd5" containerID="6f692c9353de894c8fa94990493a655c30b8374194f335a9126c69e3fb7b0f02" exitCode=0 Mar 09 14:06:32 crc kubenswrapper[4722]: I0309 14:06:32.534383 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pwxnx" event={"ID":"c0f74bde-752e-497e-ad82-ec7a1676bbd5","Type":"ContainerDied","Data":"6f692c9353de894c8fa94990493a655c30b8374194f335a9126c69e3fb7b0f02"} Mar 09 14:06:32 crc kubenswrapper[4722]: I0309 14:06:32.537133 4722 generic.go:334] "Generic (PLEG): container finished" podID="4b4a7622-7bca-4fca-adb3-eec526b21b2b" containerID="7a898e7f7bdd3689485d762720e4740c6b03c454dcdeda7b0c721d5f69630a6f" exitCode=0 Mar 09 14:06:32 crc kubenswrapper[4722]: I0309 14:06:32.537245 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zmkmp" event={"ID":"4b4a7622-7bca-4fca-adb3-eec526b21b2b","Type":"ContainerDied","Data":"7a898e7f7bdd3689485d762720e4740c6b03c454dcdeda7b0c721d5f69630a6f"} Mar 09 14:06:32 crc kubenswrapper[4722]: I0309 14:06:32.537270 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zmkmp" event={"ID":"4b4a7622-7bca-4fca-adb3-eec526b21b2b","Type":"ContainerStarted","Data":"b38fa506870e412e0bd6b9856a391fb968c0813bb14774e8bf52a00c34b7de53"} Mar 09 14:06:32 crc kubenswrapper[4722]: I0309 14:06:32.540709 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-tz6gh" event={"ID":"9d57536b-4f57-4098-b519-19fdc2559eda","Type":"ContainerStarted","Data":"adf82621040528d1293e4a06d3d17db8a5bb552b35348f0f50a943baf206c18a"} Mar 09 14:06:32 crc kubenswrapper[4722]: I0309 14:06:32.540766 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-tz6gh" event={"ID":"9d57536b-4f57-4098-b519-19fdc2559eda","Type":"ContainerStarted","Data":"743f4b506d0a7b71136d894ffc0bf3e17c1ccc6acd143da4697e5829a6a62cde"} Mar 09 14:06:32 crc kubenswrapper[4722]: I0309 14:06:32.544739 4722 generic.go:334] "Generic (PLEG): container finished" podID="b0b6cdb1-050e-4ed3-b20e-d825e4db1edb" containerID="29f4b9168facd2fc15837261a19e4e6f77ad40fe8b207c41e3e731f44fac7a1f" exitCode=0 Mar 09 14:06:32 crc kubenswrapper[4722]: I0309 14:06:32.545072 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t8pts" event={"ID":"b0b6cdb1-050e-4ed3-b20e-d825e4db1edb","Type":"ContainerDied","Data":"29f4b9168facd2fc15837261a19e4e6f77ad40fe8b207c41e3e731f44fac7a1f"} Mar 09 14:06:32 crc kubenswrapper[4722]: I0309 14:06:32.545148 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t8pts" event={"ID":"b0b6cdb1-050e-4ed3-b20e-d825e4db1edb","Type":"ContainerStarted","Data":"528b1a83a1ffc324a8557f1c712e00a1df760a4b6c17bbf005fc50581e91a9ef"} Mar 09 14:06:32 crc kubenswrapper[4722]: I0309 14:06:32.558149 4722 generic.go:334] "Generic (PLEG): container finished" podID="7e1fdc12-aac5-4f72-9b22-0212c2f3988e" containerID="dc947cec8c36a2871a868e6175dd1a21b31edd8e69d4e4e07cedf44eaf8a8a73" exitCode=0 Mar 09 14:06:32 crc kubenswrapper[4722]: I0309 14:06:32.558256 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551080-mdzx2" event={"ID":"7e1fdc12-aac5-4f72-9b22-0212c2f3988e","Type":"ContainerDied","Data":"dc947cec8c36a2871a868e6175dd1a21b31edd8e69d4e4e07cedf44eaf8a8a73"} Mar 09 14:06:32 crc kubenswrapper[4722]: I0309 14:06:32.559256 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rdkdt"] Mar 09 14:06:32 crc kubenswrapper[4722]: I0309 14:06:32.561297 4722 generic.go:334] "Generic (PLEG): container finished" podID="65e3d647-8806-4c0c-b9aa-142739f2fbe0" containerID="aa46608cdb2f1782bf7194037f0cac40fdf8b5ac4aa277132cac8a987db67863" exitCode=0 Mar 09 14:06:32 crc kubenswrapper[4722]: I0309 14:06:32.561340 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rdkdt" Mar 09 14:06:32 crc kubenswrapper[4722]: I0309 14:06:32.564422 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 09 14:06:32 crc kubenswrapper[4722]: I0309 14:06:32.564513 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w95cf" event={"ID":"65e3d647-8806-4c0c-b9aa-142739f2fbe0","Type":"ContainerDied","Data":"aa46608cdb2f1782bf7194037f0cac40fdf8b5ac4aa277132cac8a987db67863"} Mar 09 14:06:32 crc kubenswrapper[4722]: I0309 14:06:32.572173 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2mf4l" Mar 09 14:06:32 crc kubenswrapper[4722]: I0309 14:06:32.572577 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-gh244" Mar 09 14:06:32 crc kubenswrapper[4722]: I0309 14:06:32.582130 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rdkdt"] Mar 09 14:06:32 crc kubenswrapper[4722]: I0309 14:06:32.650117 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cm6f7\" (UniqueName: \"kubernetes.io/projected/cfd6e90e-4eeb-4372-8465-136a383e95b2-kube-api-access-cm6f7\") pod \"redhat-marketplace-rdkdt\" (UID: \"cfd6e90e-4eeb-4372-8465-136a383e95b2\") " pod="openshift-marketplace/redhat-marketplace-rdkdt" Mar 09 14:06:32 crc kubenswrapper[4722]: I0309 14:06:32.650293 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfd6e90e-4eeb-4372-8465-136a383e95b2-utilities\") pod \"redhat-marketplace-rdkdt\" (UID: \"cfd6e90e-4eeb-4372-8465-136a383e95b2\") " pod="openshift-marketplace/redhat-marketplace-rdkdt" Mar 09 14:06:32 crc kubenswrapper[4722]: I0309 14:06:32.650426 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfd6e90e-4eeb-4372-8465-136a383e95b2-catalog-content\") pod \"redhat-marketplace-rdkdt\" (UID: \"cfd6e90e-4eeb-4372-8465-136a383e95b2\") " pod="openshift-marketplace/redhat-marketplace-rdkdt" Mar 09 14:06:32 crc kubenswrapper[4722]: I0309 14:06:32.751897 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfd6e90e-4eeb-4372-8465-136a383e95b2-utilities\") pod \"redhat-marketplace-rdkdt\" (UID: \"cfd6e90e-4eeb-4372-8465-136a383e95b2\") " pod="openshift-marketplace/redhat-marketplace-rdkdt" Mar 09 14:06:32 crc kubenswrapper[4722]: I0309 14:06:32.751952 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfd6e90e-4eeb-4372-8465-136a383e95b2-catalog-content\") pod \"redhat-marketplace-rdkdt\" (UID: \"cfd6e90e-4eeb-4372-8465-136a383e95b2\") " pod="openshift-marketplace/redhat-marketplace-rdkdt" Mar 09 14:06:32 crc kubenswrapper[4722]: I0309 14:06:32.752028 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cm6f7\" (UniqueName: \"kubernetes.io/projected/cfd6e90e-4eeb-4372-8465-136a383e95b2-kube-api-access-cm6f7\") pod \"redhat-marketplace-rdkdt\" (UID: \"cfd6e90e-4eeb-4372-8465-136a383e95b2\") " pod="openshift-marketplace/redhat-marketplace-rdkdt" Mar 09 14:06:32 crc kubenswrapper[4722]: I0309 14:06:32.752807 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfd6e90e-4eeb-4372-8465-136a383e95b2-utilities\") pod \"redhat-marketplace-rdkdt\" (UID: \"cfd6e90e-4eeb-4372-8465-136a383e95b2\") " pod="openshift-marketplace/redhat-marketplace-rdkdt" Mar 09 14:06:32 crc kubenswrapper[4722]: I0309 14:06:32.753021 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfd6e90e-4eeb-4372-8465-136a383e95b2-catalog-content\") pod \"redhat-marketplace-rdkdt\" (UID: \"cfd6e90e-4eeb-4372-8465-136a383e95b2\") " pod="openshift-marketplace/redhat-marketplace-rdkdt" Mar 09 14:06:32 crc kubenswrapper[4722]: I0309 14:06:32.795438 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cm6f7\" (UniqueName: \"kubernetes.io/projected/cfd6e90e-4eeb-4372-8465-136a383e95b2-kube-api-access-cm6f7\") pod \"redhat-marketplace-rdkdt\" (UID: \"cfd6e90e-4eeb-4372-8465-136a383e95b2\") " pod="openshift-marketplace/redhat-marketplace-rdkdt" Mar 09 14:06:32 crc kubenswrapper[4722]: I0309 14:06:32.891680 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rdkdt" Mar 09 14:06:32 crc kubenswrapper[4722]: I0309 14:06:32.958850 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-c96ws"] Mar 09 14:06:32 crc kubenswrapper[4722]: I0309 14:06:32.959971 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c96ws" Mar 09 14:06:32 crc kubenswrapper[4722]: I0309 14:06:32.980410 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-c96ws"] Mar 09 14:06:33 crc kubenswrapper[4722]: I0309 14:06:33.055714 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9665111-e6dc-49f2-803a-96eebcc4c78c-catalog-content\") pod \"redhat-marketplace-c96ws\" (UID: \"a9665111-e6dc-49f2-803a-96eebcc4c78c\") " pod="openshift-marketplace/redhat-marketplace-c96ws" Mar 09 14:06:33 crc kubenswrapper[4722]: I0309 14:06:33.056043 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58htf\" (UniqueName: \"kubernetes.io/projected/a9665111-e6dc-49f2-803a-96eebcc4c78c-kube-api-access-58htf\") pod \"redhat-marketplace-c96ws\" (UID: \"a9665111-e6dc-49f2-803a-96eebcc4c78c\") " pod="openshift-marketplace/redhat-marketplace-c96ws" Mar 09 14:06:33 crc kubenswrapper[4722]: I0309 14:06:33.056077 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9665111-e6dc-49f2-803a-96eebcc4c78c-utilities\") pod \"redhat-marketplace-c96ws\" (UID: \"a9665111-e6dc-49f2-803a-96eebcc4c78c\") " pod="openshift-marketplace/redhat-marketplace-c96ws" Mar 09 14:06:33 crc kubenswrapper[4722]: I0309 14:06:33.156550 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-sfch8" Mar 09 14:06:33 crc kubenswrapper[4722]: I0309 14:06:33.157041 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-sfch8" Mar 09 14:06:33 crc kubenswrapper[4722]: I0309 14:06:33.157721 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9665111-e6dc-49f2-803a-96eebcc4c78c-catalog-content\") pod \"redhat-marketplace-c96ws\" (UID: \"a9665111-e6dc-49f2-803a-96eebcc4c78c\") " pod="openshift-marketplace/redhat-marketplace-c96ws" Mar 09 14:06:33 crc kubenswrapper[4722]: I0309 14:06:33.157765 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58htf\" (UniqueName: \"kubernetes.io/projected/a9665111-e6dc-49f2-803a-96eebcc4c78c-kube-api-access-58htf\") pod \"redhat-marketplace-c96ws\" (UID: \"a9665111-e6dc-49f2-803a-96eebcc4c78c\") " pod="openshift-marketplace/redhat-marketplace-c96ws" Mar 09 14:06:33 crc kubenswrapper[4722]: I0309 14:06:33.157782 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9665111-e6dc-49f2-803a-96eebcc4c78c-utilities\") pod \"redhat-marketplace-c96ws\" (UID: \"a9665111-e6dc-49f2-803a-96eebcc4c78c\") " pod="openshift-marketplace/redhat-marketplace-c96ws" Mar 09 14:06:33 crc kubenswrapper[4722]: I0309 14:06:33.158354 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9665111-e6dc-49f2-803a-96eebcc4c78c-utilities\") pod \"redhat-marketplace-c96ws\" (UID: \"a9665111-e6dc-49f2-803a-96eebcc4c78c\") " pod="openshift-marketplace/redhat-marketplace-c96ws" Mar 09 14:06:33 crc kubenswrapper[4722]: I0309 14:06:33.158567 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9665111-e6dc-49f2-803a-96eebcc4c78c-catalog-content\") pod \"redhat-marketplace-c96ws\" (UID: \"a9665111-e6dc-49f2-803a-96eebcc4c78c\") " pod="openshift-marketplace/redhat-marketplace-c96ws" Mar 09 14:06:33 crc kubenswrapper[4722]: I0309 14:06:33.161139 4722 patch_prober.go:28] interesting pod/console-f9d7485db-sfch8 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.25:8443/health\": dial tcp 10.217.0.25:8443: connect: connection refused" start-of-body= Mar 09 14:06:33 crc kubenswrapper[4722]: I0309 14:06:33.161188 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-sfch8" podUID="8a76c9b5-c226-4d93-8d7a-8e56210b572a" containerName="console" probeResult="failure" output="Get \"https://10.217.0.25:8443/health\": dial tcp 10.217.0.25:8443: connect: connection refused" Mar 09 14:06:33 crc kubenswrapper[4722]: I0309 14:06:33.179627 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58htf\" (UniqueName: \"kubernetes.io/projected/a9665111-e6dc-49f2-803a-96eebcc4c78c-kube-api-access-58htf\") pod \"redhat-marketplace-c96ws\" (UID: \"a9665111-e6dc-49f2-803a-96eebcc4c78c\") " pod="openshift-marketplace/redhat-marketplace-c96ws" Mar 09 14:06:33 crc kubenswrapper[4722]: I0309 14:06:33.219066 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rdkdt"] Mar 09 14:06:33 crc kubenswrapper[4722]: W0309 14:06:33.226819 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcfd6e90e_4eeb_4372_8465_136a383e95b2.slice/crio-fee2ac0315c036e2f0afaa1ffa5796c1dd4f70c39dedef5326921170502110c1 WatchSource:0}: Error finding container fee2ac0315c036e2f0afaa1ffa5796c1dd4f70c39dedef5326921170502110c1: Status 404 returned error can't find the container with id fee2ac0315c036e2f0afaa1ffa5796c1dd4f70c39dedef5326921170502110c1 Mar 09 14:06:33 crc kubenswrapper[4722]: I0309 14:06:33.246034 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-dp8wn" Mar 09 14:06:33 crc kubenswrapper[4722]: I0309 14:06:33.249183 4722 patch_prober.go:28] interesting pod/router-default-5444994796-dp8wn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 14:06:33 crc kubenswrapper[4722]: [-]has-synced failed: reason withheld Mar 09 14:06:33 crc kubenswrapper[4722]: [+]process-running ok Mar 09 14:06:33 crc kubenswrapper[4722]: healthz check failed Mar 09 14:06:33 crc kubenswrapper[4722]: I0309 14:06:33.249257 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dp8wn" podUID="317444ee-0620-47d2-869e-77578a367a87" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 14:06:33 crc kubenswrapper[4722]: I0309 14:06:33.277736 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c96ws" Mar 09 14:06:33 crc kubenswrapper[4722]: I0309 14:06:33.286474 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 09 14:06:33 crc kubenswrapper[4722]: I0309 14:06:33.288417 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 14:06:33 crc kubenswrapper[4722]: I0309 14:06:33.291107 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 09 14:06:33 crc kubenswrapper[4722]: I0309 14:06:33.295791 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 09 14:06:33 crc kubenswrapper[4722]: I0309 14:06:33.296157 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 09 14:06:33 crc kubenswrapper[4722]: I0309 14:06:33.337813 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-wz9f2" Mar 09 14:06:33 crc kubenswrapper[4722]: I0309 14:06:33.362371 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a6e706d6-dfac-4286-8d73-4661950f260f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"a6e706d6-dfac-4286-8d73-4661950f260f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 14:06:33 crc kubenswrapper[4722]: I0309 14:06:33.362425 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a6e706d6-dfac-4286-8d73-4661950f260f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"a6e706d6-dfac-4286-8d73-4661950f260f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 14:06:33 crc kubenswrapper[4722]: I0309 14:06:33.464571 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a6e706d6-dfac-4286-8d73-4661950f260f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"a6e706d6-dfac-4286-8d73-4661950f260f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 14:06:33 crc kubenswrapper[4722]: I0309 14:06:33.464906 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a6e706d6-dfac-4286-8d73-4661950f260f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"a6e706d6-dfac-4286-8d73-4661950f260f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 14:06:33 crc kubenswrapper[4722]: I0309 14:06:33.465030 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a6e706d6-dfac-4286-8d73-4661950f260f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"a6e706d6-dfac-4286-8d73-4661950f260f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 14:06:33 crc kubenswrapper[4722]: I0309 14:06:33.485174 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a6e706d6-dfac-4286-8d73-4661950f260f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"a6e706d6-dfac-4286-8d73-4661950f260f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 14:06:33 crc kubenswrapper[4722]: I0309 14:06:33.570771 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-c2g4c"] Mar 09 14:06:33 crc kubenswrapper[4722]: I0309 14:06:33.571886 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c2g4c" Mar 09 14:06:33 crc kubenswrapper[4722]: I0309 14:06:33.577320 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 09 14:06:33 crc kubenswrapper[4722]: I0309 14:06:33.579671 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-c2g4c"] Mar 09 14:06:33 crc kubenswrapper[4722]: I0309 14:06:33.584622 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rdkdt" event={"ID":"cfd6e90e-4eeb-4372-8465-136a383e95b2","Type":"ContainerStarted","Data":"fee2ac0315c036e2f0afaa1ffa5796c1dd4f70c39dedef5326921170502110c1"} Mar 09 14:06:33 crc kubenswrapper[4722]: I0309 14:06:33.585302 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-tz6gh" Mar 09 14:06:33 crc kubenswrapper[4722]: I0309 14:06:33.599765 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-c96ws"] Mar 09 14:06:33 crc kubenswrapper[4722]: I0309 14:06:33.610034 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-tz6gh" podStartSLOduration=169.610017747 podStartE2EDuration="2m49.610017747s" podCreationTimestamp="2026-03-09 14:03:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:06:33.605534684 +0000 UTC m=+234.161103260" watchObservedRunningTime="2026-03-09 14:06:33.610017747 +0000 UTC m=+234.165586323" Mar 09 14:06:33 crc kubenswrapper[4722]: W0309 14:06:33.621779 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9665111_e6dc_49f2_803a_96eebcc4c78c.slice/crio-1abc54c1cf595773b8b9982364a5828c8941a10b4323ea411a90248e7a74f76d WatchSource:0}: Error finding container 1abc54c1cf595773b8b9982364a5828c8941a10b4323ea411a90248e7a74f76d: Status 404 returned error can't find the container with id 1abc54c1cf595773b8b9982364a5828c8941a10b4323ea411a90248e7a74f76d Mar 09 14:06:33 crc kubenswrapper[4722]: I0309 14:06:33.648661 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 14:06:33 crc kubenswrapper[4722]: I0309 14:06:33.670021 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7df01eab-424f-40b1-a40c-03b930a8fac6-catalog-content\") pod \"redhat-operators-c2g4c\" (UID: \"7df01eab-424f-40b1-a40c-03b930a8fac6\") " pod="openshift-marketplace/redhat-operators-c2g4c" Mar 09 14:06:33 crc kubenswrapper[4722]: I0309 14:06:33.670260 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnmzq\" (UniqueName: \"kubernetes.io/projected/7df01eab-424f-40b1-a40c-03b930a8fac6-kube-api-access-nnmzq\") pod \"redhat-operators-c2g4c\" (UID: \"7df01eab-424f-40b1-a40c-03b930a8fac6\") " pod="openshift-marketplace/redhat-operators-c2g4c" Mar 09 14:06:33 crc kubenswrapper[4722]: I0309 14:06:33.670364 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7df01eab-424f-40b1-a40c-03b930a8fac6-utilities\") pod \"redhat-operators-c2g4c\" (UID: \"7df01eab-424f-40b1-a40c-03b930a8fac6\") " pod="openshift-marketplace/redhat-operators-c2g4c" Mar 09 14:06:33 crc kubenswrapper[4722]: I0309 14:06:33.772690 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7df01eab-424f-40b1-a40c-03b930a8fac6-catalog-content\") pod \"redhat-operators-c2g4c\" (UID: \"7df01eab-424f-40b1-a40c-03b930a8fac6\") " pod="openshift-marketplace/redhat-operators-c2g4c" Mar 09 14:06:33 crc kubenswrapper[4722]: I0309 14:06:33.773015 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnmzq\" (UniqueName: \"kubernetes.io/projected/7df01eab-424f-40b1-a40c-03b930a8fac6-kube-api-access-nnmzq\") pod \"redhat-operators-c2g4c\" (UID: \"7df01eab-424f-40b1-a40c-03b930a8fac6\") " pod="openshift-marketplace/redhat-operators-c2g4c" Mar 09 14:06:33 crc kubenswrapper[4722]: I0309 14:06:33.773046 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7df01eab-424f-40b1-a40c-03b930a8fac6-utilities\") pod \"redhat-operators-c2g4c\" (UID: \"7df01eab-424f-40b1-a40c-03b930a8fac6\") " pod="openshift-marketplace/redhat-operators-c2g4c" Mar 09 14:06:33 crc kubenswrapper[4722]: I0309 14:06:33.773644 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7df01eab-424f-40b1-a40c-03b930a8fac6-catalog-content\") pod \"redhat-operators-c2g4c\" (UID: \"7df01eab-424f-40b1-a40c-03b930a8fac6\") " pod="openshift-marketplace/redhat-operators-c2g4c" Mar 09 14:06:33 crc kubenswrapper[4722]: I0309 14:06:33.774131 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7df01eab-424f-40b1-a40c-03b930a8fac6-utilities\") pod \"redhat-operators-c2g4c\" (UID: \"7df01eab-424f-40b1-a40c-03b930a8fac6\") " pod="openshift-marketplace/redhat-operators-c2g4c" Mar 09 14:06:33 crc kubenswrapper[4722]: I0309 14:06:33.792441 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnmzq\" (UniqueName: \"kubernetes.io/projected/7df01eab-424f-40b1-a40c-03b930a8fac6-kube-api-access-nnmzq\") pod \"redhat-operators-c2g4c\" (UID: \"7df01eab-424f-40b1-a40c-03b930a8fac6\") " pod="openshift-marketplace/redhat-operators-c2g4c" Mar 09 14:06:33 crc kubenswrapper[4722]: I0309 14:06:33.836823 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551080-mdzx2" Mar 09 14:06:33 crc kubenswrapper[4722]: I0309 14:06:33.902730 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c2g4c" Mar 09 14:06:33 crc kubenswrapper[4722]: I0309 14:06:33.960556 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-r2hp9"] Mar 09 14:06:33 crc kubenswrapper[4722]: E0309 14:06:33.960834 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e1fdc12-aac5-4f72-9b22-0212c2f3988e" containerName="collect-profiles" Mar 09 14:06:33 crc kubenswrapper[4722]: I0309 14:06:33.960857 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e1fdc12-aac5-4f72-9b22-0212c2f3988e" containerName="collect-profiles" Mar 09 14:06:33 crc kubenswrapper[4722]: I0309 14:06:33.961002 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e1fdc12-aac5-4f72-9b22-0212c2f3988e" containerName="collect-profiles" Mar 09 14:06:33 crc kubenswrapper[4722]: I0309 14:06:33.962031 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r2hp9" Mar 09 14:06:33 crc kubenswrapper[4722]: I0309 14:06:33.973038 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-r2hp9"] Mar 09 14:06:33 crc kubenswrapper[4722]: I0309 14:06:33.974705 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7e1fdc12-aac5-4f72-9b22-0212c2f3988e-config-volume\") pod \"7e1fdc12-aac5-4f72-9b22-0212c2f3988e\" (UID: \"7e1fdc12-aac5-4f72-9b22-0212c2f3988e\") " Mar 09 14:06:33 crc kubenswrapper[4722]: I0309 14:06:33.974784 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7e1fdc12-aac5-4f72-9b22-0212c2f3988e-secret-volume\") pod \"7e1fdc12-aac5-4f72-9b22-0212c2f3988e\" (UID: \"7e1fdc12-aac5-4f72-9b22-0212c2f3988e\") " Mar 09 14:06:33 crc kubenswrapper[4722]: I0309 14:06:33.974904 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6fls\" (UniqueName: \"kubernetes.io/projected/7e1fdc12-aac5-4f72-9b22-0212c2f3988e-kube-api-access-h6fls\") pod \"7e1fdc12-aac5-4f72-9b22-0212c2f3988e\" (UID: \"7e1fdc12-aac5-4f72-9b22-0212c2f3988e\") " Mar 09 14:06:33 crc kubenswrapper[4722]: I0309 14:06:33.976220 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e1fdc12-aac5-4f72-9b22-0212c2f3988e-config-volume" (OuterVolumeSpecName: "config-volume") pod "7e1fdc12-aac5-4f72-9b22-0212c2f3988e" (UID: "7e1fdc12-aac5-4f72-9b22-0212c2f3988e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:06:33 crc kubenswrapper[4722]: I0309 14:06:33.988887 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e1fdc12-aac5-4f72-9b22-0212c2f3988e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7e1fdc12-aac5-4f72-9b22-0212c2f3988e" (UID: "7e1fdc12-aac5-4f72-9b22-0212c2f3988e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:06:33 crc kubenswrapper[4722]: I0309 14:06:33.990888 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e1fdc12-aac5-4f72-9b22-0212c2f3988e-kube-api-access-h6fls" (OuterVolumeSpecName: "kube-api-access-h6fls") pod "7e1fdc12-aac5-4f72-9b22-0212c2f3988e" (UID: "7e1fdc12-aac5-4f72-9b22-0212c2f3988e"). InnerVolumeSpecName "kube-api-access-h6fls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:06:34 crc kubenswrapper[4722]: I0309 14:06:34.059724 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 09 14:06:34 crc kubenswrapper[4722]: I0309 14:06:34.060576 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 14:06:34 crc kubenswrapper[4722]: I0309 14:06:34.062757 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 09 14:06:34 crc kubenswrapper[4722]: I0309 14:06:34.063458 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 09 14:06:34 crc kubenswrapper[4722]: I0309 14:06:34.064146 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 09 14:06:34 crc kubenswrapper[4722]: I0309 14:06:34.075830 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ba5c493-a951-4d61-acf3-5ee964dcfe60-utilities\") pod \"redhat-operators-r2hp9\" (UID: \"6ba5c493-a951-4d61-acf3-5ee964dcfe60\") " pod="openshift-marketplace/redhat-operators-r2hp9" Mar 09 14:06:34 crc kubenswrapper[4722]: I0309 14:06:34.075886 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcwjz\" (UniqueName: \"kubernetes.io/projected/6ba5c493-a951-4d61-acf3-5ee964dcfe60-kube-api-access-bcwjz\") pod \"redhat-operators-r2hp9\" (UID: \"6ba5c493-a951-4d61-acf3-5ee964dcfe60\") " pod="openshift-marketplace/redhat-operators-r2hp9" Mar 09 14:06:34 crc kubenswrapper[4722]: I0309 14:06:34.075959 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ba5c493-a951-4d61-acf3-5ee964dcfe60-catalog-content\") pod \"redhat-operators-r2hp9\" (UID: \"6ba5c493-a951-4d61-acf3-5ee964dcfe60\") " pod="openshift-marketplace/redhat-operators-r2hp9" Mar 09 14:06:34 crc kubenswrapper[4722]: I0309 14:06:34.075991 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h6fls\" (UniqueName: \"kubernetes.io/projected/7e1fdc12-aac5-4f72-9b22-0212c2f3988e-kube-api-access-h6fls\") on node \"crc\" DevicePath \"\"" Mar 09 14:06:34 crc kubenswrapper[4722]: I0309 14:06:34.076000 4722 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7e1fdc12-aac5-4f72-9b22-0212c2f3988e-config-volume\") on node \"crc\" DevicePath \"\"" Mar 09 14:06:34 crc kubenswrapper[4722]: I0309 14:06:34.076009 4722 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7e1fdc12-aac5-4f72-9b22-0212c2f3988e-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 09 14:06:34 crc kubenswrapper[4722]: I0309 14:06:34.115753 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 09 14:06:34 crc kubenswrapper[4722]: W0309 14:06:34.118956 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poda6e706d6_dfac_4286_8d73_4661950f260f.slice/crio-8607cd8c4686dfa36057edf6b1887a1dbd907ec1f6a7ae0dce1b206994cbb954 WatchSource:0}: Error finding container 8607cd8c4686dfa36057edf6b1887a1dbd907ec1f6a7ae0dce1b206994cbb954: Status 404 returned error can't find the container with id 8607cd8c4686dfa36057edf6b1887a1dbd907ec1f6a7ae0dce1b206994cbb954 Mar 09 14:06:34 crc kubenswrapper[4722]: I0309 14:06:34.176612 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ba5c493-a951-4d61-acf3-5ee964dcfe60-catalog-content\") pod \"redhat-operators-r2hp9\" (UID: \"6ba5c493-a951-4d61-acf3-5ee964dcfe60\") " pod="openshift-marketplace/redhat-operators-r2hp9" Mar 09 14:06:34 crc kubenswrapper[4722]: I0309 14:06:34.176673 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ba5c493-a951-4d61-acf3-5ee964dcfe60-utilities\") pod \"redhat-operators-r2hp9\" (UID: \"6ba5c493-a951-4d61-acf3-5ee964dcfe60\") " pod="openshift-marketplace/redhat-operators-r2hp9" Mar 09 14:06:34 crc kubenswrapper[4722]: I0309 14:06:34.176704 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcwjz\" (UniqueName: \"kubernetes.io/projected/6ba5c493-a951-4d61-acf3-5ee964dcfe60-kube-api-access-bcwjz\") pod \"redhat-operators-r2hp9\" (UID: \"6ba5c493-a951-4d61-acf3-5ee964dcfe60\") " pod="openshift-marketplace/redhat-operators-r2hp9" Mar 09 14:06:34 crc kubenswrapper[4722]: I0309 14:06:34.176735 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/696cc4a1-0075-43f4-ad48-16c96f37174e-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"696cc4a1-0075-43f4-ad48-16c96f37174e\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 14:06:34 crc kubenswrapper[4722]: I0309 14:06:34.176778 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/696cc4a1-0075-43f4-ad48-16c96f37174e-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"696cc4a1-0075-43f4-ad48-16c96f37174e\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 14:06:34 crc kubenswrapper[4722]: I0309 14:06:34.177311 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ba5c493-a951-4d61-acf3-5ee964dcfe60-catalog-content\") pod \"redhat-operators-r2hp9\" (UID: \"6ba5c493-a951-4d61-acf3-5ee964dcfe60\") " pod="openshift-marketplace/redhat-operators-r2hp9" Mar 09 14:06:34 crc kubenswrapper[4722]: I0309 14:06:34.177416 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ba5c493-a951-4d61-acf3-5ee964dcfe60-utilities\") pod \"redhat-operators-r2hp9\" (UID: \"6ba5c493-a951-4d61-acf3-5ee964dcfe60\") " pod="openshift-marketplace/redhat-operators-r2hp9" Mar 09 14:06:34 crc kubenswrapper[4722]: I0309 14:06:34.196027 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcwjz\" (UniqueName: \"kubernetes.io/projected/6ba5c493-a951-4d61-acf3-5ee964dcfe60-kube-api-access-bcwjz\") pod \"redhat-operators-r2hp9\" (UID: \"6ba5c493-a951-4d61-acf3-5ee964dcfe60\") " pod="openshift-marketplace/redhat-operators-r2hp9" Mar 09 14:06:34 crc kubenswrapper[4722]: I0309 14:06:34.251894 4722 patch_prober.go:28] interesting pod/router-default-5444994796-dp8wn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 09 14:06:34 crc kubenswrapper[4722]: [-]has-synced failed: reason withheld Mar 09 14:06:34 crc kubenswrapper[4722]: [+]process-running ok Mar 09 14:06:34 crc kubenswrapper[4722]: healthz check failed Mar 09 14:06:34 crc kubenswrapper[4722]: I0309 14:06:34.251981 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dp8wn" podUID="317444ee-0620-47d2-869e-77578a367a87" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 14:06:34 crc kubenswrapper[4722]: I0309 14:06:34.278281 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/696cc4a1-0075-43f4-ad48-16c96f37174e-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"696cc4a1-0075-43f4-ad48-16c96f37174e\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 14:06:34 crc kubenswrapper[4722]: I0309 14:06:34.278354 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/696cc4a1-0075-43f4-ad48-16c96f37174e-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"696cc4a1-0075-43f4-ad48-16c96f37174e\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 14:06:34 crc kubenswrapper[4722]: I0309 14:06:34.278448 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/696cc4a1-0075-43f4-ad48-16c96f37174e-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"696cc4a1-0075-43f4-ad48-16c96f37174e\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 14:06:34 crc kubenswrapper[4722]: I0309 14:06:34.295753 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r2hp9" Mar 09 14:06:34 crc kubenswrapper[4722]: I0309 14:06:34.306568 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/696cc4a1-0075-43f4-ad48-16c96f37174e-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"696cc4a1-0075-43f4-ad48-16c96f37174e\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 14:06:34 crc kubenswrapper[4722]: I0309 14:06:34.320454 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-c2g4c"] Mar 09 14:06:34 crc kubenswrapper[4722]: W0309 14:06:34.327830 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7df01eab_424f_40b1_a40c_03b930a8fac6.slice/crio-1efad77f097ccbcc1966a4d440a83442acdf700b6e204ef056da07356327c341 WatchSource:0}: Error finding container 1efad77f097ccbcc1966a4d440a83442acdf700b6e204ef056da07356327c341: Status 404 returned error can't find the container with id 1efad77f097ccbcc1966a4d440a83442acdf700b6e204ef056da07356327c341 Mar 09 14:06:34 crc kubenswrapper[4722]: I0309 14:06:34.404904 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 14:06:34 crc kubenswrapper[4722]: I0309 14:06:34.595953 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c2g4c" event={"ID":"7df01eab-424f-40b1-a40c-03b930a8fac6","Type":"ContainerStarted","Data":"1efad77f097ccbcc1966a4d440a83442acdf700b6e204ef056da07356327c341"} Mar 09 14:06:34 crc kubenswrapper[4722]: I0309 14:06:34.597984 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551080-mdzx2" event={"ID":"7e1fdc12-aac5-4f72-9b22-0212c2f3988e","Type":"ContainerDied","Data":"0a8fd4eed82d6739ca1061292610cac0298147fe092602018bac3f6624c31bea"} Mar 09 14:06:34 crc kubenswrapper[4722]: I0309 14:06:34.598032 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a8fd4eed82d6739ca1061292610cac0298147fe092602018bac3f6624c31bea" Mar 09 14:06:34 crc kubenswrapper[4722]: I0309 14:06:34.598110 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551080-mdzx2" Mar 09 14:06:34 crc kubenswrapper[4722]: I0309 14:06:34.599611 4722 generic.go:334] "Generic (PLEG): container finished" podID="cfd6e90e-4eeb-4372-8465-136a383e95b2" containerID="c83797e6984a294a0485ee468a0577500f0b718e4e06020eefcaa5eff63fa0f9" exitCode=0 Mar 09 14:06:34 crc kubenswrapper[4722]: I0309 14:06:34.599671 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rdkdt" event={"ID":"cfd6e90e-4eeb-4372-8465-136a383e95b2","Type":"ContainerDied","Data":"c83797e6984a294a0485ee468a0577500f0b718e4e06020eefcaa5eff63fa0f9"} Mar 09 14:06:34 crc kubenswrapper[4722]: I0309 14:06:34.601728 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"a6e706d6-dfac-4286-8d73-4661950f260f","Type":"ContainerStarted","Data":"8607cd8c4686dfa36057edf6b1887a1dbd907ec1f6a7ae0dce1b206994cbb954"} Mar 09 14:06:34 crc kubenswrapper[4722]: I0309 14:06:34.603563 4722 generic.go:334] "Generic (PLEG): container finished" podID="a9665111-e6dc-49f2-803a-96eebcc4c78c" containerID="4f4e8d1a2b8c2dbf9b4e1f90d8043b6bee2cdde2a3273416469c66457f516df8" exitCode=0 Mar 09 14:06:34 crc kubenswrapper[4722]: I0309 14:06:34.603656 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c96ws" event={"ID":"a9665111-e6dc-49f2-803a-96eebcc4c78c","Type":"ContainerDied","Data":"4f4e8d1a2b8c2dbf9b4e1f90d8043b6bee2cdde2a3273416469c66457f516df8"} Mar 09 14:06:34 crc kubenswrapper[4722]: I0309 14:06:34.603705 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c96ws" event={"ID":"a9665111-e6dc-49f2-803a-96eebcc4c78c","Type":"ContainerStarted","Data":"1abc54c1cf595773b8b9982364a5828c8941a10b4323ea411a90248e7a74f76d"} Mar 09 14:06:34 crc kubenswrapper[4722]: I0309 14:06:34.638379 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 09 14:06:34 crc kubenswrapper[4722]: I0309 14:06:34.737258 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-r2hp9"] Mar 09 14:06:34 crc kubenswrapper[4722]: I0309 14:06:34.753921 4722 ???:1] "http: TLS handshake error from 192.168.126.11:53086: no serving certificate available for the kubelet" Mar 09 14:06:35 crc kubenswrapper[4722]: I0309 14:06:35.146311 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-nnjvd" Mar 09 14:06:35 crc kubenswrapper[4722]: I0309 14:06:35.279088 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-dp8wn" Mar 09 14:06:35 crc kubenswrapper[4722]: I0309 14:06:35.283190 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-dp8wn" Mar 09 14:06:35 crc kubenswrapper[4722]: I0309 14:06:35.627914 4722 generic.go:334] "Generic (PLEG): container finished" podID="7df01eab-424f-40b1-a40c-03b930a8fac6" containerID="7c497a08089b3d2432fefa7cff8373039c142fd35a2f682c0dd090a00a79b0c7" exitCode=0 Mar 09 14:06:35 crc kubenswrapper[4722]: I0309 14:06:35.628028 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c2g4c" event={"ID":"7df01eab-424f-40b1-a40c-03b930a8fac6","Type":"ContainerDied","Data":"7c497a08089b3d2432fefa7cff8373039c142fd35a2f682c0dd090a00a79b0c7"} Mar 09 14:06:35 crc kubenswrapper[4722]: I0309 14:06:35.631032 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r2hp9" event={"ID":"6ba5c493-a951-4d61-acf3-5ee964dcfe60","Type":"ContainerStarted","Data":"103e0ee2ddfa2f25215d03fd15afc82ae1c8aad352747fff91d183a51d30c34f"} Mar 09 14:06:35 crc kubenswrapper[4722]: I0309 14:06:35.631076 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r2hp9" event={"ID":"6ba5c493-a951-4d61-acf3-5ee964dcfe60","Type":"ContainerStarted","Data":"c0b77c7bc9d97f3661f2a57a288ede760a71b5527fb709c33e741d593844a945"} Mar 09 14:06:35 crc kubenswrapper[4722]: I0309 14:06:35.633297 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"696cc4a1-0075-43f4-ad48-16c96f37174e","Type":"ContainerStarted","Data":"417a42483d78c245c870e6ee4bc401354d5ab51978093fb5f6c620ea842bfa7a"} Mar 09 14:06:35 crc kubenswrapper[4722]: I0309 14:06:35.633339 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"696cc4a1-0075-43f4-ad48-16c96f37174e","Type":"ContainerStarted","Data":"0673248890fc1f6b159b513bcb4b46b26df87031bbe5a2b0d6113807d184215b"} Mar 09 14:06:35 crc kubenswrapper[4722]: I0309 14:06:35.636036 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"a6e706d6-dfac-4286-8d73-4661950f260f","Type":"ContainerStarted","Data":"551adacd97e9295768aa294e32edc2c92758e2acd292d5bd2fbf9ab022068b44"} Mar 09 14:06:35 crc kubenswrapper[4722]: I0309 14:06:35.637586 4722 ???:1] "http: TLS handshake error from 192.168.126.11:53094: no serving certificate available for the kubelet" Mar 09 14:06:35 crc kubenswrapper[4722]: I0309 14:06:35.705608 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=1.705590435 podStartE2EDuration="1.705590435s" podCreationTimestamp="2026-03-09 14:06:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:06:35.684672215 +0000 UTC m=+236.240240781" watchObservedRunningTime="2026-03-09 14:06:35.705590435 +0000 UTC m=+236.261159011" Mar 09 14:06:35 crc kubenswrapper[4722]: I0309 14:06:35.720078 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.720061154 podStartE2EDuration="2.720061154s" podCreationTimestamp="2026-03-09 14:06:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:06:35.718375294 +0000 UTC m=+236.273943870" watchObservedRunningTime="2026-03-09 14:06:35.720061154 +0000 UTC m=+236.275629730" Mar 09 14:06:36 crc kubenswrapper[4722]: I0309 14:06:36.645860 4722 generic.go:334] "Generic (PLEG): container finished" podID="a6e706d6-dfac-4286-8d73-4661950f260f" containerID="551adacd97e9295768aa294e32edc2c92758e2acd292d5bd2fbf9ab022068b44" exitCode=0 Mar 09 14:06:36 crc kubenswrapper[4722]: I0309 14:06:36.645959 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"a6e706d6-dfac-4286-8d73-4661950f260f","Type":"ContainerDied","Data":"551adacd97e9295768aa294e32edc2c92758e2acd292d5bd2fbf9ab022068b44"} Mar 09 14:06:36 crc kubenswrapper[4722]: I0309 14:06:36.648094 4722 generic.go:334] "Generic (PLEG): container finished" podID="6ba5c493-a951-4d61-acf3-5ee964dcfe60" containerID="103e0ee2ddfa2f25215d03fd15afc82ae1c8aad352747fff91d183a51d30c34f" exitCode=0 Mar 09 14:06:36 crc kubenswrapper[4722]: I0309 14:06:36.648176 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r2hp9" event={"ID":"6ba5c493-a951-4d61-acf3-5ee964dcfe60","Type":"ContainerDied","Data":"103e0ee2ddfa2f25215d03fd15afc82ae1c8aad352747fff91d183a51d30c34f"} Mar 09 14:06:36 crc kubenswrapper[4722]: I0309 14:06:36.650775 4722 generic.go:334] "Generic (PLEG): container finished" podID="696cc4a1-0075-43f4-ad48-16c96f37174e" containerID="417a42483d78c245c870e6ee4bc401354d5ab51978093fb5f6c620ea842bfa7a" exitCode=0 Mar 09 14:06:36 crc kubenswrapper[4722]: I0309 14:06:36.650826 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"696cc4a1-0075-43f4-ad48-16c96f37174e","Type":"ContainerDied","Data":"417a42483d78c245c870e6ee4bc401354d5ab51978093fb5f6c620ea842bfa7a"} Mar 09 14:06:41 crc kubenswrapper[4722]: I0309 14:06:41.625548 4722 patch_prober.go:28] interesting pod/downloads-7954f5f757-knrzp container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Mar 09 14:06:41 crc kubenswrapper[4722]: I0309 14:06:41.626434 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-knrzp" podUID="8f915ef9-5d9a-43ee-a333-def8766e083d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" Mar 09 14:06:41 crc kubenswrapper[4722]: I0309 14:06:41.626167 4722 patch_prober.go:28] interesting pod/downloads-7954f5f757-knrzp container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Mar 09 14:06:41 crc kubenswrapper[4722]: I0309 14:06:41.626547 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-knrzp" podUID="8f915ef9-5d9a-43ee-a333-def8766e083d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" Mar 09 14:06:42 crc kubenswrapper[4722]: I0309 14:06:42.475764 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f5dbc0be-527e-4f70-b185-3a10b1b11a75-metrics-certs\") pod \"network-metrics-daemon-pvvhj\" (UID: \"f5dbc0be-527e-4f70-b185-3a10b1b11a75\") " pod="openshift-multus/network-metrics-daemon-pvvhj" Mar 09 14:06:42 crc kubenswrapper[4722]: I0309 14:06:42.478236 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 09 14:06:42 crc kubenswrapper[4722]: I0309 14:06:42.506331 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f5dbc0be-527e-4f70-b185-3a10b1b11a75-metrics-certs\") pod \"network-metrics-daemon-pvvhj\" (UID: \"f5dbc0be-527e-4f70-b185-3a10b1b11a75\") " pod="openshift-multus/network-metrics-daemon-pvvhj" Mar 09 14:06:42 crc kubenswrapper[4722]: I0309 14:06:42.578077 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 09 14:06:42 crc kubenswrapper[4722]: I0309 14:06:42.587514 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-pvvhj" Mar 09 14:06:43 crc kubenswrapper[4722]: I0309 14:06:43.160852 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-sfch8" Mar 09 14:06:43 crc kubenswrapper[4722]: I0309 14:06:43.164263 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-sfch8" Mar 09 14:06:45 crc kubenswrapper[4722]: I0309 14:06:45.038702 4722 ???:1] "http: TLS handshake error from 192.168.126.11:47682: no serving certificate available for the kubelet" Mar 09 14:06:46 crc kubenswrapper[4722]: I0309 14:06:46.233971 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 09 14:06:46 crc kubenswrapper[4722]: I0309 14:06:46.563607 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5bbf48759-6bvgh"] Mar 09 14:06:46 crc kubenswrapper[4722]: I0309 14:06:46.563828 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5bbf48759-6bvgh" podUID="e4e43d96-89d5-4ba8-8b68-5b3be43d23a9" containerName="controller-manager" containerID="cri-o://f955a81a3843f4555182e6149afe7b013e9969ddb1ca43d46c0169717bea5581" gracePeriod=30 Mar 09 14:06:46 crc kubenswrapper[4722]: I0309 14:06:46.577064 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-68dc4b87b5-hhjmz"] Mar 09 14:06:46 crc kubenswrapper[4722]: I0309 14:06:46.577337 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-68dc4b87b5-hhjmz" podUID="13ca2ac7-d53e-437c-a6e2-b851e281ebf3" containerName="route-controller-manager" containerID="cri-o://b1b7ea33f844f9b44d9fa4569482bc9bcd47681316dd9954e9365ad41bb40792" gracePeriod=30 Mar 09 14:06:48 crc kubenswrapper[4722]: I0309 14:06:48.717024 4722 generic.go:334] "Generic (PLEG): container finished" podID="e4e43d96-89d5-4ba8-8b68-5b3be43d23a9" containerID="f955a81a3843f4555182e6149afe7b013e9969ddb1ca43d46c0169717bea5581" exitCode=0 Mar 09 14:06:48 crc kubenswrapper[4722]: I0309 14:06:48.717112 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5bbf48759-6bvgh" event={"ID":"e4e43d96-89d5-4ba8-8b68-5b3be43d23a9","Type":"ContainerDied","Data":"f955a81a3843f4555182e6149afe7b013e9969ddb1ca43d46c0169717bea5581"} Mar 09 14:06:48 crc kubenswrapper[4722]: I0309 14:06:48.718583 4722 generic.go:334] "Generic (PLEG): container finished" podID="13ca2ac7-d53e-437c-a6e2-b851e281ebf3" containerID="b1b7ea33f844f9b44d9fa4569482bc9bcd47681316dd9954e9365ad41bb40792" exitCode=0 Mar 09 14:06:48 crc kubenswrapper[4722]: I0309 14:06:48.718609 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-68dc4b87b5-hhjmz" event={"ID":"13ca2ac7-d53e-437c-a6e2-b851e281ebf3","Type":"ContainerDied","Data":"b1b7ea33f844f9b44d9fa4569482bc9bcd47681316dd9954e9365ad41bb40792"} Mar 09 14:06:50 crc kubenswrapper[4722]: I0309 14:06:50.454189 4722 patch_prober.go:28] interesting pod/controller-manager-5bbf48759-6bvgh container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.45:8443/healthz\": dial tcp 10.217.0.45:8443: connect: connection refused" start-of-body= Mar 09 14:06:50 crc kubenswrapper[4722]: I0309 14:06:50.454276 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-5bbf48759-6bvgh" podUID="e4e43d96-89d5-4ba8-8b68-5b3be43d23a9" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.45:8443/healthz\": dial tcp 10.217.0.45:8443: connect: connection refused" Mar 09 14:06:50 crc kubenswrapper[4722]: I0309 14:06:50.471822 4722 patch_prober.go:28] interesting pod/route-controller-manager-68dc4b87b5-hhjmz container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.46:8443/healthz\": dial tcp 10.217.0.46:8443: connect: connection refused" start-of-body= Mar 09 14:06:50 crc kubenswrapper[4722]: I0309 14:06:50.471896 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-68dc4b87b5-hhjmz" podUID="13ca2ac7-d53e-437c-a6e2-b851e281ebf3" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.46:8443/healthz\": dial tcp 10.217.0.46:8443: connect: connection refused" Mar 09 14:06:51 crc kubenswrapper[4722]: I0309 14:06:51.328776 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 14:06:51 crc kubenswrapper[4722]: I0309 14:06:51.339573 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 14:06:51 crc kubenswrapper[4722]: I0309 14:06:51.413153 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/696cc4a1-0075-43f4-ad48-16c96f37174e-kube-api-access\") pod \"696cc4a1-0075-43f4-ad48-16c96f37174e\" (UID: \"696cc4a1-0075-43f4-ad48-16c96f37174e\") " Mar 09 14:06:51 crc kubenswrapper[4722]: I0309 14:06:51.413266 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a6e706d6-dfac-4286-8d73-4661950f260f-kube-api-access\") pod \"a6e706d6-dfac-4286-8d73-4661950f260f\" (UID: \"a6e706d6-dfac-4286-8d73-4661950f260f\") " Mar 09 14:06:51 crc kubenswrapper[4722]: I0309 14:06:51.413322 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a6e706d6-dfac-4286-8d73-4661950f260f-kubelet-dir\") pod \"a6e706d6-dfac-4286-8d73-4661950f260f\" (UID: \"a6e706d6-dfac-4286-8d73-4661950f260f\") " Mar 09 14:06:51 crc kubenswrapper[4722]: I0309 14:06:51.413362 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/696cc4a1-0075-43f4-ad48-16c96f37174e-kubelet-dir\") pod \"696cc4a1-0075-43f4-ad48-16c96f37174e\" (UID: \"696cc4a1-0075-43f4-ad48-16c96f37174e\") " Mar 09 14:06:51 crc kubenswrapper[4722]: I0309 14:06:51.413518 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a6e706d6-dfac-4286-8d73-4661950f260f-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "a6e706d6-dfac-4286-8d73-4661950f260f" (UID: "a6e706d6-dfac-4286-8d73-4661950f260f"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 14:06:51 crc kubenswrapper[4722]: I0309 14:06:51.413602 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/696cc4a1-0075-43f4-ad48-16c96f37174e-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "696cc4a1-0075-43f4-ad48-16c96f37174e" (UID: "696cc4a1-0075-43f4-ad48-16c96f37174e"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 14:06:51 crc kubenswrapper[4722]: I0309 14:06:51.421710 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6e706d6-dfac-4286-8d73-4661950f260f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "a6e706d6-dfac-4286-8d73-4661950f260f" (UID: "a6e706d6-dfac-4286-8d73-4661950f260f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:06:51 crc kubenswrapper[4722]: I0309 14:06:51.422070 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/696cc4a1-0075-43f4-ad48-16c96f37174e-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "696cc4a1-0075-43f4-ad48-16c96f37174e" (UID: "696cc4a1-0075-43f4-ad48-16c96f37174e"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:06:51 crc kubenswrapper[4722]: I0309 14:06:51.514082 4722 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a6e706d6-dfac-4286-8d73-4661950f260f-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 09 14:06:51 crc kubenswrapper[4722]: I0309 14:06:51.514121 4722 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/696cc4a1-0075-43f4-ad48-16c96f37174e-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 09 14:06:51 crc kubenswrapper[4722]: I0309 14:06:51.514131 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/696cc4a1-0075-43f4-ad48-16c96f37174e-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 09 14:06:51 crc kubenswrapper[4722]: I0309 14:06:51.514142 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a6e706d6-dfac-4286-8d73-4661950f260f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 09 14:06:51 crc kubenswrapper[4722]: I0309 14:06:51.528286 4722 patch_prober.go:28] interesting pod/machine-config-daemon-hjrrb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:06:51 crc kubenswrapper[4722]: I0309 14:06:51.528380 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:06:51 crc kubenswrapper[4722]: I0309 14:06:51.621806 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-tz6gh" Mar 09 14:06:51 crc kubenswrapper[4722]: I0309 14:06:51.625140 4722 patch_prober.go:28] interesting pod/downloads-7954f5f757-knrzp container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Mar 09 14:06:51 crc kubenswrapper[4722]: I0309 14:06:51.625145 4722 patch_prober.go:28] interesting pod/downloads-7954f5f757-knrzp container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Mar 09 14:06:51 crc kubenswrapper[4722]: I0309 14:06:51.625220 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-knrzp" podUID="8f915ef9-5d9a-43ee-a333-def8766e083d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" Mar 09 14:06:51 crc kubenswrapper[4722]: I0309 14:06:51.625288 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-knrzp" podUID="8f915ef9-5d9a-43ee-a333-def8766e083d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" Mar 09 14:06:51 crc kubenswrapper[4722]: I0309 14:06:51.625597 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console/downloads-7954f5f757-knrzp" Mar 09 14:06:51 crc kubenswrapper[4722]: I0309 14:06:51.626061 4722 patch_prober.go:28] interesting pod/downloads-7954f5f757-knrzp container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Mar 09 14:06:51 crc kubenswrapper[4722]: I0309 14:06:51.626091 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-knrzp" podUID="8f915ef9-5d9a-43ee-a333-def8766e083d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" Mar 09 14:06:51 crc kubenswrapper[4722]: I0309 14:06:51.626400 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="download-server" containerStatusID={"Type":"cri-o","ID":"08b7729dd5722a2a83f8b986c99cc6315569c3cd2d38112dd77be8d55fd5d716"} pod="openshift-console/downloads-7954f5f757-knrzp" containerMessage="Container download-server failed liveness probe, will be restarted" Mar 09 14:06:51 crc kubenswrapper[4722]: I0309 14:06:51.626441 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/downloads-7954f5f757-knrzp" podUID="8f915ef9-5d9a-43ee-a333-def8766e083d" containerName="download-server" containerID="cri-o://08b7729dd5722a2a83f8b986c99cc6315569c3cd2d38112dd77be8d55fd5d716" gracePeriod=2 Mar 09 14:06:51 crc kubenswrapper[4722]: E0309 14:06:51.712256 4722 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f915ef9_5d9a_43ee_a333_def8766e083d.slice/crio-08b7729dd5722a2a83f8b986c99cc6315569c3cd2d38112dd77be8d55fd5d716.scope\": RecentStats: unable to find data in memory cache]" Mar 09 14:06:51 crc kubenswrapper[4722]: I0309 14:06:51.734369 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"696cc4a1-0075-43f4-ad48-16c96f37174e","Type":"ContainerDied","Data":"0673248890fc1f6b159b513bcb4b46b26df87031bbe5a2b0d6113807d184215b"} Mar 09 14:06:51 crc kubenswrapper[4722]: I0309 14:06:51.734431 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0673248890fc1f6b159b513bcb4b46b26df87031bbe5a2b0d6113807d184215b" Mar 09 14:06:51 crc kubenswrapper[4722]: I0309 14:06:51.734391 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 09 14:06:51 crc kubenswrapper[4722]: I0309 14:06:51.735884 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"a6e706d6-dfac-4286-8d73-4661950f260f","Type":"ContainerDied","Data":"8607cd8c4686dfa36057edf6b1887a1dbd907ec1f6a7ae0dce1b206994cbb954"} Mar 09 14:06:51 crc kubenswrapper[4722]: I0309 14:06:51.735936 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8607cd8c4686dfa36057edf6b1887a1dbd907ec1f6a7ae0dce1b206994cbb954" Mar 09 14:06:51 crc kubenswrapper[4722]: I0309 14:06:51.735907 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 09 14:06:52 crc kubenswrapper[4722]: I0309 14:06:52.741771 4722 generic.go:334] "Generic (PLEG): container finished" podID="8f915ef9-5d9a-43ee-a333-def8766e083d" containerID="08b7729dd5722a2a83f8b986c99cc6315569c3cd2d38112dd77be8d55fd5d716" exitCode=0 Mar 09 14:06:52 crc kubenswrapper[4722]: I0309 14:06:52.741840 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-knrzp" event={"ID":"8f915ef9-5d9a-43ee-a333-def8766e083d","Type":"ContainerDied","Data":"08b7729dd5722a2a83f8b986c99cc6315569c3cd2d38112dd77be8d55fd5d716"} Mar 09 14:07:01 crc kubenswrapper[4722]: I0309 14:07:01.453915 4722 patch_prober.go:28] interesting pod/controller-manager-5bbf48759-6bvgh container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.45:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:07:01 crc kubenswrapper[4722]: I0309 14:07:01.454321 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-5bbf48759-6bvgh" podUID="e4e43d96-89d5-4ba8-8b68-5b3be43d23a9" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.45:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 14:07:01 crc kubenswrapper[4722]: I0309 14:07:01.468572 4722 patch_prober.go:28] interesting pod/route-controller-manager-68dc4b87b5-hhjmz container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.46:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:07:01 crc kubenswrapper[4722]: I0309 14:07:01.468641 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-68dc4b87b5-hhjmz" podUID="13ca2ac7-d53e-437c-a6e2-b851e281ebf3" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.46:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 14:07:01 crc kubenswrapper[4722]: I0309 14:07:01.627383 4722 patch_prober.go:28] interesting pod/downloads-7954f5f757-knrzp container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Mar 09 14:07:01 crc kubenswrapper[4722]: I0309 14:07:01.627464 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-knrzp" podUID="8f915ef9-5d9a-43ee-a333-def8766e083d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" Mar 09 14:07:03 crc kubenswrapper[4722]: I0309 14:07:03.574734 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mrtb2" Mar 09 14:07:04 crc kubenswrapper[4722]: I0309 14:07:04.851391 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 09 14:07:04 crc kubenswrapper[4722]: E0309 14:07:04.851883 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="696cc4a1-0075-43f4-ad48-16c96f37174e" containerName="pruner" Mar 09 14:07:04 crc kubenswrapper[4722]: I0309 14:07:04.851905 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="696cc4a1-0075-43f4-ad48-16c96f37174e" containerName="pruner" Mar 09 14:07:04 crc kubenswrapper[4722]: E0309 14:07:04.851936 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6e706d6-dfac-4286-8d73-4661950f260f" containerName="pruner" Mar 09 14:07:04 crc kubenswrapper[4722]: I0309 14:07:04.851948 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6e706d6-dfac-4286-8d73-4661950f260f" containerName="pruner" Mar 09 14:07:04 crc kubenswrapper[4722]: I0309 14:07:04.852111 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6e706d6-dfac-4286-8d73-4661950f260f" containerName="pruner" Mar 09 14:07:04 crc kubenswrapper[4722]: I0309 14:07:04.852126 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="696cc4a1-0075-43f4-ad48-16c96f37174e" containerName="pruner" Mar 09 14:07:04 crc kubenswrapper[4722]: I0309 14:07:04.852904 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 14:07:04 crc kubenswrapper[4722]: I0309 14:07:04.855284 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 09 14:07:04 crc kubenswrapper[4722]: I0309 14:07:04.855759 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 09 14:07:04 crc kubenswrapper[4722]: I0309 14:07:04.858172 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 09 14:07:05 crc kubenswrapper[4722]: I0309 14:07:05.018848 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/812aaee7-17d0-4777-a2a0-f1ffcb8a5ad4-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"812aaee7-17d0-4777-a2a0-f1ffcb8a5ad4\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 14:07:05 crc kubenswrapper[4722]: I0309 14:07:05.018914 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/812aaee7-17d0-4777-a2a0-f1ffcb8a5ad4-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"812aaee7-17d0-4777-a2a0-f1ffcb8a5ad4\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 14:07:05 crc kubenswrapper[4722]: I0309 14:07:05.120684 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/812aaee7-17d0-4777-a2a0-f1ffcb8a5ad4-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"812aaee7-17d0-4777-a2a0-f1ffcb8a5ad4\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 14:07:05 crc kubenswrapper[4722]: I0309 14:07:05.120775 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/812aaee7-17d0-4777-a2a0-f1ffcb8a5ad4-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"812aaee7-17d0-4777-a2a0-f1ffcb8a5ad4\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 14:07:05 crc kubenswrapper[4722]: I0309 14:07:05.120839 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/812aaee7-17d0-4777-a2a0-f1ffcb8a5ad4-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"812aaee7-17d0-4777-a2a0-f1ffcb8a5ad4\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 14:07:05 crc kubenswrapper[4722]: I0309 14:07:05.145749 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/812aaee7-17d0-4777-a2a0-f1ffcb8a5ad4-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"812aaee7-17d0-4777-a2a0-f1ffcb8a5ad4\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 14:07:05 crc kubenswrapper[4722]: I0309 14:07:05.177165 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 14:07:05 crc kubenswrapper[4722]: E0309 14:07:05.520820 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 09 14:07:05 crc kubenswrapper[4722]: E0309 14:07:05.521108 4722 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 09 14:07:05 crc kubenswrapper[4722]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 09 14:07:05 crc kubenswrapper[4722]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-csnbb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29551086-cvc28_openshift-infra(40be416c-1b7b-4973-b9ed-25ae20cd660d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 09 14:07:05 crc kubenswrapper[4722]: > logger="UnhandledError" Mar 09 14:07:05 crc kubenswrapper[4722]: E0309 14:07:05.522318 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29551086-cvc28" podUID="40be416c-1b7b-4973-b9ed-25ae20cd660d" Mar 09 14:07:05 crc kubenswrapper[4722]: E0309 14:07:05.820982 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29551086-cvc28" podUID="40be416c-1b7b-4973-b9ed-25ae20cd660d" Mar 09 14:07:07 crc kubenswrapper[4722]: I0309 14:07:07.200952 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5bbf48759-6bvgh" Mar 09 14:07:07 crc kubenswrapper[4722]: I0309 14:07:07.238526 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-68dc4b87b5-hhjmz" Mar 09 14:07:07 crc kubenswrapper[4722]: E0309 14:07:07.294850 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 09 14:07:07 crc kubenswrapper[4722]: E0309 14:07:07.295084 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9t5bt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-pwxnx_openshift-marketplace(c0f74bde-752e-497e-ad82-ec7a1676bbd5): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 09 14:07:07 crc kubenswrapper[4722]: E0309 14:07:07.296554 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-pwxnx" podUID="c0f74bde-752e-497e-ad82-ec7a1676bbd5" Mar 09 14:07:07 crc kubenswrapper[4722]: E0309 14:07:07.304749 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 09 14:07:07 crc kubenswrapper[4722]: E0309 14:07:07.304948 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cqvdd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-zmkmp_openshift-marketplace(4b4a7622-7bca-4fca-adb3-eec526b21b2b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 09 14:07:07 crc kubenswrapper[4722]: E0309 14:07:07.306181 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-zmkmp" podUID="4b4a7622-7bca-4fca-adb3-eec526b21b2b" Mar 09 14:07:07 crc kubenswrapper[4722]: I0309 14:07:07.356049 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e4e43d96-89d5-4ba8-8b68-5b3be43d23a9-client-ca\") pod \"e4e43d96-89d5-4ba8-8b68-5b3be43d23a9\" (UID: \"e4e43d96-89d5-4ba8-8b68-5b3be43d23a9\") " Mar 09 14:07:07 crc kubenswrapper[4722]: I0309 14:07:07.356636 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jbdf5\" (UniqueName: \"kubernetes.io/projected/e4e43d96-89d5-4ba8-8b68-5b3be43d23a9-kube-api-access-jbdf5\") pod \"e4e43d96-89d5-4ba8-8b68-5b3be43d23a9\" (UID: \"e4e43d96-89d5-4ba8-8b68-5b3be43d23a9\") " Mar 09 14:07:07 crc kubenswrapper[4722]: I0309 14:07:07.356674 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wd872\" (UniqueName: \"kubernetes.io/projected/13ca2ac7-d53e-437c-a6e2-b851e281ebf3-kube-api-access-wd872\") pod \"13ca2ac7-d53e-437c-a6e2-b851e281ebf3\" (UID: \"13ca2ac7-d53e-437c-a6e2-b851e281ebf3\") " Mar 09 14:07:07 crc kubenswrapper[4722]: I0309 14:07:07.356713 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4e43d96-89d5-4ba8-8b68-5b3be43d23a9-serving-cert\") pod \"e4e43d96-89d5-4ba8-8b68-5b3be43d23a9\" (UID: \"e4e43d96-89d5-4ba8-8b68-5b3be43d23a9\") " Mar 09 14:07:07 crc kubenswrapper[4722]: I0309 14:07:07.356775 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e4e43d96-89d5-4ba8-8b68-5b3be43d23a9-proxy-ca-bundles\") pod \"e4e43d96-89d5-4ba8-8b68-5b3be43d23a9\" (UID: \"e4e43d96-89d5-4ba8-8b68-5b3be43d23a9\") " Mar 09 14:07:07 crc kubenswrapper[4722]: I0309 14:07:07.356813 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/13ca2ac7-d53e-437c-a6e2-b851e281ebf3-serving-cert\") pod \"13ca2ac7-d53e-437c-a6e2-b851e281ebf3\" (UID: \"13ca2ac7-d53e-437c-a6e2-b851e281ebf3\") " Mar 09 14:07:07 crc kubenswrapper[4722]: I0309 14:07:07.356860 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4e43d96-89d5-4ba8-8b68-5b3be43d23a9-config\") pod \"e4e43d96-89d5-4ba8-8b68-5b3be43d23a9\" (UID: \"e4e43d96-89d5-4ba8-8b68-5b3be43d23a9\") " Mar 09 14:07:07 crc kubenswrapper[4722]: I0309 14:07:07.356924 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13ca2ac7-d53e-437c-a6e2-b851e281ebf3-config\") pod \"13ca2ac7-d53e-437c-a6e2-b851e281ebf3\" (UID: \"13ca2ac7-d53e-437c-a6e2-b851e281ebf3\") " Mar 09 14:07:07 crc kubenswrapper[4722]: I0309 14:07:07.356971 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/13ca2ac7-d53e-437c-a6e2-b851e281ebf3-client-ca\") pod \"13ca2ac7-d53e-437c-a6e2-b851e281ebf3\" (UID: \"13ca2ac7-d53e-437c-a6e2-b851e281ebf3\") " Mar 09 14:07:07 crc kubenswrapper[4722]: I0309 14:07:07.357596 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4e43d96-89d5-4ba8-8b68-5b3be43d23a9-client-ca" (OuterVolumeSpecName: "client-ca") pod "e4e43d96-89d5-4ba8-8b68-5b3be43d23a9" (UID: "e4e43d96-89d5-4ba8-8b68-5b3be43d23a9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:07:07 crc kubenswrapper[4722]: I0309 14:07:07.358556 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4e43d96-89d5-4ba8-8b68-5b3be43d23a9-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "e4e43d96-89d5-4ba8-8b68-5b3be43d23a9" (UID: "e4e43d96-89d5-4ba8-8b68-5b3be43d23a9"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:07:07 crc kubenswrapper[4722]: I0309 14:07:07.358571 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13ca2ac7-d53e-437c-a6e2-b851e281ebf3-client-ca" (OuterVolumeSpecName: "client-ca") pod "13ca2ac7-d53e-437c-a6e2-b851e281ebf3" (UID: "13ca2ac7-d53e-437c-a6e2-b851e281ebf3"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:07:07 crc kubenswrapper[4722]: I0309 14:07:07.358600 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13ca2ac7-d53e-437c-a6e2-b851e281ebf3-config" (OuterVolumeSpecName: "config") pod "13ca2ac7-d53e-437c-a6e2-b851e281ebf3" (UID: "13ca2ac7-d53e-437c-a6e2-b851e281ebf3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:07:07 crc kubenswrapper[4722]: I0309 14:07:07.358625 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4e43d96-89d5-4ba8-8b68-5b3be43d23a9-config" (OuterVolumeSpecName: "config") pod "e4e43d96-89d5-4ba8-8b68-5b3be43d23a9" (UID: "e4e43d96-89d5-4ba8-8b68-5b3be43d23a9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:07:07 crc kubenswrapper[4722]: I0309 14:07:07.363218 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4e43d96-89d5-4ba8-8b68-5b3be43d23a9-kube-api-access-jbdf5" (OuterVolumeSpecName: "kube-api-access-jbdf5") pod "e4e43d96-89d5-4ba8-8b68-5b3be43d23a9" (UID: "e4e43d96-89d5-4ba8-8b68-5b3be43d23a9"). InnerVolumeSpecName "kube-api-access-jbdf5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:07:07 crc kubenswrapper[4722]: I0309 14:07:07.363263 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13ca2ac7-d53e-437c-a6e2-b851e281ebf3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "13ca2ac7-d53e-437c-a6e2-b851e281ebf3" (UID: "13ca2ac7-d53e-437c-a6e2-b851e281ebf3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:07:07 crc kubenswrapper[4722]: I0309 14:07:07.363374 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4e43d96-89d5-4ba8-8b68-5b3be43d23a9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e4e43d96-89d5-4ba8-8b68-5b3be43d23a9" (UID: "e4e43d96-89d5-4ba8-8b68-5b3be43d23a9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:07:07 crc kubenswrapper[4722]: I0309 14:07:07.367259 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13ca2ac7-d53e-437c-a6e2-b851e281ebf3-kube-api-access-wd872" (OuterVolumeSpecName: "kube-api-access-wd872") pod "13ca2ac7-d53e-437c-a6e2-b851e281ebf3" (UID: "13ca2ac7-d53e-437c-a6e2-b851e281ebf3"). InnerVolumeSpecName "kube-api-access-wd872". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:07:07 crc kubenswrapper[4722]: I0309 14:07:07.458886 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jbdf5\" (UniqueName: \"kubernetes.io/projected/e4e43d96-89d5-4ba8-8b68-5b3be43d23a9-kube-api-access-jbdf5\") on node \"crc\" DevicePath \"\"" Mar 09 14:07:07 crc kubenswrapper[4722]: I0309 14:07:07.458943 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wd872\" (UniqueName: \"kubernetes.io/projected/13ca2ac7-d53e-437c-a6e2-b851e281ebf3-kube-api-access-wd872\") on node \"crc\" DevicePath \"\"" Mar 09 14:07:07 crc kubenswrapper[4722]: I0309 14:07:07.458955 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e4e43d96-89d5-4ba8-8b68-5b3be43d23a9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 14:07:07 crc kubenswrapper[4722]: I0309 14:07:07.458970 4722 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e4e43d96-89d5-4ba8-8b68-5b3be43d23a9-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 09 14:07:07 crc kubenswrapper[4722]: I0309 14:07:07.458982 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/13ca2ac7-d53e-437c-a6e2-b851e281ebf3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 14:07:07 crc kubenswrapper[4722]: I0309 14:07:07.458994 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e4e43d96-89d5-4ba8-8b68-5b3be43d23a9-config\") on node \"crc\" DevicePath \"\"" Mar 09 14:07:07 crc kubenswrapper[4722]: I0309 14:07:07.459004 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13ca2ac7-d53e-437c-a6e2-b851e281ebf3-config\") on node \"crc\" DevicePath \"\"" Mar 09 14:07:07 crc kubenswrapper[4722]: I0309 14:07:07.459014 4722 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/13ca2ac7-d53e-437c-a6e2-b851e281ebf3-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 14:07:07 crc kubenswrapper[4722]: I0309 14:07:07.459025 4722 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e4e43d96-89d5-4ba8-8b68-5b3be43d23a9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 14:07:07 crc kubenswrapper[4722]: I0309 14:07:07.831101 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5bbf48759-6bvgh" event={"ID":"e4e43d96-89d5-4ba8-8b68-5b3be43d23a9","Type":"ContainerDied","Data":"1fa826a95fa1b912554a1349bb8d83bc43736bc11b2065ecfdedc90fe6eff981"} Mar 09 14:07:07 crc kubenswrapper[4722]: I0309 14:07:07.831149 4722 scope.go:117] "RemoveContainer" containerID="f955a81a3843f4555182e6149afe7b013e9969ddb1ca43d46c0169717bea5581" Mar 09 14:07:07 crc kubenswrapper[4722]: I0309 14:07:07.831320 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5bbf48759-6bvgh" Mar 09 14:07:07 crc kubenswrapper[4722]: I0309 14:07:07.836279 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-68dc4b87b5-hhjmz" event={"ID":"13ca2ac7-d53e-437c-a6e2-b851e281ebf3","Type":"ContainerDied","Data":"cf4fdd8de94c9649d442a9d96e79904c32cf7613c1874a8d2ebf25c0c3a51dfa"} Mar 09 14:07:07 crc kubenswrapper[4722]: I0309 14:07:07.836314 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-68dc4b87b5-hhjmz" Mar 09 14:07:07 crc kubenswrapper[4722]: I0309 14:07:07.931766 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-68dc4b87b5-hhjmz"] Mar 09 14:07:07 crc kubenswrapper[4722]: I0309 14:07:07.952275 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-68dc4b87b5-hhjmz"] Mar 09 14:07:07 crc kubenswrapper[4722]: I0309 14:07:07.969489 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5bbf48759-6bvgh"] Mar 09 14:07:07 crc kubenswrapper[4722]: I0309 14:07:07.975597 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5bbf48759-6bvgh"] Mar 09 14:07:08 crc kubenswrapper[4722]: I0309 14:07:08.155053 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13ca2ac7-d53e-437c-a6e2-b851e281ebf3" path="/var/lib/kubelet/pods/13ca2ac7-d53e-437c-a6e2-b851e281ebf3/volumes" Mar 09 14:07:08 crc kubenswrapper[4722]: I0309 14:07:08.156294 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4e43d96-89d5-4ba8-8b68-5b3be43d23a9" path="/var/lib/kubelet/pods/e4e43d96-89d5-4ba8-8b68-5b3be43d23a9/volumes" Mar 09 14:07:08 crc kubenswrapper[4722]: I0309 14:07:08.539329 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c9b77b468-t9m8w"] Mar 09 14:07:08 crc kubenswrapper[4722]: E0309 14:07:08.539665 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13ca2ac7-d53e-437c-a6e2-b851e281ebf3" containerName="route-controller-manager" Mar 09 14:07:08 crc kubenswrapper[4722]: I0309 14:07:08.539683 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="13ca2ac7-d53e-437c-a6e2-b851e281ebf3" containerName="route-controller-manager" Mar 09 14:07:08 crc kubenswrapper[4722]: E0309 14:07:08.539695 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4e43d96-89d5-4ba8-8b68-5b3be43d23a9" containerName="controller-manager" Mar 09 14:07:08 crc kubenswrapper[4722]: I0309 14:07:08.539704 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4e43d96-89d5-4ba8-8b68-5b3be43d23a9" containerName="controller-manager" Mar 09 14:07:08 crc kubenswrapper[4722]: I0309 14:07:08.539843 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="13ca2ac7-d53e-437c-a6e2-b851e281ebf3" containerName="route-controller-manager" Mar 09 14:07:08 crc kubenswrapper[4722]: I0309 14:07:08.539858 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4e43d96-89d5-4ba8-8b68-5b3be43d23a9" containerName="controller-manager" Mar 09 14:07:08 crc kubenswrapper[4722]: I0309 14:07:08.540463 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7c9b77b468-t9m8w" Mar 09 14:07:08 crc kubenswrapper[4722]: I0309 14:07:08.545251 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 09 14:07:08 crc kubenswrapper[4722]: I0309 14:07:08.545436 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 09 14:07:08 crc kubenswrapper[4722]: I0309 14:07:08.545553 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 09 14:07:08 crc kubenswrapper[4722]: I0309 14:07:08.545739 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 09 14:07:08 crc kubenswrapper[4722]: I0309 14:07:08.545836 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 09 14:07:08 crc kubenswrapper[4722]: I0309 14:07:08.546289 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 09 14:07:08 crc kubenswrapper[4722]: I0309 14:07:08.562793 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5677d5654d-pmsss"] Mar 09 14:07:08 crc kubenswrapper[4722]: I0309 14:07:08.563657 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5677d5654d-pmsss" Mar 09 14:07:08 crc kubenswrapper[4722]: I0309 14:07:08.565912 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 09 14:07:08 crc kubenswrapper[4722]: I0309 14:07:08.567159 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 09 14:07:08 crc kubenswrapper[4722]: I0309 14:07:08.567455 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 09 14:07:08 crc kubenswrapper[4722]: I0309 14:07:08.567579 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 09 14:07:08 crc kubenswrapper[4722]: I0309 14:07:08.568016 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 09 14:07:08 crc kubenswrapper[4722]: I0309 14:07:08.568254 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 09 14:07:08 crc kubenswrapper[4722]: I0309 14:07:08.569595 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c9b77b468-t9m8w"] Mar 09 14:07:08 crc kubenswrapper[4722]: I0309 14:07:08.572452 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5677d5654d-pmsss"] Mar 09 14:07:08 crc kubenswrapper[4722]: I0309 14:07:08.582487 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 09 14:07:08 crc kubenswrapper[4722]: I0309 14:07:08.677774 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/11e24ad3-46ea-469c-bdec-73bd3eb10057-proxy-ca-bundles\") pod \"controller-manager-5677d5654d-pmsss\" (UID: \"11e24ad3-46ea-469c-bdec-73bd3eb10057\") " pod="openshift-controller-manager/controller-manager-5677d5654d-pmsss" Mar 09 14:07:08 crc kubenswrapper[4722]: I0309 14:07:08.677841 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hw2lc\" (UniqueName: \"kubernetes.io/projected/11e24ad3-46ea-469c-bdec-73bd3eb10057-kube-api-access-hw2lc\") pod \"controller-manager-5677d5654d-pmsss\" (UID: \"11e24ad3-46ea-469c-bdec-73bd3eb10057\") " pod="openshift-controller-manager/controller-manager-5677d5654d-pmsss" Mar 09 14:07:08 crc kubenswrapper[4722]: I0309 14:07:08.677879 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/11e24ad3-46ea-469c-bdec-73bd3eb10057-client-ca\") pod \"controller-manager-5677d5654d-pmsss\" (UID: \"11e24ad3-46ea-469c-bdec-73bd3eb10057\") " pod="openshift-controller-manager/controller-manager-5677d5654d-pmsss" Mar 09 14:07:08 crc kubenswrapper[4722]: I0309 14:07:08.678060 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7c0bd182-5e3e-4763-b517-2d6646ceddfa-serving-cert\") pod \"route-controller-manager-7c9b77b468-t9m8w\" (UID: \"7c0bd182-5e3e-4763-b517-2d6646ceddfa\") " pod="openshift-route-controller-manager/route-controller-manager-7c9b77b468-t9m8w" Mar 09 14:07:08 crc kubenswrapper[4722]: I0309 14:07:08.678124 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c0bd182-5e3e-4763-b517-2d6646ceddfa-config\") pod \"route-controller-manager-7c9b77b468-t9m8w\" (UID: \"7c0bd182-5e3e-4763-b517-2d6646ceddfa\") " pod="openshift-route-controller-manager/route-controller-manager-7c9b77b468-t9m8w" Mar 09 14:07:08 crc kubenswrapper[4722]: I0309 14:07:08.678157 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/11e24ad3-46ea-469c-bdec-73bd3eb10057-serving-cert\") pod \"controller-manager-5677d5654d-pmsss\" (UID: \"11e24ad3-46ea-469c-bdec-73bd3eb10057\") " pod="openshift-controller-manager/controller-manager-5677d5654d-pmsss" Mar 09 14:07:08 crc kubenswrapper[4722]: I0309 14:07:08.678226 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11e24ad3-46ea-469c-bdec-73bd3eb10057-config\") pod \"controller-manager-5677d5654d-pmsss\" (UID: \"11e24ad3-46ea-469c-bdec-73bd3eb10057\") " pod="openshift-controller-manager/controller-manager-5677d5654d-pmsss" Mar 09 14:07:08 crc kubenswrapper[4722]: I0309 14:07:08.678348 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7c0bd182-5e3e-4763-b517-2d6646ceddfa-client-ca\") pod \"route-controller-manager-7c9b77b468-t9m8w\" (UID: \"7c0bd182-5e3e-4763-b517-2d6646ceddfa\") " pod="openshift-route-controller-manager/route-controller-manager-7c9b77b468-t9m8w" Mar 09 14:07:08 crc kubenswrapper[4722]: I0309 14:07:08.678395 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrtt2\" (UniqueName: \"kubernetes.io/projected/7c0bd182-5e3e-4763-b517-2d6646ceddfa-kube-api-access-nrtt2\") pod \"route-controller-manager-7c9b77b468-t9m8w\" (UID: \"7c0bd182-5e3e-4763-b517-2d6646ceddfa\") " pod="openshift-route-controller-manager/route-controller-manager-7c9b77b468-t9m8w" Mar 09 14:07:08 crc kubenswrapper[4722]: I0309 14:07:08.780046 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7c0bd182-5e3e-4763-b517-2d6646ceddfa-client-ca\") pod \"route-controller-manager-7c9b77b468-t9m8w\" (UID: \"7c0bd182-5e3e-4763-b517-2d6646ceddfa\") " pod="openshift-route-controller-manager/route-controller-manager-7c9b77b468-t9m8w" Mar 09 14:07:08 crc kubenswrapper[4722]: I0309 14:07:08.780094 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrtt2\" (UniqueName: \"kubernetes.io/projected/7c0bd182-5e3e-4763-b517-2d6646ceddfa-kube-api-access-nrtt2\") pod \"route-controller-manager-7c9b77b468-t9m8w\" (UID: \"7c0bd182-5e3e-4763-b517-2d6646ceddfa\") " pod="openshift-route-controller-manager/route-controller-manager-7c9b77b468-t9m8w" Mar 09 14:07:08 crc kubenswrapper[4722]: I0309 14:07:08.780121 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/11e24ad3-46ea-469c-bdec-73bd3eb10057-proxy-ca-bundles\") pod \"controller-manager-5677d5654d-pmsss\" (UID: \"11e24ad3-46ea-469c-bdec-73bd3eb10057\") " pod="openshift-controller-manager/controller-manager-5677d5654d-pmsss" Mar 09 14:07:08 crc kubenswrapper[4722]: I0309 14:07:08.780166 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hw2lc\" (UniqueName: \"kubernetes.io/projected/11e24ad3-46ea-469c-bdec-73bd3eb10057-kube-api-access-hw2lc\") pod \"controller-manager-5677d5654d-pmsss\" (UID: \"11e24ad3-46ea-469c-bdec-73bd3eb10057\") " pod="openshift-controller-manager/controller-manager-5677d5654d-pmsss" Mar 09 14:07:08 crc kubenswrapper[4722]: I0309 14:07:08.780195 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/11e24ad3-46ea-469c-bdec-73bd3eb10057-client-ca\") pod \"controller-manager-5677d5654d-pmsss\" (UID: \"11e24ad3-46ea-469c-bdec-73bd3eb10057\") " pod="openshift-controller-manager/controller-manager-5677d5654d-pmsss" Mar 09 14:07:08 crc kubenswrapper[4722]: I0309 14:07:08.780234 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7c0bd182-5e3e-4763-b517-2d6646ceddfa-serving-cert\") pod \"route-controller-manager-7c9b77b468-t9m8w\" (UID: \"7c0bd182-5e3e-4763-b517-2d6646ceddfa\") " pod="openshift-route-controller-manager/route-controller-manager-7c9b77b468-t9m8w" Mar 09 14:07:08 crc kubenswrapper[4722]: I0309 14:07:08.780255 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c0bd182-5e3e-4763-b517-2d6646ceddfa-config\") pod \"route-controller-manager-7c9b77b468-t9m8w\" (UID: \"7c0bd182-5e3e-4763-b517-2d6646ceddfa\") " pod="openshift-route-controller-manager/route-controller-manager-7c9b77b468-t9m8w" Mar 09 14:07:08 crc kubenswrapper[4722]: I0309 14:07:08.780271 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/11e24ad3-46ea-469c-bdec-73bd3eb10057-serving-cert\") pod \"controller-manager-5677d5654d-pmsss\" (UID: \"11e24ad3-46ea-469c-bdec-73bd3eb10057\") " pod="openshift-controller-manager/controller-manager-5677d5654d-pmsss" Mar 09 14:07:08 crc kubenswrapper[4722]: I0309 14:07:08.780292 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11e24ad3-46ea-469c-bdec-73bd3eb10057-config\") pod \"controller-manager-5677d5654d-pmsss\" (UID: \"11e24ad3-46ea-469c-bdec-73bd3eb10057\") " pod="openshift-controller-manager/controller-manager-5677d5654d-pmsss" Mar 09 14:07:08 crc kubenswrapper[4722]: I0309 14:07:08.781176 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7c0bd182-5e3e-4763-b517-2d6646ceddfa-client-ca\") pod \"route-controller-manager-7c9b77b468-t9m8w\" (UID: \"7c0bd182-5e3e-4763-b517-2d6646ceddfa\") " pod="openshift-route-controller-manager/route-controller-manager-7c9b77b468-t9m8w" Mar 09 14:07:08 crc kubenswrapper[4722]: I0309 14:07:08.781532 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11e24ad3-46ea-469c-bdec-73bd3eb10057-config\") pod \"controller-manager-5677d5654d-pmsss\" (UID: \"11e24ad3-46ea-469c-bdec-73bd3eb10057\") " pod="openshift-controller-manager/controller-manager-5677d5654d-pmsss" Mar 09 14:07:08 crc kubenswrapper[4722]: I0309 14:07:08.782574 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/11e24ad3-46ea-469c-bdec-73bd3eb10057-client-ca\") pod \"controller-manager-5677d5654d-pmsss\" (UID: \"11e24ad3-46ea-469c-bdec-73bd3eb10057\") " pod="openshift-controller-manager/controller-manager-5677d5654d-pmsss" Mar 09 14:07:08 crc kubenswrapper[4722]: I0309 14:07:08.783090 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c0bd182-5e3e-4763-b517-2d6646ceddfa-config\") pod \"route-controller-manager-7c9b77b468-t9m8w\" (UID: \"7c0bd182-5e3e-4763-b517-2d6646ceddfa\") " pod="openshift-route-controller-manager/route-controller-manager-7c9b77b468-t9m8w" Mar 09 14:07:08 crc kubenswrapper[4722]: I0309 14:07:08.783795 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/11e24ad3-46ea-469c-bdec-73bd3eb10057-proxy-ca-bundles\") pod \"controller-manager-5677d5654d-pmsss\" (UID: \"11e24ad3-46ea-469c-bdec-73bd3eb10057\") " pod="openshift-controller-manager/controller-manager-5677d5654d-pmsss" Mar 09 14:07:08 crc kubenswrapper[4722]: I0309 14:07:08.791290 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7c0bd182-5e3e-4763-b517-2d6646ceddfa-serving-cert\") pod \"route-controller-manager-7c9b77b468-t9m8w\" (UID: \"7c0bd182-5e3e-4763-b517-2d6646ceddfa\") " pod="openshift-route-controller-manager/route-controller-manager-7c9b77b468-t9m8w" Mar 09 14:07:08 crc kubenswrapper[4722]: I0309 14:07:08.798656 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/11e24ad3-46ea-469c-bdec-73bd3eb10057-serving-cert\") pod \"controller-manager-5677d5654d-pmsss\" (UID: \"11e24ad3-46ea-469c-bdec-73bd3eb10057\") " pod="openshift-controller-manager/controller-manager-5677d5654d-pmsss" Mar 09 14:07:08 crc kubenswrapper[4722]: I0309 14:07:08.799006 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrtt2\" (UniqueName: \"kubernetes.io/projected/7c0bd182-5e3e-4763-b517-2d6646ceddfa-kube-api-access-nrtt2\") pod \"route-controller-manager-7c9b77b468-t9m8w\" (UID: \"7c0bd182-5e3e-4763-b517-2d6646ceddfa\") " pod="openshift-route-controller-manager/route-controller-manager-7c9b77b468-t9m8w" Mar 09 14:07:08 crc kubenswrapper[4722]: I0309 14:07:08.803172 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hw2lc\" (UniqueName: \"kubernetes.io/projected/11e24ad3-46ea-469c-bdec-73bd3eb10057-kube-api-access-hw2lc\") pod \"controller-manager-5677d5654d-pmsss\" (UID: \"11e24ad3-46ea-469c-bdec-73bd3eb10057\") " pod="openshift-controller-manager/controller-manager-5677d5654d-pmsss" Mar 09 14:07:08 crc kubenswrapper[4722]: I0309 14:07:08.868962 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7c9b77b468-t9m8w" Mar 09 14:07:08 crc kubenswrapper[4722]: I0309 14:07:08.881607 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5677d5654d-pmsss" Mar 09 14:07:09 crc kubenswrapper[4722]: E0309 14:07:09.369046 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-pwxnx" podUID="c0f74bde-752e-497e-ad82-ec7a1676bbd5" Mar 09 14:07:09 crc kubenswrapper[4722]: E0309 14:07:09.369259 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-zmkmp" podUID="4b4a7622-7bca-4fca-adb3-eec526b21b2b" Mar 09 14:07:09 crc kubenswrapper[4722]: E0309 14:07:09.415444 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 09 14:07:09 crc kubenswrapper[4722]: E0309 14:07:09.415617 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tk4lv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-t8pts_openshift-marketplace(b0b6cdb1-050e-4ed3-b20e-d825e4db1edb): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 09 14:07:09 crc kubenswrapper[4722]: E0309 14:07:09.418355 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-t8pts" podUID="b0b6cdb1-050e-4ed3-b20e-d825e4db1edb" Mar 09 14:07:09 crc kubenswrapper[4722]: E0309 14:07:09.438984 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 09 14:07:09 crc kubenswrapper[4722]: E0309 14:07:09.439766 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2kq5r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-w95cf_openshift-marketplace(65e3d647-8806-4c0c-b9aa-142739f2fbe0): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 09 14:07:09 crc kubenswrapper[4722]: E0309 14:07:09.440967 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-w95cf" podUID="65e3d647-8806-4c0c-b9aa-142739f2fbe0" Mar 09 14:07:10 crc kubenswrapper[4722]: I0309 14:07:10.243485 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 09 14:07:10 crc kubenswrapper[4722]: I0309 14:07:10.244148 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 09 14:07:10 crc kubenswrapper[4722]: I0309 14:07:10.249582 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 09 14:07:10 crc kubenswrapper[4722]: I0309 14:07:10.404868 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f8225796-29d7-45ff-a016-d19dbc155d1a-var-lock\") pod \"installer-9-crc\" (UID: \"f8225796-29d7-45ff-a016-d19dbc155d1a\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 09 14:07:10 crc kubenswrapper[4722]: I0309 14:07:10.404951 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f8225796-29d7-45ff-a016-d19dbc155d1a-kubelet-dir\") pod \"installer-9-crc\" (UID: \"f8225796-29d7-45ff-a016-d19dbc155d1a\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 09 14:07:10 crc kubenswrapper[4722]: I0309 14:07:10.405006 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f8225796-29d7-45ff-a016-d19dbc155d1a-kube-api-access\") pod \"installer-9-crc\" (UID: \"f8225796-29d7-45ff-a016-d19dbc155d1a\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 09 14:07:10 crc kubenswrapper[4722]: I0309 14:07:10.506534 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f8225796-29d7-45ff-a016-d19dbc155d1a-kube-api-access\") pod \"installer-9-crc\" (UID: \"f8225796-29d7-45ff-a016-d19dbc155d1a\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 09 14:07:10 crc kubenswrapper[4722]: I0309 14:07:10.507115 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f8225796-29d7-45ff-a016-d19dbc155d1a-var-lock\") pod \"installer-9-crc\" (UID: \"f8225796-29d7-45ff-a016-d19dbc155d1a\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 09 14:07:10 crc kubenswrapper[4722]: I0309 14:07:10.507166 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f8225796-29d7-45ff-a016-d19dbc155d1a-kubelet-dir\") pod \"installer-9-crc\" (UID: \"f8225796-29d7-45ff-a016-d19dbc155d1a\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 09 14:07:10 crc kubenswrapper[4722]: I0309 14:07:10.507262 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f8225796-29d7-45ff-a016-d19dbc155d1a-kubelet-dir\") pod \"installer-9-crc\" (UID: \"f8225796-29d7-45ff-a016-d19dbc155d1a\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 09 14:07:10 crc kubenswrapper[4722]: I0309 14:07:10.507268 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f8225796-29d7-45ff-a016-d19dbc155d1a-var-lock\") pod \"installer-9-crc\" (UID: \"f8225796-29d7-45ff-a016-d19dbc155d1a\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 09 14:07:10 crc kubenswrapper[4722]: I0309 14:07:10.525760 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f8225796-29d7-45ff-a016-d19dbc155d1a-kube-api-access\") pod \"installer-9-crc\" (UID: \"f8225796-29d7-45ff-a016-d19dbc155d1a\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 09 14:07:10 crc kubenswrapper[4722]: I0309 14:07:10.580826 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 09 14:07:11 crc kubenswrapper[4722]: E0309 14:07:11.147008 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-w95cf" podUID="65e3d647-8806-4c0c-b9aa-142739f2fbe0" Mar 09 14:07:11 crc kubenswrapper[4722]: E0309 14:07:11.147111 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-t8pts" podUID="b0b6cdb1-050e-4ed3-b20e-d825e4db1edb" Mar 09 14:07:11 crc kubenswrapper[4722]: E0309 14:07:11.165159 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 09 14:07:11 crc kubenswrapper[4722]: E0309 14:07:11.165399 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cm6f7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-rdkdt_openshift-marketplace(cfd6e90e-4eeb-4372-8465-136a383e95b2): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 09 14:07:11 crc kubenswrapper[4722]: E0309 14:07:11.166626 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-rdkdt" podUID="cfd6e90e-4eeb-4372-8465-136a383e95b2" Mar 09 14:07:11 crc kubenswrapper[4722]: I0309 14:07:11.626997 4722 patch_prober.go:28] interesting pod/downloads-7954f5f757-knrzp container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Mar 09 14:07:11 crc kubenswrapper[4722]: I0309 14:07:11.627304 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-knrzp" podUID="8f915ef9-5d9a-43ee-a333-def8766e083d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" Mar 09 14:07:15 crc kubenswrapper[4722]: E0309 14:07:15.237586 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 09 14:07:15 crc kubenswrapper[4722]: E0309 14:07:15.238304 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nnmzq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-c2g4c_openshift-marketplace(7df01eab-424f-40b1-a40c-03b930a8fac6): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 09 14:07:15 crc kubenswrapper[4722]: E0309 14:07:15.239524 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-c2g4c" podUID="7df01eab-424f-40b1-a40c-03b930a8fac6" Mar 09 14:07:15 crc kubenswrapper[4722]: I0309 14:07:15.338522 4722 scope.go:117] "RemoveContainer" containerID="b1b7ea33f844f9b44d9fa4569482bc9bcd47681316dd9954e9365ad41bb40792" Mar 09 14:07:15 crc kubenswrapper[4722]: I0309 14:07:15.777644 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-pvvhj"] Mar 09 14:07:15 crc kubenswrapper[4722]: W0309 14:07:15.783122 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5dbc0be_527e_4f70_b185_3a10b1b11a75.slice/crio-f898c959273854325d0d3b60db97f6e7d908a614cd818241261b82d0531def88 WatchSource:0}: Error finding container f898c959273854325d0d3b60db97f6e7d908a614cd818241261b82d0531def88: Status 404 returned error can't find the container with id f898c959273854325d0d3b60db97f6e7d908a614cd818241261b82d0531def88 Mar 09 14:07:15 crc kubenswrapper[4722]: I0309 14:07:15.891846 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 09 14:07:15 crc kubenswrapper[4722]: W0309 14:07:15.901159 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod812aaee7_17d0_4777_a2a0_f1ffcb8a5ad4.slice/crio-51cf359c688150bee9b9155a31de4b6f2f8c3e94e754a494b0abc5671819a302 WatchSource:0}: Error finding container 51cf359c688150bee9b9155a31de4b6f2f8c3e94e754a494b0abc5671819a302: Status 404 returned error can't find the container with id 51cf359c688150bee9b9155a31de4b6f2f8c3e94e754a494b0abc5671819a302 Mar 09 14:07:15 crc kubenswrapper[4722]: I0309 14:07:15.902132 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-knrzp" event={"ID":"8f915ef9-5d9a-43ee-a333-def8766e083d","Type":"ContainerStarted","Data":"f8d9ea3d4835fca47029b25aee5a9010030145633946531140ef15bdd63439bc"} Mar 09 14:07:15 crc kubenswrapper[4722]: I0309 14:07:15.902923 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-knrzp" Mar 09 14:07:15 crc kubenswrapper[4722]: I0309 14:07:15.902997 4722 patch_prober.go:28] interesting pod/downloads-7954f5f757-knrzp container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Mar 09 14:07:15 crc kubenswrapper[4722]: I0309 14:07:15.903036 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-knrzp" podUID="8f915ef9-5d9a-43ee-a333-def8766e083d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" Mar 09 14:07:15 crc kubenswrapper[4722]: I0309 14:07:15.908007 4722 generic.go:334] "Generic (PLEG): container finished" podID="a9665111-e6dc-49f2-803a-96eebcc4c78c" containerID="c8f8238fec7fc479f9f9b20da03b87579e821be31b34435c1879e486656c2818" exitCode=0 Mar 09 14:07:15 crc kubenswrapper[4722]: I0309 14:07:15.908129 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c96ws" event={"ID":"a9665111-e6dc-49f2-803a-96eebcc4c78c","Type":"ContainerDied","Data":"c8f8238fec7fc479f9f9b20da03b87579e821be31b34435c1879e486656c2818"} Mar 09 14:07:15 crc kubenswrapper[4722]: I0309 14:07:15.927258 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5677d5654d-pmsss"] Mar 09 14:07:15 crc kubenswrapper[4722]: I0309 14:07:15.935822 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r2hp9" event={"ID":"6ba5c493-a951-4d61-acf3-5ee964dcfe60","Type":"ContainerStarted","Data":"02554076861fbf645322f00023f0309b6094f4865c2951838c9821d2260303df"} Mar 09 14:07:15 crc kubenswrapper[4722]: I0309 14:07:15.944510 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-pvvhj" event={"ID":"f5dbc0be-527e-4f70-b185-3a10b1b11a75","Type":"ContainerStarted","Data":"f898c959273854325d0d3b60db97f6e7d908a614cd818241261b82d0531def88"} Mar 09 14:07:15 crc kubenswrapper[4722]: E0309 14:07:15.950674 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-c2g4c" podUID="7df01eab-424f-40b1-a40c-03b930a8fac6" Mar 09 14:07:15 crc kubenswrapper[4722]: W0309 14:07:15.963621 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11e24ad3_46ea_469c_bdec_73bd3eb10057.slice/crio-552ebd2bddaa9776fc1184b4e5d86cad79ae70a4da025d11cb797cc9c54b98b6 WatchSource:0}: Error finding container 552ebd2bddaa9776fc1184b4e5d86cad79ae70a4da025d11cb797cc9c54b98b6: Status 404 returned error can't find the container with id 552ebd2bddaa9776fc1184b4e5d86cad79ae70a4da025d11cb797cc9c54b98b6 Mar 09 14:07:15 crc kubenswrapper[4722]: I0309 14:07:15.971642 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 09 14:07:15 crc kubenswrapper[4722]: I0309 14:07:15.992604 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c9b77b468-t9m8w"] Mar 09 14:07:16 crc kubenswrapper[4722]: W0309 14:07:16.014993 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podf8225796_29d7_45ff_a016_d19dbc155d1a.slice/crio-04d643a62e4b41231fd58c7fe1dac9c20e6e7a78ca8eac80b18c0d2f9d1f13a4 WatchSource:0}: Error finding container 04d643a62e4b41231fd58c7fe1dac9c20e6e7a78ca8eac80b18c0d2f9d1f13a4: Status 404 returned error can't find the container with id 04d643a62e4b41231fd58c7fe1dac9c20e6e7a78ca8eac80b18c0d2f9d1f13a4 Mar 09 14:07:16 crc kubenswrapper[4722]: W0309 14:07:16.019704 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c0bd182_5e3e_4763_b517_2d6646ceddfa.slice/crio-ee5c3c563d33c7410cc4f1ca10690389ef75f43981c319a2d38920b6f28ab2f6 WatchSource:0}: Error finding container ee5c3c563d33c7410cc4f1ca10690389ef75f43981c319a2d38920b6f28ab2f6: Status 404 returned error can't find the container with id ee5c3c563d33c7410cc4f1ca10690389ef75f43981c319a2d38920b6f28ab2f6 Mar 09 14:07:16 crc kubenswrapper[4722]: I0309 14:07:16.952560 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7c9b77b468-t9m8w" event={"ID":"7c0bd182-5e3e-4763-b517-2d6646ceddfa","Type":"ContainerStarted","Data":"a154e9e25e8b5cdbeea7070f9c12442460bc3df257927d77c05bb8bce1524313"} Mar 09 14:07:16 crc kubenswrapper[4722]: I0309 14:07:16.954405 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7c9b77b468-t9m8w" event={"ID":"7c0bd182-5e3e-4763-b517-2d6646ceddfa","Type":"ContainerStarted","Data":"ee5c3c563d33c7410cc4f1ca10690389ef75f43981c319a2d38920b6f28ab2f6"} Mar 09 14:07:16 crc kubenswrapper[4722]: I0309 14:07:16.954506 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7c9b77b468-t9m8w" Mar 09 14:07:16 crc kubenswrapper[4722]: I0309 14:07:16.957723 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5677d5654d-pmsss" event={"ID":"11e24ad3-46ea-469c-bdec-73bd3eb10057","Type":"ContainerStarted","Data":"ff377b4b0bc205b0487c84a89ab21c742b21398ffe13bcf659b9592ea5a8a88c"} Mar 09 14:07:16 crc kubenswrapper[4722]: I0309 14:07:16.957826 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5677d5654d-pmsss" event={"ID":"11e24ad3-46ea-469c-bdec-73bd3eb10057","Type":"ContainerStarted","Data":"552ebd2bddaa9776fc1184b4e5d86cad79ae70a4da025d11cb797cc9c54b98b6"} Mar 09 14:07:16 crc kubenswrapper[4722]: I0309 14:07:16.958812 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5677d5654d-pmsss" Mar 09 14:07:16 crc kubenswrapper[4722]: I0309 14:07:16.964401 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5677d5654d-pmsss" Mar 09 14:07:16 crc kubenswrapper[4722]: I0309 14:07:16.966327 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c96ws" event={"ID":"a9665111-e6dc-49f2-803a-96eebcc4c78c","Type":"ContainerStarted","Data":"1f4781e8695c6b455d3a85fd01562c7656968cb7c8e952cce56e3516aceb016d"} Mar 09 14:07:16 crc kubenswrapper[4722]: I0309 14:07:16.966978 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7c9b77b468-t9m8w" Mar 09 14:07:16 crc kubenswrapper[4722]: I0309 14:07:16.968001 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"f8225796-29d7-45ff-a016-d19dbc155d1a","Type":"ContainerStarted","Data":"6a5e6490b93985ea1dc6129f8877ec4f217a1bd50f770b098de4590ca9e8df0e"} Mar 09 14:07:16 crc kubenswrapper[4722]: I0309 14:07:16.968095 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"f8225796-29d7-45ff-a016-d19dbc155d1a","Type":"ContainerStarted","Data":"04d643a62e4b41231fd58c7fe1dac9c20e6e7a78ca8eac80b18c0d2f9d1f13a4"} Mar 09 14:07:16 crc kubenswrapper[4722]: I0309 14:07:16.969759 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"812aaee7-17d0-4777-a2a0-f1ffcb8a5ad4","Type":"ContainerStarted","Data":"587063214d63954e384e5387687b40bb853e9118a7e551fb1a82fd623375b418"} Mar 09 14:07:16 crc kubenswrapper[4722]: I0309 14:07:16.969874 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"812aaee7-17d0-4777-a2a0-f1ffcb8a5ad4","Type":"ContainerStarted","Data":"51cf359c688150bee9b9155a31de4b6f2f8c3e94e754a494b0abc5671819a302"} Mar 09 14:07:16 crc kubenswrapper[4722]: I0309 14:07:16.973353 4722 generic.go:334] "Generic (PLEG): container finished" podID="6ba5c493-a951-4d61-acf3-5ee964dcfe60" containerID="02554076861fbf645322f00023f0309b6094f4865c2951838c9821d2260303df" exitCode=0 Mar 09 14:07:16 crc kubenswrapper[4722]: I0309 14:07:16.973446 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r2hp9" event={"ID":"6ba5c493-a951-4d61-acf3-5ee964dcfe60","Type":"ContainerDied","Data":"02554076861fbf645322f00023f0309b6094f4865c2951838c9821d2260303df"} Mar 09 14:07:16 crc kubenswrapper[4722]: I0309 14:07:16.981826 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-pvvhj" event={"ID":"f5dbc0be-527e-4f70-b185-3a10b1b11a75","Type":"ContainerStarted","Data":"b6f022ddcbd80c83b9b690000b31cc3ba65ad88449f0b1adc884a25f66eab753"} Mar 09 14:07:16 crc kubenswrapper[4722]: I0309 14:07:16.982329 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-pvvhj" event={"ID":"f5dbc0be-527e-4f70-b185-3a10b1b11a75","Type":"ContainerStarted","Data":"543b02ba7911745ffe379970d996add8334b25b0306ea882d88194b1e93e25f8"} Mar 09 14:07:16 crc kubenswrapper[4722]: I0309 14:07:16.982458 4722 patch_prober.go:28] interesting pod/downloads-7954f5f757-knrzp container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Mar 09 14:07:16 crc kubenswrapper[4722]: I0309 14:07:16.982758 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-knrzp" podUID="8f915ef9-5d9a-43ee-a333-def8766e083d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" Mar 09 14:07:16 crc kubenswrapper[4722]: I0309 14:07:16.983729 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7c9b77b468-t9m8w" podStartSLOduration=10.983712374 podStartE2EDuration="10.983712374s" podCreationTimestamp="2026-03-09 14:07:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:07:16.978001355 +0000 UTC m=+277.533569931" watchObservedRunningTime="2026-03-09 14:07:16.983712374 +0000 UTC m=+277.539280950" Mar 09 14:07:17 crc kubenswrapper[4722]: I0309 14:07:17.002414 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=13.002387538 podStartE2EDuration="13.002387538s" podCreationTimestamp="2026-03-09 14:07:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:07:16.999747554 +0000 UTC m=+277.555316130" watchObservedRunningTime="2026-03-09 14:07:17.002387538 +0000 UTC m=+277.557956114" Mar 09 14:07:17 crc kubenswrapper[4722]: I0309 14:07:17.102344 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5677d5654d-pmsss" podStartSLOduration=11.102323885 podStartE2EDuration="11.102323885s" podCreationTimestamp="2026-03-09 14:07:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:07:17.073766266 +0000 UTC m=+277.629334842" watchObservedRunningTime="2026-03-09 14:07:17.102323885 +0000 UTC m=+277.657892461" Mar 09 14:07:17 crc kubenswrapper[4722]: I0309 14:07:17.995090 4722 generic.go:334] "Generic (PLEG): container finished" podID="812aaee7-17d0-4777-a2a0-f1ffcb8a5ad4" containerID="587063214d63954e384e5387687b40bb853e9118a7e551fb1a82fd623375b418" exitCode=0 Mar 09 14:07:17 crc kubenswrapper[4722]: I0309 14:07:17.995151 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"812aaee7-17d0-4777-a2a0-f1ffcb8a5ad4","Type":"ContainerDied","Data":"587063214d63954e384e5387687b40bb853e9118a7e551fb1a82fd623375b418"} Mar 09 14:07:18 crc kubenswrapper[4722]: I0309 14:07:18.068476 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-pvvhj" podStartSLOduration=214.06845552 podStartE2EDuration="3m34.06845552s" podCreationTimestamp="2026-03-09 14:03:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:07:18.046086964 +0000 UTC m=+278.601655560" watchObservedRunningTime="2026-03-09 14:07:18.06845552 +0000 UTC m=+278.624024096" Mar 09 14:07:18 crc kubenswrapper[4722]: I0309 14:07:18.082342 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-c96ws" podStartSLOduration=5.321735661 podStartE2EDuration="46.082321888s" podCreationTimestamp="2026-03-09 14:06:32 +0000 UTC" firstStartedPulling="2026-03-09 14:06:35.637516315 +0000 UTC m=+236.193084891" lastFinishedPulling="2026-03-09 14:07:16.398102542 +0000 UTC m=+276.953671118" observedRunningTime="2026-03-09 14:07:18.066437964 +0000 UTC m=+278.622006540" watchObservedRunningTime="2026-03-09 14:07:18.082321888 +0000 UTC m=+278.637890464" Mar 09 14:07:18 crc kubenswrapper[4722]: I0309 14:07:18.177037 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=8.176991719 podStartE2EDuration="8.176991719s" podCreationTimestamp="2026-03-09 14:07:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:07:18.082193335 +0000 UTC m=+278.637761911" watchObservedRunningTime="2026-03-09 14:07:18.176991719 +0000 UTC m=+278.732560295" Mar 09 14:07:21 crc kubenswrapper[4722]: I0309 14:07:19.004282 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r2hp9" event={"ID":"6ba5c493-a951-4d61-acf3-5ee964dcfe60","Type":"ContainerStarted","Data":"f78b83bee983b02a1487dfd5f2313e13f3c8fa4aba7e87056f1d7e1ad5f97ffe"} Mar 09 14:07:21 crc kubenswrapper[4722]: I0309 14:07:19.358270 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 14:07:21 crc kubenswrapper[4722]: I0309 14:07:19.478997 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/812aaee7-17d0-4777-a2a0-f1ffcb8a5ad4-kubelet-dir\") pod \"812aaee7-17d0-4777-a2a0-f1ffcb8a5ad4\" (UID: \"812aaee7-17d0-4777-a2a0-f1ffcb8a5ad4\") " Mar 09 14:07:21 crc kubenswrapper[4722]: I0309 14:07:19.479160 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/812aaee7-17d0-4777-a2a0-f1ffcb8a5ad4-kube-api-access\") pod \"812aaee7-17d0-4777-a2a0-f1ffcb8a5ad4\" (UID: \"812aaee7-17d0-4777-a2a0-f1ffcb8a5ad4\") " Mar 09 14:07:21 crc kubenswrapper[4722]: I0309 14:07:19.479179 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/812aaee7-17d0-4777-a2a0-f1ffcb8a5ad4-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "812aaee7-17d0-4777-a2a0-f1ffcb8a5ad4" (UID: "812aaee7-17d0-4777-a2a0-f1ffcb8a5ad4"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 14:07:21 crc kubenswrapper[4722]: I0309 14:07:19.479485 4722 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/812aaee7-17d0-4777-a2a0-f1ffcb8a5ad4-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 09 14:07:21 crc kubenswrapper[4722]: I0309 14:07:19.486190 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/812aaee7-17d0-4777-a2a0-f1ffcb8a5ad4-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "812aaee7-17d0-4777-a2a0-f1ffcb8a5ad4" (UID: "812aaee7-17d0-4777-a2a0-f1ffcb8a5ad4"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:07:21 crc kubenswrapper[4722]: I0309 14:07:19.581084 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/812aaee7-17d0-4777-a2a0-f1ffcb8a5ad4-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 09 14:07:21 crc kubenswrapper[4722]: I0309 14:07:20.016036 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"812aaee7-17d0-4777-a2a0-f1ffcb8a5ad4","Type":"ContainerDied","Data":"51cf359c688150bee9b9155a31de4b6f2f8c3e94e754a494b0abc5671819a302"} Mar 09 14:07:21 crc kubenswrapper[4722]: I0309 14:07:20.016109 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="51cf359c688150bee9b9155a31de4b6f2f8c3e94e754a494b0abc5671819a302" Mar 09 14:07:21 crc kubenswrapper[4722]: I0309 14:07:20.016067 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 09 14:07:21 crc kubenswrapper[4722]: I0309 14:07:20.051137 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-r2hp9" podStartSLOduration=11.881727906 podStartE2EDuration="47.0511171s" podCreationTimestamp="2026-03-09 14:06:33 +0000 UTC" firstStartedPulling="2026-03-09 14:06:43.477090687 +0000 UTC m=+244.032659263" lastFinishedPulling="2026-03-09 14:07:18.646479881 +0000 UTC m=+279.202048457" observedRunningTime="2026-03-09 14:07:20.046186903 +0000 UTC m=+280.601755479" watchObservedRunningTime="2026-03-09 14:07:20.0511171 +0000 UTC m=+280.606685666" Mar 09 14:07:21 crc kubenswrapper[4722]: I0309 14:07:21.251704 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-6s4fg"] Mar 09 14:07:21 crc kubenswrapper[4722]: I0309 14:07:21.528768 4722 patch_prober.go:28] interesting pod/machine-config-daemon-hjrrb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:07:21 crc kubenswrapper[4722]: I0309 14:07:21.529375 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:07:21 crc kubenswrapper[4722]: I0309 14:07:21.529437 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" Mar 09 14:07:21 crc kubenswrapper[4722]: I0309 14:07:21.530369 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9f451493fa6d86f12c1291c0fe262bcbe4b62cef45c8dda696bb48fa3ded2eeb"} pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 14:07:21 crc kubenswrapper[4722]: I0309 14:07:21.530433 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerName="machine-config-daemon" containerID="cri-o://9f451493fa6d86f12c1291c0fe262bcbe4b62cef45c8dda696bb48fa3ded2eeb" gracePeriod=600 Mar 09 14:07:21 crc kubenswrapper[4722]: I0309 14:07:21.625245 4722 patch_prober.go:28] interesting pod/downloads-7954f5f757-knrzp container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Mar 09 14:07:21 crc kubenswrapper[4722]: I0309 14:07:21.625305 4722 patch_prober.go:28] interesting pod/downloads-7954f5f757-knrzp container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Mar 09 14:07:21 crc kubenswrapper[4722]: I0309 14:07:21.625311 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-knrzp" podUID="8f915ef9-5d9a-43ee-a333-def8766e083d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" Mar 09 14:07:21 crc kubenswrapper[4722]: I0309 14:07:21.625361 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-knrzp" podUID="8f915ef9-5d9a-43ee-a333-def8766e083d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" Mar 09 14:07:22 crc kubenswrapper[4722]: I0309 14:07:22.042517 4722 generic.go:334] "Generic (PLEG): container finished" podID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerID="9f451493fa6d86f12c1291c0fe262bcbe4b62cef45c8dda696bb48fa3ded2eeb" exitCode=0 Mar 09 14:07:22 crc kubenswrapper[4722]: I0309 14:07:22.042560 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" event={"ID":"dac2aaf5-653b-4b2a-8efe-ed26bac8d648","Type":"ContainerDied","Data":"9f451493fa6d86f12c1291c0fe262bcbe4b62cef45c8dda696bb48fa3ded2eeb"} Mar 09 14:07:23 crc kubenswrapper[4722]: I0309 14:07:23.278041 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-c96ws" Mar 09 14:07:23 crc kubenswrapper[4722]: I0309 14:07:23.279027 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-c96ws" Mar 09 14:07:24 crc kubenswrapper[4722]: I0309 14:07:24.060593 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551086-cvc28" event={"ID":"40be416c-1b7b-4973-b9ed-25ae20cd660d","Type":"ContainerStarted","Data":"d699b4ec9fd940584437cd321dd64fb6a995865e2cf274f73ca2f40410515cfa"} Mar 09 14:07:24 crc kubenswrapper[4722]: I0309 14:07:24.064087 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" event={"ID":"dac2aaf5-653b-4b2a-8efe-ed26bac8d648","Type":"ContainerStarted","Data":"ee35f8e9e8041f3696b9c0177b52fd458e4382bc38a0838adc5aa0015cd1c0a8"} Mar 09 14:07:24 crc kubenswrapper[4722]: I0309 14:07:24.078784 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551086-cvc28" podStartSLOduration=26.759866395 podStartE2EDuration="1m24.078755566s" podCreationTimestamp="2026-03-09 14:06:00 +0000 UTC" firstStartedPulling="2026-03-09 14:06:25.561247012 +0000 UTC m=+226.116815588" lastFinishedPulling="2026-03-09 14:07:22.880136183 +0000 UTC m=+283.435704759" observedRunningTime="2026-03-09 14:07:24.076952925 +0000 UTC m=+284.632521501" watchObservedRunningTime="2026-03-09 14:07:24.078755566 +0000 UTC m=+284.634324142" Mar 09 14:07:24 crc kubenswrapper[4722]: I0309 14:07:24.102789 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-c96ws" Mar 09 14:07:24 crc kubenswrapper[4722]: I0309 14:07:24.148442 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-c96ws" Mar 09 14:07:24 crc kubenswrapper[4722]: I0309 14:07:24.296707 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-r2hp9" Mar 09 14:07:24 crc kubenswrapper[4722]: I0309 14:07:24.296774 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-r2hp9" Mar 09 14:07:24 crc kubenswrapper[4722]: I0309 14:07:24.298315 4722 csr.go:261] certificate signing request csr-zdx2s is approved, waiting to be issued Mar 09 14:07:24 crc kubenswrapper[4722]: I0309 14:07:24.310044 4722 csr.go:257] certificate signing request csr-zdx2s is issued Mar 09 14:07:24 crc kubenswrapper[4722]: I0309 14:07:24.955723 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-c96ws"] Mar 09 14:07:25 crc kubenswrapper[4722]: I0309 14:07:25.074839 4722 generic.go:334] "Generic (PLEG): container finished" podID="40be416c-1b7b-4973-b9ed-25ae20cd660d" containerID="d699b4ec9fd940584437cd321dd64fb6a995865e2cf274f73ca2f40410515cfa" exitCode=0 Mar 09 14:07:25 crc kubenswrapper[4722]: I0309 14:07:25.074915 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551086-cvc28" event={"ID":"40be416c-1b7b-4973-b9ed-25ae20cd660d","Type":"ContainerDied","Data":"d699b4ec9fd940584437cd321dd64fb6a995865e2cf274f73ca2f40410515cfa"} Mar 09 14:07:25 crc kubenswrapper[4722]: I0309 14:07:25.312155 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-07 11:52:06.630649892 +0000 UTC Mar 09 14:07:25 crc kubenswrapper[4722]: I0309 14:07:25.314108 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7293h44m41.316553215s for next certificate rotation Mar 09 14:07:25 crc kubenswrapper[4722]: I0309 14:07:25.339227 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-r2hp9" podUID="6ba5c493-a951-4d61-acf3-5ee964dcfe60" containerName="registry-server" probeResult="failure" output=< Mar 09 14:07:25 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Mar 09 14:07:25 crc kubenswrapper[4722]: > Mar 09 14:07:26 crc kubenswrapper[4722]: I0309 14:07:26.085352 4722 generic.go:334] "Generic (PLEG): container finished" podID="c0f74bde-752e-497e-ad82-ec7a1676bbd5" containerID="9fa19129974de9178fcafcea62deb4f00fce3b4a22230d33c0edc0c58e099fb8" exitCode=0 Mar 09 14:07:26 crc kubenswrapper[4722]: I0309 14:07:26.085423 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pwxnx" event={"ID":"c0f74bde-752e-497e-ad82-ec7a1676bbd5","Type":"ContainerDied","Data":"9fa19129974de9178fcafcea62deb4f00fce3b4a22230d33c0edc0c58e099fb8"} Mar 09 14:07:26 crc kubenswrapper[4722]: I0309 14:07:26.088974 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rdkdt" event={"ID":"cfd6e90e-4eeb-4372-8465-136a383e95b2","Type":"ContainerStarted","Data":"c2805a424b564415cca37b53a3bc23bada46f95aafd332061d8ba671858b7e21"} Mar 09 14:07:26 crc kubenswrapper[4722]: I0309 14:07:26.096302 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zmkmp" event={"ID":"4b4a7622-7bca-4fca-adb3-eec526b21b2b","Type":"ContainerStarted","Data":"dd8eb5ce92a018092bfa959f5c229f7dcc17c39ce356b0af1bb7a9f58984e026"} Mar 09 14:07:26 crc kubenswrapper[4722]: I0309 14:07:26.098566 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t8pts" event={"ID":"b0b6cdb1-050e-4ed3-b20e-d825e4db1edb","Type":"ContainerStarted","Data":"574e56a24df0555271a4809f4200d0788d8603b8b1aae9a6cfbf9d5fca9f8829"} Mar 09 14:07:26 crc kubenswrapper[4722]: I0309 14:07:26.125075 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w95cf" event={"ID":"65e3d647-8806-4c0c-b9aa-142739f2fbe0","Type":"ContainerDied","Data":"eaa712b3811f346415682a71aa51b3e96298af53fd6b21033a95e8bb66783011"} Mar 09 14:07:26 crc kubenswrapper[4722]: I0309 14:07:26.126350 4722 generic.go:334] "Generic (PLEG): container finished" podID="65e3d647-8806-4c0c-b9aa-142739f2fbe0" containerID="eaa712b3811f346415682a71aa51b3e96298af53fd6b21033a95e8bb66783011" exitCode=0 Mar 09 14:07:26 crc kubenswrapper[4722]: I0309 14:07:26.126761 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-c96ws" podUID="a9665111-e6dc-49f2-803a-96eebcc4c78c" containerName="registry-server" containerID="cri-o://1f4781e8695c6b455d3a85fd01562c7656968cb7c8e952cce56e3516aceb016d" gracePeriod=2 Mar 09 14:07:26 crc kubenswrapper[4722]: I0309 14:07:26.610578 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5677d5654d-pmsss"] Mar 09 14:07:26 crc kubenswrapper[4722]: I0309 14:07:26.612402 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5677d5654d-pmsss" podUID="11e24ad3-46ea-469c-bdec-73bd3eb10057" containerName="controller-manager" containerID="cri-o://ff377b4b0bc205b0487c84a89ab21c742b21398ffe13bcf659b9592ea5a8a88c" gracePeriod=30 Mar 09 14:07:26 crc kubenswrapper[4722]: I0309 14:07:26.626940 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551086-cvc28" Mar 09 14:07:26 crc kubenswrapper[4722]: I0309 14:07:26.646658 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c9b77b468-t9m8w"] Mar 09 14:07:26 crc kubenswrapper[4722]: I0309 14:07:26.647092 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7c9b77b468-t9m8w" podUID="7c0bd182-5e3e-4763-b517-2d6646ceddfa" containerName="route-controller-manager" containerID="cri-o://a154e9e25e8b5cdbeea7070f9c12442460bc3df257927d77c05bb8bce1524313" gracePeriod=30 Mar 09 14:07:26 crc kubenswrapper[4722]: I0309 14:07:26.805671 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-csnbb\" (UniqueName: \"kubernetes.io/projected/40be416c-1b7b-4973-b9ed-25ae20cd660d-kube-api-access-csnbb\") pod \"40be416c-1b7b-4973-b9ed-25ae20cd660d\" (UID: \"40be416c-1b7b-4973-b9ed-25ae20cd660d\") " Mar 09 14:07:26 crc kubenswrapper[4722]: I0309 14:07:26.814525 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40be416c-1b7b-4973-b9ed-25ae20cd660d-kube-api-access-csnbb" (OuterVolumeSpecName: "kube-api-access-csnbb") pod "40be416c-1b7b-4973-b9ed-25ae20cd660d" (UID: "40be416c-1b7b-4973-b9ed-25ae20cd660d"). InnerVolumeSpecName "kube-api-access-csnbb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:07:26 crc kubenswrapper[4722]: I0309 14:07:26.908013 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-csnbb\" (UniqueName: \"kubernetes.io/projected/40be416c-1b7b-4973-b9ed-25ae20cd660d-kube-api-access-csnbb\") on node \"crc\" DevicePath \"\"" Mar 09 14:07:27 crc kubenswrapper[4722]: I0309 14:07:27.142163 4722 generic.go:334] "Generic (PLEG): container finished" podID="cfd6e90e-4eeb-4372-8465-136a383e95b2" containerID="c2805a424b564415cca37b53a3bc23bada46f95aafd332061d8ba671858b7e21" exitCode=0 Mar 09 14:07:27 crc kubenswrapper[4722]: I0309 14:07:27.142299 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rdkdt" event={"ID":"cfd6e90e-4eeb-4372-8465-136a383e95b2","Type":"ContainerDied","Data":"c2805a424b564415cca37b53a3bc23bada46f95aafd332061d8ba671858b7e21"} Mar 09 14:07:27 crc kubenswrapper[4722]: I0309 14:07:27.154593 4722 generic.go:334] "Generic (PLEG): container finished" podID="a9665111-e6dc-49f2-803a-96eebcc4c78c" containerID="1f4781e8695c6b455d3a85fd01562c7656968cb7c8e952cce56e3516aceb016d" exitCode=0 Mar 09 14:07:27 crc kubenswrapper[4722]: I0309 14:07:27.154705 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c96ws" event={"ID":"a9665111-e6dc-49f2-803a-96eebcc4c78c","Type":"ContainerDied","Data":"1f4781e8695c6b455d3a85fd01562c7656968cb7c8e952cce56e3516aceb016d"} Mar 09 14:07:27 crc kubenswrapper[4722]: I0309 14:07:27.158140 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551086-cvc28" event={"ID":"40be416c-1b7b-4973-b9ed-25ae20cd660d","Type":"ContainerDied","Data":"00396f1583d3454ca2cb8f8ea0f6a14f1cba7e698f5924eb845c2864ed34ef09"} Mar 09 14:07:27 crc kubenswrapper[4722]: I0309 14:07:27.158183 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="00396f1583d3454ca2cb8f8ea0f6a14f1cba7e698f5924eb845c2864ed34ef09" Mar 09 14:07:27 crc kubenswrapper[4722]: I0309 14:07:27.158181 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551086-cvc28" Mar 09 14:07:27 crc kubenswrapper[4722]: I0309 14:07:27.161916 4722 generic.go:334] "Generic (PLEG): container finished" podID="4b4a7622-7bca-4fca-adb3-eec526b21b2b" containerID="dd8eb5ce92a018092bfa959f5c229f7dcc17c39ce356b0af1bb7a9f58984e026" exitCode=0 Mar 09 14:07:27 crc kubenswrapper[4722]: I0309 14:07:27.161990 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zmkmp" event={"ID":"4b4a7622-7bca-4fca-adb3-eec526b21b2b","Type":"ContainerDied","Data":"dd8eb5ce92a018092bfa959f5c229f7dcc17c39ce356b0af1bb7a9f58984e026"} Mar 09 14:07:27 crc kubenswrapper[4722]: I0309 14:07:27.167224 4722 generic.go:334] "Generic (PLEG): container finished" podID="b0b6cdb1-050e-4ed3-b20e-d825e4db1edb" containerID="574e56a24df0555271a4809f4200d0788d8603b8b1aae9a6cfbf9d5fca9f8829" exitCode=0 Mar 09 14:07:27 crc kubenswrapper[4722]: I0309 14:07:27.167361 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t8pts" event={"ID":"b0b6cdb1-050e-4ed3-b20e-d825e4db1edb","Type":"ContainerDied","Data":"574e56a24df0555271a4809f4200d0788d8603b8b1aae9a6cfbf9d5fca9f8829"} Mar 09 14:07:27 crc kubenswrapper[4722]: I0309 14:07:27.174847 4722 generic.go:334] "Generic (PLEG): container finished" podID="7c0bd182-5e3e-4763-b517-2d6646ceddfa" containerID="a154e9e25e8b5cdbeea7070f9c12442460bc3df257927d77c05bb8bce1524313" exitCode=0 Mar 09 14:07:27 crc kubenswrapper[4722]: I0309 14:07:27.174978 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7c9b77b468-t9m8w" event={"ID":"7c0bd182-5e3e-4763-b517-2d6646ceddfa","Type":"ContainerDied","Data":"a154e9e25e8b5cdbeea7070f9c12442460bc3df257927d77c05bb8bce1524313"} Mar 09 14:07:27 crc kubenswrapper[4722]: I0309 14:07:27.180119 4722 generic.go:334] "Generic (PLEG): container finished" podID="11e24ad3-46ea-469c-bdec-73bd3eb10057" containerID="ff377b4b0bc205b0487c84a89ab21c742b21398ffe13bcf659b9592ea5a8a88c" exitCode=0 Mar 09 14:07:27 crc kubenswrapper[4722]: I0309 14:07:27.180217 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5677d5654d-pmsss" event={"ID":"11e24ad3-46ea-469c-bdec-73bd3eb10057","Type":"ContainerDied","Data":"ff377b4b0bc205b0487c84a89ab21c742b21398ffe13bcf659b9592ea5a8a88c"} Mar 09 14:07:27 crc kubenswrapper[4722]: I0309 14:07:27.596515 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c96ws" Mar 09 14:07:27 crc kubenswrapper[4722]: I0309 14:07:27.725289 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9665111-e6dc-49f2-803a-96eebcc4c78c-catalog-content\") pod \"a9665111-e6dc-49f2-803a-96eebcc4c78c\" (UID: \"a9665111-e6dc-49f2-803a-96eebcc4c78c\") " Mar 09 14:07:27 crc kubenswrapper[4722]: I0309 14:07:27.725352 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-58htf\" (UniqueName: \"kubernetes.io/projected/a9665111-e6dc-49f2-803a-96eebcc4c78c-kube-api-access-58htf\") pod \"a9665111-e6dc-49f2-803a-96eebcc4c78c\" (UID: \"a9665111-e6dc-49f2-803a-96eebcc4c78c\") " Mar 09 14:07:27 crc kubenswrapper[4722]: I0309 14:07:27.725403 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9665111-e6dc-49f2-803a-96eebcc4c78c-utilities\") pod \"a9665111-e6dc-49f2-803a-96eebcc4c78c\" (UID: \"a9665111-e6dc-49f2-803a-96eebcc4c78c\") " Mar 09 14:07:27 crc kubenswrapper[4722]: I0309 14:07:27.730500 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9665111-e6dc-49f2-803a-96eebcc4c78c-utilities" (OuterVolumeSpecName: "utilities") pod "a9665111-e6dc-49f2-803a-96eebcc4c78c" (UID: "a9665111-e6dc-49f2-803a-96eebcc4c78c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:07:27 crc kubenswrapper[4722]: I0309 14:07:27.736596 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9665111-e6dc-49f2-803a-96eebcc4c78c-kube-api-access-58htf" (OuterVolumeSpecName: "kube-api-access-58htf") pod "a9665111-e6dc-49f2-803a-96eebcc4c78c" (UID: "a9665111-e6dc-49f2-803a-96eebcc4c78c"). InnerVolumeSpecName "kube-api-access-58htf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:07:27 crc kubenswrapper[4722]: I0309 14:07:27.773863 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9665111-e6dc-49f2-803a-96eebcc4c78c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a9665111-e6dc-49f2-803a-96eebcc4c78c" (UID: "a9665111-e6dc-49f2-803a-96eebcc4c78c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:07:27 crc kubenswrapper[4722]: I0309 14:07:27.789823 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7c9b77b468-t9m8w" Mar 09 14:07:27 crc kubenswrapper[4722]: I0309 14:07:27.826818 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9665111-e6dc-49f2-803a-96eebcc4c78c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 14:07:27 crc kubenswrapper[4722]: I0309 14:07:27.827960 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-58htf\" (UniqueName: \"kubernetes.io/projected/a9665111-e6dc-49f2-803a-96eebcc4c78c-kube-api-access-58htf\") on node \"crc\" DevicePath \"\"" Mar 09 14:07:27 crc kubenswrapper[4722]: I0309 14:07:27.828027 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9665111-e6dc-49f2-803a-96eebcc4c78c-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 14:07:27 crc kubenswrapper[4722]: I0309 14:07:27.929036 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nrtt2\" (UniqueName: \"kubernetes.io/projected/7c0bd182-5e3e-4763-b517-2d6646ceddfa-kube-api-access-nrtt2\") pod \"7c0bd182-5e3e-4763-b517-2d6646ceddfa\" (UID: \"7c0bd182-5e3e-4763-b517-2d6646ceddfa\") " Mar 09 14:07:27 crc kubenswrapper[4722]: I0309 14:07:27.929111 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7c0bd182-5e3e-4763-b517-2d6646ceddfa-client-ca\") pod \"7c0bd182-5e3e-4763-b517-2d6646ceddfa\" (UID: \"7c0bd182-5e3e-4763-b517-2d6646ceddfa\") " Mar 09 14:07:27 crc kubenswrapper[4722]: I0309 14:07:27.929143 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c0bd182-5e3e-4763-b517-2d6646ceddfa-config\") pod \"7c0bd182-5e3e-4763-b517-2d6646ceddfa\" (UID: \"7c0bd182-5e3e-4763-b517-2d6646ceddfa\") " Mar 09 14:07:27 crc kubenswrapper[4722]: I0309 14:07:27.929182 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7c0bd182-5e3e-4763-b517-2d6646ceddfa-serving-cert\") pod \"7c0bd182-5e3e-4763-b517-2d6646ceddfa\" (UID: \"7c0bd182-5e3e-4763-b517-2d6646ceddfa\") " Mar 09 14:07:27 crc kubenswrapper[4722]: I0309 14:07:27.931213 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c0bd182-5e3e-4763-b517-2d6646ceddfa-config" (OuterVolumeSpecName: "config") pod "7c0bd182-5e3e-4763-b517-2d6646ceddfa" (UID: "7c0bd182-5e3e-4763-b517-2d6646ceddfa"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:07:27 crc kubenswrapper[4722]: I0309 14:07:27.933964 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c0bd182-5e3e-4763-b517-2d6646ceddfa-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7c0bd182-5e3e-4763-b517-2d6646ceddfa" (UID: "7c0bd182-5e3e-4763-b517-2d6646ceddfa"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:07:27 crc kubenswrapper[4722]: I0309 14:07:27.933965 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5677d5654d-pmsss" Mar 09 14:07:27 crc kubenswrapper[4722]: I0309 14:07:27.934823 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c0bd182-5e3e-4763-b517-2d6646ceddfa-client-ca" (OuterVolumeSpecName: "client-ca") pod "7c0bd182-5e3e-4763-b517-2d6646ceddfa" (UID: "7c0bd182-5e3e-4763-b517-2d6646ceddfa"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:07:27 crc kubenswrapper[4722]: I0309 14:07:27.937583 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c0bd182-5e3e-4763-b517-2d6646ceddfa-kube-api-access-nrtt2" (OuterVolumeSpecName: "kube-api-access-nrtt2") pod "7c0bd182-5e3e-4763-b517-2d6646ceddfa" (UID: "7c0bd182-5e3e-4763-b517-2d6646ceddfa"). InnerVolumeSpecName "kube-api-access-nrtt2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.030135 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/11e24ad3-46ea-469c-bdec-73bd3eb10057-proxy-ca-bundles\") pod \"11e24ad3-46ea-469c-bdec-73bd3eb10057\" (UID: \"11e24ad3-46ea-469c-bdec-73bd3eb10057\") " Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.030250 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/11e24ad3-46ea-469c-bdec-73bd3eb10057-serving-cert\") pod \"11e24ad3-46ea-469c-bdec-73bd3eb10057\" (UID: \"11e24ad3-46ea-469c-bdec-73bd3eb10057\") " Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.030341 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/11e24ad3-46ea-469c-bdec-73bd3eb10057-client-ca\") pod \"11e24ad3-46ea-469c-bdec-73bd3eb10057\" (UID: \"11e24ad3-46ea-469c-bdec-73bd3eb10057\") " Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.030402 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hw2lc\" (UniqueName: \"kubernetes.io/projected/11e24ad3-46ea-469c-bdec-73bd3eb10057-kube-api-access-hw2lc\") pod \"11e24ad3-46ea-469c-bdec-73bd3eb10057\" (UID: \"11e24ad3-46ea-469c-bdec-73bd3eb10057\") " Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.030460 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11e24ad3-46ea-469c-bdec-73bd3eb10057-config\") pod \"11e24ad3-46ea-469c-bdec-73bd3eb10057\" (UID: \"11e24ad3-46ea-469c-bdec-73bd3eb10057\") " Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.030824 4722 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7c0bd182-5e3e-4763-b517-2d6646ceddfa-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.030855 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c0bd182-5e3e-4763-b517-2d6646ceddfa-config\") on node \"crc\" DevicePath \"\"" Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.030868 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7c0bd182-5e3e-4763-b517-2d6646ceddfa-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.030882 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nrtt2\" (UniqueName: \"kubernetes.io/projected/7c0bd182-5e3e-4763-b517-2d6646ceddfa-kube-api-access-nrtt2\") on node \"crc\" DevicePath \"\"" Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.032190 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11e24ad3-46ea-469c-bdec-73bd3eb10057-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "11e24ad3-46ea-469c-bdec-73bd3eb10057" (UID: "11e24ad3-46ea-469c-bdec-73bd3eb10057"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.032557 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11e24ad3-46ea-469c-bdec-73bd3eb10057-config" (OuterVolumeSpecName: "config") pod "11e24ad3-46ea-469c-bdec-73bd3eb10057" (UID: "11e24ad3-46ea-469c-bdec-73bd3eb10057"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.032627 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11e24ad3-46ea-469c-bdec-73bd3eb10057-client-ca" (OuterVolumeSpecName: "client-ca") pod "11e24ad3-46ea-469c-bdec-73bd3eb10057" (UID: "11e24ad3-46ea-469c-bdec-73bd3eb10057"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.036391 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11e24ad3-46ea-469c-bdec-73bd3eb10057-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "11e24ad3-46ea-469c-bdec-73bd3eb10057" (UID: "11e24ad3-46ea-469c-bdec-73bd3eb10057"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.039435 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11e24ad3-46ea-469c-bdec-73bd3eb10057-kube-api-access-hw2lc" (OuterVolumeSpecName: "kube-api-access-hw2lc") pod "11e24ad3-46ea-469c-bdec-73bd3eb10057" (UID: "11e24ad3-46ea-469c-bdec-73bd3eb10057"). InnerVolumeSpecName "kube-api-access-hw2lc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.131845 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11e24ad3-46ea-469c-bdec-73bd3eb10057-config\") on node \"crc\" DevicePath \"\"" Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.131890 4722 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/11e24ad3-46ea-469c-bdec-73bd3eb10057-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.131900 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/11e24ad3-46ea-469c-bdec-73bd3eb10057-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.131908 4722 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/11e24ad3-46ea-469c-bdec-73bd3eb10057-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.131919 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hw2lc\" (UniqueName: \"kubernetes.io/projected/11e24ad3-46ea-469c-bdec-73bd3eb10057-kube-api-access-hw2lc\") on node \"crc\" DevicePath \"\"" Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.213779 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c96ws" event={"ID":"a9665111-e6dc-49f2-803a-96eebcc4c78c","Type":"ContainerDied","Data":"1abc54c1cf595773b8b9982364a5828c8941a10b4323ea411a90248e7a74f76d"} Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.213897 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c96ws" Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.213986 4722 scope.go:117] "RemoveContainer" containerID="1f4781e8695c6b455d3a85fd01562c7656968cb7c8e952cce56e3516aceb016d" Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.229037 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zmkmp" event={"ID":"4b4a7622-7bca-4fca-adb3-eec526b21b2b","Type":"ContainerStarted","Data":"b250fd3da865150b9501688128005be46b38134155b3c3c5f9375cf870b9042b"} Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.236420 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t8pts" event={"ID":"b0b6cdb1-050e-4ed3-b20e-d825e4db1edb","Type":"ContainerStarted","Data":"e45aa95389ab189d5e3b21c6fc9b35ae0e60420f591a95fabb5a7e81051c4f2f"} Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.261298 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-c96ws"] Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.269130 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w95cf" event={"ID":"65e3d647-8806-4c0c-b9aa-142739f2fbe0","Type":"ContainerStarted","Data":"cf480bc5c8bb7328b096209debbb6ae3f3c95254a69760a97c4aaaab34c7f253"} Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.272452 4722 scope.go:117] "RemoveContainer" containerID="c8f8238fec7fc479f9f9b20da03b87579e821be31b34435c1879e486656c2818" Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.277506 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-c96ws"] Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.277654 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pwxnx" event={"ID":"c0f74bde-752e-497e-ad82-ec7a1676bbd5","Type":"ContainerStarted","Data":"aa3e4f5b5b38ee676d8bc64dbcfe4f208f4ba83831c09629dbc541729ab9ea5c"} Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.285980 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-t8pts" podStartSLOduration=3.154226256 podStartE2EDuration="58.285942527s" podCreationTimestamp="2026-03-09 14:06:30 +0000 UTC" firstStartedPulling="2026-03-09 14:06:32.54813407 +0000 UTC m=+233.103702646" lastFinishedPulling="2026-03-09 14:07:27.679850341 +0000 UTC m=+288.235418917" observedRunningTime="2026-03-09 14:07:28.282379177 +0000 UTC m=+288.837947773" watchObservedRunningTime="2026-03-09 14:07:28.285942527 +0000 UTC m=+288.841511113" Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.290061 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7c9b77b468-t9m8w" Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.290059 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7c9b77b468-t9m8w" event={"ID":"7c0bd182-5e3e-4763-b517-2d6646ceddfa","Type":"ContainerDied","Data":"ee5c3c563d33c7410cc4f1ca10690389ef75f43981c319a2d38920b6f28ab2f6"} Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.295352 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5677d5654d-pmsss" event={"ID":"11e24ad3-46ea-469c-bdec-73bd3eb10057","Type":"ContainerDied","Data":"552ebd2bddaa9776fc1184b4e5d86cad79ae70a4da025d11cb797cc9c54b98b6"} Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.295590 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5677d5654d-pmsss" Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.299367 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rdkdt" event={"ID":"cfd6e90e-4eeb-4372-8465-136a383e95b2","Type":"ContainerStarted","Data":"78f5a393860e42b0934375b2f004056d71f44af329e9dd8bdc283424c1580305"} Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.311187 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zmkmp" podStartSLOduration=3.131707052 podStartE2EDuration="58.311162463s" podCreationTimestamp="2026-03-09 14:06:30 +0000 UTC" firstStartedPulling="2026-03-09 14:06:32.539167555 +0000 UTC m=+233.094736141" lastFinishedPulling="2026-03-09 14:07:27.718622976 +0000 UTC m=+288.274191552" observedRunningTime="2026-03-09 14:07:28.305488515 +0000 UTC m=+288.861057101" watchObservedRunningTime="2026-03-09 14:07:28.311162463 +0000 UTC m=+288.866731069" Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.315970 4722 scope.go:117] "RemoveContainer" containerID="4f4e8d1a2b8c2dbf9b4e1f90d8043b6bee2cdde2a3273416469c66457f516df8" Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.333842 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-pwxnx" podStartSLOduration=3.521948001 podStartE2EDuration="58.333748325s" podCreationTimestamp="2026-03-09 14:06:30 +0000 UTC" firstStartedPulling="2026-03-09 14:06:32.537537785 +0000 UTC m=+233.093106361" lastFinishedPulling="2026-03-09 14:07:27.349338109 +0000 UTC m=+287.904906685" observedRunningTime="2026-03-09 14:07:28.32570985 +0000 UTC m=+288.881278426" watchObservedRunningTime="2026-03-09 14:07:28.333748325 +0000 UTC m=+288.889316911" Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.339671 4722 scope.go:117] "RemoveContainer" containerID="a154e9e25e8b5cdbeea7070f9c12442460bc3df257927d77c05bb8bce1524313" Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.364018 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-w95cf" podStartSLOduration=3.486098976 podStartE2EDuration="58.363979881s" podCreationTimestamp="2026-03-09 14:06:30 +0000 UTC" firstStartedPulling="2026-03-09 14:06:32.566023551 +0000 UTC m=+233.121592127" lastFinishedPulling="2026-03-09 14:07:27.443904456 +0000 UTC m=+287.999473032" observedRunningTime="2026-03-09 14:07:28.360841004 +0000 UTC m=+288.916409590" watchObservedRunningTime="2026-03-09 14:07:28.363979881 +0000 UTC m=+288.919548457" Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.366253 4722 scope.go:117] "RemoveContainer" containerID="ff377b4b0bc205b0487c84a89ab21c742b21398ffe13bcf659b9592ea5a8a88c" Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.396151 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rdkdt" podStartSLOduration=4.468894947 podStartE2EDuration="56.396110851s" podCreationTimestamp="2026-03-09 14:06:32 +0000 UTC" firstStartedPulling="2026-03-09 14:06:35.637142694 +0000 UTC m=+236.192711270" lastFinishedPulling="2026-03-09 14:07:27.564358598 +0000 UTC m=+288.119927174" observedRunningTime="2026-03-09 14:07:28.390968717 +0000 UTC m=+288.946537293" watchObservedRunningTime="2026-03-09 14:07:28.396110851 +0000 UTC m=+288.951679427" Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.415879 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5677d5654d-pmsss"] Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.424773 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5677d5654d-pmsss"] Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.442661 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c9b77b468-t9m8w"] Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.453927 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c9b77b468-t9m8w"] Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.556553 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b98945b4f-krmvf"] Mar 09 14:07:28 crc kubenswrapper[4722]: E0309 14:07:28.556873 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c0bd182-5e3e-4763-b517-2d6646ceddfa" containerName="route-controller-manager" Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.556893 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c0bd182-5e3e-4763-b517-2d6646ceddfa" containerName="route-controller-manager" Mar 09 14:07:28 crc kubenswrapper[4722]: E0309 14:07:28.556912 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11e24ad3-46ea-469c-bdec-73bd3eb10057" containerName="controller-manager" Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.556921 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="11e24ad3-46ea-469c-bdec-73bd3eb10057" containerName="controller-manager" Mar 09 14:07:28 crc kubenswrapper[4722]: E0309 14:07:28.556942 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9665111-e6dc-49f2-803a-96eebcc4c78c" containerName="extract-utilities" Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.556949 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9665111-e6dc-49f2-803a-96eebcc4c78c" containerName="extract-utilities" Mar 09 14:07:28 crc kubenswrapper[4722]: E0309 14:07:28.556958 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9665111-e6dc-49f2-803a-96eebcc4c78c" containerName="extract-content" Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.556965 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9665111-e6dc-49f2-803a-96eebcc4c78c" containerName="extract-content" Mar 09 14:07:28 crc kubenswrapper[4722]: E0309 14:07:28.556979 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="812aaee7-17d0-4777-a2a0-f1ffcb8a5ad4" containerName="pruner" Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.556988 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="812aaee7-17d0-4777-a2a0-f1ffcb8a5ad4" containerName="pruner" Mar 09 14:07:28 crc kubenswrapper[4722]: E0309 14:07:28.557000 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9665111-e6dc-49f2-803a-96eebcc4c78c" containerName="registry-server" Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.557007 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9665111-e6dc-49f2-803a-96eebcc4c78c" containerName="registry-server" Mar 09 14:07:28 crc kubenswrapper[4722]: E0309 14:07:28.557016 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40be416c-1b7b-4973-b9ed-25ae20cd660d" containerName="oc" Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.557022 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="40be416c-1b7b-4973-b9ed-25ae20cd660d" containerName="oc" Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.557147 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="11e24ad3-46ea-469c-bdec-73bd3eb10057" containerName="controller-manager" Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.557161 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9665111-e6dc-49f2-803a-96eebcc4c78c" containerName="registry-server" Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.557172 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c0bd182-5e3e-4763-b517-2d6646ceddfa" containerName="route-controller-manager" Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.557180 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="812aaee7-17d0-4777-a2a0-f1ffcb8a5ad4" containerName="pruner" Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.557191 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="40be416c-1b7b-4973-b9ed-25ae20cd660d" containerName="oc" Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.557753 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6b98945b4f-krmvf" Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.561737 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.561855 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.562671 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.563069 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.563480 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.563607 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.567569 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-8555d9c686-6xxgz"] Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.568545 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8555d9c686-6xxgz" Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.575489 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.575494 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.575613 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b98945b4f-krmvf"] Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.576284 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.576493 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.578438 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.581415 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.584021 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-8555d9c686-6xxgz"] Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.584555 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.740195 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7t97\" (UniqueName: \"kubernetes.io/projected/d7afc4c0-854b-4491-8db5-a6dbdc888d1c-kube-api-access-b7t97\") pod \"route-controller-manager-6b98945b4f-krmvf\" (UID: \"d7afc4c0-854b-4491-8db5-a6dbdc888d1c\") " pod="openshift-route-controller-manager/route-controller-manager-6b98945b4f-krmvf" Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.740350 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d7afc4c0-854b-4491-8db5-a6dbdc888d1c-client-ca\") pod \"route-controller-manager-6b98945b4f-krmvf\" (UID: \"d7afc4c0-854b-4491-8db5-a6dbdc888d1c\") " pod="openshift-route-controller-manager/route-controller-manager-6b98945b4f-krmvf" Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.740399 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3bdc231e-2cd0-4323-a872-803e28ab5868-serving-cert\") pod \"controller-manager-8555d9c686-6xxgz\" (UID: \"3bdc231e-2cd0-4323-a872-803e28ab5868\") " pod="openshift-controller-manager/controller-manager-8555d9c686-6xxgz" Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.740427 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7afc4c0-854b-4491-8db5-a6dbdc888d1c-config\") pod \"route-controller-manager-6b98945b4f-krmvf\" (UID: \"d7afc4c0-854b-4491-8db5-a6dbdc888d1c\") " pod="openshift-route-controller-manager/route-controller-manager-6b98945b4f-krmvf" Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.740456 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bdc231e-2cd0-4323-a872-803e28ab5868-config\") pod \"controller-manager-8555d9c686-6xxgz\" (UID: \"3bdc231e-2cd0-4323-a872-803e28ab5868\") " pod="openshift-controller-manager/controller-manager-8555d9c686-6xxgz" Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.740476 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3bdc231e-2cd0-4323-a872-803e28ab5868-proxy-ca-bundles\") pod \"controller-manager-8555d9c686-6xxgz\" (UID: \"3bdc231e-2cd0-4323-a872-803e28ab5868\") " pod="openshift-controller-manager/controller-manager-8555d9c686-6xxgz" Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.740501 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d7afc4c0-854b-4491-8db5-a6dbdc888d1c-serving-cert\") pod \"route-controller-manager-6b98945b4f-krmvf\" (UID: \"d7afc4c0-854b-4491-8db5-a6dbdc888d1c\") " pod="openshift-route-controller-manager/route-controller-manager-6b98945b4f-krmvf" Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.740528 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-td4zf\" (UniqueName: \"kubernetes.io/projected/3bdc231e-2cd0-4323-a872-803e28ab5868-kube-api-access-td4zf\") pod \"controller-manager-8555d9c686-6xxgz\" (UID: \"3bdc231e-2cd0-4323-a872-803e28ab5868\") " pod="openshift-controller-manager/controller-manager-8555d9c686-6xxgz" Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.740554 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3bdc231e-2cd0-4323-a872-803e28ab5868-client-ca\") pod \"controller-manager-8555d9c686-6xxgz\" (UID: \"3bdc231e-2cd0-4323-a872-803e28ab5868\") " pod="openshift-controller-manager/controller-manager-8555d9c686-6xxgz" Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.842321 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3bdc231e-2cd0-4323-a872-803e28ab5868-serving-cert\") pod \"controller-manager-8555d9c686-6xxgz\" (UID: \"3bdc231e-2cd0-4323-a872-803e28ab5868\") " pod="openshift-controller-manager/controller-manager-8555d9c686-6xxgz" Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.842407 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7afc4c0-854b-4491-8db5-a6dbdc888d1c-config\") pod \"route-controller-manager-6b98945b4f-krmvf\" (UID: \"d7afc4c0-854b-4491-8db5-a6dbdc888d1c\") " pod="openshift-route-controller-manager/route-controller-manager-6b98945b4f-krmvf" Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.842451 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bdc231e-2cd0-4323-a872-803e28ab5868-config\") pod \"controller-manager-8555d9c686-6xxgz\" (UID: \"3bdc231e-2cd0-4323-a872-803e28ab5868\") " pod="openshift-controller-manager/controller-manager-8555d9c686-6xxgz" Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.842477 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3bdc231e-2cd0-4323-a872-803e28ab5868-proxy-ca-bundles\") pod \"controller-manager-8555d9c686-6xxgz\" (UID: \"3bdc231e-2cd0-4323-a872-803e28ab5868\") " pod="openshift-controller-manager/controller-manager-8555d9c686-6xxgz" Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.842506 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d7afc4c0-854b-4491-8db5-a6dbdc888d1c-serving-cert\") pod \"route-controller-manager-6b98945b4f-krmvf\" (UID: \"d7afc4c0-854b-4491-8db5-a6dbdc888d1c\") " pod="openshift-route-controller-manager/route-controller-manager-6b98945b4f-krmvf" Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.842538 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-td4zf\" (UniqueName: \"kubernetes.io/projected/3bdc231e-2cd0-4323-a872-803e28ab5868-kube-api-access-td4zf\") pod \"controller-manager-8555d9c686-6xxgz\" (UID: \"3bdc231e-2cd0-4323-a872-803e28ab5868\") " pod="openshift-controller-manager/controller-manager-8555d9c686-6xxgz" Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.842570 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3bdc231e-2cd0-4323-a872-803e28ab5868-client-ca\") pod \"controller-manager-8555d9c686-6xxgz\" (UID: \"3bdc231e-2cd0-4323-a872-803e28ab5868\") " pod="openshift-controller-manager/controller-manager-8555d9c686-6xxgz" Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.842613 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7t97\" (UniqueName: \"kubernetes.io/projected/d7afc4c0-854b-4491-8db5-a6dbdc888d1c-kube-api-access-b7t97\") pod \"route-controller-manager-6b98945b4f-krmvf\" (UID: \"d7afc4c0-854b-4491-8db5-a6dbdc888d1c\") " pod="openshift-route-controller-manager/route-controller-manager-6b98945b4f-krmvf" Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.842642 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d7afc4c0-854b-4491-8db5-a6dbdc888d1c-client-ca\") pod \"route-controller-manager-6b98945b4f-krmvf\" (UID: \"d7afc4c0-854b-4491-8db5-a6dbdc888d1c\") " pod="openshift-route-controller-manager/route-controller-manager-6b98945b4f-krmvf" Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.844056 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d7afc4c0-854b-4491-8db5-a6dbdc888d1c-client-ca\") pod \"route-controller-manager-6b98945b4f-krmvf\" (UID: \"d7afc4c0-854b-4491-8db5-a6dbdc888d1c\") " pod="openshift-route-controller-manager/route-controller-manager-6b98945b4f-krmvf" Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.844407 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3bdc231e-2cd0-4323-a872-803e28ab5868-client-ca\") pod \"controller-manager-8555d9c686-6xxgz\" (UID: \"3bdc231e-2cd0-4323-a872-803e28ab5868\") " pod="openshift-controller-manager/controller-manager-8555d9c686-6xxgz" Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.844682 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7afc4c0-854b-4491-8db5-a6dbdc888d1c-config\") pod \"route-controller-manager-6b98945b4f-krmvf\" (UID: \"d7afc4c0-854b-4491-8db5-a6dbdc888d1c\") " pod="openshift-route-controller-manager/route-controller-manager-6b98945b4f-krmvf" Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.844987 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bdc231e-2cd0-4323-a872-803e28ab5868-config\") pod \"controller-manager-8555d9c686-6xxgz\" (UID: \"3bdc231e-2cd0-4323-a872-803e28ab5868\") " pod="openshift-controller-manager/controller-manager-8555d9c686-6xxgz" Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.845682 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3bdc231e-2cd0-4323-a872-803e28ab5868-proxy-ca-bundles\") pod \"controller-manager-8555d9c686-6xxgz\" (UID: \"3bdc231e-2cd0-4323-a872-803e28ab5868\") " pod="openshift-controller-manager/controller-manager-8555d9c686-6xxgz" Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.851577 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d7afc4c0-854b-4491-8db5-a6dbdc888d1c-serving-cert\") pod \"route-controller-manager-6b98945b4f-krmvf\" (UID: \"d7afc4c0-854b-4491-8db5-a6dbdc888d1c\") " pod="openshift-route-controller-manager/route-controller-manager-6b98945b4f-krmvf" Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.854009 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3bdc231e-2cd0-4323-a872-803e28ab5868-serving-cert\") pod \"controller-manager-8555d9c686-6xxgz\" (UID: \"3bdc231e-2cd0-4323-a872-803e28ab5868\") " pod="openshift-controller-manager/controller-manager-8555d9c686-6xxgz" Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.872266 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-td4zf\" (UniqueName: \"kubernetes.io/projected/3bdc231e-2cd0-4323-a872-803e28ab5868-kube-api-access-td4zf\") pod \"controller-manager-8555d9c686-6xxgz\" (UID: \"3bdc231e-2cd0-4323-a872-803e28ab5868\") " pod="openshift-controller-manager/controller-manager-8555d9c686-6xxgz" Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.876548 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7t97\" (UniqueName: \"kubernetes.io/projected/d7afc4c0-854b-4491-8db5-a6dbdc888d1c-kube-api-access-b7t97\") pod \"route-controller-manager-6b98945b4f-krmvf\" (UID: \"d7afc4c0-854b-4491-8db5-a6dbdc888d1c\") " pod="openshift-route-controller-manager/route-controller-manager-6b98945b4f-krmvf" Mar 09 14:07:28 crc kubenswrapper[4722]: I0309 14:07:28.893993 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8555d9c686-6xxgz" Mar 09 14:07:29 crc kubenswrapper[4722]: I0309 14:07:29.176630 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6b98945b4f-krmvf" Mar 09 14:07:29 crc kubenswrapper[4722]: I0309 14:07:29.317589 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-8555d9c686-6xxgz"] Mar 09 14:07:29 crc kubenswrapper[4722]: I0309 14:07:29.370998 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c2g4c" event={"ID":"7df01eab-424f-40b1-a40c-03b930a8fac6","Type":"ContainerStarted","Data":"c4cf8c1a507fc5a3ab3987d6872bc9a199c343500986f97c7465d38446bb2443"} Mar 09 14:07:29 crc kubenswrapper[4722]: I0309 14:07:29.758332 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b98945b4f-krmvf"] Mar 09 14:07:29 crc kubenswrapper[4722]: W0309 14:07:29.762549 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7afc4c0_854b_4491_8db5_a6dbdc888d1c.slice/crio-639079ea050cb860daaa4addd8466963cc0fe384b1276e4d16ab6c97d1d6909d WatchSource:0}: Error finding container 639079ea050cb860daaa4addd8466963cc0fe384b1276e4d16ab6c97d1d6909d: Status 404 returned error can't find the container with id 639079ea050cb860daaa4addd8466963cc0fe384b1276e4d16ab6c97d1d6909d Mar 09 14:07:30 crc kubenswrapper[4722]: I0309 14:07:30.159216 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11e24ad3-46ea-469c-bdec-73bd3eb10057" path="/var/lib/kubelet/pods/11e24ad3-46ea-469c-bdec-73bd3eb10057/volumes" Mar 09 14:07:30 crc kubenswrapper[4722]: I0309 14:07:30.160439 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c0bd182-5e3e-4763-b517-2d6646ceddfa" path="/var/lib/kubelet/pods/7c0bd182-5e3e-4763-b517-2d6646ceddfa/volumes" Mar 09 14:07:30 crc kubenswrapper[4722]: I0309 14:07:30.161055 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9665111-e6dc-49f2-803a-96eebcc4c78c" path="/var/lib/kubelet/pods/a9665111-e6dc-49f2-803a-96eebcc4c78c/volumes" Mar 09 14:07:30 crc kubenswrapper[4722]: I0309 14:07:30.406330 4722 generic.go:334] "Generic (PLEG): container finished" podID="7df01eab-424f-40b1-a40c-03b930a8fac6" containerID="c4cf8c1a507fc5a3ab3987d6872bc9a199c343500986f97c7465d38446bb2443" exitCode=0 Mar 09 14:07:30 crc kubenswrapper[4722]: I0309 14:07:30.406420 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c2g4c" event={"ID":"7df01eab-424f-40b1-a40c-03b930a8fac6","Type":"ContainerDied","Data":"c4cf8c1a507fc5a3ab3987d6872bc9a199c343500986f97c7465d38446bb2443"} Mar 09 14:07:30 crc kubenswrapper[4722]: I0309 14:07:30.410337 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6b98945b4f-krmvf" event={"ID":"d7afc4c0-854b-4491-8db5-a6dbdc888d1c","Type":"ContainerStarted","Data":"4323b8690b994f2872b8977ceb081700d560bec7ee6f376993ce8f318c792782"} Mar 09 14:07:30 crc kubenswrapper[4722]: I0309 14:07:30.410374 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6b98945b4f-krmvf" event={"ID":"d7afc4c0-854b-4491-8db5-a6dbdc888d1c","Type":"ContainerStarted","Data":"639079ea050cb860daaa4addd8466963cc0fe384b1276e4d16ab6c97d1d6909d"} Mar 09 14:07:30 crc kubenswrapper[4722]: I0309 14:07:30.412439 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8555d9c686-6xxgz" event={"ID":"3bdc231e-2cd0-4323-a872-803e28ab5868","Type":"ContainerStarted","Data":"ab25961a4f544814b62328a9f21b27b3e7c6a24839aa4b36fcfcd6861c6d8f16"} Mar 09 14:07:30 crc kubenswrapper[4722]: I0309 14:07:30.412467 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8555d9c686-6xxgz" event={"ID":"3bdc231e-2cd0-4323-a872-803e28ab5868","Type":"ContainerStarted","Data":"4a9512104b74ce61513fb3e6023d91badad5cc782fbe5107a783ed4439a43f96"} Mar 09 14:07:30 crc kubenswrapper[4722]: I0309 14:07:30.412852 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-8555d9c686-6xxgz" Mar 09 14:07:30 crc kubenswrapper[4722]: I0309 14:07:30.421514 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-8555d9c686-6xxgz" Mar 09 14:07:30 crc kubenswrapper[4722]: I0309 14:07:30.514755 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-8555d9c686-6xxgz" podStartSLOduration=4.514719717 podStartE2EDuration="4.514719717s" podCreationTimestamp="2026-03-09 14:07:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:07:30.51088711 +0000 UTC m=+291.066455706" watchObservedRunningTime="2026-03-09 14:07:30.514719717 +0000 UTC m=+291.070288313" Mar 09 14:07:30 crc kubenswrapper[4722]: I0309 14:07:30.696846 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-w95cf" Mar 09 14:07:30 crc kubenswrapper[4722]: I0309 14:07:30.697784 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-w95cf" Mar 09 14:07:30 crc kubenswrapper[4722]: I0309 14:07:30.873893 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-pwxnx" Mar 09 14:07:30 crc kubenswrapper[4722]: I0309 14:07:30.875049 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-pwxnx" Mar 09 14:07:30 crc kubenswrapper[4722]: I0309 14:07:30.941173 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-pwxnx" Mar 09 14:07:31 crc kubenswrapper[4722]: I0309 14:07:31.108776 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-t8pts" Mar 09 14:07:31 crc kubenswrapper[4722]: I0309 14:07:31.108840 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-t8pts" Mar 09 14:07:31 crc kubenswrapper[4722]: I0309 14:07:31.301455 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zmkmp" Mar 09 14:07:31 crc kubenswrapper[4722]: I0309 14:07:31.301507 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zmkmp" Mar 09 14:07:31 crc kubenswrapper[4722]: I0309 14:07:31.353635 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zmkmp" Mar 09 14:07:31 crc kubenswrapper[4722]: I0309 14:07:31.450012 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6b98945b4f-krmvf" podStartSLOduration=5.449987128 podStartE2EDuration="5.449987128s" podCreationTimestamp="2026-03-09 14:07:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:07:31.449179646 +0000 UTC m=+292.004748222" watchObservedRunningTime="2026-03-09 14:07:31.449987128 +0000 UTC m=+292.005555704" Mar 09 14:07:31 crc kubenswrapper[4722]: I0309 14:07:31.657330 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-knrzp" Mar 09 14:07:31 crc kubenswrapper[4722]: I0309 14:07:31.752036 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-w95cf" podUID="65e3d647-8806-4c0c-b9aa-142739f2fbe0" containerName="registry-server" probeResult="failure" output=< Mar 09 14:07:31 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Mar 09 14:07:31 crc kubenswrapper[4722]: > Mar 09 14:07:32 crc kubenswrapper[4722]: I0309 14:07:32.159872 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-t8pts" podUID="b0b6cdb1-050e-4ed3-b20e-d825e4db1edb" containerName="registry-server" probeResult="failure" output=< Mar 09 14:07:32 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Mar 09 14:07:32 crc kubenswrapper[4722]: > Mar 09 14:07:32 crc kubenswrapper[4722]: I0309 14:07:32.891919 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rdkdt" Mar 09 14:07:32 crc kubenswrapper[4722]: I0309 14:07:32.891980 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rdkdt" Mar 09 14:07:32 crc kubenswrapper[4722]: I0309 14:07:32.948450 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rdkdt" Mar 09 14:07:33 crc kubenswrapper[4722]: I0309 14:07:33.485954 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rdkdt" Mar 09 14:07:34 crc kubenswrapper[4722]: I0309 14:07:34.386632 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-r2hp9" Mar 09 14:07:34 crc kubenswrapper[4722]: I0309 14:07:34.439121 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-r2hp9" Mar 09 14:07:35 crc kubenswrapper[4722]: I0309 14:07:35.461752 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c2g4c" event={"ID":"7df01eab-424f-40b1-a40c-03b930a8fac6","Type":"ContainerStarted","Data":"75288ffd906d8bf407d2dcb493b2b28933f498f7cd819bb126299f76207c3ac7"} Mar 09 14:07:35 crc kubenswrapper[4722]: I0309 14:07:35.488529 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-c2g4c" podStartSLOduration=3.833724688 podStartE2EDuration="1m2.488503008s" podCreationTimestamp="2026-03-09 14:06:33 +0000 UTC" firstStartedPulling="2026-03-09 14:06:35.629742725 +0000 UTC m=+236.185311301" lastFinishedPulling="2026-03-09 14:07:34.284521045 +0000 UTC m=+294.840089621" observedRunningTime="2026-03-09 14:07:35.487603023 +0000 UTC m=+296.043171639" watchObservedRunningTime="2026-03-09 14:07:35.488503008 +0000 UTC m=+296.044071584" Mar 09 14:07:36 crc kubenswrapper[4722]: I0309 14:07:36.351492 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-r2hp9"] Mar 09 14:07:36 crc kubenswrapper[4722]: I0309 14:07:36.351743 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-r2hp9" podUID="6ba5c493-a951-4d61-acf3-5ee964dcfe60" containerName="registry-server" containerID="cri-o://f78b83bee983b02a1487dfd5f2313e13f3c8fa4aba7e87056f1d7e1ad5f97ffe" gracePeriod=2 Mar 09 14:07:37 crc kubenswrapper[4722]: I0309 14:07:37.478798 4722 generic.go:334] "Generic (PLEG): container finished" podID="6ba5c493-a951-4d61-acf3-5ee964dcfe60" containerID="f78b83bee983b02a1487dfd5f2313e13f3c8fa4aba7e87056f1d7e1ad5f97ffe" exitCode=0 Mar 09 14:07:37 crc kubenswrapper[4722]: I0309 14:07:37.478904 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r2hp9" event={"ID":"6ba5c493-a951-4d61-acf3-5ee964dcfe60","Type":"ContainerDied","Data":"f78b83bee983b02a1487dfd5f2313e13f3c8fa4aba7e87056f1d7e1ad5f97ffe"} Mar 09 14:07:38 crc kubenswrapper[4722]: I0309 14:07:38.751655 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r2hp9" Mar 09 14:07:38 crc kubenswrapper[4722]: I0309 14:07:38.928533 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bcwjz\" (UniqueName: \"kubernetes.io/projected/6ba5c493-a951-4d61-acf3-5ee964dcfe60-kube-api-access-bcwjz\") pod \"6ba5c493-a951-4d61-acf3-5ee964dcfe60\" (UID: \"6ba5c493-a951-4d61-acf3-5ee964dcfe60\") " Mar 09 14:07:38 crc kubenswrapper[4722]: I0309 14:07:38.929132 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ba5c493-a951-4d61-acf3-5ee964dcfe60-utilities\") pod \"6ba5c493-a951-4d61-acf3-5ee964dcfe60\" (UID: \"6ba5c493-a951-4d61-acf3-5ee964dcfe60\") " Mar 09 14:07:38 crc kubenswrapper[4722]: I0309 14:07:38.929197 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ba5c493-a951-4d61-acf3-5ee964dcfe60-catalog-content\") pod \"6ba5c493-a951-4d61-acf3-5ee964dcfe60\" (UID: \"6ba5c493-a951-4d61-acf3-5ee964dcfe60\") " Mar 09 14:07:38 crc kubenswrapper[4722]: I0309 14:07:38.930550 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ba5c493-a951-4d61-acf3-5ee964dcfe60-utilities" (OuterVolumeSpecName: "utilities") pod "6ba5c493-a951-4d61-acf3-5ee964dcfe60" (UID: "6ba5c493-a951-4d61-acf3-5ee964dcfe60"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:07:38 crc kubenswrapper[4722]: I0309 14:07:38.937881 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ba5c493-a951-4d61-acf3-5ee964dcfe60-kube-api-access-bcwjz" (OuterVolumeSpecName: "kube-api-access-bcwjz") pod "6ba5c493-a951-4d61-acf3-5ee964dcfe60" (UID: "6ba5c493-a951-4d61-acf3-5ee964dcfe60"). InnerVolumeSpecName "kube-api-access-bcwjz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:07:39 crc kubenswrapper[4722]: I0309 14:07:39.031159 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bcwjz\" (UniqueName: \"kubernetes.io/projected/6ba5c493-a951-4d61-acf3-5ee964dcfe60-kube-api-access-bcwjz\") on node \"crc\" DevicePath \"\"" Mar 09 14:07:39 crc kubenswrapper[4722]: I0309 14:07:39.031214 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ba5c493-a951-4d61-acf3-5ee964dcfe60-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 14:07:39 crc kubenswrapper[4722]: I0309 14:07:39.067518 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ba5c493-a951-4d61-acf3-5ee964dcfe60-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6ba5c493-a951-4d61-acf3-5ee964dcfe60" (UID: "6ba5c493-a951-4d61-acf3-5ee964dcfe60"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:07:39 crc kubenswrapper[4722]: I0309 14:07:39.133195 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ba5c493-a951-4d61-acf3-5ee964dcfe60-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 14:07:39 crc kubenswrapper[4722]: I0309 14:07:39.178315 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6b98945b4f-krmvf" Mar 09 14:07:39 crc kubenswrapper[4722]: I0309 14:07:39.185894 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6b98945b4f-krmvf" Mar 09 14:07:39 crc kubenswrapper[4722]: I0309 14:07:39.503327 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r2hp9" event={"ID":"6ba5c493-a951-4d61-acf3-5ee964dcfe60","Type":"ContainerDied","Data":"c0b77c7bc9d97f3661f2a57a288ede760a71b5527fb709c33e741d593844a945"} Mar 09 14:07:39 crc kubenswrapper[4722]: I0309 14:07:39.503438 4722 scope.go:117] "RemoveContainer" containerID="f78b83bee983b02a1487dfd5f2313e13f3c8fa4aba7e87056f1d7e1ad5f97ffe" Mar 09 14:07:39 crc kubenswrapper[4722]: I0309 14:07:39.503364 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r2hp9" Mar 09 14:07:39 crc kubenswrapper[4722]: I0309 14:07:39.531825 4722 scope.go:117] "RemoveContainer" containerID="02554076861fbf645322f00023f0309b6094f4865c2951838c9821d2260303df" Mar 09 14:07:39 crc kubenswrapper[4722]: I0309 14:07:39.545041 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-r2hp9"] Mar 09 14:07:39 crc kubenswrapper[4722]: I0309 14:07:39.546104 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-r2hp9"] Mar 09 14:07:39 crc kubenswrapper[4722]: I0309 14:07:39.571800 4722 scope.go:117] "RemoveContainer" containerID="103e0ee2ddfa2f25215d03fd15afc82ae1c8aad352747fff91d183a51d30c34f" Mar 09 14:07:40 crc kubenswrapper[4722]: I0309 14:07:40.182043 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ba5c493-a951-4d61-acf3-5ee964dcfe60" path="/var/lib/kubelet/pods/6ba5c493-a951-4d61-acf3-5ee964dcfe60/volumes" Mar 09 14:07:40 crc kubenswrapper[4722]: I0309 14:07:40.743522 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-w95cf" Mar 09 14:07:40 crc kubenswrapper[4722]: I0309 14:07:40.783359 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-w95cf" Mar 09 14:07:40 crc kubenswrapper[4722]: I0309 14:07:40.916164 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-pwxnx" Mar 09 14:07:41 crc kubenswrapper[4722]: I0309 14:07:41.151269 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-t8pts" Mar 09 14:07:41 crc kubenswrapper[4722]: I0309 14:07:41.201476 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-t8pts" Mar 09 14:07:41 crc kubenswrapper[4722]: I0309 14:07:41.343010 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zmkmp" Mar 09 14:07:42 crc kubenswrapper[4722]: I0309 14:07:42.954171 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-t8pts"] Mar 09 14:07:42 crc kubenswrapper[4722]: I0309 14:07:42.954883 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-t8pts" podUID="b0b6cdb1-050e-4ed3-b20e-d825e4db1edb" containerName="registry-server" containerID="cri-o://e45aa95389ab189d5e3b21c6fc9b35ae0e60420f591a95fabb5a7e81051c4f2f" gracePeriod=2 Mar 09 14:07:43 crc kubenswrapper[4722]: I0309 14:07:43.155990 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zmkmp"] Mar 09 14:07:43 crc kubenswrapper[4722]: I0309 14:07:43.156611 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-zmkmp" podUID="4b4a7622-7bca-4fca-adb3-eec526b21b2b" containerName="registry-server" containerID="cri-o://b250fd3da865150b9501688128005be46b38134155b3c3c5f9375cf870b9042b" gracePeriod=2 Mar 09 14:07:43 crc kubenswrapper[4722]: I0309 14:07:43.903480 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-c2g4c" Mar 09 14:07:43 crc kubenswrapper[4722]: I0309 14:07:43.903960 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-c2g4c" Mar 09 14:07:43 crc kubenswrapper[4722]: I0309 14:07:43.951454 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-c2g4c" Mar 09 14:07:44 crc kubenswrapper[4722]: I0309 14:07:44.542679 4722 generic.go:334] "Generic (PLEG): container finished" podID="b0b6cdb1-050e-4ed3-b20e-d825e4db1edb" containerID="e45aa95389ab189d5e3b21c6fc9b35ae0e60420f591a95fabb5a7e81051c4f2f" exitCode=0 Mar 09 14:07:44 crc kubenswrapper[4722]: I0309 14:07:44.542722 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t8pts" event={"ID":"b0b6cdb1-050e-4ed3-b20e-d825e4db1edb","Type":"ContainerDied","Data":"e45aa95389ab189d5e3b21c6fc9b35ae0e60420f591a95fabb5a7e81051c4f2f"} Mar 09 14:07:44 crc kubenswrapper[4722]: I0309 14:07:44.547693 4722 generic.go:334] "Generic (PLEG): container finished" podID="4b4a7622-7bca-4fca-adb3-eec526b21b2b" containerID="b250fd3da865150b9501688128005be46b38134155b3c3c5f9375cf870b9042b" exitCode=0 Mar 09 14:07:44 crc kubenswrapper[4722]: I0309 14:07:44.547790 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zmkmp" event={"ID":"4b4a7622-7bca-4fca-adb3-eec526b21b2b","Type":"ContainerDied","Data":"b250fd3da865150b9501688128005be46b38134155b3c3c5f9375cf870b9042b"} Mar 09 14:07:44 crc kubenswrapper[4722]: I0309 14:07:44.595371 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-c2g4c" Mar 09 14:07:44 crc kubenswrapper[4722]: I0309 14:07:44.652744 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zmkmp" Mar 09 14:07:44 crc kubenswrapper[4722]: I0309 14:07:44.663266 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t8pts" Mar 09 14:07:44 crc kubenswrapper[4722]: I0309 14:07:44.834101 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk4lv\" (UniqueName: \"kubernetes.io/projected/b0b6cdb1-050e-4ed3-b20e-d825e4db1edb-kube-api-access-tk4lv\") pod \"b0b6cdb1-050e-4ed3-b20e-d825e4db1edb\" (UID: \"b0b6cdb1-050e-4ed3-b20e-d825e4db1edb\") " Mar 09 14:07:44 crc kubenswrapper[4722]: I0309 14:07:44.834341 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b4a7622-7bca-4fca-adb3-eec526b21b2b-utilities\") pod \"4b4a7622-7bca-4fca-adb3-eec526b21b2b\" (UID: \"4b4a7622-7bca-4fca-adb3-eec526b21b2b\") " Mar 09 14:07:44 crc kubenswrapper[4722]: I0309 14:07:44.834402 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b4a7622-7bca-4fca-adb3-eec526b21b2b-catalog-content\") pod \"4b4a7622-7bca-4fca-adb3-eec526b21b2b\" (UID: \"4b4a7622-7bca-4fca-adb3-eec526b21b2b\") " Mar 09 14:07:44 crc kubenswrapper[4722]: I0309 14:07:44.834504 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0b6cdb1-050e-4ed3-b20e-d825e4db1edb-utilities\") pod \"b0b6cdb1-050e-4ed3-b20e-d825e4db1edb\" (UID: \"b0b6cdb1-050e-4ed3-b20e-d825e4db1edb\") " Mar 09 14:07:44 crc kubenswrapper[4722]: I0309 14:07:44.835304 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b4a7622-7bca-4fca-adb3-eec526b21b2b-utilities" (OuterVolumeSpecName: "utilities") pod "4b4a7622-7bca-4fca-adb3-eec526b21b2b" (UID: "4b4a7622-7bca-4fca-adb3-eec526b21b2b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:07:44 crc kubenswrapper[4722]: I0309 14:07:44.835646 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqvdd\" (UniqueName: \"kubernetes.io/projected/4b4a7622-7bca-4fca-adb3-eec526b21b2b-kube-api-access-cqvdd\") pod \"4b4a7622-7bca-4fca-adb3-eec526b21b2b\" (UID: \"4b4a7622-7bca-4fca-adb3-eec526b21b2b\") " Mar 09 14:07:44 crc kubenswrapper[4722]: I0309 14:07:44.835687 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0b6cdb1-050e-4ed3-b20e-d825e4db1edb-utilities" (OuterVolumeSpecName: "utilities") pod "b0b6cdb1-050e-4ed3-b20e-d825e4db1edb" (UID: "b0b6cdb1-050e-4ed3-b20e-d825e4db1edb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:07:44 crc kubenswrapper[4722]: I0309 14:07:44.835875 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0b6cdb1-050e-4ed3-b20e-d825e4db1edb-catalog-content\") pod \"b0b6cdb1-050e-4ed3-b20e-d825e4db1edb\" (UID: \"b0b6cdb1-050e-4ed3-b20e-d825e4db1edb\") " Mar 09 14:07:44 crc kubenswrapper[4722]: I0309 14:07:44.836471 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b4a7622-7bca-4fca-adb3-eec526b21b2b-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 14:07:44 crc kubenswrapper[4722]: I0309 14:07:44.836500 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0b6cdb1-050e-4ed3-b20e-d825e4db1edb-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 14:07:44 crc kubenswrapper[4722]: I0309 14:07:44.843519 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b4a7622-7bca-4fca-adb3-eec526b21b2b-kube-api-access-cqvdd" (OuterVolumeSpecName: "kube-api-access-cqvdd") pod "4b4a7622-7bca-4fca-adb3-eec526b21b2b" (UID: "4b4a7622-7bca-4fca-adb3-eec526b21b2b"). InnerVolumeSpecName "kube-api-access-cqvdd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:07:44 crc kubenswrapper[4722]: I0309 14:07:44.843648 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0b6cdb1-050e-4ed3-b20e-d825e4db1edb-kube-api-access-tk4lv" (OuterVolumeSpecName: "kube-api-access-tk4lv") pod "b0b6cdb1-050e-4ed3-b20e-d825e4db1edb" (UID: "b0b6cdb1-050e-4ed3-b20e-d825e4db1edb"). InnerVolumeSpecName "kube-api-access-tk4lv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:07:44 crc kubenswrapper[4722]: I0309 14:07:44.888610 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b4a7622-7bca-4fca-adb3-eec526b21b2b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4b4a7622-7bca-4fca-adb3-eec526b21b2b" (UID: "4b4a7622-7bca-4fca-adb3-eec526b21b2b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:07:44 crc kubenswrapper[4722]: I0309 14:07:44.903654 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0b6cdb1-050e-4ed3-b20e-d825e4db1edb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b0b6cdb1-050e-4ed3-b20e-d825e4db1edb" (UID: "b0b6cdb1-050e-4ed3-b20e-d825e4db1edb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:07:44 crc kubenswrapper[4722]: I0309 14:07:44.937975 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b4a7622-7bca-4fca-adb3-eec526b21b2b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 14:07:44 crc kubenswrapper[4722]: I0309 14:07:44.938054 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cqvdd\" (UniqueName: \"kubernetes.io/projected/4b4a7622-7bca-4fca-adb3-eec526b21b2b-kube-api-access-cqvdd\") on node \"crc\" DevicePath \"\"" Mar 09 14:07:44 crc kubenswrapper[4722]: I0309 14:07:44.938073 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0b6cdb1-050e-4ed3-b20e-d825e4db1edb-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 14:07:44 crc kubenswrapper[4722]: I0309 14:07:44.938093 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk4lv\" (UniqueName: \"kubernetes.io/projected/b0b6cdb1-050e-4ed3-b20e-d825e4db1edb-kube-api-access-tk4lv\") on node \"crc\" DevicePath \"\"" Mar 09 14:07:45 crc kubenswrapper[4722]: I0309 14:07:45.557314 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t8pts" event={"ID":"b0b6cdb1-050e-4ed3-b20e-d825e4db1edb","Type":"ContainerDied","Data":"528b1a83a1ffc324a8557f1c712e00a1df760a4b6c17bbf005fc50581e91a9ef"} Mar 09 14:07:45 crc kubenswrapper[4722]: I0309 14:07:45.557377 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t8pts" Mar 09 14:07:45 crc kubenswrapper[4722]: I0309 14:07:45.557798 4722 scope.go:117] "RemoveContainer" containerID="e45aa95389ab189d5e3b21c6fc9b35ae0e60420f591a95fabb5a7e81051c4f2f" Mar 09 14:07:45 crc kubenswrapper[4722]: I0309 14:07:45.562178 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zmkmp" Mar 09 14:07:45 crc kubenswrapper[4722]: I0309 14:07:45.562855 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zmkmp" event={"ID":"4b4a7622-7bca-4fca-adb3-eec526b21b2b","Type":"ContainerDied","Data":"b38fa506870e412e0bd6b9856a391fb968c0813bb14774e8bf52a00c34b7de53"} Mar 09 14:07:45 crc kubenswrapper[4722]: I0309 14:07:45.592290 4722 scope.go:117] "RemoveContainer" containerID="574e56a24df0555271a4809f4200d0788d8603b8b1aae9a6cfbf9d5fca9f8829" Mar 09 14:07:45 crc kubenswrapper[4722]: I0309 14:07:45.617737 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-t8pts"] Mar 09 14:07:45 crc kubenswrapper[4722]: I0309 14:07:45.624285 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-t8pts"] Mar 09 14:07:45 crc kubenswrapper[4722]: I0309 14:07:45.627093 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zmkmp"] Mar 09 14:07:45 crc kubenswrapper[4722]: I0309 14:07:45.633259 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-zmkmp"] Mar 09 14:07:45 crc kubenswrapper[4722]: I0309 14:07:45.638818 4722 scope.go:117] "RemoveContainer" containerID="29f4b9168facd2fc15837261a19e4e6f77ad40fe8b207c41e3e731f44fac7a1f" Mar 09 14:07:45 crc kubenswrapper[4722]: I0309 14:07:45.662921 4722 scope.go:117] "RemoveContainer" containerID="b250fd3da865150b9501688128005be46b38134155b3c3c5f9375cf870b9042b" Mar 09 14:07:45 crc kubenswrapper[4722]: I0309 14:07:45.678812 4722 scope.go:117] "RemoveContainer" containerID="dd8eb5ce92a018092bfa959f5c229f7dcc17c39ce356b0af1bb7a9f58984e026" Mar 09 14:07:45 crc kubenswrapper[4722]: I0309 14:07:45.707597 4722 scope.go:117] "RemoveContainer" containerID="7a898e7f7bdd3689485d762720e4740c6b03c454dcdeda7b0c721d5f69630a6f" Mar 09 14:07:46 crc kubenswrapper[4722]: I0309 14:07:46.158081 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b4a7622-7bca-4fca-adb3-eec526b21b2b" path="/var/lib/kubelet/pods/4b4a7622-7bca-4fca-adb3-eec526b21b2b/volumes" Mar 09 14:07:46 crc kubenswrapper[4722]: I0309 14:07:46.159512 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0b6cdb1-050e-4ed3-b20e-d825e4db1edb" path="/var/lib/kubelet/pods/b0b6cdb1-050e-4ed3-b20e-d825e4db1edb/volumes" Mar 09 14:07:46 crc kubenswrapper[4722]: I0309 14:07:46.292326 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-6s4fg" podUID="9c83f63c-9a9e-4e33-b0c6-54396ce9f07c" containerName="oauth-openshift" containerID="cri-o://92b2657738997e464f43cf92f37aa038c34a22f9e970c6730d11b569f7bca4f4" gracePeriod=15 Mar 09 14:07:46 crc kubenswrapper[4722]: I0309 14:07:46.709440 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-8555d9c686-6xxgz"] Mar 09 14:07:46 crc kubenswrapper[4722]: I0309 14:07:46.709798 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-8555d9c686-6xxgz" podUID="3bdc231e-2cd0-4323-a872-803e28ab5868" containerName="controller-manager" containerID="cri-o://ab25961a4f544814b62328a9f21b27b3e7c6a24839aa4b36fcfcd6861c6d8f16" gracePeriod=30 Mar 09 14:07:46 crc kubenswrapper[4722]: I0309 14:07:46.713525 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b98945b4f-krmvf"] Mar 09 14:07:46 crc kubenswrapper[4722]: I0309 14:07:46.713835 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6b98945b4f-krmvf" podUID="d7afc4c0-854b-4491-8db5-a6dbdc888d1c" containerName="route-controller-manager" containerID="cri-o://4323b8690b994f2872b8977ceb081700d560bec7ee6f376993ce8f318c792782" gracePeriod=30 Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.333866 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-6s4fg" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.392376 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9c83f63c-9a9e-4e33-b0c6-54396ce9f07c-v4-0-config-user-template-login\") pod \"9c83f63c-9a9e-4e33-b0c6-54396ce9f07c\" (UID: \"9c83f63c-9a9e-4e33-b0c6-54396ce9f07c\") " Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.392954 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9c83f63c-9a9e-4e33-b0c6-54396ce9f07c-v4-0-config-system-serving-cert\") pod \"9c83f63c-9a9e-4e33-b0c6-54396ce9f07c\" (UID: \"9c83f63c-9a9e-4e33-b0c6-54396ce9f07c\") " Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.393046 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9c83f63c-9a9e-4e33-b0c6-54396ce9f07c-v4-0-config-system-service-ca\") pod \"9c83f63c-9a9e-4e33-b0c6-54396ce9f07c\" (UID: \"9c83f63c-9a9e-4e33-b0c6-54396ce9f07c\") " Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.393081 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9c83f63c-9a9e-4e33-b0c6-54396ce9f07c-v4-0-config-system-session\") pod \"9c83f63c-9a9e-4e33-b0c6-54396ce9f07c\" (UID: \"9c83f63c-9a9e-4e33-b0c6-54396ce9f07c\") " Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.393117 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9c83f63c-9a9e-4e33-b0c6-54396ce9f07c-audit-dir\") pod \"9c83f63c-9a9e-4e33-b0c6-54396ce9f07c\" (UID: \"9c83f63c-9a9e-4e33-b0c6-54396ce9f07c\") " Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.393179 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9c83f63c-9a9e-4e33-b0c6-54396ce9f07c-v4-0-config-system-router-certs\") pod \"9c83f63c-9a9e-4e33-b0c6-54396ce9f07c\" (UID: \"9c83f63c-9a9e-4e33-b0c6-54396ce9f07c\") " Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.393246 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9c83f63c-9a9e-4e33-b0c6-54396ce9f07c-audit-policies\") pod \"9c83f63c-9a9e-4e33-b0c6-54396ce9f07c\" (UID: \"9c83f63c-9a9e-4e33-b0c6-54396ce9f07c\") " Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.393278 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c7n2r\" (UniqueName: \"kubernetes.io/projected/9c83f63c-9a9e-4e33-b0c6-54396ce9f07c-kube-api-access-c7n2r\") pod \"9c83f63c-9a9e-4e33-b0c6-54396ce9f07c\" (UID: \"9c83f63c-9a9e-4e33-b0c6-54396ce9f07c\") " Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.393303 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c83f63c-9a9e-4e33-b0c6-54396ce9f07c-v4-0-config-system-trusted-ca-bundle\") pod \"9c83f63c-9a9e-4e33-b0c6-54396ce9f07c\" (UID: \"9c83f63c-9a9e-4e33-b0c6-54396ce9f07c\") " Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.393370 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9c83f63c-9a9e-4e33-b0c6-54396ce9f07c-v4-0-config-system-ocp-branding-template\") pod \"9c83f63c-9a9e-4e33-b0c6-54396ce9f07c\" (UID: \"9c83f63c-9a9e-4e33-b0c6-54396ce9f07c\") " Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.393413 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9c83f63c-9a9e-4e33-b0c6-54396ce9f07c-v4-0-config-user-template-provider-selection\") pod \"9c83f63c-9a9e-4e33-b0c6-54396ce9f07c\" (UID: \"9c83f63c-9a9e-4e33-b0c6-54396ce9f07c\") " Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.393610 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9c83f63c-9a9e-4e33-b0c6-54396ce9f07c-v4-0-config-system-cliconfig\") pod \"9c83f63c-9a9e-4e33-b0c6-54396ce9f07c\" (UID: \"9c83f63c-9a9e-4e33-b0c6-54396ce9f07c\") " Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.393661 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/9c83f63c-9a9e-4e33-b0c6-54396ce9f07c-v4-0-config-user-idp-0-file-data\") pod \"9c83f63c-9a9e-4e33-b0c6-54396ce9f07c\" (UID: \"9c83f63c-9a9e-4e33-b0c6-54396ce9f07c\") " Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.393688 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9c83f63c-9a9e-4e33-b0c6-54396ce9f07c-v4-0-config-user-template-error\") pod \"9c83f63c-9a9e-4e33-b0c6-54396ce9f07c\" (UID: \"9c83f63c-9a9e-4e33-b0c6-54396ce9f07c\") " Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.395942 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c83f63c-9a9e-4e33-b0c6-54396ce9f07c-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "9c83f63c-9a9e-4e33-b0c6-54396ce9f07c" (UID: "9c83f63c-9a9e-4e33-b0c6-54396ce9f07c"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.395999 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c83f63c-9a9e-4e33-b0c6-54396ce9f07c-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "9c83f63c-9a9e-4e33-b0c6-54396ce9f07c" (UID: "9c83f63c-9a9e-4e33-b0c6-54396ce9f07c"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.396615 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9c83f63c-9a9e-4e33-b0c6-54396ce9f07c-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "9c83f63c-9a9e-4e33-b0c6-54396ce9f07c" (UID: "9c83f63c-9a9e-4e33-b0c6-54396ce9f07c"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.397054 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c83f63c-9a9e-4e33-b0c6-54396ce9f07c-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "9c83f63c-9a9e-4e33-b0c6-54396ce9f07c" (UID: "9c83f63c-9a9e-4e33-b0c6-54396ce9f07c"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.398343 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c83f63c-9a9e-4e33-b0c6-54396ce9f07c-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "9c83f63c-9a9e-4e33-b0c6-54396ce9f07c" (UID: "9c83f63c-9a9e-4e33-b0c6-54396ce9f07c"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.404147 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c83f63c-9a9e-4e33-b0c6-54396ce9f07c-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "9c83f63c-9a9e-4e33-b0c6-54396ce9f07c" (UID: "9c83f63c-9a9e-4e33-b0c6-54396ce9f07c"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.404536 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c83f63c-9a9e-4e33-b0c6-54396ce9f07c-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "9c83f63c-9a9e-4e33-b0c6-54396ce9f07c" (UID: "9c83f63c-9a9e-4e33-b0c6-54396ce9f07c"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.405250 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c83f63c-9a9e-4e33-b0c6-54396ce9f07c-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "9c83f63c-9a9e-4e33-b0c6-54396ce9f07c" (UID: "9c83f63c-9a9e-4e33-b0c6-54396ce9f07c"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.406681 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c83f63c-9a9e-4e33-b0c6-54396ce9f07c-kube-api-access-c7n2r" (OuterVolumeSpecName: "kube-api-access-c7n2r") pod "9c83f63c-9a9e-4e33-b0c6-54396ce9f07c" (UID: "9c83f63c-9a9e-4e33-b0c6-54396ce9f07c"). InnerVolumeSpecName "kube-api-access-c7n2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.420739 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c83f63c-9a9e-4e33-b0c6-54396ce9f07c-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "9c83f63c-9a9e-4e33-b0c6-54396ce9f07c" (UID: "9c83f63c-9a9e-4e33-b0c6-54396ce9f07c"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.421288 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c83f63c-9a9e-4e33-b0c6-54396ce9f07c-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "9c83f63c-9a9e-4e33-b0c6-54396ce9f07c" (UID: "9c83f63c-9a9e-4e33-b0c6-54396ce9f07c"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.423564 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c83f63c-9a9e-4e33-b0c6-54396ce9f07c-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "9c83f63c-9a9e-4e33-b0c6-54396ce9f07c" (UID: "9c83f63c-9a9e-4e33-b0c6-54396ce9f07c"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.423780 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c83f63c-9a9e-4e33-b0c6-54396ce9f07c-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "9c83f63c-9a9e-4e33-b0c6-54396ce9f07c" (UID: "9c83f63c-9a9e-4e33-b0c6-54396ce9f07c"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.424613 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c83f63c-9a9e-4e33-b0c6-54396ce9f07c-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "9c83f63c-9a9e-4e33-b0c6-54396ce9f07c" (UID: "9c83f63c-9a9e-4e33-b0c6-54396ce9f07c"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.496185 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9c83f63c-9a9e-4e33-b0c6-54396ce9f07c-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.496247 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9c83f63c-9a9e-4e33-b0c6-54396ce9f07c-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.496265 4722 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9c83f63c-9a9e-4e33-b0c6-54396ce9f07c-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.496280 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9c83f63c-9a9e-4e33-b0c6-54396ce9f07c-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.496291 4722 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9c83f63c-9a9e-4e33-b0c6-54396ce9f07c-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.496302 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c7n2r\" (UniqueName: \"kubernetes.io/projected/9c83f63c-9a9e-4e33-b0c6-54396ce9f07c-kube-api-access-c7n2r\") on node \"crc\" DevicePath \"\"" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.496316 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c83f63c-9a9e-4e33-b0c6-54396ce9f07c-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.496327 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9c83f63c-9a9e-4e33-b0c6-54396ce9f07c-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.496338 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9c83f63c-9a9e-4e33-b0c6-54396ce9f07c-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.496351 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9c83f63c-9a9e-4e33-b0c6-54396ce9f07c-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.496360 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/9c83f63c-9a9e-4e33-b0c6-54396ce9f07c-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.496372 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9c83f63c-9a9e-4e33-b0c6-54396ce9f07c-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.496391 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9c83f63c-9a9e-4e33-b0c6-54396ce9f07c-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.496401 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9c83f63c-9a9e-4e33-b0c6-54396ce9f07c-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.575815 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-5db757fd5b-t57qc"] Mar 09 14:07:47 crc kubenswrapper[4722]: E0309 14:07:47.576093 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ba5c493-a951-4d61-acf3-5ee964dcfe60" containerName="extract-content" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.576111 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ba5c493-a951-4d61-acf3-5ee964dcfe60" containerName="extract-content" Mar 09 14:07:47 crc kubenswrapper[4722]: E0309 14:07:47.576126 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ba5c493-a951-4d61-acf3-5ee964dcfe60" containerName="registry-server" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.576134 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ba5c493-a951-4d61-acf3-5ee964dcfe60" containerName="registry-server" Mar 09 14:07:47 crc kubenswrapper[4722]: E0309 14:07:47.576149 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0b6cdb1-050e-4ed3-b20e-d825e4db1edb" containerName="extract-utilities" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.576158 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0b6cdb1-050e-4ed3-b20e-d825e4db1edb" containerName="extract-utilities" Mar 09 14:07:47 crc kubenswrapper[4722]: E0309 14:07:47.576174 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0b6cdb1-050e-4ed3-b20e-d825e4db1edb" containerName="registry-server" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.576182 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0b6cdb1-050e-4ed3-b20e-d825e4db1edb" containerName="registry-server" Mar 09 14:07:47 crc kubenswrapper[4722]: E0309 14:07:47.576196 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ba5c493-a951-4d61-acf3-5ee964dcfe60" containerName="extract-utilities" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.576225 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ba5c493-a951-4d61-acf3-5ee964dcfe60" containerName="extract-utilities" Mar 09 14:07:47 crc kubenswrapper[4722]: E0309 14:07:47.576237 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b4a7622-7bca-4fca-adb3-eec526b21b2b" containerName="extract-utilities" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.576245 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b4a7622-7bca-4fca-adb3-eec526b21b2b" containerName="extract-utilities" Mar 09 14:07:47 crc kubenswrapper[4722]: E0309 14:07:47.576255 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0b6cdb1-050e-4ed3-b20e-d825e4db1edb" containerName="extract-content" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.576262 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0b6cdb1-050e-4ed3-b20e-d825e4db1edb" containerName="extract-content" Mar 09 14:07:47 crc kubenswrapper[4722]: E0309 14:07:47.576271 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c83f63c-9a9e-4e33-b0c6-54396ce9f07c" containerName="oauth-openshift" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.576279 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c83f63c-9a9e-4e33-b0c6-54396ce9f07c" containerName="oauth-openshift" Mar 09 14:07:47 crc kubenswrapper[4722]: E0309 14:07:47.576289 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b4a7622-7bca-4fca-adb3-eec526b21b2b" containerName="registry-server" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.576302 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b4a7622-7bca-4fca-adb3-eec526b21b2b" containerName="registry-server" Mar 09 14:07:47 crc kubenswrapper[4722]: E0309 14:07:47.576316 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b4a7622-7bca-4fca-adb3-eec526b21b2b" containerName="extract-content" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.576324 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b4a7622-7bca-4fca-adb3-eec526b21b2b" containerName="extract-content" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.576475 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0b6cdb1-050e-4ed3-b20e-d825e4db1edb" containerName="registry-server" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.576492 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ba5c493-a951-4d61-acf3-5ee964dcfe60" containerName="registry-server" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.576508 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c83f63c-9a9e-4e33-b0c6-54396ce9f07c" containerName="oauth-openshift" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.576518 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b4a7622-7bca-4fca-adb3-eec526b21b2b" containerName="registry-server" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.577079 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5db757fd5b-t57qc" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.595606 4722 generic.go:334] "Generic (PLEG): container finished" podID="9c83f63c-9a9e-4e33-b0c6-54396ce9f07c" containerID="92b2657738997e464f43cf92f37aa038c34a22f9e970c6730d11b569f7bca4f4" exitCode=0 Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.595703 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-6s4fg" event={"ID":"9c83f63c-9a9e-4e33-b0c6-54396ce9f07c","Type":"ContainerDied","Data":"92b2657738997e464f43cf92f37aa038c34a22f9e970c6730d11b569f7bca4f4"} Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.595745 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-6s4fg" event={"ID":"9c83f63c-9a9e-4e33-b0c6-54396ce9f07c","Type":"ContainerDied","Data":"b1f12e433ab07cfd411f327393f1488e291557e710d61f705c41e17c11665161"} Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.595769 4722 scope.go:117] "RemoveContainer" containerID="92b2657738997e464f43cf92f37aa038c34a22f9e970c6730d11b569f7bca4f4" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.595893 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-6s4fg" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.597397 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5db757fd5b-t57qc"] Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.640268 4722 generic.go:334] "Generic (PLEG): container finished" podID="d7afc4c0-854b-4491-8db5-a6dbdc888d1c" containerID="4323b8690b994f2872b8977ceb081700d560bec7ee6f376993ce8f318c792782" exitCode=0 Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.640434 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6b98945b4f-krmvf" event={"ID":"d7afc4c0-854b-4491-8db5-a6dbdc888d1c","Type":"ContainerDied","Data":"4323b8690b994f2872b8977ceb081700d560bec7ee6f376993ce8f318c792782"} Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.654402 4722 generic.go:334] "Generic (PLEG): container finished" podID="3bdc231e-2cd0-4323-a872-803e28ab5868" containerID="ab25961a4f544814b62328a9f21b27b3e7c6a24839aa4b36fcfcd6861c6d8f16" exitCode=0 Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.654460 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8555d9c686-6xxgz" event={"ID":"3bdc231e-2cd0-4323-a872-803e28ab5868","Type":"ContainerDied","Data":"ab25961a4f544814b62328a9f21b27b3e7c6a24839aa4b36fcfcd6861c6d8f16"} Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.662378 4722 scope.go:117] "RemoveContainer" containerID="92b2657738997e464f43cf92f37aa038c34a22f9e970c6730d11b569f7bca4f4" Mar 09 14:07:47 crc kubenswrapper[4722]: E0309 14:07:47.663141 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92b2657738997e464f43cf92f37aa038c34a22f9e970c6730d11b569f7bca4f4\": container with ID starting with 92b2657738997e464f43cf92f37aa038c34a22f9e970c6730d11b569f7bca4f4 not found: ID does not exist" containerID="92b2657738997e464f43cf92f37aa038c34a22f9e970c6730d11b569f7bca4f4" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.663174 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92b2657738997e464f43cf92f37aa038c34a22f9e970c6730d11b569f7bca4f4"} err="failed to get container status \"92b2657738997e464f43cf92f37aa038c34a22f9e970c6730d11b569f7bca4f4\": rpc error: code = NotFound desc = could not find container \"92b2657738997e464f43cf92f37aa038c34a22f9e970c6730d11b569f7bca4f4\": container with ID starting with 92b2657738997e464f43cf92f37aa038c34a22f9e970c6730d11b569f7bca4f4 not found: ID does not exist" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.675055 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-6s4fg"] Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.679345 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-6s4fg"] Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.699160 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ec3da47d-c782-4189-b195-d6b203bd7f7a-v4-0-config-system-router-certs\") pod \"oauth-openshift-5db757fd5b-t57qc\" (UID: \"ec3da47d-c782-4189-b195-d6b203bd7f7a\") " pod="openshift-authentication/oauth-openshift-5db757fd5b-t57qc" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.699240 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ec3da47d-c782-4189-b195-d6b203bd7f7a-v4-0-config-user-template-login\") pod \"oauth-openshift-5db757fd5b-t57qc\" (UID: \"ec3da47d-c782-4189-b195-d6b203bd7f7a\") " pod="openshift-authentication/oauth-openshift-5db757fd5b-t57qc" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.699288 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ec3da47d-c782-4189-b195-d6b203bd7f7a-v4-0-config-system-session\") pod \"oauth-openshift-5db757fd5b-t57qc\" (UID: \"ec3da47d-c782-4189-b195-d6b203bd7f7a\") " pod="openshift-authentication/oauth-openshift-5db757fd5b-t57qc" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.699355 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4n2xn\" (UniqueName: \"kubernetes.io/projected/ec3da47d-c782-4189-b195-d6b203bd7f7a-kube-api-access-4n2xn\") pod \"oauth-openshift-5db757fd5b-t57qc\" (UID: \"ec3da47d-c782-4189-b195-d6b203bd7f7a\") " pod="openshift-authentication/oauth-openshift-5db757fd5b-t57qc" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.699408 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ec3da47d-c782-4189-b195-d6b203bd7f7a-v4-0-config-user-template-error\") pod \"oauth-openshift-5db757fd5b-t57qc\" (UID: \"ec3da47d-c782-4189-b195-d6b203bd7f7a\") " pod="openshift-authentication/oauth-openshift-5db757fd5b-t57qc" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.699428 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ec3da47d-c782-4189-b195-d6b203bd7f7a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5db757fd5b-t57qc\" (UID: \"ec3da47d-c782-4189-b195-d6b203bd7f7a\") " pod="openshift-authentication/oauth-openshift-5db757fd5b-t57qc" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.699458 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ec3da47d-c782-4189-b195-d6b203bd7f7a-audit-dir\") pod \"oauth-openshift-5db757fd5b-t57qc\" (UID: \"ec3da47d-c782-4189-b195-d6b203bd7f7a\") " pod="openshift-authentication/oauth-openshift-5db757fd5b-t57qc" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.699477 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ec3da47d-c782-4189-b195-d6b203bd7f7a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5db757fd5b-t57qc\" (UID: \"ec3da47d-c782-4189-b195-d6b203bd7f7a\") " pod="openshift-authentication/oauth-openshift-5db757fd5b-t57qc" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.699494 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ec3da47d-c782-4189-b195-d6b203bd7f7a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5db757fd5b-t57qc\" (UID: \"ec3da47d-c782-4189-b195-d6b203bd7f7a\") " pod="openshift-authentication/oauth-openshift-5db757fd5b-t57qc" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.699525 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ec3da47d-c782-4189-b195-d6b203bd7f7a-v4-0-config-system-service-ca\") pod \"oauth-openshift-5db757fd5b-t57qc\" (UID: \"ec3da47d-c782-4189-b195-d6b203bd7f7a\") " pod="openshift-authentication/oauth-openshift-5db757fd5b-t57qc" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.699544 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ec3da47d-c782-4189-b195-d6b203bd7f7a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5db757fd5b-t57qc\" (UID: \"ec3da47d-c782-4189-b195-d6b203bd7f7a\") " pod="openshift-authentication/oauth-openshift-5db757fd5b-t57qc" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.699565 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ec3da47d-c782-4189-b195-d6b203bd7f7a-audit-policies\") pod \"oauth-openshift-5db757fd5b-t57qc\" (UID: \"ec3da47d-c782-4189-b195-d6b203bd7f7a\") " pod="openshift-authentication/oauth-openshift-5db757fd5b-t57qc" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.699583 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ec3da47d-c782-4189-b195-d6b203bd7f7a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5db757fd5b-t57qc\" (UID: \"ec3da47d-c782-4189-b195-d6b203bd7f7a\") " pod="openshift-authentication/oauth-openshift-5db757fd5b-t57qc" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.699627 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec3da47d-c782-4189-b195-d6b203bd7f7a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5db757fd5b-t57qc\" (UID: \"ec3da47d-c782-4189-b195-d6b203bd7f7a\") " pod="openshift-authentication/oauth-openshift-5db757fd5b-t57qc" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.801484 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ec3da47d-c782-4189-b195-d6b203bd7f7a-v4-0-config-user-template-login\") pod \"oauth-openshift-5db757fd5b-t57qc\" (UID: \"ec3da47d-c782-4189-b195-d6b203bd7f7a\") " pod="openshift-authentication/oauth-openshift-5db757fd5b-t57qc" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.801556 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ec3da47d-c782-4189-b195-d6b203bd7f7a-v4-0-config-system-router-certs\") pod \"oauth-openshift-5db757fd5b-t57qc\" (UID: \"ec3da47d-c782-4189-b195-d6b203bd7f7a\") " pod="openshift-authentication/oauth-openshift-5db757fd5b-t57qc" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.801619 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ec3da47d-c782-4189-b195-d6b203bd7f7a-v4-0-config-system-session\") pod \"oauth-openshift-5db757fd5b-t57qc\" (UID: \"ec3da47d-c782-4189-b195-d6b203bd7f7a\") " pod="openshift-authentication/oauth-openshift-5db757fd5b-t57qc" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.801666 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4n2xn\" (UniqueName: \"kubernetes.io/projected/ec3da47d-c782-4189-b195-d6b203bd7f7a-kube-api-access-4n2xn\") pod \"oauth-openshift-5db757fd5b-t57qc\" (UID: \"ec3da47d-c782-4189-b195-d6b203bd7f7a\") " pod="openshift-authentication/oauth-openshift-5db757fd5b-t57qc" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.801696 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ec3da47d-c782-4189-b195-d6b203bd7f7a-v4-0-config-user-template-error\") pod \"oauth-openshift-5db757fd5b-t57qc\" (UID: \"ec3da47d-c782-4189-b195-d6b203bd7f7a\") " pod="openshift-authentication/oauth-openshift-5db757fd5b-t57qc" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.801751 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ec3da47d-c782-4189-b195-d6b203bd7f7a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5db757fd5b-t57qc\" (UID: \"ec3da47d-c782-4189-b195-d6b203bd7f7a\") " pod="openshift-authentication/oauth-openshift-5db757fd5b-t57qc" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.801793 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ec3da47d-c782-4189-b195-d6b203bd7f7a-audit-dir\") pod \"oauth-openshift-5db757fd5b-t57qc\" (UID: \"ec3da47d-c782-4189-b195-d6b203bd7f7a\") " pod="openshift-authentication/oauth-openshift-5db757fd5b-t57qc" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.801822 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ec3da47d-c782-4189-b195-d6b203bd7f7a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5db757fd5b-t57qc\" (UID: \"ec3da47d-c782-4189-b195-d6b203bd7f7a\") " pod="openshift-authentication/oauth-openshift-5db757fd5b-t57qc" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.801844 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ec3da47d-c782-4189-b195-d6b203bd7f7a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5db757fd5b-t57qc\" (UID: \"ec3da47d-c782-4189-b195-d6b203bd7f7a\") " pod="openshift-authentication/oauth-openshift-5db757fd5b-t57qc" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.801878 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ec3da47d-c782-4189-b195-d6b203bd7f7a-v4-0-config-system-service-ca\") pod \"oauth-openshift-5db757fd5b-t57qc\" (UID: \"ec3da47d-c782-4189-b195-d6b203bd7f7a\") " pod="openshift-authentication/oauth-openshift-5db757fd5b-t57qc" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.801906 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ec3da47d-c782-4189-b195-d6b203bd7f7a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5db757fd5b-t57qc\" (UID: \"ec3da47d-c782-4189-b195-d6b203bd7f7a\") " pod="openshift-authentication/oauth-openshift-5db757fd5b-t57qc" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.801936 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ec3da47d-c782-4189-b195-d6b203bd7f7a-audit-policies\") pod \"oauth-openshift-5db757fd5b-t57qc\" (UID: \"ec3da47d-c782-4189-b195-d6b203bd7f7a\") " pod="openshift-authentication/oauth-openshift-5db757fd5b-t57qc" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.801963 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ec3da47d-c782-4189-b195-d6b203bd7f7a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5db757fd5b-t57qc\" (UID: \"ec3da47d-c782-4189-b195-d6b203bd7f7a\") " pod="openshift-authentication/oauth-openshift-5db757fd5b-t57qc" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.801985 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec3da47d-c782-4189-b195-d6b203bd7f7a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5db757fd5b-t57qc\" (UID: \"ec3da47d-c782-4189-b195-d6b203bd7f7a\") " pod="openshift-authentication/oauth-openshift-5db757fd5b-t57qc" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.803716 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/ec3da47d-c782-4189-b195-d6b203bd7f7a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5db757fd5b-t57qc\" (UID: \"ec3da47d-c782-4189-b195-d6b203bd7f7a\") " pod="openshift-authentication/oauth-openshift-5db757fd5b-t57qc" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.803809 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ec3da47d-c782-4189-b195-d6b203bd7f7a-audit-dir\") pod \"oauth-openshift-5db757fd5b-t57qc\" (UID: \"ec3da47d-c782-4189-b195-d6b203bd7f7a\") " pod="openshift-authentication/oauth-openshift-5db757fd5b-t57qc" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.805646 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec3da47d-c782-4189-b195-d6b203bd7f7a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5db757fd5b-t57qc\" (UID: \"ec3da47d-c782-4189-b195-d6b203bd7f7a\") " pod="openshift-authentication/oauth-openshift-5db757fd5b-t57qc" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.807483 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/ec3da47d-c782-4189-b195-d6b203bd7f7a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5db757fd5b-t57qc\" (UID: \"ec3da47d-c782-4189-b195-d6b203bd7f7a\") " pod="openshift-authentication/oauth-openshift-5db757fd5b-t57qc" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.807484 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/ec3da47d-c782-4189-b195-d6b203bd7f7a-v4-0-config-user-template-error\") pod \"oauth-openshift-5db757fd5b-t57qc\" (UID: \"ec3da47d-c782-4189-b195-d6b203bd7f7a\") " pod="openshift-authentication/oauth-openshift-5db757fd5b-t57qc" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.808101 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/ec3da47d-c782-4189-b195-d6b203bd7f7a-v4-0-config-system-service-ca\") pod \"oauth-openshift-5db757fd5b-t57qc\" (UID: \"ec3da47d-c782-4189-b195-d6b203bd7f7a\") " pod="openshift-authentication/oauth-openshift-5db757fd5b-t57qc" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.808595 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/ec3da47d-c782-4189-b195-d6b203bd7f7a-v4-0-config-system-router-certs\") pod \"oauth-openshift-5db757fd5b-t57qc\" (UID: \"ec3da47d-c782-4189-b195-d6b203bd7f7a\") " pod="openshift-authentication/oauth-openshift-5db757fd5b-t57qc" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.810471 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/ec3da47d-c782-4189-b195-d6b203bd7f7a-v4-0-config-user-template-login\") pod \"oauth-openshift-5db757fd5b-t57qc\" (UID: \"ec3da47d-c782-4189-b195-d6b203bd7f7a\") " pod="openshift-authentication/oauth-openshift-5db757fd5b-t57qc" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.810990 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/ec3da47d-c782-4189-b195-d6b203bd7f7a-audit-policies\") pod \"oauth-openshift-5db757fd5b-t57qc\" (UID: \"ec3da47d-c782-4189-b195-d6b203bd7f7a\") " pod="openshift-authentication/oauth-openshift-5db757fd5b-t57qc" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.811478 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/ec3da47d-c782-4189-b195-d6b203bd7f7a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5db757fd5b-t57qc\" (UID: \"ec3da47d-c782-4189-b195-d6b203bd7f7a\") " pod="openshift-authentication/oauth-openshift-5db757fd5b-t57qc" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.812719 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/ec3da47d-c782-4189-b195-d6b203bd7f7a-v4-0-config-system-session\") pod \"oauth-openshift-5db757fd5b-t57qc\" (UID: \"ec3da47d-c782-4189-b195-d6b203bd7f7a\") " pod="openshift-authentication/oauth-openshift-5db757fd5b-t57qc" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.815555 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/ec3da47d-c782-4189-b195-d6b203bd7f7a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5db757fd5b-t57qc\" (UID: \"ec3da47d-c782-4189-b195-d6b203bd7f7a\") " pod="openshift-authentication/oauth-openshift-5db757fd5b-t57qc" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.815725 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/ec3da47d-c782-4189-b195-d6b203bd7f7a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5db757fd5b-t57qc\" (UID: \"ec3da47d-c782-4189-b195-d6b203bd7f7a\") " pod="openshift-authentication/oauth-openshift-5db757fd5b-t57qc" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.830832 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4n2xn\" (UniqueName: \"kubernetes.io/projected/ec3da47d-c782-4189-b195-d6b203bd7f7a-kube-api-access-4n2xn\") pod \"oauth-openshift-5db757fd5b-t57qc\" (UID: \"ec3da47d-c782-4189-b195-d6b203bd7f7a\") " pod="openshift-authentication/oauth-openshift-5db757fd5b-t57qc" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.875667 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6b98945b4f-krmvf" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.903119 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b7t97\" (UniqueName: \"kubernetes.io/projected/d7afc4c0-854b-4491-8db5-a6dbdc888d1c-kube-api-access-b7t97\") pod \"d7afc4c0-854b-4491-8db5-a6dbdc888d1c\" (UID: \"d7afc4c0-854b-4491-8db5-a6dbdc888d1c\") " Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.903272 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d7afc4c0-854b-4491-8db5-a6dbdc888d1c-client-ca\") pod \"d7afc4c0-854b-4491-8db5-a6dbdc888d1c\" (UID: \"d7afc4c0-854b-4491-8db5-a6dbdc888d1c\") " Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.903327 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d7afc4c0-854b-4491-8db5-a6dbdc888d1c-serving-cert\") pod \"d7afc4c0-854b-4491-8db5-a6dbdc888d1c\" (UID: \"d7afc4c0-854b-4491-8db5-a6dbdc888d1c\") " Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.903360 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7afc4c0-854b-4491-8db5-a6dbdc888d1c-config\") pod \"d7afc4c0-854b-4491-8db5-a6dbdc888d1c\" (UID: \"d7afc4c0-854b-4491-8db5-a6dbdc888d1c\") " Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.904876 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7afc4c0-854b-4491-8db5-a6dbdc888d1c-config" (OuterVolumeSpecName: "config") pod "d7afc4c0-854b-4491-8db5-a6dbdc888d1c" (UID: "d7afc4c0-854b-4491-8db5-a6dbdc888d1c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.905412 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7afc4c0-854b-4491-8db5-a6dbdc888d1c-client-ca" (OuterVolumeSpecName: "client-ca") pod "d7afc4c0-854b-4491-8db5-a6dbdc888d1c" (UID: "d7afc4c0-854b-4491-8db5-a6dbdc888d1c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.925067 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7afc4c0-854b-4491-8db5-a6dbdc888d1c-kube-api-access-b7t97" (OuterVolumeSpecName: "kube-api-access-b7t97") pod "d7afc4c0-854b-4491-8db5-a6dbdc888d1c" (UID: "d7afc4c0-854b-4491-8db5-a6dbdc888d1c"). InnerVolumeSpecName "kube-api-access-b7t97". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.925494 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7afc4c0-854b-4491-8db5-a6dbdc888d1c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d7afc4c0-854b-4491-8db5-a6dbdc888d1c" (UID: "d7afc4c0-854b-4491-8db5-a6dbdc888d1c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:07:47 crc kubenswrapper[4722]: I0309 14:07:47.959751 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5db757fd5b-t57qc" Mar 09 14:07:48 crc kubenswrapper[4722]: I0309 14:07:48.005802 4722 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d7afc4c0-854b-4491-8db5-a6dbdc888d1c-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 14:07:48 crc kubenswrapper[4722]: I0309 14:07:48.005846 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d7afc4c0-854b-4491-8db5-a6dbdc888d1c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 14:07:48 crc kubenswrapper[4722]: I0309 14:07:48.005859 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7afc4c0-854b-4491-8db5-a6dbdc888d1c-config\") on node \"crc\" DevicePath \"\"" Mar 09 14:07:48 crc kubenswrapper[4722]: I0309 14:07:48.005879 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b7t97\" (UniqueName: \"kubernetes.io/projected/d7afc4c0-854b-4491-8db5-a6dbdc888d1c-kube-api-access-b7t97\") on node \"crc\" DevicePath \"\"" Mar 09 14:07:48 crc kubenswrapper[4722]: I0309 14:07:48.064402 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8555d9c686-6xxgz" Mar 09 14:07:48 crc kubenswrapper[4722]: I0309 14:07:48.111767 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3bdc231e-2cd0-4323-a872-803e28ab5868-serving-cert\") pod \"3bdc231e-2cd0-4323-a872-803e28ab5868\" (UID: \"3bdc231e-2cd0-4323-a872-803e28ab5868\") " Mar 09 14:07:48 crc kubenswrapper[4722]: I0309 14:07:48.112358 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3bdc231e-2cd0-4323-a872-803e28ab5868-client-ca\") pod \"3bdc231e-2cd0-4323-a872-803e28ab5868\" (UID: \"3bdc231e-2cd0-4323-a872-803e28ab5868\") " Mar 09 14:07:48 crc kubenswrapper[4722]: I0309 14:07:48.112497 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-td4zf\" (UniqueName: \"kubernetes.io/projected/3bdc231e-2cd0-4323-a872-803e28ab5868-kube-api-access-td4zf\") pod \"3bdc231e-2cd0-4323-a872-803e28ab5868\" (UID: \"3bdc231e-2cd0-4323-a872-803e28ab5868\") " Mar 09 14:07:48 crc kubenswrapper[4722]: I0309 14:07:48.112560 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3bdc231e-2cd0-4323-a872-803e28ab5868-proxy-ca-bundles\") pod \"3bdc231e-2cd0-4323-a872-803e28ab5868\" (UID: \"3bdc231e-2cd0-4323-a872-803e28ab5868\") " Mar 09 14:07:48 crc kubenswrapper[4722]: I0309 14:07:48.112605 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bdc231e-2cd0-4323-a872-803e28ab5868-config\") pod \"3bdc231e-2cd0-4323-a872-803e28ab5868\" (UID: \"3bdc231e-2cd0-4323-a872-803e28ab5868\") " Mar 09 14:07:48 crc kubenswrapper[4722]: I0309 14:07:48.114166 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3bdc231e-2cd0-4323-a872-803e28ab5868-config" (OuterVolumeSpecName: "config") pod "3bdc231e-2cd0-4323-a872-803e28ab5868" (UID: "3bdc231e-2cd0-4323-a872-803e28ab5868"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:07:48 crc kubenswrapper[4722]: I0309 14:07:48.115846 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3bdc231e-2cd0-4323-a872-803e28ab5868-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "3bdc231e-2cd0-4323-a872-803e28ab5868" (UID: "3bdc231e-2cd0-4323-a872-803e28ab5868"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:07:48 crc kubenswrapper[4722]: I0309 14:07:48.116219 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3bdc231e-2cd0-4323-a872-803e28ab5868-client-ca" (OuterVolumeSpecName: "client-ca") pod "3bdc231e-2cd0-4323-a872-803e28ab5868" (UID: "3bdc231e-2cd0-4323-a872-803e28ab5868"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:07:48 crc kubenswrapper[4722]: I0309 14:07:48.123434 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bdc231e-2cd0-4323-a872-803e28ab5868-kube-api-access-td4zf" (OuterVolumeSpecName: "kube-api-access-td4zf") pod "3bdc231e-2cd0-4323-a872-803e28ab5868" (UID: "3bdc231e-2cd0-4323-a872-803e28ab5868"). InnerVolumeSpecName "kube-api-access-td4zf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:07:48 crc kubenswrapper[4722]: I0309 14:07:48.129887 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3bdc231e-2cd0-4323-a872-803e28ab5868-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "3bdc231e-2cd0-4323-a872-803e28ab5868" (UID: "3bdc231e-2cd0-4323-a872-803e28ab5868"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:07:48 crc kubenswrapper[4722]: I0309 14:07:48.160259 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c83f63c-9a9e-4e33-b0c6-54396ce9f07c" path="/var/lib/kubelet/pods/9c83f63c-9a9e-4e33-b0c6-54396ce9f07c/volumes" Mar 09 14:07:48 crc kubenswrapper[4722]: I0309 14:07:48.216444 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3bdc231e-2cd0-4323-a872-803e28ab5868-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 14:07:48 crc kubenswrapper[4722]: I0309 14:07:48.216488 4722 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3bdc231e-2cd0-4323-a872-803e28ab5868-client-ca\") on node \"crc\" DevicePath \"\"" Mar 09 14:07:48 crc kubenswrapper[4722]: I0309 14:07:48.216502 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-td4zf\" (UniqueName: \"kubernetes.io/projected/3bdc231e-2cd0-4323-a872-803e28ab5868-kube-api-access-td4zf\") on node \"crc\" DevicePath \"\"" Mar 09 14:07:48 crc kubenswrapper[4722]: I0309 14:07:48.216514 4722 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3bdc231e-2cd0-4323-a872-803e28ab5868-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 09 14:07:48 crc kubenswrapper[4722]: I0309 14:07:48.216524 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bdc231e-2cd0-4323-a872-803e28ab5868-config\") on node \"crc\" DevicePath \"\"" Mar 09 14:07:48 crc kubenswrapper[4722]: I0309 14:07:48.274404 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5db757fd5b-t57qc"] Mar 09 14:07:48 crc kubenswrapper[4722]: I0309 14:07:48.577490 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-8445c785c8-hdmgl"] Mar 09 14:07:48 crc kubenswrapper[4722]: E0309 14:07:48.577891 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bdc231e-2cd0-4323-a872-803e28ab5868" containerName="controller-manager" Mar 09 14:07:48 crc kubenswrapper[4722]: I0309 14:07:48.577921 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bdc231e-2cd0-4323-a872-803e28ab5868" containerName="controller-manager" Mar 09 14:07:48 crc kubenswrapper[4722]: E0309 14:07:48.577945 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7afc4c0-854b-4491-8db5-a6dbdc888d1c" containerName="route-controller-manager" Mar 09 14:07:48 crc kubenswrapper[4722]: I0309 14:07:48.577955 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7afc4c0-854b-4491-8db5-a6dbdc888d1c" containerName="route-controller-manager" Mar 09 14:07:48 crc kubenswrapper[4722]: I0309 14:07:48.578128 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bdc231e-2cd0-4323-a872-803e28ab5868" containerName="controller-manager" Mar 09 14:07:48 crc kubenswrapper[4722]: I0309 14:07:48.578149 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7afc4c0-854b-4491-8db5-a6dbdc888d1c" containerName="route-controller-manager" Mar 09 14:07:48 crc kubenswrapper[4722]: I0309 14:07:48.578810 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8445c785c8-hdmgl" Mar 09 14:07:48 crc kubenswrapper[4722]: I0309 14:07:48.586966 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67579949ff-g69dw"] Mar 09 14:07:48 crc kubenswrapper[4722]: I0309 14:07:48.588116 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67579949ff-g69dw" Mar 09 14:07:48 crc kubenswrapper[4722]: I0309 14:07:48.599626 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-8445c785c8-hdmgl"] Mar 09 14:07:48 crc kubenswrapper[4722]: I0309 14:07:48.610087 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67579949ff-g69dw"] Mar 09 14:07:48 crc kubenswrapper[4722]: I0309 14:07:48.621943 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvrm7\" (UniqueName: \"kubernetes.io/projected/fd45c8c5-9cad-404b-b14a-9cbc710c8468-kube-api-access-mvrm7\") pod \"route-controller-manager-67579949ff-g69dw\" (UID: \"fd45c8c5-9cad-404b-b14a-9cbc710c8468\") " pod="openshift-route-controller-manager/route-controller-manager-67579949ff-g69dw" Mar 09 14:07:48 crc kubenswrapper[4722]: I0309 14:07:48.622004 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7139e62b-5e90-4545-a264-aa8138821a55-client-ca\") pod \"controller-manager-8445c785c8-hdmgl\" (UID: \"7139e62b-5e90-4545-a264-aa8138821a55\") " pod="openshift-controller-manager/controller-manager-8445c785c8-hdmgl" Mar 09 14:07:48 crc kubenswrapper[4722]: I0309 14:07:48.622029 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd45c8c5-9cad-404b-b14a-9cbc710c8468-serving-cert\") pod \"route-controller-manager-67579949ff-g69dw\" (UID: \"fd45c8c5-9cad-404b-b14a-9cbc710c8468\") " pod="openshift-route-controller-manager/route-controller-manager-67579949ff-g69dw" Mar 09 14:07:48 crc kubenswrapper[4722]: I0309 14:07:48.622052 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fldq4\" (UniqueName: \"kubernetes.io/projected/7139e62b-5e90-4545-a264-aa8138821a55-kube-api-access-fldq4\") pod \"controller-manager-8445c785c8-hdmgl\" (UID: \"7139e62b-5e90-4545-a264-aa8138821a55\") " pod="openshift-controller-manager/controller-manager-8445c785c8-hdmgl" Mar 09 14:07:48 crc kubenswrapper[4722]: I0309 14:07:48.622079 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fd45c8c5-9cad-404b-b14a-9cbc710c8468-client-ca\") pod \"route-controller-manager-67579949ff-g69dw\" (UID: \"fd45c8c5-9cad-404b-b14a-9cbc710c8468\") " pod="openshift-route-controller-manager/route-controller-manager-67579949ff-g69dw" Mar 09 14:07:48 crc kubenswrapper[4722]: I0309 14:07:48.622100 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd45c8c5-9cad-404b-b14a-9cbc710c8468-config\") pod \"route-controller-manager-67579949ff-g69dw\" (UID: \"fd45c8c5-9cad-404b-b14a-9cbc710c8468\") " pod="openshift-route-controller-manager/route-controller-manager-67579949ff-g69dw" Mar 09 14:07:48 crc kubenswrapper[4722]: I0309 14:07:48.622121 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7139e62b-5e90-4545-a264-aa8138821a55-proxy-ca-bundles\") pod \"controller-manager-8445c785c8-hdmgl\" (UID: \"7139e62b-5e90-4545-a264-aa8138821a55\") " pod="openshift-controller-manager/controller-manager-8445c785c8-hdmgl" Mar 09 14:07:48 crc kubenswrapper[4722]: I0309 14:07:48.622158 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7139e62b-5e90-4545-a264-aa8138821a55-serving-cert\") pod \"controller-manager-8445c785c8-hdmgl\" (UID: \"7139e62b-5e90-4545-a264-aa8138821a55\") " pod="openshift-controller-manager/controller-manager-8445c785c8-hdmgl" Mar 09 14:07:48 crc kubenswrapper[4722]: I0309 14:07:48.622189 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7139e62b-5e90-4545-a264-aa8138821a55-config\") pod \"controller-manager-8445c785c8-hdmgl\" (UID: \"7139e62b-5e90-4545-a264-aa8138821a55\") " pod="openshift-controller-manager/controller-manager-8445c785c8-hdmgl" Mar 09 14:07:48 crc kubenswrapper[4722]: I0309 14:07:48.662482 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5db757fd5b-t57qc" event={"ID":"ec3da47d-c782-4189-b195-d6b203bd7f7a","Type":"ContainerStarted","Data":"1bef1f950532e1f9264d862747456e60811d3e386e3459a18b18cc7dad6b8a21"} Mar 09 14:07:48 crc kubenswrapper[4722]: I0309 14:07:48.662552 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5db757fd5b-t57qc" event={"ID":"ec3da47d-c782-4189-b195-d6b203bd7f7a","Type":"ContainerStarted","Data":"03d10d2770ce74665b3390715ede562cf0129890b43863f13d1ab38e357023a4"} Mar 09 14:07:48 crc kubenswrapper[4722]: I0309 14:07:48.662893 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-5db757fd5b-t57qc" Mar 09 14:07:48 crc kubenswrapper[4722]: I0309 14:07:48.667674 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6b98945b4f-krmvf" event={"ID":"d7afc4c0-854b-4491-8db5-a6dbdc888d1c","Type":"ContainerDied","Data":"639079ea050cb860daaa4addd8466963cc0fe384b1276e4d16ab6c97d1d6909d"} Mar 09 14:07:48 crc kubenswrapper[4722]: I0309 14:07:48.667718 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6b98945b4f-krmvf" Mar 09 14:07:48 crc kubenswrapper[4722]: I0309 14:07:48.667733 4722 scope.go:117] "RemoveContainer" containerID="4323b8690b994f2872b8977ceb081700d560bec7ee6f376993ce8f318c792782" Mar 09 14:07:48 crc kubenswrapper[4722]: I0309 14:07:48.669570 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8555d9c686-6xxgz" event={"ID":"3bdc231e-2cd0-4323-a872-803e28ab5868","Type":"ContainerDied","Data":"4a9512104b74ce61513fb3e6023d91badad5cc782fbe5107a783ed4439a43f96"} Mar 09 14:07:48 crc kubenswrapper[4722]: I0309 14:07:48.669608 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8555d9c686-6xxgz" Mar 09 14:07:48 crc kubenswrapper[4722]: I0309 14:07:48.685107 4722 scope.go:117] "RemoveContainer" containerID="ab25961a4f544814b62328a9f21b27b3e7c6a24839aa4b36fcfcd6861c6d8f16" Mar 09 14:07:48 crc kubenswrapper[4722]: I0309 14:07:48.686751 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-5db757fd5b-t57qc" podStartSLOduration=27.686734515 podStartE2EDuration="27.686734515s" podCreationTimestamp="2026-03-09 14:07:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:07:48.685313205 +0000 UTC m=+309.240881791" watchObservedRunningTime="2026-03-09 14:07:48.686734515 +0000 UTC m=+309.242303091" Mar 09 14:07:48 crc kubenswrapper[4722]: I0309 14:07:48.708215 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-8555d9c686-6xxgz"] Mar 09 14:07:48 crc kubenswrapper[4722]: I0309 14:07:48.710776 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-8555d9c686-6xxgz"] Mar 09 14:07:48 crc kubenswrapper[4722]: I0309 14:07:48.721418 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b98945b4f-krmvf"] Mar 09 14:07:48 crc kubenswrapper[4722]: I0309 14:07:48.723527 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fd45c8c5-9cad-404b-b14a-9cbc710c8468-client-ca\") pod \"route-controller-manager-67579949ff-g69dw\" (UID: \"fd45c8c5-9cad-404b-b14a-9cbc710c8468\") " pod="openshift-route-controller-manager/route-controller-manager-67579949ff-g69dw" Mar 09 14:07:48 crc kubenswrapper[4722]: I0309 14:07:48.723578 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd45c8c5-9cad-404b-b14a-9cbc710c8468-config\") pod \"route-controller-manager-67579949ff-g69dw\" (UID: \"fd45c8c5-9cad-404b-b14a-9cbc710c8468\") " pod="openshift-route-controller-manager/route-controller-manager-67579949ff-g69dw" Mar 09 14:07:48 crc kubenswrapper[4722]: I0309 14:07:48.723648 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7139e62b-5e90-4545-a264-aa8138821a55-proxy-ca-bundles\") pod \"controller-manager-8445c785c8-hdmgl\" (UID: \"7139e62b-5e90-4545-a264-aa8138821a55\") " pod="openshift-controller-manager/controller-manager-8445c785c8-hdmgl" Mar 09 14:07:48 crc kubenswrapper[4722]: I0309 14:07:48.723778 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7139e62b-5e90-4545-a264-aa8138821a55-serving-cert\") pod \"controller-manager-8445c785c8-hdmgl\" (UID: \"7139e62b-5e90-4545-a264-aa8138821a55\") " pod="openshift-controller-manager/controller-manager-8445c785c8-hdmgl" Mar 09 14:07:48 crc kubenswrapper[4722]: I0309 14:07:48.723832 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7139e62b-5e90-4545-a264-aa8138821a55-config\") pod \"controller-manager-8445c785c8-hdmgl\" (UID: \"7139e62b-5e90-4545-a264-aa8138821a55\") " pod="openshift-controller-manager/controller-manager-8445c785c8-hdmgl" Mar 09 14:07:48 crc kubenswrapper[4722]: I0309 14:07:48.723921 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvrm7\" (UniqueName: \"kubernetes.io/projected/fd45c8c5-9cad-404b-b14a-9cbc710c8468-kube-api-access-mvrm7\") pod \"route-controller-manager-67579949ff-g69dw\" (UID: \"fd45c8c5-9cad-404b-b14a-9cbc710c8468\") " pod="openshift-route-controller-manager/route-controller-manager-67579949ff-g69dw" Mar 09 14:07:48 crc kubenswrapper[4722]: I0309 14:07:48.723972 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7139e62b-5e90-4545-a264-aa8138821a55-client-ca\") pod \"controller-manager-8445c785c8-hdmgl\" (UID: \"7139e62b-5e90-4545-a264-aa8138821a55\") " pod="openshift-controller-manager/controller-manager-8445c785c8-hdmgl" Mar 09 14:07:48 crc kubenswrapper[4722]: I0309 14:07:48.724012 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd45c8c5-9cad-404b-b14a-9cbc710c8468-serving-cert\") pod \"route-controller-manager-67579949ff-g69dw\" (UID: \"fd45c8c5-9cad-404b-b14a-9cbc710c8468\") " pod="openshift-route-controller-manager/route-controller-manager-67579949ff-g69dw" Mar 09 14:07:48 crc kubenswrapper[4722]: I0309 14:07:48.724038 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fldq4\" (UniqueName: \"kubernetes.io/projected/7139e62b-5e90-4545-a264-aa8138821a55-kube-api-access-fldq4\") pod \"controller-manager-8445c785c8-hdmgl\" (UID: \"7139e62b-5e90-4545-a264-aa8138821a55\") " pod="openshift-controller-manager/controller-manager-8445c785c8-hdmgl" Mar 09 14:07:48 crc kubenswrapper[4722]: I0309 14:07:48.724799 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fd45c8c5-9cad-404b-b14a-9cbc710c8468-client-ca\") pod \"route-controller-manager-67579949ff-g69dw\" (UID: \"fd45c8c5-9cad-404b-b14a-9cbc710c8468\") " pod="openshift-route-controller-manager/route-controller-manager-67579949ff-g69dw" Mar 09 14:07:48 crc kubenswrapper[4722]: I0309 14:07:48.726570 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7139e62b-5e90-4545-a264-aa8138821a55-client-ca\") pod \"controller-manager-8445c785c8-hdmgl\" (UID: \"7139e62b-5e90-4545-a264-aa8138821a55\") " pod="openshift-controller-manager/controller-manager-8445c785c8-hdmgl" Mar 09 14:07:48 crc kubenswrapper[4722]: I0309 14:07:48.726589 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd45c8c5-9cad-404b-b14a-9cbc710c8468-config\") pod \"route-controller-manager-67579949ff-g69dw\" (UID: \"fd45c8c5-9cad-404b-b14a-9cbc710c8468\") " pod="openshift-route-controller-manager/route-controller-manager-67579949ff-g69dw" Mar 09 14:07:48 crc kubenswrapper[4722]: I0309 14:07:48.727354 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7139e62b-5e90-4545-a264-aa8138821a55-proxy-ca-bundles\") pod \"controller-manager-8445c785c8-hdmgl\" (UID: \"7139e62b-5e90-4545-a264-aa8138821a55\") " pod="openshift-controller-manager/controller-manager-8445c785c8-hdmgl" Mar 09 14:07:48 crc kubenswrapper[4722]: I0309 14:07:48.728720 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7139e62b-5e90-4545-a264-aa8138821a55-config\") pod \"controller-manager-8445c785c8-hdmgl\" (UID: \"7139e62b-5e90-4545-a264-aa8138821a55\") " pod="openshift-controller-manager/controller-manager-8445c785c8-hdmgl" Mar 09 14:07:48 crc kubenswrapper[4722]: I0309 14:07:48.729027 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b98945b4f-krmvf"] Mar 09 14:07:48 crc kubenswrapper[4722]: I0309 14:07:48.729194 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7139e62b-5e90-4545-a264-aa8138821a55-serving-cert\") pod \"controller-manager-8445c785c8-hdmgl\" (UID: \"7139e62b-5e90-4545-a264-aa8138821a55\") " pod="openshift-controller-manager/controller-manager-8445c785c8-hdmgl" Mar 09 14:07:48 crc kubenswrapper[4722]: I0309 14:07:48.729390 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd45c8c5-9cad-404b-b14a-9cbc710c8468-serving-cert\") pod \"route-controller-manager-67579949ff-g69dw\" (UID: \"fd45c8c5-9cad-404b-b14a-9cbc710c8468\") " pod="openshift-route-controller-manager/route-controller-manager-67579949ff-g69dw" Mar 09 14:07:48 crc kubenswrapper[4722]: I0309 14:07:48.742014 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fldq4\" (UniqueName: \"kubernetes.io/projected/7139e62b-5e90-4545-a264-aa8138821a55-kube-api-access-fldq4\") pod \"controller-manager-8445c785c8-hdmgl\" (UID: \"7139e62b-5e90-4545-a264-aa8138821a55\") " pod="openshift-controller-manager/controller-manager-8445c785c8-hdmgl" Mar 09 14:07:48 crc kubenswrapper[4722]: I0309 14:07:48.748536 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvrm7\" (UniqueName: \"kubernetes.io/projected/fd45c8c5-9cad-404b-b14a-9cbc710c8468-kube-api-access-mvrm7\") pod \"route-controller-manager-67579949ff-g69dw\" (UID: \"fd45c8c5-9cad-404b-b14a-9cbc710c8468\") " pod="openshift-route-controller-manager/route-controller-manager-67579949ff-g69dw" Mar 09 14:07:48 crc kubenswrapper[4722]: I0309 14:07:48.904854 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8445c785c8-hdmgl" Mar 09 14:07:48 crc kubenswrapper[4722]: I0309 14:07:48.912806 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67579949ff-g69dw" Mar 09 14:07:49 crc kubenswrapper[4722]: I0309 14:07:49.344789 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67579949ff-g69dw"] Mar 09 14:07:49 crc kubenswrapper[4722]: I0309 14:07:49.360096 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-5db757fd5b-t57qc" Mar 09 14:07:49 crc kubenswrapper[4722]: I0309 14:07:49.416867 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-8445c785c8-hdmgl"] Mar 09 14:07:49 crc kubenswrapper[4722]: I0309 14:07:49.692832 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8445c785c8-hdmgl" event={"ID":"7139e62b-5e90-4545-a264-aa8138821a55","Type":"ContainerStarted","Data":"d54182a987e41b0c8c762501f9a0b141411808f05dd2f80ed1b2f718fd2394ac"} Mar 09 14:07:49 crc kubenswrapper[4722]: I0309 14:07:49.694125 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-67579949ff-g69dw" event={"ID":"fd45c8c5-9cad-404b-b14a-9cbc710c8468","Type":"ContainerStarted","Data":"8d332c1529ee620c493764c4282c29e9d3a50c516a3d7a9251c37d0a5bd01a83"} Mar 09 14:07:50 crc kubenswrapper[4722]: I0309 14:07:50.159561 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bdc231e-2cd0-4323-a872-803e28ab5868" path="/var/lib/kubelet/pods/3bdc231e-2cd0-4323-a872-803e28ab5868/volumes" Mar 09 14:07:50 crc kubenswrapper[4722]: I0309 14:07:50.160500 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7afc4c0-854b-4491-8db5-a6dbdc888d1c" path="/var/lib/kubelet/pods/d7afc4c0-854b-4491-8db5-a6dbdc888d1c/volumes" Mar 09 14:07:50 crc kubenswrapper[4722]: I0309 14:07:50.704772 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8445c785c8-hdmgl" event={"ID":"7139e62b-5e90-4545-a264-aa8138821a55","Type":"ContainerStarted","Data":"4e30d1f5da18c63b781ab16a8bddd74f6fef64c6f69f6d7e9a89ed24675fbedc"} Mar 09 14:07:50 crc kubenswrapper[4722]: I0309 14:07:50.706505 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-8445c785c8-hdmgl" Mar 09 14:07:50 crc kubenswrapper[4722]: I0309 14:07:50.709993 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-67579949ff-g69dw" event={"ID":"fd45c8c5-9cad-404b-b14a-9cbc710c8468","Type":"ContainerStarted","Data":"cd7063f2eab111d8560c940917127f1dca125e473c409c23b68096bf29d14642"} Mar 09 14:07:50 crc kubenswrapper[4722]: I0309 14:07:50.710109 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-67579949ff-g69dw" Mar 09 14:07:50 crc kubenswrapper[4722]: I0309 14:07:50.714523 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-8445c785c8-hdmgl" Mar 09 14:07:50 crc kubenswrapper[4722]: I0309 14:07:50.716006 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-67579949ff-g69dw" Mar 09 14:07:50 crc kubenswrapper[4722]: I0309 14:07:50.734866 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-8445c785c8-hdmgl" podStartSLOduration=4.734832197 podStartE2EDuration="4.734832197s" podCreationTimestamp="2026-03-09 14:07:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:07:50.733109659 +0000 UTC m=+311.288678245" watchObservedRunningTime="2026-03-09 14:07:50.734832197 +0000 UTC m=+311.290400783" Mar 09 14:07:50 crc kubenswrapper[4722]: I0309 14:07:50.802290 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-67579949ff-g69dw" podStartSLOduration=4.8022457339999995 podStartE2EDuration="4.802245734s" podCreationTimestamp="2026-03-09 14:07:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:07:50.796122283 +0000 UTC m=+311.351690889" watchObservedRunningTime="2026-03-09 14:07:50.802245734 +0000 UTC m=+311.357814350" Mar 09 14:07:54 crc kubenswrapper[4722]: I0309 14:07:54.488604 4722 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 09 14:07:54 crc kubenswrapper[4722]: I0309 14:07:54.490278 4722 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 09 14:07:54 crc kubenswrapper[4722]: I0309 14:07:54.490546 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 14:07:54 crc kubenswrapper[4722]: I0309 14:07:54.490979 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://3dbef38ff5e714c92fc8a86f1e3c64adf606cb198145f884a000f8dee200fcf2" gracePeriod=15 Mar 09 14:07:54 crc kubenswrapper[4722]: I0309 14:07:54.491005 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://28fcbb961b20cd0cbc1e8548492168a0fe9920b7b98c7bfeffe34727c299030e" gracePeriod=15 Mar 09 14:07:54 crc kubenswrapper[4722]: I0309 14:07:54.490998 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://be3836660bf646052991965270cd69b8b8237b5c4bd698a73789b050d23e6877" gracePeriod=15 Mar 09 14:07:54 crc kubenswrapper[4722]: I0309 14:07:54.490935 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://56c76a5e8718243a17d8d9fd9a8dcf6083c2ccb65b4816926dca08a89e465728" gracePeriod=15 Mar 09 14:07:54 crc kubenswrapper[4722]: I0309 14:07:54.491561 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://e2fbabc5ec2c100150ce886c06c3bd78cc5896c8a06ac9d262eeb552a8a8bb5d" gracePeriod=15 Mar 09 14:07:54 crc kubenswrapper[4722]: I0309 14:07:54.493076 4722 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 09 14:07:54 crc kubenswrapper[4722]: E0309 14:07:54.493588 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 09 14:07:54 crc kubenswrapper[4722]: I0309 14:07:54.493604 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 09 14:07:54 crc kubenswrapper[4722]: E0309 14:07:54.493617 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 09 14:07:54 crc kubenswrapper[4722]: I0309 14:07:54.493646 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 09 14:07:54 crc kubenswrapper[4722]: E0309 14:07:54.493658 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 14:07:54 crc kubenswrapper[4722]: I0309 14:07:54.493666 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 14:07:54 crc kubenswrapper[4722]: E0309 14:07:54.493703 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 14:07:54 crc kubenswrapper[4722]: I0309 14:07:54.493713 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 14:07:54 crc kubenswrapper[4722]: E0309 14:07:54.493723 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 09 14:07:54 crc kubenswrapper[4722]: I0309 14:07:54.493731 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 09 14:07:54 crc kubenswrapper[4722]: E0309 14:07:54.493741 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 09 14:07:54 crc kubenswrapper[4722]: I0309 14:07:54.493748 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 09 14:07:54 crc kubenswrapper[4722]: E0309 14:07:54.493780 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 14:07:54 crc kubenswrapper[4722]: I0309 14:07:54.493791 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 14:07:54 crc kubenswrapper[4722]: E0309 14:07:54.493801 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 09 14:07:54 crc kubenswrapper[4722]: I0309 14:07:54.493808 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 09 14:07:54 crc kubenswrapper[4722]: E0309 14:07:54.493823 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 14:07:54 crc kubenswrapper[4722]: I0309 14:07:54.493831 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 14:07:54 crc kubenswrapper[4722]: I0309 14:07:54.494037 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 09 14:07:54 crc kubenswrapper[4722]: I0309 14:07:54.494051 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 14:07:54 crc kubenswrapper[4722]: I0309 14:07:54.494060 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 14:07:54 crc kubenswrapper[4722]: I0309 14:07:54.494070 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 09 14:07:54 crc kubenswrapper[4722]: I0309 14:07:54.494092 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 09 14:07:54 crc kubenswrapper[4722]: I0309 14:07:54.494102 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 09 14:07:54 crc kubenswrapper[4722]: I0309 14:07:54.494111 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 14:07:54 crc kubenswrapper[4722]: I0309 14:07:54.494120 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 14:07:54 crc kubenswrapper[4722]: E0309 14:07:54.494261 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 14:07:54 crc kubenswrapper[4722]: I0309 14:07:54.494270 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 14:07:54 crc kubenswrapper[4722]: I0309 14:07:54.494388 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 09 14:07:54 crc kubenswrapper[4722]: I0309 14:07:54.529006 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 14:07:54 crc kubenswrapper[4722]: I0309 14:07:54.529081 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 14:07:54 crc kubenswrapper[4722]: I0309 14:07:54.529153 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 14:07:54 crc kubenswrapper[4722]: I0309 14:07:54.529177 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 14:07:54 crc kubenswrapper[4722]: I0309 14:07:54.529249 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 14:07:54 crc kubenswrapper[4722]: I0309 14:07:54.540776 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 09 14:07:54 crc kubenswrapper[4722]: I0309 14:07:54.631002 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 14:07:54 crc kubenswrapper[4722]: I0309 14:07:54.631531 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 14:07:54 crc kubenswrapper[4722]: I0309 14:07:54.631576 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 14:07:54 crc kubenswrapper[4722]: I0309 14:07:54.631624 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 14:07:54 crc kubenswrapper[4722]: I0309 14:07:54.631658 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 14:07:54 crc kubenswrapper[4722]: I0309 14:07:54.631694 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 14:07:54 crc kubenswrapper[4722]: I0309 14:07:54.631728 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 14:07:54 crc kubenswrapper[4722]: I0309 14:07:54.631749 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 14:07:54 crc kubenswrapper[4722]: I0309 14:07:54.631864 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 14:07:54 crc kubenswrapper[4722]: I0309 14:07:54.631910 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 14:07:54 crc kubenswrapper[4722]: I0309 14:07:54.631941 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 14:07:54 crc kubenswrapper[4722]: I0309 14:07:54.631971 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 14:07:54 crc kubenswrapper[4722]: I0309 14:07:54.632000 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 14:07:54 crc kubenswrapper[4722]: I0309 14:07:54.733471 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 14:07:54 crc kubenswrapper[4722]: I0309 14:07:54.733522 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 14:07:54 crc kubenswrapper[4722]: I0309 14:07:54.733606 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 14:07:54 crc kubenswrapper[4722]: I0309 14:07:54.733707 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 14:07:54 crc kubenswrapper[4722]: I0309 14:07:54.733766 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 14:07:54 crc kubenswrapper[4722]: I0309 14:07:54.733795 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 14:07:54 crc kubenswrapper[4722]: I0309 14:07:54.749484 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 09 14:07:54 crc kubenswrapper[4722]: I0309 14:07:54.750766 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 09 14:07:54 crc kubenswrapper[4722]: I0309 14:07:54.751586 4722 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="3dbef38ff5e714c92fc8a86f1e3c64adf606cb198145f884a000f8dee200fcf2" exitCode=2 Mar 09 14:07:54 crc kubenswrapper[4722]: I0309 14:07:54.834826 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 14:07:54 crc kubenswrapper[4722]: W0309 14:07:54.862349 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-66890f6c54988d3d2f502cf378814dfee7c9e240501cf4e375277cc3c7776e2b WatchSource:0}: Error finding container 66890f6c54988d3d2f502cf378814dfee7c9e240501cf4e375277cc3c7776e2b: Status 404 returned error can't find the container with id 66890f6c54988d3d2f502cf378814dfee7c9e240501cf4e375277cc3c7776e2b Mar 09 14:07:54 crc kubenswrapper[4722]: E0309 14:07:54.866415 4722 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.194:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189b31792ff26587 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:07:54.865640839 +0000 UTC m=+315.421209435,LastTimestamp:2026-03-09 14:07:54.865640839 +0000 UTC m=+315.421209435,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:07:55 crc kubenswrapper[4722]: I0309 14:07:55.762252 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"66890f6c54988d3d2f502cf378814dfee7c9e240501cf4e375277cc3c7776e2b"} Mar 09 14:07:55 crc kubenswrapper[4722]: I0309 14:07:55.764292 4722 generic.go:334] "Generic (PLEG): container finished" podID="f8225796-29d7-45ff-a016-d19dbc155d1a" containerID="6a5e6490b93985ea1dc6129f8877ec4f217a1bd50f770b098de4590ca9e8df0e" exitCode=0 Mar 09 14:07:55 crc kubenswrapper[4722]: I0309 14:07:55.764370 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"f8225796-29d7-45ff-a016-d19dbc155d1a","Type":"ContainerDied","Data":"6a5e6490b93985ea1dc6129f8877ec4f217a1bd50f770b098de4590ca9e8df0e"} Mar 09 14:07:55 crc kubenswrapper[4722]: I0309 14:07:55.765162 4722 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 09 14:07:55 crc kubenswrapper[4722]: I0309 14:07:55.765787 4722 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 09 14:07:55 crc kubenswrapper[4722]: I0309 14:07:55.766120 4722 status_manager.go:851] "Failed to get status for pod" podUID="f8225796-29d7-45ff-a016-d19dbc155d1a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 09 14:07:55 crc kubenswrapper[4722]: I0309 14:07:55.767407 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 09 14:07:55 crc kubenswrapper[4722]: I0309 14:07:55.769145 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 09 14:07:55 crc kubenswrapper[4722]: I0309 14:07:55.769852 4722 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="28fcbb961b20cd0cbc1e8548492168a0fe9920b7b98c7bfeffe34727c299030e" exitCode=0 Mar 09 14:07:55 crc kubenswrapper[4722]: I0309 14:07:55.769875 4722 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e2fbabc5ec2c100150ce886c06c3bd78cc5896c8a06ac9d262eeb552a8a8bb5d" exitCode=0 Mar 09 14:07:55 crc kubenswrapper[4722]: I0309 14:07:55.769886 4722 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="be3836660bf646052991965270cd69b8b8237b5c4bd698a73789b050d23e6877" exitCode=0 Mar 09 14:07:55 crc kubenswrapper[4722]: I0309 14:07:55.769959 4722 scope.go:117] "RemoveContainer" containerID="b7d636e81bd62363d09f680bbc312f71cbc2ac89d9af44b3c6b9efdaf29ded48" Mar 09 14:07:56 crc kubenswrapper[4722]: I0309 14:07:56.777355 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"e3f35e9316e9d9f1180b2e1a91dfe331c0255d626caf07dba1d44d02aa4170e2"} Mar 09 14:07:56 crc kubenswrapper[4722]: I0309 14:07:56.779130 4722 status_manager.go:851] "Failed to get status for pod" podUID="f8225796-29d7-45ff-a016-d19dbc155d1a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 09 14:07:56 crc kubenswrapper[4722]: I0309 14:07:56.779424 4722 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 09 14:07:56 crc kubenswrapper[4722]: I0309 14:07:56.782462 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 09 14:07:57 crc kubenswrapper[4722]: I0309 14:07:57.002891 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 09 14:07:57 crc kubenswrapper[4722]: I0309 14:07:57.004024 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 14:07:57 crc kubenswrapper[4722]: I0309 14:07:57.005435 4722 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 09 14:07:57 crc kubenswrapper[4722]: I0309 14:07:57.005923 4722 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 09 14:07:57 crc kubenswrapper[4722]: I0309 14:07:57.006426 4722 status_manager.go:851] "Failed to get status for pod" podUID="f8225796-29d7-45ff-a016-d19dbc155d1a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 09 14:07:57 crc kubenswrapper[4722]: I0309 14:07:57.069171 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 09 14:07:57 crc kubenswrapper[4722]: I0309 14:07:57.069273 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 09 14:07:57 crc kubenswrapper[4722]: I0309 14:07:57.069401 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 09 14:07:57 crc kubenswrapper[4722]: I0309 14:07:57.069385 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 14:07:57 crc kubenswrapper[4722]: I0309 14:07:57.069439 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 14:07:57 crc kubenswrapper[4722]: I0309 14:07:57.069559 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 14:07:57 crc kubenswrapper[4722]: I0309 14:07:57.070127 4722 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 09 14:07:57 crc kubenswrapper[4722]: I0309 14:07:57.070157 4722 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 09 14:07:57 crc kubenswrapper[4722]: I0309 14:07:57.070173 4722 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 09 14:07:57 crc kubenswrapper[4722]: I0309 14:07:57.137175 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 09 14:07:57 crc kubenswrapper[4722]: I0309 14:07:57.137997 4722 status_manager.go:851] "Failed to get status for pod" podUID="f8225796-29d7-45ff-a016-d19dbc155d1a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 09 14:07:57 crc kubenswrapper[4722]: I0309 14:07:57.138683 4722 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 09 14:07:57 crc kubenswrapper[4722]: I0309 14:07:57.139279 4722 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 09 14:07:57 crc kubenswrapper[4722]: I0309 14:07:57.171841 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f8225796-29d7-45ff-a016-d19dbc155d1a-kube-api-access\") pod \"f8225796-29d7-45ff-a016-d19dbc155d1a\" (UID: \"f8225796-29d7-45ff-a016-d19dbc155d1a\") " Mar 09 14:07:57 crc kubenswrapper[4722]: I0309 14:07:57.172259 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f8225796-29d7-45ff-a016-d19dbc155d1a-kubelet-dir\") pod \"f8225796-29d7-45ff-a016-d19dbc155d1a\" (UID: \"f8225796-29d7-45ff-a016-d19dbc155d1a\") " Mar 09 14:07:57 crc kubenswrapper[4722]: I0309 14:07:57.172377 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f8225796-29d7-45ff-a016-d19dbc155d1a-var-lock\") pod \"f8225796-29d7-45ff-a016-d19dbc155d1a\" (UID: \"f8225796-29d7-45ff-a016-d19dbc155d1a\") " Mar 09 14:07:57 crc kubenswrapper[4722]: I0309 14:07:57.172655 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f8225796-29d7-45ff-a016-d19dbc155d1a-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "f8225796-29d7-45ff-a016-d19dbc155d1a" (UID: "f8225796-29d7-45ff-a016-d19dbc155d1a"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 14:07:57 crc kubenswrapper[4722]: I0309 14:07:57.172781 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f8225796-29d7-45ff-a016-d19dbc155d1a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f8225796-29d7-45ff-a016-d19dbc155d1a" (UID: "f8225796-29d7-45ff-a016-d19dbc155d1a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 14:07:57 crc kubenswrapper[4722]: I0309 14:07:57.172997 4722 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f8225796-29d7-45ff-a016-d19dbc155d1a-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 09 14:07:57 crc kubenswrapper[4722]: I0309 14:07:57.173686 4722 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f8225796-29d7-45ff-a016-d19dbc155d1a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 09 14:07:57 crc kubenswrapper[4722]: I0309 14:07:57.187692 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8225796-29d7-45ff-a016-d19dbc155d1a-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "f8225796-29d7-45ff-a016-d19dbc155d1a" (UID: "f8225796-29d7-45ff-a016-d19dbc155d1a"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:07:57 crc kubenswrapper[4722]: I0309 14:07:57.275392 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f8225796-29d7-45ff-a016-d19dbc155d1a-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 09 14:07:57 crc kubenswrapper[4722]: I0309 14:07:57.800393 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 09 14:07:57 crc kubenswrapper[4722]: I0309 14:07:57.800531 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"f8225796-29d7-45ff-a016-d19dbc155d1a","Type":"ContainerDied","Data":"04d643a62e4b41231fd58c7fe1dac9c20e6e7a78ca8eac80b18c0d2f9d1f13a4"} Mar 09 14:07:57 crc kubenswrapper[4722]: I0309 14:07:57.801598 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04d643a62e4b41231fd58c7fe1dac9c20e6e7a78ca8eac80b18c0d2f9d1f13a4" Mar 09 14:07:57 crc kubenswrapper[4722]: I0309 14:07:57.805240 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 09 14:07:57 crc kubenswrapper[4722]: I0309 14:07:57.805794 4722 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="56c76a5e8718243a17d8d9fd9a8dcf6083c2ccb65b4816926dca08a89e465728" exitCode=0 Mar 09 14:07:57 crc kubenswrapper[4722]: I0309 14:07:57.806474 4722 scope.go:117] "RemoveContainer" containerID="28fcbb961b20cd0cbc1e8548492168a0fe9920b7b98c7bfeffe34727c299030e" Mar 09 14:07:57 crc kubenswrapper[4722]: I0309 14:07:57.806617 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 14:07:57 crc kubenswrapper[4722]: I0309 14:07:57.818351 4722 status_manager.go:851] "Failed to get status for pod" podUID="f8225796-29d7-45ff-a016-d19dbc155d1a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 09 14:07:57 crc kubenswrapper[4722]: I0309 14:07:57.818728 4722 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 09 14:07:57 crc kubenswrapper[4722]: I0309 14:07:57.819081 4722 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 09 14:07:57 crc kubenswrapper[4722]: I0309 14:07:57.833856 4722 scope.go:117] "RemoveContainer" containerID="e2fbabc5ec2c100150ce886c06c3bd78cc5896c8a06ac9d262eeb552a8a8bb5d" Mar 09 14:07:57 crc kubenswrapper[4722]: I0309 14:07:57.838441 4722 status_manager.go:851] "Failed to get status for pod" podUID="f8225796-29d7-45ff-a016-d19dbc155d1a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 09 14:07:57 crc kubenswrapper[4722]: I0309 14:07:57.839584 4722 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 09 14:07:57 crc kubenswrapper[4722]: I0309 14:07:57.840481 4722 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 09 14:07:57 crc kubenswrapper[4722]: I0309 14:07:57.853288 4722 scope.go:117] "RemoveContainer" containerID="be3836660bf646052991965270cd69b8b8237b5c4bd698a73789b050d23e6877" Mar 09 14:07:57 crc kubenswrapper[4722]: I0309 14:07:57.872431 4722 scope.go:117] "RemoveContainer" containerID="3dbef38ff5e714c92fc8a86f1e3c64adf606cb198145f884a000f8dee200fcf2" Mar 09 14:07:57 crc kubenswrapper[4722]: I0309 14:07:57.888873 4722 scope.go:117] "RemoveContainer" containerID="56c76a5e8718243a17d8d9fd9a8dcf6083c2ccb65b4816926dca08a89e465728" Mar 09 14:07:57 crc kubenswrapper[4722]: I0309 14:07:57.908261 4722 scope.go:117] "RemoveContainer" containerID="0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1" Mar 09 14:07:57 crc kubenswrapper[4722]: I0309 14:07:57.946783 4722 scope.go:117] "RemoveContainer" containerID="28fcbb961b20cd0cbc1e8548492168a0fe9920b7b98c7bfeffe34727c299030e" Mar 09 14:07:57 crc kubenswrapper[4722]: E0309 14:07:57.947609 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28fcbb961b20cd0cbc1e8548492168a0fe9920b7b98c7bfeffe34727c299030e\": container with ID starting with 28fcbb961b20cd0cbc1e8548492168a0fe9920b7b98c7bfeffe34727c299030e not found: ID does not exist" containerID="28fcbb961b20cd0cbc1e8548492168a0fe9920b7b98c7bfeffe34727c299030e" Mar 09 14:07:57 crc kubenswrapper[4722]: I0309 14:07:57.947652 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28fcbb961b20cd0cbc1e8548492168a0fe9920b7b98c7bfeffe34727c299030e"} err="failed to get container status \"28fcbb961b20cd0cbc1e8548492168a0fe9920b7b98c7bfeffe34727c299030e\": rpc error: code = NotFound desc = could not find container \"28fcbb961b20cd0cbc1e8548492168a0fe9920b7b98c7bfeffe34727c299030e\": container with ID starting with 28fcbb961b20cd0cbc1e8548492168a0fe9920b7b98c7bfeffe34727c299030e not found: ID does not exist" Mar 09 14:07:57 crc kubenswrapper[4722]: I0309 14:07:57.947687 4722 scope.go:117] "RemoveContainer" containerID="e2fbabc5ec2c100150ce886c06c3bd78cc5896c8a06ac9d262eeb552a8a8bb5d" Mar 09 14:07:57 crc kubenswrapper[4722]: E0309 14:07:57.948256 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2fbabc5ec2c100150ce886c06c3bd78cc5896c8a06ac9d262eeb552a8a8bb5d\": container with ID starting with e2fbabc5ec2c100150ce886c06c3bd78cc5896c8a06ac9d262eeb552a8a8bb5d not found: ID does not exist" containerID="e2fbabc5ec2c100150ce886c06c3bd78cc5896c8a06ac9d262eeb552a8a8bb5d" Mar 09 14:07:57 crc kubenswrapper[4722]: I0309 14:07:57.948292 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2fbabc5ec2c100150ce886c06c3bd78cc5896c8a06ac9d262eeb552a8a8bb5d"} err="failed to get container status \"e2fbabc5ec2c100150ce886c06c3bd78cc5896c8a06ac9d262eeb552a8a8bb5d\": rpc error: code = NotFound desc = could not find container \"e2fbabc5ec2c100150ce886c06c3bd78cc5896c8a06ac9d262eeb552a8a8bb5d\": container with ID starting with e2fbabc5ec2c100150ce886c06c3bd78cc5896c8a06ac9d262eeb552a8a8bb5d not found: ID does not exist" Mar 09 14:07:57 crc kubenswrapper[4722]: I0309 14:07:57.948313 4722 scope.go:117] "RemoveContainer" containerID="be3836660bf646052991965270cd69b8b8237b5c4bd698a73789b050d23e6877" Mar 09 14:07:57 crc kubenswrapper[4722]: E0309 14:07:57.948814 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be3836660bf646052991965270cd69b8b8237b5c4bd698a73789b050d23e6877\": container with ID starting with be3836660bf646052991965270cd69b8b8237b5c4bd698a73789b050d23e6877 not found: ID does not exist" containerID="be3836660bf646052991965270cd69b8b8237b5c4bd698a73789b050d23e6877" Mar 09 14:07:57 crc kubenswrapper[4722]: I0309 14:07:57.948879 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be3836660bf646052991965270cd69b8b8237b5c4bd698a73789b050d23e6877"} err="failed to get container status \"be3836660bf646052991965270cd69b8b8237b5c4bd698a73789b050d23e6877\": rpc error: code = NotFound desc = could not find container \"be3836660bf646052991965270cd69b8b8237b5c4bd698a73789b050d23e6877\": container with ID starting with be3836660bf646052991965270cd69b8b8237b5c4bd698a73789b050d23e6877 not found: ID does not exist" Mar 09 14:07:57 crc kubenswrapper[4722]: I0309 14:07:57.948909 4722 scope.go:117] "RemoveContainer" containerID="3dbef38ff5e714c92fc8a86f1e3c64adf606cb198145f884a000f8dee200fcf2" Mar 09 14:07:57 crc kubenswrapper[4722]: E0309 14:07:57.949476 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3dbef38ff5e714c92fc8a86f1e3c64adf606cb198145f884a000f8dee200fcf2\": container with ID starting with 3dbef38ff5e714c92fc8a86f1e3c64adf606cb198145f884a000f8dee200fcf2 not found: ID does not exist" containerID="3dbef38ff5e714c92fc8a86f1e3c64adf606cb198145f884a000f8dee200fcf2" Mar 09 14:07:57 crc kubenswrapper[4722]: I0309 14:07:57.949535 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3dbef38ff5e714c92fc8a86f1e3c64adf606cb198145f884a000f8dee200fcf2"} err="failed to get container status \"3dbef38ff5e714c92fc8a86f1e3c64adf606cb198145f884a000f8dee200fcf2\": rpc error: code = NotFound desc = could not find container \"3dbef38ff5e714c92fc8a86f1e3c64adf606cb198145f884a000f8dee200fcf2\": container with ID starting with 3dbef38ff5e714c92fc8a86f1e3c64adf606cb198145f884a000f8dee200fcf2 not found: ID does not exist" Mar 09 14:07:57 crc kubenswrapper[4722]: I0309 14:07:57.949557 4722 scope.go:117] "RemoveContainer" containerID="56c76a5e8718243a17d8d9fd9a8dcf6083c2ccb65b4816926dca08a89e465728" Mar 09 14:07:57 crc kubenswrapper[4722]: E0309 14:07:57.949903 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56c76a5e8718243a17d8d9fd9a8dcf6083c2ccb65b4816926dca08a89e465728\": container with ID starting with 56c76a5e8718243a17d8d9fd9a8dcf6083c2ccb65b4816926dca08a89e465728 not found: ID does not exist" containerID="56c76a5e8718243a17d8d9fd9a8dcf6083c2ccb65b4816926dca08a89e465728" Mar 09 14:07:57 crc kubenswrapper[4722]: I0309 14:07:57.949939 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56c76a5e8718243a17d8d9fd9a8dcf6083c2ccb65b4816926dca08a89e465728"} err="failed to get container status \"56c76a5e8718243a17d8d9fd9a8dcf6083c2ccb65b4816926dca08a89e465728\": rpc error: code = NotFound desc = could not find container \"56c76a5e8718243a17d8d9fd9a8dcf6083c2ccb65b4816926dca08a89e465728\": container with ID starting with 56c76a5e8718243a17d8d9fd9a8dcf6083c2ccb65b4816926dca08a89e465728 not found: ID does not exist" Mar 09 14:07:57 crc kubenswrapper[4722]: I0309 14:07:57.949961 4722 scope.go:117] "RemoveContainer" containerID="0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1" Mar 09 14:07:57 crc kubenswrapper[4722]: E0309 14:07:57.950400 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\": container with ID starting with 0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1 not found: ID does not exist" containerID="0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1" Mar 09 14:07:57 crc kubenswrapper[4722]: I0309 14:07:57.950428 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1"} err="failed to get container status \"0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\": rpc error: code = NotFound desc = could not find container \"0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1\": container with ID starting with 0ca69b7fbf10a266276bc6ea153aa970732c70eb2d39bc73ea8f6d9459f3a3c1 not found: ID does not exist" Mar 09 14:07:58 crc kubenswrapper[4722]: I0309 14:07:58.156526 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 09 14:08:00 crc kubenswrapper[4722]: I0309 14:08:00.153005 4722 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 09 14:08:00 crc kubenswrapper[4722]: I0309 14:08:00.154113 4722 status_manager.go:851] "Failed to get status for pod" podUID="f8225796-29d7-45ff-a016-d19dbc155d1a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 09 14:08:02 crc kubenswrapper[4722]: E0309 14:08:02.526332 4722 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 09 14:08:02 crc kubenswrapper[4722]: E0309 14:08:02.526632 4722 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 09 14:08:02 crc kubenswrapper[4722]: E0309 14:08:02.526883 4722 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 09 14:08:02 crc kubenswrapper[4722]: E0309 14:08:02.527099 4722 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 09 14:08:02 crc kubenswrapper[4722]: E0309 14:08:02.527505 4722 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 09 14:08:02 crc kubenswrapper[4722]: I0309 14:08:02.527543 4722 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 09 14:08:02 crc kubenswrapper[4722]: E0309 14:08:02.527802 4722 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.194:6443: connect: connection refused" interval="200ms" Mar 09 14:08:02 crc kubenswrapper[4722]: E0309 14:08:02.729037 4722 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.194:6443: connect: connection refused" interval="400ms" Mar 09 14:08:03 crc kubenswrapper[4722]: E0309 14:08:03.130869 4722 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.194:6443: connect: connection refused" interval="800ms" Mar 09 14:08:03 crc kubenswrapper[4722]: E0309 14:08:03.514034 4722 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.194:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189b31792ff26587 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-09 14:07:54.865640839 +0000 UTC m=+315.421209435,LastTimestamp:2026-03-09 14:07:54.865640839 +0000 UTC m=+315.421209435,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 09 14:08:03 crc kubenswrapper[4722]: E0309 14:08:03.933163 4722 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.194:6443: connect: connection refused" interval="1.6s" Mar 09 14:08:05 crc kubenswrapper[4722]: E0309 14:08:05.533698 4722 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.194:6443: connect: connection refused" interval="3.2s" Mar 09 14:08:06 crc kubenswrapper[4722]: I0309 14:08:06.871656 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 09 14:08:06 crc kubenswrapper[4722]: I0309 14:08:06.873649 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 09 14:08:06 crc kubenswrapper[4722]: I0309 14:08:06.873699 4722 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="ff66d43b6e872eed24aacaf3ea882c7148b6ef62bbe5b7bf10ede5d2691a3680" exitCode=1 Mar 09 14:08:06 crc kubenswrapper[4722]: I0309 14:08:06.873775 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"ff66d43b6e872eed24aacaf3ea882c7148b6ef62bbe5b7bf10ede5d2691a3680"} Mar 09 14:08:06 crc kubenswrapper[4722]: I0309 14:08:06.874769 4722 scope.go:117] "RemoveContainer" containerID="ff66d43b6e872eed24aacaf3ea882c7148b6ef62bbe5b7bf10ede5d2691a3680" Mar 09 14:08:06 crc kubenswrapper[4722]: I0309 14:08:06.875119 4722 status_manager.go:851] "Failed to get status for pod" podUID="f8225796-29d7-45ff-a016-d19dbc155d1a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 09 14:08:06 crc kubenswrapper[4722]: I0309 14:08:06.875746 4722 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 09 14:08:06 crc kubenswrapper[4722]: I0309 14:08:06.876250 4722 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 09 14:08:07 crc kubenswrapper[4722]: I0309 14:08:07.886102 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 09 14:08:07 crc kubenswrapper[4722]: I0309 14:08:07.886722 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 14:08:07 crc kubenswrapper[4722]: I0309 14:08:07.887604 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 09 14:08:07 crc kubenswrapper[4722]: I0309 14:08:07.887662 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"bcddc7eafb6c9687b9000e59dccf3f95a805e933ddcd42fb6f704c8b18dd5257"} Mar 09 14:08:07 crc kubenswrapper[4722]: I0309 14:08:07.889164 4722 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 09 14:08:07 crc kubenswrapper[4722]: I0309 14:08:07.889796 4722 status_manager.go:851] "Failed to get status for pod" podUID="f8225796-29d7-45ff-a016-d19dbc155d1a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 09 14:08:07 crc kubenswrapper[4722]: I0309 14:08:07.890145 4722 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 09 14:08:08 crc kubenswrapper[4722]: I0309 14:08:08.148682 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 14:08:08 crc kubenswrapper[4722]: I0309 14:08:08.149973 4722 status_manager.go:851] "Failed to get status for pod" podUID="f8225796-29d7-45ff-a016-d19dbc155d1a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 09 14:08:08 crc kubenswrapper[4722]: I0309 14:08:08.150927 4722 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 09 14:08:08 crc kubenswrapper[4722]: I0309 14:08:08.151189 4722 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 09 14:08:08 crc kubenswrapper[4722]: I0309 14:08:08.173353 4722 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="0965e24b-526e-4842-ac1f-eca7a765355d" Mar 09 14:08:08 crc kubenswrapper[4722]: I0309 14:08:08.173423 4722 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="0965e24b-526e-4842-ac1f-eca7a765355d" Mar 09 14:08:08 crc kubenswrapper[4722]: E0309 14:08:08.174263 4722 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 14:08:08 crc kubenswrapper[4722]: I0309 14:08:08.175053 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 14:08:08 crc kubenswrapper[4722]: W0309 14:08:08.211193 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-fe228b7f20ff654a45f9fa9f9bad8a8fc1e75d540499b3aa0e487ba827857963 WatchSource:0}: Error finding container fe228b7f20ff654a45f9fa9f9bad8a8fc1e75d540499b3aa0e487ba827857963: Status 404 returned error can't find the container with id fe228b7f20ff654a45f9fa9f9bad8a8fc1e75d540499b3aa0e487ba827857963 Mar 09 14:08:08 crc kubenswrapper[4722]: E0309 14:08:08.735256 4722 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.194:6443: connect: connection refused" interval="6.4s" Mar 09 14:08:08 crc kubenswrapper[4722]: I0309 14:08:08.899234 4722 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="25d23da9e43f6fe26c5c9eb92581102e5af2dab76d997d0ea6731fdc737ad965" exitCode=0 Mar 09 14:08:08 crc kubenswrapper[4722]: I0309 14:08:08.899346 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"25d23da9e43f6fe26c5c9eb92581102e5af2dab76d997d0ea6731fdc737ad965"} Mar 09 14:08:08 crc kubenswrapper[4722]: I0309 14:08:08.899445 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"fe228b7f20ff654a45f9fa9f9bad8a8fc1e75d540499b3aa0e487ba827857963"} Mar 09 14:08:08 crc kubenswrapper[4722]: I0309 14:08:08.900364 4722 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="0965e24b-526e-4842-ac1f-eca7a765355d" Mar 09 14:08:08 crc kubenswrapper[4722]: I0309 14:08:08.900439 4722 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="0965e24b-526e-4842-ac1f-eca7a765355d" Mar 09 14:08:08 crc kubenswrapper[4722]: I0309 14:08:08.901104 4722 status_manager.go:851] "Failed to get status for pod" podUID="f8225796-29d7-45ff-a016-d19dbc155d1a" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 09 14:08:08 crc kubenswrapper[4722]: E0309 14:08:08.901168 4722 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 14:08:08 crc kubenswrapper[4722]: I0309 14:08:08.902171 4722 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 09 14:08:08 crc kubenswrapper[4722]: I0309 14:08:08.902731 4722 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.194:6443: connect: connection refused" Mar 09 14:08:09 crc kubenswrapper[4722]: I0309 14:08:09.917957 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"74bd44815ac7e071fe287018a9c53a3e35931287a37d7ae316da15351ab9094b"} Mar 09 14:08:09 crc kubenswrapper[4722]: I0309 14:08:09.918507 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"fd1997d5ad30716b65ff31978b959d2ef13ba8cbb670c542e596c31c8d3f4f05"} Mar 09 14:08:09 crc kubenswrapper[4722]: I0309 14:08:09.918525 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"145053c05ff8a68e912c2942e3fcf64b77c2130eeb281dbd83488918b5b01026"} Mar 09 14:08:10 crc kubenswrapper[4722]: I0309 14:08:10.934677 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"14ee5665dadccb9c523b15ff2211478c5bd22400bdd7739e092880edb04b9d3d"} Mar 09 14:08:10 crc kubenswrapper[4722]: I0309 14:08:10.934759 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"1cadc3127db29ae2a6fe5c335c33fc7b87f5d6c11299176785b718abc7b3bc9e"} Mar 09 14:08:10 crc kubenswrapper[4722]: I0309 14:08:10.935091 4722 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="0965e24b-526e-4842-ac1f-eca7a765355d" Mar 09 14:08:10 crc kubenswrapper[4722]: I0309 14:08:10.935110 4722 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="0965e24b-526e-4842-ac1f-eca7a765355d" Mar 09 14:08:10 crc kubenswrapper[4722]: I0309 14:08:10.935466 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 14:08:13 crc kubenswrapper[4722]: I0309 14:08:13.175961 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 14:08:13 crc kubenswrapper[4722]: I0309 14:08:13.176516 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 14:08:13 crc kubenswrapper[4722]: I0309 14:08:13.181049 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 14:08:13 crc kubenswrapper[4722]: I0309 14:08:13.686849 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 14:08:13 crc kubenswrapper[4722]: I0309 14:08:13.696582 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 14:08:13 crc kubenswrapper[4722]: I0309 14:08:13.957135 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 14:08:15 crc kubenswrapper[4722]: I0309 14:08:15.956402 4722 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 14:08:16 crc kubenswrapper[4722]: I0309 14:08:16.976895 4722 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="0965e24b-526e-4842-ac1f-eca7a765355d" Mar 09 14:08:16 crc kubenswrapper[4722]: I0309 14:08:16.976939 4722 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="0965e24b-526e-4842-ac1f-eca7a765355d" Mar 09 14:08:16 crc kubenswrapper[4722]: I0309 14:08:16.982137 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 14:08:16 crc kubenswrapper[4722]: I0309 14:08:16.985794 4722 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="d61eb825-6421-4c01-a15e-d421f4fc93fa" Mar 09 14:08:17 crc kubenswrapper[4722]: I0309 14:08:17.985554 4722 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="0965e24b-526e-4842-ac1f-eca7a765355d" Mar 09 14:08:17 crc kubenswrapper[4722]: I0309 14:08:17.985980 4722 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="0965e24b-526e-4842-ac1f-eca7a765355d" Mar 09 14:08:20 crc kubenswrapper[4722]: I0309 14:08:20.175366 4722 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="d61eb825-6421-4c01-a15e-d421f4fc93fa" Mar 09 14:08:25 crc kubenswrapper[4722]: I0309 14:08:25.311901 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 09 14:08:26 crc kubenswrapper[4722]: I0309 14:08:26.255911 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 09 14:08:26 crc kubenswrapper[4722]: I0309 14:08:26.581321 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 14:08:26 crc kubenswrapper[4722]: I0309 14:08:26.644968 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 09 14:08:26 crc kubenswrapper[4722]: I0309 14:08:26.708791 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 09 14:08:27 crc kubenswrapper[4722]: I0309 14:08:27.108018 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 09 14:08:27 crc kubenswrapper[4722]: I0309 14:08:27.212468 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 09 14:08:27 crc kubenswrapper[4722]: I0309 14:08:27.413659 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 09 14:08:27 crc kubenswrapper[4722]: I0309 14:08:27.440188 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 09 14:08:27 crc kubenswrapper[4722]: I0309 14:08:27.464148 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 09 14:08:27 crc kubenswrapper[4722]: I0309 14:08:27.615575 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 09 14:08:27 crc kubenswrapper[4722]: I0309 14:08:27.856742 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 09 14:08:27 crc kubenswrapper[4722]: I0309 14:08:27.883734 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 09 14:08:28 crc kubenswrapper[4722]: I0309 14:08:28.076569 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 09 14:08:28 crc kubenswrapper[4722]: I0309 14:08:28.195029 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 09 14:08:28 crc kubenswrapper[4722]: I0309 14:08:28.222335 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 09 14:08:28 crc kubenswrapper[4722]: I0309 14:08:28.400629 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 09 14:08:28 crc kubenswrapper[4722]: I0309 14:08:28.697170 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 09 14:08:28 crc kubenswrapper[4722]: I0309 14:08:28.800628 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 09 14:08:28 crc kubenswrapper[4722]: I0309 14:08:28.908039 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 09 14:08:28 crc kubenswrapper[4722]: I0309 14:08:28.930965 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 09 14:08:28 crc kubenswrapper[4722]: I0309 14:08:28.997304 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 09 14:08:29 crc kubenswrapper[4722]: I0309 14:08:29.103543 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 09 14:08:29 crc kubenswrapper[4722]: I0309 14:08:29.114250 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 09 14:08:29 crc kubenswrapper[4722]: I0309 14:08:29.146468 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 09 14:08:29 crc kubenswrapper[4722]: I0309 14:08:29.148746 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 09 14:08:29 crc kubenswrapper[4722]: I0309 14:08:29.188672 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 09 14:08:29 crc kubenswrapper[4722]: I0309 14:08:29.263649 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 09 14:08:29 crc kubenswrapper[4722]: I0309 14:08:29.362004 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 09 14:08:29 crc kubenswrapper[4722]: I0309 14:08:29.535420 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 09 14:08:29 crc kubenswrapper[4722]: I0309 14:08:29.683596 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 09 14:08:29 crc kubenswrapper[4722]: I0309 14:08:29.736676 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 09 14:08:29 crc kubenswrapper[4722]: I0309 14:08:29.832837 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 09 14:08:29 crc kubenswrapper[4722]: I0309 14:08:29.955935 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 09 14:08:29 crc kubenswrapper[4722]: I0309 14:08:29.970942 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 09 14:08:30 crc kubenswrapper[4722]: I0309 14:08:30.005308 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 09 14:08:30 crc kubenswrapper[4722]: I0309 14:08:30.187042 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 09 14:08:30 crc kubenswrapper[4722]: I0309 14:08:30.229083 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 09 14:08:30 crc kubenswrapper[4722]: I0309 14:08:30.234948 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 09 14:08:30 crc kubenswrapper[4722]: I0309 14:08:30.253088 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 09 14:08:30 crc kubenswrapper[4722]: I0309 14:08:30.283387 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 09 14:08:30 crc kubenswrapper[4722]: I0309 14:08:30.291371 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 09 14:08:30 crc kubenswrapper[4722]: I0309 14:08:30.305575 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 09 14:08:30 crc kubenswrapper[4722]: I0309 14:08:30.321568 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 09 14:08:30 crc kubenswrapper[4722]: I0309 14:08:30.428785 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 09 14:08:30 crc kubenswrapper[4722]: I0309 14:08:30.621480 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 09 14:08:30 crc kubenswrapper[4722]: I0309 14:08:30.732109 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 09 14:08:30 crc kubenswrapper[4722]: I0309 14:08:30.923949 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 09 14:08:30 crc kubenswrapper[4722]: I0309 14:08:30.990137 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 09 14:08:31 crc kubenswrapper[4722]: I0309 14:08:31.006612 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 09 14:08:31 crc kubenswrapper[4722]: I0309 14:08:31.056688 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 09 14:08:31 crc kubenswrapper[4722]: I0309 14:08:31.169867 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 09 14:08:31 crc kubenswrapper[4722]: I0309 14:08:31.221905 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 09 14:08:31 crc kubenswrapper[4722]: I0309 14:08:31.252410 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 09 14:08:31 crc kubenswrapper[4722]: I0309 14:08:31.324153 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 09 14:08:31 crc kubenswrapper[4722]: I0309 14:08:31.343559 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 09 14:08:31 crc kubenswrapper[4722]: I0309 14:08:31.444762 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 09 14:08:31 crc kubenswrapper[4722]: I0309 14:08:31.540892 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 09 14:08:31 crc kubenswrapper[4722]: I0309 14:08:31.554151 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 09 14:08:31 crc kubenswrapper[4722]: I0309 14:08:31.603806 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 09 14:08:31 crc kubenswrapper[4722]: I0309 14:08:31.727238 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 09 14:08:31 crc kubenswrapper[4722]: I0309 14:08:31.770932 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 09 14:08:31 crc kubenswrapper[4722]: I0309 14:08:31.809813 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 09 14:08:31 crc kubenswrapper[4722]: I0309 14:08:31.881630 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 09 14:08:31 crc kubenswrapper[4722]: I0309 14:08:31.897480 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 09 14:08:31 crc kubenswrapper[4722]: I0309 14:08:31.924283 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 09 14:08:31 crc kubenswrapper[4722]: I0309 14:08:31.962570 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 09 14:08:31 crc kubenswrapper[4722]: I0309 14:08:31.972385 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 09 14:08:31 crc kubenswrapper[4722]: I0309 14:08:31.994716 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 09 14:08:32 crc kubenswrapper[4722]: I0309 14:08:32.135112 4722 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 09 14:08:32 crc kubenswrapper[4722]: I0309 14:08:32.162052 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 09 14:08:32 crc kubenswrapper[4722]: I0309 14:08:32.287039 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 09 14:08:32 crc kubenswrapper[4722]: I0309 14:08:32.354257 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 09 14:08:32 crc kubenswrapper[4722]: I0309 14:08:32.450083 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 09 14:08:32 crc kubenswrapper[4722]: I0309 14:08:32.470607 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 09 14:08:32 crc kubenswrapper[4722]: I0309 14:08:32.528293 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 09 14:08:32 crc kubenswrapper[4722]: I0309 14:08:32.599884 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 09 14:08:32 crc kubenswrapper[4722]: I0309 14:08:32.608813 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 09 14:08:32 crc kubenswrapper[4722]: I0309 14:08:32.652863 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 09 14:08:32 crc kubenswrapper[4722]: I0309 14:08:32.663359 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 09 14:08:32 crc kubenswrapper[4722]: I0309 14:08:32.722608 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 09 14:08:32 crc kubenswrapper[4722]: I0309 14:08:32.746681 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 09 14:08:32 crc kubenswrapper[4722]: I0309 14:08:32.748343 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 09 14:08:32 crc kubenswrapper[4722]: I0309 14:08:32.762615 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 09 14:08:32 crc kubenswrapper[4722]: I0309 14:08:32.783500 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 09 14:08:32 crc kubenswrapper[4722]: I0309 14:08:32.854238 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 09 14:08:32 crc kubenswrapper[4722]: I0309 14:08:32.854359 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 09 14:08:32 crc kubenswrapper[4722]: I0309 14:08:32.881894 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 09 14:08:32 crc kubenswrapper[4722]: I0309 14:08:32.894932 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 09 14:08:32 crc kubenswrapper[4722]: I0309 14:08:32.948631 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 09 14:08:32 crc kubenswrapper[4722]: I0309 14:08:32.954159 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 09 14:08:32 crc kubenswrapper[4722]: I0309 14:08:32.955088 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 09 14:08:32 crc kubenswrapper[4722]: I0309 14:08:32.955316 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 09 14:08:32 crc kubenswrapper[4722]: I0309 14:08:32.987984 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 09 14:08:32 crc kubenswrapper[4722]: I0309 14:08:32.988883 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 09 14:08:32 crc kubenswrapper[4722]: I0309 14:08:32.998734 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 09 14:08:33 crc kubenswrapper[4722]: I0309 14:08:33.135133 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 09 14:08:33 crc kubenswrapper[4722]: I0309 14:08:33.151466 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 09 14:08:33 crc kubenswrapper[4722]: I0309 14:08:33.182848 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 09 14:08:33 crc kubenswrapper[4722]: I0309 14:08:33.219438 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 09 14:08:33 crc kubenswrapper[4722]: I0309 14:08:33.261553 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 09 14:08:33 crc kubenswrapper[4722]: I0309 14:08:33.285751 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 09 14:08:33 crc kubenswrapper[4722]: I0309 14:08:33.416451 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 09 14:08:33 crc kubenswrapper[4722]: I0309 14:08:33.416653 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 09 14:08:33 crc kubenswrapper[4722]: I0309 14:08:33.449868 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 09 14:08:33 crc kubenswrapper[4722]: I0309 14:08:33.566622 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 09 14:08:33 crc kubenswrapper[4722]: I0309 14:08:33.566750 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 09 14:08:33 crc kubenswrapper[4722]: I0309 14:08:33.584836 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 09 14:08:33 crc kubenswrapper[4722]: I0309 14:08:33.627888 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 09 14:08:33 crc kubenswrapper[4722]: I0309 14:08:33.633326 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 09 14:08:33 crc kubenswrapper[4722]: I0309 14:08:33.697791 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 09 14:08:33 crc kubenswrapper[4722]: I0309 14:08:33.788131 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 09 14:08:33 crc kubenswrapper[4722]: I0309 14:08:33.824281 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 09 14:08:33 crc kubenswrapper[4722]: I0309 14:08:33.833660 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 09 14:08:33 crc kubenswrapper[4722]: I0309 14:08:33.835157 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 09 14:08:33 crc kubenswrapper[4722]: I0309 14:08:33.894840 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 09 14:08:33 crc kubenswrapper[4722]: I0309 14:08:33.926947 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 09 14:08:33 crc kubenswrapper[4722]: I0309 14:08:33.933388 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 09 14:08:34 crc kubenswrapper[4722]: I0309 14:08:34.032811 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 09 14:08:34 crc kubenswrapper[4722]: I0309 14:08:34.064052 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 09 14:08:34 crc kubenswrapper[4722]: I0309 14:08:34.083268 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 09 14:08:34 crc kubenswrapper[4722]: I0309 14:08:34.115572 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 09 14:08:34 crc kubenswrapper[4722]: I0309 14:08:34.120989 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 09 14:08:34 crc kubenswrapper[4722]: I0309 14:08:34.210507 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 09 14:08:34 crc kubenswrapper[4722]: I0309 14:08:34.293404 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 09 14:08:34 crc kubenswrapper[4722]: I0309 14:08:34.397832 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 09 14:08:34 crc kubenswrapper[4722]: I0309 14:08:34.401684 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 09 14:08:34 crc kubenswrapper[4722]: I0309 14:08:34.417901 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 09 14:08:34 crc kubenswrapper[4722]: I0309 14:08:34.588176 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 09 14:08:34 crc kubenswrapper[4722]: I0309 14:08:34.603981 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 09 14:08:34 crc kubenswrapper[4722]: I0309 14:08:34.668603 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 09 14:08:34 crc kubenswrapper[4722]: I0309 14:08:34.679707 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 09 14:08:34 crc kubenswrapper[4722]: I0309 14:08:34.714750 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 09 14:08:34 crc kubenswrapper[4722]: I0309 14:08:34.730262 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 09 14:08:34 crc kubenswrapper[4722]: I0309 14:08:34.764670 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 09 14:08:34 crc kubenswrapper[4722]: I0309 14:08:34.891276 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 09 14:08:34 crc kubenswrapper[4722]: I0309 14:08:34.928590 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 09 14:08:34 crc kubenswrapper[4722]: I0309 14:08:34.975875 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 09 14:08:35 crc kubenswrapper[4722]: I0309 14:08:35.199369 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 09 14:08:35 crc kubenswrapper[4722]: I0309 14:08:35.205218 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 09 14:08:35 crc kubenswrapper[4722]: I0309 14:08:35.222609 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 09 14:08:35 crc kubenswrapper[4722]: I0309 14:08:35.259483 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 09 14:08:35 crc kubenswrapper[4722]: I0309 14:08:35.316311 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 09 14:08:35 crc kubenswrapper[4722]: I0309 14:08:35.481749 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 09 14:08:35 crc kubenswrapper[4722]: I0309 14:08:35.509227 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 09 14:08:35 crc kubenswrapper[4722]: I0309 14:08:35.515222 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 09 14:08:35 crc kubenswrapper[4722]: I0309 14:08:35.522312 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 09 14:08:35 crc kubenswrapper[4722]: I0309 14:08:35.531660 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 09 14:08:35 crc kubenswrapper[4722]: I0309 14:08:35.621613 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 09 14:08:35 crc kubenswrapper[4722]: I0309 14:08:35.628519 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 09 14:08:35 crc kubenswrapper[4722]: I0309 14:08:35.674608 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 09 14:08:35 crc kubenswrapper[4722]: I0309 14:08:35.744501 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 09 14:08:35 crc kubenswrapper[4722]: I0309 14:08:35.769195 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 09 14:08:35 crc kubenswrapper[4722]: I0309 14:08:35.853897 4722 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 09 14:08:35 crc kubenswrapper[4722]: I0309 14:08:35.866143 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 09 14:08:35 crc kubenswrapper[4722]: I0309 14:08:35.949823 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 09 14:08:36 crc kubenswrapper[4722]: I0309 14:08:36.075139 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 09 14:08:36 crc kubenswrapper[4722]: I0309 14:08:36.117322 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 09 14:08:36 crc kubenswrapper[4722]: I0309 14:08:36.123845 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 09 14:08:36 crc kubenswrapper[4722]: I0309 14:08:36.130410 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 09 14:08:36 crc kubenswrapper[4722]: I0309 14:08:36.151603 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 09 14:08:36 crc kubenswrapper[4722]: I0309 14:08:36.166443 4722 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 09 14:08:36 crc kubenswrapper[4722]: I0309 14:08:36.257160 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 09 14:08:36 crc kubenswrapper[4722]: I0309 14:08:36.265884 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 09 14:08:36 crc kubenswrapper[4722]: I0309 14:08:36.318895 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 09 14:08:36 crc kubenswrapper[4722]: I0309 14:08:36.343249 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 09 14:08:36 crc kubenswrapper[4722]: I0309 14:08:36.350662 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 09 14:08:36 crc kubenswrapper[4722]: I0309 14:08:36.392962 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 09 14:08:36 crc kubenswrapper[4722]: I0309 14:08:36.411598 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 09 14:08:36 crc kubenswrapper[4722]: I0309 14:08:36.429896 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 09 14:08:36 crc kubenswrapper[4722]: I0309 14:08:36.442488 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 09 14:08:36 crc kubenswrapper[4722]: I0309 14:08:36.523131 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 09 14:08:36 crc kubenswrapper[4722]: I0309 14:08:36.534685 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 09 14:08:36 crc kubenswrapper[4722]: I0309 14:08:36.578668 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 09 14:08:36 crc kubenswrapper[4722]: I0309 14:08:36.596093 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 09 14:08:36 crc kubenswrapper[4722]: I0309 14:08:36.596267 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 09 14:08:36 crc kubenswrapper[4722]: I0309 14:08:36.623245 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 09 14:08:36 crc kubenswrapper[4722]: I0309 14:08:36.660691 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 09 14:08:36 crc kubenswrapper[4722]: I0309 14:08:36.798615 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 09 14:08:36 crc kubenswrapper[4722]: I0309 14:08:36.826632 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 09 14:08:36 crc kubenswrapper[4722]: I0309 14:08:36.972761 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 09 14:08:37 crc kubenswrapper[4722]: I0309 14:08:37.019539 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 09 14:08:37 crc kubenswrapper[4722]: I0309 14:08:37.100140 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 09 14:08:37 crc kubenswrapper[4722]: I0309 14:08:37.109624 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 09 14:08:37 crc kubenswrapper[4722]: I0309 14:08:37.332580 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 09 14:08:37 crc kubenswrapper[4722]: I0309 14:08:37.350693 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 09 14:08:37 crc kubenswrapper[4722]: I0309 14:08:37.445264 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 09 14:08:37 crc kubenswrapper[4722]: I0309 14:08:37.466811 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 09 14:08:37 crc kubenswrapper[4722]: I0309 14:08:37.476923 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 09 14:08:37 crc kubenswrapper[4722]: I0309 14:08:37.571521 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 09 14:08:37 crc kubenswrapper[4722]: I0309 14:08:37.595188 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 09 14:08:37 crc kubenswrapper[4722]: I0309 14:08:37.643179 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 09 14:08:37 crc kubenswrapper[4722]: I0309 14:08:37.727307 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 09 14:08:37 crc kubenswrapper[4722]: I0309 14:08:37.767953 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 09 14:08:37 crc kubenswrapper[4722]: I0309 14:08:37.798136 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 09 14:08:37 crc kubenswrapper[4722]: I0309 14:08:37.817610 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 09 14:08:37 crc kubenswrapper[4722]: I0309 14:08:37.824496 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 09 14:08:37 crc kubenswrapper[4722]: I0309 14:08:37.824627 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 09 14:08:37 crc kubenswrapper[4722]: I0309 14:08:37.876947 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 09 14:08:37 crc kubenswrapper[4722]: I0309 14:08:37.978492 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 09 14:08:38 crc kubenswrapper[4722]: I0309 14:08:38.005608 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 09 14:08:38 crc kubenswrapper[4722]: I0309 14:08:38.045063 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 09 14:08:38 crc kubenswrapper[4722]: I0309 14:08:38.080829 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 09 14:08:38 crc kubenswrapper[4722]: I0309 14:08:38.096937 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 09 14:08:38 crc kubenswrapper[4722]: I0309 14:08:38.126628 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 09 14:08:38 crc kubenswrapper[4722]: I0309 14:08:38.137514 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 09 14:08:38 crc kubenswrapper[4722]: I0309 14:08:38.219143 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 09 14:08:38 crc kubenswrapper[4722]: I0309 14:08:38.236796 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 09 14:08:38 crc kubenswrapper[4722]: I0309 14:08:38.322867 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 09 14:08:38 crc kubenswrapper[4722]: I0309 14:08:38.362934 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 09 14:08:38 crc kubenswrapper[4722]: I0309 14:08:38.371931 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 09 14:08:38 crc kubenswrapper[4722]: I0309 14:08:38.526550 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 09 14:08:38 crc kubenswrapper[4722]: I0309 14:08:38.537706 4722 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 09 14:08:38 crc kubenswrapper[4722]: I0309 14:08:38.538661 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=44.538635673 podStartE2EDuration="44.538635673s" podCreationTimestamp="2026-03-09 14:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:08:15.84117639 +0000 UTC m=+336.396745006" watchObservedRunningTime="2026-03-09 14:08:38.538635673 +0000 UTC m=+359.094204269" Mar 09 14:08:38 crc kubenswrapper[4722]: I0309 14:08:38.539695 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 09 14:08:38 crc kubenswrapper[4722]: I0309 14:08:38.544323 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 09 14:08:38 crc kubenswrapper[4722]: I0309 14:08:38.544389 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-infra/auto-csr-approver-29551088-b4l4k"] Mar 09 14:08:38 crc kubenswrapper[4722]: E0309 14:08:38.544680 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8225796-29d7-45ff-a016-d19dbc155d1a" containerName="installer" Mar 09 14:08:38 crc kubenswrapper[4722]: I0309 14:08:38.544706 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8225796-29d7-45ff-a016-d19dbc155d1a" containerName="installer" Mar 09 14:08:38 crc kubenswrapper[4722]: I0309 14:08:38.544889 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8225796-29d7-45ff-a016-d19dbc155d1a" containerName="installer" Mar 09 14:08:38 crc kubenswrapper[4722]: I0309 14:08:38.545479 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551088-b4l4k" Mar 09 14:08:38 crc kubenswrapper[4722]: I0309 14:08:38.547514 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 14:08:38 crc kubenswrapper[4722]: I0309 14:08:38.548671 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 14:08:38 crc kubenswrapper[4722]: I0309 14:08:38.549560 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-dr2x6" Mar 09 14:08:38 crc kubenswrapper[4722]: I0309 14:08:38.550037 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 09 14:08:38 crc kubenswrapper[4722]: I0309 14:08:38.576166 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=23.576135793 podStartE2EDuration="23.576135793s" podCreationTimestamp="2026-03-09 14:08:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:08:38.570506515 +0000 UTC m=+359.126075091" watchObservedRunningTime="2026-03-09 14:08:38.576135793 +0000 UTC m=+359.131704369" Mar 09 14:08:38 crc kubenswrapper[4722]: I0309 14:08:38.579382 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 09 14:08:38 crc kubenswrapper[4722]: I0309 14:08:38.613533 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 09 14:08:38 crc kubenswrapper[4722]: I0309 14:08:38.622536 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 09 14:08:38 crc kubenswrapper[4722]: I0309 14:08:38.677080 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7c9g9\" (UniqueName: \"kubernetes.io/projected/1c0ec175-5b4c-4e8d-9382-49aa1d515423-kube-api-access-7c9g9\") pod \"auto-csr-approver-29551088-b4l4k\" (UID: \"1c0ec175-5b4c-4e8d-9382-49aa1d515423\") " pod="openshift-infra/auto-csr-approver-29551088-b4l4k" Mar 09 14:08:38 crc kubenswrapper[4722]: I0309 14:08:38.714854 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 09 14:08:38 crc kubenswrapper[4722]: I0309 14:08:38.778840 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7c9g9\" (UniqueName: \"kubernetes.io/projected/1c0ec175-5b4c-4e8d-9382-49aa1d515423-kube-api-access-7c9g9\") pod \"auto-csr-approver-29551088-b4l4k\" (UID: \"1c0ec175-5b4c-4e8d-9382-49aa1d515423\") " pod="openshift-infra/auto-csr-approver-29551088-b4l4k" Mar 09 14:08:38 crc kubenswrapper[4722]: I0309 14:08:38.802831 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7c9g9\" (UniqueName: \"kubernetes.io/projected/1c0ec175-5b4c-4e8d-9382-49aa1d515423-kube-api-access-7c9g9\") pod \"auto-csr-approver-29551088-b4l4k\" (UID: \"1c0ec175-5b4c-4e8d-9382-49aa1d515423\") " pod="openshift-infra/auto-csr-approver-29551088-b4l4k" Mar 09 14:08:38 crc kubenswrapper[4722]: I0309 14:08:38.871910 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551088-b4l4k" Mar 09 14:08:38 crc kubenswrapper[4722]: I0309 14:08:38.972545 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 09 14:08:38 crc kubenswrapper[4722]: I0309 14:08:38.974475 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 09 14:08:38 crc kubenswrapper[4722]: I0309 14:08:38.985988 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 09 14:08:39 crc kubenswrapper[4722]: I0309 14:08:39.008646 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 09 14:08:39 crc kubenswrapper[4722]: I0309 14:08:39.056584 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 09 14:08:39 crc kubenswrapper[4722]: I0309 14:08:39.099682 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 09 14:08:39 crc kubenswrapper[4722]: I0309 14:08:39.243660 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 09 14:08:39 crc kubenswrapper[4722]: I0309 14:08:39.254020 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 09 14:08:39 crc kubenswrapper[4722]: I0309 14:08:39.278826 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551088-b4l4k"] Mar 09 14:08:39 crc kubenswrapper[4722]: I0309 14:08:39.347070 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 09 14:08:39 crc kubenswrapper[4722]: I0309 14:08:39.417359 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 09 14:08:39 crc kubenswrapper[4722]: I0309 14:08:39.454652 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 09 14:08:39 crc kubenswrapper[4722]: I0309 14:08:39.590064 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 09 14:08:39 crc kubenswrapper[4722]: I0309 14:08:39.883440 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 09 14:08:40 crc kubenswrapper[4722]: I0309 14:08:40.135433 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551088-b4l4k" event={"ID":"1c0ec175-5b4c-4e8d-9382-49aa1d515423","Type":"ContainerStarted","Data":"2667c5e213d8212d2535fdf0931977c21e149753dbb4569647f04d1a2efdd52e"} Mar 09 14:08:40 crc kubenswrapper[4722]: I0309 14:08:40.362935 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 09 14:08:40 crc kubenswrapper[4722]: I0309 14:08:40.363916 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 09 14:08:40 crc kubenswrapper[4722]: I0309 14:08:40.402093 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 09 14:08:40 crc kubenswrapper[4722]: I0309 14:08:40.428666 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 09 14:08:40 crc kubenswrapper[4722]: I0309 14:08:40.466730 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 09 14:08:40 crc kubenswrapper[4722]: I0309 14:08:40.542301 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 09 14:08:40 crc kubenswrapper[4722]: I0309 14:08:40.624279 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 09 14:08:40 crc kubenswrapper[4722]: I0309 14:08:40.899977 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 09 14:08:40 crc kubenswrapper[4722]: I0309 14:08:40.989137 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 09 14:08:41 crc kubenswrapper[4722]: I0309 14:08:41.143820 4722 generic.go:334] "Generic (PLEG): container finished" podID="1c0ec175-5b4c-4e8d-9382-49aa1d515423" containerID="2dbbdbcc313bdd84115bd6391cec9f91e1e2040ca6d071a52179af65734959b0" exitCode=0 Mar 09 14:08:41 crc kubenswrapper[4722]: I0309 14:08:41.143944 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551088-b4l4k" event={"ID":"1c0ec175-5b4c-4e8d-9382-49aa1d515423","Type":"ContainerDied","Data":"2dbbdbcc313bdd84115bd6391cec9f91e1e2040ca6d071a52179af65734959b0"} Mar 09 14:08:41 crc kubenswrapper[4722]: I0309 14:08:41.271869 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 09 14:08:41 crc kubenswrapper[4722]: I0309 14:08:41.448886 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 09 14:08:41 crc kubenswrapper[4722]: I0309 14:08:41.505493 4722 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 09 14:08:41 crc kubenswrapper[4722]: I0309 14:08:41.535329 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 09 14:08:41 crc kubenswrapper[4722]: I0309 14:08:41.556368 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 09 14:08:41 crc kubenswrapper[4722]: I0309 14:08:41.600866 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 09 14:08:41 crc kubenswrapper[4722]: I0309 14:08:41.843606 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 09 14:08:42 crc kubenswrapper[4722]: I0309 14:08:42.013914 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 09 14:08:42 crc kubenswrapper[4722]: I0309 14:08:42.164151 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 09 14:08:42 crc kubenswrapper[4722]: I0309 14:08:42.189987 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 09 14:08:42 crc kubenswrapper[4722]: I0309 14:08:42.338955 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 09 14:08:42 crc kubenswrapper[4722]: I0309 14:08:42.446040 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551088-b4l4k" Mar 09 14:08:42 crc kubenswrapper[4722]: I0309 14:08:42.493140 4722 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 09 14:08:42 crc kubenswrapper[4722]: I0309 14:08:42.547475 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c9g9\" (UniqueName: \"kubernetes.io/projected/1c0ec175-5b4c-4e8d-9382-49aa1d515423-kube-api-access-7c9g9\") pod \"1c0ec175-5b4c-4e8d-9382-49aa1d515423\" (UID: \"1c0ec175-5b4c-4e8d-9382-49aa1d515423\") " Mar 09 14:08:42 crc kubenswrapper[4722]: I0309 14:08:42.555774 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c0ec175-5b4c-4e8d-9382-49aa1d515423-kube-api-access-7c9g9" (OuterVolumeSpecName: "kube-api-access-7c9g9") pod "1c0ec175-5b4c-4e8d-9382-49aa1d515423" (UID: "1c0ec175-5b4c-4e8d-9382-49aa1d515423"). InnerVolumeSpecName "kube-api-access-7c9g9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:08:42 crc kubenswrapper[4722]: I0309 14:08:42.649216 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c9g9\" (UniqueName: \"kubernetes.io/projected/1c0ec175-5b4c-4e8d-9382-49aa1d515423-kube-api-access-7c9g9\") on node \"crc\" DevicePath \"\"" Mar 09 14:08:43 crc kubenswrapper[4722]: I0309 14:08:43.165855 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551088-b4l4k" event={"ID":"1c0ec175-5b4c-4e8d-9382-49aa1d515423","Type":"ContainerDied","Data":"2667c5e213d8212d2535fdf0931977c21e149753dbb4569647f04d1a2efdd52e"} Mar 09 14:08:43 crc kubenswrapper[4722]: I0309 14:08:43.165907 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2667c5e213d8212d2535fdf0931977c21e149753dbb4569647f04d1a2efdd52e" Mar 09 14:08:43 crc kubenswrapper[4722]: I0309 14:08:43.165950 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551088-b4l4k" Mar 09 14:08:43 crc kubenswrapper[4722]: E0309 14:08:43.285183 4722 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c0ec175_5b4c_4e8d_9382_49aa1d515423.slice\": RecentStats: unable to find data in memory cache]" Mar 09 14:08:43 crc kubenswrapper[4722]: I0309 14:08:43.597235 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 09 14:08:49 crc kubenswrapper[4722]: I0309 14:08:49.825800 4722 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 09 14:08:49 crc kubenswrapper[4722]: I0309 14:08:49.826455 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://e3f35e9316e9d9f1180b2e1a91dfe331c0255d626caf07dba1d44d02aa4170e2" gracePeriod=5 Mar 09 14:08:55 crc kubenswrapper[4722]: I0309 14:08:55.247544 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 09 14:08:55 crc kubenswrapper[4722]: I0309 14:08:55.248401 4722 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="e3f35e9316e9d9f1180b2e1a91dfe331c0255d626caf07dba1d44d02aa4170e2" exitCode=137 Mar 09 14:08:55 crc kubenswrapper[4722]: I0309 14:08:55.412206 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 09 14:08:55 crc kubenswrapper[4722]: I0309 14:08:55.412745 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 14:08:55 crc kubenswrapper[4722]: I0309 14:08:55.559289 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 09 14:08:55 crc kubenswrapper[4722]: I0309 14:08:55.559433 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 09 14:08:55 crc kubenswrapper[4722]: I0309 14:08:55.559455 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 09 14:08:55 crc kubenswrapper[4722]: I0309 14:08:55.559439 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 14:08:55 crc kubenswrapper[4722]: I0309 14:08:55.559487 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 09 14:08:55 crc kubenswrapper[4722]: I0309 14:08:55.559508 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 09 14:08:55 crc kubenswrapper[4722]: I0309 14:08:55.559537 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 14:08:55 crc kubenswrapper[4722]: I0309 14:08:55.559555 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 14:08:55 crc kubenswrapper[4722]: I0309 14:08:55.559575 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 14:08:55 crc kubenswrapper[4722]: I0309 14:08:55.561863 4722 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 09 14:08:55 crc kubenswrapper[4722]: I0309 14:08:55.561889 4722 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 09 14:08:55 crc kubenswrapper[4722]: I0309 14:08:55.561902 4722 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 09 14:08:55 crc kubenswrapper[4722]: I0309 14:08:55.561917 4722 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 09 14:08:55 crc kubenswrapper[4722]: I0309 14:08:55.572067 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 14:08:55 crc kubenswrapper[4722]: I0309 14:08:55.663695 4722 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 09 14:08:56 crc kubenswrapper[4722]: I0309 14:08:56.158458 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 09 14:08:56 crc kubenswrapper[4722]: I0309 14:08:56.158778 4722 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Mar 09 14:08:56 crc kubenswrapper[4722]: I0309 14:08:56.173934 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 09 14:08:56 crc kubenswrapper[4722]: I0309 14:08:56.173978 4722 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="3e68cbb9-d34b-485f-91eb-4097bd31cf6f" Mar 09 14:08:56 crc kubenswrapper[4722]: I0309 14:08:56.180459 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 09 14:08:56 crc kubenswrapper[4722]: I0309 14:08:56.180516 4722 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="3e68cbb9-d34b-485f-91eb-4097bd31cf6f" Mar 09 14:08:56 crc kubenswrapper[4722]: I0309 14:08:56.256348 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 09 14:08:56 crc kubenswrapper[4722]: I0309 14:08:56.256446 4722 scope.go:117] "RemoveContainer" containerID="e3f35e9316e9d9f1180b2e1a91dfe331c0255d626caf07dba1d44d02aa4170e2" Mar 09 14:08:56 crc kubenswrapper[4722]: I0309 14:08:56.256603 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 09 14:08:56 crc kubenswrapper[4722]: I0309 14:08:56.354782 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-w95cf"] Mar 09 14:08:56 crc kubenswrapper[4722]: I0309 14:08:56.355083 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-w95cf" podUID="65e3d647-8806-4c0c-b9aa-142739f2fbe0" containerName="registry-server" containerID="cri-o://cf480bc5c8bb7328b096209debbb6ae3f3c95254a69760a97c4aaaab34c7f253" gracePeriod=30 Mar 09 14:08:56 crc kubenswrapper[4722]: I0309 14:08:56.367275 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pwxnx"] Mar 09 14:08:56 crc kubenswrapper[4722]: I0309 14:08:56.367868 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-pwxnx" podUID="c0f74bde-752e-497e-ad82-ec7a1676bbd5" containerName="registry-server" containerID="cri-o://aa3e4f5b5b38ee676d8bc64dbcfe4f208f4ba83831c09629dbc541729ab9ea5c" gracePeriod=30 Mar 09 14:08:56 crc kubenswrapper[4722]: I0309 14:08:56.386200 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wz9f2"] Mar 09 14:08:56 crc kubenswrapper[4722]: I0309 14:08:56.386601 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-wz9f2" podUID="e2641b0e-aae4-49df-931f-95e38505812f" containerName="marketplace-operator" containerID="cri-o://64d164d2bc8fe4385d91a3b84212f0498b57af3e2a5c68489f924b444efff3c6" gracePeriod=30 Mar 09 14:08:56 crc kubenswrapper[4722]: I0309 14:08:56.393694 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rdkdt"] Mar 09 14:08:56 crc kubenswrapper[4722]: I0309 14:08:56.394005 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rdkdt" podUID="cfd6e90e-4eeb-4372-8465-136a383e95b2" containerName="registry-server" containerID="cri-o://78f5a393860e42b0934375b2f004056d71f44af329e9dd8bdc283424c1580305" gracePeriod=30 Mar 09 14:08:56 crc kubenswrapper[4722]: I0309 14:08:56.401883 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-c2g4c"] Mar 09 14:08:56 crc kubenswrapper[4722]: I0309 14:08:56.402235 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-c2g4c" podUID="7df01eab-424f-40b1-a40c-03b930a8fac6" containerName="registry-server" containerID="cri-o://75288ffd906d8bf407d2dcb493b2b28933f498f7cd819bb126299f76207c3ac7" gracePeriod=30 Mar 09 14:08:56 crc kubenswrapper[4722]: I0309 14:08:56.422681 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-svdsk"] Mar 09 14:08:56 crc kubenswrapper[4722]: E0309 14:08:56.422980 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c0ec175-5b4c-4e8d-9382-49aa1d515423" containerName="oc" Mar 09 14:08:56 crc kubenswrapper[4722]: I0309 14:08:56.423005 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c0ec175-5b4c-4e8d-9382-49aa1d515423" containerName="oc" Mar 09 14:08:56 crc kubenswrapper[4722]: E0309 14:08:56.423019 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 09 14:08:56 crc kubenswrapper[4722]: I0309 14:08:56.423027 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 09 14:08:56 crc kubenswrapper[4722]: I0309 14:08:56.423256 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 09 14:08:56 crc kubenswrapper[4722]: I0309 14:08:56.423279 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c0ec175-5b4c-4e8d-9382-49aa1d515423" containerName="oc" Mar 09 14:08:56 crc kubenswrapper[4722]: I0309 14:08:56.424223 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-svdsk" Mar 09 14:08:56 crc kubenswrapper[4722]: I0309 14:08:56.441354 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-svdsk"] Mar 09 14:08:56 crc kubenswrapper[4722]: I0309 14:08:56.475748 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ea964ea5-3fad-4bd0-8ffe-d78f00229fbe-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-svdsk\" (UID: \"ea964ea5-3fad-4bd0-8ffe-d78f00229fbe\") " pod="openshift-marketplace/marketplace-operator-79b997595-svdsk" Mar 09 14:08:56 crc kubenswrapper[4722]: I0309 14:08:56.475873 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xv2bk\" (UniqueName: \"kubernetes.io/projected/ea964ea5-3fad-4bd0-8ffe-d78f00229fbe-kube-api-access-xv2bk\") pod \"marketplace-operator-79b997595-svdsk\" (UID: \"ea964ea5-3fad-4bd0-8ffe-d78f00229fbe\") " pod="openshift-marketplace/marketplace-operator-79b997595-svdsk" Mar 09 14:08:56 crc kubenswrapper[4722]: I0309 14:08:56.475903 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ea964ea5-3fad-4bd0-8ffe-d78f00229fbe-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-svdsk\" (UID: \"ea964ea5-3fad-4bd0-8ffe-d78f00229fbe\") " pod="openshift-marketplace/marketplace-operator-79b997595-svdsk" Mar 09 14:08:56 crc kubenswrapper[4722]: I0309 14:08:56.577014 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xv2bk\" (UniqueName: \"kubernetes.io/projected/ea964ea5-3fad-4bd0-8ffe-d78f00229fbe-kube-api-access-xv2bk\") pod \"marketplace-operator-79b997595-svdsk\" (UID: \"ea964ea5-3fad-4bd0-8ffe-d78f00229fbe\") " pod="openshift-marketplace/marketplace-operator-79b997595-svdsk" Mar 09 14:08:56 crc kubenswrapper[4722]: I0309 14:08:56.577407 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ea964ea5-3fad-4bd0-8ffe-d78f00229fbe-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-svdsk\" (UID: \"ea964ea5-3fad-4bd0-8ffe-d78f00229fbe\") " pod="openshift-marketplace/marketplace-operator-79b997595-svdsk" Mar 09 14:08:56 crc kubenswrapper[4722]: I0309 14:08:56.577439 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ea964ea5-3fad-4bd0-8ffe-d78f00229fbe-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-svdsk\" (UID: \"ea964ea5-3fad-4bd0-8ffe-d78f00229fbe\") " pod="openshift-marketplace/marketplace-operator-79b997595-svdsk" Mar 09 14:08:56 crc kubenswrapper[4722]: I0309 14:08:56.578697 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ea964ea5-3fad-4bd0-8ffe-d78f00229fbe-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-svdsk\" (UID: \"ea964ea5-3fad-4bd0-8ffe-d78f00229fbe\") " pod="openshift-marketplace/marketplace-operator-79b997595-svdsk" Mar 09 14:08:56 crc kubenswrapper[4722]: I0309 14:08:56.582882 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ea964ea5-3fad-4bd0-8ffe-d78f00229fbe-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-svdsk\" (UID: \"ea964ea5-3fad-4bd0-8ffe-d78f00229fbe\") " pod="openshift-marketplace/marketplace-operator-79b997595-svdsk" Mar 09 14:08:56 crc kubenswrapper[4722]: I0309 14:08:56.595355 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xv2bk\" (UniqueName: \"kubernetes.io/projected/ea964ea5-3fad-4bd0-8ffe-d78f00229fbe-kube-api-access-xv2bk\") pod \"marketplace-operator-79b997595-svdsk\" (UID: \"ea964ea5-3fad-4bd0-8ffe-d78f00229fbe\") " pod="openshift-marketplace/marketplace-operator-79b997595-svdsk" Mar 09 14:08:56 crc kubenswrapper[4722]: I0309 14:08:56.760608 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-svdsk" Mar 09 14:08:56 crc kubenswrapper[4722]: I0309 14:08:56.854191 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pwxnx" Mar 09 14:08:56 crc kubenswrapper[4722]: I0309 14:08:56.919516 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w95cf" Mar 09 14:08:56 crc kubenswrapper[4722]: I0309 14:08:56.958626 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rdkdt" Mar 09 14:08:56 crc kubenswrapper[4722]: I0309 14:08:56.965339 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c2g4c" Mar 09 14:08:56 crc kubenswrapper[4722]: I0309 14:08:56.976913 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-wz9f2" Mar 09 14:08:56 crc kubenswrapper[4722]: I0309 14:08:56.989939 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0f74bde-752e-497e-ad82-ec7a1676bbd5-utilities\") pod \"c0f74bde-752e-497e-ad82-ec7a1676bbd5\" (UID: \"c0f74bde-752e-497e-ad82-ec7a1676bbd5\") " Mar 09 14:08:56 crc kubenswrapper[4722]: I0309 14:08:56.989984 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9t5bt\" (UniqueName: \"kubernetes.io/projected/c0f74bde-752e-497e-ad82-ec7a1676bbd5-kube-api-access-9t5bt\") pod \"c0f74bde-752e-497e-ad82-ec7a1676bbd5\" (UID: \"c0f74bde-752e-497e-ad82-ec7a1676bbd5\") " Mar 09 14:08:56 crc kubenswrapper[4722]: I0309 14:08:56.990018 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65e3d647-8806-4c0c-b9aa-142739f2fbe0-catalog-content\") pod \"65e3d647-8806-4c0c-b9aa-142739f2fbe0\" (UID: \"65e3d647-8806-4c0c-b9aa-142739f2fbe0\") " Mar 09 14:08:56 crc kubenswrapper[4722]: I0309 14:08:56.990079 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2kq5r\" (UniqueName: \"kubernetes.io/projected/65e3d647-8806-4c0c-b9aa-142739f2fbe0-kube-api-access-2kq5r\") pod \"65e3d647-8806-4c0c-b9aa-142739f2fbe0\" (UID: \"65e3d647-8806-4c0c-b9aa-142739f2fbe0\") " Mar 09 14:08:56 crc kubenswrapper[4722]: I0309 14:08:56.990101 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0f74bde-752e-497e-ad82-ec7a1676bbd5-catalog-content\") pod \"c0f74bde-752e-497e-ad82-ec7a1676bbd5\" (UID: \"c0f74bde-752e-497e-ad82-ec7a1676bbd5\") " Mar 09 14:08:56 crc kubenswrapper[4722]: I0309 14:08:56.990132 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65e3d647-8806-4c0c-b9aa-142739f2fbe0-utilities\") pod \"65e3d647-8806-4c0c-b9aa-142739f2fbe0\" (UID: \"65e3d647-8806-4c0c-b9aa-142739f2fbe0\") " Mar 09 14:08:56 crc kubenswrapper[4722]: I0309 14:08:56.991102 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0f74bde-752e-497e-ad82-ec7a1676bbd5-utilities" (OuterVolumeSpecName: "utilities") pod "c0f74bde-752e-497e-ad82-ec7a1676bbd5" (UID: "c0f74bde-752e-497e-ad82-ec7a1676bbd5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:08:56 crc kubenswrapper[4722]: I0309 14:08:56.995074 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65e3d647-8806-4c0c-b9aa-142739f2fbe0-utilities" (OuterVolumeSpecName: "utilities") pod "65e3d647-8806-4c0c-b9aa-142739f2fbe0" (UID: "65e3d647-8806-4c0c-b9aa-142739f2fbe0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.000602 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0f74bde-752e-497e-ad82-ec7a1676bbd5-kube-api-access-9t5bt" (OuterVolumeSpecName: "kube-api-access-9t5bt") pod "c0f74bde-752e-497e-ad82-ec7a1676bbd5" (UID: "c0f74bde-752e-497e-ad82-ec7a1676bbd5"). InnerVolumeSpecName "kube-api-access-9t5bt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.004489 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65e3d647-8806-4c0c-b9aa-142739f2fbe0-kube-api-access-2kq5r" (OuterVolumeSpecName: "kube-api-access-2kq5r") pod "65e3d647-8806-4c0c-b9aa-142739f2fbe0" (UID: "65e3d647-8806-4c0c-b9aa-142739f2fbe0"). InnerVolumeSpecName "kube-api-access-2kq5r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.062146 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0f74bde-752e-497e-ad82-ec7a1676bbd5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c0f74bde-752e-497e-ad82-ec7a1676bbd5" (UID: "c0f74bde-752e-497e-ad82-ec7a1676bbd5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.073920 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65e3d647-8806-4c0c-b9aa-142739f2fbe0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "65e3d647-8806-4c0c-b9aa-142739f2fbe0" (UID: "65e3d647-8806-4c0c-b9aa-142739f2fbe0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.091141 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfd6e90e-4eeb-4372-8465-136a383e95b2-catalog-content\") pod \"cfd6e90e-4eeb-4372-8465-136a383e95b2\" (UID: \"cfd6e90e-4eeb-4372-8465-136a383e95b2\") " Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.091219 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4jdd\" (UniqueName: \"kubernetes.io/projected/e2641b0e-aae4-49df-931f-95e38505812f-kube-api-access-p4jdd\") pod \"e2641b0e-aae4-49df-931f-95e38505812f\" (UID: \"e2641b0e-aae4-49df-931f-95e38505812f\") " Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.091254 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfd6e90e-4eeb-4372-8465-136a383e95b2-utilities\") pod \"cfd6e90e-4eeb-4372-8465-136a383e95b2\" (UID: \"cfd6e90e-4eeb-4372-8465-136a383e95b2\") " Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.091276 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e2641b0e-aae4-49df-931f-95e38505812f-marketplace-trusted-ca\") pod \"e2641b0e-aae4-49df-931f-95e38505812f\" (UID: \"e2641b0e-aae4-49df-931f-95e38505812f\") " Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.091315 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nnmzq\" (UniqueName: \"kubernetes.io/projected/7df01eab-424f-40b1-a40c-03b930a8fac6-kube-api-access-nnmzq\") pod \"7df01eab-424f-40b1-a40c-03b930a8fac6\" (UID: \"7df01eab-424f-40b1-a40c-03b930a8fac6\") " Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.091360 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7df01eab-424f-40b1-a40c-03b930a8fac6-utilities\") pod \"7df01eab-424f-40b1-a40c-03b930a8fac6\" (UID: \"7df01eab-424f-40b1-a40c-03b930a8fac6\") " Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.091381 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e2641b0e-aae4-49df-931f-95e38505812f-marketplace-operator-metrics\") pod \"e2641b0e-aae4-49df-931f-95e38505812f\" (UID: \"e2641b0e-aae4-49df-931f-95e38505812f\") " Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.091402 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cm6f7\" (UniqueName: \"kubernetes.io/projected/cfd6e90e-4eeb-4372-8465-136a383e95b2-kube-api-access-cm6f7\") pod \"cfd6e90e-4eeb-4372-8465-136a383e95b2\" (UID: \"cfd6e90e-4eeb-4372-8465-136a383e95b2\") " Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.091467 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7df01eab-424f-40b1-a40c-03b930a8fac6-catalog-content\") pod \"7df01eab-424f-40b1-a40c-03b930a8fac6\" (UID: \"7df01eab-424f-40b1-a40c-03b930a8fac6\") " Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.091700 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c0f74bde-752e-497e-ad82-ec7a1676bbd5-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.091717 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9t5bt\" (UniqueName: \"kubernetes.io/projected/c0f74bde-752e-497e-ad82-ec7a1676bbd5-kube-api-access-9t5bt\") on node \"crc\" DevicePath \"\"" Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.091729 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65e3d647-8806-4c0c-b9aa-142739f2fbe0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.091737 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2kq5r\" (UniqueName: \"kubernetes.io/projected/65e3d647-8806-4c0c-b9aa-142739f2fbe0-kube-api-access-2kq5r\") on node \"crc\" DevicePath \"\"" Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.091746 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c0f74bde-752e-497e-ad82-ec7a1676bbd5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.091754 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65e3d647-8806-4c0c-b9aa-142739f2fbe0-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.094321 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7df01eab-424f-40b1-a40c-03b930a8fac6-utilities" (OuterVolumeSpecName: "utilities") pod "7df01eab-424f-40b1-a40c-03b930a8fac6" (UID: "7df01eab-424f-40b1-a40c-03b930a8fac6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.094888 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2641b0e-aae4-49df-931f-95e38505812f-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "e2641b0e-aae4-49df-931f-95e38505812f" (UID: "e2641b0e-aae4-49df-931f-95e38505812f"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.095864 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7df01eab-424f-40b1-a40c-03b930a8fac6-kube-api-access-nnmzq" (OuterVolumeSpecName: "kube-api-access-nnmzq") pod "7df01eab-424f-40b1-a40c-03b930a8fac6" (UID: "7df01eab-424f-40b1-a40c-03b930a8fac6"). InnerVolumeSpecName "kube-api-access-nnmzq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.096017 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfd6e90e-4eeb-4372-8465-136a383e95b2-kube-api-access-cm6f7" (OuterVolumeSpecName: "kube-api-access-cm6f7") pod "cfd6e90e-4eeb-4372-8465-136a383e95b2" (UID: "cfd6e90e-4eeb-4372-8465-136a383e95b2"). InnerVolumeSpecName "kube-api-access-cm6f7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.096400 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2641b0e-aae4-49df-931f-95e38505812f-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "e2641b0e-aae4-49df-931f-95e38505812f" (UID: "e2641b0e-aae4-49df-931f-95e38505812f"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.096434 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfd6e90e-4eeb-4372-8465-136a383e95b2-utilities" (OuterVolumeSpecName: "utilities") pod "cfd6e90e-4eeb-4372-8465-136a383e95b2" (UID: "cfd6e90e-4eeb-4372-8465-136a383e95b2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.096552 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2641b0e-aae4-49df-931f-95e38505812f-kube-api-access-p4jdd" (OuterVolumeSpecName: "kube-api-access-p4jdd") pod "e2641b0e-aae4-49df-931f-95e38505812f" (UID: "e2641b0e-aae4-49df-931f-95e38505812f"). InnerVolumeSpecName "kube-api-access-p4jdd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.120393 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfd6e90e-4eeb-4372-8465-136a383e95b2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cfd6e90e-4eeb-4372-8465-136a383e95b2" (UID: "cfd6e90e-4eeb-4372-8465-136a383e95b2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.192902 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nnmzq\" (UniqueName: \"kubernetes.io/projected/7df01eab-424f-40b1-a40c-03b930a8fac6-kube-api-access-nnmzq\") on node \"crc\" DevicePath \"\"" Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.192944 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7df01eab-424f-40b1-a40c-03b930a8fac6-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.192957 4722 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e2641b0e-aae4-49df-931f-95e38505812f-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.192968 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cm6f7\" (UniqueName: \"kubernetes.io/projected/cfd6e90e-4eeb-4372-8465-136a383e95b2-kube-api-access-cm6f7\") on node \"crc\" DevicePath \"\"" Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.192978 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfd6e90e-4eeb-4372-8465-136a383e95b2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.192988 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p4jdd\" (UniqueName: \"kubernetes.io/projected/e2641b0e-aae4-49df-931f-95e38505812f-kube-api-access-p4jdd\") on node \"crc\" DevicePath \"\"" Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.192997 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfd6e90e-4eeb-4372-8465-136a383e95b2-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.193007 4722 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e2641b0e-aae4-49df-931f-95e38505812f-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.216842 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7df01eab-424f-40b1-a40c-03b930a8fac6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7df01eab-424f-40b1-a40c-03b930a8fac6" (UID: "7df01eab-424f-40b1-a40c-03b930a8fac6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.265029 4722 generic.go:334] "Generic (PLEG): container finished" podID="cfd6e90e-4eeb-4372-8465-136a383e95b2" containerID="78f5a393860e42b0934375b2f004056d71f44af329e9dd8bdc283424c1580305" exitCode=0 Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.265121 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rdkdt" event={"ID":"cfd6e90e-4eeb-4372-8465-136a383e95b2","Type":"ContainerDied","Data":"78f5a393860e42b0934375b2f004056d71f44af329e9dd8bdc283424c1580305"} Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.265159 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rdkdt" event={"ID":"cfd6e90e-4eeb-4372-8465-136a383e95b2","Type":"ContainerDied","Data":"fee2ac0315c036e2f0afaa1ffa5796c1dd4f70c39dedef5326921170502110c1"} Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.265158 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rdkdt" Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.265178 4722 scope.go:117] "RemoveContainer" containerID="78f5a393860e42b0934375b2f004056d71f44af329e9dd8bdc283424c1580305" Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.266997 4722 generic.go:334] "Generic (PLEG): container finished" podID="e2641b0e-aae4-49df-931f-95e38505812f" containerID="64d164d2bc8fe4385d91a3b84212f0498b57af3e2a5c68489f924b444efff3c6" exitCode=0 Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.267054 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-wz9f2" event={"ID":"e2641b0e-aae4-49df-931f-95e38505812f","Type":"ContainerDied","Data":"64d164d2bc8fe4385d91a3b84212f0498b57af3e2a5c68489f924b444efff3c6"} Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.267077 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-wz9f2" event={"ID":"e2641b0e-aae4-49df-931f-95e38505812f","Type":"ContainerDied","Data":"34fca4843b9f5fee1507b016600c3bebb74b2540b7e1e14623371c52a8a5cf23"} Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.267143 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-wz9f2" Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.271581 4722 generic.go:334] "Generic (PLEG): container finished" podID="7df01eab-424f-40b1-a40c-03b930a8fac6" containerID="75288ffd906d8bf407d2dcb493b2b28933f498f7cd819bb126299f76207c3ac7" exitCode=0 Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.271635 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c2g4c" event={"ID":"7df01eab-424f-40b1-a40c-03b930a8fac6","Type":"ContainerDied","Data":"75288ffd906d8bf407d2dcb493b2b28933f498f7cd819bb126299f76207c3ac7"} Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.271660 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c2g4c" event={"ID":"7df01eab-424f-40b1-a40c-03b930a8fac6","Type":"ContainerDied","Data":"1efad77f097ccbcc1966a4d440a83442acdf700b6e204ef056da07356327c341"} Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.271724 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c2g4c" Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.282149 4722 generic.go:334] "Generic (PLEG): container finished" podID="65e3d647-8806-4c0c-b9aa-142739f2fbe0" containerID="cf480bc5c8bb7328b096209debbb6ae3f3c95254a69760a97c4aaaab34c7f253" exitCode=0 Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.282845 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w95cf" event={"ID":"65e3d647-8806-4c0c-b9aa-142739f2fbe0","Type":"ContainerDied","Data":"cf480bc5c8bb7328b096209debbb6ae3f3c95254a69760a97c4aaaab34c7f253"} Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.282910 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w95cf" event={"ID":"65e3d647-8806-4c0c-b9aa-142739f2fbe0","Type":"ContainerDied","Data":"5a8078b43ae67f49168afccf4629ab4efb623d4c7a3f8161b3074dee44703f2a"} Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.284307 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w95cf" Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.284468 4722 scope.go:117] "RemoveContainer" containerID="c2805a424b564415cca37b53a3bc23bada46f95aafd332061d8ba671858b7e21" Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.291888 4722 generic.go:334] "Generic (PLEG): container finished" podID="c0f74bde-752e-497e-ad82-ec7a1676bbd5" containerID="aa3e4f5b5b38ee676d8bc64dbcfe4f208f4ba83831c09629dbc541729ab9ea5c" exitCode=0 Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.291933 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pwxnx" event={"ID":"c0f74bde-752e-497e-ad82-ec7a1676bbd5","Type":"ContainerDied","Data":"aa3e4f5b5b38ee676d8bc64dbcfe4f208f4ba83831c09629dbc541729ab9ea5c"} Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.291967 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pwxnx" event={"ID":"c0f74bde-752e-497e-ad82-ec7a1676bbd5","Type":"ContainerDied","Data":"0fc91f10b5f35318045918b18351a0167656d308f21ff2e0f29b30aeee849892"} Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.292058 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pwxnx" Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.295251 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7df01eab-424f-40b1-a40c-03b930a8fac6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.299105 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-svdsk"] Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.315693 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-c2g4c"] Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.319500 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-c2g4c"] Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.326763 4722 scope.go:117] "RemoveContainer" containerID="c83797e6984a294a0485ee468a0577500f0b718e4e06020eefcaa5eff63fa0f9" Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.330884 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rdkdt"] Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.334906 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rdkdt"] Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.348151 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pwxnx"] Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.354568 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-pwxnx"] Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.359754 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-w95cf"] Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.366004 4722 scope.go:117] "RemoveContainer" containerID="78f5a393860e42b0934375b2f004056d71f44af329e9dd8bdc283424c1580305" Mar 09 14:08:57 crc kubenswrapper[4722]: E0309 14:08:57.366853 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78f5a393860e42b0934375b2f004056d71f44af329e9dd8bdc283424c1580305\": container with ID starting with 78f5a393860e42b0934375b2f004056d71f44af329e9dd8bdc283424c1580305 not found: ID does not exist" containerID="78f5a393860e42b0934375b2f004056d71f44af329e9dd8bdc283424c1580305" Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.366913 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78f5a393860e42b0934375b2f004056d71f44af329e9dd8bdc283424c1580305"} err="failed to get container status \"78f5a393860e42b0934375b2f004056d71f44af329e9dd8bdc283424c1580305\": rpc error: code = NotFound desc = could not find container \"78f5a393860e42b0934375b2f004056d71f44af329e9dd8bdc283424c1580305\": container with ID starting with 78f5a393860e42b0934375b2f004056d71f44af329e9dd8bdc283424c1580305 not found: ID does not exist" Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.366951 4722 scope.go:117] "RemoveContainer" containerID="c2805a424b564415cca37b53a3bc23bada46f95aafd332061d8ba671858b7e21" Mar 09 14:08:57 crc kubenswrapper[4722]: E0309 14:08:57.370667 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2805a424b564415cca37b53a3bc23bada46f95aafd332061d8ba671858b7e21\": container with ID starting with c2805a424b564415cca37b53a3bc23bada46f95aafd332061d8ba671858b7e21 not found: ID does not exist" containerID="c2805a424b564415cca37b53a3bc23bada46f95aafd332061d8ba671858b7e21" Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.370735 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2805a424b564415cca37b53a3bc23bada46f95aafd332061d8ba671858b7e21"} err="failed to get container status \"c2805a424b564415cca37b53a3bc23bada46f95aafd332061d8ba671858b7e21\": rpc error: code = NotFound desc = could not find container \"c2805a424b564415cca37b53a3bc23bada46f95aafd332061d8ba671858b7e21\": container with ID starting with c2805a424b564415cca37b53a3bc23bada46f95aafd332061d8ba671858b7e21 not found: ID does not exist" Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.370781 4722 scope.go:117] "RemoveContainer" containerID="c83797e6984a294a0485ee468a0577500f0b718e4e06020eefcaa5eff63fa0f9" Mar 09 14:08:57 crc kubenswrapper[4722]: E0309 14:08:57.373951 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c83797e6984a294a0485ee468a0577500f0b718e4e06020eefcaa5eff63fa0f9\": container with ID starting with c83797e6984a294a0485ee468a0577500f0b718e4e06020eefcaa5eff63fa0f9 not found: ID does not exist" containerID="c83797e6984a294a0485ee468a0577500f0b718e4e06020eefcaa5eff63fa0f9" Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.374014 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c83797e6984a294a0485ee468a0577500f0b718e4e06020eefcaa5eff63fa0f9"} err="failed to get container status \"c83797e6984a294a0485ee468a0577500f0b718e4e06020eefcaa5eff63fa0f9\": rpc error: code = NotFound desc = could not find container \"c83797e6984a294a0485ee468a0577500f0b718e4e06020eefcaa5eff63fa0f9\": container with ID starting with c83797e6984a294a0485ee468a0577500f0b718e4e06020eefcaa5eff63fa0f9 not found: ID does not exist" Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.374045 4722 scope.go:117] "RemoveContainer" containerID="64d164d2bc8fe4385d91a3b84212f0498b57af3e2a5c68489f924b444efff3c6" Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.390334 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-w95cf"] Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.403682 4722 scope.go:117] "RemoveContainer" containerID="64d164d2bc8fe4385d91a3b84212f0498b57af3e2a5c68489f924b444efff3c6" Mar 09 14:08:57 crc kubenswrapper[4722]: E0309 14:08:57.404413 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64d164d2bc8fe4385d91a3b84212f0498b57af3e2a5c68489f924b444efff3c6\": container with ID starting with 64d164d2bc8fe4385d91a3b84212f0498b57af3e2a5c68489f924b444efff3c6 not found: ID does not exist" containerID="64d164d2bc8fe4385d91a3b84212f0498b57af3e2a5c68489f924b444efff3c6" Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.404455 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64d164d2bc8fe4385d91a3b84212f0498b57af3e2a5c68489f924b444efff3c6"} err="failed to get container status \"64d164d2bc8fe4385d91a3b84212f0498b57af3e2a5c68489f924b444efff3c6\": rpc error: code = NotFound desc = could not find container \"64d164d2bc8fe4385d91a3b84212f0498b57af3e2a5c68489f924b444efff3c6\": container with ID starting with 64d164d2bc8fe4385d91a3b84212f0498b57af3e2a5c68489f924b444efff3c6 not found: ID does not exist" Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.404559 4722 scope.go:117] "RemoveContainer" containerID="75288ffd906d8bf407d2dcb493b2b28933f498f7cd819bb126299f76207c3ac7" Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.407361 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wz9f2"] Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.411730 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wz9f2"] Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.431452 4722 scope.go:117] "RemoveContainer" containerID="c4cf8c1a507fc5a3ab3987d6872bc9a199c343500986f97c7465d38446bb2443" Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.448572 4722 scope.go:117] "RemoveContainer" containerID="7c497a08089b3d2432fefa7cff8373039c142fd35a2f682c0dd090a00a79b0c7" Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.466822 4722 scope.go:117] "RemoveContainer" containerID="75288ffd906d8bf407d2dcb493b2b28933f498f7cd819bb126299f76207c3ac7" Mar 09 14:08:57 crc kubenswrapper[4722]: E0309 14:08:57.467331 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75288ffd906d8bf407d2dcb493b2b28933f498f7cd819bb126299f76207c3ac7\": container with ID starting with 75288ffd906d8bf407d2dcb493b2b28933f498f7cd819bb126299f76207c3ac7 not found: ID does not exist" containerID="75288ffd906d8bf407d2dcb493b2b28933f498f7cd819bb126299f76207c3ac7" Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.467379 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75288ffd906d8bf407d2dcb493b2b28933f498f7cd819bb126299f76207c3ac7"} err="failed to get container status \"75288ffd906d8bf407d2dcb493b2b28933f498f7cd819bb126299f76207c3ac7\": rpc error: code = NotFound desc = could not find container \"75288ffd906d8bf407d2dcb493b2b28933f498f7cd819bb126299f76207c3ac7\": container with ID starting with 75288ffd906d8bf407d2dcb493b2b28933f498f7cd819bb126299f76207c3ac7 not found: ID does not exist" Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.467412 4722 scope.go:117] "RemoveContainer" containerID="c4cf8c1a507fc5a3ab3987d6872bc9a199c343500986f97c7465d38446bb2443" Mar 09 14:08:57 crc kubenswrapper[4722]: E0309 14:08:57.467766 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4cf8c1a507fc5a3ab3987d6872bc9a199c343500986f97c7465d38446bb2443\": container with ID starting with c4cf8c1a507fc5a3ab3987d6872bc9a199c343500986f97c7465d38446bb2443 not found: ID does not exist" containerID="c4cf8c1a507fc5a3ab3987d6872bc9a199c343500986f97c7465d38446bb2443" Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.467793 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4cf8c1a507fc5a3ab3987d6872bc9a199c343500986f97c7465d38446bb2443"} err="failed to get container status \"c4cf8c1a507fc5a3ab3987d6872bc9a199c343500986f97c7465d38446bb2443\": rpc error: code = NotFound desc = could not find container \"c4cf8c1a507fc5a3ab3987d6872bc9a199c343500986f97c7465d38446bb2443\": container with ID starting with c4cf8c1a507fc5a3ab3987d6872bc9a199c343500986f97c7465d38446bb2443 not found: ID does not exist" Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.467815 4722 scope.go:117] "RemoveContainer" containerID="7c497a08089b3d2432fefa7cff8373039c142fd35a2f682c0dd090a00a79b0c7" Mar 09 14:08:57 crc kubenswrapper[4722]: E0309 14:08:57.468056 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c497a08089b3d2432fefa7cff8373039c142fd35a2f682c0dd090a00a79b0c7\": container with ID starting with 7c497a08089b3d2432fefa7cff8373039c142fd35a2f682c0dd090a00a79b0c7 not found: ID does not exist" containerID="7c497a08089b3d2432fefa7cff8373039c142fd35a2f682c0dd090a00a79b0c7" Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.468079 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c497a08089b3d2432fefa7cff8373039c142fd35a2f682c0dd090a00a79b0c7"} err="failed to get container status \"7c497a08089b3d2432fefa7cff8373039c142fd35a2f682c0dd090a00a79b0c7\": rpc error: code = NotFound desc = could not find container \"7c497a08089b3d2432fefa7cff8373039c142fd35a2f682c0dd090a00a79b0c7\": container with ID starting with 7c497a08089b3d2432fefa7cff8373039c142fd35a2f682c0dd090a00a79b0c7 not found: ID does not exist" Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.468094 4722 scope.go:117] "RemoveContainer" containerID="cf480bc5c8bb7328b096209debbb6ae3f3c95254a69760a97c4aaaab34c7f253" Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.481117 4722 scope.go:117] "RemoveContainer" containerID="eaa712b3811f346415682a71aa51b3e96298af53fd6b21033a95e8bb66783011" Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.495108 4722 scope.go:117] "RemoveContainer" containerID="aa46608cdb2f1782bf7194037f0cac40fdf8b5ac4aa277132cac8a987db67863" Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.508608 4722 scope.go:117] "RemoveContainer" containerID="cf480bc5c8bb7328b096209debbb6ae3f3c95254a69760a97c4aaaab34c7f253" Mar 09 14:08:57 crc kubenswrapper[4722]: E0309 14:08:57.509131 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf480bc5c8bb7328b096209debbb6ae3f3c95254a69760a97c4aaaab34c7f253\": container with ID starting with cf480bc5c8bb7328b096209debbb6ae3f3c95254a69760a97c4aaaab34c7f253 not found: ID does not exist" containerID="cf480bc5c8bb7328b096209debbb6ae3f3c95254a69760a97c4aaaab34c7f253" Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.509168 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf480bc5c8bb7328b096209debbb6ae3f3c95254a69760a97c4aaaab34c7f253"} err="failed to get container status \"cf480bc5c8bb7328b096209debbb6ae3f3c95254a69760a97c4aaaab34c7f253\": rpc error: code = NotFound desc = could not find container \"cf480bc5c8bb7328b096209debbb6ae3f3c95254a69760a97c4aaaab34c7f253\": container with ID starting with cf480bc5c8bb7328b096209debbb6ae3f3c95254a69760a97c4aaaab34c7f253 not found: ID does not exist" Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.509192 4722 scope.go:117] "RemoveContainer" containerID="eaa712b3811f346415682a71aa51b3e96298af53fd6b21033a95e8bb66783011" Mar 09 14:08:57 crc kubenswrapper[4722]: E0309 14:08:57.509818 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eaa712b3811f346415682a71aa51b3e96298af53fd6b21033a95e8bb66783011\": container with ID starting with eaa712b3811f346415682a71aa51b3e96298af53fd6b21033a95e8bb66783011 not found: ID does not exist" containerID="eaa712b3811f346415682a71aa51b3e96298af53fd6b21033a95e8bb66783011" Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.509863 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eaa712b3811f346415682a71aa51b3e96298af53fd6b21033a95e8bb66783011"} err="failed to get container status \"eaa712b3811f346415682a71aa51b3e96298af53fd6b21033a95e8bb66783011\": rpc error: code = NotFound desc = could not find container \"eaa712b3811f346415682a71aa51b3e96298af53fd6b21033a95e8bb66783011\": container with ID starting with eaa712b3811f346415682a71aa51b3e96298af53fd6b21033a95e8bb66783011 not found: ID does not exist" Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.509877 4722 scope.go:117] "RemoveContainer" containerID="aa46608cdb2f1782bf7194037f0cac40fdf8b5ac4aa277132cac8a987db67863" Mar 09 14:08:57 crc kubenswrapper[4722]: E0309 14:08:57.510264 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa46608cdb2f1782bf7194037f0cac40fdf8b5ac4aa277132cac8a987db67863\": container with ID starting with aa46608cdb2f1782bf7194037f0cac40fdf8b5ac4aa277132cac8a987db67863 not found: ID does not exist" containerID="aa46608cdb2f1782bf7194037f0cac40fdf8b5ac4aa277132cac8a987db67863" Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.510325 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa46608cdb2f1782bf7194037f0cac40fdf8b5ac4aa277132cac8a987db67863"} err="failed to get container status \"aa46608cdb2f1782bf7194037f0cac40fdf8b5ac4aa277132cac8a987db67863\": rpc error: code = NotFound desc = could not find container \"aa46608cdb2f1782bf7194037f0cac40fdf8b5ac4aa277132cac8a987db67863\": container with ID starting with aa46608cdb2f1782bf7194037f0cac40fdf8b5ac4aa277132cac8a987db67863 not found: ID does not exist" Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.510367 4722 scope.go:117] "RemoveContainer" containerID="aa3e4f5b5b38ee676d8bc64dbcfe4f208f4ba83831c09629dbc541729ab9ea5c" Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.559148 4722 scope.go:117] "RemoveContainer" containerID="9fa19129974de9178fcafcea62deb4f00fce3b4a22230d33c0edc0c58e099fb8" Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.576139 4722 scope.go:117] "RemoveContainer" containerID="6f692c9353de894c8fa94990493a655c30b8374194f335a9126c69e3fb7b0f02" Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.591974 4722 scope.go:117] "RemoveContainer" containerID="aa3e4f5b5b38ee676d8bc64dbcfe4f208f4ba83831c09629dbc541729ab9ea5c" Mar 09 14:08:57 crc kubenswrapper[4722]: E0309 14:08:57.592628 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa3e4f5b5b38ee676d8bc64dbcfe4f208f4ba83831c09629dbc541729ab9ea5c\": container with ID starting with aa3e4f5b5b38ee676d8bc64dbcfe4f208f4ba83831c09629dbc541729ab9ea5c not found: ID does not exist" containerID="aa3e4f5b5b38ee676d8bc64dbcfe4f208f4ba83831c09629dbc541729ab9ea5c" Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.592684 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa3e4f5b5b38ee676d8bc64dbcfe4f208f4ba83831c09629dbc541729ab9ea5c"} err="failed to get container status \"aa3e4f5b5b38ee676d8bc64dbcfe4f208f4ba83831c09629dbc541729ab9ea5c\": rpc error: code = NotFound desc = could not find container \"aa3e4f5b5b38ee676d8bc64dbcfe4f208f4ba83831c09629dbc541729ab9ea5c\": container with ID starting with aa3e4f5b5b38ee676d8bc64dbcfe4f208f4ba83831c09629dbc541729ab9ea5c not found: ID does not exist" Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.592719 4722 scope.go:117] "RemoveContainer" containerID="9fa19129974de9178fcafcea62deb4f00fce3b4a22230d33c0edc0c58e099fb8" Mar 09 14:08:57 crc kubenswrapper[4722]: E0309 14:08:57.593094 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9fa19129974de9178fcafcea62deb4f00fce3b4a22230d33c0edc0c58e099fb8\": container with ID starting with 9fa19129974de9178fcafcea62deb4f00fce3b4a22230d33c0edc0c58e099fb8 not found: ID does not exist" containerID="9fa19129974de9178fcafcea62deb4f00fce3b4a22230d33c0edc0c58e099fb8" Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.593116 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fa19129974de9178fcafcea62deb4f00fce3b4a22230d33c0edc0c58e099fb8"} err="failed to get container status \"9fa19129974de9178fcafcea62deb4f00fce3b4a22230d33c0edc0c58e099fb8\": rpc error: code = NotFound desc = could not find container \"9fa19129974de9178fcafcea62deb4f00fce3b4a22230d33c0edc0c58e099fb8\": container with ID starting with 9fa19129974de9178fcafcea62deb4f00fce3b4a22230d33c0edc0c58e099fb8 not found: ID does not exist" Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.593129 4722 scope.go:117] "RemoveContainer" containerID="6f692c9353de894c8fa94990493a655c30b8374194f335a9126c69e3fb7b0f02" Mar 09 14:08:57 crc kubenswrapper[4722]: E0309 14:08:57.593496 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f692c9353de894c8fa94990493a655c30b8374194f335a9126c69e3fb7b0f02\": container with ID starting with 6f692c9353de894c8fa94990493a655c30b8374194f335a9126c69e3fb7b0f02 not found: ID does not exist" containerID="6f692c9353de894c8fa94990493a655c30b8374194f335a9126c69e3fb7b0f02" Mar 09 14:08:57 crc kubenswrapper[4722]: I0309 14:08:57.593542 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f692c9353de894c8fa94990493a655c30b8374194f335a9126c69e3fb7b0f02"} err="failed to get container status \"6f692c9353de894c8fa94990493a655c30b8374194f335a9126c69e3fb7b0f02\": rpc error: code = NotFound desc = could not find container \"6f692c9353de894c8fa94990493a655c30b8374194f335a9126c69e3fb7b0f02\": container with ID starting with 6f692c9353de894c8fa94990493a655c30b8374194f335a9126c69e3fb7b0f02 not found: ID does not exist" Mar 09 14:08:58 crc kubenswrapper[4722]: I0309 14:08:58.156853 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65e3d647-8806-4c0c-b9aa-142739f2fbe0" path="/var/lib/kubelet/pods/65e3d647-8806-4c0c-b9aa-142739f2fbe0/volumes" Mar 09 14:08:58 crc kubenswrapper[4722]: I0309 14:08:58.157678 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7df01eab-424f-40b1-a40c-03b930a8fac6" path="/var/lib/kubelet/pods/7df01eab-424f-40b1-a40c-03b930a8fac6/volumes" Mar 09 14:08:58 crc kubenswrapper[4722]: I0309 14:08:58.158497 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0f74bde-752e-497e-ad82-ec7a1676bbd5" path="/var/lib/kubelet/pods/c0f74bde-752e-497e-ad82-ec7a1676bbd5/volumes" Mar 09 14:08:58 crc kubenswrapper[4722]: I0309 14:08:58.160026 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfd6e90e-4eeb-4372-8465-136a383e95b2" path="/var/lib/kubelet/pods/cfd6e90e-4eeb-4372-8465-136a383e95b2/volumes" Mar 09 14:08:58 crc kubenswrapper[4722]: I0309 14:08:58.160684 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2641b0e-aae4-49df-931f-95e38505812f" path="/var/lib/kubelet/pods/e2641b0e-aae4-49df-931f-95e38505812f/volumes" Mar 09 14:08:58 crc kubenswrapper[4722]: I0309 14:08:58.302546 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-svdsk" event={"ID":"ea964ea5-3fad-4bd0-8ffe-d78f00229fbe","Type":"ContainerStarted","Data":"0e7bcd13d38603f59893c735e0d0150697877082d578b1fa9c28265c8006dd62"} Mar 09 14:08:58 crc kubenswrapper[4722]: I0309 14:08:58.302626 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-svdsk" Mar 09 14:08:58 crc kubenswrapper[4722]: I0309 14:08:58.302637 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-svdsk" event={"ID":"ea964ea5-3fad-4bd0-8ffe-d78f00229fbe","Type":"ContainerStarted","Data":"d6fa7820afedf760f613e6d3bab1a492efe8f62d10116e3526bed53ff600aaa2"} Mar 09 14:08:58 crc kubenswrapper[4722]: I0309 14:08:58.304829 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-svdsk" Mar 09 14:08:58 crc kubenswrapper[4722]: I0309 14:08:58.326282 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-svdsk" podStartSLOduration=2.326260115 podStartE2EDuration="2.326260115s" podCreationTimestamp="2026-03-09 14:08:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:08:58.319875787 +0000 UTC m=+378.875444363" watchObservedRunningTime="2026-03-09 14:08:58.326260115 +0000 UTC m=+378.881828691" Mar 09 14:09:24 crc kubenswrapper[4722]: I0309 14:09:24.067663 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-c2mmw"] Mar 09 14:09:24 crc kubenswrapper[4722]: E0309 14:09:24.068841 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2641b0e-aae4-49df-931f-95e38505812f" containerName="marketplace-operator" Mar 09 14:09:24 crc kubenswrapper[4722]: I0309 14:09:24.068865 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2641b0e-aae4-49df-931f-95e38505812f" containerName="marketplace-operator" Mar 09 14:09:24 crc kubenswrapper[4722]: E0309 14:09:24.068890 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfd6e90e-4eeb-4372-8465-136a383e95b2" containerName="extract-utilities" Mar 09 14:09:24 crc kubenswrapper[4722]: I0309 14:09:24.068902 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfd6e90e-4eeb-4372-8465-136a383e95b2" containerName="extract-utilities" Mar 09 14:09:24 crc kubenswrapper[4722]: E0309 14:09:24.068923 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7df01eab-424f-40b1-a40c-03b930a8fac6" containerName="registry-server" Mar 09 14:09:24 crc kubenswrapper[4722]: I0309 14:09:24.068936 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="7df01eab-424f-40b1-a40c-03b930a8fac6" containerName="registry-server" Mar 09 14:09:24 crc kubenswrapper[4722]: E0309 14:09:24.068956 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0f74bde-752e-497e-ad82-ec7a1676bbd5" containerName="extract-content" Mar 09 14:09:24 crc kubenswrapper[4722]: I0309 14:09:24.068969 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0f74bde-752e-497e-ad82-ec7a1676bbd5" containerName="extract-content" Mar 09 14:09:24 crc kubenswrapper[4722]: E0309 14:09:24.068992 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7df01eab-424f-40b1-a40c-03b930a8fac6" containerName="extract-content" Mar 09 14:09:24 crc kubenswrapper[4722]: I0309 14:09:24.069004 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="7df01eab-424f-40b1-a40c-03b930a8fac6" containerName="extract-content" Mar 09 14:09:24 crc kubenswrapper[4722]: E0309 14:09:24.069024 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65e3d647-8806-4c0c-b9aa-142739f2fbe0" containerName="extract-content" Mar 09 14:09:24 crc kubenswrapper[4722]: I0309 14:09:24.069036 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="65e3d647-8806-4c0c-b9aa-142739f2fbe0" containerName="extract-content" Mar 09 14:09:24 crc kubenswrapper[4722]: E0309 14:09:24.069051 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65e3d647-8806-4c0c-b9aa-142739f2fbe0" containerName="extract-utilities" Mar 09 14:09:24 crc kubenswrapper[4722]: I0309 14:09:24.069063 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="65e3d647-8806-4c0c-b9aa-142739f2fbe0" containerName="extract-utilities" Mar 09 14:09:24 crc kubenswrapper[4722]: E0309 14:09:24.069081 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfd6e90e-4eeb-4372-8465-136a383e95b2" containerName="registry-server" Mar 09 14:09:24 crc kubenswrapper[4722]: I0309 14:09:24.069093 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfd6e90e-4eeb-4372-8465-136a383e95b2" containerName="registry-server" Mar 09 14:09:24 crc kubenswrapper[4722]: E0309 14:09:24.069115 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0f74bde-752e-497e-ad82-ec7a1676bbd5" containerName="registry-server" Mar 09 14:09:24 crc kubenswrapper[4722]: I0309 14:09:24.069127 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0f74bde-752e-497e-ad82-ec7a1676bbd5" containerName="registry-server" Mar 09 14:09:24 crc kubenswrapper[4722]: E0309 14:09:24.069147 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7df01eab-424f-40b1-a40c-03b930a8fac6" containerName="extract-utilities" Mar 09 14:09:24 crc kubenswrapper[4722]: I0309 14:09:24.069159 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="7df01eab-424f-40b1-a40c-03b930a8fac6" containerName="extract-utilities" Mar 09 14:09:24 crc kubenswrapper[4722]: E0309 14:09:24.069181 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfd6e90e-4eeb-4372-8465-136a383e95b2" containerName="extract-content" Mar 09 14:09:24 crc kubenswrapper[4722]: I0309 14:09:24.069193 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfd6e90e-4eeb-4372-8465-136a383e95b2" containerName="extract-content" Mar 09 14:09:24 crc kubenswrapper[4722]: E0309 14:09:24.069240 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65e3d647-8806-4c0c-b9aa-142739f2fbe0" containerName="registry-server" Mar 09 14:09:24 crc kubenswrapper[4722]: I0309 14:09:24.069253 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="65e3d647-8806-4c0c-b9aa-142739f2fbe0" containerName="registry-server" Mar 09 14:09:24 crc kubenswrapper[4722]: E0309 14:09:24.069272 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0f74bde-752e-497e-ad82-ec7a1676bbd5" containerName="extract-utilities" Mar 09 14:09:24 crc kubenswrapper[4722]: I0309 14:09:24.069284 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0f74bde-752e-497e-ad82-ec7a1676bbd5" containerName="extract-utilities" Mar 09 14:09:24 crc kubenswrapper[4722]: I0309 14:09:24.069467 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="7df01eab-424f-40b1-a40c-03b930a8fac6" containerName="registry-server" Mar 09 14:09:24 crc kubenswrapper[4722]: I0309 14:09:24.069484 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2641b0e-aae4-49df-931f-95e38505812f" containerName="marketplace-operator" Mar 09 14:09:24 crc kubenswrapper[4722]: I0309 14:09:24.069504 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfd6e90e-4eeb-4372-8465-136a383e95b2" containerName="registry-server" Mar 09 14:09:24 crc kubenswrapper[4722]: I0309 14:09:24.069519 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="65e3d647-8806-4c0c-b9aa-142739f2fbe0" containerName="registry-server" Mar 09 14:09:24 crc kubenswrapper[4722]: I0309 14:09:24.069533 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0f74bde-752e-497e-ad82-ec7a1676bbd5" containerName="registry-server" Mar 09 14:09:24 crc kubenswrapper[4722]: I0309 14:09:24.070850 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c2mmw" Mar 09 14:09:24 crc kubenswrapper[4722]: I0309 14:09:24.073164 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 09 14:09:24 crc kubenswrapper[4722]: I0309 14:09:24.090575 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-c2mmw"] Mar 09 14:09:24 crc kubenswrapper[4722]: I0309 14:09:24.185406 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/690e5ab0-3719-40ac-aba6-9278480ecb44-utilities\") pod \"redhat-marketplace-c2mmw\" (UID: \"690e5ab0-3719-40ac-aba6-9278480ecb44\") " pod="openshift-marketplace/redhat-marketplace-c2mmw" Mar 09 14:09:24 crc kubenswrapper[4722]: I0309 14:09:24.185476 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/690e5ab0-3719-40ac-aba6-9278480ecb44-catalog-content\") pod \"redhat-marketplace-c2mmw\" (UID: \"690e5ab0-3719-40ac-aba6-9278480ecb44\") " pod="openshift-marketplace/redhat-marketplace-c2mmw" Mar 09 14:09:24 crc kubenswrapper[4722]: I0309 14:09:24.185518 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-975wq\" (UniqueName: \"kubernetes.io/projected/690e5ab0-3719-40ac-aba6-9278480ecb44-kube-api-access-975wq\") pod \"redhat-marketplace-c2mmw\" (UID: \"690e5ab0-3719-40ac-aba6-9278480ecb44\") " pod="openshift-marketplace/redhat-marketplace-c2mmw" Mar 09 14:09:24 crc kubenswrapper[4722]: I0309 14:09:24.263312 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-v57f2"] Mar 09 14:09:24 crc kubenswrapper[4722]: I0309 14:09:24.265706 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v57f2" Mar 09 14:09:24 crc kubenswrapper[4722]: I0309 14:09:24.268467 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 09 14:09:24 crc kubenswrapper[4722]: I0309 14:09:24.275829 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v57f2"] Mar 09 14:09:24 crc kubenswrapper[4722]: I0309 14:09:24.288181 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/690e5ab0-3719-40ac-aba6-9278480ecb44-utilities\") pod \"redhat-marketplace-c2mmw\" (UID: \"690e5ab0-3719-40ac-aba6-9278480ecb44\") " pod="openshift-marketplace/redhat-marketplace-c2mmw" Mar 09 14:09:24 crc kubenswrapper[4722]: I0309 14:09:24.288290 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/690e5ab0-3719-40ac-aba6-9278480ecb44-catalog-content\") pod \"redhat-marketplace-c2mmw\" (UID: \"690e5ab0-3719-40ac-aba6-9278480ecb44\") " pod="openshift-marketplace/redhat-marketplace-c2mmw" Mar 09 14:09:24 crc kubenswrapper[4722]: I0309 14:09:24.288326 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-975wq\" (UniqueName: \"kubernetes.io/projected/690e5ab0-3719-40ac-aba6-9278480ecb44-kube-api-access-975wq\") pod \"redhat-marketplace-c2mmw\" (UID: \"690e5ab0-3719-40ac-aba6-9278480ecb44\") " pod="openshift-marketplace/redhat-marketplace-c2mmw" Mar 09 14:09:24 crc kubenswrapper[4722]: I0309 14:09:24.288741 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/690e5ab0-3719-40ac-aba6-9278480ecb44-utilities\") pod \"redhat-marketplace-c2mmw\" (UID: \"690e5ab0-3719-40ac-aba6-9278480ecb44\") " pod="openshift-marketplace/redhat-marketplace-c2mmw" Mar 09 14:09:24 crc kubenswrapper[4722]: I0309 14:09:24.288792 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/690e5ab0-3719-40ac-aba6-9278480ecb44-catalog-content\") pod \"redhat-marketplace-c2mmw\" (UID: \"690e5ab0-3719-40ac-aba6-9278480ecb44\") " pod="openshift-marketplace/redhat-marketplace-c2mmw" Mar 09 14:09:24 crc kubenswrapper[4722]: I0309 14:09:24.316530 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-975wq\" (UniqueName: \"kubernetes.io/projected/690e5ab0-3719-40ac-aba6-9278480ecb44-kube-api-access-975wq\") pod \"redhat-marketplace-c2mmw\" (UID: \"690e5ab0-3719-40ac-aba6-9278480ecb44\") " pod="openshift-marketplace/redhat-marketplace-c2mmw" Mar 09 14:09:24 crc kubenswrapper[4722]: I0309 14:09:24.390104 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e27c5a4-8cba-4119-8006-f9841d6121dc-catalog-content\") pod \"redhat-operators-v57f2\" (UID: \"4e27c5a4-8cba-4119-8006-f9841d6121dc\") " pod="openshift-marketplace/redhat-operators-v57f2" Mar 09 14:09:24 crc kubenswrapper[4722]: I0309 14:09:24.390505 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e27c5a4-8cba-4119-8006-f9841d6121dc-utilities\") pod \"redhat-operators-v57f2\" (UID: \"4e27c5a4-8cba-4119-8006-f9841d6121dc\") " pod="openshift-marketplace/redhat-operators-v57f2" Mar 09 14:09:24 crc kubenswrapper[4722]: I0309 14:09:24.390723 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cknrf\" (UniqueName: \"kubernetes.io/projected/4e27c5a4-8cba-4119-8006-f9841d6121dc-kube-api-access-cknrf\") pod \"redhat-operators-v57f2\" (UID: \"4e27c5a4-8cba-4119-8006-f9841d6121dc\") " pod="openshift-marketplace/redhat-operators-v57f2" Mar 09 14:09:24 crc kubenswrapper[4722]: I0309 14:09:24.392783 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c2mmw" Mar 09 14:09:24 crc kubenswrapper[4722]: I0309 14:09:24.492166 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cknrf\" (UniqueName: \"kubernetes.io/projected/4e27c5a4-8cba-4119-8006-f9841d6121dc-kube-api-access-cknrf\") pod \"redhat-operators-v57f2\" (UID: \"4e27c5a4-8cba-4119-8006-f9841d6121dc\") " pod="openshift-marketplace/redhat-operators-v57f2" Mar 09 14:09:24 crc kubenswrapper[4722]: I0309 14:09:24.492718 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e27c5a4-8cba-4119-8006-f9841d6121dc-catalog-content\") pod \"redhat-operators-v57f2\" (UID: \"4e27c5a4-8cba-4119-8006-f9841d6121dc\") " pod="openshift-marketplace/redhat-operators-v57f2" Mar 09 14:09:24 crc kubenswrapper[4722]: I0309 14:09:24.492767 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e27c5a4-8cba-4119-8006-f9841d6121dc-utilities\") pod \"redhat-operators-v57f2\" (UID: \"4e27c5a4-8cba-4119-8006-f9841d6121dc\") " pod="openshift-marketplace/redhat-operators-v57f2" Mar 09 14:09:24 crc kubenswrapper[4722]: I0309 14:09:24.493497 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e27c5a4-8cba-4119-8006-f9841d6121dc-utilities\") pod \"redhat-operators-v57f2\" (UID: \"4e27c5a4-8cba-4119-8006-f9841d6121dc\") " pod="openshift-marketplace/redhat-operators-v57f2" Mar 09 14:09:24 crc kubenswrapper[4722]: I0309 14:09:24.493690 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e27c5a4-8cba-4119-8006-f9841d6121dc-catalog-content\") pod \"redhat-operators-v57f2\" (UID: \"4e27c5a4-8cba-4119-8006-f9841d6121dc\") " pod="openshift-marketplace/redhat-operators-v57f2" Mar 09 14:09:24 crc kubenswrapper[4722]: I0309 14:09:24.518886 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cknrf\" (UniqueName: \"kubernetes.io/projected/4e27c5a4-8cba-4119-8006-f9841d6121dc-kube-api-access-cknrf\") pod \"redhat-operators-v57f2\" (UID: \"4e27c5a4-8cba-4119-8006-f9841d6121dc\") " pod="openshift-marketplace/redhat-operators-v57f2" Mar 09 14:09:24 crc kubenswrapper[4722]: I0309 14:09:24.584264 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v57f2" Mar 09 14:09:24 crc kubenswrapper[4722]: I0309 14:09:24.828755 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-c2mmw"] Mar 09 14:09:25 crc kubenswrapper[4722]: I0309 14:09:25.058177 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v57f2"] Mar 09 14:09:25 crc kubenswrapper[4722]: W0309 14:09:25.110901 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e27c5a4_8cba_4119_8006_f9841d6121dc.slice/crio-9e5b010a95457e90bf0c347a2f0ad86e10f2124a2bb10cec97569437e8f92754 WatchSource:0}: Error finding container 9e5b010a95457e90bf0c347a2f0ad86e10f2124a2bb10cec97569437e8f92754: Status 404 returned error can't find the container with id 9e5b010a95457e90bf0c347a2f0ad86e10f2124a2bb10cec97569437e8f92754 Mar 09 14:09:25 crc kubenswrapper[4722]: I0309 14:09:25.493529 4722 generic.go:334] "Generic (PLEG): container finished" podID="4e27c5a4-8cba-4119-8006-f9841d6121dc" containerID="cc57ce74db96aacab14437761cde5109350e616b6c6635159b59cfbd77998f4b" exitCode=0 Mar 09 14:09:25 crc kubenswrapper[4722]: I0309 14:09:25.493626 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v57f2" event={"ID":"4e27c5a4-8cba-4119-8006-f9841d6121dc","Type":"ContainerDied","Data":"cc57ce74db96aacab14437761cde5109350e616b6c6635159b59cfbd77998f4b"} Mar 09 14:09:25 crc kubenswrapper[4722]: I0309 14:09:25.493665 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v57f2" event={"ID":"4e27c5a4-8cba-4119-8006-f9841d6121dc","Type":"ContainerStarted","Data":"9e5b010a95457e90bf0c347a2f0ad86e10f2124a2bb10cec97569437e8f92754"} Mar 09 14:09:25 crc kubenswrapper[4722]: I0309 14:09:25.496254 4722 generic.go:334] "Generic (PLEG): container finished" podID="690e5ab0-3719-40ac-aba6-9278480ecb44" containerID="31dbfa1273afeeb00f519a43772b94f833df7317d4adf70b5ad9c23e7c42ca05" exitCode=0 Mar 09 14:09:25 crc kubenswrapper[4722]: I0309 14:09:25.496352 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c2mmw" event={"ID":"690e5ab0-3719-40ac-aba6-9278480ecb44","Type":"ContainerDied","Data":"31dbfa1273afeeb00f519a43772b94f833df7317d4adf70b5ad9c23e7c42ca05"} Mar 09 14:09:25 crc kubenswrapper[4722]: I0309 14:09:25.496418 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c2mmw" event={"ID":"690e5ab0-3719-40ac-aba6-9278480ecb44","Type":"ContainerStarted","Data":"3edd8706457377b85df22258f006793d71309c8160caf8bf101c065c7ebbc989"} Mar 09 14:09:26 crc kubenswrapper[4722]: I0309 14:09:26.471928 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wkdkx"] Mar 09 14:09:26 crc kubenswrapper[4722]: I0309 14:09:26.473830 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wkdkx" Mar 09 14:09:26 crc kubenswrapper[4722]: I0309 14:09:26.476404 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 09 14:09:26 crc kubenswrapper[4722]: I0309 14:09:26.484984 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wkdkx"] Mar 09 14:09:26 crc kubenswrapper[4722]: I0309 14:09:26.511177 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c2mmw" event={"ID":"690e5ab0-3719-40ac-aba6-9278480ecb44","Type":"ContainerStarted","Data":"f4d81cb13e30698c38667374a5afe605b7ec5b61fa2e1e38fadf1c7094c23f2d"} Mar 09 14:09:26 crc kubenswrapper[4722]: I0309 14:09:26.647404 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69a0d1f2-276f-4062-91d9-8af8048a8d8f-utilities\") pod \"certified-operators-wkdkx\" (UID: \"69a0d1f2-276f-4062-91d9-8af8048a8d8f\") " pod="openshift-marketplace/certified-operators-wkdkx" Mar 09 14:09:26 crc kubenswrapper[4722]: I0309 14:09:26.647470 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69a0d1f2-276f-4062-91d9-8af8048a8d8f-catalog-content\") pod \"certified-operators-wkdkx\" (UID: \"69a0d1f2-276f-4062-91d9-8af8048a8d8f\") " pod="openshift-marketplace/certified-operators-wkdkx" Mar 09 14:09:26 crc kubenswrapper[4722]: I0309 14:09:26.647510 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhg2v\" (UniqueName: \"kubernetes.io/projected/69a0d1f2-276f-4062-91d9-8af8048a8d8f-kube-api-access-dhg2v\") pod \"certified-operators-wkdkx\" (UID: \"69a0d1f2-276f-4062-91d9-8af8048a8d8f\") " pod="openshift-marketplace/certified-operators-wkdkx" Mar 09 14:09:26 crc kubenswrapper[4722]: I0309 14:09:26.660874 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hrrr6"] Mar 09 14:09:26 crc kubenswrapper[4722]: I0309 14:09:26.662085 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hrrr6" Mar 09 14:09:26 crc kubenswrapper[4722]: I0309 14:09:26.664816 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 09 14:09:26 crc kubenswrapper[4722]: I0309 14:09:26.673246 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hrrr6"] Mar 09 14:09:26 crc kubenswrapper[4722]: I0309 14:09:26.749281 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69a0d1f2-276f-4062-91d9-8af8048a8d8f-catalog-content\") pod \"certified-operators-wkdkx\" (UID: \"69a0d1f2-276f-4062-91d9-8af8048a8d8f\") " pod="openshift-marketplace/certified-operators-wkdkx" Mar 09 14:09:26 crc kubenswrapper[4722]: I0309 14:09:26.749830 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhg2v\" (UniqueName: \"kubernetes.io/projected/69a0d1f2-276f-4062-91d9-8af8048a8d8f-kube-api-access-dhg2v\") pod \"certified-operators-wkdkx\" (UID: \"69a0d1f2-276f-4062-91d9-8af8048a8d8f\") " pod="openshift-marketplace/certified-operators-wkdkx" Mar 09 14:09:26 crc kubenswrapper[4722]: I0309 14:09:26.749915 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69a0d1f2-276f-4062-91d9-8af8048a8d8f-utilities\") pod \"certified-operators-wkdkx\" (UID: \"69a0d1f2-276f-4062-91d9-8af8048a8d8f\") " pod="openshift-marketplace/certified-operators-wkdkx" Mar 09 14:09:26 crc kubenswrapper[4722]: I0309 14:09:26.749966 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69a0d1f2-276f-4062-91d9-8af8048a8d8f-catalog-content\") pod \"certified-operators-wkdkx\" (UID: \"69a0d1f2-276f-4062-91d9-8af8048a8d8f\") " pod="openshift-marketplace/certified-operators-wkdkx" Mar 09 14:09:26 crc kubenswrapper[4722]: I0309 14:09:26.750289 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69a0d1f2-276f-4062-91d9-8af8048a8d8f-utilities\") pod \"certified-operators-wkdkx\" (UID: \"69a0d1f2-276f-4062-91d9-8af8048a8d8f\") " pod="openshift-marketplace/certified-operators-wkdkx" Mar 09 14:09:26 crc kubenswrapper[4722]: I0309 14:09:26.781941 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhg2v\" (UniqueName: \"kubernetes.io/projected/69a0d1f2-276f-4062-91d9-8af8048a8d8f-kube-api-access-dhg2v\") pod \"certified-operators-wkdkx\" (UID: \"69a0d1f2-276f-4062-91d9-8af8048a8d8f\") " pod="openshift-marketplace/certified-operators-wkdkx" Mar 09 14:09:26 crc kubenswrapper[4722]: I0309 14:09:26.813662 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wkdkx" Mar 09 14:09:26 crc kubenswrapper[4722]: I0309 14:09:26.851701 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c85eb92e-6d30-4e52-9176-70140b518ce9-catalog-content\") pod \"community-operators-hrrr6\" (UID: \"c85eb92e-6d30-4e52-9176-70140b518ce9\") " pod="openshift-marketplace/community-operators-hrrr6" Mar 09 14:09:26 crc kubenswrapper[4722]: I0309 14:09:26.851800 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mnhk\" (UniqueName: \"kubernetes.io/projected/c85eb92e-6d30-4e52-9176-70140b518ce9-kube-api-access-4mnhk\") pod \"community-operators-hrrr6\" (UID: \"c85eb92e-6d30-4e52-9176-70140b518ce9\") " pod="openshift-marketplace/community-operators-hrrr6" Mar 09 14:09:26 crc kubenswrapper[4722]: I0309 14:09:26.851859 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c85eb92e-6d30-4e52-9176-70140b518ce9-utilities\") pod \"community-operators-hrrr6\" (UID: \"c85eb92e-6d30-4e52-9176-70140b518ce9\") " pod="openshift-marketplace/community-operators-hrrr6" Mar 09 14:09:26 crc kubenswrapper[4722]: I0309 14:09:26.952918 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mnhk\" (UniqueName: \"kubernetes.io/projected/c85eb92e-6d30-4e52-9176-70140b518ce9-kube-api-access-4mnhk\") pod \"community-operators-hrrr6\" (UID: \"c85eb92e-6d30-4e52-9176-70140b518ce9\") " pod="openshift-marketplace/community-operators-hrrr6" Mar 09 14:09:26 crc kubenswrapper[4722]: I0309 14:09:26.953021 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c85eb92e-6d30-4e52-9176-70140b518ce9-utilities\") pod \"community-operators-hrrr6\" (UID: \"c85eb92e-6d30-4e52-9176-70140b518ce9\") " pod="openshift-marketplace/community-operators-hrrr6" Mar 09 14:09:26 crc kubenswrapper[4722]: I0309 14:09:26.953068 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c85eb92e-6d30-4e52-9176-70140b518ce9-catalog-content\") pod \"community-operators-hrrr6\" (UID: \"c85eb92e-6d30-4e52-9176-70140b518ce9\") " pod="openshift-marketplace/community-operators-hrrr6" Mar 09 14:09:26 crc kubenswrapper[4722]: I0309 14:09:26.953646 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c85eb92e-6d30-4e52-9176-70140b518ce9-catalog-content\") pod \"community-operators-hrrr6\" (UID: \"c85eb92e-6d30-4e52-9176-70140b518ce9\") " pod="openshift-marketplace/community-operators-hrrr6" Mar 09 14:09:26 crc kubenswrapper[4722]: I0309 14:09:26.954571 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c85eb92e-6d30-4e52-9176-70140b518ce9-utilities\") pod \"community-operators-hrrr6\" (UID: \"c85eb92e-6d30-4e52-9176-70140b518ce9\") " pod="openshift-marketplace/community-operators-hrrr6" Mar 09 14:09:26 crc kubenswrapper[4722]: I0309 14:09:26.974120 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mnhk\" (UniqueName: \"kubernetes.io/projected/c85eb92e-6d30-4e52-9176-70140b518ce9-kube-api-access-4mnhk\") pod \"community-operators-hrrr6\" (UID: \"c85eb92e-6d30-4e52-9176-70140b518ce9\") " pod="openshift-marketplace/community-operators-hrrr6" Mar 09 14:09:26 crc kubenswrapper[4722]: I0309 14:09:26.979949 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hrrr6" Mar 09 14:09:27 crc kubenswrapper[4722]: I0309 14:09:27.239605 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wkdkx"] Mar 09 14:09:27 crc kubenswrapper[4722]: W0309 14:09:27.253457 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69a0d1f2_276f_4062_91d9_8af8048a8d8f.slice/crio-eeb3e5fceedb0bade4ee38f83fe3011b3eb4f41fdf4f7407939e975fb31e54e9 WatchSource:0}: Error finding container eeb3e5fceedb0bade4ee38f83fe3011b3eb4f41fdf4f7407939e975fb31e54e9: Status 404 returned error can't find the container with id eeb3e5fceedb0bade4ee38f83fe3011b3eb4f41fdf4f7407939e975fb31e54e9 Mar 09 14:09:27 crc kubenswrapper[4722]: I0309 14:09:27.416145 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hrrr6"] Mar 09 14:09:27 crc kubenswrapper[4722]: W0309 14:09:27.424139 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc85eb92e_6d30_4e52_9176_70140b518ce9.slice/crio-efb2586f3d856f1ee84b0d6379cd480c55766ad7bd456633efe29691c223582b WatchSource:0}: Error finding container efb2586f3d856f1ee84b0d6379cd480c55766ad7bd456633efe29691c223582b: Status 404 returned error can't find the container with id efb2586f3d856f1ee84b0d6379cd480c55766ad7bd456633efe29691c223582b Mar 09 14:09:27 crc kubenswrapper[4722]: I0309 14:09:27.529175 4722 generic.go:334] "Generic (PLEG): container finished" podID="4e27c5a4-8cba-4119-8006-f9841d6121dc" containerID="b5e1393e6b5f861b6ac4e6ad52e652c101f9c0f0ecf68b5e251a8253fc4c0a7e" exitCode=0 Mar 09 14:09:27 crc kubenswrapper[4722]: I0309 14:09:27.529742 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v57f2" event={"ID":"4e27c5a4-8cba-4119-8006-f9841d6121dc","Type":"ContainerDied","Data":"b5e1393e6b5f861b6ac4e6ad52e652c101f9c0f0ecf68b5e251a8253fc4c0a7e"} Mar 09 14:09:27 crc kubenswrapper[4722]: I0309 14:09:27.535587 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hrrr6" event={"ID":"c85eb92e-6d30-4e52-9176-70140b518ce9","Type":"ContainerStarted","Data":"efb2586f3d856f1ee84b0d6379cd480c55766ad7bd456633efe29691c223582b"} Mar 09 14:09:27 crc kubenswrapper[4722]: I0309 14:09:27.542143 4722 generic.go:334] "Generic (PLEG): container finished" podID="690e5ab0-3719-40ac-aba6-9278480ecb44" containerID="f4d81cb13e30698c38667374a5afe605b7ec5b61fa2e1e38fadf1c7094c23f2d" exitCode=0 Mar 09 14:09:27 crc kubenswrapper[4722]: I0309 14:09:27.542221 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c2mmw" event={"ID":"690e5ab0-3719-40ac-aba6-9278480ecb44","Type":"ContainerDied","Data":"f4d81cb13e30698c38667374a5afe605b7ec5b61fa2e1e38fadf1c7094c23f2d"} Mar 09 14:09:27 crc kubenswrapper[4722]: I0309 14:09:27.550833 4722 generic.go:334] "Generic (PLEG): container finished" podID="69a0d1f2-276f-4062-91d9-8af8048a8d8f" containerID="0efa5b8193b3f00deaad69970df1d953fdfa07ca482f8978a9ae4e2e437233c1" exitCode=0 Mar 09 14:09:27 crc kubenswrapper[4722]: I0309 14:09:27.550906 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wkdkx" event={"ID":"69a0d1f2-276f-4062-91d9-8af8048a8d8f","Type":"ContainerDied","Data":"0efa5b8193b3f00deaad69970df1d953fdfa07ca482f8978a9ae4e2e437233c1"} Mar 09 14:09:27 crc kubenswrapper[4722]: I0309 14:09:27.550950 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wkdkx" event={"ID":"69a0d1f2-276f-4062-91d9-8af8048a8d8f","Type":"ContainerStarted","Data":"eeb3e5fceedb0bade4ee38f83fe3011b3eb4f41fdf4f7407939e975fb31e54e9"} Mar 09 14:09:28 crc kubenswrapper[4722]: I0309 14:09:28.560170 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wkdkx" event={"ID":"69a0d1f2-276f-4062-91d9-8af8048a8d8f","Type":"ContainerStarted","Data":"84c3eab135f468e638c4dd01d61c1d66b42efb32c53cde18cd7cdda58ed10aa5"} Mar 09 14:09:28 crc kubenswrapper[4722]: I0309 14:09:28.564437 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v57f2" event={"ID":"4e27c5a4-8cba-4119-8006-f9841d6121dc","Type":"ContainerStarted","Data":"39debea8587f6e6f170a7177c41011980494388f1c8da8fb895edab231f9cc8f"} Mar 09 14:09:28 crc kubenswrapper[4722]: I0309 14:09:28.567059 4722 generic.go:334] "Generic (PLEG): container finished" podID="c85eb92e-6d30-4e52-9176-70140b518ce9" containerID="974026ef92eb9be1a82222b8f60ee7e5839cd035daee5e85fb5ebb63c9f5c990" exitCode=0 Mar 09 14:09:28 crc kubenswrapper[4722]: I0309 14:09:28.567575 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hrrr6" event={"ID":"c85eb92e-6d30-4e52-9176-70140b518ce9","Type":"ContainerDied","Data":"974026ef92eb9be1a82222b8f60ee7e5839cd035daee5e85fb5ebb63c9f5c990"} Mar 09 14:09:28 crc kubenswrapper[4722]: I0309 14:09:28.570372 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c2mmw" event={"ID":"690e5ab0-3719-40ac-aba6-9278480ecb44","Type":"ContainerStarted","Data":"90d71c8e051db0d4adfc07841fdf4a0886e86003f3b562c362dc6cd509599188"} Mar 09 14:09:28 crc kubenswrapper[4722]: I0309 14:09:28.648912 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-v57f2" podStartSLOduration=2.221296706 podStartE2EDuration="4.648881679s" podCreationTimestamp="2026-03-09 14:09:24 +0000 UTC" firstStartedPulling="2026-03-09 14:09:25.495503587 +0000 UTC m=+406.051072163" lastFinishedPulling="2026-03-09 14:09:27.92308856 +0000 UTC m=+408.478657136" observedRunningTime="2026-03-09 14:09:28.624024025 +0000 UTC m=+409.179592601" watchObservedRunningTime="2026-03-09 14:09:28.648881679 +0000 UTC m=+409.204450255" Mar 09 14:09:28 crc kubenswrapper[4722]: I0309 14:09:28.650470 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-c2mmw" podStartSLOduration=2.162550909 podStartE2EDuration="4.650457669s" podCreationTimestamp="2026-03-09 14:09:24 +0000 UTC" firstStartedPulling="2026-03-09 14:09:25.499880808 +0000 UTC m=+406.055449384" lastFinishedPulling="2026-03-09 14:09:27.987787558 +0000 UTC m=+408.543356144" observedRunningTime="2026-03-09 14:09:28.646225401 +0000 UTC m=+409.201793977" watchObservedRunningTime="2026-03-09 14:09:28.650457669 +0000 UTC m=+409.206026255" Mar 09 14:09:29 crc kubenswrapper[4722]: I0309 14:09:29.578343 4722 generic.go:334] "Generic (PLEG): container finished" podID="69a0d1f2-276f-4062-91d9-8af8048a8d8f" containerID="84c3eab135f468e638c4dd01d61c1d66b42efb32c53cde18cd7cdda58ed10aa5" exitCode=0 Mar 09 14:09:29 crc kubenswrapper[4722]: I0309 14:09:29.578442 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wkdkx" event={"ID":"69a0d1f2-276f-4062-91d9-8af8048a8d8f","Type":"ContainerDied","Data":"84c3eab135f468e638c4dd01d61c1d66b42efb32c53cde18cd7cdda58ed10aa5"} Mar 09 14:09:29 crc kubenswrapper[4722]: I0309 14:09:29.582304 4722 generic.go:334] "Generic (PLEG): container finished" podID="c85eb92e-6d30-4e52-9176-70140b518ce9" containerID="b72d6c9c9baff3642f192cd7fe21a0d800fe78ebbb0e9106e4c2ff7389e60b81" exitCode=0 Mar 09 14:09:29 crc kubenswrapper[4722]: I0309 14:09:29.583674 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hrrr6" event={"ID":"c85eb92e-6d30-4e52-9176-70140b518ce9","Type":"ContainerDied","Data":"b72d6c9c9baff3642f192cd7fe21a0d800fe78ebbb0e9106e4c2ff7389e60b81"} Mar 09 14:09:30 crc kubenswrapper[4722]: I0309 14:09:30.589786 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hrrr6" event={"ID":"c85eb92e-6d30-4e52-9176-70140b518ce9","Type":"ContainerStarted","Data":"0b9fd0ff003c995530f76971084a2f2fb868bbb21a5f5dc686aa7c069616d9a4"} Mar 09 14:09:30 crc kubenswrapper[4722]: I0309 14:09:30.591788 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wkdkx" event={"ID":"69a0d1f2-276f-4062-91d9-8af8048a8d8f","Type":"ContainerStarted","Data":"5d115099f7394083ebc67ce7b10fe170fd35f9fb5c674236210d09be322b199c"} Mar 09 14:09:30 crc kubenswrapper[4722]: I0309 14:09:30.613079 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hrrr6" podStartSLOduration=3.14629182 podStartE2EDuration="4.613057116s" podCreationTimestamp="2026-03-09 14:09:26 +0000 UTC" firstStartedPulling="2026-03-09 14:09:28.568917482 +0000 UTC m=+409.124486048" lastFinishedPulling="2026-03-09 14:09:30.035682758 +0000 UTC m=+410.591251344" observedRunningTime="2026-03-09 14:09:30.610366738 +0000 UTC m=+411.165935314" watchObservedRunningTime="2026-03-09 14:09:30.613057116 +0000 UTC m=+411.168625692" Mar 09 14:09:34 crc kubenswrapper[4722]: I0309 14:09:34.393736 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-c2mmw" Mar 09 14:09:34 crc kubenswrapper[4722]: I0309 14:09:34.394855 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-c2mmw" Mar 09 14:09:34 crc kubenswrapper[4722]: I0309 14:09:34.458591 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-c2mmw" Mar 09 14:09:34 crc kubenswrapper[4722]: I0309 14:09:34.483718 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wkdkx" podStartSLOduration=6.03583707 podStartE2EDuration="8.48369539s" podCreationTimestamp="2026-03-09 14:09:26 +0000 UTC" firstStartedPulling="2026-03-09 14:09:27.554107229 +0000 UTC m=+408.109675805" lastFinishedPulling="2026-03-09 14:09:30.001965549 +0000 UTC m=+410.557534125" observedRunningTime="2026-03-09 14:09:30.641625374 +0000 UTC m=+411.197193950" watchObservedRunningTime="2026-03-09 14:09:34.48369539 +0000 UTC m=+415.039263966" Mar 09 14:09:34 crc kubenswrapper[4722]: I0309 14:09:34.584505 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-v57f2" Mar 09 14:09:34 crc kubenswrapper[4722]: I0309 14:09:34.584591 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-v57f2" Mar 09 14:09:34 crc kubenswrapper[4722]: I0309 14:09:34.627853 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-v57f2" Mar 09 14:09:34 crc kubenswrapper[4722]: I0309 14:09:34.659603 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-c2mmw" Mar 09 14:09:34 crc kubenswrapper[4722]: I0309 14:09:34.676281 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-v57f2" Mar 09 14:09:36 crc kubenswrapper[4722]: I0309 14:09:36.814080 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wkdkx" Mar 09 14:09:36 crc kubenswrapper[4722]: I0309 14:09:36.814709 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wkdkx" Mar 09 14:09:36 crc kubenswrapper[4722]: I0309 14:09:36.861341 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wkdkx" Mar 09 14:09:36 crc kubenswrapper[4722]: I0309 14:09:36.980622 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hrrr6" Mar 09 14:09:36 crc kubenswrapper[4722]: I0309 14:09:36.980786 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hrrr6" Mar 09 14:09:37 crc kubenswrapper[4722]: I0309 14:09:37.025173 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hrrr6" Mar 09 14:09:37 crc kubenswrapper[4722]: I0309 14:09:37.688097 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wkdkx" Mar 09 14:09:37 crc kubenswrapper[4722]: I0309 14:09:37.697833 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hrrr6" Mar 09 14:09:50 crc kubenswrapper[4722]: I0309 14:09:50.951621 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6d5b84845-p57h4"] Mar 09 14:09:50 crc kubenswrapper[4722]: I0309 14:09:50.953639 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-p57h4" Mar 09 14:09:50 crc kubenswrapper[4722]: I0309 14:09:50.961360 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Mar 09 14:09:50 crc kubenswrapper[4722]: I0309 14:09:50.961392 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Mar 09 14:09:50 crc kubenswrapper[4722]: I0309 14:09:50.961753 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Mar 09 14:09:50 crc kubenswrapper[4722]: I0309 14:09:50.962336 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Mar 09 14:09:50 crc kubenswrapper[4722]: I0309 14:09:50.962695 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-dockercfg-wwt9l" Mar 09 14:09:50 crc kubenswrapper[4722]: I0309 14:09:50.976016 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6d5b84845-p57h4"] Mar 09 14:09:51 crc kubenswrapper[4722]: I0309 14:09:51.121517 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ad3c79de-9ee6-46a3-9163-1d1f4b1e43d5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6d5b84845-p57h4\" (UID: \"ad3c79de-9ee6-46a3-9163-1d1f4b1e43d5\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-p57h4" Mar 09 14:09:51 crc kubenswrapper[4722]: I0309 14:09:51.121663 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/ad3c79de-9ee6-46a3-9163-1d1f4b1e43d5-telemetry-config\") pod \"cluster-monitoring-operator-6d5b84845-p57h4\" (UID: \"ad3c79de-9ee6-46a3-9163-1d1f4b1e43d5\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-p57h4" Mar 09 14:09:51 crc kubenswrapper[4722]: I0309 14:09:51.121720 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45ff8\" (UniqueName: \"kubernetes.io/projected/ad3c79de-9ee6-46a3-9163-1d1f4b1e43d5-kube-api-access-45ff8\") pod \"cluster-monitoring-operator-6d5b84845-p57h4\" (UID: \"ad3c79de-9ee6-46a3-9163-1d1f4b1e43d5\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-p57h4" Mar 09 14:09:51 crc kubenswrapper[4722]: I0309 14:09:51.222923 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/ad3c79de-9ee6-46a3-9163-1d1f4b1e43d5-telemetry-config\") pod \"cluster-monitoring-operator-6d5b84845-p57h4\" (UID: \"ad3c79de-9ee6-46a3-9163-1d1f4b1e43d5\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-p57h4" Mar 09 14:09:51 crc kubenswrapper[4722]: I0309 14:09:51.223033 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45ff8\" (UniqueName: \"kubernetes.io/projected/ad3c79de-9ee6-46a3-9163-1d1f4b1e43d5-kube-api-access-45ff8\") pod \"cluster-monitoring-operator-6d5b84845-p57h4\" (UID: \"ad3c79de-9ee6-46a3-9163-1d1f4b1e43d5\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-p57h4" Mar 09 14:09:51 crc kubenswrapper[4722]: I0309 14:09:51.223160 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ad3c79de-9ee6-46a3-9163-1d1f4b1e43d5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6d5b84845-p57h4\" (UID: \"ad3c79de-9ee6-46a3-9163-1d1f4b1e43d5\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-p57h4" Mar 09 14:09:51 crc kubenswrapper[4722]: I0309 14:09:51.226062 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/ad3c79de-9ee6-46a3-9163-1d1f4b1e43d5-telemetry-config\") pod \"cluster-monitoring-operator-6d5b84845-p57h4\" (UID: \"ad3c79de-9ee6-46a3-9163-1d1f4b1e43d5\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-p57h4" Mar 09 14:09:51 crc kubenswrapper[4722]: I0309 14:09:51.237293 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ad3c79de-9ee6-46a3-9163-1d1f4b1e43d5-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6d5b84845-p57h4\" (UID: \"ad3c79de-9ee6-46a3-9163-1d1f4b1e43d5\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-p57h4" Mar 09 14:09:51 crc kubenswrapper[4722]: I0309 14:09:51.245392 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45ff8\" (UniqueName: \"kubernetes.io/projected/ad3c79de-9ee6-46a3-9163-1d1f4b1e43d5-kube-api-access-45ff8\") pod \"cluster-monitoring-operator-6d5b84845-p57h4\" (UID: \"ad3c79de-9ee6-46a3-9163-1d1f4b1e43d5\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-p57h4" Mar 09 14:09:51 crc kubenswrapper[4722]: I0309 14:09:51.281078 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-p57h4" Mar 09 14:09:51 crc kubenswrapper[4722]: I0309 14:09:51.527774 4722 patch_prober.go:28] interesting pod/machine-config-daemon-hjrrb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:09:51 crc kubenswrapper[4722]: I0309 14:09:51.528401 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:09:51 crc kubenswrapper[4722]: I0309 14:09:51.744236 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6d5b84845-p57h4"] Mar 09 14:09:51 crc kubenswrapper[4722]: W0309 14:09:51.759529 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad3c79de_9ee6_46a3_9163_1d1f4b1e43d5.slice/crio-726e9b66d51ffda3eaac217871099f5bf22e25f854bc31db09834e5a45b0cf69 WatchSource:0}: Error finding container 726e9b66d51ffda3eaac217871099f5bf22e25f854bc31db09834e5a45b0cf69: Status 404 returned error can't find the container with id 726e9b66d51ffda3eaac217871099f5bf22e25f854bc31db09834e5a45b0cf69 Mar 09 14:09:52 crc kubenswrapper[4722]: I0309 14:09:52.727373 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-p57h4" event={"ID":"ad3c79de-9ee6-46a3-9163-1d1f4b1e43d5","Type":"ContainerStarted","Data":"726e9b66d51ffda3eaac217871099f5bf22e25f854bc31db09834e5a45b0cf69"} Mar 09 14:09:54 crc kubenswrapper[4722]: I0309 14:09:54.537776 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-dch29"] Mar 09 14:09:54 crc kubenswrapper[4722]: I0309 14:09:54.538914 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-dch29" Mar 09 14:09:54 crc kubenswrapper[4722]: I0309 14:09:54.569479 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-dch29"] Mar 09 14:09:54 crc kubenswrapper[4722]: I0309 14:09:54.669205 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f352fba2-f1d8-46bb-b4e6-d68f6c9b4fe7-ca-trust-extracted\") pod \"image-registry-66df7c8f76-dch29\" (UID: \"f352fba2-f1d8-46bb-b4e6-d68f6c9b4fe7\") " pod="openshift-image-registry/image-registry-66df7c8f76-dch29" Mar 09 14:09:54 crc kubenswrapper[4722]: I0309 14:09:54.669317 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f352fba2-f1d8-46bb-b4e6-d68f6c9b4fe7-registry-tls\") pod \"image-registry-66df7c8f76-dch29\" (UID: \"f352fba2-f1d8-46bb-b4e6-d68f6c9b4fe7\") " pod="openshift-image-registry/image-registry-66df7c8f76-dch29" Mar 09 14:09:54 crc kubenswrapper[4722]: I0309 14:09:54.669353 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kn5s\" (UniqueName: \"kubernetes.io/projected/f352fba2-f1d8-46bb-b4e6-d68f6c9b4fe7-kube-api-access-7kn5s\") pod \"image-registry-66df7c8f76-dch29\" (UID: \"f352fba2-f1d8-46bb-b4e6-d68f6c9b4fe7\") " pod="openshift-image-registry/image-registry-66df7c8f76-dch29" Mar 09 14:09:54 crc kubenswrapper[4722]: I0309 14:09:54.669391 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f352fba2-f1d8-46bb-b4e6-d68f6c9b4fe7-installation-pull-secrets\") pod \"image-registry-66df7c8f76-dch29\" (UID: \"f352fba2-f1d8-46bb-b4e6-d68f6c9b4fe7\") " pod="openshift-image-registry/image-registry-66df7c8f76-dch29" Mar 09 14:09:54 crc kubenswrapper[4722]: I0309 14:09:54.669449 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-dch29\" (UID: \"f352fba2-f1d8-46bb-b4e6-d68f6c9b4fe7\") " pod="openshift-image-registry/image-registry-66df7c8f76-dch29" Mar 09 14:09:54 crc kubenswrapper[4722]: I0309 14:09:54.669475 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f352fba2-f1d8-46bb-b4e6-d68f6c9b4fe7-registry-certificates\") pod \"image-registry-66df7c8f76-dch29\" (UID: \"f352fba2-f1d8-46bb-b4e6-d68f6c9b4fe7\") " pod="openshift-image-registry/image-registry-66df7c8f76-dch29" Mar 09 14:09:54 crc kubenswrapper[4722]: I0309 14:09:54.669507 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f352fba2-f1d8-46bb-b4e6-d68f6c9b4fe7-trusted-ca\") pod \"image-registry-66df7c8f76-dch29\" (UID: \"f352fba2-f1d8-46bb-b4e6-d68f6c9b4fe7\") " pod="openshift-image-registry/image-registry-66df7c8f76-dch29" Mar 09 14:09:54 crc kubenswrapper[4722]: I0309 14:09:54.669538 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f352fba2-f1d8-46bb-b4e6-d68f6c9b4fe7-bound-sa-token\") pod \"image-registry-66df7c8f76-dch29\" (UID: \"f352fba2-f1d8-46bb-b4e6-d68f6c9b4fe7\") " pod="openshift-image-registry/image-registry-66df7c8f76-dch29" Mar 09 14:09:54 crc kubenswrapper[4722]: I0309 14:09:54.707124 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-dch29\" (UID: \"f352fba2-f1d8-46bb-b4e6-d68f6c9b4fe7\") " pod="openshift-image-registry/image-registry-66df7c8f76-dch29" Mar 09 14:09:54 crc kubenswrapper[4722]: I0309 14:09:54.739580 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-p57h4" event={"ID":"ad3c79de-9ee6-46a3-9163-1d1f4b1e43d5","Type":"ContainerStarted","Data":"e83355ce8608b339c0556d823dc4fea0d84f9f5b230dff24f472601349323ee9"} Mar 09 14:09:54 crc kubenswrapper[4722]: I0309 14:09:54.745668 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-8xdjl"] Mar 09 14:09:54 crc kubenswrapper[4722]: I0309 14:09:54.746698 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-8xdjl" Mar 09 14:09:54 crc kubenswrapper[4722]: I0309 14:09:54.749062 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-tls" Mar 09 14:09:54 crc kubenswrapper[4722]: I0309 14:09:54.754633 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-dockercfg-ggj5r" Mar 09 14:09:54 crc kubenswrapper[4722]: I0309 14:09:54.765222 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-p57h4" podStartSLOduration=2.557927018 podStartE2EDuration="4.765189147s" podCreationTimestamp="2026-03-09 14:09:50 +0000 UTC" firstStartedPulling="2026-03-09 14:09:51.762017062 +0000 UTC m=+432.317585638" lastFinishedPulling="2026-03-09 14:09:53.969279201 +0000 UTC m=+434.524847767" observedRunningTime="2026-03-09 14:09:54.756691101 +0000 UTC m=+435.312259677" watchObservedRunningTime="2026-03-09 14:09:54.765189147 +0000 UTC m=+435.320757723" Mar 09 14:09:54 crc kubenswrapper[4722]: I0309 14:09:54.771255 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f352fba2-f1d8-46bb-b4e6-d68f6c9b4fe7-trusted-ca\") pod \"image-registry-66df7c8f76-dch29\" (UID: \"f352fba2-f1d8-46bb-b4e6-d68f6c9b4fe7\") " pod="openshift-image-registry/image-registry-66df7c8f76-dch29" Mar 09 14:09:54 crc kubenswrapper[4722]: I0309 14:09:54.771312 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f352fba2-f1d8-46bb-b4e6-d68f6c9b4fe7-bound-sa-token\") pod \"image-registry-66df7c8f76-dch29\" (UID: \"f352fba2-f1d8-46bb-b4e6-d68f6c9b4fe7\") " pod="openshift-image-registry/image-registry-66df7c8f76-dch29" Mar 09 14:09:54 crc kubenswrapper[4722]: I0309 14:09:54.771353 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f352fba2-f1d8-46bb-b4e6-d68f6c9b4fe7-ca-trust-extracted\") pod \"image-registry-66df7c8f76-dch29\" (UID: \"f352fba2-f1d8-46bb-b4e6-d68f6c9b4fe7\") " pod="openshift-image-registry/image-registry-66df7c8f76-dch29" Mar 09 14:09:54 crc kubenswrapper[4722]: I0309 14:09:54.771378 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f352fba2-f1d8-46bb-b4e6-d68f6c9b4fe7-registry-tls\") pod \"image-registry-66df7c8f76-dch29\" (UID: \"f352fba2-f1d8-46bb-b4e6-d68f6c9b4fe7\") " pod="openshift-image-registry/image-registry-66df7c8f76-dch29" Mar 09 14:09:54 crc kubenswrapper[4722]: I0309 14:09:54.771395 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7kn5s\" (UniqueName: \"kubernetes.io/projected/f352fba2-f1d8-46bb-b4e6-d68f6c9b4fe7-kube-api-access-7kn5s\") pod \"image-registry-66df7c8f76-dch29\" (UID: \"f352fba2-f1d8-46bb-b4e6-d68f6c9b4fe7\") " pod="openshift-image-registry/image-registry-66df7c8f76-dch29" Mar 09 14:09:54 crc kubenswrapper[4722]: I0309 14:09:54.771422 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f352fba2-f1d8-46bb-b4e6-d68f6c9b4fe7-installation-pull-secrets\") pod \"image-registry-66df7c8f76-dch29\" (UID: \"f352fba2-f1d8-46bb-b4e6-d68f6c9b4fe7\") " pod="openshift-image-registry/image-registry-66df7c8f76-dch29" Mar 09 14:09:54 crc kubenswrapper[4722]: I0309 14:09:54.771460 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f352fba2-f1d8-46bb-b4e6-d68f6c9b4fe7-registry-certificates\") pod \"image-registry-66df7c8f76-dch29\" (UID: \"f352fba2-f1d8-46bb-b4e6-d68f6c9b4fe7\") " pod="openshift-image-registry/image-registry-66df7c8f76-dch29" Mar 09 14:09:54 crc kubenswrapper[4722]: I0309 14:09:54.772660 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f352fba2-f1d8-46bb-b4e6-d68f6c9b4fe7-registry-certificates\") pod \"image-registry-66df7c8f76-dch29\" (UID: \"f352fba2-f1d8-46bb-b4e6-d68f6c9b4fe7\") " pod="openshift-image-registry/image-registry-66df7c8f76-dch29" Mar 09 14:09:54 crc kubenswrapper[4722]: I0309 14:09:54.774296 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f352fba2-f1d8-46bb-b4e6-d68f6c9b4fe7-trusted-ca\") pod \"image-registry-66df7c8f76-dch29\" (UID: \"f352fba2-f1d8-46bb-b4e6-d68f6c9b4fe7\") " pod="openshift-image-registry/image-registry-66df7c8f76-dch29" Mar 09 14:09:54 crc kubenswrapper[4722]: I0309 14:09:54.774812 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f352fba2-f1d8-46bb-b4e6-d68f6c9b4fe7-ca-trust-extracted\") pod \"image-registry-66df7c8f76-dch29\" (UID: \"f352fba2-f1d8-46bb-b4e6-d68f6c9b4fe7\") " pod="openshift-image-registry/image-registry-66df7c8f76-dch29" Mar 09 14:09:54 crc kubenswrapper[4722]: I0309 14:09:54.776962 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-8xdjl"] Mar 09 14:09:54 crc kubenswrapper[4722]: I0309 14:09:54.781117 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f352fba2-f1d8-46bb-b4e6-d68f6c9b4fe7-installation-pull-secrets\") pod \"image-registry-66df7c8f76-dch29\" (UID: \"f352fba2-f1d8-46bb-b4e6-d68f6c9b4fe7\") " pod="openshift-image-registry/image-registry-66df7c8f76-dch29" Mar 09 14:09:54 crc kubenswrapper[4722]: I0309 14:09:54.785758 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f352fba2-f1d8-46bb-b4e6-d68f6c9b4fe7-registry-tls\") pod \"image-registry-66df7c8f76-dch29\" (UID: \"f352fba2-f1d8-46bb-b4e6-d68f6c9b4fe7\") " pod="openshift-image-registry/image-registry-66df7c8f76-dch29" Mar 09 14:09:54 crc kubenswrapper[4722]: I0309 14:09:54.796470 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f352fba2-f1d8-46bb-b4e6-d68f6c9b4fe7-bound-sa-token\") pod \"image-registry-66df7c8f76-dch29\" (UID: \"f352fba2-f1d8-46bb-b4e6-d68f6c9b4fe7\") " pod="openshift-image-registry/image-registry-66df7c8f76-dch29" Mar 09 14:09:54 crc kubenswrapper[4722]: I0309 14:09:54.800356 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kn5s\" (UniqueName: \"kubernetes.io/projected/f352fba2-f1d8-46bb-b4e6-d68f6c9b4fe7-kube-api-access-7kn5s\") pod \"image-registry-66df7c8f76-dch29\" (UID: \"f352fba2-f1d8-46bb-b4e6-d68f6c9b4fe7\") " pod="openshift-image-registry/image-registry-66df7c8f76-dch29" Mar 09 14:09:54 crc kubenswrapper[4722]: I0309 14:09:54.857954 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-dch29" Mar 09 14:09:54 crc kubenswrapper[4722]: I0309 14:09:54.872999 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/8b35adf7-a305-4f94-a5c9-02fbc3fca46f-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-8xdjl\" (UID: \"8b35adf7-a305-4f94-a5c9-02fbc3fca46f\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-8xdjl" Mar 09 14:09:54 crc kubenswrapper[4722]: I0309 14:09:54.976122 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/8b35adf7-a305-4f94-a5c9-02fbc3fca46f-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-8xdjl\" (UID: \"8b35adf7-a305-4f94-a5c9-02fbc3fca46f\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-8xdjl" Mar 09 14:09:54 crc kubenswrapper[4722]: I0309 14:09:54.980561 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/8b35adf7-a305-4f94-a5c9-02fbc3fca46f-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-8xdjl\" (UID: \"8b35adf7-a305-4f94-a5c9-02fbc3fca46f\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-8xdjl" Mar 09 14:09:55 crc kubenswrapper[4722]: I0309 14:09:55.062954 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-8xdjl" Mar 09 14:09:55 crc kubenswrapper[4722]: I0309 14:09:55.245711 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-8xdjl"] Mar 09 14:09:55 crc kubenswrapper[4722]: I0309 14:09:55.275434 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-dch29"] Mar 09 14:09:55 crc kubenswrapper[4722]: W0309 14:09:55.278957 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf352fba2_f1d8_46bb_b4e6_d68f6c9b4fe7.slice/crio-aa7ec5462828c91920d891d15f162144b2f9a16be404371fa8cf95e74093a7f0 WatchSource:0}: Error finding container aa7ec5462828c91920d891d15f162144b2f9a16be404371fa8cf95e74093a7f0: Status 404 returned error can't find the container with id aa7ec5462828c91920d891d15f162144b2f9a16be404371fa8cf95e74093a7f0 Mar 09 14:09:55 crc kubenswrapper[4722]: I0309 14:09:55.746880 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-dch29" event={"ID":"f352fba2-f1d8-46bb-b4e6-d68f6c9b4fe7","Type":"ContainerStarted","Data":"835b193105371b2a10761f8099ba04bff201446b7be8e8808aaca1da67172a05"} Mar 09 14:09:55 crc kubenswrapper[4722]: I0309 14:09:55.747016 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-dch29" event={"ID":"f352fba2-f1d8-46bb-b4e6-d68f6c9b4fe7","Type":"ContainerStarted","Data":"aa7ec5462828c91920d891d15f162144b2f9a16be404371fa8cf95e74093a7f0"} Mar 09 14:09:55 crc kubenswrapper[4722]: I0309 14:09:55.747226 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-dch29" Mar 09 14:09:55 crc kubenswrapper[4722]: I0309 14:09:55.748438 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-8xdjl" event={"ID":"8b35adf7-a305-4f94-a5c9-02fbc3fca46f","Type":"ContainerStarted","Data":"52083116864c475e9210ec9a7f6c270c70eb3203b325862020b85a82e38a1a70"} Mar 09 14:09:56 crc kubenswrapper[4722]: I0309 14:09:56.756145 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-8xdjl" event={"ID":"8b35adf7-a305-4f94-a5c9-02fbc3fca46f","Type":"ContainerStarted","Data":"24504e3f082f14957d38d16a3a1fe7333e36577d396d752821c030b29c7c33f8"} Mar 09 14:09:56 crc kubenswrapper[4722]: I0309 14:09:56.774524 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-8xdjl" podStartSLOduration=1.492363082 podStartE2EDuration="2.774499404s" podCreationTimestamp="2026-03-09 14:09:54 +0000 UTC" firstStartedPulling="2026-03-09 14:09:55.252365138 +0000 UTC m=+435.807933724" lastFinishedPulling="2026-03-09 14:09:56.53450147 +0000 UTC m=+437.090070046" observedRunningTime="2026-03-09 14:09:56.770286907 +0000 UTC m=+437.325855493" watchObservedRunningTime="2026-03-09 14:09:56.774499404 +0000 UTC m=+437.330067980" Mar 09 14:09:56 crc kubenswrapper[4722]: I0309 14:09:56.774785 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-dch29" podStartSLOduration=2.774775131 podStartE2EDuration="2.774775131s" podCreationTimestamp="2026-03-09 14:09:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:09:55.770683162 +0000 UTC m=+436.326251758" watchObservedRunningTime="2026-03-09 14:09:56.774775131 +0000 UTC m=+437.330343727" Mar 09 14:09:57 crc kubenswrapper[4722]: I0309 14:09:57.761609 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-8xdjl" Mar 09 14:09:57 crc kubenswrapper[4722]: I0309 14:09:57.767729 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-8xdjl" Mar 09 14:09:58 crc kubenswrapper[4722]: I0309 14:09:58.794555 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-db54df47d-48gxv"] Mar 09 14:09:58 crc kubenswrapper[4722]: I0309 14:09:58.795814 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-db54df47d-48gxv" Mar 09 14:09:58 crc kubenswrapper[4722]: I0309 14:09:58.799489 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-dockercfg-d9hf4" Mar 09 14:09:58 crc kubenswrapper[4722]: I0309 14:09:58.799708 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-tls" Mar 09 14:09:58 crc kubenswrapper[4722]: I0309 14:09:58.800135 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-kube-rbac-proxy-config" Mar 09 14:09:58 crc kubenswrapper[4722]: I0309 14:09:58.800315 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-client-ca" Mar 09 14:09:58 crc kubenswrapper[4722]: I0309 14:09:58.804250 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-db54df47d-48gxv"] Mar 09 14:09:58 crc kubenswrapper[4722]: I0309 14:09:58.934542 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/095deb64-7d79-4197-b6ef-8d45508d2c43-prometheus-operator-tls\") pod \"prometheus-operator-db54df47d-48gxv\" (UID: \"095deb64-7d79-4197-b6ef-8d45508d2c43\") " pod="openshift-monitoring/prometheus-operator-db54df47d-48gxv" Mar 09 14:09:58 crc kubenswrapper[4722]: I0309 14:09:58.934826 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgrpg\" (UniqueName: \"kubernetes.io/projected/095deb64-7d79-4197-b6ef-8d45508d2c43-kube-api-access-kgrpg\") pod \"prometheus-operator-db54df47d-48gxv\" (UID: \"095deb64-7d79-4197-b6ef-8d45508d2c43\") " pod="openshift-monitoring/prometheus-operator-db54df47d-48gxv" Mar 09 14:09:58 crc kubenswrapper[4722]: I0309 14:09:58.934971 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/095deb64-7d79-4197-b6ef-8d45508d2c43-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-db54df47d-48gxv\" (UID: \"095deb64-7d79-4197-b6ef-8d45508d2c43\") " pod="openshift-monitoring/prometheus-operator-db54df47d-48gxv" Mar 09 14:09:58 crc kubenswrapper[4722]: I0309 14:09:58.935060 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/095deb64-7d79-4197-b6ef-8d45508d2c43-metrics-client-ca\") pod \"prometheus-operator-db54df47d-48gxv\" (UID: \"095deb64-7d79-4197-b6ef-8d45508d2c43\") " pod="openshift-monitoring/prometheus-operator-db54df47d-48gxv" Mar 09 14:09:59 crc kubenswrapper[4722]: I0309 14:09:59.036362 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/095deb64-7d79-4197-b6ef-8d45508d2c43-prometheus-operator-tls\") pod \"prometheus-operator-db54df47d-48gxv\" (UID: \"095deb64-7d79-4197-b6ef-8d45508d2c43\") " pod="openshift-monitoring/prometheus-operator-db54df47d-48gxv" Mar 09 14:09:59 crc kubenswrapper[4722]: I0309 14:09:59.036686 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgrpg\" (UniqueName: \"kubernetes.io/projected/095deb64-7d79-4197-b6ef-8d45508d2c43-kube-api-access-kgrpg\") pod \"prometheus-operator-db54df47d-48gxv\" (UID: \"095deb64-7d79-4197-b6ef-8d45508d2c43\") " pod="openshift-monitoring/prometheus-operator-db54df47d-48gxv" Mar 09 14:09:59 crc kubenswrapper[4722]: I0309 14:09:59.036875 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/095deb64-7d79-4197-b6ef-8d45508d2c43-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-db54df47d-48gxv\" (UID: \"095deb64-7d79-4197-b6ef-8d45508d2c43\") " pod="openshift-monitoring/prometheus-operator-db54df47d-48gxv" Mar 09 14:09:59 crc kubenswrapper[4722]: I0309 14:09:59.037021 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/095deb64-7d79-4197-b6ef-8d45508d2c43-metrics-client-ca\") pod \"prometheus-operator-db54df47d-48gxv\" (UID: \"095deb64-7d79-4197-b6ef-8d45508d2c43\") " pod="openshift-monitoring/prometheus-operator-db54df47d-48gxv" Mar 09 14:09:59 crc kubenswrapper[4722]: I0309 14:09:59.038504 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/095deb64-7d79-4197-b6ef-8d45508d2c43-metrics-client-ca\") pod \"prometheus-operator-db54df47d-48gxv\" (UID: \"095deb64-7d79-4197-b6ef-8d45508d2c43\") " pod="openshift-monitoring/prometheus-operator-db54df47d-48gxv" Mar 09 14:09:59 crc kubenswrapper[4722]: I0309 14:09:59.044624 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/095deb64-7d79-4197-b6ef-8d45508d2c43-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-db54df47d-48gxv\" (UID: \"095deb64-7d79-4197-b6ef-8d45508d2c43\") " pod="openshift-monitoring/prometheus-operator-db54df47d-48gxv" Mar 09 14:09:59 crc kubenswrapper[4722]: I0309 14:09:59.047788 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/095deb64-7d79-4197-b6ef-8d45508d2c43-prometheus-operator-tls\") pod \"prometheus-operator-db54df47d-48gxv\" (UID: \"095deb64-7d79-4197-b6ef-8d45508d2c43\") " pod="openshift-monitoring/prometheus-operator-db54df47d-48gxv" Mar 09 14:09:59 crc kubenswrapper[4722]: I0309 14:09:59.056849 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgrpg\" (UniqueName: \"kubernetes.io/projected/095deb64-7d79-4197-b6ef-8d45508d2c43-kube-api-access-kgrpg\") pod \"prometheus-operator-db54df47d-48gxv\" (UID: \"095deb64-7d79-4197-b6ef-8d45508d2c43\") " pod="openshift-monitoring/prometheus-operator-db54df47d-48gxv" Mar 09 14:09:59 crc kubenswrapper[4722]: I0309 14:09:59.128036 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-db54df47d-48gxv" Mar 09 14:09:59 crc kubenswrapper[4722]: I0309 14:09:59.406565 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-db54df47d-48gxv"] Mar 09 14:09:59 crc kubenswrapper[4722]: W0309 14:09:59.419891 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod095deb64_7d79_4197_b6ef_8d45508d2c43.slice/crio-1b2ce297b144c4a9ddccaa44e71a562191d071ae12d7c3fc9ff09ac70883fa35 WatchSource:0}: Error finding container 1b2ce297b144c4a9ddccaa44e71a562191d071ae12d7c3fc9ff09ac70883fa35: Status 404 returned error can't find the container with id 1b2ce297b144c4a9ddccaa44e71a562191d071ae12d7c3fc9ff09ac70883fa35 Mar 09 14:09:59 crc kubenswrapper[4722]: I0309 14:09:59.784198 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-db54df47d-48gxv" event={"ID":"095deb64-7d79-4197-b6ef-8d45508d2c43","Type":"ContainerStarted","Data":"1b2ce297b144c4a9ddccaa44e71a562191d071ae12d7c3fc9ff09ac70883fa35"} Mar 09 14:10:00 crc kubenswrapper[4722]: I0309 14:10:00.127125 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551090-trppz"] Mar 09 14:10:00 crc kubenswrapper[4722]: I0309 14:10:00.127984 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551090-trppz" Mar 09 14:10:00 crc kubenswrapper[4722]: I0309 14:10:00.130057 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 14:10:00 crc kubenswrapper[4722]: I0309 14:10:00.130343 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 14:10:00 crc kubenswrapper[4722]: I0309 14:10:00.130637 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-dr2x6" Mar 09 14:10:00 crc kubenswrapper[4722]: I0309 14:10:00.138947 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551090-trppz"] Mar 09 14:10:00 crc kubenswrapper[4722]: I0309 14:10:00.254750 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msqk7\" (UniqueName: \"kubernetes.io/projected/a1b96455-88a6-4b99-9cf3-8c7332822aae-kube-api-access-msqk7\") pod \"auto-csr-approver-29551090-trppz\" (UID: \"a1b96455-88a6-4b99-9cf3-8c7332822aae\") " pod="openshift-infra/auto-csr-approver-29551090-trppz" Mar 09 14:10:00 crc kubenswrapper[4722]: I0309 14:10:00.356452 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msqk7\" (UniqueName: \"kubernetes.io/projected/a1b96455-88a6-4b99-9cf3-8c7332822aae-kube-api-access-msqk7\") pod \"auto-csr-approver-29551090-trppz\" (UID: \"a1b96455-88a6-4b99-9cf3-8c7332822aae\") " pod="openshift-infra/auto-csr-approver-29551090-trppz" Mar 09 14:10:00 crc kubenswrapper[4722]: I0309 14:10:00.377378 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msqk7\" (UniqueName: \"kubernetes.io/projected/a1b96455-88a6-4b99-9cf3-8c7332822aae-kube-api-access-msqk7\") pod \"auto-csr-approver-29551090-trppz\" (UID: \"a1b96455-88a6-4b99-9cf3-8c7332822aae\") " pod="openshift-infra/auto-csr-approver-29551090-trppz" Mar 09 14:10:00 crc kubenswrapper[4722]: I0309 14:10:00.454774 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551090-trppz" Mar 09 14:10:00 crc kubenswrapper[4722]: I0309 14:10:00.709817 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551090-trppz"] Mar 09 14:10:01 crc kubenswrapper[4722]: I0309 14:10:01.809199 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-db54df47d-48gxv" event={"ID":"095deb64-7d79-4197-b6ef-8d45508d2c43","Type":"ContainerStarted","Data":"02f8929971ea50149aa11ebed18a98b1e721e8481e948bb3815d1f986c42ad6a"} Mar 09 14:10:01 crc kubenswrapper[4722]: I0309 14:10:01.809752 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-db54df47d-48gxv" event={"ID":"095deb64-7d79-4197-b6ef-8d45508d2c43","Type":"ContainerStarted","Data":"e8dd100335ebaeacae65d81c4c79ced76e9cc58f443f5821c28bd34e24ecd2f5"} Mar 09 14:10:01 crc kubenswrapper[4722]: I0309 14:10:01.812896 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551090-trppz" event={"ID":"a1b96455-88a6-4b99-9cf3-8c7332822aae","Type":"ContainerStarted","Data":"9c8e804e49a6fc4cc358f32647183d4cb8c3cc812cda49159c88533988c337fb"} Mar 09 14:10:01 crc kubenswrapper[4722]: I0309 14:10:01.827029 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-db54df47d-48gxv" podStartSLOduration=2.2325356259999998 podStartE2EDuration="3.826999236s" podCreationTimestamp="2026-03-09 14:09:58 +0000 UTC" firstStartedPulling="2026-03-09 14:09:59.422653245 +0000 UTC m=+439.978221811" lastFinishedPulling="2026-03-09 14:10:01.017116845 +0000 UTC m=+441.572685421" observedRunningTime="2026-03-09 14:10:01.823338783 +0000 UTC m=+442.378907359" watchObservedRunningTime="2026-03-09 14:10:01.826999236 +0000 UTC m=+442.382567802" Mar 09 14:10:02 crc kubenswrapper[4722]: I0309 14:10:02.819067 4722 generic.go:334] "Generic (PLEG): container finished" podID="a1b96455-88a6-4b99-9cf3-8c7332822aae" containerID="189c0576d6df4503226c658c0938baa9365f608dcc338c56a8d434a267bc0b50" exitCode=0 Mar 09 14:10:02 crc kubenswrapper[4722]: I0309 14:10:02.819286 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551090-trppz" event={"ID":"a1b96455-88a6-4b99-9cf3-8c7332822aae","Type":"ContainerDied","Data":"189c0576d6df4503226c658c0938baa9365f608dcc338c56a8d434a267bc0b50"} Mar 09 14:10:04 crc kubenswrapper[4722]: I0309 14:10:04.109972 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-566fddb674-wg4mz"] Mar 09 14:10:04 crc kubenswrapper[4722]: I0309 14:10:04.111449 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-566fddb674-wg4mz" Mar 09 14:10:04 crc kubenswrapper[4722]: I0309 14:10:04.118562 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-kube-rbac-proxy-config" Mar 09 14:10:04 crc kubenswrapper[4722]: I0309 14:10:04.118863 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-dockercfg-zjsx6" Mar 09 14:10:04 crc kubenswrapper[4722]: I0309 14:10:04.122743 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-tls" Mar 09 14:10:04 crc kubenswrapper[4722]: I0309 14:10:04.127625 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-566fddb674-wg4mz"] Mar 09 14:10:04 crc kubenswrapper[4722]: I0309 14:10:04.226401 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/0f5ed552-309f-4cc0-8646-484b1e4be7d9-openshift-state-metrics-tls\") pod \"openshift-state-metrics-566fddb674-wg4mz\" (UID: \"0f5ed552-309f-4cc0-8646-484b1e4be7d9\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-wg4mz" Mar 09 14:10:04 crc kubenswrapper[4722]: I0309 14:10:04.226739 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dk9w5\" (UniqueName: \"kubernetes.io/projected/0f5ed552-309f-4cc0-8646-484b1e4be7d9-kube-api-access-dk9w5\") pod \"openshift-state-metrics-566fddb674-wg4mz\" (UID: \"0f5ed552-309f-4cc0-8646-484b1e4be7d9\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-wg4mz" Mar 09 14:10:04 crc kubenswrapper[4722]: I0309 14:10:04.226765 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0f5ed552-309f-4cc0-8646-484b1e4be7d9-metrics-client-ca\") pod \"openshift-state-metrics-566fddb674-wg4mz\" (UID: \"0f5ed552-309f-4cc0-8646-484b1e4be7d9\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-wg4mz" Mar 09 14:10:04 crc kubenswrapper[4722]: I0309 14:10:04.226831 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0f5ed552-309f-4cc0-8646-484b1e4be7d9-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-566fddb674-wg4mz\" (UID: \"0f5ed552-309f-4cc0-8646-484b1e4be7d9\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-wg4mz" Mar 09 14:10:04 crc kubenswrapper[4722]: I0309 14:10:04.231816 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-p5xzd"] Mar 09 14:10:04 crc kubenswrapper[4722]: I0309 14:10:04.233158 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-p5xzd" Mar 09 14:10:04 crc kubenswrapper[4722]: I0309 14:10:04.244732 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-kube-rbac-proxy-config" Mar 09 14:10:04 crc kubenswrapper[4722]: I0309 14:10:04.251898 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-tls" Mar 09 14:10:04 crc kubenswrapper[4722]: I0309 14:10:04.252824 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-dockercfg-rg9hp" Mar 09 14:10:04 crc kubenswrapper[4722]: I0309 14:10:04.274423 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-777cb5bd5d-2qtgt"] Mar 09 14:10:04 crc kubenswrapper[4722]: I0309 14:10:04.275892 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-2qtgt" Mar 09 14:10:04 crc kubenswrapper[4722]: I0309 14:10:04.279670 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-kube-rbac-proxy-config" Mar 09 14:10:04 crc kubenswrapper[4722]: I0309 14:10:04.280293 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-state-metrics-custom-resource-state-configmap" Mar 09 14:10:04 crc kubenswrapper[4722]: I0309 14:10:04.280813 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-dockercfg-frlnq" Mar 09 14:10:04 crc kubenswrapper[4722]: I0309 14:10:04.280999 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-tls" Mar 09 14:10:04 crc kubenswrapper[4722]: I0309 14:10:04.293245 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-777cb5bd5d-2qtgt"] Mar 09 14:10:04 crc kubenswrapper[4722]: I0309 14:10:04.309884 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551090-trppz" Mar 09 14:10:04 crc kubenswrapper[4722]: I0309 14:10:04.334245 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dk9w5\" (UniqueName: \"kubernetes.io/projected/0f5ed552-309f-4cc0-8646-484b1e4be7d9-kube-api-access-dk9w5\") pod \"openshift-state-metrics-566fddb674-wg4mz\" (UID: \"0f5ed552-309f-4cc0-8646-484b1e4be7d9\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-wg4mz" Mar 09 14:10:04 crc kubenswrapper[4722]: I0309 14:10:04.334293 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/29dfec3b-120d-4a56-8ab6-7155b7ec5f43-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-777cb5bd5d-2qtgt\" (UID: \"29dfec3b-120d-4a56-8ab6-7155b7ec5f43\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-2qtgt" Mar 09 14:10:04 crc kubenswrapper[4722]: I0309 14:10:04.334318 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/29dfec3b-120d-4a56-8ab6-7155b7ec5f43-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-777cb5bd5d-2qtgt\" (UID: \"29dfec3b-120d-4a56-8ab6-7155b7ec5f43\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-2qtgt" Mar 09 14:10:04 crc kubenswrapper[4722]: I0309 14:10:04.334351 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0f5ed552-309f-4cc0-8646-484b1e4be7d9-metrics-client-ca\") pod \"openshift-state-metrics-566fddb674-wg4mz\" (UID: \"0f5ed552-309f-4cc0-8646-484b1e4be7d9\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-wg4mz" Mar 09 14:10:04 crc kubenswrapper[4722]: I0309 14:10:04.334388 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/29dfec3b-120d-4a56-8ab6-7155b7ec5f43-kube-state-metrics-tls\") pod \"kube-state-metrics-777cb5bd5d-2qtgt\" (UID: \"29dfec3b-120d-4a56-8ab6-7155b7ec5f43\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-2qtgt" Mar 09 14:10:04 crc kubenswrapper[4722]: I0309 14:10:04.334434 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lplbm\" (UniqueName: \"kubernetes.io/projected/29dfec3b-120d-4a56-8ab6-7155b7ec5f43-kube-api-access-lplbm\") pod \"kube-state-metrics-777cb5bd5d-2qtgt\" (UID: \"29dfec3b-120d-4a56-8ab6-7155b7ec5f43\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-2qtgt" Mar 09 14:10:04 crc kubenswrapper[4722]: I0309 14:10:04.334458 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/29dfec3b-120d-4a56-8ab6-7155b7ec5f43-volume-directive-shadow\") pod \"kube-state-metrics-777cb5bd5d-2qtgt\" (UID: \"29dfec3b-120d-4a56-8ab6-7155b7ec5f43\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-2qtgt" Mar 09 14:10:04 crc kubenswrapper[4722]: I0309 14:10:04.334527 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0f5ed552-309f-4cc0-8646-484b1e4be7d9-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-566fddb674-wg4mz\" (UID: \"0f5ed552-309f-4cc0-8646-484b1e4be7d9\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-wg4mz" Mar 09 14:10:04 crc kubenswrapper[4722]: I0309 14:10:04.334547 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/29dfec3b-120d-4a56-8ab6-7155b7ec5f43-metrics-client-ca\") pod \"kube-state-metrics-777cb5bd5d-2qtgt\" (UID: \"29dfec3b-120d-4a56-8ab6-7155b7ec5f43\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-2qtgt" Mar 09 14:10:04 crc kubenswrapper[4722]: I0309 14:10:04.334607 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/0f5ed552-309f-4cc0-8646-484b1e4be7d9-openshift-state-metrics-tls\") pod \"openshift-state-metrics-566fddb674-wg4mz\" (UID: \"0f5ed552-309f-4cc0-8646-484b1e4be7d9\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-wg4mz" Mar 09 14:10:04 crc kubenswrapper[4722]: I0309 14:10:04.336802 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0f5ed552-309f-4cc0-8646-484b1e4be7d9-metrics-client-ca\") pod \"openshift-state-metrics-566fddb674-wg4mz\" (UID: \"0f5ed552-309f-4cc0-8646-484b1e4be7d9\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-wg4mz" Mar 09 14:10:04 crc kubenswrapper[4722]: I0309 14:10:04.358020 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0f5ed552-309f-4cc0-8646-484b1e4be7d9-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-566fddb674-wg4mz\" (UID: \"0f5ed552-309f-4cc0-8646-484b1e4be7d9\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-wg4mz" Mar 09 14:10:04 crc kubenswrapper[4722]: I0309 14:10:04.360422 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/0f5ed552-309f-4cc0-8646-484b1e4be7d9-openshift-state-metrics-tls\") pod \"openshift-state-metrics-566fddb674-wg4mz\" (UID: \"0f5ed552-309f-4cc0-8646-484b1e4be7d9\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-wg4mz" Mar 09 14:10:04 crc kubenswrapper[4722]: I0309 14:10:04.362133 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dk9w5\" (UniqueName: \"kubernetes.io/projected/0f5ed552-309f-4cc0-8646-484b1e4be7d9-kube-api-access-dk9w5\") pod \"openshift-state-metrics-566fddb674-wg4mz\" (UID: \"0f5ed552-309f-4cc0-8646-484b1e4be7d9\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-wg4mz" Mar 09 14:10:04 crc kubenswrapper[4722]: I0309 14:10:04.439780 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-msqk7\" (UniqueName: \"kubernetes.io/projected/a1b96455-88a6-4b99-9cf3-8c7332822aae-kube-api-access-msqk7\") pod \"a1b96455-88a6-4b99-9cf3-8c7332822aae\" (UID: \"a1b96455-88a6-4b99-9cf3-8c7332822aae\") " Mar 09 14:10:04 crc kubenswrapper[4722]: I0309 14:10:04.439990 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c5a8910e-86fc-4909-ab14-cee4a4002f88-sys\") pod \"node-exporter-p5xzd\" (UID: \"c5a8910e-86fc-4909-ab14-cee4a4002f88\") " pod="openshift-monitoring/node-exporter-p5xzd" Mar 09 14:10:04 crc kubenswrapper[4722]: I0309 14:10:04.440030 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/29dfec3b-120d-4a56-8ab6-7155b7ec5f43-volume-directive-shadow\") pod \"kube-state-metrics-777cb5bd5d-2qtgt\" (UID: \"29dfec3b-120d-4a56-8ab6-7155b7ec5f43\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-2qtgt" Mar 09 14:10:04 crc kubenswrapper[4722]: I0309 14:10:04.445512 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/29dfec3b-120d-4a56-8ab6-7155b7ec5f43-volume-directive-shadow\") pod \"kube-state-metrics-777cb5bd5d-2qtgt\" (UID: \"29dfec3b-120d-4a56-8ab6-7155b7ec5f43\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-2qtgt" Mar 09 14:10:04 crc kubenswrapper[4722]: I0309 14:10:04.446096 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c5a8910e-86fc-4909-ab14-cee4a4002f88-node-exporter-tls\") pod \"node-exporter-p5xzd\" (UID: \"c5a8910e-86fc-4909-ab14-cee4a4002f88\") " pod="openshift-monitoring/node-exporter-p5xzd" Mar 09 14:10:04 crc kubenswrapper[4722]: I0309 14:10:04.446127 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c5a8910e-86fc-4909-ab14-cee4a4002f88-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-p5xzd\" (UID: \"c5a8910e-86fc-4909-ab14-cee4a4002f88\") " pod="openshift-monitoring/node-exporter-p5xzd" Mar 09 14:10:04 crc kubenswrapper[4722]: I0309 14:10:04.446174 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfxvm\" (UniqueName: \"kubernetes.io/projected/c5a8910e-86fc-4909-ab14-cee4a4002f88-kube-api-access-qfxvm\") pod \"node-exporter-p5xzd\" (UID: \"c5a8910e-86fc-4909-ab14-cee4a4002f88\") " pod="openshift-monitoring/node-exporter-p5xzd" Mar 09 14:10:04 crc kubenswrapper[4722]: I0309 14:10:04.446238 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c5a8910e-86fc-4909-ab14-cee4a4002f88-node-exporter-textfile\") pod \"node-exporter-p5xzd\" (UID: \"c5a8910e-86fc-4909-ab14-cee4a4002f88\") " pod="openshift-monitoring/node-exporter-p5xzd" Mar 09 14:10:04 crc kubenswrapper[4722]: I0309 14:10:04.446267 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c5a8910e-86fc-4909-ab14-cee4a4002f88-node-exporter-wtmp\") pod \"node-exporter-p5xzd\" (UID: \"c5a8910e-86fc-4909-ab14-cee4a4002f88\") " pod="openshift-monitoring/node-exporter-p5xzd" Mar 09 14:10:04 crc kubenswrapper[4722]: I0309 14:10:04.446307 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/29dfec3b-120d-4a56-8ab6-7155b7ec5f43-metrics-client-ca\") pod \"kube-state-metrics-777cb5bd5d-2qtgt\" (UID: \"29dfec3b-120d-4a56-8ab6-7155b7ec5f43\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-2qtgt" Mar 09 14:10:04 crc kubenswrapper[4722]: I0309 14:10:04.446374 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c5a8910e-86fc-4909-ab14-cee4a4002f88-root\") pod \"node-exporter-p5xzd\" (UID: \"c5a8910e-86fc-4909-ab14-cee4a4002f88\") " pod="openshift-monitoring/node-exporter-p5xzd" Mar 09 14:10:04 crc kubenswrapper[4722]: I0309 14:10:04.446408 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/29dfec3b-120d-4a56-8ab6-7155b7ec5f43-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-777cb5bd5d-2qtgt\" (UID: \"29dfec3b-120d-4a56-8ab6-7155b7ec5f43\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-2qtgt" Mar 09 14:10:04 crc kubenswrapper[4722]: I0309 14:10:04.446441 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/29dfec3b-120d-4a56-8ab6-7155b7ec5f43-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-777cb5bd5d-2qtgt\" (UID: \"29dfec3b-120d-4a56-8ab6-7155b7ec5f43\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-2qtgt" Mar 09 14:10:04 crc kubenswrapper[4722]: I0309 14:10:04.446479 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c5a8910e-86fc-4909-ab14-cee4a4002f88-metrics-client-ca\") pod \"node-exporter-p5xzd\" (UID: \"c5a8910e-86fc-4909-ab14-cee4a4002f88\") " pod="openshift-monitoring/node-exporter-p5xzd" Mar 09 14:10:04 crc kubenswrapper[4722]: I0309 14:10:04.446515 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/29dfec3b-120d-4a56-8ab6-7155b7ec5f43-kube-state-metrics-tls\") pod \"kube-state-metrics-777cb5bd5d-2qtgt\" (UID: \"29dfec3b-120d-4a56-8ab6-7155b7ec5f43\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-2qtgt" Mar 09 14:10:04 crc kubenswrapper[4722]: I0309 14:10:04.446556 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lplbm\" (UniqueName: \"kubernetes.io/projected/29dfec3b-120d-4a56-8ab6-7155b7ec5f43-kube-api-access-lplbm\") pod \"kube-state-metrics-777cb5bd5d-2qtgt\" (UID: \"29dfec3b-120d-4a56-8ab6-7155b7ec5f43\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-2qtgt" Mar 09 14:10:04 crc kubenswrapper[4722]: I0309 14:10:04.447745 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/29dfec3b-120d-4a56-8ab6-7155b7ec5f43-metrics-client-ca\") pod \"kube-state-metrics-777cb5bd5d-2qtgt\" (UID: \"29dfec3b-120d-4a56-8ab6-7155b7ec5f43\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-2qtgt" Mar 09 14:10:04 crc kubenswrapper[4722]: I0309 14:10:04.449120 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/29dfec3b-120d-4a56-8ab6-7155b7ec5f43-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-777cb5bd5d-2qtgt\" (UID: \"29dfec3b-120d-4a56-8ab6-7155b7ec5f43\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-2qtgt" Mar 09 14:10:04 crc kubenswrapper[4722]: E0309 14:10:04.449523 4722 secret.go:188] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Mar 09 14:10:04 crc kubenswrapper[4722]: E0309 14:10:04.449580 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/29dfec3b-120d-4a56-8ab6-7155b7ec5f43-kube-state-metrics-tls podName:29dfec3b-120d-4a56-8ab6-7155b7ec5f43 nodeName:}" failed. No retries permitted until 2026-03-09 14:10:04.949560825 +0000 UTC m=+445.505129401 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/29dfec3b-120d-4a56-8ab6-7155b7ec5f43-kube-state-metrics-tls") pod "kube-state-metrics-777cb5bd5d-2qtgt" (UID: "29dfec3b-120d-4a56-8ab6-7155b7ec5f43") : secret "kube-state-metrics-tls" not found Mar 09 14:10:04 crc kubenswrapper[4722]: I0309 14:10:04.453879 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1b96455-88a6-4b99-9cf3-8c7332822aae-kube-api-access-msqk7" (OuterVolumeSpecName: "kube-api-access-msqk7") pod "a1b96455-88a6-4b99-9cf3-8c7332822aae" (UID: "a1b96455-88a6-4b99-9cf3-8c7332822aae"). InnerVolumeSpecName "kube-api-access-msqk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:10:04 crc kubenswrapper[4722]: I0309 14:10:04.453986 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/29dfec3b-120d-4a56-8ab6-7155b7ec5f43-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-777cb5bd5d-2qtgt\" (UID: \"29dfec3b-120d-4a56-8ab6-7155b7ec5f43\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-2qtgt" Mar 09 14:10:04 crc kubenswrapper[4722]: I0309 14:10:04.464430 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lplbm\" (UniqueName: \"kubernetes.io/projected/29dfec3b-120d-4a56-8ab6-7155b7ec5f43-kube-api-access-lplbm\") pod \"kube-state-metrics-777cb5bd5d-2qtgt\" (UID: \"29dfec3b-120d-4a56-8ab6-7155b7ec5f43\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-2qtgt" Mar 09 14:10:04 crc kubenswrapper[4722]: I0309 14:10:04.477824 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-566fddb674-wg4mz" Mar 09 14:10:04 crc kubenswrapper[4722]: I0309 14:10:04.547233 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c5a8910e-86fc-4909-ab14-cee4a4002f88-sys\") pod \"node-exporter-p5xzd\" (UID: \"c5a8910e-86fc-4909-ab14-cee4a4002f88\") " pod="openshift-monitoring/node-exporter-p5xzd" Mar 09 14:10:04 crc kubenswrapper[4722]: I0309 14:10:04.547641 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c5a8910e-86fc-4909-ab14-cee4a4002f88-node-exporter-tls\") pod \"node-exporter-p5xzd\" (UID: \"c5a8910e-86fc-4909-ab14-cee4a4002f88\") " pod="openshift-monitoring/node-exporter-p5xzd" Mar 09 14:10:04 crc kubenswrapper[4722]: I0309 14:10:04.547667 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c5a8910e-86fc-4909-ab14-cee4a4002f88-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-p5xzd\" (UID: \"c5a8910e-86fc-4909-ab14-cee4a4002f88\") " pod="openshift-monitoring/node-exporter-p5xzd" Mar 09 14:10:04 crc kubenswrapper[4722]: I0309 14:10:04.547393 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c5a8910e-86fc-4909-ab14-cee4a4002f88-sys\") pod \"node-exporter-p5xzd\" (UID: \"c5a8910e-86fc-4909-ab14-cee4a4002f88\") " pod="openshift-monitoring/node-exporter-p5xzd" Mar 09 14:10:04 crc kubenswrapper[4722]: I0309 14:10:04.547705 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfxvm\" (UniqueName: \"kubernetes.io/projected/c5a8910e-86fc-4909-ab14-cee4a4002f88-kube-api-access-qfxvm\") pod \"node-exporter-p5xzd\" (UID: \"c5a8910e-86fc-4909-ab14-cee4a4002f88\") " pod="openshift-monitoring/node-exporter-p5xzd" Mar 09 14:10:04 crc kubenswrapper[4722]: I0309 14:10:04.547794 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c5a8910e-86fc-4909-ab14-cee4a4002f88-node-exporter-textfile\") pod \"node-exporter-p5xzd\" (UID: \"c5a8910e-86fc-4909-ab14-cee4a4002f88\") " pod="openshift-monitoring/node-exporter-p5xzd" Mar 09 14:10:04 crc kubenswrapper[4722]: I0309 14:10:04.547826 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c5a8910e-86fc-4909-ab14-cee4a4002f88-node-exporter-wtmp\") pod \"node-exporter-p5xzd\" (UID: \"c5a8910e-86fc-4909-ab14-cee4a4002f88\") " pod="openshift-monitoring/node-exporter-p5xzd" Mar 09 14:10:04 crc kubenswrapper[4722]: I0309 14:10:04.547940 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c5a8910e-86fc-4909-ab14-cee4a4002f88-root\") pod \"node-exporter-p5xzd\" (UID: \"c5a8910e-86fc-4909-ab14-cee4a4002f88\") " pod="openshift-monitoring/node-exporter-p5xzd" Mar 09 14:10:04 crc kubenswrapper[4722]: I0309 14:10:04.547997 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c5a8910e-86fc-4909-ab14-cee4a4002f88-metrics-client-ca\") pod \"node-exporter-p5xzd\" (UID: \"c5a8910e-86fc-4909-ab14-cee4a4002f88\") " pod="openshift-monitoring/node-exporter-p5xzd" Mar 09 14:10:04 crc kubenswrapper[4722]: I0309 14:10:04.548087 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-msqk7\" (UniqueName: \"kubernetes.io/projected/a1b96455-88a6-4b99-9cf3-8c7332822aae-kube-api-access-msqk7\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:04 crc kubenswrapper[4722]: I0309 14:10:04.548757 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c5a8910e-86fc-4909-ab14-cee4a4002f88-metrics-client-ca\") pod \"node-exporter-p5xzd\" (UID: \"c5a8910e-86fc-4909-ab14-cee4a4002f88\") " pod="openshift-monitoring/node-exporter-p5xzd" Mar 09 14:10:04 crc kubenswrapper[4722]: I0309 14:10:04.548765 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c5a8910e-86fc-4909-ab14-cee4a4002f88-node-exporter-wtmp\") pod \"node-exporter-p5xzd\" (UID: \"c5a8910e-86fc-4909-ab14-cee4a4002f88\") " pod="openshift-monitoring/node-exporter-p5xzd" Mar 09 14:10:04 crc kubenswrapper[4722]: I0309 14:10:04.548797 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c5a8910e-86fc-4909-ab14-cee4a4002f88-root\") pod \"node-exporter-p5xzd\" (UID: \"c5a8910e-86fc-4909-ab14-cee4a4002f88\") " pod="openshift-monitoring/node-exporter-p5xzd" Mar 09 14:10:04 crc kubenswrapper[4722]: I0309 14:10:04.549003 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c5a8910e-86fc-4909-ab14-cee4a4002f88-node-exporter-textfile\") pod \"node-exporter-p5xzd\" (UID: \"c5a8910e-86fc-4909-ab14-cee4a4002f88\") " pod="openshift-monitoring/node-exporter-p5xzd" Mar 09 14:10:04 crc kubenswrapper[4722]: I0309 14:10:04.555544 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c5a8910e-86fc-4909-ab14-cee4a4002f88-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-p5xzd\" (UID: \"c5a8910e-86fc-4909-ab14-cee4a4002f88\") " pod="openshift-monitoring/node-exporter-p5xzd" Mar 09 14:10:04 crc kubenswrapper[4722]: I0309 14:10:04.556962 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c5a8910e-86fc-4909-ab14-cee4a4002f88-node-exporter-tls\") pod \"node-exporter-p5xzd\" (UID: \"c5a8910e-86fc-4909-ab14-cee4a4002f88\") " pod="openshift-monitoring/node-exporter-p5xzd" Mar 09 14:10:04 crc kubenswrapper[4722]: I0309 14:10:04.575759 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfxvm\" (UniqueName: \"kubernetes.io/projected/c5a8910e-86fc-4909-ab14-cee4a4002f88-kube-api-access-qfxvm\") pod \"node-exporter-p5xzd\" (UID: \"c5a8910e-86fc-4909-ab14-cee4a4002f88\") " pod="openshift-monitoring/node-exporter-p5xzd" Mar 09 14:10:04 crc kubenswrapper[4722]: I0309 14:10:04.642559 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-p5xzd" Mar 09 14:10:04 crc kubenswrapper[4722]: I0309 14:10:04.849100 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551090-trppz" event={"ID":"a1b96455-88a6-4b99-9cf3-8c7332822aae","Type":"ContainerDied","Data":"9c8e804e49a6fc4cc358f32647183d4cb8c3cc812cda49159c88533988c337fb"} Mar 09 14:10:04 crc kubenswrapper[4722]: I0309 14:10:04.849150 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551090-trppz" Mar 09 14:10:04 crc kubenswrapper[4722]: I0309 14:10:04.849160 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9c8e804e49a6fc4cc358f32647183d4cb8c3cc812cda49159c88533988c337fb" Mar 09 14:10:04 crc kubenswrapper[4722]: I0309 14:10:04.853601 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-p5xzd" event={"ID":"c5a8910e-86fc-4909-ab14-cee4a4002f88","Type":"ContainerStarted","Data":"b92336bc2fa8f57fe6094920eac434c9d6a9b6e56898396830f6353299a8c443"} Mar 09 14:10:04 crc kubenswrapper[4722]: I0309 14:10:04.954864 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/29dfec3b-120d-4a56-8ab6-7155b7ec5f43-kube-state-metrics-tls\") pod \"kube-state-metrics-777cb5bd5d-2qtgt\" (UID: \"29dfec3b-120d-4a56-8ab6-7155b7ec5f43\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-2qtgt" Mar 09 14:10:04 crc kubenswrapper[4722]: I0309 14:10:04.958926 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/29dfec3b-120d-4a56-8ab6-7155b7ec5f43-kube-state-metrics-tls\") pod \"kube-state-metrics-777cb5bd5d-2qtgt\" (UID: \"29dfec3b-120d-4a56-8ab6-7155b7ec5f43\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-2qtgt" Mar 09 14:10:05 crc kubenswrapper[4722]: I0309 14:10:05.017769 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-2qtgt" Mar 09 14:10:05 crc kubenswrapper[4722]: I0309 14:10:05.231183 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-777cb5bd5d-2qtgt"] Mar 09 14:10:05 crc kubenswrapper[4722]: I0309 14:10:05.288500 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-566fddb674-wg4mz"] Mar 09 14:10:05 crc kubenswrapper[4722]: I0309 14:10:05.361568 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 09 14:10:05 crc kubenswrapper[4722]: E0309 14:10:05.361938 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1b96455-88a6-4b99-9cf3-8c7332822aae" containerName="oc" Mar 09 14:10:05 crc kubenswrapper[4722]: I0309 14:10:05.361958 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1b96455-88a6-4b99-9cf3-8c7332822aae" containerName="oc" Mar 09 14:10:05 crc kubenswrapper[4722]: I0309 14:10:05.362178 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1b96455-88a6-4b99-9cf3-8c7332822aae" containerName="oc" Mar 09 14:10:05 crc kubenswrapper[4722]: I0309 14:10:05.366812 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 09 14:10:05 crc kubenswrapper[4722]: I0309 14:10:05.373987 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-web" Mar 09 14:10:05 crc kubenswrapper[4722]: I0309 14:10:05.374178 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-metric" Mar 09 14:10:05 crc kubenswrapper[4722]: I0309 14:10:05.374313 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls-assets-0" Mar 09 14:10:05 crc kubenswrapper[4722]: I0309 14:10:05.374409 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-generated" Mar 09 14:10:05 crc kubenswrapper[4722]: I0309 14:10:05.374517 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy" Mar 09 14:10:05 crc kubenswrapper[4722]: I0309 14:10:05.374634 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-web-config" Mar 09 14:10:05 crc kubenswrapper[4722]: I0309 14:10:05.374746 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls" Mar 09 14:10:05 crc kubenswrapper[4722]: I0309 14:10:05.375817 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-dockercfg-84hlv" Mar 09 14:10:05 crc kubenswrapper[4722]: I0309 14:10:05.378847 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"alertmanager-trusted-ca-bundle" Mar 09 14:10:05 crc kubenswrapper[4722]: I0309 14:10:05.383874 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 09 14:10:05 crc kubenswrapper[4722]: I0309 14:10:05.563494 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/dba33bb4-644d-4534-9d39-0d3da7d18d0c-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"dba33bb4-644d-4534-9d39-0d3da7d18d0c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 09 14:10:05 crc kubenswrapper[4722]: I0309 14:10:05.563782 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5b9zk\" (UniqueName: \"kubernetes.io/projected/dba33bb4-644d-4534-9d39-0d3da7d18d0c-kube-api-access-5b9zk\") pod \"alertmanager-main-0\" (UID: \"dba33bb4-644d-4534-9d39-0d3da7d18d0c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 09 14:10:05 crc kubenswrapper[4722]: I0309 14:10:05.563834 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/dba33bb4-644d-4534-9d39-0d3da7d18d0c-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"dba33bb4-644d-4534-9d39-0d3da7d18d0c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 09 14:10:05 crc kubenswrapper[4722]: I0309 14:10:05.563856 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/dba33bb4-644d-4534-9d39-0d3da7d18d0c-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"dba33bb4-644d-4534-9d39-0d3da7d18d0c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 09 14:10:05 crc kubenswrapper[4722]: I0309 14:10:05.563870 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dba33bb4-644d-4534-9d39-0d3da7d18d0c-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"dba33bb4-644d-4534-9d39-0d3da7d18d0c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 09 14:10:05 crc kubenswrapper[4722]: I0309 14:10:05.563892 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/dba33bb4-644d-4534-9d39-0d3da7d18d0c-tls-assets\") pod \"alertmanager-main-0\" (UID: \"dba33bb4-644d-4534-9d39-0d3da7d18d0c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 09 14:10:05 crc kubenswrapper[4722]: I0309 14:10:05.563906 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/dba33bb4-644d-4534-9d39-0d3da7d18d0c-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"dba33bb4-644d-4534-9d39-0d3da7d18d0c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 09 14:10:05 crc kubenswrapper[4722]: I0309 14:10:05.563984 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/dba33bb4-644d-4534-9d39-0d3da7d18d0c-config-out\") pod \"alertmanager-main-0\" (UID: \"dba33bb4-644d-4534-9d39-0d3da7d18d0c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 09 14:10:05 crc kubenswrapper[4722]: I0309 14:10:05.564156 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/dba33bb4-644d-4534-9d39-0d3da7d18d0c-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"dba33bb4-644d-4534-9d39-0d3da7d18d0c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 09 14:10:05 crc kubenswrapper[4722]: I0309 14:10:05.564260 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dba33bb4-644d-4534-9d39-0d3da7d18d0c-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"dba33bb4-644d-4534-9d39-0d3da7d18d0c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 09 14:10:05 crc kubenswrapper[4722]: I0309 14:10:05.564295 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/dba33bb4-644d-4534-9d39-0d3da7d18d0c-config-volume\") pod \"alertmanager-main-0\" (UID: \"dba33bb4-644d-4534-9d39-0d3da7d18d0c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 09 14:10:05 crc kubenswrapper[4722]: I0309 14:10:05.564341 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/dba33bb4-644d-4534-9d39-0d3da7d18d0c-web-config\") pod \"alertmanager-main-0\" (UID: \"dba33bb4-644d-4534-9d39-0d3da7d18d0c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 09 14:10:05 crc kubenswrapper[4722]: I0309 14:10:05.665270 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/dba33bb4-644d-4534-9d39-0d3da7d18d0c-web-config\") pod \"alertmanager-main-0\" (UID: \"dba33bb4-644d-4534-9d39-0d3da7d18d0c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 09 14:10:05 crc kubenswrapper[4722]: I0309 14:10:05.665332 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/dba33bb4-644d-4534-9d39-0d3da7d18d0c-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"dba33bb4-644d-4534-9d39-0d3da7d18d0c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 09 14:10:05 crc kubenswrapper[4722]: I0309 14:10:05.665363 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5b9zk\" (UniqueName: \"kubernetes.io/projected/dba33bb4-644d-4534-9d39-0d3da7d18d0c-kube-api-access-5b9zk\") pod \"alertmanager-main-0\" (UID: \"dba33bb4-644d-4534-9d39-0d3da7d18d0c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 09 14:10:05 crc kubenswrapper[4722]: I0309 14:10:05.665424 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/dba33bb4-644d-4534-9d39-0d3da7d18d0c-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"dba33bb4-644d-4534-9d39-0d3da7d18d0c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 09 14:10:05 crc kubenswrapper[4722]: I0309 14:10:05.665452 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/dba33bb4-644d-4534-9d39-0d3da7d18d0c-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"dba33bb4-644d-4534-9d39-0d3da7d18d0c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 09 14:10:05 crc kubenswrapper[4722]: I0309 14:10:05.665473 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dba33bb4-644d-4534-9d39-0d3da7d18d0c-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"dba33bb4-644d-4534-9d39-0d3da7d18d0c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 09 14:10:05 crc kubenswrapper[4722]: I0309 14:10:05.665500 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/dba33bb4-644d-4534-9d39-0d3da7d18d0c-tls-assets\") pod \"alertmanager-main-0\" (UID: \"dba33bb4-644d-4534-9d39-0d3da7d18d0c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 09 14:10:05 crc kubenswrapper[4722]: I0309 14:10:05.665549 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/dba33bb4-644d-4534-9d39-0d3da7d18d0c-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"dba33bb4-644d-4534-9d39-0d3da7d18d0c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 09 14:10:05 crc kubenswrapper[4722]: I0309 14:10:05.665585 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/dba33bb4-644d-4534-9d39-0d3da7d18d0c-config-out\") pod \"alertmanager-main-0\" (UID: \"dba33bb4-644d-4534-9d39-0d3da7d18d0c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 09 14:10:05 crc kubenswrapper[4722]: I0309 14:10:05.665621 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/dba33bb4-644d-4534-9d39-0d3da7d18d0c-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"dba33bb4-644d-4534-9d39-0d3da7d18d0c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 09 14:10:05 crc kubenswrapper[4722]: I0309 14:10:05.665663 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dba33bb4-644d-4534-9d39-0d3da7d18d0c-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"dba33bb4-644d-4534-9d39-0d3da7d18d0c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 09 14:10:05 crc kubenswrapper[4722]: I0309 14:10:05.665688 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/dba33bb4-644d-4534-9d39-0d3da7d18d0c-config-volume\") pod \"alertmanager-main-0\" (UID: \"dba33bb4-644d-4534-9d39-0d3da7d18d0c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 09 14:10:05 crc kubenswrapper[4722]: I0309 14:10:05.666772 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dba33bb4-644d-4534-9d39-0d3da7d18d0c-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"dba33bb4-644d-4534-9d39-0d3da7d18d0c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 09 14:10:05 crc kubenswrapper[4722]: I0309 14:10:05.668707 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/dba33bb4-644d-4534-9d39-0d3da7d18d0c-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"dba33bb4-644d-4534-9d39-0d3da7d18d0c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 09 14:10:05 crc kubenswrapper[4722]: I0309 14:10:05.674810 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/dba33bb4-644d-4534-9d39-0d3da7d18d0c-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"dba33bb4-644d-4534-9d39-0d3da7d18d0c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 09 14:10:05 crc kubenswrapper[4722]: I0309 14:10:05.674811 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/dba33bb4-644d-4534-9d39-0d3da7d18d0c-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"dba33bb4-644d-4534-9d39-0d3da7d18d0c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 09 14:10:05 crc kubenswrapper[4722]: I0309 14:10:05.674810 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/dba33bb4-644d-4534-9d39-0d3da7d18d0c-config-volume\") pod \"alertmanager-main-0\" (UID: \"dba33bb4-644d-4534-9d39-0d3da7d18d0c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 09 14:10:05 crc kubenswrapper[4722]: I0309 14:10:05.675309 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/dba33bb4-644d-4534-9d39-0d3da7d18d0c-tls-assets\") pod \"alertmanager-main-0\" (UID: \"dba33bb4-644d-4534-9d39-0d3da7d18d0c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 09 14:10:05 crc kubenswrapper[4722]: I0309 14:10:05.675704 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/dba33bb4-644d-4534-9d39-0d3da7d18d0c-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"dba33bb4-644d-4534-9d39-0d3da7d18d0c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 09 14:10:05 crc kubenswrapper[4722]: I0309 14:10:05.675755 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/dba33bb4-644d-4534-9d39-0d3da7d18d0c-web-config\") pod \"alertmanager-main-0\" (UID: \"dba33bb4-644d-4534-9d39-0d3da7d18d0c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 09 14:10:05 crc kubenswrapper[4722]: I0309 14:10:05.676662 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/dba33bb4-644d-4534-9d39-0d3da7d18d0c-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"dba33bb4-644d-4534-9d39-0d3da7d18d0c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 09 14:10:05 crc kubenswrapper[4722]: I0309 14:10:05.677373 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/dba33bb4-644d-4534-9d39-0d3da7d18d0c-config-out\") pod \"alertmanager-main-0\" (UID: \"dba33bb4-644d-4534-9d39-0d3da7d18d0c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 09 14:10:05 crc kubenswrapper[4722]: I0309 14:10:05.680669 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/dba33bb4-644d-4534-9d39-0d3da7d18d0c-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"dba33bb4-644d-4534-9d39-0d3da7d18d0c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 09 14:10:05 crc kubenswrapper[4722]: I0309 14:10:05.683893 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5b9zk\" (UniqueName: \"kubernetes.io/projected/dba33bb4-644d-4534-9d39-0d3da7d18d0c-kube-api-access-5b9zk\") pod \"alertmanager-main-0\" (UID: \"dba33bb4-644d-4534-9d39-0d3da7d18d0c\") " pod="openshift-monitoring/alertmanager-main-0" Mar 09 14:10:05 crc kubenswrapper[4722]: I0309 14:10:05.704567 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 09 14:10:05 crc kubenswrapper[4722]: I0309 14:10:05.861040 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-2qtgt" event={"ID":"29dfec3b-120d-4a56-8ab6-7155b7ec5f43","Type":"ContainerStarted","Data":"3a9fdfd097e956e9cbc607be7b4d6b2a685612cadd7af92c847e463dac07fb58"} Mar 09 14:10:05 crc kubenswrapper[4722]: I0309 14:10:05.863485 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-wg4mz" event={"ID":"0f5ed552-309f-4cc0-8646-484b1e4be7d9","Type":"ContainerStarted","Data":"68a5e66007445160fdf91a5f203ceff088c6e9302c2b7610d221401387f1f802"} Mar 09 14:10:05 crc kubenswrapper[4722]: I0309 14:10:05.863564 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-wg4mz" event={"ID":"0f5ed552-309f-4cc0-8646-484b1e4be7d9","Type":"ContainerStarted","Data":"4d4176ebe2ee9c36385aee901a8004f1685fabf96bbed7a71b2a8c41bb1bd5a9"} Mar 09 14:10:05 crc kubenswrapper[4722]: I0309 14:10:05.863582 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-wg4mz" event={"ID":"0f5ed552-309f-4cc0-8646-484b1e4be7d9","Type":"ContainerStarted","Data":"9bcd5678bc227041b9e766671d65268ca3fd144e7cb23d40fb83d135840ac83f"} Mar 09 14:10:06 crc kubenswrapper[4722]: I0309 14:10:06.228110 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-5f6c868b98-s8fg2"] Mar 09 14:10:06 crc kubenswrapper[4722]: I0309 14:10:06.234064 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-5f6c868b98-s8fg2" Mar 09 14:10:06 crc kubenswrapper[4722]: I0309 14:10:06.236404 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-rules" Mar 09 14:10:06 crc kubenswrapper[4722]: I0309 14:10:06.236637 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy" Mar 09 14:10:06 crc kubenswrapper[4722]: I0309 14:10:06.238679 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-dockercfg-z9clk" Mar 09 14:10:06 crc kubenswrapper[4722]: I0309 14:10:06.238919 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-grpc-tls-eev02qd9h025t" Mar 09 14:10:06 crc kubenswrapper[4722]: I0309 14:10:06.239766 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-web" Mar 09 14:10:06 crc kubenswrapper[4722]: I0309 14:10:06.239851 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-tls" Mar 09 14:10:06 crc kubenswrapper[4722]: I0309 14:10:06.239979 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-metrics" Mar 09 14:10:06 crc kubenswrapper[4722]: I0309 14:10:06.252846 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-5f6c868b98-s8fg2"] Mar 09 14:10:06 crc kubenswrapper[4722]: I0309 14:10:06.273146 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1eb00f12-c24e-46dc-8346-c096826564f5-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5f6c868b98-s8fg2\" (UID: \"1eb00f12-c24e-46dc-8346-c096826564f5\") " pod="openshift-monitoring/thanos-querier-5f6c868b98-s8fg2" Mar 09 14:10:06 crc kubenswrapper[4722]: I0309 14:10:06.273231 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1eb00f12-c24e-46dc-8346-c096826564f5-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5f6c868b98-s8fg2\" (UID: \"1eb00f12-c24e-46dc-8346-c096826564f5\") " pod="openshift-monitoring/thanos-querier-5f6c868b98-s8fg2" Mar 09 14:10:06 crc kubenswrapper[4722]: I0309 14:10:06.273273 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1eb00f12-c24e-46dc-8346-c096826564f5-metrics-client-ca\") pod \"thanos-querier-5f6c868b98-s8fg2\" (UID: \"1eb00f12-c24e-46dc-8346-c096826564f5\") " pod="openshift-monitoring/thanos-querier-5f6c868b98-s8fg2" Mar 09 14:10:06 crc kubenswrapper[4722]: I0309 14:10:06.273295 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzrlp\" (UniqueName: \"kubernetes.io/projected/1eb00f12-c24e-46dc-8346-c096826564f5-kube-api-access-zzrlp\") pod \"thanos-querier-5f6c868b98-s8fg2\" (UID: \"1eb00f12-c24e-46dc-8346-c096826564f5\") " pod="openshift-monitoring/thanos-querier-5f6c868b98-s8fg2" Mar 09 14:10:06 crc kubenswrapper[4722]: I0309 14:10:06.273320 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/1eb00f12-c24e-46dc-8346-c096826564f5-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5f6c868b98-s8fg2\" (UID: \"1eb00f12-c24e-46dc-8346-c096826564f5\") " pod="openshift-monitoring/thanos-querier-5f6c868b98-s8fg2" Mar 09 14:10:06 crc kubenswrapper[4722]: I0309 14:10:06.273339 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/1eb00f12-c24e-46dc-8346-c096826564f5-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5f6c868b98-s8fg2\" (UID: \"1eb00f12-c24e-46dc-8346-c096826564f5\") " pod="openshift-monitoring/thanos-querier-5f6c868b98-s8fg2" Mar 09 14:10:06 crc kubenswrapper[4722]: I0309 14:10:06.273360 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1eb00f12-c24e-46dc-8346-c096826564f5-secret-grpc-tls\") pod \"thanos-querier-5f6c868b98-s8fg2\" (UID: \"1eb00f12-c24e-46dc-8346-c096826564f5\") " pod="openshift-monitoring/thanos-querier-5f6c868b98-s8fg2" Mar 09 14:10:06 crc kubenswrapper[4722]: I0309 14:10:06.273389 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/1eb00f12-c24e-46dc-8346-c096826564f5-secret-thanos-querier-tls\") pod \"thanos-querier-5f6c868b98-s8fg2\" (UID: \"1eb00f12-c24e-46dc-8346-c096826564f5\") " pod="openshift-monitoring/thanos-querier-5f6c868b98-s8fg2" Mar 09 14:10:06 crc kubenswrapper[4722]: I0309 14:10:06.319052 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 09 14:10:06 crc kubenswrapper[4722]: I0309 14:10:06.374750 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1eb00f12-c24e-46dc-8346-c096826564f5-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5f6c868b98-s8fg2\" (UID: \"1eb00f12-c24e-46dc-8346-c096826564f5\") " pod="openshift-monitoring/thanos-querier-5f6c868b98-s8fg2" Mar 09 14:10:06 crc kubenswrapper[4722]: I0309 14:10:06.375079 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1eb00f12-c24e-46dc-8346-c096826564f5-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5f6c868b98-s8fg2\" (UID: \"1eb00f12-c24e-46dc-8346-c096826564f5\") " pod="openshift-monitoring/thanos-querier-5f6c868b98-s8fg2" Mar 09 14:10:06 crc kubenswrapper[4722]: I0309 14:10:06.375178 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1eb00f12-c24e-46dc-8346-c096826564f5-metrics-client-ca\") pod \"thanos-querier-5f6c868b98-s8fg2\" (UID: \"1eb00f12-c24e-46dc-8346-c096826564f5\") " pod="openshift-monitoring/thanos-querier-5f6c868b98-s8fg2" Mar 09 14:10:06 crc kubenswrapper[4722]: I0309 14:10:06.375220 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzrlp\" (UniqueName: \"kubernetes.io/projected/1eb00f12-c24e-46dc-8346-c096826564f5-kube-api-access-zzrlp\") pod \"thanos-querier-5f6c868b98-s8fg2\" (UID: \"1eb00f12-c24e-46dc-8346-c096826564f5\") " pod="openshift-monitoring/thanos-querier-5f6c868b98-s8fg2" Mar 09 14:10:06 crc kubenswrapper[4722]: I0309 14:10:06.375259 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/1eb00f12-c24e-46dc-8346-c096826564f5-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5f6c868b98-s8fg2\" (UID: \"1eb00f12-c24e-46dc-8346-c096826564f5\") " pod="openshift-monitoring/thanos-querier-5f6c868b98-s8fg2" Mar 09 14:10:06 crc kubenswrapper[4722]: I0309 14:10:06.375286 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/1eb00f12-c24e-46dc-8346-c096826564f5-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5f6c868b98-s8fg2\" (UID: \"1eb00f12-c24e-46dc-8346-c096826564f5\") " pod="openshift-monitoring/thanos-querier-5f6c868b98-s8fg2" Mar 09 14:10:06 crc kubenswrapper[4722]: I0309 14:10:06.375316 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1eb00f12-c24e-46dc-8346-c096826564f5-secret-grpc-tls\") pod \"thanos-querier-5f6c868b98-s8fg2\" (UID: \"1eb00f12-c24e-46dc-8346-c096826564f5\") " pod="openshift-monitoring/thanos-querier-5f6c868b98-s8fg2" Mar 09 14:10:06 crc kubenswrapper[4722]: I0309 14:10:06.375372 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/1eb00f12-c24e-46dc-8346-c096826564f5-secret-thanos-querier-tls\") pod \"thanos-querier-5f6c868b98-s8fg2\" (UID: \"1eb00f12-c24e-46dc-8346-c096826564f5\") " pod="openshift-monitoring/thanos-querier-5f6c868b98-s8fg2" Mar 09 14:10:06 crc kubenswrapper[4722]: I0309 14:10:06.378749 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1eb00f12-c24e-46dc-8346-c096826564f5-metrics-client-ca\") pod \"thanos-querier-5f6c868b98-s8fg2\" (UID: \"1eb00f12-c24e-46dc-8346-c096826564f5\") " pod="openshift-monitoring/thanos-querier-5f6c868b98-s8fg2" Mar 09 14:10:06 crc kubenswrapper[4722]: I0309 14:10:06.391187 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1eb00f12-c24e-46dc-8346-c096826564f5-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5f6c868b98-s8fg2\" (UID: \"1eb00f12-c24e-46dc-8346-c096826564f5\") " pod="openshift-monitoring/thanos-querier-5f6c868b98-s8fg2" Mar 09 14:10:06 crc kubenswrapper[4722]: I0309 14:10:06.391341 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1eb00f12-c24e-46dc-8346-c096826564f5-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5f6c868b98-s8fg2\" (UID: \"1eb00f12-c24e-46dc-8346-c096826564f5\") " pod="openshift-monitoring/thanos-querier-5f6c868b98-s8fg2" Mar 09 14:10:06 crc kubenswrapper[4722]: I0309 14:10:06.391366 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/1eb00f12-c24e-46dc-8346-c096826564f5-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5f6c868b98-s8fg2\" (UID: \"1eb00f12-c24e-46dc-8346-c096826564f5\") " pod="openshift-monitoring/thanos-querier-5f6c868b98-s8fg2" Mar 09 14:10:06 crc kubenswrapper[4722]: I0309 14:10:06.391766 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1eb00f12-c24e-46dc-8346-c096826564f5-secret-grpc-tls\") pod \"thanos-querier-5f6c868b98-s8fg2\" (UID: \"1eb00f12-c24e-46dc-8346-c096826564f5\") " pod="openshift-monitoring/thanos-querier-5f6c868b98-s8fg2" Mar 09 14:10:06 crc kubenswrapper[4722]: I0309 14:10:06.391926 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/1eb00f12-c24e-46dc-8346-c096826564f5-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5f6c868b98-s8fg2\" (UID: \"1eb00f12-c24e-46dc-8346-c096826564f5\") " pod="openshift-monitoring/thanos-querier-5f6c868b98-s8fg2" Mar 09 14:10:06 crc kubenswrapper[4722]: I0309 14:10:06.392029 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/1eb00f12-c24e-46dc-8346-c096826564f5-secret-thanos-querier-tls\") pod \"thanos-querier-5f6c868b98-s8fg2\" (UID: \"1eb00f12-c24e-46dc-8346-c096826564f5\") " pod="openshift-monitoring/thanos-querier-5f6c868b98-s8fg2" Mar 09 14:10:06 crc kubenswrapper[4722]: I0309 14:10:06.394537 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzrlp\" (UniqueName: \"kubernetes.io/projected/1eb00f12-c24e-46dc-8346-c096826564f5-kube-api-access-zzrlp\") pod \"thanos-querier-5f6c868b98-s8fg2\" (UID: \"1eb00f12-c24e-46dc-8346-c096826564f5\") " pod="openshift-monitoring/thanos-querier-5f6c868b98-s8fg2" Mar 09 14:10:06 crc kubenswrapper[4722]: I0309 14:10:06.607455 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-5f6c868b98-s8fg2" Mar 09 14:10:06 crc kubenswrapper[4722]: I0309 14:10:06.872799 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"dba33bb4-644d-4534-9d39-0d3da7d18d0c","Type":"ContainerStarted","Data":"c7fe1012ccbdaeb97ee384088d055d1b9229ec85a30f6d3fe041748aeb1165f0"} Mar 09 14:10:06 crc kubenswrapper[4722]: I0309 14:10:06.877312 4722 generic.go:334] "Generic (PLEG): container finished" podID="c5a8910e-86fc-4909-ab14-cee4a4002f88" containerID="ca24dee5b35d603fa003105ac4e8de9d57cb516b3f6c6a0de64f42e3c0a652e5" exitCode=0 Mar 09 14:10:06 crc kubenswrapper[4722]: I0309 14:10:06.877362 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-p5xzd" event={"ID":"c5a8910e-86fc-4909-ab14-cee4a4002f88","Type":"ContainerDied","Data":"ca24dee5b35d603fa003105ac4e8de9d57cb516b3f6c6a0de64f42e3c0a652e5"} Mar 09 14:10:07 crc kubenswrapper[4722]: I0309 14:10:07.886783 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-2qtgt" event={"ID":"29dfec3b-120d-4a56-8ab6-7155b7ec5f43","Type":"ContainerStarted","Data":"fab04d41a11f91963d132c9aba17c63d3ec68917911c8f89a54a1a4a3fe16885"} Mar 09 14:10:07 crc kubenswrapper[4722]: I0309 14:10:07.887474 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-2qtgt" event={"ID":"29dfec3b-120d-4a56-8ab6-7155b7ec5f43","Type":"ContainerStarted","Data":"72ea5024b801d473c8310d577c0866cfd8ddacb1cdc9796b2b512cb472b57dd8"} Mar 09 14:10:07 crc kubenswrapper[4722]: I0309 14:10:07.892834 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-p5xzd" event={"ID":"c5a8910e-86fc-4909-ab14-cee4a4002f88","Type":"ContainerStarted","Data":"a2bd8a492aae0302139452915a0d0f34c5e6c6e71c516ae0bb98e14fa3a6eed1"} Mar 09 14:10:07 crc kubenswrapper[4722]: I0309 14:10:07.892873 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-p5xzd" event={"ID":"c5a8910e-86fc-4909-ab14-cee4a4002f88","Type":"ContainerStarted","Data":"762dc7d1438151ee1223195cb76bc3855678e35aa007659d13b29102f240c5bc"} Mar 09 14:10:07 crc kubenswrapper[4722]: I0309 14:10:07.897907 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-wg4mz" event={"ID":"0f5ed552-309f-4cc0-8646-484b1e4be7d9","Type":"ContainerStarted","Data":"e6fbcf74d78a678597c1505c1d53907013501d38899f36d4e6a6b367f6f62880"} Mar 09 14:10:07 crc kubenswrapper[4722]: I0309 14:10:07.915644 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-5f6c868b98-s8fg2"] Mar 09 14:10:07 crc kubenswrapper[4722]: I0309 14:10:07.920307 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-p5xzd" podStartSLOduration=2.6636704289999997 podStartE2EDuration="3.920286222s" podCreationTimestamp="2026-03-09 14:10:04 +0000 UTC" firstStartedPulling="2026-03-09 14:10:04.672125585 +0000 UTC m=+445.227694161" lastFinishedPulling="2026-03-09 14:10:05.928741378 +0000 UTC m=+446.484309954" observedRunningTime="2026-03-09 14:10:07.917388288 +0000 UTC m=+448.472956864" watchObservedRunningTime="2026-03-09 14:10:07.920286222 +0000 UTC m=+448.475854798" Mar 09 14:10:07 crc kubenswrapper[4722]: I0309 14:10:07.938024 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-566fddb674-wg4mz" podStartSLOduration=2.044276231 podStartE2EDuration="3.937989303s" podCreationTimestamp="2026-03-09 14:10:04 +0000 UTC" firstStartedPulling="2026-03-09 14:10:05.589074015 +0000 UTC m=+446.144642591" lastFinishedPulling="2026-03-09 14:10:07.482787087 +0000 UTC m=+448.038355663" observedRunningTime="2026-03-09 14:10:07.935752636 +0000 UTC m=+448.491321212" watchObservedRunningTime="2026-03-09 14:10:07.937989303 +0000 UTC m=+448.493557899" Mar 09 14:10:08 crc kubenswrapper[4722]: I0309 14:10:08.922192 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5f6c868b98-s8fg2" event={"ID":"1eb00f12-c24e-46dc-8346-c096826564f5","Type":"ContainerStarted","Data":"2c83631be4528fc4022dd3dd4e2d827df3bd7d2b2acf713593aa913b73571817"} Mar 09 14:10:08 crc kubenswrapper[4722]: I0309 14:10:08.932517 4722 generic.go:334] "Generic (PLEG): container finished" podID="dba33bb4-644d-4534-9d39-0d3da7d18d0c" containerID="ef9b81639dedde17cc8286139d2606a42c95162fbd1e90f0fc5510ae47be9c69" exitCode=0 Mar 09 14:10:08 crc kubenswrapper[4722]: I0309 14:10:08.932621 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"dba33bb4-644d-4534-9d39-0d3da7d18d0c","Type":"ContainerDied","Data":"ef9b81639dedde17cc8286139d2606a42c95162fbd1e90f0fc5510ae47be9c69"} Mar 09 14:10:08 crc kubenswrapper[4722]: I0309 14:10:08.961525 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-7b675946d5-fzz6p"] Mar 09 14:10:08 crc kubenswrapper[4722]: I0309 14:10:08.962553 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-2qtgt" event={"ID":"29dfec3b-120d-4a56-8ab6-7155b7ec5f43","Type":"ContainerStarted","Data":"0d41d9c11343fa5cc88d51797a84c14a52a076d13fde1387ad2db8ad2b8980d7"} Mar 09 14:10:08 crc kubenswrapper[4722]: I0309 14:10:08.962655 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b675946d5-fzz6p" Mar 09 14:10:08 crc kubenswrapper[4722]: I0309 14:10:08.979634 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7b675946d5-fzz6p"] Mar 09 14:10:09 crc kubenswrapper[4722]: I0309 14:10:09.013485 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/23e0bff9-aca3-41b7-88d9-e67af2d00319-console-oauth-config\") pod \"console-7b675946d5-fzz6p\" (UID: \"23e0bff9-aca3-41b7-88d9-e67af2d00319\") " pod="openshift-console/console-7b675946d5-fzz6p" Mar 09 14:10:09 crc kubenswrapper[4722]: I0309 14:10:09.013629 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/23e0bff9-aca3-41b7-88d9-e67af2d00319-service-ca\") pod \"console-7b675946d5-fzz6p\" (UID: \"23e0bff9-aca3-41b7-88d9-e67af2d00319\") " pod="openshift-console/console-7b675946d5-fzz6p" Mar 09 14:10:09 crc kubenswrapper[4722]: I0309 14:10:09.013806 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/23e0bff9-aca3-41b7-88d9-e67af2d00319-console-serving-cert\") pod \"console-7b675946d5-fzz6p\" (UID: \"23e0bff9-aca3-41b7-88d9-e67af2d00319\") " pod="openshift-console/console-7b675946d5-fzz6p" Mar 09 14:10:09 crc kubenswrapper[4722]: I0309 14:10:09.013856 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/23e0bff9-aca3-41b7-88d9-e67af2d00319-console-config\") pod \"console-7b675946d5-fzz6p\" (UID: \"23e0bff9-aca3-41b7-88d9-e67af2d00319\") " pod="openshift-console/console-7b675946d5-fzz6p" Mar 09 14:10:09 crc kubenswrapper[4722]: I0309 14:10:09.015024 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/23e0bff9-aca3-41b7-88d9-e67af2d00319-trusted-ca-bundle\") pod \"console-7b675946d5-fzz6p\" (UID: \"23e0bff9-aca3-41b7-88d9-e67af2d00319\") " pod="openshift-console/console-7b675946d5-fzz6p" Mar 09 14:10:09 crc kubenswrapper[4722]: I0309 14:10:09.015122 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/23e0bff9-aca3-41b7-88d9-e67af2d00319-oauth-serving-cert\") pod \"console-7b675946d5-fzz6p\" (UID: \"23e0bff9-aca3-41b7-88d9-e67af2d00319\") " pod="openshift-console/console-7b675946d5-fzz6p" Mar 09 14:10:09 crc kubenswrapper[4722]: I0309 14:10:09.015305 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5xnc\" (UniqueName: \"kubernetes.io/projected/23e0bff9-aca3-41b7-88d9-e67af2d00319-kube-api-access-g5xnc\") pod \"console-7b675946d5-fzz6p\" (UID: \"23e0bff9-aca3-41b7-88d9-e67af2d00319\") " pod="openshift-console/console-7b675946d5-fzz6p" Mar 09 14:10:09 crc kubenswrapper[4722]: I0309 14:10:09.039335 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-2qtgt" podStartSLOduration=2.8035268540000002 podStartE2EDuration="5.03930422s" podCreationTimestamp="2026-03-09 14:10:04 +0000 UTC" firstStartedPulling="2026-03-09 14:10:05.242176378 +0000 UTC m=+445.797744944" lastFinishedPulling="2026-03-09 14:10:07.477953734 +0000 UTC m=+448.033522310" observedRunningTime="2026-03-09 14:10:09.032696551 +0000 UTC m=+449.588265127" watchObservedRunningTime="2026-03-09 14:10:09.03930422 +0000 UTC m=+449.594872806" Mar 09 14:10:09 crc kubenswrapper[4722]: I0309 14:10:09.116818 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/23e0bff9-aca3-41b7-88d9-e67af2d00319-service-ca\") pod \"console-7b675946d5-fzz6p\" (UID: \"23e0bff9-aca3-41b7-88d9-e67af2d00319\") " pod="openshift-console/console-7b675946d5-fzz6p" Mar 09 14:10:09 crc kubenswrapper[4722]: I0309 14:10:09.116910 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/23e0bff9-aca3-41b7-88d9-e67af2d00319-console-serving-cert\") pod \"console-7b675946d5-fzz6p\" (UID: \"23e0bff9-aca3-41b7-88d9-e67af2d00319\") " pod="openshift-console/console-7b675946d5-fzz6p" Mar 09 14:10:09 crc kubenswrapper[4722]: I0309 14:10:09.116944 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/23e0bff9-aca3-41b7-88d9-e67af2d00319-trusted-ca-bundle\") pod \"console-7b675946d5-fzz6p\" (UID: \"23e0bff9-aca3-41b7-88d9-e67af2d00319\") " pod="openshift-console/console-7b675946d5-fzz6p" Mar 09 14:10:09 crc kubenswrapper[4722]: I0309 14:10:09.116962 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/23e0bff9-aca3-41b7-88d9-e67af2d00319-console-config\") pod \"console-7b675946d5-fzz6p\" (UID: \"23e0bff9-aca3-41b7-88d9-e67af2d00319\") " pod="openshift-console/console-7b675946d5-fzz6p" Mar 09 14:10:09 crc kubenswrapper[4722]: I0309 14:10:09.116985 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/23e0bff9-aca3-41b7-88d9-e67af2d00319-oauth-serving-cert\") pod \"console-7b675946d5-fzz6p\" (UID: \"23e0bff9-aca3-41b7-88d9-e67af2d00319\") " pod="openshift-console/console-7b675946d5-fzz6p" Mar 09 14:10:09 crc kubenswrapper[4722]: I0309 14:10:09.117014 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5xnc\" (UniqueName: \"kubernetes.io/projected/23e0bff9-aca3-41b7-88d9-e67af2d00319-kube-api-access-g5xnc\") pod \"console-7b675946d5-fzz6p\" (UID: \"23e0bff9-aca3-41b7-88d9-e67af2d00319\") " pod="openshift-console/console-7b675946d5-fzz6p" Mar 09 14:10:09 crc kubenswrapper[4722]: I0309 14:10:09.117050 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/23e0bff9-aca3-41b7-88d9-e67af2d00319-console-oauth-config\") pod \"console-7b675946d5-fzz6p\" (UID: \"23e0bff9-aca3-41b7-88d9-e67af2d00319\") " pod="openshift-console/console-7b675946d5-fzz6p" Mar 09 14:10:09 crc kubenswrapper[4722]: I0309 14:10:09.118044 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/23e0bff9-aca3-41b7-88d9-e67af2d00319-console-config\") pod \"console-7b675946d5-fzz6p\" (UID: \"23e0bff9-aca3-41b7-88d9-e67af2d00319\") " pod="openshift-console/console-7b675946d5-fzz6p" Mar 09 14:10:09 crc kubenswrapper[4722]: I0309 14:10:09.118322 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/23e0bff9-aca3-41b7-88d9-e67af2d00319-service-ca\") pod \"console-7b675946d5-fzz6p\" (UID: \"23e0bff9-aca3-41b7-88d9-e67af2d00319\") " pod="openshift-console/console-7b675946d5-fzz6p" Mar 09 14:10:09 crc kubenswrapper[4722]: I0309 14:10:09.118094 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/23e0bff9-aca3-41b7-88d9-e67af2d00319-oauth-serving-cert\") pod \"console-7b675946d5-fzz6p\" (UID: \"23e0bff9-aca3-41b7-88d9-e67af2d00319\") " pod="openshift-console/console-7b675946d5-fzz6p" Mar 09 14:10:09 crc kubenswrapper[4722]: I0309 14:10:09.120715 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/23e0bff9-aca3-41b7-88d9-e67af2d00319-trusted-ca-bundle\") pod \"console-7b675946d5-fzz6p\" (UID: \"23e0bff9-aca3-41b7-88d9-e67af2d00319\") " pod="openshift-console/console-7b675946d5-fzz6p" Mar 09 14:10:09 crc kubenswrapper[4722]: I0309 14:10:09.121311 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/23e0bff9-aca3-41b7-88d9-e67af2d00319-console-serving-cert\") pod \"console-7b675946d5-fzz6p\" (UID: \"23e0bff9-aca3-41b7-88d9-e67af2d00319\") " pod="openshift-console/console-7b675946d5-fzz6p" Mar 09 14:10:09 crc kubenswrapper[4722]: I0309 14:10:09.131649 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/23e0bff9-aca3-41b7-88d9-e67af2d00319-console-oauth-config\") pod \"console-7b675946d5-fzz6p\" (UID: \"23e0bff9-aca3-41b7-88d9-e67af2d00319\") " pod="openshift-console/console-7b675946d5-fzz6p" Mar 09 14:10:09 crc kubenswrapper[4722]: I0309 14:10:09.134290 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5xnc\" (UniqueName: \"kubernetes.io/projected/23e0bff9-aca3-41b7-88d9-e67af2d00319-kube-api-access-g5xnc\") pod \"console-7b675946d5-fzz6p\" (UID: \"23e0bff9-aca3-41b7-88d9-e67af2d00319\") " pod="openshift-console/console-7b675946d5-fzz6p" Mar 09 14:10:09 crc kubenswrapper[4722]: I0309 14:10:09.299481 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b675946d5-fzz6p" Mar 09 14:10:09 crc kubenswrapper[4722]: I0309 14:10:09.576635 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-bcd6d9dd6-5rp7g"] Mar 09 14:10:09 crc kubenswrapper[4722]: I0309 14:10:09.577944 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-bcd6d9dd6-5rp7g" Mar 09 14:10:09 crc kubenswrapper[4722]: I0309 14:10:09.581525 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-server-audit-profiles" Mar 09 14:10:09 crc kubenswrapper[4722]: I0309 14:10:09.581601 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-2lc8nbr7cg5od" Mar 09 14:10:09 crc kubenswrapper[4722]: I0309 14:10:09.584057 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-dockercfg-qvqjf" Mar 09 14:10:09 crc kubenswrapper[4722]: I0309 14:10:09.588953 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-bcd6d9dd6-5rp7g"] Mar 09 14:10:09 crc kubenswrapper[4722]: I0309 14:10:09.591140 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kubelet-serving-ca-bundle" Mar 09 14:10:09 crc kubenswrapper[4722]: I0309 14:10:09.591777 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-tls" Mar 09 14:10:09 crc kubenswrapper[4722]: I0309 14:10:09.592052 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-client-certs" Mar 09 14:10:09 crc kubenswrapper[4722]: I0309 14:10:09.730422 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/f8a65e9f-5e0a-47d0-b251-aa4e52e2f581-audit-log\") pod \"metrics-server-bcd6d9dd6-5rp7g\" (UID: \"f8a65e9f-5e0a-47d0-b251-aa4e52e2f581\") " pod="openshift-monitoring/metrics-server-bcd6d9dd6-5rp7g" Mar 09 14:10:09 crc kubenswrapper[4722]: I0309 14:10:09.730502 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8a65e9f-5e0a-47d0-b251-aa4e52e2f581-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-bcd6d9dd6-5rp7g\" (UID: \"f8a65e9f-5e0a-47d0-b251-aa4e52e2f581\") " pod="openshift-monitoring/metrics-server-bcd6d9dd6-5rp7g" Mar 09 14:10:09 crc kubenswrapper[4722]: I0309 14:10:09.730542 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8a65e9f-5e0a-47d0-b251-aa4e52e2f581-client-ca-bundle\") pod \"metrics-server-bcd6d9dd6-5rp7g\" (UID: \"f8a65e9f-5e0a-47d0-b251-aa4e52e2f581\") " pod="openshift-monitoring/metrics-server-bcd6d9dd6-5rp7g" Mar 09 14:10:09 crc kubenswrapper[4722]: I0309 14:10:09.730561 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nklsm\" (UniqueName: \"kubernetes.io/projected/f8a65e9f-5e0a-47d0-b251-aa4e52e2f581-kube-api-access-nklsm\") pod \"metrics-server-bcd6d9dd6-5rp7g\" (UID: \"f8a65e9f-5e0a-47d0-b251-aa4e52e2f581\") " pod="openshift-monitoring/metrics-server-bcd6d9dd6-5rp7g" Mar 09 14:10:09 crc kubenswrapper[4722]: I0309 14:10:09.730595 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f8a65e9f-5e0a-47d0-b251-aa4e52e2f581-secret-metrics-client-certs\") pod \"metrics-server-bcd6d9dd6-5rp7g\" (UID: \"f8a65e9f-5e0a-47d0-b251-aa4e52e2f581\") " pod="openshift-monitoring/metrics-server-bcd6d9dd6-5rp7g" Mar 09 14:10:09 crc kubenswrapper[4722]: I0309 14:10:09.730615 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/f8a65e9f-5e0a-47d0-b251-aa4e52e2f581-metrics-server-audit-profiles\") pod \"metrics-server-bcd6d9dd6-5rp7g\" (UID: \"f8a65e9f-5e0a-47d0-b251-aa4e52e2f581\") " pod="openshift-monitoring/metrics-server-bcd6d9dd6-5rp7g" Mar 09 14:10:09 crc kubenswrapper[4722]: I0309 14:10:09.730637 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/f8a65e9f-5e0a-47d0-b251-aa4e52e2f581-secret-metrics-server-tls\") pod \"metrics-server-bcd6d9dd6-5rp7g\" (UID: \"f8a65e9f-5e0a-47d0-b251-aa4e52e2f581\") " pod="openshift-monitoring/metrics-server-bcd6d9dd6-5rp7g" Mar 09 14:10:09 crc kubenswrapper[4722]: I0309 14:10:09.832478 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8a65e9f-5e0a-47d0-b251-aa4e52e2f581-client-ca-bundle\") pod \"metrics-server-bcd6d9dd6-5rp7g\" (UID: \"f8a65e9f-5e0a-47d0-b251-aa4e52e2f581\") " pod="openshift-monitoring/metrics-server-bcd6d9dd6-5rp7g" Mar 09 14:10:09 crc kubenswrapper[4722]: I0309 14:10:09.832540 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nklsm\" (UniqueName: \"kubernetes.io/projected/f8a65e9f-5e0a-47d0-b251-aa4e52e2f581-kube-api-access-nklsm\") pod \"metrics-server-bcd6d9dd6-5rp7g\" (UID: \"f8a65e9f-5e0a-47d0-b251-aa4e52e2f581\") " pod="openshift-monitoring/metrics-server-bcd6d9dd6-5rp7g" Mar 09 14:10:09 crc kubenswrapper[4722]: I0309 14:10:09.832596 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f8a65e9f-5e0a-47d0-b251-aa4e52e2f581-secret-metrics-client-certs\") pod \"metrics-server-bcd6d9dd6-5rp7g\" (UID: \"f8a65e9f-5e0a-47d0-b251-aa4e52e2f581\") " pod="openshift-monitoring/metrics-server-bcd6d9dd6-5rp7g" Mar 09 14:10:09 crc kubenswrapper[4722]: I0309 14:10:09.832620 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/f8a65e9f-5e0a-47d0-b251-aa4e52e2f581-metrics-server-audit-profiles\") pod \"metrics-server-bcd6d9dd6-5rp7g\" (UID: \"f8a65e9f-5e0a-47d0-b251-aa4e52e2f581\") " pod="openshift-monitoring/metrics-server-bcd6d9dd6-5rp7g" Mar 09 14:10:09 crc kubenswrapper[4722]: I0309 14:10:09.832645 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/f8a65e9f-5e0a-47d0-b251-aa4e52e2f581-secret-metrics-server-tls\") pod \"metrics-server-bcd6d9dd6-5rp7g\" (UID: \"f8a65e9f-5e0a-47d0-b251-aa4e52e2f581\") " pod="openshift-monitoring/metrics-server-bcd6d9dd6-5rp7g" Mar 09 14:10:09 crc kubenswrapper[4722]: I0309 14:10:09.832701 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/f8a65e9f-5e0a-47d0-b251-aa4e52e2f581-audit-log\") pod \"metrics-server-bcd6d9dd6-5rp7g\" (UID: \"f8a65e9f-5e0a-47d0-b251-aa4e52e2f581\") " pod="openshift-monitoring/metrics-server-bcd6d9dd6-5rp7g" Mar 09 14:10:09 crc kubenswrapper[4722]: I0309 14:10:09.832757 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8a65e9f-5e0a-47d0-b251-aa4e52e2f581-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-bcd6d9dd6-5rp7g\" (UID: \"f8a65e9f-5e0a-47d0-b251-aa4e52e2f581\") " pod="openshift-monitoring/metrics-server-bcd6d9dd6-5rp7g" Mar 09 14:10:09 crc kubenswrapper[4722]: I0309 14:10:09.833813 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8a65e9f-5e0a-47d0-b251-aa4e52e2f581-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-bcd6d9dd6-5rp7g\" (UID: \"f8a65e9f-5e0a-47d0-b251-aa4e52e2f581\") " pod="openshift-monitoring/metrics-server-bcd6d9dd6-5rp7g" Mar 09 14:10:09 crc kubenswrapper[4722]: I0309 14:10:09.836073 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/f8a65e9f-5e0a-47d0-b251-aa4e52e2f581-audit-log\") pod \"metrics-server-bcd6d9dd6-5rp7g\" (UID: \"f8a65e9f-5e0a-47d0-b251-aa4e52e2f581\") " pod="openshift-monitoring/metrics-server-bcd6d9dd6-5rp7g" Mar 09 14:10:09 crc kubenswrapper[4722]: I0309 14:10:09.836358 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/f8a65e9f-5e0a-47d0-b251-aa4e52e2f581-metrics-server-audit-profiles\") pod \"metrics-server-bcd6d9dd6-5rp7g\" (UID: \"f8a65e9f-5e0a-47d0-b251-aa4e52e2f581\") " pod="openshift-monitoring/metrics-server-bcd6d9dd6-5rp7g" Mar 09 14:10:09 crc kubenswrapper[4722]: I0309 14:10:09.839784 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/f8a65e9f-5e0a-47d0-b251-aa4e52e2f581-secret-metrics-server-tls\") pod \"metrics-server-bcd6d9dd6-5rp7g\" (UID: \"f8a65e9f-5e0a-47d0-b251-aa4e52e2f581\") " pod="openshift-monitoring/metrics-server-bcd6d9dd6-5rp7g" Mar 09 14:10:09 crc kubenswrapper[4722]: I0309 14:10:09.840028 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8a65e9f-5e0a-47d0-b251-aa4e52e2f581-client-ca-bundle\") pod \"metrics-server-bcd6d9dd6-5rp7g\" (UID: \"f8a65e9f-5e0a-47d0-b251-aa4e52e2f581\") " pod="openshift-monitoring/metrics-server-bcd6d9dd6-5rp7g" Mar 09 14:10:09 crc kubenswrapper[4722]: I0309 14:10:09.840115 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f8a65e9f-5e0a-47d0-b251-aa4e52e2f581-secret-metrics-client-certs\") pod \"metrics-server-bcd6d9dd6-5rp7g\" (UID: \"f8a65e9f-5e0a-47d0-b251-aa4e52e2f581\") " pod="openshift-monitoring/metrics-server-bcd6d9dd6-5rp7g" Mar 09 14:10:09 crc kubenswrapper[4722]: I0309 14:10:09.849014 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nklsm\" (UniqueName: \"kubernetes.io/projected/f8a65e9f-5e0a-47d0-b251-aa4e52e2f581-kube-api-access-nklsm\") pod \"metrics-server-bcd6d9dd6-5rp7g\" (UID: \"f8a65e9f-5e0a-47d0-b251-aa4e52e2f581\") " pod="openshift-monitoring/metrics-server-bcd6d9dd6-5rp7g" Mar 09 14:10:09 crc kubenswrapper[4722]: I0309 14:10:09.899818 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-bcd6d9dd6-5rp7g" Mar 09 14:10:09 crc kubenswrapper[4722]: I0309 14:10:09.924485 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-65599947bd-42bk4"] Mar 09 14:10:09 crc kubenswrapper[4722]: I0309 14:10:09.925728 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-65599947bd-42bk4" Mar 09 14:10:09 crc kubenswrapper[4722]: I0309 14:10:09.929283 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"default-dockercfg-6tstp" Mar 09 14:10:09 crc kubenswrapper[4722]: I0309 14:10:09.934934 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-65599947bd-42bk4"] Mar 09 14:10:09 crc kubenswrapper[4722]: I0309 14:10:09.936707 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"monitoring-plugin-cert" Mar 09 14:10:10 crc kubenswrapper[4722]: I0309 14:10:10.035045 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/85f5a76e-3679-44b3-8932-f5245c49b481-monitoring-plugin-cert\") pod \"monitoring-plugin-65599947bd-42bk4\" (UID: \"85f5a76e-3679-44b3-8932-f5245c49b481\") " pod="openshift-monitoring/monitoring-plugin-65599947bd-42bk4" Mar 09 14:10:10 crc kubenswrapper[4722]: I0309 14:10:10.144508 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/85f5a76e-3679-44b3-8932-f5245c49b481-monitoring-plugin-cert\") pod \"monitoring-plugin-65599947bd-42bk4\" (UID: \"85f5a76e-3679-44b3-8932-f5245c49b481\") " pod="openshift-monitoring/monitoring-plugin-65599947bd-42bk4" Mar 09 14:10:10 crc kubenswrapper[4722]: I0309 14:10:10.164819 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/85f5a76e-3679-44b3-8932-f5245c49b481-monitoring-plugin-cert\") pod \"monitoring-plugin-65599947bd-42bk4\" (UID: \"85f5a76e-3679-44b3-8932-f5245c49b481\") " pod="openshift-monitoring/monitoring-plugin-65599947bd-42bk4" Mar 09 14:10:10 crc kubenswrapper[4722]: I0309 14:10:10.251859 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-65599947bd-42bk4" Mar 09 14:10:10 crc kubenswrapper[4722]: I0309 14:10:10.624402 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 09 14:10:10 crc kubenswrapper[4722]: I0309 14:10:10.627320 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 09 14:10:10 crc kubenswrapper[4722]: I0309 14:10:10.632285 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-prometheus-http-client-file" Mar 09 14:10:10 crc kubenswrapper[4722]: I0309 14:10:10.632492 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-rbac-proxy" Mar 09 14:10:10 crc kubenswrapper[4722]: I0309 14:10:10.632593 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-kube-rbac-proxy-web" Mar 09 14:10:10 crc kubenswrapper[4722]: I0309 14:10:10.632841 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls-assets-0" Mar 09 14:10:10 crc kubenswrapper[4722]: I0309 14:10:10.632964 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s" Mar 09 14:10:10 crc kubenswrapper[4722]: I0309 14:10:10.633038 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-dockercfg-lj8dj" Mar 09 14:10:10 crc kubenswrapper[4722]: I0309 14:10:10.633946 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-sidecar-tls" Mar 09 14:10:10 crc kubenswrapper[4722]: I0309 14:10:10.634177 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"serving-certs-ca-bundle" Mar 09 14:10:10 crc kubenswrapper[4722]: I0309 14:10:10.634190 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-grpc-tls-6oeiieod8robd" Mar 09 14:10:10 crc kubenswrapper[4722]: I0309 14:10:10.632191 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-web-config" Mar 09 14:10:10 crc kubenswrapper[4722]: I0309 14:10:10.648584 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls" Mar 09 14:10:10 crc kubenswrapper[4722]: I0309 14:10:10.649343 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-k8s-rulefiles-0" Mar 09 14:10:10 crc kubenswrapper[4722]: I0309 14:10:10.650264 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-trusted-ca-bundle" Mar 09 14:10:10 crc kubenswrapper[4722]: I0309 14:10:10.656698 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 09 14:10:10 crc kubenswrapper[4722]: I0309 14:10:10.736886 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7b675946d5-fzz6p"] Mar 09 14:10:10 crc kubenswrapper[4722]: I0309 14:10:10.741672 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-bcd6d9dd6-5rp7g"] Mar 09 14:10:10 crc kubenswrapper[4722]: W0309 14:10:10.747001 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8a65e9f_5e0a_47d0_b251_aa4e52e2f581.slice/crio-24b941d4a1c679d869909d0a886abf77640bc2e14945dcb7c0a9984f6f01512c WatchSource:0}: Error finding container 24b941d4a1c679d869909d0a886abf77640bc2e14945dcb7c0a9984f6f01512c: Status 404 returned error can't find the container with id 24b941d4a1c679d869909d0a886abf77640bc2e14945dcb7c0a9984f6f01512c Mar 09 14:10:10 crc kubenswrapper[4722]: I0309 14:10:10.751663 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hn9b8\" (UniqueName: \"kubernetes.io/projected/df07ee71-98db-42a1-9df4-6f8707504f08-kube-api-access-hn9b8\") pod \"prometheus-k8s-0\" (UID: \"df07ee71-98db-42a1-9df4-6f8707504f08\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 14:10:10 crc kubenswrapper[4722]: I0309 14:10:10.751723 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/df07ee71-98db-42a1-9df4-6f8707504f08-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"df07ee71-98db-42a1-9df4-6f8707504f08\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 14:10:10 crc kubenswrapper[4722]: I0309 14:10:10.751759 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/df07ee71-98db-42a1-9df4-6f8707504f08-config-out\") pod \"prometheus-k8s-0\" (UID: \"df07ee71-98db-42a1-9df4-6f8707504f08\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 14:10:10 crc kubenswrapper[4722]: I0309 14:10:10.751788 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/df07ee71-98db-42a1-9df4-6f8707504f08-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"df07ee71-98db-42a1-9df4-6f8707504f08\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 14:10:10 crc kubenswrapper[4722]: I0309 14:10:10.751808 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/df07ee71-98db-42a1-9df4-6f8707504f08-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"df07ee71-98db-42a1-9df4-6f8707504f08\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 14:10:10 crc kubenswrapper[4722]: I0309 14:10:10.751826 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/df07ee71-98db-42a1-9df4-6f8707504f08-config\") pod \"prometheus-k8s-0\" (UID: \"df07ee71-98db-42a1-9df4-6f8707504f08\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 14:10:10 crc kubenswrapper[4722]: I0309 14:10:10.751845 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/df07ee71-98db-42a1-9df4-6f8707504f08-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"df07ee71-98db-42a1-9df4-6f8707504f08\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 14:10:10 crc kubenswrapper[4722]: I0309 14:10:10.751870 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/df07ee71-98db-42a1-9df4-6f8707504f08-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"df07ee71-98db-42a1-9df4-6f8707504f08\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 14:10:10 crc kubenswrapper[4722]: I0309 14:10:10.751891 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/df07ee71-98db-42a1-9df4-6f8707504f08-web-config\") pod \"prometheus-k8s-0\" (UID: \"df07ee71-98db-42a1-9df4-6f8707504f08\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 14:10:10 crc kubenswrapper[4722]: I0309 14:10:10.751915 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/df07ee71-98db-42a1-9df4-6f8707504f08-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"df07ee71-98db-42a1-9df4-6f8707504f08\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 14:10:10 crc kubenswrapper[4722]: I0309 14:10:10.751940 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/df07ee71-98db-42a1-9df4-6f8707504f08-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"df07ee71-98db-42a1-9df4-6f8707504f08\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 14:10:10 crc kubenswrapper[4722]: I0309 14:10:10.752253 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/df07ee71-98db-42a1-9df4-6f8707504f08-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"df07ee71-98db-42a1-9df4-6f8707504f08\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 14:10:10 crc kubenswrapper[4722]: I0309 14:10:10.752404 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/df07ee71-98db-42a1-9df4-6f8707504f08-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"df07ee71-98db-42a1-9df4-6f8707504f08\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 14:10:10 crc kubenswrapper[4722]: I0309 14:10:10.752475 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/df07ee71-98db-42a1-9df4-6f8707504f08-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"df07ee71-98db-42a1-9df4-6f8707504f08\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 14:10:10 crc kubenswrapper[4722]: I0309 14:10:10.752511 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/df07ee71-98db-42a1-9df4-6f8707504f08-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"df07ee71-98db-42a1-9df4-6f8707504f08\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 14:10:10 crc kubenswrapper[4722]: I0309 14:10:10.752557 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/df07ee71-98db-42a1-9df4-6f8707504f08-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"df07ee71-98db-42a1-9df4-6f8707504f08\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 14:10:10 crc kubenswrapper[4722]: I0309 14:10:10.752584 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/df07ee71-98db-42a1-9df4-6f8707504f08-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"df07ee71-98db-42a1-9df4-6f8707504f08\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 14:10:10 crc kubenswrapper[4722]: I0309 14:10:10.752611 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/df07ee71-98db-42a1-9df4-6f8707504f08-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"df07ee71-98db-42a1-9df4-6f8707504f08\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 14:10:10 crc kubenswrapper[4722]: I0309 14:10:10.813093 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-65599947bd-42bk4"] Mar 09 14:10:10 crc kubenswrapper[4722]: I0309 14:10:10.854007 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/df07ee71-98db-42a1-9df4-6f8707504f08-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"df07ee71-98db-42a1-9df4-6f8707504f08\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 14:10:10 crc kubenswrapper[4722]: I0309 14:10:10.854071 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hn9b8\" (UniqueName: \"kubernetes.io/projected/df07ee71-98db-42a1-9df4-6f8707504f08-kube-api-access-hn9b8\") pod \"prometheus-k8s-0\" (UID: \"df07ee71-98db-42a1-9df4-6f8707504f08\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 14:10:10 crc kubenswrapper[4722]: I0309 14:10:10.854112 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/df07ee71-98db-42a1-9df4-6f8707504f08-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"df07ee71-98db-42a1-9df4-6f8707504f08\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 14:10:10 crc kubenswrapper[4722]: I0309 14:10:10.854147 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/df07ee71-98db-42a1-9df4-6f8707504f08-config-out\") pod \"prometheus-k8s-0\" (UID: \"df07ee71-98db-42a1-9df4-6f8707504f08\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 14:10:10 crc kubenswrapper[4722]: I0309 14:10:10.854176 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/df07ee71-98db-42a1-9df4-6f8707504f08-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"df07ee71-98db-42a1-9df4-6f8707504f08\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 14:10:10 crc kubenswrapper[4722]: I0309 14:10:10.854225 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/df07ee71-98db-42a1-9df4-6f8707504f08-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"df07ee71-98db-42a1-9df4-6f8707504f08\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 14:10:10 crc kubenswrapper[4722]: I0309 14:10:10.854247 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/df07ee71-98db-42a1-9df4-6f8707504f08-config\") pod \"prometheus-k8s-0\" (UID: \"df07ee71-98db-42a1-9df4-6f8707504f08\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 14:10:10 crc kubenswrapper[4722]: I0309 14:10:10.854272 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/df07ee71-98db-42a1-9df4-6f8707504f08-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"df07ee71-98db-42a1-9df4-6f8707504f08\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 14:10:10 crc kubenswrapper[4722]: I0309 14:10:10.854300 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/df07ee71-98db-42a1-9df4-6f8707504f08-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"df07ee71-98db-42a1-9df4-6f8707504f08\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 14:10:10 crc kubenswrapper[4722]: I0309 14:10:10.854322 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/df07ee71-98db-42a1-9df4-6f8707504f08-web-config\") pod \"prometheus-k8s-0\" (UID: \"df07ee71-98db-42a1-9df4-6f8707504f08\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 14:10:10 crc kubenswrapper[4722]: I0309 14:10:10.854351 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/df07ee71-98db-42a1-9df4-6f8707504f08-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"df07ee71-98db-42a1-9df4-6f8707504f08\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 14:10:10 crc kubenswrapper[4722]: I0309 14:10:10.854381 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/df07ee71-98db-42a1-9df4-6f8707504f08-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"df07ee71-98db-42a1-9df4-6f8707504f08\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 14:10:10 crc kubenswrapper[4722]: I0309 14:10:10.854408 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/df07ee71-98db-42a1-9df4-6f8707504f08-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"df07ee71-98db-42a1-9df4-6f8707504f08\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 14:10:10 crc kubenswrapper[4722]: I0309 14:10:10.854470 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/df07ee71-98db-42a1-9df4-6f8707504f08-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"df07ee71-98db-42a1-9df4-6f8707504f08\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 14:10:10 crc kubenswrapper[4722]: I0309 14:10:10.854496 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/df07ee71-98db-42a1-9df4-6f8707504f08-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"df07ee71-98db-42a1-9df4-6f8707504f08\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 14:10:10 crc kubenswrapper[4722]: I0309 14:10:10.854669 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/df07ee71-98db-42a1-9df4-6f8707504f08-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"df07ee71-98db-42a1-9df4-6f8707504f08\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 14:10:10 crc kubenswrapper[4722]: I0309 14:10:10.854897 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/df07ee71-98db-42a1-9df4-6f8707504f08-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"df07ee71-98db-42a1-9df4-6f8707504f08\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 14:10:10 crc kubenswrapper[4722]: I0309 14:10:10.854963 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/df07ee71-98db-42a1-9df4-6f8707504f08-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"df07ee71-98db-42a1-9df4-6f8707504f08\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 14:10:10 crc kubenswrapper[4722]: I0309 14:10:10.855058 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/df07ee71-98db-42a1-9df4-6f8707504f08-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"df07ee71-98db-42a1-9df4-6f8707504f08\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 14:10:10 crc kubenswrapper[4722]: I0309 14:10:10.856188 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/df07ee71-98db-42a1-9df4-6f8707504f08-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"df07ee71-98db-42a1-9df4-6f8707504f08\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 14:10:10 crc kubenswrapper[4722]: I0309 14:10:10.856440 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/df07ee71-98db-42a1-9df4-6f8707504f08-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"df07ee71-98db-42a1-9df4-6f8707504f08\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 14:10:10 crc kubenswrapper[4722]: I0309 14:10:10.859391 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/df07ee71-98db-42a1-9df4-6f8707504f08-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"df07ee71-98db-42a1-9df4-6f8707504f08\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 14:10:10 crc kubenswrapper[4722]: I0309 14:10:10.859997 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/df07ee71-98db-42a1-9df4-6f8707504f08-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"df07ee71-98db-42a1-9df4-6f8707504f08\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 14:10:10 crc kubenswrapper[4722]: I0309 14:10:10.863362 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/df07ee71-98db-42a1-9df4-6f8707504f08-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"df07ee71-98db-42a1-9df4-6f8707504f08\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 14:10:10 crc kubenswrapper[4722]: I0309 14:10:10.864084 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/df07ee71-98db-42a1-9df4-6f8707504f08-config\") pod \"prometheus-k8s-0\" (UID: \"df07ee71-98db-42a1-9df4-6f8707504f08\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 14:10:10 crc kubenswrapper[4722]: I0309 14:10:10.864134 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/df07ee71-98db-42a1-9df4-6f8707504f08-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"df07ee71-98db-42a1-9df4-6f8707504f08\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 14:10:10 crc kubenswrapper[4722]: I0309 14:10:10.864758 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/df07ee71-98db-42a1-9df4-6f8707504f08-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"df07ee71-98db-42a1-9df4-6f8707504f08\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 14:10:10 crc kubenswrapper[4722]: I0309 14:10:10.864905 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/df07ee71-98db-42a1-9df4-6f8707504f08-web-config\") pod \"prometheus-k8s-0\" (UID: \"df07ee71-98db-42a1-9df4-6f8707504f08\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 14:10:10 crc kubenswrapper[4722]: I0309 14:10:10.865663 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/df07ee71-98db-42a1-9df4-6f8707504f08-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"df07ee71-98db-42a1-9df4-6f8707504f08\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 14:10:10 crc kubenswrapper[4722]: I0309 14:10:10.865716 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/df07ee71-98db-42a1-9df4-6f8707504f08-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"df07ee71-98db-42a1-9df4-6f8707504f08\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 14:10:10 crc kubenswrapper[4722]: I0309 14:10:10.865811 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/df07ee71-98db-42a1-9df4-6f8707504f08-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"df07ee71-98db-42a1-9df4-6f8707504f08\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 14:10:10 crc kubenswrapper[4722]: I0309 14:10:10.867219 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/df07ee71-98db-42a1-9df4-6f8707504f08-config-out\") pod \"prometheus-k8s-0\" (UID: \"df07ee71-98db-42a1-9df4-6f8707504f08\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 14:10:10 crc kubenswrapper[4722]: I0309 14:10:10.877958 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/df07ee71-98db-42a1-9df4-6f8707504f08-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"df07ee71-98db-42a1-9df4-6f8707504f08\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 14:10:10 crc kubenswrapper[4722]: I0309 14:10:10.878519 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/df07ee71-98db-42a1-9df4-6f8707504f08-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"df07ee71-98db-42a1-9df4-6f8707504f08\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 14:10:10 crc kubenswrapper[4722]: I0309 14:10:10.881879 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hn9b8\" (UniqueName: \"kubernetes.io/projected/df07ee71-98db-42a1-9df4-6f8707504f08-kube-api-access-hn9b8\") pod \"prometheus-k8s-0\" (UID: \"df07ee71-98db-42a1-9df4-6f8707504f08\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 14:10:10 crc kubenswrapper[4722]: I0309 14:10:10.883123 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/df07ee71-98db-42a1-9df4-6f8707504f08-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"df07ee71-98db-42a1-9df4-6f8707504f08\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 09 14:10:10 crc kubenswrapper[4722]: I0309 14:10:10.964258 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 09 14:10:10 crc kubenswrapper[4722]: I0309 14:10:10.974371 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-bcd6d9dd6-5rp7g" event={"ID":"f8a65e9f-5e0a-47d0-b251-aa4e52e2f581","Type":"ContainerStarted","Data":"24b941d4a1c679d869909d0a886abf77640bc2e14945dcb7c0a9984f6f01512c"} Mar 09 14:10:10 crc kubenswrapper[4722]: I0309 14:10:10.981532 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-65599947bd-42bk4" event={"ID":"85f5a76e-3679-44b3-8932-f5245c49b481","Type":"ContainerStarted","Data":"a6d6122e66de8a30dbb68f28218a6609ce223e32067c8ebfb68474507aaa37a2"} Mar 09 14:10:10 crc kubenswrapper[4722]: I0309 14:10:10.985357 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5f6c868b98-s8fg2" event={"ID":"1eb00f12-c24e-46dc-8346-c096826564f5","Type":"ContainerStarted","Data":"76bba07a28aaa7fc0955e07e814a80d9d8540f00e8e7358fa7c60da5f0d10206"} Mar 09 14:10:10 crc kubenswrapper[4722]: I0309 14:10:10.985422 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5f6c868b98-s8fg2" event={"ID":"1eb00f12-c24e-46dc-8346-c096826564f5","Type":"ContainerStarted","Data":"04d6505b05ea67e1943915cdfab96e94bf1131835258822b560b518191cbab3f"} Mar 09 14:10:10 crc kubenswrapper[4722]: I0309 14:10:10.985443 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5f6c868b98-s8fg2" event={"ID":"1eb00f12-c24e-46dc-8346-c096826564f5","Type":"ContainerStarted","Data":"165556eb8378c0e43338968f9d52ef7848132c0f5bd4028ba4f93bfacdbc87dc"} Mar 09 14:10:10 crc kubenswrapper[4722]: I0309 14:10:10.988848 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b675946d5-fzz6p" event={"ID":"23e0bff9-aca3-41b7-88d9-e67af2d00319","Type":"ContainerStarted","Data":"769483573e2da12bf63a74127b02961d9bf5d8f969e268e3610cd466360933ee"} Mar 09 14:10:10 crc kubenswrapper[4722]: I0309 14:10:10.988900 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b675946d5-fzz6p" event={"ID":"23e0bff9-aca3-41b7-88d9-e67af2d00319","Type":"ContainerStarted","Data":"56901853a9d0663664fda4dfa4defb72d36613607f805f67dece3a89bb8e4a20"} Mar 09 14:10:11 crc kubenswrapper[4722]: I0309 14:10:11.017398 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7b675946d5-fzz6p" podStartSLOduration=3.0173789 podStartE2EDuration="3.0173789s" podCreationTimestamp="2026-03-09 14:10:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:10:11.015242887 +0000 UTC m=+451.570811483" watchObservedRunningTime="2026-03-09 14:10:11.0173789 +0000 UTC m=+451.572947476" Mar 09 14:10:11 crc kubenswrapper[4722]: I0309 14:10:11.212042 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 09 14:10:12 crc kubenswrapper[4722]: I0309 14:10:11.999167 4722 generic.go:334] "Generic (PLEG): container finished" podID="df07ee71-98db-42a1-9df4-6f8707504f08" containerID="0343c3d385fc848dabd966432d253ca497684d69a30bbce4c1639b3d5957f4a1" exitCode=0 Mar 09 14:10:12 crc kubenswrapper[4722]: I0309 14:10:11.999349 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"df07ee71-98db-42a1-9df4-6f8707504f08","Type":"ContainerDied","Data":"0343c3d385fc848dabd966432d253ca497684d69a30bbce4c1639b3d5957f4a1"} Mar 09 14:10:12 crc kubenswrapper[4722]: I0309 14:10:11.999734 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"df07ee71-98db-42a1-9df4-6f8707504f08","Type":"ContainerStarted","Data":"1dcf9f86631dbc102976b10415ac88e6ec57d4bb9d5c0610048b0da1fbee9cba"} Mar 09 14:10:14 crc kubenswrapper[4722]: I0309 14:10:14.018029 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"dba33bb4-644d-4534-9d39-0d3da7d18d0c","Type":"ContainerStarted","Data":"45a2bf5bd197aaf82341f1ab7704abd8a5978893790c1a27879a9e506a0866f0"} Mar 09 14:10:14 crc kubenswrapper[4722]: I0309 14:10:14.018580 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"dba33bb4-644d-4534-9d39-0d3da7d18d0c","Type":"ContainerStarted","Data":"267b1f4b7fba07ab0e2d47811773a4c98fe13fc7dc02db30972870645de6b8c5"} Mar 09 14:10:14 crc kubenswrapper[4722]: I0309 14:10:14.018594 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"dba33bb4-644d-4534-9d39-0d3da7d18d0c","Type":"ContainerStarted","Data":"7d7ff9ee22c3c83199d97fb095ed2f35468c0cabb8328460f0a4d4f9f3b9945f"} Mar 09 14:10:14 crc kubenswrapper[4722]: I0309 14:10:14.018604 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"dba33bb4-644d-4534-9d39-0d3da7d18d0c","Type":"ContainerStarted","Data":"bd30d3bba2ee68fcd95cd3cd3867ff2120ceab179b42c6fcfab2aa59835565fd"} Mar 09 14:10:14 crc kubenswrapper[4722]: I0309 14:10:14.018614 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"dba33bb4-644d-4534-9d39-0d3da7d18d0c","Type":"ContainerStarted","Data":"2d904c5190f272ce5647fd9605728a36df8b3f3f286aafa37b7f2a222ba0217b"} Mar 09 14:10:14 crc kubenswrapper[4722]: I0309 14:10:14.018624 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"dba33bb4-644d-4534-9d39-0d3da7d18d0c","Type":"ContainerStarted","Data":"2366a991a5bf2c69a88178b9d3637a06c019a8bf3bf80c30f7d1ad618c326432"} Mar 09 14:10:14 crc kubenswrapper[4722]: I0309 14:10:14.024216 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5f6c868b98-s8fg2" event={"ID":"1eb00f12-c24e-46dc-8346-c096826564f5","Type":"ContainerStarted","Data":"3dd7fb882abda13b8699b5e01ffa4c10d9f51e72587011e0456ef1db257a84ee"} Mar 09 14:10:14 crc kubenswrapper[4722]: I0309 14:10:14.024268 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5f6c868b98-s8fg2" event={"ID":"1eb00f12-c24e-46dc-8346-c096826564f5","Type":"ContainerStarted","Data":"48865d2f3e2a6b6d60bf95b5e993ea5132bd3129ec05dc78e0c27e9c259dba58"} Mar 09 14:10:14 crc kubenswrapper[4722]: I0309 14:10:14.024281 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5f6c868b98-s8fg2" event={"ID":"1eb00f12-c24e-46dc-8346-c096826564f5","Type":"ContainerStarted","Data":"8d19276b98597558ff594db27fe7f608856a1be3d94ab501ce260541d0a6e712"} Mar 09 14:10:14 crc kubenswrapper[4722]: I0309 14:10:14.024446 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/thanos-querier-5f6c868b98-s8fg2" Mar 09 14:10:14 crc kubenswrapper[4722]: I0309 14:10:14.026224 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-bcd6d9dd6-5rp7g" event={"ID":"f8a65e9f-5e0a-47d0-b251-aa4e52e2f581","Type":"ContainerStarted","Data":"accf60ca5a63b2963a1bf6cb44c81ffff1c61e6d3e9b03e803b0af50dfcdb881"} Mar 09 14:10:14 crc kubenswrapper[4722]: I0309 14:10:14.027751 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-65599947bd-42bk4" event={"ID":"85f5a76e-3679-44b3-8932-f5245c49b481","Type":"ContainerStarted","Data":"0343712ab35bfa33bfde47ad44ee97ca71aa0ab667f76d286f47cb79dbac51af"} Mar 09 14:10:14 crc kubenswrapper[4722]: I0309 14:10:14.028881 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/monitoring-plugin-65599947bd-42bk4" Mar 09 14:10:14 crc kubenswrapper[4722]: I0309 14:10:14.036754 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-65599947bd-42bk4" Mar 09 14:10:14 crc kubenswrapper[4722]: I0309 14:10:14.056656 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.452515076 podStartE2EDuration="9.056592495s" podCreationTimestamp="2026-03-09 14:10:05 +0000 UTC" firstStartedPulling="2026-03-09 14:10:06.332139414 +0000 UTC m=+446.887707990" lastFinishedPulling="2026-03-09 14:10:12.936216833 +0000 UTC m=+453.491785409" observedRunningTime="2026-03-09 14:10:14.046662032 +0000 UTC m=+454.602230608" watchObservedRunningTime="2026-03-09 14:10:14.056592495 +0000 UTC m=+454.612161071" Mar 09 14:10:14 crc kubenswrapper[4722]: I0309 14:10:14.071607 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-65599947bd-42bk4" podStartSLOduration=2.959868971 podStartE2EDuration="5.071573297s" podCreationTimestamp="2026-03-09 14:10:09 +0000 UTC" firstStartedPulling="2026-03-09 14:10:10.828809377 +0000 UTC m=+451.384377953" lastFinishedPulling="2026-03-09 14:10:12.940513703 +0000 UTC m=+453.496082279" observedRunningTime="2026-03-09 14:10:14.065688076 +0000 UTC m=+454.621256652" watchObservedRunningTime="2026-03-09 14:10:14.071573297 +0000 UTC m=+454.627141873" Mar 09 14:10:14 crc kubenswrapper[4722]: I0309 14:10:14.100098 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-5f6c868b98-s8fg2" podStartSLOduration=3.094721581 podStartE2EDuration="8.100078163s" podCreationTimestamp="2026-03-09 14:10:06 +0000 UTC" firstStartedPulling="2026-03-09 14:10:07.93160842 +0000 UTC m=+448.487176996" lastFinishedPulling="2026-03-09 14:10:12.936965002 +0000 UTC m=+453.492533578" observedRunningTime="2026-03-09 14:10:14.096984054 +0000 UTC m=+454.652552630" watchObservedRunningTime="2026-03-09 14:10:14.100078163 +0000 UTC m=+454.655646739" Mar 09 14:10:14 crc kubenswrapper[4722]: I0309 14:10:14.117544 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-bcd6d9dd6-5rp7g" podStartSLOduration=2.923823322 podStartE2EDuration="5.117522026s" podCreationTimestamp="2026-03-09 14:10:09 +0000 UTC" firstStartedPulling="2026-03-09 14:10:10.750597465 +0000 UTC m=+451.306166041" lastFinishedPulling="2026-03-09 14:10:12.944296169 +0000 UTC m=+453.499864745" observedRunningTime="2026-03-09 14:10:14.11685611 +0000 UTC m=+454.672424686" watchObservedRunningTime="2026-03-09 14:10:14.117522026 +0000 UTC m=+454.673090602" Mar 09 14:10:14 crc kubenswrapper[4722]: I0309 14:10:14.866335 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-dch29" Mar 09 14:10:14 crc kubenswrapper[4722]: I0309 14:10:14.930527 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-tz6gh"] Mar 09 14:10:15 crc kubenswrapper[4722]: I0309 14:10:15.061065 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-5f6c868b98-s8fg2" Mar 09 14:10:17 crc kubenswrapper[4722]: I0309 14:10:17.052869 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"df07ee71-98db-42a1-9df4-6f8707504f08","Type":"ContainerStarted","Data":"3b00225f411a3eaefe25ccad70409fa7bdc99f61a5ef690b2e5ec2ddb4218a3a"} Mar 09 14:10:17 crc kubenswrapper[4722]: I0309 14:10:17.053241 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"df07ee71-98db-42a1-9df4-6f8707504f08","Type":"ContainerStarted","Data":"f4b3c89e47fc44ffc4721a43a17d482afe9de05244aff5537528a2090b30e242"} Mar 09 14:10:17 crc kubenswrapper[4722]: I0309 14:10:17.053252 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"df07ee71-98db-42a1-9df4-6f8707504f08","Type":"ContainerStarted","Data":"5b4b88f0162c757e3548c6bc88800f5a7d5e6ce2f20ddb671681147ed3d847c0"} Mar 09 14:10:17 crc kubenswrapper[4722]: I0309 14:10:17.053260 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"df07ee71-98db-42a1-9df4-6f8707504f08","Type":"ContainerStarted","Data":"315c2945d39ab31c74ff1761ff56b771e9206f01ef1f3ced917b374b0595f12c"} Mar 09 14:10:17 crc kubenswrapper[4722]: I0309 14:10:17.053269 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"df07ee71-98db-42a1-9df4-6f8707504f08","Type":"ContainerStarted","Data":"0d41a21a91ac17dbc6775ed65aed03dc4f667c73a33a1ec842f6226feb11c51c"} Mar 09 14:10:17 crc kubenswrapper[4722]: I0309 14:10:17.053278 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"df07ee71-98db-42a1-9df4-6f8707504f08","Type":"ContainerStarted","Data":"5c6af4f5b1390e0fa350198e4b30efc579a7139fde74125479e6797423853804"} Mar 09 14:10:17 crc kubenswrapper[4722]: I0309 14:10:17.100695 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.901094919 podStartE2EDuration="7.100676952s" podCreationTimestamp="2026-03-09 14:10:10 +0000 UTC" firstStartedPulling="2026-03-09 14:10:12.00066687 +0000 UTC m=+452.556235476" lastFinishedPulling="2026-03-09 14:10:16.200248933 +0000 UTC m=+456.755817509" observedRunningTime="2026-03-09 14:10:17.091972271 +0000 UTC m=+457.647540887" watchObservedRunningTime="2026-03-09 14:10:17.100676952 +0000 UTC m=+457.656245518" Mar 09 14:10:19 crc kubenswrapper[4722]: I0309 14:10:19.300396 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-7b675946d5-fzz6p" Mar 09 14:10:19 crc kubenswrapper[4722]: I0309 14:10:19.300819 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7b675946d5-fzz6p" Mar 09 14:10:19 crc kubenswrapper[4722]: I0309 14:10:19.312927 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7b675946d5-fzz6p" Mar 09 14:10:20 crc kubenswrapper[4722]: I0309 14:10:20.081655 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7b675946d5-fzz6p" Mar 09 14:10:20 crc kubenswrapper[4722]: I0309 14:10:20.165287 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-sfch8"] Mar 09 14:10:20 crc kubenswrapper[4722]: I0309 14:10:20.965466 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-k8s-0" Mar 09 14:10:21 crc kubenswrapper[4722]: I0309 14:10:21.528970 4722 patch_prober.go:28] interesting pod/machine-config-daemon-hjrrb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:10:21 crc kubenswrapper[4722]: I0309 14:10:21.529056 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:10:29 crc kubenswrapper[4722]: I0309 14:10:29.900652 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/metrics-server-bcd6d9dd6-5rp7g" Mar 09 14:10:29 crc kubenswrapper[4722]: I0309 14:10:29.901192 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-bcd6d9dd6-5rp7g" Mar 09 14:10:39 crc kubenswrapper[4722]: I0309 14:10:39.979046 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-tz6gh" podUID="9d57536b-4f57-4098-b519-19fdc2559eda" containerName="registry" containerID="cri-o://adf82621040528d1293e4a06d3d17db8a5bb552b35348f0f50a943baf206c18a" gracePeriod=30 Mar 09 14:10:40 crc kubenswrapper[4722]: I0309 14:10:40.261483 4722 generic.go:334] "Generic (PLEG): container finished" podID="9d57536b-4f57-4098-b519-19fdc2559eda" containerID="adf82621040528d1293e4a06d3d17db8a5bb552b35348f0f50a943baf206c18a" exitCode=0 Mar 09 14:10:40 crc kubenswrapper[4722]: I0309 14:10:40.261570 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-tz6gh" event={"ID":"9d57536b-4f57-4098-b519-19fdc2559eda","Type":"ContainerDied","Data":"adf82621040528d1293e4a06d3d17db8a5bb552b35348f0f50a943baf206c18a"} Mar 09 14:10:41 crc kubenswrapper[4722]: I0309 14:10:41.042860 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-tz6gh" Mar 09 14:10:41 crc kubenswrapper[4722]: I0309 14:10:41.222889 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9d57536b-4f57-4098-b519-19fdc2559eda-installation-pull-secrets\") pod \"9d57536b-4f57-4098-b519-19fdc2559eda\" (UID: \"9d57536b-4f57-4098-b519-19fdc2559eda\") " Mar 09 14:10:41 crc kubenswrapper[4722]: I0309 14:10:41.223126 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbdw2\" (UniqueName: \"kubernetes.io/projected/9d57536b-4f57-4098-b519-19fdc2559eda-kube-api-access-bbdw2\") pod \"9d57536b-4f57-4098-b519-19fdc2559eda\" (UID: \"9d57536b-4f57-4098-b519-19fdc2559eda\") " Mar 09 14:10:41 crc kubenswrapper[4722]: I0309 14:10:41.223457 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"9d57536b-4f57-4098-b519-19fdc2559eda\" (UID: \"9d57536b-4f57-4098-b519-19fdc2559eda\") " Mar 09 14:10:41 crc kubenswrapper[4722]: I0309 14:10:41.223628 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d57536b-4f57-4098-b519-19fdc2559eda-trusted-ca\") pod \"9d57536b-4f57-4098-b519-19fdc2559eda\" (UID: \"9d57536b-4f57-4098-b519-19fdc2559eda\") " Mar 09 14:10:41 crc kubenswrapper[4722]: I0309 14:10:41.224628 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9d57536b-4f57-4098-b519-19fdc2559eda-ca-trust-extracted\") pod \"9d57536b-4f57-4098-b519-19fdc2559eda\" (UID: \"9d57536b-4f57-4098-b519-19fdc2559eda\") " Mar 09 14:10:41 crc kubenswrapper[4722]: I0309 14:10:41.224698 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9d57536b-4f57-4098-b519-19fdc2559eda-registry-certificates\") pod \"9d57536b-4f57-4098-b519-19fdc2559eda\" (UID: \"9d57536b-4f57-4098-b519-19fdc2559eda\") " Mar 09 14:10:41 crc kubenswrapper[4722]: I0309 14:10:41.224752 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9d57536b-4f57-4098-b519-19fdc2559eda-registry-tls\") pod \"9d57536b-4f57-4098-b519-19fdc2559eda\" (UID: \"9d57536b-4f57-4098-b519-19fdc2559eda\") " Mar 09 14:10:41 crc kubenswrapper[4722]: I0309 14:10:41.224778 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d57536b-4f57-4098-b519-19fdc2559eda-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d57536b-4f57-4098-b519-19fdc2559eda" (UID: "9d57536b-4f57-4098-b519-19fdc2559eda"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:10:41 crc kubenswrapper[4722]: I0309 14:10:41.224893 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9d57536b-4f57-4098-b519-19fdc2559eda-bound-sa-token\") pod \"9d57536b-4f57-4098-b519-19fdc2559eda\" (UID: \"9d57536b-4f57-4098-b519-19fdc2559eda\") " Mar 09 14:10:41 crc kubenswrapper[4722]: I0309 14:10:41.225609 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d57536b-4f57-4098-b519-19fdc2559eda-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "9d57536b-4f57-4098-b519-19fdc2559eda" (UID: "9d57536b-4f57-4098-b519-19fdc2559eda"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:10:41 crc kubenswrapper[4722]: I0309 14:10:41.225877 4722 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d57536b-4f57-4098-b519-19fdc2559eda-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:41 crc kubenswrapper[4722]: I0309 14:10:41.225911 4722 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9d57536b-4f57-4098-b519-19fdc2559eda-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:41 crc kubenswrapper[4722]: I0309 14:10:41.235772 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d57536b-4f57-4098-b519-19fdc2559eda-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "9d57536b-4f57-4098-b519-19fdc2559eda" (UID: "9d57536b-4f57-4098-b519-19fdc2559eda"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:10:41 crc kubenswrapper[4722]: I0309 14:10:41.236017 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d57536b-4f57-4098-b519-19fdc2559eda-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "9d57536b-4f57-4098-b519-19fdc2559eda" (UID: "9d57536b-4f57-4098-b519-19fdc2559eda"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:10:41 crc kubenswrapper[4722]: I0309 14:10:41.236465 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d57536b-4f57-4098-b519-19fdc2559eda-kube-api-access-bbdw2" (OuterVolumeSpecName: "kube-api-access-bbdw2") pod "9d57536b-4f57-4098-b519-19fdc2559eda" (UID: "9d57536b-4f57-4098-b519-19fdc2559eda"). InnerVolumeSpecName "kube-api-access-bbdw2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:10:41 crc kubenswrapper[4722]: I0309 14:10:41.237531 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "9d57536b-4f57-4098-b519-19fdc2559eda" (UID: "9d57536b-4f57-4098-b519-19fdc2559eda"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 09 14:10:41 crc kubenswrapper[4722]: I0309 14:10:41.237997 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d57536b-4f57-4098-b519-19fdc2559eda-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "9d57536b-4f57-4098-b519-19fdc2559eda" (UID: "9d57536b-4f57-4098-b519-19fdc2559eda"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:10:41 crc kubenswrapper[4722]: I0309 14:10:41.248389 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d57536b-4f57-4098-b519-19fdc2559eda-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "9d57536b-4f57-4098-b519-19fdc2559eda" (UID: "9d57536b-4f57-4098-b519-19fdc2559eda"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:10:41 crc kubenswrapper[4722]: I0309 14:10:41.272657 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-tz6gh" event={"ID":"9d57536b-4f57-4098-b519-19fdc2559eda","Type":"ContainerDied","Data":"743f4b506d0a7b71136d894ffc0bf3e17c1ccc6acd143da4697e5829a6a62cde"} Mar 09 14:10:41 crc kubenswrapper[4722]: I0309 14:10:41.272704 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-tz6gh" Mar 09 14:10:41 crc kubenswrapper[4722]: I0309 14:10:41.272741 4722 scope.go:117] "RemoveContainer" containerID="adf82621040528d1293e4a06d3d17db8a5bb552b35348f0f50a943baf206c18a" Mar 09 14:10:41 crc kubenswrapper[4722]: I0309 14:10:41.308370 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-tz6gh"] Mar 09 14:10:41 crc kubenswrapper[4722]: I0309 14:10:41.313756 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-tz6gh"] Mar 09 14:10:41 crc kubenswrapper[4722]: I0309 14:10:41.327413 4722 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9d57536b-4f57-4098-b519-19fdc2559eda-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:41 crc kubenswrapper[4722]: I0309 14:10:41.327464 4722 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9d57536b-4f57-4098-b519-19fdc2559eda-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:41 crc kubenswrapper[4722]: I0309 14:10:41.327477 4722 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9d57536b-4f57-4098-b519-19fdc2559eda-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:41 crc kubenswrapper[4722]: I0309 14:10:41.327491 4722 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9d57536b-4f57-4098-b519-19fdc2559eda-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:41 crc kubenswrapper[4722]: I0309 14:10:41.327508 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bbdw2\" (UniqueName: \"kubernetes.io/projected/9d57536b-4f57-4098-b519-19fdc2559eda-kube-api-access-bbdw2\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:42 crc kubenswrapper[4722]: I0309 14:10:42.161053 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d57536b-4f57-4098-b519-19fdc2559eda" path="/var/lib/kubelet/pods/9d57536b-4f57-4098-b519-19fdc2559eda/volumes" Mar 09 14:10:45 crc kubenswrapper[4722]: I0309 14:10:45.234005 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-sfch8" podUID="8a76c9b5-c226-4d93-8d7a-8e56210b572a" containerName="console" containerID="cri-o://ff7a4d0a96e5e1f0cc0fdcc6585ec856f476a50b47ef4023cefba4393ef415b6" gracePeriod=15 Mar 09 14:10:45 crc kubenswrapper[4722]: I0309 14:10:45.614891 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-sfch8_8a76c9b5-c226-4d93-8d7a-8e56210b572a/console/0.log" Mar 09 14:10:45 crc kubenswrapper[4722]: I0309 14:10:45.614983 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-sfch8" Mar 09 14:10:45 crc kubenswrapper[4722]: I0309 14:10:45.786472 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7tvf\" (UniqueName: \"kubernetes.io/projected/8a76c9b5-c226-4d93-8d7a-8e56210b572a-kube-api-access-k7tvf\") pod \"8a76c9b5-c226-4d93-8d7a-8e56210b572a\" (UID: \"8a76c9b5-c226-4d93-8d7a-8e56210b572a\") " Mar 09 14:10:45 crc kubenswrapper[4722]: I0309 14:10:45.786542 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8a76c9b5-c226-4d93-8d7a-8e56210b572a-oauth-serving-cert\") pod \"8a76c9b5-c226-4d93-8d7a-8e56210b572a\" (UID: \"8a76c9b5-c226-4d93-8d7a-8e56210b572a\") " Mar 09 14:10:45 crc kubenswrapper[4722]: I0309 14:10:45.786598 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8a76c9b5-c226-4d93-8d7a-8e56210b572a-trusted-ca-bundle\") pod \"8a76c9b5-c226-4d93-8d7a-8e56210b572a\" (UID: \"8a76c9b5-c226-4d93-8d7a-8e56210b572a\") " Mar 09 14:10:45 crc kubenswrapper[4722]: I0309 14:10:45.786644 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8a76c9b5-c226-4d93-8d7a-8e56210b572a-service-ca\") pod \"8a76c9b5-c226-4d93-8d7a-8e56210b572a\" (UID: \"8a76c9b5-c226-4d93-8d7a-8e56210b572a\") " Mar 09 14:10:45 crc kubenswrapper[4722]: I0309 14:10:45.786685 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8a76c9b5-c226-4d93-8d7a-8e56210b572a-console-serving-cert\") pod \"8a76c9b5-c226-4d93-8d7a-8e56210b572a\" (UID: \"8a76c9b5-c226-4d93-8d7a-8e56210b572a\") " Mar 09 14:10:45 crc kubenswrapper[4722]: I0309 14:10:45.786773 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8a76c9b5-c226-4d93-8d7a-8e56210b572a-console-config\") pod \"8a76c9b5-c226-4d93-8d7a-8e56210b572a\" (UID: \"8a76c9b5-c226-4d93-8d7a-8e56210b572a\") " Mar 09 14:10:45 crc kubenswrapper[4722]: I0309 14:10:45.786822 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8a76c9b5-c226-4d93-8d7a-8e56210b572a-console-oauth-config\") pod \"8a76c9b5-c226-4d93-8d7a-8e56210b572a\" (UID: \"8a76c9b5-c226-4d93-8d7a-8e56210b572a\") " Mar 09 14:10:45 crc kubenswrapper[4722]: I0309 14:10:45.787243 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a76c9b5-c226-4d93-8d7a-8e56210b572a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "8a76c9b5-c226-4d93-8d7a-8e56210b572a" (UID: "8a76c9b5-c226-4d93-8d7a-8e56210b572a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:10:45 crc kubenswrapper[4722]: I0309 14:10:45.787389 4722 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8a76c9b5-c226-4d93-8d7a-8e56210b572a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:45 crc kubenswrapper[4722]: I0309 14:10:45.787479 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a76c9b5-c226-4d93-8d7a-8e56210b572a-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "8a76c9b5-c226-4d93-8d7a-8e56210b572a" (UID: "8a76c9b5-c226-4d93-8d7a-8e56210b572a"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:10:45 crc kubenswrapper[4722]: I0309 14:10:45.787739 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a76c9b5-c226-4d93-8d7a-8e56210b572a-console-config" (OuterVolumeSpecName: "console-config") pod "8a76c9b5-c226-4d93-8d7a-8e56210b572a" (UID: "8a76c9b5-c226-4d93-8d7a-8e56210b572a"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:10:45 crc kubenswrapper[4722]: I0309 14:10:45.788059 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a76c9b5-c226-4d93-8d7a-8e56210b572a-service-ca" (OuterVolumeSpecName: "service-ca") pod "8a76c9b5-c226-4d93-8d7a-8e56210b572a" (UID: "8a76c9b5-c226-4d93-8d7a-8e56210b572a"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:10:45 crc kubenswrapper[4722]: I0309 14:10:45.794637 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a76c9b5-c226-4d93-8d7a-8e56210b572a-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "8a76c9b5-c226-4d93-8d7a-8e56210b572a" (UID: "8a76c9b5-c226-4d93-8d7a-8e56210b572a"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:10:45 crc kubenswrapper[4722]: I0309 14:10:45.794893 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a76c9b5-c226-4d93-8d7a-8e56210b572a-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "8a76c9b5-c226-4d93-8d7a-8e56210b572a" (UID: "8a76c9b5-c226-4d93-8d7a-8e56210b572a"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:10:45 crc kubenswrapper[4722]: I0309 14:10:45.804772 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a76c9b5-c226-4d93-8d7a-8e56210b572a-kube-api-access-k7tvf" (OuterVolumeSpecName: "kube-api-access-k7tvf") pod "8a76c9b5-c226-4d93-8d7a-8e56210b572a" (UID: "8a76c9b5-c226-4d93-8d7a-8e56210b572a"). InnerVolumeSpecName "kube-api-access-k7tvf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:10:45 crc kubenswrapper[4722]: I0309 14:10:45.888882 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k7tvf\" (UniqueName: \"kubernetes.io/projected/8a76c9b5-c226-4d93-8d7a-8e56210b572a-kube-api-access-k7tvf\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:45 crc kubenswrapper[4722]: I0309 14:10:45.889188 4722 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/8a76c9b5-c226-4d93-8d7a-8e56210b572a-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:45 crc kubenswrapper[4722]: I0309 14:10:45.889211 4722 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8a76c9b5-c226-4d93-8d7a-8e56210b572a-service-ca\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:45 crc kubenswrapper[4722]: I0309 14:10:45.889222 4722 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/8a76c9b5-c226-4d93-8d7a-8e56210b572a-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:45 crc kubenswrapper[4722]: I0309 14:10:45.889232 4722 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/8a76c9b5-c226-4d93-8d7a-8e56210b572a-console-config\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:45 crc kubenswrapper[4722]: I0309 14:10:45.889240 4722 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/8a76c9b5-c226-4d93-8d7a-8e56210b572a-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 09 14:10:46 crc kubenswrapper[4722]: I0309 14:10:46.317761 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-sfch8_8a76c9b5-c226-4d93-8d7a-8e56210b572a/console/0.log" Mar 09 14:10:46 crc kubenswrapper[4722]: I0309 14:10:46.317809 4722 generic.go:334] "Generic (PLEG): container finished" podID="8a76c9b5-c226-4d93-8d7a-8e56210b572a" containerID="ff7a4d0a96e5e1f0cc0fdcc6585ec856f476a50b47ef4023cefba4393ef415b6" exitCode=2 Mar 09 14:10:46 crc kubenswrapper[4722]: I0309 14:10:46.317838 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-sfch8" event={"ID":"8a76c9b5-c226-4d93-8d7a-8e56210b572a","Type":"ContainerDied","Data":"ff7a4d0a96e5e1f0cc0fdcc6585ec856f476a50b47ef4023cefba4393ef415b6"} Mar 09 14:10:46 crc kubenswrapper[4722]: I0309 14:10:46.317865 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-sfch8" event={"ID":"8a76c9b5-c226-4d93-8d7a-8e56210b572a","Type":"ContainerDied","Data":"6fee2f304e49495845c7a3f368cf91dcffba93fe423b49acd454c70f8fd5cb71"} Mar 09 14:10:46 crc kubenswrapper[4722]: I0309 14:10:46.317881 4722 scope.go:117] "RemoveContainer" containerID="ff7a4d0a96e5e1f0cc0fdcc6585ec856f476a50b47ef4023cefba4393ef415b6" Mar 09 14:10:46 crc kubenswrapper[4722]: I0309 14:10:46.317950 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-sfch8" Mar 09 14:10:46 crc kubenswrapper[4722]: I0309 14:10:46.345368 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-sfch8"] Mar 09 14:10:46 crc kubenswrapper[4722]: I0309 14:10:46.349196 4722 scope.go:117] "RemoveContainer" containerID="ff7a4d0a96e5e1f0cc0fdcc6585ec856f476a50b47ef4023cefba4393ef415b6" Mar 09 14:10:46 crc kubenswrapper[4722]: E0309 14:10:46.350167 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff7a4d0a96e5e1f0cc0fdcc6585ec856f476a50b47ef4023cefba4393ef415b6\": container with ID starting with ff7a4d0a96e5e1f0cc0fdcc6585ec856f476a50b47ef4023cefba4393ef415b6 not found: ID does not exist" containerID="ff7a4d0a96e5e1f0cc0fdcc6585ec856f476a50b47ef4023cefba4393ef415b6" Mar 09 14:10:46 crc kubenswrapper[4722]: I0309 14:10:46.350290 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff7a4d0a96e5e1f0cc0fdcc6585ec856f476a50b47ef4023cefba4393ef415b6"} err="failed to get container status \"ff7a4d0a96e5e1f0cc0fdcc6585ec856f476a50b47ef4023cefba4393ef415b6\": rpc error: code = NotFound desc = could not find container \"ff7a4d0a96e5e1f0cc0fdcc6585ec856f476a50b47ef4023cefba4393ef415b6\": container with ID starting with ff7a4d0a96e5e1f0cc0fdcc6585ec856f476a50b47ef4023cefba4393ef415b6 not found: ID does not exist" Mar 09 14:10:46 crc kubenswrapper[4722]: I0309 14:10:46.352758 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-sfch8"] Mar 09 14:10:48 crc kubenswrapper[4722]: I0309 14:10:48.167085 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a76c9b5-c226-4d93-8d7a-8e56210b572a" path="/var/lib/kubelet/pods/8a76c9b5-c226-4d93-8d7a-8e56210b572a/volumes" Mar 09 14:10:49 crc kubenswrapper[4722]: I0309 14:10:49.908240 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-bcd6d9dd6-5rp7g" Mar 09 14:10:49 crc kubenswrapper[4722]: I0309 14:10:49.914233 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-bcd6d9dd6-5rp7g" Mar 09 14:10:51 crc kubenswrapper[4722]: I0309 14:10:51.528014 4722 patch_prober.go:28] interesting pod/machine-config-daemon-hjrrb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:10:51 crc kubenswrapper[4722]: I0309 14:10:51.528580 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:10:51 crc kubenswrapper[4722]: I0309 14:10:51.528627 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" Mar 09 14:10:51 crc kubenswrapper[4722]: I0309 14:10:51.529270 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ee35f8e9e8041f3696b9c0177b52fd458e4382bc38a0838adc5aa0015cd1c0a8"} pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 14:10:51 crc kubenswrapper[4722]: I0309 14:10:51.529323 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerName="machine-config-daemon" containerID="cri-o://ee35f8e9e8041f3696b9c0177b52fd458e4382bc38a0838adc5aa0015cd1c0a8" gracePeriod=600 Mar 09 14:10:52 crc kubenswrapper[4722]: I0309 14:10:52.364020 4722 generic.go:334] "Generic (PLEG): container finished" podID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerID="ee35f8e9e8041f3696b9c0177b52fd458e4382bc38a0838adc5aa0015cd1c0a8" exitCode=0 Mar 09 14:10:52 crc kubenswrapper[4722]: I0309 14:10:52.364122 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" event={"ID":"dac2aaf5-653b-4b2a-8efe-ed26bac8d648","Type":"ContainerDied","Data":"ee35f8e9e8041f3696b9c0177b52fd458e4382bc38a0838adc5aa0015cd1c0a8"} Mar 09 14:10:52 crc kubenswrapper[4722]: I0309 14:10:52.364844 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" event={"ID":"dac2aaf5-653b-4b2a-8efe-ed26bac8d648","Type":"ContainerStarted","Data":"bab3031afd45b73f21ee1828aeb90282e64688adb04b134404f0cab923bb0351"} Mar 09 14:10:52 crc kubenswrapper[4722]: I0309 14:10:52.364877 4722 scope.go:117] "RemoveContainer" containerID="9f451493fa6d86f12c1291c0fe262bcbe4b62cef45c8dda696bb48fa3ded2eeb" Mar 09 14:11:10 crc kubenswrapper[4722]: I0309 14:11:10.965664 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Mar 09 14:11:11 crc kubenswrapper[4722]: I0309 14:11:11.001846 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Mar 09 14:11:11 crc kubenswrapper[4722]: I0309 14:11:11.544416 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Mar 09 14:11:45 crc kubenswrapper[4722]: I0309 14:11:45.030488 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-5d9c759fb9-nkb8p"] Mar 09 14:11:45 crc kubenswrapper[4722]: E0309 14:11:45.031751 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a76c9b5-c226-4d93-8d7a-8e56210b572a" containerName="console" Mar 09 14:11:45 crc kubenswrapper[4722]: I0309 14:11:45.031772 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a76c9b5-c226-4d93-8d7a-8e56210b572a" containerName="console" Mar 09 14:11:45 crc kubenswrapper[4722]: E0309 14:11:45.031803 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d57536b-4f57-4098-b519-19fdc2559eda" containerName="registry" Mar 09 14:11:45 crc kubenswrapper[4722]: I0309 14:11:45.031812 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d57536b-4f57-4098-b519-19fdc2559eda" containerName="registry" Mar 09 14:11:45 crc kubenswrapper[4722]: I0309 14:11:45.031967 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d57536b-4f57-4098-b519-19fdc2559eda" containerName="registry" Mar 09 14:11:45 crc kubenswrapper[4722]: I0309 14:11:45.031989 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a76c9b5-c226-4d93-8d7a-8e56210b572a" containerName="console" Mar 09 14:11:45 crc kubenswrapper[4722]: I0309 14:11:45.032712 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d9c759fb9-nkb8p" Mar 09 14:11:45 crc kubenswrapper[4722]: I0309 14:11:45.048095 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5d9c759fb9-nkb8p"] Mar 09 14:11:45 crc kubenswrapper[4722]: I0309 14:11:45.171398 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/720bc29d-ca91-4ee2-9a0c-5f32659e650e-console-serving-cert\") pod \"console-5d9c759fb9-nkb8p\" (UID: \"720bc29d-ca91-4ee2-9a0c-5f32659e650e\") " pod="openshift-console/console-5d9c759fb9-nkb8p" Mar 09 14:11:45 crc kubenswrapper[4722]: I0309 14:11:45.171476 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/720bc29d-ca91-4ee2-9a0c-5f32659e650e-console-oauth-config\") pod \"console-5d9c759fb9-nkb8p\" (UID: \"720bc29d-ca91-4ee2-9a0c-5f32659e650e\") " pod="openshift-console/console-5d9c759fb9-nkb8p" Mar 09 14:11:45 crc kubenswrapper[4722]: I0309 14:11:45.171514 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/720bc29d-ca91-4ee2-9a0c-5f32659e650e-console-config\") pod \"console-5d9c759fb9-nkb8p\" (UID: \"720bc29d-ca91-4ee2-9a0c-5f32659e650e\") " pod="openshift-console/console-5d9c759fb9-nkb8p" Mar 09 14:11:45 crc kubenswrapper[4722]: I0309 14:11:45.171903 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/720bc29d-ca91-4ee2-9a0c-5f32659e650e-trusted-ca-bundle\") pod \"console-5d9c759fb9-nkb8p\" (UID: \"720bc29d-ca91-4ee2-9a0c-5f32659e650e\") " pod="openshift-console/console-5d9c759fb9-nkb8p" Mar 09 14:11:45 crc kubenswrapper[4722]: I0309 14:11:45.172237 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/720bc29d-ca91-4ee2-9a0c-5f32659e650e-oauth-serving-cert\") pod \"console-5d9c759fb9-nkb8p\" (UID: \"720bc29d-ca91-4ee2-9a0c-5f32659e650e\") " pod="openshift-console/console-5d9c759fb9-nkb8p" Mar 09 14:11:45 crc kubenswrapper[4722]: I0309 14:11:45.172385 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x52zr\" (UniqueName: \"kubernetes.io/projected/720bc29d-ca91-4ee2-9a0c-5f32659e650e-kube-api-access-x52zr\") pod \"console-5d9c759fb9-nkb8p\" (UID: \"720bc29d-ca91-4ee2-9a0c-5f32659e650e\") " pod="openshift-console/console-5d9c759fb9-nkb8p" Mar 09 14:11:45 crc kubenswrapper[4722]: I0309 14:11:45.172512 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/720bc29d-ca91-4ee2-9a0c-5f32659e650e-service-ca\") pod \"console-5d9c759fb9-nkb8p\" (UID: \"720bc29d-ca91-4ee2-9a0c-5f32659e650e\") " pod="openshift-console/console-5d9c759fb9-nkb8p" Mar 09 14:11:45 crc kubenswrapper[4722]: I0309 14:11:45.274433 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/720bc29d-ca91-4ee2-9a0c-5f32659e650e-service-ca\") pod \"console-5d9c759fb9-nkb8p\" (UID: \"720bc29d-ca91-4ee2-9a0c-5f32659e650e\") " pod="openshift-console/console-5d9c759fb9-nkb8p" Mar 09 14:11:45 crc kubenswrapper[4722]: I0309 14:11:45.274591 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/720bc29d-ca91-4ee2-9a0c-5f32659e650e-console-serving-cert\") pod \"console-5d9c759fb9-nkb8p\" (UID: \"720bc29d-ca91-4ee2-9a0c-5f32659e650e\") " pod="openshift-console/console-5d9c759fb9-nkb8p" Mar 09 14:11:45 crc kubenswrapper[4722]: I0309 14:11:45.274645 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/720bc29d-ca91-4ee2-9a0c-5f32659e650e-console-oauth-config\") pod \"console-5d9c759fb9-nkb8p\" (UID: \"720bc29d-ca91-4ee2-9a0c-5f32659e650e\") " pod="openshift-console/console-5d9c759fb9-nkb8p" Mar 09 14:11:45 crc kubenswrapper[4722]: I0309 14:11:45.274693 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/720bc29d-ca91-4ee2-9a0c-5f32659e650e-console-config\") pod \"console-5d9c759fb9-nkb8p\" (UID: \"720bc29d-ca91-4ee2-9a0c-5f32659e650e\") " pod="openshift-console/console-5d9c759fb9-nkb8p" Mar 09 14:11:45 crc kubenswrapper[4722]: I0309 14:11:45.274732 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/720bc29d-ca91-4ee2-9a0c-5f32659e650e-trusted-ca-bundle\") pod \"console-5d9c759fb9-nkb8p\" (UID: \"720bc29d-ca91-4ee2-9a0c-5f32659e650e\") " pod="openshift-console/console-5d9c759fb9-nkb8p" Mar 09 14:11:45 crc kubenswrapper[4722]: I0309 14:11:45.274830 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/720bc29d-ca91-4ee2-9a0c-5f32659e650e-oauth-serving-cert\") pod \"console-5d9c759fb9-nkb8p\" (UID: \"720bc29d-ca91-4ee2-9a0c-5f32659e650e\") " pod="openshift-console/console-5d9c759fb9-nkb8p" Mar 09 14:11:45 crc kubenswrapper[4722]: I0309 14:11:45.274879 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x52zr\" (UniqueName: \"kubernetes.io/projected/720bc29d-ca91-4ee2-9a0c-5f32659e650e-kube-api-access-x52zr\") pod \"console-5d9c759fb9-nkb8p\" (UID: \"720bc29d-ca91-4ee2-9a0c-5f32659e650e\") " pod="openshift-console/console-5d9c759fb9-nkb8p" Mar 09 14:11:45 crc kubenswrapper[4722]: I0309 14:11:45.275912 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/720bc29d-ca91-4ee2-9a0c-5f32659e650e-service-ca\") pod \"console-5d9c759fb9-nkb8p\" (UID: \"720bc29d-ca91-4ee2-9a0c-5f32659e650e\") " pod="openshift-console/console-5d9c759fb9-nkb8p" Mar 09 14:11:45 crc kubenswrapper[4722]: I0309 14:11:45.275913 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/720bc29d-ca91-4ee2-9a0c-5f32659e650e-console-config\") pod \"console-5d9c759fb9-nkb8p\" (UID: \"720bc29d-ca91-4ee2-9a0c-5f32659e650e\") " pod="openshift-console/console-5d9c759fb9-nkb8p" Mar 09 14:11:45 crc kubenswrapper[4722]: I0309 14:11:45.277088 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/720bc29d-ca91-4ee2-9a0c-5f32659e650e-trusted-ca-bundle\") pod \"console-5d9c759fb9-nkb8p\" (UID: \"720bc29d-ca91-4ee2-9a0c-5f32659e650e\") " pod="openshift-console/console-5d9c759fb9-nkb8p" Mar 09 14:11:45 crc kubenswrapper[4722]: I0309 14:11:45.277277 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/720bc29d-ca91-4ee2-9a0c-5f32659e650e-oauth-serving-cert\") pod \"console-5d9c759fb9-nkb8p\" (UID: \"720bc29d-ca91-4ee2-9a0c-5f32659e650e\") " pod="openshift-console/console-5d9c759fb9-nkb8p" Mar 09 14:11:45 crc kubenswrapper[4722]: I0309 14:11:45.285490 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/720bc29d-ca91-4ee2-9a0c-5f32659e650e-console-serving-cert\") pod \"console-5d9c759fb9-nkb8p\" (UID: \"720bc29d-ca91-4ee2-9a0c-5f32659e650e\") " pod="openshift-console/console-5d9c759fb9-nkb8p" Mar 09 14:11:45 crc kubenswrapper[4722]: I0309 14:11:45.285773 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/720bc29d-ca91-4ee2-9a0c-5f32659e650e-console-oauth-config\") pod \"console-5d9c759fb9-nkb8p\" (UID: \"720bc29d-ca91-4ee2-9a0c-5f32659e650e\") " pod="openshift-console/console-5d9c759fb9-nkb8p" Mar 09 14:11:45 crc kubenswrapper[4722]: I0309 14:11:45.298979 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x52zr\" (UniqueName: \"kubernetes.io/projected/720bc29d-ca91-4ee2-9a0c-5f32659e650e-kube-api-access-x52zr\") pod \"console-5d9c759fb9-nkb8p\" (UID: \"720bc29d-ca91-4ee2-9a0c-5f32659e650e\") " pod="openshift-console/console-5d9c759fb9-nkb8p" Mar 09 14:11:45 crc kubenswrapper[4722]: I0309 14:11:45.353860 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d9c759fb9-nkb8p" Mar 09 14:11:45 crc kubenswrapper[4722]: I0309 14:11:45.609628 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5d9c759fb9-nkb8p"] Mar 09 14:11:45 crc kubenswrapper[4722]: I0309 14:11:45.771336 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d9c759fb9-nkb8p" event={"ID":"720bc29d-ca91-4ee2-9a0c-5f32659e650e","Type":"ContainerStarted","Data":"e4390ee405b9ff45c1807c4e21e61fad235bb5c948893f0bf7107ea13c6f322e"} Mar 09 14:11:45 crc kubenswrapper[4722]: I0309 14:11:45.771386 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d9c759fb9-nkb8p" event={"ID":"720bc29d-ca91-4ee2-9a0c-5f32659e650e","Type":"ContainerStarted","Data":"ce63b766ed219542b8ed0002471573ffa3dd7c5c1c3a56804a39e71ae295b4b1"} Mar 09 14:11:45 crc kubenswrapper[4722]: I0309 14:11:45.794865 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5d9c759fb9-nkb8p" podStartSLOduration=0.794840693 podStartE2EDuration="794.840693ms" podCreationTimestamp="2026-03-09 14:11:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:11:45.792435136 +0000 UTC m=+546.348003712" watchObservedRunningTime="2026-03-09 14:11:45.794840693 +0000 UTC m=+546.350409279" Mar 09 14:11:53 crc kubenswrapper[4722]: E0309 14:11:53.227322 4722 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="1.079s" Mar 09 14:11:55 crc kubenswrapper[4722]: I0309 14:11:55.354160 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-5d9c759fb9-nkb8p" Mar 09 14:11:55 crc kubenswrapper[4722]: I0309 14:11:55.355057 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5d9c759fb9-nkb8p" Mar 09 14:11:55 crc kubenswrapper[4722]: I0309 14:11:55.359383 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5d9c759fb9-nkb8p" Mar 09 14:11:56 crc kubenswrapper[4722]: I0309 14:11:56.263321 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5d9c759fb9-nkb8p" Mar 09 14:11:56 crc kubenswrapper[4722]: I0309 14:11:56.346593 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7b675946d5-fzz6p"] Mar 09 14:12:00 crc kubenswrapper[4722]: I0309 14:12:00.146240 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551092-5pnkn"] Mar 09 14:12:00 crc kubenswrapper[4722]: I0309 14:12:00.147861 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551092-5pnkn" Mar 09 14:12:00 crc kubenswrapper[4722]: I0309 14:12:00.153651 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-dr2x6" Mar 09 14:12:00 crc kubenswrapper[4722]: I0309 14:12:00.153742 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 14:12:00 crc kubenswrapper[4722]: I0309 14:12:00.153651 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 14:12:00 crc kubenswrapper[4722]: I0309 14:12:00.159960 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551092-5pnkn"] Mar 09 14:12:00 crc kubenswrapper[4722]: I0309 14:12:00.242032 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbfhq\" (UniqueName: \"kubernetes.io/projected/f18133e6-a08d-48be-82a2-c77e4c05e170-kube-api-access-hbfhq\") pod \"auto-csr-approver-29551092-5pnkn\" (UID: \"f18133e6-a08d-48be-82a2-c77e4c05e170\") " pod="openshift-infra/auto-csr-approver-29551092-5pnkn" Mar 09 14:12:00 crc kubenswrapper[4722]: I0309 14:12:00.344106 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbfhq\" (UniqueName: \"kubernetes.io/projected/f18133e6-a08d-48be-82a2-c77e4c05e170-kube-api-access-hbfhq\") pod \"auto-csr-approver-29551092-5pnkn\" (UID: \"f18133e6-a08d-48be-82a2-c77e4c05e170\") " pod="openshift-infra/auto-csr-approver-29551092-5pnkn" Mar 09 14:12:00 crc kubenswrapper[4722]: I0309 14:12:00.367336 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbfhq\" (UniqueName: \"kubernetes.io/projected/f18133e6-a08d-48be-82a2-c77e4c05e170-kube-api-access-hbfhq\") pod \"auto-csr-approver-29551092-5pnkn\" (UID: \"f18133e6-a08d-48be-82a2-c77e4c05e170\") " pod="openshift-infra/auto-csr-approver-29551092-5pnkn" Mar 09 14:12:00 crc kubenswrapper[4722]: I0309 14:12:00.471782 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551092-5pnkn" Mar 09 14:12:00 crc kubenswrapper[4722]: I0309 14:12:00.686863 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551092-5pnkn"] Mar 09 14:12:00 crc kubenswrapper[4722]: I0309 14:12:00.693984 4722 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 14:12:01 crc kubenswrapper[4722]: I0309 14:12:01.292317 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551092-5pnkn" event={"ID":"f18133e6-a08d-48be-82a2-c77e4c05e170","Type":"ContainerStarted","Data":"a7d6708c65d19ffd075b811e6f6ad9fd9243fc7c95d1147db542cea3e847a307"} Mar 09 14:12:02 crc kubenswrapper[4722]: I0309 14:12:02.302230 4722 generic.go:334] "Generic (PLEG): container finished" podID="f18133e6-a08d-48be-82a2-c77e4c05e170" containerID="2902dcdfebbe6acd840510f06d63043c66ae19a0cc75f47007ba4184460ae59c" exitCode=0 Mar 09 14:12:02 crc kubenswrapper[4722]: I0309 14:12:02.302375 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551092-5pnkn" event={"ID":"f18133e6-a08d-48be-82a2-c77e4c05e170","Type":"ContainerDied","Data":"2902dcdfebbe6acd840510f06d63043c66ae19a0cc75f47007ba4184460ae59c"} Mar 09 14:12:03 crc kubenswrapper[4722]: I0309 14:12:03.570179 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551092-5pnkn" Mar 09 14:12:03 crc kubenswrapper[4722]: I0309 14:12:03.590138 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hbfhq\" (UniqueName: \"kubernetes.io/projected/f18133e6-a08d-48be-82a2-c77e4c05e170-kube-api-access-hbfhq\") pod \"f18133e6-a08d-48be-82a2-c77e4c05e170\" (UID: \"f18133e6-a08d-48be-82a2-c77e4c05e170\") " Mar 09 14:12:03 crc kubenswrapper[4722]: I0309 14:12:03.595757 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f18133e6-a08d-48be-82a2-c77e4c05e170-kube-api-access-hbfhq" (OuterVolumeSpecName: "kube-api-access-hbfhq") pod "f18133e6-a08d-48be-82a2-c77e4c05e170" (UID: "f18133e6-a08d-48be-82a2-c77e4c05e170"). InnerVolumeSpecName "kube-api-access-hbfhq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:12:03 crc kubenswrapper[4722]: I0309 14:12:03.691910 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hbfhq\" (UniqueName: \"kubernetes.io/projected/f18133e6-a08d-48be-82a2-c77e4c05e170-kube-api-access-hbfhq\") on node \"crc\" DevicePath \"\"" Mar 09 14:12:04 crc kubenswrapper[4722]: I0309 14:12:04.318480 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551092-5pnkn" event={"ID":"f18133e6-a08d-48be-82a2-c77e4c05e170","Type":"ContainerDied","Data":"a7d6708c65d19ffd075b811e6f6ad9fd9243fc7c95d1147db542cea3e847a307"} Mar 09 14:12:04 crc kubenswrapper[4722]: I0309 14:12:04.318525 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7d6708c65d19ffd075b811e6f6ad9fd9243fc7c95d1147db542cea3e847a307" Mar 09 14:12:04 crc kubenswrapper[4722]: I0309 14:12:04.318546 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551092-5pnkn" Mar 09 14:12:04 crc kubenswrapper[4722]: I0309 14:12:04.640128 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551086-cvc28"] Mar 09 14:12:04 crc kubenswrapper[4722]: I0309 14:12:04.648904 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551086-cvc28"] Mar 09 14:12:06 crc kubenswrapper[4722]: I0309 14:12:06.158676 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40be416c-1b7b-4973-b9ed-25ae20cd660d" path="/var/lib/kubelet/pods/40be416c-1b7b-4973-b9ed-25ae20cd660d/volumes" Mar 09 14:12:21 crc kubenswrapper[4722]: I0309 14:12:21.402098 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-7b675946d5-fzz6p" podUID="23e0bff9-aca3-41b7-88d9-e67af2d00319" containerName="console" containerID="cri-o://769483573e2da12bf63a74127b02961d9bf5d8f969e268e3610cd466360933ee" gracePeriod=15 Mar 09 14:12:21 crc kubenswrapper[4722]: I0309 14:12:21.792419 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7b675946d5-fzz6p_23e0bff9-aca3-41b7-88d9-e67af2d00319/console/0.log" Mar 09 14:12:21 crc kubenswrapper[4722]: I0309 14:12:21.792871 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b675946d5-fzz6p" Mar 09 14:12:21 crc kubenswrapper[4722]: I0309 14:12:21.884935 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/23e0bff9-aca3-41b7-88d9-e67af2d00319-console-oauth-config\") pod \"23e0bff9-aca3-41b7-88d9-e67af2d00319\" (UID: \"23e0bff9-aca3-41b7-88d9-e67af2d00319\") " Mar 09 14:12:21 crc kubenswrapper[4722]: I0309 14:12:21.885065 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/23e0bff9-aca3-41b7-88d9-e67af2d00319-trusted-ca-bundle\") pod \"23e0bff9-aca3-41b7-88d9-e67af2d00319\" (UID: \"23e0bff9-aca3-41b7-88d9-e67af2d00319\") " Mar 09 14:12:21 crc kubenswrapper[4722]: I0309 14:12:21.885120 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5xnc\" (UniqueName: \"kubernetes.io/projected/23e0bff9-aca3-41b7-88d9-e67af2d00319-kube-api-access-g5xnc\") pod \"23e0bff9-aca3-41b7-88d9-e67af2d00319\" (UID: \"23e0bff9-aca3-41b7-88d9-e67af2d00319\") " Mar 09 14:12:21 crc kubenswrapper[4722]: I0309 14:12:21.885182 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/23e0bff9-aca3-41b7-88d9-e67af2d00319-console-config\") pod \"23e0bff9-aca3-41b7-88d9-e67af2d00319\" (UID: \"23e0bff9-aca3-41b7-88d9-e67af2d00319\") " Mar 09 14:12:21 crc kubenswrapper[4722]: I0309 14:12:21.885299 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/23e0bff9-aca3-41b7-88d9-e67af2d00319-oauth-serving-cert\") pod \"23e0bff9-aca3-41b7-88d9-e67af2d00319\" (UID: \"23e0bff9-aca3-41b7-88d9-e67af2d00319\") " Mar 09 14:12:21 crc kubenswrapper[4722]: I0309 14:12:21.885424 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/23e0bff9-aca3-41b7-88d9-e67af2d00319-console-serving-cert\") pod \"23e0bff9-aca3-41b7-88d9-e67af2d00319\" (UID: \"23e0bff9-aca3-41b7-88d9-e67af2d00319\") " Mar 09 14:12:21 crc kubenswrapper[4722]: I0309 14:12:21.885488 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/23e0bff9-aca3-41b7-88d9-e67af2d00319-service-ca\") pod \"23e0bff9-aca3-41b7-88d9-e67af2d00319\" (UID: \"23e0bff9-aca3-41b7-88d9-e67af2d00319\") " Mar 09 14:12:21 crc kubenswrapper[4722]: I0309 14:12:21.886629 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23e0bff9-aca3-41b7-88d9-e67af2d00319-console-config" (OuterVolumeSpecName: "console-config") pod "23e0bff9-aca3-41b7-88d9-e67af2d00319" (UID: "23e0bff9-aca3-41b7-88d9-e67af2d00319"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:12:21 crc kubenswrapper[4722]: I0309 14:12:21.886700 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23e0bff9-aca3-41b7-88d9-e67af2d00319-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "23e0bff9-aca3-41b7-88d9-e67af2d00319" (UID: "23e0bff9-aca3-41b7-88d9-e67af2d00319"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:12:21 crc kubenswrapper[4722]: I0309 14:12:21.886822 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23e0bff9-aca3-41b7-88d9-e67af2d00319-service-ca" (OuterVolumeSpecName: "service-ca") pod "23e0bff9-aca3-41b7-88d9-e67af2d00319" (UID: "23e0bff9-aca3-41b7-88d9-e67af2d00319"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:12:21 crc kubenswrapper[4722]: I0309 14:12:21.886949 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23e0bff9-aca3-41b7-88d9-e67af2d00319-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "23e0bff9-aca3-41b7-88d9-e67af2d00319" (UID: "23e0bff9-aca3-41b7-88d9-e67af2d00319"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:12:21 crc kubenswrapper[4722]: I0309 14:12:21.892763 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23e0bff9-aca3-41b7-88d9-e67af2d00319-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "23e0bff9-aca3-41b7-88d9-e67af2d00319" (UID: "23e0bff9-aca3-41b7-88d9-e67af2d00319"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:12:21 crc kubenswrapper[4722]: I0309 14:12:21.893517 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23e0bff9-aca3-41b7-88d9-e67af2d00319-kube-api-access-g5xnc" (OuterVolumeSpecName: "kube-api-access-g5xnc") pod "23e0bff9-aca3-41b7-88d9-e67af2d00319" (UID: "23e0bff9-aca3-41b7-88d9-e67af2d00319"). InnerVolumeSpecName "kube-api-access-g5xnc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:12:21 crc kubenswrapper[4722]: I0309 14:12:21.893779 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23e0bff9-aca3-41b7-88d9-e67af2d00319-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "23e0bff9-aca3-41b7-88d9-e67af2d00319" (UID: "23e0bff9-aca3-41b7-88d9-e67af2d00319"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:12:21 crc kubenswrapper[4722]: I0309 14:12:21.987726 4722 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/23e0bff9-aca3-41b7-88d9-e67af2d00319-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 14:12:21 crc kubenswrapper[4722]: I0309 14:12:21.987779 4722 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/23e0bff9-aca3-41b7-88d9-e67af2d00319-service-ca\") on node \"crc\" DevicePath \"\"" Mar 09 14:12:21 crc kubenswrapper[4722]: I0309 14:12:21.987791 4722 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/23e0bff9-aca3-41b7-88d9-e67af2d00319-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 09 14:12:21 crc kubenswrapper[4722]: I0309 14:12:21.987804 4722 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/23e0bff9-aca3-41b7-88d9-e67af2d00319-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:12:21 crc kubenswrapper[4722]: I0309 14:12:21.987817 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g5xnc\" (UniqueName: \"kubernetes.io/projected/23e0bff9-aca3-41b7-88d9-e67af2d00319-kube-api-access-g5xnc\") on node \"crc\" DevicePath \"\"" Mar 09 14:12:21 crc kubenswrapper[4722]: I0309 14:12:21.987831 4722 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/23e0bff9-aca3-41b7-88d9-e67af2d00319-console-config\") on node \"crc\" DevicePath \"\"" Mar 09 14:12:21 crc kubenswrapper[4722]: I0309 14:12:21.987843 4722 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/23e0bff9-aca3-41b7-88d9-e67af2d00319-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 14:12:22 crc kubenswrapper[4722]: I0309 14:12:22.452602 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7b675946d5-fzz6p_23e0bff9-aca3-41b7-88d9-e67af2d00319/console/0.log" Mar 09 14:12:22 crc kubenswrapper[4722]: I0309 14:12:22.452677 4722 generic.go:334] "Generic (PLEG): container finished" podID="23e0bff9-aca3-41b7-88d9-e67af2d00319" containerID="769483573e2da12bf63a74127b02961d9bf5d8f969e268e3610cd466360933ee" exitCode=2 Mar 09 14:12:22 crc kubenswrapper[4722]: I0309 14:12:22.452793 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b675946d5-fzz6p" Mar 09 14:12:22 crc kubenswrapper[4722]: I0309 14:12:22.453593 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b675946d5-fzz6p" event={"ID":"23e0bff9-aca3-41b7-88d9-e67af2d00319","Type":"ContainerDied","Data":"769483573e2da12bf63a74127b02961d9bf5d8f969e268e3610cd466360933ee"} Mar 09 14:12:22 crc kubenswrapper[4722]: I0309 14:12:22.453633 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b675946d5-fzz6p" event={"ID":"23e0bff9-aca3-41b7-88d9-e67af2d00319","Type":"ContainerDied","Data":"56901853a9d0663664fda4dfa4defb72d36613607f805f67dece3a89bb8e4a20"} Mar 09 14:12:22 crc kubenswrapper[4722]: I0309 14:12:22.453663 4722 scope.go:117] "RemoveContainer" containerID="769483573e2da12bf63a74127b02961d9bf5d8f969e268e3610cd466360933ee" Mar 09 14:12:22 crc kubenswrapper[4722]: I0309 14:12:22.485765 4722 scope.go:117] "RemoveContainer" containerID="769483573e2da12bf63a74127b02961d9bf5d8f969e268e3610cd466360933ee" Mar 09 14:12:22 crc kubenswrapper[4722]: E0309 14:12:22.486652 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"769483573e2da12bf63a74127b02961d9bf5d8f969e268e3610cd466360933ee\": container with ID starting with 769483573e2da12bf63a74127b02961d9bf5d8f969e268e3610cd466360933ee not found: ID does not exist" containerID="769483573e2da12bf63a74127b02961d9bf5d8f969e268e3610cd466360933ee" Mar 09 14:12:22 crc kubenswrapper[4722]: I0309 14:12:22.486751 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"769483573e2da12bf63a74127b02961d9bf5d8f969e268e3610cd466360933ee"} err="failed to get container status \"769483573e2da12bf63a74127b02961d9bf5d8f969e268e3610cd466360933ee\": rpc error: code = NotFound desc = could not find container \"769483573e2da12bf63a74127b02961d9bf5d8f969e268e3610cd466360933ee\": container with ID starting with 769483573e2da12bf63a74127b02961d9bf5d8f969e268e3610cd466360933ee not found: ID does not exist" Mar 09 14:12:22 crc kubenswrapper[4722]: I0309 14:12:22.493555 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7b675946d5-fzz6p"] Mar 09 14:12:22 crc kubenswrapper[4722]: I0309 14:12:22.498226 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7b675946d5-fzz6p"] Mar 09 14:12:24 crc kubenswrapper[4722]: I0309 14:12:24.166835 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23e0bff9-aca3-41b7-88d9-e67af2d00319" path="/var/lib/kubelet/pods/23e0bff9-aca3-41b7-88d9-e67af2d00319/volumes" Mar 09 14:12:51 crc kubenswrapper[4722]: I0309 14:12:51.529192 4722 patch_prober.go:28] interesting pod/machine-config-daemon-hjrrb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:12:51 crc kubenswrapper[4722]: I0309 14:12:51.529921 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:13:21 crc kubenswrapper[4722]: I0309 14:13:21.528683 4722 patch_prober.go:28] interesting pod/machine-config-daemon-hjrrb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:13:21 crc kubenswrapper[4722]: I0309 14:13:21.529737 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:13:51 crc kubenswrapper[4722]: I0309 14:13:51.528701 4722 patch_prober.go:28] interesting pod/machine-config-daemon-hjrrb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:13:51 crc kubenswrapper[4722]: I0309 14:13:51.529419 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:13:51 crc kubenswrapper[4722]: I0309 14:13:51.529535 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" Mar 09 14:13:51 crc kubenswrapper[4722]: I0309 14:13:51.530640 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bab3031afd45b73f21ee1828aeb90282e64688adb04b134404f0cab923bb0351"} pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 14:13:51 crc kubenswrapper[4722]: I0309 14:13:51.530767 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerName="machine-config-daemon" containerID="cri-o://bab3031afd45b73f21ee1828aeb90282e64688adb04b134404f0cab923bb0351" gracePeriod=600 Mar 09 14:13:52 crc kubenswrapper[4722]: I0309 14:13:52.183078 4722 generic.go:334] "Generic (PLEG): container finished" podID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerID="bab3031afd45b73f21ee1828aeb90282e64688adb04b134404f0cab923bb0351" exitCode=0 Mar 09 14:13:52 crc kubenswrapper[4722]: I0309 14:13:52.183101 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" event={"ID":"dac2aaf5-653b-4b2a-8efe-ed26bac8d648","Type":"ContainerDied","Data":"bab3031afd45b73f21ee1828aeb90282e64688adb04b134404f0cab923bb0351"} Mar 09 14:13:52 crc kubenswrapper[4722]: I0309 14:13:52.183543 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" event={"ID":"dac2aaf5-653b-4b2a-8efe-ed26bac8d648","Type":"ContainerStarted","Data":"cc45a812c78ad6bdbc54dbec7789e158b5ae14665e6cafed5462e27caf19d00d"} Mar 09 14:13:52 crc kubenswrapper[4722]: I0309 14:13:52.183580 4722 scope.go:117] "RemoveContainer" containerID="ee35f8e9e8041f3696b9c0177b52fd458e4382bc38a0838adc5aa0015cd1c0a8" Mar 09 14:14:00 crc kubenswrapper[4722]: I0309 14:14:00.169703 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551094-7t8bv"] Mar 09 14:14:00 crc kubenswrapper[4722]: E0309 14:14:00.172537 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f18133e6-a08d-48be-82a2-c77e4c05e170" containerName="oc" Mar 09 14:14:00 crc kubenswrapper[4722]: I0309 14:14:00.172706 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f18133e6-a08d-48be-82a2-c77e4c05e170" containerName="oc" Mar 09 14:14:00 crc kubenswrapper[4722]: E0309 14:14:00.172845 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23e0bff9-aca3-41b7-88d9-e67af2d00319" containerName="console" Mar 09 14:14:00 crc kubenswrapper[4722]: I0309 14:14:00.172958 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="23e0bff9-aca3-41b7-88d9-e67af2d00319" containerName="console" Mar 09 14:14:00 crc kubenswrapper[4722]: I0309 14:14:00.173386 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="23e0bff9-aca3-41b7-88d9-e67af2d00319" containerName="console" Mar 09 14:14:00 crc kubenswrapper[4722]: I0309 14:14:00.173576 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="f18133e6-a08d-48be-82a2-c77e4c05e170" containerName="oc" Mar 09 14:14:00 crc kubenswrapper[4722]: I0309 14:14:00.174662 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551094-7t8bv"] Mar 09 14:14:00 crc kubenswrapper[4722]: I0309 14:14:00.174984 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551094-7t8bv" Mar 09 14:14:00 crc kubenswrapper[4722]: I0309 14:14:00.177636 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-dr2x6" Mar 09 14:14:00 crc kubenswrapper[4722]: I0309 14:14:00.178566 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 14:14:00 crc kubenswrapper[4722]: I0309 14:14:00.179117 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 14:14:00 crc kubenswrapper[4722]: I0309 14:14:00.326960 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2rwg\" (UniqueName: \"kubernetes.io/projected/0fcf0635-09ce-4b6b-b899-b51db22e1b37-kube-api-access-k2rwg\") pod \"auto-csr-approver-29551094-7t8bv\" (UID: \"0fcf0635-09ce-4b6b-b899-b51db22e1b37\") " pod="openshift-infra/auto-csr-approver-29551094-7t8bv" Mar 09 14:14:00 crc kubenswrapper[4722]: I0309 14:14:00.428726 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2rwg\" (UniqueName: \"kubernetes.io/projected/0fcf0635-09ce-4b6b-b899-b51db22e1b37-kube-api-access-k2rwg\") pod \"auto-csr-approver-29551094-7t8bv\" (UID: \"0fcf0635-09ce-4b6b-b899-b51db22e1b37\") " pod="openshift-infra/auto-csr-approver-29551094-7t8bv" Mar 09 14:14:00 crc kubenswrapper[4722]: I0309 14:14:00.467180 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2rwg\" (UniqueName: \"kubernetes.io/projected/0fcf0635-09ce-4b6b-b899-b51db22e1b37-kube-api-access-k2rwg\") pod \"auto-csr-approver-29551094-7t8bv\" (UID: \"0fcf0635-09ce-4b6b-b899-b51db22e1b37\") " pod="openshift-infra/auto-csr-approver-29551094-7t8bv" Mar 09 14:14:00 crc kubenswrapper[4722]: I0309 14:14:00.502983 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551094-7t8bv" Mar 09 14:14:00 crc kubenswrapper[4722]: I0309 14:14:00.740816 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551094-7t8bv"] Mar 09 14:14:01 crc kubenswrapper[4722]: I0309 14:14:01.253530 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551094-7t8bv" event={"ID":"0fcf0635-09ce-4b6b-b899-b51db22e1b37","Type":"ContainerStarted","Data":"cace54f127846d5205dc1171e8314be2069be216eee1310bef7d6b7a73cc1164"} Mar 09 14:14:02 crc kubenswrapper[4722]: I0309 14:14:02.262085 4722 generic.go:334] "Generic (PLEG): container finished" podID="0fcf0635-09ce-4b6b-b899-b51db22e1b37" containerID="84c5a23e2b2328d9ab22aebec1965a907d8d90028ccd5b6209b26045f99b6725" exitCode=0 Mar 09 14:14:02 crc kubenswrapper[4722]: I0309 14:14:02.262167 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551094-7t8bv" event={"ID":"0fcf0635-09ce-4b6b-b899-b51db22e1b37","Type":"ContainerDied","Data":"84c5a23e2b2328d9ab22aebec1965a907d8d90028ccd5b6209b26045f99b6725"} Mar 09 14:14:03 crc kubenswrapper[4722]: I0309 14:14:03.545956 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551094-7t8bv" Mar 09 14:14:03 crc kubenswrapper[4722]: I0309 14:14:03.692312 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k2rwg\" (UniqueName: \"kubernetes.io/projected/0fcf0635-09ce-4b6b-b899-b51db22e1b37-kube-api-access-k2rwg\") pod \"0fcf0635-09ce-4b6b-b899-b51db22e1b37\" (UID: \"0fcf0635-09ce-4b6b-b899-b51db22e1b37\") " Mar 09 14:14:03 crc kubenswrapper[4722]: I0309 14:14:03.704492 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fcf0635-09ce-4b6b-b899-b51db22e1b37-kube-api-access-k2rwg" (OuterVolumeSpecName: "kube-api-access-k2rwg") pod "0fcf0635-09ce-4b6b-b899-b51db22e1b37" (UID: "0fcf0635-09ce-4b6b-b899-b51db22e1b37"). InnerVolumeSpecName "kube-api-access-k2rwg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:14:03 crc kubenswrapper[4722]: I0309 14:14:03.794331 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k2rwg\" (UniqueName: \"kubernetes.io/projected/0fcf0635-09ce-4b6b-b899-b51db22e1b37-kube-api-access-k2rwg\") on node \"crc\" DevicePath \"\"" Mar 09 14:14:04 crc kubenswrapper[4722]: I0309 14:14:04.284188 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551094-7t8bv" event={"ID":"0fcf0635-09ce-4b6b-b899-b51db22e1b37","Type":"ContainerDied","Data":"cace54f127846d5205dc1171e8314be2069be216eee1310bef7d6b7a73cc1164"} Mar 09 14:14:04 crc kubenswrapper[4722]: I0309 14:14:04.284331 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551094-7t8bv" Mar 09 14:14:04 crc kubenswrapper[4722]: I0309 14:14:04.284348 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cace54f127846d5205dc1171e8314be2069be216eee1310bef7d6b7a73cc1164" Mar 09 14:14:04 crc kubenswrapper[4722]: I0309 14:14:04.620481 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551088-b4l4k"] Mar 09 14:14:04 crc kubenswrapper[4722]: I0309 14:14:04.628437 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551088-b4l4k"] Mar 09 14:14:06 crc kubenswrapper[4722]: I0309 14:14:06.163532 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c0ec175-5b4c-4e8d-9382-49aa1d515423" path="/var/lib/kubelet/pods/1c0ec175-5b4c-4e8d-9382-49aa1d515423/volumes" Mar 09 14:14:15 crc kubenswrapper[4722]: I0309 14:14:15.626015 4722 scope.go:117] "RemoveContainer" containerID="d699b4ec9fd940584437cd321dd64fb6a995865e2cf274f73ca2f40410515cfa" Mar 09 14:14:33 crc kubenswrapper[4722]: I0309 14:14:33.664473 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08pcbpj"] Mar 09 14:14:33 crc kubenswrapper[4722]: E0309 14:14:33.665437 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fcf0635-09ce-4b6b-b899-b51db22e1b37" containerName="oc" Mar 09 14:14:33 crc kubenswrapper[4722]: I0309 14:14:33.665461 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fcf0635-09ce-4b6b-b899-b51db22e1b37" containerName="oc" Mar 09 14:14:33 crc kubenswrapper[4722]: I0309 14:14:33.665694 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fcf0635-09ce-4b6b-b899-b51db22e1b37" containerName="oc" Mar 09 14:14:33 crc kubenswrapper[4722]: I0309 14:14:33.667193 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08pcbpj" Mar 09 14:14:33 crc kubenswrapper[4722]: I0309 14:14:33.669971 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 09 14:14:33 crc kubenswrapper[4722]: I0309 14:14:33.691518 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08pcbpj"] Mar 09 14:14:33 crc kubenswrapper[4722]: I0309 14:14:33.735130 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1c250d34-2965-46ae-81ec-c73a372d0380-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08pcbpj\" (UID: \"1c250d34-2965-46ae-81ec-c73a372d0380\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08pcbpj" Mar 09 14:14:33 crc kubenswrapper[4722]: I0309 14:14:33.735224 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69kkq\" (UniqueName: \"kubernetes.io/projected/1c250d34-2965-46ae-81ec-c73a372d0380-kube-api-access-69kkq\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08pcbpj\" (UID: \"1c250d34-2965-46ae-81ec-c73a372d0380\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08pcbpj" Mar 09 14:14:33 crc kubenswrapper[4722]: I0309 14:14:33.735338 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1c250d34-2965-46ae-81ec-c73a372d0380-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08pcbpj\" (UID: \"1c250d34-2965-46ae-81ec-c73a372d0380\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08pcbpj" Mar 09 14:14:33 crc kubenswrapper[4722]: I0309 14:14:33.836925 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1c250d34-2965-46ae-81ec-c73a372d0380-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08pcbpj\" (UID: \"1c250d34-2965-46ae-81ec-c73a372d0380\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08pcbpj" Mar 09 14:14:33 crc kubenswrapper[4722]: I0309 14:14:33.837548 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69kkq\" (UniqueName: \"kubernetes.io/projected/1c250d34-2965-46ae-81ec-c73a372d0380-kube-api-access-69kkq\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08pcbpj\" (UID: \"1c250d34-2965-46ae-81ec-c73a372d0380\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08pcbpj" Mar 09 14:14:33 crc kubenswrapper[4722]: I0309 14:14:33.837647 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1c250d34-2965-46ae-81ec-c73a372d0380-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08pcbpj\" (UID: \"1c250d34-2965-46ae-81ec-c73a372d0380\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08pcbpj" Mar 09 14:14:33 crc kubenswrapper[4722]: I0309 14:14:33.837955 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1c250d34-2965-46ae-81ec-c73a372d0380-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08pcbpj\" (UID: \"1c250d34-2965-46ae-81ec-c73a372d0380\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08pcbpj" Mar 09 14:14:33 crc kubenswrapper[4722]: I0309 14:14:33.838569 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1c250d34-2965-46ae-81ec-c73a372d0380-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08pcbpj\" (UID: \"1c250d34-2965-46ae-81ec-c73a372d0380\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08pcbpj" Mar 09 14:14:33 crc kubenswrapper[4722]: I0309 14:14:33.877102 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69kkq\" (UniqueName: \"kubernetes.io/projected/1c250d34-2965-46ae-81ec-c73a372d0380-kube-api-access-69kkq\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08pcbpj\" (UID: \"1c250d34-2965-46ae-81ec-c73a372d0380\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08pcbpj" Mar 09 14:14:33 crc kubenswrapper[4722]: I0309 14:14:33.996355 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08pcbpj" Mar 09 14:14:34 crc kubenswrapper[4722]: I0309 14:14:34.232097 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08pcbpj"] Mar 09 14:14:34 crc kubenswrapper[4722]: I0309 14:14:34.511350 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08pcbpj" event={"ID":"1c250d34-2965-46ae-81ec-c73a372d0380","Type":"ContainerStarted","Data":"db413d8bfc402b59d4ccf127dde818ee7de768bbc47914080aa3a02404a76891"} Mar 09 14:14:34 crc kubenswrapper[4722]: I0309 14:14:34.511968 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08pcbpj" event={"ID":"1c250d34-2965-46ae-81ec-c73a372d0380","Type":"ContainerStarted","Data":"260e64b9d15be2021e65ae4af4384d6f9c55c1d636ad19be4e805142b840f4b4"} Mar 09 14:14:35 crc kubenswrapper[4722]: I0309 14:14:35.519526 4722 generic.go:334] "Generic (PLEG): container finished" podID="1c250d34-2965-46ae-81ec-c73a372d0380" containerID="db413d8bfc402b59d4ccf127dde818ee7de768bbc47914080aa3a02404a76891" exitCode=0 Mar 09 14:14:35 crc kubenswrapper[4722]: I0309 14:14:35.519587 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08pcbpj" event={"ID":"1c250d34-2965-46ae-81ec-c73a372d0380","Type":"ContainerDied","Data":"db413d8bfc402b59d4ccf127dde818ee7de768bbc47914080aa3a02404a76891"} Mar 09 14:14:37 crc kubenswrapper[4722]: I0309 14:14:37.540100 4722 generic.go:334] "Generic (PLEG): container finished" podID="1c250d34-2965-46ae-81ec-c73a372d0380" containerID="69b3f70224f35531f8ab8d21cb02b27dfbbaa27f8c41141e3616f1d71eec1df9" exitCode=0 Mar 09 14:14:37 crc kubenswrapper[4722]: I0309 14:14:37.540278 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08pcbpj" event={"ID":"1c250d34-2965-46ae-81ec-c73a372d0380","Type":"ContainerDied","Data":"69b3f70224f35531f8ab8d21cb02b27dfbbaa27f8c41141e3616f1d71eec1df9"} Mar 09 14:14:38 crc kubenswrapper[4722]: I0309 14:14:38.553642 4722 generic.go:334] "Generic (PLEG): container finished" podID="1c250d34-2965-46ae-81ec-c73a372d0380" containerID="995fd6821f400f40f667029e5035dc6b8676b91bb4bbd8ddde098abd5900c4ed" exitCode=0 Mar 09 14:14:38 crc kubenswrapper[4722]: I0309 14:14:38.553669 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08pcbpj" event={"ID":"1c250d34-2965-46ae-81ec-c73a372d0380","Type":"ContainerDied","Data":"995fd6821f400f40f667029e5035dc6b8676b91bb4bbd8ddde098abd5900c4ed"} Mar 09 14:14:39 crc kubenswrapper[4722]: I0309 14:14:39.860152 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08pcbpj" Mar 09 14:14:39 crc kubenswrapper[4722]: I0309 14:14:39.927394 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1c250d34-2965-46ae-81ec-c73a372d0380-util\") pod \"1c250d34-2965-46ae-81ec-c73a372d0380\" (UID: \"1c250d34-2965-46ae-81ec-c73a372d0380\") " Mar 09 14:14:39 crc kubenswrapper[4722]: I0309 14:14:39.927442 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1c250d34-2965-46ae-81ec-c73a372d0380-bundle\") pod \"1c250d34-2965-46ae-81ec-c73a372d0380\" (UID: \"1c250d34-2965-46ae-81ec-c73a372d0380\") " Mar 09 14:14:39 crc kubenswrapper[4722]: I0309 14:14:39.927488 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-69kkq\" (UniqueName: \"kubernetes.io/projected/1c250d34-2965-46ae-81ec-c73a372d0380-kube-api-access-69kkq\") pod \"1c250d34-2965-46ae-81ec-c73a372d0380\" (UID: \"1c250d34-2965-46ae-81ec-c73a372d0380\") " Mar 09 14:14:39 crc kubenswrapper[4722]: I0309 14:14:39.929723 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c250d34-2965-46ae-81ec-c73a372d0380-bundle" (OuterVolumeSpecName: "bundle") pod "1c250d34-2965-46ae-81ec-c73a372d0380" (UID: "1c250d34-2965-46ae-81ec-c73a372d0380"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:14:39 crc kubenswrapper[4722]: I0309 14:14:39.936840 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c250d34-2965-46ae-81ec-c73a372d0380-kube-api-access-69kkq" (OuterVolumeSpecName: "kube-api-access-69kkq") pod "1c250d34-2965-46ae-81ec-c73a372d0380" (UID: "1c250d34-2965-46ae-81ec-c73a372d0380"). InnerVolumeSpecName "kube-api-access-69kkq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:14:39 crc kubenswrapper[4722]: I0309 14:14:39.941160 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c250d34-2965-46ae-81ec-c73a372d0380-util" (OuterVolumeSpecName: "util") pod "1c250d34-2965-46ae-81ec-c73a372d0380" (UID: "1c250d34-2965-46ae-81ec-c73a372d0380"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:14:40 crc kubenswrapper[4722]: I0309 14:14:40.028856 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-69kkq\" (UniqueName: \"kubernetes.io/projected/1c250d34-2965-46ae-81ec-c73a372d0380-kube-api-access-69kkq\") on node \"crc\" DevicePath \"\"" Mar 09 14:14:40 crc kubenswrapper[4722]: I0309 14:14:40.028915 4722 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1c250d34-2965-46ae-81ec-c73a372d0380-util\") on node \"crc\" DevicePath \"\"" Mar 09 14:14:40 crc kubenswrapper[4722]: I0309 14:14:40.028940 4722 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1c250d34-2965-46ae-81ec-c73a372d0380-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:14:40 crc kubenswrapper[4722]: I0309 14:14:40.571611 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08pcbpj" event={"ID":"1c250d34-2965-46ae-81ec-c73a372d0380","Type":"ContainerDied","Data":"260e64b9d15be2021e65ae4af4384d6f9c55c1d636ad19be4e805142b840f4b4"} Mar 09 14:14:40 crc kubenswrapper[4722]: I0309 14:14:40.571991 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="260e64b9d15be2021e65ae4af4384d6f9c55c1d636ad19be4e805142b840f4b4" Mar 09 14:14:40 crc kubenswrapper[4722]: I0309 14:14:40.571683 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08pcbpj" Mar 09 14:14:45 crc kubenswrapper[4722]: I0309 14:14:45.030660 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-5v7ng"] Mar 09 14:14:45 crc kubenswrapper[4722]: I0309 14:14:45.031635 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" podUID="2e305619-b3a2-44c9-9e54-e1afa4f43dbf" containerName="ovn-controller" containerID="cri-o://24e737417baac512fab2089f5da12e7c47ef06eae8df7803ad6b248505ff3d18" gracePeriod=30 Mar 09 14:14:45 crc kubenswrapper[4722]: I0309 14:14:45.031729 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" podUID="2e305619-b3a2-44c9-9e54-e1afa4f43dbf" containerName="sbdb" containerID="cri-o://8f5af077ae5b947194e5dba0542d053890a589582e415e1b41a932bdab6632e0" gracePeriod=30 Mar 09 14:14:45 crc kubenswrapper[4722]: I0309 14:14:45.031828 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" podUID="2e305619-b3a2-44c9-9e54-e1afa4f43dbf" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://f969085ae62d82dbe1f0f8f5d459c4a1bc7d1ae0a459da484f11dd6ae5fdd418" gracePeriod=30 Mar 09 14:14:45 crc kubenswrapper[4722]: I0309 14:14:45.031801 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" podUID="2e305619-b3a2-44c9-9e54-e1afa4f43dbf" containerName="northd" containerID="cri-o://ef1c5a3944e8693d8c68a4108c23851352407d0fac7fb7a03763d9084f117dfb" gracePeriod=30 Mar 09 14:14:45 crc kubenswrapper[4722]: I0309 14:14:45.031910 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" podUID="2e305619-b3a2-44c9-9e54-e1afa4f43dbf" containerName="kube-rbac-proxy-node" containerID="cri-o://d1fc80c43be93eeb2f9805bf314c8430f52062d0ce5d0fa1830e7e0c27e674e3" gracePeriod=30 Mar 09 14:14:45 crc kubenswrapper[4722]: I0309 14:14:45.031952 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" podUID="2e305619-b3a2-44c9-9e54-e1afa4f43dbf" containerName="ovn-acl-logging" containerID="cri-o://a0106fa808804a96ca8247096399e3d04eb082b2691b1f39df3db9f6c630114c" gracePeriod=30 Mar 09 14:14:45 crc kubenswrapper[4722]: I0309 14:14:45.031780 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" podUID="2e305619-b3a2-44c9-9e54-e1afa4f43dbf" containerName="nbdb" containerID="cri-o://f5765145caac69454a9a8e8de34d05cb1da06924dd365f3e74f52461be5027ae" gracePeriod=30 Mar 09 14:14:45 crc kubenswrapper[4722]: I0309 14:14:45.077380 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" podUID="2e305619-b3a2-44c9-9e54-e1afa4f43dbf" containerName="ovnkube-controller" containerID="cri-o://71c08361cf5ed90f65d201ffd82a9403cb9c01ae88ea91cbbf351f6003bfa427" gracePeriod=30 Mar 09 14:14:45 crc kubenswrapper[4722]: I0309 14:14:45.607324 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5v7ng_2e305619-b3a2-44c9-9e54-e1afa4f43dbf/ovnkube-controller/3.log" Mar 09 14:14:45 crc kubenswrapper[4722]: I0309 14:14:45.610797 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5v7ng_2e305619-b3a2-44c9-9e54-e1afa4f43dbf/ovn-acl-logging/0.log" Mar 09 14:14:45 crc kubenswrapper[4722]: I0309 14:14:45.611317 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5v7ng_2e305619-b3a2-44c9-9e54-e1afa4f43dbf/ovn-controller/0.log" Mar 09 14:14:45 crc kubenswrapper[4722]: I0309 14:14:45.611718 4722 generic.go:334] "Generic (PLEG): container finished" podID="2e305619-b3a2-44c9-9e54-e1afa4f43dbf" containerID="71c08361cf5ed90f65d201ffd82a9403cb9c01ae88ea91cbbf351f6003bfa427" exitCode=0 Mar 09 14:14:45 crc kubenswrapper[4722]: I0309 14:14:45.611748 4722 generic.go:334] "Generic (PLEG): container finished" podID="2e305619-b3a2-44c9-9e54-e1afa4f43dbf" containerID="8f5af077ae5b947194e5dba0542d053890a589582e415e1b41a932bdab6632e0" exitCode=0 Mar 09 14:14:45 crc kubenswrapper[4722]: I0309 14:14:45.611758 4722 generic.go:334] "Generic (PLEG): container finished" podID="2e305619-b3a2-44c9-9e54-e1afa4f43dbf" containerID="f5765145caac69454a9a8e8de34d05cb1da06924dd365f3e74f52461be5027ae" exitCode=0 Mar 09 14:14:45 crc kubenswrapper[4722]: I0309 14:14:45.611766 4722 generic.go:334] "Generic (PLEG): container finished" podID="2e305619-b3a2-44c9-9e54-e1afa4f43dbf" containerID="ef1c5a3944e8693d8c68a4108c23851352407d0fac7fb7a03763d9084f117dfb" exitCode=0 Mar 09 14:14:45 crc kubenswrapper[4722]: I0309 14:14:45.611772 4722 generic.go:334] "Generic (PLEG): container finished" podID="2e305619-b3a2-44c9-9e54-e1afa4f43dbf" containerID="a0106fa808804a96ca8247096399e3d04eb082b2691b1f39df3db9f6c630114c" exitCode=143 Mar 09 14:14:45 crc kubenswrapper[4722]: I0309 14:14:45.611779 4722 generic.go:334] "Generic (PLEG): container finished" podID="2e305619-b3a2-44c9-9e54-e1afa4f43dbf" containerID="24e737417baac512fab2089f5da12e7c47ef06eae8df7803ad6b248505ff3d18" exitCode=143 Mar 09 14:14:45 crc kubenswrapper[4722]: I0309 14:14:45.611804 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" event={"ID":"2e305619-b3a2-44c9-9e54-e1afa4f43dbf","Type":"ContainerDied","Data":"71c08361cf5ed90f65d201ffd82a9403cb9c01ae88ea91cbbf351f6003bfa427"} Mar 09 14:14:45 crc kubenswrapper[4722]: I0309 14:14:45.611876 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" event={"ID":"2e305619-b3a2-44c9-9e54-e1afa4f43dbf","Type":"ContainerDied","Data":"8f5af077ae5b947194e5dba0542d053890a589582e415e1b41a932bdab6632e0"} Mar 09 14:14:45 crc kubenswrapper[4722]: I0309 14:14:45.611891 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" event={"ID":"2e305619-b3a2-44c9-9e54-e1afa4f43dbf","Type":"ContainerDied","Data":"f5765145caac69454a9a8e8de34d05cb1da06924dd365f3e74f52461be5027ae"} Mar 09 14:14:45 crc kubenswrapper[4722]: I0309 14:14:45.611904 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" event={"ID":"2e305619-b3a2-44c9-9e54-e1afa4f43dbf","Type":"ContainerDied","Data":"ef1c5a3944e8693d8c68a4108c23851352407d0fac7fb7a03763d9084f117dfb"} Mar 09 14:14:45 crc kubenswrapper[4722]: I0309 14:14:45.611919 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" event={"ID":"2e305619-b3a2-44c9-9e54-e1afa4f43dbf","Type":"ContainerDied","Data":"a0106fa808804a96ca8247096399e3d04eb082b2691b1f39df3db9f6c630114c"} Mar 09 14:14:45 crc kubenswrapper[4722]: I0309 14:14:45.611930 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" event={"ID":"2e305619-b3a2-44c9-9e54-e1afa4f43dbf","Type":"ContainerDied","Data":"24e737417baac512fab2089f5da12e7c47ef06eae8df7803ad6b248505ff3d18"} Mar 09 14:14:45 crc kubenswrapper[4722]: I0309 14:14:45.611953 4722 scope.go:117] "RemoveContainer" containerID="647b164e0544f016e9b6adbb065e7b958f1f7e1446a15f120191cfb35d030220" Mar 09 14:14:45 crc kubenswrapper[4722]: I0309 14:14:45.613847 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-h4zw5_6b9e29bb-6e51-47ab-a543-b70117ab854d/kube-multus/2.log" Mar 09 14:14:45 crc kubenswrapper[4722]: I0309 14:14:45.614392 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-h4zw5_6b9e29bb-6e51-47ab-a543-b70117ab854d/kube-multus/1.log" Mar 09 14:14:45 crc kubenswrapper[4722]: I0309 14:14:45.614444 4722 generic.go:334] "Generic (PLEG): container finished" podID="6b9e29bb-6e51-47ab-a543-b70117ab854d" containerID="d35e198363967793f8437918e78c17906e68a7a9bddca3be185af7534bf15d4f" exitCode=2 Mar 09 14:14:45 crc kubenswrapper[4722]: I0309 14:14:45.614474 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-h4zw5" event={"ID":"6b9e29bb-6e51-47ab-a543-b70117ab854d","Type":"ContainerDied","Data":"d35e198363967793f8437918e78c17906e68a7a9bddca3be185af7534bf15d4f"} Mar 09 14:14:45 crc kubenswrapper[4722]: I0309 14:14:45.614992 4722 scope.go:117] "RemoveContainer" containerID="d35e198363967793f8437918e78c17906e68a7a9bddca3be185af7534bf15d4f" Mar 09 14:14:45 crc kubenswrapper[4722]: E0309 14:14:45.615174 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-h4zw5_openshift-multus(6b9e29bb-6e51-47ab-a543-b70117ab854d)\"" pod="openshift-multus/multus-h4zw5" podUID="6b9e29bb-6e51-47ab-a543-b70117ab854d" Mar 09 14:14:45 crc kubenswrapper[4722]: I0309 14:14:45.640111 4722 scope.go:117] "RemoveContainer" containerID="30f56826c4b193cbd284ce58320073c1b4dc43c9eba976445ba4d4bb7c089960" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.329670 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5v7ng_2e305619-b3a2-44c9-9e54-e1afa4f43dbf/ovn-acl-logging/0.log" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.330773 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5v7ng_2e305619-b3a2-44c9-9e54-e1afa4f43dbf/ovn-controller/0.log" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.332398 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.430240 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-fmhcx"] Mar 09 14:14:46 crc kubenswrapper[4722]: E0309 14:14:46.430573 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e305619-b3a2-44c9-9e54-e1afa4f43dbf" containerName="ovnkube-controller" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.430597 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e305619-b3a2-44c9-9e54-e1afa4f43dbf" containerName="ovnkube-controller" Mar 09 14:14:46 crc kubenswrapper[4722]: E0309 14:14:46.430613 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e305619-b3a2-44c9-9e54-e1afa4f43dbf" containerName="ovnkube-controller" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.430621 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e305619-b3a2-44c9-9e54-e1afa4f43dbf" containerName="ovnkube-controller" Mar 09 14:14:46 crc kubenswrapper[4722]: E0309 14:14:46.430632 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c250d34-2965-46ae-81ec-c73a372d0380" containerName="extract" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.430640 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c250d34-2965-46ae-81ec-c73a372d0380" containerName="extract" Mar 09 14:14:46 crc kubenswrapper[4722]: E0309 14:14:46.430649 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e305619-b3a2-44c9-9e54-e1afa4f43dbf" containerName="ovnkube-controller" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.430655 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e305619-b3a2-44c9-9e54-e1afa4f43dbf" containerName="ovnkube-controller" Mar 09 14:14:46 crc kubenswrapper[4722]: E0309 14:14:46.430665 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e305619-b3a2-44c9-9e54-e1afa4f43dbf" containerName="sbdb" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.430672 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e305619-b3a2-44c9-9e54-e1afa4f43dbf" containerName="sbdb" Mar 09 14:14:46 crc kubenswrapper[4722]: E0309 14:14:46.430686 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e305619-b3a2-44c9-9e54-e1afa4f43dbf" containerName="ovn-acl-logging" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.430693 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e305619-b3a2-44c9-9e54-e1afa4f43dbf" containerName="ovn-acl-logging" Mar 09 14:14:46 crc kubenswrapper[4722]: E0309 14:14:46.430708 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e305619-b3a2-44c9-9e54-e1afa4f43dbf" containerName="ovn-controller" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.430715 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e305619-b3a2-44c9-9e54-e1afa4f43dbf" containerName="ovn-controller" Mar 09 14:14:46 crc kubenswrapper[4722]: E0309 14:14:46.430727 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c250d34-2965-46ae-81ec-c73a372d0380" containerName="util" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.430733 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c250d34-2965-46ae-81ec-c73a372d0380" containerName="util" Mar 09 14:14:46 crc kubenswrapper[4722]: E0309 14:14:46.430742 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e305619-b3a2-44c9-9e54-e1afa4f43dbf" containerName="kube-rbac-proxy-ovn-metrics" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.430748 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e305619-b3a2-44c9-9e54-e1afa4f43dbf" containerName="kube-rbac-proxy-ovn-metrics" Mar 09 14:14:46 crc kubenswrapper[4722]: E0309 14:14:46.430761 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e305619-b3a2-44c9-9e54-e1afa4f43dbf" containerName="northd" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.430768 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e305619-b3a2-44c9-9e54-e1afa4f43dbf" containerName="northd" Mar 09 14:14:46 crc kubenswrapper[4722]: E0309 14:14:46.430778 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e305619-b3a2-44c9-9e54-e1afa4f43dbf" containerName="kubecfg-setup" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.430785 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e305619-b3a2-44c9-9e54-e1afa4f43dbf" containerName="kubecfg-setup" Mar 09 14:14:46 crc kubenswrapper[4722]: E0309 14:14:46.430797 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e305619-b3a2-44c9-9e54-e1afa4f43dbf" containerName="kube-rbac-proxy-node" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.430804 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e305619-b3a2-44c9-9e54-e1afa4f43dbf" containerName="kube-rbac-proxy-node" Mar 09 14:14:46 crc kubenswrapper[4722]: E0309 14:14:46.430813 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c250d34-2965-46ae-81ec-c73a372d0380" containerName="pull" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.430821 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c250d34-2965-46ae-81ec-c73a372d0380" containerName="pull" Mar 09 14:14:46 crc kubenswrapper[4722]: E0309 14:14:46.430852 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e305619-b3a2-44c9-9e54-e1afa4f43dbf" containerName="nbdb" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.430860 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e305619-b3a2-44c9-9e54-e1afa4f43dbf" containerName="nbdb" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.430993 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e305619-b3a2-44c9-9e54-e1afa4f43dbf" containerName="nbdb" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.431009 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e305619-b3a2-44c9-9e54-e1afa4f43dbf" containerName="ovn-controller" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.431017 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e305619-b3a2-44c9-9e54-e1afa4f43dbf" containerName="sbdb" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.431027 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e305619-b3a2-44c9-9e54-e1afa4f43dbf" containerName="ovnkube-controller" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.431036 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e305619-b3a2-44c9-9e54-e1afa4f43dbf" containerName="ovnkube-controller" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.431044 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e305619-b3a2-44c9-9e54-e1afa4f43dbf" containerName="northd" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.431057 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e305619-b3a2-44c9-9e54-e1afa4f43dbf" containerName="kube-rbac-proxy-node" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.431068 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e305619-b3a2-44c9-9e54-e1afa4f43dbf" containerName="ovnkube-controller" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.431077 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e305619-b3a2-44c9-9e54-e1afa4f43dbf" containerName="ovnkube-controller" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.431086 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e305619-b3a2-44c9-9e54-e1afa4f43dbf" containerName="ovn-acl-logging" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.431098 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e305619-b3a2-44c9-9e54-e1afa4f43dbf" containerName="kube-rbac-proxy-ovn-metrics" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.431107 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c250d34-2965-46ae-81ec-c73a372d0380" containerName="extract" Mar 09 14:14:46 crc kubenswrapper[4722]: E0309 14:14:46.431247 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e305619-b3a2-44c9-9e54-e1afa4f43dbf" containerName="ovnkube-controller" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.431258 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e305619-b3a2-44c9-9e54-e1afa4f43dbf" containerName="ovnkube-controller" Mar 09 14:14:46 crc kubenswrapper[4722]: E0309 14:14:46.431271 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e305619-b3a2-44c9-9e54-e1afa4f43dbf" containerName="ovnkube-controller" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.431279 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e305619-b3a2-44c9-9e54-e1afa4f43dbf" containerName="ovnkube-controller" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.431409 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e305619-b3a2-44c9-9e54-e1afa4f43dbf" containerName="ovnkube-controller" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.433178 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-ovnkube-config\") pod \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\" (UID: \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\") " Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.433241 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-host-run-ovn-kubernetes\") pod \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\" (UID: \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\") " Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.433277 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-run-ovn\") pod \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\" (UID: \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\") " Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.433313 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-ovn-node-metrics-cert\") pod \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\" (UID: \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\") " Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.433347 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-host-slash\") pod \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\" (UID: \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\") " Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.433375 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-var-lib-openvswitch\") pod \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\" (UID: \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\") " Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.433397 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-ovnkube-script-lib\") pod \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\" (UID: \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\") " Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.433416 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-run-openvswitch\") pod \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\" (UID: \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\") " Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.433440 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-run-systemd\") pod \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\" (UID: \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\") " Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.433455 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-env-overrides\") pod \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\" (UID: \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\") " Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.433497 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-host-kubelet\") pod \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\" (UID: \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\") " Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.433528 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-node-log\") pod \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\" (UID: \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\") " Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.433548 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-log-socket\") pod \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\" (UID: \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\") " Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.433565 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-host-cni-bin\") pod \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\" (UID: \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\") " Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.433591 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v6nlp\" (UniqueName: \"kubernetes.io/projected/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-kube-api-access-v6nlp\") pod \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\" (UID: \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\") " Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.433613 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "2e305619-b3a2-44c9-9e54-e1afa4f43dbf" (UID: "2e305619-b3a2-44c9-9e54-e1afa4f43dbf"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.433628 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-etc-openvswitch\") pod \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\" (UID: \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\") " Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.433676 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "2e305619-b3a2-44c9-9e54-e1afa4f43dbf" (UID: "2e305619-b3a2-44c9-9e54-e1afa4f43dbf"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.433687 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-systemd-units\") pod \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\" (UID: \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\") " Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.433708 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "2e305619-b3a2-44c9-9e54-e1afa4f43dbf" (UID: "2e305619-b3a2-44c9-9e54-e1afa4f43dbf"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.433712 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-fmhcx" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.433725 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-host-run-netns\") pod \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\" (UID: \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\") " Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.433727 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "2e305619-b3a2-44c9-9e54-e1afa4f43dbf" (UID: "2e305619-b3a2-44c9-9e54-e1afa4f43dbf"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.433751 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-host-var-lib-cni-networks-ovn-kubernetes\") pod \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\" (UID: \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\") " Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.433777 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-host-cni-netd\") pod \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\" (UID: \"2e305619-b3a2-44c9-9e54-e1afa4f43dbf\") " Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.434162 4722 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.434181 4722 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.434193 4722 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.434218 4722 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.434217 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "2e305619-b3a2-44c9-9e54-e1afa4f43dbf" (UID: "2e305619-b3a2-44c9-9e54-e1afa4f43dbf"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.434247 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "2e305619-b3a2-44c9-9e54-e1afa4f43dbf" (UID: "2e305619-b3a2-44c9-9e54-e1afa4f43dbf"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.434238 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "2e305619-b3a2-44c9-9e54-e1afa4f43dbf" (UID: "2e305619-b3a2-44c9-9e54-e1afa4f43dbf"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.434271 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "2e305619-b3a2-44c9-9e54-e1afa4f43dbf" (UID: "2e305619-b3a2-44c9-9e54-e1afa4f43dbf"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.434270 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "2e305619-b3a2-44c9-9e54-e1afa4f43dbf" (UID: "2e305619-b3a2-44c9-9e54-e1afa4f43dbf"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.434300 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "2e305619-b3a2-44c9-9e54-e1afa4f43dbf" (UID: "2e305619-b3a2-44c9-9e54-e1afa4f43dbf"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.434300 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "2e305619-b3a2-44c9-9e54-e1afa4f43dbf" (UID: "2e305619-b3a2-44c9-9e54-e1afa4f43dbf"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.434332 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-host-slash" (OuterVolumeSpecName: "host-slash") pod "2e305619-b3a2-44c9-9e54-e1afa4f43dbf" (UID: "2e305619-b3a2-44c9-9e54-e1afa4f43dbf"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.434362 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "2e305619-b3a2-44c9-9e54-e1afa4f43dbf" (UID: "2e305619-b3a2-44c9-9e54-e1afa4f43dbf"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.434379 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-log-socket" (OuterVolumeSpecName: "log-socket") pod "2e305619-b3a2-44c9-9e54-e1afa4f43dbf" (UID: "2e305619-b3a2-44c9-9e54-e1afa4f43dbf"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.434405 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-node-log" (OuterVolumeSpecName: "node-log") pod "2e305619-b3a2-44c9-9e54-e1afa4f43dbf" (UID: "2e305619-b3a2-44c9-9e54-e1afa4f43dbf"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.434675 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "2e305619-b3a2-44c9-9e54-e1afa4f43dbf" (UID: "2e305619-b3a2-44c9-9e54-e1afa4f43dbf"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.434746 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "2e305619-b3a2-44c9-9e54-e1afa4f43dbf" (UID: "2e305619-b3a2-44c9-9e54-e1afa4f43dbf"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.444600 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "2e305619-b3a2-44c9-9e54-e1afa4f43dbf" (UID: "2e305619-b3a2-44c9-9e54-e1afa4f43dbf"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.445496 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-kube-api-access-v6nlp" (OuterVolumeSpecName: "kube-api-access-v6nlp") pod "2e305619-b3a2-44c9-9e54-e1afa4f43dbf" (UID: "2e305619-b3a2-44c9-9e54-e1afa4f43dbf"). InnerVolumeSpecName "kube-api-access-v6nlp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.465460 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "2e305619-b3a2-44c9-9e54-e1afa4f43dbf" (UID: "2e305619-b3a2-44c9-9e54-e1afa4f43dbf"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.535881 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/731c31a2-ded2-452f-b330-0cf118ab1e84-host-run-netns\") pod \"ovnkube-node-fmhcx\" (UID: \"731c31a2-ded2-452f-b330-0cf118ab1e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmhcx" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.535931 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/731c31a2-ded2-452f-b330-0cf118ab1e84-ovnkube-config\") pod \"ovnkube-node-fmhcx\" (UID: \"731c31a2-ded2-452f-b330-0cf118ab1e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmhcx" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.535957 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccmnd\" (UniqueName: \"kubernetes.io/projected/731c31a2-ded2-452f-b330-0cf118ab1e84-kube-api-access-ccmnd\") pod \"ovnkube-node-fmhcx\" (UID: \"731c31a2-ded2-452f-b330-0cf118ab1e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmhcx" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.535974 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/731c31a2-ded2-452f-b330-0cf118ab1e84-var-lib-openvswitch\") pod \"ovnkube-node-fmhcx\" (UID: \"731c31a2-ded2-452f-b330-0cf118ab1e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmhcx" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.535997 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/731c31a2-ded2-452f-b330-0cf118ab1e84-host-run-ovn-kubernetes\") pod \"ovnkube-node-fmhcx\" (UID: \"731c31a2-ded2-452f-b330-0cf118ab1e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmhcx" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.536012 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/731c31a2-ded2-452f-b330-0cf118ab1e84-ovn-node-metrics-cert\") pod \"ovnkube-node-fmhcx\" (UID: \"731c31a2-ded2-452f-b330-0cf118ab1e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmhcx" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.536029 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/731c31a2-ded2-452f-b330-0cf118ab1e84-run-openvswitch\") pod \"ovnkube-node-fmhcx\" (UID: \"731c31a2-ded2-452f-b330-0cf118ab1e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmhcx" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.536043 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/731c31a2-ded2-452f-b330-0cf118ab1e84-etc-openvswitch\") pod \"ovnkube-node-fmhcx\" (UID: \"731c31a2-ded2-452f-b330-0cf118ab1e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmhcx" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.536062 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/731c31a2-ded2-452f-b330-0cf118ab1e84-run-systemd\") pod \"ovnkube-node-fmhcx\" (UID: \"731c31a2-ded2-452f-b330-0cf118ab1e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmhcx" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.536080 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/731c31a2-ded2-452f-b330-0cf118ab1e84-host-slash\") pod \"ovnkube-node-fmhcx\" (UID: \"731c31a2-ded2-452f-b330-0cf118ab1e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmhcx" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.536111 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/731c31a2-ded2-452f-b330-0cf118ab1e84-log-socket\") pod \"ovnkube-node-fmhcx\" (UID: \"731c31a2-ded2-452f-b330-0cf118ab1e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmhcx" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.536129 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/731c31a2-ded2-452f-b330-0cf118ab1e84-node-log\") pod \"ovnkube-node-fmhcx\" (UID: \"731c31a2-ded2-452f-b330-0cf118ab1e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmhcx" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.536145 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/731c31a2-ded2-452f-b330-0cf118ab1e84-ovnkube-script-lib\") pod \"ovnkube-node-fmhcx\" (UID: \"731c31a2-ded2-452f-b330-0cf118ab1e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmhcx" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.536164 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/731c31a2-ded2-452f-b330-0cf118ab1e84-host-cni-bin\") pod \"ovnkube-node-fmhcx\" (UID: \"731c31a2-ded2-452f-b330-0cf118ab1e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmhcx" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.536177 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/731c31a2-ded2-452f-b330-0cf118ab1e84-host-cni-netd\") pod \"ovnkube-node-fmhcx\" (UID: \"731c31a2-ded2-452f-b330-0cf118ab1e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmhcx" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.536214 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/731c31a2-ded2-452f-b330-0cf118ab1e84-run-ovn\") pod \"ovnkube-node-fmhcx\" (UID: \"731c31a2-ded2-452f-b330-0cf118ab1e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmhcx" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.536233 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/731c31a2-ded2-452f-b330-0cf118ab1e84-env-overrides\") pod \"ovnkube-node-fmhcx\" (UID: \"731c31a2-ded2-452f-b330-0cf118ab1e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmhcx" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.536252 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/731c31a2-ded2-452f-b330-0cf118ab1e84-host-kubelet\") pod \"ovnkube-node-fmhcx\" (UID: \"731c31a2-ded2-452f-b330-0cf118ab1e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmhcx" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.536279 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/731c31a2-ded2-452f-b330-0cf118ab1e84-systemd-units\") pod \"ovnkube-node-fmhcx\" (UID: \"731c31a2-ded2-452f-b330-0cf118ab1e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmhcx" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.536298 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/731c31a2-ded2-452f-b330-0cf118ab1e84-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fmhcx\" (UID: \"731c31a2-ded2-452f-b330-0cf118ab1e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmhcx" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.536337 4722 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.536348 4722 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.536358 4722 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.536368 4722 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.536378 4722 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.536386 4722 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-host-slash\") on node \"crc\" DevicePath \"\"" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.536395 4722 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.536404 4722 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.536413 4722 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.536421 4722 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.536430 4722 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.536438 4722 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.536447 4722 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-node-log\") on node \"crc\" DevicePath \"\"" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.536456 4722 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-log-socket\") on node \"crc\" DevicePath \"\"" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.536465 4722 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.536473 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v6nlp\" (UniqueName: \"kubernetes.io/projected/2e305619-b3a2-44c9-9e54-e1afa4f43dbf-kube-api-access-v6nlp\") on node \"crc\" DevicePath \"\"" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.625181 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5v7ng_2e305619-b3a2-44c9-9e54-e1afa4f43dbf/ovn-acl-logging/0.log" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.626253 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5v7ng_2e305619-b3a2-44c9-9e54-e1afa4f43dbf/ovn-controller/0.log" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.626708 4722 generic.go:334] "Generic (PLEG): container finished" podID="2e305619-b3a2-44c9-9e54-e1afa4f43dbf" containerID="f969085ae62d82dbe1f0f8f5d459c4a1bc7d1ae0a459da484f11dd6ae5fdd418" exitCode=0 Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.626744 4722 generic.go:334] "Generic (PLEG): container finished" podID="2e305619-b3a2-44c9-9e54-e1afa4f43dbf" containerID="d1fc80c43be93eeb2f9805bf314c8430f52062d0ce5d0fa1830e7e0c27e674e3" exitCode=0 Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.627023 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.627016 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" event={"ID":"2e305619-b3a2-44c9-9e54-e1afa4f43dbf","Type":"ContainerDied","Data":"f969085ae62d82dbe1f0f8f5d459c4a1bc7d1ae0a459da484f11dd6ae5fdd418"} Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.627096 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" event={"ID":"2e305619-b3a2-44c9-9e54-e1afa4f43dbf","Type":"ContainerDied","Data":"d1fc80c43be93eeb2f9805bf314c8430f52062d0ce5d0fa1830e7e0c27e674e3"} Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.627129 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5v7ng" event={"ID":"2e305619-b3a2-44c9-9e54-e1afa4f43dbf","Type":"ContainerDied","Data":"d2f6057c8c3de0bebcb5e88e9ed6dd935f0fc52db361546f77994d3be96ebf04"} Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.627159 4722 scope.go:117] "RemoveContainer" containerID="71c08361cf5ed90f65d201ffd82a9403cb9c01ae88ea91cbbf351f6003bfa427" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.629695 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-h4zw5_6b9e29bb-6e51-47ab-a543-b70117ab854d/kube-multus/2.log" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.637927 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/731c31a2-ded2-452f-b330-0cf118ab1e84-ovnkube-script-lib\") pod \"ovnkube-node-fmhcx\" (UID: \"731c31a2-ded2-452f-b330-0cf118ab1e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmhcx" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.637966 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/731c31a2-ded2-452f-b330-0cf118ab1e84-host-cni-bin\") pod \"ovnkube-node-fmhcx\" (UID: \"731c31a2-ded2-452f-b330-0cf118ab1e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmhcx" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.637987 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/731c31a2-ded2-452f-b330-0cf118ab1e84-host-cni-netd\") pod \"ovnkube-node-fmhcx\" (UID: \"731c31a2-ded2-452f-b330-0cf118ab1e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmhcx" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.638005 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/731c31a2-ded2-452f-b330-0cf118ab1e84-run-ovn\") pod \"ovnkube-node-fmhcx\" (UID: \"731c31a2-ded2-452f-b330-0cf118ab1e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmhcx" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.638026 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/731c31a2-ded2-452f-b330-0cf118ab1e84-env-overrides\") pod \"ovnkube-node-fmhcx\" (UID: \"731c31a2-ded2-452f-b330-0cf118ab1e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmhcx" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.638046 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/731c31a2-ded2-452f-b330-0cf118ab1e84-host-kubelet\") pod \"ovnkube-node-fmhcx\" (UID: \"731c31a2-ded2-452f-b330-0cf118ab1e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmhcx" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.638074 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/731c31a2-ded2-452f-b330-0cf118ab1e84-systemd-units\") pod \"ovnkube-node-fmhcx\" (UID: \"731c31a2-ded2-452f-b330-0cf118ab1e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmhcx" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.638089 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/731c31a2-ded2-452f-b330-0cf118ab1e84-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fmhcx\" (UID: \"731c31a2-ded2-452f-b330-0cf118ab1e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmhcx" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.638116 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/731c31a2-ded2-452f-b330-0cf118ab1e84-host-run-netns\") pod \"ovnkube-node-fmhcx\" (UID: \"731c31a2-ded2-452f-b330-0cf118ab1e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmhcx" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.638131 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/731c31a2-ded2-452f-b330-0cf118ab1e84-ovnkube-config\") pod \"ovnkube-node-fmhcx\" (UID: \"731c31a2-ded2-452f-b330-0cf118ab1e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmhcx" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.638157 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccmnd\" (UniqueName: \"kubernetes.io/projected/731c31a2-ded2-452f-b330-0cf118ab1e84-kube-api-access-ccmnd\") pod \"ovnkube-node-fmhcx\" (UID: \"731c31a2-ded2-452f-b330-0cf118ab1e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmhcx" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.638176 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/731c31a2-ded2-452f-b330-0cf118ab1e84-var-lib-openvswitch\") pod \"ovnkube-node-fmhcx\" (UID: \"731c31a2-ded2-452f-b330-0cf118ab1e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmhcx" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.638194 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/731c31a2-ded2-452f-b330-0cf118ab1e84-ovn-node-metrics-cert\") pod \"ovnkube-node-fmhcx\" (UID: \"731c31a2-ded2-452f-b330-0cf118ab1e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmhcx" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.638226 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/731c31a2-ded2-452f-b330-0cf118ab1e84-host-run-ovn-kubernetes\") pod \"ovnkube-node-fmhcx\" (UID: \"731c31a2-ded2-452f-b330-0cf118ab1e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmhcx" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.638243 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/731c31a2-ded2-452f-b330-0cf118ab1e84-etc-openvswitch\") pod \"ovnkube-node-fmhcx\" (UID: \"731c31a2-ded2-452f-b330-0cf118ab1e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmhcx" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.638259 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/731c31a2-ded2-452f-b330-0cf118ab1e84-run-openvswitch\") pod \"ovnkube-node-fmhcx\" (UID: \"731c31a2-ded2-452f-b330-0cf118ab1e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmhcx" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.638277 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/731c31a2-ded2-452f-b330-0cf118ab1e84-run-systemd\") pod \"ovnkube-node-fmhcx\" (UID: \"731c31a2-ded2-452f-b330-0cf118ab1e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmhcx" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.638292 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/731c31a2-ded2-452f-b330-0cf118ab1e84-host-slash\") pod \"ovnkube-node-fmhcx\" (UID: \"731c31a2-ded2-452f-b330-0cf118ab1e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmhcx" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.638318 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/731c31a2-ded2-452f-b330-0cf118ab1e84-log-socket\") pod \"ovnkube-node-fmhcx\" (UID: \"731c31a2-ded2-452f-b330-0cf118ab1e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmhcx" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.638335 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/731c31a2-ded2-452f-b330-0cf118ab1e84-node-log\") pod \"ovnkube-node-fmhcx\" (UID: \"731c31a2-ded2-452f-b330-0cf118ab1e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmhcx" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.638403 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/731c31a2-ded2-452f-b330-0cf118ab1e84-node-log\") pod \"ovnkube-node-fmhcx\" (UID: \"731c31a2-ded2-452f-b330-0cf118ab1e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmhcx" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.639114 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/731c31a2-ded2-452f-b330-0cf118ab1e84-etc-openvswitch\") pod \"ovnkube-node-fmhcx\" (UID: \"731c31a2-ded2-452f-b330-0cf118ab1e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmhcx" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.639137 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/731c31a2-ded2-452f-b330-0cf118ab1e84-run-systemd\") pod \"ovnkube-node-fmhcx\" (UID: \"731c31a2-ded2-452f-b330-0cf118ab1e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmhcx" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.639132 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/731c31a2-ded2-452f-b330-0cf118ab1e84-ovnkube-script-lib\") pod \"ovnkube-node-fmhcx\" (UID: \"731c31a2-ded2-452f-b330-0cf118ab1e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmhcx" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.639155 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/731c31a2-ded2-452f-b330-0cf118ab1e84-host-slash\") pod \"ovnkube-node-fmhcx\" (UID: \"731c31a2-ded2-452f-b330-0cf118ab1e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmhcx" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.639424 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/731c31a2-ded2-452f-b330-0cf118ab1e84-env-overrides\") pod \"ovnkube-node-fmhcx\" (UID: \"731c31a2-ded2-452f-b330-0cf118ab1e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmhcx" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.639986 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/731c31a2-ded2-452f-b330-0cf118ab1e84-host-cni-netd\") pod \"ovnkube-node-fmhcx\" (UID: \"731c31a2-ded2-452f-b330-0cf118ab1e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmhcx" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.640036 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/731c31a2-ded2-452f-b330-0cf118ab1e84-host-cni-bin\") pod \"ovnkube-node-fmhcx\" (UID: \"731c31a2-ded2-452f-b330-0cf118ab1e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmhcx" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.640024 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/731c31a2-ded2-452f-b330-0cf118ab1e84-systemd-units\") pod \"ovnkube-node-fmhcx\" (UID: \"731c31a2-ded2-452f-b330-0cf118ab1e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmhcx" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.640058 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/731c31a2-ded2-452f-b330-0cf118ab1e84-host-run-ovn-kubernetes\") pod \"ovnkube-node-fmhcx\" (UID: \"731c31a2-ded2-452f-b330-0cf118ab1e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmhcx" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.640068 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/731c31a2-ded2-452f-b330-0cf118ab1e84-host-kubelet\") pod \"ovnkube-node-fmhcx\" (UID: \"731c31a2-ded2-452f-b330-0cf118ab1e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmhcx" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.640061 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/731c31a2-ded2-452f-b330-0cf118ab1e84-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fmhcx\" (UID: \"731c31a2-ded2-452f-b330-0cf118ab1e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmhcx" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.640070 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/731c31a2-ded2-452f-b330-0cf118ab1e84-run-ovn\") pod \"ovnkube-node-fmhcx\" (UID: \"731c31a2-ded2-452f-b330-0cf118ab1e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmhcx" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.640105 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/731c31a2-ded2-452f-b330-0cf118ab1e84-run-openvswitch\") pod \"ovnkube-node-fmhcx\" (UID: \"731c31a2-ded2-452f-b330-0cf118ab1e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmhcx" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.640089 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/731c31a2-ded2-452f-b330-0cf118ab1e84-var-lib-openvswitch\") pod \"ovnkube-node-fmhcx\" (UID: \"731c31a2-ded2-452f-b330-0cf118ab1e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmhcx" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.640076 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/731c31a2-ded2-452f-b330-0cf118ab1e84-host-run-netns\") pod \"ovnkube-node-fmhcx\" (UID: \"731c31a2-ded2-452f-b330-0cf118ab1e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmhcx" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.640626 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/731c31a2-ded2-452f-b330-0cf118ab1e84-log-socket\") pod \"ovnkube-node-fmhcx\" (UID: \"731c31a2-ded2-452f-b330-0cf118ab1e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmhcx" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.642569 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/731c31a2-ded2-452f-b330-0cf118ab1e84-ovnkube-config\") pod \"ovnkube-node-fmhcx\" (UID: \"731c31a2-ded2-452f-b330-0cf118ab1e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmhcx" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.645742 4722 scope.go:117] "RemoveContainer" containerID="8f5af077ae5b947194e5dba0542d053890a589582e415e1b41a932bdab6632e0" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.646563 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/731c31a2-ded2-452f-b330-0cf118ab1e84-ovn-node-metrics-cert\") pod \"ovnkube-node-fmhcx\" (UID: \"731c31a2-ded2-452f-b330-0cf118ab1e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmhcx" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.657770 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccmnd\" (UniqueName: \"kubernetes.io/projected/731c31a2-ded2-452f-b330-0cf118ab1e84-kube-api-access-ccmnd\") pod \"ovnkube-node-fmhcx\" (UID: \"731c31a2-ded2-452f-b330-0cf118ab1e84\") " pod="openshift-ovn-kubernetes/ovnkube-node-fmhcx" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.660516 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-5v7ng"] Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.665354 4722 scope.go:117] "RemoveContainer" containerID="f5765145caac69454a9a8e8de34d05cb1da06924dd365f3e74f52461be5027ae" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.665755 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-5v7ng"] Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.678953 4722 scope.go:117] "RemoveContainer" containerID="ef1c5a3944e8693d8c68a4108c23851352407d0fac7fb7a03763d9084f117dfb" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.692072 4722 scope.go:117] "RemoveContainer" containerID="f969085ae62d82dbe1f0f8f5d459c4a1bc7d1ae0a459da484f11dd6ae5fdd418" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.707683 4722 scope.go:117] "RemoveContainer" containerID="d1fc80c43be93eeb2f9805bf314c8430f52062d0ce5d0fa1830e7e0c27e674e3" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.730314 4722 scope.go:117] "RemoveContainer" containerID="a0106fa808804a96ca8247096399e3d04eb082b2691b1f39df3db9f6c630114c" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.754757 4722 scope.go:117] "RemoveContainer" containerID="24e737417baac512fab2089f5da12e7c47ef06eae8df7803ad6b248505ff3d18" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.775839 4722 scope.go:117] "RemoveContainer" containerID="bcc35a356ac2421f1f5a4e5460ac9461063a1fb85aaefce257f9f4e7aed4dc75" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.777391 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-fmhcx" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.803055 4722 scope.go:117] "RemoveContainer" containerID="71c08361cf5ed90f65d201ffd82a9403cb9c01ae88ea91cbbf351f6003bfa427" Mar 09 14:14:46 crc kubenswrapper[4722]: E0309 14:14:46.803586 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71c08361cf5ed90f65d201ffd82a9403cb9c01ae88ea91cbbf351f6003bfa427\": container with ID starting with 71c08361cf5ed90f65d201ffd82a9403cb9c01ae88ea91cbbf351f6003bfa427 not found: ID does not exist" containerID="71c08361cf5ed90f65d201ffd82a9403cb9c01ae88ea91cbbf351f6003bfa427" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.803642 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71c08361cf5ed90f65d201ffd82a9403cb9c01ae88ea91cbbf351f6003bfa427"} err="failed to get container status \"71c08361cf5ed90f65d201ffd82a9403cb9c01ae88ea91cbbf351f6003bfa427\": rpc error: code = NotFound desc = could not find container \"71c08361cf5ed90f65d201ffd82a9403cb9c01ae88ea91cbbf351f6003bfa427\": container with ID starting with 71c08361cf5ed90f65d201ffd82a9403cb9c01ae88ea91cbbf351f6003bfa427 not found: ID does not exist" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.803684 4722 scope.go:117] "RemoveContainer" containerID="8f5af077ae5b947194e5dba0542d053890a589582e415e1b41a932bdab6632e0" Mar 09 14:14:46 crc kubenswrapper[4722]: E0309 14:14:46.804048 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f5af077ae5b947194e5dba0542d053890a589582e415e1b41a932bdab6632e0\": container with ID starting with 8f5af077ae5b947194e5dba0542d053890a589582e415e1b41a932bdab6632e0 not found: ID does not exist" containerID="8f5af077ae5b947194e5dba0542d053890a589582e415e1b41a932bdab6632e0" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.804084 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f5af077ae5b947194e5dba0542d053890a589582e415e1b41a932bdab6632e0"} err="failed to get container status \"8f5af077ae5b947194e5dba0542d053890a589582e415e1b41a932bdab6632e0\": rpc error: code = NotFound desc = could not find container \"8f5af077ae5b947194e5dba0542d053890a589582e415e1b41a932bdab6632e0\": container with ID starting with 8f5af077ae5b947194e5dba0542d053890a589582e415e1b41a932bdab6632e0 not found: ID does not exist" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.804110 4722 scope.go:117] "RemoveContainer" containerID="f5765145caac69454a9a8e8de34d05cb1da06924dd365f3e74f52461be5027ae" Mar 09 14:14:46 crc kubenswrapper[4722]: E0309 14:14:46.804493 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5765145caac69454a9a8e8de34d05cb1da06924dd365f3e74f52461be5027ae\": container with ID starting with f5765145caac69454a9a8e8de34d05cb1da06924dd365f3e74f52461be5027ae not found: ID does not exist" containerID="f5765145caac69454a9a8e8de34d05cb1da06924dd365f3e74f52461be5027ae" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.804526 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5765145caac69454a9a8e8de34d05cb1da06924dd365f3e74f52461be5027ae"} err="failed to get container status \"f5765145caac69454a9a8e8de34d05cb1da06924dd365f3e74f52461be5027ae\": rpc error: code = NotFound desc = could not find container \"f5765145caac69454a9a8e8de34d05cb1da06924dd365f3e74f52461be5027ae\": container with ID starting with f5765145caac69454a9a8e8de34d05cb1da06924dd365f3e74f52461be5027ae not found: ID does not exist" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.804546 4722 scope.go:117] "RemoveContainer" containerID="ef1c5a3944e8693d8c68a4108c23851352407d0fac7fb7a03763d9084f117dfb" Mar 09 14:14:46 crc kubenswrapper[4722]: E0309 14:14:46.805069 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef1c5a3944e8693d8c68a4108c23851352407d0fac7fb7a03763d9084f117dfb\": container with ID starting with ef1c5a3944e8693d8c68a4108c23851352407d0fac7fb7a03763d9084f117dfb not found: ID does not exist" containerID="ef1c5a3944e8693d8c68a4108c23851352407d0fac7fb7a03763d9084f117dfb" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.805120 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef1c5a3944e8693d8c68a4108c23851352407d0fac7fb7a03763d9084f117dfb"} err="failed to get container status \"ef1c5a3944e8693d8c68a4108c23851352407d0fac7fb7a03763d9084f117dfb\": rpc error: code = NotFound desc = could not find container \"ef1c5a3944e8693d8c68a4108c23851352407d0fac7fb7a03763d9084f117dfb\": container with ID starting with ef1c5a3944e8693d8c68a4108c23851352407d0fac7fb7a03763d9084f117dfb not found: ID does not exist" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.805145 4722 scope.go:117] "RemoveContainer" containerID="f969085ae62d82dbe1f0f8f5d459c4a1bc7d1ae0a459da484f11dd6ae5fdd418" Mar 09 14:14:46 crc kubenswrapper[4722]: E0309 14:14:46.805452 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f969085ae62d82dbe1f0f8f5d459c4a1bc7d1ae0a459da484f11dd6ae5fdd418\": container with ID starting with f969085ae62d82dbe1f0f8f5d459c4a1bc7d1ae0a459da484f11dd6ae5fdd418 not found: ID does not exist" containerID="f969085ae62d82dbe1f0f8f5d459c4a1bc7d1ae0a459da484f11dd6ae5fdd418" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.805483 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f969085ae62d82dbe1f0f8f5d459c4a1bc7d1ae0a459da484f11dd6ae5fdd418"} err="failed to get container status \"f969085ae62d82dbe1f0f8f5d459c4a1bc7d1ae0a459da484f11dd6ae5fdd418\": rpc error: code = NotFound desc = could not find container \"f969085ae62d82dbe1f0f8f5d459c4a1bc7d1ae0a459da484f11dd6ae5fdd418\": container with ID starting with f969085ae62d82dbe1f0f8f5d459c4a1bc7d1ae0a459da484f11dd6ae5fdd418 not found: ID does not exist" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.805502 4722 scope.go:117] "RemoveContainer" containerID="d1fc80c43be93eeb2f9805bf314c8430f52062d0ce5d0fa1830e7e0c27e674e3" Mar 09 14:14:46 crc kubenswrapper[4722]: E0309 14:14:46.805803 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1fc80c43be93eeb2f9805bf314c8430f52062d0ce5d0fa1830e7e0c27e674e3\": container with ID starting with d1fc80c43be93eeb2f9805bf314c8430f52062d0ce5d0fa1830e7e0c27e674e3 not found: ID does not exist" containerID="d1fc80c43be93eeb2f9805bf314c8430f52062d0ce5d0fa1830e7e0c27e674e3" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.805838 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1fc80c43be93eeb2f9805bf314c8430f52062d0ce5d0fa1830e7e0c27e674e3"} err="failed to get container status \"d1fc80c43be93eeb2f9805bf314c8430f52062d0ce5d0fa1830e7e0c27e674e3\": rpc error: code = NotFound desc = could not find container \"d1fc80c43be93eeb2f9805bf314c8430f52062d0ce5d0fa1830e7e0c27e674e3\": container with ID starting with d1fc80c43be93eeb2f9805bf314c8430f52062d0ce5d0fa1830e7e0c27e674e3 not found: ID does not exist" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.805858 4722 scope.go:117] "RemoveContainer" containerID="a0106fa808804a96ca8247096399e3d04eb082b2691b1f39df3db9f6c630114c" Mar 09 14:14:46 crc kubenswrapper[4722]: E0309 14:14:46.806238 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0106fa808804a96ca8247096399e3d04eb082b2691b1f39df3db9f6c630114c\": container with ID starting with a0106fa808804a96ca8247096399e3d04eb082b2691b1f39df3db9f6c630114c not found: ID does not exist" containerID="a0106fa808804a96ca8247096399e3d04eb082b2691b1f39df3db9f6c630114c" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.806296 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0106fa808804a96ca8247096399e3d04eb082b2691b1f39df3db9f6c630114c"} err="failed to get container status \"a0106fa808804a96ca8247096399e3d04eb082b2691b1f39df3db9f6c630114c\": rpc error: code = NotFound desc = could not find container \"a0106fa808804a96ca8247096399e3d04eb082b2691b1f39df3db9f6c630114c\": container with ID starting with a0106fa808804a96ca8247096399e3d04eb082b2691b1f39df3db9f6c630114c not found: ID does not exist" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.806329 4722 scope.go:117] "RemoveContainer" containerID="24e737417baac512fab2089f5da12e7c47ef06eae8df7803ad6b248505ff3d18" Mar 09 14:14:46 crc kubenswrapper[4722]: E0309 14:14:46.806678 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24e737417baac512fab2089f5da12e7c47ef06eae8df7803ad6b248505ff3d18\": container with ID starting with 24e737417baac512fab2089f5da12e7c47ef06eae8df7803ad6b248505ff3d18 not found: ID does not exist" containerID="24e737417baac512fab2089f5da12e7c47ef06eae8df7803ad6b248505ff3d18" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.806711 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24e737417baac512fab2089f5da12e7c47ef06eae8df7803ad6b248505ff3d18"} err="failed to get container status \"24e737417baac512fab2089f5da12e7c47ef06eae8df7803ad6b248505ff3d18\": rpc error: code = NotFound desc = could not find container \"24e737417baac512fab2089f5da12e7c47ef06eae8df7803ad6b248505ff3d18\": container with ID starting with 24e737417baac512fab2089f5da12e7c47ef06eae8df7803ad6b248505ff3d18 not found: ID does not exist" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.806731 4722 scope.go:117] "RemoveContainer" containerID="bcc35a356ac2421f1f5a4e5460ac9461063a1fb85aaefce257f9f4e7aed4dc75" Mar 09 14:14:46 crc kubenswrapper[4722]: E0309 14:14:46.807034 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bcc35a356ac2421f1f5a4e5460ac9461063a1fb85aaefce257f9f4e7aed4dc75\": container with ID starting with bcc35a356ac2421f1f5a4e5460ac9461063a1fb85aaefce257f9f4e7aed4dc75 not found: ID does not exist" containerID="bcc35a356ac2421f1f5a4e5460ac9461063a1fb85aaefce257f9f4e7aed4dc75" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.807078 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bcc35a356ac2421f1f5a4e5460ac9461063a1fb85aaefce257f9f4e7aed4dc75"} err="failed to get container status \"bcc35a356ac2421f1f5a4e5460ac9461063a1fb85aaefce257f9f4e7aed4dc75\": rpc error: code = NotFound desc = could not find container \"bcc35a356ac2421f1f5a4e5460ac9461063a1fb85aaefce257f9f4e7aed4dc75\": container with ID starting with bcc35a356ac2421f1f5a4e5460ac9461063a1fb85aaefce257f9f4e7aed4dc75 not found: ID does not exist" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.807101 4722 scope.go:117] "RemoveContainer" containerID="71c08361cf5ed90f65d201ffd82a9403cb9c01ae88ea91cbbf351f6003bfa427" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.807392 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71c08361cf5ed90f65d201ffd82a9403cb9c01ae88ea91cbbf351f6003bfa427"} err="failed to get container status \"71c08361cf5ed90f65d201ffd82a9403cb9c01ae88ea91cbbf351f6003bfa427\": rpc error: code = NotFound desc = could not find container \"71c08361cf5ed90f65d201ffd82a9403cb9c01ae88ea91cbbf351f6003bfa427\": container with ID starting with 71c08361cf5ed90f65d201ffd82a9403cb9c01ae88ea91cbbf351f6003bfa427 not found: ID does not exist" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.807417 4722 scope.go:117] "RemoveContainer" containerID="8f5af077ae5b947194e5dba0542d053890a589582e415e1b41a932bdab6632e0" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.807692 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f5af077ae5b947194e5dba0542d053890a589582e415e1b41a932bdab6632e0"} err="failed to get container status \"8f5af077ae5b947194e5dba0542d053890a589582e415e1b41a932bdab6632e0\": rpc error: code = NotFound desc = could not find container \"8f5af077ae5b947194e5dba0542d053890a589582e415e1b41a932bdab6632e0\": container with ID starting with 8f5af077ae5b947194e5dba0542d053890a589582e415e1b41a932bdab6632e0 not found: ID does not exist" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.807720 4722 scope.go:117] "RemoveContainer" containerID="f5765145caac69454a9a8e8de34d05cb1da06924dd365f3e74f52461be5027ae" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.807994 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5765145caac69454a9a8e8de34d05cb1da06924dd365f3e74f52461be5027ae"} err="failed to get container status \"f5765145caac69454a9a8e8de34d05cb1da06924dd365f3e74f52461be5027ae\": rpc error: code = NotFound desc = could not find container \"f5765145caac69454a9a8e8de34d05cb1da06924dd365f3e74f52461be5027ae\": container with ID starting with f5765145caac69454a9a8e8de34d05cb1da06924dd365f3e74f52461be5027ae not found: ID does not exist" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.808022 4722 scope.go:117] "RemoveContainer" containerID="ef1c5a3944e8693d8c68a4108c23851352407d0fac7fb7a03763d9084f117dfb" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.808310 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef1c5a3944e8693d8c68a4108c23851352407d0fac7fb7a03763d9084f117dfb"} err="failed to get container status \"ef1c5a3944e8693d8c68a4108c23851352407d0fac7fb7a03763d9084f117dfb\": rpc error: code = NotFound desc = could not find container \"ef1c5a3944e8693d8c68a4108c23851352407d0fac7fb7a03763d9084f117dfb\": container with ID starting with ef1c5a3944e8693d8c68a4108c23851352407d0fac7fb7a03763d9084f117dfb not found: ID does not exist" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.808338 4722 scope.go:117] "RemoveContainer" containerID="f969085ae62d82dbe1f0f8f5d459c4a1bc7d1ae0a459da484f11dd6ae5fdd418" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.808611 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f969085ae62d82dbe1f0f8f5d459c4a1bc7d1ae0a459da484f11dd6ae5fdd418"} err="failed to get container status \"f969085ae62d82dbe1f0f8f5d459c4a1bc7d1ae0a459da484f11dd6ae5fdd418\": rpc error: code = NotFound desc = could not find container \"f969085ae62d82dbe1f0f8f5d459c4a1bc7d1ae0a459da484f11dd6ae5fdd418\": container with ID starting with f969085ae62d82dbe1f0f8f5d459c4a1bc7d1ae0a459da484f11dd6ae5fdd418 not found: ID does not exist" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.808645 4722 scope.go:117] "RemoveContainer" containerID="d1fc80c43be93eeb2f9805bf314c8430f52062d0ce5d0fa1830e7e0c27e674e3" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.808951 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1fc80c43be93eeb2f9805bf314c8430f52062d0ce5d0fa1830e7e0c27e674e3"} err="failed to get container status \"d1fc80c43be93eeb2f9805bf314c8430f52062d0ce5d0fa1830e7e0c27e674e3\": rpc error: code = NotFound desc = could not find container \"d1fc80c43be93eeb2f9805bf314c8430f52062d0ce5d0fa1830e7e0c27e674e3\": container with ID starting with d1fc80c43be93eeb2f9805bf314c8430f52062d0ce5d0fa1830e7e0c27e674e3 not found: ID does not exist" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.808977 4722 scope.go:117] "RemoveContainer" containerID="a0106fa808804a96ca8247096399e3d04eb082b2691b1f39df3db9f6c630114c" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.809259 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0106fa808804a96ca8247096399e3d04eb082b2691b1f39df3db9f6c630114c"} err="failed to get container status \"a0106fa808804a96ca8247096399e3d04eb082b2691b1f39df3db9f6c630114c\": rpc error: code = NotFound desc = could not find container \"a0106fa808804a96ca8247096399e3d04eb082b2691b1f39df3db9f6c630114c\": container with ID starting with a0106fa808804a96ca8247096399e3d04eb082b2691b1f39df3db9f6c630114c not found: ID does not exist" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.809283 4722 scope.go:117] "RemoveContainer" containerID="24e737417baac512fab2089f5da12e7c47ef06eae8df7803ad6b248505ff3d18" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.812877 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24e737417baac512fab2089f5da12e7c47ef06eae8df7803ad6b248505ff3d18"} err="failed to get container status \"24e737417baac512fab2089f5da12e7c47ef06eae8df7803ad6b248505ff3d18\": rpc error: code = NotFound desc = could not find container \"24e737417baac512fab2089f5da12e7c47ef06eae8df7803ad6b248505ff3d18\": container with ID starting with 24e737417baac512fab2089f5da12e7c47ef06eae8df7803ad6b248505ff3d18 not found: ID does not exist" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.812928 4722 scope.go:117] "RemoveContainer" containerID="bcc35a356ac2421f1f5a4e5460ac9461063a1fb85aaefce257f9f4e7aed4dc75" Mar 09 14:14:46 crc kubenswrapper[4722]: I0309 14:14:46.813290 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bcc35a356ac2421f1f5a4e5460ac9461063a1fb85aaefce257f9f4e7aed4dc75"} err="failed to get container status \"bcc35a356ac2421f1f5a4e5460ac9461063a1fb85aaefce257f9f4e7aed4dc75\": rpc error: code = NotFound desc = could not find container \"bcc35a356ac2421f1f5a4e5460ac9461063a1fb85aaefce257f9f4e7aed4dc75\": container with ID starting with bcc35a356ac2421f1f5a4e5460ac9461063a1fb85aaefce257f9f4e7aed4dc75 not found: ID does not exist" Mar 09 14:14:47 crc kubenswrapper[4722]: I0309 14:14:47.639300 4722 generic.go:334] "Generic (PLEG): container finished" podID="731c31a2-ded2-452f-b330-0cf118ab1e84" containerID="0bf7c024824824a929c1e661d65baf17ab07597a43f169b1d82b7a865d0a8b9d" exitCode=0 Mar 09 14:14:47 crc kubenswrapper[4722]: I0309 14:14:47.639405 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fmhcx" event={"ID":"731c31a2-ded2-452f-b330-0cf118ab1e84","Type":"ContainerDied","Data":"0bf7c024824824a929c1e661d65baf17ab07597a43f169b1d82b7a865d0a8b9d"} Mar 09 14:14:47 crc kubenswrapper[4722]: I0309 14:14:47.639824 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fmhcx" event={"ID":"731c31a2-ded2-452f-b330-0cf118ab1e84","Type":"ContainerStarted","Data":"d1b9be4c373f5f85ab91ea06e2c3ca2157cf75027e00992ce591869d5ea0b445"} Mar 09 14:14:48 crc kubenswrapper[4722]: I0309 14:14:48.162537 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e305619-b3a2-44c9-9e54-e1afa4f43dbf" path="/var/lib/kubelet/pods/2e305619-b3a2-44c9-9e54-e1afa4f43dbf/volumes" Mar 09 14:14:48 crc kubenswrapper[4722]: I0309 14:14:48.685659 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fmhcx" event={"ID":"731c31a2-ded2-452f-b330-0cf118ab1e84","Type":"ContainerStarted","Data":"2fa5d2dc6cbc2508ba2473a8f7c26f0cc162b5e958c2e00697e72f202715b2df"} Mar 09 14:14:48 crc kubenswrapper[4722]: I0309 14:14:48.686092 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fmhcx" event={"ID":"731c31a2-ded2-452f-b330-0cf118ab1e84","Type":"ContainerStarted","Data":"37a2f8a592dcfd6d6187edbc214b384bbd820fabbd8626014e9c13c62ad9d22d"} Mar 09 14:14:48 crc kubenswrapper[4722]: I0309 14:14:48.686106 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fmhcx" event={"ID":"731c31a2-ded2-452f-b330-0cf118ab1e84","Type":"ContainerStarted","Data":"0c75c84267033e5abdf1ae55975cfe2b235142a2a330872863c3f9f1420d4279"} Mar 09 14:14:48 crc kubenswrapper[4722]: I0309 14:14:48.686119 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fmhcx" event={"ID":"731c31a2-ded2-452f-b330-0cf118ab1e84","Type":"ContainerStarted","Data":"6a6b8d54593d7caef7417ed7fc1e46494180fd7335885550277d4b605dbed839"} Mar 09 14:14:48 crc kubenswrapper[4722]: I0309 14:14:48.686128 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fmhcx" event={"ID":"731c31a2-ded2-452f-b330-0cf118ab1e84","Type":"ContainerStarted","Data":"483d73ffd36dfd9ee1302ad5b3888a2cea9e70bfe06fe904ece1b6c184d00d15"} Mar 09 14:14:49 crc kubenswrapper[4722]: I0309 14:14:49.694469 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fmhcx" event={"ID":"731c31a2-ded2-452f-b330-0cf118ab1e84","Type":"ContainerStarted","Data":"81d8bcb6794db79876f5887aa5edeb55a46152952b4c6047105da803a09073fa"} Mar 09 14:14:50 crc kubenswrapper[4722]: I0309 14:14:50.773822 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-pw2x9"] Mar 09 14:14:50 crc kubenswrapper[4722]: I0309 14:14:50.775570 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-pw2x9" Mar 09 14:14:50 crc kubenswrapper[4722]: I0309 14:14:50.778763 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-wtfbb" Mar 09 14:14:50 crc kubenswrapper[4722]: I0309 14:14:50.779239 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Mar 09 14:14:50 crc kubenswrapper[4722]: I0309 14:14:50.782662 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Mar 09 14:14:50 crc kubenswrapper[4722]: I0309 14:14:50.898434 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zr5sg\" (UniqueName: \"kubernetes.io/projected/f612329a-8162-4440-aae8-a5467e713976-kube-api-access-zr5sg\") pod \"obo-prometheus-operator-68bc856cb9-pw2x9\" (UID: \"f612329a-8162-4440-aae8-a5467e713976\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-pw2x9" Mar 09 14:14:50 crc kubenswrapper[4722]: I0309 14:14:50.901347 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5d974998df-jzrsl"] Mar 09 14:14:50 crc kubenswrapper[4722]: I0309 14:14:50.902140 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d974998df-jzrsl" Mar 09 14:14:50 crc kubenswrapper[4722]: I0309 14:14:50.904771 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Mar 09 14:14:50 crc kubenswrapper[4722]: I0309 14:14:50.906899 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-z56mz" Mar 09 14:14:50 crc kubenswrapper[4722]: I0309 14:14:50.911249 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5d974998df-p2w5j"] Mar 09 14:14:50 crc kubenswrapper[4722]: I0309 14:14:50.912188 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d974998df-p2w5j" Mar 09 14:14:51 crc kubenswrapper[4722]: I0309 14:14:51.000320 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a58a5d27-2898-4346-b0c0-08507cf2eb44-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5d974998df-p2w5j\" (UID: \"a58a5d27-2898-4346-b0c0-08507cf2eb44\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d974998df-p2w5j" Mar 09 14:14:51 crc kubenswrapper[4722]: I0309 14:14:51.000801 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/85953f91-dda3-42f4-b308-7b553054dad6-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5d974998df-jzrsl\" (UID: \"85953f91-dda3-42f4-b308-7b553054dad6\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d974998df-jzrsl" Mar 09 14:14:51 crc kubenswrapper[4722]: I0309 14:14:51.000882 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a58a5d27-2898-4346-b0c0-08507cf2eb44-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5d974998df-p2w5j\" (UID: \"a58a5d27-2898-4346-b0c0-08507cf2eb44\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d974998df-p2w5j" Mar 09 14:14:51 crc kubenswrapper[4722]: I0309 14:14:51.000981 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zr5sg\" (UniqueName: \"kubernetes.io/projected/f612329a-8162-4440-aae8-a5467e713976-kube-api-access-zr5sg\") pod \"obo-prometheus-operator-68bc856cb9-pw2x9\" (UID: \"f612329a-8162-4440-aae8-a5467e713976\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-pw2x9" Mar 09 14:14:51 crc kubenswrapper[4722]: I0309 14:14:51.001010 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/85953f91-dda3-42f4-b308-7b553054dad6-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5d974998df-jzrsl\" (UID: \"85953f91-dda3-42f4-b308-7b553054dad6\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d974998df-jzrsl" Mar 09 14:14:51 crc kubenswrapper[4722]: I0309 14:14:51.020397 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zr5sg\" (UniqueName: \"kubernetes.io/projected/f612329a-8162-4440-aae8-a5467e713976-kube-api-access-zr5sg\") pod \"obo-prometheus-operator-68bc856cb9-pw2x9\" (UID: \"f612329a-8162-4440-aae8-a5467e713976\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-pw2x9" Mar 09 14:14:51 crc kubenswrapper[4722]: I0309 14:14:51.093720 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-lc5zn"] Mar 09 14:14:51 crc kubenswrapper[4722]: I0309 14:14:51.094531 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-lc5zn" Mar 09 14:14:51 crc kubenswrapper[4722]: I0309 14:14:51.095113 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-pw2x9" Mar 09 14:14:51 crc kubenswrapper[4722]: I0309 14:14:51.096696 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-bwrjf" Mar 09 14:14:51 crc kubenswrapper[4722]: I0309 14:14:51.096876 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Mar 09 14:14:51 crc kubenswrapper[4722]: I0309 14:14:51.103729 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/85953f91-dda3-42f4-b308-7b553054dad6-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5d974998df-jzrsl\" (UID: \"85953f91-dda3-42f4-b308-7b553054dad6\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d974998df-jzrsl" Mar 09 14:14:51 crc kubenswrapper[4722]: I0309 14:14:51.103785 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a58a5d27-2898-4346-b0c0-08507cf2eb44-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5d974998df-p2w5j\" (UID: \"a58a5d27-2898-4346-b0c0-08507cf2eb44\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d974998df-p2w5j" Mar 09 14:14:51 crc kubenswrapper[4722]: I0309 14:14:51.103809 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/85953f91-dda3-42f4-b308-7b553054dad6-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5d974998df-jzrsl\" (UID: \"85953f91-dda3-42f4-b308-7b553054dad6\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d974998df-jzrsl" Mar 09 14:14:51 crc kubenswrapper[4722]: I0309 14:14:51.103853 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a58a5d27-2898-4346-b0c0-08507cf2eb44-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5d974998df-p2w5j\" (UID: \"a58a5d27-2898-4346-b0c0-08507cf2eb44\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d974998df-p2w5j" Mar 09 14:14:51 crc kubenswrapper[4722]: I0309 14:14:51.106928 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a58a5d27-2898-4346-b0c0-08507cf2eb44-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5d974998df-p2w5j\" (UID: \"a58a5d27-2898-4346-b0c0-08507cf2eb44\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d974998df-p2w5j" Mar 09 14:14:51 crc kubenswrapper[4722]: I0309 14:14:51.108456 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/85953f91-dda3-42f4-b308-7b553054dad6-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5d974998df-jzrsl\" (UID: \"85953f91-dda3-42f4-b308-7b553054dad6\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d974998df-jzrsl" Mar 09 14:14:51 crc kubenswrapper[4722]: I0309 14:14:51.108833 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a58a5d27-2898-4346-b0c0-08507cf2eb44-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5d974998df-p2w5j\" (UID: \"a58a5d27-2898-4346-b0c0-08507cf2eb44\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d974998df-p2w5j" Mar 09 14:14:51 crc kubenswrapper[4722]: I0309 14:14:51.110786 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/85953f91-dda3-42f4-b308-7b553054dad6-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5d974998df-jzrsl\" (UID: \"85953f91-dda3-42f4-b308-7b553054dad6\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d974998df-jzrsl" Mar 09 14:14:51 crc kubenswrapper[4722]: E0309 14:14:51.132227 4722 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-pw2x9_openshift-operators_f612329a-8162-4440-aae8-a5467e713976_0(f9e6551b83e4cb91bee9bd7f9f58cdaf76cff8e100a6e2ca6158ba8da0edb324): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 14:14:51 crc kubenswrapper[4722]: E0309 14:14:51.132307 4722 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-pw2x9_openshift-operators_f612329a-8162-4440-aae8-a5467e713976_0(f9e6551b83e4cb91bee9bd7f9f58cdaf76cff8e100a6e2ca6158ba8da0edb324): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-pw2x9" Mar 09 14:14:51 crc kubenswrapper[4722]: E0309 14:14:51.132327 4722 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-pw2x9_openshift-operators_f612329a-8162-4440-aae8-a5467e713976_0(f9e6551b83e4cb91bee9bd7f9f58cdaf76cff8e100a6e2ca6158ba8da0edb324): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-pw2x9" Mar 09 14:14:51 crc kubenswrapper[4722]: E0309 14:14:51.132377 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-68bc856cb9-pw2x9_openshift-operators(f612329a-8162-4440-aae8-a5467e713976)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-68bc856cb9-pw2x9_openshift-operators(f612329a-8162-4440-aae8-a5467e713976)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-pw2x9_openshift-operators_f612329a-8162-4440-aae8-a5467e713976_0(f9e6551b83e4cb91bee9bd7f9f58cdaf76cff8e100a6e2ca6158ba8da0edb324): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-pw2x9" podUID="f612329a-8162-4440-aae8-a5467e713976" Mar 09 14:14:51 crc kubenswrapper[4722]: I0309 14:14:51.192130 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-2rhld"] Mar 09 14:14:51 crc kubenswrapper[4722]: I0309 14:14:51.192929 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-2rhld" Mar 09 14:14:51 crc kubenswrapper[4722]: I0309 14:14:51.195474 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-fmktd" Mar 09 14:14:51 crc kubenswrapper[4722]: I0309 14:14:51.205191 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d56qg\" (UniqueName: \"kubernetes.io/projected/14655c3d-02fe-4215-b566-0c4008fd34a0-kube-api-access-d56qg\") pod \"observability-operator-59bdc8b94-lc5zn\" (UID: \"14655c3d-02fe-4215-b566-0c4008fd34a0\") " pod="openshift-operators/observability-operator-59bdc8b94-lc5zn" Mar 09 14:14:51 crc kubenswrapper[4722]: I0309 14:14:51.205271 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/14655c3d-02fe-4215-b566-0c4008fd34a0-observability-operator-tls\") pod \"observability-operator-59bdc8b94-lc5zn\" (UID: \"14655c3d-02fe-4215-b566-0c4008fd34a0\") " pod="openshift-operators/observability-operator-59bdc8b94-lc5zn" Mar 09 14:14:51 crc kubenswrapper[4722]: I0309 14:14:51.223492 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d974998df-jzrsl" Mar 09 14:14:51 crc kubenswrapper[4722]: I0309 14:14:51.237677 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d974998df-p2w5j" Mar 09 14:14:51 crc kubenswrapper[4722]: E0309 14:14:51.248005 4722 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5d974998df-jzrsl_openshift-operators_85953f91-dda3-42f4-b308-7b553054dad6_0(c8c9932bef4c2829f0e5c0a4a40a261bac523719aee736ae78e8ecac5933b059): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 14:14:51 crc kubenswrapper[4722]: E0309 14:14:51.248090 4722 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5d974998df-jzrsl_openshift-operators_85953f91-dda3-42f4-b308-7b553054dad6_0(c8c9932bef4c2829f0e5c0a4a40a261bac523719aee736ae78e8ecac5933b059): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d974998df-jzrsl" Mar 09 14:14:51 crc kubenswrapper[4722]: E0309 14:14:51.248112 4722 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5d974998df-jzrsl_openshift-operators_85953f91-dda3-42f4-b308-7b553054dad6_0(c8c9932bef4c2829f0e5c0a4a40a261bac523719aee736ae78e8ecac5933b059): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d974998df-jzrsl" Mar 09 14:14:51 crc kubenswrapper[4722]: E0309 14:14:51.248154 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-5d974998df-jzrsl_openshift-operators(85953f91-dda3-42f4-b308-7b553054dad6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-5d974998df-jzrsl_openshift-operators(85953f91-dda3-42f4-b308-7b553054dad6)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5d974998df-jzrsl_openshift-operators_85953f91-dda3-42f4-b308-7b553054dad6_0(c8c9932bef4c2829f0e5c0a4a40a261bac523719aee736ae78e8ecac5933b059): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d974998df-jzrsl" podUID="85953f91-dda3-42f4-b308-7b553054dad6" Mar 09 14:14:51 crc kubenswrapper[4722]: E0309 14:14:51.272700 4722 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5d974998df-p2w5j_openshift-operators_a58a5d27-2898-4346-b0c0-08507cf2eb44_0(c8d3d63658d32a1bac807bfc7ef21c4da9ee6d18f92223f01d62d85d6a969b8c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 14:14:51 crc kubenswrapper[4722]: E0309 14:14:51.272799 4722 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5d974998df-p2w5j_openshift-operators_a58a5d27-2898-4346-b0c0-08507cf2eb44_0(c8d3d63658d32a1bac807bfc7ef21c4da9ee6d18f92223f01d62d85d6a969b8c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d974998df-p2w5j" Mar 09 14:14:51 crc kubenswrapper[4722]: E0309 14:14:51.272847 4722 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5d974998df-p2w5j_openshift-operators_a58a5d27-2898-4346-b0c0-08507cf2eb44_0(c8d3d63658d32a1bac807bfc7ef21c4da9ee6d18f92223f01d62d85d6a969b8c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d974998df-p2w5j" Mar 09 14:14:51 crc kubenswrapper[4722]: E0309 14:14:51.272896 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-5d974998df-p2w5j_openshift-operators(a58a5d27-2898-4346-b0c0-08507cf2eb44)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-5d974998df-p2w5j_openshift-operators(a58a5d27-2898-4346-b0c0-08507cf2eb44)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5d974998df-p2w5j_openshift-operators_a58a5d27-2898-4346-b0c0-08507cf2eb44_0(c8d3d63658d32a1bac807bfc7ef21c4da9ee6d18f92223f01d62d85d6a969b8c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d974998df-p2w5j" podUID="a58a5d27-2898-4346-b0c0-08507cf2eb44" Mar 09 14:14:51 crc kubenswrapper[4722]: I0309 14:14:51.309115 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmrp4\" (UniqueName: \"kubernetes.io/projected/c123a767-e0e0-4432-b34f-cbe0b581d938-kube-api-access-xmrp4\") pod \"perses-operator-5bf474d74f-2rhld\" (UID: \"c123a767-e0e0-4432-b34f-cbe0b581d938\") " pod="openshift-operators/perses-operator-5bf474d74f-2rhld" Mar 09 14:14:51 crc kubenswrapper[4722]: I0309 14:14:51.309182 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d56qg\" (UniqueName: \"kubernetes.io/projected/14655c3d-02fe-4215-b566-0c4008fd34a0-kube-api-access-d56qg\") pod \"observability-operator-59bdc8b94-lc5zn\" (UID: \"14655c3d-02fe-4215-b566-0c4008fd34a0\") " pod="openshift-operators/observability-operator-59bdc8b94-lc5zn" Mar 09 14:14:51 crc kubenswrapper[4722]: I0309 14:14:51.309259 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/14655c3d-02fe-4215-b566-0c4008fd34a0-observability-operator-tls\") pod \"observability-operator-59bdc8b94-lc5zn\" (UID: \"14655c3d-02fe-4215-b566-0c4008fd34a0\") " pod="openshift-operators/observability-operator-59bdc8b94-lc5zn" Mar 09 14:14:51 crc kubenswrapper[4722]: I0309 14:14:51.309301 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/c123a767-e0e0-4432-b34f-cbe0b581d938-openshift-service-ca\") pod \"perses-operator-5bf474d74f-2rhld\" (UID: \"c123a767-e0e0-4432-b34f-cbe0b581d938\") " pod="openshift-operators/perses-operator-5bf474d74f-2rhld" Mar 09 14:14:51 crc kubenswrapper[4722]: I0309 14:14:51.313854 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/14655c3d-02fe-4215-b566-0c4008fd34a0-observability-operator-tls\") pod \"observability-operator-59bdc8b94-lc5zn\" (UID: \"14655c3d-02fe-4215-b566-0c4008fd34a0\") " pod="openshift-operators/observability-operator-59bdc8b94-lc5zn" Mar 09 14:14:51 crc kubenswrapper[4722]: I0309 14:14:51.327437 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d56qg\" (UniqueName: \"kubernetes.io/projected/14655c3d-02fe-4215-b566-0c4008fd34a0-kube-api-access-d56qg\") pod \"observability-operator-59bdc8b94-lc5zn\" (UID: \"14655c3d-02fe-4215-b566-0c4008fd34a0\") " pod="openshift-operators/observability-operator-59bdc8b94-lc5zn" Mar 09 14:14:51 crc kubenswrapper[4722]: I0309 14:14:51.410719 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/c123a767-e0e0-4432-b34f-cbe0b581d938-openshift-service-ca\") pod \"perses-operator-5bf474d74f-2rhld\" (UID: \"c123a767-e0e0-4432-b34f-cbe0b581d938\") " pod="openshift-operators/perses-operator-5bf474d74f-2rhld" Mar 09 14:14:51 crc kubenswrapper[4722]: I0309 14:14:51.410868 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmrp4\" (UniqueName: \"kubernetes.io/projected/c123a767-e0e0-4432-b34f-cbe0b581d938-kube-api-access-xmrp4\") pod \"perses-operator-5bf474d74f-2rhld\" (UID: \"c123a767-e0e0-4432-b34f-cbe0b581d938\") " pod="openshift-operators/perses-operator-5bf474d74f-2rhld" Mar 09 14:14:51 crc kubenswrapper[4722]: I0309 14:14:51.411832 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/c123a767-e0e0-4432-b34f-cbe0b581d938-openshift-service-ca\") pod \"perses-operator-5bf474d74f-2rhld\" (UID: \"c123a767-e0e0-4432-b34f-cbe0b581d938\") " pod="openshift-operators/perses-operator-5bf474d74f-2rhld" Mar 09 14:14:51 crc kubenswrapper[4722]: I0309 14:14:51.431678 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmrp4\" (UniqueName: \"kubernetes.io/projected/c123a767-e0e0-4432-b34f-cbe0b581d938-kube-api-access-xmrp4\") pod \"perses-operator-5bf474d74f-2rhld\" (UID: \"c123a767-e0e0-4432-b34f-cbe0b581d938\") " pod="openshift-operators/perses-operator-5bf474d74f-2rhld" Mar 09 14:14:51 crc kubenswrapper[4722]: I0309 14:14:51.469990 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-lc5zn" Mar 09 14:14:51 crc kubenswrapper[4722]: E0309 14:14:51.489900 4722 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-lc5zn_openshift-operators_14655c3d-02fe-4215-b566-0c4008fd34a0_0(18570b96c39b47a4a06ae73278b98503a508c0bb8c690977455e6693840ff80e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 14:14:51 crc kubenswrapper[4722]: E0309 14:14:51.489969 4722 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-lc5zn_openshift-operators_14655c3d-02fe-4215-b566-0c4008fd34a0_0(18570b96c39b47a4a06ae73278b98503a508c0bb8c690977455e6693840ff80e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-lc5zn" Mar 09 14:14:51 crc kubenswrapper[4722]: E0309 14:14:51.489997 4722 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-lc5zn_openshift-operators_14655c3d-02fe-4215-b566-0c4008fd34a0_0(18570b96c39b47a4a06ae73278b98503a508c0bb8c690977455e6693840ff80e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-lc5zn" Mar 09 14:14:51 crc kubenswrapper[4722]: E0309 14:14:51.490043 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-59bdc8b94-lc5zn_openshift-operators(14655c3d-02fe-4215-b566-0c4008fd34a0)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-59bdc8b94-lc5zn_openshift-operators(14655c3d-02fe-4215-b566-0c4008fd34a0)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-lc5zn_openshift-operators_14655c3d-02fe-4215-b566-0c4008fd34a0_0(18570b96c39b47a4a06ae73278b98503a508c0bb8c690977455e6693840ff80e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-59bdc8b94-lc5zn" podUID="14655c3d-02fe-4215-b566-0c4008fd34a0" Mar 09 14:14:51 crc kubenswrapper[4722]: I0309 14:14:51.509709 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-2rhld" Mar 09 14:14:51 crc kubenswrapper[4722]: E0309 14:14:51.537217 4722 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-2rhld_openshift-operators_c123a767-e0e0-4432-b34f-cbe0b581d938_0(ca44da2f06603da14a5a50ab577016348ec99a65a8a19625273700a4876f704a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 14:14:51 crc kubenswrapper[4722]: E0309 14:14:51.537294 4722 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-2rhld_openshift-operators_c123a767-e0e0-4432-b34f-cbe0b581d938_0(ca44da2f06603da14a5a50ab577016348ec99a65a8a19625273700a4876f704a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-2rhld" Mar 09 14:14:51 crc kubenswrapper[4722]: E0309 14:14:51.537324 4722 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-2rhld_openshift-operators_c123a767-e0e0-4432-b34f-cbe0b581d938_0(ca44da2f06603da14a5a50ab577016348ec99a65a8a19625273700a4876f704a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-2rhld" Mar 09 14:14:51 crc kubenswrapper[4722]: E0309 14:14:51.537391 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5bf474d74f-2rhld_openshift-operators(c123a767-e0e0-4432-b34f-cbe0b581d938)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5bf474d74f-2rhld_openshift-operators(c123a767-e0e0-4432-b34f-cbe0b581d938)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-2rhld_openshift-operators_c123a767-e0e0-4432-b34f-cbe0b581d938_0(ca44da2f06603da14a5a50ab577016348ec99a65a8a19625273700a4876f704a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5bf474d74f-2rhld" podUID="c123a767-e0e0-4432-b34f-cbe0b581d938" Mar 09 14:14:51 crc kubenswrapper[4722]: I0309 14:14:51.711945 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fmhcx" event={"ID":"731c31a2-ded2-452f-b330-0cf118ab1e84","Type":"ContainerStarted","Data":"cd99938117b89c3c39f1dd95f0a0116139052b7301845015343d0e28ec969c92"} Mar 09 14:14:53 crc kubenswrapper[4722]: I0309 14:14:53.727366 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fmhcx" event={"ID":"731c31a2-ded2-452f-b330-0cf118ab1e84","Type":"ContainerStarted","Data":"f703b451fd4de17825f1838eb6d91f5cc96bc713c8404b15fc4d759cbf151205"} Mar 09 14:14:53 crc kubenswrapper[4722]: I0309 14:14:53.728028 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-fmhcx" Mar 09 14:14:53 crc kubenswrapper[4722]: I0309 14:14:53.728063 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-fmhcx" Mar 09 14:14:53 crc kubenswrapper[4722]: I0309 14:14:53.728079 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-fmhcx" Mar 09 14:14:53 crc kubenswrapper[4722]: I0309 14:14:53.754268 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fmhcx" Mar 09 14:14:53 crc kubenswrapper[4722]: I0309 14:14:53.754894 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fmhcx" Mar 09 14:14:53 crc kubenswrapper[4722]: I0309 14:14:53.761034 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-fmhcx" podStartSLOduration=7.761015872 podStartE2EDuration="7.761015872s" podCreationTimestamp="2026-03-09 14:14:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:14:53.756099804 +0000 UTC m=+734.311668380" watchObservedRunningTime="2026-03-09 14:14:53.761015872 +0000 UTC m=+734.316584448" Mar 09 14:14:54 crc kubenswrapper[4722]: I0309 14:14:54.138272 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-pw2x9"] Mar 09 14:14:54 crc kubenswrapper[4722]: I0309 14:14:54.138402 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-pw2x9" Mar 09 14:14:54 crc kubenswrapper[4722]: I0309 14:14:54.138904 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-pw2x9" Mar 09 14:14:54 crc kubenswrapper[4722]: I0309 14:14:54.142576 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5d974998df-jzrsl"] Mar 09 14:14:54 crc kubenswrapper[4722]: I0309 14:14:54.142706 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d974998df-jzrsl" Mar 09 14:14:54 crc kubenswrapper[4722]: I0309 14:14:54.143409 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d974998df-jzrsl" Mar 09 14:14:54 crc kubenswrapper[4722]: I0309 14:14:54.161336 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-2rhld"] Mar 09 14:14:54 crc kubenswrapper[4722]: I0309 14:14:54.161505 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-2rhld" Mar 09 14:14:54 crc kubenswrapper[4722]: I0309 14:14:54.162098 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-2rhld" Mar 09 14:14:54 crc kubenswrapper[4722]: I0309 14:14:54.179797 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5d974998df-p2w5j"] Mar 09 14:14:54 crc kubenswrapper[4722]: I0309 14:14:54.179973 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d974998df-p2w5j" Mar 09 14:14:54 crc kubenswrapper[4722]: I0309 14:14:54.180541 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d974998df-p2w5j" Mar 09 14:14:54 crc kubenswrapper[4722]: I0309 14:14:54.183659 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-lc5zn"] Mar 09 14:14:54 crc kubenswrapper[4722]: I0309 14:14:54.183784 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-lc5zn" Mar 09 14:14:54 crc kubenswrapper[4722]: I0309 14:14:54.184233 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-lc5zn" Mar 09 14:14:54 crc kubenswrapper[4722]: E0309 14:14:54.235162 4722 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-pw2x9_openshift-operators_f612329a-8162-4440-aae8-a5467e713976_0(a57acc2c51fa295d4022bffa23389c1b4e361855e7387fc5c014386c2b02c4ea): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 14:14:54 crc kubenswrapper[4722]: E0309 14:14:54.235259 4722 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-pw2x9_openshift-operators_f612329a-8162-4440-aae8-a5467e713976_0(a57acc2c51fa295d4022bffa23389c1b4e361855e7387fc5c014386c2b02c4ea): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-pw2x9" Mar 09 14:14:54 crc kubenswrapper[4722]: E0309 14:14:54.235291 4722 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-pw2x9_openshift-operators_f612329a-8162-4440-aae8-a5467e713976_0(a57acc2c51fa295d4022bffa23389c1b4e361855e7387fc5c014386c2b02c4ea): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-pw2x9" Mar 09 14:14:54 crc kubenswrapper[4722]: E0309 14:14:54.235356 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-68bc856cb9-pw2x9_openshift-operators(f612329a-8162-4440-aae8-a5467e713976)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-68bc856cb9-pw2x9_openshift-operators(f612329a-8162-4440-aae8-a5467e713976)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-pw2x9_openshift-operators_f612329a-8162-4440-aae8-a5467e713976_0(a57acc2c51fa295d4022bffa23389c1b4e361855e7387fc5c014386c2b02c4ea): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-pw2x9" podUID="f612329a-8162-4440-aae8-a5467e713976" Mar 09 14:14:54 crc kubenswrapper[4722]: E0309 14:14:54.236800 4722 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-2rhld_openshift-operators_c123a767-e0e0-4432-b34f-cbe0b581d938_0(1da260886247e325f51dd64a7672460d6e5a92117ab6c8d971989794ba5a392b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 14:14:54 crc kubenswrapper[4722]: E0309 14:14:54.236875 4722 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-2rhld_openshift-operators_c123a767-e0e0-4432-b34f-cbe0b581d938_0(1da260886247e325f51dd64a7672460d6e5a92117ab6c8d971989794ba5a392b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-2rhld" Mar 09 14:14:54 crc kubenswrapper[4722]: E0309 14:14:54.236897 4722 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-2rhld_openshift-operators_c123a767-e0e0-4432-b34f-cbe0b581d938_0(1da260886247e325f51dd64a7672460d6e5a92117ab6c8d971989794ba5a392b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-2rhld" Mar 09 14:14:54 crc kubenswrapper[4722]: E0309 14:14:54.236939 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5bf474d74f-2rhld_openshift-operators(c123a767-e0e0-4432-b34f-cbe0b581d938)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5bf474d74f-2rhld_openshift-operators(c123a767-e0e0-4432-b34f-cbe0b581d938)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-2rhld_openshift-operators_c123a767-e0e0-4432-b34f-cbe0b581d938_0(1da260886247e325f51dd64a7672460d6e5a92117ab6c8d971989794ba5a392b): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5bf474d74f-2rhld" podUID="c123a767-e0e0-4432-b34f-cbe0b581d938" Mar 09 14:14:54 crc kubenswrapper[4722]: E0309 14:14:54.253030 4722 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5d974998df-jzrsl_openshift-operators_85953f91-dda3-42f4-b308-7b553054dad6_0(d7c64d9001b82ee28134cc8ee225bbd5aadd8eb6f6e339bb0cbc33273039c3c3): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 14:14:54 crc kubenswrapper[4722]: E0309 14:14:54.253083 4722 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5d974998df-jzrsl_openshift-operators_85953f91-dda3-42f4-b308-7b553054dad6_0(d7c64d9001b82ee28134cc8ee225bbd5aadd8eb6f6e339bb0cbc33273039c3c3): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d974998df-jzrsl" Mar 09 14:14:54 crc kubenswrapper[4722]: E0309 14:14:54.253104 4722 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5d974998df-jzrsl_openshift-operators_85953f91-dda3-42f4-b308-7b553054dad6_0(d7c64d9001b82ee28134cc8ee225bbd5aadd8eb6f6e339bb0cbc33273039c3c3): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d974998df-jzrsl" Mar 09 14:14:54 crc kubenswrapper[4722]: E0309 14:14:54.253162 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-5d974998df-jzrsl_openshift-operators(85953f91-dda3-42f4-b308-7b553054dad6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-5d974998df-jzrsl_openshift-operators(85953f91-dda3-42f4-b308-7b553054dad6)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5d974998df-jzrsl_openshift-operators_85953f91-dda3-42f4-b308-7b553054dad6_0(d7c64d9001b82ee28134cc8ee225bbd5aadd8eb6f6e339bb0cbc33273039c3c3): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d974998df-jzrsl" podUID="85953f91-dda3-42f4-b308-7b553054dad6" Mar 09 14:14:54 crc kubenswrapper[4722]: E0309 14:14:54.265178 4722 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-lc5zn_openshift-operators_14655c3d-02fe-4215-b566-0c4008fd34a0_0(6724b99c174e2690b21860c4c7581defb81df2ab1b7d7230feb70c73d3ffefab): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 14:14:54 crc kubenswrapper[4722]: E0309 14:14:54.265254 4722 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-lc5zn_openshift-operators_14655c3d-02fe-4215-b566-0c4008fd34a0_0(6724b99c174e2690b21860c4c7581defb81df2ab1b7d7230feb70c73d3ffefab): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-lc5zn" Mar 09 14:14:54 crc kubenswrapper[4722]: E0309 14:14:54.265275 4722 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-lc5zn_openshift-operators_14655c3d-02fe-4215-b566-0c4008fd34a0_0(6724b99c174e2690b21860c4c7581defb81df2ab1b7d7230feb70c73d3ffefab): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-lc5zn" Mar 09 14:14:54 crc kubenswrapper[4722]: E0309 14:14:54.265325 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-59bdc8b94-lc5zn_openshift-operators(14655c3d-02fe-4215-b566-0c4008fd34a0)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-59bdc8b94-lc5zn_openshift-operators(14655c3d-02fe-4215-b566-0c4008fd34a0)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-lc5zn_openshift-operators_14655c3d-02fe-4215-b566-0c4008fd34a0_0(6724b99c174e2690b21860c4c7581defb81df2ab1b7d7230feb70c73d3ffefab): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-59bdc8b94-lc5zn" podUID="14655c3d-02fe-4215-b566-0c4008fd34a0" Mar 09 14:14:54 crc kubenswrapper[4722]: E0309 14:14:54.270661 4722 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5d974998df-p2w5j_openshift-operators_a58a5d27-2898-4346-b0c0-08507cf2eb44_0(6359081a28b4b6113cec986c0314f9e0ececf28a12c5f8e6c9fd65117a2446f6): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 14:14:54 crc kubenswrapper[4722]: E0309 14:14:54.270698 4722 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5d974998df-p2w5j_openshift-operators_a58a5d27-2898-4346-b0c0-08507cf2eb44_0(6359081a28b4b6113cec986c0314f9e0ececf28a12c5f8e6c9fd65117a2446f6): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d974998df-p2w5j" Mar 09 14:14:54 crc kubenswrapper[4722]: E0309 14:14:54.270716 4722 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5d974998df-p2w5j_openshift-operators_a58a5d27-2898-4346-b0c0-08507cf2eb44_0(6359081a28b4b6113cec986c0314f9e0ececf28a12c5f8e6c9fd65117a2446f6): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d974998df-p2w5j" Mar 09 14:14:54 crc kubenswrapper[4722]: E0309 14:14:54.270751 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-5d974998df-p2w5j_openshift-operators(a58a5d27-2898-4346-b0c0-08507cf2eb44)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-5d974998df-p2w5j_openshift-operators(a58a5d27-2898-4346-b0c0-08507cf2eb44)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5d974998df-p2w5j_openshift-operators_a58a5d27-2898-4346-b0c0-08507cf2eb44_0(6359081a28b4b6113cec986c0314f9e0ececf28a12c5f8e6c9fd65117a2446f6): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d974998df-p2w5j" podUID="a58a5d27-2898-4346-b0c0-08507cf2eb44" Mar 09 14:14:58 crc kubenswrapper[4722]: I0309 14:14:58.153923 4722 scope.go:117] "RemoveContainer" containerID="d35e198363967793f8437918e78c17906e68a7a9bddca3be185af7534bf15d4f" Mar 09 14:14:58 crc kubenswrapper[4722]: E0309 14:14:58.157500 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-h4zw5_openshift-multus(6b9e29bb-6e51-47ab-a543-b70117ab854d)\"" pod="openshift-multus/multus-h4zw5" podUID="6b9e29bb-6e51-47ab-a543-b70117ab854d" Mar 09 14:15:00 crc kubenswrapper[4722]: I0309 14:15:00.130860 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551095-h8w2n"] Mar 09 14:15:00 crc kubenswrapper[4722]: I0309 14:15:00.132782 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551095-h8w2n" Mar 09 14:15:00 crc kubenswrapper[4722]: I0309 14:15:00.135750 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 09 14:15:00 crc kubenswrapper[4722]: I0309 14:15:00.139865 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 09 14:15:00 crc kubenswrapper[4722]: I0309 14:15:00.153481 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551095-h8w2n"] Mar 09 14:15:00 crc kubenswrapper[4722]: I0309 14:15:00.238052 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/00a50aa1-a705-42cb-a3f7-e90cb7212a19-secret-volume\") pod \"collect-profiles-29551095-h8w2n\" (UID: \"00a50aa1-a705-42cb-a3f7-e90cb7212a19\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551095-h8w2n" Mar 09 14:15:00 crc kubenswrapper[4722]: I0309 14:15:00.238120 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pczzm\" (UniqueName: \"kubernetes.io/projected/00a50aa1-a705-42cb-a3f7-e90cb7212a19-kube-api-access-pczzm\") pod \"collect-profiles-29551095-h8w2n\" (UID: \"00a50aa1-a705-42cb-a3f7-e90cb7212a19\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551095-h8w2n" Mar 09 14:15:00 crc kubenswrapper[4722]: I0309 14:15:00.238149 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/00a50aa1-a705-42cb-a3f7-e90cb7212a19-config-volume\") pod \"collect-profiles-29551095-h8w2n\" (UID: \"00a50aa1-a705-42cb-a3f7-e90cb7212a19\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551095-h8w2n" Mar 09 14:15:00 crc kubenswrapper[4722]: I0309 14:15:00.339872 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/00a50aa1-a705-42cb-a3f7-e90cb7212a19-secret-volume\") pod \"collect-profiles-29551095-h8w2n\" (UID: \"00a50aa1-a705-42cb-a3f7-e90cb7212a19\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551095-h8w2n" Mar 09 14:15:00 crc kubenswrapper[4722]: I0309 14:15:00.339943 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pczzm\" (UniqueName: \"kubernetes.io/projected/00a50aa1-a705-42cb-a3f7-e90cb7212a19-kube-api-access-pczzm\") pod \"collect-profiles-29551095-h8w2n\" (UID: \"00a50aa1-a705-42cb-a3f7-e90cb7212a19\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551095-h8w2n" Mar 09 14:15:00 crc kubenswrapper[4722]: I0309 14:15:00.339978 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/00a50aa1-a705-42cb-a3f7-e90cb7212a19-config-volume\") pod \"collect-profiles-29551095-h8w2n\" (UID: \"00a50aa1-a705-42cb-a3f7-e90cb7212a19\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551095-h8w2n" Mar 09 14:15:00 crc kubenswrapper[4722]: I0309 14:15:00.341129 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/00a50aa1-a705-42cb-a3f7-e90cb7212a19-config-volume\") pod \"collect-profiles-29551095-h8w2n\" (UID: \"00a50aa1-a705-42cb-a3f7-e90cb7212a19\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551095-h8w2n" Mar 09 14:15:00 crc kubenswrapper[4722]: I0309 14:15:00.349951 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/00a50aa1-a705-42cb-a3f7-e90cb7212a19-secret-volume\") pod \"collect-profiles-29551095-h8w2n\" (UID: \"00a50aa1-a705-42cb-a3f7-e90cb7212a19\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551095-h8w2n" Mar 09 14:15:00 crc kubenswrapper[4722]: I0309 14:15:00.362987 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pczzm\" (UniqueName: \"kubernetes.io/projected/00a50aa1-a705-42cb-a3f7-e90cb7212a19-kube-api-access-pczzm\") pod \"collect-profiles-29551095-h8w2n\" (UID: \"00a50aa1-a705-42cb-a3f7-e90cb7212a19\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551095-h8w2n" Mar 09 14:15:00 crc kubenswrapper[4722]: I0309 14:15:00.466103 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551095-h8w2n" Mar 09 14:15:00 crc kubenswrapper[4722]: E0309 14:15:00.491630 4722 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29551095-h8w2n_openshift-operator-lifecycle-manager_00a50aa1-a705-42cb-a3f7-e90cb7212a19_0(77f2a70cecda58abfaa25988873c43d8828fb82bf93813f719272e83bb481483): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 14:15:00 crc kubenswrapper[4722]: E0309 14:15:00.491711 4722 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29551095-h8w2n_openshift-operator-lifecycle-manager_00a50aa1-a705-42cb-a3f7-e90cb7212a19_0(77f2a70cecda58abfaa25988873c43d8828fb82bf93813f719272e83bb481483): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operator-lifecycle-manager/collect-profiles-29551095-h8w2n" Mar 09 14:15:00 crc kubenswrapper[4722]: E0309 14:15:00.491737 4722 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29551095-h8w2n_openshift-operator-lifecycle-manager_00a50aa1-a705-42cb-a3f7-e90cb7212a19_0(77f2a70cecda58abfaa25988873c43d8828fb82bf93813f719272e83bb481483): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operator-lifecycle-manager/collect-profiles-29551095-h8w2n" Mar 09 14:15:00 crc kubenswrapper[4722]: E0309 14:15:00.491799 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"collect-profiles-29551095-h8w2n_openshift-operator-lifecycle-manager(00a50aa1-a705-42cb-a3f7-e90cb7212a19)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"collect-profiles-29551095-h8w2n_openshift-operator-lifecycle-manager(00a50aa1-a705-42cb-a3f7-e90cb7212a19)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29551095-h8w2n_openshift-operator-lifecycle-manager_00a50aa1-a705-42cb-a3f7-e90cb7212a19_0(77f2a70cecda58abfaa25988873c43d8828fb82bf93813f719272e83bb481483): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operator-lifecycle-manager/collect-profiles-29551095-h8w2n" podUID="00a50aa1-a705-42cb-a3f7-e90cb7212a19" Mar 09 14:15:00 crc kubenswrapper[4722]: I0309 14:15:00.769243 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551095-h8w2n" Mar 09 14:15:00 crc kubenswrapper[4722]: I0309 14:15:00.769731 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551095-h8w2n" Mar 09 14:15:00 crc kubenswrapper[4722]: E0309 14:15:00.798531 4722 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29551095-h8w2n_openshift-operator-lifecycle-manager_00a50aa1-a705-42cb-a3f7-e90cb7212a19_0(ec92da6ce12e2cf83cbc09ae8cf1ad079f42b0178d4df7652b548af52bde7df8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 14:15:00 crc kubenswrapper[4722]: E0309 14:15:00.798605 4722 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29551095-h8w2n_openshift-operator-lifecycle-manager_00a50aa1-a705-42cb-a3f7-e90cb7212a19_0(ec92da6ce12e2cf83cbc09ae8cf1ad079f42b0178d4df7652b548af52bde7df8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operator-lifecycle-manager/collect-profiles-29551095-h8w2n" Mar 09 14:15:00 crc kubenswrapper[4722]: E0309 14:15:00.798638 4722 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29551095-h8w2n_openshift-operator-lifecycle-manager_00a50aa1-a705-42cb-a3f7-e90cb7212a19_0(ec92da6ce12e2cf83cbc09ae8cf1ad079f42b0178d4df7652b548af52bde7df8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operator-lifecycle-manager/collect-profiles-29551095-h8w2n" Mar 09 14:15:00 crc kubenswrapper[4722]: E0309 14:15:00.798690 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"collect-profiles-29551095-h8w2n_openshift-operator-lifecycle-manager(00a50aa1-a705-42cb-a3f7-e90cb7212a19)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"collect-profiles-29551095-h8w2n_openshift-operator-lifecycle-manager(00a50aa1-a705-42cb-a3f7-e90cb7212a19)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29551095-h8w2n_openshift-operator-lifecycle-manager_00a50aa1-a705-42cb-a3f7-e90cb7212a19_0(ec92da6ce12e2cf83cbc09ae8cf1ad079f42b0178d4df7652b548af52bde7df8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operator-lifecycle-manager/collect-profiles-29551095-h8w2n" podUID="00a50aa1-a705-42cb-a3f7-e90cb7212a19" Mar 09 14:15:05 crc kubenswrapper[4722]: I0309 14:15:05.149560 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d974998df-p2w5j" Mar 09 14:15:05 crc kubenswrapper[4722]: I0309 14:15:05.151937 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d974998df-p2w5j" Mar 09 14:15:05 crc kubenswrapper[4722]: E0309 14:15:05.180964 4722 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5d974998df-p2w5j_openshift-operators_a58a5d27-2898-4346-b0c0-08507cf2eb44_0(8a8f013031a760da84408a2c4a580ee8b02ed3b6f710a2f080eba24d2fe341b9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 14:15:05 crc kubenswrapper[4722]: E0309 14:15:05.181707 4722 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5d974998df-p2w5j_openshift-operators_a58a5d27-2898-4346-b0c0-08507cf2eb44_0(8a8f013031a760da84408a2c4a580ee8b02ed3b6f710a2f080eba24d2fe341b9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d974998df-p2w5j" Mar 09 14:15:05 crc kubenswrapper[4722]: E0309 14:15:05.181853 4722 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5d974998df-p2w5j_openshift-operators_a58a5d27-2898-4346-b0c0-08507cf2eb44_0(8a8f013031a760da84408a2c4a580ee8b02ed3b6f710a2f080eba24d2fe341b9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d974998df-p2w5j" Mar 09 14:15:05 crc kubenswrapper[4722]: E0309 14:15:05.182049 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-5d974998df-p2w5j_openshift-operators(a58a5d27-2898-4346-b0c0-08507cf2eb44)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-5d974998df-p2w5j_openshift-operators(a58a5d27-2898-4346-b0c0-08507cf2eb44)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5d974998df-p2w5j_openshift-operators_a58a5d27-2898-4346-b0c0-08507cf2eb44_0(8a8f013031a760da84408a2c4a580ee8b02ed3b6f710a2f080eba24d2fe341b9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d974998df-p2w5j" podUID="a58a5d27-2898-4346-b0c0-08507cf2eb44" Mar 09 14:15:06 crc kubenswrapper[4722]: I0309 14:15:06.149187 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-lc5zn" Mar 09 14:15:06 crc kubenswrapper[4722]: I0309 14:15:06.149407 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-pw2x9" Mar 09 14:15:06 crc kubenswrapper[4722]: I0309 14:15:06.149952 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d974998df-jzrsl" Mar 09 14:15:06 crc kubenswrapper[4722]: I0309 14:15:06.149996 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-lc5zn" Mar 09 14:15:06 crc kubenswrapper[4722]: I0309 14:15:06.150184 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-pw2x9" Mar 09 14:15:06 crc kubenswrapper[4722]: I0309 14:15:06.150246 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d974998df-jzrsl" Mar 09 14:15:06 crc kubenswrapper[4722]: E0309 14:15:06.201568 4722 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-lc5zn_openshift-operators_14655c3d-02fe-4215-b566-0c4008fd34a0_0(cd2912eb7850888c9d3f80d56d1456a5d298308678fab29c97a9d6eb77796e29): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 14:15:06 crc kubenswrapper[4722]: E0309 14:15:06.201651 4722 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-lc5zn_openshift-operators_14655c3d-02fe-4215-b566-0c4008fd34a0_0(cd2912eb7850888c9d3f80d56d1456a5d298308678fab29c97a9d6eb77796e29): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-lc5zn" Mar 09 14:15:06 crc kubenswrapper[4722]: E0309 14:15:06.201680 4722 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-lc5zn_openshift-operators_14655c3d-02fe-4215-b566-0c4008fd34a0_0(cd2912eb7850888c9d3f80d56d1456a5d298308678fab29c97a9d6eb77796e29): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-lc5zn" Mar 09 14:15:06 crc kubenswrapper[4722]: E0309 14:15:06.201729 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-59bdc8b94-lc5zn_openshift-operators(14655c3d-02fe-4215-b566-0c4008fd34a0)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-59bdc8b94-lc5zn_openshift-operators(14655c3d-02fe-4215-b566-0c4008fd34a0)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-lc5zn_openshift-operators_14655c3d-02fe-4215-b566-0c4008fd34a0_0(cd2912eb7850888c9d3f80d56d1456a5d298308678fab29c97a9d6eb77796e29): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-59bdc8b94-lc5zn" podUID="14655c3d-02fe-4215-b566-0c4008fd34a0" Mar 09 14:15:06 crc kubenswrapper[4722]: E0309 14:15:06.214078 4722 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5d974998df-jzrsl_openshift-operators_85953f91-dda3-42f4-b308-7b553054dad6_0(bebdbfd19e2fca56c012b0ca66a4ce46cdaeab6fb092d77b8477a36f9bb203a5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 14:15:06 crc kubenswrapper[4722]: E0309 14:15:06.214126 4722 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5d974998df-jzrsl_openshift-operators_85953f91-dda3-42f4-b308-7b553054dad6_0(bebdbfd19e2fca56c012b0ca66a4ce46cdaeab6fb092d77b8477a36f9bb203a5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d974998df-jzrsl" Mar 09 14:15:06 crc kubenswrapper[4722]: E0309 14:15:06.214147 4722 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5d974998df-jzrsl_openshift-operators_85953f91-dda3-42f4-b308-7b553054dad6_0(bebdbfd19e2fca56c012b0ca66a4ce46cdaeab6fb092d77b8477a36f9bb203a5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d974998df-jzrsl" Mar 09 14:15:06 crc kubenswrapper[4722]: E0309 14:15:06.214194 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-5d974998df-jzrsl_openshift-operators(85953f91-dda3-42f4-b308-7b553054dad6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-5d974998df-jzrsl_openshift-operators(85953f91-dda3-42f4-b308-7b553054dad6)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-5d974998df-jzrsl_openshift-operators_85953f91-dda3-42f4-b308-7b553054dad6_0(bebdbfd19e2fca56c012b0ca66a4ce46cdaeab6fb092d77b8477a36f9bb203a5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d974998df-jzrsl" podUID="85953f91-dda3-42f4-b308-7b553054dad6" Mar 09 14:15:06 crc kubenswrapper[4722]: E0309 14:15:06.222559 4722 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-pw2x9_openshift-operators_f612329a-8162-4440-aae8-a5467e713976_0(2443c593e543bf5181ba27975afd17b7fdb50d8371783b8545d92c3e5c063883): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 14:15:06 crc kubenswrapper[4722]: E0309 14:15:06.222611 4722 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-pw2x9_openshift-operators_f612329a-8162-4440-aae8-a5467e713976_0(2443c593e543bf5181ba27975afd17b7fdb50d8371783b8545d92c3e5c063883): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-pw2x9" Mar 09 14:15:06 crc kubenswrapper[4722]: E0309 14:15:06.222628 4722 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-pw2x9_openshift-operators_f612329a-8162-4440-aae8-a5467e713976_0(2443c593e543bf5181ba27975afd17b7fdb50d8371783b8545d92c3e5c063883): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-pw2x9" Mar 09 14:15:06 crc kubenswrapper[4722]: E0309 14:15:06.222673 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-68bc856cb9-pw2x9_openshift-operators(f612329a-8162-4440-aae8-a5467e713976)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-68bc856cb9-pw2x9_openshift-operators(f612329a-8162-4440-aae8-a5467e713976)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-pw2x9_openshift-operators_f612329a-8162-4440-aae8-a5467e713976_0(2443c593e543bf5181ba27975afd17b7fdb50d8371783b8545d92c3e5c063883): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-pw2x9" podUID="f612329a-8162-4440-aae8-a5467e713976" Mar 09 14:15:08 crc kubenswrapper[4722]: I0309 14:15:08.149262 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-2rhld" Mar 09 14:15:08 crc kubenswrapper[4722]: I0309 14:15:08.150392 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-2rhld" Mar 09 14:15:08 crc kubenswrapper[4722]: E0309 14:15:08.200247 4722 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-2rhld_openshift-operators_c123a767-e0e0-4432-b34f-cbe0b581d938_0(33b65adad7ac98971d9c5cadb9b23ab7d66b8c0350cdaced749b5a23354a2abd): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 14:15:08 crc kubenswrapper[4722]: E0309 14:15:08.200329 4722 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-2rhld_openshift-operators_c123a767-e0e0-4432-b34f-cbe0b581d938_0(33b65adad7ac98971d9c5cadb9b23ab7d66b8c0350cdaced749b5a23354a2abd): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-2rhld" Mar 09 14:15:08 crc kubenswrapper[4722]: E0309 14:15:08.200354 4722 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-2rhld_openshift-operators_c123a767-e0e0-4432-b34f-cbe0b581d938_0(33b65adad7ac98971d9c5cadb9b23ab7d66b8c0350cdaced749b5a23354a2abd): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-2rhld" Mar 09 14:15:08 crc kubenswrapper[4722]: E0309 14:15:08.200403 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5bf474d74f-2rhld_openshift-operators(c123a767-e0e0-4432-b34f-cbe0b581d938)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5bf474d74f-2rhld_openshift-operators(c123a767-e0e0-4432-b34f-cbe0b581d938)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-2rhld_openshift-operators_c123a767-e0e0-4432-b34f-cbe0b581d938_0(33b65adad7ac98971d9c5cadb9b23ab7d66b8c0350cdaced749b5a23354a2abd): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5bf474d74f-2rhld" podUID="c123a767-e0e0-4432-b34f-cbe0b581d938" Mar 09 14:15:11 crc kubenswrapper[4722]: I0309 14:15:11.149551 4722 scope.go:117] "RemoveContainer" containerID="d35e198363967793f8437918e78c17906e68a7a9bddca3be185af7534bf15d4f" Mar 09 14:15:11 crc kubenswrapper[4722]: I0309 14:15:11.837889 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-h4zw5_6b9e29bb-6e51-47ab-a543-b70117ab854d/kube-multus/2.log" Mar 09 14:15:11 crc kubenswrapper[4722]: I0309 14:15:11.838291 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-h4zw5" event={"ID":"6b9e29bb-6e51-47ab-a543-b70117ab854d","Type":"ContainerStarted","Data":"5d170cb88c3710b57581f054c9720df2347179bc48e6aa6976633cad1f890ed0"} Mar 09 14:15:13 crc kubenswrapper[4722]: I0309 14:15:13.148950 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551095-h8w2n" Mar 09 14:15:13 crc kubenswrapper[4722]: I0309 14:15:13.149974 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551095-h8w2n" Mar 09 14:15:13 crc kubenswrapper[4722]: E0309 14:15:13.178314 4722 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29551095-h8w2n_openshift-operator-lifecycle-manager_00a50aa1-a705-42cb-a3f7-e90cb7212a19_0(8ec3e5c8f6b3210e73cd6d979d6302a0449c1edb30012076c292a7f06a353a2a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 09 14:15:13 crc kubenswrapper[4722]: E0309 14:15:13.179014 4722 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29551095-h8w2n_openshift-operator-lifecycle-manager_00a50aa1-a705-42cb-a3f7-e90cb7212a19_0(8ec3e5c8f6b3210e73cd6d979d6302a0449c1edb30012076c292a7f06a353a2a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operator-lifecycle-manager/collect-profiles-29551095-h8w2n" Mar 09 14:15:13 crc kubenswrapper[4722]: E0309 14:15:13.179055 4722 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29551095-h8w2n_openshift-operator-lifecycle-manager_00a50aa1-a705-42cb-a3f7-e90cb7212a19_0(8ec3e5c8f6b3210e73cd6d979d6302a0449c1edb30012076c292a7f06a353a2a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operator-lifecycle-manager/collect-profiles-29551095-h8w2n" Mar 09 14:15:13 crc kubenswrapper[4722]: E0309 14:15:13.179120 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"collect-profiles-29551095-h8w2n_openshift-operator-lifecycle-manager(00a50aa1-a705-42cb-a3f7-e90cb7212a19)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"collect-profiles-29551095-h8w2n_openshift-operator-lifecycle-manager(00a50aa1-a705-42cb-a3f7-e90cb7212a19)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_collect-profiles-29551095-h8w2n_openshift-operator-lifecycle-manager_00a50aa1-a705-42cb-a3f7-e90cb7212a19_0(8ec3e5c8f6b3210e73cd6d979d6302a0449c1edb30012076c292a7f06a353a2a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operator-lifecycle-manager/collect-profiles-29551095-h8w2n" podUID="00a50aa1-a705-42cb-a3f7-e90cb7212a19" Mar 09 14:15:15 crc kubenswrapper[4722]: I0309 14:15:15.709157 4722 scope.go:117] "RemoveContainer" containerID="2dbbdbcc313bdd84115bd6391cec9f91e1e2040ca6d071a52179af65734959b0" Mar 09 14:15:16 crc kubenswrapper[4722]: I0309 14:15:16.841648 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fmhcx" Mar 09 14:15:18 crc kubenswrapper[4722]: I0309 14:15:18.148696 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d974998df-p2w5j" Mar 09 14:15:18 crc kubenswrapper[4722]: I0309 14:15:18.150584 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d974998df-p2w5j" Mar 09 14:15:18 crc kubenswrapper[4722]: I0309 14:15:18.608426 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5d974998df-p2w5j"] Mar 09 14:15:18 crc kubenswrapper[4722]: I0309 14:15:18.889104 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d974998df-p2w5j" event={"ID":"a58a5d27-2898-4346-b0c0-08507cf2eb44","Type":"ContainerStarted","Data":"bfb0379b41d81f331cfd7a0ebdd08b4649b266e8bb2cee794028d4bb790016bd"} Mar 09 14:15:19 crc kubenswrapper[4722]: I0309 14:15:19.148110 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-pw2x9" Mar 09 14:15:19 crc kubenswrapper[4722]: I0309 14:15:19.148633 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-pw2x9" Mar 09 14:15:19 crc kubenswrapper[4722]: W0309 14:15:19.354547 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf612329a_8162_4440_aae8_a5467e713976.slice/crio-5b40e7d7d40c8695ddd10888280fd06480c82ec3107bb3a5ea1a8162f1abdcb9 WatchSource:0}: Error finding container 5b40e7d7d40c8695ddd10888280fd06480c82ec3107bb3a5ea1a8162f1abdcb9: Status 404 returned error can't find the container with id 5b40e7d7d40c8695ddd10888280fd06480c82ec3107bb3a5ea1a8162f1abdcb9 Mar 09 14:15:19 crc kubenswrapper[4722]: I0309 14:15:19.356496 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-pw2x9"] Mar 09 14:15:19 crc kubenswrapper[4722]: I0309 14:15:19.898596 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-pw2x9" event={"ID":"f612329a-8162-4440-aae8-a5467e713976","Type":"ContainerStarted","Data":"5b40e7d7d40c8695ddd10888280fd06480c82ec3107bb3a5ea1a8162f1abdcb9"} Mar 09 14:15:20 crc kubenswrapper[4722]: I0309 14:15:20.148257 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-lc5zn" Mar 09 14:15:20 crc kubenswrapper[4722]: I0309 14:15:20.148268 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-2rhld" Mar 09 14:15:20 crc kubenswrapper[4722]: I0309 14:15:20.148277 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d974998df-jzrsl" Mar 09 14:15:20 crc kubenswrapper[4722]: I0309 14:15:20.156877 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-lc5zn" Mar 09 14:15:20 crc kubenswrapper[4722]: I0309 14:15:20.156894 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-2rhld" Mar 09 14:15:20 crc kubenswrapper[4722]: I0309 14:15:20.157271 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d974998df-jzrsl" Mar 09 14:15:20 crc kubenswrapper[4722]: I0309 14:15:20.651350 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-2rhld"] Mar 09 14:15:20 crc kubenswrapper[4722]: I0309 14:15:20.664917 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-lc5zn"] Mar 09 14:15:20 crc kubenswrapper[4722]: I0309 14:15:20.705675 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5d974998df-jzrsl"] Mar 09 14:15:20 crc kubenswrapper[4722]: W0309 14:15:20.724245 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85953f91_dda3_42f4_b308_7b553054dad6.slice/crio-ef4d46fd03a6ed24e9257220019a6d0c5c2606fc17c29ff5b1c8ba6ccd10240d WatchSource:0}: Error finding container ef4d46fd03a6ed24e9257220019a6d0c5c2606fc17c29ff5b1c8ba6ccd10240d: Status 404 returned error can't find the container with id ef4d46fd03a6ed24e9257220019a6d0c5c2606fc17c29ff5b1c8ba6ccd10240d Mar 09 14:15:20 crc kubenswrapper[4722]: I0309 14:15:20.906068 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-lc5zn" event={"ID":"14655c3d-02fe-4215-b566-0c4008fd34a0","Type":"ContainerStarted","Data":"da90930b0e53d42a3bd18d44c6ec90cd0a3cc14c642bfbad2eb8b23f7933447f"} Mar 09 14:15:20 crc kubenswrapper[4722]: I0309 14:15:20.907936 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d974998df-jzrsl" event={"ID":"85953f91-dda3-42f4-b308-7b553054dad6","Type":"ContainerStarted","Data":"ef4d46fd03a6ed24e9257220019a6d0c5c2606fc17c29ff5b1c8ba6ccd10240d"} Mar 09 14:15:20 crc kubenswrapper[4722]: I0309 14:15:20.909311 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-2rhld" event={"ID":"c123a767-e0e0-4432-b34f-cbe0b581d938","Type":"ContainerStarted","Data":"0b426d94377d4a5fb2a3eb7f28ccabd15fc7d7d60c3604c3ae4cbdf069500175"} Mar 09 14:15:25 crc kubenswrapper[4722]: I0309 14:15:25.148988 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551095-h8w2n" Mar 09 14:15:25 crc kubenswrapper[4722]: I0309 14:15:25.149967 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551095-h8w2n" Mar 09 14:15:27 crc kubenswrapper[4722]: I0309 14:15:27.214704 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551095-h8w2n"] Mar 09 14:15:27 crc kubenswrapper[4722]: W0309 14:15:27.282713 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00a50aa1_a705_42cb_a3f7_e90cb7212a19.slice/crio-d69d97ea824b38007a9581d0ac8f061f977b84bfc68510c464fad26d7f176f0a WatchSource:0}: Error finding container d69d97ea824b38007a9581d0ac8f061f977b84bfc68510c464fad26d7f176f0a: Status 404 returned error can't find the container with id d69d97ea824b38007a9581d0ac8f061f977b84bfc68510c464fad26d7f176f0a Mar 09 14:15:27 crc kubenswrapper[4722]: I0309 14:15:27.997614 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-lc5zn" event={"ID":"14655c3d-02fe-4215-b566-0c4008fd34a0","Type":"ContainerStarted","Data":"620d6d1f26b1e3ccb96ee8067fcb299cd682674fff256e53875db302966d4f26"} Mar 09 14:15:27 crc kubenswrapper[4722]: I0309 14:15:27.998004 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-lc5zn" Mar 09 14:15:27 crc kubenswrapper[4722]: I0309 14:15:27.999663 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d974998df-jzrsl" event={"ID":"85953f91-dda3-42f4-b308-7b553054dad6","Type":"ContainerStarted","Data":"0eaabac982f62fe853113d1073968b2241160fb143136180389b2d0be8c4eb68"} Mar 09 14:15:28 crc kubenswrapper[4722]: I0309 14:15:28.002226 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-pw2x9" event={"ID":"f612329a-8162-4440-aae8-a5467e713976","Type":"ContainerStarted","Data":"3bc8d38c07b23cadf006d3efaeab03a98e268df967bdc80807ee8618d1f81d56"} Mar 09 14:15:28 crc kubenswrapper[4722]: I0309 14:15:28.004228 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-2rhld" event={"ID":"c123a767-e0e0-4432-b34f-cbe0b581d938","Type":"ContainerStarted","Data":"ee2c3cd53b92c20f586b10fb179a8f7f9e0dec300dfd25de6acb3afa9caf440e"} Mar 09 14:15:28 crc kubenswrapper[4722]: I0309 14:15:28.004445 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-2rhld" Mar 09 14:15:28 crc kubenswrapper[4722]: I0309 14:15:28.005762 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d974998df-p2w5j" event={"ID":"a58a5d27-2898-4346-b0c0-08507cf2eb44","Type":"ContainerStarted","Data":"e01fac44c0e97866096ecb58f9d6ba9d3b0981632a108215c075326940b199a6"} Mar 09 14:15:28 crc kubenswrapper[4722]: I0309 14:15:28.006780 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-lc5zn" Mar 09 14:15:28 crc kubenswrapper[4722]: I0309 14:15:28.007574 4722 generic.go:334] "Generic (PLEG): container finished" podID="00a50aa1-a705-42cb-a3f7-e90cb7212a19" containerID="0649f7b51acfb6bb022ac48b78a5bcee767b4f61a5aebbe40c11419852d44da7" exitCode=0 Mar 09 14:15:28 crc kubenswrapper[4722]: I0309 14:15:28.007625 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551095-h8w2n" event={"ID":"00a50aa1-a705-42cb-a3f7-e90cb7212a19","Type":"ContainerDied","Data":"0649f7b51acfb6bb022ac48b78a5bcee767b4f61a5aebbe40c11419852d44da7"} Mar 09 14:15:28 crc kubenswrapper[4722]: I0309 14:15:28.007661 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551095-h8w2n" event={"ID":"00a50aa1-a705-42cb-a3f7-e90cb7212a19","Type":"ContainerStarted","Data":"d69d97ea824b38007a9581d0ac8f061f977b84bfc68510c464fad26d7f176f0a"} Mar 09 14:15:28 crc kubenswrapper[4722]: I0309 14:15:28.019476 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-lc5zn" podStartSLOduration=30.851841454 podStartE2EDuration="37.019455679s" podCreationTimestamp="2026-03-09 14:14:51 +0000 UTC" firstStartedPulling="2026-03-09 14:15:20.653940532 +0000 UTC m=+761.209509108" lastFinishedPulling="2026-03-09 14:15:26.821554757 +0000 UTC m=+767.377123333" observedRunningTime="2026-03-09 14:15:28.016243419 +0000 UTC m=+768.571811995" watchObservedRunningTime="2026-03-09 14:15:28.019455679 +0000 UTC m=+768.575024255" Mar 09 14:15:28 crc kubenswrapper[4722]: I0309 14:15:28.052791 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-2rhld" podStartSLOduration=30.866273488 podStartE2EDuration="37.052769072s" podCreationTimestamp="2026-03-09 14:14:51 +0000 UTC" firstStartedPulling="2026-03-09 14:15:20.656812623 +0000 UTC m=+761.212381199" lastFinishedPulling="2026-03-09 14:15:26.843308207 +0000 UTC m=+767.398876783" observedRunningTime="2026-03-09 14:15:28.051351783 +0000 UTC m=+768.606920359" watchObservedRunningTime="2026-03-09 14:15:28.052769072 +0000 UTC m=+768.608337648" Mar 09 14:15:28 crc kubenswrapper[4722]: I0309 14:15:28.073887 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d974998df-jzrsl" podStartSLOduration=31.972309418000002 podStartE2EDuration="38.073868153s" podCreationTimestamp="2026-03-09 14:14:50 +0000 UTC" firstStartedPulling="2026-03-09 14:15:20.727257205 +0000 UTC m=+761.282825791" lastFinishedPulling="2026-03-09 14:15:26.82881595 +0000 UTC m=+767.384384526" observedRunningTime="2026-03-09 14:15:28.069577453 +0000 UTC m=+768.625146029" watchObservedRunningTime="2026-03-09 14:15:28.073868153 +0000 UTC m=+768.629436719" Mar 09 14:15:28 crc kubenswrapper[4722]: I0309 14:15:28.109353 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-pw2x9" podStartSLOduration=30.644983463 podStartE2EDuration="38.109324145s" podCreationTimestamp="2026-03-09 14:14:50 +0000 UTC" firstStartedPulling="2026-03-09 14:15:19.357292037 +0000 UTC m=+759.912860653" lastFinishedPulling="2026-03-09 14:15:26.821632759 +0000 UTC m=+767.377201335" observedRunningTime="2026-03-09 14:15:28.105163238 +0000 UTC m=+768.660731814" watchObservedRunningTime="2026-03-09 14:15:28.109324145 +0000 UTC m=+768.664892721" Mar 09 14:15:28 crc kubenswrapper[4722]: I0309 14:15:28.141351 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5d974998df-p2w5j" podStartSLOduration=29.937239642 podStartE2EDuration="38.141323971s" podCreationTimestamp="2026-03-09 14:14:50 +0000 UTC" firstStartedPulling="2026-03-09 14:15:18.617666103 +0000 UTC m=+759.173234689" lastFinishedPulling="2026-03-09 14:15:26.821750442 +0000 UTC m=+767.377319018" observedRunningTime="2026-03-09 14:15:28.138188043 +0000 UTC m=+768.693756619" watchObservedRunningTime="2026-03-09 14:15:28.141323971 +0000 UTC m=+768.696892557" Mar 09 14:15:29 crc kubenswrapper[4722]: I0309 14:15:29.292030 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551095-h8w2n" Mar 09 14:15:29 crc kubenswrapper[4722]: I0309 14:15:29.409157 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pczzm\" (UniqueName: \"kubernetes.io/projected/00a50aa1-a705-42cb-a3f7-e90cb7212a19-kube-api-access-pczzm\") pod \"00a50aa1-a705-42cb-a3f7-e90cb7212a19\" (UID: \"00a50aa1-a705-42cb-a3f7-e90cb7212a19\") " Mar 09 14:15:29 crc kubenswrapper[4722]: I0309 14:15:29.409263 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/00a50aa1-a705-42cb-a3f7-e90cb7212a19-config-volume\") pod \"00a50aa1-a705-42cb-a3f7-e90cb7212a19\" (UID: \"00a50aa1-a705-42cb-a3f7-e90cb7212a19\") " Mar 09 14:15:29 crc kubenswrapper[4722]: I0309 14:15:29.409328 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/00a50aa1-a705-42cb-a3f7-e90cb7212a19-secret-volume\") pod \"00a50aa1-a705-42cb-a3f7-e90cb7212a19\" (UID: \"00a50aa1-a705-42cb-a3f7-e90cb7212a19\") " Mar 09 14:15:29 crc kubenswrapper[4722]: I0309 14:15:29.409712 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00a50aa1-a705-42cb-a3f7-e90cb7212a19-config-volume" (OuterVolumeSpecName: "config-volume") pod "00a50aa1-a705-42cb-a3f7-e90cb7212a19" (UID: "00a50aa1-a705-42cb-a3f7-e90cb7212a19"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:15:29 crc kubenswrapper[4722]: I0309 14:15:29.416111 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00a50aa1-a705-42cb-a3f7-e90cb7212a19-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "00a50aa1-a705-42cb-a3f7-e90cb7212a19" (UID: "00a50aa1-a705-42cb-a3f7-e90cb7212a19"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:15:29 crc kubenswrapper[4722]: I0309 14:15:29.416423 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00a50aa1-a705-42cb-a3f7-e90cb7212a19-kube-api-access-pczzm" (OuterVolumeSpecName: "kube-api-access-pczzm") pod "00a50aa1-a705-42cb-a3f7-e90cb7212a19" (UID: "00a50aa1-a705-42cb-a3f7-e90cb7212a19"). InnerVolumeSpecName "kube-api-access-pczzm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:15:29 crc kubenswrapper[4722]: I0309 14:15:29.511013 4722 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/00a50aa1-a705-42cb-a3f7-e90cb7212a19-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 09 14:15:29 crc kubenswrapper[4722]: I0309 14:15:29.511071 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pczzm\" (UniqueName: \"kubernetes.io/projected/00a50aa1-a705-42cb-a3f7-e90cb7212a19-kube-api-access-pczzm\") on node \"crc\" DevicePath \"\"" Mar 09 14:15:29 crc kubenswrapper[4722]: I0309 14:15:29.511088 4722 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/00a50aa1-a705-42cb-a3f7-e90cb7212a19-config-volume\") on node \"crc\" DevicePath \"\"" Mar 09 14:15:30 crc kubenswrapper[4722]: I0309 14:15:30.028501 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551095-h8w2n" Mar 09 14:15:30 crc kubenswrapper[4722]: I0309 14:15:30.028468 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551095-h8w2n" event={"ID":"00a50aa1-a705-42cb-a3f7-e90cb7212a19","Type":"ContainerDied","Data":"d69d97ea824b38007a9581d0ac8f061f977b84bfc68510c464fad26d7f176f0a"} Mar 09 14:15:30 crc kubenswrapper[4722]: I0309 14:15:30.028570 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d69d97ea824b38007a9581d0ac8f061f977b84bfc68510c464fad26d7f176f0a" Mar 09 14:15:37 crc kubenswrapper[4722]: I0309 14:15:37.740376 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-kxd7l"] Mar 09 14:15:37 crc kubenswrapper[4722]: E0309 14:15:37.744731 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00a50aa1-a705-42cb-a3f7-e90cb7212a19" containerName="collect-profiles" Mar 09 14:15:37 crc kubenswrapper[4722]: I0309 14:15:37.744769 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="00a50aa1-a705-42cb-a3f7-e90cb7212a19" containerName="collect-profiles" Mar 09 14:15:37 crc kubenswrapper[4722]: I0309 14:15:37.744940 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="00a50aa1-a705-42cb-a3f7-e90cb7212a19" containerName="collect-profiles" Mar 09 14:15:37 crc kubenswrapper[4722]: I0309 14:15:37.745515 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-kxd7l" Mar 09 14:15:37 crc kubenswrapper[4722]: I0309 14:15:37.753028 4722 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-4b68w" Mar 09 14:15:37 crc kubenswrapper[4722]: I0309 14:15:37.754877 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 09 14:15:37 crc kubenswrapper[4722]: I0309 14:15:37.757590 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-kxd7l"] Mar 09 14:15:37 crc kubenswrapper[4722]: I0309 14:15:37.758142 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 09 14:15:37 crc kubenswrapper[4722]: I0309 14:15:37.766613 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-rrczb"] Mar 09 14:15:37 crc kubenswrapper[4722]: I0309 14:15:37.767986 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-rrczb" Mar 09 14:15:37 crc kubenswrapper[4722]: I0309 14:15:37.771874 4722 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-l45zp" Mar 09 14:15:37 crc kubenswrapper[4722]: I0309 14:15:37.808487 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-wb6rj"] Mar 09 14:15:37 crc kubenswrapper[4722]: I0309 14:15:37.809565 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-wb6rj" Mar 09 14:15:37 crc kubenswrapper[4722]: I0309 14:15:37.812350 4722 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-vlszl" Mar 09 14:15:37 crc kubenswrapper[4722]: I0309 14:15:37.815876 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-rrczb"] Mar 09 14:15:37 crc kubenswrapper[4722]: I0309 14:15:37.822820 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-wb6rj"] Mar 09 14:15:37 crc kubenswrapper[4722]: I0309 14:15:37.840901 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w45v7\" (UniqueName: \"kubernetes.io/projected/5548bcb9-3490-4e2b-982f-adc9ff86db62-kube-api-access-w45v7\") pod \"cert-manager-cainjector-cf98fcc89-kxd7l\" (UID: \"5548bcb9-3490-4e2b-982f-adc9ff86db62\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-kxd7l" Mar 09 14:15:37 crc kubenswrapper[4722]: I0309 14:15:37.840991 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7jl9\" (UniqueName: \"kubernetes.io/projected/ac6d6e52-6a89-4f96-8894-8ed2c71cdcbc-kube-api-access-x7jl9\") pod \"cert-manager-webhook-687f57d79b-wb6rj\" (UID: \"ac6d6e52-6a89-4f96-8894-8ed2c71cdcbc\") " pod="cert-manager/cert-manager-webhook-687f57d79b-wb6rj" Mar 09 14:15:37 crc kubenswrapper[4722]: I0309 14:15:37.841027 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bn5bb\" (UniqueName: \"kubernetes.io/projected/344178ce-f6d3-47f4-ab3c-69c394e2f677-kube-api-access-bn5bb\") pod \"cert-manager-858654f9db-rrczb\" (UID: \"344178ce-f6d3-47f4-ab3c-69c394e2f677\") " pod="cert-manager/cert-manager-858654f9db-rrczb" Mar 09 14:15:37 crc kubenswrapper[4722]: I0309 14:15:37.942701 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7jl9\" (UniqueName: \"kubernetes.io/projected/ac6d6e52-6a89-4f96-8894-8ed2c71cdcbc-kube-api-access-x7jl9\") pod \"cert-manager-webhook-687f57d79b-wb6rj\" (UID: \"ac6d6e52-6a89-4f96-8894-8ed2c71cdcbc\") " pod="cert-manager/cert-manager-webhook-687f57d79b-wb6rj" Mar 09 14:15:37 crc kubenswrapper[4722]: I0309 14:15:37.942765 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bn5bb\" (UniqueName: \"kubernetes.io/projected/344178ce-f6d3-47f4-ab3c-69c394e2f677-kube-api-access-bn5bb\") pod \"cert-manager-858654f9db-rrczb\" (UID: \"344178ce-f6d3-47f4-ab3c-69c394e2f677\") " pod="cert-manager/cert-manager-858654f9db-rrczb" Mar 09 14:15:37 crc kubenswrapper[4722]: I0309 14:15:37.943091 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w45v7\" (UniqueName: \"kubernetes.io/projected/5548bcb9-3490-4e2b-982f-adc9ff86db62-kube-api-access-w45v7\") pod \"cert-manager-cainjector-cf98fcc89-kxd7l\" (UID: \"5548bcb9-3490-4e2b-982f-adc9ff86db62\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-kxd7l" Mar 09 14:15:37 crc kubenswrapper[4722]: I0309 14:15:37.967247 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bn5bb\" (UniqueName: \"kubernetes.io/projected/344178ce-f6d3-47f4-ab3c-69c394e2f677-kube-api-access-bn5bb\") pod \"cert-manager-858654f9db-rrczb\" (UID: \"344178ce-f6d3-47f4-ab3c-69c394e2f677\") " pod="cert-manager/cert-manager-858654f9db-rrczb" Mar 09 14:15:37 crc kubenswrapper[4722]: I0309 14:15:37.967247 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w45v7\" (UniqueName: \"kubernetes.io/projected/5548bcb9-3490-4e2b-982f-adc9ff86db62-kube-api-access-w45v7\") pod \"cert-manager-cainjector-cf98fcc89-kxd7l\" (UID: \"5548bcb9-3490-4e2b-982f-adc9ff86db62\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-kxd7l" Mar 09 14:15:37 crc kubenswrapper[4722]: I0309 14:15:37.970447 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7jl9\" (UniqueName: \"kubernetes.io/projected/ac6d6e52-6a89-4f96-8894-8ed2c71cdcbc-kube-api-access-x7jl9\") pod \"cert-manager-webhook-687f57d79b-wb6rj\" (UID: \"ac6d6e52-6a89-4f96-8894-8ed2c71cdcbc\") " pod="cert-manager/cert-manager-webhook-687f57d79b-wb6rj" Mar 09 14:15:38 crc kubenswrapper[4722]: I0309 14:15:38.066282 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-kxd7l" Mar 09 14:15:38 crc kubenswrapper[4722]: I0309 14:15:38.089904 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-rrczb" Mar 09 14:15:38 crc kubenswrapper[4722]: I0309 14:15:38.125857 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-wb6rj" Mar 09 14:15:38 crc kubenswrapper[4722]: I0309 14:15:38.310515 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-rrczb"] Mar 09 14:15:38 crc kubenswrapper[4722]: I0309 14:15:38.575772 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-kxd7l"] Mar 09 14:15:38 crc kubenswrapper[4722]: W0309 14:15:38.580950 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5548bcb9_3490_4e2b_982f_adc9ff86db62.slice/crio-8a0f2987d74183772b460a1e14f334daf2bb9a5f7b3ae93af56daac334e12a2a WatchSource:0}: Error finding container 8a0f2987d74183772b460a1e14f334daf2bb9a5f7b3ae93af56daac334e12a2a: Status 404 returned error can't find the container with id 8a0f2987d74183772b460a1e14f334daf2bb9a5f7b3ae93af56daac334e12a2a Mar 09 14:15:38 crc kubenswrapper[4722]: I0309 14:15:38.622618 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-wb6rj"] Mar 09 14:15:38 crc kubenswrapper[4722]: W0309 14:15:38.624021 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac6d6e52_6a89_4f96_8894_8ed2c71cdcbc.slice/crio-aec09eda59524d96ba50978c4c22ef81352da4227c6cbd5241d88df5f6dd2d5b WatchSource:0}: Error finding container aec09eda59524d96ba50978c4c22ef81352da4227c6cbd5241d88df5f6dd2d5b: Status 404 returned error can't find the container with id aec09eda59524d96ba50978c4c22ef81352da4227c6cbd5241d88df5f6dd2d5b Mar 09 14:15:39 crc kubenswrapper[4722]: I0309 14:15:39.091599 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-rrczb" event={"ID":"344178ce-f6d3-47f4-ab3c-69c394e2f677","Type":"ContainerStarted","Data":"f798951e503da6d743face24c297faf3f653dafb19ed380afa4ca54d74cce42e"} Mar 09 14:15:39 crc kubenswrapper[4722]: I0309 14:15:39.093366 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-kxd7l" event={"ID":"5548bcb9-3490-4e2b-982f-adc9ff86db62","Type":"ContainerStarted","Data":"8a0f2987d74183772b460a1e14f334daf2bb9a5f7b3ae93af56daac334e12a2a"} Mar 09 14:15:39 crc kubenswrapper[4722]: I0309 14:15:39.094703 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-wb6rj" event={"ID":"ac6d6e52-6a89-4f96-8894-8ed2c71cdcbc","Type":"ContainerStarted","Data":"aec09eda59524d96ba50978c4c22ef81352da4227c6cbd5241d88df5f6dd2d5b"} Mar 09 14:15:41 crc kubenswrapper[4722]: I0309 14:15:41.512508 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-2rhld" Mar 09 14:15:43 crc kubenswrapper[4722]: I0309 14:15:43.127989 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-rrczb" event={"ID":"344178ce-f6d3-47f4-ab3c-69c394e2f677","Type":"ContainerStarted","Data":"f701de29365e5d131b0eac49daf179fc9ceae5b9415f76aab7b00a906a086930"} Mar 09 14:15:43 crc kubenswrapper[4722]: I0309 14:15:43.129786 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-kxd7l" event={"ID":"5548bcb9-3490-4e2b-982f-adc9ff86db62","Type":"ContainerStarted","Data":"3530fa4d7127713e763e232d5e4d8392ed6a89a99e5c2d9d7d3fdc3e34315ffd"} Mar 09 14:15:43 crc kubenswrapper[4722]: I0309 14:15:43.131297 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-wb6rj" event={"ID":"ac6d6e52-6a89-4f96-8894-8ed2c71cdcbc","Type":"ContainerStarted","Data":"e078f699513d8998dff6dee2534f1b9472aa7eaaa9227e09a2c4ea6b6ba71de6"} Mar 09 14:15:43 crc kubenswrapper[4722]: I0309 14:15:43.131417 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-wb6rj" Mar 09 14:15:43 crc kubenswrapper[4722]: I0309 14:15:43.146340 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-rrczb" podStartSLOduration=1.902503427 podStartE2EDuration="6.146316283s" podCreationTimestamp="2026-03-09 14:15:37 +0000 UTC" firstStartedPulling="2026-03-09 14:15:38.32343835 +0000 UTC m=+778.879006926" lastFinishedPulling="2026-03-09 14:15:42.567251206 +0000 UTC m=+783.122819782" observedRunningTime="2026-03-09 14:15:43.145008948 +0000 UTC m=+783.700577524" watchObservedRunningTime="2026-03-09 14:15:43.146316283 +0000 UTC m=+783.701884859" Mar 09 14:15:43 crc kubenswrapper[4722]: I0309 14:15:43.279733 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-wb6rj" podStartSLOduration=2.256144999 podStartE2EDuration="6.279706811s" podCreationTimestamp="2026-03-09 14:15:37 +0000 UTC" firstStartedPulling="2026-03-09 14:15:38.626771891 +0000 UTC m=+779.182340467" lastFinishedPulling="2026-03-09 14:15:42.650333693 +0000 UTC m=+783.205902279" observedRunningTime="2026-03-09 14:15:43.278571469 +0000 UTC m=+783.834140045" watchObservedRunningTime="2026-03-09 14:15:43.279706811 +0000 UTC m=+783.835275387" Mar 09 14:15:43 crc kubenswrapper[4722]: I0309 14:15:43.284023 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-kxd7l" podStartSLOduration=2.301028975 podStartE2EDuration="6.28400965s" podCreationTimestamp="2026-03-09 14:15:37 +0000 UTC" firstStartedPulling="2026-03-09 14:15:38.58351173 +0000 UTC m=+779.139080316" lastFinishedPulling="2026-03-09 14:15:42.566492415 +0000 UTC m=+783.122060991" observedRunningTime="2026-03-09 14:15:43.188737856 +0000 UTC m=+783.744306422" watchObservedRunningTime="2026-03-09 14:15:43.28400965 +0000 UTC m=+783.839578226" Mar 09 14:15:48 crc kubenswrapper[4722]: I0309 14:15:48.129655 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-wb6rj" Mar 09 14:15:51 crc kubenswrapper[4722]: I0309 14:15:51.527834 4722 patch_prober.go:28] interesting pod/machine-config-daemon-hjrrb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:15:51 crc kubenswrapper[4722]: I0309 14:15:51.528439 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:16:00 crc kubenswrapper[4722]: I0309 14:16:00.163973 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551096-hkrz5"] Mar 09 14:16:00 crc kubenswrapper[4722]: I0309 14:16:00.165750 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551096-hkrz5" Mar 09 14:16:00 crc kubenswrapper[4722]: I0309 14:16:00.167914 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-dr2x6" Mar 09 14:16:00 crc kubenswrapper[4722]: I0309 14:16:00.168112 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 14:16:00 crc kubenswrapper[4722]: I0309 14:16:00.168694 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 14:16:00 crc kubenswrapper[4722]: I0309 14:16:00.169826 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551096-hkrz5"] Mar 09 14:16:00 crc kubenswrapper[4722]: I0309 14:16:00.204597 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tphws\" (UniqueName: \"kubernetes.io/projected/0741506d-5c21-4ce8-be9e-98caaea80864-kube-api-access-tphws\") pod \"auto-csr-approver-29551096-hkrz5\" (UID: \"0741506d-5c21-4ce8-be9e-98caaea80864\") " pod="openshift-infra/auto-csr-approver-29551096-hkrz5" Mar 09 14:16:00 crc kubenswrapper[4722]: I0309 14:16:00.305989 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tphws\" (UniqueName: \"kubernetes.io/projected/0741506d-5c21-4ce8-be9e-98caaea80864-kube-api-access-tphws\") pod \"auto-csr-approver-29551096-hkrz5\" (UID: \"0741506d-5c21-4ce8-be9e-98caaea80864\") " pod="openshift-infra/auto-csr-approver-29551096-hkrz5" Mar 09 14:16:00 crc kubenswrapper[4722]: I0309 14:16:00.330532 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tphws\" (UniqueName: \"kubernetes.io/projected/0741506d-5c21-4ce8-be9e-98caaea80864-kube-api-access-tphws\") pod \"auto-csr-approver-29551096-hkrz5\" (UID: \"0741506d-5c21-4ce8-be9e-98caaea80864\") " pod="openshift-infra/auto-csr-approver-29551096-hkrz5" Mar 09 14:16:00 crc kubenswrapper[4722]: I0309 14:16:00.488020 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551096-hkrz5" Mar 09 14:16:00 crc kubenswrapper[4722]: I0309 14:16:00.942294 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551096-hkrz5"] Mar 09 14:16:01 crc kubenswrapper[4722]: I0309 14:16:01.260565 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551096-hkrz5" event={"ID":"0741506d-5c21-4ce8-be9e-98caaea80864","Type":"ContainerStarted","Data":"7735f2b31dbfcf96255350f45593a3083e92ab25002f127d4795ec79ed3e041a"} Mar 09 14:16:03 crc kubenswrapper[4722]: I0309 14:16:03.282646 4722 generic.go:334] "Generic (PLEG): container finished" podID="0741506d-5c21-4ce8-be9e-98caaea80864" containerID="070b01014ac1c78dfc5ec279e65efc22b02c251ada626c1d2a07b3697b21f4d9" exitCode=0 Mar 09 14:16:03 crc kubenswrapper[4722]: I0309 14:16:03.282746 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551096-hkrz5" event={"ID":"0741506d-5c21-4ce8-be9e-98caaea80864","Type":"ContainerDied","Data":"070b01014ac1c78dfc5ec279e65efc22b02c251ada626c1d2a07b3697b21f4d9"} Mar 09 14:16:04 crc kubenswrapper[4722]: I0309 14:16:04.650642 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551096-hkrz5" Mar 09 14:16:04 crc kubenswrapper[4722]: I0309 14:16:04.674578 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tphws\" (UniqueName: \"kubernetes.io/projected/0741506d-5c21-4ce8-be9e-98caaea80864-kube-api-access-tphws\") pod \"0741506d-5c21-4ce8-be9e-98caaea80864\" (UID: \"0741506d-5c21-4ce8-be9e-98caaea80864\") " Mar 09 14:16:04 crc kubenswrapper[4722]: I0309 14:16:04.687369 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0741506d-5c21-4ce8-be9e-98caaea80864-kube-api-access-tphws" (OuterVolumeSpecName: "kube-api-access-tphws") pod "0741506d-5c21-4ce8-be9e-98caaea80864" (UID: "0741506d-5c21-4ce8-be9e-98caaea80864"). InnerVolumeSpecName "kube-api-access-tphws". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:16:04 crc kubenswrapper[4722]: I0309 14:16:04.776698 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tphws\" (UniqueName: \"kubernetes.io/projected/0741506d-5c21-4ce8-be9e-98caaea80864-kube-api-access-tphws\") on node \"crc\" DevicePath \"\"" Mar 09 14:16:05 crc kubenswrapper[4722]: I0309 14:16:05.301346 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551096-hkrz5" event={"ID":"0741506d-5c21-4ce8-be9e-98caaea80864","Type":"ContainerDied","Data":"7735f2b31dbfcf96255350f45593a3083e92ab25002f127d4795ec79ed3e041a"} Mar 09 14:16:05 crc kubenswrapper[4722]: I0309 14:16:05.301396 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7735f2b31dbfcf96255350f45593a3083e92ab25002f127d4795ec79ed3e041a" Mar 09 14:16:05 crc kubenswrapper[4722]: I0309 14:16:05.301471 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551096-hkrz5" Mar 09 14:16:05 crc kubenswrapper[4722]: I0309 14:16:05.719117 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551090-trppz"] Mar 09 14:16:05 crc kubenswrapper[4722]: I0309 14:16:05.729486 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551090-trppz"] Mar 09 14:16:06 crc kubenswrapper[4722]: I0309 14:16:06.158982 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1b96455-88a6-4b99-9cf3-8c7332822aae" path="/var/lib/kubelet/pods/a1b96455-88a6-4b99-9cf3-8c7332822aae/volumes" Mar 09 14:16:11 crc kubenswrapper[4722]: I0309 14:16:11.920293 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e19k78qz"] Mar 09 14:16:11 crc kubenswrapper[4722]: E0309 14:16:11.921106 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0741506d-5c21-4ce8-be9e-98caaea80864" containerName="oc" Mar 09 14:16:11 crc kubenswrapper[4722]: I0309 14:16:11.921118 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="0741506d-5c21-4ce8-be9e-98caaea80864" containerName="oc" Mar 09 14:16:11 crc kubenswrapper[4722]: I0309 14:16:11.921282 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="0741506d-5c21-4ce8-be9e-98caaea80864" containerName="oc" Mar 09 14:16:11 crc kubenswrapper[4722]: I0309 14:16:11.922174 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e19k78qz" Mar 09 14:16:11 crc kubenswrapper[4722]: I0309 14:16:11.926783 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 09 14:16:11 crc kubenswrapper[4722]: I0309 14:16:11.935148 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e19k78qz"] Mar 09 14:16:12 crc kubenswrapper[4722]: I0309 14:16:12.002858 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/af1c9e69-9bc2-4c93-8e62-132b4470617d-bundle\") pod \"371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e19k78qz\" (UID: \"af1c9e69-9bc2-4c93-8e62-132b4470617d\") " pod="openshift-marketplace/371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e19k78qz" Mar 09 14:16:12 crc kubenswrapper[4722]: I0309 14:16:12.003075 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/af1c9e69-9bc2-4c93-8e62-132b4470617d-util\") pod \"371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e19k78qz\" (UID: \"af1c9e69-9bc2-4c93-8e62-132b4470617d\") " pod="openshift-marketplace/371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e19k78qz" Mar 09 14:16:12 crc kubenswrapper[4722]: I0309 14:16:12.003228 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cswgm\" (UniqueName: \"kubernetes.io/projected/af1c9e69-9bc2-4c93-8e62-132b4470617d-kube-api-access-cswgm\") pod \"371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e19k78qz\" (UID: \"af1c9e69-9bc2-4c93-8e62-132b4470617d\") " pod="openshift-marketplace/371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e19k78qz" Mar 09 14:16:12 crc kubenswrapper[4722]: I0309 14:16:12.104055 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/af1c9e69-9bc2-4c93-8e62-132b4470617d-util\") pod \"371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e19k78qz\" (UID: \"af1c9e69-9bc2-4c93-8e62-132b4470617d\") " pod="openshift-marketplace/371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e19k78qz" Mar 09 14:16:12 crc kubenswrapper[4722]: I0309 14:16:12.104120 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cswgm\" (UniqueName: \"kubernetes.io/projected/af1c9e69-9bc2-4c93-8e62-132b4470617d-kube-api-access-cswgm\") pod \"371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e19k78qz\" (UID: \"af1c9e69-9bc2-4c93-8e62-132b4470617d\") " pod="openshift-marketplace/371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e19k78qz" Mar 09 14:16:12 crc kubenswrapper[4722]: I0309 14:16:12.104592 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/af1c9e69-9bc2-4c93-8e62-132b4470617d-bundle\") pod \"371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e19k78qz\" (UID: \"af1c9e69-9bc2-4c93-8e62-132b4470617d\") " pod="openshift-marketplace/371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e19k78qz" Mar 09 14:16:12 crc kubenswrapper[4722]: I0309 14:16:12.104621 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/af1c9e69-9bc2-4c93-8e62-132b4470617d-util\") pod \"371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e19k78qz\" (UID: \"af1c9e69-9bc2-4c93-8e62-132b4470617d\") " pod="openshift-marketplace/371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e19k78qz" Mar 09 14:16:12 crc kubenswrapper[4722]: I0309 14:16:12.104988 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/af1c9e69-9bc2-4c93-8e62-132b4470617d-bundle\") pod \"371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e19k78qz\" (UID: \"af1c9e69-9bc2-4c93-8e62-132b4470617d\") " pod="openshift-marketplace/371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e19k78qz" Mar 09 14:16:12 crc kubenswrapper[4722]: I0309 14:16:12.125307 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cswgm\" (UniqueName: \"kubernetes.io/projected/af1c9e69-9bc2-4c93-8e62-132b4470617d-kube-api-access-cswgm\") pod \"371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e19k78qz\" (UID: \"af1c9e69-9bc2-4c93-8e62-132b4470617d\") " pod="openshift-marketplace/371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e19k78qz" Mar 09 14:16:12 crc kubenswrapper[4722]: I0309 14:16:12.240521 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e19k78qz" Mar 09 14:16:12 crc kubenswrapper[4722]: I0309 14:16:12.505139 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e19k78qz"] Mar 09 14:16:13 crc kubenswrapper[4722]: I0309 14:16:13.365588 4722 generic.go:334] "Generic (PLEG): container finished" podID="af1c9e69-9bc2-4c93-8e62-132b4470617d" containerID="df95b1cf35d7c78cf01d8cab9beb1c61eb884e511804456a373f17a21ee15e14" exitCode=0 Mar 09 14:16:13 crc kubenswrapper[4722]: I0309 14:16:13.365645 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e19k78qz" event={"ID":"af1c9e69-9bc2-4c93-8e62-132b4470617d","Type":"ContainerDied","Data":"df95b1cf35d7c78cf01d8cab9beb1c61eb884e511804456a373f17a21ee15e14"} Mar 09 14:16:13 crc kubenswrapper[4722]: I0309 14:16:13.365836 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e19k78qz" event={"ID":"af1c9e69-9bc2-4c93-8e62-132b4470617d","Type":"ContainerStarted","Data":"1e53d82a4e656110478d2326cdc0f4986c934e19abe3a1b013081cec965c8b58"} Mar 09 14:16:14 crc kubenswrapper[4722]: I0309 14:16:14.717093 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e08989dhs6v"] Mar 09 14:16:14 crc kubenswrapper[4722]: I0309 14:16:14.719478 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e08989dhs6v" Mar 09 14:16:14 crc kubenswrapper[4722]: I0309 14:16:14.775775 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e08989dhs6v"] Mar 09 14:16:14 crc kubenswrapper[4722]: I0309 14:16:14.776627 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qj7h6\" (UniqueName: \"kubernetes.io/projected/adf5a27c-0e62-496e-a0a2-a0d2d98f898f-kube-api-access-qj7h6\") pod \"e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e08989dhs6v\" (UID: \"adf5a27c-0e62-496e-a0a2-a0d2d98f898f\") " pod="openshift-marketplace/e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e08989dhs6v" Mar 09 14:16:14 crc kubenswrapper[4722]: I0309 14:16:14.776749 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/adf5a27c-0e62-496e-a0a2-a0d2d98f898f-util\") pod \"e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e08989dhs6v\" (UID: \"adf5a27c-0e62-496e-a0a2-a0d2d98f898f\") " pod="openshift-marketplace/e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e08989dhs6v" Mar 09 14:16:14 crc kubenswrapper[4722]: I0309 14:16:14.776824 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/adf5a27c-0e62-496e-a0a2-a0d2d98f898f-bundle\") pod \"e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e08989dhs6v\" (UID: \"adf5a27c-0e62-496e-a0a2-a0d2d98f898f\") " pod="openshift-marketplace/e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e08989dhs6v" Mar 09 14:16:14 crc kubenswrapper[4722]: I0309 14:16:14.878293 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qj7h6\" (UniqueName: \"kubernetes.io/projected/adf5a27c-0e62-496e-a0a2-a0d2d98f898f-kube-api-access-qj7h6\") pod \"e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e08989dhs6v\" (UID: \"adf5a27c-0e62-496e-a0a2-a0d2d98f898f\") " pod="openshift-marketplace/e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e08989dhs6v" Mar 09 14:16:14 crc kubenswrapper[4722]: I0309 14:16:14.878362 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/adf5a27c-0e62-496e-a0a2-a0d2d98f898f-util\") pod \"e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e08989dhs6v\" (UID: \"adf5a27c-0e62-496e-a0a2-a0d2d98f898f\") " pod="openshift-marketplace/e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e08989dhs6v" Mar 09 14:16:14 crc kubenswrapper[4722]: I0309 14:16:14.878404 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/adf5a27c-0e62-496e-a0a2-a0d2d98f898f-bundle\") pod \"e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e08989dhs6v\" (UID: \"adf5a27c-0e62-496e-a0a2-a0d2d98f898f\") " pod="openshift-marketplace/e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e08989dhs6v" Mar 09 14:16:14 crc kubenswrapper[4722]: I0309 14:16:14.879095 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/adf5a27c-0e62-496e-a0a2-a0d2d98f898f-util\") pod \"e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e08989dhs6v\" (UID: \"adf5a27c-0e62-496e-a0a2-a0d2d98f898f\") " pod="openshift-marketplace/e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e08989dhs6v" Mar 09 14:16:14 crc kubenswrapper[4722]: I0309 14:16:14.879143 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/adf5a27c-0e62-496e-a0a2-a0d2d98f898f-bundle\") pod \"e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e08989dhs6v\" (UID: \"adf5a27c-0e62-496e-a0a2-a0d2d98f898f\") " pod="openshift-marketplace/e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e08989dhs6v" Mar 09 14:16:14 crc kubenswrapper[4722]: I0309 14:16:14.902832 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qj7h6\" (UniqueName: \"kubernetes.io/projected/adf5a27c-0e62-496e-a0a2-a0d2d98f898f-kube-api-access-qj7h6\") pod \"e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e08989dhs6v\" (UID: \"adf5a27c-0e62-496e-a0a2-a0d2d98f898f\") " pod="openshift-marketplace/e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e08989dhs6v" Mar 09 14:16:15 crc kubenswrapper[4722]: I0309 14:16:15.039445 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e08989dhs6v" Mar 09 14:16:15 crc kubenswrapper[4722]: I0309 14:16:15.481948 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e08989dhs6v"] Mar 09 14:16:15 crc kubenswrapper[4722]: W0309 14:16:15.507766 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podadf5a27c_0e62_496e_a0a2_a0d2d98f898f.slice/crio-96872c318f1824b405f13293ea4c8f3a8063bd57b636a044b78ced329e769efa WatchSource:0}: Error finding container 96872c318f1824b405f13293ea4c8f3a8063bd57b636a044b78ced329e769efa: Status 404 returned error can't find the container with id 96872c318f1824b405f13293ea4c8f3a8063bd57b636a044b78ced329e769efa Mar 09 14:16:15 crc kubenswrapper[4722]: I0309 14:16:15.786784 4722 scope.go:117] "RemoveContainer" containerID="189c0576d6df4503226c658c0938baa9365f608dcc338c56a8d434a267bc0b50" Mar 09 14:16:16 crc kubenswrapper[4722]: I0309 14:16:16.390350 4722 generic.go:334] "Generic (PLEG): container finished" podID="adf5a27c-0e62-496e-a0a2-a0d2d98f898f" containerID="5dc40871aa032801c71dfb604c8990d96e8b9e4801c9300c746ec93b09f768ef" exitCode=0 Mar 09 14:16:16 crc kubenswrapper[4722]: I0309 14:16:16.390469 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e08989dhs6v" event={"ID":"adf5a27c-0e62-496e-a0a2-a0d2d98f898f","Type":"ContainerDied","Data":"5dc40871aa032801c71dfb604c8990d96e8b9e4801c9300c746ec93b09f768ef"} Mar 09 14:16:16 crc kubenswrapper[4722]: I0309 14:16:16.390740 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e08989dhs6v" event={"ID":"adf5a27c-0e62-496e-a0a2-a0d2d98f898f","Type":"ContainerStarted","Data":"96872c318f1824b405f13293ea4c8f3a8063bd57b636a044b78ced329e769efa"} Mar 09 14:16:16 crc kubenswrapper[4722]: I0309 14:16:16.392993 4722 generic.go:334] "Generic (PLEG): container finished" podID="af1c9e69-9bc2-4c93-8e62-132b4470617d" containerID="cb09a20ec075ed701bcd7c80fbcad79b45d8714a8302a5c1dca52f6a2e4497d5" exitCode=0 Mar 09 14:16:16 crc kubenswrapper[4722]: I0309 14:16:16.393034 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e19k78qz" event={"ID":"af1c9e69-9bc2-4c93-8e62-132b4470617d","Type":"ContainerDied","Data":"cb09a20ec075ed701bcd7c80fbcad79b45d8714a8302a5c1dca52f6a2e4497d5"} Mar 09 14:16:17 crc kubenswrapper[4722]: I0309 14:16:17.400614 4722 generic.go:334] "Generic (PLEG): container finished" podID="af1c9e69-9bc2-4c93-8e62-132b4470617d" containerID="7a9ed7c7ebc0912fa5e4b1f08708b8851f45c2720865b45d4a564bc25e0a317e" exitCode=0 Mar 09 14:16:17 crc kubenswrapper[4722]: I0309 14:16:17.400743 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e19k78qz" event={"ID":"af1c9e69-9bc2-4c93-8e62-132b4470617d","Type":"ContainerDied","Data":"7a9ed7c7ebc0912fa5e4b1f08708b8851f45c2720865b45d4a564bc25e0a317e"} Mar 09 14:16:18 crc kubenswrapper[4722]: I0309 14:16:18.410628 4722 generic.go:334] "Generic (PLEG): container finished" podID="adf5a27c-0e62-496e-a0a2-a0d2d98f898f" containerID="48bffc387213a14f95a62caa4be7dd3b8fe2e437dab65a00c35eaa400760e2a2" exitCode=0 Mar 09 14:16:18 crc kubenswrapper[4722]: I0309 14:16:18.411565 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e08989dhs6v" event={"ID":"adf5a27c-0e62-496e-a0a2-a0d2d98f898f","Type":"ContainerDied","Data":"48bffc387213a14f95a62caa4be7dd3b8fe2e437dab65a00c35eaa400760e2a2"} Mar 09 14:16:18 crc kubenswrapper[4722]: I0309 14:16:18.845585 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e19k78qz" Mar 09 14:16:18 crc kubenswrapper[4722]: I0309 14:16:18.951865 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/af1c9e69-9bc2-4c93-8e62-132b4470617d-util\") pod \"af1c9e69-9bc2-4c93-8e62-132b4470617d\" (UID: \"af1c9e69-9bc2-4c93-8e62-132b4470617d\") " Mar 09 14:16:18 crc kubenswrapper[4722]: I0309 14:16:18.951949 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cswgm\" (UniqueName: \"kubernetes.io/projected/af1c9e69-9bc2-4c93-8e62-132b4470617d-kube-api-access-cswgm\") pod \"af1c9e69-9bc2-4c93-8e62-132b4470617d\" (UID: \"af1c9e69-9bc2-4c93-8e62-132b4470617d\") " Mar 09 14:16:18 crc kubenswrapper[4722]: I0309 14:16:18.952037 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/af1c9e69-9bc2-4c93-8e62-132b4470617d-bundle\") pod \"af1c9e69-9bc2-4c93-8e62-132b4470617d\" (UID: \"af1c9e69-9bc2-4c93-8e62-132b4470617d\") " Mar 09 14:16:18 crc kubenswrapper[4722]: I0309 14:16:18.953290 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af1c9e69-9bc2-4c93-8e62-132b4470617d-bundle" (OuterVolumeSpecName: "bundle") pod "af1c9e69-9bc2-4c93-8e62-132b4470617d" (UID: "af1c9e69-9bc2-4c93-8e62-132b4470617d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:16:18 crc kubenswrapper[4722]: I0309 14:16:18.958669 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af1c9e69-9bc2-4c93-8e62-132b4470617d-kube-api-access-cswgm" (OuterVolumeSpecName: "kube-api-access-cswgm") pod "af1c9e69-9bc2-4c93-8e62-132b4470617d" (UID: "af1c9e69-9bc2-4c93-8e62-132b4470617d"). InnerVolumeSpecName "kube-api-access-cswgm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:16:18 crc kubenswrapper[4722]: I0309 14:16:18.967573 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af1c9e69-9bc2-4c93-8e62-132b4470617d-util" (OuterVolumeSpecName: "util") pod "af1c9e69-9bc2-4c93-8e62-132b4470617d" (UID: "af1c9e69-9bc2-4c93-8e62-132b4470617d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:16:19 crc kubenswrapper[4722]: I0309 14:16:19.053886 4722 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/af1c9e69-9bc2-4c93-8e62-132b4470617d-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:16:19 crc kubenswrapper[4722]: I0309 14:16:19.053923 4722 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/af1c9e69-9bc2-4c93-8e62-132b4470617d-util\") on node \"crc\" DevicePath \"\"" Mar 09 14:16:19 crc kubenswrapper[4722]: I0309 14:16:19.053933 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cswgm\" (UniqueName: \"kubernetes.io/projected/af1c9e69-9bc2-4c93-8e62-132b4470617d-kube-api-access-cswgm\") on node \"crc\" DevicePath \"\"" Mar 09 14:16:19 crc kubenswrapper[4722]: I0309 14:16:19.424376 4722 generic.go:334] "Generic (PLEG): container finished" podID="adf5a27c-0e62-496e-a0a2-a0d2d98f898f" containerID="22fc69a8f39dbfe85ec6f125257867286544e0b4ad0184056431b43996b8007d" exitCode=0 Mar 09 14:16:19 crc kubenswrapper[4722]: I0309 14:16:19.424437 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e08989dhs6v" event={"ID":"adf5a27c-0e62-496e-a0a2-a0d2d98f898f","Type":"ContainerDied","Data":"22fc69a8f39dbfe85ec6f125257867286544e0b4ad0184056431b43996b8007d"} Mar 09 14:16:19 crc kubenswrapper[4722]: I0309 14:16:19.439425 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e19k78qz" event={"ID":"af1c9e69-9bc2-4c93-8e62-132b4470617d","Type":"ContainerDied","Data":"1e53d82a4e656110478d2326cdc0f4986c934e19abe3a1b013081cec965c8b58"} Mar 09 14:16:19 crc kubenswrapper[4722]: I0309 14:16:19.439723 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e53d82a4e656110478d2326cdc0f4986c934e19abe3a1b013081cec965c8b58" Mar 09 14:16:19 crc kubenswrapper[4722]: I0309 14:16:19.439723 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e19k78qz" Mar 09 14:16:20 crc kubenswrapper[4722]: I0309 14:16:20.747966 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e08989dhs6v" Mar 09 14:16:20 crc kubenswrapper[4722]: I0309 14:16:20.882577 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qj7h6\" (UniqueName: \"kubernetes.io/projected/adf5a27c-0e62-496e-a0a2-a0d2d98f898f-kube-api-access-qj7h6\") pod \"adf5a27c-0e62-496e-a0a2-a0d2d98f898f\" (UID: \"adf5a27c-0e62-496e-a0a2-a0d2d98f898f\") " Mar 09 14:16:20 crc kubenswrapper[4722]: I0309 14:16:20.882657 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/adf5a27c-0e62-496e-a0a2-a0d2d98f898f-bundle\") pod \"adf5a27c-0e62-496e-a0a2-a0d2d98f898f\" (UID: \"adf5a27c-0e62-496e-a0a2-a0d2d98f898f\") " Mar 09 14:16:20 crc kubenswrapper[4722]: I0309 14:16:20.882769 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/adf5a27c-0e62-496e-a0a2-a0d2d98f898f-util\") pod \"adf5a27c-0e62-496e-a0a2-a0d2d98f898f\" (UID: \"adf5a27c-0e62-496e-a0a2-a0d2d98f898f\") " Mar 09 14:16:20 crc kubenswrapper[4722]: I0309 14:16:20.891650 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/adf5a27c-0e62-496e-a0a2-a0d2d98f898f-bundle" (OuterVolumeSpecName: "bundle") pod "adf5a27c-0e62-496e-a0a2-a0d2d98f898f" (UID: "adf5a27c-0e62-496e-a0a2-a0d2d98f898f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:16:20 crc kubenswrapper[4722]: I0309 14:16:20.892415 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/adf5a27c-0e62-496e-a0a2-a0d2d98f898f-kube-api-access-qj7h6" (OuterVolumeSpecName: "kube-api-access-qj7h6") pod "adf5a27c-0e62-496e-a0a2-a0d2d98f898f" (UID: "adf5a27c-0e62-496e-a0a2-a0d2d98f898f"). InnerVolumeSpecName "kube-api-access-qj7h6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:16:20 crc kubenswrapper[4722]: I0309 14:16:20.901880 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/adf5a27c-0e62-496e-a0a2-a0d2d98f898f-util" (OuterVolumeSpecName: "util") pod "adf5a27c-0e62-496e-a0a2-a0d2d98f898f" (UID: "adf5a27c-0e62-496e-a0a2-a0d2d98f898f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:16:20 crc kubenswrapper[4722]: I0309 14:16:20.985003 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qj7h6\" (UniqueName: \"kubernetes.io/projected/adf5a27c-0e62-496e-a0a2-a0d2d98f898f-kube-api-access-qj7h6\") on node \"crc\" DevicePath \"\"" Mar 09 14:16:20 crc kubenswrapper[4722]: I0309 14:16:20.985039 4722 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/adf5a27c-0e62-496e-a0a2-a0d2d98f898f-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:16:20 crc kubenswrapper[4722]: I0309 14:16:20.985050 4722 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/adf5a27c-0e62-496e-a0a2-a0d2d98f898f-util\") on node \"crc\" DevicePath \"\"" Mar 09 14:16:21 crc kubenswrapper[4722]: I0309 14:16:21.456827 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e08989dhs6v" event={"ID":"adf5a27c-0e62-496e-a0a2-a0d2d98f898f","Type":"ContainerDied","Data":"96872c318f1824b405f13293ea4c8f3a8063bd57b636a044b78ced329e769efa"} Mar 09 14:16:21 crc kubenswrapper[4722]: I0309 14:16:21.456871 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="96872c318f1824b405f13293ea4c8f3a8063bd57b636a044b78ced329e769efa" Mar 09 14:16:21 crc kubenswrapper[4722]: I0309 14:16:21.456947 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e08989dhs6v" Mar 09 14:16:21 crc kubenswrapper[4722]: I0309 14:16:21.528061 4722 patch_prober.go:28] interesting pod/machine-config-daemon-hjrrb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:16:21 crc kubenswrapper[4722]: I0309 14:16:21.528123 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:16:24 crc kubenswrapper[4722]: I0309 14:16:24.264922 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/cluster-logging-operator-c769fd969-gjzjg"] Mar 09 14:16:24 crc kubenswrapper[4722]: E0309 14:16:24.267701 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adf5a27c-0e62-496e-a0a2-a0d2d98f898f" containerName="extract" Mar 09 14:16:24 crc kubenswrapper[4722]: I0309 14:16:24.267719 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="adf5a27c-0e62-496e-a0a2-a0d2d98f898f" containerName="extract" Mar 09 14:16:24 crc kubenswrapper[4722]: E0309 14:16:24.267728 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af1c9e69-9bc2-4c93-8e62-132b4470617d" containerName="extract" Mar 09 14:16:24 crc kubenswrapper[4722]: I0309 14:16:24.267733 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="af1c9e69-9bc2-4c93-8e62-132b4470617d" containerName="extract" Mar 09 14:16:24 crc kubenswrapper[4722]: E0309 14:16:24.267745 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af1c9e69-9bc2-4c93-8e62-132b4470617d" containerName="pull" Mar 09 14:16:24 crc kubenswrapper[4722]: I0309 14:16:24.267752 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="af1c9e69-9bc2-4c93-8e62-132b4470617d" containerName="pull" Mar 09 14:16:24 crc kubenswrapper[4722]: E0309 14:16:24.267766 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adf5a27c-0e62-496e-a0a2-a0d2d98f898f" containerName="pull" Mar 09 14:16:24 crc kubenswrapper[4722]: I0309 14:16:24.267772 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="adf5a27c-0e62-496e-a0a2-a0d2d98f898f" containerName="pull" Mar 09 14:16:24 crc kubenswrapper[4722]: E0309 14:16:24.267783 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af1c9e69-9bc2-4c93-8e62-132b4470617d" containerName="util" Mar 09 14:16:24 crc kubenswrapper[4722]: I0309 14:16:24.267789 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="af1c9e69-9bc2-4c93-8e62-132b4470617d" containerName="util" Mar 09 14:16:24 crc kubenswrapper[4722]: E0309 14:16:24.267800 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adf5a27c-0e62-496e-a0a2-a0d2d98f898f" containerName="util" Mar 09 14:16:24 crc kubenswrapper[4722]: I0309 14:16:24.267807 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="adf5a27c-0e62-496e-a0a2-a0d2d98f898f" containerName="util" Mar 09 14:16:24 crc kubenswrapper[4722]: I0309 14:16:24.267912 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="adf5a27c-0e62-496e-a0a2-a0d2d98f898f" containerName="extract" Mar 09 14:16:24 crc kubenswrapper[4722]: I0309 14:16:24.267930 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="af1c9e69-9bc2-4c93-8e62-132b4470617d" containerName="extract" Mar 09 14:16:24 crc kubenswrapper[4722]: I0309 14:16:24.268465 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/cluster-logging-operator-c769fd969-gjzjg" Mar 09 14:16:24 crc kubenswrapper[4722]: I0309 14:16:24.271551 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"cluster-logging-operator-dockercfg-cmvvq" Mar 09 14:16:24 crc kubenswrapper[4722]: I0309 14:16:24.272007 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"kube-root-ca.crt" Mar 09 14:16:24 crc kubenswrapper[4722]: I0309 14:16:24.281161 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"openshift-service-ca.crt" Mar 09 14:16:24 crc kubenswrapper[4722]: I0309 14:16:24.283475 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/cluster-logging-operator-c769fd969-gjzjg"] Mar 09 14:16:24 crc kubenswrapper[4722]: I0309 14:16:24.433942 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s64kh\" (UniqueName: \"kubernetes.io/projected/b08e9124-838c-47b3-9452-62c3388a66e0-kube-api-access-s64kh\") pod \"cluster-logging-operator-c769fd969-gjzjg\" (UID: \"b08e9124-838c-47b3-9452-62c3388a66e0\") " pod="openshift-logging/cluster-logging-operator-c769fd969-gjzjg" Mar 09 14:16:24 crc kubenswrapper[4722]: I0309 14:16:24.536181 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s64kh\" (UniqueName: \"kubernetes.io/projected/b08e9124-838c-47b3-9452-62c3388a66e0-kube-api-access-s64kh\") pod \"cluster-logging-operator-c769fd969-gjzjg\" (UID: \"b08e9124-838c-47b3-9452-62c3388a66e0\") " pod="openshift-logging/cluster-logging-operator-c769fd969-gjzjg" Mar 09 14:16:24 crc kubenswrapper[4722]: I0309 14:16:24.555406 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s64kh\" (UniqueName: \"kubernetes.io/projected/b08e9124-838c-47b3-9452-62c3388a66e0-kube-api-access-s64kh\") pod \"cluster-logging-operator-c769fd969-gjzjg\" (UID: \"b08e9124-838c-47b3-9452-62c3388a66e0\") " pod="openshift-logging/cluster-logging-operator-c769fd969-gjzjg" Mar 09 14:16:24 crc kubenswrapper[4722]: I0309 14:16:24.595925 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/cluster-logging-operator-c769fd969-gjzjg" Mar 09 14:16:25 crc kubenswrapper[4722]: I0309 14:16:25.097782 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/cluster-logging-operator-c769fd969-gjzjg"] Mar 09 14:16:25 crc kubenswrapper[4722]: I0309 14:16:25.484039 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/cluster-logging-operator-c769fd969-gjzjg" event={"ID":"b08e9124-838c-47b3-9452-62c3388a66e0","Type":"ContainerStarted","Data":"bd7eeffeb7d07c24277babe36ec5221bdb0a567c0ecd7c8dd2dd555922c8562c"} Mar 09 14:16:30 crc kubenswrapper[4722]: I0309 14:16:30.524942 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/cluster-logging-operator-c769fd969-gjzjg" event={"ID":"b08e9124-838c-47b3-9452-62c3388a66e0","Type":"ContainerStarted","Data":"38f074ceebee52234fcf2f9567d8c1d4ff38ca806f5c57cb8cbe6142c6dc23ba"} Mar 09 14:16:30 crc kubenswrapper[4722]: I0309 14:16:30.544289 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/cluster-logging-operator-c769fd969-gjzjg" podStartSLOduration=1.6060010409999999 podStartE2EDuration="6.544271329s" podCreationTimestamp="2026-03-09 14:16:24 +0000 UTC" firstStartedPulling="2026-03-09 14:16:25.109413453 +0000 UTC m=+825.664982039" lastFinishedPulling="2026-03-09 14:16:30.047683761 +0000 UTC m=+830.603252327" observedRunningTime="2026-03-09 14:16:30.543959099 +0000 UTC m=+831.099527675" watchObservedRunningTime="2026-03-09 14:16:30.544271329 +0000 UTC m=+831.099839915" Mar 09 14:16:35 crc kubenswrapper[4722]: I0309 14:16:35.327343 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-7d6d6698bd-4r85k"] Mar 09 14:16:35 crc kubenswrapper[4722]: I0309 14:16:35.328799 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-7d6d6698bd-4r85k" Mar 09 14:16:35 crc kubenswrapper[4722]: I0309 14:16:35.335497 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-service-cert" Mar 09 14:16:35 crc kubenswrapper[4722]: I0309 14:16:35.336730 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"kube-root-ca.crt" Mar 09 14:16:35 crc kubenswrapper[4722]: I0309 14:16:35.336946 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"loki-operator-manager-config" Mar 09 14:16:35 crc kubenswrapper[4722]: I0309 14:16:35.337895 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"openshift-service-ca.crt" Mar 09 14:16:35 crc kubenswrapper[4722]: I0309 14:16:35.338125 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-dockercfg-phk67" Mar 09 14:16:35 crc kubenswrapper[4722]: I0309 14:16:35.346689 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-metrics" Mar 09 14:16:35 crc kubenswrapper[4722]: I0309 14:16:35.372796 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-7d6d6698bd-4r85k"] Mar 09 14:16:35 crc kubenswrapper[4722]: I0309 14:16:35.439647 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjbkc\" (UniqueName: \"kubernetes.io/projected/497a07fc-9649-4620-9432-855aa3fdc327-kube-api-access-rjbkc\") pod \"loki-operator-controller-manager-7d6d6698bd-4r85k\" (UID: \"497a07fc-9649-4620-9432-855aa3fdc327\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7d6d6698bd-4r85k" Mar 09 14:16:35 crc kubenswrapper[4722]: I0309 14:16:35.439718 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/497a07fc-9649-4620-9432-855aa3fdc327-manager-config\") pod \"loki-operator-controller-manager-7d6d6698bd-4r85k\" (UID: \"497a07fc-9649-4620-9432-855aa3fdc327\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7d6d6698bd-4r85k" Mar 09 14:16:35 crc kubenswrapper[4722]: I0309 14:16:35.439958 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/497a07fc-9649-4620-9432-855aa3fdc327-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-7d6d6698bd-4r85k\" (UID: \"497a07fc-9649-4620-9432-855aa3fdc327\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7d6d6698bd-4r85k" Mar 09 14:16:35 crc kubenswrapper[4722]: I0309 14:16:35.440031 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/497a07fc-9649-4620-9432-855aa3fdc327-apiservice-cert\") pod \"loki-operator-controller-manager-7d6d6698bd-4r85k\" (UID: \"497a07fc-9649-4620-9432-855aa3fdc327\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7d6d6698bd-4r85k" Mar 09 14:16:35 crc kubenswrapper[4722]: I0309 14:16:35.440092 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/497a07fc-9649-4620-9432-855aa3fdc327-webhook-cert\") pod \"loki-operator-controller-manager-7d6d6698bd-4r85k\" (UID: \"497a07fc-9649-4620-9432-855aa3fdc327\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7d6d6698bd-4r85k" Mar 09 14:16:35 crc kubenswrapper[4722]: I0309 14:16:35.540969 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjbkc\" (UniqueName: \"kubernetes.io/projected/497a07fc-9649-4620-9432-855aa3fdc327-kube-api-access-rjbkc\") pod \"loki-operator-controller-manager-7d6d6698bd-4r85k\" (UID: \"497a07fc-9649-4620-9432-855aa3fdc327\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7d6d6698bd-4r85k" Mar 09 14:16:35 crc kubenswrapper[4722]: I0309 14:16:35.541038 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/497a07fc-9649-4620-9432-855aa3fdc327-manager-config\") pod \"loki-operator-controller-manager-7d6d6698bd-4r85k\" (UID: \"497a07fc-9649-4620-9432-855aa3fdc327\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7d6d6698bd-4r85k" Mar 09 14:16:35 crc kubenswrapper[4722]: I0309 14:16:35.541089 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/497a07fc-9649-4620-9432-855aa3fdc327-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-7d6d6698bd-4r85k\" (UID: \"497a07fc-9649-4620-9432-855aa3fdc327\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7d6d6698bd-4r85k" Mar 09 14:16:35 crc kubenswrapper[4722]: I0309 14:16:35.541106 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/497a07fc-9649-4620-9432-855aa3fdc327-apiservice-cert\") pod \"loki-operator-controller-manager-7d6d6698bd-4r85k\" (UID: \"497a07fc-9649-4620-9432-855aa3fdc327\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7d6d6698bd-4r85k" Mar 09 14:16:35 crc kubenswrapper[4722]: I0309 14:16:35.541136 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/497a07fc-9649-4620-9432-855aa3fdc327-webhook-cert\") pod \"loki-operator-controller-manager-7d6d6698bd-4r85k\" (UID: \"497a07fc-9649-4620-9432-855aa3fdc327\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7d6d6698bd-4r85k" Mar 09 14:16:35 crc kubenswrapper[4722]: I0309 14:16:35.541973 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/497a07fc-9649-4620-9432-855aa3fdc327-manager-config\") pod \"loki-operator-controller-manager-7d6d6698bd-4r85k\" (UID: \"497a07fc-9649-4620-9432-855aa3fdc327\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7d6d6698bd-4r85k" Mar 09 14:16:35 crc kubenswrapper[4722]: I0309 14:16:35.547079 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/497a07fc-9649-4620-9432-855aa3fdc327-apiservice-cert\") pod \"loki-operator-controller-manager-7d6d6698bd-4r85k\" (UID: \"497a07fc-9649-4620-9432-855aa3fdc327\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7d6d6698bd-4r85k" Mar 09 14:16:35 crc kubenswrapper[4722]: I0309 14:16:35.555144 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/497a07fc-9649-4620-9432-855aa3fdc327-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-7d6d6698bd-4r85k\" (UID: \"497a07fc-9649-4620-9432-855aa3fdc327\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7d6d6698bd-4r85k" Mar 09 14:16:35 crc kubenswrapper[4722]: I0309 14:16:35.557961 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjbkc\" (UniqueName: \"kubernetes.io/projected/497a07fc-9649-4620-9432-855aa3fdc327-kube-api-access-rjbkc\") pod \"loki-operator-controller-manager-7d6d6698bd-4r85k\" (UID: \"497a07fc-9649-4620-9432-855aa3fdc327\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7d6d6698bd-4r85k" Mar 09 14:16:35 crc kubenswrapper[4722]: I0309 14:16:35.558119 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/497a07fc-9649-4620-9432-855aa3fdc327-webhook-cert\") pod \"loki-operator-controller-manager-7d6d6698bd-4r85k\" (UID: \"497a07fc-9649-4620-9432-855aa3fdc327\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7d6d6698bd-4r85k" Mar 09 14:16:35 crc kubenswrapper[4722]: I0309 14:16:35.657117 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-7d6d6698bd-4r85k" Mar 09 14:16:36 crc kubenswrapper[4722]: I0309 14:16:36.142632 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-7d6d6698bd-4r85k"] Mar 09 14:16:36 crc kubenswrapper[4722]: I0309 14:16:36.568604 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-7d6d6698bd-4r85k" event={"ID":"497a07fc-9649-4620-9432-855aa3fdc327","Type":"ContainerStarted","Data":"878dc01dcce02a7208013f25486ecb1cb2d27cf742e23f5a96799dfb94c7d055"} Mar 09 14:16:36 crc kubenswrapper[4722]: I0309 14:16:36.893138 4722 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 09 14:16:39 crc kubenswrapper[4722]: I0309 14:16:39.589266 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-7d6d6698bd-4r85k" event={"ID":"497a07fc-9649-4620-9432-855aa3fdc327","Type":"ContainerStarted","Data":"c59716ff78d5c2092732f1060e720e1a4bc901135d08a83828a4032b1d2c6101"} Mar 09 14:16:45 crc kubenswrapper[4722]: I0309 14:16:45.641553 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-7d6d6698bd-4r85k" event={"ID":"497a07fc-9649-4620-9432-855aa3fdc327","Type":"ContainerStarted","Data":"e0657e40f6bb4142daa552156aa9a95856202927bd725236cf77ebd12188842c"} Mar 09 14:16:45 crc kubenswrapper[4722]: I0309 14:16:45.642124 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators-redhat/loki-operator-controller-manager-7d6d6698bd-4r85k" Mar 09 14:16:45 crc kubenswrapper[4722]: I0309 14:16:45.644678 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators-redhat/loki-operator-controller-manager-7d6d6698bd-4r85k" Mar 09 14:16:45 crc kubenswrapper[4722]: I0309 14:16:45.671891 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators-redhat/loki-operator-controller-manager-7d6d6698bd-4r85k" podStartSLOduration=1.614902887 podStartE2EDuration="10.671863568s" podCreationTimestamp="2026-03-09 14:16:35 +0000 UTC" firstStartedPulling="2026-03-09 14:16:36.17327372 +0000 UTC m=+836.728842296" lastFinishedPulling="2026-03-09 14:16:45.230234401 +0000 UTC m=+845.785802977" observedRunningTime="2026-03-09 14:16:45.665217754 +0000 UTC m=+846.220786330" watchObservedRunningTime="2026-03-09 14:16:45.671863568 +0000 UTC m=+846.227432164" Mar 09 14:16:50 crc kubenswrapper[4722]: I0309 14:16:50.475110 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["minio-dev/minio"] Mar 09 14:16:50 crc kubenswrapper[4722]: I0309 14:16:50.477064 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Mar 09 14:16:50 crc kubenswrapper[4722]: I0309 14:16:50.482668 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"kube-root-ca.crt" Mar 09 14:16:50 crc kubenswrapper[4722]: I0309 14:16:50.482810 4722 reflector.go:368] Caches populated for *v1.Secret from object-"minio-dev"/"default-dockercfg-f5tjf" Mar 09 14:16:50 crc kubenswrapper[4722]: I0309 14:16:50.484565 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"openshift-service-ca.crt" Mar 09 14:16:50 crc kubenswrapper[4722]: I0309 14:16:50.486189 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Mar 09 14:16:50 crc kubenswrapper[4722]: I0309 14:16:50.578998 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-f20470b6-f364-456b-a4d1-dee3676a1409\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f20470b6-f364-456b-a4d1-dee3676a1409\") pod \"minio\" (UID: \"d16831be-9cbf-4e6b-92a7-e72e1197d02a\") " pod="minio-dev/minio" Mar 09 14:16:50 crc kubenswrapper[4722]: I0309 14:16:50.579083 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nf28t\" (UniqueName: \"kubernetes.io/projected/d16831be-9cbf-4e6b-92a7-e72e1197d02a-kube-api-access-nf28t\") pod \"minio\" (UID: \"d16831be-9cbf-4e6b-92a7-e72e1197d02a\") " pod="minio-dev/minio" Mar 09 14:16:50 crc kubenswrapper[4722]: I0309 14:16:50.679929 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nf28t\" (UniqueName: \"kubernetes.io/projected/d16831be-9cbf-4e6b-92a7-e72e1197d02a-kube-api-access-nf28t\") pod \"minio\" (UID: \"d16831be-9cbf-4e6b-92a7-e72e1197d02a\") " pod="minio-dev/minio" Mar 09 14:16:50 crc kubenswrapper[4722]: I0309 14:16:50.680037 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-f20470b6-f364-456b-a4d1-dee3676a1409\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f20470b6-f364-456b-a4d1-dee3676a1409\") pod \"minio\" (UID: \"d16831be-9cbf-4e6b-92a7-e72e1197d02a\") " pod="minio-dev/minio" Mar 09 14:16:50 crc kubenswrapper[4722]: I0309 14:16:50.682988 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 09 14:16:50 crc kubenswrapper[4722]: I0309 14:16:50.683025 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-f20470b6-f364-456b-a4d1-dee3676a1409\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f20470b6-f364-456b-a4d1-dee3676a1409\") pod \"minio\" (UID: \"d16831be-9cbf-4e6b-92a7-e72e1197d02a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/af9d6bd5b1825d8b3f2e1d69867c78717abee1b70133b1a6d4cb65a074a8300d/globalmount\"" pod="minio-dev/minio" Mar 09 14:16:50 crc kubenswrapper[4722]: I0309 14:16:50.699450 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nf28t\" (UniqueName: \"kubernetes.io/projected/d16831be-9cbf-4e6b-92a7-e72e1197d02a-kube-api-access-nf28t\") pod \"minio\" (UID: \"d16831be-9cbf-4e6b-92a7-e72e1197d02a\") " pod="minio-dev/minio" Mar 09 14:16:50 crc kubenswrapper[4722]: I0309 14:16:50.713643 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-f20470b6-f364-456b-a4d1-dee3676a1409\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-f20470b6-f364-456b-a4d1-dee3676a1409\") pod \"minio\" (UID: \"d16831be-9cbf-4e6b-92a7-e72e1197d02a\") " pod="minio-dev/minio" Mar 09 14:16:50 crc kubenswrapper[4722]: I0309 14:16:50.802566 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Mar 09 14:16:51 crc kubenswrapper[4722]: I0309 14:16:51.051762 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Mar 09 14:16:51 crc kubenswrapper[4722]: W0309 14:16:51.062728 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd16831be_9cbf_4e6b_92a7_e72e1197d02a.slice/crio-3d94812ad68f323108ac0b06118ad8b67a4a133d88328bfafe3be36c18e60e8c WatchSource:0}: Error finding container 3d94812ad68f323108ac0b06118ad8b67a4a133d88328bfafe3be36c18e60e8c: Status 404 returned error can't find the container with id 3d94812ad68f323108ac0b06118ad8b67a4a133d88328bfafe3be36c18e60e8c Mar 09 14:16:51 crc kubenswrapper[4722]: I0309 14:16:51.528190 4722 patch_prober.go:28] interesting pod/machine-config-daemon-hjrrb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:16:51 crc kubenswrapper[4722]: I0309 14:16:51.528533 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:16:51 crc kubenswrapper[4722]: I0309 14:16:51.528581 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" Mar 09 14:16:51 crc kubenswrapper[4722]: I0309 14:16:51.529257 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cc45a812c78ad6bdbc54dbec7789e158b5ae14665e6cafed5462e27caf19d00d"} pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 14:16:51 crc kubenswrapper[4722]: I0309 14:16:51.529342 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerName="machine-config-daemon" containerID="cri-o://cc45a812c78ad6bdbc54dbec7789e158b5ae14665e6cafed5462e27caf19d00d" gracePeriod=600 Mar 09 14:16:51 crc kubenswrapper[4722]: I0309 14:16:51.693571 4722 generic.go:334] "Generic (PLEG): container finished" podID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerID="cc45a812c78ad6bdbc54dbec7789e158b5ae14665e6cafed5462e27caf19d00d" exitCode=0 Mar 09 14:16:51 crc kubenswrapper[4722]: I0309 14:16:51.693730 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" event={"ID":"dac2aaf5-653b-4b2a-8efe-ed26bac8d648","Type":"ContainerDied","Data":"cc45a812c78ad6bdbc54dbec7789e158b5ae14665e6cafed5462e27caf19d00d"} Mar 09 14:16:51 crc kubenswrapper[4722]: I0309 14:16:51.693780 4722 scope.go:117] "RemoveContainer" containerID="bab3031afd45b73f21ee1828aeb90282e64688adb04b134404f0cab923bb0351" Mar 09 14:16:51 crc kubenswrapper[4722]: I0309 14:16:51.694726 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"d16831be-9cbf-4e6b-92a7-e72e1197d02a","Type":"ContainerStarted","Data":"3d94812ad68f323108ac0b06118ad8b67a4a133d88328bfafe3be36c18e60e8c"} Mar 09 14:16:52 crc kubenswrapper[4722]: I0309 14:16:52.721192 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" event={"ID":"dac2aaf5-653b-4b2a-8efe-ed26bac8d648","Type":"ContainerStarted","Data":"50a94b1e196b515b7f1ddd4cb650f99db9da76851a6a18093dd50246aaec5007"} Mar 09 14:16:54 crc kubenswrapper[4722]: I0309 14:16:54.746879 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"d16831be-9cbf-4e6b-92a7-e72e1197d02a","Type":"ContainerStarted","Data":"630090e22ad676f8683dd8d7f6033a5622854ebca77a32ad0ec9d6a817454e06"} Mar 09 14:16:54 crc kubenswrapper[4722]: I0309 14:16:54.772019 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="minio-dev/minio" podStartSLOduration=3.610654462 podStartE2EDuration="6.771992541s" podCreationTimestamp="2026-03-09 14:16:48 +0000 UTC" firstStartedPulling="2026-03-09 14:16:51.066596454 +0000 UTC m=+851.622165030" lastFinishedPulling="2026-03-09 14:16:54.227934533 +0000 UTC m=+854.783503109" observedRunningTime="2026-03-09 14:16:54.768011952 +0000 UTC m=+855.323580548" watchObservedRunningTime="2026-03-09 14:16:54.771992541 +0000 UTC m=+855.327561117" Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.373593 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-distributor-5d5548c9f5-r6x4b"] Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.374749 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-distributor-5d5548c9f5-r6x4b" Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.377966 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-config" Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.378977 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-ca-bundle" Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.379071 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-distributor-grpc" Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.380068 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-dockercfg-7gshb" Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.380145 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-distributor-http" Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.398465 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-distributor-5d5548c9f5-r6x4b"] Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.539696 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/822bc43f-dfed-4440-be35-1bf58f50456b-logging-loki-ca-bundle\") pod \"logging-loki-distributor-5d5548c9f5-r6x4b\" (UID: \"822bc43f-dfed-4440-be35-1bf58f50456b\") " pod="openshift-logging/logging-loki-distributor-5d5548c9f5-r6x4b" Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.540048 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/822bc43f-dfed-4440-be35-1bf58f50456b-config\") pod \"logging-loki-distributor-5d5548c9f5-r6x4b\" (UID: \"822bc43f-dfed-4440-be35-1bf58f50456b\") " pod="openshift-logging/logging-loki-distributor-5d5548c9f5-r6x4b" Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.540070 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/822bc43f-dfed-4440-be35-1bf58f50456b-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-5d5548c9f5-r6x4b\" (UID: \"822bc43f-dfed-4440-be35-1bf58f50456b\") " pod="openshift-logging/logging-loki-distributor-5d5548c9f5-r6x4b" Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.540096 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/822bc43f-dfed-4440-be35-1bf58f50456b-logging-loki-distributor-http\") pod \"logging-loki-distributor-5d5548c9f5-r6x4b\" (UID: \"822bc43f-dfed-4440-be35-1bf58f50456b\") " pod="openshift-logging/logging-loki-distributor-5d5548c9f5-r6x4b" Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.540135 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fb74g\" (UniqueName: \"kubernetes.io/projected/822bc43f-dfed-4440-be35-1bf58f50456b-kube-api-access-fb74g\") pod \"logging-loki-distributor-5d5548c9f5-r6x4b\" (UID: \"822bc43f-dfed-4440-be35-1bf58f50456b\") " pod="openshift-logging/logging-loki-distributor-5d5548c9f5-r6x4b" Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.585537 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-querier-76bf7b6d45-fv4dh"] Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.586326 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-querier-76bf7b6d45-fv4dh" Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.593376 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-querier-http" Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.593618 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-querier-grpc" Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.593761 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-s3" Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.613148 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-querier-76bf7b6d45-fv4dh"] Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.642133 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/822bc43f-dfed-4440-be35-1bf58f50456b-logging-loki-ca-bundle\") pod \"logging-loki-distributor-5d5548c9f5-r6x4b\" (UID: \"822bc43f-dfed-4440-be35-1bf58f50456b\") " pod="openshift-logging/logging-loki-distributor-5d5548c9f5-r6x4b" Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.642174 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/822bc43f-dfed-4440-be35-1bf58f50456b-config\") pod \"logging-loki-distributor-5d5548c9f5-r6x4b\" (UID: \"822bc43f-dfed-4440-be35-1bf58f50456b\") " pod="openshift-logging/logging-loki-distributor-5d5548c9f5-r6x4b" Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.642195 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/822bc43f-dfed-4440-be35-1bf58f50456b-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-5d5548c9f5-r6x4b\" (UID: \"822bc43f-dfed-4440-be35-1bf58f50456b\") " pod="openshift-logging/logging-loki-distributor-5d5548c9f5-r6x4b" Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.642234 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/822bc43f-dfed-4440-be35-1bf58f50456b-logging-loki-distributor-http\") pod \"logging-loki-distributor-5d5548c9f5-r6x4b\" (UID: \"822bc43f-dfed-4440-be35-1bf58f50456b\") " pod="openshift-logging/logging-loki-distributor-5d5548c9f5-r6x4b" Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.642273 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fb74g\" (UniqueName: \"kubernetes.io/projected/822bc43f-dfed-4440-be35-1bf58f50456b-kube-api-access-fb74g\") pod \"logging-loki-distributor-5d5548c9f5-r6x4b\" (UID: \"822bc43f-dfed-4440-be35-1bf58f50456b\") " pod="openshift-logging/logging-loki-distributor-5d5548c9f5-r6x4b" Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.643432 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/822bc43f-dfed-4440-be35-1bf58f50456b-logging-loki-ca-bundle\") pod \"logging-loki-distributor-5d5548c9f5-r6x4b\" (UID: \"822bc43f-dfed-4440-be35-1bf58f50456b\") " pod="openshift-logging/logging-loki-distributor-5d5548c9f5-r6x4b" Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.644037 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/822bc43f-dfed-4440-be35-1bf58f50456b-config\") pod \"logging-loki-distributor-5d5548c9f5-r6x4b\" (UID: \"822bc43f-dfed-4440-be35-1bf58f50456b\") " pod="openshift-logging/logging-loki-distributor-5d5548c9f5-r6x4b" Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.649590 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/822bc43f-dfed-4440-be35-1bf58f50456b-logging-loki-distributor-http\") pod \"logging-loki-distributor-5d5548c9f5-r6x4b\" (UID: \"822bc43f-dfed-4440-be35-1bf58f50456b\") " pod="openshift-logging/logging-loki-distributor-5d5548c9f5-r6x4b" Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.650748 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/822bc43f-dfed-4440-be35-1bf58f50456b-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-5d5548c9f5-r6x4b\" (UID: \"822bc43f-dfed-4440-be35-1bf58f50456b\") " pod="openshift-logging/logging-loki-distributor-5d5548c9f5-r6x4b" Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.662341 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fb74g\" (UniqueName: \"kubernetes.io/projected/822bc43f-dfed-4440-be35-1bf58f50456b-kube-api-access-fb74g\") pod \"logging-loki-distributor-5d5548c9f5-r6x4b\" (UID: \"822bc43f-dfed-4440-be35-1bf58f50456b\") " pod="openshift-logging/logging-loki-distributor-5d5548c9f5-r6x4b" Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.697070 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-distributor-5d5548c9f5-r6x4b" Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.714225 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-query-frontend-6d6859c548-5b72h"] Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.717915 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-query-frontend-6d6859c548-5b72h" Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.725858 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-query-frontend-http" Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.726139 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-query-frontend-grpc" Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.734613 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-query-frontend-6d6859c548-5b72h"] Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.743907 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba829d53-02a8-4003-a5ee-b9b36d8404e3-config\") pod \"logging-loki-querier-76bf7b6d45-fv4dh\" (UID: \"ba829d53-02a8-4003-a5ee-b9b36d8404e3\") " pod="openshift-logging/logging-loki-querier-76bf7b6d45-fv4dh" Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.744022 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mstz9\" (UniqueName: \"kubernetes.io/projected/ba829d53-02a8-4003-a5ee-b9b36d8404e3-kube-api-access-mstz9\") pod \"logging-loki-querier-76bf7b6d45-fv4dh\" (UID: \"ba829d53-02a8-4003-a5ee-b9b36d8404e3\") " pod="openshift-logging/logging-loki-querier-76bf7b6d45-fv4dh" Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.744057 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/ba829d53-02a8-4003-a5ee-b9b36d8404e3-logging-loki-querier-http\") pod \"logging-loki-querier-76bf7b6d45-fv4dh\" (UID: \"ba829d53-02a8-4003-a5ee-b9b36d8404e3\") " pod="openshift-logging/logging-loki-querier-76bf7b6d45-fv4dh" Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.744081 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/ba829d53-02a8-4003-a5ee-b9b36d8404e3-logging-loki-querier-grpc\") pod \"logging-loki-querier-76bf7b6d45-fv4dh\" (UID: \"ba829d53-02a8-4003-a5ee-b9b36d8404e3\") " pod="openshift-logging/logging-loki-querier-76bf7b6d45-fv4dh" Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.744104 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/ba829d53-02a8-4003-a5ee-b9b36d8404e3-logging-loki-s3\") pod \"logging-loki-querier-76bf7b6d45-fv4dh\" (UID: \"ba829d53-02a8-4003-a5ee-b9b36d8404e3\") " pod="openshift-logging/logging-loki-querier-76bf7b6d45-fv4dh" Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.744126 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba829d53-02a8-4003-a5ee-b9b36d8404e3-logging-loki-ca-bundle\") pod \"logging-loki-querier-76bf7b6d45-fv4dh\" (UID: \"ba829d53-02a8-4003-a5ee-b9b36d8404e3\") " pod="openshift-logging/logging-loki-querier-76bf7b6d45-fv4dh" Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.836294 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-gateway-6c5ff86c56-n5jdr"] Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.838091 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-6c5ff86c56-n5jdr" Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.841706 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-gateway" Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.841899 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-gateway-ca-bundle" Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.842038 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway" Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.842225 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-http" Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.842514 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-client-http" Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.851020 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gs2w9\" (UniqueName: \"kubernetes.io/projected/5ccc948e-2185-44fd-90c4-3ae3228f6224-kube-api-access-gs2w9\") pod \"logging-loki-query-frontend-6d6859c548-5b72h\" (UID: \"5ccc948e-2185-44fd-90c4-3ae3228f6224\") " pod="openshift-logging/logging-loki-query-frontend-6d6859c548-5b72h" Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.851070 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/5ccc948e-2185-44fd-90c4-3ae3228f6224-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-6d6859c548-5b72h\" (UID: \"5ccc948e-2185-44fd-90c4-3ae3228f6224\") " pod="openshift-logging/logging-loki-query-frontend-6d6859c548-5b72h" Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.851085 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-gateway-6c5ff86c56-dvps5"] Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.851113 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba829d53-02a8-4003-a5ee-b9b36d8404e3-config\") pod \"logging-loki-querier-76bf7b6d45-fv4dh\" (UID: \"ba829d53-02a8-4003-a5ee-b9b36d8404e3\") " pod="openshift-logging/logging-loki-querier-76bf7b6d45-fv4dh" Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.851143 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ccc948e-2185-44fd-90c4-3ae3228f6224-config\") pod \"logging-loki-query-frontend-6d6859c548-5b72h\" (UID: \"5ccc948e-2185-44fd-90c4-3ae3228f6224\") " pod="openshift-logging/logging-loki-query-frontend-6d6859c548-5b72h" Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.851167 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/5ccc948e-2185-44fd-90c4-3ae3228f6224-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-6d6859c548-5b72h\" (UID: \"5ccc948e-2185-44fd-90c4-3ae3228f6224\") " pod="openshift-logging/logging-loki-query-frontend-6d6859c548-5b72h" Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.851190 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mstz9\" (UniqueName: \"kubernetes.io/projected/ba829d53-02a8-4003-a5ee-b9b36d8404e3-kube-api-access-mstz9\") pod \"logging-loki-querier-76bf7b6d45-fv4dh\" (UID: \"ba829d53-02a8-4003-a5ee-b9b36d8404e3\") " pod="openshift-logging/logging-loki-querier-76bf7b6d45-fv4dh" Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.851234 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/ba829d53-02a8-4003-a5ee-b9b36d8404e3-logging-loki-querier-http\") pod \"logging-loki-querier-76bf7b6d45-fv4dh\" (UID: \"ba829d53-02a8-4003-a5ee-b9b36d8404e3\") " pod="openshift-logging/logging-loki-querier-76bf7b6d45-fv4dh" Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.851254 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5ccc948e-2185-44fd-90c4-3ae3228f6224-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-6d6859c548-5b72h\" (UID: \"5ccc948e-2185-44fd-90c4-3ae3228f6224\") " pod="openshift-logging/logging-loki-query-frontend-6d6859c548-5b72h" Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.851279 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/ba829d53-02a8-4003-a5ee-b9b36d8404e3-logging-loki-querier-grpc\") pod \"logging-loki-querier-76bf7b6d45-fv4dh\" (UID: \"ba829d53-02a8-4003-a5ee-b9b36d8404e3\") " pod="openshift-logging/logging-loki-querier-76bf7b6d45-fv4dh" Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.851298 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/ba829d53-02a8-4003-a5ee-b9b36d8404e3-logging-loki-s3\") pod \"logging-loki-querier-76bf7b6d45-fv4dh\" (UID: \"ba829d53-02a8-4003-a5ee-b9b36d8404e3\") " pod="openshift-logging/logging-loki-querier-76bf7b6d45-fv4dh" Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.851318 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba829d53-02a8-4003-a5ee-b9b36d8404e3-logging-loki-ca-bundle\") pod \"logging-loki-querier-76bf7b6d45-fv4dh\" (UID: \"ba829d53-02a8-4003-a5ee-b9b36d8404e3\") " pod="openshift-logging/logging-loki-querier-76bf7b6d45-fv4dh" Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.852018 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba829d53-02a8-4003-a5ee-b9b36d8404e3-logging-loki-ca-bundle\") pod \"logging-loki-querier-76bf7b6d45-fv4dh\" (UID: \"ba829d53-02a8-4003-a5ee-b9b36d8404e3\") " pod="openshift-logging/logging-loki-querier-76bf7b6d45-fv4dh" Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.852496 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba829d53-02a8-4003-a5ee-b9b36d8404e3-config\") pod \"logging-loki-querier-76bf7b6d45-fv4dh\" (UID: \"ba829d53-02a8-4003-a5ee-b9b36d8404e3\") " pod="openshift-logging/logging-loki-querier-76bf7b6d45-fv4dh" Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.856989 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/ba829d53-02a8-4003-a5ee-b9b36d8404e3-logging-loki-s3\") pod \"logging-loki-querier-76bf7b6d45-fv4dh\" (UID: \"ba829d53-02a8-4003-a5ee-b9b36d8404e3\") " pod="openshift-logging/logging-loki-querier-76bf7b6d45-fv4dh" Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.860539 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/ba829d53-02a8-4003-a5ee-b9b36d8404e3-logging-loki-querier-http\") pod \"logging-loki-querier-76bf7b6d45-fv4dh\" (UID: \"ba829d53-02a8-4003-a5ee-b9b36d8404e3\") " pod="openshift-logging/logging-loki-querier-76bf7b6d45-fv4dh" Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.861279 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-6c5ff86c56-dvps5" Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.864372 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-dockercfg-vt9w9" Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.867250 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-6c5ff86c56-n5jdr"] Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.877002 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/ba829d53-02a8-4003-a5ee-b9b36d8404e3-logging-loki-querier-grpc\") pod \"logging-loki-querier-76bf7b6d45-fv4dh\" (UID: \"ba829d53-02a8-4003-a5ee-b9b36d8404e3\") " pod="openshift-logging/logging-loki-querier-76bf7b6d45-fv4dh" Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.888256 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-6c5ff86c56-dvps5"] Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.896959 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mstz9\" (UniqueName: \"kubernetes.io/projected/ba829d53-02a8-4003-a5ee-b9b36d8404e3-kube-api-access-mstz9\") pod \"logging-loki-querier-76bf7b6d45-fv4dh\" (UID: \"ba829d53-02a8-4003-a5ee-b9b36d8404e3\") " pod="openshift-logging/logging-loki-querier-76bf7b6d45-fv4dh" Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.903600 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-querier-76bf7b6d45-fv4dh" Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.964233 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/8becd072-3095-4717-a83d-e56cf0d0f816-tenants\") pod \"logging-loki-gateway-6c5ff86c56-n5jdr\" (UID: \"8becd072-3095-4717-a83d-e56cf0d0f816\") " pod="openshift-logging/logging-loki-gateway-6c5ff86c56-n5jdr" Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.964286 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ccc948e-2185-44fd-90c4-3ae3228f6224-config\") pod \"logging-loki-query-frontend-6d6859c548-5b72h\" (UID: \"5ccc948e-2185-44fd-90c4-3ae3228f6224\") " pod="openshift-logging/logging-loki-query-frontend-6d6859c548-5b72h" Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.964311 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/5ccc948e-2185-44fd-90c4-3ae3228f6224-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-6d6859c548-5b72h\" (UID: \"5ccc948e-2185-44fd-90c4-3ae3228f6224\") " pod="openshift-logging/logging-loki-query-frontend-6d6859c548-5b72h" Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.964329 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/e265fe14-7154-4fbb-a7c3-33557166f71d-rbac\") pod \"logging-loki-gateway-6c5ff86c56-dvps5\" (UID: \"e265fe14-7154-4fbb-a7c3-33557166f71d\") " pod="openshift-logging/logging-loki-gateway-6c5ff86c56-dvps5" Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.964344 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47dtg\" (UniqueName: \"kubernetes.io/projected/8becd072-3095-4717-a83d-e56cf0d0f816-kube-api-access-47dtg\") pod \"logging-loki-gateway-6c5ff86c56-n5jdr\" (UID: \"8becd072-3095-4717-a83d-e56cf0d0f816\") " pod="openshift-logging/logging-loki-gateway-6c5ff86c56-n5jdr" Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.964380 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/8becd072-3095-4717-a83d-e56cf0d0f816-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-6c5ff86c56-n5jdr\" (UID: \"8becd072-3095-4717-a83d-e56cf0d0f816\") " pod="openshift-logging/logging-loki-gateway-6c5ff86c56-n5jdr" Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.964403 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e265fe14-7154-4fbb-a7c3-33557166f71d-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-6c5ff86c56-dvps5\" (UID: \"e265fe14-7154-4fbb-a7c3-33557166f71d\") " pod="openshift-logging/logging-loki-gateway-6c5ff86c56-dvps5" Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.964419 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/8becd072-3095-4717-a83d-e56cf0d0f816-rbac\") pod \"logging-loki-gateway-6c5ff86c56-n5jdr\" (UID: \"8becd072-3095-4717-a83d-e56cf0d0f816\") " pod="openshift-logging/logging-loki-gateway-6c5ff86c56-n5jdr" Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.964437 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5ccc948e-2185-44fd-90c4-3ae3228f6224-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-6d6859c548-5b72h\" (UID: \"5ccc948e-2185-44fd-90c4-3ae3228f6224\") " pod="openshift-logging/logging-loki-query-frontend-6d6859c548-5b72h" Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.964464 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/e265fe14-7154-4fbb-a7c3-33557166f71d-tenants\") pod \"logging-loki-gateway-6c5ff86c56-dvps5\" (UID: \"e265fe14-7154-4fbb-a7c3-33557166f71d\") " pod="openshift-logging/logging-loki-gateway-6c5ff86c56-dvps5" Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.964480 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wf4p\" (UniqueName: \"kubernetes.io/projected/e265fe14-7154-4fbb-a7c3-33557166f71d-kube-api-access-4wf4p\") pod \"logging-loki-gateway-6c5ff86c56-dvps5\" (UID: \"e265fe14-7154-4fbb-a7c3-33557166f71d\") " pod="openshift-logging/logging-loki-gateway-6c5ff86c56-dvps5" Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.964508 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8becd072-3095-4717-a83d-e56cf0d0f816-logging-loki-ca-bundle\") pod \"logging-loki-gateway-6c5ff86c56-n5jdr\" (UID: \"8becd072-3095-4717-a83d-e56cf0d0f816\") " pod="openshift-logging/logging-loki-gateway-6c5ff86c56-n5jdr" Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.964522 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gs2w9\" (UniqueName: \"kubernetes.io/projected/5ccc948e-2185-44fd-90c4-3ae3228f6224-kube-api-access-gs2w9\") pod \"logging-loki-query-frontend-6d6859c548-5b72h\" (UID: \"5ccc948e-2185-44fd-90c4-3ae3228f6224\") " pod="openshift-logging/logging-loki-query-frontend-6d6859c548-5b72h" Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.964539 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e265fe14-7154-4fbb-a7c3-33557166f71d-logging-loki-ca-bundle\") pod \"logging-loki-gateway-6c5ff86c56-dvps5\" (UID: \"e265fe14-7154-4fbb-a7c3-33557166f71d\") " pod="openshift-logging/logging-loki-gateway-6c5ff86c56-dvps5" Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.964559 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/8becd072-3095-4717-a83d-e56cf0d0f816-tls-secret\") pod \"logging-loki-gateway-6c5ff86c56-n5jdr\" (UID: \"8becd072-3095-4717-a83d-e56cf0d0f816\") " pod="openshift-logging/logging-loki-gateway-6c5ff86c56-n5jdr" Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.964574 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/8becd072-3095-4717-a83d-e56cf0d0f816-lokistack-gateway\") pod \"logging-loki-gateway-6c5ff86c56-n5jdr\" (UID: \"8becd072-3095-4717-a83d-e56cf0d0f816\") " pod="openshift-logging/logging-loki-gateway-6c5ff86c56-n5jdr" Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.964589 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/e265fe14-7154-4fbb-a7c3-33557166f71d-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-6c5ff86c56-dvps5\" (UID: \"e265fe14-7154-4fbb-a7c3-33557166f71d\") " pod="openshift-logging/logging-loki-gateway-6c5ff86c56-dvps5" Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.964604 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/e265fe14-7154-4fbb-a7c3-33557166f71d-lokistack-gateway\") pod \"logging-loki-gateway-6c5ff86c56-dvps5\" (UID: \"e265fe14-7154-4fbb-a7c3-33557166f71d\") " pod="openshift-logging/logging-loki-gateway-6c5ff86c56-dvps5" Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.964624 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/5ccc948e-2185-44fd-90c4-3ae3228f6224-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-6d6859c548-5b72h\" (UID: \"5ccc948e-2185-44fd-90c4-3ae3228f6224\") " pod="openshift-logging/logging-loki-query-frontend-6d6859c548-5b72h" Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.964641 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8becd072-3095-4717-a83d-e56cf0d0f816-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-6c5ff86c56-n5jdr\" (UID: \"8becd072-3095-4717-a83d-e56cf0d0f816\") " pod="openshift-logging/logging-loki-gateway-6c5ff86c56-n5jdr" Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.964657 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/e265fe14-7154-4fbb-a7c3-33557166f71d-tls-secret\") pod \"logging-loki-gateway-6c5ff86c56-dvps5\" (UID: \"e265fe14-7154-4fbb-a7c3-33557166f71d\") " pod="openshift-logging/logging-loki-gateway-6c5ff86c56-dvps5" Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.977378 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5ccc948e-2185-44fd-90c4-3ae3228f6224-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-6d6859c548-5b72h\" (UID: \"5ccc948e-2185-44fd-90c4-3ae3228f6224\") " pod="openshift-logging/logging-loki-query-frontend-6d6859c548-5b72h" Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.982192 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/5ccc948e-2185-44fd-90c4-3ae3228f6224-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-6d6859c548-5b72h\" (UID: \"5ccc948e-2185-44fd-90c4-3ae3228f6224\") " pod="openshift-logging/logging-loki-query-frontend-6d6859c548-5b72h" Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.983412 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gs2w9\" (UniqueName: \"kubernetes.io/projected/5ccc948e-2185-44fd-90c4-3ae3228f6224-kube-api-access-gs2w9\") pod \"logging-loki-query-frontend-6d6859c548-5b72h\" (UID: \"5ccc948e-2185-44fd-90c4-3ae3228f6224\") " pod="openshift-logging/logging-loki-query-frontend-6d6859c548-5b72h" Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.983769 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/5ccc948e-2185-44fd-90c4-3ae3228f6224-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-6d6859c548-5b72h\" (UID: \"5ccc948e-2185-44fd-90c4-3ae3228f6224\") " pod="openshift-logging/logging-loki-query-frontend-6d6859c548-5b72h" Mar 09 14:17:00 crc kubenswrapper[4722]: I0309 14:17:00.984762 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ccc948e-2185-44fd-90c4-3ae3228f6224-config\") pod \"logging-loki-query-frontend-6d6859c548-5b72h\" (UID: \"5ccc948e-2185-44fd-90c4-3ae3228f6224\") " pod="openshift-logging/logging-loki-query-frontend-6d6859c548-5b72h" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.066501 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e265fe14-7154-4fbb-a7c3-33557166f71d-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-6c5ff86c56-dvps5\" (UID: \"e265fe14-7154-4fbb-a7c3-33557166f71d\") " pod="openshift-logging/logging-loki-gateway-6c5ff86c56-dvps5" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.066734 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/8becd072-3095-4717-a83d-e56cf0d0f816-rbac\") pod \"logging-loki-gateway-6c5ff86c56-n5jdr\" (UID: \"8becd072-3095-4717-a83d-e56cf0d0f816\") " pod="openshift-logging/logging-loki-gateway-6c5ff86c56-n5jdr" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.066766 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/e265fe14-7154-4fbb-a7c3-33557166f71d-tenants\") pod \"logging-loki-gateway-6c5ff86c56-dvps5\" (UID: \"e265fe14-7154-4fbb-a7c3-33557166f71d\") " pod="openshift-logging/logging-loki-gateway-6c5ff86c56-dvps5" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.066782 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wf4p\" (UniqueName: \"kubernetes.io/projected/e265fe14-7154-4fbb-a7c3-33557166f71d-kube-api-access-4wf4p\") pod \"logging-loki-gateway-6c5ff86c56-dvps5\" (UID: \"e265fe14-7154-4fbb-a7c3-33557166f71d\") " pod="openshift-logging/logging-loki-gateway-6c5ff86c56-dvps5" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.066810 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8becd072-3095-4717-a83d-e56cf0d0f816-logging-loki-ca-bundle\") pod \"logging-loki-gateway-6c5ff86c56-n5jdr\" (UID: \"8becd072-3095-4717-a83d-e56cf0d0f816\") " pod="openshift-logging/logging-loki-gateway-6c5ff86c56-n5jdr" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.066831 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e265fe14-7154-4fbb-a7c3-33557166f71d-logging-loki-ca-bundle\") pod \"logging-loki-gateway-6c5ff86c56-dvps5\" (UID: \"e265fe14-7154-4fbb-a7c3-33557166f71d\") " pod="openshift-logging/logging-loki-gateway-6c5ff86c56-dvps5" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.066856 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/8becd072-3095-4717-a83d-e56cf0d0f816-tls-secret\") pod \"logging-loki-gateway-6c5ff86c56-n5jdr\" (UID: \"8becd072-3095-4717-a83d-e56cf0d0f816\") " pod="openshift-logging/logging-loki-gateway-6c5ff86c56-n5jdr" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.066873 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/8becd072-3095-4717-a83d-e56cf0d0f816-lokistack-gateway\") pod \"logging-loki-gateway-6c5ff86c56-n5jdr\" (UID: \"8becd072-3095-4717-a83d-e56cf0d0f816\") " pod="openshift-logging/logging-loki-gateway-6c5ff86c56-n5jdr" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.066891 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/e265fe14-7154-4fbb-a7c3-33557166f71d-lokistack-gateway\") pod \"logging-loki-gateway-6c5ff86c56-dvps5\" (UID: \"e265fe14-7154-4fbb-a7c3-33557166f71d\") " pod="openshift-logging/logging-loki-gateway-6c5ff86c56-dvps5" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.066908 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/e265fe14-7154-4fbb-a7c3-33557166f71d-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-6c5ff86c56-dvps5\" (UID: \"e265fe14-7154-4fbb-a7c3-33557166f71d\") " pod="openshift-logging/logging-loki-gateway-6c5ff86c56-dvps5" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.066931 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8becd072-3095-4717-a83d-e56cf0d0f816-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-6c5ff86c56-n5jdr\" (UID: \"8becd072-3095-4717-a83d-e56cf0d0f816\") " pod="openshift-logging/logging-loki-gateway-6c5ff86c56-n5jdr" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.066947 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/e265fe14-7154-4fbb-a7c3-33557166f71d-tls-secret\") pod \"logging-loki-gateway-6c5ff86c56-dvps5\" (UID: \"e265fe14-7154-4fbb-a7c3-33557166f71d\") " pod="openshift-logging/logging-loki-gateway-6c5ff86c56-dvps5" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.066985 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/8becd072-3095-4717-a83d-e56cf0d0f816-tenants\") pod \"logging-loki-gateway-6c5ff86c56-n5jdr\" (UID: \"8becd072-3095-4717-a83d-e56cf0d0f816\") " pod="openshift-logging/logging-loki-gateway-6c5ff86c56-n5jdr" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.067010 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/e265fe14-7154-4fbb-a7c3-33557166f71d-rbac\") pod \"logging-loki-gateway-6c5ff86c56-dvps5\" (UID: \"e265fe14-7154-4fbb-a7c3-33557166f71d\") " pod="openshift-logging/logging-loki-gateway-6c5ff86c56-dvps5" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.067025 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47dtg\" (UniqueName: \"kubernetes.io/projected/8becd072-3095-4717-a83d-e56cf0d0f816-kube-api-access-47dtg\") pod \"logging-loki-gateway-6c5ff86c56-n5jdr\" (UID: \"8becd072-3095-4717-a83d-e56cf0d0f816\") " pod="openshift-logging/logging-loki-gateway-6c5ff86c56-n5jdr" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.067054 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/8becd072-3095-4717-a83d-e56cf0d0f816-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-6c5ff86c56-n5jdr\" (UID: \"8becd072-3095-4717-a83d-e56cf0d0f816\") " pod="openshift-logging/logging-loki-gateway-6c5ff86c56-n5jdr" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.067600 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e265fe14-7154-4fbb-a7c3-33557166f71d-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-6c5ff86c56-dvps5\" (UID: \"e265fe14-7154-4fbb-a7c3-33557166f71d\") " pod="openshift-logging/logging-loki-gateway-6c5ff86c56-dvps5" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.068731 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8becd072-3095-4717-a83d-e56cf0d0f816-logging-loki-ca-bundle\") pod \"logging-loki-gateway-6c5ff86c56-n5jdr\" (UID: \"8becd072-3095-4717-a83d-e56cf0d0f816\") " pod="openshift-logging/logging-loki-gateway-6c5ff86c56-n5jdr" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.068753 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/8becd072-3095-4717-a83d-e56cf0d0f816-rbac\") pod \"logging-loki-gateway-6c5ff86c56-n5jdr\" (UID: \"8becd072-3095-4717-a83d-e56cf0d0f816\") " pod="openshift-logging/logging-loki-gateway-6c5ff86c56-n5jdr" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.068754 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8becd072-3095-4717-a83d-e56cf0d0f816-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-6c5ff86c56-n5jdr\" (UID: \"8becd072-3095-4717-a83d-e56cf0d0f816\") " pod="openshift-logging/logging-loki-gateway-6c5ff86c56-n5jdr" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.068885 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e265fe14-7154-4fbb-a7c3-33557166f71d-logging-loki-ca-bundle\") pod \"logging-loki-gateway-6c5ff86c56-dvps5\" (UID: \"e265fe14-7154-4fbb-a7c3-33557166f71d\") " pod="openshift-logging/logging-loki-gateway-6c5ff86c56-dvps5" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.068915 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/e265fe14-7154-4fbb-a7c3-33557166f71d-lokistack-gateway\") pod \"logging-loki-gateway-6c5ff86c56-dvps5\" (UID: \"e265fe14-7154-4fbb-a7c3-33557166f71d\") " pod="openshift-logging/logging-loki-gateway-6c5ff86c56-dvps5" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.069607 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/8becd072-3095-4717-a83d-e56cf0d0f816-lokistack-gateway\") pod \"logging-loki-gateway-6c5ff86c56-n5jdr\" (UID: \"8becd072-3095-4717-a83d-e56cf0d0f816\") " pod="openshift-logging/logging-loki-gateway-6c5ff86c56-n5jdr" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.071765 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/e265fe14-7154-4fbb-a7c3-33557166f71d-tls-secret\") pod \"logging-loki-gateway-6c5ff86c56-dvps5\" (UID: \"e265fe14-7154-4fbb-a7c3-33557166f71d\") " pod="openshift-logging/logging-loki-gateway-6c5ff86c56-dvps5" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.073017 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/8becd072-3095-4717-a83d-e56cf0d0f816-tls-secret\") pod \"logging-loki-gateway-6c5ff86c56-n5jdr\" (UID: \"8becd072-3095-4717-a83d-e56cf0d0f816\") " pod="openshift-logging/logging-loki-gateway-6c5ff86c56-n5jdr" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.073100 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/e265fe14-7154-4fbb-a7c3-33557166f71d-rbac\") pod \"logging-loki-gateway-6c5ff86c56-dvps5\" (UID: \"e265fe14-7154-4fbb-a7c3-33557166f71d\") " pod="openshift-logging/logging-loki-gateway-6c5ff86c56-dvps5" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.073228 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/8becd072-3095-4717-a83d-e56cf0d0f816-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-6c5ff86c56-n5jdr\" (UID: \"8becd072-3095-4717-a83d-e56cf0d0f816\") " pod="openshift-logging/logging-loki-gateway-6c5ff86c56-n5jdr" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.073304 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/e265fe14-7154-4fbb-a7c3-33557166f71d-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-6c5ff86c56-dvps5\" (UID: \"e265fe14-7154-4fbb-a7c3-33557166f71d\") " pod="openshift-logging/logging-loki-gateway-6c5ff86c56-dvps5" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.074320 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/e265fe14-7154-4fbb-a7c3-33557166f71d-tenants\") pod \"logging-loki-gateway-6c5ff86c56-dvps5\" (UID: \"e265fe14-7154-4fbb-a7c3-33557166f71d\") " pod="openshift-logging/logging-loki-gateway-6c5ff86c56-dvps5" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.086127 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-query-frontend-6d6859c548-5b72h" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.086878 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/8becd072-3095-4717-a83d-e56cf0d0f816-tenants\") pod \"logging-loki-gateway-6c5ff86c56-n5jdr\" (UID: \"8becd072-3095-4717-a83d-e56cf0d0f816\") " pod="openshift-logging/logging-loki-gateway-6c5ff86c56-n5jdr" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.089282 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47dtg\" (UniqueName: \"kubernetes.io/projected/8becd072-3095-4717-a83d-e56cf0d0f816-kube-api-access-47dtg\") pod \"logging-loki-gateway-6c5ff86c56-n5jdr\" (UID: \"8becd072-3095-4717-a83d-e56cf0d0f816\") " pod="openshift-logging/logging-loki-gateway-6c5ff86c56-n5jdr" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.090279 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wf4p\" (UniqueName: \"kubernetes.io/projected/e265fe14-7154-4fbb-a7c3-33557166f71d-kube-api-access-4wf4p\") pod \"logging-loki-gateway-6c5ff86c56-dvps5\" (UID: \"e265fe14-7154-4fbb-a7c3-33557166f71d\") " pod="openshift-logging/logging-loki-gateway-6c5ff86c56-dvps5" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.206658 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-6c5ff86c56-n5jdr" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.211488 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-querier-76bf7b6d45-fv4dh"] Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.221612 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-6c5ff86c56-dvps5" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.244655 4722 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.323039 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-distributor-5d5548c9f5-r6x4b"] Mar 09 14:17:01 crc kubenswrapper[4722]: W0309 14:17:01.346464 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod822bc43f_dfed_4440_be35_1bf58f50456b.slice/crio-48d370c14045dc395aa4ebbee1de9001bdeea1f74391dab25fc314105a7db1ba WatchSource:0}: Error finding container 48d370c14045dc395aa4ebbee1de9001bdeea1f74391dab25fc314105a7db1ba: Status 404 returned error can't find the container with id 48d370c14045dc395aa4ebbee1de9001bdeea1f74391dab25fc314105a7db1ba Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.547508 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.548393 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-ingester-0" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.552350 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-ingester-http" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.552669 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-ingester-grpc" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.554461 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.602110 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-query-frontend-6d6859c548-5b72h"] Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.684782 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6a23db8b-8a30-47b8-bf39-6f193899fcee-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"6a23db8b-8a30-47b8-bf39-6f193899fcee\") " pod="openshift-logging/logging-loki-ingester-0" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.684853 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-51be6952-fbf5-40ed-9e12-249680b3fcd2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-51be6952-fbf5-40ed-9e12-249680b3fcd2\") pod \"logging-loki-ingester-0\" (UID: \"6a23db8b-8a30-47b8-bf39-6f193899fcee\") " pod="openshift-logging/logging-loki-ingester-0" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.685070 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-35f7632b-3813-4ebd-b24f-a76781f8a24d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-35f7632b-3813-4ebd-b24f-a76781f8a24d\") pod \"logging-loki-ingester-0\" (UID: \"6a23db8b-8a30-47b8-bf39-6f193899fcee\") " pod="openshift-logging/logging-loki-ingester-0" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.685142 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pj4vm\" (UniqueName: \"kubernetes.io/projected/6a23db8b-8a30-47b8-bf39-6f193899fcee-kube-api-access-pj4vm\") pod \"logging-loki-ingester-0\" (UID: \"6a23db8b-8a30-47b8-bf39-6f193899fcee\") " pod="openshift-logging/logging-loki-ingester-0" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.685183 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/6a23db8b-8a30-47b8-bf39-6f193899fcee-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"6a23db8b-8a30-47b8-bf39-6f193899fcee\") " pod="openshift-logging/logging-loki-ingester-0" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.685243 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/6a23db8b-8a30-47b8-bf39-6f193899fcee-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"6a23db8b-8a30-47b8-bf39-6f193899fcee\") " pod="openshift-logging/logging-loki-ingester-0" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.685284 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/6a23db8b-8a30-47b8-bf39-6f193899fcee-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"6a23db8b-8a30-47b8-bf39-6f193899fcee\") " pod="openshift-logging/logging-loki-ingester-0" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.685334 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a23db8b-8a30-47b8-bf39-6f193899fcee-config\") pod \"logging-loki-ingester-0\" (UID: \"6a23db8b-8a30-47b8-bf39-6f193899fcee\") " pod="openshift-logging/logging-loki-ingester-0" Mar 09 14:17:01 crc kubenswrapper[4722]: W0309 14:17:01.695411 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8becd072_3095_4717_a83d_e56cf0d0f816.slice/crio-4f479cc6091364bfa758488825d2d32507b82c4aab3970b03ea3f9c96fc4fcdb WatchSource:0}: Error finding container 4f479cc6091364bfa758488825d2d32507b82c4aab3970b03ea3f9c96fc4fcdb: Status 404 returned error can't find the container with id 4f479cc6091364bfa758488825d2d32507b82c4aab3970b03ea3f9c96fc4fcdb Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.696306 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-6c5ff86c56-n5jdr"] Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.709341 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.710218 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-compactor-0" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.712793 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-compactor-http" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.713004 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-compactor-grpc" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.723120 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.756646 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-6c5ff86c56-dvps5"] Mar 09 14:17:01 crc kubenswrapper[4722]: W0309 14:17:01.759004 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode265fe14_7154_4fbb_a7c3_33557166f71d.slice/crio-a57edd71e566cf765bf70f343258228d0800405e17d375449fb51a3bc742f6e4 WatchSource:0}: Error finding container a57edd71e566cf765bf70f343258228d0800405e17d375449fb51a3bc742f6e4: Status 404 returned error can't find the container with id a57edd71e566cf765bf70f343258228d0800405e17d375449fb51a3bc742f6e4 Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.778470 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.780050 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-index-gateway-0" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.782066 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-index-gateway-grpc" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.782233 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-index-gateway-http" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.788670 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pj4vm\" (UniqueName: \"kubernetes.io/projected/6a23db8b-8a30-47b8-bf39-6f193899fcee-kube-api-access-pj4vm\") pod \"logging-loki-ingester-0\" (UID: \"6a23db8b-8a30-47b8-bf39-6f193899fcee\") " pod="openshift-logging/logging-loki-ingester-0" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.788738 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/6a23db8b-8a30-47b8-bf39-6f193899fcee-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"6a23db8b-8a30-47b8-bf39-6f193899fcee\") " pod="openshift-logging/logging-loki-ingester-0" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.788775 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/6a23db8b-8a30-47b8-bf39-6f193899fcee-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"6a23db8b-8a30-47b8-bf39-6f193899fcee\") " pod="openshift-logging/logging-loki-ingester-0" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.788801 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/6a23db8b-8a30-47b8-bf39-6f193899fcee-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"6a23db8b-8a30-47b8-bf39-6f193899fcee\") " pod="openshift-logging/logging-loki-ingester-0" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.788823 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a23db8b-8a30-47b8-bf39-6f193899fcee-config\") pod \"logging-loki-ingester-0\" (UID: \"6a23db8b-8a30-47b8-bf39-6f193899fcee\") " pod="openshift-logging/logging-loki-ingester-0" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.788947 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6a23db8b-8a30-47b8-bf39-6f193899fcee-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"6a23db8b-8a30-47b8-bf39-6f193899fcee\") " pod="openshift-logging/logging-loki-ingester-0" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.789007 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-51be6952-fbf5-40ed-9e12-249680b3fcd2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-51be6952-fbf5-40ed-9e12-249680b3fcd2\") pod \"logging-loki-ingester-0\" (UID: \"6a23db8b-8a30-47b8-bf39-6f193899fcee\") " pod="openshift-logging/logging-loki-ingester-0" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.789056 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-35f7632b-3813-4ebd-b24f-a76781f8a24d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-35f7632b-3813-4ebd-b24f-a76781f8a24d\") pod \"logging-loki-ingester-0\" (UID: \"6a23db8b-8a30-47b8-bf39-6f193899fcee\") " pod="openshift-logging/logging-loki-ingester-0" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.789853 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.790628 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6a23db8b-8a30-47b8-bf39-6f193899fcee-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"6a23db8b-8a30-47b8-bf39-6f193899fcee\") " pod="openshift-logging/logging-loki-ingester-0" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.791422 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a23db8b-8a30-47b8-bf39-6f193899fcee-config\") pod \"logging-loki-ingester-0\" (UID: \"6a23db8b-8a30-47b8-bf39-6f193899fcee\") " pod="openshift-logging/logging-loki-ingester-0" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.795136 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.795187 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-51be6952-fbf5-40ed-9e12-249680b3fcd2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-51be6952-fbf5-40ed-9e12-249680b3fcd2\") pod \"logging-loki-ingester-0\" (UID: \"6a23db8b-8a30-47b8-bf39-6f193899fcee\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/ba54c34738142487fe9465cd72d391f1b817a70d234201443595ed0e97b3eee4/globalmount\"" pod="openshift-logging/logging-loki-ingester-0" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.795304 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.795367 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-35f7632b-3813-4ebd-b24f-a76781f8a24d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-35f7632b-3813-4ebd-b24f-a76781f8a24d\") pod \"logging-loki-ingester-0\" (UID: \"6a23db8b-8a30-47b8-bf39-6f193899fcee\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/ffc78ffa09dc3deac543e540d4e726874b9d28c7c13f185c7dae0a961219ed98/globalmount\"" pod="openshift-logging/logging-loki-ingester-0" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.795453 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/6a23db8b-8a30-47b8-bf39-6f193899fcee-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"6a23db8b-8a30-47b8-bf39-6f193899fcee\") " pod="openshift-logging/logging-loki-ingester-0" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.796680 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/6a23db8b-8a30-47b8-bf39-6f193899fcee-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"6a23db8b-8a30-47b8-bf39-6f193899fcee\") " pod="openshift-logging/logging-loki-ingester-0" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.797750 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/6a23db8b-8a30-47b8-bf39-6f193899fcee-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"6a23db8b-8a30-47b8-bf39-6f193899fcee\") " pod="openshift-logging/logging-loki-ingester-0" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.807554 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pj4vm\" (UniqueName: \"kubernetes.io/projected/6a23db8b-8a30-47b8-bf39-6f193899fcee-kube-api-access-pj4vm\") pod \"logging-loki-ingester-0\" (UID: \"6a23db8b-8a30-47b8-bf39-6f193899fcee\") " pod="openshift-logging/logging-loki-ingester-0" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.809642 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-6c5ff86c56-dvps5" event={"ID":"e265fe14-7154-4fbb-a7c3-33557166f71d","Type":"ContainerStarted","Data":"a57edd71e566cf765bf70f343258228d0800405e17d375449fb51a3bc742f6e4"} Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.810575 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-distributor-5d5548c9f5-r6x4b" event={"ID":"822bc43f-dfed-4440-be35-1bf58f50456b","Type":"ContainerStarted","Data":"48d370c14045dc395aa4ebbee1de9001bdeea1f74391dab25fc314105a7db1ba"} Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.811581 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-6c5ff86c56-n5jdr" event={"ID":"8becd072-3095-4717-a83d-e56cf0d0f816","Type":"ContainerStarted","Data":"4f479cc6091364bfa758488825d2d32507b82c4aab3970b03ea3f9c96fc4fcdb"} Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.812766 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-query-frontend-6d6859c548-5b72h" event={"ID":"5ccc948e-2185-44fd-90c4-3ae3228f6224","Type":"ContainerStarted","Data":"05052f326c0e1b50724c1df88ce8e581cb51f5e2abb6ef33f49256cf27cff797"} Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.813729 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-querier-76bf7b6d45-fv4dh" event={"ID":"ba829d53-02a8-4003-a5ee-b9b36d8404e3","Type":"ContainerStarted","Data":"1957b7324b2d64ab0be6d485edbd979319442c2e39fea6a3df99ec0f91df7653"} Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.834861 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-51be6952-fbf5-40ed-9e12-249680b3fcd2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-51be6952-fbf5-40ed-9e12-249680b3fcd2\") pod \"logging-loki-ingester-0\" (UID: \"6a23db8b-8a30-47b8-bf39-6f193899fcee\") " pod="openshift-logging/logging-loki-ingester-0" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.835960 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-35f7632b-3813-4ebd-b24f-a76781f8a24d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-35f7632b-3813-4ebd-b24f-a76781f8a24d\") pod \"logging-loki-ingester-0\" (UID: \"6a23db8b-8a30-47b8-bf39-6f193899fcee\") " pod="openshift-logging/logging-loki-ingester-0" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.866890 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-ingester-0" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.889988 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2v6h\" (UniqueName: \"kubernetes.io/projected/bce49c11-10b4-4c30-a1a4-16cf32cb42fd-kube-api-access-g2v6h\") pod \"logging-loki-index-gateway-0\" (UID: \"bce49c11-10b4-4c30-a1a4-16cf32cb42fd\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.890031 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a437917e-04a4-4399-ab70-4993b7e8d6df\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a437917e-04a4-4399-ab70-4993b7e8d6df\") pod \"logging-loki-index-gateway-0\" (UID: \"bce49c11-10b4-4c30-a1a4-16cf32cb42fd\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.890055 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/75ed49e3-dc17-45c0-96ec-1db69670395b-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"75ed49e3-dc17-45c0-96ec-1db69670395b\") " pod="openshift-logging/logging-loki-compactor-0" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.890077 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75ed49e3-dc17-45c0-96ec-1db69670395b-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"75ed49e3-dc17-45c0-96ec-1db69670395b\") " pod="openshift-logging/logging-loki-compactor-0" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.890133 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/75ed49e3-dc17-45c0-96ec-1db69670395b-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"75ed49e3-dc17-45c0-96ec-1db69670395b\") " pod="openshift-logging/logging-loki-compactor-0" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.890158 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/75ed49e3-dc17-45c0-96ec-1db69670395b-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"75ed49e3-dc17-45c0-96ec-1db69670395b\") " pod="openshift-logging/logging-loki-compactor-0" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.890179 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bce49c11-10b4-4c30-a1a4-16cf32cb42fd-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"bce49c11-10b4-4c30-a1a4-16cf32cb42fd\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.890223 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/bce49c11-10b4-4c30-a1a4-16cf32cb42fd-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"bce49c11-10b4-4c30-a1a4-16cf32cb42fd\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.890245 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-cbc3c0a6-f0b9-41dd-af1a-b3c519567652\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cbc3c0a6-f0b9-41dd-af1a-b3c519567652\") pod \"logging-loki-compactor-0\" (UID: \"75ed49e3-dc17-45c0-96ec-1db69670395b\") " pod="openshift-logging/logging-loki-compactor-0" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.890267 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/bce49c11-10b4-4c30-a1a4-16cf32cb42fd-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"bce49c11-10b4-4c30-a1a4-16cf32cb42fd\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.890375 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqhl9\" (UniqueName: \"kubernetes.io/projected/75ed49e3-dc17-45c0-96ec-1db69670395b-kube-api-access-jqhl9\") pod \"logging-loki-compactor-0\" (UID: \"75ed49e3-dc17-45c0-96ec-1db69670395b\") " pod="openshift-logging/logging-loki-compactor-0" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.890433 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bce49c11-10b4-4c30-a1a4-16cf32cb42fd-config\") pod \"logging-loki-index-gateway-0\" (UID: \"bce49c11-10b4-4c30-a1a4-16cf32cb42fd\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.890462 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/bce49c11-10b4-4c30-a1a4-16cf32cb42fd-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"bce49c11-10b4-4c30-a1a4-16cf32cb42fd\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.890487 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75ed49e3-dc17-45c0-96ec-1db69670395b-config\") pod \"logging-loki-compactor-0\" (UID: \"75ed49e3-dc17-45c0-96ec-1db69670395b\") " pod="openshift-logging/logging-loki-compactor-0" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.992530 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/bce49c11-10b4-4c30-a1a4-16cf32cb42fd-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"bce49c11-10b4-4c30-a1a4-16cf32cb42fd\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.992624 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqhl9\" (UniqueName: \"kubernetes.io/projected/75ed49e3-dc17-45c0-96ec-1db69670395b-kube-api-access-jqhl9\") pod \"logging-loki-compactor-0\" (UID: \"75ed49e3-dc17-45c0-96ec-1db69670395b\") " pod="openshift-logging/logging-loki-compactor-0" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.992664 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bce49c11-10b4-4c30-a1a4-16cf32cb42fd-config\") pod \"logging-loki-index-gateway-0\" (UID: \"bce49c11-10b4-4c30-a1a4-16cf32cb42fd\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.992699 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/bce49c11-10b4-4c30-a1a4-16cf32cb42fd-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"bce49c11-10b4-4c30-a1a4-16cf32cb42fd\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.992741 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75ed49e3-dc17-45c0-96ec-1db69670395b-config\") pod \"logging-loki-compactor-0\" (UID: \"75ed49e3-dc17-45c0-96ec-1db69670395b\") " pod="openshift-logging/logging-loki-compactor-0" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.992892 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2v6h\" (UniqueName: \"kubernetes.io/projected/bce49c11-10b4-4c30-a1a4-16cf32cb42fd-kube-api-access-g2v6h\") pod \"logging-loki-index-gateway-0\" (UID: \"bce49c11-10b4-4c30-a1a4-16cf32cb42fd\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.992961 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a437917e-04a4-4399-ab70-4993b7e8d6df\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a437917e-04a4-4399-ab70-4993b7e8d6df\") pod \"logging-loki-index-gateway-0\" (UID: \"bce49c11-10b4-4c30-a1a4-16cf32cb42fd\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.993025 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/75ed49e3-dc17-45c0-96ec-1db69670395b-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"75ed49e3-dc17-45c0-96ec-1db69670395b\") " pod="openshift-logging/logging-loki-compactor-0" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.993073 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75ed49e3-dc17-45c0-96ec-1db69670395b-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"75ed49e3-dc17-45c0-96ec-1db69670395b\") " pod="openshift-logging/logging-loki-compactor-0" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.993138 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/75ed49e3-dc17-45c0-96ec-1db69670395b-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"75ed49e3-dc17-45c0-96ec-1db69670395b\") " pod="openshift-logging/logging-loki-compactor-0" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.993190 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/75ed49e3-dc17-45c0-96ec-1db69670395b-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"75ed49e3-dc17-45c0-96ec-1db69670395b\") " pod="openshift-logging/logging-loki-compactor-0" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.993277 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bce49c11-10b4-4c30-a1a4-16cf32cb42fd-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"bce49c11-10b4-4c30-a1a4-16cf32cb42fd\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.993354 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/bce49c11-10b4-4c30-a1a4-16cf32cb42fd-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"bce49c11-10b4-4c30-a1a4-16cf32cb42fd\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.993413 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-cbc3c0a6-f0b9-41dd-af1a-b3c519567652\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cbc3c0a6-f0b9-41dd-af1a-b3c519567652\") pod \"logging-loki-compactor-0\" (UID: \"75ed49e3-dc17-45c0-96ec-1db69670395b\") " pod="openshift-logging/logging-loki-compactor-0" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.993996 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bce49c11-10b4-4c30-a1a4-16cf32cb42fd-config\") pod \"logging-loki-index-gateway-0\" (UID: \"bce49c11-10b4-4c30-a1a4-16cf32cb42fd\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.994444 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bce49c11-10b4-4c30-a1a4-16cf32cb42fd-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"bce49c11-10b4-4c30-a1a4-16cf32cb42fd\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.994533 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75ed49e3-dc17-45c0-96ec-1db69670395b-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"75ed49e3-dc17-45c0-96ec-1db69670395b\") " pod="openshift-logging/logging-loki-compactor-0" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.995411 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75ed49e3-dc17-45c0-96ec-1db69670395b-config\") pod \"logging-loki-compactor-0\" (UID: \"75ed49e3-dc17-45c0-96ec-1db69670395b\") " pod="openshift-logging/logging-loki-compactor-0" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.996232 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.996277 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.996280 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-cbc3c0a6-f0b9-41dd-af1a-b3c519567652\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cbc3c0a6-f0b9-41dd-af1a-b3c519567652\") pod \"logging-loki-compactor-0\" (UID: \"75ed49e3-dc17-45c0-96ec-1db69670395b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/13ad635aa84c009e978db10d3b2f17c273177761a9de490f68557cd98507bc8b/globalmount\"" pod="openshift-logging/logging-loki-compactor-0" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.996308 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a437917e-04a4-4399-ab70-4993b7e8d6df\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a437917e-04a4-4399-ab70-4993b7e8d6df\") pod \"logging-loki-index-gateway-0\" (UID: \"bce49c11-10b4-4c30-a1a4-16cf32cb42fd\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/a2051066ff9d16ba230f99332179ff72f7b9e5e2e8aa664bb2ea2fdd41583ebf/globalmount\"" pod="openshift-logging/logging-loki-index-gateway-0" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.996560 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/bce49c11-10b4-4c30-a1a4-16cf32cb42fd-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"bce49c11-10b4-4c30-a1a4-16cf32cb42fd\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.997539 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/bce49c11-10b4-4c30-a1a4-16cf32cb42fd-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"bce49c11-10b4-4c30-a1a4-16cf32cb42fd\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.998179 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/75ed49e3-dc17-45c0-96ec-1db69670395b-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"75ed49e3-dc17-45c0-96ec-1db69670395b\") " pod="openshift-logging/logging-loki-compactor-0" Mar 09 14:17:01 crc kubenswrapper[4722]: I0309 14:17:01.998539 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/75ed49e3-dc17-45c0-96ec-1db69670395b-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"75ed49e3-dc17-45c0-96ec-1db69670395b\") " pod="openshift-logging/logging-loki-compactor-0" Mar 09 14:17:02 crc kubenswrapper[4722]: I0309 14:17:02.000111 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/bce49c11-10b4-4c30-a1a4-16cf32cb42fd-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"bce49c11-10b4-4c30-a1a4-16cf32cb42fd\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 09 14:17:02 crc kubenswrapper[4722]: I0309 14:17:02.006077 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/75ed49e3-dc17-45c0-96ec-1db69670395b-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"75ed49e3-dc17-45c0-96ec-1db69670395b\") " pod="openshift-logging/logging-loki-compactor-0" Mar 09 14:17:02 crc kubenswrapper[4722]: I0309 14:17:02.011865 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2v6h\" (UniqueName: \"kubernetes.io/projected/bce49c11-10b4-4c30-a1a4-16cf32cb42fd-kube-api-access-g2v6h\") pod \"logging-loki-index-gateway-0\" (UID: \"bce49c11-10b4-4c30-a1a4-16cf32cb42fd\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 09 14:17:02 crc kubenswrapper[4722]: I0309 14:17:02.024731 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqhl9\" (UniqueName: \"kubernetes.io/projected/75ed49e3-dc17-45c0-96ec-1db69670395b-kube-api-access-jqhl9\") pod \"logging-loki-compactor-0\" (UID: \"75ed49e3-dc17-45c0-96ec-1db69670395b\") " pod="openshift-logging/logging-loki-compactor-0" Mar 09 14:17:02 crc kubenswrapper[4722]: I0309 14:17:02.032666 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-cbc3c0a6-f0b9-41dd-af1a-b3c519567652\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-cbc3c0a6-f0b9-41dd-af1a-b3c519567652\") pod \"logging-loki-compactor-0\" (UID: \"75ed49e3-dc17-45c0-96ec-1db69670395b\") " pod="openshift-logging/logging-loki-compactor-0" Mar 09 14:17:02 crc kubenswrapper[4722]: I0309 14:17:02.035430 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a437917e-04a4-4399-ab70-4993b7e8d6df\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a437917e-04a4-4399-ab70-4993b7e8d6df\") pod \"logging-loki-index-gateway-0\" (UID: \"bce49c11-10b4-4c30-a1a4-16cf32cb42fd\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 09 14:17:02 crc kubenswrapper[4722]: I0309 14:17:02.092053 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Mar 09 14:17:02 crc kubenswrapper[4722]: W0309 14:17:02.095851 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a23db8b_8a30_47b8_bf39_6f193899fcee.slice/crio-6c735b0063250569b3bad9d1995345791816e32ed0660414062c7a9468572d88 WatchSource:0}: Error finding container 6c735b0063250569b3bad9d1995345791816e32ed0660414062c7a9468572d88: Status 404 returned error can't find the container with id 6c735b0063250569b3bad9d1995345791816e32ed0660414062c7a9468572d88 Mar 09 14:17:02 crc kubenswrapper[4722]: I0309 14:17:02.145536 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-index-gateway-0" Mar 09 14:17:02 crc kubenswrapper[4722]: I0309 14:17:02.335492 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-compactor-0" Mar 09 14:17:02 crc kubenswrapper[4722]: I0309 14:17:02.618149 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Mar 09 14:17:02 crc kubenswrapper[4722]: W0309 14:17:02.626482 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbce49c11_10b4_4c30_a1a4_16cf32cb42fd.slice/crio-1638d47d92f5b62cb279a487b49c0e4b54578a19f208561394a04cb06d65d752 WatchSource:0}: Error finding container 1638d47d92f5b62cb279a487b49c0e4b54578a19f208561394a04cb06d65d752: Status 404 returned error can't find the container with id 1638d47d92f5b62cb279a487b49c0e4b54578a19f208561394a04cb06d65d752 Mar 09 14:17:02 crc kubenswrapper[4722]: I0309 14:17:02.790023 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Mar 09 14:17:02 crc kubenswrapper[4722]: I0309 14:17:02.842992 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-ingester-0" event={"ID":"6a23db8b-8a30-47b8-bf39-6f193899fcee","Type":"ContainerStarted","Data":"6c735b0063250569b3bad9d1995345791816e32ed0660414062c7a9468572d88"} Mar 09 14:17:02 crc kubenswrapper[4722]: I0309 14:17:02.844668 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-compactor-0" event={"ID":"75ed49e3-dc17-45c0-96ec-1db69670395b","Type":"ContainerStarted","Data":"c05f7e402dab281739c0f77b37f3e2ea394f6ca19b060a2cfbf19ef0aabdca46"} Mar 09 14:17:02 crc kubenswrapper[4722]: I0309 14:17:02.845748 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-index-gateway-0" event={"ID":"bce49c11-10b4-4c30-a1a4-16cf32cb42fd","Type":"ContainerStarted","Data":"1638d47d92f5b62cb279a487b49c0e4b54578a19f208561394a04cb06d65d752"} Mar 09 14:17:05 crc kubenswrapper[4722]: I0309 14:17:05.874024 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-compactor-0" event={"ID":"75ed49e3-dc17-45c0-96ec-1db69670395b","Type":"ContainerStarted","Data":"6cf06100ea9806b080fa92fba103259cda68456411386eb1b128884db99bc85c"} Mar 09 14:17:05 crc kubenswrapper[4722]: I0309 14:17:05.874462 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-compactor-0" Mar 09 14:17:05 crc kubenswrapper[4722]: I0309 14:17:05.875401 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-6c5ff86c56-n5jdr" event={"ID":"8becd072-3095-4717-a83d-e56cf0d0f816","Type":"ContainerStarted","Data":"93e8c54b1349812753720e9364271e99051b05f095bfdad19081fd25052e3f27"} Mar 09 14:17:05 crc kubenswrapper[4722]: I0309 14:17:05.876768 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-query-frontend-6d6859c548-5b72h" event={"ID":"5ccc948e-2185-44fd-90c4-3ae3228f6224","Type":"ContainerStarted","Data":"4c946f2bfa248faed3b0541d87119246f97d294f9a5ac5943b14f6963d499d59"} Mar 09 14:17:05 crc kubenswrapper[4722]: I0309 14:17:05.876993 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-query-frontend-6d6859c548-5b72h" Mar 09 14:17:05 crc kubenswrapper[4722]: I0309 14:17:05.878742 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-index-gateway-0" event={"ID":"bce49c11-10b4-4c30-a1a4-16cf32cb42fd","Type":"ContainerStarted","Data":"d47046cd47de4cfbcdc3c5b62345e51bd4e8ca8feed82edc4765ae368cd08e03"} Mar 09 14:17:05 crc kubenswrapper[4722]: I0309 14:17:05.878878 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-index-gateway-0" Mar 09 14:17:05 crc kubenswrapper[4722]: I0309 14:17:05.880068 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-6c5ff86c56-dvps5" event={"ID":"e265fe14-7154-4fbb-a7c3-33557166f71d","Type":"ContainerStarted","Data":"edbab73bbcc01b2ae49d210842963dbe417d99b0e2da3c806bd5469cc9632574"} Mar 09 14:17:05 crc kubenswrapper[4722]: I0309 14:17:05.884023 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-ingester-0" event={"ID":"6a23db8b-8a30-47b8-bf39-6f193899fcee","Type":"ContainerStarted","Data":"d839beac914d340a7b289c43e22cfeba7a1ca7663251a0b1307308737b1bff2d"} Mar 09 14:17:05 crc kubenswrapper[4722]: I0309 14:17:05.884833 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-ingester-0" Mar 09 14:17:05 crc kubenswrapper[4722]: I0309 14:17:05.890160 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-querier-76bf7b6d45-fv4dh" event={"ID":"ba829d53-02a8-4003-a5ee-b9b36d8404e3","Type":"ContainerStarted","Data":"e9b71268e2a744072167b55c23f1a308bf622def148bced9ee654e3ab125220f"} Mar 09 14:17:05 crc kubenswrapper[4722]: I0309 14:17:05.890323 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-querier-76bf7b6d45-fv4dh" Mar 09 14:17:05 crc kubenswrapper[4722]: I0309 14:17:05.892081 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-distributor-5d5548c9f5-r6x4b" event={"ID":"822bc43f-dfed-4440-be35-1bf58f50456b","Type":"ContainerStarted","Data":"e971c54efb0bc5c6ed40f5f9e5a09f2227b69a105238cb328fb8c615964266f0"} Mar 09 14:17:05 crc kubenswrapper[4722]: I0309 14:17:05.892272 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-distributor-5d5548c9f5-r6x4b" Mar 09 14:17:05 crc kubenswrapper[4722]: I0309 14:17:05.926333 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-compactor-0" podStartSLOduration=3.454911553 podStartE2EDuration="5.926306919s" podCreationTimestamp="2026-03-09 14:17:00 +0000 UTC" firstStartedPulling="2026-03-09 14:17:02.795466414 +0000 UTC m=+863.351034990" lastFinishedPulling="2026-03-09 14:17:05.26686178 +0000 UTC m=+865.822430356" observedRunningTime="2026-03-09 14:17:05.918579745 +0000 UTC m=+866.474148321" watchObservedRunningTime="2026-03-09 14:17:05.926306919 +0000 UTC m=+866.481875495" Mar 09 14:17:05 crc kubenswrapper[4722]: I0309 14:17:05.953322 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-query-frontend-6d6859c548-5b72h" podStartSLOduration=2.292576092 podStartE2EDuration="5.953304785s" podCreationTimestamp="2026-03-09 14:17:00 +0000 UTC" firstStartedPulling="2026-03-09 14:17:01.603965377 +0000 UTC m=+862.159533953" lastFinishedPulling="2026-03-09 14:17:05.26469407 +0000 UTC m=+865.820262646" observedRunningTime="2026-03-09 14:17:05.952957856 +0000 UTC m=+866.508526432" watchObservedRunningTime="2026-03-09 14:17:05.953304785 +0000 UTC m=+866.508873361" Mar 09 14:17:05 crc kubenswrapper[4722]: I0309 14:17:05.981583 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-index-gateway-0" podStartSLOduration=3.425875378 podStartE2EDuration="5.981564866s" podCreationTimestamp="2026-03-09 14:17:00 +0000 UTC" firstStartedPulling="2026-03-09 14:17:02.62967835 +0000 UTC m=+863.185246926" lastFinishedPulling="2026-03-09 14:17:05.185367838 +0000 UTC m=+865.740936414" observedRunningTime="2026-03-09 14:17:05.978685016 +0000 UTC m=+866.534253592" watchObservedRunningTime="2026-03-09 14:17:05.981564866 +0000 UTC m=+866.537133442" Mar 09 14:17:06 crc kubenswrapper[4722]: I0309 14:17:06.022143 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-querier-76bf7b6d45-fv4dh" podStartSLOduration=2.001828375 podStartE2EDuration="6.022123807s" podCreationTimestamp="2026-03-09 14:17:00 +0000 UTC" firstStartedPulling="2026-03-09 14:17:01.244377567 +0000 UTC m=+861.799946143" lastFinishedPulling="2026-03-09 14:17:05.264672999 +0000 UTC m=+865.820241575" observedRunningTime="2026-03-09 14:17:06.019943248 +0000 UTC m=+866.575511824" watchObservedRunningTime="2026-03-09 14:17:06.022123807 +0000 UTC m=+866.577692383" Mar 09 14:17:06 crc kubenswrapper[4722]: I0309 14:17:06.057925 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-distributor-5d5548c9f5-r6x4b" podStartSLOduration=2.145351932 podStartE2EDuration="6.057889866s" podCreationTimestamp="2026-03-09 14:17:00 +0000 UTC" firstStartedPulling="2026-03-09 14:17:01.352693811 +0000 UTC m=+861.908262387" lastFinishedPulling="2026-03-09 14:17:05.265231755 +0000 UTC m=+865.820800321" observedRunningTime="2026-03-09 14:17:06.051224682 +0000 UTC m=+866.606793268" watchObservedRunningTime="2026-03-09 14:17:06.057889866 +0000 UTC m=+866.613458442" Mar 09 14:17:06 crc kubenswrapper[4722]: I0309 14:17:06.084971 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-ingester-0" podStartSLOduration=2.925697774 podStartE2EDuration="6.084946914s" podCreationTimestamp="2026-03-09 14:17:00 +0000 UTC" firstStartedPulling="2026-03-09 14:17:02.098292652 +0000 UTC m=+862.653861228" lastFinishedPulling="2026-03-09 14:17:05.257541772 +0000 UTC m=+865.813110368" observedRunningTime="2026-03-09 14:17:06.077816977 +0000 UTC m=+866.633385553" watchObservedRunningTime="2026-03-09 14:17:06.084946914 +0000 UTC m=+866.640515490" Mar 09 14:17:07 crc kubenswrapper[4722]: I0309 14:17:07.906870 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-6c5ff86c56-dvps5" event={"ID":"e265fe14-7154-4fbb-a7c3-33557166f71d","Type":"ContainerStarted","Data":"28cd4ff38607097f66a1df2163844a1ab98f5c7f2cf327d656c1846fc0de3f70"} Mar 09 14:17:07 crc kubenswrapper[4722]: I0309 14:17:07.907512 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-6c5ff86c56-dvps5" Mar 09 14:17:07 crc kubenswrapper[4722]: I0309 14:17:07.909548 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-6c5ff86c56-n5jdr" event={"ID":"8becd072-3095-4717-a83d-e56cf0d0f816","Type":"ContainerStarted","Data":"3b505e09c628014045e71143b04e8bb4e639238235e0b0d3c9081be4a4899a2e"} Mar 09 14:17:07 crc kubenswrapper[4722]: I0309 14:17:07.920880 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-6c5ff86c56-dvps5" Mar 09 14:17:07 crc kubenswrapper[4722]: I0309 14:17:07.935096 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-gateway-6c5ff86c56-dvps5" podStartSLOduration=2.367664758 podStartE2EDuration="7.935069137s" podCreationTimestamp="2026-03-09 14:17:00 +0000 UTC" firstStartedPulling="2026-03-09 14:17:01.760438983 +0000 UTC m=+862.316007549" lastFinishedPulling="2026-03-09 14:17:07.327843352 +0000 UTC m=+867.883411928" observedRunningTime="2026-03-09 14:17:07.926996913 +0000 UTC m=+868.482565509" watchObservedRunningTime="2026-03-09 14:17:07.935069137 +0000 UTC m=+868.490637723" Mar 09 14:17:07 crc kubenswrapper[4722]: I0309 14:17:07.958481 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-gateway-6c5ff86c56-n5jdr" podStartSLOduration=2.33518681 podStartE2EDuration="7.958454883s" podCreationTimestamp="2026-03-09 14:17:00 +0000 UTC" firstStartedPulling="2026-03-09 14:17:01.696756412 +0000 UTC m=+862.252324988" lastFinishedPulling="2026-03-09 14:17:07.320024485 +0000 UTC m=+867.875593061" observedRunningTime="2026-03-09 14:17:07.953197308 +0000 UTC m=+868.508787325" watchObservedRunningTime="2026-03-09 14:17:07.958454883 +0000 UTC m=+868.514023489" Mar 09 14:17:08 crc kubenswrapper[4722]: I0309 14:17:08.916729 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-6c5ff86c56-dvps5" Mar 09 14:17:08 crc kubenswrapper[4722]: I0309 14:17:08.917043 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-6c5ff86c56-n5jdr" Mar 09 14:17:08 crc kubenswrapper[4722]: I0309 14:17:08.917055 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-6c5ff86c56-n5jdr" Mar 09 14:17:08 crc kubenswrapper[4722]: I0309 14:17:08.925334 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-6c5ff86c56-dvps5" Mar 09 14:17:08 crc kubenswrapper[4722]: I0309 14:17:08.926933 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-6c5ff86c56-n5jdr" Mar 09 14:17:08 crc kubenswrapper[4722]: I0309 14:17:08.933193 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-6c5ff86c56-n5jdr" Mar 09 14:17:20 crc kubenswrapper[4722]: I0309 14:17:20.706912 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-distributor-5d5548c9f5-r6x4b" Mar 09 14:17:20 crc kubenswrapper[4722]: I0309 14:17:20.910854 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-querier-76bf7b6d45-fv4dh" Mar 09 14:17:21 crc kubenswrapper[4722]: I0309 14:17:21.091914 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-query-frontend-6d6859c548-5b72h" Mar 09 14:17:21 crc kubenswrapper[4722]: I0309 14:17:21.872625 4722 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: this instance owns no tokens Mar 09 14:17:21 crc kubenswrapper[4722]: I0309 14:17:21.872703 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="6a23db8b-8a30-47b8-bf39-6f193899fcee" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 09 14:17:22 crc kubenswrapper[4722]: I0309 14:17:22.158285 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-index-gateway-0" Mar 09 14:17:22 crc kubenswrapper[4722]: I0309 14:17:22.342288 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-compactor-0" Mar 09 14:17:31 crc kubenswrapper[4722]: I0309 14:17:31.878025 4722 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: this instance owns no tokens Mar 09 14:17:31 crc kubenswrapper[4722]: I0309 14:17:31.880651 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="6a23db8b-8a30-47b8-bf39-6f193899fcee" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 09 14:17:41 crc kubenswrapper[4722]: I0309 14:17:41.874700 4722 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: waiting for 15s after being ready Mar 09 14:17:41 crc kubenswrapper[4722]: I0309 14:17:41.875449 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="6a23db8b-8a30-47b8-bf39-6f193899fcee" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 09 14:17:51 crc kubenswrapper[4722]: I0309 14:17:51.873416 4722 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: waiting for 15s after being ready Mar 09 14:17:51 crc kubenswrapper[4722]: I0309 14:17:51.874296 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="6a23db8b-8a30-47b8-bf39-6f193899fcee" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 09 14:18:00 crc kubenswrapper[4722]: I0309 14:18:00.145719 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551098-rb55h"] Mar 09 14:18:00 crc kubenswrapper[4722]: I0309 14:18:00.147246 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551098-rb55h" Mar 09 14:18:00 crc kubenswrapper[4722]: I0309 14:18:00.149532 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 14:18:00 crc kubenswrapper[4722]: I0309 14:18:00.149847 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-dr2x6" Mar 09 14:18:00 crc kubenswrapper[4722]: I0309 14:18:00.150852 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 14:18:00 crc kubenswrapper[4722]: I0309 14:18:00.163080 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551098-rb55h"] Mar 09 14:18:00 crc kubenswrapper[4722]: I0309 14:18:00.313413 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmmt9\" (UniqueName: \"kubernetes.io/projected/9020ca81-ccfb-4a03-ac36-ae9b259c68b1-kube-api-access-mmmt9\") pod \"auto-csr-approver-29551098-rb55h\" (UID: \"9020ca81-ccfb-4a03-ac36-ae9b259c68b1\") " pod="openshift-infra/auto-csr-approver-29551098-rb55h" Mar 09 14:18:00 crc kubenswrapper[4722]: I0309 14:18:00.415683 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmmt9\" (UniqueName: \"kubernetes.io/projected/9020ca81-ccfb-4a03-ac36-ae9b259c68b1-kube-api-access-mmmt9\") pod \"auto-csr-approver-29551098-rb55h\" (UID: \"9020ca81-ccfb-4a03-ac36-ae9b259c68b1\") " pod="openshift-infra/auto-csr-approver-29551098-rb55h" Mar 09 14:18:00 crc kubenswrapper[4722]: I0309 14:18:00.439264 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmmt9\" (UniqueName: \"kubernetes.io/projected/9020ca81-ccfb-4a03-ac36-ae9b259c68b1-kube-api-access-mmmt9\") pod \"auto-csr-approver-29551098-rb55h\" (UID: \"9020ca81-ccfb-4a03-ac36-ae9b259c68b1\") " pod="openshift-infra/auto-csr-approver-29551098-rb55h" Mar 09 14:18:00 crc kubenswrapper[4722]: I0309 14:18:00.468289 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551098-rb55h" Mar 09 14:18:00 crc kubenswrapper[4722]: I0309 14:18:00.891672 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551098-rb55h"] Mar 09 14:18:01 crc kubenswrapper[4722]: I0309 14:18:01.420969 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551098-rb55h" event={"ID":"9020ca81-ccfb-4a03-ac36-ae9b259c68b1","Type":"ContainerStarted","Data":"7cc9678916a520c25d56fc87c5a9c63e1c3016be4ec1739c734580a7c06d22d8"} Mar 09 14:18:01 crc kubenswrapper[4722]: I0309 14:18:01.874318 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-ingester-0" Mar 09 14:18:03 crc kubenswrapper[4722]: I0309 14:18:03.445356 4722 generic.go:334] "Generic (PLEG): container finished" podID="9020ca81-ccfb-4a03-ac36-ae9b259c68b1" containerID="32938ebe1b7b6b82bb2086db38761774319a2bbe444d065ebebb2cb4f6773226" exitCode=0 Mar 09 14:18:03 crc kubenswrapper[4722]: I0309 14:18:03.445677 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551098-rb55h" event={"ID":"9020ca81-ccfb-4a03-ac36-ae9b259c68b1","Type":"ContainerDied","Data":"32938ebe1b7b6b82bb2086db38761774319a2bbe444d065ebebb2cb4f6773226"} Mar 09 14:18:04 crc kubenswrapper[4722]: I0309 14:18:04.761191 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551098-rb55h" Mar 09 14:18:04 crc kubenswrapper[4722]: I0309 14:18:04.894560 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mmmt9\" (UniqueName: \"kubernetes.io/projected/9020ca81-ccfb-4a03-ac36-ae9b259c68b1-kube-api-access-mmmt9\") pod \"9020ca81-ccfb-4a03-ac36-ae9b259c68b1\" (UID: \"9020ca81-ccfb-4a03-ac36-ae9b259c68b1\") " Mar 09 14:18:04 crc kubenswrapper[4722]: I0309 14:18:04.902413 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9020ca81-ccfb-4a03-ac36-ae9b259c68b1-kube-api-access-mmmt9" (OuterVolumeSpecName: "kube-api-access-mmmt9") pod "9020ca81-ccfb-4a03-ac36-ae9b259c68b1" (UID: "9020ca81-ccfb-4a03-ac36-ae9b259c68b1"). InnerVolumeSpecName "kube-api-access-mmmt9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:18:04 crc kubenswrapper[4722]: I0309 14:18:04.996893 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mmmt9\" (UniqueName: \"kubernetes.io/projected/9020ca81-ccfb-4a03-ac36-ae9b259c68b1-kube-api-access-mmmt9\") on node \"crc\" DevicePath \"\"" Mar 09 14:18:05 crc kubenswrapper[4722]: I0309 14:18:05.467364 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551098-rb55h" event={"ID":"9020ca81-ccfb-4a03-ac36-ae9b259c68b1","Type":"ContainerDied","Data":"7cc9678916a520c25d56fc87c5a9c63e1c3016be4ec1739c734580a7c06d22d8"} Mar 09 14:18:05 crc kubenswrapper[4722]: I0309 14:18:05.467436 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7cc9678916a520c25d56fc87c5a9c63e1c3016be4ec1739c734580a7c06d22d8" Mar 09 14:18:05 crc kubenswrapper[4722]: I0309 14:18:05.467433 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551098-rb55h" Mar 09 14:18:05 crc kubenswrapper[4722]: I0309 14:18:05.836405 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551092-5pnkn"] Mar 09 14:18:05 crc kubenswrapper[4722]: I0309 14:18:05.846062 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551092-5pnkn"] Mar 09 14:18:06 crc kubenswrapper[4722]: I0309 14:18:06.167167 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f18133e6-a08d-48be-82a2-c77e4c05e170" path="/var/lib/kubelet/pods/f18133e6-a08d-48be-82a2-c77e4c05e170/volumes" Mar 09 14:18:15 crc kubenswrapper[4722]: I0309 14:18:15.870088 4722 scope.go:117] "RemoveContainer" containerID="2902dcdfebbe6acd840510f06d63043c66ae19a0cc75f47007ba4184460ae59c" Mar 09 14:18:19 crc kubenswrapper[4722]: I0309 14:18:19.514515 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/collector-7gz5b"] Mar 09 14:18:19 crc kubenswrapper[4722]: E0309 14:18:19.515525 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9020ca81-ccfb-4a03-ac36-ae9b259c68b1" containerName="oc" Mar 09 14:18:19 crc kubenswrapper[4722]: I0309 14:18:19.515548 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="9020ca81-ccfb-4a03-ac36-ae9b259c68b1" containerName="oc" Mar 09 14:18:19 crc kubenswrapper[4722]: I0309 14:18:19.515817 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="9020ca81-ccfb-4a03-ac36-ae9b259c68b1" containerName="oc" Mar 09 14:18:19 crc kubenswrapper[4722]: I0309 14:18:19.516717 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-7gz5b" Mar 09 14:18:19 crc kubenswrapper[4722]: I0309 14:18:19.523299 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-config" Mar 09 14:18:19 crc kubenswrapper[4722]: I0309 14:18:19.523661 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-metrics" Mar 09 14:18:19 crc kubenswrapper[4722]: I0309 14:18:19.523660 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-syslog-receiver" Mar 09 14:18:19 crc kubenswrapper[4722]: I0309 14:18:19.523801 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-dockercfg-xzp6n" Mar 09 14:18:19 crc kubenswrapper[4722]: I0309 14:18:19.524414 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-token" Mar 09 14:18:19 crc kubenswrapper[4722]: I0309 14:18:19.533151 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-7gz5b"] Mar 09 14:18:19 crc kubenswrapper[4722]: I0309 14:18:19.534030 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-trustbundle" Mar 09 14:18:19 crc kubenswrapper[4722]: I0309 14:18:19.546534 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/bbd41556-249d-43e5-8114-e982f97b66e7-metrics\") pod \"collector-7gz5b\" (UID: \"bbd41556-249d-43e5-8114-e982f97b66e7\") " pod="openshift-logging/collector-7gz5b" Mar 09 14:18:19 crc kubenswrapper[4722]: I0309 14:18:19.546588 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bbd41556-249d-43e5-8114-e982f97b66e7-trusted-ca\") pod \"collector-7gz5b\" (UID: \"bbd41556-249d-43e5-8114-e982f97b66e7\") " pod="openshift-logging/collector-7gz5b" Mar 09 14:18:19 crc kubenswrapper[4722]: I0309 14:18:19.546654 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/bbd41556-249d-43e5-8114-e982f97b66e7-entrypoint\") pod \"collector-7gz5b\" (UID: \"bbd41556-249d-43e5-8114-e982f97b66e7\") " pod="openshift-logging/collector-7gz5b" Mar 09 14:18:19 crc kubenswrapper[4722]: I0309 14:18:19.546689 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bbd41556-249d-43e5-8114-e982f97b66e7-tmp\") pod \"collector-7gz5b\" (UID: \"bbd41556-249d-43e5-8114-e982f97b66e7\") " pod="openshift-logging/collector-7gz5b" Mar 09 14:18:19 crc kubenswrapper[4722]: I0309 14:18:19.546715 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/bbd41556-249d-43e5-8114-e982f97b66e7-collector-token\") pod \"collector-7gz5b\" (UID: \"bbd41556-249d-43e5-8114-e982f97b66e7\") " pod="openshift-logging/collector-7gz5b" Mar 09 14:18:19 crc kubenswrapper[4722]: I0309 14:18:19.546737 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/bbd41556-249d-43e5-8114-e982f97b66e7-datadir\") pod \"collector-7gz5b\" (UID: \"bbd41556-249d-43e5-8114-e982f97b66e7\") " pod="openshift-logging/collector-7gz5b" Mar 09 14:18:19 crc kubenswrapper[4722]: I0309 14:18:19.546785 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbd41556-249d-43e5-8114-e982f97b66e7-config\") pod \"collector-7gz5b\" (UID: \"bbd41556-249d-43e5-8114-e982f97b66e7\") " pod="openshift-logging/collector-7gz5b" Mar 09 14:18:19 crc kubenswrapper[4722]: I0309 14:18:19.546807 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zp6cl\" (UniqueName: \"kubernetes.io/projected/bbd41556-249d-43e5-8114-e982f97b66e7-kube-api-access-zp6cl\") pod \"collector-7gz5b\" (UID: \"bbd41556-249d-43e5-8114-e982f97b66e7\") " pod="openshift-logging/collector-7gz5b" Mar 09 14:18:19 crc kubenswrapper[4722]: I0309 14:18:19.546831 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/bbd41556-249d-43e5-8114-e982f97b66e7-collector-syslog-receiver\") pod \"collector-7gz5b\" (UID: \"bbd41556-249d-43e5-8114-e982f97b66e7\") " pod="openshift-logging/collector-7gz5b" Mar 09 14:18:19 crc kubenswrapper[4722]: I0309 14:18:19.546875 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/bbd41556-249d-43e5-8114-e982f97b66e7-sa-token\") pod \"collector-7gz5b\" (UID: \"bbd41556-249d-43e5-8114-e982f97b66e7\") " pod="openshift-logging/collector-7gz5b" Mar 09 14:18:19 crc kubenswrapper[4722]: I0309 14:18:19.546902 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/bbd41556-249d-43e5-8114-e982f97b66e7-config-openshift-service-cacrt\") pod \"collector-7gz5b\" (UID: \"bbd41556-249d-43e5-8114-e982f97b66e7\") " pod="openshift-logging/collector-7gz5b" Mar 09 14:18:19 crc kubenswrapper[4722]: I0309 14:18:19.592655 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-logging/collector-7gz5b"] Mar 09 14:18:19 crc kubenswrapper[4722]: E0309 14:18:19.593247 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[collector-syslog-receiver collector-token config config-openshift-service-cacrt datadir entrypoint kube-api-access-zp6cl metrics sa-token tmp trusted-ca], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-logging/collector-7gz5b" podUID="bbd41556-249d-43e5-8114-e982f97b66e7" Mar 09 14:18:19 crc kubenswrapper[4722]: I0309 14:18:19.634727 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-7gz5b" Mar 09 14:18:19 crc kubenswrapper[4722]: I0309 14:18:19.642016 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-7gz5b" Mar 09 14:18:19 crc kubenswrapper[4722]: I0309 14:18:19.648101 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/bbd41556-249d-43e5-8114-e982f97b66e7-entrypoint\") pod \"collector-7gz5b\" (UID: \"bbd41556-249d-43e5-8114-e982f97b66e7\") " pod="openshift-logging/collector-7gz5b" Mar 09 14:18:19 crc kubenswrapper[4722]: I0309 14:18:19.648151 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bbd41556-249d-43e5-8114-e982f97b66e7-tmp\") pod \"collector-7gz5b\" (UID: \"bbd41556-249d-43e5-8114-e982f97b66e7\") " pod="openshift-logging/collector-7gz5b" Mar 09 14:18:19 crc kubenswrapper[4722]: I0309 14:18:19.648176 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/bbd41556-249d-43e5-8114-e982f97b66e7-collector-token\") pod \"collector-7gz5b\" (UID: \"bbd41556-249d-43e5-8114-e982f97b66e7\") " pod="openshift-logging/collector-7gz5b" Mar 09 14:18:19 crc kubenswrapper[4722]: I0309 14:18:19.648194 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/bbd41556-249d-43e5-8114-e982f97b66e7-datadir\") pod \"collector-7gz5b\" (UID: \"bbd41556-249d-43e5-8114-e982f97b66e7\") " pod="openshift-logging/collector-7gz5b" Mar 09 14:18:19 crc kubenswrapper[4722]: I0309 14:18:19.648253 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbd41556-249d-43e5-8114-e982f97b66e7-config\") pod \"collector-7gz5b\" (UID: \"bbd41556-249d-43e5-8114-e982f97b66e7\") " pod="openshift-logging/collector-7gz5b" Mar 09 14:18:19 crc kubenswrapper[4722]: I0309 14:18:19.648274 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zp6cl\" (UniqueName: \"kubernetes.io/projected/bbd41556-249d-43e5-8114-e982f97b66e7-kube-api-access-zp6cl\") pod \"collector-7gz5b\" (UID: \"bbd41556-249d-43e5-8114-e982f97b66e7\") " pod="openshift-logging/collector-7gz5b" Mar 09 14:18:19 crc kubenswrapper[4722]: I0309 14:18:19.648298 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/bbd41556-249d-43e5-8114-e982f97b66e7-collector-syslog-receiver\") pod \"collector-7gz5b\" (UID: \"bbd41556-249d-43e5-8114-e982f97b66e7\") " pod="openshift-logging/collector-7gz5b" Mar 09 14:18:19 crc kubenswrapper[4722]: I0309 14:18:19.648332 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/bbd41556-249d-43e5-8114-e982f97b66e7-sa-token\") pod \"collector-7gz5b\" (UID: \"bbd41556-249d-43e5-8114-e982f97b66e7\") " pod="openshift-logging/collector-7gz5b" Mar 09 14:18:19 crc kubenswrapper[4722]: I0309 14:18:19.648349 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/bbd41556-249d-43e5-8114-e982f97b66e7-config-openshift-service-cacrt\") pod \"collector-7gz5b\" (UID: \"bbd41556-249d-43e5-8114-e982f97b66e7\") " pod="openshift-logging/collector-7gz5b" Mar 09 14:18:19 crc kubenswrapper[4722]: I0309 14:18:19.648370 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/bbd41556-249d-43e5-8114-e982f97b66e7-metrics\") pod \"collector-7gz5b\" (UID: \"bbd41556-249d-43e5-8114-e982f97b66e7\") " pod="openshift-logging/collector-7gz5b" Mar 09 14:18:19 crc kubenswrapper[4722]: I0309 14:18:19.648391 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bbd41556-249d-43e5-8114-e982f97b66e7-trusted-ca\") pod \"collector-7gz5b\" (UID: \"bbd41556-249d-43e5-8114-e982f97b66e7\") " pod="openshift-logging/collector-7gz5b" Mar 09 14:18:19 crc kubenswrapper[4722]: I0309 14:18:19.648997 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/bbd41556-249d-43e5-8114-e982f97b66e7-datadir\") pod \"collector-7gz5b\" (UID: \"bbd41556-249d-43e5-8114-e982f97b66e7\") " pod="openshift-logging/collector-7gz5b" Mar 09 14:18:19 crc kubenswrapper[4722]: I0309 14:18:19.649286 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/bbd41556-249d-43e5-8114-e982f97b66e7-config-openshift-service-cacrt\") pod \"collector-7gz5b\" (UID: \"bbd41556-249d-43e5-8114-e982f97b66e7\") " pod="openshift-logging/collector-7gz5b" Mar 09 14:18:19 crc kubenswrapper[4722]: I0309 14:18:19.649371 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/bbd41556-249d-43e5-8114-e982f97b66e7-entrypoint\") pod \"collector-7gz5b\" (UID: \"bbd41556-249d-43e5-8114-e982f97b66e7\") " pod="openshift-logging/collector-7gz5b" Mar 09 14:18:19 crc kubenswrapper[4722]: I0309 14:18:19.649598 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbd41556-249d-43e5-8114-e982f97b66e7-config\") pod \"collector-7gz5b\" (UID: \"bbd41556-249d-43e5-8114-e982f97b66e7\") " pod="openshift-logging/collector-7gz5b" Mar 09 14:18:19 crc kubenswrapper[4722]: I0309 14:18:19.650495 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bbd41556-249d-43e5-8114-e982f97b66e7-trusted-ca\") pod \"collector-7gz5b\" (UID: \"bbd41556-249d-43e5-8114-e982f97b66e7\") " pod="openshift-logging/collector-7gz5b" Mar 09 14:18:19 crc kubenswrapper[4722]: I0309 14:18:19.654039 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bbd41556-249d-43e5-8114-e982f97b66e7-tmp\") pod \"collector-7gz5b\" (UID: \"bbd41556-249d-43e5-8114-e982f97b66e7\") " pod="openshift-logging/collector-7gz5b" Mar 09 14:18:19 crc kubenswrapper[4722]: I0309 14:18:19.654525 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/bbd41556-249d-43e5-8114-e982f97b66e7-metrics\") pod \"collector-7gz5b\" (UID: \"bbd41556-249d-43e5-8114-e982f97b66e7\") " pod="openshift-logging/collector-7gz5b" Mar 09 14:18:19 crc kubenswrapper[4722]: I0309 14:18:19.658735 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/bbd41556-249d-43e5-8114-e982f97b66e7-collector-syslog-receiver\") pod \"collector-7gz5b\" (UID: \"bbd41556-249d-43e5-8114-e982f97b66e7\") " pod="openshift-logging/collector-7gz5b" Mar 09 14:18:19 crc kubenswrapper[4722]: I0309 14:18:19.658767 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/bbd41556-249d-43e5-8114-e982f97b66e7-collector-token\") pod \"collector-7gz5b\" (UID: \"bbd41556-249d-43e5-8114-e982f97b66e7\") " pod="openshift-logging/collector-7gz5b" Mar 09 14:18:19 crc kubenswrapper[4722]: I0309 14:18:19.671042 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zp6cl\" (UniqueName: \"kubernetes.io/projected/bbd41556-249d-43e5-8114-e982f97b66e7-kube-api-access-zp6cl\") pod \"collector-7gz5b\" (UID: \"bbd41556-249d-43e5-8114-e982f97b66e7\") " pod="openshift-logging/collector-7gz5b" Mar 09 14:18:19 crc kubenswrapper[4722]: I0309 14:18:19.676189 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/bbd41556-249d-43e5-8114-e982f97b66e7-sa-token\") pod \"collector-7gz5b\" (UID: \"bbd41556-249d-43e5-8114-e982f97b66e7\") " pod="openshift-logging/collector-7gz5b" Mar 09 14:18:19 crc kubenswrapper[4722]: I0309 14:18:19.850765 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zp6cl\" (UniqueName: \"kubernetes.io/projected/bbd41556-249d-43e5-8114-e982f97b66e7-kube-api-access-zp6cl\") pod \"bbd41556-249d-43e5-8114-e982f97b66e7\" (UID: \"bbd41556-249d-43e5-8114-e982f97b66e7\") " Mar 09 14:18:19 crc kubenswrapper[4722]: I0309 14:18:19.850832 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bbd41556-249d-43e5-8114-e982f97b66e7-tmp\") pod \"bbd41556-249d-43e5-8114-e982f97b66e7\" (UID: \"bbd41556-249d-43e5-8114-e982f97b66e7\") " Mar 09 14:18:19 crc kubenswrapper[4722]: I0309 14:18:19.850884 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/bbd41556-249d-43e5-8114-e982f97b66e7-collector-token\") pod \"bbd41556-249d-43e5-8114-e982f97b66e7\" (UID: \"bbd41556-249d-43e5-8114-e982f97b66e7\") " Mar 09 14:18:19 crc kubenswrapper[4722]: I0309 14:18:19.850948 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bbd41556-249d-43e5-8114-e982f97b66e7-trusted-ca\") pod \"bbd41556-249d-43e5-8114-e982f97b66e7\" (UID: \"bbd41556-249d-43e5-8114-e982f97b66e7\") " Mar 09 14:18:19 crc kubenswrapper[4722]: I0309 14:18:19.851071 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/bbd41556-249d-43e5-8114-e982f97b66e7-metrics\") pod \"bbd41556-249d-43e5-8114-e982f97b66e7\" (UID: \"bbd41556-249d-43e5-8114-e982f97b66e7\") " Mar 09 14:18:19 crc kubenswrapper[4722]: I0309 14:18:19.851126 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/bbd41556-249d-43e5-8114-e982f97b66e7-config-openshift-service-cacrt\") pod \"bbd41556-249d-43e5-8114-e982f97b66e7\" (UID: \"bbd41556-249d-43e5-8114-e982f97b66e7\") " Mar 09 14:18:19 crc kubenswrapper[4722]: I0309 14:18:19.851182 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbd41556-249d-43e5-8114-e982f97b66e7-config\") pod \"bbd41556-249d-43e5-8114-e982f97b66e7\" (UID: \"bbd41556-249d-43e5-8114-e982f97b66e7\") " Mar 09 14:18:19 crc kubenswrapper[4722]: I0309 14:18:19.851271 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/bbd41556-249d-43e5-8114-e982f97b66e7-collector-syslog-receiver\") pod \"bbd41556-249d-43e5-8114-e982f97b66e7\" (UID: \"bbd41556-249d-43e5-8114-e982f97b66e7\") " Mar 09 14:18:19 crc kubenswrapper[4722]: I0309 14:18:19.851333 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/bbd41556-249d-43e5-8114-e982f97b66e7-datadir\") pod \"bbd41556-249d-43e5-8114-e982f97b66e7\" (UID: \"bbd41556-249d-43e5-8114-e982f97b66e7\") " Mar 09 14:18:19 crc kubenswrapper[4722]: I0309 14:18:19.851384 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/bbd41556-249d-43e5-8114-e982f97b66e7-entrypoint\") pod \"bbd41556-249d-43e5-8114-e982f97b66e7\" (UID: \"bbd41556-249d-43e5-8114-e982f97b66e7\") " Mar 09 14:18:19 crc kubenswrapper[4722]: I0309 14:18:19.851430 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/bbd41556-249d-43e5-8114-e982f97b66e7-sa-token\") pod \"bbd41556-249d-43e5-8114-e982f97b66e7\" (UID: \"bbd41556-249d-43e5-8114-e982f97b66e7\") " Mar 09 14:18:19 crc kubenswrapper[4722]: I0309 14:18:19.851500 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bbd41556-249d-43e5-8114-e982f97b66e7-datadir" (OuterVolumeSpecName: "datadir") pod "bbd41556-249d-43e5-8114-e982f97b66e7" (UID: "bbd41556-249d-43e5-8114-e982f97b66e7"). InnerVolumeSpecName "datadir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 14:18:19 crc kubenswrapper[4722]: I0309 14:18:19.851595 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bbd41556-249d-43e5-8114-e982f97b66e7-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bbd41556-249d-43e5-8114-e982f97b66e7" (UID: "bbd41556-249d-43e5-8114-e982f97b66e7"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:18:19 crc kubenswrapper[4722]: I0309 14:18:19.851839 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bbd41556-249d-43e5-8114-e982f97b66e7-entrypoint" (OuterVolumeSpecName: "entrypoint") pod "bbd41556-249d-43e5-8114-e982f97b66e7" (UID: "bbd41556-249d-43e5-8114-e982f97b66e7"). InnerVolumeSpecName "entrypoint". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:18:19 crc kubenswrapper[4722]: I0309 14:18:19.851920 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bbd41556-249d-43e5-8114-e982f97b66e7-config-openshift-service-cacrt" (OuterVolumeSpecName: "config-openshift-service-cacrt") pod "bbd41556-249d-43e5-8114-e982f97b66e7" (UID: "bbd41556-249d-43e5-8114-e982f97b66e7"). InnerVolumeSpecName "config-openshift-service-cacrt". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:18:19 crc kubenswrapper[4722]: I0309 14:18:19.851959 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bbd41556-249d-43e5-8114-e982f97b66e7-config" (OuterVolumeSpecName: "config") pod "bbd41556-249d-43e5-8114-e982f97b66e7" (UID: "bbd41556-249d-43e5-8114-e982f97b66e7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:18:19 crc kubenswrapper[4722]: I0309 14:18:19.851972 4722 reconciler_common.go:293] "Volume detached for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/bbd41556-249d-43e5-8114-e982f97b66e7-datadir\") on node \"crc\" DevicePath \"\"" Mar 09 14:18:19 crc kubenswrapper[4722]: I0309 14:18:19.851995 4722 reconciler_common.go:293] "Volume detached for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/bbd41556-249d-43e5-8114-e982f97b66e7-entrypoint\") on node \"crc\" DevicePath \"\"" Mar 09 14:18:19 crc kubenswrapper[4722]: I0309 14:18:19.852008 4722 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bbd41556-249d-43e5-8114-e982f97b66e7-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 09 14:18:19 crc kubenswrapper[4722]: I0309 14:18:19.854043 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bbd41556-249d-43e5-8114-e982f97b66e7-tmp" (OuterVolumeSpecName: "tmp") pod "bbd41556-249d-43e5-8114-e982f97b66e7" (UID: "bbd41556-249d-43e5-8114-e982f97b66e7"). InnerVolumeSpecName "tmp". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:18:19 crc kubenswrapper[4722]: I0309 14:18:19.854587 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbd41556-249d-43e5-8114-e982f97b66e7-kube-api-access-zp6cl" (OuterVolumeSpecName: "kube-api-access-zp6cl") pod "bbd41556-249d-43e5-8114-e982f97b66e7" (UID: "bbd41556-249d-43e5-8114-e982f97b66e7"). InnerVolumeSpecName "kube-api-access-zp6cl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:18:19 crc kubenswrapper[4722]: I0309 14:18:19.854941 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbd41556-249d-43e5-8114-e982f97b66e7-collector-syslog-receiver" (OuterVolumeSpecName: "collector-syslog-receiver") pod "bbd41556-249d-43e5-8114-e982f97b66e7" (UID: "bbd41556-249d-43e5-8114-e982f97b66e7"). InnerVolumeSpecName "collector-syslog-receiver". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:18:19 crc kubenswrapper[4722]: I0309 14:18:19.855247 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbd41556-249d-43e5-8114-e982f97b66e7-metrics" (OuterVolumeSpecName: "metrics") pod "bbd41556-249d-43e5-8114-e982f97b66e7" (UID: "bbd41556-249d-43e5-8114-e982f97b66e7"). InnerVolumeSpecName "metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:18:19 crc kubenswrapper[4722]: I0309 14:18:19.855480 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbd41556-249d-43e5-8114-e982f97b66e7-sa-token" (OuterVolumeSpecName: "sa-token") pod "bbd41556-249d-43e5-8114-e982f97b66e7" (UID: "bbd41556-249d-43e5-8114-e982f97b66e7"). InnerVolumeSpecName "sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:18:19 crc kubenswrapper[4722]: I0309 14:18:19.857384 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbd41556-249d-43e5-8114-e982f97b66e7-collector-token" (OuterVolumeSpecName: "collector-token") pod "bbd41556-249d-43e5-8114-e982f97b66e7" (UID: "bbd41556-249d-43e5-8114-e982f97b66e7"). InnerVolumeSpecName "collector-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:18:19 crc kubenswrapper[4722]: I0309 14:18:19.953082 4722 reconciler_common.go:293] "Volume detached for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/bbd41556-249d-43e5-8114-e982f97b66e7-sa-token\") on node \"crc\" DevicePath \"\"" Mar 09 14:18:19 crc kubenswrapper[4722]: I0309 14:18:19.953116 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zp6cl\" (UniqueName: \"kubernetes.io/projected/bbd41556-249d-43e5-8114-e982f97b66e7-kube-api-access-zp6cl\") on node \"crc\" DevicePath \"\"" Mar 09 14:18:19 crc kubenswrapper[4722]: I0309 14:18:19.953128 4722 reconciler_common.go:293] "Volume detached for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bbd41556-249d-43e5-8114-e982f97b66e7-tmp\") on node \"crc\" DevicePath \"\"" Mar 09 14:18:19 crc kubenswrapper[4722]: I0309 14:18:19.953153 4722 reconciler_common.go:293] "Volume detached for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/bbd41556-249d-43e5-8114-e982f97b66e7-collector-token\") on node \"crc\" DevicePath \"\"" Mar 09 14:18:19 crc kubenswrapper[4722]: I0309 14:18:19.953164 4722 reconciler_common.go:293] "Volume detached for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/bbd41556-249d-43e5-8114-e982f97b66e7-metrics\") on node \"crc\" DevicePath \"\"" Mar 09 14:18:19 crc kubenswrapper[4722]: I0309 14:18:19.953177 4722 reconciler_common.go:293] "Volume detached for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/bbd41556-249d-43e5-8114-e982f97b66e7-config-openshift-service-cacrt\") on node \"crc\" DevicePath \"\"" Mar 09 14:18:19 crc kubenswrapper[4722]: I0309 14:18:19.953187 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbd41556-249d-43e5-8114-e982f97b66e7-config\") on node \"crc\" DevicePath \"\"" Mar 09 14:18:19 crc kubenswrapper[4722]: I0309 14:18:19.953196 4722 reconciler_common.go:293] "Volume detached for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/bbd41556-249d-43e5-8114-e982f97b66e7-collector-syslog-receiver\") on node \"crc\" DevicePath \"\"" Mar 09 14:18:20 crc kubenswrapper[4722]: I0309 14:18:20.653692 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-7gz5b" Mar 09 14:18:20 crc kubenswrapper[4722]: I0309 14:18:20.715565 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-logging/collector-7gz5b"] Mar 09 14:18:20 crc kubenswrapper[4722]: I0309 14:18:20.728474 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-logging/collector-7gz5b"] Mar 09 14:18:20 crc kubenswrapper[4722]: I0309 14:18:20.737173 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/collector-rj4pp"] Mar 09 14:18:20 crc kubenswrapper[4722]: I0309 14:18:20.738216 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-rj4pp" Mar 09 14:18:20 crc kubenswrapper[4722]: I0309 14:18:20.740540 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-dockercfg-xzp6n" Mar 09 14:18:20 crc kubenswrapper[4722]: I0309 14:18:20.740632 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-metrics" Mar 09 14:18:20 crc kubenswrapper[4722]: I0309 14:18:20.740856 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-syslog-receiver" Mar 09 14:18:20 crc kubenswrapper[4722]: I0309 14:18:20.741821 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-config" Mar 09 14:18:20 crc kubenswrapper[4722]: I0309 14:18:20.742628 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-token" Mar 09 14:18:20 crc kubenswrapper[4722]: I0309 14:18:20.745007 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-rj4pp"] Mar 09 14:18:20 crc kubenswrapper[4722]: I0309 14:18:20.749528 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-trustbundle" Mar 09 14:18:20 crc kubenswrapper[4722]: I0309 14:18:20.768541 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6cjn\" (UniqueName: \"kubernetes.io/projected/b7932da8-d764-41c7-b8ac-038cc75e50fd-kube-api-access-x6cjn\") pod \"collector-rj4pp\" (UID: \"b7932da8-d764-41c7-b8ac-038cc75e50fd\") " pod="openshift-logging/collector-rj4pp" Mar 09 14:18:20 crc kubenswrapper[4722]: I0309 14:18:20.768619 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b7932da8-d764-41c7-b8ac-038cc75e50fd-tmp\") pod \"collector-rj4pp\" (UID: \"b7932da8-d764-41c7-b8ac-038cc75e50fd\") " pod="openshift-logging/collector-rj4pp" Mar 09 14:18:20 crc kubenswrapper[4722]: I0309 14:18:20.768647 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/b7932da8-d764-41c7-b8ac-038cc75e50fd-metrics\") pod \"collector-rj4pp\" (UID: \"b7932da8-d764-41c7-b8ac-038cc75e50fd\") " pod="openshift-logging/collector-rj4pp" Mar 09 14:18:20 crc kubenswrapper[4722]: I0309 14:18:20.768719 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/b7932da8-d764-41c7-b8ac-038cc75e50fd-datadir\") pod \"collector-rj4pp\" (UID: \"b7932da8-d764-41c7-b8ac-038cc75e50fd\") " pod="openshift-logging/collector-rj4pp" Mar 09 14:18:20 crc kubenswrapper[4722]: I0309 14:18:20.768795 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/b7932da8-d764-41c7-b8ac-038cc75e50fd-collector-syslog-receiver\") pod \"collector-rj4pp\" (UID: \"b7932da8-d764-41c7-b8ac-038cc75e50fd\") " pod="openshift-logging/collector-rj4pp" Mar 09 14:18:20 crc kubenswrapper[4722]: I0309 14:18:20.768824 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/b7932da8-d764-41c7-b8ac-038cc75e50fd-collector-token\") pod \"collector-rj4pp\" (UID: \"b7932da8-d764-41c7-b8ac-038cc75e50fd\") " pod="openshift-logging/collector-rj4pp" Mar 09 14:18:20 crc kubenswrapper[4722]: I0309 14:18:20.768993 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b7932da8-d764-41c7-b8ac-038cc75e50fd-trusted-ca\") pod \"collector-rj4pp\" (UID: \"b7932da8-d764-41c7-b8ac-038cc75e50fd\") " pod="openshift-logging/collector-rj4pp" Mar 09 14:18:20 crc kubenswrapper[4722]: I0309 14:18:20.769080 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/b7932da8-d764-41c7-b8ac-038cc75e50fd-entrypoint\") pod \"collector-rj4pp\" (UID: \"b7932da8-d764-41c7-b8ac-038cc75e50fd\") " pod="openshift-logging/collector-rj4pp" Mar 09 14:18:20 crc kubenswrapper[4722]: I0309 14:18:20.769120 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/b7932da8-d764-41c7-b8ac-038cc75e50fd-config-openshift-service-cacrt\") pod \"collector-rj4pp\" (UID: \"b7932da8-d764-41c7-b8ac-038cc75e50fd\") " pod="openshift-logging/collector-rj4pp" Mar 09 14:18:20 crc kubenswrapper[4722]: I0309 14:18:20.769141 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7932da8-d764-41c7-b8ac-038cc75e50fd-config\") pod \"collector-rj4pp\" (UID: \"b7932da8-d764-41c7-b8ac-038cc75e50fd\") " pod="openshift-logging/collector-rj4pp" Mar 09 14:18:20 crc kubenswrapper[4722]: I0309 14:18:20.769158 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/b7932da8-d764-41c7-b8ac-038cc75e50fd-sa-token\") pod \"collector-rj4pp\" (UID: \"b7932da8-d764-41c7-b8ac-038cc75e50fd\") " pod="openshift-logging/collector-rj4pp" Mar 09 14:18:20 crc kubenswrapper[4722]: I0309 14:18:20.870318 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/b7932da8-d764-41c7-b8ac-038cc75e50fd-datadir\") pod \"collector-rj4pp\" (UID: \"b7932da8-d764-41c7-b8ac-038cc75e50fd\") " pod="openshift-logging/collector-rj4pp" Mar 09 14:18:20 crc kubenswrapper[4722]: I0309 14:18:20.870400 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/b7932da8-d764-41c7-b8ac-038cc75e50fd-collector-syslog-receiver\") pod \"collector-rj4pp\" (UID: \"b7932da8-d764-41c7-b8ac-038cc75e50fd\") " pod="openshift-logging/collector-rj4pp" Mar 09 14:18:20 crc kubenswrapper[4722]: I0309 14:18:20.870445 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/b7932da8-d764-41c7-b8ac-038cc75e50fd-collector-token\") pod \"collector-rj4pp\" (UID: \"b7932da8-d764-41c7-b8ac-038cc75e50fd\") " pod="openshift-logging/collector-rj4pp" Mar 09 14:18:20 crc kubenswrapper[4722]: I0309 14:18:20.870461 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/b7932da8-d764-41c7-b8ac-038cc75e50fd-datadir\") pod \"collector-rj4pp\" (UID: \"b7932da8-d764-41c7-b8ac-038cc75e50fd\") " pod="openshift-logging/collector-rj4pp" Mar 09 14:18:20 crc kubenswrapper[4722]: I0309 14:18:20.870496 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b7932da8-d764-41c7-b8ac-038cc75e50fd-trusted-ca\") pod \"collector-rj4pp\" (UID: \"b7932da8-d764-41c7-b8ac-038cc75e50fd\") " pod="openshift-logging/collector-rj4pp" Mar 09 14:18:20 crc kubenswrapper[4722]: I0309 14:18:20.870556 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/b7932da8-d764-41c7-b8ac-038cc75e50fd-entrypoint\") pod \"collector-rj4pp\" (UID: \"b7932da8-d764-41c7-b8ac-038cc75e50fd\") " pod="openshift-logging/collector-rj4pp" Mar 09 14:18:20 crc kubenswrapper[4722]: I0309 14:18:20.870618 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/b7932da8-d764-41c7-b8ac-038cc75e50fd-config-openshift-service-cacrt\") pod \"collector-rj4pp\" (UID: \"b7932da8-d764-41c7-b8ac-038cc75e50fd\") " pod="openshift-logging/collector-rj4pp" Mar 09 14:18:20 crc kubenswrapper[4722]: I0309 14:18:20.870662 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7932da8-d764-41c7-b8ac-038cc75e50fd-config\") pod \"collector-rj4pp\" (UID: \"b7932da8-d764-41c7-b8ac-038cc75e50fd\") " pod="openshift-logging/collector-rj4pp" Mar 09 14:18:20 crc kubenswrapper[4722]: I0309 14:18:20.870692 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/b7932da8-d764-41c7-b8ac-038cc75e50fd-sa-token\") pod \"collector-rj4pp\" (UID: \"b7932da8-d764-41c7-b8ac-038cc75e50fd\") " pod="openshift-logging/collector-rj4pp" Mar 09 14:18:20 crc kubenswrapper[4722]: I0309 14:18:20.870762 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6cjn\" (UniqueName: \"kubernetes.io/projected/b7932da8-d764-41c7-b8ac-038cc75e50fd-kube-api-access-x6cjn\") pod \"collector-rj4pp\" (UID: \"b7932da8-d764-41c7-b8ac-038cc75e50fd\") " pod="openshift-logging/collector-rj4pp" Mar 09 14:18:20 crc kubenswrapper[4722]: I0309 14:18:20.870800 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b7932da8-d764-41c7-b8ac-038cc75e50fd-tmp\") pod \"collector-rj4pp\" (UID: \"b7932da8-d764-41c7-b8ac-038cc75e50fd\") " pod="openshift-logging/collector-rj4pp" Mar 09 14:18:20 crc kubenswrapper[4722]: I0309 14:18:20.870830 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/b7932da8-d764-41c7-b8ac-038cc75e50fd-metrics\") pod \"collector-rj4pp\" (UID: \"b7932da8-d764-41c7-b8ac-038cc75e50fd\") " pod="openshift-logging/collector-rj4pp" Mar 09 14:18:20 crc kubenswrapper[4722]: I0309 14:18:20.872873 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/b7932da8-d764-41c7-b8ac-038cc75e50fd-entrypoint\") pod \"collector-rj4pp\" (UID: \"b7932da8-d764-41c7-b8ac-038cc75e50fd\") " pod="openshift-logging/collector-rj4pp" Mar 09 14:18:20 crc kubenswrapper[4722]: I0309 14:18:20.873221 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b7932da8-d764-41c7-b8ac-038cc75e50fd-trusted-ca\") pod \"collector-rj4pp\" (UID: \"b7932da8-d764-41c7-b8ac-038cc75e50fd\") " pod="openshift-logging/collector-rj4pp" Mar 09 14:18:20 crc kubenswrapper[4722]: I0309 14:18:20.874009 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b7932da8-d764-41c7-b8ac-038cc75e50fd-config\") pod \"collector-rj4pp\" (UID: \"b7932da8-d764-41c7-b8ac-038cc75e50fd\") " pod="openshift-logging/collector-rj4pp" Mar 09 14:18:20 crc kubenswrapper[4722]: I0309 14:18:20.873530 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/b7932da8-d764-41c7-b8ac-038cc75e50fd-config-openshift-service-cacrt\") pod \"collector-rj4pp\" (UID: \"b7932da8-d764-41c7-b8ac-038cc75e50fd\") " pod="openshift-logging/collector-rj4pp" Mar 09 14:18:20 crc kubenswrapper[4722]: I0309 14:18:20.876735 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b7932da8-d764-41c7-b8ac-038cc75e50fd-tmp\") pod \"collector-rj4pp\" (UID: \"b7932da8-d764-41c7-b8ac-038cc75e50fd\") " pod="openshift-logging/collector-rj4pp" Mar 09 14:18:20 crc kubenswrapper[4722]: I0309 14:18:20.876939 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/b7932da8-d764-41c7-b8ac-038cc75e50fd-collector-token\") pod \"collector-rj4pp\" (UID: \"b7932da8-d764-41c7-b8ac-038cc75e50fd\") " pod="openshift-logging/collector-rj4pp" Mar 09 14:18:20 crc kubenswrapper[4722]: I0309 14:18:20.880874 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/b7932da8-d764-41c7-b8ac-038cc75e50fd-metrics\") pod \"collector-rj4pp\" (UID: \"b7932da8-d764-41c7-b8ac-038cc75e50fd\") " pod="openshift-logging/collector-rj4pp" Mar 09 14:18:20 crc kubenswrapper[4722]: I0309 14:18:20.890009 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6cjn\" (UniqueName: \"kubernetes.io/projected/b7932da8-d764-41c7-b8ac-038cc75e50fd-kube-api-access-x6cjn\") pod \"collector-rj4pp\" (UID: \"b7932da8-d764-41c7-b8ac-038cc75e50fd\") " pod="openshift-logging/collector-rj4pp" Mar 09 14:18:20 crc kubenswrapper[4722]: I0309 14:18:20.891685 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/b7932da8-d764-41c7-b8ac-038cc75e50fd-collector-syslog-receiver\") pod \"collector-rj4pp\" (UID: \"b7932da8-d764-41c7-b8ac-038cc75e50fd\") " pod="openshift-logging/collector-rj4pp" Mar 09 14:18:20 crc kubenswrapper[4722]: I0309 14:18:20.908259 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/b7932da8-d764-41c7-b8ac-038cc75e50fd-sa-token\") pod \"collector-rj4pp\" (UID: \"b7932da8-d764-41c7-b8ac-038cc75e50fd\") " pod="openshift-logging/collector-rj4pp" Mar 09 14:18:21 crc kubenswrapper[4722]: I0309 14:18:21.063522 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-rj4pp" Mar 09 14:18:21 crc kubenswrapper[4722]: I0309 14:18:21.505021 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-rj4pp"] Mar 09 14:18:21 crc kubenswrapper[4722]: W0309 14:18:21.518536 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7932da8_d764_41c7_b8ac_038cc75e50fd.slice/crio-fdec8970e2ee2c44c37651dbc36fb7130bf4cdc50e9d6f65790bd32099d1a5d4 WatchSource:0}: Error finding container fdec8970e2ee2c44c37651dbc36fb7130bf4cdc50e9d6f65790bd32099d1a5d4: Status 404 returned error can't find the container with id fdec8970e2ee2c44c37651dbc36fb7130bf4cdc50e9d6f65790bd32099d1a5d4 Mar 09 14:18:21 crc kubenswrapper[4722]: I0309 14:18:21.663520 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/collector-rj4pp" event={"ID":"b7932da8-d764-41c7-b8ac-038cc75e50fd","Type":"ContainerStarted","Data":"fdec8970e2ee2c44c37651dbc36fb7130bf4cdc50e9d6f65790bd32099d1a5d4"} Mar 09 14:18:22 crc kubenswrapper[4722]: I0309 14:18:22.169951 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bbd41556-249d-43e5-8114-e982f97b66e7" path="/var/lib/kubelet/pods/bbd41556-249d-43e5-8114-e982f97b66e7/volumes" Mar 09 14:18:27 crc kubenswrapper[4722]: I0309 14:18:27.710151 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/collector-rj4pp" event={"ID":"b7932da8-d764-41c7-b8ac-038cc75e50fd","Type":"ContainerStarted","Data":"a96b593e49cc6a4e1b220ae42782cc55e0979897023202e357a9535c9fc91a18"} Mar 09 14:18:28 crc kubenswrapper[4722]: I0309 14:18:28.747919 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/collector-rj4pp" podStartSLOduration=2.752372941 podStartE2EDuration="8.747893812s" podCreationTimestamp="2026-03-09 14:18:20 +0000 UTC" firstStartedPulling="2026-03-09 14:18:21.52150461 +0000 UTC m=+942.077073196" lastFinishedPulling="2026-03-09 14:18:27.517025491 +0000 UTC m=+948.072594067" observedRunningTime="2026-03-09 14:18:28.736429233 +0000 UTC m=+949.291997859" watchObservedRunningTime="2026-03-09 14:18:28.747893812 +0000 UTC m=+949.303462408" Mar 09 14:18:51 crc kubenswrapper[4722]: I0309 14:18:51.527671 4722 patch_prober.go:28] interesting pod/machine-config-daemon-hjrrb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:18:51 crc kubenswrapper[4722]: I0309 14:18:51.528474 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:18:52 crc kubenswrapper[4722]: I0309 14:18:52.949623 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8tdrv"] Mar 09 14:18:52 crc kubenswrapper[4722]: I0309 14:18:52.956745 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8tdrv"] Mar 09 14:18:52 crc kubenswrapper[4722]: I0309 14:18:52.956903 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8tdrv" Mar 09 14:18:53 crc kubenswrapper[4722]: I0309 14:18:53.111462 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54383c09-c05b-4ee3-b3bc-8f256f75b2f6-catalog-content\") pod \"certified-operators-8tdrv\" (UID: \"54383c09-c05b-4ee3-b3bc-8f256f75b2f6\") " pod="openshift-marketplace/certified-operators-8tdrv" Mar 09 14:18:53 crc kubenswrapper[4722]: I0309 14:18:53.111523 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54383c09-c05b-4ee3-b3bc-8f256f75b2f6-utilities\") pod \"certified-operators-8tdrv\" (UID: \"54383c09-c05b-4ee3-b3bc-8f256f75b2f6\") " pod="openshift-marketplace/certified-operators-8tdrv" Mar 09 14:18:53 crc kubenswrapper[4722]: I0309 14:18:53.111563 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nl4s6\" (UniqueName: \"kubernetes.io/projected/54383c09-c05b-4ee3-b3bc-8f256f75b2f6-kube-api-access-nl4s6\") pod \"certified-operators-8tdrv\" (UID: \"54383c09-c05b-4ee3-b3bc-8f256f75b2f6\") " pod="openshift-marketplace/certified-operators-8tdrv" Mar 09 14:18:53 crc kubenswrapper[4722]: I0309 14:18:53.212979 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54383c09-c05b-4ee3-b3bc-8f256f75b2f6-catalog-content\") pod \"certified-operators-8tdrv\" (UID: \"54383c09-c05b-4ee3-b3bc-8f256f75b2f6\") " pod="openshift-marketplace/certified-operators-8tdrv" Mar 09 14:18:53 crc kubenswrapper[4722]: I0309 14:18:53.213038 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54383c09-c05b-4ee3-b3bc-8f256f75b2f6-utilities\") pod \"certified-operators-8tdrv\" (UID: \"54383c09-c05b-4ee3-b3bc-8f256f75b2f6\") " pod="openshift-marketplace/certified-operators-8tdrv" Mar 09 14:18:53 crc kubenswrapper[4722]: I0309 14:18:53.213066 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nl4s6\" (UniqueName: \"kubernetes.io/projected/54383c09-c05b-4ee3-b3bc-8f256f75b2f6-kube-api-access-nl4s6\") pod \"certified-operators-8tdrv\" (UID: \"54383c09-c05b-4ee3-b3bc-8f256f75b2f6\") " pod="openshift-marketplace/certified-operators-8tdrv" Mar 09 14:18:53 crc kubenswrapper[4722]: I0309 14:18:53.213606 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54383c09-c05b-4ee3-b3bc-8f256f75b2f6-catalog-content\") pod \"certified-operators-8tdrv\" (UID: \"54383c09-c05b-4ee3-b3bc-8f256f75b2f6\") " pod="openshift-marketplace/certified-operators-8tdrv" Mar 09 14:18:53 crc kubenswrapper[4722]: I0309 14:18:53.213787 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54383c09-c05b-4ee3-b3bc-8f256f75b2f6-utilities\") pod \"certified-operators-8tdrv\" (UID: \"54383c09-c05b-4ee3-b3bc-8f256f75b2f6\") " pod="openshift-marketplace/certified-operators-8tdrv" Mar 09 14:18:53 crc kubenswrapper[4722]: I0309 14:18:53.235005 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nl4s6\" (UniqueName: \"kubernetes.io/projected/54383c09-c05b-4ee3-b3bc-8f256f75b2f6-kube-api-access-nl4s6\") pod \"certified-operators-8tdrv\" (UID: \"54383c09-c05b-4ee3-b3bc-8f256f75b2f6\") " pod="openshift-marketplace/certified-operators-8tdrv" Mar 09 14:18:53 crc kubenswrapper[4722]: I0309 14:18:53.276164 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8tdrv" Mar 09 14:18:53 crc kubenswrapper[4722]: I0309 14:18:53.815884 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8tdrv"] Mar 09 14:18:53 crc kubenswrapper[4722]: I0309 14:18:53.916887 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8tdrv" event={"ID":"54383c09-c05b-4ee3-b3bc-8f256f75b2f6","Type":"ContainerStarted","Data":"68882470a29a7f05f45e3ab5a1220437ab54b89e5e7bc215fb77890ce07c532a"} Mar 09 14:18:54 crc kubenswrapper[4722]: I0309 14:18:54.926167 4722 generic.go:334] "Generic (PLEG): container finished" podID="54383c09-c05b-4ee3-b3bc-8f256f75b2f6" containerID="c3cebe48546555b88ce078145fab9d166cc1d0bacebd9d95be046bd3856e7aec" exitCode=0 Mar 09 14:18:54 crc kubenswrapper[4722]: I0309 14:18:54.926413 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8tdrv" event={"ID":"54383c09-c05b-4ee3-b3bc-8f256f75b2f6","Type":"ContainerDied","Data":"c3cebe48546555b88ce078145fab9d166cc1d0bacebd9d95be046bd3856e7aec"} Mar 09 14:18:56 crc kubenswrapper[4722]: I0309 14:18:56.940839 4722 generic.go:334] "Generic (PLEG): container finished" podID="54383c09-c05b-4ee3-b3bc-8f256f75b2f6" containerID="a658b24b1e8f432dc8d5e6b980f61e121a00c0a02b45a0c903feaadee351de49" exitCode=0 Mar 09 14:18:56 crc kubenswrapper[4722]: I0309 14:18:56.940937 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8tdrv" event={"ID":"54383c09-c05b-4ee3-b3bc-8f256f75b2f6","Type":"ContainerDied","Data":"a658b24b1e8f432dc8d5e6b980f61e121a00c0a02b45a0c903feaadee351de49"} Mar 09 14:18:57 crc kubenswrapper[4722]: I0309 14:18:57.960893 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8tdrv" event={"ID":"54383c09-c05b-4ee3-b3bc-8f256f75b2f6","Type":"ContainerStarted","Data":"fc79b53cf57964eda9562d59a9556eb146c5a567c561241397cfebcda7f6ec50"} Mar 09 14:18:57 crc kubenswrapper[4722]: I0309 14:18:57.984348 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8tdrv" podStartSLOduration=3.529418001 podStartE2EDuration="5.984323884s" podCreationTimestamp="2026-03-09 14:18:52 +0000 UTC" firstStartedPulling="2026-03-09 14:18:54.930282035 +0000 UTC m=+975.485850621" lastFinishedPulling="2026-03-09 14:18:57.385187908 +0000 UTC m=+977.940756504" observedRunningTime="2026-03-09 14:18:57.983716638 +0000 UTC m=+978.539285214" watchObservedRunningTime="2026-03-09 14:18:57.984323884 +0000 UTC m=+978.539892470" Mar 09 14:19:00 crc kubenswrapper[4722]: I0309 14:19:00.337600 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82l5sn8"] Mar 09 14:19:00 crc kubenswrapper[4722]: I0309 14:19:00.339225 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82l5sn8" Mar 09 14:19:00 crc kubenswrapper[4722]: I0309 14:19:00.340846 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 09 14:19:00 crc kubenswrapper[4722]: I0309 14:19:00.352950 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82l5sn8"] Mar 09 14:19:00 crc kubenswrapper[4722]: I0309 14:19:00.426355 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/55a96bf1-f247-49ec-9ecf-feb3ae63a814-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82l5sn8\" (UID: \"55a96bf1-f247-49ec-9ecf-feb3ae63a814\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82l5sn8" Mar 09 14:19:00 crc kubenswrapper[4722]: I0309 14:19:00.426402 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/55a96bf1-f247-49ec-9ecf-feb3ae63a814-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82l5sn8\" (UID: \"55a96bf1-f247-49ec-9ecf-feb3ae63a814\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82l5sn8" Mar 09 14:19:00 crc kubenswrapper[4722]: I0309 14:19:00.426552 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zfxf\" (UniqueName: \"kubernetes.io/projected/55a96bf1-f247-49ec-9ecf-feb3ae63a814-kube-api-access-5zfxf\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82l5sn8\" (UID: \"55a96bf1-f247-49ec-9ecf-feb3ae63a814\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82l5sn8" Mar 09 14:19:00 crc kubenswrapper[4722]: I0309 14:19:00.528033 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/55a96bf1-f247-49ec-9ecf-feb3ae63a814-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82l5sn8\" (UID: \"55a96bf1-f247-49ec-9ecf-feb3ae63a814\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82l5sn8" Mar 09 14:19:00 crc kubenswrapper[4722]: I0309 14:19:00.528089 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/55a96bf1-f247-49ec-9ecf-feb3ae63a814-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82l5sn8\" (UID: \"55a96bf1-f247-49ec-9ecf-feb3ae63a814\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82l5sn8" Mar 09 14:19:00 crc kubenswrapper[4722]: I0309 14:19:00.528146 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zfxf\" (UniqueName: \"kubernetes.io/projected/55a96bf1-f247-49ec-9ecf-feb3ae63a814-kube-api-access-5zfxf\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82l5sn8\" (UID: \"55a96bf1-f247-49ec-9ecf-feb3ae63a814\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82l5sn8" Mar 09 14:19:00 crc kubenswrapper[4722]: I0309 14:19:00.528546 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/55a96bf1-f247-49ec-9ecf-feb3ae63a814-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82l5sn8\" (UID: \"55a96bf1-f247-49ec-9ecf-feb3ae63a814\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82l5sn8" Mar 09 14:19:00 crc kubenswrapper[4722]: I0309 14:19:00.528616 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/55a96bf1-f247-49ec-9ecf-feb3ae63a814-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82l5sn8\" (UID: \"55a96bf1-f247-49ec-9ecf-feb3ae63a814\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82l5sn8" Mar 09 14:19:00 crc kubenswrapper[4722]: I0309 14:19:00.548789 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zfxf\" (UniqueName: \"kubernetes.io/projected/55a96bf1-f247-49ec-9ecf-feb3ae63a814-kube-api-access-5zfxf\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82l5sn8\" (UID: \"55a96bf1-f247-49ec-9ecf-feb3ae63a814\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82l5sn8" Mar 09 14:19:00 crc kubenswrapper[4722]: I0309 14:19:00.654121 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82l5sn8" Mar 09 14:19:00 crc kubenswrapper[4722]: I0309 14:19:00.978800 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82l5sn8"] Mar 09 14:19:02 crc kubenswrapper[4722]: I0309 14:19:02.020166 4722 generic.go:334] "Generic (PLEG): container finished" podID="55a96bf1-f247-49ec-9ecf-feb3ae63a814" containerID="e7e0177c168cf56abf716aed8545e09a307faf904ecd65ea355651a01c3205d1" exitCode=0 Mar 09 14:19:02 crc kubenswrapper[4722]: I0309 14:19:02.020236 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82l5sn8" event={"ID":"55a96bf1-f247-49ec-9ecf-feb3ae63a814","Type":"ContainerDied","Data":"e7e0177c168cf56abf716aed8545e09a307faf904ecd65ea355651a01c3205d1"} Mar 09 14:19:02 crc kubenswrapper[4722]: I0309 14:19:02.020514 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82l5sn8" event={"ID":"55a96bf1-f247-49ec-9ecf-feb3ae63a814","Type":"ContainerStarted","Data":"4bf4bac91ae788f123b69bc7d759d2efd1510a8f27b23c15206b6cd71520cb78"} Mar 09 14:19:02 crc kubenswrapper[4722]: I0309 14:19:02.064299 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-564p8"] Mar 09 14:19:02 crc kubenswrapper[4722]: I0309 14:19:02.066540 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-564p8" Mar 09 14:19:02 crc kubenswrapper[4722]: I0309 14:19:02.079181 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-564p8"] Mar 09 14:19:02 crc kubenswrapper[4722]: I0309 14:19:02.188161 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96289499-73be-40e3-beb0-95dd99d9dacd-catalog-content\") pod \"redhat-operators-564p8\" (UID: \"96289499-73be-40e3-beb0-95dd99d9dacd\") " pod="openshift-marketplace/redhat-operators-564p8" Mar 09 14:19:02 crc kubenswrapper[4722]: I0309 14:19:02.188347 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96289499-73be-40e3-beb0-95dd99d9dacd-utilities\") pod \"redhat-operators-564p8\" (UID: \"96289499-73be-40e3-beb0-95dd99d9dacd\") " pod="openshift-marketplace/redhat-operators-564p8" Mar 09 14:19:02 crc kubenswrapper[4722]: I0309 14:19:02.188397 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h85m6\" (UniqueName: \"kubernetes.io/projected/96289499-73be-40e3-beb0-95dd99d9dacd-kube-api-access-h85m6\") pod \"redhat-operators-564p8\" (UID: \"96289499-73be-40e3-beb0-95dd99d9dacd\") " pod="openshift-marketplace/redhat-operators-564p8" Mar 09 14:19:02 crc kubenswrapper[4722]: I0309 14:19:02.289928 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96289499-73be-40e3-beb0-95dd99d9dacd-utilities\") pod \"redhat-operators-564p8\" (UID: \"96289499-73be-40e3-beb0-95dd99d9dacd\") " pod="openshift-marketplace/redhat-operators-564p8" Mar 09 14:19:02 crc kubenswrapper[4722]: I0309 14:19:02.289997 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h85m6\" (UniqueName: \"kubernetes.io/projected/96289499-73be-40e3-beb0-95dd99d9dacd-kube-api-access-h85m6\") pod \"redhat-operators-564p8\" (UID: \"96289499-73be-40e3-beb0-95dd99d9dacd\") " pod="openshift-marketplace/redhat-operators-564p8" Mar 09 14:19:02 crc kubenswrapper[4722]: I0309 14:19:02.290103 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96289499-73be-40e3-beb0-95dd99d9dacd-catalog-content\") pod \"redhat-operators-564p8\" (UID: \"96289499-73be-40e3-beb0-95dd99d9dacd\") " pod="openshift-marketplace/redhat-operators-564p8" Mar 09 14:19:02 crc kubenswrapper[4722]: I0309 14:19:02.290551 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96289499-73be-40e3-beb0-95dd99d9dacd-utilities\") pod \"redhat-operators-564p8\" (UID: \"96289499-73be-40e3-beb0-95dd99d9dacd\") " pod="openshift-marketplace/redhat-operators-564p8" Mar 09 14:19:02 crc kubenswrapper[4722]: I0309 14:19:02.290670 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96289499-73be-40e3-beb0-95dd99d9dacd-catalog-content\") pod \"redhat-operators-564p8\" (UID: \"96289499-73be-40e3-beb0-95dd99d9dacd\") " pod="openshift-marketplace/redhat-operators-564p8" Mar 09 14:19:02 crc kubenswrapper[4722]: I0309 14:19:02.316423 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h85m6\" (UniqueName: \"kubernetes.io/projected/96289499-73be-40e3-beb0-95dd99d9dacd-kube-api-access-h85m6\") pod \"redhat-operators-564p8\" (UID: \"96289499-73be-40e3-beb0-95dd99d9dacd\") " pod="openshift-marketplace/redhat-operators-564p8" Mar 09 14:19:02 crc kubenswrapper[4722]: I0309 14:19:02.384891 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-564p8" Mar 09 14:19:02 crc kubenswrapper[4722]: I0309 14:19:02.838352 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-564p8"] Mar 09 14:19:03 crc kubenswrapper[4722]: I0309 14:19:03.028482 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-564p8" event={"ID":"96289499-73be-40e3-beb0-95dd99d9dacd","Type":"ContainerStarted","Data":"31fc13675822d7ca66c2c7c58b3257e52bcc6033eb2450b2c2e07408b4344915"} Mar 09 14:19:03 crc kubenswrapper[4722]: I0309 14:19:03.277060 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8tdrv" Mar 09 14:19:03 crc kubenswrapper[4722]: I0309 14:19:03.277886 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8tdrv" Mar 09 14:19:03 crc kubenswrapper[4722]: I0309 14:19:03.324414 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8tdrv" Mar 09 14:19:04 crc kubenswrapper[4722]: I0309 14:19:04.037970 4722 generic.go:334] "Generic (PLEG): container finished" podID="55a96bf1-f247-49ec-9ecf-feb3ae63a814" containerID="cf8f1533de0eb77d8d754b0e70ba84975d758c07f8eda9679c9bf415917055e7" exitCode=0 Mar 09 14:19:04 crc kubenswrapper[4722]: I0309 14:19:04.038559 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82l5sn8" event={"ID":"55a96bf1-f247-49ec-9ecf-feb3ae63a814","Type":"ContainerDied","Data":"cf8f1533de0eb77d8d754b0e70ba84975d758c07f8eda9679c9bf415917055e7"} Mar 09 14:19:04 crc kubenswrapper[4722]: I0309 14:19:04.041662 4722 generic.go:334] "Generic (PLEG): container finished" podID="96289499-73be-40e3-beb0-95dd99d9dacd" containerID="89a98873c734286cb961b908cccb9017d3fd8f467a3e8b3c258a59c9ab8cf131" exitCode=0 Mar 09 14:19:04 crc kubenswrapper[4722]: I0309 14:19:04.042626 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-564p8" event={"ID":"96289499-73be-40e3-beb0-95dd99d9dacd","Type":"ContainerDied","Data":"89a98873c734286cb961b908cccb9017d3fd8f467a3e8b3c258a59c9ab8cf131"} Mar 09 14:19:04 crc kubenswrapper[4722]: I0309 14:19:04.088293 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8tdrv" Mar 09 14:19:05 crc kubenswrapper[4722]: I0309 14:19:05.064240 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-564p8" event={"ID":"96289499-73be-40e3-beb0-95dd99d9dacd","Type":"ContainerStarted","Data":"62fff1460329808fe35ae0d002972417c34c62fbdba1c4f02c9f917fec807ccd"} Mar 09 14:19:05 crc kubenswrapper[4722]: I0309 14:19:05.067812 4722 generic.go:334] "Generic (PLEG): container finished" podID="55a96bf1-f247-49ec-9ecf-feb3ae63a814" containerID="5e3a6ab94a77f41287e8197ce4a567be7091c2e8d137c211e9741a48f23e270d" exitCode=0 Mar 09 14:19:05 crc kubenswrapper[4722]: I0309 14:19:05.067877 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82l5sn8" event={"ID":"55a96bf1-f247-49ec-9ecf-feb3ae63a814","Type":"ContainerDied","Data":"5e3a6ab94a77f41287e8197ce4a567be7091c2e8d137c211e9741a48f23e270d"} Mar 09 14:19:06 crc kubenswrapper[4722]: I0309 14:19:06.076799 4722 generic.go:334] "Generic (PLEG): container finished" podID="96289499-73be-40e3-beb0-95dd99d9dacd" containerID="62fff1460329808fe35ae0d002972417c34c62fbdba1c4f02c9f917fec807ccd" exitCode=0 Mar 09 14:19:06 crc kubenswrapper[4722]: I0309 14:19:06.076892 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-564p8" event={"ID":"96289499-73be-40e3-beb0-95dd99d9dacd","Type":"ContainerDied","Data":"62fff1460329808fe35ae0d002972417c34c62fbdba1c4f02c9f917fec807ccd"} Mar 09 14:19:06 crc kubenswrapper[4722]: I0309 14:19:06.471179 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82l5sn8" Mar 09 14:19:06 crc kubenswrapper[4722]: I0309 14:19:06.579945 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/55a96bf1-f247-49ec-9ecf-feb3ae63a814-bundle\") pod \"55a96bf1-f247-49ec-9ecf-feb3ae63a814\" (UID: \"55a96bf1-f247-49ec-9ecf-feb3ae63a814\") " Mar 09 14:19:06 crc kubenswrapper[4722]: I0309 14:19:06.580318 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zfxf\" (UniqueName: \"kubernetes.io/projected/55a96bf1-f247-49ec-9ecf-feb3ae63a814-kube-api-access-5zfxf\") pod \"55a96bf1-f247-49ec-9ecf-feb3ae63a814\" (UID: \"55a96bf1-f247-49ec-9ecf-feb3ae63a814\") " Mar 09 14:19:06 crc kubenswrapper[4722]: I0309 14:19:06.580471 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/55a96bf1-f247-49ec-9ecf-feb3ae63a814-util\") pod \"55a96bf1-f247-49ec-9ecf-feb3ae63a814\" (UID: \"55a96bf1-f247-49ec-9ecf-feb3ae63a814\") " Mar 09 14:19:06 crc kubenswrapper[4722]: I0309 14:19:06.580937 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55a96bf1-f247-49ec-9ecf-feb3ae63a814-bundle" (OuterVolumeSpecName: "bundle") pod "55a96bf1-f247-49ec-9ecf-feb3ae63a814" (UID: "55a96bf1-f247-49ec-9ecf-feb3ae63a814"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:19:06 crc kubenswrapper[4722]: I0309 14:19:06.592158 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55a96bf1-f247-49ec-9ecf-feb3ae63a814-kube-api-access-5zfxf" (OuterVolumeSpecName: "kube-api-access-5zfxf") pod "55a96bf1-f247-49ec-9ecf-feb3ae63a814" (UID: "55a96bf1-f247-49ec-9ecf-feb3ae63a814"). InnerVolumeSpecName "kube-api-access-5zfxf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:19:06 crc kubenswrapper[4722]: I0309 14:19:06.682427 4722 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/55a96bf1-f247-49ec-9ecf-feb3ae63a814-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:19:06 crc kubenswrapper[4722]: I0309 14:19:06.682461 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5zfxf\" (UniqueName: \"kubernetes.io/projected/55a96bf1-f247-49ec-9ecf-feb3ae63a814-kube-api-access-5zfxf\") on node \"crc\" DevicePath \"\"" Mar 09 14:19:06 crc kubenswrapper[4722]: I0309 14:19:06.854794 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8tdrv"] Mar 09 14:19:07 crc kubenswrapper[4722]: I0309 14:19:07.085187 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82l5sn8" event={"ID":"55a96bf1-f247-49ec-9ecf-feb3ae63a814","Type":"ContainerDied","Data":"4bf4bac91ae788f123b69bc7d759d2efd1510a8f27b23c15206b6cd71520cb78"} Mar 09 14:19:07 crc kubenswrapper[4722]: I0309 14:19:07.085512 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4bf4bac91ae788f123b69bc7d759d2efd1510a8f27b23c15206b6cd71520cb78" Mar 09 14:19:07 crc kubenswrapper[4722]: I0309 14:19:07.085237 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82l5sn8" Mar 09 14:19:07 crc kubenswrapper[4722]: I0309 14:19:07.088113 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-564p8" event={"ID":"96289499-73be-40e3-beb0-95dd99d9dacd","Type":"ContainerStarted","Data":"0da2d648ad5c9045606ec53f7e62448104c4aa4bb78ab1f818bdeed5dcbaceba"} Mar 09 14:19:07 crc kubenswrapper[4722]: I0309 14:19:07.088323 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8tdrv" podUID="54383c09-c05b-4ee3-b3bc-8f256f75b2f6" containerName="registry-server" containerID="cri-o://fc79b53cf57964eda9562d59a9556eb146c5a567c561241397cfebcda7f6ec50" gracePeriod=2 Mar 09 14:19:07 crc kubenswrapper[4722]: I0309 14:19:07.117982 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-564p8" podStartSLOduration=2.444817022 podStartE2EDuration="5.117957466s" podCreationTimestamp="2026-03-09 14:19:02 +0000 UTC" firstStartedPulling="2026-03-09 14:19:04.043461387 +0000 UTC m=+984.599029963" lastFinishedPulling="2026-03-09 14:19:06.716601811 +0000 UTC m=+987.272170407" observedRunningTime="2026-03-09 14:19:07.111466605 +0000 UTC m=+987.667035211" watchObservedRunningTime="2026-03-09 14:19:07.117957466 +0000 UTC m=+987.673526052" Mar 09 14:19:07 crc kubenswrapper[4722]: I0309 14:19:07.564266 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55a96bf1-f247-49ec-9ecf-feb3ae63a814-util" (OuterVolumeSpecName: "util") pod "55a96bf1-f247-49ec-9ecf-feb3ae63a814" (UID: "55a96bf1-f247-49ec-9ecf-feb3ae63a814"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:19:07 crc kubenswrapper[4722]: I0309 14:19:07.595861 4722 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/55a96bf1-f247-49ec-9ecf-feb3ae63a814-util\") on node \"crc\" DevicePath \"\"" Mar 09 14:19:08 crc kubenswrapper[4722]: I0309 14:19:08.098532 4722 generic.go:334] "Generic (PLEG): container finished" podID="54383c09-c05b-4ee3-b3bc-8f256f75b2f6" containerID="fc79b53cf57964eda9562d59a9556eb146c5a567c561241397cfebcda7f6ec50" exitCode=0 Mar 09 14:19:08 crc kubenswrapper[4722]: I0309 14:19:08.098647 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8tdrv" event={"ID":"54383c09-c05b-4ee3-b3bc-8f256f75b2f6","Type":"ContainerDied","Data":"fc79b53cf57964eda9562d59a9556eb146c5a567c561241397cfebcda7f6ec50"} Mar 09 14:19:08 crc kubenswrapper[4722]: I0309 14:19:08.660279 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8tdrv" Mar 09 14:19:08 crc kubenswrapper[4722]: I0309 14:19:08.715183 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54383c09-c05b-4ee3-b3bc-8f256f75b2f6-catalog-content\") pod \"54383c09-c05b-4ee3-b3bc-8f256f75b2f6\" (UID: \"54383c09-c05b-4ee3-b3bc-8f256f75b2f6\") " Mar 09 14:19:08 crc kubenswrapper[4722]: I0309 14:19:08.715267 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54383c09-c05b-4ee3-b3bc-8f256f75b2f6-utilities\") pod \"54383c09-c05b-4ee3-b3bc-8f256f75b2f6\" (UID: \"54383c09-c05b-4ee3-b3bc-8f256f75b2f6\") " Mar 09 14:19:08 crc kubenswrapper[4722]: I0309 14:19:08.715373 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nl4s6\" (UniqueName: \"kubernetes.io/projected/54383c09-c05b-4ee3-b3bc-8f256f75b2f6-kube-api-access-nl4s6\") pod \"54383c09-c05b-4ee3-b3bc-8f256f75b2f6\" (UID: \"54383c09-c05b-4ee3-b3bc-8f256f75b2f6\") " Mar 09 14:19:08 crc kubenswrapper[4722]: I0309 14:19:08.718475 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54383c09-c05b-4ee3-b3bc-8f256f75b2f6-utilities" (OuterVolumeSpecName: "utilities") pod "54383c09-c05b-4ee3-b3bc-8f256f75b2f6" (UID: "54383c09-c05b-4ee3-b3bc-8f256f75b2f6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:19:08 crc kubenswrapper[4722]: I0309 14:19:08.723444 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54383c09-c05b-4ee3-b3bc-8f256f75b2f6-kube-api-access-nl4s6" (OuterVolumeSpecName: "kube-api-access-nl4s6") pod "54383c09-c05b-4ee3-b3bc-8f256f75b2f6" (UID: "54383c09-c05b-4ee3-b3bc-8f256f75b2f6"). InnerVolumeSpecName "kube-api-access-nl4s6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:19:08 crc kubenswrapper[4722]: I0309 14:19:08.818366 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54383c09-c05b-4ee3-b3bc-8f256f75b2f6-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 14:19:08 crc kubenswrapper[4722]: I0309 14:19:08.818410 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nl4s6\" (UniqueName: \"kubernetes.io/projected/54383c09-c05b-4ee3-b3bc-8f256f75b2f6-kube-api-access-nl4s6\") on node \"crc\" DevicePath \"\"" Mar 09 14:19:08 crc kubenswrapper[4722]: I0309 14:19:08.832084 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54383c09-c05b-4ee3-b3bc-8f256f75b2f6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "54383c09-c05b-4ee3-b3bc-8f256f75b2f6" (UID: "54383c09-c05b-4ee3-b3bc-8f256f75b2f6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:19:08 crc kubenswrapper[4722]: I0309 14:19:08.919880 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54383c09-c05b-4ee3-b3bc-8f256f75b2f6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 14:19:09 crc kubenswrapper[4722]: I0309 14:19:09.106273 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8tdrv" event={"ID":"54383c09-c05b-4ee3-b3bc-8f256f75b2f6","Type":"ContainerDied","Data":"68882470a29a7f05f45e3ab5a1220437ab54b89e5e7bc215fb77890ce07c532a"} Mar 09 14:19:09 crc kubenswrapper[4722]: I0309 14:19:09.106340 4722 scope.go:117] "RemoveContainer" containerID="fc79b53cf57964eda9562d59a9556eb146c5a567c561241397cfebcda7f6ec50" Mar 09 14:19:09 crc kubenswrapper[4722]: I0309 14:19:09.106359 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8tdrv" Mar 09 14:19:09 crc kubenswrapper[4722]: I0309 14:19:09.127223 4722 scope.go:117] "RemoveContainer" containerID="a658b24b1e8f432dc8d5e6b980f61e121a00c0a02b45a0c903feaadee351de49" Mar 09 14:19:09 crc kubenswrapper[4722]: I0309 14:19:09.142801 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8tdrv"] Mar 09 14:19:09 crc kubenswrapper[4722]: I0309 14:19:09.153422 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8tdrv"] Mar 09 14:19:09 crc kubenswrapper[4722]: I0309 14:19:09.166437 4722 scope.go:117] "RemoveContainer" containerID="c3cebe48546555b88ce078145fab9d166cc1d0bacebd9d95be046bd3856e7aec" Mar 09 14:19:09 crc kubenswrapper[4722]: I0309 14:19:09.682619 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-svlc5"] Mar 09 14:19:09 crc kubenswrapper[4722]: E0309 14:19:09.683189 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55a96bf1-f247-49ec-9ecf-feb3ae63a814" containerName="util" Mar 09 14:19:09 crc kubenswrapper[4722]: I0309 14:19:09.683257 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="55a96bf1-f247-49ec-9ecf-feb3ae63a814" containerName="util" Mar 09 14:19:09 crc kubenswrapper[4722]: E0309 14:19:09.683279 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54383c09-c05b-4ee3-b3bc-8f256f75b2f6" containerName="registry-server" Mar 09 14:19:09 crc kubenswrapper[4722]: I0309 14:19:09.683287 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="54383c09-c05b-4ee3-b3bc-8f256f75b2f6" containerName="registry-server" Mar 09 14:19:09 crc kubenswrapper[4722]: E0309 14:19:09.683306 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55a96bf1-f247-49ec-9ecf-feb3ae63a814" containerName="pull" Mar 09 14:19:09 crc kubenswrapper[4722]: I0309 14:19:09.683313 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="55a96bf1-f247-49ec-9ecf-feb3ae63a814" containerName="pull" Mar 09 14:19:09 crc kubenswrapper[4722]: E0309 14:19:09.683328 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54383c09-c05b-4ee3-b3bc-8f256f75b2f6" containerName="extract-utilities" Mar 09 14:19:09 crc kubenswrapper[4722]: I0309 14:19:09.683335 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="54383c09-c05b-4ee3-b3bc-8f256f75b2f6" containerName="extract-utilities" Mar 09 14:19:09 crc kubenswrapper[4722]: E0309 14:19:09.683353 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54383c09-c05b-4ee3-b3bc-8f256f75b2f6" containerName="extract-content" Mar 09 14:19:09 crc kubenswrapper[4722]: I0309 14:19:09.683360 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="54383c09-c05b-4ee3-b3bc-8f256f75b2f6" containerName="extract-content" Mar 09 14:19:09 crc kubenswrapper[4722]: E0309 14:19:09.683373 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55a96bf1-f247-49ec-9ecf-feb3ae63a814" containerName="extract" Mar 09 14:19:09 crc kubenswrapper[4722]: I0309 14:19:09.683381 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="55a96bf1-f247-49ec-9ecf-feb3ae63a814" containerName="extract" Mar 09 14:19:09 crc kubenswrapper[4722]: I0309 14:19:09.683546 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="55a96bf1-f247-49ec-9ecf-feb3ae63a814" containerName="extract" Mar 09 14:19:09 crc kubenswrapper[4722]: I0309 14:19:09.683564 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="54383c09-c05b-4ee3-b3bc-8f256f75b2f6" containerName="registry-server" Mar 09 14:19:09 crc kubenswrapper[4722]: I0309 14:19:09.684225 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-svlc5" Mar 09 14:19:09 crc kubenswrapper[4722]: I0309 14:19:09.685889 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 09 14:19:09 crc kubenswrapper[4722]: I0309 14:19:09.686244 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 09 14:19:09 crc kubenswrapper[4722]: I0309 14:19:09.686357 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-7jctd" Mar 09 14:19:09 crc kubenswrapper[4722]: I0309 14:19:09.696374 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-svlc5"] Mar 09 14:19:09 crc kubenswrapper[4722]: I0309 14:19:09.834591 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9pm8\" (UniqueName: \"kubernetes.io/projected/0556dc80-2dcf-4f82-8b4c-96198af98e00-kube-api-access-x9pm8\") pod \"nmstate-operator-75c5dccd6c-svlc5\" (UID: \"0556dc80-2dcf-4f82-8b4c-96198af98e00\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-svlc5" Mar 09 14:19:09 crc kubenswrapper[4722]: I0309 14:19:09.935835 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9pm8\" (UniqueName: \"kubernetes.io/projected/0556dc80-2dcf-4f82-8b4c-96198af98e00-kube-api-access-x9pm8\") pod \"nmstate-operator-75c5dccd6c-svlc5\" (UID: \"0556dc80-2dcf-4f82-8b4c-96198af98e00\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-svlc5" Mar 09 14:19:09 crc kubenswrapper[4722]: I0309 14:19:09.958404 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9pm8\" (UniqueName: \"kubernetes.io/projected/0556dc80-2dcf-4f82-8b4c-96198af98e00-kube-api-access-x9pm8\") pod \"nmstate-operator-75c5dccd6c-svlc5\" (UID: \"0556dc80-2dcf-4f82-8b4c-96198af98e00\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-svlc5" Mar 09 14:19:10 crc kubenswrapper[4722]: I0309 14:19:10.001236 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-svlc5" Mar 09 14:19:10 crc kubenswrapper[4722]: I0309 14:19:10.174906 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54383c09-c05b-4ee3-b3bc-8f256f75b2f6" path="/var/lib/kubelet/pods/54383c09-c05b-4ee3-b3bc-8f256f75b2f6/volumes" Mar 09 14:19:10 crc kubenswrapper[4722]: I0309 14:19:10.546374 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-svlc5"] Mar 09 14:19:10 crc kubenswrapper[4722]: W0309 14:19:10.557218 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0556dc80_2dcf_4f82_8b4c_96198af98e00.slice/crio-20c0a843873d1c9cd31ebcaf7adc64a423658656bfffffc30e327ccad417bc88 WatchSource:0}: Error finding container 20c0a843873d1c9cd31ebcaf7adc64a423658656bfffffc30e327ccad417bc88: Status 404 returned error can't find the container with id 20c0a843873d1c9cd31ebcaf7adc64a423658656bfffffc30e327ccad417bc88 Mar 09 14:19:11 crc kubenswrapper[4722]: I0309 14:19:11.136528 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-svlc5" event={"ID":"0556dc80-2dcf-4f82-8b4c-96198af98e00","Type":"ContainerStarted","Data":"20c0a843873d1c9cd31ebcaf7adc64a423658656bfffffc30e327ccad417bc88"} Mar 09 14:19:12 crc kubenswrapper[4722]: I0309 14:19:12.385929 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-564p8" Mar 09 14:19:12 crc kubenswrapper[4722]: I0309 14:19:12.386314 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-564p8" Mar 09 14:19:13 crc kubenswrapper[4722]: I0309 14:19:13.430190 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-564p8" podUID="96289499-73be-40e3-beb0-95dd99d9dacd" containerName="registry-server" probeResult="failure" output=< Mar 09 14:19:13 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Mar 09 14:19:13 crc kubenswrapper[4722]: > Mar 09 14:19:14 crc kubenswrapper[4722]: I0309 14:19:14.159842 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-svlc5" event={"ID":"0556dc80-2dcf-4f82-8b4c-96198af98e00","Type":"ContainerStarted","Data":"dbe2a125d451300dd58f86cff3845dc271f157d43108a234ca6cbc6e3b1d12dd"} Mar 09 14:19:14 crc kubenswrapper[4722]: I0309 14:19:14.191894 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-svlc5" podStartSLOduration=2.382455288 podStartE2EDuration="5.19186607s" podCreationTimestamp="2026-03-09 14:19:09 +0000 UTC" firstStartedPulling="2026-03-09 14:19:10.559430092 +0000 UTC m=+991.114998668" lastFinishedPulling="2026-03-09 14:19:13.368840874 +0000 UTC m=+993.924409450" observedRunningTime="2026-03-09 14:19:14.178419445 +0000 UTC m=+994.733988071" watchObservedRunningTime="2026-03-09 14:19:14.19186607 +0000 UTC m=+994.747434656" Mar 09 14:19:15 crc kubenswrapper[4722]: I0309 14:19:15.255486 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-zvlr9"] Mar 09 14:19:15 crc kubenswrapper[4722]: I0309 14:19:15.257611 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-69594cc75-zvlr9" Mar 09 14:19:15 crc kubenswrapper[4722]: I0309 14:19:15.264302 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-c6x8v" Mar 09 14:19:15 crc kubenswrapper[4722]: I0309 14:19:15.277064 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-jdp69"] Mar 09 14:19:15 crc kubenswrapper[4722]: I0309 14:19:15.277999 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-786f45cff4-jdp69" Mar 09 14:19:15 crc kubenswrapper[4722]: I0309 14:19:15.288594 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-zvlr9"] Mar 09 14:19:15 crc kubenswrapper[4722]: I0309 14:19:15.306938 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 09 14:19:15 crc kubenswrapper[4722]: I0309 14:19:15.343469 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzwt6\" (UniqueName: \"kubernetes.io/projected/6f9a1c28-1ff4-4ead-91d8-5dc47c7b4d55-kube-api-access-gzwt6\") pod \"nmstate-metrics-69594cc75-zvlr9\" (UID: \"6f9a1c28-1ff4-4ead-91d8-5dc47c7b4d55\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-zvlr9" Mar 09 14:19:15 crc kubenswrapper[4722]: I0309 14:19:15.347891 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-67pzj"] Mar 09 14:19:15 crc kubenswrapper[4722]: I0309 14:19:15.349036 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-67pzj" Mar 09 14:19:15 crc kubenswrapper[4722]: I0309 14:19:15.356694 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-jdp69"] Mar 09 14:19:15 crc kubenswrapper[4722]: I0309 14:19:15.435758 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-vnpzr"] Mar 09 14:19:15 crc kubenswrapper[4722]: I0309 14:19:15.441867 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-vnpzr" Mar 09 14:19:15 crc kubenswrapper[4722]: I0309 14:19:15.444774 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/ade25a79-1e43-41ed-be91-ce97aa1c4103-dbus-socket\") pod \"nmstate-handler-67pzj\" (UID: \"ade25a79-1e43-41ed-be91-ce97aa1c4103\") " pod="openshift-nmstate/nmstate-handler-67pzj" Mar 09 14:19:15 crc kubenswrapper[4722]: I0309 14:19:15.444889 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwrth\" (UniqueName: \"kubernetes.io/projected/ade25a79-1e43-41ed-be91-ce97aa1c4103-kube-api-access-mwrth\") pod \"nmstate-handler-67pzj\" (UID: \"ade25a79-1e43-41ed-be91-ce97aa1c4103\") " pod="openshift-nmstate/nmstate-handler-67pzj" Mar 09 14:19:15 crc kubenswrapper[4722]: I0309 14:19:15.444916 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/ade25a79-1e43-41ed-be91-ce97aa1c4103-ovs-socket\") pod \"nmstate-handler-67pzj\" (UID: \"ade25a79-1e43-41ed-be91-ce97aa1c4103\") " pod="openshift-nmstate/nmstate-handler-67pzj" Mar 09 14:19:15 crc kubenswrapper[4722]: I0309 14:19:15.445027 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzwt6\" (UniqueName: \"kubernetes.io/projected/6f9a1c28-1ff4-4ead-91d8-5dc47c7b4d55-kube-api-access-gzwt6\") pod \"nmstate-metrics-69594cc75-zvlr9\" (UID: \"6f9a1c28-1ff4-4ead-91d8-5dc47c7b4d55\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-zvlr9" Mar 09 14:19:15 crc kubenswrapper[4722]: I0309 14:19:15.445058 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nbz5\" (UniqueName: \"kubernetes.io/projected/c0af161b-a8d5-4a36-b1c2-0a4d43820c73-kube-api-access-2nbz5\") pod \"nmstate-webhook-786f45cff4-jdp69\" (UID: \"c0af161b-a8d5-4a36-b1c2-0a4d43820c73\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-jdp69" Mar 09 14:19:15 crc kubenswrapper[4722]: I0309 14:19:15.445133 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/ade25a79-1e43-41ed-be91-ce97aa1c4103-nmstate-lock\") pod \"nmstate-handler-67pzj\" (UID: \"ade25a79-1e43-41ed-be91-ce97aa1c4103\") " pod="openshift-nmstate/nmstate-handler-67pzj" Mar 09 14:19:15 crc kubenswrapper[4722]: I0309 14:19:15.445170 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/c0af161b-a8d5-4a36-b1c2-0a4d43820c73-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-jdp69\" (UID: \"c0af161b-a8d5-4a36-b1c2-0a4d43820c73\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-jdp69" Mar 09 14:19:15 crc kubenswrapper[4722]: I0309 14:19:15.446635 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 09 14:19:15 crc kubenswrapper[4722]: I0309 14:19:15.447104 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 09 14:19:15 crc kubenswrapper[4722]: I0309 14:19:15.447559 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-tk9th" Mar 09 14:19:15 crc kubenswrapper[4722]: I0309 14:19:15.450467 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-vnpzr"] Mar 09 14:19:15 crc kubenswrapper[4722]: I0309 14:19:15.467525 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzwt6\" (UniqueName: \"kubernetes.io/projected/6f9a1c28-1ff4-4ead-91d8-5dc47c7b4d55-kube-api-access-gzwt6\") pod \"nmstate-metrics-69594cc75-zvlr9\" (UID: \"6f9a1c28-1ff4-4ead-91d8-5dc47c7b4d55\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-zvlr9" Mar 09 14:19:15 crc kubenswrapper[4722]: I0309 14:19:15.547343 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/ade25a79-1e43-41ed-be91-ce97aa1c4103-nmstate-lock\") pod \"nmstate-handler-67pzj\" (UID: \"ade25a79-1e43-41ed-be91-ce97aa1c4103\") " pod="openshift-nmstate/nmstate-handler-67pzj" Mar 09 14:19:15 crc kubenswrapper[4722]: I0309 14:19:15.547388 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/6a030cdd-139c-4d44-bf79-43f14e97d9f9-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-vnpzr\" (UID: \"6a030cdd-139c-4d44-bf79-43f14e97d9f9\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-vnpzr" Mar 09 14:19:15 crc kubenswrapper[4722]: I0309 14:19:15.547415 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/c0af161b-a8d5-4a36-b1c2-0a4d43820c73-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-jdp69\" (UID: \"c0af161b-a8d5-4a36-b1c2-0a4d43820c73\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-jdp69" Mar 09 14:19:15 crc kubenswrapper[4722]: I0309 14:19:15.547451 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/ade25a79-1e43-41ed-be91-ce97aa1c4103-dbus-socket\") pod \"nmstate-handler-67pzj\" (UID: \"ade25a79-1e43-41ed-be91-ce97aa1c4103\") " pod="openshift-nmstate/nmstate-handler-67pzj" Mar 09 14:19:15 crc kubenswrapper[4722]: I0309 14:19:15.547482 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/6a030cdd-139c-4d44-bf79-43f14e97d9f9-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-vnpzr\" (UID: \"6a030cdd-139c-4d44-bf79-43f14e97d9f9\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-vnpzr" Mar 09 14:19:15 crc kubenswrapper[4722]: I0309 14:19:15.547511 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jpvp\" (UniqueName: \"kubernetes.io/projected/6a030cdd-139c-4d44-bf79-43f14e97d9f9-kube-api-access-4jpvp\") pod \"nmstate-console-plugin-5dcbbd79cf-vnpzr\" (UID: \"6a030cdd-139c-4d44-bf79-43f14e97d9f9\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-vnpzr" Mar 09 14:19:15 crc kubenswrapper[4722]: I0309 14:19:15.547516 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/ade25a79-1e43-41ed-be91-ce97aa1c4103-nmstate-lock\") pod \"nmstate-handler-67pzj\" (UID: \"ade25a79-1e43-41ed-be91-ce97aa1c4103\") " pod="openshift-nmstate/nmstate-handler-67pzj" Mar 09 14:19:15 crc kubenswrapper[4722]: I0309 14:19:15.547685 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwrth\" (UniqueName: \"kubernetes.io/projected/ade25a79-1e43-41ed-be91-ce97aa1c4103-kube-api-access-mwrth\") pod \"nmstate-handler-67pzj\" (UID: \"ade25a79-1e43-41ed-be91-ce97aa1c4103\") " pod="openshift-nmstate/nmstate-handler-67pzj" Mar 09 14:19:15 crc kubenswrapper[4722]: I0309 14:19:15.547728 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/ade25a79-1e43-41ed-be91-ce97aa1c4103-ovs-socket\") pod \"nmstate-handler-67pzj\" (UID: \"ade25a79-1e43-41ed-be91-ce97aa1c4103\") " pod="openshift-nmstate/nmstate-handler-67pzj" Mar 09 14:19:15 crc kubenswrapper[4722]: I0309 14:19:15.547787 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2nbz5\" (UniqueName: \"kubernetes.io/projected/c0af161b-a8d5-4a36-b1c2-0a4d43820c73-kube-api-access-2nbz5\") pod \"nmstate-webhook-786f45cff4-jdp69\" (UID: \"c0af161b-a8d5-4a36-b1c2-0a4d43820c73\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-jdp69" Mar 09 14:19:15 crc kubenswrapper[4722]: I0309 14:19:15.547859 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/ade25a79-1e43-41ed-be91-ce97aa1c4103-dbus-socket\") pod \"nmstate-handler-67pzj\" (UID: \"ade25a79-1e43-41ed-be91-ce97aa1c4103\") " pod="openshift-nmstate/nmstate-handler-67pzj" Mar 09 14:19:15 crc kubenswrapper[4722]: I0309 14:19:15.547891 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/ade25a79-1e43-41ed-be91-ce97aa1c4103-ovs-socket\") pod \"nmstate-handler-67pzj\" (UID: \"ade25a79-1e43-41ed-be91-ce97aa1c4103\") " pod="openshift-nmstate/nmstate-handler-67pzj" Mar 09 14:19:15 crc kubenswrapper[4722]: I0309 14:19:15.551791 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/c0af161b-a8d5-4a36-b1c2-0a4d43820c73-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-jdp69\" (UID: \"c0af161b-a8d5-4a36-b1c2-0a4d43820c73\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-jdp69" Mar 09 14:19:15 crc kubenswrapper[4722]: I0309 14:19:15.576440 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-69594cc75-zvlr9" Mar 09 14:19:15 crc kubenswrapper[4722]: I0309 14:19:15.580311 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nbz5\" (UniqueName: \"kubernetes.io/projected/c0af161b-a8d5-4a36-b1c2-0a4d43820c73-kube-api-access-2nbz5\") pod \"nmstate-webhook-786f45cff4-jdp69\" (UID: \"c0af161b-a8d5-4a36-b1c2-0a4d43820c73\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-jdp69" Mar 09 14:19:15 crc kubenswrapper[4722]: I0309 14:19:15.592424 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwrth\" (UniqueName: \"kubernetes.io/projected/ade25a79-1e43-41ed-be91-ce97aa1c4103-kube-api-access-mwrth\") pod \"nmstate-handler-67pzj\" (UID: \"ade25a79-1e43-41ed-be91-ce97aa1c4103\") " pod="openshift-nmstate/nmstate-handler-67pzj" Mar 09 14:19:15 crc kubenswrapper[4722]: I0309 14:19:15.623294 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-786f45cff4-jdp69" Mar 09 14:19:15 crc kubenswrapper[4722]: I0309 14:19:15.643158 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-78fdf7cd4f-mt82k"] Mar 09 14:19:15 crc kubenswrapper[4722]: I0309 14:19:15.644099 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-78fdf7cd4f-mt82k" Mar 09 14:19:15 crc kubenswrapper[4722]: I0309 14:19:15.650143 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/6a030cdd-139c-4d44-bf79-43f14e97d9f9-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-vnpzr\" (UID: \"6a030cdd-139c-4d44-bf79-43f14e97d9f9\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-vnpzr" Mar 09 14:19:15 crc kubenswrapper[4722]: I0309 14:19:15.650965 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/6a030cdd-139c-4d44-bf79-43f14e97d9f9-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-vnpzr\" (UID: \"6a030cdd-139c-4d44-bf79-43f14e97d9f9\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-vnpzr" Mar 09 14:19:15 crc kubenswrapper[4722]: I0309 14:19:15.651068 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/6a030cdd-139c-4d44-bf79-43f14e97d9f9-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-vnpzr\" (UID: \"6a030cdd-139c-4d44-bf79-43f14e97d9f9\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-vnpzr" Mar 09 14:19:15 crc kubenswrapper[4722]: I0309 14:19:15.651100 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jpvp\" (UniqueName: \"kubernetes.io/projected/6a030cdd-139c-4d44-bf79-43f14e97d9f9-kube-api-access-4jpvp\") pod \"nmstate-console-plugin-5dcbbd79cf-vnpzr\" (UID: \"6a030cdd-139c-4d44-bf79-43f14e97d9f9\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-vnpzr" Mar 09 14:19:15 crc kubenswrapper[4722]: I0309 14:19:15.665079 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/6a030cdd-139c-4d44-bf79-43f14e97d9f9-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-vnpzr\" (UID: \"6a030cdd-139c-4d44-bf79-43f14e97d9f9\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-vnpzr" Mar 09 14:19:15 crc kubenswrapper[4722]: I0309 14:19:15.678908 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-67pzj" Mar 09 14:19:15 crc kubenswrapper[4722]: I0309 14:19:15.696818 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jpvp\" (UniqueName: \"kubernetes.io/projected/6a030cdd-139c-4d44-bf79-43f14e97d9f9-kube-api-access-4jpvp\") pod \"nmstate-console-plugin-5dcbbd79cf-vnpzr\" (UID: \"6a030cdd-139c-4d44-bf79-43f14e97d9f9\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-vnpzr" Mar 09 14:19:15 crc kubenswrapper[4722]: I0309 14:19:15.708786 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-78fdf7cd4f-mt82k"] Mar 09 14:19:15 crc kubenswrapper[4722]: W0309 14:19:15.729319 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podade25a79_1e43_41ed_be91_ce97aa1c4103.slice/crio-ab16f89100c5f36a25cb7e241c707973b9ff7892c82017f26a23bbc2e8923c03 WatchSource:0}: Error finding container ab16f89100c5f36a25cb7e241c707973b9ff7892c82017f26a23bbc2e8923c03: Status 404 returned error can't find the container with id ab16f89100c5f36a25cb7e241c707973b9ff7892c82017f26a23bbc2e8923c03 Mar 09 14:19:15 crc kubenswrapper[4722]: I0309 14:19:15.752904 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/aa31f801-ed80-405f-960c-74c254d4f9ca-console-serving-cert\") pod \"console-78fdf7cd4f-mt82k\" (UID: \"aa31f801-ed80-405f-960c-74c254d4f9ca\") " pod="openshift-console/console-78fdf7cd4f-mt82k" Mar 09 14:19:15 crc kubenswrapper[4722]: I0309 14:19:15.752979 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/aa31f801-ed80-405f-960c-74c254d4f9ca-service-ca\") pod \"console-78fdf7cd4f-mt82k\" (UID: \"aa31f801-ed80-405f-960c-74c254d4f9ca\") " pod="openshift-console/console-78fdf7cd4f-mt82k" Mar 09 14:19:15 crc kubenswrapper[4722]: I0309 14:19:15.753006 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpqsr\" (UniqueName: \"kubernetes.io/projected/aa31f801-ed80-405f-960c-74c254d4f9ca-kube-api-access-bpqsr\") pod \"console-78fdf7cd4f-mt82k\" (UID: \"aa31f801-ed80-405f-960c-74c254d4f9ca\") " pod="openshift-console/console-78fdf7cd4f-mt82k" Mar 09 14:19:15 crc kubenswrapper[4722]: I0309 14:19:15.753068 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa31f801-ed80-405f-960c-74c254d4f9ca-trusted-ca-bundle\") pod \"console-78fdf7cd4f-mt82k\" (UID: \"aa31f801-ed80-405f-960c-74c254d4f9ca\") " pod="openshift-console/console-78fdf7cd4f-mt82k" Mar 09 14:19:15 crc kubenswrapper[4722]: I0309 14:19:15.753098 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/aa31f801-ed80-405f-960c-74c254d4f9ca-oauth-serving-cert\") pod \"console-78fdf7cd4f-mt82k\" (UID: \"aa31f801-ed80-405f-960c-74c254d4f9ca\") " pod="openshift-console/console-78fdf7cd4f-mt82k" Mar 09 14:19:15 crc kubenswrapper[4722]: I0309 14:19:15.753155 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/aa31f801-ed80-405f-960c-74c254d4f9ca-console-oauth-config\") pod \"console-78fdf7cd4f-mt82k\" (UID: \"aa31f801-ed80-405f-960c-74c254d4f9ca\") " pod="openshift-console/console-78fdf7cd4f-mt82k" Mar 09 14:19:15 crc kubenswrapper[4722]: I0309 14:19:15.753246 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/aa31f801-ed80-405f-960c-74c254d4f9ca-console-config\") pod \"console-78fdf7cd4f-mt82k\" (UID: \"aa31f801-ed80-405f-960c-74c254d4f9ca\") " pod="openshift-console/console-78fdf7cd4f-mt82k" Mar 09 14:19:15 crc kubenswrapper[4722]: I0309 14:19:15.755644 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-vnpzr" Mar 09 14:19:15 crc kubenswrapper[4722]: I0309 14:19:15.855251 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/aa31f801-ed80-405f-960c-74c254d4f9ca-console-oauth-config\") pod \"console-78fdf7cd4f-mt82k\" (UID: \"aa31f801-ed80-405f-960c-74c254d4f9ca\") " pod="openshift-console/console-78fdf7cd4f-mt82k" Mar 09 14:19:15 crc kubenswrapper[4722]: I0309 14:19:15.855343 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/aa31f801-ed80-405f-960c-74c254d4f9ca-console-config\") pod \"console-78fdf7cd4f-mt82k\" (UID: \"aa31f801-ed80-405f-960c-74c254d4f9ca\") " pod="openshift-console/console-78fdf7cd4f-mt82k" Mar 09 14:19:15 crc kubenswrapper[4722]: I0309 14:19:15.855381 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/aa31f801-ed80-405f-960c-74c254d4f9ca-console-serving-cert\") pod \"console-78fdf7cd4f-mt82k\" (UID: \"aa31f801-ed80-405f-960c-74c254d4f9ca\") " pod="openshift-console/console-78fdf7cd4f-mt82k" Mar 09 14:19:15 crc kubenswrapper[4722]: I0309 14:19:15.855419 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/aa31f801-ed80-405f-960c-74c254d4f9ca-service-ca\") pod \"console-78fdf7cd4f-mt82k\" (UID: \"aa31f801-ed80-405f-960c-74c254d4f9ca\") " pod="openshift-console/console-78fdf7cd4f-mt82k" Mar 09 14:19:15 crc kubenswrapper[4722]: I0309 14:19:15.855442 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpqsr\" (UniqueName: \"kubernetes.io/projected/aa31f801-ed80-405f-960c-74c254d4f9ca-kube-api-access-bpqsr\") pod \"console-78fdf7cd4f-mt82k\" (UID: \"aa31f801-ed80-405f-960c-74c254d4f9ca\") " pod="openshift-console/console-78fdf7cd4f-mt82k" Mar 09 14:19:15 crc kubenswrapper[4722]: I0309 14:19:15.855495 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa31f801-ed80-405f-960c-74c254d4f9ca-trusted-ca-bundle\") pod \"console-78fdf7cd4f-mt82k\" (UID: \"aa31f801-ed80-405f-960c-74c254d4f9ca\") " pod="openshift-console/console-78fdf7cd4f-mt82k" Mar 09 14:19:15 crc kubenswrapper[4722]: I0309 14:19:15.855524 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/aa31f801-ed80-405f-960c-74c254d4f9ca-oauth-serving-cert\") pod \"console-78fdf7cd4f-mt82k\" (UID: \"aa31f801-ed80-405f-960c-74c254d4f9ca\") " pod="openshift-console/console-78fdf7cd4f-mt82k" Mar 09 14:19:15 crc kubenswrapper[4722]: I0309 14:19:15.856885 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/aa31f801-ed80-405f-960c-74c254d4f9ca-oauth-serving-cert\") pod \"console-78fdf7cd4f-mt82k\" (UID: \"aa31f801-ed80-405f-960c-74c254d4f9ca\") " pod="openshift-console/console-78fdf7cd4f-mt82k" Mar 09 14:19:15 crc kubenswrapper[4722]: I0309 14:19:15.857585 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/aa31f801-ed80-405f-960c-74c254d4f9ca-service-ca\") pod \"console-78fdf7cd4f-mt82k\" (UID: \"aa31f801-ed80-405f-960c-74c254d4f9ca\") " pod="openshift-console/console-78fdf7cd4f-mt82k" Mar 09 14:19:15 crc kubenswrapper[4722]: I0309 14:19:15.857843 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/aa31f801-ed80-405f-960c-74c254d4f9ca-console-config\") pod \"console-78fdf7cd4f-mt82k\" (UID: \"aa31f801-ed80-405f-960c-74c254d4f9ca\") " pod="openshift-console/console-78fdf7cd4f-mt82k" Mar 09 14:19:15 crc kubenswrapper[4722]: I0309 14:19:15.859121 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa31f801-ed80-405f-960c-74c254d4f9ca-trusted-ca-bundle\") pod \"console-78fdf7cd4f-mt82k\" (UID: \"aa31f801-ed80-405f-960c-74c254d4f9ca\") " pod="openshift-console/console-78fdf7cd4f-mt82k" Mar 09 14:19:15 crc kubenswrapper[4722]: I0309 14:19:15.862135 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/aa31f801-ed80-405f-960c-74c254d4f9ca-console-oauth-config\") pod \"console-78fdf7cd4f-mt82k\" (UID: \"aa31f801-ed80-405f-960c-74c254d4f9ca\") " pod="openshift-console/console-78fdf7cd4f-mt82k" Mar 09 14:19:15 crc kubenswrapper[4722]: I0309 14:19:15.862734 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/aa31f801-ed80-405f-960c-74c254d4f9ca-console-serving-cert\") pod \"console-78fdf7cd4f-mt82k\" (UID: \"aa31f801-ed80-405f-960c-74c254d4f9ca\") " pod="openshift-console/console-78fdf7cd4f-mt82k" Mar 09 14:19:15 crc kubenswrapper[4722]: I0309 14:19:15.909065 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpqsr\" (UniqueName: \"kubernetes.io/projected/aa31f801-ed80-405f-960c-74c254d4f9ca-kube-api-access-bpqsr\") pod \"console-78fdf7cd4f-mt82k\" (UID: \"aa31f801-ed80-405f-960c-74c254d4f9ca\") " pod="openshift-console/console-78fdf7cd4f-mt82k" Mar 09 14:19:16 crc kubenswrapper[4722]: I0309 14:19:16.016613 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-78fdf7cd4f-mt82k" Mar 09 14:19:16 crc kubenswrapper[4722]: I0309 14:19:16.074703 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-zvlr9"] Mar 09 14:19:16 crc kubenswrapper[4722]: I0309 14:19:16.147684 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-jdp69"] Mar 09 14:19:16 crc kubenswrapper[4722]: I0309 14:19:16.197598 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-67pzj" event={"ID":"ade25a79-1e43-41ed-be91-ce97aa1c4103","Type":"ContainerStarted","Data":"ab16f89100c5f36a25cb7e241c707973b9ff7892c82017f26a23bbc2e8923c03"} Mar 09 14:19:16 crc kubenswrapper[4722]: I0309 14:19:16.198671 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-zvlr9" event={"ID":"6f9a1c28-1ff4-4ead-91d8-5dc47c7b4d55","Type":"ContainerStarted","Data":"fa574846efed93f5d0c1740f193a3147e8ce5704735f1e30ff8767ee2f9d1501"} Mar 09 14:19:16 crc kubenswrapper[4722]: W0309 14:19:16.213352 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0af161b_a8d5_4a36_b1c2_0a4d43820c73.slice/crio-a2897f9819238dc8060bc1dc47e8fb2c9fc432956b58c5ca8dbf8d147418f324 WatchSource:0}: Error finding container a2897f9819238dc8060bc1dc47e8fb2c9fc432956b58c5ca8dbf8d147418f324: Status 404 returned error can't find the container with id a2897f9819238dc8060bc1dc47e8fb2c9fc432956b58c5ca8dbf8d147418f324 Mar 09 14:19:16 crc kubenswrapper[4722]: I0309 14:19:16.243730 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-vnpzr"] Mar 09 14:19:16 crc kubenswrapper[4722]: W0309 14:19:16.267511 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a030cdd_139c_4d44_bf79_43f14e97d9f9.slice/crio-bf9e4aa4844790876cbc6107399217bebac2045e7ba107a68d50a9a7a2d08b57 WatchSource:0}: Error finding container bf9e4aa4844790876cbc6107399217bebac2045e7ba107a68d50a9a7a2d08b57: Status 404 returned error can't find the container with id bf9e4aa4844790876cbc6107399217bebac2045e7ba107a68d50a9a7a2d08b57 Mar 09 14:19:16 crc kubenswrapper[4722]: I0309 14:19:16.532687 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-78fdf7cd4f-mt82k"] Mar 09 14:19:16 crc kubenswrapper[4722]: W0309 14:19:16.542353 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa31f801_ed80_405f_960c_74c254d4f9ca.slice/crio-06afa867c517b762f56818737eb26daa1472d479f2b7b9888aad848b5ed83f4c WatchSource:0}: Error finding container 06afa867c517b762f56818737eb26daa1472d479f2b7b9888aad848b5ed83f4c: Status 404 returned error can't find the container with id 06afa867c517b762f56818737eb26daa1472d479f2b7b9888aad848b5ed83f4c Mar 09 14:19:17 crc kubenswrapper[4722]: I0309 14:19:17.208318 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-78fdf7cd4f-mt82k" event={"ID":"aa31f801-ed80-405f-960c-74c254d4f9ca","Type":"ContainerStarted","Data":"e88aefd45843603369ebe8bd286ece8982c0ae3f0cd8e6b50b0d085a2c5d1930"} Mar 09 14:19:17 crc kubenswrapper[4722]: I0309 14:19:17.208771 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-78fdf7cd4f-mt82k" event={"ID":"aa31f801-ed80-405f-960c-74c254d4f9ca","Type":"ContainerStarted","Data":"06afa867c517b762f56818737eb26daa1472d479f2b7b9888aad848b5ed83f4c"} Mar 09 14:19:17 crc kubenswrapper[4722]: I0309 14:19:17.209148 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-786f45cff4-jdp69" event={"ID":"c0af161b-a8d5-4a36-b1c2-0a4d43820c73","Type":"ContainerStarted","Data":"a2897f9819238dc8060bc1dc47e8fb2c9fc432956b58c5ca8dbf8d147418f324"} Mar 09 14:19:17 crc kubenswrapper[4722]: I0309 14:19:17.210683 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-vnpzr" event={"ID":"6a030cdd-139c-4d44-bf79-43f14e97d9f9","Type":"ContainerStarted","Data":"bf9e4aa4844790876cbc6107399217bebac2045e7ba107a68d50a9a7a2d08b57"} Mar 09 14:19:17 crc kubenswrapper[4722]: I0309 14:19:17.233754 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-78fdf7cd4f-mt82k" podStartSLOduration=2.23373433 podStartE2EDuration="2.23373433s" podCreationTimestamp="2026-03-09 14:19:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:19:17.228828984 +0000 UTC m=+997.784397550" watchObservedRunningTime="2026-03-09 14:19:17.23373433 +0000 UTC m=+997.789302906" Mar 09 14:19:20 crc kubenswrapper[4722]: I0309 14:19:20.238167 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-vnpzr" event={"ID":"6a030cdd-139c-4d44-bf79-43f14e97d9f9","Type":"ContainerStarted","Data":"2f56a7e0ab760699f3a3521f6a647d567d77085ad19904604d727611931f5dc4"} Mar 09 14:19:20 crc kubenswrapper[4722]: I0309 14:19:20.240757 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-zvlr9" event={"ID":"6f9a1c28-1ff4-4ead-91d8-5dc47c7b4d55","Type":"ContainerStarted","Data":"223d37d4945f6474fac12d36687e182db83f6b339528485a82b8074b29e33a71"} Mar 09 14:19:20 crc kubenswrapper[4722]: I0309 14:19:20.242772 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-786f45cff4-jdp69" event={"ID":"c0af161b-a8d5-4a36-b1c2-0a4d43820c73","Type":"ContainerStarted","Data":"0fd290b98e10582c80885938a48b81f3ef5e68d9d78a75f252faf55f0b2e4966"} Mar 09 14:19:20 crc kubenswrapper[4722]: I0309 14:19:20.242930 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-786f45cff4-jdp69" Mar 09 14:19:20 crc kubenswrapper[4722]: I0309 14:19:20.246779 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-67pzj" event={"ID":"ade25a79-1e43-41ed-be91-ce97aa1c4103","Type":"ContainerStarted","Data":"ebd7cb1f83ff12364c046914edd869aaad3e394bb12a4af4b0a543b9a9b4aa49"} Mar 09 14:19:20 crc kubenswrapper[4722]: I0309 14:19:20.247187 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-67pzj" Mar 09 14:19:20 crc kubenswrapper[4722]: I0309 14:19:20.268914 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-vnpzr" podStartSLOduration=1.832567201 podStartE2EDuration="5.268887203s" podCreationTimestamp="2026-03-09 14:19:15 +0000 UTC" firstStartedPulling="2026-03-09 14:19:16.271127235 +0000 UTC m=+996.826695811" lastFinishedPulling="2026-03-09 14:19:19.707447237 +0000 UTC m=+1000.263015813" observedRunningTime="2026-03-09 14:19:20.254329288 +0000 UTC m=+1000.809897884" watchObservedRunningTime="2026-03-09 14:19:20.268887203 +0000 UTC m=+1000.824455799" Mar 09 14:19:20 crc kubenswrapper[4722]: I0309 14:19:20.284241 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-786f45cff4-jdp69" podStartSLOduration=1.775926182 podStartE2EDuration="5.284194299s" podCreationTimestamp="2026-03-09 14:19:15 +0000 UTC" firstStartedPulling="2026-03-09 14:19:16.218155998 +0000 UTC m=+996.773724574" lastFinishedPulling="2026-03-09 14:19:19.726424115 +0000 UTC m=+1000.281992691" observedRunningTime="2026-03-09 14:19:20.277048781 +0000 UTC m=+1000.832617407" watchObservedRunningTime="2026-03-09 14:19:20.284194299 +0000 UTC m=+1000.839762895" Mar 09 14:19:20 crc kubenswrapper[4722]: I0309 14:19:20.307249 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-67pzj" podStartSLOduration=1.305054449 podStartE2EDuration="5.307232491s" podCreationTimestamp="2026-03-09 14:19:15 +0000 UTC" firstStartedPulling="2026-03-09 14:19:15.731323931 +0000 UTC m=+996.286892507" lastFinishedPulling="2026-03-09 14:19:19.733501973 +0000 UTC m=+1000.289070549" observedRunningTime="2026-03-09 14:19:20.305833832 +0000 UTC m=+1000.861402428" watchObservedRunningTime="2026-03-09 14:19:20.307232491 +0000 UTC m=+1000.862801067" Mar 09 14:19:21 crc kubenswrapper[4722]: I0309 14:19:21.527816 4722 patch_prober.go:28] interesting pod/machine-config-daemon-hjrrb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:19:21 crc kubenswrapper[4722]: I0309 14:19:21.528192 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:19:22 crc kubenswrapper[4722]: I0309 14:19:22.427415 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-564p8" Mar 09 14:19:22 crc kubenswrapper[4722]: I0309 14:19:22.469395 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-564p8" Mar 09 14:19:22 crc kubenswrapper[4722]: I0309 14:19:22.657474 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-564p8"] Mar 09 14:19:23 crc kubenswrapper[4722]: I0309 14:19:23.272347 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-zvlr9" event={"ID":"6f9a1c28-1ff4-4ead-91d8-5dc47c7b4d55","Type":"ContainerStarted","Data":"f695f34fbb250dadfcf1d84213bd5082306200b723632f64ca6e80dcd94d2609"} Mar 09 14:19:23 crc kubenswrapper[4722]: I0309 14:19:23.298606 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-69594cc75-zvlr9" podStartSLOduration=1.976367357 podStartE2EDuration="8.298581424s" podCreationTimestamp="2026-03-09 14:19:15 +0000 UTC" firstStartedPulling="2026-03-09 14:19:16.117835712 +0000 UTC m=+996.673404288" lastFinishedPulling="2026-03-09 14:19:22.440049779 +0000 UTC m=+1002.995618355" observedRunningTime="2026-03-09 14:19:23.294530811 +0000 UTC m=+1003.850099407" watchObservedRunningTime="2026-03-09 14:19:23.298581424 +0000 UTC m=+1003.854150010" Mar 09 14:19:24 crc kubenswrapper[4722]: I0309 14:19:24.281869 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-564p8" podUID="96289499-73be-40e3-beb0-95dd99d9dacd" containerName="registry-server" containerID="cri-o://0da2d648ad5c9045606ec53f7e62448104c4aa4bb78ab1f818bdeed5dcbaceba" gracePeriod=2 Mar 09 14:19:24 crc kubenswrapper[4722]: I0309 14:19:24.854294 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-564p8" Mar 09 14:19:24 crc kubenswrapper[4722]: I0309 14:19:24.945656 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96289499-73be-40e3-beb0-95dd99d9dacd-utilities\") pod \"96289499-73be-40e3-beb0-95dd99d9dacd\" (UID: \"96289499-73be-40e3-beb0-95dd99d9dacd\") " Mar 09 14:19:24 crc kubenswrapper[4722]: I0309 14:19:24.945738 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96289499-73be-40e3-beb0-95dd99d9dacd-catalog-content\") pod \"96289499-73be-40e3-beb0-95dd99d9dacd\" (UID: \"96289499-73be-40e3-beb0-95dd99d9dacd\") " Mar 09 14:19:24 crc kubenswrapper[4722]: I0309 14:19:24.945903 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h85m6\" (UniqueName: \"kubernetes.io/projected/96289499-73be-40e3-beb0-95dd99d9dacd-kube-api-access-h85m6\") pod \"96289499-73be-40e3-beb0-95dd99d9dacd\" (UID: \"96289499-73be-40e3-beb0-95dd99d9dacd\") " Mar 09 14:19:24 crc kubenswrapper[4722]: I0309 14:19:24.946830 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96289499-73be-40e3-beb0-95dd99d9dacd-utilities" (OuterVolumeSpecName: "utilities") pod "96289499-73be-40e3-beb0-95dd99d9dacd" (UID: "96289499-73be-40e3-beb0-95dd99d9dacd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:19:24 crc kubenswrapper[4722]: I0309 14:19:24.951874 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96289499-73be-40e3-beb0-95dd99d9dacd-kube-api-access-h85m6" (OuterVolumeSpecName: "kube-api-access-h85m6") pod "96289499-73be-40e3-beb0-95dd99d9dacd" (UID: "96289499-73be-40e3-beb0-95dd99d9dacd"). InnerVolumeSpecName "kube-api-access-h85m6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:19:25 crc kubenswrapper[4722]: I0309 14:19:25.048321 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h85m6\" (UniqueName: \"kubernetes.io/projected/96289499-73be-40e3-beb0-95dd99d9dacd-kube-api-access-h85m6\") on node \"crc\" DevicePath \"\"" Mar 09 14:19:25 crc kubenswrapper[4722]: I0309 14:19:25.048366 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96289499-73be-40e3-beb0-95dd99d9dacd-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 14:19:25 crc kubenswrapper[4722]: I0309 14:19:25.119569 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96289499-73be-40e3-beb0-95dd99d9dacd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "96289499-73be-40e3-beb0-95dd99d9dacd" (UID: "96289499-73be-40e3-beb0-95dd99d9dacd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:19:25 crc kubenswrapper[4722]: I0309 14:19:25.150180 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96289499-73be-40e3-beb0-95dd99d9dacd-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 14:19:25 crc kubenswrapper[4722]: I0309 14:19:25.292952 4722 generic.go:334] "Generic (PLEG): container finished" podID="96289499-73be-40e3-beb0-95dd99d9dacd" containerID="0da2d648ad5c9045606ec53f7e62448104c4aa4bb78ab1f818bdeed5dcbaceba" exitCode=0 Mar 09 14:19:25 crc kubenswrapper[4722]: I0309 14:19:25.293001 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-564p8" event={"ID":"96289499-73be-40e3-beb0-95dd99d9dacd","Type":"ContainerDied","Data":"0da2d648ad5c9045606ec53f7e62448104c4aa4bb78ab1f818bdeed5dcbaceba"} Mar 09 14:19:25 crc kubenswrapper[4722]: I0309 14:19:25.293028 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-564p8" event={"ID":"96289499-73be-40e3-beb0-95dd99d9dacd","Type":"ContainerDied","Data":"31fc13675822d7ca66c2c7c58b3257e52bcc6033eb2450b2c2e07408b4344915"} Mar 09 14:19:25 crc kubenswrapper[4722]: I0309 14:19:25.293044 4722 scope.go:117] "RemoveContainer" containerID="0da2d648ad5c9045606ec53f7e62448104c4aa4bb78ab1f818bdeed5dcbaceba" Mar 09 14:19:25 crc kubenswrapper[4722]: I0309 14:19:25.293172 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-564p8" Mar 09 14:19:25 crc kubenswrapper[4722]: I0309 14:19:25.324662 4722 scope.go:117] "RemoveContainer" containerID="62fff1460329808fe35ae0d002972417c34c62fbdba1c4f02c9f917fec807ccd" Mar 09 14:19:25 crc kubenswrapper[4722]: I0309 14:19:25.325912 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-564p8"] Mar 09 14:19:25 crc kubenswrapper[4722]: I0309 14:19:25.340503 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-564p8"] Mar 09 14:19:25 crc kubenswrapper[4722]: I0309 14:19:25.353991 4722 scope.go:117] "RemoveContainer" containerID="89a98873c734286cb961b908cccb9017d3fd8f467a3e8b3c258a59c9ab8cf131" Mar 09 14:19:25 crc kubenswrapper[4722]: I0309 14:19:25.387923 4722 scope.go:117] "RemoveContainer" containerID="0da2d648ad5c9045606ec53f7e62448104c4aa4bb78ab1f818bdeed5dcbaceba" Mar 09 14:19:25 crc kubenswrapper[4722]: E0309 14:19:25.388685 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0da2d648ad5c9045606ec53f7e62448104c4aa4bb78ab1f818bdeed5dcbaceba\": container with ID starting with 0da2d648ad5c9045606ec53f7e62448104c4aa4bb78ab1f818bdeed5dcbaceba not found: ID does not exist" containerID="0da2d648ad5c9045606ec53f7e62448104c4aa4bb78ab1f818bdeed5dcbaceba" Mar 09 14:19:25 crc kubenswrapper[4722]: I0309 14:19:25.388731 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0da2d648ad5c9045606ec53f7e62448104c4aa4bb78ab1f818bdeed5dcbaceba"} err="failed to get container status \"0da2d648ad5c9045606ec53f7e62448104c4aa4bb78ab1f818bdeed5dcbaceba\": rpc error: code = NotFound desc = could not find container \"0da2d648ad5c9045606ec53f7e62448104c4aa4bb78ab1f818bdeed5dcbaceba\": container with ID starting with 0da2d648ad5c9045606ec53f7e62448104c4aa4bb78ab1f818bdeed5dcbaceba not found: ID does not exist" Mar 09 14:19:25 crc kubenswrapper[4722]: I0309 14:19:25.388759 4722 scope.go:117] "RemoveContainer" containerID="62fff1460329808fe35ae0d002972417c34c62fbdba1c4f02c9f917fec807ccd" Mar 09 14:19:25 crc kubenswrapper[4722]: E0309 14:19:25.389382 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62fff1460329808fe35ae0d002972417c34c62fbdba1c4f02c9f917fec807ccd\": container with ID starting with 62fff1460329808fe35ae0d002972417c34c62fbdba1c4f02c9f917fec807ccd not found: ID does not exist" containerID="62fff1460329808fe35ae0d002972417c34c62fbdba1c4f02c9f917fec807ccd" Mar 09 14:19:25 crc kubenswrapper[4722]: I0309 14:19:25.389440 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62fff1460329808fe35ae0d002972417c34c62fbdba1c4f02c9f917fec807ccd"} err="failed to get container status \"62fff1460329808fe35ae0d002972417c34c62fbdba1c4f02c9f917fec807ccd\": rpc error: code = NotFound desc = could not find container \"62fff1460329808fe35ae0d002972417c34c62fbdba1c4f02c9f917fec807ccd\": container with ID starting with 62fff1460329808fe35ae0d002972417c34c62fbdba1c4f02c9f917fec807ccd not found: ID does not exist" Mar 09 14:19:25 crc kubenswrapper[4722]: I0309 14:19:25.389472 4722 scope.go:117] "RemoveContainer" containerID="89a98873c734286cb961b908cccb9017d3fd8f467a3e8b3c258a59c9ab8cf131" Mar 09 14:19:25 crc kubenswrapper[4722]: E0309 14:19:25.389823 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89a98873c734286cb961b908cccb9017d3fd8f467a3e8b3c258a59c9ab8cf131\": container with ID starting with 89a98873c734286cb961b908cccb9017d3fd8f467a3e8b3c258a59c9ab8cf131 not found: ID does not exist" containerID="89a98873c734286cb961b908cccb9017d3fd8f467a3e8b3c258a59c9ab8cf131" Mar 09 14:19:25 crc kubenswrapper[4722]: I0309 14:19:25.389845 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89a98873c734286cb961b908cccb9017d3fd8f467a3e8b3c258a59c9ab8cf131"} err="failed to get container status \"89a98873c734286cb961b908cccb9017d3fd8f467a3e8b3c258a59c9ab8cf131\": rpc error: code = NotFound desc = could not find container \"89a98873c734286cb961b908cccb9017d3fd8f467a3e8b3c258a59c9ab8cf131\": container with ID starting with 89a98873c734286cb961b908cccb9017d3fd8f467a3e8b3c258a59c9ab8cf131 not found: ID does not exist" Mar 09 14:19:25 crc kubenswrapper[4722]: I0309 14:19:25.706641 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-67pzj" Mar 09 14:19:26 crc kubenswrapper[4722]: I0309 14:19:26.017420 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-78fdf7cd4f-mt82k" Mar 09 14:19:26 crc kubenswrapper[4722]: I0309 14:19:26.017470 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-78fdf7cd4f-mt82k" Mar 09 14:19:26 crc kubenswrapper[4722]: I0309 14:19:26.023075 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-78fdf7cd4f-mt82k" Mar 09 14:19:26 crc kubenswrapper[4722]: I0309 14:19:26.158758 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96289499-73be-40e3-beb0-95dd99d9dacd" path="/var/lib/kubelet/pods/96289499-73be-40e3-beb0-95dd99d9dacd/volumes" Mar 09 14:19:26 crc kubenswrapper[4722]: I0309 14:19:26.302745 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-78fdf7cd4f-mt82k" Mar 09 14:19:26 crc kubenswrapper[4722]: I0309 14:19:26.362366 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5d9c759fb9-nkb8p"] Mar 09 14:19:35 crc kubenswrapper[4722]: I0309 14:19:35.632146 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-786f45cff4-jdp69" Mar 09 14:19:45 crc kubenswrapper[4722]: I0309 14:19:45.778759 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8dh75"] Mar 09 14:19:45 crc kubenswrapper[4722]: E0309 14:19:45.779939 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96289499-73be-40e3-beb0-95dd99d9dacd" containerName="extract-content" Mar 09 14:19:45 crc kubenswrapper[4722]: I0309 14:19:45.779957 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="96289499-73be-40e3-beb0-95dd99d9dacd" containerName="extract-content" Mar 09 14:19:45 crc kubenswrapper[4722]: E0309 14:19:45.779976 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96289499-73be-40e3-beb0-95dd99d9dacd" containerName="registry-server" Mar 09 14:19:45 crc kubenswrapper[4722]: I0309 14:19:45.779984 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="96289499-73be-40e3-beb0-95dd99d9dacd" containerName="registry-server" Mar 09 14:19:45 crc kubenswrapper[4722]: E0309 14:19:45.780000 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96289499-73be-40e3-beb0-95dd99d9dacd" containerName="extract-utilities" Mar 09 14:19:45 crc kubenswrapper[4722]: I0309 14:19:45.780009 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="96289499-73be-40e3-beb0-95dd99d9dacd" containerName="extract-utilities" Mar 09 14:19:45 crc kubenswrapper[4722]: I0309 14:19:45.780256 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="96289499-73be-40e3-beb0-95dd99d9dacd" containerName="registry-server" Mar 09 14:19:45 crc kubenswrapper[4722]: I0309 14:19:45.781605 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8dh75" Mar 09 14:19:45 crc kubenswrapper[4722]: I0309 14:19:45.788327 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8dh75"] Mar 09 14:19:45 crc kubenswrapper[4722]: I0309 14:19:45.905178 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ba97601-bd7e-4763-8c59-0af1273fd01f-catalog-content\") pod \"redhat-marketplace-8dh75\" (UID: \"8ba97601-bd7e-4763-8c59-0af1273fd01f\") " pod="openshift-marketplace/redhat-marketplace-8dh75" Mar 09 14:19:45 crc kubenswrapper[4722]: I0309 14:19:45.906110 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kd2vn\" (UniqueName: \"kubernetes.io/projected/8ba97601-bd7e-4763-8c59-0af1273fd01f-kube-api-access-kd2vn\") pod \"redhat-marketplace-8dh75\" (UID: \"8ba97601-bd7e-4763-8c59-0af1273fd01f\") " pod="openshift-marketplace/redhat-marketplace-8dh75" Mar 09 14:19:45 crc kubenswrapper[4722]: I0309 14:19:45.906349 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ba97601-bd7e-4763-8c59-0af1273fd01f-utilities\") pod \"redhat-marketplace-8dh75\" (UID: \"8ba97601-bd7e-4763-8c59-0af1273fd01f\") " pod="openshift-marketplace/redhat-marketplace-8dh75" Mar 09 14:19:46 crc kubenswrapper[4722]: I0309 14:19:46.008488 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ba97601-bd7e-4763-8c59-0af1273fd01f-utilities\") pod \"redhat-marketplace-8dh75\" (UID: \"8ba97601-bd7e-4763-8c59-0af1273fd01f\") " pod="openshift-marketplace/redhat-marketplace-8dh75" Mar 09 14:19:46 crc kubenswrapper[4722]: I0309 14:19:46.008598 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ba97601-bd7e-4763-8c59-0af1273fd01f-catalog-content\") pod \"redhat-marketplace-8dh75\" (UID: \"8ba97601-bd7e-4763-8c59-0af1273fd01f\") " pod="openshift-marketplace/redhat-marketplace-8dh75" Mar 09 14:19:46 crc kubenswrapper[4722]: I0309 14:19:46.008626 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kd2vn\" (UniqueName: \"kubernetes.io/projected/8ba97601-bd7e-4763-8c59-0af1273fd01f-kube-api-access-kd2vn\") pod \"redhat-marketplace-8dh75\" (UID: \"8ba97601-bd7e-4763-8c59-0af1273fd01f\") " pod="openshift-marketplace/redhat-marketplace-8dh75" Mar 09 14:19:46 crc kubenswrapper[4722]: I0309 14:19:46.009382 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ba97601-bd7e-4763-8c59-0af1273fd01f-utilities\") pod \"redhat-marketplace-8dh75\" (UID: \"8ba97601-bd7e-4763-8c59-0af1273fd01f\") " pod="openshift-marketplace/redhat-marketplace-8dh75" Mar 09 14:19:46 crc kubenswrapper[4722]: I0309 14:19:46.009603 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ba97601-bd7e-4763-8c59-0af1273fd01f-catalog-content\") pod \"redhat-marketplace-8dh75\" (UID: \"8ba97601-bd7e-4763-8c59-0af1273fd01f\") " pod="openshift-marketplace/redhat-marketplace-8dh75" Mar 09 14:19:46 crc kubenswrapper[4722]: I0309 14:19:46.045135 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kd2vn\" (UniqueName: \"kubernetes.io/projected/8ba97601-bd7e-4763-8c59-0af1273fd01f-kube-api-access-kd2vn\") pod \"redhat-marketplace-8dh75\" (UID: \"8ba97601-bd7e-4763-8c59-0af1273fd01f\") " pod="openshift-marketplace/redhat-marketplace-8dh75" Mar 09 14:19:46 crc kubenswrapper[4722]: I0309 14:19:46.102695 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8dh75" Mar 09 14:19:46 crc kubenswrapper[4722]: I0309 14:19:46.558323 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8dh75"] Mar 09 14:19:47 crc kubenswrapper[4722]: I0309 14:19:47.502138 4722 generic.go:334] "Generic (PLEG): container finished" podID="8ba97601-bd7e-4763-8c59-0af1273fd01f" containerID="6ea932039d4db6c98713a4fd135a2bf606f0659160cc20109a72cc8251652047" exitCode=0 Mar 09 14:19:47 crc kubenswrapper[4722]: I0309 14:19:47.502182 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8dh75" event={"ID":"8ba97601-bd7e-4763-8c59-0af1273fd01f","Type":"ContainerDied","Data":"6ea932039d4db6c98713a4fd135a2bf606f0659160cc20109a72cc8251652047"} Mar 09 14:19:47 crc kubenswrapper[4722]: I0309 14:19:47.502694 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8dh75" event={"ID":"8ba97601-bd7e-4763-8c59-0af1273fd01f","Type":"ContainerStarted","Data":"4b13b28d43d58c9a022ddbebb03cccb80b343a8768f5d90ba0df030ca8e872ea"} Mar 09 14:19:49 crc kubenswrapper[4722]: I0309 14:19:49.528449 4722 generic.go:334] "Generic (PLEG): container finished" podID="8ba97601-bd7e-4763-8c59-0af1273fd01f" containerID="87f90862d3f61c97ac117dcdc06dd0b503aeeb88a959ccae2c53f92874ee295b" exitCode=0 Mar 09 14:19:49 crc kubenswrapper[4722]: I0309 14:19:49.528528 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8dh75" event={"ID":"8ba97601-bd7e-4763-8c59-0af1273fd01f","Type":"ContainerDied","Data":"87f90862d3f61c97ac117dcdc06dd0b503aeeb88a959ccae2c53f92874ee295b"} Mar 09 14:19:50 crc kubenswrapper[4722]: I0309 14:19:50.545365 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8dh75" event={"ID":"8ba97601-bd7e-4763-8c59-0af1273fd01f","Type":"ContainerStarted","Data":"9cc4456f91b3c0afe2203a8e93a844650068489d457671f1b99643cd32e27b29"} Mar 09 14:19:50 crc kubenswrapper[4722]: I0309 14:19:50.564391 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8dh75" podStartSLOduration=3.033479997 podStartE2EDuration="5.564374608s" podCreationTimestamp="2026-03-09 14:19:45 +0000 UTC" firstStartedPulling="2026-03-09 14:19:47.503871088 +0000 UTC m=+1028.059439664" lastFinishedPulling="2026-03-09 14:19:50.034765699 +0000 UTC m=+1030.590334275" observedRunningTime="2026-03-09 14:19:50.562183937 +0000 UTC m=+1031.117752523" watchObservedRunningTime="2026-03-09 14:19:50.564374608 +0000 UTC m=+1031.119943184" Mar 09 14:19:51 crc kubenswrapper[4722]: I0309 14:19:51.437026 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-5d9c759fb9-nkb8p" podUID="720bc29d-ca91-4ee2-9a0c-5f32659e650e" containerName="console" containerID="cri-o://e4390ee405b9ff45c1807c4e21e61fad235bb5c948893f0bf7107ea13c6f322e" gracePeriod=15 Mar 09 14:19:51 crc kubenswrapper[4722]: I0309 14:19:51.527864 4722 patch_prober.go:28] interesting pod/machine-config-daemon-hjrrb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:19:51 crc kubenswrapper[4722]: I0309 14:19:51.527919 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:19:51 crc kubenswrapper[4722]: I0309 14:19:51.527961 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" Mar 09 14:19:51 crc kubenswrapper[4722]: I0309 14:19:51.528727 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"50a94b1e196b515b7f1ddd4cb650f99db9da76851a6a18093dd50246aaec5007"} pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 14:19:51 crc kubenswrapper[4722]: I0309 14:19:51.528795 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerName="machine-config-daemon" containerID="cri-o://50a94b1e196b515b7f1ddd4cb650f99db9da76851a6a18093dd50246aaec5007" gracePeriod=600 Mar 09 14:19:51 crc kubenswrapper[4722]: I0309 14:19:51.891071 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5d9c759fb9-nkb8p_720bc29d-ca91-4ee2-9a0c-5f32659e650e/console/0.log" Mar 09 14:19:51 crc kubenswrapper[4722]: I0309 14:19:51.891428 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d9c759fb9-nkb8p" Mar 09 14:19:52 crc kubenswrapper[4722]: I0309 14:19:52.023554 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x52zr\" (UniqueName: \"kubernetes.io/projected/720bc29d-ca91-4ee2-9a0c-5f32659e650e-kube-api-access-x52zr\") pod \"720bc29d-ca91-4ee2-9a0c-5f32659e650e\" (UID: \"720bc29d-ca91-4ee2-9a0c-5f32659e650e\") " Mar 09 14:19:52 crc kubenswrapper[4722]: I0309 14:19:52.023911 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/720bc29d-ca91-4ee2-9a0c-5f32659e650e-console-oauth-config\") pod \"720bc29d-ca91-4ee2-9a0c-5f32659e650e\" (UID: \"720bc29d-ca91-4ee2-9a0c-5f32659e650e\") " Mar 09 14:19:52 crc kubenswrapper[4722]: I0309 14:19:52.023932 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/720bc29d-ca91-4ee2-9a0c-5f32659e650e-console-serving-cert\") pod \"720bc29d-ca91-4ee2-9a0c-5f32659e650e\" (UID: \"720bc29d-ca91-4ee2-9a0c-5f32659e650e\") " Mar 09 14:19:52 crc kubenswrapper[4722]: I0309 14:19:52.023963 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/720bc29d-ca91-4ee2-9a0c-5f32659e650e-console-config\") pod \"720bc29d-ca91-4ee2-9a0c-5f32659e650e\" (UID: \"720bc29d-ca91-4ee2-9a0c-5f32659e650e\") " Mar 09 14:19:52 crc kubenswrapper[4722]: I0309 14:19:52.024021 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/720bc29d-ca91-4ee2-9a0c-5f32659e650e-oauth-serving-cert\") pod \"720bc29d-ca91-4ee2-9a0c-5f32659e650e\" (UID: \"720bc29d-ca91-4ee2-9a0c-5f32659e650e\") " Mar 09 14:19:52 crc kubenswrapper[4722]: I0309 14:19:52.024054 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/720bc29d-ca91-4ee2-9a0c-5f32659e650e-service-ca\") pod \"720bc29d-ca91-4ee2-9a0c-5f32659e650e\" (UID: \"720bc29d-ca91-4ee2-9a0c-5f32659e650e\") " Mar 09 14:19:52 crc kubenswrapper[4722]: I0309 14:19:52.024069 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/720bc29d-ca91-4ee2-9a0c-5f32659e650e-trusted-ca-bundle\") pod \"720bc29d-ca91-4ee2-9a0c-5f32659e650e\" (UID: \"720bc29d-ca91-4ee2-9a0c-5f32659e650e\") " Mar 09 14:19:52 crc kubenswrapper[4722]: I0309 14:19:52.024657 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/720bc29d-ca91-4ee2-9a0c-5f32659e650e-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "720bc29d-ca91-4ee2-9a0c-5f32659e650e" (UID: "720bc29d-ca91-4ee2-9a0c-5f32659e650e"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:19:52 crc kubenswrapper[4722]: I0309 14:19:52.024732 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/720bc29d-ca91-4ee2-9a0c-5f32659e650e-service-ca" (OuterVolumeSpecName: "service-ca") pod "720bc29d-ca91-4ee2-9a0c-5f32659e650e" (UID: "720bc29d-ca91-4ee2-9a0c-5f32659e650e"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:19:52 crc kubenswrapper[4722]: I0309 14:19:52.024770 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/720bc29d-ca91-4ee2-9a0c-5f32659e650e-console-config" (OuterVolumeSpecName: "console-config") pod "720bc29d-ca91-4ee2-9a0c-5f32659e650e" (UID: "720bc29d-ca91-4ee2-9a0c-5f32659e650e"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:19:52 crc kubenswrapper[4722]: I0309 14:19:52.024810 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/720bc29d-ca91-4ee2-9a0c-5f32659e650e-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "720bc29d-ca91-4ee2-9a0c-5f32659e650e" (UID: "720bc29d-ca91-4ee2-9a0c-5f32659e650e"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:19:52 crc kubenswrapper[4722]: I0309 14:19:52.029643 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/720bc29d-ca91-4ee2-9a0c-5f32659e650e-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "720bc29d-ca91-4ee2-9a0c-5f32659e650e" (UID: "720bc29d-ca91-4ee2-9a0c-5f32659e650e"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:19:52 crc kubenswrapper[4722]: I0309 14:19:52.029692 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/720bc29d-ca91-4ee2-9a0c-5f32659e650e-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "720bc29d-ca91-4ee2-9a0c-5f32659e650e" (UID: "720bc29d-ca91-4ee2-9a0c-5f32659e650e"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:19:52 crc kubenswrapper[4722]: I0309 14:19:52.030489 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/720bc29d-ca91-4ee2-9a0c-5f32659e650e-kube-api-access-x52zr" (OuterVolumeSpecName: "kube-api-access-x52zr") pod "720bc29d-ca91-4ee2-9a0c-5f32659e650e" (UID: "720bc29d-ca91-4ee2-9a0c-5f32659e650e"). InnerVolumeSpecName "kube-api-access-x52zr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:19:52 crc kubenswrapper[4722]: I0309 14:19:52.125266 4722 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/720bc29d-ca91-4ee2-9a0c-5f32659e650e-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 14:19:52 crc kubenswrapper[4722]: I0309 14:19:52.125300 4722 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/720bc29d-ca91-4ee2-9a0c-5f32659e650e-service-ca\") on node \"crc\" DevicePath \"\"" Mar 09 14:19:52 crc kubenswrapper[4722]: I0309 14:19:52.125310 4722 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/720bc29d-ca91-4ee2-9a0c-5f32659e650e-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:19:52 crc kubenswrapper[4722]: I0309 14:19:52.125318 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x52zr\" (UniqueName: \"kubernetes.io/projected/720bc29d-ca91-4ee2-9a0c-5f32659e650e-kube-api-access-x52zr\") on node \"crc\" DevicePath \"\"" Mar 09 14:19:52 crc kubenswrapper[4722]: I0309 14:19:52.125330 4722 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/720bc29d-ca91-4ee2-9a0c-5f32659e650e-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 09 14:19:52 crc kubenswrapper[4722]: I0309 14:19:52.125372 4722 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/720bc29d-ca91-4ee2-9a0c-5f32659e650e-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 14:19:52 crc kubenswrapper[4722]: I0309 14:19:52.125383 4722 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/720bc29d-ca91-4ee2-9a0c-5f32659e650e-console-config\") on node \"crc\" DevicePath \"\"" Mar 09 14:19:52 crc kubenswrapper[4722]: I0309 14:19:52.565009 4722 generic.go:334] "Generic (PLEG): container finished" podID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerID="50a94b1e196b515b7f1ddd4cb650f99db9da76851a6a18093dd50246aaec5007" exitCode=0 Mar 09 14:19:52 crc kubenswrapper[4722]: I0309 14:19:52.565052 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" event={"ID":"dac2aaf5-653b-4b2a-8efe-ed26bac8d648","Type":"ContainerDied","Data":"50a94b1e196b515b7f1ddd4cb650f99db9da76851a6a18093dd50246aaec5007"} Mar 09 14:19:52 crc kubenswrapper[4722]: I0309 14:19:52.565107 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" event={"ID":"dac2aaf5-653b-4b2a-8efe-ed26bac8d648","Type":"ContainerStarted","Data":"61f8d88e021e1998adcf282dcc3a5969939b9f1d00069284614000e527956e5e"} Mar 09 14:19:52 crc kubenswrapper[4722]: I0309 14:19:52.565166 4722 scope.go:117] "RemoveContainer" containerID="cc45a812c78ad6bdbc54dbec7789e158b5ae14665e6cafed5462e27caf19d00d" Mar 09 14:19:52 crc kubenswrapper[4722]: I0309 14:19:52.567926 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5d9c759fb9-nkb8p_720bc29d-ca91-4ee2-9a0c-5f32659e650e/console/0.log" Mar 09 14:19:52 crc kubenswrapper[4722]: I0309 14:19:52.568001 4722 generic.go:334] "Generic (PLEG): container finished" podID="720bc29d-ca91-4ee2-9a0c-5f32659e650e" containerID="e4390ee405b9ff45c1807c4e21e61fad235bb5c948893f0bf7107ea13c6f322e" exitCode=2 Mar 09 14:19:52 crc kubenswrapper[4722]: I0309 14:19:52.568046 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d9c759fb9-nkb8p" event={"ID":"720bc29d-ca91-4ee2-9a0c-5f32659e650e","Type":"ContainerDied","Data":"e4390ee405b9ff45c1807c4e21e61fad235bb5c948893f0bf7107ea13c6f322e"} Mar 09 14:19:52 crc kubenswrapper[4722]: I0309 14:19:52.568085 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d9c759fb9-nkb8p" event={"ID":"720bc29d-ca91-4ee2-9a0c-5f32659e650e","Type":"ContainerDied","Data":"ce63b766ed219542b8ed0002471573ffa3dd7c5c1c3a56804a39e71ae295b4b1"} Mar 09 14:19:52 crc kubenswrapper[4722]: I0309 14:19:52.568110 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d9c759fb9-nkb8p" Mar 09 14:19:52 crc kubenswrapper[4722]: I0309 14:19:52.626044 4722 scope.go:117] "RemoveContainer" containerID="e4390ee405b9ff45c1807c4e21e61fad235bb5c948893f0bf7107ea13c6f322e" Mar 09 14:19:52 crc kubenswrapper[4722]: I0309 14:19:52.635287 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5d9c759fb9-nkb8p"] Mar 09 14:19:52 crc kubenswrapper[4722]: I0309 14:19:52.641460 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5d9c759fb9-nkb8p"] Mar 09 14:19:52 crc kubenswrapper[4722]: I0309 14:19:52.662400 4722 scope.go:117] "RemoveContainer" containerID="e4390ee405b9ff45c1807c4e21e61fad235bb5c948893f0bf7107ea13c6f322e" Mar 09 14:19:52 crc kubenswrapper[4722]: E0309 14:19:52.663532 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4390ee405b9ff45c1807c4e21e61fad235bb5c948893f0bf7107ea13c6f322e\": container with ID starting with e4390ee405b9ff45c1807c4e21e61fad235bb5c948893f0bf7107ea13c6f322e not found: ID does not exist" containerID="e4390ee405b9ff45c1807c4e21e61fad235bb5c948893f0bf7107ea13c6f322e" Mar 09 14:19:52 crc kubenswrapper[4722]: I0309 14:19:52.663562 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4390ee405b9ff45c1807c4e21e61fad235bb5c948893f0bf7107ea13c6f322e"} err="failed to get container status \"e4390ee405b9ff45c1807c4e21e61fad235bb5c948893f0bf7107ea13c6f322e\": rpc error: code = NotFound desc = could not find container \"e4390ee405b9ff45c1807c4e21e61fad235bb5c948893f0bf7107ea13c6f322e\": container with ID starting with e4390ee405b9ff45c1807c4e21e61fad235bb5c948893f0bf7107ea13c6f322e not found: ID does not exist" Mar 09 14:19:54 crc kubenswrapper[4722]: I0309 14:19:54.158137 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="720bc29d-ca91-4ee2-9a0c-5f32659e650e" path="/var/lib/kubelet/pods/720bc29d-ca91-4ee2-9a0c-5f32659e650e/volumes" Mar 09 14:19:55 crc kubenswrapper[4722]: I0309 14:19:55.968854 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4tw5s7"] Mar 09 14:19:55 crc kubenswrapper[4722]: E0309 14:19:55.969674 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="720bc29d-ca91-4ee2-9a0c-5f32659e650e" containerName="console" Mar 09 14:19:55 crc kubenswrapper[4722]: I0309 14:19:55.969693 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="720bc29d-ca91-4ee2-9a0c-5f32659e650e" containerName="console" Mar 09 14:19:55 crc kubenswrapper[4722]: I0309 14:19:55.969897 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="720bc29d-ca91-4ee2-9a0c-5f32659e650e" containerName="console" Mar 09 14:19:55 crc kubenswrapper[4722]: I0309 14:19:55.971064 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4tw5s7" Mar 09 14:19:55 crc kubenswrapper[4722]: I0309 14:19:55.975244 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 09 14:19:55 crc kubenswrapper[4722]: I0309 14:19:55.977767 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4tw5s7"] Mar 09 14:19:56 crc kubenswrapper[4722]: I0309 14:19:56.095292 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ef685835-f0f9-45e4-b0e1-9213895704e4-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4tw5s7\" (UID: \"ef685835-f0f9-45e4-b0e1-9213895704e4\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4tw5s7" Mar 09 14:19:56 crc kubenswrapper[4722]: I0309 14:19:56.095340 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28mbz\" (UniqueName: \"kubernetes.io/projected/ef685835-f0f9-45e4-b0e1-9213895704e4-kube-api-access-28mbz\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4tw5s7\" (UID: \"ef685835-f0f9-45e4-b0e1-9213895704e4\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4tw5s7" Mar 09 14:19:56 crc kubenswrapper[4722]: I0309 14:19:56.095578 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ef685835-f0f9-45e4-b0e1-9213895704e4-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4tw5s7\" (UID: \"ef685835-f0f9-45e4-b0e1-9213895704e4\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4tw5s7" Mar 09 14:19:56 crc kubenswrapper[4722]: I0309 14:19:56.103259 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8dh75" Mar 09 14:19:56 crc kubenswrapper[4722]: I0309 14:19:56.103358 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8dh75" Mar 09 14:19:56 crc kubenswrapper[4722]: I0309 14:19:56.147824 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8dh75" Mar 09 14:19:56 crc kubenswrapper[4722]: I0309 14:19:56.197509 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28mbz\" (UniqueName: \"kubernetes.io/projected/ef685835-f0f9-45e4-b0e1-9213895704e4-kube-api-access-28mbz\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4tw5s7\" (UID: \"ef685835-f0f9-45e4-b0e1-9213895704e4\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4tw5s7" Mar 09 14:19:56 crc kubenswrapper[4722]: I0309 14:19:56.197772 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ef685835-f0f9-45e4-b0e1-9213895704e4-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4tw5s7\" (UID: \"ef685835-f0f9-45e4-b0e1-9213895704e4\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4tw5s7" Mar 09 14:19:56 crc kubenswrapper[4722]: I0309 14:19:56.197964 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ef685835-f0f9-45e4-b0e1-9213895704e4-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4tw5s7\" (UID: \"ef685835-f0f9-45e4-b0e1-9213895704e4\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4tw5s7" Mar 09 14:19:56 crc kubenswrapper[4722]: I0309 14:19:56.198483 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ef685835-f0f9-45e4-b0e1-9213895704e4-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4tw5s7\" (UID: \"ef685835-f0f9-45e4-b0e1-9213895704e4\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4tw5s7" Mar 09 14:19:56 crc kubenswrapper[4722]: I0309 14:19:56.198693 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ef685835-f0f9-45e4-b0e1-9213895704e4-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4tw5s7\" (UID: \"ef685835-f0f9-45e4-b0e1-9213895704e4\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4tw5s7" Mar 09 14:19:56 crc kubenswrapper[4722]: I0309 14:19:56.238225 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28mbz\" (UniqueName: \"kubernetes.io/projected/ef685835-f0f9-45e4-b0e1-9213895704e4-kube-api-access-28mbz\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4tw5s7\" (UID: \"ef685835-f0f9-45e4-b0e1-9213895704e4\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4tw5s7" Mar 09 14:19:56 crc kubenswrapper[4722]: I0309 14:19:56.294989 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4tw5s7" Mar 09 14:19:56 crc kubenswrapper[4722]: I0309 14:19:56.651756 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8dh75" Mar 09 14:19:56 crc kubenswrapper[4722]: I0309 14:19:56.765629 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4tw5s7"] Mar 09 14:19:57 crc kubenswrapper[4722]: I0309 14:19:57.612265 4722 generic.go:334] "Generic (PLEG): container finished" podID="ef685835-f0f9-45e4-b0e1-9213895704e4" containerID="16b88ee7bd498f079639504776babed26d23684e2d65f9c4d6dcad0583a4698a" exitCode=0 Mar 09 14:19:57 crc kubenswrapper[4722]: I0309 14:19:57.612362 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4tw5s7" event={"ID":"ef685835-f0f9-45e4-b0e1-9213895704e4","Type":"ContainerDied","Data":"16b88ee7bd498f079639504776babed26d23684e2d65f9c4d6dcad0583a4698a"} Mar 09 14:19:57 crc kubenswrapper[4722]: I0309 14:19:57.614038 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4tw5s7" event={"ID":"ef685835-f0f9-45e4-b0e1-9213895704e4","Type":"ContainerStarted","Data":"460c9d4c3c744a9f5bcfb880ae2e48e4b70b413cdc80576327e167887f9f52d7"} Mar 09 14:19:59 crc kubenswrapper[4722]: I0309 14:19:59.514755 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8dh75"] Mar 09 14:19:59 crc kubenswrapper[4722]: I0309 14:19:59.515313 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8dh75" podUID="8ba97601-bd7e-4763-8c59-0af1273fd01f" containerName="registry-server" containerID="cri-o://9cc4456f91b3c0afe2203a8e93a844650068489d457671f1b99643cd32e27b29" gracePeriod=2 Mar 09 14:19:59 crc kubenswrapper[4722]: I0309 14:19:59.632034 4722 generic.go:334] "Generic (PLEG): container finished" podID="ef685835-f0f9-45e4-b0e1-9213895704e4" containerID="5858b1e108c172290f693fbe2bd0a2a73bdba77b06bc81c1e88118016630eed8" exitCode=0 Mar 09 14:19:59 crc kubenswrapper[4722]: I0309 14:19:59.632079 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4tw5s7" event={"ID":"ef685835-f0f9-45e4-b0e1-9213895704e4","Type":"ContainerDied","Data":"5858b1e108c172290f693fbe2bd0a2a73bdba77b06bc81c1e88118016630eed8"} Mar 09 14:20:00 crc kubenswrapper[4722]: I0309 14:20:00.037282 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8dh75" Mar 09 14:20:00 crc kubenswrapper[4722]: I0309 14:20:00.133124 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551100-2cmr4"] Mar 09 14:20:00 crc kubenswrapper[4722]: E0309 14:20:00.133489 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ba97601-bd7e-4763-8c59-0af1273fd01f" containerName="extract-content" Mar 09 14:20:00 crc kubenswrapper[4722]: I0309 14:20:00.133505 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ba97601-bd7e-4763-8c59-0af1273fd01f" containerName="extract-content" Mar 09 14:20:00 crc kubenswrapper[4722]: E0309 14:20:00.133527 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ba97601-bd7e-4763-8c59-0af1273fd01f" containerName="registry-server" Mar 09 14:20:00 crc kubenswrapper[4722]: I0309 14:20:00.133532 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ba97601-bd7e-4763-8c59-0af1273fd01f" containerName="registry-server" Mar 09 14:20:00 crc kubenswrapper[4722]: E0309 14:20:00.133573 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ba97601-bd7e-4763-8c59-0af1273fd01f" containerName="extract-utilities" Mar 09 14:20:00 crc kubenswrapper[4722]: I0309 14:20:00.133580 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ba97601-bd7e-4763-8c59-0af1273fd01f" containerName="extract-utilities" Mar 09 14:20:00 crc kubenswrapper[4722]: I0309 14:20:00.133753 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ba97601-bd7e-4763-8c59-0af1273fd01f" containerName="registry-server" Mar 09 14:20:00 crc kubenswrapper[4722]: I0309 14:20:00.134289 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551100-2cmr4" Mar 09 14:20:00 crc kubenswrapper[4722]: I0309 14:20:00.139890 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-dr2x6" Mar 09 14:20:00 crc kubenswrapper[4722]: I0309 14:20:00.139931 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 14:20:00 crc kubenswrapper[4722]: I0309 14:20:00.139963 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 14:20:00 crc kubenswrapper[4722]: I0309 14:20:00.161015 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ba97601-bd7e-4763-8c59-0af1273fd01f-utilities\") pod \"8ba97601-bd7e-4763-8c59-0af1273fd01f\" (UID: \"8ba97601-bd7e-4763-8c59-0af1273fd01f\") " Mar 09 14:20:00 crc kubenswrapper[4722]: I0309 14:20:00.161054 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ba97601-bd7e-4763-8c59-0af1273fd01f-catalog-content\") pod \"8ba97601-bd7e-4763-8c59-0af1273fd01f\" (UID: \"8ba97601-bd7e-4763-8c59-0af1273fd01f\") " Mar 09 14:20:00 crc kubenswrapper[4722]: I0309 14:20:00.161137 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kd2vn\" (UniqueName: \"kubernetes.io/projected/8ba97601-bd7e-4763-8c59-0af1273fd01f-kube-api-access-kd2vn\") pod \"8ba97601-bd7e-4763-8c59-0af1273fd01f\" (UID: \"8ba97601-bd7e-4763-8c59-0af1273fd01f\") " Mar 09 14:20:00 crc kubenswrapper[4722]: I0309 14:20:00.163317 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ba97601-bd7e-4763-8c59-0af1273fd01f-utilities" (OuterVolumeSpecName: "utilities") pod "8ba97601-bd7e-4763-8c59-0af1273fd01f" (UID: "8ba97601-bd7e-4763-8c59-0af1273fd01f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:20:00 crc kubenswrapper[4722]: I0309 14:20:00.170857 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551100-2cmr4"] Mar 09 14:20:00 crc kubenswrapper[4722]: I0309 14:20:00.188625 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ba97601-bd7e-4763-8c59-0af1273fd01f-kube-api-access-kd2vn" (OuterVolumeSpecName: "kube-api-access-kd2vn") pod "8ba97601-bd7e-4763-8c59-0af1273fd01f" (UID: "8ba97601-bd7e-4763-8c59-0af1273fd01f"). InnerVolumeSpecName "kube-api-access-kd2vn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:20:00 crc kubenswrapper[4722]: I0309 14:20:00.196541 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ba97601-bd7e-4763-8c59-0af1273fd01f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8ba97601-bd7e-4763-8c59-0af1273fd01f" (UID: "8ba97601-bd7e-4763-8c59-0af1273fd01f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:20:00 crc kubenswrapper[4722]: I0309 14:20:00.263101 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8xvb\" (UniqueName: \"kubernetes.io/projected/bbc4a147-c662-4236-b11b-16239fa031a0-kube-api-access-r8xvb\") pod \"auto-csr-approver-29551100-2cmr4\" (UID: \"bbc4a147-c662-4236-b11b-16239fa031a0\") " pod="openshift-infra/auto-csr-approver-29551100-2cmr4" Mar 09 14:20:00 crc kubenswrapper[4722]: I0309 14:20:00.263227 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kd2vn\" (UniqueName: \"kubernetes.io/projected/8ba97601-bd7e-4763-8c59-0af1273fd01f-kube-api-access-kd2vn\") on node \"crc\" DevicePath \"\"" Mar 09 14:20:00 crc kubenswrapper[4722]: I0309 14:20:00.263248 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ba97601-bd7e-4763-8c59-0af1273fd01f-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 14:20:00 crc kubenswrapper[4722]: I0309 14:20:00.263262 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ba97601-bd7e-4763-8c59-0af1273fd01f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 14:20:00 crc kubenswrapper[4722]: I0309 14:20:00.364580 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8xvb\" (UniqueName: \"kubernetes.io/projected/bbc4a147-c662-4236-b11b-16239fa031a0-kube-api-access-r8xvb\") pod \"auto-csr-approver-29551100-2cmr4\" (UID: \"bbc4a147-c662-4236-b11b-16239fa031a0\") " pod="openshift-infra/auto-csr-approver-29551100-2cmr4" Mar 09 14:20:00 crc kubenswrapper[4722]: I0309 14:20:00.381736 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8xvb\" (UniqueName: \"kubernetes.io/projected/bbc4a147-c662-4236-b11b-16239fa031a0-kube-api-access-r8xvb\") pod \"auto-csr-approver-29551100-2cmr4\" (UID: \"bbc4a147-c662-4236-b11b-16239fa031a0\") " pod="openshift-infra/auto-csr-approver-29551100-2cmr4" Mar 09 14:20:00 crc kubenswrapper[4722]: I0309 14:20:00.455737 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551100-2cmr4" Mar 09 14:20:00 crc kubenswrapper[4722]: I0309 14:20:00.646037 4722 generic.go:334] "Generic (PLEG): container finished" podID="8ba97601-bd7e-4763-8c59-0af1273fd01f" containerID="9cc4456f91b3c0afe2203a8e93a844650068489d457671f1b99643cd32e27b29" exitCode=0 Mar 09 14:20:00 crc kubenswrapper[4722]: I0309 14:20:00.646117 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8dh75" Mar 09 14:20:00 crc kubenswrapper[4722]: I0309 14:20:00.646118 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8dh75" event={"ID":"8ba97601-bd7e-4763-8c59-0af1273fd01f","Type":"ContainerDied","Data":"9cc4456f91b3c0afe2203a8e93a844650068489d457671f1b99643cd32e27b29"} Mar 09 14:20:00 crc kubenswrapper[4722]: I0309 14:20:00.646264 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8dh75" event={"ID":"8ba97601-bd7e-4763-8c59-0af1273fd01f","Type":"ContainerDied","Data":"4b13b28d43d58c9a022ddbebb03cccb80b343a8768f5d90ba0df030ca8e872ea"} Mar 09 14:20:00 crc kubenswrapper[4722]: I0309 14:20:00.646299 4722 scope.go:117] "RemoveContainer" containerID="9cc4456f91b3c0afe2203a8e93a844650068489d457671f1b99643cd32e27b29" Mar 09 14:20:00 crc kubenswrapper[4722]: I0309 14:20:00.653243 4722 generic.go:334] "Generic (PLEG): container finished" podID="ef685835-f0f9-45e4-b0e1-9213895704e4" containerID="85affa059ff9742f69e030ff1d6f24111eb4565ecadaf835607a83a1b824a866" exitCode=0 Mar 09 14:20:00 crc kubenswrapper[4722]: I0309 14:20:00.653642 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4tw5s7" event={"ID":"ef685835-f0f9-45e4-b0e1-9213895704e4","Type":"ContainerDied","Data":"85affa059ff9742f69e030ff1d6f24111eb4565ecadaf835607a83a1b824a866"} Mar 09 14:20:00 crc kubenswrapper[4722]: I0309 14:20:00.683674 4722 scope.go:117] "RemoveContainer" containerID="87f90862d3f61c97ac117dcdc06dd0b503aeeb88a959ccae2c53f92874ee295b" Mar 09 14:20:00 crc kubenswrapper[4722]: I0309 14:20:00.692783 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8dh75"] Mar 09 14:20:00 crc kubenswrapper[4722]: I0309 14:20:00.705161 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8dh75"] Mar 09 14:20:00 crc kubenswrapper[4722]: I0309 14:20:00.711974 4722 scope.go:117] "RemoveContainer" containerID="6ea932039d4db6c98713a4fd135a2bf606f0659160cc20109a72cc8251652047" Mar 09 14:20:00 crc kubenswrapper[4722]: I0309 14:20:00.736464 4722 scope.go:117] "RemoveContainer" containerID="9cc4456f91b3c0afe2203a8e93a844650068489d457671f1b99643cd32e27b29" Mar 09 14:20:00 crc kubenswrapper[4722]: E0309 14:20:00.736731 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cc4456f91b3c0afe2203a8e93a844650068489d457671f1b99643cd32e27b29\": container with ID starting with 9cc4456f91b3c0afe2203a8e93a844650068489d457671f1b99643cd32e27b29 not found: ID does not exist" containerID="9cc4456f91b3c0afe2203a8e93a844650068489d457671f1b99643cd32e27b29" Mar 09 14:20:00 crc kubenswrapper[4722]: I0309 14:20:00.736771 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cc4456f91b3c0afe2203a8e93a844650068489d457671f1b99643cd32e27b29"} err="failed to get container status \"9cc4456f91b3c0afe2203a8e93a844650068489d457671f1b99643cd32e27b29\": rpc error: code = NotFound desc = could not find container \"9cc4456f91b3c0afe2203a8e93a844650068489d457671f1b99643cd32e27b29\": container with ID starting with 9cc4456f91b3c0afe2203a8e93a844650068489d457671f1b99643cd32e27b29 not found: ID does not exist" Mar 09 14:20:00 crc kubenswrapper[4722]: I0309 14:20:00.736804 4722 scope.go:117] "RemoveContainer" containerID="87f90862d3f61c97ac117dcdc06dd0b503aeeb88a959ccae2c53f92874ee295b" Mar 09 14:20:00 crc kubenswrapper[4722]: E0309 14:20:00.737140 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87f90862d3f61c97ac117dcdc06dd0b503aeeb88a959ccae2c53f92874ee295b\": container with ID starting with 87f90862d3f61c97ac117dcdc06dd0b503aeeb88a959ccae2c53f92874ee295b not found: ID does not exist" containerID="87f90862d3f61c97ac117dcdc06dd0b503aeeb88a959ccae2c53f92874ee295b" Mar 09 14:20:00 crc kubenswrapper[4722]: I0309 14:20:00.737173 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87f90862d3f61c97ac117dcdc06dd0b503aeeb88a959ccae2c53f92874ee295b"} err="failed to get container status \"87f90862d3f61c97ac117dcdc06dd0b503aeeb88a959ccae2c53f92874ee295b\": rpc error: code = NotFound desc = could not find container \"87f90862d3f61c97ac117dcdc06dd0b503aeeb88a959ccae2c53f92874ee295b\": container with ID starting with 87f90862d3f61c97ac117dcdc06dd0b503aeeb88a959ccae2c53f92874ee295b not found: ID does not exist" Mar 09 14:20:00 crc kubenswrapper[4722]: I0309 14:20:00.737195 4722 scope.go:117] "RemoveContainer" containerID="6ea932039d4db6c98713a4fd135a2bf606f0659160cc20109a72cc8251652047" Mar 09 14:20:00 crc kubenswrapper[4722]: E0309 14:20:00.737488 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ea932039d4db6c98713a4fd135a2bf606f0659160cc20109a72cc8251652047\": container with ID starting with 6ea932039d4db6c98713a4fd135a2bf606f0659160cc20109a72cc8251652047 not found: ID does not exist" containerID="6ea932039d4db6c98713a4fd135a2bf606f0659160cc20109a72cc8251652047" Mar 09 14:20:00 crc kubenswrapper[4722]: I0309 14:20:00.737513 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ea932039d4db6c98713a4fd135a2bf606f0659160cc20109a72cc8251652047"} err="failed to get container status \"6ea932039d4db6c98713a4fd135a2bf606f0659160cc20109a72cc8251652047\": rpc error: code = NotFound desc = could not find container \"6ea932039d4db6c98713a4fd135a2bf606f0659160cc20109a72cc8251652047\": container with ID starting with 6ea932039d4db6c98713a4fd135a2bf606f0659160cc20109a72cc8251652047 not found: ID does not exist" Mar 09 14:20:00 crc kubenswrapper[4722]: I0309 14:20:00.949921 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551100-2cmr4"] Mar 09 14:20:01 crc kubenswrapper[4722]: I0309 14:20:01.664340 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551100-2cmr4" event={"ID":"bbc4a147-c662-4236-b11b-16239fa031a0","Type":"ContainerStarted","Data":"5be35280f29ece0ea0a0fb7adc7ea9f45223d2e6f3c34d1375e2292d78b6ffb1"} Mar 09 14:20:02 crc kubenswrapper[4722]: I0309 14:20:02.038259 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4tw5s7" Mar 09 14:20:02 crc kubenswrapper[4722]: I0309 14:20:02.168791 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ba97601-bd7e-4763-8c59-0af1273fd01f" path="/var/lib/kubelet/pods/8ba97601-bd7e-4763-8c59-0af1273fd01f/volumes" Mar 09 14:20:02 crc kubenswrapper[4722]: I0309 14:20:02.201675 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ef685835-f0f9-45e4-b0e1-9213895704e4-bundle\") pod \"ef685835-f0f9-45e4-b0e1-9213895704e4\" (UID: \"ef685835-f0f9-45e4-b0e1-9213895704e4\") " Mar 09 14:20:02 crc kubenswrapper[4722]: I0309 14:20:02.201748 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ef685835-f0f9-45e4-b0e1-9213895704e4-util\") pod \"ef685835-f0f9-45e4-b0e1-9213895704e4\" (UID: \"ef685835-f0f9-45e4-b0e1-9213895704e4\") " Mar 09 14:20:02 crc kubenswrapper[4722]: I0309 14:20:02.201819 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28mbz\" (UniqueName: \"kubernetes.io/projected/ef685835-f0f9-45e4-b0e1-9213895704e4-kube-api-access-28mbz\") pod \"ef685835-f0f9-45e4-b0e1-9213895704e4\" (UID: \"ef685835-f0f9-45e4-b0e1-9213895704e4\") " Mar 09 14:20:02 crc kubenswrapper[4722]: I0309 14:20:02.202867 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef685835-f0f9-45e4-b0e1-9213895704e4-bundle" (OuterVolumeSpecName: "bundle") pod "ef685835-f0f9-45e4-b0e1-9213895704e4" (UID: "ef685835-f0f9-45e4-b0e1-9213895704e4"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:20:02 crc kubenswrapper[4722]: I0309 14:20:02.207951 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef685835-f0f9-45e4-b0e1-9213895704e4-kube-api-access-28mbz" (OuterVolumeSpecName: "kube-api-access-28mbz") pod "ef685835-f0f9-45e4-b0e1-9213895704e4" (UID: "ef685835-f0f9-45e4-b0e1-9213895704e4"). InnerVolumeSpecName "kube-api-access-28mbz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:20:02 crc kubenswrapper[4722]: I0309 14:20:02.303157 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-28mbz\" (UniqueName: \"kubernetes.io/projected/ef685835-f0f9-45e4-b0e1-9213895704e4-kube-api-access-28mbz\") on node \"crc\" DevicePath \"\"" Mar 09 14:20:02 crc kubenswrapper[4722]: I0309 14:20:02.303216 4722 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ef685835-f0f9-45e4-b0e1-9213895704e4-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:20:02 crc kubenswrapper[4722]: I0309 14:20:02.453735 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef685835-f0f9-45e4-b0e1-9213895704e4-util" (OuterVolumeSpecName: "util") pod "ef685835-f0f9-45e4-b0e1-9213895704e4" (UID: "ef685835-f0f9-45e4-b0e1-9213895704e4"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:20:02 crc kubenswrapper[4722]: I0309 14:20:02.506984 4722 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ef685835-f0f9-45e4-b0e1-9213895704e4-util\") on node \"crc\" DevicePath \"\"" Mar 09 14:20:02 crc kubenswrapper[4722]: I0309 14:20:02.674147 4722 generic.go:334] "Generic (PLEG): container finished" podID="bbc4a147-c662-4236-b11b-16239fa031a0" containerID="dd11c5d281c72d880edcd7ab0de997dadc7ed6b7c81eb6f7415412e1d4bb7a0f" exitCode=0 Mar 09 14:20:02 crc kubenswrapper[4722]: I0309 14:20:02.674234 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551100-2cmr4" event={"ID":"bbc4a147-c662-4236-b11b-16239fa031a0","Type":"ContainerDied","Data":"dd11c5d281c72d880edcd7ab0de997dadc7ed6b7c81eb6f7415412e1d4bb7a0f"} Mar 09 14:20:02 crc kubenswrapper[4722]: I0309 14:20:02.676588 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4tw5s7" event={"ID":"ef685835-f0f9-45e4-b0e1-9213895704e4","Type":"ContainerDied","Data":"460c9d4c3c744a9f5bcfb880ae2e48e4b70b413cdc80576327e167887f9f52d7"} Mar 09 14:20:02 crc kubenswrapper[4722]: I0309 14:20:02.676613 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="460c9d4c3c744a9f5bcfb880ae2e48e4b70b413cdc80576327e167887f9f52d7" Mar 09 14:20:02 crc kubenswrapper[4722]: I0309 14:20:02.676650 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4tw5s7" Mar 09 14:20:03 crc kubenswrapper[4722]: I0309 14:20:03.964879 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551100-2cmr4" Mar 09 14:20:04 crc kubenswrapper[4722]: I0309 14:20:04.032585 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8xvb\" (UniqueName: \"kubernetes.io/projected/bbc4a147-c662-4236-b11b-16239fa031a0-kube-api-access-r8xvb\") pod \"bbc4a147-c662-4236-b11b-16239fa031a0\" (UID: \"bbc4a147-c662-4236-b11b-16239fa031a0\") " Mar 09 14:20:04 crc kubenswrapper[4722]: I0309 14:20:04.040641 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbc4a147-c662-4236-b11b-16239fa031a0-kube-api-access-r8xvb" (OuterVolumeSpecName: "kube-api-access-r8xvb") pod "bbc4a147-c662-4236-b11b-16239fa031a0" (UID: "bbc4a147-c662-4236-b11b-16239fa031a0"). InnerVolumeSpecName "kube-api-access-r8xvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:20:04 crc kubenswrapper[4722]: I0309 14:20:04.134358 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8xvb\" (UniqueName: \"kubernetes.io/projected/bbc4a147-c662-4236-b11b-16239fa031a0-kube-api-access-r8xvb\") on node \"crc\" DevicePath \"\"" Mar 09 14:20:04 crc kubenswrapper[4722]: I0309 14:20:04.697661 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551100-2cmr4" event={"ID":"bbc4a147-c662-4236-b11b-16239fa031a0","Type":"ContainerDied","Data":"5be35280f29ece0ea0a0fb7adc7ea9f45223d2e6f3c34d1375e2292d78b6ffb1"} Mar 09 14:20:04 crc kubenswrapper[4722]: I0309 14:20:04.698042 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5be35280f29ece0ea0a0fb7adc7ea9f45223d2e6f3c34d1375e2292d78b6ffb1" Mar 09 14:20:04 crc kubenswrapper[4722]: I0309 14:20:04.698121 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551100-2cmr4" Mar 09 14:20:05 crc kubenswrapper[4722]: I0309 14:20:05.046017 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551094-7t8bv"] Mar 09 14:20:05 crc kubenswrapper[4722]: I0309 14:20:05.058643 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551094-7t8bv"] Mar 09 14:20:06 crc kubenswrapper[4722]: I0309 14:20:06.157134 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fcf0635-09ce-4b6b-b899-b51db22e1b37" path="/var/lib/kubelet/pods/0fcf0635-09ce-4b6b-b899-b51db22e1b37/volumes" Mar 09 14:20:08 crc kubenswrapper[4722]: I0309 14:20:08.126640 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-z644p"] Mar 09 14:20:08 crc kubenswrapper[4722]: E0309 14:20:08.127238 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef685835-f0f9-45e4-b0e1-9213895704e4" containerName="extract" Mar 09 14:20:08 crc kubenswrapper[4722]: I0309 14:20:08.127254 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef685835-f0f9-45e4-b0e1-9213895704e4" containerName="extract" Mar 09 14:20:08 crc kubenswrapper[4722]: E0309 14:20:08.127301 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef685835-f0f9-45e4-b0e1-9213895704e4" containerName="pull" Mar 09 14:20:08 crc kubenswrapper[4722]: I0309 14:20:08.127310 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef685835-f0f9-45e4-b0e1-9213895704e4" containerName="pull" Mar 09 14:20:08 crc kubenswrapper[4722]: E0309 14:20:08.127334 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbc4a147-c662-4236-b11b-16239fa031a0" containerName="oc" Mar 09 14:20:08 crc kubenswrapper[4722]: I0309 14:20:08.127342 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbc4a147-c662-4236-b11b-16239fa031a0" containerName="oc" Mar 09 14:20:08 crc kubenswrapper[4722]: E0309 14:20:08.127358 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef685835-f0f9-45e4-b0e1-9213895704e4" containerName="util" Mar 09 14:20:08 crc kubenswrapper[4722]: I0309 14:20:08.127364 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef685835-f0f9-45e4-b0e1-9213895704e4" containerName="util" Mar 09 14:20:08 crc kubenswrapper[4722]: I0309 14:20:08.127496 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef685835-f0f9-45e4-b0e1-9213895704e4" containerName="extract" Mar 09 14:20:08 crc kubenswrapper[4722]: I0309 14:20:08.127512 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbc4a147-c662-4236-b11b-16239fa031a0" containerName="oc" Mar 09 14:20:08 crc kubenswrapper[4722]: I0309 14:20:08.128524 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-z644p" Mar 09 14:20:08 crc kubenswrapper[4722]: I0309 14:20:08.158458 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-z644p"] Mar 09 14:20:08 crc kubenswrapper[4722]: I0309 14:20:08.307085 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g29r4\" (UniqueName: \"kubernetes.io/projected/2880773d-b66d-4d9b-9185-24ff99f9d7c2-kube-api-access-g29r4\") pod \"community-operators-z644p\" (UID: \"2880773d-b66d-4d9b-9185-24ff99f9d7c2\") " pod="openshift-marketplace/community-operators-z644p" Mar 09 14:20:08 crc kubenswrapper[4722]: I0309 14:20:08.307183 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2880773d-b66d-4d9b-9185-24ff99f9d7c2-utilities\") pod \"community-operators-z644p\" (UID: \"2880773d-b66d-4d9b-9185-24ff99f9d7c2\") " pod="openshift-marketplace/community-operators-z644p" Mar 09 14:20:08 crc kubenswrapper[4722]: I0309 14:20:08.307218 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2880773d-b66d-4d9b-9185-24ff99f9d7c2-catalog-content\") pod \"community-operators-z644p\" (UID: \"2880773d-b66d-4d9b-9185-24ff99f9d7c2\") " pod="openshift-marketplace/community-operators-z644p" Mar 09 14:20:08 crc kubenswrapper[4722]: I0309 14:20:08.408639 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g29r4\" (UniqueName: \"kubernetes.io/projected/2880773d-b66d-4d9b-9185-24ff99f9d7c2-kube-api-access-g29r4\") pod \"community-operators-z644p\" (UID: \"2880773d-b66d-4d9b-9185-24ff99f9d7c2\") " pod="openshift-marketplace/community-operators-z644p" Mar 09 14:20:08 crc kubenswrapper[4722]: I0309 14:20:08.408698 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2880773d-b66d-4d9b-9185-24ff99f9d7c2-utilities\") pod \"community-operators-z644p\" (UID: \"2880773d-b66d-4d9b-9185-24ff99f9d7c2\") " pod="openshift-marketplace/community-operators-z644p" Mar 09 14:20:08 crc kubenswrapper[4722]: I0309 14:20:08.408721 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2880773d-b66d-4d9b-9185-24ff99f9d7c2-catalog-content\") pod \"community-operators-z644p\" (UID: \"2880773d-b66d-4d9b-9185-24ff99f9d7c2\") " pod="openshift-marketplace/community-operators-z644p" Mar 09 14:20:08 crc kubenswrapper[4722]: I0309 14:20:08.409173 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2880773d-b66d-4d9b-9185-24ff99f9d7c2-catalog-content\") pod \"community-operators-z644p\" (UID: \"2880773d-b66d-4d9b-9185-24ff99f9d7c2\") " pod="openshift-marketplace/community-operators-z644p" Mar 09 14:20:08 crc kubenswrapper[4722]: I0309 14:20:08.409453 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2880773d-b66d-4d9b-9185-24ff99f9d7c2-utilities\") pod \"community-operators-z644p\" (UID: \"2880773d-b66d-4d9b-9185-24ff99f9d7c2\") " pod="openshift-marketplace/community-operators-z644p" Mar 09 14:20:08 crc kubenswrapper[4722]: I0309 14:20:08.427255 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g29r4\" (UniqueName: \"kubernetes.io/projected/2880773d-b66d-4d9b-9185-24ff99f9d7c2-kube-api-access-g29r4\") pod \"community-operators-z644p\" (UID: \"2880773d-b66d-4d9b-9185-24ff99f9d7c2\") " pod="openshift-marketplace/community-operators-z644p" Mar 09 14:20:08 crc kubenswrapper[4722]: I0309 14:20:08.482668 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-z644p" Mar 09 14:20:08 crc kubenswrapper[4722]: I0309 14:20:08.941531 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-z644p"] Mar 09 14:20:09 crc kubenswrapper[4722]: I0309 14:20:09.745616 4722 generic.go:334] "Generic (PLEG): container finished" podID="2880773d-b66d-4d9b-9185-24ff99f9d7c2" containerID="2daff81b12e26c66af8e1678202ade9fa2c8758d57425d0377e2efa872fcc4c1" exitCode=0 Mar 09 14:20:09 crc kubenswrapper[4722]: I0309 14:20:09.745660 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z644p" event={"ID":"2880773d-b66d-4d9b-9185-24ff99f9d7c2","Type":"ContainerDied","Data":"2daff81b12e26c66af8e1678202ade9fa2c8758d57425d0377e2efa872fcc4c1"} Mar 09 14:20:09 crc kubenswrapper[4722]: I0309 14:20:09.745686 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z644p" event={"ID":"2880773d-b66d-4d9b-9185-24ff99f9d7c2","Type":"ContainerStarted","Data":"4f68c753373d862956dfac7013fc33af2e89a73d33d12587bed5e00c2477a49d"} Mar 09 14:20:10 crc kubenswrapper[4722]: I0309 14:20:10.756632 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z644p" event={"ID":"2880773d-b66d-4d9b-9185-24ff99f9d7c2","Type":"ContainerStarted","Data":"0d03caa6c4f6a04f0f5e9a0dd3ce52e217e583481ee4df446eb949ad00d64ccb"} Mar 09 14:20:11 crc kubenswrapper[4722]: I0309 14:20:11.766664 4722 generic.go:334] "Generic (PLEG): container finished" podID="2880773d-b66d-4d9b-9185-24ff99f9d7c2" containerID="0d03caa6c4f6a04f0f5e9a0dd3ce52e217e583481ee4df446eb949ad00d64ccb" exitCode=0 Mar 09 14:20:11 crc kubenswrapper[4722]: I0309 14:20:11.766711 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z644p" event={"ID":"2880773d-b66d-4d9b-9185-24ff99f9d7c2","Type":"ContainerDied","Data":"0d03caa6c4f6a04f0f5e9a0dd3ce52e217e583481ee4df446eb949ad00d64ccb"} Mar 09 14:20:12 crc kubenswrapper[4722]: I0309 14:20:12.777371 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z644p" event={"ID":"2880773d-b66d-4d9b-9185-24ff99f9d7c2","Type":"ContainerStarted","Data":"7cef14b9d6b3c3b69ea34a64a3c14e7dad05ff25ec8bda048748b920a10ad1c8"} Mar 09 14:20:12 crc kubenswrapper[4722]: I0309 14:20:12.794845 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-z644p" podStartSLOduration=2.280982593 podStartE2EDuration="4.79482996s" podCreationTimestamp="2026-03-09 14:20:08 +0000 UTC" firstStartedPulling="2026-03-09 14:20:09.748899723 +0000 UTC m=+1050.304468339" lastFinishedPulling="2026-03-09 14:20:12.26274709 +0000 UTC m=+1052.818315706" observedRunningTime="2026-03-09 14:20:12.792063784 +0000 UTC m=+1053.347632380" watchObservedRunningTime="2026-03-09 14:20:12.79482996 +0000 UTC m=+1053.350398536" Mar 09 14:20:13 crc kubenswrapper[4722]: I0309 14:20:13.091871 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-5689854475-89q94"] Mar 09 14:20:13 crc kubenswrapper[4722]: I0309 14:20:13.093120 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5689854475-89q94" Mar 09 14:20:13 crc kubenswrapper[4722]: I0309 14:20:13.095006 4722 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 09 14:20:13 crc kubenswrapper[4722]: I0309 14:20:13.095303 4722 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 09 14:20:13 crc kubenswrapper[4722]: I0309 14:20:13.095385 4722 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-hpjnp" Mar 09 14:20:13 crc kubenswrapper[4722]: I0309 14:20:13.096144 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 09 14:20:13 crc kubenswrapper[4722]: I0309 14:20:13.120337 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 09 14:20:13 crc kubenswrapper[4722]: I0309 14:20:13.154421 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5689854475-89q94"] Mar 09 14:20:13 crc kubenswrapper[4722]: I0309 14:20:13.282774 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3ea04cb5-4d36-42a9-bb83-c6f943619d16-apiservice-cert\") pod \"metallb-operator-controller-manager-5689854475-89q94\" (UID: \"3ea04cb5-4d36-42a9-bb83-c6f943619d16\") " pod="metallb-system/metallb-operator-controller-manager-5689854475-89q94" Mar 09 14:20:13 crc kubenswrapper[4722]: I0309 14:20:13.283126 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3ea04cb5-4d36-42a9-bb83-c6f943619d16-webhook-cert\") pod \"metallb-operator-controller-manager-5689854475-89q94\" (UID: \"3ea04cb5-4d36-42a9-bb83-c6f943619d16\") " pod="metallb-system/metallb-operator-controller-manager-5689854475-89q94" Mar 09 14:20:13 crc kubenswrapper[4722]: I0309 14:20:13.283282 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7b69f\" (UniqueName: \"kubernetes.io/projected/3ea04cb5-4d36-42a9-bb83-c6f943619d16-kube-api-access-7b69f\") pod \"metallb-operator-controller-manager-5689854475-89q94\" (UID: \"3ea04cb5-4d36-42a9-bb83-c6f943619d16\") " pod="metallb-system/metallb-operator-controller-manager-5689854475-89q94" Mar 09 14:20:13 crc kubenswrapper[4722]: I0309 14:20:13.385084 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3ea04cb5-4d36-42a9-bb83-c6f943619d16-apiservice-cert\") pod \"metallb-operator-controller-manager-5689854475-89q94\" (UID: \"3ea04cb5-4d36-42a9-bb83-c6f943619d16\") " pod="metallb-system/metallb-operator-controller-manager-5689854475-89q94" Mar 09 14:20:13 crc kubenswrapper[4722]: I0309 14:20:13.385170 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3ea04cb5-4d36-42a9-bb83-c6f943619d16-webhook-cert\") pod \"metallb-operator-controller-manager-5689854475-89q94\" (UID: \"3ea04cb5-4d36-42a9-bb83-c6f943619d16\") " pod="metallb-system/metallb-operator-controller-manager-5689854475-89q94" Mar 09 14:20:13 crc kubenswrapper[4722]: I0309 14:20:13.385222 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7b69f\" (UniqueName: \"kubernetes.io/projected/3ea04cb5-4d36-42a9-bb83-c6f943619d16-kube-api-access-7b69f\") pod \"metallb-operator-controller-manager-5689854475-89q94\" (UID: \"3ea04cb5-4d36-42a9-bb83-c6f943619d16\") " pod="metallb-system/metallb-operator-controller-manager-5689854475-89q94" Mar 09 14:20:13 crc kubenswrapper[4722]: I0309 14:20:13.403162 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3ea04cb5-4d36-42a9-bb83-c6f943619d16-webhook-cert\") pod \"metallb-operator-controller-manager-5689854475-89q94\" (UID: \"3ea04cb5-4d36-42a9-bb83-c6f943619d16\") " pod="metallb-system/metallb-operator-controller-manager-5689854475-89q94" Mar 09 14:20:13 crc kubenswrapper[4722]: I0309 14:20:13.403400 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3ea04cb5-4d36-42a9-bb83-c6f943619d16-apiservice-cert\") pod \"metallb-operator-controller-manager-5689854475-89q94\" (UID: \"3ea04cb5-4d36-42a9-bb83-c6f943619d16\") " pod="metallb-system/metallb-operator-controller-manager-5689854475-89q94" Mar 09 14:20:13 crc kubenswrapper[4722]: I0309 14:20:13.413531 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7b69f\" (UniqueName: \"kubernetes.io/projected/3ea04cb5-4d36-42a9-bb83-c6f943619d16-kube-api-access-7b69f\") pod \"metallb-operator-controller-manager-5689854475-89q94\" (UID: \"3ea04cb5-4d36-42a9-bb83-c6f943619d16\") " pod="metallb-system/metallb-operator-controller-manager-5689854475-89q94" Mar 09 14:20:13 crc kubenswrapper[4722]: I0309 14:20:13.421321 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5689854475-89q94" Mar 09 14:20:13 crc kubenswrapper[4722]: I0309 14:20:13.447769 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-798745ff96-864pz"] Mar 09 14:20:13 crc kubenswrapper[4722]: I0309 14:20:13.448688 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-798745ff96-864pz" Mar 09 14:20:13 crc kubenswrapper[4722]: I0309 14:20:13.451159 4722 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 09 14:20:13 crc kubenswrapper[4722]: I0309 14:20:13.451410 4722 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 09 14:20:13 crc kubenswrapper[4722]: I0309 14:20:13.451744 4722 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-mfvbb" Mar 09 14:20:13 crc kubenswrapper[4722]: I0309 14:20:13.476878 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-798745ff96-864pz"] Mar 09 14:20:13 crc kubenswrapper[4722]: I0309 14:20:13.588654 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgvd2\" (UniqueName: \"kubernetes.io/projected/f67efad4-1b85-4f64-9e98-55eb2da89fb6-kube-api-access-pgvd2\") pod \"metallb-operator-webhook-server-798745ff96-864pz\" (UID: \"f67efad4-1b85-4f64-9e98-55eb2da89fb6\") " pod="metallb-system/metallb-operator-webhook-server-798745ff96-864pz" Mar 09 14:20:13 crc kubenswrapper[4722]: I0309 14:20:13.588712 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f67efad4-1b85-4f64-9e98-55eb2da89fb6-apiservice-cert\") pod \"metallb-operator-webhook-server-798745ff96-864pz\" (UID: \"f67efad4-1b85-4f64-9e98-55eb2da89fb6\") " pod="metallb-system/metallb-operator-webhook-server-798745ff96-864pz" Mar 09 14:20:13 crc kubenswrapper[4722]: I0309 14:20:13.588830 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f67efad4-1b85-4f64-9e98-55eb2da89fb6-webhook-cert\") pod \"metallb-operator-webhook-server-798745ff96-864pz\" (UID: \"f67efad4-1b85-4f64-9e98-55eb2da89fb6\") " pod="metallb-system/metallb-operator-webhook-server-798745ff96-864pz" Mar 09 14:20:13 crc kubenswrapper[4722]: I0309 14:20:13.690477 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f67efad4-1b85-4f64-9e98-55eb2da89fb6-webhook-cert\") pod \"metallb-operator-webhook-server-798745ff96-864pz\" (UID: \"f67efad4-1b85-4f64-9e98-55eb2da89fb6\") " pod="metallb-system/metallb-operator-webhook-server-798745ff96-864pz" Mar 09 14:20:13 crc kubenswrapper[4722]: I0309 14:20:13.690612 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgvd2\" (UniqueName: \"kubernetes.io/projected/f67efad4-1b85-4f64-9e98-55eb2da89fb6-kube-api-access-pgvd2\") pod \"metallb-operator-webhook-server-798745ff96-864pz\" (UID: \"f67efad4-1b85-4f64-9e98-55eb2da89fb6\") " pod="metallb-system/metallb-operator-webhook-server-798745ff96-864pz" Mar 09 14:20:13 crc kubenswrapper[4722]: I0309 14:20:13.690645 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f67efad4-1b85-4f64-9e98-55eb2da89fb6-apiservice-cert\") pod \"metallb-operator-webhook-server-798745ff96-864pz\" (UID: \"f67efad4-1b85-4f64-9e98-55eb2da89fb6\") " pod="metallb-system/metallb-operator-webhook-server-798745ff96-864pz" Mar 09 14:20:13 crc kubenswrapper[4722]: I0309 14:20:13.695281 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f67efad4-1b85-4f64-9e98-55eb2da89fb6-webhook-cert\") pod \"metallb-operator-webhook-server-798745ff96-864pz\" (UID: \"f67efad4-1b85-4f64-9e98-55eb2da89fb6\") " pod="metallb-system/metallb-operator-webhook-server-798745ff96-864pz" Mar 09 14:20:13 crc kubenswrapper[4722]: I0309 14:20:13.697744 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f67efad4-1b85-4f64-9e98-55eb2da89fb6-apiservice-cert\") pod \"metallb-operator-webhook-server-798745ff96-864pz\" (UID: \"f67efad4-1b85-4f64-9e98-55eb2da89fb6\") " pod="metallb-system/metallb-operator-webhook-server-798745ff96-864pz" Mar 09 14:20:13 crc kubenswrapper[4722]: I0309 14:20:13.708584 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgvd2\" (UniqueName: \"kubernetes.io/projected/f67efad4-1b85-4f64-9e98-55eb2da89fb6-kube-api-access-pgvd2\") pod \"metallb-operator-webhook-server-798745ff96-864pz\" (UID: \"f67efad4-1b85-4f64-9e98-55eb2da89fb6\") " pod="metallb-system/metallb-operator-webhook-server-798745ff96-864pz" Mar 09 14:20:13 crc kubenswrapper[4722]: I0309 14:20:13.815097 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-798745ff96-864pz" Mar 09 14:20:13 crc kubenswrapper[4722]: I0309 14:20:13.991794 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5689854475-89q94"] Mar 09 14:20:14 crc kubenswrapper[4722]: I0309 14:20:14.316442 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-798745ff96-864pz"] Mar 09 14:20:14 crc kubenswrapper[4722]: W0309 14:20:14.327987 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf67efad4_1b85_4f64_9e98_55eb2da89fb6.slice/crio-5a23eac148306ab84e5ecf1d78308902d8d217898d74673308c98cb2ced33c62 WatchSource:0}: Error finding container 5a23eac148306ab84e5ecf1d78308902d8d217898d74673308c98cb2ced33c62: Status 404 returned error can't find the container with id 5a23eac148306ab84e5ecf1d78308902d8d217898d74673308c98cb2ced33c62 Mar 09 14:20:14 crc kubenswrapper[4722]: I0309 14:20:14.792186 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5689854475-89q94" event={"ID":"3ea04cb5-4d36-42a9-bb83-c6f943619d16","Type":"ContainerStarted","Data":"f5e928f807ae1e63169a6b2cf09beb145bbd6f9e82dfed619a4219aab02e1532"} Mar 09 14:20:14 crc kubenswrapper[4722]: I0309 14:20:14.793333 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-798745ff96-864pz" event={"ID":"f67efad4-1b85-4f64-9e98-55eb2da89fb6","Type":"ContainerStarted","Data":"5a23eac148306ab84e5ecf1d78308902d8d217898d74673308c98cb2ced33c62"} Mar 09 14:20:16 crc kubenswrapper[4722]: I0309 14:20:16.006048 4722 scope.go:117] "RemoveContainer" containerID="84c5a23e2b2328d9ab22aebec1965a907d8d90028ccd5b6209b26045f99b6725" Mar 09 14:20:18 crc kubenswrapper[4722]: I0309 14:20:18.482886 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-z644p" Mar 09 14:20:18 crc kubenswrapper[4722]: I0309 14:20:18.483609 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-z644p" Mar 09 14:20:18 crc kubenswrapper[4722]: I0309 14:20:18.533720 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-z644p" Mar 09 14:20:18 crc kubenswrapper[4722]: I0309 14:20:18.880883 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-z644p" Mar 09 14:20:20 crc kubenswrapper[4722]: I0309 14:20:20.838394 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5689854475-89q94" event={"ID":"3ea04cb5-4d36-42a9-bb83-c6f943619d16","Type":"ContainerStarted","Data":"0d966fe7c2e8e3b2f7acee820d22775c7a77d23eaef3601e29397335bcda71ac"} Mar 09 14:20:20 crc kubenswrapper[4722]: I0309 14:20:20.838789 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-5689854475-89q94" Mar 09 14:20:20 crc kubenswrapper[4722]: I0309 14:20:20.840880 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-798745ff96-864pz" event={"ID":"f67efad4-1b85-4f64-9e98-55eb2da89fb6","Type":"ContainerStarted","Data":"40e8a42e49bdaa139e2f793ecc9c4f756fce5f98b5a4f6cd78ab4bf4c69dc0ee"} Mar 09 14:20:20 crc kubenswrapper[4722]: I0309 14:20:20.841061 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-798745ff96-864pz" Mar 09 14:20:20 crc kubenswrapper[4722]: I0309 14:20:20.860119 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-5689854475-89q94" podStartSLOduration=2.006059664 podStartE2EDuration="7.860101737s" podCreationTimestamp="2026-03-09 14:20:13 +0000 UTC" firstStartedPulling="2026-03-09 14:20:14.003760575 +0000 UTC m=+1054.559329161" lastFinishedPulling="2026-03-09 14:20:19.857802658 +0000 UTC m=+1060.413371234" observedRunningTime="2026-03-09 14:20:20.854596315 +0000 UTC m=+1061.410164921" watchObservedRunningTime="2026-03-09 14:20:20.860101737 +0000 UTC m=+1061.415670313" Mar 09 14:20:20 crc kubenswrapper[4722]: I0309 14:20:20.887805 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-798745ff96-864pz" podStartSLOduration=2.342286056 podStartE2EDuration="7.887785054s" podCreationTimestamp="2026-03-09 14:20:13 +0000 UTC" firstStartedPulling="2026-03-09 14:20:14.330453353 +0000 UTC m=+1054.886021929" lastFinishedPulling="2026-03-09 14:20:19.875952351 +0000 UTC m=+1060.431520927" observedRunningTime="2026-03-09 14:20:20.879895425 +0000 UTC m=+1061.435464021" watchObservedRunningTime="2026-03-09 14:20:20.887785054 +0000 UTC m=+1061.443353630" Mar 09 14:20:20 crc kubenswrapper[4722]: I0309 14:20:20.926339 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-z644p"] Mar 09 14:20:20 crc kubenswrapper[4722]: I0309 14:20:20.926866 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-z644p" podUID="2880773d-b66d-4d9b-9185-24ff99f9d7c2" containerName="registry-server" containerID="cri-o://7cef14b9d6b3c3b69ea34a64a3c14e7dad05ff25ec8bda048748b920a10ad1c8" gracePeriod=2 Mar 09 14:20:21 crc kubenswrapper[4722]: I0309 14:20:21.369102 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-z644p" Mar 09 14:20:21 crc kubenswrapper[4722]: I0309 14:20:21.529707 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2880773d-b66d-4d9b-9185-24ff99f9d7c2-utilities\") pod \"2880773d-b66d-4d9b-9185-24ff99f9d7c2\" (UID: \"2880773d-b66d-4d9b-9185-24ff99f9d7c2\") " Mar 09 14:20:21 crc kubenswrapper[4722]: I0309 14:20:21.530178 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g29r4\" (UniqueName: \"kubernetes.io/projected/2880773d-b66d-4d9b-9185-24ff99f9d7c2-kube-api-access-g29r4\") pod \"2880773d-b66d-4d9b-9185-24ff99f9d7c2\" (UID: \"2880773d-b66d-4d9b-9185-24ff99f9d7c2\") " Mar 09 14:20:21 crc kubenswrapper[4722]: I0309 14:20:21.530471 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2880773d-b66d-4d9b-9185-24ff99f9d7c2-catalog-content\") pod \"2880773d-b66d-4d9b-9185-24ff99f9d7c2\" (UID: \"2880773d-b66d-4d9b-9185-24ff99f9d7c2\") " Mar 09 14:20:21 crc kubenswrapper[4722]: I0309 14:20:21.530754 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2880773d-b66d-4d9b-9185-24ff99f9d7c2-utilities" (OuterVolumeSpecName: "utilities") pod "2880773d-b66d-4d9b-9185-24ff99f9d7c2" (UID: "2880773d-b66d-4d9b-9185-24ff99f9d7c2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:20:21 crc kubenswrapper[4722]: I0309 14:20:21.531237 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2880773d-b66d-4d9b-9185-24ff99f9d7c2-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 14:20:21 crc kubenswrapper[4722]: I0309 14:20:21.553889 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2880773d-b66d-4d9b-9185-24ff99f9d7c2-kube-api-access-g29r4" (OuterVolumeSpecName: "kube-api-access-g29r4") pod "2880773d-b66d-4d9b-9185-24ff99f9d7c2" (UID: "2880773d-b66d-4d9b-9185-24ff99f9d7c2"). InnerVolumeSpecName "kube-api-access-g29r4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:20:21 crc kubenswrapper[4722]: I0309 14:20:21.583429 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2880773d-b66d-4d9b-9185-24ff99f9d7c2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2880773d-b66d-4d9b-9185-24ff99f9d7c2" (UID: "2880773d-b66d-4d9b-9185-24ff99f9d7c2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:20:21 crc kubenswrapper[4722]: I0309 14:20:21.633856 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2880773d-b66d-4d9b-9185-24ff99f9d7c2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 14:20:21 crc kubenswrapper[4722]: I0309 14:20:21.633893 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g29r4\" (UniqueName: \"kubernetes.io/projected/2880773d-b66d-4d9b-9185-24ff99f9d7c2-kube-api-access-g29r4\") on node \"crc\" DevicePath \"\"" Mar 09 14:20:21 crc kubenswrapper[4722]: I0309 14:20:21.851332 4722 generic.go:334] "Generic (PLEG): container finished" podID="2880773d-b66d-4d9b-9185-24ff99f9d7c2" containerID="7cef14b9d6b3c3b69ea34a64a3c14e7dad05ff25ec8bda048748b920a10ad1c8" exitCode=0 Mar 09 14:20:21 crc kubenswrapper[4722]: I0309 14:20:21.851415 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-z644p" Mar 09 14:20:21 crc kubenswrapper[4722]: I0309 14:20:21.851433 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z644p" event={"ID":"2880773d-b66d-4d9b-9185-24ff99f9d7c2","Type":"ContainerDied","Data":"7cef14b9d6b3c3b69ea34a64a3c14e7dad05ff25ec8bda048748b920a10ad1c8"} Mar 09 14:20:21 crc kubenswrapper[4722]: I0309 14:20:21.852920 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-z644p" event={"ID":"2880773d-b66d-4d9b-9185-24ff99f9d7c2","Type":"ContainerDied","Data":"4f68c753373d862956dfac7013fc33af2e89a73d33d12587bed5e00c2477a49d"} Mar 09 14:20:21 crc kubenswrapper[4722]: I0309 14:20:21.852948 4722 scope.go:117] "RemoveContainer" containerID="7cef14b9d6b3c3b69ea34a64a3c14e7dad05ff25ec8bda048748b920a10ad1c8" Mar 09 14:20:21 crc kubenswrapper[4722]: I0309 14:20:21.878677 4722 scope.go:117] "RemoveContainer" containerID="0d03caa6c4f6a04f0f5e9a0dd3ce52e217e583481ee4df446eb949ad00d64ccb" Mar 09 14:20:21 crc kubenswrapper[4722]: I0309 14:20:21.895999 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-z644p"] Mar 09 14:20:21 crc kubenswrapper[4722]: I0309 14:20:21.901053 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-z644p"] Mar 09 14:20:21 crc kubenswrapper[4722]: I0309 14:20:21.909412 4722 scope.go:117] "RemoveContainer" containerID="2daff81b12e26c66af8e1678202ade9fa2c8758d57425d0377e2efa872fcc4c1" Mar 09 14:20:21 crc kubenswrapper[4722]: I0309 14:20:21.927350 4722 scope.go:117] "RemoveContainer" containerID="7cef14b9d6b3c3b69ea34a64a3c14e7dad05ff25ec8bda048748b920a10ad1c8" Mar 09 14:20:21 crc kubenswrapper[4722]: E0309 14:20:21.927808 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7cef14b9d6b3c3b69ea34a64a3c14e7dad05ff25ec8bda048748b920a10ad1c8\": container with ID starting with 7cef14b9d6b3c3b69ea34a64a3c14e7dad05ff25ec8bda048748b920a10ad1c8 not found: ID does not exist" containerID="7cef14b9d6b3c3b69ea34a64a3c14e7dad05ff25ec8bda048748b920a10ad1c8" Mar 09 14:20:21 crc kubenswrapper[4722]: I0309 14:20:21.927838 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cef14b9d6b3c3b69ea34a64a3c14e7dad05ff25ec8bda048748b920a10ad1c8"} err="failed to get container status \"7cef14b9d6b3c3b69ea34a64a3c14e7dad05ff25ec8bda048748b920a10ad1c8\": rpc error: code = NotFound desc = could not find container \"7cef14b9d6b3c3b69ea34a64a3c14e7dad05ff25ec8bda048748b920a10ad1c8\": container with ID starting with 7cef14b9d6b3c3b69ea34a64a3c14e7dad05ff25ec8bda048748b920a10ad1c8 not found: ID does not exist" Mar 09 14:20:21 crc kubenswrapper[4722]: I0309 14:20:21.927859 4722 scope.go:117] "RemoveContainer" containerID="0d03caa6c4f6a04f0f5e9a0dd3ce52e217e583481ee4df446eb949ad00d64ccb" Mar 09 14:20:21 crc kubenswrapper[4722]: E0309 14:20:21.928363 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d03caa6c4f6a04f0f5e9a0dd3ce52e217e583481ee4df446eb949ad00d64ccb\": container with ID starting with 0d03caa6c4f6a04f0f5e9a0dd3ce52e217e583481ee4df446eb949ad00d64ccb not found: ID does not exist" containerID="0d03caa6c4f6a04f0f5e9a0dd3ce52e217e583481ee4df446eb949ad00d64ccb" Mar 09 14:20:21 crc kubenswrapper[4722]: I0309 14:20:21.928436 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d03caa6c4f6a04f0f5e9a0dd3ce52e217e583481ee4df446eb949ad00d64ccb"} err="failed to get container status \"0d03caa6c4f6a04f0f5e9a0dd3ce52e217e583481ee4df446eb949ad00d64ccb\": rpc error: code = NotFound desc = could not find container \"0d03caa6c4f6a04f0f5e9a0dd3ce52e217e583481ee4df446eb949ad00d64ccb\": container with ID starting with 0d03caa6c4f6a04f0f5e9a0dd3ce52e217e583481ee4df446eb949ad00d64ccb not found: ID does not exist" Mar 09 14:20:21 crc kubenswrapper[4722]: I0309 14:20:21.928516 4722 scope.go:117] "RemoveContainer" containerID="2daff81b12e26c66af8e1678202ade9fa2c8758d57425d0377e2efa872fcc4c1" Mar 09 14:20:21 crc kubenswrapper[4722]: E0309 14:20:21.928871 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2daff81b12e26c66af8e1678202ade9fa2c8758d57425d0377e2efa872fcc4c1\": container with ID starting with 2daff81b12e26c66af8e1678202ade9fa2c8758d57425d0377e2efa872fcc4c1 not found: ID does not exist" containerID="2daff81b12e26c66af8e1678202ade9fa2c8758d57425d0377e2efa872fcc4c1" Mar 09 14:20:21 crc kubenswrapper[4722]: I0309 14:20:21.928901 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2daff81b12e26c66af8e1678202ade9fa2c8758d57425d0377e2efa872fcc4c1"} err="failed to get container status \"2daff81b12e26c66af8e1678202ade9fa2c8758d57425d0377e2efa872fcc4c1\": rpc error: code = NotFound desc = could not find container \"2daff81b12e26c66af8e1678202ade9fa2c8758d57425d0377e2efa872fcc4c1\": container with ID starting with 2daff81b12e26c66af8e1678202ade9fa2c8758d57425d0377e2efa872fcc4c1 not found: ID does not exist" Mar 09 14:20:22 crc kubenswrapper[4722]: I0309 14:20:22.156937 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2880773d-b66d-4d9b-9185-24ff99f9d7c2" path="/var/lib/kubelet/pods/2880773d-b66d-4d9b-9185-24ff99f9d7c2/volumes" Mar 09 14:20:33 crc kubenswrapper[4722]: I0309 14:20:33.819822 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-798745ff96-864pz" Mar 09 14:20:53 crc kubenswrapper[4722]: I0309 14:20:53.424481 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-5689854475-89q94" Mar 09 14:20:54 crc kubenswrapper[4722]: I0309 14:20:54.214944 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-2nlfl"] Mar 09 14:20:54 crc kubenswrapper[4722]: E0309 14:20:54.215409 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2880773d-b66d-4d9b-9185-24ff99f9d7c2" containerName="extract-content" Mar 09 14:20:54 crc kubenswrapper[4722]: I0309 14:20:54.215433 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="2880773d-b66d-4d9b-9185-24ff99f9d7c2" containerName="extract-content" Mar 09 14:20:54 crc kubenswrapper[4722]: E0309 14:20:54.215460 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2880773d-b66d-4d9b-9185-24ff99f9d7c2" containerName="registry-server" Mar 09 14:20:54 crc kubenswrapper[4722]: I0309 14:20:54.215471 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="2880773d-b66d-4d9b-9185-24ff99f9d7c2" containerName="registry-server" Mar 09 14:20:54 crc kubenswrapper[4722]: E0309 14:20:54.215522 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2880773d-b66d-4d9b-9185-24ff99f9d7c2" containerName="extract-utilities" Mar 09 14:20:54 crc kubenswrapper[4722]: I0309 14:20:54.215533 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="2880773d-b66d-4d9b-9185-24ff99f9d7c2" containerName="extract-utilities" Mar 09 14:20:54 crc kubenswrapper[4722]: I0309 14:20:54.215812 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="2880773d-b66d-4d9b-9185-24ff99f9d7c2" containerName="registry-server" Mar 09 14:20:54 crc kubenswrapper[4722]: I0309 14:20:54.216607 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-2nlfl" Mar 09 14:20:54 crc kubenswrapper[4722]: I0309 14:20:54.218996 4722 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-bvllc" Mar 09 14:20:54 crc kubenswrapper[4722]: I0309 14:20:54.219307 4722 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 09 14:20:54 crc kubenswrapper[4722]: I0309 14:20:54.226060 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-6vn96"] Mar 09 14:20:54 crc kubenswrapper[4722]: I0309 14:20:54.230578 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-6vn96" Mar 09 14:20:54 crc kubenswrapper[4722]: I0309 14:20:54.232619 4722 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 09 14:20:54 crc kubenswrapper[4722]: I0309 14:20:54.232977 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 09 14:20:54 crc kubenswrapper[4722]: I0309 14:20:54.241250 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-2nlfl"] Mar 09 14:20:54 crc kubenswrapper[4722]: I0309 14:20:54.312566 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-kwml8"] Mar 09 14:20:54 crc kubenswrapper[4722]: I0309 14:20:54.314193 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-kwml8" Mar 09 14:20:54 crc kubenswrapper[4722]: I0309 14:20:54.319045 4722 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 09 14:20:54 crc kubenswrapper[4722]: I0309 14:20:54.319244 4722 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-98kpc" Mar 09 14:20:54 crc kubenswrapper[4722]: I0309 14:20:54.319483 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 09 14:20:54 crc kubenswrapper[4722]: I0309 14:20:54.319484 4722 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 09 14:20:54 crc kubenswrapper[4722]: I0309 14:20:54.329116 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-86ddb6bd46-6w5ww"] Mar 09 14:20:54 crc kubenswrapper[4722]: I0309 14:20:54.330368 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-86ddb6bd46-6w5ww" Mar 09 14:20:54 crc kubenswrapper[4722]: I0309 14:20:54.332436 4722 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 09 14:20:54 crc kubenswrapper[4722]: I0309 14:20:54.342493 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-86ddb6bd46-6w5ww"] Mar 09 14:20:54 crc kubenswrapper[4722]: I0309 14:20:54.369768 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46zsz\" (UniqueName: \"kubernetes.io/projected/8557439a-0367-4823-af83-28955a17cc08-kube-api-access-46zsz\") pod \"frr-k8s-webhook-server-7f989f654f-2nlfl\" (UID: \"8557439a-0367-4823-af83-28955a17cc08\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-2nlfl" Mar 09 14:20:54 crc kubenswrapper[4722]: I0309 14:20:54.370082 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82vf5\" (UniqueName: \"kubernetes.io/projected/29ed2858-4fd0-4817-8ed3-b3515ac035d7-kube-api-access-82vf5\") pod \"frr-k8s-6vn96\" (UID: \"29ed2858-4fd0-4817-8ed3-b3515ac035d7\") " pod="metallb-system/frr-k8s-6vn96" Mar 09 14:20:54 crc kubenswrapper[4722]: I0309 14:20:54.370302 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/29ed2858-4fd0-4817-8ed3-b3515ac035d7-metrics\") pod \"frr-k8s-6vn96\" (UID: \"29ed2858-4fd0-4817-8ed3-b3515ac035d7\") " pod="metallb-system/frr-k8s-6vn96" Mar 09 14:20:54 crc kubenswrapper[4722]: I0309 14:20:54.370436 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/29ed2858-4fd0-4817-8ed3-b3515ac035d7-frr-conf\") pod \"frr-k8s-6vn96\" (UID: \"29ed2858-4fd0-4817-8ed3-b3515ac035d7\") " pod="metallb-system/frr-k8s-6vn96" Mar 09 14:20:54 crc kubenswrapper[4722]: I0309 14:20:54.370538 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8557439a-0367-4823-af83-28955a17cc08-cert\") pod \"frr-k8s-webhook-server-7f989f654f-2nlfl\" (UID: \"8557439a-0367-4823-af83-28955a17cc08\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-2nlfl" Mar 09 14:20:54 crc kubenswrapper[4722]: I0309 14:20:54.370647 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/29ed2858-4fd0-4817-8ed3-b3515ac035d7-frr-sockets\") pod \"frr-k8s-6vn96\" (UID: \"29ed2858-4fd0-4817-8ed3-b3515ac035d7\") " pod="metallb-system/frr-k8s-6vn96" Mar 09 14:20:54 crc kubenswrapper[4722]: I0309 14:20:54.370769 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/29ed2858-4fd0-4817-8ed3-b3515ac035d7-frr-startup\") pod \"frr-k8s-6vn96\" (UID: \"29ed2858-4fd0-4817-8ed3-b3515ac035d7\") " pod="metallb-system/frr-k8s-6vn96" Mar 09 14:20:54 crc kubenswrapper[4722]: I0309 14:20:54.370863 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/29ed2858-4fd0-4817-8ed3-b3515ac035d7-reloader\") pod \"frr-k8s-6vn96\" (UID: \"29ed2858-4fd0-4817-8ed3-b3515ac035d7\") " pod="metallb-system/frr-k8s-6vn96" Mar 09 14:20:54 crc kubenswrapper[4722]: I0309 14:20:54.370955 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/29ed2858-4fd0-4817-8ed3-b3515ac035d7-metrics-certs\") pod \"frr-k8s-6vn96\" (UID: \"29ed2858-4fd0-4817-8ed3-b3515ac035d7\") " pod="metallb-system/frr-k8s-6vn96" Mar 09 14:20:54 crc kubenswrapper[4722]: I0309 14:20:54.472313 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/29ed2858-4fd0-4817-8ed3-b3515ac035d7-frr-startup\") pod \"frr-k8s-6vn96\" (UID: \"29ed2858-4fd0-4817-8ed3-b3515ac035d7\") " pod="metallb-system/frr-k8s-6vn96" Mar 09 14:20:54 crc kubenswrapper[4722]: I0309 14:20:54.472358 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/29ed2858-4fd0-4817-8ed3-b3515ac035d7-reloader\") pod \"frr-k8s-6vn96\" (UID: \"29ed2858-4fd0-4817-8ed3-b3515ac035d7\") " pod="metallb-system/frr-k8s-6vn96" Mar 09 14:20:54 crc kubenswrapper[4722]: I0309 14:20:54.472376 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/29ed2858-4fd0-4817-8ed3-b3515ac035d7-metrics-certs\") pod \"frr-k8s-6vn96\" (UID: \"29ed2858-4fd0-4817-8ed3-b3515ac035d7\") " pod="metallb-system/frr-k8s-6vn96" Mar 09 14:20:54 crc kubenswrapper[4722]: I0309 14:20:54.472403 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46zsz\" (UniqueName: \"kubernetes.io/projected/8557439a-0367-4823-af83-28955a17cc08-kube-api-access-46zsz\") pod \"frr-k8s-webhook-server-7f989f654f-2nlfl\" (UID: \"8557439a-0367-4823-af83-28955a17cc08\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-2nlfl" Mar 09 14:20:54 crc kubenswrapper[4722]: I0309 14:20:54.472451 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/93b8f0be-bf52-4559-8cf6-338026cb6610-memberlist\") pod \"speaker-kwml8\" (UID: \"93b8f0be-bf52-4559-8cf6-338026cb6610\") " pod="metallb-system/speaker-kwml8" Mar 09 14:20:54 crc kubenswrapper[4722]: I0309 14:20:54.472468 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/93b8f0be-bf52-4559-8cf6-338026cb6610-metallb-excludel2\") pod \"speaker-kwml8\" (UID: \"93b8f0be-bf52-4559-8cf6-338026cb6610\") " pod="metallb-system/speaker-kwml8" Mar 09 14:20:54 crc kubenswrapper[4722]: I0309 14:20:54.472492 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82vf5\" (UniqueName: \"kubernetes.io/projected/29ed2858-4fd0-4817-8ed3-b3515ac035d7-kube-api-access-82vf5\") pod \"frr-k8s-6vn96\" (UID: \"29ed2858-4fd0-4817-8ed3-b3515ac035d7\") " pod="metallb-system/frr-k8s-6vn96" Mar 09 14:20:54 crc kubenswrapper[4722]: I0309 14:20:54.472536 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0b264ba4-1398-4d3e-bb2b-83cd34a0ccf6-cert\") pod \"controller-86ddb6bd46-6w5ww\" (UID: \"0b264ba4-1398-4d3e-bb2b-83cd34a0ccf6\") " pod="metallb-system/controller-86ddb6bd46-6w5ww" Mar 09 14:20:54 crc kubenswrapper[4722]: I0309 14:20:54.472565 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/29ed2858-4fd0-4817-8ed3-b3515ac035d7-metrics\") pod \"frr-k8s-6vn96\" (UID: \"29ed2858-4fd0-4817-8ed3-b3515ac035d7\") " pod="metallb-system/frr-k8s-6vn96" Mar 09 14:20:54 crc kubenswrapper[4722]: I0309 14:20:54.472582 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/93b8f0be-bf52-4559-8cf6-338026cb6610-metrics-certs\") pod \"speaker-kwml8\" (UID: \"93b8f0be-bf52-4559-8cf6-338026cb6610\") " pod="metallb-system/speaker-kwml8" Mar 09 14:20:54 crc kubenswrapper[4722]: I0309 14:20:54.472601 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2xlg\" (UniqueName: \"kubernetes.io/projected/0b264ba4-1398-4d3e-bb2b-83cd34a0ccf6-kube-api-access-z2xlg\") pod \"controller-86ddb6bd46-6w5ww\" (UID: \"0b264ba4-1398-4d3e-bb2b-83cd34a0ccf6\") " pod="metallb-system/controller-86ddb6bd46-6w5ww" Mar 09 14:20:54 crc kubenswrapper[4722]: I0309 14:20:54.472620 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5q54\" (UniqueName: \"kubernetes.io/projected/93b8f0be-bf52-4559-8cf6-338026cb6610-kube-api-access-q5q54\") pod \"speaker-kwml8\" (UID: \"93b8f0be-bf52-4559-8cf6-338026cb6610\") " pod="metallb-system/speaker-kwml8" Mar 09 14:20:54 crc kubenswrapper[4722]: I0309 14:20:54.472646 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/29ed2858-4fd0-4817-8ed3-b3515ac035d7-frr-conf\") pod \"frr-k8s-6vn96\" (UID: \"29ed2858-4fd0-4817-8ed3-b3515ac035d7\") " pod="metallb-system/frr-k8s-6vn96" Mar 09 14:20:54 crc kubenswrapper[4722]: I0309 14:20:54.472670 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8557439a-0367-4823-af83-28955a17cc08-cert\") pod \"frr-k8s-webhook-server-7f989f654f-2nlfl\" (UID: \"8557439a-0367-4823-af83-28955a17cc08\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-2nlfl" Mar 09 14:20:54 crc kubenswrapper[4722]: I0309 14:20:54.472707 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/29ed2858-4fd0-4817-8ed3-b3515ac035d7-frr-sockets\") pod \"frr-k8s-6vn96\" (UID: \"29ed2858-4fd0-4817-8ed3-b3515ac035d7\") " pod="metallb-system/frr-k8s-6vn96" Mar 09 14:20:54 crc kubenswrapper[4722]: I0309 14:20:54.472731 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0b264ba4-1398-4d3e-bb2b-83cd34a0ccf6-metrics-certs\") pod \"controller-86ddb6bd46-6w5ww\" (UID: \"0b264ba4-1398-4d3e-bb2b-83cd34a0ccf6\") " pod="metallb-system/controller-86ddb6bd46-6w5ww" Mar 09 14:20:54 crc kubenswrapper[4722]: I0309 14:20:54.473078 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/29ed2858-4fd0-4817-8ed3-b3515ac035d7-reloader\") pod \"frr-k8s-6vn96\" (UID: \"29ed2858-4fd0-4817-8ed3-b3515ac035d7\") " pod="metallb-system/frr-k8s-6vn96" Mar 09 14:20:54 crc kubenswrapper[4722]: E0309 14:20:54.473155 4722 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Mar 09 14:20:54 crc kubenswrapper[4722]: E0309 14:20:54.473194 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/29ed2858-4fd0-4817-8ed3-b3515ac035d7-metrics-certs podName:29ed2858-4fd0-4817-8ed3-b3515ac035d7 nodeName:}" failed. No retries permitted until 2026-03-09 14:20:54.973180057 +0000 UTC m=+1095.528748633 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/29ed2858-4fd0-4817-8ed3-b3515ac035d7-metrics-certs") pod "frr-k8s-6vn96" (UID: "29ed2858-4fd0-4817-8ed3-b3515ac035d7") : secret "frr-k8s-certs-secret" not found Mar 09 14:20:54 crc kubenswrapper[4722]: I0309 14:20:54.473597 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/29ed2858-4fd0-4817-8ed3-b3515ac035d7-frr-startup\") pod \"frr-k8s-6vn96\" (UID: \"29ed2858-4fd0-4817-8ed3-b3515ac035d7\") " pod="metallb-system/frr-k8s-6vn96" Mar 09 14:20:54 crc kubenswrapper[4722]: I0309 14:20:54.473824 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/29ed2858-4fd0-4817-8ed3-b3515ac035d7-frr-conf\") pod \"frr-k8s-6vn96\" (UID: \"29ed2858-4fd0-4817-8ed3-b3515ac035d7\") " pod="metallb-system/frr-k8s-6vn96" Mar 09 14:20:54 crc kubenswrapper[4722]: I0309 14:20:54.473908 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/29ed2858-4fd0-4817-8ed3-b3515ac035d7-metrics\") pod \"frr-k8s-6vn96\" (UID: \"29ed2858-4fd0-4817-8ed3-b3515ac035d7\") " pod="metallb-system/frr-k8s-6vn96" Mar 09 14:20:54 crc kubenswrapper[4722]: I0309 14:20:54.474280 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/29ed2858-4fd0-4817-8ed3-b3515ac035d7-frr-sockets\") pod \"frr-k8s-6vn96\" (UID: \"29ed2858-4fd0-4817-8ed3-b3515ac035d7\") " pod="metallb-system/frr-k8s-6vn96" Mar 09 14:20:54 crc kubenswrapper[4722]: I0309 14:20:54.480153 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8557439a-0367-4823-af83-28955a17cc08-cert\") pod \"frr-k8s-webhook-server-7f989f654f-2nlfl\" (UID: \"8557439a-0367-4823-af83-28955a17cc08\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-2nlfl" Mar 09 14:20:54 crc kubenswrapper[4722]: I0309 14:20:54.491071 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82vf5\" (UniqueName: \"kubernetes.io/projected/29ed2858-4fd0-4817-8ed3-b3515ac035d7-kube-api-access-82vf5\") pod \"frr-k8s-6vn96\" (UID: \"29ed2858-4fd0-4817-8ed3-b3515ac035d7\") " pod="metallb-system/frr-k8s-6vn96" Mar 09 14:20:54 crc kubenswrapper[4722]: I0309 14:20:54.496020 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46zsz\" (UniqueName: \"kubernetes.io/projected/8557439a-0367-4823-af83-28955a17cc08-kube-api-access-46zsz\") pod \"frr-k8s-webhook-server-7f989f654f-2nlfl\" (UID: \"8557439a-0367-4823-af83-28955a17cc08\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-2nlfl" Mar 09 14:20:54 crc kubenswrapper[4722]: I0309 14:20:54.548961 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-2nlfl" Mar 09 14:20:54 crc kubenswrapper[4722]: I0309 14:20:54.574306 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/93b8f0be-bf52-4559-8cf6-338026cb6610-memberlist\") pod \"speaker-kwml8\" (UID: \"93b8f0be-bf52-4559-8cf6-338026cb6610\") " pod="metallb-system/speaker-kwml8" Mar 09 14:20:54 crc kubenswrapper[4722]: I0309 14:20:54.574666 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/93b8f0be-bf52-4559-8cf6-338026cb6610-metallb-excludel2\") pod \"speaker-kwml8\" (UID: \"93b8f0be-bf52-4559-8cf6-338026cb6610\") " pod="metallb-system/speaker-kwml8" Mar 09 14:20:54 crc kubenswrapper[4722]: I0309 14:20:54.574797 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0b264ba4-1398-4d3e-bb2b-83cd34a0ccf6-cert\") pod \"controller-86ddb6bd46-6w5ww\" (UID: \"0b264ba4-1398-4d3e-bb2b-83cd34a0ccf6\") " pod="metallb-system/controller-86ddb6bd46-6w5ww" Mar 09 14:20:54 crc kubenswrapper[4722]: I0309 14:20:54.574892 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/93b8f0be-bf52-4559-8cf6-338026cb6610-metrics-certs\") pod \"speaker-kwml8\" (UID: \"93b8f0be-bf52-4559-8cf6-338026cb6610\") " pod="metallb-system/speaker-kwml8" Mar 09 14:20:54 crc kubenswrapper[4722]: I0309 14:20:54.574981 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2xlg\" (UniqueName: \"kubernetes.io/projected/0b264ba4-1398-4d3e-bb2b-83cd34a0ccf6-kube-api-access-z2xlg\") pod \"controller-86ddb6bd46-6w5ww\" (UID: \"0b264ba4-1398-4d3e-bb2b-83cd34a0ccf6\") " pod="metallb-system/controller-86ddb6bd46-6w5ww" Mar 09 14:20:54 crc kubenswrapper[4722]: E0309 14:20:54.574524 4722 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 09 14:20:54 crc kubenswrapper[4722]: E0309 14:20:54.575153 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93b8f0be-bf52-4559-8cf6-338026cb6610-memberlist podName:93b8f0be-bf52-4559-8cf6-338026cb6610 nodeName:}" failed. No retries permitted until 2026-03-09 14:20:55.075128328 +0000 UTC m=+1095.630696964 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/93b8f0be-bf52-4559-8cf6-338026cb6610-memberlist") pod "speaker-kwml8" (UID: "93b8f0be-bf52-4559-8cf6-338026cb6610") : secret "metallb-memberlist" not found Mar 09 14:20:54 crc kubenswrapper[4722]: I0309 14:20:54.575062 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5q54\" (UniqueName: \"kubernetes.io/projected/93b8f0be-bf52-4559-8cf6-338026cb6610-kube-api-access-q5q54\") pod \"speaker-kwml8\" (UID: \"93b8f0be-bf52-4559-8cf6-338026cb6610\") " pod="metallb-system/speaker-kwml8" Mar 09 14:20:54 crc kubenswrapper[4722]: I0309 14:20:54.575353 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0b264ba4-1398-4d3e-bb2b-83cd34a0ccf6-metrics-certs\") pod \"controller-86ddb6bd46-6w5ww\" (UID: \"0b264ba4-1398-4d3e-bb2b-83cd34a0ccf6\") " pod="metallb-system/controller-86ddb6bd46-6w5ww" Mar 09 14:20:54 crc kubenswrapper[4722]: I0309 14:20:54.575431 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/93b8f0be-bf52-4559-8cf6-338026cb6610-metallb-excludel2\") pod \"speaker-kwml8\" (UID: \"93b8f0be-bf52-4559-8cf6-338026cb6610\") " pod="metallb-system/speaker-kwml8" Mar 09 14:20:54 crc kubenswrapper[4722]: I0309 14:20:54.578988 4722 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 09 14:20:54 crc kubenswrapper[4722]: I0309 14:20:54.579183 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/93b8f0be-bf52-4559-8cf6-338026cb6610-metrics-certs\") pod \"speaker-kwml8\" (UID: \"93b8f0be-bf52-4559-8cf6-338026cb6610\") " pod="metallb-system/speaker-kwml8" Mar 09 14:20:54 crc kubenswrapper[4722]: I0309 14:20:54.579419 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0b264ba4-1398-4d3e-bb2b-83cd34a0ccf6-metrics-certs\") pod \"controller-86ddb6bd46-6w5ww\" (UID: \"0b264ba4-1398-4d3e-bb2b-83cd34a0ccf6\") " pod="metallb-system/controller-86ddb6bd46-6w5ww" Mar 09 14:20:54 crc kubenswrapper[4722]: I0309 14:20:54.593560 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0b264ba4-1398-4d3e-bb2b-83cd34a0ccf6-cert\") pod \"controller-86ddb6bd46-6w5ww\" (UID: \"0b264ba4-1398-4d3e-bb2b-83cd34a0ccf6\") " pod="metallb-system/controller-86ddb6bd46-6w5ww" Mar 09 14:20:54 crc kubenswrapper[4722]: I0309 14:20:54.597049 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5q54\" (UniqueName: \"kubernetes.io/projected/93b8f0be-bf52-4559-8cf6-338026cb6610-kube-api-access-q5q54\") pod \"speaker-kwml8\" (UID: \"93b8f0be-bf52-4559-8cf6-338026cb6610\") " pod="metallb-system/speaker-kwml8" Mar 09 14:20:54 crc kubenswrapper[4722]: I0309 14:20:54.600258 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2xlg\" (UniqueName: \"kubernetes.io/projected/0b264ba4-1398-4d3e-bb2b-83cd34a0ccf6-kube-api-access-z2xlg\") pod \"controller-86ddb6bd46-6w5ww\" (UID: \"0b264ba4-1398-4d3e-bb2b-83cd34a0ccf6\") " pod="metallb-system/controller-86ddb6bd46-6w5ww" Mar 09 14:20:54 crc kubenswrapper[4722]: I0309 14:20:54.642838 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-86ddb6bd46-6w5ww" Mar 09 14:20:54 crc kubenswrapper[4722]: I0309 14:20:54.985155 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-2nlfl"] Mar 09 14:20:54 crc kubenswrapper[4722]: I0309 14:20:54.986332 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/29ed2858-4fd0-4817-8ed3-b3515ac035d7-metrics-certs\") pod \"frr-k8s-6vn96\" (UID: \"29ed2858-4fd0-4817-8ed3-b3515ac035d7\") " pod="metallb-system/frr-k8s-6vn96" Mar 09 14:20:54 crc kubenswrapper[4722]: I0309 14:20:54.992555 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/29ed2858-4fd0-4817-8ed3-b3515ac035d7-metrics-certs\") pod \"frr-k8s-6vn96\" (UID: \"29ed2858-4fd0-4817-8ed3-b3515ac035d7\") " pod="metallb-system/frr-k8s-6vn96" Mar 09 14:20:55 crc kubenswrapper[4722]: I0309 14:20:55.085767 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-86ddb6bd46-6w5ww"] Mar 09 14:20:55 crc kubenswrapper[4722]: I0309 14:20:55.088393 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/93b8f0be-bf52-4559-8cf6-338026cb6610-memberlist\") pod \"speaker-kwml8\" (UID: \"93b8f0be-bf52-4559-8cf6-338026cb6610\") " pod="metallb-system/speaker-kwml8" Mar 09 14:20:55 crc kubenswrapper[4722]: E0309 14:20:55.088540 4722 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 09 14:20:55 crc kubenswrapper[4722]: E0309 14:20:55.088587 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93b8f0be-bf52-4559-8cf6-338026cb6610-memberlist podName:93b8f0be-bf52-4559-8cf6-338026cb6610 nodeName:}" failed. No retries permitted until 2026-03-09 14:20:56.088572072 +0000 UTC m=+1096.644140648 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/93b8f0be-bf52-4559-8cf6-338026cb6610-memberlist") pod "speaker-kwml8" (UID: "93b8f0be-bf52-4559-8cf6-338026cb6610") : secret "metallb-memberlist" not found Mar 09 14:20:55 crc kubenswrapper[4722]: I0309 14:20:55.136630 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-2nlfl" event={"ID":"8557439a-0367-4823-af83-28955a17cc08","Type":"ContainerStarted","Data":"6a568ebea61b10da229d3683a0d658980149d7fd063f73d09bfa0d3a8502502c"} Mar 09 14:20:55 crc kubenswrapper[4722]: I0309 14:20:55.137486 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-6w5ww" event={"ID":"0b264ba4-1398-4d3e-bb2b-83cd34a0ccf6","Type":"ContainerStarted","Data":"d9d8f2cc359fa2d93bcc9f434de35986d1b90aedc154e58d3efa1028f25c6ac9"} Mar 09 14:20:55 crc kubenswrapper[4722]: I0309 14:20:55.160504 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-6vn96" Mar 09 14:20:56 crc kubenswrapper[4722]: I0309 14:20:56.103977 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/93b8f0be-bf52-4559-8cf6-338026cb6610-memberlist\") pod \"speaker-kwml8\" (UID: \"93b8f0be-bf52-4559-8cf6-338026cb6610\") " pod="metallb-system/speaker-kwml8" Mar 09 14:20:56 crc kubenswrapper[4722]: I0309 14:20:56.121066 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/93b8f0be-bf52-4559-8cf6-338026cb6610-memberlist\") pod \"speaker-kwml8\" (UID: \"93b8f0be-bf52-4559-8cf6-338026cb6610\") " pod="metallb-system/speaker-kwml8" Mar 09 14:20:56 crc kubenswrapper[4722]: I0309 14:20:56.130140 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-kwml8" Mar 09 14:20:56 crc kubenswrapper[4722]: I0309 14:20:56.147270 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-6w5ww" event={"ID":"0b264ba4-1398-4d3e-bb2b-83cd34a0ccf6","Type":"ContainerStarted","Data":"76f242e79cbd19e75984d46d82a5f4b2c20ae2e2c2b0342bef1dc50d42103976"} Mar 09 14:20:56 crc kubenswrapper[4722]: I0309 14:20:56.147313 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-6w5ww" event={"ID":"0b264ba4-1398-4d3e-bb2b-83cd34a0ccf6","Type":"ContainerStarted","Data":"a1a02869fd9ddd439c962393949fee7151323c975474b74ba0c3e6af42e7e3d9"} Mar 09 14:20:56 crc kubenswrapper[4722]: I0309 14:20:56.147377 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-86ddb6bd46-6w5ww" Mar 09 14:20:56 crc kubenswrapper[4722]: I0309 14:20:56.161589 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6vn96" event={"ID":"29ed2858-4fd0-4817-8ed3-b3515ac035d7","Type":"ContainerStarted","Data":"3e816ae665e147d2e7e67dd4126826f84f2369d501d515375c9be9c62b49db9b"} Mar 09 14:20:56 crc kubenswrapper[4722]: I0309 14:20:56.166947 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-86ddb6bd46-6w5ww" podStartSLOduration=2.166929985 podStartE2EDuration="2.166929985s" podCreationTimestamp="2026-03-09 14:20:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:20:56.162913944 +0000 UTC m=+1096.718482530" watchObservedRunningTime="2026-03-09 14:20:56.166929985 +0000 UTC m=+1096.722498561" Mar 09 14:20:57 crc kubenswrapper[4722]: I0309 14:20:57.158754 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-kwml8" event={"ID":"93b8f0be-bf52-4559-8cf6-338026cb6610","Type":"ContainerStarted","Data":"38fd029bf052afcdf1869a543aa5295e94531651da65feb3be86f77d477879b3"} Mar 09 14:20:57 crc kubenswrapper[4722]: I0309 14:20:57.159136 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-kwml8" event={"ID":"93b8f0be-bf52-4559-8cf6-338026cb6610","Type":"ContainerStarted","Data":"f7a226d5093d75ff82b179a47e747e51f8211a60175fbcdd137f0c5d5378a95f"} Mar 09 14:20:57 crc kubenswrapper[4722]: I0309 14:20:57.159153 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-kwml8" event={"ID":"93b8f0be-bf52-4559-8cf6-338026cb6610","Type":"ContainerStarted","Data":"2d6d174ad15840ed6d260c9e2e0b99f13a5eb15dca57a57fa39e7681a9ae0cea"} Mar 09 14:20:57 crc kubenswrapper[4722]: I0309 14:20:57.159335 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-kwml8" Mar 09 14:20:57 crc kubenswrapper[4722]: I0309 14:20:57.181862 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-kwml8" podStartSLOduration=3.181845222 podStartE2EDuration="3.181845222s" podCreationTimestamp="2026-03-09 14:20:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:20:57.177987036 +0000 UTC m=+1097.733555612" watchObservedRunningTime="2026-03-09 14:20:57.181845222 +0000 UTC m=+1097.737413798" Mar 09 14:21:03 crc kubenswrapper[4722]: I0309 14:21:03.213383 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-2nlfl" event={"ID":"8557439a-0367-4823-af83-28955a17cc08","Type":"ContainerStarted","Data":"e319108c72ee500be331451610739dd7591d749c08a2d18d2a197ae9514e65d4"} Mar 09 14:21:03 crc kubenswrapper[4722]: I0309 14:21:03.213989 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-2nlfl" Mar 09 14:21:03 crc kubenswrapper[4722]: I0309 14:21:03.218425 4722 generic.go:334] "Generic (PLEG): container finished" podID="29ed2858-4fd0-4817-8ed3-b3515ac035d7" containerID="7ed4cc7b32b4e05f13f815765a3e7ea6c75b2080331eeb8f5a6c0d43c9e42b89" exitCode=0 Mar 09 14:21:03 crc kubenswrapper[4722]: I0309 14:21:03.218481 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6vn96" event={"ID":"29ed2858-4fd0-4817-8ed3-b3515ac035d7","Type":"ContainerDied","Data":"7ed4cc7b32b4e05f13f815765a3e7ea6c75b2080331eeb8f5a6c0d43c9e42b89"} Mar 09 14:21:03 crc kubenswrapper[4722]: I0309 14:21:03.236974 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-2nlfl" podStartSLOduration=1.847012104 podStartE2EDuration="9.236955138s" podCreationTimestamp="2026-03-09 14:20:54 +0000 UTC" firstStartedPulling="2026-03-09 14:20:54.998073969 +0000 UTC m=+1095.553642545" lastFinishedPulling="2026-03-09 14:21:02.388017003 +0000 UTC m=+1102.943585579" observedRunningTime="2026-03-09 14:21:03.233691368 +0000 UTC m=+1103.789259944" watchObservedRunningTime="2026-03-09 14:21:03.236955138 +0000 UTC m=+1103.792523714" Mar 09 14:21:04 crc kubenswrapper[4722]: I0309 14:21:04.229106 4722 generic.go:334] "Generic (PLEG): container finished" podID="29ed2858-4fd0-4817-8ed3-b3515ac035d7" containerID="82a207930f079892c9b5713f21d712c362b40c0b232a546a78ce4a8b86c6a3bd" exitCode=0 Mar 09 14:21:04 crc kubenswrapper[4722]: I0309 14:21:04.229186 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6vn96" event={"ID":"29ed2858-4fd0-4817-8ed3-b3515ac035d7","Type":"ContainerDied","Data":"82a207930f079892c9b5713f21d712c362b40c0b232a546a78ce4a8b86c6a3bd"} Mar 09 14:21:05 crc kubenswrapper[4722]: I0309 14:21:05.240932 4722 generic.go:334] "Generic (PLEG): container finished" podID="29ed2858-4fd0-4817-8ed3-b3515ac035d7" containerID="81e04d2aac1e737ec047369d43f3578f67a686e64a39c389f0dd73818b8dc344" exitCode=0 Mar 09 14:21:05 crc kubenswrapper[4722]: I0309 14:21:05.240977 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6vn96" event={"ID":"29ed2858-4fd0-4817-8ed3-b3515ac035d7","Type":"ContainerDied","Data":"81e04d2aac1e737ec047369d43f3578f67a686e64a39c389f0dd73818b8dc344"} Mar 09 14:21:06 crc kubenswrapper[4722]: I0309 14:21:06.134758 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-kwml8" Mar 09 14:21:06 crc kubenswrapper[4722]: I0309 14:21:06.268289 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6vn96" event={"ID":"29ed2858-4fd0-4817-8ed3-b3515ac035d7","Type":"ContainerStarted","Data":"06dcd515f92c094b3d9539d7158b8cbfa88cadc3994d6796639eebf14ddb2d68"} Mar 09 14:21:06 crc kubenswrapper[4722]: I0309 14:21:06.269566 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6vn96" event={"ID":"29ed2858-4fd0-4817-8ed3-b3515ac035d7","Type":"ContainerStarted","Data":"40c8bb2cc4d08c1bcfbefb3e5c754878ea6dbb95c5ed66cca764ddd6b30ebede"} Mar 09 14:21:06 crc kubenswrapper[4722]: I0309 14:21:06.269703 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6vn96" event={"ID":"29ed2858-4fd0-4817-8ed3-b3515ac035d7","Type":"ContainerStarted","Data":"25ac224479b35514fa9566dd2da263dce903042897ff1bff71d03f769bee9df3"} Mar 09 14:21:06 crc kubenswrapper[4722]: I0309 14:21:06.269835 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6vn96" event={"ID":"29ed2858-4fd0-4817-8ed3-b3515ac035d7","Type":"ContainerStarted","Data":"228312f95f7cd63c660e199315bd7b89094078a743e9a9f8c8185bc6054ecca4"} Mar 09 14:21:06 crc kubenswrapper[4722]: I0309 14:21:06.270036 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6vn96" event={"ID":"29ed2858-4fd0-4817-8ed3-b3515ac035d7","Type":"ContainerStarted","Data":"5d32ee9d09c3f1a6f773c148a99e203bcff909b156f0b9409b87196b806bbf1a"} Mar 09 14:21:07 crc kubenswrapper[4722]: I0309 14:21:07.285113 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6vn96" event={"ID":"29ed2858-4fd0-4817-8ed3-b3515ac035d7","Type":"ContainerStarted","Data":"1b08696b0ec9ed935d9e28ba7170ae2f6fa11c90048a8177434b2f80c76cffa1"} Mar 09 14:21:07 crc kubenswrapper[4722]: I0309 14:21:07.285454 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-6vn96" Mar 09 14:21:07 crc kubenswrapper[4722]: I0309 14:21:07.314627 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-6vn96" podStartSLOduration=6.16998166 podStartE2EDuration="13.314604477s" podCreationTimestamp="2026-03-09 14:20:54 +0000 UTC" firstStartedPulling="2026-03-09 14:20:55.262322729 +0000 UTC m=+1095.817891315" lastFinishedPulling="2026-03-09 14:21:02.406945556 +0000 UTC m=+1102.962514132" observedRunningTime="2026-03-09 14:21:07.309045803 +0000 UTC m=+1107.864614389" watchObservedRunningTime="2026-03-09 14:21:07.314604477 +0000 UTC m=+1107.870173053" Mar 09 14:21:09 crc kubenswrapper[4722]: I0309 14:21:09.071698 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-dw8w2"] Mar 09 14:21:09 crc kubenswrapper[4722]: I0309 14:21:09.073150 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-dw8w2" Mar 09 14:21:09 crc kubenswrapper[4722]: I0309 14:21:09.074885 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-fkv88" Mar 09 14:21:09 crc kubenswrapper[4722]: I0309 14:21:09.075239 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 09 14:21:09 crc kubenswrapper[4722]: I0309 14:21:09.075378 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 09 14:21:09 crc kubenswrapper[4722]: I0309 14:21:09.085949 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-dw8w2"] Mar 09 14:21:09 crc kubenswrapper[4722]: I0309 14:21:09.237401 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlflw\" (UniqueName: \"kubernetes.io/projected/b0a83909-d285-4405-ae04-9356b2a29db9-kube-api-access-dlflw\") pod \"openstack-operator-index-dw8w2\" (UID: \"b0a83909-d285-4405-ae04-9356b2a29db9\") " pod="openstack-operators/openstack-operator-index-dw8w2" Mar 09 14:21:09 crc kubenswrapper[4722]: I0309 14:21:09.339019 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlflw\" (UniqueName: \"kubernetes.io/projected/b0a83909-d285-4405-ae04-9356b2a29db9-kube-api-access-dlflw\") pod \"openstack-operator-index-dw8w2\" (UID: \"b0a83909-d285-4405-ae04-9356b2a29db9\") " pod="openstack-operators/openstack-operator-index-dw8w2" Mar 09 14:21:09 crc kubenswrapper[4722]: I0309 14:21:09.355677 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlflw\" (UniqueName: \"kubernetes.io/projected/b0a83909-d285-4405-ae04-9356b2a29db9-kube-api-access-dlflw\") pod \"openstack-operator-index-dw8w2\" (UID: \"b0a83909-d285-4405-ae04-9356b2a29db9\") " pod="openstack-operators/openstack-operator-index-dw8w2" Mar 09 14:21:09 crc kubenswrapper[4722]: I0309 14:21:09.392824 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-dw8w2" Mar 09 14:21:09 crc kubenswrapper[4722]: I0309 14:21:09.834780 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-dw8w2"] Mar 09 14:21:09 crc kubenswrapper[4722]: W0309 14:21:09.840237 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0a83909_d285_4405_ae04_9356b2a29db9.slice/crio-e45ad6f01547ed2ff17c4db01912a841dec157a484772da58dd726b2942942ed WatchSource:0}: Error finding container e45ad6f01547ed2ff17c4db01912a841dec157a484772da58dd726b2942942ed: Status 404 returned error can't find the container with id e45ad6f01547ed2ff17c4db01912a841dec157a484772da58dd726b2942942ed Mar 09 14:21:10 crc kubenswrapper[4722]: I0309 14:21:10.196775 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-6vn96" Mar 09 14:21:10 crc kubenswrapper[4722]: I0309 14:21:10.220609 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-6vn96" Mar 09 14:21:10 crc kubenswrapper[4722]: I0309 14:21:10.310373 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-dw8w2" event={"ID":"b0a83909-d285-4405-ae04-9356b2a29db9","Type":"ContainerStarted","Data":"e45ad6f01547ed2ff17c4db01912a841dec157a484772da58dd726b2942942ed"} Mar 09 14:21:11 crc kubenswrapper[4722]: I0309 14:21:11.853182 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-dw8w2"] Mar 09 14:21:12 crc kubenswrapper[4722]: I0309 14:21:12.462953 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-7qbrd"] Mar 09 14:21:12 crc kubenswrapper[4722]: I0309 14:21:12.464570 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-7qbrd" Mar 09 14:21:12 crc kubenswrapper[4722]: I0309 14:21:12.469489 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-7qbrd"] Mar 09 14:21:12 crc kubenswrapper[4722]: I0309 14:21:12.594234 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkfzb\" (UniqueName: \"kubernetes.io/projected/8ac36d47-4501-4033-aee7-ce9ed8ed7002-kube-api-access-vkfzb\") pod \"openstack-operator-index-7qbrd\" (UID: \"8ac36d47-4501-4033-aee7-ce9ed8ed7002\") " pod="openstack-operators/openstack-operator-index-7qbrd" Mar 09 14:21:12 crc kubenswrapper[4722]: I0309 14:21:12.695193 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkfzb\" (UniqueName: \"kubernetes.io/projected/8ac36d47-4501-4033-aee7-ce9ed8ed7002-kube-api-access-vkfzb\") pod \"openstack-operator-index-7qbrd\" (UID: \"8ac36d47-4501-4033-aee7-ce9ed8ed7002\") " pod="openstack-operators/openstack-operator-index-7qbrd" Mar 09 14:21:12 crc kubenswrapper[4722]: I0309 14:21:12.714820 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkfzb\" (UniqueName: \"kubernetes.io/projected/8ac36d47-4501-4033-aee7-ce9ed8ed7002-kube-api-access-vkfzb\") pod \"openstack-operator-index-7qbrd\" (UID: \"8ac36d47-4501-4033-aee7-ce9ed8ed7002\") " pod="openstack-operators/openstack-operator-index-7qbrd" Mar 09 14:21:12 crc kubenswrapper[4722]: I0309 14:21:12.821423 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-7qbrd" Mar 09 14:21:13 crc kubenswrapper[4722]: I0309 14:21:13.247881 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-7qbrd"] Mar 09 14:21:13 crc kubenswrapper[4722]: I0309 14:21:13.338962 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-7qbrd" event={"ID":"8ac36d47-4501-4033-aee7-ce9ed8ed7002","Type":"ContainerStarted","Data":"f5f6998fb66c06d8d8708ab7e7eaa34651f0efe31f84273db77c00faa63cc859"} Mar 09 14:21:13 crc kubenswrapper[4722]: I0309 14:21:13.340651 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-dw8w2" event={"ID":"b0a83909-d285-4405-ae04-9356b2a29db9","Type":"ContainerStarted","Data":"2b63c6facc599bcce666bdb107e5371f8dc0086fc8f81d5e9913cf4007c6b1e1"} Mar 09 14:21:13 crc kubenswrapper[4722]: I0309 14:21:13.340775 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-dw8w2" podUID="b0a83909-d285-4405-ae04-9356b2a29db9" containerName="registry-server" containerID="cri-o://2b63c6facc599bcce666bdb107e5371f8dc0086fc8f81d5e9913cf4007c6b1e1" gracePeriod=2 Mar 09 14:21:13 crc kubenswrapper[4722]: I0309 14:21:13.360807 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-dw8w2" podStartSLOduration=1.585209648 podStartE2EDuration="4.360784105s" podCreationTimestamp="2026-03-09 14:21:09 +0000 UTC" firstStartedPulling="2026-03-09 14:21:09.842130351 +0000 UTC m=+1110.397698927" lastFinishedPulling="2026-03-09 14:21:12.617704808 +0000 UTC m=+1113.173273384" observedRunningTime="2026-03-09 14:21:13.354301966 +0000 UTC m=+1113.909870542" watchObservedRunningTime="2026-03-09 14:21:13.360784105 +0000 UTC m=+1113.916352681" Mar 09 14:21:13 crc kubenswrapper[4722]: I0309 14:21:13.722607 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-dw8w2" Mar 09 14:21:13 crc kubenswrapper[4722]: I0309 14:21:13.813332 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dlflw\" (UniqueName: \"kubernetes.io/projected/b0a83909-d285-4405-ae04-9356b2a29db9-kube-api-access-dlflw\") pod \"b0a83909-d285-4405-ae04-9356b2a29db9\" (UID: \"b0a83909-d285-4405-ae04-9356b2a29db9\") " Mar 09 14:21:13 crc kubenswrapper[4722]: I0309 14:21:13.819327 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0a83909-d285-4405-ae04-9356b2a29db9-kube-api-access-dlflw" (OuterVolumeSpecName: "kube-api-access-dlflw") pod "b0a83909-d285-4405-ae04-9356b2a29db9" (UID: "b0a83909-d285-4405-ae04-9356b2a29db9"). InnerVolumeSpecName "kube-api-access-dlflw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:21:13 crc kubenswrapper[4722]: I0309 14:21:13.915589 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dlflw\" (UniqueName: \"kubernetes.io/projected/b0a83909-d285-4405-ae04-9356b2a29db9-kube-api-access-dlflw\") on node \"crc\" DevicePath \"\"" Mar 09 14:21:14 crc kubenswrapper[4722]: I0309 14:21:14.349490 4722 generic.go:334] "Generic (PLEG): container finished" podID="b0a83909-d285-4405-ae04-9356b2a29db9" containerID="2b63c6facc599bcce666bdb107e5371f8dc0086fc8f81d5e9913cf4007c6b1e1" exitCode=0 Mar 09 14:21:14 crc kubenswrapper[4722]: I0309 14:21:14.349546 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-dw8w2" event={"ID":"b0a83909-d285-4405-ae04-9356b2a29db9","Type":"ContainerDied","Data":"2b63c6facc599bcce666bdb107e5371f8dc0086fc8f81d5e9913cf4007c6b1e1"} Mar 09 14:21:14 crc kubenswrapper[4722]: I0309 14:21:14.349570 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-dw8w2" event={"ID":"b0a83909-d285-4405-ae04-9356b2a29db9","Type":"ContainerDied","Data":"e45ad6f01547ed2ff17c4db01912a841dec157a484772da58dd726b2942942ed"} Mar 09 14:21:14 crc kubenswrapper[4722]: I0309 14:21:14.349571 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-dw8w2" Mar 09 14:21:14 crc kubenswrapper[4722]: I0309 14:21:14.349585 4722 scope.go:117] "RemoveContainer" containerID="2b63c6facc599bcce666bdb107e5371f8dc0086fc8f81d5e9913cf4007c6b1e1" Mar 09 14:21:14 crc kubenswrapper[4722]: I0309 14:21:14.351566 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-7qbrd" event={"ID":"8ac36d47-4501-4033-aee7-ce9ed8ed7002","Type":"ContainerStarted","Data":"6adf907913ba33a54b43059100ba51bbb9f8f927134943ccbc706b2478b42924"} Mar 09 14:21:14 crc kubenswrapper[4722]: I0309 14:21:14.369356 4722 scope.go:117] "RemoveContainer" containerID="2b63c6facc599bcce666bdb107e5371f8dc0086fc8f81d5e9913cf4007c6b1e1" Mar 09 14:21:14 crc kubenswrapper[4722]: E0309 14:21:14.370036 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b63c6facc599bcce666bdb107e5371f8dc0086fc8f81d5e9913cf4007c6b1e1\": container with ID starting with 2b63c6facc599bcce666bdb107e5371f8dc0086fc8f81d5e9913cf4007c6b1e1 not found: ID does not exist" containerID="2b63c6facc599bcce666bdb107e5371f8dc0086fc8f81d5e9913cf4007c6b1e1" Mar 09 14:21:14 crc kubenswrapper[4722]: I0309 14:21:14.370264 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b63c6facc599bcce666bdb107e5371f8dc0086fc8f81d5e9913cf4007c6b1e1"} err="failed to get container status \"2b63c6facc599bcce666bdb107e5371f8dc0086fc8f81d5e9913cf4007c6b1e1\": rpc error: code = NotFound desc = could not find container \"2b63c6facc599bcce666bdb107e5371f8dc0086fc8f81d5e9913cf4007c6b1e1\": container with ID starting with 2b63c6facc599bcce666bdb107e5371f8dc0086fc8f81d5e9913cf4007c6b1e1 not found: ID does not exist" Mar 09 14:21:14 crc kubenswrapper[4722]: I0309 14:21:14.375736 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-dw8w2"] Mar 09 14:21:14 crc kubenswrapper[4722]: I0309 14:21:14.387046 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-dw8w2"] Mar 09 14:21:14 crc kubenswrapper[4722]: I0309 14:21:14.394356 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-7qbrd" podStartSLOduration=2.340566552 podStartE2EDuration="2.394328749s" podCreationTimestamp="2026-03-09 14:21:12 +0000 UTC" firstStartedPulling="2026-03-09 14:21:13.260041508 +0000 UTC m=+1113.815610084" lastFinishedPulling="2026-03-09 14:21:13.313803685 +0000 UTC m=+1113.869372281" observedRunningTime="2026-03-09 14:21:14.384690452 +0000 UTC m=+1114.940259028" watchObservedRunningTime="2026-03-09 14:21:14.394328749 +0000 UTC m=+1114.949897335" Mar 09 14:21:14 crc kubenswrapper[4722]: I0309 14:21:14.556216 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-2nlfl" Mar 09 14:21:14 crc kubenswrapper[4722]: I0309 14:21:14.646588 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-86ddb6bd46-6w5ww" Mar 09 14:21:15 crc kubenswrapper[4722]: I0309 14:21:15.164126 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-6vn96" Mar 09 14:21:16 crc kubenswrapper[4722]: I0309 14:21:16.157675 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0a83909-d285-4405-ae04-9356b2a29db9" path="/var/lib/kubelet/pods/b0a83909-d285-4405-ae04-9356b2a29db9/volumes" Mar 09 14:21:22 crc kubenswrapper[4722]: I0309 14:21:22.822713 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-7qbrd" Mar 09 14:21:22 crc kubenswrapper[4722]: I0309 14:21:22.824669 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-7qbrd" Mar 09 14:21:22 crc kubenswrapper[4722]: I0309 14:21:22.851775 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-7qbrd" Mar 09 14:21:23 crc kubenswrapper[4722]: I0309 14:21:23.445736 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-7qbrd" Mar 09 14:21:30 crc kubenswrapper[4722]: I0309 14:21:30.307338 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/b46038f17be43086a4100f87067c3f77525f8955a044026c95f8e3646bc72km"] Mar 09 14:21:30 crc kubenswrapper[4722]: E0309 14:21:30.308175 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0a83909-d285-4405-ae04-9356b2a29db9" containerName="registry-server" Mar 09 14:21:30 crc kubenswrapper[4722]: I0309 14:21:30.308191 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0a83909-d285-4405-ae04-9356b2a29db9" containerName="registry-server" Mar 09 14:21:30 crc kubenswrapper[4722]: I0309 14:21:30.308435 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0a83909-d285-4405-ae04-9356b2a29db9" containerName="registry-server" Mar 09 14:21:30 crc kubenswrapper[4722]: I0309 14:21:30.309998 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/b46038f17be43086a4100f87067c3f77525f8955a044026c95f8e3646bc72km" Mar 09 14:21:30 crc kubenswrapper[4722]: I0309 14:21:30.312280 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-gp8ct" Mar 09 14:21:30 crc kubenswrapper[4722]: I0309 14:21:30.323678 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/b46038f17be43086a4100f87067c3f77525f8955a044026c95f8e3646bc72km"] Mar 09 14:21:30 crc kubenswrapper[4722]: I0309 14:21:30.400096 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fcc40a21-72a2-4ffc-9148-848cc22b9ada-bundle\") pod \"b46038f17be43086a4100f87067c3f77525f8955a044026c95f8e3646bc72km\" (UID: \"fcc40a21-72a2-4ffc-9148-848cc22b9ada\") " pod="openstack-operators/b46038f17be43086a4100f87067c3f77525f8955a044026c95f8e3646bc72km" Mar 09 14:21:30 crc kubenswrapper[4722]: I0309 14:21:30.400481 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fcc40a21-72a2-4ffc-9148-848cc22b9ada-util\") pod \"b46038f17be43086a4100f87067c3f77525f8955a044026c95f8e3646bc72km\" (UID: \"fcc40a21-72a2-4ffc-9148-848cc22b9ada\") " pod="openstack-operators/b46038f17be43086a4100f87067c3f77525f8955a044026c95f8e3646bc72km" Mar 09 14:21:30 crc kubenswrapper[4722]: I0309 14:21:30.400636 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-km655\" (UniqueName: \"kubernetes.io/projected/fcc40a21-72a2-4ffc-9148-848cc22b9ada-kube-api-access-km655\") pod \"b46038f17be43086a4100f87067c3f77525f8955a044026c95f8e3646bc72km\" (UID: \"fcc40a21-72a2-4ffc-9148-848cc22b9ada\") " pod="openstack-operators/b46038f17be43086a4100f87067c3f77525f8955a044026c95f8e3646bc72km" Mar 09 14:21:30 crc kubenswrapper[4722]: I0309 14:21:30.502718 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fcc40a21-72a2-4ffc-9148-848cc22b9ada-bundle\") pod \"b46038f17be43086a4100f87067c3f77525f8955a044026c95f8e3646bc72km\" (UID: \"fcc40a21-72a2-4ffc-9148-848cc22b9ada\") " pod="openstack-operators/b46038f17be43086a4100f87067c3f77525f8955a044026c95f8e3646bc72km" Mar 09 14:21:30 crc kubenswrapper[4722]: I0309 14:21:30.502833 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fcc40a21-72a2-4ffc-9148-848cc22b9ada-util\") pod \"b46038f17be43086a4100f87067c3f77525f8955a044026c95f8e3646bc72km\" (UID: \"fcc40a21-72a2-4ffc-9148-848cc22b9ada\") " pod="openstack-operators/b46038f17be43086a4100f87067c3f77525f8955a044026c95f8e3646bc72km" Mar 09 14:21:30 crc kubenswrapper[4722]: I0309 14:21:30.502865 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-km655\" (UniqueName: \"kubernetes.io/projected/fcc40a21-72a2-4ffc-9148-848cc22b9ada-kube-api-access-km655\") pod \"b46038f17be43086a4100f87067c3f77525f8955a044026c95f8e3646bc72km\" (UID: \"fcc40a21-72a2-4ffc-9148-848cc22b9ada\") " pod="openstack-operators/b46038f17be43086a4100f87067c3f77525f8955a044026c95f8e3646bc72km" Mar 09 14:21:30 crc kubenswrapper[4722]: I0309 14:21:30.503726 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fcc40a21-72a2-4ffc-9148-848cc22b9ada-bundle\") pod \"b46038f17be43086a4100f87067c3f77525f8955a044026c95f8e3646bc72km\" (UID: \"fcc40a21-72a2-4ffc-9148-848cc22b9ada\") " pod="openstack-operators/b46038f17be43086a4100f87067c3f77525f8955a044026c95f8e3646bc72km" Mar 09 14:21:30 crc kubenswrapper[4722]: I0309 14:21:30.503944 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fcc40a21-72a2-4ffc-9148-848cc22b9ada-util\") pod \"b46038f17be43086a4100f87067c3f77525f8955a044026c95f8e3646bc72km\" (UID: \"fcc40a21-72a2-4ffc-9148-848cc22b9ada\") " pod="openstack-operators/b46038f17be43086a4100f87067c3f77525f8955a044026c95f8e3646bc72km" Mar 09 14:21:30 crc kubenswrapper[4722]: I0309 14:21:30.533861 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-km655\" (UniqueName: \"kubernetes.io/projected/fcc40a21-72a2-4ffc-9148-848cc22b9ada-kube-api-access-km655\") pod \"b46038f17be43086a4100f87067c3f77525f8955a044026c95f8e3646bc72km\" (UID: \"fcc40a21-72a2-4ffc-9148-848cc22b9ada\") " pod="openstack-operators/b46038f17be43086a4100f87067c3f77525f8955a044026c95f8e3646bc72km" Mar 09 14:21:30 crc kubenswrapper[4722]: I0309 14:21:30.635720 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/b46038f17be43086a4100f87067c3f77525f8955a044026c95f8e3646bc72km" Mar 09 14:21:31 crc kubenswrapper[4722]: I0309 14:21:31.132354 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/b46038f17be43086a4100f87067c3f77525f8955a044026c95f8e3646bc72km"] Mar 09 14:21:31 crc kubenswrapper[4722]: W0309 14:21:31.144568 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfcc40a21_72a2_4ffc_9148_848cc22b9ada.slice/crio-ecbb1d3e9debdf5ace5ff49b7b69e6b2aa7efd3fdf93d119093894557cd2460a WatchSource:0}: Error finding container ecbb1d3e9debdf5ace5ff49b7b69e6b2aa7efd3fdf93d119093894557cd2460a: Status 404 returned error can't find the container with id ecbb1d3e9debdf5ace5ff49b7b69e6b2aa7efd3fdf93d119093894557cd2460a Mar 09 14:21:31 crc kubenswrapper[4722]: I0309 14:21:31.478832 4722 generic.go:334] "Generic (PLEG): container finished" podID="fcc40a21-72a2-4ffc-9148-848cc22b9ada" containerID="8cb0e55b8aa2cee49f6d9eb14eb261d177da424759f37c104f4e387e0f0ffdfc" exitCode=0 Mar 09 14:21:31 crc kubenswrapper[4722]: I0309 14:21:31.478878 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b46038f17be43086a4100f87067c3f77525f8955a044026c95f8e3646bc72km" event={"ID":"fcc40a21-72a2-4ffc-9148-848cc22b9ada","Type":"ContainerDied","Data":"8cb0e55b8aa2cee49f6d9eb14eb261d177da424759f37c104f4e387e0f0ffdfc"} Mar 09 14:21:31 crc kubenswrapper[4722]: I0309 14:21:31.479131 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b46038f17be43086a4100f87067c3f77525f8955a044026c95f8e3646bc72km" event={"ID":"fcc40a21-72a2-4ffc-9148-848cc22b9ada","Type":"ContainerStarted","Data":"ecbb1d3e9debdf5ace5ff49b7b69e6b2aa7efd3fdf93d119093894557cd2460a"} Mar 09 14:21:33 crc kubenswrapper[4722]: I0309 14:21:33.501901 4722 generic.go:334] "Generic (PLEG): container finished" podID="fcc40a21-72a2-4ffc-9148-848cc22b9ada" containerID="ac80b34f6ef0b8127a43e42d0784bfcf87c83c61876b2e4204bbe9d76476dba5" exitCode=0 Mar 09 14:21:33 crc kubenswrapper[4722]: I0309 14:21:33.501993 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b46038f17be43086a4100f87067c3f77525f8955a044026c95f8e3646bc72km" event={"ID":"fcc40a21-72a2-4ffc-9148-848cc22b9ada","Type":"ContainerDied","Data":"ac80b34f6ef0b8127a43e42d0784bfcf87c83c61876b2e4204bbe9d76476dba5"} Mar 09 14:21:34 crc kubenswrapper[4722]: I0309 14:21:34.511355 4722 generic.go:334] "Generic (PLEG): container finished" podID="fcc40a21-72a2-4ffc-9148-848cc22b9ada" containerID="7bfe6aa4b6c8662219a918ce9699525e6d812a33ae4829fd4b6fc4092d50634d" exitCode=0 Mar 09 14:21:34 crc kubenswrapper[4722]: I0309 14:21:34.511467 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b46038f17be43086a4100f87067c3f77525f8955a044026c95f8e3646bc72km" event={"ID":"fcc40a21-72a2-4ffc-9148-848cc22b9ada","Type":"ContainerDied","Data":"7bfe6aa4b6c8662219a918ce9699525e6d812a33ae4829fd4b6fc4092d50634d"} Mar 09 14:21:35 crc kubenswrapper[4722]: I0309 14:21:35.865058 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/b46038f17be43086a4100f87067c3f77525f8955a044026c95f8e3646bc72km" Mar 09 14:21:35 crc kubenswrapper[4722]: I0309 14:21:35.999464 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fcc40a21-72a2-4ffc-9148-848cc22b9ada-util\") pod \"fcc40a21-72a2-4ffc-9148-848cc22b9ada\" (UID: \"fcc40a21-72a2-4ffc-9148-848cc22b9ada\") " Mar 09 14:21:35 crc kubenswrapper[4722]: I0309 14:21:35.999595 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-km655\" (UniqueName: \"kubernetes.io/projected/fcc40a21-72a2-4ffc-9148-848cc22b9ada-kube-api-access-km655\") pod \"fcc40a21-72a2-4ffc-9148-848cc22b9ada\" (UID: \"fcc40a21-72a2-4ffc-9148-848cc22b9ada\") " Mar 09 14:21:35 crc kubenswrapper[4722]: I0309 14:21:35.999732 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fcc40a21-72a2-4ffc-9148-848cc22b9ada-bundle\") pod \"fcc40a21-72a2-4ffc-9148-848cc22b9ada\" (UID: \"fcc40a21-72a2-4ffc-9148-848cc22b9ada\") " Mar 09 14:21:36 crc kubenswrapper[4722]: I0309 14:21:36.000387 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fcc40a21-72a2-4ffc-9148-848cc22b9ada-bundle" (OuterVolumeSpecName: "bundle") pod "fcc40a21-72a2-4ffc-9148-848cc22b9ada" (UID: "fcc40a21-72a2-4ffc-9148-848cc22b9ada"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:21:36 crc kubenswrapper[4722]: I0309 14:21:36.005276 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcc40a21-72a2-4ffc-9148-848cc22b9ada-kube-api-access-km655" (OuterVolumeSpecName: "kube-api-access-km655") pod "fcc40a21-72a2-4ffc-9148-848cc22b9ada" (UID: "fcc40a21-72a2-4ffc-9148-848cc22b9ada"). InnerVolumeSpecName "kube-api-access-km655". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:21:36 crc kubenswrapper[4722]: I0309 14:21:36.012776 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fcc40a21-72a2-4ffc-9148-848cc22b9ada-util" (OuterVolumeSpecName: "util") pod "fcc40a21-72a2-4ffc-9148-848cc22b9ada" (UID: "fcc40a21-72a2-4ffc-9148-848cc22b9ada"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:21:36 crc kubenswrapper[4722]: I0309 14:21:36.101619 4722 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fcc40a21-72a2-4ffc-9148-848cc22b9ada-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:21:36 crc kubenswrapper[4722]: I0309 14:21:36.101838 4722 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fcc40a21-72a2-4ffc-9148-848cc22b9ada-util\") on node \"crc\" DevicePath \"\"" Mar 09 14:21:36 crc kubenswrapper[4722]: I0309 14:21:36.101906 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-km655\" (UniqueName: \"kubernetes.io/projected/fcc40a21-72a2-4ffc-9148-848cc22b9ada-kube-api-access-km655\") on node \"crc\" DevicePath \"\"" Mar 09 14:21:36 crc kubenswrapper[4722]: I0309 14:21:36.535155 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b46038f17be43086a4100f87067c3f77525f8955a044026c95f8e3646bc72km" event={"ID":"fcc40a21-72a2-4ffc-9148-848cc22b9ada","Type":"ContainerDied","Data":"ecbb1d3e9debdf5ace5ff49b7b69e6b2aa7efd3fdf93d119093894557cd2460a"} Mar 09 14:21:36 crc kubenswrapper[4722]: I0309 14:21:36.535226 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ecbb1d3e9debdf5ace5ff49b7b69e6b2aa7efd3fdf93d119093894557cd2460a" Mar 09 14:21:36 crc kubenswrapper[4722]: I0309 14:21:36.535326 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/b46038f17be43086a4100f87067c3f77525f8955a044026c95f8e3646bc72km" Mar 09 14:21:42 crc kubenswrapper[4722]: I0309 14:21:42.359951 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-5b979cff56-vwbnz"] Mar 09 14:21:42 crc kubenswrapper[4722]: E0309 14:21:42.382523 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcc40a21-72a2-4ffc-9148-848cc22b9ada" containerName="pull" Mar 09 14:21:42 crc kubenswrapper[4722]: I0309 14:21:42.382559 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcc40a21-72a2-4ffc-9148-848cc22b9ada" containerName="pull" Mar 09 14:21:42 crc kubenswrapper[4722]: E0309 14:21:42.382589 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcc40a21-72a2-4ffc-9148-848cc22b9ada" containerName="util" Mar 09 14:21:42 crc kubenswrapper[4722]: I0309 14:21:42.382597 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcc40a21-72a2-4ffc-9148-848cc22b9ada" containerName="util" Mar 09 14:21:42 crc kubenswrapper[4722]: E0309 14:21:42.382644 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcc40a21-72a2-4ffc-9148-848cc22b9ada" containerName="extract" Mar 09 14:21:42 crc kubenswrapper[4722]: I0309 14:21:42.382650 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcc40a21-72a2-4ffc-9148-848cc22b9ada" containerName="extract" Mar 09 14:21:42 crc kubenswrapper[4722]: I0309 14:21:42.383483 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcc40a21-72a2-4ffc-9148-848cc22b9ada" containerName="extract" Mar 09 14:21:42 crc kubenswrapper[4722]: I0309 14:21:42.384389 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-5b979cff56-vwbnz" Mar 09 14:21:42 crc kubenswrapper[4722]: I0309 14:21:42.387719 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-2n52p" Mar 09 14:21:42 crc kubenswrapper[4722]: I0309 14:21:42.409820 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-5b979cff56-vwbnz"] Mar 09 14:21:42 crc kubenswrapper[4722]: I0309 14:21:42.421074 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88pkj\" (UniqueName: \"kubernetes.io/projected/bdac45ca-36d4-41c5-b5e5-332d70558171-kube-api-access-88pkj\") pod \"openstack-operator-controller-init-5b979cff56-vwbnz\" (UID: \"bdac45ca-36d4-41c5-b5e5-332d70558171\") " pod="openstack-operators/openstack-operator-controller-init-5b979cff56-vwbnz" Mar 09 14:21:42 crc kubenswrapper[4722]: I0309 14:21:42.522046 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88pkj\" (UniqueName: \"kubernetes.io/projected/bdac45ca-36d4-41c5-b5e5-332d70558171-kube-api-access-88pkj\") pod \"openstack-operator-controller-init-5b979cff56-vwbnz\" (UID: \"bdac45ca-36d4-41c5-b5e5-332d70558171\") " pod="openstack-operators/openstack-operator-controller-init-5b979cff56-vwbnz" Mar 09 14:21:42 crc kubenswrapper[4722]: I0309 14:21:42.540778 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88pkj\" (UniqueName: \"kubernetes.io/projected/bdac45ca-36d4-41c5-b5e5-332d70558171-kube-api-access-88pkj\") pod \"openstack-operator-controller-init-5b979cff56-vwbnz\" (UID: \"bdac45ca-36d4-41c5-b5e5-332d70558171\") " pod="openstack-operators/openstack-operator-controller-init-5b979cff56-vwbnz" Mar 09 14:21:42 crc kubenswrapper[4722]: I0309 14:21:42.718531 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-5b979cff56-vwbnz" Mar 09 14:21:43 crc kubenswrapper[4722]: I0309 14:21:43.209554 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-5b979cff56-vwbnz"] Mar 09 14:21:43 crc kubenswrapper[4722]: I0309 14:21:43.591393 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-5b979cff56-vwbnz" event={"ID":"bdac45ca-36d4-41c5-b5e5-332d70558171","Type":"ContainerStarted","Data":"e57c05971380057a146b6b5fa9b7b6d98393098b25014e1a4f7bdd7b43d98d26"} Mar 09 14:21:47 crc kubenswrapper[4722]: I0309 14:21:47.625806 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-5b979cff56-vwbnz" event={"ID":"bdac45ca-36d4-41c5-b5e5-332d70558171","Type":"ContainerStarted","Data":"9375f94a471a39f0fc6a55916228c8a20239245e447bacbc825eba7f44f3106d"} Mar 09 14:21:47 crc kubenswrapper[4722]: I0309 14:21:47.626608 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-5b979cff56-vwbnz" Mar 09 14:21:47 crc kubenswrapper[4722]: I0309 14:21:47.680773 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-5b979cff56-vwbnz" podStartSLOduration=1.708242451 podStartE2EDuration="5.680755521s" podCreationTimestamp="2026-03-09 14:21:42 +0000 UTC" firstStartedPulling="2026-03-09 14:21:43.222409501 +0000 UTC m=+1143.777978077" lastFinishedPulling="2026-03-09 14:21:47.194922571 +0000 UTC m=+1147.750491147" observedRunningTime="2026-03-09 14:21:47.66910585 +0000 UTC m=+1148.224674466" watchObservedRunningTime="2026-03-09 14:21:47.680755521 +0000 UTC m=+1148.236324097" Mar 09 14:21:51 crc kubenswrapper[4722]: I0309 14:21:51.527838 4722 patch_prober.go:28] interesting pod/machine-config-daemon-hjrrb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:21:51 crc kubenswrapper[4722]: I0309 14:21:51.528462 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:21:52 crc kubenswrapper[4722]: I0309 14:21:52.722378 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-5b979cff56-vwbnz" Mar 09 14:22:00 crc kubenswrapper[4722]: I0309 14:22:00.140671 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551102-xr98b"] Mar 09 14:22:00 crc kubenswrapper[4722]: I0309 14:22:00.142863 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551102-xr98b" Mar 09 14:22:00 crc kubenswrapper[4722]: I0309 14:22:00.145609 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 14:22:00 crc kubenswrapper[4722]: I0309 14:22:00.146151 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 14:22:00 crc kubenswrapper[4722]: I0309 14:22:00.147849 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-dr2x6" Mar 09 14:22:00 crc kubenswrapper[4722]: I0309 14:22:00.159797 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551102-xr98b"] Mar 09 14:22:00 crc kubenswrapper[4722]: I0309 14:22:00.262000 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwtx4\" (UniqueName: \"kubernetes.io/projected/94a144d8-4427-4dd2-99df-825094bc5a4b-kube-api-access-mwtx4\") pod \"auto-csr-approver-29551102-xr98b\" (UID: \"94a144d8-4427-4dd2-99df-825094bc5a4b\") " pod="openshift-infra/auto-csr-approver-29551102-xr98b" Mar 09 14:22:00 crc kubenswrapper[4722]: I0309 14:22:00.363842 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwtx4\" (UniqueName: \"kubernetes.io/projected/94a144d8-4427-4dd2-99df-825094bc5a4b-kube-api-access-mwtx4\") pod \"auto-csr-approver-29551102-xr98b\" (UID: \"94a144d8-4427-4dd2-99df-825094bc5a4b\") " pod="openshift-infra/auto-csr-approver-29551102-xr98b" Mar 09 14:22:00 crc kubenswrapper[4722]: I0309 14:22:00.382497 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwtx4\" (UniqueName: \"kubernetes.io/projected/94a144d8-4427-4dd2-99df-825094bc5a4b-kube-api-access-mwtx4\") pod \"auto-csr-approver-29551102-xr98b\" (UID: \"94a144d8-4427-4dd2-99df-825094bc5a4b\") " pod="openshift-infra/auto-csr-approver-29551102-xr98b" Mar 09 14:22:00 crc kubenswrapper[4722]: I0309 14:22:00.474371 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551102-xr98b" Mar 09 14:22:00 crc kubenswrapper[4722]: I0309 14:22:00.966492 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551102-xr98b"] Mar 09 14:22:01 crc kubenswrapper[4722]: I0309 14:22:01.739245 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551102-xr98b" event={"ID":"94a144d8-4427-4dd2-99df-825094bc5a4b","Type":"ContainerStarted","Data":"4c7f71c613261cc51b83d2eb3b9e2b1fc3272996d455aecfaa12681b8eac85c9"} Mar 09 14:22:02 crc kubenswrapper[4722]: I0309 14:22:02.753284 4722 generic.go:334] "Generic (PLEG): container finished" podID="94a144d8-4427-4dd2-99df-825094bc5a4b" containerID="428579cc73e13fdc6466d3bc9654e1b74df863575b4241ab50e43a0a961ac2b1" exitCode=0 Mar 09 14:22:02 crc kubenswrapper[4722]: I0309 14:22:02.753535 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551102-xr98b" event={"ID":"94a144d8-4427-4dd2-99df-825094bc5a4b","Type":"ContainerDied","Data":"428579cc73e13fdc6466d3bc9654e1b74df863575b4241ab50e43a0a961ac2b1"} Mar 09 14:22:04 crc kubenswrapper[4722]: I0309 14:22:04.092680 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551102-xr98b" Mar 09 14:22:04 crc kubenswrapper[4722]: I0309 14:22:04.157931 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mwtx4\" (UniqueName: \"kubernetes.io/projected/94a144d8-4427-4dd2-99df-825094bc5a4b-kube-api-access-mwtx4\") pod \"94a144d8-4427-4dd2-99df-825094bc5a4b\" (UID: \"94a144d8-4427-4dd2-99df-825094bc5a4b\") " Mar 09 14:22:04 crc kubenswrapper[4722]: I0309 14:22:04.167343 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94a144d8-4427-4dd2-99df-825094bc5a4b-kube-api-access-mwtx4" (OuterVolumeSpecName: "kube-api-access-mwtx4") pod "94a144d8-4427-4dd2-99df-825094bc5a4b" (UID: "94a144d8-4427-4dd2-99df-825094bc5a4b"). InnerVolumeSpecName "kube-api-access-mwtx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:22:04 crc kubenswrapper[4722]: I0309 14:22:04.260457 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mwtx4\" (UniqueName: \"kubernetes.io/projected/94a144d8-4427-4dd2-99df-825094bc5a4b-kube-api-access-mwtx4\") on node \"crc\" DevicePath \"\"" Mar 09 14:22:04 crc kubenswrapper[4722]: I0309 14:22:04.770953 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551102-xr98b" event={"ID":"94a144d8-4427-4dd2-99df-825094bc5a4b","Type":"ContainerDied","Data":"4c7f71c613261cc51b83d2eb3b9e2b1fc3272996d455aecfaa12681b8eac85c9"} Mar 09 14:22:04 crc kubenswrapper[4722]: I0309 14:22:04.771022 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c7f71c613261cc51b83d2eb3b9e2b1fc3272996d455aecfaa12681b8eac85c9" Mar 09 14:22:04 crc kubenswrapper[4722]: I0309 14:22:04.771117 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551102-xr98b" Mar 09 14:22:05 crc kubenswrapper[4722]: I0309 14:22:05.168572 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551096-hkrz5"] Mar 09 14:22:05 crc kubenswrapper[4722]: I0309 14:22:05.181580 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551096-hkrz5"] Mar 09 14:22:06 crc kubenswrapper[4722]: I0309 14:22:06.168867 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0741506d-5c21-4ce8-be9e-98caaea80864" path="/var/lib/kubelet/pods/0741506d-5c21-4ce8-be9e-98caaea80864/volumes" Mar 09 14:22:16 crc kubenswrapper[4722]: I0309 14:22:16.162546 4722 scope.go:117] "RemoveContainer" containerID="070b01014ac1c78dfc5ec279e65efc22b02c251ada626c1d2a07b3697b21f4d9" Mar 09 14:22:21 crc kubenswrapper[4722]: I0309 14:22:21.529329 4722 patch_prober.go:28] interesting pod/machine-config-daemon-hjrrb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:22:21 crc kubenswrapper[4722]: I0309 14:22:21.529761 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:22:34 crc kubenswrapper[4722]: I0309 14:22:34.314747 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6db6876945-vppnv"] Mar 09 14:22:34 crc kubenswrapper[4722]: E0309 14:22:34.315712 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94a144d8-4427-4dd2-99df-825094bc5a4b" containerName="oc" Mar 09 14:22:34 crc kubenswrapper[4722]: I0309 14:22:34.315723 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="94a144d8-4427-4dd2-99df-825094bc5a4b" containerName="oc" Mar 09 14:22:34 crc kubenswrapper[4722]: I0309 14:22:34.315856 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="94a144d8-4427-4dd2-99df-825094bc5a4b" containerName="oc" Mar 09 14:22:34 crc kubenswrapper[4722]: I0309 14:22:34.316399 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-vppnv" Mar 09 14:22:34 crc kubenswrapper[4722]: I0309 14:22:34.320629 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-l42s2" Mar 09 14:22:34 crc kubenswrapper[4722]: I0309 14:22:34.337033 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-rsd9l"] Mar 09 14:22:34 crc kubenswrapper[4722]: I0309 14:22:34.338111 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-rsd9l" Mar 09 14:22:34 crc kubenswrapper[4722]: I0309 14:22:34.340786 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-x9wq7" Mar 09 14:22:34 crc kubenswrapper[4722]: I0309 14:22:34.349153 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-5d87c9d997-bmpgd"] Mar 09 14:22:34 crc kubenswrapper[4722]: I0309 14:22:34.350186 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-bmpgd" Mar 09 14:22:34 crc kubenswrapper[4722]: I0309 14:22:34.352943 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-5vld5" Mar 09 14:22:34 crc kubenswrapper[4722]: I0309 14:22:34.358876 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6db6876945-vppnv"] Mar 09 14:22:34 crc kubenswrapper[4722]: I0309 14:22:34.367110 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-rsd9l"] Mar 09 14:22:34 crc kubenswrapper[4722]: I0309 14:22:34.382040 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmpg4\" (UniqueName: \"kubernetes.io/projected/8f839106-1673-4589-9391-0cd7748e658c-kube-api-access-gmpg4\") pod \"barbican-operator-controller-manager-6db6876945-vppnv\" (UID: \"8f839106-1673-4589-9391-0cd7748e658c\") " pod="openstack-operators/barbican-operator-controller-manager-6db6876945-vppnv" Mar 09 14:22:34 crc kubenswrapper[4722]: I0309 14:22:34.420294 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-5d87c9d997-bmpgd"] Mar 09 14:22:34 crc kubenswrapper[4722]: I0309 14:22:34.458327 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-cf99c678f-ct7x8"] Mar 09 14:22:34 crc kubenswrapper[4722]: I0309 14:22:34.459347 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-ct7x8" Mar 09 14:22:34 crc kubenswrapper[4722]: I0309 14:22:34.462561 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-nps5f" Mar 09 14:22:34 crc kubenswrapper[4722]: I0309 14:22:34.482097 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-64db6967f8-zjf7b"] Mar 09 14:22:34 crc kubenswrapper[4722]: I0309 14:22:34.483153 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-zjf7b" Mar 09 14:22:34 crc kubenswrapper[4722]: I0309 14:22:34.484634 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvwd8\" (UniqueName: \"kubernetes.io/projected/edd71e1d-6ff0-4918-9cd8-a342efba2df5-kube-api-access-fvwd8\") pod \"cinder-operator-controller-manager-55d77d7b5c-rsd9l\" (UID: \"edd71e1d-6ff0-4918-9cd8-a342efba2df5\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-rsd9l" Mar 09 14:22:34 crc kubenswrapper[4722]: I0309 14:22:34.484668 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmpg4\" (UniqueName: \"kubernetes.io/projected/8f839106-1673-4589-9391-0cd7748e658c-kube-api-access-gmpg4\") pod \"barbican-operator-controller-manager-6db6876945-vppnv\" (UID: \"8f839106-1673-4589-9391-0cd7748e658c\") " pod="openstack-operators/barbican-operator-controller-manager-6db6876945-vppnv" Mar 09 14:22:34 crc kubenswrapper[4722]: I0309 14:22:34.484703 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5slz\" (UniqueName: \"kubernetes.io/projected/a1a5e35a-83f6-4886-86db-55738f51f7e8-kube-api-access-h5slz\") pod \"designate-operator-controller-manager-5d87c9d997-bmpgd\" (UID: \"a1a5e35a-83f6-4886-86db-55738f51f7e8\") " pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-bmpgd" Mar 09 14:22:34 crc kubenswrapper[4722]: I0309 14:22:34.486756 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-c4fpv" Mar 09 14:22:34 crc kubenswrapper[4722]: I0309 14:22:34.508036 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmpg4\" (UniqueName: \"kubernetes.io/projected/8f839106-1673-4589-9391-0cd7748e658c-kube-api-access-gmpg4\") pod \"barbican-operator-controller-manager-6db6876945-vppnv\" (UID: \"8f839106-1673-4589-9391-0cd7748e658c\") " pod="openstack-operators/barbican-operator-controller-manager-6db6876945-vppnv" Mar 09 14:22:34 crc kubenswrapper[4722]: I0309 14:22:34.554759 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-cf99c678f-ct7x8"] Mar 09 14:22:34 crc kubenswrapper[4722]: I0309 14:22:34.562318 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-64db6967f8-zjf7b"] Mar 09 14:22:34 crc kubenswrapper[4722]: I0309 14:22:34.569228 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-6npqv"] Mar 09 14:22:34 crc kubenswrapper[4722]: I0309 14:22:34.570372 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-6npqv" Mar 09 14:22:34 crc kubenswrapper[4722]: I0309 14:22:34.579479 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-6npqv"] Mar 09 14:22:34 crc kubenswrapper[4722]: I0309 14:22:34.586633 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-kbj4f" Mar 09 14:22:34 crc kubenswrapper[4722]: I0309 14:22:34.587462 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5slz\" (UniqueName: \"kubernetes.io/projected/a1a5e35a-83f6-4886-86db-55738f51f7e8-kube-api-access-h5slz\") pod \"designate-operator-controller-manager-5d87c9d997-bmpgd\" (UID: \"a1a5e35a-83f6-4886-86db-55738f51f7e8\") " pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-bmpgd" Mar 09 14:22:34 crc kubenswrapper[4722]: I0309 14:22:34.587519 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtdjp\" (UniqueName: \"kubernetes.io/projected/22043c71-5292-422c-99e5-c88ea1aef638-kube-api-access-vtdjp\") pod \"glance-operator-controller-manager-64db6967f8-zjf7b\" (UID: \"22043c71-5292-422c-99e5-c88ea1aef638\") " pod="openstack-operators/glance-operator-controller-manager-64db6967f8-zjf7b" Mar 09 14:22:34 crc kubenswrapper[4722]: I0309 14:22:34.587595 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2h7fn\" (UniqueName: \"kubernetes.io/projected/f21c35ef-c8ea-4331-a747-44a62c6f2e74-kube-api-access-2h7fn\") pod \"heat-operator-controller-manager-cf99c678f-ct7x8\" (UID: \"f21c35ef-c8ea-4331-a747-44a62c6f2e74\") " pod="openstack-operators/heat-operator-controller-manager-cf99c678f-ct7x8" Mar 09 14:22:34 crc kubenswrapper[4722]: I0309 14:22:34.587662 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvwd8\" (UniqueName: \"kubernetes.io/projected/edd71e1d-6ff0-4918-9cd8-a342efba2df5-kube-api-access-fvwd8\") pod \"cinder-operator-controller-manager-55d77d7b5c-rsd9l\" (UID: \"edd71e1d-6ff0-4918-9cd8-a342efba2df5\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-rsd9l" Mar 09 14:22:34 crc kubenswrapper[4722]: I0309 14:22:34.590430 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-f7fcc58b9-lvfgg"] Mar 09 14:22:34 crc kubenswrapper[4722]: I0309 14:22:34.591724 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-lvfgg" Mar 09 14:22:34 crc kubenswrapper[4722]: I0309 14:22:34.610718 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-545456dc4-hnzfw"] Mar 09 14:22:34 crc kubenswrapper[4722]: I0309 14:22:34.611732 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-hnzfw" Mar 09 14:22:34 crc kubenswrapper[4722]: I0309 14:22:34.612141 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 09 14:22:34 crc kubenswrapper[4722]: I0309 14:22:34.612331 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-sdlvx" Mar 09 14:22:34 crc kubenswrapper[4722]: I0309 14:22:34.620218 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-q4g52" Mar 09 14:22:34 crc kubenswrapper[4722]: I0309 14:22:34.632295 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvwd8\" (UniqueName: \"kubernetes.io/projected/edd71e1d-6ff0-4918-9cd8-a342efba2df5-kube-api-access-fvwd8\") pod \"cinder-operator-controller-manager-55d77d7b5c-rsd9l\" (UID: \"edd71e1d-6ff0-4918-9cd8-a342efba2df5\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-rsd9l" Mar 09 14:22:34 crc kubenswrapper[4722]: I0309 14:22:34.644391 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-vppnv" Mar 09 14:22:34 crc kubenswrapper[4722]: I0309 14:22:34.647460 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5slz\" (UniqueName: \"kubernetes.io/projected/a1a5e35a-83f6-4886-86db-55738f51f7e8-kube-api-access-h5slz\") pod \"designate-operator-controller-manager-5d87c9d997-bmpgd\" (UID: \"a1a5e35a-83f6-4886-86db-55738f51f7e8\") " pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-bmpgd" Mar 09 14:22:34 crc kubenswrapper[4722]: I0309 14:22:34.650963 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-f7fcc58b9-lvfgg"] Mar 09 14:22:34 crc kubenswrapper[4722]: I0309 14:22:34.671109 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-rsd9l" Mar 09 14:22:34 crc kubenswrapper[4722]: I0309 14:22:34.691634 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-bmpgd" Mar 09 14:22:34 crc kubenswrapper[4722]: I0309 14:22:34.694124 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-545456dc4-hnzfw"] Mar 09 14:22:34 crc kubenswrapper[4722]: I0309 14:22:34.697531 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7a62b98d-e9d4-4cbc-bea8-0da13fcc4467-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-lvfgg\" (UID: \"7a62b98d-e9d4-4cbc-bea8-0da13fcc4467\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-lvfgg" Mar 09 14:22:34 crc kubenswrapper[4722]: I0309 14:22:34.697689 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtdjp\" (UniqueName: \"kubernetes.io/projected/22043c71-5292-422c-99e5-c88ea1aef638-kube-api-access-vtdjp\") pod \"glance-operator-controller-manager-64db6967f8-zjf7b\" (UID: \"22043c71-5292-422c-99e5-c88ea1aef638\") " pod="openstack-operators/glance-operator-controller-manager-64db6967f8-zjf7b" Mar 09 14:22:34 crc kubenswrapper[4722]: I0309 14:22:34.697735 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ccr9\" (UniqueName: \"kubernetes.io/projected/663f1719-30f7-4588-a183-4a59787e8d8d-kube-api-access-7ccr9\") pod \"horizon-operator-controller-manager-78bc7f9bd9-6npqv\" (UID: \"663f1719-30f7-4588-a183-4a59787e8d8d\") " pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-6npqv" Mar 09 14:22:34 crc kubenswrapper[4722]: I0309 14:22:34.697769 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67prx\" (UniqueName: \"kubernetes.io/projected/7a62b98d-e9d4-4cbc-bea8-0da13fcc4467-kube-api-access-67prx\") pod \"infra-operator-controller-manager-f7fcc58b9-lvfgg\" (UID: \"7a62b98d-e9d4-4cbc-bea8-0da13fcc4467\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-lvfgg" Mar 09 14:22:34 crc kubenswrapper[4722]: I0309 14:22:34.697790 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-th2gc\" (UniqueName: \"kubernetes.io/projected/ec9f1f5e-26f5-4683-bf41-c85981da9d18-kube-api-access-th2gc\") pod \"ironic-operator-controller-manager-545456dc4-hnzfw\" (UID: \"ec9f1f5e-26f5-4683-bf41-c85981da9d18\") " pod="openstack-operators/ironic-operator-controller-manager-545456dc4-hnzfw" Mar 09 14:22:34 crc kubenswrapper[4722]: I0309 14:22:34.697830 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2h7fn\" (UniqueName: \"kubernetes.io/projected/f21c35ef-c8ea-4331-a747-44a62c6f2e74-kube-api-access-2h7fn\") pod \"heat-operator-controller-manager-cf99c678f-ct7x8\" (UID: \"f21c35ef-c8ea-4331-a747-44a62c6f2e74\") " pod="openstack-operators/heat-operator-controller-manager-cf99c678f-ct7x8" Mar 09 14:22:34 crc kubenswrapper[4722]: I0309 14:22:34.720055 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7c789f89c6-2hxzr"] Mar 09 14:22:34 crc kubenswrapper[4722]: I0309 14:22:34.721347 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-2hxzr" Mar 09 14:22:34 crc kubenswrapper[4722]: I0309 14:22:34.726297 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-t42d6" Mar 09 14:22:34 crc kubenswrapper[4722]: I0309 14:22:34.740003 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2h7fn\" (UniqueName: \"kubernetes.io/projected/f21c35ef-c8ea-4331-a747-44a62c6f2e74-kube-api-access-2h7fn\") pod \"heat-operator-controller-manager-cf99c678f-ct7x8\" (UID: \"f21c35ef-c8ea-4331-a747-44a62c6f2e74\") " pod="openstack-operators/heat-operator-controller-manager-cf99c678f-ct7x8" Mar 09 14:22:34 crc kubenswrapper[4722]: I0309 14:22:34.757891 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-67d996989d-28855"] Mar 09 14:22:34 crc kubenswrapper[4722]: I0309 14:22:34.758984 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-67d996989d-28855" Mar 09 14:22:34 crc kubenswrapper[4722]: I0309 14:22:34.760809 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtdjp\" (UniqueName: \"kubernetes.io/projected/22043c71-5292-422c-99e5-c88ea1aef638-kube-api-access-vtdjp\") pod \"glance-operator-controller-manager-64db6967f8-zjf7b\" (UID: \"22043c71-5292-422c-99e5-c88ea1aef638\") " pod="openstack-operators/glance-operator-controller-manager-64db6967f8-zjf7b" Mar 09 14:22:34 crc kubenswrapper[4722]: I0309 14:22:34.777481 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-jkvzn" Mar 09 14:22:34 crc kubenswrapper[4722]: I0309 14:22:34.784215 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-ct7x8" Mar 09 14:22:34 crc kubenswrapper[4722]: I0309 14:22:34.795848 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-67d996989d-28855"] Mar 09 14:22:34 crc kubenswrapper[4722]: I0309 14:22:34.805602 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ccr9\" (UniqueName: \"kubernetes.io/projected/663f1719-30f7-4588-a183-4a59787e8d8d-kube-api-access-7ccr9\") pod \"horizon-operator-controller-manager-78bc7f9bd9-6npqv\" (UID: \"663f1719-30f7-4588-a183-4a59787e8d8d\") " pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-6npqv" Mar 09 14:22:34 crc kubenswrapper[4722]: I0309 14:22:34.805652 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67prx\" (UniqueName: \"kubernetes.io/projected/7a62b98d-e9d4-4cbc-bea8-0da13fcc4467-kube-api-access-67prx\") pod \"infra-operator-controller-manager-f7fcc58b9-lvfgg\" (UID: \"7a62b98d-e9d4-4cbc-bea8-0da13fcc4467\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-lvfgg" Mar 09 14:22:34 crc kubenswrapper[4722]: I0309 14:22:34.805674 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-th2gc\" (UniqueName: \"kubernetes.io/projected/ec9f1f5e-26f5-4683-bf41-c85981da9d18-kube-api-access-th2gc\") pod \"ironic-operator-controller-manager-545456dc4-hnzfw\" (UID: \"ec9f1f5e-26f5-4683-bf41-c85981da9d18\") " pod="openstack-operators/ironic-operator-controller-manager-545456dc4-hnzfw" Mar 09 14:22:34 crc kubenswrapper[4722]: I0309 14:22:34.805705 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cjdv\" (UniqueName: \"kubernetes.io/projected/4de6db14-6f3e-4c4e-a61d-39c6648209dd-kube-api-access-8cjdv\") pod \"keystone-operator-controller-manager-7c789f89c6-2hxzr\" (UID: \"4de6db14-6f3e-4c4e-a61d-39c6648209dd\") " pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-2hxzr" Mar 09 14:22:34 crc kubenswrapper[4722]: I0309 14:22:34.805734 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7a62b98d-e9d4-4cbc-bea8-0da13fcc4467-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-lvfgg\" (UID: \"7a62b98d-e9d4-4cbc-bea8-0da13fcc4467\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-lvfgg" Mar 09 14:22:34 crc kubenswrapper[4722]: E0309 14:22:34.805881 4722 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 09 14:22:34 crc kubenswrapper[4722]: E0309 14:22:34.805927 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a62b98d-e9d4-4cbc-bea8-0da13fcc4467-cert podName:7a62b98d-e9d4-4cbc-bea8-0da13fcc4467 nodeName:}" failed. No retries permitted until 2026-03-09 14:22:35.305911675 +0000 UTC m=+1195.861480251 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7a62b98d-e9d4-4cbc-bea8-0da13fcc4467-cert") pod "infra-operator-controller-manager-f7fcc58b9-lvfgg" (UID: "7a62b98d-e9d4-4cbc-bea8-0da13fcc4467") : secret "infra-operator-webhook-server-cert" not found Mar 09 14:22:34 crc kubenswrapper[4722]: I0309 14:22:34.821258 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7c789f89c6-2hxzr"] Mar 09 14:22:34 crc kubenswrapper[4722]: I0309 14:22:34.838304 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-zjf7b" Mar 09 14:22:34 crc kubenswrapper[4722]: I0309 14:22:34.845219 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ccr9\" (UniqueName: \"kubernetes.io/projected/663f1719-30f7-4588-a183-4a59787e8d8d-kube-api-access-7ccr9\") pod \"horizon-operator-controller-manager-78bc7f9bd9-6npqv\" (UID: \"663f1719-30f7-4588-a183-4a59787e8d8d\") " pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-6npqv" Mar 09 14:22:34 crc kubenswrapper[4722]: I0309 14:22:34.852295 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-l8bds"] Mar 09 14:22:34 crc kubenswrapper[4722]: I0309 14:22:34.853355 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-l8bds" Mar 09 14:22:34 crc kubenswrapper[4722]: I0309 14:22:34.857485 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-vzssg" Mar 09 14:22:34 crc kubenswrapper[4722]: I0309 14:22:34.858267 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67prx\" (UniqueName: \"kubernetes.io/projected/7a62b98d-e9d4-4cbc-bea8-0da13fcc4467-kube-api-access-67prx\") pod \"infra-operator-controller-manager-f7fcc58b9-lvfgg\" (UID: \"7a62b98d-e9d4-4cbc-bea8-0da13fcc4467\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-lvfgg" Mar 09 14:22:34 crc kubenswrapper[4722]: I0309 14:22:34.859919 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-th2gc\" (UniqueName: \"kubernetes.io/projected/ec9f1f5e-26f5-4683-bf41-c85981da9d18-kube-api-access-th2gc\") pod \"ironic-operator-controller-manager-545456dc4-hnzfw\" (UID: \"ec9f1f5e-26f5-4683-bf41-c85981da9d18\") " pod="openstack-operators/ironic-operator-controller-manager-545456dc4-hnzfw" Mar 09 14:22:34 crc kubenswrapper[4722]: I0309 14:22:34.906055 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-l8bds"] Mar 09 14:22:34 crc kubenswrapper[4722]: I0309 14:22:34.906663 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cjdv\" (UniqueName: \"kubernetes.io/projected/4de6db14-6f3e-4c4e-a61d-39c6648209dd-kube-api-access-8cjdv\") pod \"keystone-operator-controller-manager-7c789f89c6-2hxzr\" (UID: \"4de6db14-6f3e-4c4e-a61d-39c6648209dd\") " pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-2hxzr" Mar 09 14:22:34 crc kubenswrapper[4722]: I0309 14:22:34.906735 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ch28f\" (UniqueName: \"kubernetes.io/projected/dae536b6-7a22-435e-b307-a8ab6b54779d-kube-api-access-ch28f\") pod \"manila-operator-controller-manager-67d996989d-28855\" (UID: \"dae536b6-7a22-435e-b307-a8ab6b54779d\") " pod="openstack-operators/manila-operator-controller-manager-67d996989d-28855" Mar 09 14:22:34 crc kubenswrapper[4722]: I0309 14:22:34.940470 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cjdv\" (UniqueName: \"kubernetes.io/projected/4de6db14-6f3e-4c4e-a61d-39c6648209dd-kube-api-access-8cjdv\") pod \"keystone-operator-controller-manager-7c789f89c6-2hxzr\" (UID: \"4de6db14-6f3e-4c4e-a61d-39c6648209dd\") " pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-2hxzr" Mar 09 14:22:34 crc kubenswrapper[4722]: I0309 14:22:34.960060 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-54688575f-5fcw8"] Mar 09 14:22:34 crc kubenswrapper[4722]: I0309 14:22:34.961356 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-54688575f-5fcw8" Mar 09 14:22:34 crc kubenswrapper[4722]: I0309 14:22:34.968550 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-dz6cq" Mar 09 14:22:34 crc kubenswrapper[4722]: I0309 14:22:34.984792 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-54688575f-5fcw8"] Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.006071 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-74b6b5dc96-jwgfg"] Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.008086 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-jwgfg" Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.008543 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ch28f\" (UniqueName: \"kubernetes.io/projected/dae536b6-7a22-435e-b307-a8ab6b54779d-kube-api-access-ch28f\") pod \"manila-operator-controller-manager-67d996989d-28855\" (UID: \"dae536b6-7a22-435e-b307-a8ab6b54779d\") " pod="openstack-operators/manila-operator-controller-manager-67d996989d-28855" Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.008607 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zn9qd\" (UniqueName: \"kubernetes.io/projected/74cb981b-ce89-479e-8573-fdda25190637-kube-api-access-zn9qd\") pod \"mariadb-operator-controller-manager-7b6bfb6475-l8bds\" (UID: \"74cb981b-ce89-479e-8573-fdda25190637\") " pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-l8bds" Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.008857 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-6npqv" Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.015143 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-r7mvl" Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.039084 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-n5zc7"] Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.040179 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-n5zc7" Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.043893 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-jdn8x" Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.056877 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-n5zc7"] Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.058019 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ch28f\" (UniqueName: \"kubernetes.io/projected/dae536b6-7a22-435e-b307-a8ab6b54779d-kube-api-access-ch28f\") pod \"manila-operator-controller-manager-67d996989d-28855\" (UID: \"dae536b6-7a22-435e-b307-a8ab6b54779d\") " pod="openstack-operators/manila-operator-controller-manager-67d996989d-28855" Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.117693 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zn9qd\" (UniqueName: \"kubernetes.io/projected/74cb981b-ce89-479e-8573-fdda25190637-kube-api-access-zn9qd\") pod \"mariadb-operator-controller-manager-7b6bfb6475-l8bds\" (UID: \"74cb981b-ce89-479e-8573-fdda25190637\") " pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-l8bds" Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.118497 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-424vz\" (UniqueName: \"kubernetes.io/projected/febfeb1a-d5a3-46b8-bc4f-fe3266905e8c-kube-api-access-424vz\") pod \"neutron-operator-controller-manager-54688575f-5fcw8\" (UID: \"febfeb1a-d5a3-46b8-bc4f-fe3266905e8c\") " pod="openstack-operators/neutron-operator-controller-manager-54688575f-5fcw8" Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.118630 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f85zm\" (UniqueName: \"kubernetes.io/projected/a9ff56ca-00a6-484f-a477-0dca4f3a0f5c-kube-api-access-f85zm\") pod \"nova-operator-controller-manager-74b6b5dc96-jwgfg\" (UID: \"a9ff56ca-00a6-484f-a477-0dca4f3a0f5c\") " pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-jwgfg" Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.118722 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kg98n\" (UniqueName: \"kubernetes.io/projected/717ffc3a-7a6d-4a7c-837f-d1ed92489b68-kube-api-access-kg98n\") pod \"octavia-operator-controller-manager-5d86c7ddb7-n5zc7\" (UID: \"717ffc3a-7a6d-4a7c-837f-d1ed92489b68\") " pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-n5zc7" Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.137349 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-74b6b5dc96-jwgfg"] Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.143480 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zn9qd\" (UniqueName: \"kubernetes.io/projected/74cb981b-ce89-479e-8573-fdda25190637-kube-api-access-zn9qd\") pod \"mariadb-operator-controller-manager-7b6bfb6475-l8bds\" (UID: \"74cb981b-ce89-479e-8573-fdda25190637\") " pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-l8bds" Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.158389 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-hnzfw" Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.183437 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9ctn5qw"] Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.184650 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9ctn5qw" Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.187257 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-6zmqr" Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.188017 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.207088 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-2hxzr" Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.222392 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-424vz\" (UniqueName: \"kubernetes.io/projected/febfeb1a-d5a3-46b8-bc4f-fe3266905e8c-kube-api-access-424vz\") pod \"neutron-operator-controller-manager-54688575f-5fcw8\" (UID: \"febfeb1a-d5a3-46b8-bc4f-fe3266905e8c\") " pod="openstack-operators/neutron-operator-controller-manager-54688575f-5fcw8" Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.222456 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f85zm\" (UniqueName: \"kubernetes.io/projected/a9ff56ca-00a6-484f-a477-0dca4f3a0f5c-kube-api-access-f85zm\") pod \"nova-operator-controller-manager-74b6b5dc96-jwgfg\" (UID: \"a9ff56ca-00a6-484f-a477-0dca4f3a0f5c\") " pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-jwgfg" Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.222488 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kg98n\" (UniqueName: \"kubernetes.io/projected/717ffc3a-7a6d-4a7c-837f-d1ed92489b68-kube-api-access-kg98n\") pod \"octavia-operator-controller-manager-5d86c7ddb7-n5zc7\" (UID: \"717ffc3a-7a6d-4a7c-837f-d1ed92489b68\") " pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-n5zc7" Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.228228 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-75684d597f-hgkzs"] Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.229466 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-hgkzs" Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.237566 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-s5vhj" Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.238546 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9ctn5qw"] Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.248177 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f85zm\" (UniqueName: \"kubernetes.io/projected/a9ff56ca-00a6-484f-a477-0dca4f3a0f5c-kube-api-access-f85zm\") pod \"nova-operator-controller-manager-74b6b5dc96-jwgfg\" (UID: \"a9ff56ca-00a6-484f-a477-0dca4f3a0f5c\") " pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-jwgfg" Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.248268 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-9b9ff9f4d-56qz9"] Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.251606 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-56qz9" Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.253639 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kg98n\" (UniqueName: \"kubernetes.io/projected/717ffc3a-7a6d-4a7c-837f-d1ed92489b68-kube-api-access-kg98n\") pod \"octavia-operator-controller-manager-5d86c7ddb7-n5zc7\" (UID: \"717ffc3a-7a6d-4a7c-837f-d1ed92489b68\") " pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-n5zc7" Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.255116 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-75684d597f-hgkzs"] Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.255479 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-6jnl4" Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.263098 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-67d996989d-28855" Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.263743 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-648564c9fc-wrzrq"] Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.265090 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-wrzrq" Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.267878 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-rgs2d" Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.270105 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-424vz\" (UniqueName: \"kubernetes.io/projected/febfeb1a-d5a3-46b8-bc4f-fe3266905e8c-kube-api-access-424vz\") pod \"neutron-operator-controller-manager-54688575f-5fcw8\" (UID: \"febfeb1a-d5a3-46b8-bc4f-fe3266905e8c\") " pod="openstack-operators/neutron-operator-controller-manager-54688575f-5fcw8" Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.286482 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-l8bds" Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.287636 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-9b9ff9f4d-56qz9"] Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.299797 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-66c8b7dfbb-m7fv2"] Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.307013 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-66c8b7dfbb-m7fv2" Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.334231 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-jpq92" Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.336320 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4ckm\" (UniqueName: \"kubernetes.io/projected/df8b52ff-f61e-4aca-a408-240590699ae6-kube-api-access-s4ckm\") pod \"ovn-operator-controller-manager-75684d597f-hgkzs\" (UID: \"df8b52ff-f61e-4aca-a408-240590699ae6\") " pod="openstack-operators/ovn-operator-controller-manager-75684d597f-hgkzs" Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.349691 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7a62b98d-e9d4-4cbc-bea8-0da13fcc4467-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-lvfgg\" (UID: \"7a62b98d-e9d4-4cbc-bea8-0da13fcc4467\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-lvfgg" Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.349848 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5e25c11b-f9c6-4542-9c0c-394ea6bc2c17-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9ctn5qw\" (UID: \"5e25c11b-f9c6-4542-9c0c-394ea6bc2c17\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9ctn5qw" Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.349977 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66czw\" (UniqueName: \"kubernetes.io/projected/5e25c11b-f9c6-4542-9c0c-394ea6bc2c17-kube-api-access-66czw\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9ctn5qw\" (UID: \"5e25c11b-f9c6-4542-9c0c-394ea6bc2c17\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9ctn5qw" Mar 09 14:22:35 crc kubenswrapper[4722]: E0309 14:22:35.351770 4722 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 09 14:22:35 crc kubenswrapper[4722]: E0309 14:22:35.351848 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a62b98d-e9d4-4cbc-bea8-0da13fcc4467-cert podName:7a62b98d-e9d4-4cbc-bea8-0da13fcc4467 nodeName:}" failed. No retries permitted until 2026-03-09 14:22:36.351828807 +0000 UTC m=+1196.907397383 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7a62b98d-e9d4-4cbc-bea8-0da13fcc4467-cert") pod "infra-operator-controller-manager-f7fcc58b9-lvfgg" (UID: "7a62b98d-e9d4-4cbc-bea8-0da13fcc4467") : secret "infra-operator-webhook-server-cert" not found Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.353044 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-648564c9fc-wrzrq"] Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.364497 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-54688575f-5fcw8" Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.375246 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-66c8b7dfbb-m7fv2"] Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.392017 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-jwgfg" Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.426356 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-n5zc7" Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.452598 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66czw\" (UniqueName: \"kubernetes.io/projected/5e25c11b-f9c6-4542-9c0c-394ea6bc2c17-kube-api-access-66czw\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9ctn5qw\" (UID: \"5e25c11b-f9c6-4542-9c0c-394ea6bc2c17\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9ctn5qw" Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.452838 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25vsz\" (UniqueName: \"kubernetes.io/projected/e7b4c7c9-7c4f-4a13-8367-759f5f5ce368-kube-api-access-25vsz\") pod \"placement-operator-controller-manager-648564c9fc-wrzrq\" (UID: \"e7b4c7c9-7c4f-4a13-8367-759f5f5ce368\") " pod="openstack-operators/placement-operator-controller-manager-648564c9fc-wrzrq" Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.452935 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzw5h\" (UniqueName: \"kubernetes.io/projected/a9df5689-5d83-4206-be2b-cf6877d70e23-kube-api-access-tzw5h\") pod \"swift-operator-controller-manager-9b9ff9f4d-56qz9\" (UID: \"a9df5689-5d83-4206-be2b-cf6877d70e23\") " pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-56qz9" Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.453015 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4ckm\" (UniqueName: \"kubernetes.io/projected/df8b52ff-f61e-4aca-a408-240590699ae6-kube-api-access-s4ckm\") pod \"ovn-operator-controller-manager-75684d597f-hgkzs\" (UID: \"df8b52ff-f61e-4aca-a408-240590699ae6\") " pod="openstack-operators/ovn-operator-controller-manager-75684d597f-hgkzs" Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.453147 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gs5j\" (UniqueName: \"kubernetes.io/projected/5bf14ad6-64cf-48f7-99e6-fabac12849e2-kube-api-access-5gs5j\") pod \"telemetry-operator-controller-manager-66c8b7dfbb-m7fv2\" (UID: \"5bf14ad6-64cf-48f7-99e6-fabac12849e2\") " pod="openstack-operators/telemetry-operator-controller-manager-66c8b7dfbb-m7fv2" Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.459406 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5e25c11b-f9c6-4542-9c0c-394ea6bc2c17-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9ctn5qw\" (UID: \"5e25c11b-f9c6-4542-9c0c-394ea6bc2c17\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9ctn5qw" Mar 09 14:22:35 crc kubenswrapper[4722]: E0309 14:22:35.459708 4722 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 09 14:22:35 crc kubenswrapper[4722]: E0309 14:22:35.459942 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5e25c11b-f9c6-4542-9c0c-394ea6bc2c17-cert podName:5e25c11b-f9c6-4542-9c0c-394ea6bc2c17 nodeName:}" failed. No retries permitted until 2026-03-09 14:22:35.959924669 +0000 UTC m=+1196.515493245 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5e25c11b-f9c6-4542-9c0c-394ea6bc2c17-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9ctn5qw" (UID: "5e25c11b-f9c6-4542-9c0c-394ea6bc2c17") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.477047 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4ckm\" (UniqueName: \"kubernetes.io/projected/df8b52ff-f61e-4aca-a408-240590699ae6-kube-api-access-s4ckm\") pod \"ovn-operator-controller-manager-75684d597f-hgkzs\" (UID: \"df8b52ff-f61e-4aca-a408-240590699ae6\") " pod="openstack-operators/ovn-operator-controller-manager-75684d597f-hgkzs" Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.482702 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-55b5ff4dbb-pg8qn"] Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.483913 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-pg8qn" Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.491261 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-b6mvx" Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.491424 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-55b5ff4dbb-pg8qn"] Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.493971 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66czw\" (UniqueName: \"kubernetes.io/projected/5e25c11b-f9c6-4542-9c0c-394ea6bc2c17-kube-api-access-66czw\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9ctn5qw\" (UID: \"5e25c11b-f9c6-4542-9c0c-394ea6bc2c17\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9ctn5qw" Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.507521 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-pkbqb"] Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.508955 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-pkbqb" Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.513578 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-pkbqb"] Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.519623 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-vq4jw" Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.536398 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-55cd86c56-dm2dr"] Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.538036 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-55cd86c56-dm2dr" Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.540833 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.541009 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.541133 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-m8ctf" Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.550172 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-55cd86c56-dm2dr"] Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.561683 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25vsz\" (UniqueName: \"kubernetes.io/projected/e7b4c7c9-7c4f-4a13-8367-759f5f5ce368-kube-api-access-25vsz\") pod \"placement-operator-controller-manager-648564c9fc-wrzrq\" (UID: \"e7b4c7c9-7c4f-4a13-8367-759f5f5ce368\") " pod="openstack-operators/placement-operator-controller-manager-648564c9fc-wrzrq" Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.561733 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzw5h\" (UniqueName: \"kubernetes.io/projected/a9df5689-5d83-4206-be2b-cf6877d70e23-kube-api-access-tzw5h\") pod \"swift-operator-controller-manager-9b9ff9f4d-56qz9\" (UID: \"a9df5689-5d83-4206-be2b-cf6877d70e23\") " pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-56qz9" Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.561922 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gs5j\" (UniqueName: \"kubernetes.io/projected/5bf14ad6-64cf-48f7-99e6-fabac12849e2-kube-api-access-5gs5j\") pod \"telemetry-operator-controller-manager-66c8b7dfbb-m7fv2\" (UID: \"5bf14ad6-64cf-48f7-99e6-fabac12849e2\") " pod="openstack-operators/telemetry-operator-controller-manager-66c8b7dfbb-m7fv2" Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.561976 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2k8tw\" (UniqueName: \"kubernetes.io/projected/ef36bc5a-2962-4c1e-a5fd-98f61d525d5d-kube-api-access-2k8tw\") pod \"test-operator-controller-manager-55b5ff4dbb-pg8qn\" (UID: \"ef36bc5a-2962-4c1e-a5fd-98f61d525d5d\") " pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-pg8qn" Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.562010 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/98c22319-d5f8-4a0b-8a30-89b9d832f354-metrics-certs\") pod \"openstack-operator-controller-manager-55cd86c56-dm2dr\" (UID: \"98c22319-d5f8-4a0b-8a30-89b9d832f354\") " pod="openstack-operators/openstack-operator-controller-manager-55cd86c56-dm2dr" Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.562047 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnc6f\" (UniqueName: \"kubernetes.io/projected/0eac7341-5bab-4c97-a730-b7eeb0a75899-kube-api-access-jnc6f\") pod \"watcher-operator-controller-manager-bccc79885-pkbqb\" (UID: \"0eac7341-5bab-4c97-a730-b7eeb0a75899\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-pkbqb" Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.562077 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/98c22319-d5f8-4a0b-8a30-89b9d832f354-webhook-certs\") pod \"openstack-operator-controller-manager-55cd86c56-dm2dr\" (UID: \"98c22319-d5f8-4a0b-8a30-89b9d832f354\") " pod="openstack-operators/openstack-operator-controller-manager-55cd86c56-dm2dr" Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.562100 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpgbw\" (UniqueName: \"kubernetes.io/projected/98c22319-d5f8-4a0b-8a30-89b9d832f354-kube-api-access-fpgbw\") pod \"openstack-operator-controller-manager-55cd86c56-dm2dr\" (UID: \"98c22319-d5f8-4a0b-8a30-89b9d832f354\") " pod="openstack-operators/openstack-operator-controller-manager-55cd86c56-dm2dr" Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.562905 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xswl4"] Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.564247 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xswl4" Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.568938 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-b54k2" Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.571950 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xswl4"] Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.587122 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzw5h\" (UniqueName: \"kubernetes.io/projected/a9df5689-5d83-4206-be2b-cf6877d70e23-kube-api-access-tzw5h\") pod \"swift-operator-controller-manager-9b9ff9f4d-56qz9\" (UID: \"a9df5689-5d83-4206-be2b-cf6877d70e23\") " pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-56qz9" Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.588244 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25vsz\" (UniqueName: \"kubernetes.io/projected/e7b4c7c9-7c4f-4a13-8367-759f5f5ce368-kube-api-access-25vsz\") pod \"placement-operator-controller-manager-648564c9fc-wrzrq\" (UID: \"e7b4c7c9-7c4f-4a13-8367-759f5f5ce368\") " pod="openstack-operators/placement-operator-controller-manager-648564c9fc-wrzrq" Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.588986 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gs5j\" (UniqueName: \"kubernetes.io/projected/5bf14ad6-64cf-48f7-99e6-fabac12849e2-kube-api-access-5gs5j\") pod \"telemetry-operator-controller-manager-66c8b7dfbb-m7fv2\" (UID: \"5bf14ad6-64cf-48f7-99e6-fabac12849e2\") " pod="openstack-operators/telemetry-operator-controller-manager-66c8b7dfbb-m7fv2" Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.643072 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-hgkzs" Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.666358 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-wrzrq" Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.668074 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnc6f\" (UniqueName: \"kubernetes.io/projected/0eac7341-5bab-4c97-a730-b7eeb0a75899-kube-api-access-jnc6f\") pod \"watcher-operator-controller-manager-bccc79885-pkbqb\" (UID: \"0eac7341-5bab-4c97-a730-b7eeb0a75899\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-pkbqb" Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.668127 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k565v\" (UniqueName: \"kubernetes.io/projected/f9ff9b26-9d5a-4194-bab5-1b9fb5dee947-kube-api-access-k565v\") pod \"rabbitmq-cluster-operator-manager-668c99d594-xswl4\" (UID: \"f9ff9b26-9d5a-4194-bab5-1b9fb5dee947\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xswl4" Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.668191 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/98c22319-d5f8-4a0b-8a30-89b9d832f354-webhook-certs\") pod \"openstack-operator-controller-manager-55cd86c56-dm2dr\" (UID: \"98c22319-d5f8-4a0b-8a30-89b9d832f354\") " pod="openstack-operators/openstack-operator-controller-manager-55cd86c56-dm2dr" Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.668256 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpgbw\" (UniqueName: \"kubernetes.io/projected/98c22319-d5f8-4a0b-8a30-89b9d832f354-kube-api-access-fpgbw\") pod \"openstack-operator-controller-manager-55cd86c56-dm2dr\" (UID: \"98c22319-d5f8-4a0b-8a30-89b9d832f354\") " pod="openstack-operators/openstack-operator-controller-manager-55cd86c56-dm2dr" Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.668556 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2k8tw\" (UniqueName: \"kubernetes.io/projected/ef36bc5a-2962-4c1e-a5fd-98f61d525d5d-kube-api-access-2k8tw\") pod \"test-operator-controller-manager-55b5ff4dbb-pg8qn\" (UID: \"ef36bc5a-2962-4c1e-a5fd-98f61d525d5d\") " pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-pg8qn" Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.668641 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/98c22319-d5f8-4a0b-8a30-89b9d832f354-metrics-certs\") pod \"openstack-operator-controller-manager-55cd86c56-dm2dr\" (UID: \"98c22319-d5f8-4a0b-8a30-89b9d832f354\") " pod="openstack-operators/openstack-operator-controller-manager-55cd86c56-dm2dr" Mar 09 14:22:35 crc kubenswrapper[4722]: E0309 14:22:35.668854 4722 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 09 14:22:35 crc kubenswrapper[4722]: E0309 14:22:35.668864 4722 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 09 14:22:35 crc kubenswrapper[4722]: E0309 14:22:35.668904 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98c22319-d5f8-4a0b-8a30-89b9d832f354-webhook-certs podName:98c22319-d5f8-4a0b-8a30-89b9d832f354 nodeName:}" failed. No retries permitted until 2026-03-09 14:22:36.168887169 +0000 UTC m=+1196.724455865 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/98c22319-d5f8-4a0b-8a30-89b9d832f354-webhook-certs") pod "openstack-operator-controller-manager-55cd86c56-dm2dr" (UID: "98c22319-d5f8-4a0b-8a30-89b9d832f354") : secret "webhook-server-cert" not found Mar 09 14:22:35 crc kubenswrapper[4722]: E0309 14:22:35.668923 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98c22319-d5f8-4a0b-8a30-89b9d832f354-metrics-certs podName:98c22319-d5f8-4a0b-8a30-89b9d832f354 nodeName:}" failed. No retries permitted until 2026-03-09 14:22:36.16891343 +0000 UTC m=+1196.724482016 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/98c22319-d5f8-4a0b-8a30-89b9d832f354-metrics-certs") pod "openstack-operator-controller-manager-55cd86c56-dm2dr" (UID: "98c22319-d5f8-4a0b-8a30-89b9d832f354") : secret "metrics-server-cert" not found Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.688262 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnc6f\" (UniqueName: \"kubernetes.io/projected/0eac7341-5bab-4c97-a730-b7eeb0a75899-kube-api-access-jnc6f\") pod \"watcher-operator-controller-manager-bccc79885-pkbqb\" (UID: \"0eac7341-5bab-4c97-a730-b7eeb0a75899\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-pkbqb" Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.689579 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpgbw\" (UniqueName: \"kubernetes.io/projected/98c22319-d5f8-4a0b-8a30-89b9d832f354-kube-api-access-fpgbw\") pod \"openstack-operator-controller-manager-55cd86c56-dm2dr\" (UID: \"98c22319-d5f8-4a0b-8a30-89b9d832f354\") " pod="openstack-operators/openstack-operator-controller-manager-55cd86c56-dm2dr" Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.690403 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2k8tw\" (UniqueName: \"kubernetes.io/projected/ef36bc5a-2962-4c1e-a5fd-98f61d525d5d-kube-api-access-2k8tw\") pod \"test-operator-controller-manager-55b5ff4dbb-pg8qn\" (UID: \"ef36bc5a-2962-4c1e-a5fd-98f61d525d5d\") " pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-pg8qn" Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.704643 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-56qz9" Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.719752 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-66c8b7dfbb-m7fv2" Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.769821 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k565v\" (UniqueName: \"kubernetes.io/projected/f9ff9b26-9d5a-4194-bab5-1b9fb5dee947-kube-api-access-k565v\") pod \"rabbitmq-cluster-operator-manager-668c99d594-xswl4\" (UID: \"f9ff9b26-9d5a-4194-bab5-1b9fb5dee947\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xswl4" Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.795044 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k565v\" (UniqueName: \"kubernetes.io/projected/f9ff9b26-9d5a-4194-bab5-1b9fb5dee947-kube-api-access-k565v\") pod \"rabbitmq-cluster-operator-manager-668c99d594-xswl4\" (UID: \"f9ff9b26-9d5a-4194-bab5-1b9fb5dee947\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xswl4" Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.834727 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xswl4" Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.950187 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-pg8qn" Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.973589 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5e25c11b-f9c6-4542-9c0c-394ea6bc2c17-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9ctn5qw\" (UID: \"5e25c11b-f9c6-4542-9c0c-394ea6bc2c17\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9ctn5qw" Mar 09 14:22:35 crc kubenswrapper[4722]: E0309 14:22:35.973860 4722 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 09 14:22:35 crc kubenswrapper[4722]: E0309 14:22:35.973923 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5e25c11b-f9c6-4542-9c0c-394ea6bc2c17-cert podName:5e25c11b-f9c6-4542-9c0c-394ea6bc2c17 nodeName:}" failed. No retries permitted until 2026-03-09 14:22:36.97390447 +0000 UTC m=+1197.529473046 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5e25c11b-f9c6-4542-9c0c-394ea6bc2c17-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9ctn5qw" (UID: "5e25c11b-f9c6-4542-9c0c-394ea6bc2c17") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 09 14:22:35 crc kubenswrapper[4722]: I0309 14:22:35.977129 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-pkbqb" Mar 09 14:22:36 crc kubenswrapper[4722]: I0309 14:22:36.169790 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-rsd9l"] Mar 09 14:22:36 crc kubenswrapper[4722]: I0309 14:22:36.169832 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-cf99c678f-ct7x8"] Mar 09 14:22:36 crc kubenswrapper[4722]: I0309 14:22:36.175416 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-5d87c9d997-bmpgd"] Mar 09 14:22:36 crc kubenswrapper[4722]: W0309 14:22:36.178082 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf21c35ef_c8ea_4331_a747_44a62c6f2e74.slice/crio-db705a4322e4cf556e6fbb16261b771d67bb1ef9a5e6a4b22d578521d13a1dbb WatchSource:0}: Error finding container db705a4322e4cf556e6fbb16261b771d67bb1ef9a5e6a4b22d578521d13a1dbb: Status 404 returned error can't find the container with id db705a4322e4cf556e6fbb16261b771d67bb1ef9a5e6a4b22d578521d13a1dbb Mar 09 14:22:36 crc kubenswrapper[4722]: I0309 14:22:36.178306 4722 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 14:22:36 crc kubenswrapper[4722]: I0309 14:22:36.178586 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/98c22319-d5f8-4a0b-8a30-89b9d832f354-webhook-certs\") pod \"openstack-operator-controller-manager-55cd86c56-dm2dr\" (UID: \"98c22319-d5f8-4a0b-8a30-89b9d832f354\") " pod="openstack-operators/openstack-operator-controller-manager-55cd86c56-dm2dr" Mar 09 14:22:36 crc kubenswrapper[4722]: I0309 14:22:36.178734 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/98c22319-d5f8-4a0b-8a30-89b9d832f354-metrics-certs\") pod \"openstack-operator-controller-manager-55cd86c56-dm2dr\" (UID: \"98c22319-d5f8-4a0b-8a30-89b9d832f354\") " pod="openstack-operators/openstack-operator-controller-manager-55cd86c56-dm2dr" Mar 09 14:22:36 crc kubenswrapper[4722]: E0309 14:22:36.179384 4722 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 09 14:22:36 crc kubenswrapper[4722]: E0309 14:22:36.179447 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98c22319-d5f8-4a0b-8a30-89b9d832f354-metrics-certs podName:98c22319-d5f8-4a0b-8a30-89b9d832f354 nodeName:}" failed. No retries permitted until 2026-03-09 14:22:37.179431239 +0000 UTC m=+1197.734999815 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/98c22319-d5f8-4a0b-8a30-89b9d832f354-metrics-certs") pod "openstack-operator-controller-manager-55cd86c56-dm2dr" (UID: "98c22319-d5f8-4a0b-8a30-89b9d832f354") : secret "metrics-server-cert" not found Mar 09 14:22:36 crc kubenswrapper[4722]: E0309 14:22:36.179390 4722 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 09 14:22:36 crc kubenswrapper[4722]: E0309 14:22:36.179476 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98c22319-d5f8-4a0b-8a30-89b9d832f354-webhook-certs podName:98c22319-d5f8-4a0b-8a30-89b9d832f354 nodeName:}" failed. No retries permitted until 2026-03-09 14:22:37.17947098 +0000 UTC m=+1197.735039556 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/98c22319-d5f8-4a0b-8a30-89b9d832f354-webhook-certs") pod "openstack-operator-controller-manager-55cd86c56-dm2dr" (UID: "98c22319-d5f8-4a0b-8a30-89b9d832f354") : secret "webhook-server-cert" not found Mar 09 14:22:36 crc kubenswrapper[4722]: W0309 14:22:36.187328 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1a5e35a_83f6_4886_86db_55738f51f7e8.slice/crio-9ec96490750612e1150861e4cb25a428950b9c31af5d4a7838cb38c0f3bd5673 WatchSource:0}: Error finding container 9ec96490750612e1150861e4cb25a428950b9c31af5d4a7838cb38c0f3bd5673: Status 404 returned error can't find the container with id 9ec96490750612e1150861e4cb25a428950b9c31af5d4a7838cb38c0f3bd5673 Mar 09 14:22:36 crc kubenswrapper[4722]: I0309 14:22:36.213792 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6db6876945-vppnv"] Mar 09 14:22:36 crc kubenswrapper[4722]: I0309 14:22:36.382016 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7a62b98d-e9d4-4cbc-bea8-0da13fcc4467-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-lvfgg\" (UID: \"7a62b98d-e9d4-4cbc-bea8-0da13fcc4467\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-lvfgg" Mar 09 14:22:36 crc kubenswrapper[4722]: E0309 14:22:36.382162 4722 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 09 14:22:36 crc kubenswrapper[4722]: E0309 14:22:36.382332 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a62b98d-e9d4-4cbc-bea8-0da13fcc4467-cert podName:7a62b98d-e9d4-4cbc-bea8-0da13fcc4467 nodeName:}" failed. No retries permitted until 2026-03-09 14:22:38.382310787 +0000 UTC m=+1198.937879423 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7a62b98d-e9d4-4cbc-bea8-0da13fcc4467-cert") pod "infra-operator-controller-manager-f7fcc58b9-lvfgg" (UID: "7a62b98d-e9d4-4cbc-bea8-0da13fcc4467") : secret "infra-operator-webhook-server-cert" not found Mar 09 14:22:36 crc kubenswrapper[4722]: I0309 14:22:36.453854 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7c789f89c6-2hxzr"] Mar 09 14:22:36 crc kubenswrapper[4722]: I0309 14:22:36.463714 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-64db6967f8-zjf7b"] Mar 09 14:22:36 crc kubenswrapper[4722]: W0309 14:22:36.463770 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4de6db14_6f3e_4c4e_a61d_39c6648209dd.slice/crio-ceeb38fe5a5daa5e0a412ee91bd74a49a76f17a50018dd83023f30dc2bb158cd WatchSource:0}: Error finding container ceeb38fe5a5daa5e0a412ee91bd74a49a76f17a50018dd83023f30dc2bb158cd: Status 404 returned error can't find the container with id ceeb38fe5a5daa5e0a412ee91bd74a49a76f17a50018dd83023f30dc2bb158cd Mar 09 14:22:36 crc kubenswrapper[4722]: I0309 14:22:36.471790 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-6npqv"] Mar 09 14:22:36 crc kubenswrapper[4722]: W0309 14:22:36.495402 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddae536b6_7a22_435e_b307_a8ab6b54779d.slice/crio-2f1221c1b8660cdcfbdf68e337adf5e6cb3616e12672aa5489aa21c670976377 WatchSource:0}: Error finding container 2f1221c1b8660cdcfbdf68e337adf5e6cb3616e12672aa5489aa21c670976377: Status 404 returned error can't find the container with id 2f1221c1b8660cdcfbdf68e337adf5e6cb3616e12672aa5489aa21c670976377 Mar 09 14:22:36 crc kubenswrapper[4722]: I0309 14:22:36.500356 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-67d996989d-28855"] Mar 09 14:22:36 crc kubenswrapper[4722]: I0309 14:22:36.912509 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-6npqv" event={"ID":"663f1719-30f7-4588-a183-4a59787e8d8d","Type":"ContainerStarted","Data":"236baebcd84c6ae71135e0d6e6b4443ea0935dfa5a2392ed782cfe32c030a622"} Mar 09 14:22:36 crc kubenswrapper[4722]: I0309 14:22:36.915114 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-67d996989d-28855" event={"ID":"dae536b6-7a22-435e-b307-a8ab6b54779d","Type":"ContainerStarted","Data":"2f1221c1b8660cdcfbdf68e337adf5e6cb3616e12672aa5489aa21c670976377"} Mar 09 14:22:36 crc kubenswrapper[4722]: I0309 14:22:36.935883 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-bmpgd" event={"ID":"a1a5e35a-83f6-4886-86db-55738f51f7e8","Type":"ContainerStarted","Data":"9ec96490750612e1150861e4cb25a428950b9c31af5d4a7838cb38c0f3bd5673"} Mar 09 14:22:36 crc kubenswrapper[4722]: I0309 14:22:36.951455 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-rsd9l" event={"ID":"edd71e1d-6ff0-4918-9cd8-a342efba2df5","Type":"ContainerStarted","Data":"c721be86063fe7f16163c12a3dfedd598e765156476a1cb13ae399a5048bdd88"} Mar 09 14:22:36 crc kubenswrapper[4722]: I0309 14:22:36.957544 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-ct7x8" event={"ID":"f21c35ef-c8ea-4331-a747-44a62c6f2e74","Type":"ContainerStarted","Data":"db705a4322e4cf556e6fbb16261b771d67bb1ef9a5e6a4b22d578521d13a1dbb"} Mar 09 14:22:36 crc kubenswrapper[4722]: I0309 14:22:36.959131 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-2hxzr" event={"ID":"4de6db14-6f3e-4c4e-a61d-39c6648209dd","Type":"ContainerStarted","Data":"ceeb38fe5a5daa5e0a412ee91bd74a49a76f17a50018dd83023f30dc2bb158cd"} Mar 09 14:22:36 crc kubenswrapper[4722]: I0309 14:22:36.960927 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-vppnv" event={"ID":"8f839106-1673-4589-9391-0cd7748e658c","Type":"ContainerStarted","Data":"f362e7b5256f2eb6bedf334093c14f50975bfdecfa51150b5229384e3c3203ff"} Mar 09 14:22:36 crc kubenswrapper[4722]: I0309 14:22:36.976563 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-zjf7b" event={"ID":"22043c71-5292-422c-99e5-c88ea1aef638","Type":"ContainerStarted","Data":"02ce78043e9d689c13951941368781243332d74d9aca2b85f8141a1e8e379b72"} Mar 09 14:22:37 crc kubenswrapper[4722]: I0309 14:22:37.005488 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5e25c11b-f9c6-4542-9c0c-394ea6bc2c17-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9ctn5qw\" (UID: \"5e25c11b-f9c6-4542-9c0c-394ea6bc2c17\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9ctn5qw" Mar 09 14:22:37 crc kubenswrapper[4722]: E0309 14:22:37.005624 4722 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 09 14:22:37 crc kubenswrapper[4722]: E0309 14:22:37.005693 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5e25c11b-f9c6-4542-9c0c-394ea6bc2c17-cert podName:5e25c11b-f9c6-4542-9c0c-394ea6bc2c17 nodeName:}" failed. No retries permitted until 2026-03-09 14:22:39.005673714 +0000 UTC m=+1199.561242300 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5e25c11b-f9c6-4542-9c0c-394ea6bc2c17-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9ctn5qw" (UID: "5e25c11b-f9c6-4542-9c0c-394ea6bc2c17") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 09 14:22:37 crc kubenswrapper[4722]: I0309 14:22:37.215578 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/98c22319-d5f8-4a0b-8a30-89b9d832f354-webhook-certs\") pod \"openstack-operator-controller-manager-55cd86c56-dm2dr\" (UID: \"98c22319-d5f8-4a0b-8a30-89b9d832f354\") " pod="openstack-operators/openstack-operator-controller-manager-55cd86c56-dm2dr" Mar 09 14:22:37 crc kubenswrapper[4722]: I0309 14:22:37.215760 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/98c22319-d5f8-4a0b-8a30-89b9d832f354-metrics-certs\") pod \"openstack-operator-controller-manager-55cd86c56-dm2dr\" (UID: \"98c22319-d5f8-4a0b-8a30-89b9d832f354\") " pod="openstack-operators/openstack-operator-controller-manager-55cd86c56-dm2dr" Mar 09 14:22:37 crc kubenswrapper[4722]: E0309 14:22:37.215771 4722 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 09 14:22:37 crc kubenswrapper[4722]: E0309 14:22:37.215840 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98c22319-d5f8-4a0b-8a30-89b9d832f354-webhook-certs podName:98c22319-d5f8-4a0b-8a30-89b9d832f354 nodeName:}" failed. No retries permitted until 2026-03-09 14:22:39.215820266 +0000 UTC m=+1199.771388842 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/98c22319-d5f8-4a0b-8a30-89b9d832f354-webhook-certs") pod "openstack-operator-controller-manager-55cd86c56-dm2dr" (UID: "98c22319-d5f8-4a0b-8a30-89b9d832f354") : secret "webhook-server-cert" not found Mar 09 14:22:37 crc kubenswrapper[4722]: E0309 14:22:37.215857 4722 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 09 14:22:37 crc kubenswrapper[4722]: E0309 14:22:37.215883 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98c22319-d5f8-4a0b-8a30-89b9d832f354-metrics-certs podName:98c22319-d5f8-4a0b-8a30-89b9d832f354 nodeName:}" failed. No retries permitted until 2026-03-09 14:22:39.215875198 +0000 UTC m=+1199.771443774 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/98c22319-d5f8-4a0b-8a30-89b9d832f354-metrics-certs") pod "openstack-operator-controller-manager-55cd86c56-dm2dr" (UID: "98c22319-d5f8-4a0b-8a30-89b9d832f354") : secret "metrics-server-cert" not found Mar 09 14:22:37 crc kubenswrapper[4722]: I0309 14:22:37.285279 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-75684d597f-hgkzs"] Mar 09 14:22:37 crc kubenswrapper[4722]: W0309 14:22:37.286419 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfebfeb1a_d5a3_46b8_bc4f_fe3266905e8c.slice/crio-2a4702925273ba54ea33817f34e5a6a787935975d44cbff9597776b983f28167 WatchSource:0}: Error finding container 2a4702925273ba54ea33817f34e5a6a787935975d44cbff9597776b983f28167: Status 404 returned error can't find the container with id 2a4702925273ba54ea33817f34e5a6a787935975d44cbff9597776b983f28167 Mar 09 14:22:37 crc kubenswrapper[4722]: I0309 14:22:37.309802 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-54688575f-5fcw8"] Mar 09 14:22:37 crc kubenswrapper[4722]: I0309 14:22:37.325440 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-74b6b5dc96-jwgfg"] Mar 09 14:22:37 crc kubenswrapper[4722]: I0309 14:22:37.354163 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-545456dc4-hnzfw"] Mar 09 14:22:37 crc kubenswrapper[4722]: I0309 14:22:37.362539 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-n5zc7"] Mar 09 14:22:37 crc kubenswrapper[4722]: I0309 14:22:37.370813 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-66c8b7dfbb-m7fv2"] Mar 09 14:22:37 crc kubenswrapper[4722]: W0309 14:22:37.373666 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec9f1f5e_26f5_4683_bf41_c85981da9d18.slice/crio-07fd2495f2b5464f8cfe51b890570deda103bc9ffcaa1d38ec361a20f7037514 WatchSource:0}: Error finding container 07fd2495f2b5464f8cfe51b890570deda103bc9ffcaa1d38ec361a20f7037514: Status 404 returned error can't find the container with id 07fd2495f2b5464f8cfe51b890570deda103bc9ffcaa1d38ec361a20f7037514 Mar 09 14:22:37 crc kubenswrapper[4722]: I0309 14:22:37.386700 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-648564c9fc-wrzrq"] Mar 09 14:22:37 crc kubenswrapper[4722]: I0309 14:22:37.403783 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-l8bds"] Mar 09 14:22:37 crc kubenswrapper[4722]: I0309 14:22:37.587157 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xswl4"] Mar 09 14:22:37 crc kubenswrapper[4722]: W0309 14:22:37.595409 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9ff9b26_9d5a_4194_bab5_1b9fb5dee947.slice/crio-ba659e5b8cb4e566ebb75fdbc8b2e53ae4450310d3833ad50a9ccc5dfa0249f2 WatchSource:0}: Error finding container ba659e5b8cb4e566ebb75fdbc8b2e53ae4450310d3833ad50a9ccc5dfa0249f2: Status 404 returned error can't find the container with id ba659e5b8cb4e566ebb75fdbc8b2e53ae4450310d3833ad50a9ccc5dfa0249f2 Mar 09 14:22:37 crc kubenswrapper[4722]: I0309 14:22:37.611771 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-pkbqb"] Mar 09 14:22:37 crc kubenswrapper[4722]: E0309 14:22:37.629040 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:06311600a491c689493552e7ff26e36df740fa4e7c143fca874bef19f24afb97,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jnc6f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-bccc79885-pkbqb_openstack-operators(0eac7341-5bab-4c97-a730-b7eeb0a75899): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 09 14:22:37 crc kubenswrapper[4722]: E0309 14:22:37.630256 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-pkbqb" podUID="0eac7341-5bab-4c97-a730-b7eeb0a75899" Mar 09 14:22:37 crc kubenswrapper[4722]: I0309 14:22:37.633792 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-9b9ff9f4d-56qz9"] Mar 09 14:22:37 crc kubenswrapper[4722]: I0309 14:22:37.644256 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-55b5ff4dbb-pg8qn"] Mar 09 14:22:37 crc kubenswrapper[4722]: W0309 14:22:37.658179 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9df5689_5d83_4206_be2b_cf6877d70e23.slice/crio-3335c40450b7c70f964bfa421d012b1addca18802ca0d76a2300887ac391bb1e WatchSource:0}: Error finding container 3335c40450b7c70f964bfa421d012b1addca18802ca0d76a2300887ac391bb1e: Status 404 returned error can't find the container with id 3335c40450b7c70f964bfa421d012b1addca18802ca0d76a2300887ac391bb1e Mar 09 14:22:37 crc kubenswrapper[4722]: E0309 14:22:37.662730 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:f309cdea8084a4b1e8cbcd732d6e250fd93c55cfd1b48ba9026907c8591faab7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tzw5h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-9b9ff9f4d-56qz9_openstack-operators(a9df5689-5d83-4206-be2b-cf6877d70e23): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 09 14:22:37 crc kubenswrapper[4722]: E0309 14:22:37.664068 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-56qz9" podUID="a9df5689-5d83-4206-be2b-cf6877d70e23" Mar 09 14:22:37 crc kubenswrapper[4722]: W0309 14:22:37.666541 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef36bc5a_2962_4c1e_a5fd_98f61d525d5d.slice/crio-2c4e2922b37fe66c10767227ab87fe381dd86feb3e45899c6fb95111c8f4cba9 WatchSource:0}: Error finding container 2c4e2922b37fe66c10767227ab87fe381dd86feb3e45899c6fb95111c8f4cba9: Status 404 returned error can't find the container with id 2c4e2922b37fe66c10767227ab87fe381dd86feb3e45899c6fb95111c8f4cba9 Mar 09 14:22:37 crc kubenswrapper[4722]: E0309 14:22:37.673327 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:9d03f03aa9a460f1fcac8875064808c03e4ecd0388873bbfb9c7dc58331f3968,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2k8tw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-55b5ff4dbb-pg8qn_openstack-operators(ef36bc5a-2962-4c1e-a5fd-98f61d525d5d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 09 14:22:37 crc kubenswrapper[4722]: E0309 14:22:37.675225 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-pg8qn" podUID="ef36bc5a-2962-4c1e-a5fd-98f61d525d5d" Mar 09 14:22:38 crc kubenswrapper[4722]: I0309 14:22:38.005521 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-56qz9" event={"ID":"a9df5689-5d83-4206-be2b-cf6877d70e23","Type":"ContainerStarted","Data":"3335c40450b7c70f964bfa421d012b1addca18802ca0d76a2300887ac391bb1e"} Mar 09 14:22:38 crc kubenswrapper[4722]: E0309 14:22:38.008260 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:f309cdea8084a4b1e8cbcd732d6e250fd93c55cfd1b48ba9026907c8591faab7\\\"\"" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-56qz9" podUID="a9df5689-5d83-4206-be2b-cf6877d70e23" Mar 09 14:22:38 crc kubenswrapper[4722]: I0309 14:22:38.014744 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-hgkzs" event={"ID":"df8b52ff-f61e-4aca-a408-240590699ae6","Type":"ContainerStarted","Data":"85dd8f27a9d5f00be13382ac8d5f87117e5b79e9612dc210216ab4ecb0a0ee62"} Mar 09 14:22:38 crc kubenswrapper[4722]: I0309 14:22:38.030876 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-jwgfg" event={"ID":"a9ff56ca-00a6-484f-a477-0dca4f3a0f5c","Type":"ContainerStarted","Data":"59950bd6cc7d383aa981ce9139740e4d608bd9f4440bfdd805dc3efbed9de500"} Mar 09 14:22:38 crc kubenswrapper[4722]: I0309 14:22:38.049041 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-wrzrq" event={"ID":"e7b4c7c9-7c4f-4a13-8367-759f5f5ce368","Type":"ContainerStarted","Data":"e869f42a935e9b019dcd8de015f440173be9c62956cd575cca8d6c4001e3bc35"} Mar 09 14:22:38 crc kubenswrapper[4722]: I0309 14:22:38.050085 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-54688575f-5fcw8" event={"ID":"febfeb1a-d5a3-46b8-bc4f-fe3266905e8c","Type":"ContainerStarted","Data":"2a4702925273ba54ea33817f34e5a6a787935975d44cbff9597776b983f28167"} Mar 09 14:22:38 crc kubenswrapper[4722]: I0309 14:22:38.058967 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-hnzfw" event={"ID":"ec9f1f5e-26f5-4683-bf41-c85981da9d18","Type":"ContainerStarted","Data":"07fd2495f2b5464f8cfe51b890570deda103bc9ffcaa1d38ec361a20f7037514"} Mar 09 14:22:38 crc kubenswrapper[4722]: I0309 14:22:38.061622 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-n5zc7" event={"ID":"717ffc3a-7a6d-4a7c-837f-d1ed92489b68","Type":"ContainerStarted","Data":"0db50b0c23da989fb8ba8ab09a7f1f74a8949fa4001cf53efa94595f9ec7dba2"} Mar 09 14:22:38 crc kubenswrapper[4722]: I0309 14:22:38.063631 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xswl4" event={"ID":"f9ff9b26-9d5a-4194-bab5-1b9fb5dee947","Type":"ContainerStarted","Data":"ba659e5b8cb4e566ebb75fdbc8b2e53ae4450310d3833ad50a9ccc5dfa0249f2"} Mar 09 14:22:38 crc kubenswrapper[4722]: I0309 14:22:38.069393 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-pg8qn" event={"ID":"ef36bc5a-2962-4c1e-a5fd-98f61d525d5d","Type":"ContainerStarted","Data":"2c4e2922b37fe66c10767227ab87fe381dd86feb3e45899c6fb95111c8f4cba9"} Mar 09 14:22:38 crc kubenswrapper[4722]: I0309 14:22:38.071628 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-l8bds" event={"ID":"74cb981b-ce89-479e-8573-fdda25190637","Type":"ContainerStarted","Data":"dd29ecc9798e2265054405609662acdf7237cb3584b07da01f03cf8834f7fb78"} Mar 09 14:22:38 crc kubenswrapper[4722]: E0309 14:22:38.072570 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:9d03f03aa9a460f1fcac8875064808c03e4ecd0388873bbfb9c7dc58331f3968\\\"\"" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-pg8qn" podUID="ef36bc5a-2962-4c1e-a5fd-98f61d525d5d" Mar 09 14:22:38 crc kubenswrapper[4722]: I0309 14:22:38.083119 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-66c8b7dfbb-m7fv2" event={"ID":"5bf14ad6-64cf-48f7-99e6-fabac12849e2","Type":"ContainerStarted","Data":"fd04320ff25900162d1a2edcf4095eafa64f636ee5421d56ba76e78bf2292c7e"} Mar 09 14:22:38 crc kubenswrapper[4722]: I0309 14:22:38.084406 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-pkbqb" event={"ID":"0eac7341-5bab-4c97-a730-b7eeb0a75899","Type":"ContainerStarted","Data":"8310345bfb0c3258b54aed3c66e16096e3d64586bc45ea2d1907207f9aa70258"} Mar 09 14:22:38 crc kubenswrapper[4722]: E0309 14:22:38.085622 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:06311600a491c689493552e7ff26e36df740fa4e7c143fca874bef19f24afb97\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-pkbqb" podUID="0eac7341-5bab-4c97-a730-b7eeb0a75899" Mar 09 14:22:38 crc kubenswrapper[4722]: I0309 14:22:38.441010 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7a62b98d-e9d4-4cbc-bea8-0da13fcc4467-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-lvfgg\" (UID: \"7a62b98d-e9d4-4cbc-bea8-0da13fcc4467\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-lvfgg" Mar 09 14:22:38 crc kubenswrapper[4722]: E0309 14:22:38.441216 4722 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 09 14:22:38 crc kubenswrapper[4722]: E0309 14:22:38.441467 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a62b98d-e9d4-4cbc-bea8-0da13fcc4467-cert podName:7a62b98d-e9d4-4cbc-bea8-0da13fcc4467 nodeName:}" failed. No retries permitted until 2026-03-09 14:22:42.441431187 +0000 UTC m=+1202.996999823 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7a62b98d-e9d4-4cbc-bea8-0da13fcc4467-cert") pod "infra-operator-controller-manager-f7fcc58b9-lvfgg" (UID: "7a62b98d-e9d4-4cbc-bea8-0da13fcc4467") : secret "infra-operator-webhook-server-cert" not found Mar 09 14:22:39 crc kubenswrapper[4722]: I0309 14:22:39.065311 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5e25c11b-f9c6-4542-9c0c-394ea6bc2c17-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9ctn5qw\" (UID: \"5e25c11b-f9c6-4542-9c0c-394ea6bc2c17\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9ctn5qw" Mar 09 14:22:39 crc kubenswrapper[4722]: E0309 14:22:39.065506 4722 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 09 14:22:39 crc kubenswrapper[4722]: E0309 14:22:39.065553 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5e25c11b-f9c6-4542-9c0c-394ea6bc2c17-cert podName:5e25c11b-f9c6-4542-9c0c-394ea6bc2c17 nodeName:}" failed. No retries permitted until 2026-03-09 14:22:43.065538154 +0000 UTC m=+1203.621106730 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5e25c11b-f9c6-4542-9c0c-394ea6bc2c17-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9ctn5qw" (UID: "5e25c11b-f9c6-4542-9c0c-394ea6bc2c17") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 09 14:22:39 crc kubenswrapper[4722]: E0309 14:22:39.102801 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:f309cdea8084a4b1e8cbcd732d6e250fd93c55cfd1b48ba9026907c8591faab7\\\"\"" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-56qz9" podUID="a9df5689-5d83-4206-be2b-cf6877d70e23" Mar 09 14:22:39 crc kubenswrapper[4722]: E0309 14:22:39.102899 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:06311600a491c689493552e7ff26e36df740fa4e7c143fca874bef19f24afb97\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-pkbqb" podUID="0eac7341-5bab-4c97-a730-b7eeb0a75899" Mar 09 14:22:39 crc kubenswrapper[4722]: E0309 14:22:39.103289 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:9d03f03aa9a460f1fcac8875064808c03e4ecd0388873bbfb9c7dc58331f3968\\\"\"" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-pg8qn" podUID="ef36bc5a-2962-4c1e-a5fd-98f61d525d5d" Mar 09 14:22:39 crc kubenswrapper[4722]: I0309 14:22:39.267926 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/98c22319-d5f8-4a0b-8a30-89b9d832f354-metrics-certs\") pod \"openstack-operator-controller-manager-55cd86c56-dm2dr\" (UID: \"98c22319-d5f8-4a0b-8a30-89b9d832f354\") " pod="openstack-operators/openstack-operator-controller-manager-55cd86c56-dm2dr" Mar 09 14:22:39 crc kubenswrapper[4722]: I0309 14:22:39.268012 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/98c22319-d5f8-4a0b-8a30-89b9d832f354-webhook-certs\") pod \"openstack-operator-controller-manager-55cd86c56-dm2dr\" (UID: \"98c22319-d5f8-4a0b-8a30-89b9d832f354\") " pod="openstack-operators/openstack-operator-controller-manager-55cd86c56-dm2dr" Mar 09 14:22:39 crc kubenswrapper[4722]: E0309 14:22:39.268593 4722 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 09 14:22:39 crc kubenswrapper[4722]: E0309 14:22:39.268850 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98c22319-d5f8-4a0b-8a30-89b9d832f354-webhook-certs podName:98c22319-d5f8-4a0b-8a30-89b9d832f354 nodeName:}" failed. No retries permitted until 2026-03-09 14:22:43.268639628 +0000 UTC m=+1203.824208194 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/98c22319-d5f8-4a0b-8a30-89b9d832f354-webhook-certs") pod "openstack-operator-controller-manager-55cd86c56-dm2dr" (UID: "98c22319-d5f8-4a0b-8a30-89b9d832f354") : secret "webhook-server-cert" not found Mar 09 14:22:39 crc kubenswrapper[4722]: E0309 14:22:39.269429 4722 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 09 14:22:39 crc kubenswrapper[4722]: E0309 14:22:39.269461 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98c22319-d5f8-4a0b-8a30-89b9d832f354-metrics-certs podName:98c22319-d5f8-4a0b-8a30-89b9d832f354 nodeName:}" failed. No retries permitted until 2026-03-09 14:22:43.269453629 +0000 UTC m=+1203.825022205 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/98c22319-d5f8-4a0b-8a30-89b9d832f354-metrics-certs") pod "openstack-operator-controller-manager-55cd86c56-dm2dr" (UID: "98c22319-d5f8-4a0b-8a30-89b9d832f354") : secret "metrics-server-cert" not found Mar 09 14:22:42 crc kubenswrapper[4722]: I0309 14:22:42.447284 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7a62b98d-e9d4-4cbc-bea8-0da13fcc4467-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-lvfgg\" (UID: \"7a62b98d-e9d4-4cbc-bea8-0da13fcc4467\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-lvfgg" Mar 09 14:22:42 crc kubenswrapper[4722]: E0309 14:22:42.447428 4722 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 09 14:22:42 crc kubenswrapper[4722]: E0309 14:22:42.447752 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a62b98d-e9d4-4cbc-bea8-0da13fcc4467-cert podName:7a62b98d-e9d4-4cbc-bea8-0da13fcc4467 nodeName:}" failed. No retries permitted until 2026-03-09 14:22:50.447732862 +0000 UTC m=+1211.003301438 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7a62b98d-e9d4-4cbc-bea8-0da13fcc4467-cert") pod "infra-operator-controller-manager-f7fcc58b9-lvfgg" (UID: "7a62b98d-e9d4-4cbc-bea8-0da13fcc4467") : secret "infra-operator-webhook-server-cert" not found Mar 09 14:22:43 crc kubenswrapper[4722]: I0309 14:22:43.073336 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5e25c11b-f9c6-4542-9c0c-394ea6bc2c17-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9ctn5qw\" (UID: \"5e25c11b-f9c6-4542-9c0c-394ea6bc2c17\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9ctn5qw" Mar 09 14:22:43 crc kubenswrapper[4722]: E0309 14:22:43.073545 4722 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 09 14:22:43 crc kubenswrapper[4722]: E0309 14:22:43.073610 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5e25c11b-f9c6-4542-9c0c-394ea6bc2c17-cert podName:5e25c11b-f9c6-4542-9c0c-394ea6bc2c17 nodeName:}" failed. No retries permitted until 2026-03-09 14:22:51.073588616 +0000 UTC m=+1211.629157192 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5e25c11b-f9c6-4542-9c0c-394ea6bc2c17-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9ctn5qw" (UID: "5e25c11b-f9c6-4542-9c0c-394ea6bc2c17") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 09 14:22:43 crc kubenswrapper[4722]: I0309 14:22:43.276860 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/98c22319-d5f8-4a0b-8a30-89b9d832f354-metrics-certs\") pod \"openstack-operator-controller-manager-55cd86c56-dm2dr\" (UID: \"98c22319-d5f8-4a0b-8a30-89b9d832f354\") " pod="openstack-operators/openstack-operator-controller-manager-55cd86c56-dm2dr" Mar 09 14:22:43 crc kubenswrapper[4722]: I0309 14:22:43.277429 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/98c22319-d5f8-4a0b-8a30-89b9d832f354-webhook-certs\") pod \"openstack-operator-controller-manager-55cd86c56-dm2dr\" (UID: \"98c22319-d5f8-4a0b-8a30-89b9d832f354\") " pod="openstack-operators/openstack-operator-controller-manager-55cd86c56-dm2dr" Mar 09 14:22:43 crc kubenswrapper[4722]: E0309 14:22:43.277286 4722 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 09 14:22:43 crc kubenswrapper[4722]: E0309 14:22:43.277517 4722 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 09 14:22:43 crc kubenswrapper[4722]: E0309 14:22:43.277761 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98c22319-d5f8-4a0b-8a30-89b9d832f354-webhook-certs podName:98c22319-d5f8-4a0b-8a30-89b9d832f354 nodeName:}" failed. No retries permitted until 2026-03-09 14:22:51.277695927 +0000 UTC m=+1211.833264503 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/98c22319-d5f8-4a0b-8a30-89b9d832f354-webhook-certs") pod "openstack-operator-controller-manager-55cd86c56-dm2dr" (UID: "98c22319-d5f8-4a0b-8a30-89b9d832f354") : secret "webhook-server-cert" not found Mar 09 14:22:43 crc kubenswrapper[4722]: E0309 14:22:43.278047 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98c22319-d5f8-4a0b-8a30-89b9d832f354-metrics-certs podName:98c22319-d5f8-4a0b-8a30-89b9d832f354 nodeName:}" failed. No retries permitted until 2026-03-09 14:22:51.278032756 +0000 UTC m=+1211.833601332 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/98c22319-d5f8-4a0b-8a30-89b9d832f354-metrics-certs") pod "openstack-operator-controller-manager-55cd86c56-dm2dr" (UID: "98c22319-d5f8-4a0b-8a30-89b9d832f354") : secret "metrics-server-cert" not found Mar 09 14:22:50 crc kubenswrapper[4722]: E0309 14:22:50.094609 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/cinder-operator@sha256:7961c67cfc87de69055f8330771af625f73d857426c4bb17ebb888ead843fff3" Mar 09 14:22:50 crc kubenswrapper[4722]: E0309 14:22:50.095296 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/cinder-operator@sha256:7961c67cfc87de69055f8330771af625f73d857426c4bb17ebb888ead843fff3,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fvwd8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-55d77d7b5c-rsd9l_openstack-operators(edd71e1d-6ff0-4918-9cd8-a342efba2df5): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 14:22:50 crc kubenswrapper[4722]: E0309 14:22:50.096820 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-rsd9l" podUID="edd71e1d-6ff0-4918-9cd8-a342efba2df5" Mar 09 14:22:50 crc kubenswrapper[4722]: E0309 14:22:50.192738 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/cinder-operator@sha256:7961c67cfc87de69055f8330771af625f73d857426c4bb17ebb888ead843fff3\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-rsd9l" podUID="edd71e1d-6ff0-4918-9cd8-a342efba2df5" Mar 09 14:22:50 crc kubenswrapper[4722]: I0309 14:22:50.512834 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7a62b98d-e9d4-4cbc-bea8-0da13fcc4467-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-lvfgg\" (UID: \"7a62b98d-e9d4-4cbc-bea8-0da13fcc4467\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-lvfgg" Mar 09 14:22:50 crc kubenswrapper[4722]: I0309 14:22:50.518485 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7a62b98d-e9d4-4cbc-bea8-0da13fcc4467-cert\") pod \"infra-operator-controller-manager-f7fcc58b9-lvfgg\" (UID: \"7a62b98d-e9d4-4cbc-bea8-0da13fcc4467\") " pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-lvfgg" Mar 09 14:22:50 crc kubenswrapper[4722]: E0309 14:22:50.721803 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/mariadb-operator@sha256:5592ec4a6fbe2c832d1828b51af0b907e5d733d478b6f378a9b2f6d6cf0ac505" Mar 09 14:22:50 crc kubenswrapper[4722]: E0309 14:22:50.722015 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:5592ec4a6fbe2c832d1828b51af0b907e5d733d478b6f378a9b2f6d6cf0ac505,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zn9qd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-7b6bfb6475-l8bds_openstack-operators(74cb981b-ce89-479e-8573-fdda25190637): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 14:22:50 crc kubenswrapper[4722]: E0309 14:22:50.723268 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-l8bds" podUID="74cb981b-ce89-479e-8573-fdda25190637" Mar 09 14:22:50 crc kubenswrapper[4722]: I0309 14:22:50.727444 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-sdlvx" Mar 09 14:22:50 crc kubenswrapper[4722]: I0309 14:22:50.735148 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-lvfgg" Mar 09 14:22:51 crc kubenswrapper[4722]: I0309 14:22:51.129617 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5e25c11b-f9c6-4542-9c0c-394ea6bc2c17-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9ctn5qw\" (UID: \"5e25c11b-f9c6-4542-9c0c-394ea6bc2c17\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9ctn5qw" Mar 09 14:22:51 crc kubenswrapper[4722]: I0309 14:22:51.149320 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5e25c11b-f9c6-4542-9c0c-394ea6bc2c17-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9ctn5qw\" (UID: \"5e25c11b-f9c6-4542-9c0c-394ea6bc2c17\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9ctn5qw" Mar 09 14:22:51 crc kubenswrapper[4722]: I0309 14:22:51.197069 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-6zmqr" Mar 09 14:22:51 crc kubenswrapper[4722]: E0309 14:22:51.197659 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:5592ec4a6fbe2c832d1828b51af0b907e5d733d478b6f378a9b2f6d6cf0ac505\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-l8bds" podUID="74cb981b-ce89-479e-8573-fdda25190637" Mar 09 14:22:51 crc kubenswrapper[4722]: I0309 14:22:51.204713 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9ctn5qw" Mar 09 14:22:51 crc kubenswrapper[4722]: I0309 14:22:51.333052 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/98c22319-d5f8-4a0b-8a30-89b9d832f354-metrics-certs\") pod \"openstack-operator-controller-manager-55cd86c56-dm2dr\" (UID: \"98c22319-d5f8-4a0b-8a30-89b9d832f354\") " pod="openstack-operators/openstack-operator-controller-manager-55cd86c56-dm2dr" Mar 09 14:22:51 crc kubenswrapper[4722]: I0309 14:22:51.333158 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/98c22319-d5f8-4a0b-8a30-89b9d832f354-webhook-certs\") pod \"openstack-operator-controller-manager-55cd86c56-dm2dr\" (UID: \"98c22319-d5f8-4a0b-8a30-89b9d832f354\") " pod="openstack-operators/openstack-operator-controller-manager-55cd86c56-dm2dr" Mar 09 14:22:51 crc kubenswrapper[4722]: I0309 14:22:51.337490 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/98c22319-d5f8-4a0b-8a30-89b9d832f354-metrics-certs\") pod \"openstack-operator-controller-manager-55cd86c56-dm2dr\" (UID: \"98c22319-d5f8-4a0b-8a30-89b9d832f354\") " pod="openstack-operators/openstack-operator-controller-manager-55cd86c56-dm2dr" Mar 09 14:22:51 crc kubenswrapper[4722]: I0309 14:22:51.337761 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/98c22319-d5f8-4a0b-8a30-89b9d832f354-webhook-certs\") pod \"openstack-operator-controller-manager-55cd86c56-dm2dr\" (UID: \"98c22319-d5f8-4a0b-8a30-89b9d832f354\") " pod="openstack-operators/openstack-operator-controller-manager-55cd86c56-dm2dr" Mar 09 14:22:51 crc kubenswrapper[4722]: I0309 14:22:51.422319 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-m8ctf" Mar 09 14:22:51 crc kubenswrapper[4722]: I0309 14:22:51.431860 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-55cd86c56-dm2dr" Mar 09 14:22:51 crc kubenswrapper[4722]: I0309 14:22:51.528085 4722 patch_prober.go:28] interesting pod/machine-config-daemon-hjrrb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:22:51 crc kubenswrapper[4722]: I0309 14:22:51.528159 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:22:51 crc kubenswrapper[4722]: I0309 14:22:51.528249 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" Mar 09 14:22:51 crc kubenswrapper[4722]: I0309 14:22:51.528968 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"61f8d88e021e1998adcf282dcc3a5969939b9f1d00069284614000e527956e5e"} pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 14:22:51 crc kubenswrapper[4722]: I0309 14:22:51.529028 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerName="machine-config-daemon" containerID="cri-o://61f8d88e021e1998adcf282dcc3a5969939b9f1d00069284614000e527956e5e" gracePeriod=600 Mar 09 14:22:51 crc kubenswrapper[4722]: E0309 14:22:51.530359 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/horizon-operator@sha256:114c0dee0bab1d453890e9dcc7727de749055bdbea049384d5696e7ac8d78fe3" Mar 09 14:22:51 crc kubenswrapper[4722]: E0309 14:22:51.530817 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/horizon-operator@sha256:114c0dee0bab1d453890e9dcc7727de749055bdbea049384d5696e7ac8d78fe3,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7ccr9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-78bc7f9bd9-6npqv_openstack-operators(663f1719-30f7-4588-a183-4a59787e8d8d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 14:22:51 crc kubenswrapper[4722]: E0309 14:22:51.533858 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-6npqv" podUID="663f1719-30f7-4588-a183-4a59787e8d8d" Mar 09 14:22:52 crc kubenswrapper[4722]: E0309 14:22:52.092692 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/placement-operator@sha256:bb939885bd04593ad03af901adb77ee2a2d18529b328c23288c7cc7a2ba5282e" Mar 09 14:22:52 crc kubenswrapper[4722]: E0309 14:22:52.092902 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:bb939885bd04593ad03af901adb77ee2a2d18529b328c23288c7cc7a2ba5282e,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-25vsz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-648564c9fc-wrzrq_openstack-operators(e7b4c7c9-7c4f-4a13-8367-759f5f5ce368): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 14:22:52 crc kubenswrapper[4722]: E0309 14:22:52.094688 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-wrzrq" podUID="e7b4c7c9-7c4f-4a13-8367-759f5f5ce368" Mar 09 14:22:52 crc kubenswrapper[4722]: I0309 14:22:52.205116 4722 generic.go:334] "Generic (PLEG): container finished" podID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerID="61f8d88e021e1998adcf282dcc3a5969939b9f1d00069284614000e527956e5e" exitCode=0 Mar 09 14:22:52 crc kubenswrapper[4722]: I0309 14:22:52.205963 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" event={"ID":"dac2aaf5-653b-4b2a-8efe-ed26bac8d648","Type":"ContainerDied","Data":"61f8d88e021e1998adcf282dcc3a5969939b9f1d00069284614000e527956e5e"} Mar 09 14:22:52 crc kubenswrapper[4722]: I0309 14:22:52.206001 4722 scope.go:117] "RemoveContainer" containerID="50a94b1e196b515b7f1ddd4cb650f99db9da76851a6a18093dd50246aaec5007" Mar 09 14:22:52 crc kubenswrapper[4722]: E0309 14:22:52.207155 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:bb939885bd04593ad03af901adb77ee2a2d18529b328c23288c7cc7a2ba5282e\\\"\"" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-wrzrq" podUID="e7b4c7c9-7c4f-4a13-8367-759f5f5ce368" Mar 09 14:22:52 crc kubenswrapper[4722]: E0309 14:22:52.207237 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/horizon-operator@sha256:114c0dee0bab1d453890e9dcc7727de749055bdbea049384d5696e7ac8d78fe3\\\"\"" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-6npqv" podUID="663f1719-30f7-4588-a183-4a59787e8d8d" Mar 09 14:22:52 crc kubenswrapper[4722]: E0309 14:22:52.611355 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/glance-operator@sha256:81e43c058d9af1d3bc31704010c630bc2a574c2ee388aa0ffe8c7b9621a7d051" Mar 09 14:22:52 crc kubenswrapper[4722]: E0309 14:22:52.612043 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/glance-operator@sha256:81e43c058d9af1d3bc31704010c630bc2a574c2ee388aa0ffe8c7b9621a7d051,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vtdjp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-64db6967f8-zjf7b_openstack-operators(22043c71-5292-422c-99e5-c88ea1aef638): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 14:22:52 crc kubenswrapper[4722]: E0309 14:22:52.613573 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-zjf7b" podUID="22043c71-5292-422c-99e5-c88ea1aef638" Mar 09 14:22:53 crc kubenswrapper[4722]: E0309 14:22:53.160510 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ironic-operator@sha256:e41dfadd2c3bbcae29f8c43cd2feea6724a48cdef127d65d1d37816bb9945a01" Mar 09 14:22:53 crc kubenswrapper[4722]: E0309 14:22:53.160703 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:e41dfadd2c3bbcae29f8c43cd2feea6724a48cdef127d65d1d37816bb9945a01,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-th2gc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-545456dc4-hnzfw_openstack-operators(ec9f1f5e-26f5-4683-bf41-c85981da9d18): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 14:22:53 crc kubenswrapper[4722]: E0309 14:22:53.161987 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-hnzfw" podUID="ec9f1f5e-26f5-4683-bf41-c85981da9d18" Mar 09 14:22:53 crc kubenswrapper[4722]: E0309 14:22:53.213592 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/glance-operator@sha256:81e43c058d9af1d3bc31704010c630bc2a574c2ee388aa0ffe8c7b9621a7d051\\\"\"" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-zjf7b" podUID="22043c71-5292-422c-99e5-c88ea1aef638" Mar 09 14:22:53 crc kubenswrapper[4722]: E0309 14:22:53.214293 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ironic-operator@sha256:e41dfadd2c3bbcae29f8c43cd2feea6724a48cdef127d65d1d37816bb9945a01\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-hnzfw" podUID="ec9f1f5e-26f5-4683-bf41-c85981da9d18" Mar 09 14:22:53 crc kubenswrapper[4722]: E0309 14:22:53.744104 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/designate-operator@sha256:508859beb0e5b69169393dbb0039dc03a9d4ba05f16f6ff74f9b25e19d446214" Mar 09 14:22:53 crc kubenswrapper[4722]: E0309 14:22:53.744360 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/designate-operator@sha256:508859beb0e5b69169393dbb0039dc03a9d4ba05f16f6ff74f9b25e19d446214,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-h5slz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-5d87c9d997-bmpgd_openstack-operators(a1a5e35a-83f6-4886-86db-55738f51f7e8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 14:22:53 crc kubenswrapper[4722]: E0309 14:22:53.745524 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-bmpgd" podUID="a1a5e35a-83f6-4886-86db-55738f51f7e8" Mar 09 14:22:54 crc kubenswrapper[4722]: E0309 14:22:54.227035 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/designate-operator@sha256:508859beb0e5b69169393dbb0039dc03a9d4ba05f16f6ff74f9b25e19d446214\\\"\"" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-bmpgd" podUID="a1a5e35a-83f6-4886-86db-55738f51f7e8" Mar 09 14:22:55 crc kubenswrapper[4722]: E0309 14:22:55.406636 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:2d59045b8d8e6f9c5483c4fdda7c5057218d553200dc4bcf26789980ac1d9abd" Mar 09 14:22:55 crc kubenswrapper[4722]: E0309 14:22:55.406871 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:2d59045b8d8e6f9c5483c4fdda7c5057218d553200dc4bcf26789980ac1d9abd,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kg98n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-5d86c7ddb7-n5zc7_openstack-operators(717ffc3a-7a6d-4a7c-837f-d1ed92489b68): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 14:22:55 crc kubenswrapper[4722]: E0309 14:22:55.408101 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-n5zc7" podUID="717ffc3a-7a6d-4a7c-837f-d1ed92489b68" Mar 09 14:22:55 crc kubenswrapper[4722]: E0309 14:22:55.887779 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Mar 09 14:22:55 crc kubenswrapper[4722]: E0309 14:22:55.888241 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-k565v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-xswl4_openstack-operators(f9ff9b26-9d5a-4194-bab5-1b9fb5dee947): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 14:22:55 crc kubenswrapper[4722]: E0309 14:22:55.889434 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xswl4" podUID="f9ff9b26-9d5a-4194-bab5-1b9fb5dee947" Mar 09 14:22:56 crc kubenswrapper[4722]: E0309 14:22:56.240921 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:2d59045b8d8e6f9c5483c4fdda7c5057218d553200dc4bcf26789980ac1d9abd\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-n5zc7" podUID="717ffc3a-7a6d-4a7c-837f-d1ed92489b68" Mar 09 14:22:56 crc kubenswrapper[4722]: E0309 14:22:56.241283 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xswl4" podUID="f9ff9b26-9d5a-4194-bab5-1b9fb5dee947" Mar 09 14:22:57 crc kubenswrapper[4722]: E0309 14:22:57.161344 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.20:5001/openstack-k8s-operators/telemetry-operator:b58a49689efe8e6e9aeeec8f9da2f714463da105" Mar 09 14:22:57 crc kubenswrapper[4722]: E0309 14:22:57.161723 4722 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.20:5001/openstack-k8s-operators/telemetry-operator:b58a49689efe8e6e9aeeec8f9da2f714463da105" Mar 09 14:22:57 crc kubenswrapper[4722]: E0309 14:22:57.161888 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.20:5001/openstack-k8s-operators/telemetry-operator:b58a49689efe8e6e9aeeec8f9da2f714463da105,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5gs5j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-66c8b7dfbb-m7fv2_openstack-operators(5bf14ad6-64cf-48f7-99e6-fabac12849e2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 14:22:57 crc kubenswrapper[4722]: E0309 14:22:57.163113 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/telemetry-operator-controller-manager-66c8b7dfbb-m7fv2" podUID="5bf14ad6-64cf-48f7-99e6-fabac12849e2" Mar 09 14:22:57 crc kubenswrapper[4722]: E0309 14:22:57.248139 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.20:5001/openstack-k8s-operators/telemetry-operator:b58a49689efe8e6e9aeeec8f9da2f714463da105\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-66c8b7dfbb-m7fv2" podUID="5bf14ad6-64cf-48f7-99e6-fabac12849e2" Mar 09 14:22:58 crc kubenswrapper[4722]: E0309 14:22:58.336285 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:9d723ab33964ee44704eed3223b64e828349d45dee04695434a6fcf4b6807d4c" Mar 09 14:22:58 crc kubenswrapper[4722]: E0309 14:22:58.336465 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:9d723ab33964ee44704eed3223b64e828349d45dee04695434a6fcf4b6807d4c,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8cjdv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-7c789f89c6-2hxzr_openstack-operators(4de6db14-6f3e-4c4e-a61d-39c6648209dd): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 14:22:58 crc kubenswrapper[4722]: E0309 14:22:58.338037 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-2hxzr" podUID="4de6db14-6f3e-4c4e-a61d-39c6648209dd" Mar 09 14:22:58 crc kubenswrapper[4722]: I0309 14:22:58.855733 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9ctn5qw"] Mar 09 14:22:58 crc kubenswrapper[4722]: I0309 14:22:58.956662 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-f7fcc58b9-lvfgg"] Mar 09 14:22:58 crc kubenswrapper[4722]: I0309 14:22:58.994000 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-55cd86c56-dm2dr"] Mar 09 14:22:59 crc kubenswrapper[4722]: I0309 14:22:59.262233 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-pkbqb" event={"ID":"0eac7341-5bab-4c97-a730-b7eeb0a75899","Type":"ContainerStarted","Data":"f2574dea3a1e2ccf8a53c32ead5790de25878d4720362da4871cdd0bb71c8a0f"} Mar 09 14:22:59 crc kubenswrapper[4722]: I0309 14:22:59.262481 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-pkbqb" Mar 09 14:22:59 crc kubenswrapper[4722]: I0309 14:22:59.263922 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-54688575f-5fcw8" event={"ID":"febfeb1a-d5a3-46b8-bc4f-fe3266905e8c","Type":"ContainerStarted","Data":"50b9bc8d53b354e58aad52df3f604aa6efb34421e516c38ddb65fcfe0e560fad"} Mar 09 14:22:59 crc kubenswrapper[4722]: I0309 14:22:59.264056 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-54688575f-5fcw8" Mar 09 14:22:59 crc kubenswrapper[4722]: I0309 14:22:59.265347 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-56qz9" event={"ID":"a9df5689-5d83-4206-be2b-cf6877d70e23","Type":"ContainerStarted","Data":"7b90ab1b4d8ef2c443f53b7a503c742ed75e29f430c3ef869ae535ea8c9a08c2"} Mar 09 14:22:59 crc kubenswrapper[4722]: I0309 14:22:59.265505 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-56qz9" Mar 09 14:22:59 crc kubenswrapper[4722]: I0309 14:22:59.267279 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-hgkzs" event={"ID":"df8b52ff-f61e-4aca-a408-240590699ae6","Type":"ContainerStarted","Data":"9ee98f0ba96b989ec81202064c9c5c6a9fc391943e93eafa7da4d27052db2cd9"} Mar 09 14:22:59 crc kubenswrapper[4722]: I0309 14:22:59.267391 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-hgkzs" Mar 09 14:22:59 crc kubenswrapper[4722]: I0309 14:22:59.269074 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-jwgfg" event={"ID":"a9ff56ca-00a6-484f-a477-0dca4f3a0f5c","Type":"ContainerStarted","Data":"d2e8ebe08a5452de0fbd7475441f6e816198c3093416a568892ae073348db75e"} Mar 09 14:22:59 crc kubenswrapper[4722]: I0309 14:22:59.269117 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-jwgfg" Mar 09 14:22:59 crc kubenswrapper[4722]: I0309 14:22:59.269926 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9ctn5qw" event={"ID":"5e25c11b-f9c6-4542-9c0c-394ea6bc2c17","Type":"ContainerStarted","Data":"47b472e0d7337435a78370e5aa18cadd89e8920932f621a18368eeeb23f3ea83"} Mar 09 14:22:59 crc kubenswrapper[4722]: I0309 14:22:59.270935 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-pg8qn" event={"ID":"ef36bc5a-2962-4c1e-a5fd-98f61d525d5d","Type":"ContainerStarted","Data":"6782d8882e0a59ca3e58a50a0e001600a53007a67f9e563627e6db6131091f0c"} Mar 09 14:22:59 crc kubenswrapper[4722]: I0309 14:22:59.271307 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-pg8qn" Mar 09 14:22:59 crc kubenswrapper[4722]: I0309 14:22:59.272785 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-ct7x8" event={"ID":"f21c35ef-c8ea-4331-a747-44a62c6f2e74","Type":"ContainerStarted","Data":"daf50fcb70e742726fdc3b24774f11f98dea479bc40bcb4838c7f9304343cbb9"} Mar 09 14:22:59 crc kubenswrapper[4722]: I0309 14:22:59.272914 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-ct7x8" Mar 09 14:22:59 crc kubenswrapper[4722]: I0309 14:22:59.274437 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-55cd86c56-dm2dr" event={"ID":"98c22319-d5f8-4a0b-8a30-89b9d832f354","Type":"ContainerStarted","Data":"eef7b6b9c2478f3c9be2954de876f0c104d23e496f6e5c9281a6f9a6be437ed8"} Mar 09 14:22:59 crc kubenswrapper[4722]: I0309 14:22:59.274466 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-55cd86c56-dm2dr" event={"ID":"98c22319-d5f8-4a0b-8a30-89b9d832f354","Type":"ContainerStarted","Data":"d37612c1ed03eb1dd8db6c583f989c244ff4ca608f5d6ee628691fcc9fe4e68f"} Mar 09 14:22:59 crc kubenswrapper[4722]: I0309 14:22:59.274868 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-55cd86c56-dm2dr" Mar 09 14:22:59 crc kubenswrapper[4722]: I0309 14:22:59.276838 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-vppnv" event={"ID":"8f839106-1673-4589-9391-0cd7748e658c","Type":"ContainerStarted","Data":"ef2f3bf2cd7b4ea0616419f561917b62a7437eadc3b9d42a838e320f168d64aa"} Mar 09 14:22:59 crc kubenswrapper[4722]: I0309 14:22:59.276896 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-vppnv" Mar 09 14:22:59 crc kubenswrapper[4722]: I0309 14:22:59.280913 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" event={"ID":"dac2aaf5-653b-4b2a-8efe-ed26bac8d648","Type":"ContainerStarted","Data":"2d686d3e92fab7cd0f339e5d57afd546181543a3a9585b91ecf278050136cecb"} Mar 09 14:22:59 crc kubenswrapper[4722]: I0309 14:22:59.283021 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-67d996989d-28855" event={"ID":"dae536b6-7a22-435e-b307-a8ab6b54779d","Type":"ContainerStarted","Data":"a4540a41a96415eeff01048adf88a3ba30ad126ba822c509398515c15cf89d18"} Mar 09 14:22:59 crc kubenswrapper[4722]: I0309 14:22:59.283781 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-67d996989d-28855" Mar 09 14:22:59 crc kubenswrapper[4722]: I0309 14:22:59.288574 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-lvfgg" event={"ID":"7a62b98d-e9d4-4cbc-bea8-0da13fcc4467","Type":"ContainerStarted","Data":"8baf3db06263614e0c059e7f14e9e7fb9578d5edf1d5302f8d504c2645042053"} Mar 09 14:22:59 crc kubenswrapper[4722]: E0309 14:22:59.292897 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:9d723ab33964ee44704eed3223b64e828349d45dee04695434a6fcf4b6807d4c\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-2hxzr" podUID="4de6db14-6f3e-4c4e-a61d-39c6648209dd" Mar 09 14:22:59 crc kubenswrapper[4722]: I0309 14:22:59.296815 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-pkbqb" podStartSLOduration=4.489676893 podStartE2EDuration="25.296800028s" podCreationTimestamp="2026-03-09 14:22:34 +0000 UTC" firstStartedPulling="2026-03-09 14:22:37.628885657 +0000 UTC m=+1198.184454233" lastFinishedPulling="2026-03-09 14:22:58.436008792 +0000 UTC m=+1218.991577368" observedRunningTime="2026-03-09 14:22:59.280629328 +0000 UTC m=+1219.836197904" watchObservedRunningTime="2026-03-09 14:22:59.296800028 +0000 UTC m=+1219.852368604" Mar 09 14:22:59 crc kubenswrapper[4722]: I0309 14:22:59.313992 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-jwgfg" podStartSLOduration=6.829020013 podStartE2EDuration="25.313972696s" podCreationTimestamp="2026-03-09 14:22:34 +0000 UTC" firstStartedPulling="2026-03-09 14:22:37.404726902 +0000 UTC m=+1197.960295478" lastFinishedPulling="2026-03-09 14:22:55.889679595 +0000 UTC m=+1216.445248161" observedRunningTime="2026-03-09 14:22:59.306860946 +0000 UTC m=+1219.862429522" watchObservedRunningTime="2026-03-09 14:22:59.313972696 +0000 UTC m=+1219.869541272" Mar 09 14:22:59 crc kubenswrapper[4722]: I0309 14:22:59.335943 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-56qz9" podStartSLOduration=4.593486961 podStartE2EDuration="25.335926212s" podCreationTimestamp="2026-03-09 14:22:34 +0000 UTC" firstStartedPulling="2026-03-09 14:22:37.662465472 +0000 UTC m=+1198.218034048" lastFinishedPulling="2026-03-09 14:22:58.404904723 +0000 UTC m=+1218.960473299" observedRunningTime="2026-03-09 14:22:59.333786185 +0000 UTC m=+1219.889354771" watchObservedRunningTime="2026-03-09 14:22:59.335926212 +0000 UTC m=+1219.891494788" Mar 09 14:22:59 crc kubenswrapper[4722]: I0309 14:22:59.354472 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-vppnv" podStartSLOduration=5.671109026 podStartE2EDuration="25.354454745s" podCreationTimestamp="2026-03-09 14:22:34 +0000 UTC" firstStartedPulling="2026-03-09 14:22:36.217979586 +0000 UTC m=+1196.773548162" lastFinishedPulling="2026-03-09 14:22:55.901325305 +0000 UTC m=+1216.456893881" observedRunningTime="2026-03-09 14:22:59.34934305 +0000 UTC m=+1219.904911616" watchObservedRunningTime="2026-03-09 14:22:59.354454745 +0000 UTC m=+1219.910023321" Mar 09 14:22:59 crc kubenswrapper[4722]: I0309 14:22:59.384346 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-ct7x8" podStartSLOduration=5.665537537 podStartE2EDuration="25.384329232s" podCreationTimestamp="2026-03-09 14:22:34 +0000 UTC" firstStartedPulling="2026-03-09 14:22:36.182020817 +0000 UTC m=+1196.737589393" lastFinishedPulling="2026-03-09 14:22:55.900812512 +0000 UTC m=+1216.456381088" observedRunningTime="2026-03-09 14:22:59.379866353 +0000 UTC m=+1219.935434929" watchObservedRunningTime="2026-03-09 14:22:59.384329232 +0000 UTC m=+1219.939897808" Mar 09 14:22:59 crc kubenswrapper[4722]: I0309 14:22:59.457934 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-hgkzs" podStartSLOduration=5.630914634 podStartE2EDuration="25.457917773s" podCreationTimestamp="2026-03-09 14:22:34 +0000 UTC" firstStartedPulling="2026-03-09 14:22:37.287392744 +0000 UTC m=+1197.842961320" lastFinishedPulling="2026-03-09 14:22:57.114395883 +0000 UTC m=+1217.669964459" observedRunningTime="2026-03-09 14:22:59.43678215 +0000 UTC m=+1219.992350726" watchObservedRunningTime="2026-03-09 14:22:59.457917773 +0000 UTC m=+1220.013486349" Mar 09 14:22:59 crc kubenswrapper[4722]: I0309 14:22:59.460508 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-54688575f-5fcw8" podStartSLOduration=5.845919756 podStartE2EDuration="25.460498202s" podCreationTimestamp="2026-03-09 14:22:34 +0000 UTC" firstStartedPulling="2026-03-09 14:22:37.310838309 +0000 UTC m=+1197.866406885" lastFinishedPulling="2026-03-09 14:22:56.925416745 +0000 UTC m=+1217.480985331" observedRunningTime="2026-03-09 14:22:59.456798053 +0000 UTC m=+1220.012366629" watchObservedRunningTime="2026-03-09 14:22:59.460498202 +0000 UTC m=+1220.016066778" Mar 09 14:22:59 crc kubenswrapper[4722]: I0309 14:22:59.488450 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-pg8qn" podStartSLOduration=4.656924501 podStartE2EDuration="25.488435057s" podCreationTimestamp="2026-03-09 14:22:34 +0000 UTC" firstStartedPulling="2026-03-09 14:22:37.673100155 +0000 UTC m=+1198.228668731" lastFinishedPulling="2026-03-09 14:22:58.504610711 +0000 UTC m=+1219.060179287" observedRunningTime="2026-03-09 14:22:59.487910773 +0000 UTC m=+1220.043479349" watchObservedRunningTime="2026-03-09 14:22:59.488435057 +0000 UTC m=+1220.044003633" Mar 09 14:22:59 crc kubenswrapper[4722]: I0309 14:22:59.529992 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-55cd86c56-dm2dr" podStartSLOduration=24.529968204 podStartE2EDuration="24.529968204s" podCreationTimestamp="2026-03-09 14:22:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:22:59.526368908 +0000 UTC m=+1220.081937504" watchObservedRunningTime="2026-03-09 14:22:59.529968204 +0000 UTC m=+1220.085536800" Mar 09 14:23:00 crc kubenswrapper[4722]: I0309 14:23:00.182799 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-67d996989d-28855" podStartSLOduration=6.807159181 podStartE2EDuration="26.182778906s" podCreationTimestamp="2026-03-09 14:22:34 +0000 UTC" firstStartedPulling="2026-03-09 14:22:36.51406183 +0000 UTC m=+1197.069630416" lastFinishedPulling="2026-03-09 14:22:55.889681565 +0000 UTC m=+1216.445250141" observedRunningTime="2026-03-09 14:22:59.618308379 +0000 UTC m=+1220.173876955" watchObservedRunningTime="2026-03-09 14:23:00.182778906 +0000 UTC m=+1220.738347502" Mar 09 14:23:04 crc kubenswrapper[4722]: I0309 14:23:04.648880 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-vppnv" Mar 09 14:23:04 crc kubenswrapper[4722]: I0309 14:23:04.787378 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-ct7x8" Mar 09 14:23:05 crc kubenswrapper[4722]: I0309 14:23:05.266561 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-67d996989d-28855" Mar 09 14:23:05 crc kubenswrapper[4722]: I0309 14:23:05.368743 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-54688575f-5fcw8" Mar 09 14:23:05 crc kubenswrapper[4722]: I0309 14:23:05.397908 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-jwgfg" Mar 09 14:23:05 crc kubenswrapper[4722]: I0309 14:23:05.655735 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-hgkzs" Mar 09 14:23:05 crc kubenswrapper[4722]: I0309 14:23:05.710894 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-56qz9" Mar 09 14:23:05 crc kubenswrapper[4722]: I0309 14:23:05.952914 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-pg8qn" Mar 09 14:23:05 crc kubenswrapper[4722]: I0309 14:23:05.986747 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-pkbqb" Mar 09 14:23:07 crc kubenswrapper[4722]: I0309 14:23:07.355327 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9ctn5qw" event={"ID":"5e25c11b-f9c6-4542-9c0c-394ea6bc2c17","Type":"ContainerStarted","Data":"2947e9366eb22bbce7325286e08c9f598b6274003b4528b8f993e61ce6bd571c"} Mar 09 14:23:07 crc kubenswrapper[4722]: I0309 14:23:07.356025 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9ctn5qw" Mar 09 14:23:07 crc kubenswrapper[4722]: I0309 14:23:07.357338 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-l8bds" event={"ID":"74cb981b-ce89-479e-8573-fdda25190637","Type":"ContainerStarted","Data":"a154881d450e8d126219a8b7b71b123d2b85326acfe992710eb4d2e649f4e3c6"} Mar 09 14:23:07 crc kubenswrapper[4722]: I0309 14:23:07.358231 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-l8bds" Mar 09 14:23:07 crc kubenswrapper[4722]: I0309 14:23:07.359546 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-6npqv" event={"ID":"663f1719-30f7-4588-a183-4a59787e8d8d","Type":"ContainerStarted","Data":"cc53e753376a06f24df49e8719f6b99b6498f23832d2a5a2733b29b1ffd1f797"} Mar 09 14:23:07 crc kubenswrapper[4722]: I0309 14:23:07.359706 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-6npqv" Mar 09 14:23:07 crc kubenswrapper[4722]: I0309 14:23:07.360973 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-wrzrq" event={"ID":"e7b4c7c9-7c4f-4a13-8367-759f5f5ce368","Type":"ContainerStarted","Data":"aeed0051b34a2518ea37e91d9cc3bb5447759d2a6ea8749d2c396bbd85c423f5"} Mar 09 14:23:07 crc kubenswrapper[4722]: I0309 14:23:07.361232 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-wrzrq" Mar 09 14:23:07 crc kubenswrapper[4722]: I0309 14:23:07.362433 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-rsd9l" event={"ID":"edd71e1d-6ff0-4918-9cd8-a342efba2df5","Type":"ContainerStarted","Data":"bbecdfa31f6d2cb7e7ce9caf923d4fcc88d779db00d4a9076e2154fc0b34a22a"} Mar 09 14:23:07 crc kubenswrapper[4722]: I0309 14:23:07.362615 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-rsd9l" Mar 09 14:23:07 crc kubenswrapper[4722]: I0309 14:23:07.363838 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-lvfgg" event={"ID":"7a62b98d-e9d4-4cbc-bea8-0da13fcc4467","Type":"ContainerStarted","Data":"32a0e58da063dd7bade8b3cf5a28e5e43ca5f2b14ee02f6b41515830777fa4f0"} Mar 09 14:23:07 crc kubenswrapper[4722]: I0309 14:23:07.363911 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-lvfgg" Mar 09 14:23:07 crc kubenswrapper[4722]: I0309 14:23:07.365311 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-hnzfw" event={"ID":"ec9f1f5e-26f5-4683-bf41-c85981da9d18","Type":"ContainerStarted","Data":"ed3f78f85b1a4a65b61d1a60a0bc4c588ac85a3fc264827eba225ad765864e3b"} Mar 09 14:23:07 crc kubenswrapper[4722]: I0309 14:23:07.365533 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-hnzfw" Mar 09 14:23:07 crc kubenswrapper[4722]: I0309 14:23:07.499124 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9ctn5qw" podStartSLOduration=25.810267295 podStartE2EDuration="33.499100277s" podCreationTimestamp="2026-03-09 14:22:34 +0000 UTC" firstStartedPulling="2026-03-09 14:22:58.897517205 +0000 UTC m=+1219.453085781" lastFinishedPulling="2026-03-09 14:23:06.586350167 +0000 UTC m=+1227.141918763" observedRunningTime="2026-03-09 14:23:07.419056784 +0000 UTC m=+1227.974625360" watchObservedRunningTime="2026-03-09 14:23:07.499100277 +0000 UTC m=+1228.054668853" Mar 09 14:23:07 crc kubenswrapper[4722]: I0309 14:23:07.573641 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-lvfgg" podStartSLOduration=25.975729287 podStartE2EDuration="33.573613054s" podCreationTimestamp="2026-03-09 14:22:34 +0000 UTC" firstStartedPulling="2026-03-09 14:22:58.978291268 +0000 UTC m=+1219.533859844" lastFinishedPulling="2026-03-09 14:23:06.576175025 +0000 UTC m=+1227.131743611" observedRunningTime="2026-03-09 14:23:07.569779412 +0000 UTC m=+1228.125347988" watchObservedRunningTime="2026-03-09 14:23:07.573613054 +0000 UTC m=+1228.129181630" Mar 09 14:23:07 crc kubenswrapper[4722]: I0309 14:23:07.600045 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-rsd9l" podStartSLOduration=3.198864543 podStartE2EDuration="33.600009597s" podCreationTimestamp="2026-03-09 14:22:34 +0000 UTC" firstStartedPulling="2026-03-09 14:22:36.17799785 +0000 UTC m=+1196.733566426" lastFinishedPulling="2026-03-09 14:23:06.579142894 +0000 UTC m=+1227.134711480" observedRunningTime="2026-03-09 14:23:07.511518699 +0000 UTC m=+1228.067087275" watchObservedRunningTime="2026-03-09 14:23:07.600009597 +0000 UTC m=+1228.155578253" Mar 09 14:23:07 crc kubenswrapper[4722]: I0309 14:23:07.625042 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-l8bds" podStartSLOduration=4.40441738 podStartE2EDuration="33.625025314s" podCreationTimestamp="2026-03-09 14:22:34 +0000 UTC" firstStartedPulling="2026-03-09 14:22:37.404331971 +0000 UTC m=+1197.959900547" lastFinishedPulling="2026-03-09 14:23:06.624939905 +0000 UTC m=+1227.180508481" observedRunningTime="2026-03-09 14:23:07.614107414 +0000 UTC m=+1228.169675990" watchObservedRunningTime="2026-03-09 14:23:07.625025314 +0000 UTC m=+1228.180593880" Mar 09 14:23:07 crc kubenswrapper[4722]: I0309 14:23:07.654215 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-hnzfw" podStartSLOduration=4.46706181 podStartE2EDuration="33.654187572s" podCreationTimestamp="2026-03-09 14:22:34 +0000 UTC" firstStartedPulling="2026-03-09 14:22:37.393114522 +0000 UTC m=+1197.948683098" lastFinishedPulling="2026-03-09 14:23:06.580240244 +0000 UTC m=+1227.135808860" observedRunningTime="2026-03-09 14:23:07.648334686 +0000 UTC m=+1228.203903262" watchObservedRunningTime="2026-03-09 14:23:07.654187572 +0000 UTC m=+1228.209756148" Mar 09 14:23:07 crc kubenswrapper[4722]: I0309 14:23:07.694453 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-wrzrq" podStartSLOduration=4.70978139 podStartE2EDuration="33.694434205s" podCreationTimestamp="2026-03-09 14:22:34 +0000 UTC" firstStartedPulling="2026-03-09 14:22:37.409335604 +0000 UTC m=+1197.964904180" lastFinishedPulling="2026-03-09 14:23:06.393988419 +0000 UTC m=+1226.949556995" observedRunningTime="2026-03-09 14:23:07.689503203 +0000 UTC m=+1228.245071779" watchObservedRunningTime="2026-03-09 14:23:07.694434205 +0000 UTC m=+1228.250002781" Mar 09 14:23:07 crc kubenswrapper[4722]: I0309 14:23:07.717830 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-6npqv" podStartSLOduration=3.624066667 podStartE2EDuration="33.717813038s" podCreationTimestamp="2026-03-09 14:22:34 +0000 UTC" firstStartedPulling="2026-03-09 14:22:36.487151041 +0000 UTC m=+1197.042719617" lastFinishedPulling="2026-03-09 14:23:06.580897402 +0000 UTC m=+1227.136465988" observedRunningTime="2026-03-09 14:23:07.712903767 +0000 UTC m=+1228.268472343" watchObservedRunningTime="2026-03-09 14:23:07.717813038 +0000 UTC m=+1228.273381614" Mar 09 14:23:09 crc kubenswrapper[4722]: I0309 14:23:09.383783 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-bmpgd" event={"ID":"a1a5e35a-83f6-4886-86db-55738f51f7e8","Type":"ContainerStarted","Data":"a5071a8f2f64f6d065cb37df3156dd774b36c714f4c66f9ab7e936ae0a729609"} Mar 09 14:23:09 crc kubenswrapper[4722]: I0309 14:23:09.384657 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-bmpgd" Mar 09 14:23:09 crc kubenswrapper[4722]: I0309 14:23:09.385721 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xswl4" event={"ID":"f9ff9b26-9d5a-4194-bab5-1b9fb5dee947","Type":"ContainerStarted","Data":"ed3fbb43f36319a7eb8ab1a8588e8778eb342cc179a9ad24d7451bed03535581"} Mar 09 14:23:09 crc kubenswrapper[4722]: I0309 14:23:09.388610 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-zjf7b" event={"ID":"22043c71-5292-422c-99e5-c88ea1aef638","Type":"ContainerStarted","Data":"1233abd42f31e32e4b780ef1591655c23c9a6b3fe437a03d3f6f9931ed23c220"} Mar 09 14:23:09 crc kubenswrapper[4722]: I0309 14:23:09.389018 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-zjf7b" Mar 09 14:23:09 crc kubenswrapper[4722]: I0309 14:23:09.415022 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-bmpgd" podStartSLOduration=2.823704243 podStartE2EDuration="35.41500179s" podCreationTimestamp="2026-03-09 14:22:34 +0000 UTC" firstStartedPulling="2026-03-09 14:22:36.19710508 +0000 UTC m=+1196.752673656" lastFinishedPulling="2026-03-09 14:23:08.788402627 +0000 UTC m=+1229.343971203" observedRunningTime="2026-03-09 14:23:09.407179982 +0000 UTC m=+1229.962748578" watchObservedRunningTime="2026-03-09 14:23:09.41500179 +0000 UTC m=+1229.970570376" Mar 09 14:23:09 crc kubenswrapper[4722]: I0309 14:23:09.435170 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-zjf7b" podStartSLOduration=2.93689306 podStartE2EDuration="35.435144467s" podCreationTimestamp="2026-03-09 14:22:34 +0000 UTC" firstStartedPulling="2026-03-09 14:22:36.486894935 +0000 UTC m=+1197.042463511" lastFinishedPulling="2026-03-09 14:23:08.985146352 +0000 UTC m=+1229.540714918" observedRunningTime="2026-03-09 14:23:09.429738633 +0000 UTC m=+1229.985307199" watchObservedRunningTime="2026-03-09 14:23:09.435144467 +0000 UTC m=+1229.990713053" Mar 09 14:23:09 crc kubenswrapper[4722]: I0309 14:23:09.450405 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xswl4" podStartSLOduration=3.41376027 podStartE2EDuration="34.450375993s" podCreationTimestamp="2026-03-09 14:22:35 +0000 UTC" firstStartedPulling="2026-03-09 14:22:37.599462503 +0000 UTC m=+1198.155031079" lastFinishedPulling="2026-03-09 14:23:08.636078226 +0000 UTC m=+1229.191646802" observedRunningTime="2026-03-09 14:23:09.449268753 +0000 UTC m=+1230.004837329" watchObservedRunningTime="2026-03-09 14:23:09.450375993 +0000 UTC m=+1230.005944559" Mar 09 14:23:11 crc kubenswrapper[4722]: I0309 14:23:11.415127 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-n5zc7" event={"ID":"717ffc3a-7a6d-4a7c-837f-d1ed92489b68","Type":"ContainerStarted","Data":"4215330fca94e183b407b09a814fe82779e48c0e1475bfc1884f69234bef8531"} Mar 09 14:23:11 crc kubenswrapper[4722]: I0309 14:23:11.415651 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-n5zc7" Mar 09 14:23:11 crc kubenswrapper[4722]: I0309 14:23:11.435149 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-n5zc7" podStartSLOduration=3.984874476 podStartE2EDuration="37.435133541s" podCreationTimestamp="2026-03-09 14:22:34 +0000 UTC" firstStartedPulling="2026-03-09 14:22:37.405345958 +0000 UTC m=+1197.960914534" lastFinishedPulling="2026-03-09 14:23:10.855605023 +0000 UTC m=+1231.411173599" observedRunningTime="2026-03-09 14:23:11.430036165 +0000 UTC m=+1231.985604791" watchObservedRunningTime="2026-03-09 14:23:11.435133541 +0000 UTC m=+1231.990702117" Mar 09 14:23:11 crc kubenswrapper[4722]: I0309 14:23:11.441579 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-55cd86c56-dm2dr" Mar 09 14:23:12 crc kubenswrapper[4722]: I0309 14:23:12.423569 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-66c8b7dfbb-m7fv2" event={"ID":"5bf14ad6-64cf-48f7-99e6-fabac12849e2","Type":"ContainerStarted","Data":"787222444ce2eed0099002be713c411e9d3b09b2c30c469655e4a2716fd32c8c"} Mar 09 14:23:12 crc kubenswrapper[4722]: I0309 14:23:12.424049 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-66c8b7dfbb-m7fv2" Mar 09 14:23:12 crc kubenswrapper[4722]: I0309 14:23:12.437513 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-66c8b7dfbb-m7fv2" podStartSLOduration=4.09045695 podStartE2EDuration="38.43749422s" podCreationTimestamp="2026-03-09 14:22:34 +0000 UTC" firstStartedPulling="2026-03-09 14:22:37.409585051 +0000 UTC m=+1197.965153627" lastFinishedPulling="2026-03-09 14:23:11.756622321 +0000 UTC m=+1232.312190897" observedRunningTime="2026-03-09 14:23:12.437245604 +0000 UTC m=+1232.992814190" watchObservedRunningTime="2026-03-09 14:23:12.43749422 +0000 UTC m=+1232.993062806" Mar 09 14:23:14 crc kubenswrapper[4722]: I0309 14:23:14.442074 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-2hxzr" event={"ID":"4de6db14-6f3e-4c4e-a61d-39c6648209dd","Type":"ContainerStarted","Data":"708a193ff4194bb83d14fc922c7c15ba4f75bdbfb539cc02781e2e41893036b5"} Mar 09 14:23:14 crc kubenswrapper[4722]: I0309 14:23:14.443184 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-2hxzr" Mar 09 14:23:14 crc kubenswrapper[4722]: I0309 14:23:14.673416 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-rsd9l" Mar 09 14:23:14 crc kubenswrapper[4722]: I0309 14:23:14.689464 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-2hxzr" podStartSLOduration=3.615772126 podStartE2EDuration="40.689443131s" podCreationTimestamp="2026-03-09 14:22:34 +0000 UTC" firstStartedPulling="2026-03-09 14:22:36.476163928 +0000 UTC m=+1197.031732504" lastFinishedPulling="2026-03-09 14:23:13.549834933 +0000 UTC m=+1234.105403509" observedRunningTime="2026-03-09 14:23:14.461978477 +0000 UTC m=+1235.017547053" watchObservedRunningTime="2026-03-09 14:23:14.689443131 +0000 UTC m=+1235.245011707" Mar 09 14:23:14 crc kubenswrapper[4722]: I0309 14:23:14.697767 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-bmpgd" Mar 09 14:23:14 crc kubenswrapper[4722]: I0309 14:23:14.840625 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-zjf7b" Mar 09 14:23:15 crc kubenswrapper[4722]: I0309 14:23:15.012383 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-6npqv" Mar 09 14:23:15 crc kubenswrapper[4722]: I0309 14:23:15.164421 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-hnzfw" Mar 09 14:23:15 crc kubenswrapper[4722]: I0309 14:23:15.291520 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-l8bds" Mar 09 14:23:15 crc kubenswrapper[4722]: I0309 14:23:15.670396 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-wrzrq" Mar 09 14:23:20 crc kubenswrapper[4722]: I0309 14:23:20.744290 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-lvfgg" Mar 09 14:23:21 crc kubenswrapper[4722]: I0309 14:23:21.214046 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9ctn5qw" Mar 09 14:23:25 crc kubenswrapper[4722]: I0309 14:23:25.210884 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-2hxzr" Mar 09 14:23:25 crc kubenswrapper[4722]: I0309 14:23:25.432067 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-n5zc7" Mar 09 14:23:25 crc kubenswrapper[4722]: I0309 14:23:25.724480 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-66c8b7dfbb-m7fv2" Mar 09 14:23:43 crc kubenswrapper[4722]: I0309 14:23:43.317133 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-mdqwg"] Mar 09 14:23:43 crc kubenswrapper[4722]: I0309 14:23:43.319319 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-mdqwg" Mar 09 14:23:43 crc kubenswrapper[4722]: I0309 14:23:43.323563 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 09 14:23:43 crc kubenswrapper[4722]: I0309 14:23:43.325444 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-9llls" Mar 09 14:23:43 crc kubenswrapper[4722]: I0309 14:23:43.326033 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 09 14:23:43 crc kubenswrapper[4722]: I0309 14:23:43.337433 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 09 14:23:43 crc kubenswrapper[4722]: I0309 14:23:43.342747 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-mdqwg"] Mar 09 14:23:43 crc kubenswrapper[4722]: I0309 14:23:43.475218 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nh2n8\" (UniqueName: \"kubernetes.io/projected/fdbc665b-813d-4bce-ab2e-e0a2408bd149-kube-api-access-nh2n8\") pod \"dnsmasq-dns-675f4bcbfc-mdqwg\" (UID: \"fdbc665b-813d-4bce-ab2e-e0a2408bd149\") " pod="openstack/dnsmasq-dns-675f4bcbfc-mdqwg" Mar 09 14:23:43 crc kubenswrapper[4722]: I0309 14:23:43.475307 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fdbc665b-813d-4bce-ab2e-e0a2408bd149-config\") pod \"dnsmasq-dns-675f4bcbfc-mdqwg\" (UID: \"fdbc665b-813d-4bce-ab2e-e0a2408bd149\") " pod="openstack/dnsmasq-dns-675f4bcbfc-mdqwg" Mar 09 14:23:43 crc kubenswrapper[4722]: I0309 14:23:43.483530 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-vrhjb"] Mar 09 14:23:43 crc kubenswrapper[4722]: I0309 14:23:43.485464 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-vrhjb" Mar 09 14:23:43 crc kubenswrapper[4722]: I0309 14:23:43.496379 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 09 14:23:43 crc kubenswrapper[4722]: I0309 14:23:43.506544 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-vrhjb"] Mar 09 14:23:43 crc kubenswrapper[4722]: I0309 14:23:43.577189 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nh2n8\" (UniqueName: \"kubernetes.io/projected/fdbc665b-813d-4bce-ab2e-e0a2408bd149-kube-api-access-nh2n8\") pod \"dnsmasq-dns-675f4bcbfc-mdqwg\" (UID: \"fdbc665b-813d-4bce-ab2e-e0a2408bd149\") " pod="openstack/dnsmasq-dns-675f4bcbfc-mdqwg" Mar 09 14:23:43 crc kubenswrapper[4722]: I0309 14:23:43.577282 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fdbc665b-813d-4bce-ab2e-e0a2408bd149-config\") pod \"dnsmasq-dns-675f4bcbfc-mdqwg\" (UID: \"fdbc665b-813d-4bce-ab2e-e0a2408bd149\") " pod="openstack/dnsmasq-dns-675f4bcbfc-mdqwg" Mar 09 14:23:43 crc kubenswrapper[4722]: I0309 14:23:43.577348 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f77eae01-5673-4dda-95b8-c9b540e75f92-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-vrhjb\" (UID: \"f77eae01-5673-4dda-95b8-c9b540e75f92\") " pod="openstack/dnsmasq-dns-78dd6ddcc-vrhjb" Mar 09 14:23:43 crc kubenswrapper[4722]: I0309 14:23:43.577385 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4l5z\" (UniqueName: \"kubernetes.io/projected/f77eae01-5673-4dda-95b8-c9b540e75f92-kube-api-access-f4l5z\") pod \"dnsmasq-dns-78dd6ddcc-vrhjb\" (UID: \"f77eae01-5673-4dda-95b8-c9b540e75f92\") " pod="openstack/dnsmasq-dns-78dd6ddcc-vrhjb" Mar 09 14:23:43 crc kubenswrapper[4722]: I0309 14:23:43.577407 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f77eae01-5673-4dda-95b8-c9b540e75f92-config\") pod \"dnsmasq-dns-78dd6ddcc-vrhjb\" (UID: \"f77eae01-5673-4dda-95b8-c9b540e75f92\") " pod="openstack/dnsmasq-dns-78dd6ddcc-vrhjb" Mar 09 14:23:43 crc kubenswrapper[4722]: I0309 14:23:43.578313 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fdbc665b-813d-4bce-ab2e-e0a2408bd149-config\") pod \"dnsmasq-dns-675f4bcbfc-mdqwg\" (UID: \"fdbc665b-813d-4bce-ab2e-e0a2408bd149\") " pod="openstack/dnsmasq-dns-675f4bcbfc-mdqwg" Mar 09 14:23:43 crc kubenswrapper[4722]: I0309 14:23:43.596052 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nh2n8\" (UniqueName: \"kubernetes.io/projected/fdbc665b-813d-4bce-ab2e-e0a2408bd149-kube-api-access-nh2n8\") pod \"dnsmasq-dns-675f4bcbfc-mdqwg\" (UID: \"fdbc665b-813d-4bce-ab2e-e0a2408bd149\") " pod="openstack/dnsmasq-dns-675f4bcbfc-mdqwg" Mar 09 14:23:43 crc kubenswrapper[4722]: I0309 14:23:43.638137 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-mdqwg" Mar 09 14:23:43 crc kubenswrapper[4722]: I0309 14:23:43.679057 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f77eae01-5673-4dda-95b8-c9b540e75f92-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-vrhjb\" (UID: \"f77eae01-5673-4dda-95b8-c9b540e75f92\") " pod="openstack/dnsmasq-dns-78dd6ddcc-vrhjb" Mar 09 14:23:43 crc kubenswrapper[4722]: I0309 14:23:43.679115 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4l5z\" (UniqueName: \"kubernetes.io/projected/f77eae01-5673-4dda-95b8-c9b540e75f92-kube-api-access-f4l5z\") pod \"dnsmasq-dns-78dd6ddcc-vrhjb\" (UID: \"f77eae01-5673-4dda-95b8-c9b540e75f92\") " pod="openstack/dnsmasq-dns-78dd6ddcc-vrhjb" Mar 09 14:23:43 crc kubenswrapper[4722]: I0309 14:23:43.679149 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f77eae01-5673-4dda-95b8-c9b540e75f92-config\") pod \"dnsmasq-dns-78dd6ddcc-vrhjb\" (UID: \"f77eae01-5673-4dda-95b8-c9b540e75f92\") " pod="openstack/dnsmasq-dns-78dd6ddcc-vrhjb" Mar 09 14:23:43 crc kubenswrapper[4722]: I0309 14:23:43.680031 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f77eae01-5673-4dda-95b8-c9b540e75f92-config\") pod \"dnsmasq-dns-78dd6ddcc-vrhjb\" (UID: \"f77eae01-5673-4dda-95b8-c9b540e75f92\") " pod="openstack/dnsmasq-dns-78dd6ddcc-vrhjb" Mar 09 14:23:43 crc kubenswrapper[4722]: I0309 14:23:43.680663 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f77eae01-5673-4dda-95b8-c9b540e75f92-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-vrhjb\" (UID: \"f77eae01-5673-4dda-95b8-c9b540e75f92\") " pod="openstack/dnsmasq-dns-78dd6ddcc-vrhjb" Mar 09 14:23:43 crc kubenswrapper[4722]: I0309 14:23:43.712422 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4l5z\" (UniqueName: \"kubernetes.io/projected/f77eae01-5673-4dda-95b8-c9b540e75f92-kube-api-access-f4l5z\") pod \"dnsmasq-dns-78dd6ddcc-vrhjb\" (UID: \"f77eae01-5673-4dda-95b8-c9b540e75f92\") " pod="openstack/dnsmasq-dns-78dd6ddcc-vrhjb" Mar 09 14:23:43 crc kubenswrapper[4722]: I0309 14:23:43.809798 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-vrhjb" Mar 09 14:23:44 crc kubenswrapper[4722]: I0309 14:23:44.106952 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-mdqwg"] Mar 09 14:23:44 crc kubenswrapper[4722]: I0309 14:23:44.283568 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-vrhjb"] Mar 09 14:23:44 crc kubenswrapper[4722]: I0309 14:23:44.715323 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-mdqwg" event={"ID":"fdbc665b-813d-4bce-ab2e-e0a2408bd149","Type":"ContainerStarted","Data":"917f6e0caa00bc9146de3de9e043f0b437c86ceb70d9482844ac8efb171a46a8"} Mar 09 14:23:44 crc kubenswrapper[4722]: I0309 14:23:44.716994 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-vrhjb" event={"ID":"f77eae01-5673-4dda-95b8-c9b540e75f92","Type":"ContainerStarted","Data":"4c41214d3e3f1b12b5d9e2b2526820e9ce9f36114c31ea49159af633ecac7c23"} Mar 09 14:23:46 crc kubenswrapper[4722]: I0309 14:23:46.351462 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-mdqwg"] Mar 09 14:23:46 crc kubenswrapper[4722]: I0309 14:23:46.421878 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-lpmrz"] Mar 09 14:23:46 crc kubenswrapper[4722]: I0309 14:23:46.423298 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-lpmrz" Mar 09 14:23:46 crc kubenswrapper[4722]: I0309 14:23:46.522646 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-lpmrz"] Mar 09 14:23:46 crc kubenswrapper[4722]: I0309 14:23:46.561863 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd7183ce-685d-4fe4-9bc9-5d58c95bc9e0-config\") pod \"dnsmasq-dns-5ccc8479f9-lpmrz\" (UID: \"dd7183ce-685d-4fe4-9bc9-5d58c95bc9e0\") " pod="openstack/dnsmasq-dns-5ccc8479f9-lpmrz" Mar 09 14:23:46 crc kubenswrapper[4722]: I0309 14:23:46.561950 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd7183ce-685d-4fe4-9bc9-5d58c95bc9e0-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-lpmrz\" (UID: \"dd7183ce-685d-4fe4-9bc9-5d58c95bc9e0\") " pod="openstack/dnsmasq-dns-5ccc8479f9-lpmrz" Mar 09 14:23:46 crc kubenswrapper[4722]: I0309 14:23:46.562089 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hn8j6\" (UniqueName: \"kubernetes.io/projected/dd7183ce-685d-4fe4-9bc9-5d58c95bc9e0-kube-api-access-hn8j6\") pod \"dnsmasq-dns-5ccc8479f9-lpmrz\" (UID: \"dd7183ce-685d-4fe4-9bc9-5d58c95bc9e0\") " pod="openstack/dnsmasq-dns-5ccc8479f9-lpmrz" Mar 09 14:23:46 crc kubenswrapper[4722]: I0309 14:23:46.663138 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hn8j6\" (UniqueName: \"kubernetes.io/projected/dd7183ce-685d-4fe4-9bc9-5d58c95bc9e0-kube-api-access-hn8j6\") pod \"dnsmasq-dns-5ccc8479f9-lpmrz\" (UID: \"dd7183ce-685d-4fe4-9bc9-5d58c95bc9e0\") " pod="openstack/dnsmasq-dns-5ccc8479f9-lpmrz" Mar 09 14:23:46 crc kubenswrapper[4722]: I0309 14:23:46.663328 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd7183ce-685d-4fe4-9bc9-5d58c95bc9e0-config\") pod \"dnsmasq-dns-5ccc8479f9-lpmrz\" (UID: \"dd7183ce-685d-4fe4-9bc9-5d58c95bc9e0\") " pod="openstack/dnsmasq-dns-5ccc8479f9-lpmrz" Mar 09 14:23:46 crc kubenswrapper[4722]: I0309 14:23:46.663370 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd7183ce-685d-4fe4-9bc9-5d58c95bc9e0-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-lpmrz\" (UID: \"dd7183ce-685d-4fe4-9bc9-5d58c95bc9e0\") " pod="openstack/dnsmasq-dns-5ccc8479f9-lpmrz" Mar 09 14:23:46 crc kubenswrapper[4722]: I0309 14:23:46.664146 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd7183ce-685d-4fe4-9bc9-5d58c95bc9e0-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-lpmrz\" (UID: \"dd7183ce-685d-4fe4-9bc9-5d58c95bc9e0\") " pod="openstack/dnsmasq-dns-5ccc8479f9-lpmrz" Mar 09 14:23:46 crc kubenswrapper[4722]: I0309 14:23:46.664853 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd7183ce-685d-4fe4-9bc9-5d58c95bc9e0-config\") pod \"dnsmasq-dns-5ccc8479f9-lpmrz\" (UID: \"dd7183ce-685d-4fe4-9bc9-5d58c95bc9e0\") " pod="openstack/dnsmasq-dns-5ccc8479f9-lpmrz" Mar 09 14:23:46 crc kubenswrapper[4722]: I0309 14:23:46.707194 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hn8j6\" (UniqueName: \"kubernetes.io/projected/dd7183ce-685d-4fe4-9bc9-5d58c95bc9e0-kube-api-access-hn8j6\") pod \"dnsmasq-dns-5ccc8479f9-lpmrz\" (UID: \"dd7183ce-685d-4fe4-9bc9-5d58c95bc9e0\") " pod="openstack/dnsmasq-dns-5ccc8479f9-lpmrz" Mar 09 14:23:46 crc kubenswrapper[4722]: I0309 14:23:46.789733 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-lpmrz" Mar 09 14:23:46 crc kubenswrapper[4722]: I0309 14:23:46.824137 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-vrhjb"] Mar 09 14:23:46 crc kubenswrapper[4722]: I0309 14:23:46.849948 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-lb2r7"] Mar 09 14:23:46 crc kubenswrapper[4722]: I0309 14:23:46.851635 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-lb2r7" Mar 09 14:23:46 crc kubenswrapper[4722]: I0309 14:23:46.859680 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-lb2r7"] Mar 09 14:23:46 crc kubenswrapper[4722]: I0309 14:23:46.967121 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dg249\" (UniqueName: \"kubernetes.io/projected/6dca0862-8223-4ba0-9fc8-eb8c861baa60-kube-api-access-dg249\") pod \"dnsmasq-dns-57d769cc4f-lb2r7\" (UID: \"6dca0862-8223-4ba0-9fc8-eb8c861baa60\") " pod="openstack/dnsmasq-dns-57d769cc4f-lb2r7" Mar 09 14:23:46 crc kubenswrapper[4722]: I0309 14:23:46.967686 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6dca0862-8223-4ba0-9fc8-eb8c861baa60-config\") pod \"dnsmasq-dns-57d769cc4f-lb2r7\" (UID: \"6dca0862-8223-4ba0-9fc8-eb8c861baa60\") " pod="openstack/dnsmasq-dns-57d769cc4f-lb2r7" Mar 09 14:23:46 crc kubenswrapper[4722]: I0309 14:23:46.967756 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6dca0862-8223-4ba0-9fc8-eb8c861baa60-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-lb2r7\" (UID: \"6dca0862-8223-4ba0-9fc8-eb8c861baa60\") " pod="openstack/dnsmasq-dns-57d769cc4f-lb2r7" Mar 09 14:23:47 crc kubenswrapper[4722]: I0309 14:23:47.071801 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dg249\" (UniqueName: \"kubernetes.io/projected/6dca0862-8223-4ba0-9fc8-eb8c861baa60-kube-api-access-dg249\") pod \"dnsmasq-dns-57d769cc4f-lb2r7\" (UID: \"6dca0862-8223-4ba0-9fc8-eb8c861baa60\") " pod="openstack/dnsmasq-dns-57d769cc4f-lb2r7" Mar 09 14:23:47 crc kubenswrapper[4722]: I0309 14:23:47.072155 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6dca0862-8223-4ba0-9fc8-eb8c861baa60-config\") pod \"dnsmasq-dns-57d769cc4f-lb2r7\" (UID: \"6dca0862-8223-4ba0-9fc8-eb8c861baa60\") " pod="openstack/dnsmasq-dns-57d769cc4f-lb2r7" Mar 09 14:23:47 crc kubenswrapper[4722]: I0309 14:23:47.072188 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6dca0862-8223-4ba0-9fc8-eb8c861baa60-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-lb2r7\" (UID: \"6dca0862-8223-4ba0-9fc8-eb8c861baa60\") " pod="openstack/dnsmasq-dns-57d769cc4f-lb2r7" Mar 09 14:23:47 crc kubenswrapper[4722]: I0309 14:23:47.073720 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6dca0862-8223-4ba0-9fc8-eb8c861baa60-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-lb2r7\" (UID: \"6dca0862-8223-4ba0-9fc8-eb8c861baa60\") " pod="openstack/dnsmasq-dns-57d769cc4f-lb2r7" Mar 09 14:23:47 crc kubenswrapper[4722]: I0309 14:23:47.074054 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6dca0862-8223-4ba0-9fc8-eb8c861baa60-config\") pod \"dnsmasq-dns-57d769cc4f-lb2r7\" (UID: \"6dca0862-8223-4ba0-9fc8-eb8c861baa60\") " pod="openstack/dnsmasq-dns-57d769cc4f-lb2r7" Mar 09 14:23:47 crc kubenswrapper[4722]: I0309 14:23:47.103733 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dg249\" (UniqueName: \"kubernetes.io/projected/6dca0862-8223-4ba0-9fc8-eb8c861baa60-kube-api-access-dg249\") pod \"dnsmasq-dns-57d769cc4f-lb2r7\" (UID: \"6dca0862-8223-4ba0-9fc8-eb8c861baa60\") " pod="openstack/dnsmasq-dns-57d769cc4f-lb2r7" Mar 09 14:23:47 crc kubenswrapper[4722]: I0309 14:23:47.279154 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-lb2r7" Mar 09 14:23:47 crc kubenswrapper[4722]: I0309 14:23:47.459266 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-lpmrz"] Mar 09 14:23:47 crc kubenswrapper[4722]: W0309 14:23:47.473550 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd7183ce_685d_4fe4_9bc9_5d58c95bc9e0.slice/crio-6350dcb160191f9e91020920bf73d9e068cf8b1977185682e983f5532d411f46 WatchSource:0}: Error finding container 6350dcb160191f9e91020920bf73d9e068cf8b1977185682e983f5532d411f46: Status 404 returned error can't find the container with id 6350dcb160191f9e91020920bf73d9e068cf8b1977185682e983f5532d411f46 Mar 09 14:23:47 crc kubenswrapper[4722]: I0309 14:23:47.602020 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 09 14:23:47 crc kubenswrapper[4722]: I0309 14:23:47.604508 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 09 14:23:47 crc kubenswrapper[4722]: I0309 14:23:47.615930 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 09 14:23:47 crc kubenswrapper[4722]: I0309 14:23:47.616124 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 09 14:23:47 crc kubenswrapper[4722]: I0309 14:23:47.616261 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 09 14:23:47 crc kubenswrapper[4722]: I0309 14:23:47.616395 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 09 14:23:47 crc kubenswrapper[4722]: I0309 14:23:47.616492 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-sj5h5" Mar 09 14:23:47 crc kubenswrapper[4722]: I0309 14:23:47.616586 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 09 14:23:47 crc kubenswrapper[4722]: I0309 14:23:47.616695 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 09 14:23:47 crc kubenswrapper[4722]: I0309 14:23:47.638281 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 09 14:23:47 crc kubenswrapper[4722]: I0309 14:23:47.789516 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1c98e541-4b72-465d-8799-89e8c9791c3e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c98e541-4b72-465d-8799-89e8c9791c3e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 14:23:47 crc kubenswrapper[4722]: I0309 14:23:47.789548 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1c98e541-4b72-465d-8799-89e8c9791c3e-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c98e541-4b72-465d-8799-89e8c9791c3e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 14:23:47 crc kubenswrapper[4722]: I0309 14:23:47.789565 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1c98e541-4b72-465d-8799-89e8c9791c3e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c98e541-4b72-465d-8799-89e8c9791c3e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 14:23:47 crc kubenswrapper[4722]: I0309 14:23:47.789593 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1c98e541-4b72-465d-8799-89e8c9791c3e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c98e541-4b72-465d-8799-89e8c9791c3e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 14:23:47 crc kubenswrapper[4722]: I0309 14:23:47.789618 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1c98e541-4b72-465d-8799-89e8c9791c3e-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c98e541-4b72-465d-8799-89e8c9791c3e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 14:23:47 crc kubenswrapper[4722]: I0309 14:23:47.789637 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-35d3f19e-3c36-4038-9906-dc1e23e87cf8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-35d3f19e-3c36-4038-9906-dc1e23e87cf8\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c98e541-4b72-465d-8799-89e8c9791c3e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 14:23:47 crc kubenswrapper[4722]: I0309 14:23:47.789812 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xptdn\" (UniqueName: \"kubernetes.io/projected/1c98e541-4b72-465d-8799-89e8c9791c3e-kube-api-access-xptdn\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c98e541-4b72-465d-8799-89e8c9791c3e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 14:23:47 crc kubenswrapper[4722]: I0309 14:23:47.789937 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1c98e541-4b72-465d-8799-89e8c9791c3e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c98e541-4b72-465d-8799-89e8c9791c3e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 14:23:47 crc kubenswrapper[4722]: I0309 14:23:47.789983 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1c98e541-4b72-465d-8799-89e8c9791c3e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c98e541-4b72-465d-8799-89e8c9791c3e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 14:23:47 crc kubenswrapper[4722]: I0309 14:23:47.790048 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1c98e541-4b72-465d-8799-89e8c9791c3e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c98e541-4b72-465d-8799-89e8c9791c3e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 14:23:47 crc kubenswrapper[4722]: I0309 14:23:47.790182 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1c98e541-4b72-465d-8799-89e8c9791c3e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c98e541-4b72-465d-8799-89e8c9791c3e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 14:23:47 crc kubenswrapper[4722]: I0309 14:23:47.798628 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-lpmrz" event={"ID":"dd7183ce-685d-4fe4-9bc9-5d58c95bc9e0","Type":"ContainerStarted","Data":"6350dcb160191f9e91020920bf73d9e068cf8b1977185682e983f5532d411f46"} Mar 09 14:23:47 crc kubenswrapper[4722]: I0309 14:23:47.893000 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1c98e541-4b72-465d-8799-89e8c9791c3e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c98e541-4b72-465d-8799-89e8c9791c3e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 14:23:47 crc kubenswrapper[4722]: I0309 14:23:47.893061 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1c98e541-4b72-465d-8799-89e8c9791c3e-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c98e541-4b72-465d-8799-89e8c9791c3e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 14:23:47 crc kubenswrapper[4722]: I0309 14:23:47.893088 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-35d3f19e-3c36-4038-9906-dc1e23e87cf8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-35d3f19e-3c36-4038-9906-dc1e23e87cf8\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c98e541-4b72-465d-8799-89e8c9791c3e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 14:23:47 crc kubenswrapper[4722]: I0309 14:23:47.893149 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xptdn\" (UniqueName: \"kubernetes.io/projected/1c98e541-4b72-465d-8799-89e8c9791c3e-kube-api-access-xptdn\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c98e541-4b72-465d-8799-89e8c9791c3e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 14:23:47 crc kubenswrapper[4722]: I0309 14:23:47.893220 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1c98e541-4b72-465d-8799-89e8c9791c3e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c98e541-4b72-465d-8799-89e8c9791c3e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 14:23:47 crc kubenswrapper[4722]: I0309 14:23:47.893255 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1c98e541-4b72-465d-8799-89e8c9791c3e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c98e541-4b72-465d-8799-89e8c9791c3e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 14:23:47 crc kubenswrapper[4722]: I0309 14:23:47.893295 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1c98e541-4b72-465d-8799-89e8c9791c3e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c98e541-4b72-465d-8799-89e8c9791c3e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 14:23:47 crc kubenswrapper[4722]: I0309 14:23:47.893342 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1c98e541-4b72-465d-8799-89e8c9791c3e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c98e541-4b72-465d-8799-89e8c9791c3e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 14:23:47 crc kubenswrapper[4722]: I0309 14:23:47.893392 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1c98e541-4b72-465d-8799-89e8c9791c3e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c98e541-4b72-465d-8799-89e8c9791c3e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 14:23:47 crc kubenswrapper[4722]: I0309 14:23:47.893416 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1c98e541-4b72-465d-8799-89e8c9791c3e-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c98e541-4b72-465d-8799-89e8c9791c3e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 14:23:47 crc kubenswrapper[4722]: I0309 14:23:47.893436 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1c98e541-4b72-465d-8799-89e8c9791c3e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c98e541-4b72-465d-8799-89e8c9791c3e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 14:23:47 crc kubenswrapper[4722]: I0309 14:23:47.894031 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1c98e541-4b72-465d-8799-89e8c9791c3e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c98e541-4b72-465d-8799-89e8c9791c3e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 14:23:47 crc kubenswrapper[4722]: I0309 14:23:47.894884 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1c98e541-4b72-465d-8799-89e8c9791c3e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c98e541-4b72-465d-8799-89e8c9791c3e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 14:23:47 crc kubenswrapper[4722]: I0309 14:23:47.895656 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1c98e541-4b72-465d-8799-89e8c9791c3e-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c98e541-4b72-465d-8799-89e8c9791c3e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 14:23:47 crc kubenswrapper[4722]: I0309 14:23:47.896488 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1c98e541-4b72-465d-8799-89e8c9791c3e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c98e541-4b72-465d-8799-89e8c9791c3e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 14:23:47 crc kubenswrapper[4722]: I0309 14:23:47.900367 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1c98e541-4b72-465d-8799-89e8c9791c3e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c98e541-4b72-465d-8799-89e8c9791c3e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 14:23:47 crc kubenswrapper[4722]: I0309 14:23:47.900713 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1c98e541-4b72-465d-8799-89e8c9791c3e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c98e541-4b72-465d-8799-89e8c9791c3e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 14:23:47 crc kubenswrapper[4722]: I0309 14:23:47.908179 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1c98e541-4b72-465d-8799-89e8c9791c3e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c98e541-4b72-465d-8799-89e8c9791c3e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 14:23:47 crc kubenswrapper[4722]: I0309 14:23:47.910064 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 09 14:23:47 crc kubenswrapper[4722]: I0309 14:23:47.910099 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-35d3f19e-3c36-4038-9906-dc1e23e87cf8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-35d3f19e-3c36-4038-9906-dc1e23e87cf8\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c98e541-4b72-465d-8799-89e8c9791c3e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/bd40df8b19fd4b7f91f0427928b43bf8fc8992041a81fccf4003d9f0fcaf3986/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Mar 09 14:23:47 crc kubenswrapper[4722]: I0309 14:23:47.911454 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1c98e541-4b72-465d-8799-89e8c9791c3e-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c98e541-4b72-465d-8799-89e8c9791c3e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 14:23:47 crc kubenswrapper[4722]: I0309 14:23:47.915036 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1c98e541-4b72-465d-8799-89e8c9791c3e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c98e541-4b72-465d-8799-89e8c9791c3e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 14:23:47 crc kubenswrapper[4722]: I0309 14:23:47.917761 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xptdn\" (UniqueName: \"kubernetes.io/projected/1c98e541-4b72-465d-8799-89e8c9791c3e-kube-api-access-xptdn\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c98e541-4b72-465d-8799-89e8c9791c3e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 14:23:47 crc kubenswrapper[4722]: I0309 14:23:47.955681 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-35d3f19e-3c36-4038-9906-dc1e23e87cf8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-35d3f19e-3c36-4038-9906-dc1e23e87cf8\") pod \"rabbitmq-cell1-server-0\" (UID: \"1c98e541-4b72-465d-8799-89e8c9791c3e\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 14:23:47 crc kubenswrapper[4722]: I0309 14:23:47.978702 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-lb2r7"] Mar 09 14:23:47 crc kubenswrapper[4722]: I0309 14:23:47.995659 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 09 14:23:47 crc kubenswrapper[4722]: I0309 14:23:47.997561 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.003117 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.003129 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.003473 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.003681 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.003892 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-64x8b" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.004056 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.005020 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.053811 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.075435 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-2"] Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.078054 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-2" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.088847 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-1"] Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.090749 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-1" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.099231 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6f4e007a-4a18-40e6-bf96-4a751e00cd73-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"6f4e007a-4a18-40e6-bf96-4a751e00cd73\") " pod="openstack/rabbitmq-server-0" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.099353 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8l5q\" (UniqueName: \"kubernetes.io/projected/6f4e007a-4a18-40e6-bf96-4a751e00cd73-kube-api-access-w8l5q\") pod \"rabbitmq-server-0\" (UID: \"6f4e007a-4a18-40e6-bf96-4a751e00cd73\") " pod="openstack/rabbitmq-server-0" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.099411 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6f4e007a-4a18-40e6-bf96-4a751e00cd73-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6f4e007a-4a18-40e6-bf96-4a751e00cd73\") " pod="openstack/rabbitmq-server-0" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.099469 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6f4e007a-4a18-40e6-bf96-4a751e00cd73-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6f4e007a-4a18-40e6-bf96-4a751e00cd73\") " pod="openstack/rabbitmq-server-0" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.099509 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6f4e007a-4a18-40e6-bf96-4a751e00cd73-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6f4e007a-4a18-40e6-bf96-4a751e00cd73\") " pod="openstack/rabbitmq-server-0" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.099576 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6f4e007a-4a18-40e6-bf96-4a751e00cd73-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6f4e007a-4a18-40e6-bf96-4a751e00cd73\") " pod="openstack/rabbitmq-server-0" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.099652 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6f4e007a-4a18-40e6-bf96-4a751e00cd73-config-data\") pod \"rabbitmq-server-0\" (UID: \"6f4e007a-4a18-40e6-bf96-4a751e00cd73\") " pod="openstack/rabbitmq-server-0" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.100408 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6f4e007a-4a18-40e6-bf96-4a751e00cd73-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6f4e007a-4a18-40e6-bf96-4a751e00cd73\") " pod="openstack/rabbitmq-server-0" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.100530 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6f4e007a-4a18-40e6-bf96-4a751e00cd73-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6f4e007a-4a18-40e6-bf96-4a751e00cd73\") " pod="openstack/rabbitmq-server-0" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.100589 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d921ac5a-690a-43f2-832b-27c969b7b3d5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d921ac5a-690a-43f2-832b-27c969b7b3d5\") pod \"rabbitmq-server-0\" (UID: \"6f4e007a-4a18-40e6-bf96-4a751e00cd73\") " pod="openstack/rabbitmq-server-0" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.100670 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6f4e007a-4a18-40e6-bf96-4a751e00cd73-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6f4e007a-4a18-40e6-bf96-4a751e00cd73\") " pod="openstack/rabbitmq-server-0" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.103792 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-2"] Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.124769 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-1"] Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.203801 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/17ce7999-f86f-45fa-ae07-785f70d797a1-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-2\" (UID: \"17ce7999-f86f-45fa-ae07-785f70d797a1\") " pod="openstack/rabbitmq-server-2" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.203876 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6f4e007a-4a18-40e6-bf96-4a751e00cd73-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6f4e007a-4a18-40e6-bf96-4a751e00cd73\") " pod="openstack/rabbitmq-server-0" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.203906 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c6ada086-becc-4f4a-a0a0-0aad894dc550-pod-info\") pod \"rabbitmq-server-1\" (UID: \"c6ada086-becc-4f4a-a0a0-0aad894dc550\") " pod="openstack/rabbitmq-server-1" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.203927 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/17ce7999-f86f-45fa-ae07-785f70d797a1-rabbitmq-plugins\") pod \"rabbitmq-server-2\" (UID: \"17ce7999-f86f-45fa-ae07-785f70d797a1\") " pod="openstack/rabbitmq-server-2" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.203953 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c6ada086-becc-4f4a-a0a0-0aad894dc550-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"c6ada086-becc-4f4a-a0a0-0aad894dc550\") " pod="openstack/rabbitmq-server-1" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.203989 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d921ac5a-690a-43f2-832b-27c969b7b3d5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d921ac5a-690a-43f2-832b-27c969b7b3d5\") pod \"rabbitmq-server-0\" (UID: \"6f4e007a-4a18-40e6-bf96-4a751e00cd73\") " pod="openstack/rabbitmq-server-0" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.204031 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6f4e007a-4a18-40e6-bf96-4a751e00cd73-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6f4e007a-4a18-40e6-bf96-4a751e00cd73\") " pod="openstack/rabbitmq-server-0" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.204074 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c6ada086-becc-4f4a-a0a0-0aad894dc550-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"c6ada086-becc-4f4a-a0a0-0aad894dc550\") " pod="openstack/rabbitmq-server-1" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.204126 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c6ada086-becc-4f4a-a0a0-0aad894dc550-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"c6ada086-becc-4f4a-a0a0-0aad894dc550\") " pod="openstack/rabbitmq-server-1" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.204557 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pvdb\" (UniqueName: \"kubernetes.io/projected/c6ada086-becc-4f4a-a0a0-0aad894dc550-kube-api-access-4pvdb\") pod \"rabbitmq-server-1\" (UID: \"c6ada086-becc-4f4a-a0a0-0aad894dc550\") " pod="openstack/rabbitmq-server-1" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.204685 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/17ce7999-f86f-45fa-ae07-785f70d797a1-server-conf\") pod \"rabbitmq-server-2\" (UID: \"17ce7999-f86f-45fa-ae07-785f70d797a1\") " pod="openstack/rabbitmq-server-2" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.204827 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6f4e007a-4a18-40e6-bf96-4a751e00cd73-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6f4e007a-4a18-40e6-bf96-4a751e00cd73\") " pod="openstack/rabbitmq-server-0" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.204882 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c6ada086-becc-4f4a-a0a0-0aad894dc550-server-conf\") pod \"rabbitmq-server-1\" (UID: \"c6ada086-becc-4f4a-a0a0-0aad894dc550\") " pod="openstack/rabbitmq-server-1" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.204960 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c6ada086-becc-4f4a-a0a0-0aad894dc550-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"c6ada086-becc-4f4a-a0a0-0aad894dc550\") " pod="openstack/rabbitmq-server-1" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.204986 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpn7f\" (UniqueName: \"kubernetes.io/projected/17ce7999-f86f-45fa-ae07-785f70d797a1-kube-api-access-zpn7f\") pod \"rabbitmq-server-2\" (UID: \"17ce7999-f86f-45fa-ae07-785f70d797a1\") " pod="openstack/rabbitmq-server-2" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.205014 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6f4e007a-4a18-40e6-bf96-4a751e00cd73-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"6f4e007a-4a18-40e6-bf96-4a751e00cd73\") " pod="openstack/rabbitmq-server-0" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.205097 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-49d64f63-cdc8-4473-958f-e6d9024e45ab\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-49d64f63-cdc8-4473-958f-e6d9024e45ab\") pod \"rabbitmq-server-2\" (UID: \"17ce7999-f86f-45fa-ae07-785f70d797a1\") " pod="openstack/rabbitmq-server-2" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.205124 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8l5q\" (UniqueName: \"kubernetes.io/projected/6f4e007a-4a18-40e6-bf96-4a751e00cd73-kube-api-access-w8l5q\") pod \"rabbitmq-server-0\" (UID: \"6f4e007a-4a18-40e6-bf96-4a751e00cd73\") " pod="openstack/rabbitmq-server-0" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.205173 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6f4e007a-4a18-40e6-bf96-4a751e00cd73-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6f4e007a-4a18-40e6-bf96-4a751e00cd73\") " pod="openstack/rabbitmq-server-0" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.205240 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6f4e007a-4a18-40e6-bf96-4a751e00cd73-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6f4e007a-4a18-40e6-bf96-4a751e00cd73\") " pod="openstack/rabbitmq-server-0" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.205279 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c6ada086-becc-4f4a-a0a0-0aad894dc550-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"c6ada086-becc-4f4a-a0a0-0aad894dc550\") " pod="openstack/rabbitmq-server-1" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.205321 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c6ada086-becc-4f4a-a0a0-0aad894dc550-config-data\") pod \"rabbitmq-server-1\" (UID: \"c6ada086-becc-4f4a-a0a0-0aad894dc550\") " pod="openstack/rabbitmq-server-1" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.205347 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6f4e007a-4a18-40e6-bf96-4a751e00cd73-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6f4e007a-4a18-40e6-bf96-4a751e00cd73\") " pod="openstack/rabbitmq-server-0" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.205370 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-2885c5b9-d401-48f7-ae60-bfa289e45112\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2885c5b9-d401-48f7-ae60-bfa289e45112\") pod \"rabbitmq-server-1\" (UID: \"c6ada086-becc-4f4a-a0a0-0aad894dc550\") " pod="openstack/rabbitmq-server-1" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.205409 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/17ce7999-f86f-45fa-ae07-785f70d797a1-rabbitmq-confd\") pod \"rabbitmq-server-2\" (UID: \"17ce7999-f86f-45fa-ae07-785f70d797a1\") " pod="openstack/rabbitmq-server-2" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.205425 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/17ce7999-f86f-45fa-ae07-785f70d797a1-rabbitmq-tls\") pod \"rabbitmq-server-2\" (UID: \"17ce7999-f86f-45fa-ae07-785f70d797a1\") " pod="openstack/rabbitmq-server-2" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.205454 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/17ce7999-f86f-45fa-ae07-785f70d797a1-pod-info\") pod \"rabbitmq-server-2\" (UID: \"17ce7999-f86f-45fa-ae07-785f70d797a1\") " pod="openstack/rabbitmq-server-2" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.205481 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6f4e007a-4a18-40e6-bf96-4a751e00cd73-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6f4e007a-4a18-40e6-bf96-4a751e00cd73\") " pod="openstack/rabbitmq-server-0" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.205539 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/17ce7999-f86f-45fa-ae07-785f70d797a1-erlang-cookie-secret\") pod \"rabbitmq-server-2\" (UID: \"17ce7999-f86f-45fa-ae07-785f70d797a1\") " pod="openstack/rabbitmq-server-2" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.205557 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/17ce7999-f86f-45fa-ae07-785f70d797a1-plugins-conf\") pod \"rabbitmq-server-2\" (UID: \"17ce7999-f86f-45fa-ae07-785f70d797a1\") " pod="openstack/rabbitmq-server-2" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.205591 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6f4e007a-4a18-40e6-bf96-4a751e00cd73-config-data\") pod \"rabbitmq-server-0\" (UID: \"6f4e007a-4a18-40e6-bf96-4a751e00cd73\") " pod="openstack/rabbitmq-server-0" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.205613 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c6ada086-becc-4f4a-a0a0-0aad894dc550-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"c6ada086-becc-4f4a-a0a0-0aad894dc550\") " pod="openstack/rabbitmq-server-1" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.205630 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6f4e007a-4a18-40e6-bf96-4a751e00cd73-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6f4e007a-4a18-40e6-bf96-4a751e00cd73\") " pod="openstack/rabbitmq-server-0" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.205662 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/17ce7999-f86f-45fa-ae07-785f70d797a1-config-data\") pod \"rabbitmq-server-2\" (UID: \"17ce7999-f86f-45fa-ae07-785f70d797a1\") " pod="openstack/rabbitmq-server-2" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.205665 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6f4e007a-4a18-40e6-bf96-4a751e00cd73-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6f4e007a-4a18-40e6-bf96-4a751e00cd73\") " pod="openstack/rabbitmq-server-0" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.205701 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6f4e007a-4a18-40e6-bf96-4a751e00cd73-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6f4e007a-4a18-40e6-bf96-4a751e00cd73\") " pod="openstack/rabbitmq-server-0" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.206414 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6f4e007a-4a18-40e6-bf96-4a751e00cd73-config-data\") pod \"rabbitmq-server-0\" (UID: \"6f4e007a-4a18-40e6-bf96-4a751e00cd73\") " pod="openstack/rabbitmq-server-0" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.206909 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6f4e007a-4a18-40e6-bf96-4a751e00cd73-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6f4e007a-4a18-40e6-bf96-4a751e00cd73\") " pod="openstack/rabbitmq-server-0" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.207386 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.207410 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d921ac5a-690a-43f2-832b-27c969b7b3d5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d921ac5a-690a-43f2-832b-27c969b7b3d5\") pod \"rabbitmq-server-0\" (UID: \"6f4e007a-4a18-40e6-bf96-4a751e00cd73\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/f27f5e88ec17f45c75ef399615f6a29b6f2764d2fa42d7f4a87340be7a71cdcc/globalmount\"" pod="openstack/rabbitmq-server-0" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.212574 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6f4e007a-4a18-40e6-bf96-4a751e00cd73-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"6f4e007a-4a18-40e6-bf96-4a751e00cd73\") " pod="openstack/rabbitmq-server-0" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.220391 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6f4e007a-4a18-40e6-bf96-4a751e00cd73-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6f4e007a-4a18-40e6-bf96-4a751e00cd73\") " pod="openstack/rabbitmq-server-0" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.225654 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6f4e007a-4a18-40e6-bf96-4a751e00cd73-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6f4e007a-4a18-40e6-bf96-4a751e00cd73\") " pod="openstack/rabbitmq-server-0" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.228560 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6f4e007a-4a18-40e6-bf96-4a751e00cd73-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6f4e007a-4a18-40e6-bf96-4a751e00cd73\") " pod="openstack/rabbitmq-server-0" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.241993 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8l5q\" (UniqueName: \"kubernetes.io/projected/6f4e007a-4a18-40e6-bf96-4a751e00cd73-kube-api-access-w8l5q\") pod \"rabbitmq-server-0\" (UID: \"6f4e007a-4a18-40e6-bf96-4a751e00cd73\") " pod="openstack/rabbitmq-server-0" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.255145 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.276617 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d921ac5a-690a-43f2-832b-27c969b7b3d5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d921ac5a-690a-43f2-832b-27c969b7b3d5\") pod \"rabbitmq-server-0\" (UID: \"6f4e007a-4a18-40e6-bf96-4a751e00cd73\") " pod="openstack/rabbitmq-server-0" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.308179 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c6ada086-becc-4f4a-a0a0-0aad894dc550-pod-info\") pod \"rabbitmq-server-1\" (UID: \"c6ada086-becc-4f4a-a0a0-0aad894dc550\") " pod="openstack/rabbitmq-server-1" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.308225 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/17ce7999-f86f-45fa-ae07-785f70d797a1-rabbitmq-plugins\") pod \"rabbitmq-server-2\" (UID: \"17ce7999-f86f-45fa-ae07-785f70d797a1\") " pod="openstack/rabbitmq-server-2" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.308254 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c6ada086-becc-4f4a-a0a0-0aad894dc550-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"c6ada086-becc-4f4a-a0a0-0aad894dc550\") " pod="openstack/rabbitmq-server-1" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.308294 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c6ada086-becc-4f4a-a0a0-0aad894dc550-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"c6ada086-becc-4f4a-a0a0-0aad894dc550\") " pod="openstack/rabbitmq-server-1" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.308320 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c6ada086-becc-4f4a-a0a0-0aad894dc550-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"c6ada086-becc-4f4a-a0a0-0aad894dc550\") " pod="openstack/rabbitmq-server-1" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.308339 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pvdb\" (UniqueName: \"kubernetes.io/projected/c6ada086-becc-4f4a-a0a0-0aad894dc550-kube-api-access-4pvdb\") pod \"rabbitmq-server-1\" (UID: \"c6ada086-becc-4f4a-a0a0-0aad894dc550\") " pod="openstack/rabbitmq-server-1" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.308362 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/17ce7999-f86f-45fa-ae07-785f70d797a1-server-conf\") pod \"rabbitmq-server-2\" (UID: \"17ce7999-f86f-45fa-ae07-785f70d797a1\") " pod="openstack/rabbitmq-server-2" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.308378 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c6ada086-becc-4f4a-a0a0-0aad894dc550-server-conf\") pod \"rabbitmq-server-1\" (UID: \"c6ada086-becc-4f4a-a0a0-0aad894dc550\") " pod="openstack/rabbitmq-server-1" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.308398 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c6ada086-becc-4f4a-a0a0-0aad894dc550-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"c6ada086-becc-4f4a-a0a0-0aad894dc550\") " pod="openstack/rabbitmq-server-1" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.308415 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpn7f\" (UniqueName: \"kubernetes.io/projected/17ce7999-f86f-45fa-ae07-785f70d797a1-kube-api-access-zpn7f\") pod \"rabbitmq-server-2\" (UID: \"17ce7999-f86f-45fa-ae07-785f70d797a1\") " pod="openstack/rabbitmq-server-2" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.308442 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-49d64f63-cdc8-4473-958f-e6d9024e45ab\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-49d64f63-cdc8-4473-958f-e6d9024e45ab\") pod \"rabbitmq-server-2\" (UID: \"17ce7999-f86f-45fa-ae07-785f70d797a1\") " pod="openstack/rabbitmq-server-2" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.308480 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c6ada086-becc-4f4a-a0a0-0aad894dc550-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"c6ada086-becc-4f4a-a0a0-0aad894dc550\") " pod="openstack/rabbitmq-server-1" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.308494 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c6ada086-becc-4f4a-a0a0-0aad894dc550-config-data\") pod \"rabbitmq-server-1\" (UID: \"c6ada086-becc-4f4a-a0a0-0aad894dc550\") " pod="openstack/rabbitmq-server-1" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.308508 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-2885c5b9-d401-48f7-ae60-bfa289e45112\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2885c5b9-d401-48f7-ae60-bfa289e45112\") pod \"rabbitmq-server-1\" (UID: \"c6ada086-becc-4f4a-a0a0-0aad894dc550\") " pod="openstack/rabbitmq-server-1" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.308526 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/17ce7999-f86f-45fa-ae07-785f70d797a1-rabbitmq-confd\") pod \"rabbitmq-server-2\" (UID: \"17ce7999-f86f-45fa-ae07-785f70d797a1\") " pod="openstack/rabbitmq-server-2" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.308539 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/17ce7999-f86f-45fa-ae07-785f70d797a1-rabbitmq-tls\") pod \"rabbitmq-server-2\" (UID: \"17ce7999-f86f-45fa-ae07-785f70d797a1\") " pod="openstack/rabbitmq-server-2" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.308590 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/17ce7999-f86f-45fa-ae07-785f70d797a1-pod-info\") pod \"rabbitmq-server-2\" (UID: \"17ce7999-f86f-45fa-ae07-785f70d797a1\") " pod="openstack/rabbitmq-server-2" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.308627 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/17ce7999-f86f-45fa-ae07-785f70d797a1-plugins-conf\") pod \"rabbitmq-server-2\" (UID: \"17ce7999-f86f-45fa-ae07-785f70d797a1\") " pod="openstack/rabbitmq-server-2" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.308646 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/17ce7999-f86f-45fa-ae07-785f70d797a1-erlang-cookie-secret\") pod \"rabbitmq-server-2\" (UID: \"17ce7999-f86f-45fa-ae07-785f70d797a1\") " pod="openstack/rabbitmq-server-2" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.308666 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c6ada086-becc-4f4a-a0a0-0aad894dc550-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"c6ada086-becc-4f4a-a0a0-0aad894dc550\") " pod="openstack/rabbitmq-server-1" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.308686 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/17ce7999-f86f-45fa-ae07-785f70d797a1-config-data\") pod \"rabbitmq-server-2\" (UID: \"17ce7999-f86f-45fa-ae07-785f70d797a1\") " pod="openstack/rabbitmq-server-2" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.308706 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/17ce7999-f86f-45fa-ae07-785f70d797a1-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-2\" (UID: \"17ce7999-f86f-45fa-ae07-785f70d797a1\") " pod="openstack/rabbitmq-server-2" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.309113 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/17ce7999-f86f-45fa-ae07-785f70d797a1-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-2\" (UID: \"17ce7999-f86f-45fa-ae07-785f70d797a1\") " pod="openstack/rabbitmq-server-2" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.309894 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/17ce7999-f86f-45fa-ae07-785f70d797a1-rabbitmq-plugins\") pod \"rabbitmq-server-2\" (UID: \"17ce7999-f86f-45fa-ae07-785f70d797a1\") " pod="openstack/rabbitmq-server-2" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.312109 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c6ada086-becc-4f4a-a0a0-0aad894dc550-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"c6ada086-becc-4f4a-a0a0-0aad894dc550\") " pod="openstack/rabbitmq-server-1" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.313804 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/17ce7999-f86f-45fa-ae07-785f70d797a1-server-conf\") pod \"rabbitmq-server-2\" (UID: \"17ce7999-f86f-45fa-ae07-785f70d797a1\") " pod="openstack/rabbitmq-server-2" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.314041 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c6ada086-becc-4f4a-a0a0-0aad894dc550-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"c6ada086-becc-4f4a-a0a0-0aad894dc550\") " pod="openstack/rabbitmq-server-1" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.314241 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c6ada086-becc-4f4a-a0a0-0aad894dc550-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"c6ada086-becc-4f4a-a0a0-0aad894dc550\") " pod="openstack/rabbitmq-server-1" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.314307 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c6ada086-becc-4f4a-a0a0-0aad894dc550-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"c6ada086-becc-4f4a-a0a0-0aad894dc550\") " pod="openstack/rabbitmq-server-1" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.314891 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c6ada086-becc-4f4a-a0a0-0aad894dc550-config-data\") pod \"rabbitmq-server-1\" (UID: \"c6ada086-becc-4f4a-a0a0-0aad894dc550\") " pod="openstack/rabbitmq-server-1" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.316041 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.316082 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-2885c5b9-d401-48f7-ae60-bfa289e45112\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2885c5b9-d401-48f7-ae60-bfa289e45112\") pod \"rabbitmq-server-1\" (UID: \"c6ada086-becc-4f4a-a0a0-0aad894dc550\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/723a9ce05431756e03b00745520f00559943d81b66433ce834e3e67d95e138ab/globalmount\"" pod="openstack/rabbitmq-server-1" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.316145 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.316170 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-49d64f63-cdc8-4473-958f-e6d9024e45ab\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-49d64f63-cdc8-4473-958f-e6d9024e45ab\") pod \"rabbitmq-server-2\" (UID: \"17ce7999-f86f-45fa-ae07-785f70d797a1\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/0c27981484d45a0452f6bc7b25565dc834ac0db89d430ab9341cec8b8dfe57f8/globalmount\"" pod="openstack/rabbitmq-server-2" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.316594 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c6ada086-becc-4f4a-a0a0-0aad894dc550-server-conf\") pod \"rabbitmq-server-1\" (UID: \"c6ada086-becc-4f4a-a0a0-0aad894dc550\") " pod="openstack/rabbitmq-server-1" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.318574 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/17ce7999-f86f-45fa-ae07-785f70d797a1-plugins-conf\") pod \"rabbitmq-server-2\" (UID: \"17ce7999-f86f-45fa-ae07-785f70d797a1\") " pod="openstack/rabbitmq-server-2" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.319455 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/17ce7999-f86f-45fa-ae07-785f70d797a1-rabbitmq-confd\") pod \"rabbitmq-server-2\" (UID: \"17ce7999-f86f-45fa-ae07-785f70d797a1\") " pod="openstack/rabbitmq-server-2" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.325091 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c6ada086-becc-4f4a-a0a0-0aad894dc550-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"c6ada086-becc-4f4a-a0a0-0aad894dc550\") " pod="openstack/rabbitmq-server-1" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.325486 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/17ce7999-f86f-45fa-ae07-785f70d797a1-config-data\") pod \"rabbitmq-server-2\" (UID: \"17ce7999-f86f-45fa-ae07-785f70d797a1\") " pod="openstack/rabbitmq-server-2" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.325648 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c6ada086-becc-4f4a-a0a0-0aad894dc550-pod-info\") pod \"rabbitmq-server-1\" (UID: \"c6ada086-becc-4f4a-a0a0-0aad894dc550\") " pod="openstack/rabbitmq-server-1" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.327840 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/17ce7999-f86f-45fa-ae07-785f70d797a1-rabbitmq-tls\") pod \"rabbitmq-server-2\" (UID: \"17ce7999-f86f-45fa-ae07-785f70d797a1\") " pod="openstack/rabbitmq-server-2" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.329135 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/17ce7999-f86f-45fa-ae07-785f70d797a1-pod-info\") pod \"rabbitmq-server-2\" (UID: \"17ce7999-f86f-45fa-ae07-785f70d797a1\") " pod="openstack/rabbitmq-server-2" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.329985 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c6ada086-becc-4f4a-a0a0-0aad894dc550-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"c6ada086-becc-4f4a-a0a0-0aad894dc550\") " pod="openstack/rabbitmq-server-1" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.331482 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpn7f\" (UniqueName: \"kubernetes.io/projected/17ce7999-f86f-45fa-ae07-785f70d797a1-kube-api-access-zpn7f\") pod \"rabbitmq-server-2\" (UID: \"17ce7999-f86f-45fa-ae07-785f70d797a1\") " pod="openstack/rabbitmq-server-2" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.332094 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pvdb\" (UniqueName: \"kubernetes.io/projected/c6ada086-becc-4f4a-a0a0-0aad894dc550-kube-api-access-4pvdb\") pod \"rabbitmq-server-1\" (UID: \"c6ada086-becc-4f4a-a0a0-0aad894dc550\") " pod="openstack/rabbitmq-server-1" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.335315 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/17ce7999-f86f-45fa-ae07-785f70d797a1-erlang-cookie-secret\") pod \"rabbitmq-server-2\" (UID: \"17ce7999-f86f-45fa-ae07-785f70d797a1\") " pod="openstack/rabbitmq-server-2" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.360126 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.360732 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-2885c5b9-d401-48f7-ae60-bfa289e45112\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2885c5b9-d401-48f7-ae60-bfa289e45112\") pod \"rabbitmq-server-1\" (UID: \"c6ada086-becc-4f4a-a0a0-0aad894dc550\") " pod="openstack/rabbitmq-server-1" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.381309 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-49d64f63-cdc8-4473-958f-e6d9024e45ab\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-49d64f63-cdc8-4473-958f-e6d9024e45ab\") pod \"rabbitmq-server-2\" (UID: \"17ce7999-f86f-45fa-ae07-785f70d797a1\") " pod="openstack/rabbitmq-server-2" Mar 09 14:23:48 crc kubenswrapper[4722]: I0309 14:23:48.412076 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-2" Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:48.428376 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-1" Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:48.807290 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:48.809345 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:48.811885 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:48.812454 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-64zsq" Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:48.812744 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:48.813654 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-lb2r7" event={"ID":"6dca0862-8223-4ba0-9fc8-eb8c861baa60","Type":"ContainerStarted","Data":"576e33664ab0ccb35a12912c364ded97a43df7fdb6084047b444222b6f2b2678"} Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:48.819957 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:48.820771 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:48.828490 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:48.919586 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwnhb\" (UniqueName: \"kubernetes.io/projected/a79aaedb-92ba-42fc-8cc7-9ecb007d2ac7-kube-api-access-hwnhb\") pod \"openstack-galera-0\" (UID: \"a79aaedb-92ba-42fc-8cc7-9ecb007d2ac7\") " pod="openstack/openstack-galera-0" Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:48.919638 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a79aaedb-92ba-42fc-8cc7-9ecb007d2ac7-kolla-config\") pod \"openstack-galera-0\" (UID: \"a79aaedb-92ba-42fc-8cc7-9ecb007d2ac7\") " pod="openstack/openstack-galera-0" Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:48.919666 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a79aaedb-92ba-42fc-8cc7-9ecb007d2ac7-operator-scripts\") pod \"openstack-galera-0\" (UID: \"a79aaedb-92ba-42fc-8cc7-9ecb007d2ac7\") " pod="openstack/openstack-galera-0" Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:48.919715 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a79aaedb-92ba-42fc-8cc7-9ecb007d2ac7-config-data-generated\") pod \"openstack-galera-0\" (UID: \"a79aaedb-92ba-42fc-8cc7-9ecb007d2ac7\") " pod="openstack/openstack-galera-0" Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:48.919749 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-e9615e94-13d7-4346-840b-d2c55a1390c7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e9615e94-13d7-4346-840b-d2c55a1390c7\") pod \"openstack-galera-0\" (UID: \"a79aaedb-92ba-42fc-8cc7-9ecb007d2ac7\") " pod="openstack/openstack-galera-0" Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:48.919818 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a79aaedb-92ba-42fc-8cc7-9ecb007d2ac7-config-data-default\") pod \"openstack-galera-0\" (UID: \"a79aaedb-92ba-42fc-8cc7-9ecb007d2ac7\") " pod="openstack/openstack-galera-0" Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:48.919837 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a79aaedb-92ba-42fc-8cc7-9ecb007d2ac7-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"a79aaedb-92ba-42fc-8cc7-9ecb007d2ac7\") " pod="openstack/openstack-galera-0" Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:48.919851 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a79aaedb-92ba-42fc-8cc7-9ecb007d2ac7-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"a79aaedb-92ba-42fc-8cc7-9ecb007d2ac7\") " pod="openstack/openstack-galera-0" Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:49.021961 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a79aaedb-92ba-42fc-8cc7-9ecb007d2ac7-config-data-default\") pod \"openstack-galera-0\" (UID: \"a79aaedb-92ba-42fc-8cc7-9ecb007d2ac7\") " pod="openstack/openstack-galera-0" Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:49.022050 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a79aaedb-92ba-42fc-8cc7-9ecb007d2ac7-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"a79aaedb-92ba-42fc-8cc7-9ecb007d2ac7\") " pod="openstack/openstack-galera-0" Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:49.022087 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a79aaedb-92ba-42fc-8cc7-9ecb007d2ac7-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"a79aaedb-92ba-42fc-8cc7-9ecb007d2ac7\") " pod="openstack/openstack-galera-0" Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:49.022116 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwnhb\" (UniqueName: \"kubernetes.io/projected/a79aaedb-92ba-42fc-8cc7-9ecb007d2ac7-kube-api-access-hwnhb\") pod \"openstack-galera-0\" (UID: \"a79aaedb-92ba-42fc-8cc7-9ecb007d2ac7\") " pod="openstack/openstack-galera-0" Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:49.022144 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a79aaedb-92ba-42fc-8cc7-9ecb007d2ac7-kolla-config\") pod \"openstack-galera-0\" (UID: \"a79aaedb-92ba-42fc-8cc7-9ecb007d2ac7\") " pod="openstack/openstack-galera-0" Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:49.022181 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a79aaedb-92ba-42fc-8cc7-9ecb007d2ac7-operator-scripts\") pod \"openstack-galera-0\" (UID: \"a79aaedb-92ba-42fc-8cc7-9ecb007d2ac7\") " pod="openstack/openstack-galera-0" Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:49.022249 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a79aaedb-92ba-42fc-8cc7-9ecb007d2ac7-config-data-generated\") pod \"openstack-galera-0\" (UID: \"a79aaedb-92ba-42fc-8cc7-9ecb007d2ac7\") " pod="openstack/openstack-galera-0" Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:49.022288 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-e9615e94-13d7-4346-840b-d2c55a1390c7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e9615e94-13d7-4346-840b-d2c55a1390c7\") pod \"openstack-galera-0\" (UID: \"a79aaedb-92ba-42fc-8cc7-9ecb007d2ac7\") " pod="openstack/openstack-galera-0" Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:49.023758 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a79aaedb-92ba-42fc-8cc7-9ecb007d2ac7-kolla-config\") pod \"openstack-galera-0\" (UID: \"a79aaedb-92ba-42fc-8cc7-9ecb007d2ac7\") " pod="openstack/openstack-galera-0" Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:49.025045 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a79aaedb-92ba-42fc-8cc7-9ecb007d2ac7-config-data-default\") pod \"openstack-galera-0\" (UID: \"a79aaedb-92ba-42fc-8cc7-9ecb007d2ac7\") " pod="openstack/openstack-galera-0" Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:49.025273 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a79aaedb-92ba-42fc-8cc7-9ecb007d2ac7-operator-scripts\") pod \"openstack-galera-0\" (UID: \"a79aaedb-92ba-42fc-8cc7-9ecb007d2ac7\") " pod="openstack/openstack-galera-0" Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:49.025579 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a79aaedb-92ba-42fc-8cc7-9ecb007d2ac7-config-data-generated\") pod \"openstack-galera-0\" (UID: \"a79aaedb-92ba-42fc-8cc7-9ecb007d2ac7\") " pod="openstack/openstack-galera-0" Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:49.029236 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:49.029278 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-e9615e94-13d7-4346-840b-d2c55a1390c7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e9615e94-13d7-4346-840b-d2c55a1390c7\") pod \"openstack-galera-0\" (UID: \"a79aaedb-92ba-42fc-8cc7-9ecb007d2ac7\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/44c7808e02d285205dbe3b8057f7237650bf78781cc6f5d6a1e7ec949181f3a2/globalmount\"" pod="openstack/openstack-galera-0" Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:49.029514 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a79aaedb-92ba-42fc-8cc7-9ecb007d2ac7-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"a79aaedb-92ba-42fc-8cc7-9ecb007d2ac7\") " pod="openstack/openstack-galera-0" Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:49.030603 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/a79aaedb-92ba-42fc-8cc7-9ecb007d2ac7-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"a79aaedb-92ba-42fc-8cc7-9ecb007d2ac7\") " pod="openstack/openstack-galera-0" Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:49.055406 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwnhb\" (UniqueName: \"kubernetes.io/projected/a79aaedb-92ba-42fc-8cc7-9ecb007d2ac7-kube-api-access-hwnhb\") pod \"openstack-galera-0\" (UID: \"a79aaedb-92ba-42fc-8cc7-9ecb007d2ac7\") " pod="openstack/openstack-galera-0" Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:49.109235 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-e9615e94-13d7-4346-840b-d2c55a1390c7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-e9615e94-13d7-4346-840b-d2c55a1390c7\") pod \"openstack-galera-0\" (UID: \"a79aaedb-92ba-42fc-8cc7-9ecb007d2ac7\") " pod="openstack/openstack-galera-0" Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:49.147802 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:50.125667 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:50.140188 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:50.150276 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:50.150403 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:50.150445 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-ncxjt" Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:50.150474 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:50.190789 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:50.255406 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4159e308-3ccf-45d9-a97b-8133542007a8-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"4159e308-3ccf-45d9-a97b-8133542007a8\") " pod="openstack/openstack-cell1-galera-0" Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:50.255482 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4159e308-3ccf-45d9-a97b-8133542007a8-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"4159e308-3ccf-45d9-a97b-8133542007a8\") " pod="openstack/openstack-cell1-galera-0" Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:50.255651 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4159e308-3ccf-45d9-a97b-8133542007a8-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"4159e308-3ccf-45d9-a97b-8133542007a8\") " pod="openstack/openstack-cell1-galera-0" Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:50.255674 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvwgq\" (UniqueName: \"kubernetes.io/projected/4159e308-3ccf-45d9-a97b-8133542007a8-kube-api-access-gvwgq\") pod \"openstack-cell1-galera-0\" (UID: \"4159e308-3ccf-45d9-a97b-8133542007a8\") " pod="openstack/openstack-cell1-galera-0" Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:50.255880 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4159e308-3ccf-45d9-a97b-8133542007a8-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"4159e308-3ccf-45d9-a97b-8133542007a8\") " pod="openstack/openstack-cell1-galera-0" Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:50.255966 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-ce768298-dec2-4672-ba7d-800099f7bb8b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ce768298-dec2-4672-ba7d-800099f7bb8b\") pod \"openstack-cell1-galera-0\" (UID: \"4159e308-3ccf-45d9-a97b-8133542007a8\") " pod="openstack/openstack-cell1-galera-0" Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:50.255992 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4159e308-3ccf-45d9-a97b-8133542007a8-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"4159e308-3ccf-45d9-a97b-8133542007a8\") " pod="openstack/openstack-cell1-galera-0" Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:50.256016 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4159e308-3ccf-45d9-a97b-8133542007a8-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"4159e308-3ccf-45d9-a97b-8133542007a8\") " pod="openstack/openstack-cell1-galera-0" Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:50.358575 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-ce768298-dec2-4672-ba7d-800099f7bb8b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ce768298-dec2-4672-ba7d-800099f7bb8b\") pod \"openstack-cell1-galera-0\" (UID: \"4159e308-3ccf-45d9-a97b-8133542007a8\") " pod="openstack/openstack-cell1-galera-0" Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:50.358620 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4159e308-3ccf-45d9-a97b-8133542007a8-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"4159e308-3ccf-45d9-a97b-8133542007a8\") " pod="openstack/openstack-cell1-galera-0" Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:50.358646 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4159e308-3ccf-45d9-a97b-8133542007a8-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"4159e308-3ccf-45d9-a97b-8133542007a8\") " pod="openstack/openstack-cell1-galera-0" Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:50.358668 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4159e308-3ccf-45d9-a97b-8133542007a8-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"4159e308-3ccf-45d9-a97b-8133542007a8\") " pod="openstack/openstack-cell1-galera-0" Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:50.358699 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4159e308-3ccf-45d9-a97b-8133542007a8-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"4159e308-3ccf-45d9-a97b-8133542007a8\") " pod="openstack/openstack-cell1-galera-0" Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:50.358786 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4159e308-3ccf-45d9-a97b-8133542007a8-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"4159e308-3ccf-45d9-a97b-8133542007a8\") " pod="openstack/openstack-cell1-galera-0" Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:50.358805 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvwgq\" (UniqueName: \"kubernetes.io/projected/4159e308-3ccf-45d9-a97b-8133542007a8-kube-api-access-gvwgq\") pod \"openstack-cell1-galera-0\" (UID: \"4159e308-3ccf-45d9-a97b-8133542007a8\") " pod="openstack/openstack-cell1-galera-0" Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:50.358924 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4159e308-3ccf-45d9-a97b-8133542007a8-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"4159e308-3ccf-45d9-a97b-8133542007a8\") " pod="openstack/openstack-cell1-galera-0" Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:50.362356 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4159e308-3ccf-45d9-a97b-8133542007a8-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"4159e308-3ccf-45d9-a97b-8133542007a8\") " pod="openstack/openstack-cell1-galera-0" Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:50.365513 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4159e308-3ccf-45d9-a97b-8133542007a8-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"4159e308-3ccf-45d9-a97b-8133542007a8\") " pod="openstack/openstack-cell1-galera-0" Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:50.366015 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4159e308-3ccf-45d9-a97b-8133542007a8-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"4159e308-3ccf-45d9-a97b-8133542007a8\") " pod="openstack/openstack-cell1-galera-0" Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:50.366471 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4159e308-3ccf-45d9-a97b-8133542007a8-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"4159e308-3ccf-45d9-a97b-8133542007a8\") " pod="openstack/openstack-cell1-galera-0" Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:50.367096 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:50.367120 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-ce768298-dec2-4672-ba7d-800099f7bb8b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ce768298-dec2-4672-ba7d-800099f7bb8b\") pod \"openstack-cell1-galera-0\" (UID: \"4159e308-3ccf-45d9-a97b-8133542007a8\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/dc578e1176610d2801d9cd21eee66e41bc6b04a67c4195574ab093d2dd1b5ae5/globalmount\"" pod="openstack/openstack-cell1-galera-0" Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:50.393139 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4159e308-3ccf-45d9-a97b-8133542007a8-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"4159e308-3ccf-45d9-a97b-8133542007a8\") " pod="openstack/openstack-cell1-galera-0" Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:50.405315 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:50.406437 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4159e308-3ccf-45d9-a97b-8133542007a8-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"4159e308-3ccf-45d9-a97b-8133542007a8\") " pod="openstack/openstack-cell1-galera-0" Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:50.406809 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:50.408696 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvwgq\" (UniqueName: \"kubernetes.io/projected/4159e308-3ccf-45d9-a97b-8133542007a8-kube-api-access-gvwgq\") pod \"openstack-cell1-galera-0\" (UID: \"4159e308-3ccf-45d9-a97b-8133542007a8\") " pod="openstack/openstack-cell1-galera-0" Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:50.409105 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-4n4rr" Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:50.412721 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:50.413466 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:50.420946 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:50.461755 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a6188a6-3f71-48b5-9013-66d297c205a7-combined-ca-bundle\") pod \"memcached-0\" (UID: \"0a6188a6-3f71-48b5-9013-66d297c205a7\") " pod="openstack/memcached-0" Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:50.462051 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktdrr\" (UniqueName: \"kubernetes.io/projected/0a6188a6-3f71-48b5-9013-66d297c205a7-kube-api-access-ktdrr\") pod \"memcached-0\" (UID: \"0a6188a6-3f71-48b5-9013-66d297c205a7\") " pod="openstack/memcached-0" Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:50.462141 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0a6188a6-3f71-48b5-9013-66d297c205a7-kolla-config\") pod \"memcached-0\" (UID: \"0a6188a6-3f71-48b5-9013-66d297c205a7\") " pod="openstack/memcached-0" Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:50.462192 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a6188a6-3f71-48b5-9013-66d297c205a7-memcached-tls-certs\") pod \"memcached-0\" (UID: \"0a6188a6-3f71-48b5-9013-66d297c205a7\") " pod="openstack/memcached-0" Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:50.462337 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0a6188a6-3f71-48b5-9013-66d297c205a7-config-data\") pod \"memcached-0\" (UID: \"0a6188a6-3f71-48b5-9013-66d297c205a7\") " pod="openstack/memcached-0" Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:50.466069 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-ce768298-dec2-4672-ba7d-800099f7bb8b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ce768298-dec2-4672-ba7d-800099f7bb8b\") pod \"openstack-cell1-galera-0\" (UID: \"4159e308-3ccf-45d9-a97b-8133542007a8\") " pod="openstack/openstack-cell1-galera-0" Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:50.506112 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:50.564443 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a6188a6-3f71-48b5-9013-66d297c205a7-combined-ca-bundle\") pod \"memcached-0\" (UID: \"0a6188a6-3f71-48b5-9013-66d297c205a7\") " pod="openstack/memcached-0" Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:50.564600 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktdrr\" (UniqueName: \"kubernetes.io/projected/0a6188a6-3f71-48b5-9013-66d297c205a7-kube-api-access-ktdrr\") pod \"memcached-0\" (UID: \"0a6188a6-3f71-48b5-9013-66d297c205a7\") " pod="openstack/memcached-0" Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:50.564681 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0a6188a6-3f71-48b5-9013-66d297c205a7-kolla-config\") pod \"memcached-0\" (UID: \"0a6188a6-3f71-48b5-9013-66d297c205a7\") " pod="openstack/memcached-0" Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:50.564717 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a6188a6-3f71-48b5-9013-66d297c205a7-memcached-tls-certs\") pod \"memcached-0\" (UID: \"0a6188a6-3f71-48b5-9013-66d297c205a7\") " pod="openstack/memcached-0" Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:50.565604 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0a6188a6-3f71-48b5-9013-66d297c205a7-config-data\") pod \"memcached-0\" (UID: \"0a6188a6-3f71-48b5-9013-66d297c205a7\") " pod="openstack/memcached-0" Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:50.565982 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0a6188a6-3f71-48b5-9013-66d297c205a7-kolla-config\") pod \"memcached-0\" (UID: \"0a6188a6-3f71-48b5-9013-66d297c205a7\") " pod="openstack/memcached-0" Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:50.566620 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0a6188a6-3f71-48b5-9013-66d297c205a7-config-data\") pod \"memcached-0\" (UID: \"0a6188a6-3f71-48b5-9013-66d297c205a7\") " pod="openstack/memcached-0" Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:50.586801 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a6188a6-3f71-48b5-9013-66d297c205a7-combined-ca-bundle\") pod \"memcached-0\" (UID: \"0a6188a6-3f71-48b5-9013-66d297c205a7\") " pod="openstack/memcached-0" Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:50.587450 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a6188a6-3f71-48b5-9013-66d297c205a7-memcached-tls-certs\") pod \"memcached-0\" (UID: \"0a6188a6-3f71-48b5-9013-66d297c205a7\") " pod="openstack/memcached-0" Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:50.596449 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktdrr\" (UniqueName: \"kubernetes.io/projected/0a6188a6-3f71-48b5-9013-66d297c205a7-kube-api-access-ktdrr\") pod \"memcached-0\" (UID: \"0a6188a6-3f71-48b5-9013-66d297c205a7\") " pod="openstack/memcached-0" Mar 09 14:23:50 crc kubenswrapper[4722]: I0309 14:23:50.827851 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 09 14:23:51 crc kubenswrapper[4722]: I0309 14:23:51.380110 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-2"] Mar 09 14:23:51 crc kubenswrapper[4722]: I0309 14:23:51.396766 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 09 14:23:51 crc kubenswrapper[4722]: I0309 14:23:51.404398 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 09 14:23:51 crc kubenswrapper[4722]: I0309 14:23:51.411233 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 09 14:23:51 crc kubenswrapper[4722]: I0309 14:23:51.418066 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-1"] Mar 09 14:23:52 crc kubenswrapper[4722]: I0309 14:23:52.914147 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 09 14:23:52 crc kubenswrapper[4722]: I0309 14:23:52.915743 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 09 14:23:52 crc kubenswrapper[4722]: I0309 14:23:52.924867 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-lds5f" Mar 09 14:23:52 crc kubenswrapper[4722]: I0309 14:23:52.927730 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 09 14:23:52 crc kubenswrapper[4722]: I0309 14:23:52.929086 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcfrm\" (UniqueName: \"kubernetes.io/projected/4d8e49ae-5932-472c-a714-7872980c5a9b-kube-api-access-xcfrm\") pod \"kube-state-metrics-0\" (UID: \"4d8e49ae-5932-472c-a714-7872980c5a9b\") " pod="openstack/kube-state-metrics-0" Mar 09 14:23:53 crc kubenswrapper[4722]: I0309 14:23:53.033319 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcfrm\" (UniqueName: \"kubernetes.io/projected/4d8e49ae-5932-472c-a714-7872980c5a9b-kube-api-access-xcfrm\") pod \"kube-state-metrics-0\" (UID: \"4d8e49ae-5932-472c-a714-7872980c5a9b\") " pod="openstack/kube-state-metrics-0" Mar 09 14:23:53 crc kubenswrapper[4722]: I0309 14:23:53.088658 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcfrm\" (UniqueName: \"kubernetes.io/projected/4d8e49ae-5932-472c-a714-7872980c5a9b-kube-api-access-xcfrm\") pod \"kube-state-metrics-0\" (UID: \"4d8e49ae-5932-472c-a714-7872980c5a9b\") " pod="openstack/kube-state-metrics-0" Mar 09 14:23:53 crc kubenswrapper[4722]: I0309 14:23:53.262020 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 09 14:23:53 crc kubenswrapper[4722]: I0309 14:23:53.779800 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-ui-dashboards-66cbf594b5-bzqhp"] Mar 09 14:23:53 crc kubenswrapper[4722]: I0309 14:23:53.781788 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-ui-dashboards-66cbf594b5-bzqhp" Mar 09 14:23:53 crc kubenswrapper[4722]: I0309 14:23:53.789888 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-ui-dashboards-sa-dockercfg-lv89c" Mar 09 14:23:53 crc kubenswrapper[4722]: I0309 14:23:53.790093 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-ui-dashboards" Mar 09 14:23:53 crc kubenswrapper[4722]: I0309 14:23:53.798231 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-ui-dashboards-66cbf594b5-bzqhp"] Mar 09 14:23:53 crc kubenswrapper[4722]: I0309 14:23:53.858568 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gpkg\" (UniqueName: \"kubernetes.io/projected/0d15a083-af10-4638-b6bc-9de1f89f123f-kube-api-access-4gpkg\") pod \"observability-ui-dashboards-66cbf594b5-bzqhp\" (UID: \"0d15a083-af10-4638-b6bc-9de1f89f123f\") " pod="openshift-operators/observability-ui-dashboards-66cbf594b5-bzqhp" Mar 09 14:23:53 crc kubenswrapper[4722]: I0309 14:23:53.859065 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d15a083-af10-4638-b6bc-9de1f89f123f-serving-cert\") pod \"observability-ui-dashboards-66cbf594b5-bzqhp\" (UID: \"0d15a083-af10-4638-b6bc-9de1f89f123f\") " pod="openshift-operators/observability-ui-dashboards-66cbf594b5-bzqhp" Mar 09 14:23:53 crc kubenswrapper[4722]: I0309 14:23:53.963543 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d15a083-af10-4638-b6bc-9de1f89f123f-serving-cert\") pod \"observability-ui-dashboards-66cbf594b5-bzqhp\" (UID: \"0d15a083-af10-4638-b6bc-9de1f89f123f\") " pod="openshift-operators/observability-ui-dashboards-66cbf594b5-bzqhp" Mar 09 14:23:53 crc kubenswrapper[4722]: I0309 14:23:53.963605 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gpkg\" (UniqueName: \"kubernetes.io/projected/0d15a083-af10-4638-b6bc-9de1f89f123f-kube-api-access-4gpkg\") pod \"observability-ui-dashboards-66cbf594b5-bzqhp\" (UID: \"0d15a083-af10-4638-b6bc-9de1f89f123f\") " pod="openshift-operators/observability-ui-dashboards-66cbf594b5-bzqhp" Mar 09 14:23:53 crc kubenswrapper[4722]: E0309 14:23:53.964089 4722 secret.go:188] Couldn't get secret openshift-operators/observability-ui-dashboards: secret "observability-ui-dashboards" not found Mar 09 14:23:53 crc kubenswrapper[4722]: E0309 14:23:53.964194 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d15a083-af10-4638-b6bc-9de1f89f123f-serving-cert podName:0d15a083-af10-4638-b6bc-9de1f89f123f nodeName:}" failed. No retries permitted until 2026-03-09 14:23:54.46417239 +0000 UTC m=+1275.019740966 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/0d15a083-af10-4638-b6bc-9de1f89f123f-serving-cert") pod "observability-ui-dashboards-66cbf594b5-bzqhp" (UID: "0d15a083-af10-4638-b6bc-9de1f89f123f") : secret "observability-ui-dashboards" not found Mar 09 14:23:53 crc kubenswrapper[4722]: I0309 14:23:53.996391 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gpkg\" (UniqueName: \"kubernetes.io/projected/0d15a083-af10-4638-b6bc-9de1f89f123f-kube-api-access-4gpkg\") pod \"observability-ui-dashboards-66cbf594b5-bzqhp\" (UID: \"0d15a083-af10-4638-b6bc-9de1f89f123f\") " pod="openshift-operators/observability-ui-dashboards-66cbf594b5-bzqhp" Mar 09 14:23:54 crc kubenswrapper[4722]: I0309 14:23:54.089399 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-6dc4c5dd4b-rk4jt"] Mar 09 14:23:54 crc kubenswrapper[4722]: I0309 14:23:54.090842 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6dc4c5dd4b-rk4jt" Mar 09 14:23:54 crc kubenswrapper[4722]: I0309 14:23:54.098025 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6dc4c5dd4b-rk4jt"] Mar 09 14:23:54 crc kubenswrapper[4722]: I0309 14:23:54.169262 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6dc5b476-5a42-4c98-9a95-0e3b29f2f771-console-serving-cert\") pod \"console-6dc4c5dd4b-rk4jt\" (UID: \"6dc5b476-5a42-4c98-9a95-0e3b29f2f771\") " pod="openshift-console/console-6dc4c5dd4b-rk4jt" Mar 09 14:23:54 crc kubenswrapper[4722]: I0309 14:23:54.169347 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6dc5b476-5a42-4c98-9a95-0e3b29f2f771-service-ca\") pod \"console-6dc4c5dd4b-rk4jt\" (UID: \"6dc5b476-5a42-4c98-9a95-0e3b29f2f771\") " pod="openshift-console/console-6dc4c5dd4b-rk4jt" Mar 09 14:23:54 crc kubenswrapper[4722]: I0309 14:23:54.169574 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6dc5b476-5a42-4c98-9a95-0e3b29f2f771-console-oauth-config\") pod \"console-6dc4c5dd4b-rk4jt\" (UID: \"6dc5b476-5a42-4c98-9a95-0e3b29f2f771\") " pod="openshift-console/console-6dc4c5dd4b-rk4jt" Mar 09 14:23:54 crc kubenswrapper[4722]: I0309 14:23:54.169843 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6dc5b476-5a42-4c98-9a95-0e3b29f2f771-trusted-ca-bundle\") pod \"console-6dc4c5dd4b-rk4jt\" (UID: \"6dc5b476-5a42-4c98-9a95-0e3b29f2f771\") " pod="openshift-console/console-6dc4c5dd4b-rk4jt" Mar 09 14:23:54 crc kubenswrapper[4722]: I0309 14:23:54.170050 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6dc5b476-5a42-4c98-9a95-0e3b29f2f771-oauth-serving-cert\") pod \"console-6dc4c5dd4b-rk4jt\" (UID: \"6dc5b476-5a42-4c98-9a95-0e3b29f2f771\") " pod="openshift-console/console-6dc4c5dd4b-rk4jt" Mar 09 14:23:54 crc kubenswrapper[4722]: I0309 14:23:54.170121 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mtll\" (UniqueName: \"kubernetes.io/projected/6dc5b476-5a42-4c98-9a95-0e3b29f2f771-kube-api-access-5mtll\") pod \"console-6dc4c5dd4b-rk4jt\" (UID: \"6dc5b476-5a42-4c98-9a95-0e3b29f2f771\") " pod="openshift-console/console-6dc4c5dd4b-rk4jt" Mar 09 14:23:54 crc kubenswrapper[4722]: I0309 14:23:54.170227 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6dc5b476-5a42-4c98-9a95-0e3b29f2f771-console-config\") pod \"console-6dc4c5dd4b-rk4jt\" (UID: \"6dc5b476-5a42-4c98-9a95-0e3b29f2f771\") " pod="openshift-console/console-6dc4c5dd4b-rk4jt" Mar 09 14:23:54 crc kubenswrapper[4722]: I0309 14:23:54.271383 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6dc5b476-5a42-4c98-9a95-0e3b29f2f771-oauth-serving-cert\") pod \"console-6dc4c5dd4b-rk4jt\" (UID: \"6dc5b476-5a42-4c98-9a95-0e3b29f2f771\") " pod="openshift-console/console-6dc4c5dd4b-rk4jt" Mar 09 14:23:54 crc kubenswrapper[4722]: I0309 14:23:54.271436 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mtll\" (UniqueName: \"kubernetes.io/projected/6dc5b476-5a42-4c98-9a95-0e3b29f2f771-kube-api-access-5mtll\") pod \"console-6dc4c5dd4b-rk4jt\" (UID: \"6dc5b476-5a42-4c98-9a95-0e3b29f2f771\") " pod="openshift-console/console-6dc4c5dd4b-rk4jt" Mar 09 14:23:54 crc kubenswrapper[4722]: I0309 14:23:54.271470 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6dc5b476-5a42-4c98-9a95-0e3b29f2f771-console-config\") pod \"console-6dc4c5dd4b-rk4jt\" (UID: \"6dc5b476-5a42-4c98-9a95-0e3b29f2f771\") " pod="openshift-console/console-6dc4c5dd4b-rk4jt" Mar 09 14:23:54 crc kubenswrapper[4722]: I0309 14:23:54.271508 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6dc5b476-5a42-4c98-9a95-0e3b29f2f771-console-serving-cert\") pod \"console-6dc4c5dd4b-rk4jt\" (UID: \"6dc5b476-5a42-4c98-9a95-0e3b29f2f771\") " pod="openshift-console/console-6dc4c5dd4b-rk4jt" Mar 09 14:23:54 crc kubenswrapper[4722]: I0309 14:23:54.271559 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6dc5b476-5a42-4c98-9a95-0e3b29f2f771-service-ca\") pod \"console-6dc4c5dd4b-rk4jt\" (UID: \"6dc5b476-5a42-4c98-9a95-0e3b29f2f771\") " pod="openshift-console/console-6dc4c5dd4b-rk4jt" Mar 09 14:23:54 crc kubenswrapper[4722]: I0309 14:23:54.271598 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6dc5b476-5a42-4c98-9a95-0e3b29f2f771-console-oauth-config\") pod \"console-6dc4c5dd4b-rk4jt\" (UID: \"6dc5b476-5a42-4c98-9a95-0e3b29f2f771\") " pod="openshift-console/console-6dc4c5dd4b-rk4jt" Mar 09 14:23:54 crc kubenswrapper[4722]: I0309 14:23:54.271663 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6dc5b476-5a42-4c98-9a95-0e3b29f2f771-trusted-ca-bundle\") pod \"console-6dc4c5dd4b-rk4jt\" (UID: \"6dc5b476-5a42-4c98-9a95-0e3b29f2f771\") " pod="openshift-console/console-6dc4c5dd4b-rk4jt" Mar 09 14:23:54 crc kubenswrapper[4722]: I0309 14:23:54.272334 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6dc5b476-5a42-4c98-9a95-0e3b29f2f771-oauth-serving-cert\") pod \"console-6dc4c5dd4b-rk4jt\" (UID: \"6dc5b476-5a42-4c98-9a95-0e3b29f2f771\") " pod="openshift-console/console-6dc4c5dd4b-rk4jt" Mar 09 14:23:54 crc kubenswrapper[4722]: I0309 14:23:54.272395 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6dc5b476-5a42-4c98-9a95-0e3b29f2f771-console-config\") pod \"console-6dc4c5dd4b-rk4jt\" (UID: \"6dc5b476-5a42-4c98-9a95-0e3b29f2f771\") " pod="openshift-console/console-6dc4c5dd4b-rk4jt" Mar 09 14:23:54 crc kubenswrapper[4722]: I0309 14:23:54.275192 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6dc5b476-5a42-4c98-9a95-0e3b29f2f771-trusted-ca-bundle\") pod \"console-6dc4c5dd4b-rk4jt\" (UID: \"6dc5b476-5a42-4c98-9a95-0e3b29f2f771\") " pod="openshift-console/console-6dc4c5dd4b-rk4jt" Mar 09 14:23:54 crc kubenswrapper[4722]: I0309 14:23:54.276073 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6dc5b476-5a42-4c98-9a95-0e3b29f2f771-console-oauth-config\") pod \"console-6dc4c5dd4b-rk4jt\" (UID: \"6dc5b476-5a42-4c98-9a95-0e3b29f2f771\") " pod="openshift-console/console-6dc4c5dd4b-rk4jt" Mar 09 14:23:54 crc kubenswrapper[4722]: I0309 14:23:54.277957 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6dc5b476-5a42-4c98-9a95-0e3b29f2f771-service-ca\") pod \"console-6dc4c5dd4b-rk4jt\" (UID: \"6dc5b476-5a42-4c98-9a95-0e3b29f2f771\") " pod="openshift-console/console-6dc4c5dd4b-rk4jt" Mar 09 14:23:54 crc kubenswrapper[4722]: I0309 14:23:54.289039 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6dc5b476-5a42-4c98-9a95-0e3b29f2f771-console-serving-cert\") pod \"console-6dc4c5dd4b-rk4jt\" (UID: \"6dc5b476-5a42-4c98-9a95-0e3b29f2f771\") " pod="openshift-console/console-6dc4c5dd4b-rk4jt" Mar 09 14:23:54 crc kubenswrapper[4722]: I0309 14:23:54.292347 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 09 14:23:54 crc kubenswrapper[4722]: I0309 14:23:54.295183 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 09 14:23:54 crc kubenswrapper[4722]: I0309 14:23:54.306953 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-g5gbv" Mar 09 14:23:54 crc kubenswrapper[4722]: I0309 14:23:54.310278 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Mar 09 14:23:54 crc kubenswrapper[4722]: I0309 14:23:54.310485 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Mar 09 14:23:54 crc kubenswrapper[4722]: I0309 14:23:54.310599 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Mar 09 14:23:54 crc kubenswrapper[4722]: I0309 14:23:54.313390 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Mar 09 14:23:54 crc kubenswrapper[4722]: I0309 14:23:54.313781 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Mar 09 14:23:54 crc kubenswrapper[4722]: I0309 14:23:54.315392 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Mar 09 14:23:54 crc kubenswrapper[4722]: I0309 14:23:54.315749 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Mar 09 14:23:54 crc kubenswrapper[4722]: I0309 14:23:54.330732 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 09 14:23:54 crc kubenswrapper[4722]: I0309 14:23:54.342030 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mtll\" (UniqueName: \"kubernetes.io/projected/6dc5b476-5a42-4c98-9a95-0e3b29f2f771-kube-api-access-5mtll\") pod \"console-6dc4c5dd4b-rk4jt\" (UID: \"6dc5b476-5a42-4c98-9a95-0e3b29f2f771\") " pod="openshift-console/console-6dc4c5dd4b-rk4jt" Mar 09 14:23:54 crc kubenswrapper[4722]: I0309 14:23:54.434742 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6dc4c5dd4b-rk4jt" Mar 09 14:23:54 crc kubenswrapper[4722]: I0309 14:23:54.482619 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/e0db26a0-2877-48fa-b706-b5558f9973d5-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"e0db26a0-2877-48fa-b706-b5558f9973d5\") " pod="openstack/prometheus-metric-storage-0" Mar 09 14:23:54 crc kubenswrapper[4722]: I0309 14:23:54.483089 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7sslz\" (UniqueName: \"kubernetes.io/projected/e0db26a0-2877-48fa-b706-b5558f9973d5-kube-api-access-7sslz\") pod \"prometheus-metric-storage-0\" (UID: \"e0db26a0-2877-48fa-b706-b5558f9973d5\") " pod="openstack/prometheus-metric-storage-0" Mar 09 14:23:54 crc kubenswrapper[4722]: I0309 14:23:54.483132 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e0db26a0-2877-48fa-b706-b5558f9973d5-config\") pod \"prometheus-metric-storage-0\" (UID: \"e0db26a0-2877-48fa-b706-b5558f9973d5\") " pod="openstack/prometheus-metric-storage-0" Mar 09 14:23:54 crc kubenswrapper[4722]: I0309 14:23:54.483162 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e0db26a0-2877-48fa-b706-b5558f9973d5-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"e0db26a0-2877-48fa-b706-b5558f9973d5\") " pod="openstack/prometheus-metric-storage-0" Mar 09 14:23:54 crc kubenswrapper[4722]: I0309 14:23:54.483274 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d15a083-af10-4638-b6bc-9de1f89f123f-serving-cert\") pod \"observability-ui-dashboards-66cbf594b5-bzqhp\" (UID: \"0d15a083-af10-4638-b6bc-9de1f89f123f\") " pod="openshift-operators/observability-ui-dashboards-66cbf594b5-bzqhp" Mar 09 14:23:54 crc kubenswrapper[4722]: I0309 14:23:54.483299 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e0db26a0-2877-48fa-b706-b5558f9973d5-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"e0db26a0-2877-48fa-b706-b5558f9973d5\") " pod="openstack/prometheus-metric-storage-0" Mar 09 14:23:54 crc kubenswrapper[4722]: I0309 14:23:54.483357 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e0db26a0-2877-48fa-b706-b5558f9973d5-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"e0db26a0-2877-48fa-b706-b5558f9973d5\") " pod="openstack/prometheus-metric-storage-0" Mar 09 14:23:54 crc kubenswrapper[4722]: I0309 14:23:54.483394 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e0db26a0-2877-48fa-b706-b5558f9973d5-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"e0db26a0-2877-48fa-b706-b5558f9973d5\") " pod="openstack/prometheus-metric-storage-0" Mar 09 14:23:54 crc kubenswrapper[4722]: I0309 14:23:54.483443 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e0db26a0-2877-48fa-b706-b5558f9973d5-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"e0db26a0-2877-48fa-b706-b5558f9973d5\") " pod="openstack/prometheus-metric-storage-0" Mar 09 14:23:54 crc kubenswrapper[4722]: I0309 14:23:54.483481 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d60fc4df-9ddd-42e1-8e0d-2624907f046b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d60fc4df-9ddd-42e1-8e0d-2624907f046b\") pod \"prometheus-metric-storage-0\" (UID: \"e0db26a0-2877-48fa-b706-b5558f9973d5\") " pod="openstack/prometheus-metric-storage-0" Mar 09 14:23:54 crc kubenswrapper[4722]: I0309 14:23:54.483517 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/e0db26a0-2877-48fa-b706-b5558f9973d5-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"e0db26a0-2877-48fa-b706-b5558f9973d5\") " pod="openstack/prometheus-metric-storage-0" Mar 09 14:23:54 crc kubenswrapper[4722]: I0309 14:23:54.516174 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d15a083-af10-4638-b6bc-9de1f89f123f-serving-cert\") pod \"observability-ui-dashboards-66cbf594b5-bzqhp\" (UID: \"0d15a083-af10-4638-b6bc-9de1f89f123f\") " pod="openshift-operators/observability-ui-dashboards-66cbf594b5-bzqhp" Mar 09 14:23:54 crc kubenswrapper[4722]: I0309 14:23:54.584840 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/e0db26a0-2877-48fa-b706-b5558f9973d5-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"e0db26a0-2877-48fa-b706-b5558f9973d5\") " pod="openstack/prometheus-metric-storage-0" Mar 09 14:23:54 crc kubenswrapper[4722]: I0309 14:23:54.584896 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7sslz\" (UniqueName: \"kubernetes.io/projected/e0db26a0-2877-48fa-b706-b5558f9973d5-kube-api-access-7sslz\") pod \"prometheus-metric-storage-0\" (UID: \"e0db26a0-2877-48fa-b706-b5558f9973d5\") " pod="openstack/prometheus-metric-storage-0" Mar 09 14:23:54 crc kubenswrapper[4722]: I0309 14:23:54.584939 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e0db26a0-2877-48fa-b706-b5558f9973d5-config\") pod \"prometheus-metric-storage-0\" (UID: \"e0db26a0-2877-48fa-b706-b5558f9973d5\") " pod="openstack/prometheus-metric-storage-0" Mar 09 14:23:54 crc kubenswrapper[4722]: I0309 14:23:54.584966 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e0db26a0-2877-48fa-b706-b5558f9973d5-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"e0db26a0-2877-48fa-b706-b5558f9973d5\") " pod="openstack/prometheus-metric-storage-0" Mar 09 14:23:54 crc kubenswrapper[4722]: I0309 14:23:54.585042 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e0db26a0-2877-48fa-b706-b5558f9973d5-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"e0db26a0-2877-48fa-b706-b5558f9973d5\") " pod="openstack/prometheus-metric-storage-0" Mar 09 14:23:54 crc kubenswrapper[4722]: I0309 14:23:54.585096 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e0db26a0-2877-48fa-b706-b5558f9973d5-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"e0db26a0-2877-48fa-b706-b5558f9973d5\") " pod="openstack/prometheus-metric-storage-0" Mar 09 14:23:54 crc kubenswrapper[4722]: I0309 14:23:54.585132 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e0db26a0-2877-48fa-b706-b5558f9973d5-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"e0db26a0-2877-48fa-b706-b5558f9973d5\") " pod="openstack/prometheus-metric-storage-0" Mar 09 14:23:54 crc kubenswrapper[4722]: I0309 14:23:54.585180 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e0db26a0-2877-48fa-b706-b5558f9973d5-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"e0db26a0-2877-48fa-b706-b5558f9973d5\") " pod="openstack/prometheus-metric-storage-0" Mar 09 14:23:54 crc kubenswrapper[4722]: I0309 14:23:54.585221 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d60fc4df-9ddd-42e1-8e0d-2624907f046b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d60fc4df-9ddd-42e1-8e0d-2624907f046b\") pod \"prometheus-metric-storage-0\" (UID: \"e0db26a0-2877-48fa-b706-b5558f9973d5\") " pod="openstack/prometheus-metric-storage-0" Mar 09 14:23:54 crc kubenswrapper[4722]: I0309 14:23:54.585270 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/e0db26a0-2877-48fa-b706-b5558f9973d5-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"e0db26a0-2877-48fa-b706-b5558f9973d5\") " pod="openstack/prometheus-metric-storage-0" Mar 09 14:23:54 crc kubenswrapper[4722]: I0309 14:23:54.585704 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/e0db26a0-2877-48fa-b706-b5558f9973d5-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"e0db26a0-2877-48fa-b706-b5558f9973d5\") " pod="openstack/prometheus-metric-storage-0" Mar 09 14:23:54 crc kubenswrapper[4722]: I0309 14:23:54.585960 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/e0db26a0-2877-48fa-b706-b5558f9973d5-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"e0db26a0-2877-48fa-b706-b5558f9973d5\") " pod="openstack/prometheus-metric-storage-0" Mar 09 14:23:54 crc kubenswrapper[4722]: I0309 14:23:54.586674 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e0db26a0-2877-48fa-b706-b5558f9973d5-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"e0db26a0-2877-48fa-b706-b5558f9973d5\") " pod="openstack/prometheus-metric-storage-0" Mar 09 14:23:54 crc kubenswrapper[4722]: I0309 14:23:54.590936 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e0db26a0-2877-48fa-b706-b5558f9973d5-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"e0db26a0-2877-48fa-b706-b5558f9973d5\") " pod="openstack/prometheus-metric-storage-0" Mar 09 14:23:54 crc kubenswrapper[4722]: I0309 14:23:54.591628 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e0db26a0-2877-48fa-b706-b5558f9973d5-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"e0db26a0-2877-48fa-b706-b5558f9973d5\") " pod="openstack/prometheus-metric-storage-0" Mar 09 14:23:54 crc kubenswrapper[4722]: I0309 14:23:54.591784 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e0db26a0-2877-48fa-b706-b5558f9973d5-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"e0db26a0-2877-48fa-b706-b5558f9973d5\") " pod="openstack/prometheus-metric-storage-0" Mar 09 14:23:54 crc kubenswrapper[4722]: I0309 14:23:54.593338 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e0db26a0-2877-48fa-b706-b5558f9973d5-config\") pod \"prometheus-metric-storage-0\" (UID: \"e0db26a0-2877-48fa-b706-b5558f9973d5\") " pod="openstack/prometheus-metric-storage-0" Mar 09 14:23:54 crc kubenswrapper[4722]: I0309 14:23:54.593586 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e0db26a0-2877-48fa-b706-b5558f9973d5-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"e0db26a0-2877-48fa-b706-b5558f9973d5\") " pod="openstack/prometheus-metric-storage-0" Mar 09 14:23:54 crc kubenswrapper[4722]: I0309 14:23:54.599235 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 09 14:23:54 crc kubenswrapper[4722]: I0309 14:23:54.599358 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d60fc4df-9ddd-42e1-8e0d-2624907f046b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d60fc4df-9ddd-42e1-8e0d-2624907f046b\") pod \"prometheus-metric-storage-0\" (UID: \"e0db26a0-2877-48fa-b706-b5558f9973d5\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/5450886054d333d0379a47b47c9d8ace333ac0051caf38aa48f7152c27378c49/globalmount\"" pod="openstack/prometheus-metric-storage-0" Mar 09 14:23:54 crc kubenswrapper[4722]: I0309 14:23:54.605989 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7sslz\" (UniqueName: \"kubernetes.io/projected/e0db26a0-2877-48fa-b706-b5558f9973d5-kube-api-access-7sslz\") pod \"prometheus-metric-storage-0\" (UID: \"e0db26a0-2877-48fa-b706-b5558f9973d5\") " pod="openstack/prometheus-metric-storage-0" Mar 09 14:23:54 crc kubenswrapper[4722]: I0309 14:23:54.666832 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d60fc4df-9ddd-42e1-8e0d-2624907f046b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d60fc4df-9ddd-42e1-8e0d-2624907f046b\") pod \"prometheus-metric-storage-0\" (UID: \"e0db26a0-2877-48fa-b706-b5558f9973d5\") " pod="openstack/prometheus-metric-storage-0" Mar 09 14:23:54 crc kubenswrapper[4722]: I0309 14:23:54.726856 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-ui-dashboards-66cbf594b5-bzqhp" Mar 09 14:23:54 crc kubenswrapper[4722]: I0309 14:23:54.792571 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 09 14:23:55 crc kubenswrapper[4722]: I0309 14:23:55.904879 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-b8gzx"] Mar 09 14:23:55 crc kubenswrapper[4722]: I0309 14:23:55.914775 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-b8gzx" Mar 09 14:23:55 crc kubenswrapper[4722]: I0309 14:23:55.917971 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-k6ng6"] Mar 09 14:23:55 crc kubenswrapper[4722]: I0309 14:23:55.921846 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-k6ng6" Mar 09 14:23:55 crc kubenswrapper[4722]: I0309 14:23:55.930214 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 09 14:23:55 crc kubenswrapper[4722]: I0309 14:23:55.930499 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 09 14:23:55 crc kubenswrapper[4722]: I0309 14:23:55.930726 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-2pj2z" Mar 09 14:23:55 crc kubenswrapper[4722]: I0309 14:23:55.940046 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-b8gzx"] Mar 09 14:23:55 crc kubenswrapper[4722]: I0309 14:23:55.954127 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-k6ng6"] Mar 09 14:23:56 crc kubenswrapper[4722]: I0309 14:23:56.012749 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/32bc4279-b6a2-4846-801c-ddf3a01db8b2-var-run\") pod \"ovn-controller-b8gzx\" (UID: \"32bc4279-b6a2-4846-801c-ddf3a01db8b2\") " pod="openstack/ovn-controller-b8gzx" Mar 09 14:23:56 crc kubenswrapper[4722]: I0309 14:23:56.013546 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/d6228268-4d1f-464b-b733-a2f308211670-var-log\") pod \"ovn-controller-ovs-k6ng6\" (UID: \"d6228268-4d1f-464b-b733-a2f308211670\") " pod="openstack/ovn-controller-ovs-k6ng6" Mar 09 14:23:56 crc kubenswrapper[4722]: I0309 14:23:56.013619 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d6228268-4d1f-464b-b733-a2f308211670-scripts\") pod \"ovn-controller-ovs-k6ng6\" (UID: \"d6228268-4d1f-464b-b733-a2f308211670\") " pod="openstack/ovn-controller-ovs-k6ng6" Mar 09 14:23:56 crc kubenswrapper[4722]: I0309 14:23:56.013690 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/d6228268-4d1f-464b-b733-a2f308211670-var-lib\") pod \"ovn-controller-ovs-k6ng6\" (UID: \"d6228268-4d1f-464b-b733-a2f308211670\") " pod="openstack/ovn-controller-ovs-k6ng6" Mar 09 14:23:56 crc kubenswrapper[4722]: I0309 14:23:56.013779 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z26fb\" (UniqueName: \"kubernetes.io/projected/d6228268-4d1f-464b-b733-a2f308211670-kube-api-access-z26fb\") pod \"ovn-controller-ovs-k6ng6\" (UID: \"d6228268-4d1f-464b-b733-a2f308211670\") " pod="openstack/ovn-controller-ovs-k6ng6" Mar 09 14:23:56 crc kubenswrapper[4722]: I0309 14:23:56.013843 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d6228268-4d1f-464b-b733-a2f308211670-var-run\") pod \"ovn-controller-ovs-k6ng6\" (UID: \"d6228268-4d1f-464b-b733-a2f308211670\") " pod="openstack/ovn-controller-ovs-k6ng6" Mar 09 14:23:56 crc kubenswrapper[4722]: I0309 14:23:56.013931 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/d6228268-4d1f-464b-b733-a2f308211670-etc-ovs\") pod \"ovn-controller-ovs-k6ng6\" (UID: \"d6228268-4d1f-464b-b733-a2f308211670\") " pod="openstack/ovn-controller-ovs-k6ng6" Mar 09 14:23:56 crc kubenswrapper[4722]: I0309 14:23:56.014000 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjsph\" (UniqueName: \"kubernetes.io/projected/32bc4279-b6a2-4846-801c-ddf3a01db8b2-kube-api-access-jjsph\") pod \"ovn-controller-b8gzx\" (UID: \"32bc4279-b6a2-4846-801c-ddf3a01db8b2\") " pod="openstack/ovn-controller-b8gzx" Mar 09 14:23:56 crc kubenswrapper[4722]: I0309 14:23:56.014075 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/32bc4279-b6a2-4846-801c-ddf3a01db8b2-var-log-ovn\") pod \"ovn-controller-b8gzx\" (UID: \"32bc4279-b6a2-4846-801c-ddf3a01db8b2\") " pod="openstack/ovn-controller-b8gzx" Mar 09 14:23:56 crc kubenswrapper[4722]: I0309 14:23:56.014136 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/32bc4279-b6a2-4846-801c-ddf3a01db8b2-var-run-ovn\") pod \"ovn-controller-b8gzx\" (UID: \"32bc4279-b6a2-4846-801c-ddf3a01db8b2\") " pod="openstack/ovn-controller-b8gzx" Mar 09 14:23:56 crc kubenswrapper[4722]: I0309 14:23:56.014209 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/32bc4279-b6a2-4846-801c-ddf3a01db8b2-ovn-controller-tls-certs\") pod \"ovn-controller-b8gzx\" (UID: \"32bc4279-b6a2-4846-801c-ddf3a01db8b2\") " pod="openstack/ovn-controller-b8gzx" Mar 09 14:23:56 crc kubenswrapper[4722]: I0309 14:23:56.014361 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/32bc4279-b6a2-4846-801c-ddf3a01db8b2-scripts\") pod \"ovn-controller-b8gzx\" (UID: \"32bc4279-b6a2-4846-801c-ddf3a01db8b2\") " pod="openstack/ovn-controller-b8gzx" Mar 09 14:23:56 crc kubenswrapper[4722]: I0309 14:23:56.014432 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32bc4279-b6a2-4846-801c-ddf3a01db8b2-combined-ca-bundle\") pod \"ovn-controller-b8gzx\" (UID: \"32bc4279-b6a2-4846-801c-ddf3a01db8b2\") " pod="openstack/ovn-controller-b8gzx" Mar 09 14:23:56 crc kubenswrapper[4722]: I0309 14:23:56.116115 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z26fb\" (UniqueName: \"kubernetes.io/projected/d6228268-4d1f-464b-b733-a2f308211670-kube-api-access-z26fb\") pod \"ovn-controller-ovs-k6ng6\" (UID: \"d6228268-4d1f-464b-b733-a2f308211670\") " pod="openstack/ovn-controller-ovs-k6ng6" Mar 09 14:23:56 crc kubenswrapper[4722]: I0309 14:23:56.116180 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d6228268-4d1f-464b-b733-a2f308211670-var-run\") pod \"ovn-controller-ovs-k6ng6\" (UID: \"d6228268-4d1f-464b-b733-a2f308211670\") " pod="openstack/ovn-controller-ovs-k6ng6" Mar 09 14:23:56 crc kubenswrapper[4722]: I0309 14:23:56.116270 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/d6228268-4d1f-464b-b733-a2f308211670-etc-ovs\") pod \"ovn-controller-ovs-k6ng6\" (UID: \"d6228268-4d1f-464b-b733-a2f308211670\") " pod="openstack/ovn-controller-ovs-k6ng6" Mar 09 14:23:56 crc kubenswrapper[4722]: I0309 14:23:56.116308 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjsph\" (UniqueName: \"kubernetes.io/projected/32bc4279-b6a2-4846-801c-ddf3a01db8b2-kube-api-access-jjsph\") pod \"ovn-controller-b8gzx\" (UID: \"32bc4279-b6a2-4846-801c-ddf3a01db8b2\") " pod="openstack/ovn-controller-b8gzx" Mar 09 14:23:56 crc kubenswrapper[4722]: I0309 14:23:56.116351 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/32bc4279-b6a2-4846-801c-ddf3a01db8b2-var-log-ovn\") pod \"ovn-controller-b8gzx\" (UID: \"32bc4279-b6a2-4846-801c-ddf3a01db8b2\") " pod="openstack/ovn-controller-b8gzx" Mar 09 14:23:56 crc kubenswrapper[4722]: I0309 14:23:56.116379 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/32bc4279-b6a2-4846-801c-ddf3a01db8b2-var-run-ovn\") pod \"ovn-controller-b8gzx\" (UID: \"32bc4279-b6a2-4846-801c-ddf3a01db8b2\") " pod="openstack/ovn-controller-b8gzx" Mar 09 14:23:56 crc kubenswrapper[4722]: I0309 14:23:56.116413 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/32bc4279-b6a2-4846-801c-ddf3a01db8b2-ovn-controller-tls-certs\") pod \"ovn-controller-b8gzx\" (UID: \"32bc4279-b6a2-4846-801c-ddf3a01db8b2\") " pod="openstack/ovn-controller-b8gzx" Mar 09 14:23:56 crc kubenswrapper[4722]: I0309 14:23:56.116434 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/32bc4279-b6a2-4846-801c-ddf3a01db8b2-scripts\") pod \"ovn-controller-b8gzx\" (UID: \"32bc4279-b6a2-4846-801c-ddf3a01db8b2\") " pod="openstack/ovn-controller-b8gzx" Mar 09 14:23:56 crc kubenswrapper[4722]: I0309 14:23:56.116465 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32bc4279-b6a2-4846-801c-ddf3a01db8b2-combined-ca-bundle\") pod \"ovn-controller-b8gzx\" (UID: \"32bc4279-b6a2-4846-801c-ddf3a01db8b2\") " pod="openstack/ovn-controller-b8gzx" Mar 09 14:23:56 crc kubenswrapper[4722]: I0309 14:23:56.116511 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/d6228268-4d1f-464b-b733-a2f308211670-var-log\") pod \"ovn-controller-ovs-k6ng6\" (UID: \"d6228268-4d1f-464b-b733-a2f308211670\") " pod="openstack/ovn-controller-ovs-k6ng6" Mar 09 14:23:56 crc kubenswrapper[4722]: I0309 14:23:56.116532 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/32bc4279-b6a2-4846-801c-ddf3a01db8b2-var-run\") pod \"ovn-controller-b8gzx\" (UID: \"32bc4279-b6a2-4846-801c-ddf3a01db8b2\") " pod="openstack/ovn-controller-b8gzx" Mar 09 14:23:56 crc kubenswrapper[4722]: I0309 14:23:56.116560 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d6228268-4d1f-464b-b733-a2f308211670-scripts\") pod \"ovn-controller-ovs-k6ng6\" (UID: \"d6228268-4d1f-464b-b733-a2f308211670\") " pod="openstack/ovn-controller-ovs-k6ng6" Mar 09 14:23:56 crc kubenswrapper[4722]: I0309 14:23:56.116590 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/d6228268-4d1f-464b-b733-a2f308211670-var-lib\") pod \"ovn-controller-ovs-k6ng6\" (UID: \"d6228268-4d1f-464b-b733-a2f308211670\") " pod="openstack/ovn-controller-ovs-k6ng6" Mar 09 14:23:56 crc kubenswrapper[4722]: I0309 14:23:56.116752 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/d6228268-4d1f-464b-b733-a2f308211670-etc-ovs\") pod \"ovn-controller-ovs-k6ng6\" (UID: \"d6228268-4d1f-464b-b733-a2f308211670\") " pod="openstack/ovn-controller-ovs-k6ng6" Mar 09 14:23:56 crc kubenswrapper[4722]: I0309 14:23:56.116874 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d6228268-4d1f-464b-b733-a2f308211670-var-run\") pod \"ovn-controller-ovs-k6ng6\" (UID: \"d6228268-4d1f-464b-b733-a2f308211670\") " pod="openstack/ovn-controller-ovs-k6ng6" Mar 09 14:23:56 crc kubenswrapper[4722]: I0309 14:23:56.116937 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/d6228268-4d1f-464b-b733-a2f308211670-var-log\") pod \"ovn-controller-ovs-k6ng6\" (UID: \"d6228268-4d1f-464b-b733-a2f308211670\") " pod="openstack/ovn-controller-ovs-k6ng6" Mar 09 14:23:56 crc kubenswrapper[4722]: I0309 14:23:56.116943 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/32bc4279-b6a2-4846-801c-ddf3a01db8b2-var-run\") pod \"ovn-controller-b8gzx\" (UID: \"32bc4279-b6a2-4846-801c-ddf3a01db8b2\") " pod="openstack/ovn-controller-b8gzx" Mar 09 14:23:56 crc kubenswrapper[4722]: I0309 14:23:56.117004 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/d6228268-4d1f-464b-b733-a2f308211670-var-lib\") pod \"ovn-controller-ovs-k6ng6\" (UID: \"d6228268-4d1f-464b-b733-a2f308211670\") " pod="openstack/ovn-controller-ovs-k6ng6" Mar 09 14:23:56 crc kubenswrapper[4722]: I0309 14:23:56.117153 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/32bc4279-b6a2-4846-801c-ddf3a01db8b2-var-log-ovn\") pod \"ovn-controller-b8gzx\" (UID: \"32bc4279-b6a2-4846-801c-ddf3a01db8b2\") " pod="openstack/ovn-controller-b8gzx" Mar 09 14:23:56 crc kubenswrapper[4722]: I0309 14:23:56.117421 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/32bc4279-b6a2-4846-801c-ddf3a01db8b2-var-run-ovn\") pod \"ovn-controller-b8gzx\" (UID: \"32bc4279-b6a2-4846-801c-ddf3a01db8b2\") " pod="openstack/ovn-controller-b8gzx" Mar 09 14:23:56 crc kubenswrapper[4722]: I0309 14:23:56.118705 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/32bc4279-b6a2-4846-801c-ddf3a01db8b2-scripts\") pod \"ovn-controller-b8gzx\" (UID: \"32bc4279-b6a2-4846-801c-ddf3a01db8b2\") " pod="openstack/ovn-controller-b8gzx" Mar 09 14:23:56 crc kubenswrapper[4722]: I0309 14:23:56.119504 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d6228268-4d1f-464b-b733-a2f308211670-scripts\") pod \"ovn-controller-ovs-k6ng6\" (UID: \"d6228268-4d1f-464b-b733-a2f308211670\") " pod="openstack/ovn-controller-ovs-k6ng6" Mar 09 14:23:56 crc kubenswrapper[4722]: I0309 14:23:56.122699 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/32bc4279-b6a2-4846-801c-ddf3a01db8b2-ovn-controller-tls-certs\") pod \"ovn-controller-b8gzx\" (UID: \"32bc4279-b6a2-4846-801c-ddf3a01db8b2\") " pod="openstack/ovn-controller-b8gzx" Mar 09 14:23:56 crc kubenswrapper[4722]: I0309 14:23:56.130859 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32bc4279-b6a2-4846-801c-ddf3a01db8b2-combined-ca-bundle\") pod \"ovn-controller-b8gzx\" (UID: \"32bc4279-b6a2-4846-801c-ddf3a01db8b2\") " pod="openstack/ovn-controller-b8gzx" Mar 09 14:23:56 crc kubenswrapper[4722]: I0309 14:23:56.138373 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjsph\" (UniqueName: \"kubernetes.io/projected/32bc4279-b6a2-4846-801c-ddf3a01db8b2-kube-api-access-jjsph\") pod \"ovn-controller-b8gzx\" (UID: \"32bc4279-b6a2-4846-801c-ddf3a01db8b2\") " pod="openstack/ovn-controller-b8gzx" Mar 09 14:23:56 crc kubenswrapper[4722]: I0309 14:23:56.139249 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z26fb\" (UniqueName: \"kubernetes.io/projected/d6228268-4d1f-464b-b733-a2f308211670-kube-api-access-z26fb\") pod \"ovn-controller-ovs-k6ng6\" (UID: \"d6228268-4d1f-464b-b733-a2f308211670\") " pod="openstack/ovn-controller-ovs-k6ng6" Mar 09 14:23:56 crc kubenswrapper[4722]: I0309 14:23:56.247156 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-b8gzx" Mar 09 14:23:56 crc kubenswrapper[4722]: I0309 14:23:56.265348 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-k6ng6" Mar 09 14:23:56 crc kubenswrapper[4722]: I0309 14:23:56.442205 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 09 14:23:56 crc kubenswrapper[4722]: I0309 14:23:56.443999 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 09 14:23:56 crc kubenswrapper[4722]: I0309 14:23:56.446653 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-r49m5" Mar 09 14:23:56 crc kubenswrapper[4722]: I0309 14:23:56.446898 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 09 14:23:56 crc kubenswrapper[4722]: I0309 14:23:56.452786 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 09 14:23:56 crc kubenswrapper[4722]: I0309 14:23:56.453091 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 09 14:23:56 crc kubenswrapper[4722]: I0309 14:23:56.453335 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 09 14:23:56 crc kubenswrapper[4722]: I0309 14:23:56.457216 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 09 14:23:56 crc kubenswrapper[4722]: I0309 14:23:56.525850 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f2d1a87-0e77-4753-b87a-39b2b5f333a4-config\") pod \"ovsdbserver-nb-0\" (UID: \"3f2d1a87-0e77-4753-b87a-39b2b5f333a4\") " pod="openstack/ovsdbserver-nb-0" Mar 09 14:23:56 crc kubenswrapper[4722]: I0309 14:23:56.525921 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3f2d1a87-0e77-4753-b87a-39b2b5f333a4-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"3f2d1a87-0e77-4753-b87a-39b2b5f333a4\") " pod="openstack/ovsdbserver-nb-0" Mar 09 14:23:56 crc kubenswrapper[4722]: I0309 14:23:56.525959 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f2d1a87-0e77-4753-b87a-39b2b5f333a4-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"3f2d1a87-0e77-4753-b87a-39b2b5f333a4\") " pod="openstack/ovsdbserver-nb-0" Mar 09 14:23:56 crc kubenswrapper[4722]: I0309 14:23:56.526054 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f2d1a87-0e77-4753-b87a-39b2b5f333a4-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3f2d1a87-0e77-4753-b87a-39b2b5f333a4\") " pod="openstack/ovsdbserver-nb-0" Mar 09 14:23:56 crc kubenswrapper[4722]: I0309 14:23:56.526087 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3f2d1a87-0e77-4753-b87a-39b2b5f333a4-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"3f2d1a87-0e77-4753-b87a-39b2b5f333a4\") " pod="openstack/ovsdbserver-nb-0" Mar 09 14:23:56 crc kubenswrapper[4722]: I0309 14:23:56.526104 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f2d1a87-0e77-4753-b87a-39b2b5f333a4-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3f2d1a87-0e77-4753-b87a-39b2b5f333a4\") " pod="openstack/ovsdbserver-nb-0" Mar 09 14:23:56 crc kubenswrapper[4722]: I0309 14:23:56.526127 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-fc269846-8778-4505-bc67-e10deff2e80d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fc269846-8778-4505-bc67-e10deff2e80d\") pod \"ovsdbserver-nb-0\" (UID: \"3f2d1a87-0e77-4753-b87a-39b2b5f333a4\") " pod="openstack/ovsdbserver-nb-0" Mar 09 14:23:56 crc kubenswrapper[4722]: I0309 14:23:56.526361 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvqcz\" (UniqueName: \"kubernetes.io/projected/3f2d1a87-0e77-4753-b87a-39b2b5f333a4-kube-api-access-mvqcz\") pod \"ovsdbserver-nb-0\" (UID: \"3f2d1a87-0e77-4753-b87a-39b2b5f333a4\") " pod="openstack/ovsdbserver-nb-0" Mar 09 14:23:56 crc kubenswrapper[4722]: I0309 14:23:56.628582 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f2d1a87-0e77-4753-b87a-39b2b5f333a4-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3f2d1a87-0e77-4753-b87a-39b2b5f333a4\") " pod="openstack/ovsdbserver-nb-0" Mar 09 14:23:56 crc kubenswrapper[4722]: I0309 14:23:56.628643 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3f2d1a87-0e77-4753-b87a-39b2b5f333a4-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"3f2d1a87-0e77-4753-b87a-39b2b5f333a4\") " pod="openstack/ovsdbserver-nb-0" Mar 09 14:23:56 crc kubenswrapper[4722]: I0309 14:23:56.628669 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f2d1a87-0e77-4753-b87a-39b2b5f333a4-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3f2d1a87-0e77-4753-b87a-39b2b5f333a4\") " pod="openstack/ovsdbserver-nb-0" Mar 09 14:23:56 crc kubenswrapper[4722]: I0309 14:23:56.628705 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-fc269846-8778-4505-bc67-e10deff2e80d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fc269846-8778-4505-bc67-e10deff2e80d\") pod \"ovsdbserver-nb-0\" (UID: \"3f2d1a87-0e77-4753-b87a-39b2b5f333a4\") " pod="openstack/ovsdbserver-nb-0" Mar 09 14:23:56 crc kubenswrapper[4722]: I0309 14:23:56.629110 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/3f2d1a87-0e77-4753-b87a-39b2b5f333a4-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"3f2d1a87-0e77-4753-b87a-39b2b5f333a4\") " pod="openstack/ovsdbserver-nb-0" Mar 09 14:23:56 crc kubenswrapper[4722]: I0309 14:23:56.630657 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvqcz\" (UniqueName: \"kubernetes.io/projected/3f2d1a87-0e77-4753-b87a-39b2b5f333a4-kube-api-access-mvqcz\") pod \"ovsdbserver-nb-0\" (UID: \"3f2d1a87-0e77-4753-b87a-39b2b5f333a4\") " pod="openstack/ovsdbserver-nb-0" Mar 09 14:23:56 crc kubenswrapper[4722]: I0309 14:23:56.630812 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f2d1a87-0e77-4753-b87a-39b2b5f333a4-config\") pod \"ovsdbserver-nb-0\" (UID: \"3f2d1a87-0e77-4753-b87a-39b2b5f333a4\") " pod="openstack/ovsdbserver-nb-0" Mar 09 14:23:56 crc kubenswrapper[4722]: I0309 14:23:56.631854 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3f2d1a87-0e77-4753-b87a-39b2b5f333a4-config\") pod \"ovsdbserver-nb-0\" (UID: \"3f2d1a87-0e77-4753-b87a-39b2b5f333a4\") " pod="openstack/ovsdbserver-nb-0" Mar 09 14:23:56 crc kubenswrapper[4722]: I0309 14:23:56.632004 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3f2d1a87-0e77-4753-b87a-39b2b5f333a4-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"3f2d1a87-0e77-4753-b87a-39b2b5f333a4\") " pod="openstack/ovsdbserver-nb-0" Mar 09 14:23:56 crc kubenswrapper[4722]: I0309 14:23:56.632062 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f2d1a87-0e77-4753-b87a-39b2b5f333a4-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"3f2d1a87-0e77-4753-b87a-39b2b5f333a4\") " pod="openstack/ovsdbserver-nb-0" Mar 09 14:23:56 crc kubenswrapper[4722]: I0309 14:23:56.632206 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f2d1a87-0e77-4753-b87a-39b2b5f333a4-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3f2d1a87-0e77-4753-b87a-39b2b5f333a4\") " pod="openstack/ovsdbserver-nb-0" Mar 09 14:23:56 crc kubenswrapper[4722]: I0309 14:23:56.632754 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3f2d1a87-0e77-4753-b87a-39b2b5f333a4-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"3f2d1a87-0e77-4753-b87a-39b2b5f333a4\") " pod="openstack/ovsdbserver-nb-0" Mar 09 14:23:56 crc kubenswrapper[4722]: I0309 14:23:56.636673 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f2d1a87-0e77-4753-b87a-39b2b5f333a4-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"3f2d1a87-0e77-4753-b87a-39b2b5f333a4\") " pod="openstack/ovsdbserver-nb-0" Mar 09 14:23:56 crc kubenswrapper[4722]: I0309 14:23:56.637616 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3f2d1a87-0e77-4753-b87a-39b2b5f333a4-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"3f2d1a87-0e77-4753-b87a-39b2b5f333a4\") " pod="openstack/ovsdbserver-nb-0" Mar 09 14:23:56 crc kubenswrapper[4722]: I0309 14:23:56.649786 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvqcz\" (UniqueName: \"kubernetes.io/projected/3f2d1a87-0e77-4753-b87a-39b2b5f333a4-kube-api-access-mvqcz\") pod \"ovsdbserver-nb-0\" (UID: \"3f2d1a87-0e77-4753-b87a-39b2b5f333a4\") " pod="openstack/ovsdbserver-nb-0" Mar 09 14:23:56 crc kubenswrapper[4722]: I0309 14:23:56.651311 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 09 14:23:56 crc kubenswrapper[4722]: I0309 14:23:56.651410 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-fc269846-8778-4505-bc67-e10deff2e80d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fc269846-8778-4505-bc67-e10deff2e80d\") pod \"ovsdbserver-nb-0\" (UID: \"3f2d1a87-0e77-4753-b87a-39b2b5f333a4\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/e5272732707f7b93692022cd555b97bb5d9a8728c04c44caf3f98bdac9aa43a0/globalmount\"" pod="openstack/ovsdbserver-nb-0" Mar 09 14:23:56 crc kubenswrapper[4722]: I0309 14:23:56.702866 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-fc269846-8778-4505-bc67-e10deff2e80d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-fc269846-8778-4505-bc67-e10deff2e80d\") pod \"ovsdbserver-nb-0\" (UID: \"3f2d1a87-0e77-4753-b87a-39b2b5f333a4\") " pod="openstack/ovsdbserver-nb-0" Mar 09 14:23:56 crc kubenswrapper[4722]: W0309 14:23:56.765315 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda79aaedb_92ba_42fc_8cc7_9ecb007d2ac7.slice/crio-ea204fd8c32ad3ccd826ef6053ae9cfb6e3b356ac18f6dd403054cf04f3fba43 WatchSource:0}: Error finding container ea204fd8c32ad3ccd826ef6053ae9cfb6e3b356ac18f6dd403054cf04f3fba43: Status 404 returned error can't find the container with id ea204fd8c32ad3ccd826ef6053ae9cfb6e3b356ac18f6dd403054cf04f3fba43 Mar 09 14:23:56 crc kubenswrapper[4722]: W0309 14:23:56.767549 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f4e007a_4a18_40e6_bf96_4a751e00cd73.slice/crio-8826d7e3b5c4577b329e1668f6236ec0bcbdcc1ff604699b20e9e79150ba01d7 WatchSource:0}: Error finding container 8826d7e3b5c4577b329e1668f6236ec0bcbdcc1ff604699b20e9e79150ba01d7: Status 404 returned error can't find the container with id 8826d7e3b5c4577b329e1668f6236ec0bcbdcc1ff604699b20e9e79150ba01d7 Mar 09 14:23:56 crc kubenswrapper[4722]: I0309 14:23:56.774574 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 09 14:23:56 crc kubenswrapper[4722]: I0309 14:23:56.973653 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6f4e007a-4a18-40e6-bf96-4a751e00cd73","Type":"ContainerStarted","Data":"8826d7e3b5c4577b329e1668f6236ec0bcbdcc1ff604699b20e9e79150ba01d7"} Mar 09 14:23:57 crc kubenswrapper[4722]: I0309 14:23:57.004375 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1c98e541-4b72-465d-8799-89e8c9791c3e","Type":"ContainerStarted","Data":"07c39314f6fb722f15d56d57d5a45a08afff090e92aa0fd3ad5d75a4f3acf3b8"} Mar 09 14:23:57 crc kubenswrapper[4722]: I0309 14:23:57.014564 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"c6ada086-becc-4f4a-a0a0-0aad894dc550","Type":"ContainerStarted","Data":"7daf28ee40a43853387b92aa2acd493b5b17000dad0500d62280373a0f538481"} Mar 09 14:23:57 crc kubenswrapper[4722]: I0309 14:23:57.023656 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"a79aaedb-92ba-42fc-8cc7-9ecb007d2ac7","Type":"ContainerStarted","Data":"ea204fd8c32ad3ccd826ef6053ae9cfb6e3b356ac18f6dd403054cf04f3fba43"} Mar 09 14:23:57 crc kubenswrapper[4722]: I0309 14:23:57.024631 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"17ce7999-f86f-45fa-ae07-785f70d797a1","Type":"ContainerStarted","Data":"5bf93b12a7d905614f0f42183e3fc57fef84de58c9b870b252fb21aa59eeed03"} Mar 09 14:23:59 crc kubenswrapper[4722]: I0309 14:23:59.702476 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 09 14:23:59 crc kubenswrapper[4722]: I0309 14:23:59.704899 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 09 14:23:59 crc kubenswrapper[4722]: I0309 14:23:59.708400 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 09 14:23:59 crc kubenswrapper[4722]: I0309 14:23:59.708742 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-568x9" Mar 09 14:23:59 crc kubenswrapper[4722]: I0309 14:23:59.713543 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 09 14:23:59 crc kubenswrapper[4722]: I0309 14:23:59.713900 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 09 14:23:59 crc kubenswrapper[4722]: I0309 14:23:59.724502 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 09 14:23:59 crc kubenswrapper[4722]: I0309 14:23:59.813472 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-16e96a21-cfef-430f-9fe9-631fdc692059\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-16e96a21-cfef-430f-9fe9-631fdc692059\") pod \"ovsdbserver-sb-0\" (UID: \"9aed43b2-88da-4388-b0ce-77699c8f978c\") " pod="openstack/ovsdbserver-sb-0" Mar 09 14:23:59 crc kubenswrapper[4722]: I0309 14:23:59.813551 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9aed43b2-88da-4388-b0ce-77699c8f978c-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"9aed43b2-88da-4388-b0ce-77699c8f978c\") " pod="openstack/ovsdbserver-sb-0" Mar 09 14:23:59 crc kubenswrapper[4722]: I0309 14:23:59.813617 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9aed43b2-88da-4388-b0ce-77699c8f978c-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"9aed43b2-88da-4388-b0ce-77699c8f978c\") " pod="openstack/ovsdbserver-sb-0" Mar 09 14:23:59 crc kubenswrapper[4722]: I0309 14:23:59.813652 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpxrp\" (UniqueName: \"kubernetes.io/projected/9aed43b2-88da-4388-b0ce-77699c8f978c-kube-api-access-qpxrp\") pod \"ovsdbserver-sb-0\" (UID: \"9aed43b2-88da-4388-b0ce-77699c8f978c\") " pod="openstack/ovsdbserver-sb-0" Mar 09 14:23:59 crc kubenswrapper[4722]: I0309 14:23:59.813752 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9aed43b2-88da-4388-b0ce-77699c8f978c-config\") pod \"ovsdbserver-sb-0\" (UID: \"9aed43b2-88da-4388-b0ce-77699c8f978c\") " pod="openstack/ovsdbserver-sb-0" Mar 09 14:23:59 crc kubenswrapper[4722]: I0309 14:23:59.813787 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9aed43b2-88da-4388-b0ce-77699c8f978c-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"9aed43b2-88da-4388-b0ce-77699c8f978c\") " pod="openstack/ovsdbserver-sb-0" Mar 09 14:23:59 crc kubenswrapper[4722]: I0309 14:23:59.813827 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9aed43b2-88da-4388-b0ce-77699c8f978c-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"9aed43b2-88da-4388-b0ce-77699c8f978c\") " pod="openstack/ovsdbserver-sb-0" Mar 09 14:23:59 crc kubenswrapper[4722]: I0309 14:23:59.813865 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9aed43b2-88da-4388-b0ce-77699c8f978c-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"9aed43b2-88da-4388-b0ce-77699c8f978c\") " pod="openstack/ovsdbserver-sb-0" Mar 09 14:23:59 crc kubenswrapper[4722]: I0309 14:23:59.915046 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9aed43b2-88da-4388-b0ce-77699c8f978c-config\") pod \"ovsdbserver-sb-0\" (UID: \"9aed43b2-88da-4388-b0ce-77699c8f978c\") " pod="openstack/ovsdbserver-sb-0" Mar 09 14:23:59 crc kubenswrapper[4722]: I0309 14:23:59.915097 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9aed43b2-88da-4388-b0ce-77699c8f978c-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"9aed43b2-88da-4388-b0ce-77699c8f978c\") " pod="openstack/ovsdbserver-sb-0" Mar 09 14:23:59 crc kubenswrapper[4722]: I0309 14:23:59.915133 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9aed43b2-88da-4388-b0ce-77699c8f978c-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"9aed43b2-88da-4388-b0ce-77699c8f978c\") " pod="openstack/ovsdbserver-sb-0" Mar 09 14:23:59 crc kubenswrapper[4722]: I0309 14:23:59.915159 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9aed43b2-88da-4388-b0ce-77699c8f978c-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"9aed43b2-88da-4388-b0ce-77699c8f978c\") " pod="openstack/ovsdbserver-sb-0" Mar 09 14:23:59 crc kubenswrapper[4722]: I0309 14:23:59.915195 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-16e96a21-cfef-430f-9fe9-631fdc692059\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-16e96a21-cfef-430f-9fe9-631fdc692059\") pod \"ovsdbserver-sb-0\" (UID: \"9aed43b2-88da-4388-b0ce-77699c8f978c\") " pod="openstack/ovsdbserver-sb-0" Mar 09 14:23:59 crc kubenswrapper[4722]: I0309 14:23:59.915285 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9aed43b2-88da-4388-b0ce-77699c8f978c-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"9aed43b2-88da-4388-b0ce-77699c8f978c\") " pod="openstack/ovsdbserver-sb-0" Mar 09 14:23:59 crc kubenswrapper[4722]: I0309 14:23:59.915326 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9aed43b2-88da-4388-b0ce-77699c8f978c-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"9aed43b2-88da-4388-b0ce-77699c8f978c\") " pod="openstack/ovsdbserver-sb-0" Mar 09 14:23:59 crc kubenswrapper[4722]: I0309 14:23:59.915349 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpxrp\" (UniqueName: \"kubernetes.io/projected/9aed43b2-88da-4388-b0ce-77699c8f978c-kube-api-access-qpxrp\") pod \"ovsdbserver-sb-0\" (UID: \"9aed43b2-88da-4388-b0ce-77699c8f978c\") " pod="openstack/ovsdbserver-sb-0" Mar 09 14:23:59 crc kubenswrapper[4722]: I0309 14:23:59.915991 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9aed43b2-88da-4388-b0ce-77699c8f978c-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"9aed43b2-88da-4388-b0ce-77699c8f978c\") " pod="openstack/ovsdbserver-sb-0" Mar 09 14:23:59 crc kubenswrapper[4722]: I0309 14:23:59.916598 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9aed43b2-88da-4388-b0ce-77699c8f978c-config\") pod \"ovsdbserver-sb-0\" (UID: \"9aed43b2-88da-4388-b0ce-77699c8f978c\") " pod="openstack/ovsdbserver-sb-0" Mar 09 14:23:59 crc kubenswrapper[4722]: I0309 14:23:59.916954 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9aed43b2-88da-4388-b0ce-77699c8f978c-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"9aed43b2-88da-4388-b0ce-77699c8f978c\") " pod="openstack/ovsdbserver-sb-0" Mar 09 14:23:59 crc kubenswrapper[4722]: I0309 14:23:59.920553 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9aed43b2-88da-4388-b0ce-77699c8f978c-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"9aed43b2-88da-4388-b0ce-77699c8f978c\") " pod="openstack/ovsdbserver-sb-0" Mar 09 14:23:59 crc kubenswrapper[4722]: I0309 14:23:59.920892 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9aed43b2-88da-4388-b0ce-77699c8f978c-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"9aed43b2-88da-4388-b0ce-77699c8f978c\") " pod="openstack/ovsdbserver-sb-0" Mar 09 14:23:59 crc kubenswrapper[4722]: I0309 14:23:59.921136 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 09 14:23:59 crc kubenswrapper[4722]: I0309 14:23:59.921166 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-16e96a21-cfef-430f-9fe9-631fdc692059\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-16e96a21-cfef-430f-9fe9-631fdc692059\") pod \"ovsdbserver-sb-0\" (UID: \"9aed43b2-88da-4388-b0ce-77699c8f978c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/80a274d2c286ff36e3626a6cf550cf0654a2fffcb7051a3f60ca57ca4e3c1afb/globalmount\"" pod="openstack/ovsdbserver-sb-0" Mar 09 14:23:59 crc kubenswrapper[4722]: I0309 14:23:59.921562 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9aed43b2-88da-4388-b0ce-77699c8f978c-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"9aed43b2-88da-4388-b0ce-77699c8f978c\") " pod="openstack/ovsdbserver-sb-0" Mar 09 14:23:59 crc kubenswrapper[4722]: I0309 14:23:59.956539 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-16e96a21-cfef-430f-9fe9-631fdc692059\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-16e96a21-cfef-430f-9fe9-631fdc692059\") pod \"ovsdbserver-sb-0\" (UID: \"9aed43b2-88da-4388-b0ce-77699c8f978c\") " pod="openstack/ovsdbserver-sb-0" Mar 09 14:23:59 crc kubenswrapper[4722]: I0309 14:23:59.959332 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpxrp\" (UniqueName: \"kubernetes.io/projected/9aed43b2-88da-4388-b0ce-77699c8f978c-kube-api-access-qpxrp\") pod \"ovsdbserver-sb-0\" (UID: \"9aed43b2-88da-4388-b0ce-77699c8f978c\") " pod="openstack/ovsdbserver-sb-0" Mar 09 14:24:00 crc kubenswrapper[4722]: I0309 14:24:00.041631 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 09 14:24:00 crc kubenswrapper[4722]: I0309 14:24:00.133299 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551104-vdth8"] Mar 09 14:24:00 crc kubenswrapper[4722]: I0309 14:24:00.134830 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551104-vdth8" Mar 09 14:24:00 crc kubenswrapper[4722]: I0309 14:24:00.137631 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 14:24:00 crc kubenswrapper[4722]: I0309 14:24:00.137631 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-dr2x6" Mar 09 14:24:00 crc kubenswrapper[4722]: I0309 14:24:00.137674 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 14:24:00 crc kubenswrapper[4722]: I0309 14:24:00.141865 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551104-vdth8"] Mar 09 14:24:00 crc kubenswrapper[4722]: I0309 14:24:00.226530 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zs8z4\" (UniqueName: \"kubernetes.io/projected/b2ae19a8-dc3a-4368-94d0-a8be2f8ed011-kube-api-access-zs8z4\") pod \"auto-csr-approver-29551104-vdth8\" (UID: \"b2ae19a8-dc3a-4368-94d0-a8be2f8ed011\") " pod="openshift-infra/auto-csr-approver-29551104-vdth8" Mar 09 14:24:00 crc kubenswrapper[4722]: I0309 14:24:00.329070 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zs8z4\" (UniqueName: \"kubernetes.io/projected/b2ae19a8-dc3a-4368-94d0-a8be2f8ed011-kube-api-access-zs8z4\") pod \"auto-csr-approver-29551104-vdth8\" (UID: \"b2ae19a8-dc3a-4368-94d0-a8be2f8ed011\") " pod="openshift-infra/auto-csr-approver-29551104-vdth8" Mar 09 14:24:00 crc kubenswrapper[4722]: I0309 14:24:00.346240 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zs8z4\" (UniqueName: \"kubernetes.io/projected/b2ae19a8-dc3a-4368-94d0-a8be2f8ed011-kube-api-access-zs8z4\") pod \"auto-csr-approver-29551104-vdth8\" (UID: \"b2ae19a8-dc3a-4368-94d0-a8be2f8ed011\") " pod="openshift-infra/auto-csr-approver-29551104-vdth8" Mar 09 14:24:00 crc kubenswrapper[4722]: I0309 14:24:00.464868 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551104-vdth8" Mar 09 14:24:10 crc kubenswrapper[4722]: I0309 14:24:10.714022 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 09 14:24:11 crc kubenswrapper[4722]: E0309 14:24:11.112890 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 09 14:24:11 crc kubenswrapper[4722]: E0309 14:24:11.113132 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nh2n8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-mdqwg_openstack(fdbc665b-813d-4bce-ab2e-e0a2408bd149): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 14:24:11 crc kubenswrapper[4722]: E0309 14:24:11.114445 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-mdqwg" podUID="fdbc665b-813d-4bce-ab2e-e0a2408bd149" Mar 09 14:24:11 crc kubenswrapper[4722]: E0309 14:24:11.146114 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 09 14:24:11 crc kubenswrapper[4722]: E0309 14:24:11.146318 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nfdh5dfhb6h64h676hc4h78h97h669h54chfbh696hb5h54bh5d4h6bh64h644h677h584h5cbh698h9dh5bbh5f8h5b8hcdh644h5c7h694hbfh589q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hn8j6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5ccc8479f9-lpmrz_openstack(dd7183ce-685d-4fe4-9bc9-5d58c95bc9e0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 14:24:11 crc kubenswrapper[4722]: E0309 14:24:11.147549 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-5ccc8479f9-lpmrz" podUID="dd7183ce-685d-4fe4-9bc9-5d58c95bc9e0" Mar 09 14:24:11 crc kubenswrapper[4722]: E0309 14:24:11.176365 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-5ccc8479f9-lpmrz" podUID="dd7183ce-685d-4fe4-9bc9-5d58c95bc9e0" Mar 09 14:24:12 crc kubenswrapper[4722]: W0309 14:24:12.915189 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a6188a6_3f71_48b5_9013_66d297c205a7.slice/crio-e2d43fe45a07a4e241a2abf41283c428eb0155376ef31276dad821f0d390344b WatchSource:0}: Error finding container e2d43fe45a07a4e241a2abf41283c428eb0155376ef31276dad821f0d390344b: Status 404 returned error can't find the container with id e2d43fe45a07a4e241a2abf41283c428eb0155376ef31276dad821f0d390344b Mar 09 14:24:12 crc kubenswrapper[4722]: E0309 14:24:12.966377 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 09 14:24:12 crc kubenswrapper[4722]: E0309 14:24:12.966814 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-f4l5z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-vrhjb_openstack(f77eae01-5673-4dda-95b8-c9b540e75f92): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 14:24:12 crc kubenswrapper[4722]: E0309 14:24:12.967405 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 09 14:24:12 crc kubenswrapper[4722]: E0309 14:24:12.967630 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dg249,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-lb2r7_openstack(6dca0862-8223-4ba0-9fc8-eb8c861baa60): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 14:24:12 crc kubenswrapper[4722]: E0309 14:24:12.967954 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-vrhjb" podUID="f77eae01-5673-4dda-95b8-c9b540e75f92" Mar 09 14:24:12 crc kubenswrapper[4722]: E0309 14:24:12.969129 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-lb2r7" podUID="6dca0862-8223-4ba0-9fc8-eb8c861baa60" Mar 09 14:24:13 crc kubenswrapper[4722]: I0309 14:24:13.103392 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-mdqwg" Mar 09 14:24:13 crc kubenswrapper[4722]: I0309 14:24:13.196483 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fdbc665b-813d-4bce-ab2e-e0a2408bd149-config\") pod \"fdbc665b-813d-4bce-ab2e-e0a2408bd149\" (UID: \"fdbc665b-813d-4bce-ab2e-e0a2408bd149\") " Mar 09 14:24:13 crc kubenswrapper[4722]: I0309 14:24:13.197242 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nh2n8\" (UniqueName: \"kubernetes.io/projected/fdbc665b-813d-4bce-ab2e-e0a2408bd149-kube-api-access-nh2n8\") pod \"fdbc665b-813d-4bce-ab2e-e0a2408bd149\" (UID: \"fdbc665b-813d-4bce-ab2e-e0a2408bd149\") " Mar 09 14:24:13 crc kubenswrapper[4722]: I0309 14:24:13.197256 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fdbc665b-813d-4bce-ab2e-e0a2408bd149-config" (OuterVolumeSpecName: "config") pod "fdbc665b-813d-4bce-ab2e-e0a2408bd149" (UID: "fdbc665b-813d-4bce-ab2e-e0a2408bd149"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:24:13 crc kubenswrapper[4722]: I0309 14:24:13.199341 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fdbc665b-813d-4bce-ab2e-e0a2408bd149-config\") on node \"crc\" DevicePath \"\"" Mar 09 14:24:13 crc kubenswrapper[4722]: I0309 14:24:13.204944 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdbc665b-813d-4bce-ab2e-e0a2408bd149-kube-api-access-nh2n8" (OuterVolumeSpecName: "kube-api-access-nh2n8") pod "fdbc665b-813d-4bce-ab2e-e0a2408bd149" (UID: "fdbc665b-813d-4bce-ab2e-e0a2408bd149"). InnerVolumeSpecName "kube-api-access-nh2n8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:24:13 crc kubenswrapper[4722]: I0309 14:24:13.231748 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"0a6188a6-3f71-48b5-9013-66d297c205a7","Type":"ContainerStarted","Data":"e2d43fe45a07a4e241a2abf41283c428eb0155376ef31276dad821f0d390344b"} Mar 09 14:24:13 crc kubenswrapper[4722]: I0309 14:24:13.235951 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-mdqwg" Mar 09 14:24:13 crc kubenswrapper[4722]: I0309 14:24:13.236498 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-mdqwg" event={"ID":"fdbc665b-813d-4bce-ab2e-e0a2408bd149","Type":"ContainerDied","Data":"917f6e0caa00bc9146de3de9e043f0b437c86ceb70d9482844ac8efb171a46a8"} Mar 09 14:24:13 crc kubenswrapper[4722]: E0309 14:24:13.242437 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-lb2r7" podUID="6dca0862-8223-4ba0-9fc8-eb8c861baa60" Mar 09 14:24:13 crc kubenswrapper[4722]: I0309 14:24:13.300988 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nh2n8\" (UniqueName: \"kubernetes.io/projected/fdbc665b-813d-4bce-ab2e-e0a2408bd149-kube-api-access-nh2n8\") on node \"crc\" DevicePath \"\"" Mar 09 14:24:13 crc kubenswrapper[4722]: I0309 14:24:13.481049 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-mdqwg"] Mar 09 14:24:13 crc kubenswrapper[4722]: I0309 14:24:13.503366 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-mdqwg"] Mar 09 14:24:14 crc kubenswrapper[4722]: I0309 14:24:14.120905 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-k6ng6"] Mar 09 14:24:14 crc kubenswrapper[4722]: I0309 14:24:14.168121 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fdbc665b-813d-4bce-ab2e-e0a2408bd149" path="/var/lib/kubelet/pods/fdbc665b-813d-4bce-ab2e-e0a2408bd149/volumes" Mar 09 14:24:14 crc kubenswrapper[4722]: I0309 14:24:14.249060 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-k6ng6" event={"ID":"d6228268-4d1f-464b-b733-a2f308211670","Type":"ContainerStarted","Data":"e416399a9e4ea97693ae73a8541a1f79b95e225f175ecfa7b71dc923b955f77f"} Mar 09 14:24:14 crc kubenswrapper[4722]: I0309 14:24:14.252765 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"a79aaedb-92ba-42fc-8cc7-9ecb007d2ac7","Type":"ContainerStarted","Data":"a3e29f47d72b8b31f0af322d62ef838e7c3f001dfa6ce7a8f8c623b6f9008e31"} Mar 09 14:24:14 crc kubenswrapper[4722]: I0309 14:24:14.380561 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-ui-dashboards-66cbf594b5-bzqhp"] Mar 09 14:24:14 crc kubenswrapper[4722]: I0309 14:24:14.409738 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6dc4c5dd4b-rk4jt"] Mar 09 14:24:14 crc kubenswrapper[4722]: I0309 14:24:14.422114 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 09 14:24:14 crc kubenswrapper[4722]: I0309 14:24:14.491358 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 09 14:24:14 crc kubenswrapper[4722]: I0309 14:24:14.632321 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551104-vdth8"] Mar 09 14:24:14 crc kubenswrapper[4722]: I0309 14:24:14.645469 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 09 14:24:14 crc kubenswrapper[4722]: I0309 14:24:14.655416 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-b8gzx"] Mar 09 14:24:14 crc kubenswrapper[4722]: I0309 14:24:14.665566 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 09 14:24:15 crc kubenswrapper[4722]: W0309 14:24:15.107779 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3f2d1a87_0e77_4753_b87a_39b2b5f333a4.slice/crio-583cded0353bd48174efb7c6f5d20dcf98f49013518ca5923cec125f76ea81ef WatchSource:0}: Error finding container 583cded0353bd48174efb7c6f5d20dcf98f49013518ca5923cec125f76ea81ef: Status 404 returned error can't find the container with id 583cded0353bd48174efb7c6f5d20dcf98f49013518ca5923cec125f76ea81ef Mar 09 14:24:15 crc kubenswrapper[4722]: W0309 14:24:15.112173 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4159e308_3ccf_45d9_a97b_8133542007a8.slice/crio-7a8a0110044fef0257bad8b54005f8d553c23bd95aadfb35bedb18fc84bc5372 WatchSource:0}: Error finding container 7a8a0110044fef0257bad8b54005f8d553c23bd95aadfb35bedb18fc84bc5372: Status 404 returned error can't find the container with id 7a8a0110044fef0257bad8b54005f8d553c23bd95aadfb35bedb18fc84bc5372 Mar 09 14:24:15 crc kubenswrapper[4722]: W0309 14:24:15.118948 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d15a083_af10_4638_b6bc_9de1f89f123f.slice/crio-e5dc868ce2b53e705f7d5dff5110e8547e002521807a131a8d099d9f657dc62f WatchSource:0}: Error finding container e5dc868ce2b53e705f7d5dff5110e8547e002521807a131a8d099d9f657dc62f: Status 404 returned error can't find the container with id e5dc868ce2b53e705f7d5dff5110e8547e002521807a131a8d099d9f657dc62f Mar 09 14:24:15 crc kubenswrapper[4722]: W0309 14:24:15.122226 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod32bc4279_b6a2_4846_801c_ddf3a01db8b2.slice/crio-ecf703e9566aa18a2b97f8d3b8f3dfc17b3b440a126263415e5949757e46a77c WatchSource:0}: Error finding container ecf703e9566aa18a2b97f8d3b8f3dfc17b3b440a126263415e5949757e46a77c: Status 404 returned error can't find the container with id ecf703e9566aa18a2b97f8d3b8f3dfc17b3b440a126263415e5949757e46a77c Mar 09 14:24:15 crc kubenswrapper[4722]: W0309 14:24:15.130398 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb2ae19a8_dc3a_4368_94d0_a8be2f8ed011.slice/crio-8332983898ba97ae251795cebc2139f10795b85cae4b24af4a5d75e074ad1c38 WatchSource:0}: Error finding container 8332983898ba97ae251795cebc2139f10795b85cae4b24af4a5d75e074ad1c38: Status 404 returned error can't find the container with id 8332983898ba97ae251795cebc2139f10795b85cae4b24af4a5d75e074ad1c38 Mar 09 14:24:15 crc kubenswrapper[4722]: W0309 14:24:15.141065 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0db26a0_2877_48fa_b706_b5558f9973d5.slice/crio-1cd523e3b573fe09cf95a9251bec022f3e7c1a846e2ea18bb9119d72c3848821 WatchSource:0}: Error finding container 1cd523e3b573fe09cf95a9251bec022f3e7c1a846e2ea18bb9119d72c3848821: Status 404 returned error can't find the container with id 1cd523e3b573fe09cf95a9251bec022f3e7c1a846e2ea18bb9119d72c3848821 Mar 09 14:24:15 crc kubenswrapper[4722]: I0309 14:24:15.276716 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"c6ada086-becc-4f4a-a0a0-0aad894dc550","Type":"ContainerStarted","Data":"c14c75a63d784852902e25e93e5a8cf7646ecf4eccaa9b3cddd0d364975c6f55"} Mar 09 14:24:15 crc kubenswrapper[4722]: I0309 14:24:15.278815 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4d8e49ae-5932-472c-a714-7872980c5a9b","Type":"ContainerStarted","Data":"2dcc0b6e482a4bf6c4d23c45dbd2fab04460373e1df31b072003bbb064cfafc1"} Mar 09 14:24:15 crc kubenswrapper[4722]: I0309 14:24:15.280368 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"3f2d1a87-0e77-4753-b87a-39b2b5f333a4","Type":"ContainerStarted","Data":"583cded0353bd48174efb7c6f5d20dcf98f49013518ca5923cec125f76ea81ef"} Mar 09 14:24:15 crc kubenswrapper[4722]: I0309 14:24:15.283889 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e0db26a0-2877-48fa-b706-b5558f9973d5","Type":"ContainerStarted","Data":"1cd523e3b573fe09cf95a9251bec022f3e7c1a846e2ea18bb9119d72c3848821"} Mar 09 14:24:15 crc kubenswrapper[4722]: I0309 14:24:15.285978 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"17ce7999-f86f-45fa-ae07-785f70d797a1","Type":"ContainerStarted","Data":"2ed1c447dbb8dfe73b7c01fa28b0e8e47079d52fb2e0e72560df64042052f747"} Mar 09 14:24:15 crc kubenswrapper[4722]: I0309 14:24:15.291853 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-vrhjb" event={"ID":"f77eae01-5673-4dda-95b8-c9b540e75f92","Type":"ContainerDied","Data":"4c41214d3e3f1b12b5d9e2b2526820e9ce9f36114c31ea49159af633ecac7c23"} Mar 09 14:24:15 crc kubenswrapper[4722]: I0309 14:24:15.291904 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c41214d3e3f1b12b5d9e2b2526820e9ce9f36114c31ea49159af633ecac7c23" Mar 09 14:24:15 crc kubenswrapper[4722]: I0309 14:24:15.294177 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6f4e007a-4a18-40e6-bf96-4a751e00cd73","Type":"ContainerStarted","Data":"89991879f6e59e858e98954d53f4101c5c7935bc1ad02ef1f93145110f421678"} Mar 09 14:24:15 crc kubenswrapper[4722]: I0309 14:24:15.296514 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1c98e541-4b72-465d-8799-89e8c9791c3e","Type":"ContainerStarted","Data":"83b4abfa07a9f0cdc86b0978dab11bb5c16ae3ddce1e3930e50e2705f0aa51fa"} Mar 09 14:24:15 crc kubenswrapper[4722]: I0309 14:24:15.303122 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"4159e308-3ccf-45d9-a97b-8133542007a8","Type":"ContainerStarted","Data":"7a8a0110044fef0257bad8b54005f8d553c23bd95aadfb35bedb18fc84bc5372"} Mar 09 14:24:15 crc kubenswrapper[4722]: I0309 14:24:15.310886 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-b8gzx" event={"ID":"32bc4279-b6a2-4846-801c-ddf3a01db8b2","Type":"ContainerStarted","Data":"ecf703e9566aa18a2b97f8d3b8f3dfc17b3b440a126263415e5949757e46a77c"} Mar 09 14:24:15 crc kubenswrapper[4722]: I0309 14:24:15.313624 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-ui-dashboards-66cbf594b5-bzqhp" event={"ID":"0d15a083-af10-4638-b6bc-9de1f89f123f","Type":"ContainerStarted","Data":"e5dc868ce2b53e705f7d5dff5110e8547e002521807a131a8d099d9f657dc62f"} Mar 09 14:24:15 crc kubenswrapper[4722]: I0309 14:24:15.315984 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6dc4c5dd4b-rk4jt" event={"ID":"6dc5b476-5a42-4c98-9a95-0e3b29f2f771","Type":"ContainerStarted","Data":"69d32ae3951fc47f513ffed93525a84c59790c556935c4506ee34ae19fef6cd9"} Mar 09 14:24:15 crc kubenswrapper[4722]: I0309 14:24:15.317726 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551104-vdth8" event={"ID":"b2ae19a8-dc3a-4368-94d0-a8be2f8ed011","Type":"ContainerStarted","Data":"8332983898ba97ae251795cebc2139f10795b85cae4b24af4a5d75e074ad1c38"} Mar 09 14:24:15 crc kubenswrapper[4722]: I0309 14:24:15.460684 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 09 14:24:15 crc kubenswrapper[4722]: W0309 14:24:15.469068 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9aed43b2_88da_4388_b0ce_77699c8f978c.slice/crio-4f3e2ab65e8f58745b25be4021c892dd9458427f183e2547128c21f78ec10c73 WatchSource:0}: Error finding container 4f3e2ab65e8f58745b25be4021c892dd9458427f183e2547128c21f78ec10c73: Status 404 returned error can't find the container with id 4f3e2ab65e8f58745b25be4021c892dd9458427f183e2547128c21f78ec10c73 Mar 09 14:24:15 crc kubenswrapper[4722]: I0309 14:24:15.470529 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-vrhjb" Mar 09 14:24:15 crc kubenswrapper[4722]: I0309 14:24:15.562768 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f77eae01-5673-4dda-95b8-c9b540e75f92-dns-svc\") pod \"f77eae01-5673-4dda-95b8-c9b540e75f92\" (UID: \"f77eae01-5673-4dda-95b8-c9b540e75f92\") " Mar 09 14:24:15 crc kubenswrapper[4722]: I0309 14:24:15.562871 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f4l5z\" (UniqueName: \"kubernetes.io/projected/f77eae01-5673-4dda-95b8-c9b540e75f92-kube-api-access-f4l5z\") pod \"f77eae01-5673-4dda-95b8-c9b540e75f92\" (UID: \"f77eae01-5673-4dda-95b8-c9b540e75f92\") " Mar 09 14:24:15 crc kubenswrapper[4722]: I0309 14:24:15.562940 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f77eae01-5673-4dda-95b8-c9b540e75f92-config\") pod \"f77eae01-5673-4dda-95b8-c9b540e75f92\" (UID: \"f77eae01-5673-4dda-95b8-c9b540e75f92\") " Mar 09 14:24:15 crc kubenswrapper[4722]: I0309 14:24:15.563684 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f77eae01-5673-4dda-95b8-c9b540e75f92-config" (OuterVolumeSpecName: "config") pod "f77eae01-5673-4dda-95b8-c9b540e75f92" (UID: "f77eae01-5673-4dda-95b8-c9b540e75f92"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:24:15 crc kubenswrapper[4722]: I0309 14:24:15.563923 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f77eae01-5673-4dda-95b8-c9b540e75f92-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f77eae01-5673-4dda-95b8-c9b540e75f92" (UID: "f77eae01-5673-4dda-95b8-c9b540e75f92"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:24:15 crc kubenswrapper[4722]: I0309 14:24:15.568656 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f77eae01-5673-4dda-95b8-c9b540e75f92-kube-api-access-f4l5z" (OuterVolumeSpecName: "kube-api-access-f4l5z") pod "f77eae01-5673-4dda-95b8-c9b540e75f92" (UID: "f77eae01-5673-4dda-95b8-c9b540e75f92"). InnerVolumeSpecName "kube-api-access-f4l5z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:24:15 crc kubenswrapper[4722]: I0309 14:24:15.666494 4722 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f77eae01-5673-4dda-95b8-c9b540e75f92-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 14:24:15 crc kubenswrapper[4722]: I0309 14:24:15.666912 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f4l5z\" (UniqueName: \"kubernetes.io/projected/f77eae01-5673-4dda-95b8-c9b540e75f92-kube-api-access-f4l5z\") on node \"crc\" DevicePath \"\"" Mar 09 14:24:15 crc kubenswrapper[4722]: I0309 14:24:15.666930 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f77eae01-5673-4dda-95b8-c9b540e75f92-config\") on node \"crc\" DevicePath \"\"" Mar 09 14:24:16 crc kubenswrapper[4722]: I0309 14:24:16.329704 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6dc4c5dd4b-rk4jt" event={"ID":"6dc5b476-5a42-4c98-9a95-0e3b29f2f771","Type":"ContainerStarted","Data":"b8f5b1038e62f1fa81cc3b716e91ed6508a1746f147df7b94d2cb47945f17185"} Mar 09 14:24:16 crc kubenswrapper[4722]: I0309 14:24:16.334877 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"4159e308-3ccf-45d9-a97b-8133542007a8","Type":"ContainerStarted","Data":"04d0845974c3397f942aeb76173d964a9704d054052bbb9f216deff5a900d2bc"} Mar 09 14:24:16 crc kubenswrapper[4722]: I0309 14:24:16.338778 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"0a6188a6-3f71-48b5-9013-66d297c205a7","Type":"ContainerStarted","Data":"f5f2d02a7434432d0fac2fb248a202e25f11546680f477faf9fba34cf3ea5bca"} Mar 09 14:24:16 crc kubenswrapper[4722]: I0309 14:24:16.338900 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 09 14:24:16 crc kubenswrapper[4722]: I0309 14:24:16.340307 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"9aed43b2-88da-4388-b0ce-77699c8f978c","Type":"ContainerStarted","Data":"4f3e2ab65e8f58745b25be4021c892dd9458427f183e2547128c21f78ec10c73"} Mar 09 14:24:16 crc kubenswrapper[4722]: I0309 14:24:16.340504 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-vrhjb" Mar 09 14:24:16 crc kubenswrapper[4722]: I0309 14:24:16.360480 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6dc4c5dd4b-rk4jt" podStartSLOduration=22.360445207 podStartE2EDuration="22.360445207s" podCreationTimestamp="2026-03-09 14:23:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:24:16.350345348 +0000 UTC m=+1296.905913924" watchObservedRunningTime="2026-03-09 14:24:16.360445207 +0000 UTC m=+1296.916013783" Mar 09 14:24:16 crc kubenswrapper[4722]: I0309 14:24:16.370105 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=24.09814745 podStartE2EDuration="26.370089314s" podCreationTimestamp="2026-03-09 14:23:50 +0000 UTC" firstStartedPulling="2026-03-09 14:24:12.923318583 +0000 UTC m=+1293.478887159" lastFinishedPulling="2026-03-09 14:24:15.195260447 +0000 UTC m=+1295.750829023" observedRunningTime="2026-03-09 14:24:16.366766256 +0000 UTC m=+1296.922334862" watchObservedRunningTime="2026-03-09 14:24:16.370089314 +0000 UTC m=+1296.925657890" Mar 09 14:24:16 crc kubenswrapper[4722]: I0309 14:24:16.445890 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-vrhjb"] Mar 09 14:24:16 crc kubenswrapper[4722]: I0309 14:24:16.458614 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-vrhjb"] Mar 09 14:24:17 crc kubenswrapper[4722]: I0309 14:24:17.356305 4722 generic.go:334] "Generic (PLEG): container finished" podID="a79aaedb-92ba-42fc-8cc7-9ecb007d2ac7" containerID="a3e29f47d72b8b31f0af322d62ef838e7c3f001dfa6ce7a8f8c623b6f9008e31" exitCode=0 Mar 09 14:24:17 crc kubenswrapper[4722]: I0309 14:24:17.356396 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"a79aaedb-92ba-42fc-8cc7-9ecb007d2ac7","Type":"ContainerDied","Data":"a3e29f47d72b8b31f0af322d62ef838e7c3f001dfa6ce7a8f8c623b6f9008e31"} Mar 09 14:24:18 crc kubenswrapper[4722]: I0309 14:24:18.160709 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f77eae01-5673-4dda-95b8-c9b540e75f92" path="/var/lib/kubelet/pods/f77eae01-5673-4dda-95b8-c9b540e75f92/volumes" Mar 09 14:24:19 crc kubenswrapper[4722]: I0309 14:24:19.381731 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-mw6tx"] Mar 09 14:24:19 crc kubenswrapper[4722]: I0309 14:24:19.382743 4722 generic.go:334] "Generic (PLEG): container finished" podID="4159e308-3ccf-45d9-a97b-8133542007a8" containerID="04d0845974c3397f942aeb76173d964a9704d054052bbb9f216deff5a900d2bc" exitCode=0 Mar 09 14:24:19 crc kubenswrapper[4722]: I0309 14:24:19.386621 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"4159e308-3ccf-45d9-a97b-8133542007a8","Type":"ContainerDied","Data":"04d0845974c3397f942aeb76173d964a9704d054052bbb9f216deff5a900d2bc"} Mar 09 14:24:19 crc kubenswrapper[4722]: I0309 14:24:19.387589 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-mw6tx" Mar 09 14:24:19 crc kubenswrapper[4722]: I0309 14:24:19.392647 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 09 14:24:19 crc kubenswrapper[4722]: I0309 14:24:19.401290 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-mw6tx"] Mar 09 14:24:19 crc kubenswrapper[4722]: I0309 14:24:19.462671 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f28b2af2-0f97-4160-9641-6771f3deb9d1-combined-ca-bundle\") pod \"ovn-controller-metrics-mw6tx\" (UID: \"f28b2af2-0f97-4160-9641-6771f3deb9d1\") " pod="openstack/ovn-controller-metrics-mw6tx" Mar 09 14:24:19 crc kubenswrapper[4722]: I0309 14:24:19.462720 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/f28b2af2-0f97-4160-9641-6771f3deb9d1-ovn-rundir\") pod \"ovn-controller-metrics-mw6tx\" (UID: \"f28b2af2-0f97-4160-9641-6771f3deb9d1\") " pod="openstack/ovn-controller-metrics-mw6tx" Mar 09 14:24:19 crc kubenswrapper[4722]: I0309 14:24:19.462791 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/f28b2af2-0f97-4160-9641-6771f3deb9d1-ovs-rundir\") pod \"ovn-controller-metrics-mw6tx\" (UID: \"f28b2af2-0f97-4160-9641-6771f3deb9d1\") " pod="openstack/ovn-controller-metrics-mw6tx" Mar 09 14:24:19 crc kubenswrapper[4722]: I0309 14:24:19.462869 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f28b2af2-0f97-4160-9641-6771f3deb9d1-config\") pod \"ovn-controller-metrics-mw6tx\" (UID: \"f28b2af2-0f97-4160-9641-6771f3deb9d1\") " pod="openstack/ovn-controller-metrics-mw6tx" Mar 09 14:24:19 crc kubenswrapper[4722]: I0309 14:24:19.462968 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzlg5\" (UniqueName: \"kubernetes.io/projected/f28b2af2-0f97-4160-9641-6771f3deb9d1-kube-api-access-bzlg5\") pod \"ovn-controller-metrics-mw6tx\" (UID: \"f28b2af2-0f97-4160-9641-6771f3deb9d1\") " pod="openstack/ovn-controller-metrics-mw6tx" Mar 09 14:24:19 crc kubenswrapper[4722]: I0309 14:24:19.463028 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f28b2af2-0f97-4160-9641-6771f3deb9d1-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-mw6tx\" (UID: \"f28b2af2-0f97-4160-9641-6771f3deb9d1\") " pod="openstack/ovn-controller-metrics-mw6tx" Mar 09 14:24:19 crc kubenswrapper[4722]: I0309 14:24:19.537377 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-lb2r7"] Mar 09 14:24:19 crc kubenswrapper[4722]: I0309 14:24:19.565263 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzlg5\" (UniqueName: \"kubernetes.io/projected/f28b2af2-0f97-4160-9641-6771f3deb9d1-kube-api-access-bzlg5\") pod \"ovn-controller-metrics-mw6tx\" (UID: \"f28b2af2-0f97-4160-9641-6771f3deb9d1\") " pod="openstack/ovn-controller-metrics-mw6tx" Mar 09 14:24:19 crc kubenswrapper[4722]: I0309 14:24:19.565358 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f28b2af2-0f97-4160-9641-6771f3deb9d1-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-mw6tx\" (UID: \"f28b2af2-0f97-4160-9641-6771f3deb9d1\") " pod="openstack/ovn-controller-metrics-mw6tx" Mar 09 14:24:19 crc kubenswrapper[4722]: I0309 14:24:19.565399 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f28b2af2-0f97-4160-9641-6771f3deb9d1-combined-ca-bundle\") pod \"ovn-controller-metrics-mw6tx\" (UID: \"f28b2af2-0f97-4160-9641-6771f3deb9d1\") " pod="openstack/ovn-controller-metrics-mw6tx" Mar 09 14:24:19 crc kubenswrapper[4722]: I0309 14:24:19.565440 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/f28b2af2-0f97-4160-9641-6771f3deb9d1-ovn-rundir\") pod \"ovn-controller-metrics-mw6tx\" (UID: \"f28b2af2-0f97-4160-9641-6771f3deb9d1\") " pod="openstack/ovn-controller-metrics-mw6tx" Mar 09 14:24:19 crc kubenswrapper[4722]: I0309 14:24:19.565499 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/f28b2af2-0f97-4160-9641-6771f3deb9d1-ovs-rundir\") pod \"ovn-controller-metrics-mw6tx\" (UID: \"f28b2af2-0f97-4160-9641-6771f3deb9d1\") " pod="openstack/ovn-controller-metrics-mw6tx" Mar 09 14:24:19 crc kubenswrapper[4722]: I0309 14:24:19.565570 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f28b2af2-0f97-4160-9641-6771f3deb9d1-config\") pod \"ovn-controller-metrics-mw6tx\" (UID: \"f28b2af2-0f97-4160-9641-6771f3deb9d1\") " pod="openstack/ovn-controller-metrics-mw6tx" Mar 09 14:24:19 crc kubenswrapper[4722]: I0309 14:24:19.566314 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/f28b2af2-0f97-4160-9641-6771f3deb9d1-ovn-rundir\") pod \"ovn-controller-metrics-mw6tx\" (UID: \"f28b2af2-0f97-4160-9641-6771f3deb9d1\") " pod="openstack/ovn-controller-metrics-mw6tx" Mar 09 14:24:19 crc kubenswrapper[4722]: I0309 14:24:19.566338 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/f28b2af2-0f97-4160-9641-6771f3deb9d1-ovs-rundir\") pod \"ovn-controller-metrics-mw6tx\" (UID: \"f28b2af2-0f97-4160-9641-6771f3deb9d1\") " pod="openstack/ovn-controller-metrics-mw6tx" Mar 09 14:24:19 crc kubenswrapper[4722]: I0309 14:24:19.566504 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f28b2af2-0f97-4160-9641-6771f3deb9d1-config\") pod \"ovn-controller-metrics-mw6tx\" (UID: \"f28b2af2-0f97-4160-9641-6771f3deb9d1\") " pod="openstack/ovn-controller-metrics-mw6tx" Mar 09 14:24:19 crc kubenswrapper[4722]: I0309 14:24:19.572714 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f28b2af2-0f97-4160-9641-6771f3deb9d1-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-mw6tx\" (UID: \"f28b2af2-0f97-4160-9641-6771f3deb9d1\") " pod="openstack/ovn-controller-metrics-mw6tx" Mar 09 14:24:19 crc kubenswrapper[4722]: I0309 14:24:19.575329 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f28b2af2-0f97-4160-9641-6771f3deb9d1-combined-ca-bundle\") pod \"ovn-controller-metrics-mw6tx\" (UID: \"f28b2af2-0f97-4160-9641-6771f3deb9d1\") " pod="openstack/ovn-controller-metrics-mw6tx" Mar 09 14:24:19 crc kubenswrapper[4722]: I0309 14:24:19.598519 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-vnvsd"] Mar 09 14:24:19 crc kubenswrapper[4722]: I0309 14:24:19.599281 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzlg5\" (UniqueName: \"kubernetes.io/projected/f28b2af2-0f97-4160-9641-6771f3deb9d1-kube-api-access-bzlg5\") pod \"ovn-controller-metrics-mw6tx\" (UID: \"f28b2af2-0f97-4160-9641-6771f3deb9d1\") " pod="openstack/ovn-controller-metrics-mw6tx" Mar 09 14:24:19 crc kubenswrapper[4722]: I0309 14:24:19.600764 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-vnvsd" Mar 09 14:24:19 crc kubenswrapper[4722]: I0309 14:24:19.604316 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 09 14:24:19 crc kubenswrapper[4722]: I0309 14:24:19.627330 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-vnvsd"] Mar 09 14:24:19 crc kubenswrapper[4722]: I0309 14:24:19.668108 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knvf9\" (UniqueName: \"kubernetes.io/projected/c4e534de-c45a-48fd-b097-1b3276f0d083-kube-api-access-knvf9\") pod \"dnsmasq-dns-7f896c8c65-vnvsd\" (UID: \"c4e534de-c45a-48fd-b097-1b3276f0d083\") " pod="openstack/dnsmasq-dns-7f896c8c65-vnvsd" Mar 09 14:24:19 crc kubenswrapper[4722]: I0309 14:24:19.668248 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c4e534de-c45a-48fd-b097-1b3276f0d083-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-vnvsd\" (UID: \"c4e534de-c45a-48fd-b097-1b3276f0d083\") " pod="openstack/dnsmasq-dns-7f896c8c65-vnvsd" Mar 09 14:24:19 crc kubenswrapper[4722]: I0309 14:24:19.668331 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c4e534de-c45a-48fd-b097-1b3276f0d083-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-vnvsd\" (UID: \"c4e534de-c45a-48fd-b097-1b3276f0d083\") " pod="openstack/dnsmasq-dns-7f896c8c65-vnvsd" Mar 09 14:24:19 crc kubenswrapper[4722]: I0309 14:24:19.668557 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4e534de-c45a-48fd-b097-1b3276f0d083-config\") pod \"dnsmasq-dns-7f896c8c65-vnvsd\" (UID: \"c4e534de-c45a-48fd-b097-1b3276f0d083\") " pod="openstack/dnsmasq-dns-7f896c8c65-vnvsd" Mar 09 14:24:19 crc kubenswrapper[4722]: I0309 14:24:19.732285 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-mw6tx" Mar 09 14:24:19 crc kubenswrapper[4722]: I0309 14:24:19.770924 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4e534de-c45a-48fd-b097-1b3276f0d083-config\") pod \"dnsmasq-dns-7f896c8c65-vnvsd\" (UID: \"c4e534de-c45a-48fd-b097-1b3276f0d083\") " pod="openstack/dnsmasq-dns-7f896c8c65-vnvsd" Mar 09 14:24:19 crc kubenswrapper[4722]: I0309 14:24:19.770988 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knvf9\" (UniqueName: \"kubernetes.io/projected/c4e534de-c45a-48fd-b097-1b3276f0d083-kube-api-access-knvf9\") pod \"dnsmasq-dns-7f896c8c65-vnvsd\" (UID: \"c4e534de-c45a-48fd-b097-1b3276f0d083\") " pod="openstack/dnsmasq-dns-7f896c8c65-vnvsd" Mar 09 14:24:19 crc kubenswrapper[4722]: I0309 14:24:19.771039 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c4e534de-c45a-48fd-b097-1b3276f0d083-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-vnvsd\" (UID: \"c4e534de-c45a-48fd-b097-1b3276f0d083\") " pod="openstack/dnsmasq-dns-7f896c8c65-vnvsd" Mar 09 14:24:19 crc kubenswrapper[4722]: I0309 14:24:19.771080 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c4e534de-c45a-48fd-b097-1b3276f0d083-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-vnvsd\" (UID: \"c4e534de-c45a-48fd-b097-1b3276f0d083\") " pod="openstack/dnsmasq-dns-7f896c8c65-vnvsd" Mar 09 14:24:19 crc kubenswrapper[4722]: I0309 14:24:19.771097 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-lpmrz"] Mar 09 14:24:19 crc kubenswrapper[4722]: I0309 14:24:19.772642 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c4e534de-c45a-48fd-b097-1b3276f0d083-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-vnvsd\" (UID: \"c4e534de-c45a-48fd-b097-1b3276f0d083\") " pod="openstack/dnsmasq-dns-7f896c8c65-vnvsd" Mar 09 14:24:19 crc kubenswrapper[4722]: I0309 14:24:19.773004 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4e534de-c45a-48fd-b097-1b3276f0d083-config\") pod \"dnsmasq-dns-7f896c8c65-vnvsd\" (UID: \"c4e534de-c45a-48fd-b097-1b3276f0d083\") " pod="openstack/dnsmasq-dns-7f896c8c65-vnvsd" Mar 09 14:24:19 crc kubenswrapper[4722]: I0309 14:24:19.773294 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c4e534de-c45a-48fd-b097-1b3276f0d083-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-vnvsd\" (UID: \"c4e534de-c45a-48fd-b097-1b3276f0d083\") " pod="openstack/dnsmasq-dns-7f896c8c65-vnvsd" Mar 09 14:24:19 crc kubenswrapper[4722]: I0309 14:24:19.804053 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knvf9\" (UniqueName: \"kubernetes.io/projected/c4e534de-c45a-48fd-b097-1b3276f0d083-kube-api-access-knvf9\") pod \"dnsmasq-dns-7f896c8c65-vnvsd\" (UID: \"c4e534de-c45a-48fd-b097-1b3276f0d083\") " pod="openstack/dnsmasq-dns-7f896c8c65-vnvsd" Mar 09 14:24:19 crc kubenswrapper[4722]: I0309 14:24:19.807632 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-lvds5"] Mar 09 14:24:19 crc kubenswrapper[4722]: I0309 14:24:19.809698 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-lvds5" Mar 09 14:24:19 crc kubenswrapper[4722]: I0309 14:24:19.815772 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 09 14:24:19 crc kubenswrapper[4722]: I0309 14:24:19.861669 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-lvds5"] Mar 09 14:24:19 crc kubenswrapper[4722]: I0309 14:24:19.873715 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5z9x\" (UniqueName: \"kubernetes.io/projected/0ae3ba92-3a9f-4f65-8c32-2b3f4d1f95e7-kube-api-access-t5z9x\") pod \"dnsmasq-dns-86db49b7ff-lvds5\" (UID: \"0ae3ba92-3a9f-4f65-8c32-2b3f4d1f95e7\") " pod="openstack/dnsmasq-dns-86db49b7ff-lvds5" Mar 09 14:24:19 crc kubenswrapper[4722]: I0309 14:24:19.873817 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0ae3ba92-3a9f-4f65-8c32-2b3f4d1f95e7-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-lvds5\" (UID: \"0ae3ba92-3a9f-4f65-8c32-2b3f4d1f95e7\") " pod="openstack/dnsmasq-dns-86db49b7ff-lvds5" Mar 09 14:24:19 crc kubenswrapper[4722]: I0309 14:24:19.873888 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ae3ba92-3a9f-4f65-8c32-2b3f4d1f95e7-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-lvds5\" (UID: \"0ae3ba92-3a9f-4f65-8c32-2b3f4d1f95e7\") " pod="openstack/dnsmasq-dns-86db49b7ff-lvds5" Mar 09 14:24:19 crc kubenswrapper[4722]: I0309 14:24:19.874089 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ae3ba92-3a9f-4f65-8c32-2b3f4d1f95e7-config\") pod \"dnsmasq-dns-86db49b7ff-lvds5\" (UID: \"0ae3ba92-3a9f-4f65-8c32-2b3f4d1f95e7\") " pod="openstack/dnsmasq-dns-86db49b7ff-lvds5" Mar 09 14:24:19 crc kubenswrapper[4722]: I0309 14:24:19.874141 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0ae3ba92-3a9f-4f65-8c32-2b3f4d1f95e7-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-lvds5\" (UID: \"0ae3ba92-3a9f-4f65-8c32-2b3f4d1f95e7\") " pod="openstack/dnsmasq-dns-86db49b7ff-lvds5" Mar 09 14:24:19 crc kubenswrapper[4722]: I0309 14:24:19.979603 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ae3ba92-3a9f-4f65-8c32-2b3f4d1f95e7-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-lvds5\" (UID: \"0ae3ba92-3a9f-4f65-8c32-2b3f4d1f95e7\") " pod="openstack/dnsmasq-dns-86db49b7ff-lvds5" Mar 09 14:24:19 crc kubenswrapper[4722]: I0309 14:24:19.979748 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ae3ba92-3a9f-4f65-8c32-2b3f4d1f95e7-config\") pod \"dnsmasq-dns-86db49b7ff-lvds5\" (UID: \"0ae3ba92-3a9f-4f65-8c32-2b3f4d1f95e7\") " pod="openstack/dnsmasq-dns-86db49b7ff-lvds5" Mar 09 14:24:19 crc kubenswrapper[4722]: I0309 14:24:19.979772 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0ae3ba92-3a9f-4f65-8c32-2b3f4d1f95e7-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-lvds5\" (UID: \"0ae3ba92-3a9f-4f65-8c32-2b3f4d1f95e7\") " pod="openstack/dnsmasq-dns-86db49b7ff-lvds5" Mar 09 14:24:19 crc kubenswrapper[4722]: I0309 14:24:19.979839 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5z9x\" (UniqueName: \"kubernetes.io/projected/0ae3ba92-3a9f-4f65-8c32-2b3f4d1f95e7-kube-api-access-t5z9x\") pod \"dnsmasq-dns-86db49b7ff-lvds5\" (UID: \"0ae3ba92-3a9f-4f65-8c32-2b3f4d1f95e7\") " pod="openstack/dnsmasq-dns-86db49b7ff-lvds5" Mar 09 14:24:19 crc kubenswrapper[4722]: I0309 14:24:19.979894 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0ae3ba92-3a9f-4f65-8c32-2b3f4d1f95e7-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-lvds5\" (UID: \"0ae3ba92-3a9f-4f65-8c32-2b3f4d1f95e7\") " pod="openstack/dnsmasq-dns-86db49b7ff-lvds5" Mar 09 14:24:19 crc kubenswrapper[4722]: I0309 14:24:19.980571 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ae3ba92-3a9f-4f65-8c32-2b3f4d1f95e7-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-lvds5\" (UID: \"0ae3ba92-3a9f-4f65-8c32-2b3f4d1f95e7\") " pod="openstack/dnsmasq-dns-86db49b7ff-lvds5" Mar 09 14:24:19 crc kubenswrapper[4722]: I0309 14:24:19.980714 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ae3ba92-3a9f-4f65-8c32-2b3f4d1f95e7-config\") pod \"dnsmasq-dns-86db49b7ff-lvds5\" (UID: \"0ae3ba92-3a9f-4f65-8c32-2b3f4d1f95e7\") " pod="openstack/dnsmasq-dns-86db49b7ff-lvds5" Mar 09 14:24:19 crc kubenswrapper[4722]: I0309 14:24:19.980862 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0ae3ba92-3a9f-4f65-8c32-2b3f4d1f95e7-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-lvds5\" (UID: \"0ae3ba92-3a9f-4f65-8c32-2b3f4d1f95e7\") " pod="openstack/dnsmasq-dns-86db49b7ff-lvds5" Mar 09 14:24:19 crc kubenswrapper[4722]: I0309 14:24:19.981452 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0ae3ba92-3a9f-4f65-8c32-2b3f4d1f95e7-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-lvds5\" (UID: \"0ae3ba92-3a9f-4f65-8c32-2b3f4d1f95e7\") " pod="openstack/dnsmasq-dns-86db49b7ff-lvds5" Mar 09 14:24:19 crc kubenswrapper[4722]: I0309 14:24:19.994936 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-vnvsd" Mar 09 14:24:20 crc kubenswrapper[4722]: I0309 14:24:20.003892 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5z9x\" (UniqueName: \"kubernetes.io/projected/0ae3ba92-3a9f-4f65-8c32-2b3f4d1f95e7-kube-api-access-t5z9x\") pod \"dnsmasq-dns-86db49b7ff-lvds5\" (UID: \"0ae3ba92-3a9f-4f65-8c32-2b3f4d1f95e7\") " pod="openstack/dnsmasq-dns-86db49b7ff-lvds5" Mar 09 14:24:20 crc kubenswrapper[4722]: I0309 14:24:20.130107 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-lb2r7" Mar 09 14:24:20 crc kubenswrapper[4722]: I0309 14:24:20.161980 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-lvds5" Mar 09 14:24:20 crc kubenswrapper[4722]: I0309 14:24:20.182770 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dg249\" (UniqueName: \"kubernetes.io/projected/6dca0862-8223-4ba0-9fc8-eb8c861baa60-kube-api-access-dg249\") pod \"6dca0862-8223-4ba0-9fc8-eb8c861baa60\" (UID: \"6dca0862-8223-4ba0-9fc8-eb8c861baa60\") " Mar 09 14:24:20 crc kubenswrapper[4722]: I0309 14:24:20.182872 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6dca0862-8223-4ba0-9fc8-eb8c861baa60-config\") pod \"6dca0862-8223-4ba0-9fc8-eb8c861baa60\" (UID: \"6dca0862-8223-4ba0-9fc8-eb8c861baa60\") " Mar 09 14:24:20 crc kubenswrapper[4722]: I0309 14:24:20.182984 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6dca0862-8223-4ba0-9fc8-eb8c861baa60-dns-svc\") pod \"6dca0862-8223-4ba0-9fc8-eb8c861baa60\" (UID: \"6dca0862-8223-4ba0-9fc8-eb8c861baa60\") " Mar 09 14:24:20 crc kubenswrapper[4722]: I0309 14:24:20.183922 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6dca0862-8223-4ba0-9fc8-eb8c861baa60-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6dca0862-8223-4ba0-9fc8-eb8c861baa60" (UID: "6dca0862-8223-4ba0-9fc8-eb8c861baa60"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:24:20 crc kubenswrapper[4722]: I0309 14:24:20.186645 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6dca0862-8223-4ba0-9fc8-eb8c861baa60-config" (OuterVolumeSpecName: "config") pod "6dca0862-8223-4ba0-9fc8-eb8c861baa60" (UID: "6dca0862-8223-4ba0-9fc8-eb8c861baa60"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:24:20 crc kubenswrapper[4722]: I0309 14:24:20.188348 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6dca0862-8223-4ba0-9fc8-eb8c861baa60-kube-api-access-dg249" (OuterVolumeSpecName: "kube-api-access-dg249") pod "6dca0862-8223-4ba0-9fc8-eb8c861baa60" (UID: "6dca0862-8223-4ba0-9fc8-eb8c861baa60"). InnerVolumeSpecName "kube-api-access-dg249". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:24:20 crc kubenswrapper[4722]: I0309 14:24:20.285137 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dg249\" (UniqueName: \"kubernetes.io/projected/6dca0862-8223-4ba0-9fc8-eb8c861baa60-kube-api-access-dg249\") on node \"crc\" DevicePath \"\"" Mar 09 14:24:20 crc kubenswrapper[4722]: I0309 14:24:20.285178 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6dca0862-8223-4ba0-9fc8-eb8c861baa60-config\") on node \"crc\" DevicePath \"\"" Mar 09 14:24:20 crc kubenswrapper[4722]: I0309 14:24:20.285190 4722 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6dca0862-8223-4ba0-9fc8-eb8c861baa60-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 14:24:20 crc kubenswrapper[4722]: I0309 14:24:20.399922 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-lb2r7" event={"ID":"6dca0862-8223-4ba0-9fc8-eb8c861baa60","Type":"ContainerDied","Data":"576e33664ab0ccb35a12912c364ded97a43df7fdb6084047b444222b6f2b2678"} Mar 09 14:24:20 crc kubenswrapper[4722]: I0309 14:24:20.400088 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-lb2r7" Mar 09 14:24:20 crc kubenswrapper[4722]: I0309 14:24:20.504815 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-lpmrz" Mar 09 14:24:20 crc kubenswrapper[4722]: I0309 14:24:20.557234 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-lb2r7"] Mar 09 14:24:20 crc kubenswrapper[4722]: I0309 14:24:20.570099 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-lb2r7"] Mar 09 14:24:20 crc kubenswrapper[4722]: I0309 14:24:20.589635 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd7183ce-685d-4fe4-9bc9-5d58c95bc9e0-config\") pod \"dd7183ce-685d-4fe4-9bc9-5d58c95bc9e0\" (UID: \"dd7183ce-685d-4fe4-9bc9-5d58c95bc9e0\") " Mar 09 14:24:20 crc kubenswrapper[4722]: I0309 14:24:20.589878 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd7183ce-685d-4fe4-9bc9-5d58c95bc9e0-dns-svc\") pod \"dd7183ce-685d-4fe4-9bc9-5d58c95bc9e0\" (UID: \"dd7183ce-685d-4fe4-9bc9-5d58c95bc9e0\") " Mar 09 14:24:20 crc kubenswrapper[4722]: I0309 14:24:20.589937 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hn8j6\" (UniqueName: \"kubernetes.io/projected/dd7183ce-685d-4fe4-9bc9-5d58c95bc9e0-kube-api-access-hn8j6\") pod \"dd7183ce-685d-4fe4-9bc9-5d58c95bc9e0\" (UID: \"dd7183ce-685d-4fe4-9bc9-5d58c95bc9e0\") " Mar 09 14:24:20 crc kubenswrapper[4722]: I0309 14:24:20.593996 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd7183ce-685d-4fe4-9bc9-5d58c95bc9e0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "dd7183ce-685d-4fe4-9bc9-5d58c95bc9e0" (UID: "dd7183ce-685d-4fe4-9bc9-5d58c95bc9e0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:24:20 crc kubenswrapper[4722]: I0309 14:24:20.594357 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd7183ce-685d-4fe4-9bc9-5d58c95bc9e0-kube-api-access-hn8j6" (OuterVolumeSpecName: "kube-api-access-hn8j6") pod "dd7183ce-685d-4fe4-9bc9-5d58c95bc9e0" (UID: "dd7183ce-685d-4fe4-9bc9-5d58c95bc9e0"). InnerVolumeSpecName "kube-api-access-hn8j6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:24:20 crc kubenswrapper[4722]: I0309 14:24:20.600196 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd7183ce-685d-4fe4-9bc9-5d58c95bc9e0-config" (OuterVolumeSpecName: "config") pod "dd7183ce-685d-4fe4-9bc9-5d58c95bc9e0" (UID: "dd7183ce-685d-4fe4-9bc9-5d58c95bc9e0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:24:20 crc kubenswrapper[4722]: I0309 14:24:20.692508 4722 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dd7183ce-685d-4fe4-9bc9-5d58c95bc9e0-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 14:24:20 crc kubenswrapper[4722]: I0309 14:24:20.692542 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hn8j6\" (UniqueName: \"kubernetes.io/projected/dd7183ce-685d-4fe4-9bc9-5d58c95bc9e0-kube-api-access-hn8j6\") on node \"crc\" DevicePath \"\"" Mar 09 14:24:20 crc kubenswrapper[4722]: I0309 14:24:20.692553 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd7183ce-685d-4fe4-9bc9-5d58c95bc9e0-config\") on node \"crc\" DevicePath \"\"" Mar 09 14:24:20 crc kubenswrapper[4722]: I0309 14:24:20.831441 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 09 14:24:21 crc kubenswrapper[4722]: I0309 14:24:21.108196 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-vnvsd"] Mar 09 14:24:21 crc kubenswrapper[4722]: I0309 14:24:21.291305 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-lvds5"] Mar 09 14:24:21 crc kubenswrapper[4722]: I0309 14:24:21.307077 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-mw6tx"] Mar 09 14:24:21 crc kubenswrapper[4722]: I0309 14:24:21.417575 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-lpmrz" event={"ID":"dd7183ce-685d-4fe4-9bc9-5d58c95bc9e0","Type":"ContainerDied","Data":"6350dcb160191f9e91020920bf73d9e068cf8b1977185682e983f5532d411f46"} Mar 09 14:24:21 crc kubenswrapper[4722]: I0309 14:24:21.417656 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-lpmrz" Mar 09 14:24:21 crc kubenswrapper[4722]: I0309 14:24:21.470881 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-lpmrz"] Mar 09 14:24:21 crc kubenswrapper[4722]: I0309 14:24:21.478107 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-lpmrz"] Mar 09 14:24:21 crc kubenswrapper[4722]: W0309 14:24:21.556451 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ae3ba92_3a9f_4f65_8c32_2b3f4d1f95e7.slice/crio-fe472556cd60ef8c56aa441231ec648ac421358272f3d5ea3c68c4a6b38797a4 WatchSource:0}: Error finding container fe472556cd60ef8c56aa441231ec648ac421358272f3d5ea3c68c4a6b38797a4: Status 404 returned error can't find the container with id fe472556cd60ef8c56aa441231ec648ac421358272f3d5ea3c68c4a6b38797a4 Mar 09 14:24:22 crc kubenswrapper[4722]: I0309 14:24:22.162829 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6dca0862-8223-4ba0-9fc8-eb8c861baa60" path="/var/lib/kubelet/pods/6dca0862-8223-4ba0-9fc8-eb8c861baa60/volumes" Mar 09 14:24:22 crc kubenswrapper[4722]: I0309 14:24:22.163691 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd7183ce-685d-4fe4-9bc9-5d58c95bc9e0" path="/var/lib/kubelet/pods/dd7183ce-685d-4fe4-9bc9-5d58c95bc9e0/volumes" Mar 09 14:24:22 crc kubenswrapper[4722]: I0309 14:24:22.430690 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-lvds5" event={"ID":"0ae3ba92-3a9f-4f65-8c32-2b3f4d1f95e7","Type":"ContainerStarted","Data":"fe472556cd60ef8c56aa441231ec648ac421358272f3d5ea3c68c4a6b38797a4"} Mar 09 14:24:22 crc kubenswrapper[4722]: I0309 14:24:22.433524 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-ui-dashboards-66cbf594b5-bzqhp" event={"ID":"0d15a083-af10-4638-b6bc-9de1f89f123f","Type":"ContainerStarted","Data":"d91716ff6750587b9480e7decfa05826eed853f14bc4611749c48a4d672b27e6"} Mar 09 14:24:22 crc kubenswrapper[4722]: I0309 14:24:22.435811 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-mw6tx" event={"ID":"f28b2af2-0f97-4160-9641-6771f3deb9d1","Type":"ContainerStarted","Data":"ab16ea3bad09fb52b7580211edcad91eefdc36a34d2d5222094a83b0e63f711a"} Mar 09 14:24:22 crc kubenswrapper[4722]: I0309 14:24:22.438092 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"4159e308-3ccf-45d9-a97b-8133542007a8","Type":"ContainerStarted","Data":"80ff28c7cda34665de0bebe76a8a1102a025d02953717f80f765be65e05ede59"} Mar 09 14:24:22 crc kubenswrapper[4722]: I0309 14:24:22.441745 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"a79aaedb-92ba-42fc-8cc7-9ecb007d2ac7","Type":"ContainerStarted","Data":"ed50884a2474119d9434961efa1de2e0e6821dcf7bc3597580905135c8f6ae50"} Mar 09 14:24:22 crc kubenswrapper[4722]: I0309 14:24:22.443989 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551104-vdth8" event={"ID":"b2ae19a8-dc3a-4368-94d0-a8be2f8ed011","Type":"ContainerStarted","Data":"032f4d0bf04298bfee55256aec077d596bd22f1c59ec4dc749b6b9452d36eb42"} Mar 09 14:24:22 crc kubenswrapper[4722]: I0309 14:24:22.445629 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-vnvsd" event={"ID":"c4e534de-c45a-48fd-b097-1b3276f0d083","Type":"ContainerStarted","Data":"4f6de60d9235054b4b9cb72f128536455432e99327746a6f5124b5d20744d558"} Mar 09 14:24:22 crc kubenswrapper[4722]: I0309 14:24:22.465129 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-ui-dashboards-66cbf594b5-bzqhp" podStartSLOduration=24.2064419 podStartE2EDuration="29.46510692s" podCreationTimestamp="2026-03-09 14:23:53 +0000 UTC" firstStartedPulling="2026-03-09 14:24:15.122140018 +0000 UTC m=+1295.677708634" lastFinishedPulling="2026-03-09 14:24:20.380805078 +0000 UTC m=+1300.936373654" observedRunningTime="2026-03-09 14:24:22.449182525 +0000 UTC m=+1303.004751101" watchObservedRunningTime="2026-03-09 14:24:22.46510692 +0000 UTC m=+1303.020675516" Mar 09 14:24:22 crc kubenswrapper[4722]: I0309 14:24:22.479152 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=19.141482702 podStartE2EDuration="35.479134373s" podCreationTimestamp="2026-03-09 14:23:47 +0000 UTC" firstStartedPulling="2026-03-09 14:23:56.774683061 +0000 UTC m=+1277.330251637" lastFinishedPulling="2026-03-09 14:24:13.112334732 +0000 UTC m=+1293.667903308" observedRunningTime="2026-03-09 14:24:22.476807661 +0000 UTC m=+1303.032376237" watchObservedRunningTime="2026-03-09 14:24:22.479134373 +0000 UTC m=+1303.034702949" Mar 09 14:24:22 crc kubenswrapper[4722]: I0309 14:24:22.545425 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551104-vdth8" podStartSLOduration=17.128927362 podStartE2EDuration="22.54539921s" podCreationTimestamp="2026-03-09 14:24:00 +0000 UTC" firstStartedPulling="2026-03-09 14:24:15.138788871 +0000 UTC m=+1295.694357447" lastFinishedPulling="2026-03-09 14:24:20.555260719 +0000 UTC m=+1301.110829295" observedRunningTime="2026-03-09 14:24:22.505678721 +0000 UTC m=+1303.061247297" watchObservedRunningTime="2026-03-09 14:24:22.54539921 +0000 UTC m=+1303.100967796" Mar 09 14:24:22 crc kubenswrapper[4722]: I0309 14:24:22.568564 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=33.568530117 podStartE2EDuration="33.568530117s" podCreationTimestamp="2026-03-09 14:23:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:24:22.533342899 +0000 UTC m=+1303.088911475" watchObservedRunningTime="2026-03-09 14:24:22.568530117 +0000 UTC m=+1303.124098693" Mar 09 14:24:23 crc kubenswrapper[4722]: I0309 14:24:23.243880 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-vnvsd"] Mar 09 14:24:23 crc kubenswrapper[4722]: I0309 14:24:23.304943 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-thqfj"] Mar 09 14:24:23 crc kubenswrapper[4722]: I0309 14:24:23.307306 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-thqfj" Mar 09 14:24:23 crc kubenswrapper[4722]: I0309 14:24:23.360288 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-thqfj"] Mar 09 14:24:23 crc kubenswrapper[4722]: I0309 14:24:23.378294 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d029d7d8-83b7-441d-a7b3-014e9ea76618-config\") pod \"dnsmasq-dns-698758b865-thqfj\" (UID: \"d029d7d8-83b7-441d-a7b3-014e9ea76618\") " pod="openstack/dnsmasq-dns-698758b865-thqfj" Mar 09 14:24:23 crc kubenswrapper[4722]: I0309 14:24:23.378493 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hk7r8\" (UniqueName: \"kubernetes.io/projected/d029d7d8-83b7-441d-a7b3-014e9ea76618-kube-api-access-hk7r8\") pod \"dnsmasq-dns-698758b865-thqfj\" (UID: \"d029d7d8-83b7-441d-a7b3-014e9ea76618\") " pod="openstack/dnsmasq-dns-698758b865-thqfj" Mar 09 14:24:23 crc kubenswrapper[4722]: I0309 14:24:23.378639 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d029d7d8-83b7-441d-a7b3-014e9ea76618-dns-svc\") pod \"dnsmasq-dns-698758b865-thqfj\" (UID: \"d029d7d8-83b7-441d-a7b3-014e9ea76618\") " pod="openstack/dnsmasq-dns-698758b865-thqfj" Mar 09 14:24:23 crc kubenswrapper[4722]: I0309 14:24:23.378740 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d029d7d8-83b7-441d-a7b3-014e9ea76618-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-thqfj\" (UID: \"d029d7d8-83b7-441d-a7b3-014e9ea76618\") " pod="openstack/dnsmasq-dns-698758b865-thqfj" Mar 09 14:24:23 crc kubenswrapper[4722]: I0309 14:24:23.378825 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d029d7d8-83b7-441d-a7b3-014e9ea76618-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-thqfj\" (UID: \"d029d7d8-83b7-441d-a7b3-014e9ea76618\") " pod="openstack/dnsmasq-dns-698758b865-thqfj" Mar 09 14:24:23 crc kubenswrapper[4722]: I0309 14:24:23.480821 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hk7r8\" (UniqueName: \"kubernetes.io/projected/d029d7d8-83b7-441d-a7b3-014e9ea76618-kube-api-access-hk7r8\") pod \"dnsmasq-dns-698758b865-thqfj\" (UID: \"d029d7d8-83b7-441d-a7b3-014e9ea76618\") " pod="openstack/dnsmasq-dns-698758b865-thqfj" Mar 09 14:24:23 crc kubenswrapper[4722]: I0309 14:24:23.480890 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d029d7d8-83b7-441d-a7b3-014e9ea76618-dns-svc\") pod \"dnsmasq-dns-698758b865-thqfj\" (UID: \"d029d7d8-83b7-441d-a7b3-014e9ea76618\") " pod="openstack/dnsmasq-dns-698758b865-thqfj" Mar 09 14:24:23 crc kubenswrapper[4722]: I0309 14:24:23.480912 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d029d7d8-83b7-441d-a7b3-014e9ea76618-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-thqfj\" (UID: \"d029d7d8-83b7-441d-a7b3-014e9ea76618\") " pod="openstack/dnsmasq-dns-698758b865-thqfj" Mar 09 14:24:23 crc kubenswrapper[4722]: I0309 14:24:23.480952 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d029d7d8-83b7-441d-a7b3-014e9ea76618-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-thqfj\" (UID: \"d029d7d8-83b7-441d-a7b3-014e9ea76618\") " pod="openstack/dnsmasq-dns-698758b865-thqfj" Mar 09 14:24:23 crc kubenswrapper[4722]: I0309 14:24:23.481042 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d029d7d8-83b7-441d-a7b3-014e9ea76618-config\") pod \"dnsmasq-dns-698758b865-thqfj\" (UID: \"d029d7d8-83b7-441d-a7b3-014e9ea76618\") " pod="openstack/dnsmasq-dns-698758b865-thqfj" Mar 09 14:24:23 crc kubenswrapper[4722]: I0309 14:24:23.482062 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d029d7d8-83b7-441d-a7b3-014e9ea76618-config\") pod \"dnsmasq-dns-698758b865-thqfj\" (UID: \"d029d7d8-83b7-441d-a7b3-014e9ea76618\") " pod="openstack/dnsmasq-dns-698758b865-thqfj" Mar 09 14:24:23 crc kubenswrapper[4722]: I0309 14:24:23.482300 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d029d7d8-83b7-441d-a7b3-014e9ea76618-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-thqfj\" (UID: \"d029d7d8-83b7-441d-a7b3-014e9ea76618\") " pod="openstack/dnsmasq-dns-698758b865-thqfj" Mar 09 14:24:23 crc kubenswrapper[4722]: I0309 14:24:23.482741 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d029d7d8-83b7-441d-a7b3-014e9ea76618-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-thqfj\" (UID: \"d029d7d8-83b7-441d-a7b3-014e9ea76618\") " pod="openstack/dnsmasq-dns-698758b865-thqfj" Mar 09 14:24:23 crc kubenswrapper[4722]: I0309 14:24:23.483065 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d029d7d8-83b7-441d-a7b3-014e9ea76618-dns-svc\") pod \"dnsmasq-dns-698758b865-thqfj\" (UID: \"d029d7d8-83b7-441d-a7b3-014e9ea76618\") " pod="openstack/dnsmasq-dns-698758b865-thqfj" Mar 09 14:24:23 crc kubenswrapper[4722]: I0309 14:24:23.491143 4722 generic.go:334] "Generic (PLEG): container finished" podID="b2ae19a8-dc3a-4368-94d0-a8be2f8ed011" containerID="032f4d0bf04298bfee55256aec077d596bd22f1c59ec4dc749b6b9452d36eb42" exitCode=0 Mar 09 14:24:23 crc kubenswrapper[4722]: I0309 14:24:23.491208 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551104-vdth8" event={"ID":"b2ae19a8-dc3a-4368-94d0-a8be2f8ed011","Type":"ContainerDied","Data":"032f4d0bf04298bfee55256aec077d596bd22f1c59ec4dc749b6b9452d36eb42"} Mar 09 14:24:23 crc kubenswrapper[4722]: I0309 14:24:23.494061 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-b8gzx" event={"ID":"32bc4279-b6a2-4846-801c-ddf3a01db8b2","Type":"ContainerStarted","Data":"9bb24e0782280dc7fec0e88e405e33ea0fc2c896d1a6390ef314b025f54081e3"} Mar 09 14:24:23 crc kubenswrapper[4722]: I0309 14:24:23.494467 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-b8gzx" Mar 09 14:24:23 crc kubenswrapper[4722]: I0309 14:24:23.502531 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-k6ng6" event={"ID":"d6228268-4d1f-464b-b733-a2f308211670","Type":"ContainerStarted","Data":"8e85a592cde392bab415876b16806a13b8b4410e6f679abcdd0c939e9acd53f9"} Mar 09 14:24:23 crc kubenswrapper[4722]: I0309 14:24:23.511377 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"9aed43b2-88da-4388-b0ce-77699c8f978c","Type":"ContainerStarted","Data":"377af92c541541b20b666850ab3d4ec783b3c039ba34c8d821a4d31ea025dd78"} Mar 09 14:24:23 crc kubenswrapper[4722]: I0309 14:24:23.513906 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hk7r8\" (UniqueName: \"kubernetes.io/projected/d029d7d8-83b7-441d-a7b3-014e9ea76618-kube-api-access-hk7r8\") pod \"dnsmasq-dns-698758b865-thqfj\" (UID: \"d029d7d8-83b7-441d-a7b3-014e9ea76618\") " pod="openstack/dnsmasq-dns-698758b865-thqfj" Mar 09 14:24:23 crc kubenswrapper[4722]: I0309 14:24:23.514142 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4d8e49ae-5932-472c-a714-7872980c5a9b","Type":"ContainerStarted","Data":"f46a83809e16de46291a90c86e9dec05353037c7e623d1559e8436af46ab3f4a"} Mar 09 14:24:23 crc kubenswrapper[4722]: I0309 14:24:23.514976 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 09 14:24:23 crc kubenswrapper[4722]: I0309 14:24:23.517694 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"3f2d1a87-0e77-4753-b87a-39b2b5f333a4","Type":"ContainerStarted","Data":"8fa19286a8b3aa1df0cd6c491ee59a2aa5485d24fbb855dcdae3ec4ebf0b1093"} Mar 09 14:24:23 crc kubenswrapper[4722]: I0309 14:24:23.566391 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-b8gzx" podStartSLOduration=23.297818902 podStartE2EDuration="28.566372497s" podCreationTimestamp="2026-03-09 14:23:55 +0000 UTC" firstStartedPulling="2026-03-09 14:24:15.126470173 +0000 UTC m=+1295.682038749" lastFinishedPulling="2026-03-09 14:24:20.395023768 +0000 UTC m=+1300.950592344" observedRunningTime="2026-03-09 14:24:23.561743572 +0000 UTC m=+1304.117312148" watchObservedRunningTime="2026-03-09 14:24:23.566372497 +0000 UTC m=+1304.121941073" Mar 09 14:24:23 crc kubenswrapper[4722]: I0309 14:24:23.594815 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=24.91163497 podStartE2EDuration="31.594779363s" podCreationTimestamp="2026-03-09 14:23:52 +0000 UTC" firstStartedPulling="2026-03-09 14:24:15.183325058 +0000 UTC m=+1295.738893634" lastFinishedPulling="2026-03-09 14:24:21.866469451 +0000 UTC m=+1302.422038027" observedRunningTime="2026-03-09 14:24:23.583154574 +0000 UTC m=+1304.138723160" watchObservedRunningTime="2026-03-09 14:24:23.594779363 +0000 UTC m=+1304.150347949" Mar 09 14:24:23 crc kubenswrapper[4722]: I0309 14:24:23.665901 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-thqfj" Mar 09 14:24:24 crc kubenswrapper[4722]: I0309 14:24:24.411394 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 09 14:24:24 crc kubenswrapper[4722]: I0309 14:24:24.419145 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 09 14:24:24 crc kubenswrapper[4722]: I0309 14:24:24.422342 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-gzxb9" Mar 09 14:24:24 crc kubenswrapper[4722]: I0309 14:24:24.422589 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Mar 09 14:24:24 crc kubenswrapper[4722]: I0309 14:24:24.425220 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 09 14:24:24 crc kubenswrapper[4722]: I0309 14:24:24.426125 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 09 14:24:24 crc kubenswrapper[4722]: I0309 14:24:24.440224 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6dc4c5dd4b-rk4jt" Mar 09 14:24:24 crc kubenswrapper[4722]: I0309 14:24:24.440260 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-6dc4c5dd4b-rk4jt" Mar 09 14:24:24 crc kubenswrapper[4722]: I0309 14:24:24.442433 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 09 14:24:24 crc kubenswrapper[4722]: I0309 14:24:24.447445 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6dc4c5dd4b-rk4jt" Mar 09 14:24:24 crc kubenswrapper[4722]: I0309 14:24:24.508313 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-659550b0-da52-42aa-9b3a-5d222d566682\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-659550b0-da52-42aa-9b3a-5d222d566682\") pod \"swift-storage-0\" (UID: \"7463e84f-f457-4409-9621-507d331e06b5\") " pod="openstack/swift-storage-0" Mar 09 14:24:24 crc kubenswrapper[4722]: I0309 14:24:24.509723 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/7463e84f-f457-4409-9621-507d331e06b5-lock\") pod \"swift-storage-0\" (UID: \"7463e84f-f457-4409-9621-507d331e06b5\") " pod="openstack/swift-storage-0" Mar 09 14:24:24 crc kubenswrapper[4722]: I0309 14:24:24.509866 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/7463e84f-f457-4409-9621-507d331e06b5-cache\") pod \"swift-storage-0\" (UID: \"7463e84f-f457-4409-9621-507d331e06b5\") " pod="openstack/swift-storage-0" Mar 09 14:24:24 crc kubenswrapper[4722]: I0309 14:24:24.510062 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7463e84f-f457-4409-9621-507d331e06b5-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"7463e84f-f457-4409-9621-507d331e06b5\") " pod="openstack/swift-storage-0" Mar 09 14:24:24 crc kubenswrapper[4722]: I0309 14:24:24.510834 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gfjj\" (UniqueName: \"kubernetes.io/projected/7463e84f-f457-4409-9621-507d331e06b5-kube-api-access-7gfjj\") pod \"swift-storage-0\" (UID: \"7463e84f-f457-4409-9621-507d331e06b5\") " pod="openstack/swift-storage-0" Mar 09 14:24:24 crc kubenswrapper[4722]: I0309 14:24:24.511510 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7463e84f-f457-4409-9621-507d331e06b5-etc-swift\") pod \"swift-storage-0\" (UID: \"7463e84f-f457-4409-9621-507d331e06b5\") " pod="openstack/swift-storage-0" Mar 09 14:24:24 crc kubenswrapper[4722]: I0309 14:24:24.531588 4722 generic.go:334] "Generic (PLEG): container finished" podID="0ae3ba92-3a9f-4f65-8c32-2b3f4d1f95e7" containerID="f3a6349731b8bd48668c095339efa4febc5af6a7eb1936c487dd1fc5d805a644" exitCode=0 Mar 09 14:24:24 crc kubenswrapper[4722]: I0309 14:24:24.531633 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-lvds5" event={"ID":"0ae3ba92-3a9f-4f65-8c32-2b3f4d1f95e7","Type":"ContainerDied","Data":"f3a6349731b8bd48668c095339efa4febc5af6a7eb1936c487dd1fc5d805a644"} Mar 09 14:24:24 crc kubenswrapper[4722]: I0309 14:24:24.532949 4722 generic.go:334] "Generic (PLEG): container finished" podID="d6228268-4d1f-464b-b733-a2f308211670" containerID="8e85a592cde392bab415876b16806a13b8b4410e6f679abcdd0c939e9acd53f9" exitCode=0 Mar 09 14:24:24 crc kubenswrapper[4722]: I0309 14:24:24.533005 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-k6ng6" event={"ID":"d6228268-4d1f-464b-b733-a2f308211670","Type":"ContainerDied","Data":"8e85a592cde392bab415876b16806a13b8b4410e6f679abcdd0c939e9acd53f9"} Mar 09 14:24:24 crc kubenswrapper[4722]: I0309 14:24:24.535512 4722 generic.go:334] "Generic (PLEG): container finished" podID="c4e534de-c45a-48fd-b097-1b3276f0d083" containerID="ee4e553f04f9a52cc2f0cc3472165d8cba9a10ff1afd2d7039d22faf4042b96c" exitCode=0 Mar 09 14:24:24 crc kubenswrapper[4722]: I0309 14:24:24.539342 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-vnvsd" event={"ID":"c4e534de-c45a-48fd-b097-1b3276f0d083","Type":"ContainerDied","Data":"ee4e553f04f9a52cc2f0cc3472165d8cba9a10ff1afd2d7039d22faf4042b96c"} Mar 09 14:24:24 crc kubenswrapper[4722]: I0309 14:24:24.541311 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6dc4c5dd4b-rk4jt" Mar 09 14:24:24 crc kubenswrapper[4722]: I0309 14:24:24.613637 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gfjj\" (UniqueName: \"kubernetes.io/projected/7463e84f-f457-4409-9621-507d331e06b5-kube-api-access-7gfjj\") pod \"swift-storage-0\" (UID: \"7463e84f-f457-4409-9621-507d331e06b5\") " pod="openstack/swift-storage-0" Mar 09 14:24:24 crc kubenswrapper[4722]: I0309 14:24:24.613908 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7463e84f-f457-4409-9621-507d331e06b5-etc-swift\") pod \"swift-storage-0\" (UID: \"7463e84f-f457-4409-9621-507d331e06b5\") " pod="openstack/swift-storage-0" Mar 09 14:24:24 crc kubenswrapper[4722]: I0309 14:24:24.614091 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-659550b0-da52-42aa-9b3a-5d222d566682\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-659550b0-da52-42aa-9b3a-5d222d566682\") pod \"swift-storage-0\" (UID: \"7463e84f-f457-4409-9621-507d331e06b5\") " pod="openstack/swift-storage-0" Mar 09 14:24:24 crc kubenswrapper[4722]: I0309 14:24:24.614249 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/7463e84f-f457-4409-9621-507d331e06b5-lock\") pod \"swift-storage-0\" (UID: \"7463e84f-f457-4409-9621-507d331e06b5\") " pod="openstack/swift-storage-0" Mar 09 14:24:24 crc kubenswrapper[4722]: I0309 14:24:24.614333 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/7463e84f-f457-4409-9621-507d331e06b5-cache\") pod \"swift-storage-0\" (UID: \"7463e84f-f457-4409-9621-507d331e06b5\") " pod="openstack/swift-storage-0" Mar 09 14:24:24 crc kubenswrapper[4722]: I0309 14:24:24.614515 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7463e84f-f457-4409-9621-507d331e06b5-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"7463e84f-f457-4409-9621-507d331e06b5\") " pod="openstack/swift-storage-0" Mar 09 14:24:24 crc kubenswrapper[4722]: I0309 14:24:24.621919 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/7463e84f-f457-4409-9621-507d331e06b5-lock\") pod \"swift-storage-0\" (UID: \"7463e84f-f457-4409-9621-507d331e06b5\") " pod="openstack/swift-storage-0" Mar 09 14:24:24 crc kubenswrapper[4722]: I0309 14:24:24.622213 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/7463e84f-f457-4409-9621-507d331e06b5-cache\") pod \"swift-storage-0\" (UID: \"7463e84f-f457-4409-9621-507d331e06b5\") " pod="openstack/swift-storage-0" Mar 09 14:24:24 crc kubenswrapper[4722]: E0309 14:24:24.625427 4722 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 09 14:24:24 crc kubenswrapper[4722]: E0309 14:24:24.625718 4722 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 09 14:24:24 crc kubenswrapper[4722]: I0309 14:24:24.627246 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7463e84f-f457-4409-9621-507d331e06b5-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"7463e84f-f457-4409-9621-507d331e06b5\") " pod="openstack/swift-storage-0" Mar 09 14:24:24 crc kubenswrapper[4722]: I0309 14:24:24.634555 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 09 14:24:24 crc kubenswrapper[4722]: I0309 14:24:24.634619 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-659550b0-da52-42aa-9b3a-5d222d566682\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-659550b0-da52-42aa-9b3a-5d222d566682\") pod \"swift-storage-0\" (UID: \"7463e84f-f457-4409-9621-507d331e06b5\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/0ce270dd5f4da3f9614c52c1a4fa85509e67929938d3b395c6a91d96ddc8f6cf/globalmount\"" pod="openstack/swift-storage-0" Mar 09 14:24:24 crc kubenswrapper[4722]: E0309 14:24:24.646071 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7463e84f-f457-4409-9621-507d331e06b5-etc-swift podName:7463e84f-f457-4409-9621-507d331e06b5 nodeName:}" failed. No retries permitted until 2026-03-09 14:24:25.146026797 +0000 UTC m=+1305.701595373 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/7463e84f-f457-4409-9621-507d331e06b5-etc-swift") pod "swift-storage-0" (UID: "7463e84f-f457-4409-9621-507d331e06b5") : configmap "swift-ring-files" not found Mar 09 14:24:24 crc kubenswrapper[4722]: I0309 14:24:24.685027 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gfjj\" (UniqueName: \"kubernetes.io/projected/7463e84f-f457-4409-9621-507d331e06b5-kube-api-access-7gfjj\") pod \"swift-storage-0\" (UID: \"7463e84f-f457-4409-9621-507d331e06b5\") " pod="openstack/swift-storage-0" Mar 09 14:24:24 crc kubenswrapper[4722]: I0309 14:24:24.738692 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-659550b0-da52-42aa-9b3a-5d222d566682\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-659550b0-da52-42aa-9b3a-5d222d566682\") pod \"swift-storage-0\" (UID: \"7463e84f-f457-4409-9621-507d331e06b5\") " pod="openstack/swift-storage-0" Mar 09 14:24:24 crc kubenswrapper[4722]: I0309 14:24:24.756038 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-78fdf7cd4f-mt82k"] Mar 09 14:24:24 crc kubenswrapper[4722]: I0309 14:24:24.866060 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-vtpb7"] Mar 09 14:24:24 crc kubenswrapper[4722]: I0309 14:24:24.872546 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-vtpb7" Mar 09 14:24:24 crc kubenswrapper[4722]: I0309 14:24:24.880770 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 09 14:24:24 crc kubenswrapper[4722]: I0309 14:24:24.880821 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 09 14:24:24 crc kubenswrapper[4722]: I0309 14:24:24.881058 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 09 14:24:24 crc kubenswrapper[4722]: I0309 14:24:24.889534 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-vtpb7"] Mar 09 14:24:24 crc kubenswrapper[4722]: I0309 14:24:24.934027 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb0983d9-3f03-406f-a485-f89ba50341fc-combined-ca-bundle\") pod \"swift-ring-rebalance-vtpb7\" (UID: \"cb0983d9-3f03-406f-a485-f89ba50341fc\") " pod="openstack/swift-ring-rebalance-vtpb7" Mar 09 14:24:24 crc kubenswrapper[4722]: I0309 14:24:24.934116 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/cb0983d9-3f03-406f-a485-f89ba50341fc-etc-swift\") pod \"swift-ring-rebalance-vtpb7\" (UID: \"cb0983d9-3f03-406f-a485-f89ba50341fc\") " pod="openstack/swift-ring-rebalance-vtpb7" Mar 09 14:24:24 crc kubenswrapper[4722]: I0309 14:24:24.934395 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cb0983d9-3f03-406f-a485-f89ba50341fc-scripts\") pod \"swift-ring-rebalance-vtpb7\" (UID: \"cb0983d9-3f03-406f-a485-f89ba50341fc\") " pod="openstack/swift-ring-rebalance-vtpb7" Mar 09 14:24:24 crc kubenswrapper[4722]: I0309 14:24:24.934501 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/cb0983d9-3f03-406f-a485-f89ba50341fc-ring-data-devices\") pod \"swift-ring-rebalance-vtpb7\" (UID: \"cb0983d9-3f03-406f-a485-f89ba50341fc\") " pod="openstack/swift-ring-rebalance-vtpb7" Mar 09 14:24:24 crc kubenswrapper[4722]: I0309 14:24:24.934784 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/cb0983d9-3f03-406f-a485-f89ba50341fc-swiftconf\") pod \"swift-ring-rebalance-vtpb7\" (UID: \"cb0983d9-3f03-406f-a485-f89ba50341fc\") " pod="openstack/swift-ring-rebalance-vtpb7" Mar 09 14:24:24 crc kubenswrapper[4722]: I0309 14:24:24.934896 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/cb0983d9-3f03-406f-a485-f89ba50341fc-dispersionconf\") pod \"swift-ring-rebalance-vtpb7\" (UID: \"cb0983d9-3f03-406f-a485-f89ba50341fc\") " pod="openstack/swift-ring-rebalance-vtpb7" Mar 09 14:24:24 crc kubenswrapper[4722]: I0309 14:24:24.934921 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nwbd\" (UniqueName: \"kubernetes.io/projected/cb0983d9-3f03-406f-a485-f89ba50341fc-kube-api-access-8nwbd\") pod \"swift-ring-rebalance-vtpb7\" (UID: \"cb0983d9-3f03-406f-a485-f89ba50341fc\") " pod="openstack/swift-ring-rebalance-vtpb7" Mar 09 14:24:25 crc kubenswrapper[4722]: I0309 14:24:25.037732 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/cb0983d9-3f03-406f-a485-f89ba50341fc-swiftconf\") pod \"swift-ring-rebalance-vtpb7\" (UID: \"cb0983d9-3f03-406f-a485-f89ba50341fc\") " pod="openstack/swift-ring-rebalance-vtpb7" Mar 09 14:24:25 crc kubenswrapper[4722]: I0309 14:24:25.037815 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/cb0983d9-3f03-406f-a485-f89ba50341fc-dispersionconf\") pod \"swift-ring-rebalance-vtpb7\" (UID: \"cb0983d9-3f03-406f-a485-f89ba50341fc\") " pod="openstack/swift-ring-rebalance-vtpb7" Mar 09 14:24:25 crc kubenswrapper[4722]: I0309 14:24:25.037845 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nwbd\" (UniqueName: \"kubernetes.io/projected/cb0983d9-3f03-406f-a485-f89ba50341fc-kube-api-access-8nwbd\") pod \"swift-ring-rebalance-vtpb7\" (UID: \"cb0983d9-3f03-406f-a485-f89ba50341fc\") " pod="openstack/swift-ring-rebalance-vtpb7" Mar 09 14:24:25 crc kubenswrapper[4722]: I0309 14:24:25.037920 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb0983d9-3f03-406f-a485-f89ba50341fc-combined-ca-bundle\") pod \"swift-ring-rebalance-vtpb7\" (UID: \"cb0983d9-3f03-406f-a485-f89ba50341fc\") " pod="openstack/swift-ring-rebalance-vtpb7" Mar 09 14:24:25 crc kubenswrapper[4722]: I0309 14:24:25.037984 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/cb0983d9-3f03-406f-a485-f89ba50341fc-etc-swift\") pod \"swift-ring-rebalance-vtpb7\" (UID: \"cb0983d9-3f03-406f-a485-f89ba50341fc\") " pod="openstack/swift-ring-rebalance-vtpb7" Mar 09 14:24:25 crc kubenswrapper[4722]: I0309 14:24:25.038067 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cb0983d9-3f03-406f-a485-f89ba50341fc-scripts\") pod \"swift-ring-rebalance-vtpb7\" (UID: \"cb0983d9-3f03-406f-a485-f89ba50341fc\") " pod="openstack/swift-ring-rebalance-vtpb7" Mar 09 14:24:25 crc kubenswrapper[4722]: I0309 14:24:25.038106 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/cb0983d9-3f03-406f-a485-f89ba50341fc-ring-data-devices\") pod \"swift-ring-rebalance-vtpb7\" (UID: \"cb0983d9-3f03-406f-a485-f89ba50341fc\") " pod="openstack/swift-ring-rebalance-vtpb7" Mar 09 14:24:25 crc kubenswrapper[4722]: I0309 14:24:25.040820 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/cb0983d9-3f03-406f-a485-f89ba50341fc-etc-swift\") pod \"swift-ring-rebalance-vtpb7\" (UID: \"cb0983d9-3f03-406f-a485-f89ba50341fc\") " pod="openstack/swift-ring-rebalance-vtpb7" Mar 09 14:24:25 crc kubenswrapper[4722]: I0309 14:24:25.041123 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cb0983d9-3f03-406f-a485-f89ba50341fc-scripts\") pod \"swift-ring-rebalance-vtpb7\" (UID: \"cb0983d9-3f03-406f-a485-f89ba50341fc\") " pod="openstack/swift-ring-rebalance-vtpb7" Mar 09 14:24:25 crc kubenswrapper[4722]: I0309 14:24:25.041234 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/cb0983d9-3f03-406f-a485-f89ba50341fc-ring-data-devices\") pod \"swift-ring-rebalance-vtpb7\" (UID: \"cb0983d9-3f03-406f-a485-f89ba50341fc\") " pod="openstack/swift-ring-rebalance-vtpb7" Mar 09 14:24:25 crc kubenswrapper[4722]: I0309 14:24:25.055579 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/cb0983d9-3f03-406f-a485-f89ba50341fc-dispersionconf\") pod \"swift-ring-rebalance-vtpb7\" (UID: \"cb0983d9-3f03-406f-a485-f89ba50341fc\") " pod="openstack/swift-ring-rebalance-vtpb7" Mar 09 14:24:25 crc kubenswrapper[4722]: I0309 14:24:25.060990 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb0983d9-3f03-406f-a485-f89ba50341fc-combined-ca-bundle\") pod \"swift-ring-rebalance-vtpb7\" (UID: \"cb0983d9-3f03-406f-a485-f89ba50341fc\") " pod="openstack/swift-ring-rebalance-vtpb7" Mar 09 14:24:25 crc kubenswrapper[4722]: I0309 14:24:25.063472 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/cb0983d9-3f03-406f-a485-f89ba50341fc-swiftconf\") pod \"swift-ring-rebalance-vtpb7\" (UID: \"cb0983d9-3f03-406f-a485-f89ba50341fc\") " pod="openstack/swift-ring-rebalance-vtpb7" Mar 09 14:24:25 crc kubenswrapper[4722]: I0309 14:24:25.085716 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nwbd\" (UniqueName: \"kubernetes.io/projected/cb0983d9-3f03-406f-a485-f89ba50341fc-kube-api-access-8nwbd\") pod \"swift-ring-rebalance-vtpb7\" (UID: \"cb0983d9-3f03-406f-a485-f89ba50341fc\") " pod="openstack/swift-ring-rebalance-vtpb7" Mar 09 14:24:25 crc kubenswrapper[4722]: I0309 14:24:25.196326 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-vtpb7" Mar 09 14:24:25 crc kubenswrapper[4722]: I0309 14:24:25.243431 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7463e84f-f457-4409-9621-507d331e06b5-etc-swift\") pod \"swift-storage-0\" (UID: \"7463e84f-f457-4409-9621-507d331e06b5\") " pod="openstack/swift-storage-0" Mar 09 14:24:25 crc kubenswrapper[4722]: E0309 14:24:25.243832 4722 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 09 14:24:25 crc kubenswrapper[4722]: E0309 14:24:25.243861 4722 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 09 14:24:25 crc kubenswrapper[4722]: E0309 14:24:25.243918 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7463e84f-f457-4409-9621-507d331e06b5-etc-swift podName:7463e84f-f457-4409-9621-507d331e06b5 nodeName:}" failed. No retries permitted until 2026-03-09 14:24:26.243897804 +0000 UTC m=+1306.799466380 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/7463e84f-f457-4409-9621-507d331e06b5-etc-swift") pod "swift-storage-0" (UID: "7463e84f-f457-4409-9621-507d331e06b5") : configmap "swift-ring-files" not found Mar 09 14:24:25 crc kubenswrapper[4722]: I0309 14:24:25.546509 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e0db26a0-2877-48fa-b706-b5558f9973d5","Type":"ContainerStarted","Data":"f2225cc475fe79f373efd2edc2c53e8d89e8acca9f05facf3b97f00de7c31ca4"} Mar 09 14:24:26 crc kubenswrapper[4722]: I0309 14:24:26.275843 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7463e84f-f457-4409-9621-507d331e06b5-etc-swift\") pod \"swift-storage-0\" (UID: \"7463e84f-f457-4409-9621-507d331e06b5\") " pod="openstack/swift-storage-0" Mar 09 14:24:26 crc kubenswrapper[4722]: E0309 14:24:26.276484 4722 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 09 14:24:26 crc kubenswrapper[4722]: E0309 14:24:26.276509 4722 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 09 14:24:26 crc kubenswrapper[4722]: E0309 14:24:26.276555 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7463e84f-f457-4409-9621-507d331e06b5-etc-swift podName:7463e84f-f457-4409-9621-507d331e06b5 nodeName:}" failed. No retries permitted until 2026-03-09 14:24:28.276538481 +0000 UTC m=+1308.832107117 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/7463e84f-f457-4409-9621-507d331e06b5-etc-swift") pod "swift-storage-0" (UID: "7463e84f-f457-4409-9621-507d331e06b5") : configmap "swift-ring-files" not found Mar 09 14:24:26 crc kubenswrapper[4722]: I0309 14:24:26.519853 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-vnvsd" Mar 09 14:24:26 crc kubenswrapper[4722]: I0309 14:24:26.568905 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551104-vdth8" event={"ID":"b2ae19a8-dc3a-4368-94d0-a8be2f8ed011","Type":"ContainerDied","Data":"8332983898ba97ae251795cebc2139f10795b85cae4b24af4a5d75e074ad1c38"} Mar 09 14:24:26 crc kubenswrapper[4722]: I0309 14:24:26.569004 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8332983898ba97ae251795cebc2139f10795b85cae4b24af4a5d75e074ad1c38" Mar 09 14:24:26 crc kubenswrapper[4722]: I0309 14:24:26.575999 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-vnvsd" Mar 09 14:24:26 crc kubenswrapper[4722]: I0309 14:24:26.576762 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-vnvsd" event={"ID":"c4e534de-c45a-48fd-b097-1b3276f0d083","Type":"ContainerDied","Data":"4f6de60d9235054b4b9cb72f128536455432e99327746a6f5124b5d20744d558"} Mar 09 14:24:26 crc kubenswrapper[4722]: I0309 14:24:26.576810 4722 scope.go:117] "RemoveContainer" containerID="ee4e553f04f9a52cc2f0cc3472165d8cba9a10ff1afd2d7039d22faf4042b96c" Mar 09 14:24:26 crc kubenswrapper[4722]: I0309 14:24:26.581979 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c4e534de-c45a-48fd-b097-1b3276f0d083-dns-svc\") pod \"c4e534de-c45a-48fd-b097-1b3276f0d083\" (UID: \"c4e534de-c45a-48fd-b097-1b3276f0d083\") " Mar 09 14:24:26 crc kubenswrapper[4722]: I0309 14:24:26.582043 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-knvf9\" (UniqueName: \"kubernetes.io/projected/c4e534de-c45a-48fd-b097-1b3276f0d083-kube-api-access-knvf9\") pod \"c4e534de-c45a-48fd-b097-1b3276f0d083\" (UID: \"c4e534de-c45a-48fd-b097-1b3276f0d083\") " Mar 09 14:24:26 crc kubenswrapper[4722]: I0309 14:24:26.582109 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c4e534de-c45a-48fd-b097-1b3276f0d083-ovsdbserver-sb\") pod \"c4e534de-c45a-48fd-b097-1b3276f0d083\" (UID: \"c4e534de-c45a-48fd-b097-1b3276f0d083\") " Mar 09 14:24:26 crc kubenswrapper[4722]: I0309 14:24:26.582342 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4e534de-c45a-48fd-b097-1b3276f0d083-config\") pod \"c4e534de-c45a-48fd-b097-1b3276f0d083\" (UID: \"c4e534de-c45a-48fd-b097-1b3276f0d083\") " Mar 09 14:24:26 crc kubenswrapper[4722]: I0309 14:24:26.588133 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4e534de-c45a-48fd-b097-1b3276f0d083-kube-api-access-knvf9" (OuterVolumeSpecName: "kube-api-access-knvf9") pod "c4e534de-c45a-48fd-b097-1b3276f0d083" (UID: "c4e534de-c45a-48fd-b097-1b3276f0d083"). InnerVolumeSpecName "kube-api-access-knvf9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:24:26 crc kubenswrapper[4722]: I0309 14:24:26.613021 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4e534de-c45a-48fd-b097-1b3276f0d083-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c4e534de-c45a-48fd-b097-1b3276f0d083" (UID: "c4e534de-c45a-48fd-b097-1b3276f0d083"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:24:26 crc kubenswrapper[4722]: I0309 14:24:26.617599 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4e534de-c45a-48fd-b097-1b3276f0d083-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c4e534de-c45a-48fd-b097-1b3276f0d083" (UID: "c4e534de-c45a-48fd-b097-1b3276f0d083"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:24:26 crc kubenswrapper[4722]: I0309 14:24:26.628068 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4e534de-c45a-48fd-b097-1b3276f0d083-config" (OuterVolumeSpecName: "config") pod "c4e534de-c45a-48fd-b097-1b3276f0d083" (UID: "c4e534de-c45a-48fd-b097-1b3276f0d083"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:24:26 crc kubenswrapper[4722]: I0309 14:24:26.687845 4722 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c4e534de-c45a-48fd-b097-1b3276f0d083-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 14:24:26 crc kubenswrapper[4722]: I0309 14:24:26.687872 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-knvf9\" (UniqueName: \"kubernetes.io/projected/c4e534de-c45a-48fd-b097-1b3276f0d083-kube-api-access-knvf9\") on node \"crc\" DevicePath \"\"" Mar 09 14:24:26 crc kubenswrapper[4722]: I0309 14:24:26.687882 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c4e534de-c45a-48fd-b097-1b3276f0d083-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 09 14:24:26 crc kubenswrapper[4722]: I0309 14:24:26.687892 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4e534de-c45a-48fd-b097-1b3276f0d083-config\") on node \"crc\" DevicePath \"\"" Mar 09 14:24:26 crc kubenswrapper[4722]: I0309 14:24:26.717738 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551104-vdth8" Mar 09 14:24:26 crc kubenswrapper[4722]: I0309 14:24:26.789621 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zs8z4\" (UniqueName: \"kubernetes.io/projected/b2ae19a8-dc3a-4368-94d0-a8be2f8ed011-kube-api-access-zs8z4\") pod \"b2ae19a8-dc3a-4368-94d0-a8be2f8ed011\" (UID: \"b2ae19a8-dc3a-4368-94d0-a8be2f8ed011\") " Mar 09 14:24:26 crc kubenswrapper[4722]: I0309 14:24:26.804098 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2ae19a8-dc3a-4368-94d0-a8be2f8ed011-kube-api-access-zs8z4" (OuterVolumeSpecName: "kube-api-access-zs8z4") pod "b2ae19a8-dc3a-4368-94d0-a8be2f8ed011" (UID: "b2ae19a8-dc3a-4368-94d0-a8be2f8ed011"). InnerVolumeSpecName "kube-api-access-zs8z4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:24:26 crc kubenswrapper[4722]: I0309 14:24:26.892815 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zs8z4\" (UniqueName: \"kubernetes.io/projected/b2ae19a8-dc3a-4368-94d0-a8be2f8ed011-kube-api-access-zs8z4\") on node \"crc\" DevicePath \"\"" Mar 09 14:24:26 crc kubenswrapper[4722]: I0309 14:24:26.976265 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-vnvsd"] Mar 09 14:24:26 crc kubenswrapper[4722]: I0309 14:24:26.989157 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-vnvsd"] Mar 09 14:24:27 crc kubenswrapper[4722]: I0309 14:24:27.101821 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-vtpb7"] Mar 09 14:24:27 crc kubenswrapper[4722]: I0309 14:24:27.148738 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-thqfj"] Mar 09 14:24:27 crc kubenswrapper[4722]: I0309 14:24:27.585713 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-vtpb7" event={"ID":"cb0983d9-3f03-406f-a485-f89ba50341fc","Type":"ContainerStarted","Data":"e23d446c9c64a15350b2cbdfc8a3c98a4bb27e01afffe096e3d9b12ec895189b"} Mar 09 14:24:27 crc kubenswrapper[4722]: I0309 14:24:27.588005 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-lvds5" event={"ID":"0ae3ba92-3a9f-4f65-8c32-2b3f4d1f95e7","Type":"ContainerStarted","Data":"f6534d1e16ac3434c37eec26b3cbd5e4e89ad65721ae34a000ce2ae8185d8263"} Mar 09 14:24:27 crc kubenswrapper[4722]: I0309 14:24:27.588179 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-lvds5" Mar 09 14:24:27 crc kubenswrapper[4722]: I0309 14:24:27.590405 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-k6ng6" event={"ID":"d6228268-4d1f-464b-b733-a2f308211670","Type":"ContainerStarted","Data":"6718f932bb601c0533545103893ff8701fef352f875c1ac0565f7d5596deadb9"} Mar 09 14:24:27 crc kubenswrapper[4722]: I0309 14:24:27.590606 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-k6ng6" Mar 09 14:24:27 crc kubenswrapper[4722]: I0309 14:24:27.592253 4722 generic.go:334] "Generic (PLEG): container finished" podID="d029d7d8-83b7-441d-a7b3-014e9ea76618" containerID="938cb38ce445ab609ed1a74b409c8033ceeaf1593440b7134131222510a86963" exitCode=0 Mar 09 14:24:27 crc kubenswrapper[4722]: I0309 14:24:27.592348 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-thqfj" event={"ID":"d029d7d8-83b7-441d-a7b3-014e9ea76618","Type":"ContainerDied","Data":"938cb38ce445ab609ed1a74b409c8033ceeaf1593440b7134131222510a86963"} Mar 09 14:24:27 crc kubenswrapper[4722]: I0309 14:24:27.592383 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-thqfj" event={"ID":"d029d7d8-83b7-441d-a7b3-014e9ea76618","Type":"ContainerStarted","Data":"da7eedebb6465c8445304593b4e941d077cea8c3545d9ea5426f85503f9413e5"} Mar 09 14:24:27 crc kubenswrapper[4722]: I0309 14:24:27.594949 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"9aed43b2-88da-4388-b0ce-77699c8f978c","Type":"ContainerStarted","Data":"ca1ea97c85da926804a21069a5fc803fb8237129737b846f76c06dce91055b6d"} Mar 09 14:24:27 crc kubenswrapper[4722]: I0309 14:24:27.597003 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-mw6tx" event={"ID":"f28b2af2-0f97-4160-9641-6771f3deb9d1","Type":"ContainerStarted","Data":"11b6054c1d39a961d4976e11313188cbb68e9a74678d23e9281dc4fc282e2a83"} Mar 09 14:24:27 crc kubenswrapper[4722]: I0309 14:24:27.607099 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551104-vdth8" Mar 09 14:24:27 crc kubenswrapper[4722]: I0309 14:24:27.608472 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"3f2d1a87-0e77-4753-b87a-39b2b5f333a4","Type":"ContainerStarted","Data":"4e15880b6bb03b9ba7e79fc05b6e1e28df6abbd8817729059330eb65392b097f"} Mar 09 14:24:27 crc kubenswrapper[4722]: I0309 14:24:27.618923 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-lvds5" podStartSLOduration=8.06545704 podStartE2EDuration="8.618906295s" podCreationTimestamp="2026-03-09 14:24:19 +0000 UTC" firstStartedPulling="2026-03-09 14:24:21.631116847 +0000 UTC m=+1302.186685423" lastFinishedPulling="2026-03-09 14:24:22.184566102 +0000 UTC m=+1302.740134678" observedRunningTime="2026-03-09 14:24:27.613979493 +0000 UTC m=+1308.169548079" watchObservedRunningTime="2026-03-09 14:24:27.618906295 +0000 UTC m=+1308.174474871" Mar 09 14:24:27 crc kubenswrapper[4722]: I0309 14:24:27.658553 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-k6ng6" podStartSLOduration=26.423311919 podStartE2EDuration="32.658527561s" podCreationTimestamp="2026-03-09 14:23:55 +0000 UTC" firstStartedPulling="2026-03-09 14:24:14.155353726 +0000 UTC m=+1294.710922302" lastFinishedPulling="2026-03-09 14:24:20.390569368 +0000 UTC m=+1300.946137944" observedRunningTime="2026-03-09 14:24:27.641890207 +0000 UTC m=+1308.197458793" watchObservedRunningTime="2026-03-09 14:24:27.658527561 +0000 UTC m=+1308.214096137" Mar 09 14:24:27 crc kubenswrapper[4722]: I0309 14:24:27.669827 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=21.217121776 podStartE2EDuration="32.669810491s" podCreationTimestamp="2026-03-09 14:23:55 +0000 UTC" firstStartedPulling="2026-03-09 14:24:15.111454613 +0000 UTC m=+1295.667023229" lastFinishedPulling="2026-03-09 14:24:26.564143368 +0000 UTC m=+1307.119711944" observedRunningTime="2026-03-09 14:24:27.66937322 +0000 UTC m=+1308.224941806" watchObservedRunningTime="2026-03-09 14:24:27.669810491 +0000 UTC m=+1308.225379057" Mar 09 14:24:27 crc kubenswrapper[4722]: I0309 14:24:27.692874 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=18.583475703 podStartE2EDuration="29.692851716s" podCreationTimestamp="2026-03-09 14:23:58 +0000 UTC" firstStartedPulling="2026-03-09 14:24:15.47261348 +0000 UTC m=+1296.028182056" lastFinishedPulling="2026-03-09 14:24:26.581989493 +0000 UTC m=+1307.137558069" observedRunningTime="2026-03-09 14:24:27.687142574 +0000 UTC m=+1308.242711170" watchObservedRunningTime="2026-03-09 14:24:27.692851716 +0000 UTC m=+1308.248420312" Mar 09 14:24:27 crc kubenswrapper[4722]: I0309 14:24:27.789747 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-mw6tx" podStartSLOduration=3.780552149 podStartE2EDuration="8.789727978s" podCreationTimestamp="2026-03-09 14:24:19 +0000 UTC" firstStartedPulling="2026-03-09 14:24:21.554700271 +0000 UTC m=+1302.110268847" lastFinishedPulling="2026-03-09 14:24:26.5638761 +0000 UTC m=+1307.119444676" observedRunningTime="2026-03-09 14:24:27.723157563 +0000 UTC m=+1308.278726149" watchObservedRunningTime="2026-03-09 14:24:27.789727978 +0000 UTC m=+1308.345296554" Mar 09 14:24:27 crc kubenswrapper[4722]: I0309 14:24:27.851431 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551098-rb55h"] Mar 09 14:24:27 crc kubenswrapper[4722]: I0309 14:24:27.866793 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551098-rb55h"] Mar 09 14:24:28 crc kubenswrapper[4722]: I0309 14:24:28.162615 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9020ca81-ccfb-4a03-ac36-ae9b259c68b1" path="/var/lib/kubelet/pods/9020ca81-ccfb-4a03-ac36-ae9b259c68b1/volumes" Mar 09 14:24:28 crc kubenswrapper[4722]: I0309 14:24:28.164069 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4e534de-c45a-48fd-b097-1b3276f0d083" path="/var/lib/kubelet/pods/c4e534de-c45a-48fd-b097-1b3276f0d083/volumes" Mar 09 14:24:28 crc kubenswrapper[4722]: I0309 14:24:28.340474 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7463e84f-f457-4409-9621-507d331e06b5-etc-swift\") pod \"swift-storage-0\" (UID: \"7463e84f-f457-4409-9621-507d331e06b5\") " pod="openstack/swift-storage-0" Mar 09 14:24:28 crc kubenswrapper[4722]: E0309 14:24:28.341721 4722 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 09 14:24:28 crc kubenswrapper[4722]: E0309 14:24:28.341745 4722 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 09 14:24:28 crc kubenswrapper[4722]: E0309 14:24:28.342188 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7463e84f-f457-4409-9621-507d331e06b5-etc-swift podName:7463e84f-f457-4409-9621-507d331e06b5 nodeName:}" failed. No retries permitted until 2026-03-09 14:24:32.342164475 +0000 UTC m=+1312.897733051 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/7463e84f-f457-4409-9621-507d331e06b5-etc-swift") pod "swift-storage-0" (UID: "7463e84f-f457-4409-9621-507d331e06b5") : configmap "swift-ring-files" not found Mar 09 14:24:28 crc kubenswrapper[4722]: I0309 14:24:28.618678 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-k6ng6" event={"ID":"d6228268-4d1f-464b-b733-a2f308211670","Type":"ContainerStarted","Data":"84f9a18197b8d2b884dcd9d8346eece0884e2f801be5f4565680b933a2d67fe2"} Mar 09 14:24:28 crc kubenswrapper[4722]: I0309 14:24:28.618727 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-k6ng6" Mar 09 14:24:28 crc kubenswrapper[4722]: I0309 14:24:28.621387 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-thqfj" event={"ID":"d029d7d8-83b7-441d-a7b3-014e9ea76618","Type":"ContainerStarted","Data":"96d495a9e82f8249a90710718e21a86830a2a6a682c4099fa1ca5d36da70d331"} Mar 09 14:24:28 crc kubenswrapper[4722]: I0309 14:24:28.622168 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-thqfj" Mar 09 14:24:28 crc kubenswrapper[4722]: I0309 14:24:28.675741 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-thqfj" podStartSLOduration=5.675713066 podStartE2EDuration="5.675713066s" podCreationTimestamp="2026-03-09 14:24:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:24:28.659707299 +0000 UTC m=+1309.215275875" watchObservedRunningTime="2026-03-09 14:24:28.675713066 +0000 UTC m=+1309.231281642" Mar 09 14:24:29 crc kubenswrapper[4722]: I0309 14:24:29.149819 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 09 14:24:29 crc kubenswrapper[4722]: I0309 14:24:29.149911 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 09 14:24:29 crc kubenswrapper[4722]: I0309 14:24:29.258781 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 09 14:24:29 crc kubenswrapper[4722]: I0309 14:24:29.718007 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 09 14:24:29 crc kubenswrapper[4722]: I0309 14:24:29.775581 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 09 14:24:29 crc kubenswrapper[4722]: I0309 14:24:29.829340 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 09 14:24:30 crc kubenswrapper[4722]: I0309 14:24:30.042804 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 09 14:24:30 crc kubenswrapper[4722]: I0309 14:24:30.042911 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 09 14:24:30 crc kubenswrapper[4722]: I0309 14:24:30.101805 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 09 14:24:30 crc kubenswrapper[4722]: I0309 14:24:30.506711 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 09 14:24:30 crc kubenswrapper[4722]: I0309 14:24:30.506867 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 09 14:24:30 crc kubenswrapper[4722]: I0309 14:24:30.605848 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 09 14:24:30 crc kubenswrapper[4722]: I0309 14:24:30.651635 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 09 14:24:30 crc kubenswrapper[4722]: I0309 14:24:30.697999 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 09 14:24:30 crc kubenswrapper[4722]: I0309 14:24:30.709047 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 09 14:24:30 crc kubenswrapper[4722]: I0309 14:24:30.767914 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 09 14:24:31 crc kubenswrapper[4722]: I0309 14:24:31.058068 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 09 14:24:31 crc kubenswrapper[4722]: E0309 14:24:31.058516 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4e534de-c45a-48fd-b097-1b3276f0d083" containerName="init" Mar 09 14:24:31 crc kubenswrapper[4722]: I0309 14:24:31.058534 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4e534de-c45a-48fd-b097-1b3276f0d083" containerName="init" Mar 09 14:24:31 crc kubenswrapper[4722]: E0309 14:24:31.058563 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2ae19a8-dc3a-4368-94d0-a8be2f8ed011" containerName="oc" Mar 09 14:24:31 crc kubenswrapper[4722]: I0309 14:24:31.058570 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2ae19a8-dc3a-4368-94d0-a8be2f8ed011" containerName="oc" Mar 09 14:24:31 crc kubenswrapper[4722]: I0309 14:24:31.058797 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2ae19a8-dc3a-4368-94d0-a8be2f8ed011" containerName="oc" Mar 09 14:24:31 crc kubenswrapper[4722]: I0309 14:24:31.058826 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4e534de-c45a-48fd-b097-1b3276f0d083" containerName="init" Mar 09 14:24:31 crc kubenswrapper[4722]: I0309 14:24:31.060196 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 09 14:24:31 crc kubenswrapper[4722]: I0309 14:24:31.069159 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 09 14:24:31 crc kubenswrapper[4722]: I0309 14:24:31.069289 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-zdhn2" Mar 09 14:24:31 crc kubenswrapper[4722]: I0309 14:24:31.069305 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 09 14:24:31 crc kubenswrapper[4722]: I0309 14:24:31.069431 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 09 14:24:31 crc kubenswrapper[4722]: I0309 14:24:31.097430 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 09 14:24:31 crc kubenswrapper[4722]: I0309 14:24:31.113814 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6585120e-2007-43b3-a72d-4e80fb7ab2fb-config\") pod \"ovn-northd-0\" (UID: \"6585120e-2007-43b3-a72d-4e80fb7ab2fb\") " pod="openstack/ovn-northd-0" Mar 09 14:24:31 crc kubenswrapper[4722]: I0309 14:24:31.114196 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6585120e-2007-43b3-a72d-4e80fb7ab2fb-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"6585120e-2007-43b3-a72d-4e80fb7ab2fb\") " pod="openstack/ovn-northd-0" Mar 09 14:24:31 crc kubenswrapper[4722]: I0309 14:24:31.114360 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6585120e-2007-43b3-a72d-4e80fb7ab2fb-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"6585120e-2007-43b3-a72d-4e80fb7ab2fb\") " pod="openstack/ovn-northd-0" Mar 09 14:24:31 crc kubenswrapper[4722]: I0309 14:24:31.114461 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/6585120e-2007-43b3-a72d-4e80fb7ab2fb-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"6585120e-2007-43b3-a72d-4e80fb7ab2fb\") " pod="openstack/ovn-northd-0" Mar 09 14:24:31 crc kubenswrapper[4722]: I0309 14:24:31.115068 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6585120e-2007-43b3-a72d-4e80fb7ab2fb-scripts\") pod \"ovn-northd-0\" (UID: \"6585120e-2007-43b3-a72d-4e80fb7ab2fb\") " pod="openstack/ovn-northd-0" Mar 09 14:24:31 crc kubenswrapper[4722]: I0309 14:24:31.115172 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6585120e-2007-43b3-a72d-4e80fb7ab2fb-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"6585120e-2007-43b3-a72d-4e80fb7ab2fb\") " pod="openstack/ovn-northd-0" Mar 09 14:24:31 crc kubenswrapper[4722]: I0309 14:24:31.115272 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66gk5\" (UniqueName: \"kubernetes.io/projected/6585120e-2007-43b3-a72d-4e80fb7ab2fb-kube-api-access-66gk5\") pod \"ovn-northd-0\" (UID: \"6585120e-2007-43b3-a72d-4e80fb7ab2fb\") " pod="openstack/ovn-northd-0" Mar 09 14:24:31 crc kubenswrapper[4722]: I0309 14:24:31.135515 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-fd4g6"] Mar 09 14:24:31 crc kubenswrapper[4722]: I0309 14:24:31.136967 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-fd4g6" Mar 09 14:24:31 crc kubenswrapper[4722]: I0309 14:24:31.154089 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-fd4g6"] Mar 09 14:24:31 crc kubenswrapper[4722]: I0309 14:24:31.163796 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-33f7-account-create-update-tc7x6"] Mar 09 14:24:31 crc kubenswrapper[4722]: I0309 14:24:31.165192 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-33f7-account-create-update-tc7x6" Mar 09 14:24:31 crc kubenswrapper[4722]: I0309 14:24:31.167349 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 09 14:24:31 crc kubenswrapper[4722]: I0309 14:24:31.183104 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-33f7-account-create-update-tc7x6"] Mar 09 14:24:31 crc kubenswrapper[4722]: I0309 14:24:31.217034 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qbgb\" (UniqueName: \"kubernetes.io/projected/1d9d7716-b622-4ae7-9f3a-480c5807525b-kube-api-access-5qbgb\") pod \"glance-33f7-account-create-update-tc7x6\" (UID: \"1d9d7716-b622-4ae7-9f3a-480c5807525b\") " pod="openstack/glance-33f7-account-create-update-tc7x6" Mar 09 14:24:31 crc kubenswrapper[4722]: I0309 14:24:31.217111 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6585120e-2007-43b3-a72d-4e80fb7ab2fb-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"6585120e-2007-43b3-a72d-4e80fb7ab2fb\") " pod="openstack/ovn-northd-0" Mar 09 14:24:31 crc kubenswrapper[4722]: I0309 14:24:31.217192 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6585120e-2007-43b3-a72d-4e80fb7ab2fb-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"6585120e-2007-43b3-a72d-4e80fb7ab2fb\") " pod="openstack/ovn-northd-0" Mar 09 14:24:31 crc kubenswrapper[4722]: I0309 14:24:31.217307 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pfvd\" (UniqueName: \"kubernetes.io/projected/b730192a-1410-4860-a847-f5e5974fd728-kube-api-access-7pfvd\") pod \"glance-db-create-fd4g6\" (UID: \"b730192a-1410-4860-a847-f5e5974fd728\") " pod="openstack/glance-db-create-fd4g6" Mar 09 14:24:31 crc kubenswrapper[4722]: I0309 14:24:31.217367 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/6585120e-2007-43b3-a72d-4e80fb7ab2fb-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"6585120e-2007-43b3-a72d-4e80fb7ab2fb\") " pod="openstack/ovn-northd-0" Mar 09 14:24:31 crc kubenswrapper[4722]: I0309 14:24:31.217392 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6585120e-2007-43b3-a72d-4e80fb7ab2fb-scripts\") pod \"ovn-northd-0\" (UID: \"6585120e-2007-43b3-a72d-4e80fb7ab2fb\") " pod="openstack/ovn-northd-0" Mar 09 14:24:31 crc kubenswrapper[4722]: I0309 14:24:31.217454 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b730192a-1410-4860-a847-f5e5974fd728-operator-scripts\") pod \"glance-db-create-fd4g6\" (UID: \"b730192a-1410-4860-a847-f5e5974fd728\") " pod="openstack/glance-db-create-fd4g6" Mar 09 14:24:31 crc kubenswrapper[4722]: I0309 14:24:31.217480 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6585120e-2007-43b3-a72d-4e80fb7ab2fb-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"6585120e-2007-43b3-a72d-4e80fb7ab2fb\") " pod="openstack/ovn-northd-0" Mar 09 14:24:31 crc kubenswrapper[4722]: I0309 14:24:31.217504 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66gk5\" (UniqueName: \"kubernetes.io/projected/6585120e-2007-43b3-a72d-4e80fb7ab2fb-kube-api-access-66gk5\") pod \"ovn-northd-0\" (UID: \"6585120e-2007-43b3-a72d-4e80fb7ab2fb\") " pod="openstack/ovn-northd-0" Mar 09 14:24:31 crc kubenswrapper[4722]: I0309 14:24:31.217577 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1d9d7716-b622-4ae7-9f3a-480c5807525b-operator-scripts\") pod \"glance-33f7-account-create-update-tc7x6\" (UID: \"1d9d7716-b622-4ae7-9f3a-480c5807525b\") " pod="openstack/glance-33f7-account-create-update-tc7x6" Mar 09 14:24:31 crc kubenswrapper[4722]: I0309 14:24:31.217608 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6585120e-2007-43b3-a72d-4e80fb7ab2fb-config\") pod \"ovn-northd-0\" (UID: \"6585120e-2007-43b3-a72d-4e80fb7ab2fb\") " pod="openstack/ovn-northd-0" Mar 09 14:24:31 crc kubenswrapper[4722]: I0309 14:24:31.218295 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/6585120e-2007-43b3-a72d-4e80fb7ab2fb-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"6585120e-2007-43b3-a72d-4e80fb7ab2fb\") " pod="openstack/ovn-northd-0" Mar 09 14:24:31 crc kubenswrapper[4722]: I0309 14:24:31.219052 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6585120e-2007-43b3-a72d-4e80fb7ab2fb-config\") pod \"ovn-northd-0\" (UID: \"6585120e-2007-43b3-a72d-4e80fb7ab2fb\") " pod="openstack/ovn-northd-0" Mar 09 14:24:31 crc kubenswrapper[4722]: I0309 14:24:31.219594 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6585120e-2007-43b3-a72d-4e80fb7ab2fb-scripts\") pod \"ovn-northd-0\" (UID: \"6585120e-2007-43b3-a72d-4e80fb7ab2fb\") " pod="openstack/ovn-northd-0" Mar 09 14:24:31 crc kubenswrapper[4722]: I0309 14:24:31.223800 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6585120e-2007-43b3-a72d-4e80fb7ab2fb-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"6585120e-2007-43b3-a72d-4e80fb7ab2fb\") " pod="openstack/ovn-northd-0" Mar 09 14:24:31 crc kubenswrapper[4722]: I0309 14:24:31.224674 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/6585120e-2007-43b3-a72d-4e80fb7ab2fb-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"6585120e-2007-43b3-a72d-4e80fb7ab2fb\") " pod="openstack/ovn-northd-0" Mar 09 14:24:31 crc kubenswrapper[4722]: I0309 14:24:31.230919 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/6585120e-2007-43b3-a72d-4e80fb7ab2fb-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"6585120e-2007-43b3-a72d-4e80fb7ab2fb\") " pod="openstack/ovn-northd-0" Mar 09 14:24:31 crc kubenswrapper[4722]: I0309 14:24:31.237327 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66gk5\" (UniqueName: \"kubernetes.io/projected/6585120e-2007-43b3-a72d-4e80fb7ab2fb-kube-api-access-66gk5\") pod \"ovn-northd-0\" (UID: \"6585120e-2007-43b3-a72d-4e80fb7ab2fb\") " pod="openstack/ovn-northd-0" Mar 09 14:24:31 crc kubenswrapper[4722]: I0309 14:24:31.319728 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1d9d7716-b622-4ae7-9f3a-480c5807525b-operator-scripts\") pod \"glance-33f7-account-create-update-tc7x6\" (UID: \"1d9d7716-b622-4ae7-9f3a-480c5807525b\") " pod="openstack/glance-33f7-account-create-update-tc7x6" Mar 09 14:24:31 crc kubenswrapper[4722]: I0309 14:24:31.320105 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qbgb\" (UniqueName: \"kubernetes.io/projected/1d9d7716-b622-4ae7-9f3a-480c5807525b-kube-api-access-5qbgb\") pod \"glance-33f7-account-create-update-tc7x6\" (UID: \"1d9d7716-b622-4ae7-9f3a-480c5807525b\") " pod="openstack/glance-33f7-account-create-update-tc7x6" Mar 09 14:24:31 crc kubenswrapper[4722]: I0309 14:24:31.320190 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pfvd\" (UniqueName: \"kubernetes.io/projected/b730192a-1410-4860-a847-f5e5974fd728-kube-api-access-7pfvd\") pod \"glance-db-create-fd4g6\" (UID: \"b730192a-1410-4860-a847-f5e5974fd728\") " pod="openstack/glance-db-create-fd4g6" Mar 09 14:24:31 crc kubenswrapper[4722]: I0309 14:24:31.320276 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b730192a-1410-4860-a847-f5e5974fd728-operator-scripts\") pod \"glance-db-create-fd4g6\" (UID: \"b730192a-1410-4860-a847-f5e5974fd728\") " pod="openstack/glance-db-create-fd4g6" Mar 09 14:24:31 crc kubenswrapper[4722]: I0309 14:24:31.320611 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1d9d7716-b622-4ae7-9f3a-480c5807525b-operator-scripts\") pod \"glance-33f7-account-create-update-tc7x6\" (UID: \"1d9d7716-b622-4ae7-9f3a-480c5807525b\") " pod="openstack/glance-33f7-account-create-update-tc7x6" Mar 09 14:24:31 crc kubenswrapper[4722]: I0309 14:24:31.320923 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b730192a-1410-4860-a847-f5e5974fd728-operator-scripts\") pod \"glance-db-create-fd4g6\" (UID: \"b730192a-1410-4860-a847-f5e5974fd728\") " pod="openstack/glance-db-create-fd4g6" Mar 09 14:24:31 crc kubenswrapper[4722]: I0309 14:24:31.338756 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qbgb\" (UniqueName: \"kubernetes.io/projected/1d9d7716-b622-4ae7-9f3a-480c5807525b-kube-api-access-5qbgb\") pod \"glance-33f7-account-create-update-tc7x6\" (UID: \"1d9d7716-b622-4ae7-9f3a-480c5807525b\") " pod="openstack/glance-33f7-account-create-update-tc7x6" Mar 09 14:24:31 crc kubenswrapper[4722]: I0309 14:24:31.338746 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pfvd\" (UniqueName: \"kubernetes.io/projected/b730192a-1410-4860-a847-f5e5974fd728-kube-api-access-7pfvd\") pod \"glance-db-create-fd4g6\" (UID: \"b730192a-1410-4860-a847-f5e5974fd728\") " pod="openstack/glance-db-create-fd4g6" Mar 09 14:24:31 crc kubenswrapper[4722]: I0309 14:24:31.385336 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 09 14:24:31 crc kubenswrapper[4722]: I0309 14:24:31.464372 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-fd4g6" Mar 09 14:24:31 crc kubenswrapper[4722]: I0309 14:24:31.486896 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-33f7-account-create-update-tc7x6" Mar 09 14:24:31 crc kubenswrapper[4722]: I0309 14:24:31.661531 4722 generic.go:334] "Generic (PLEG): container finished" podID="e0db26a0-2877-48fa-b706-b5558f9973d5" containerID="f2225cc475fe79f373efd2edc2c53e8d89e8acca9f05facf3b97f00de7c31ca4" exitCode=0 Mar 09 14:24:31 crc kubenswrapper[4722]: I0309 14:24:31.661638 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e0db26a0-2877-48fa-b706-b5558f9973d5","Type":"ContainerDied","Data":"f2225cc475fe79f373efd2edc2c53e8d89e8acca9f05facf3b97f00de7c31ca4"} Mar 09 14:24:31 crc kubenswrapper[4722]: I0309 14:24:31.805465 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-4qrms"] Mar 09 14:24:31 crc kubenswrapper[4722]: I0309 14:24:31.807050 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-4qrms" Mar 09 14:24:31 crc kubenswrapper[4722]: I0309 14:24:31.821936 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-4qrms"] Mar 09 14:24:31 crc kubenswrapper[4722]: I0309 14:24:31.834275 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9btkk\" (UniqueName: \"kubernetes.io/projected/5397205b-9c4a-4575-8bfa-8604e88784e9-kube-api-access-9btkk\") pod \"keystone-db-create-4qrms\" (UID: \"5397205b-9c4a-4575-8bfa-8604e88784e9\") " pod="openstack/keystone-db-create-4qrms" Mar 09 14:24:31 crc kubenswrapper[4722]: I0309 14:24:31.834447 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5397205b-9c4a-4575-8bfa-8604e88784e9-operator-scripts\") pod \"keystone-db-create-4qrms\" (UID: \"5397205b-9c4a-4575-8bfa-8604e88784e9\") " pod="openstack/keystone-db-create-4qrms" Mar 09 14:24:31 crc kubenswrapper[4722]: I0309 14:24:31.912292 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-23c6-account-create-update-nk68b"] Mar 09 14:24:31 crc kubenswrapper[4722]: I0309 14:24:31.914403 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-23c6-account-create-update-nk68b" Mar 09 14:24:31 crc kubenswrapper[4722]: I0309 14:24:31.916575 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 09 14:24:31 crc kubenswrapper[4722]: I0309 14:24:31.921714 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-23c6-account-create-update-nk68b"] Mar 09 14:24:31 crc kubenswrapper[4722]: I0309 14:24:31.936754 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpl9d\" (UniqueName: \"kubernetes.io/projected/e55131a4-f18d-417f-8d87-408a7f3bb919-kube-api-access-gpl9d\") pod \"keystone-23c6-account-create-update-nk68b\" (UID: \"e55131a4-f18d-417f-8d87-408a7f3bb919\") " pod="openstack/keystone-23c6-account-create-update-nk68b" Mar 09 14:24:31 crc kubenswrapper[4722]: I0309 14:24:31.936894 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e55131a4-f18d-417f-8d87-408a7f3bb919-operator-scripts\") pod \"keystone-23c6-account-create-update-nk68b\" (UID: \"e55131a4-f18d-417f-8d87-408a7f3bb919\") " pod="openstack/keystone-23c6-account-create-update-nk68b" Mar 09 14:24:31 crc kubenswrapper[4722]: I0309 14:24:31.937255 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9btkk\" (UniqueName: \"kubernetes.io/projected/5397205b-9c4a-4575-8bfa-8604e88784e9-kube-api-access-9btkk\") pod \"keystone-db-create-4qrms\" (UID: \"5397205b-9c4a-4575-8bfa-8604e88784e9\") " pod="openstack/keystone-db-create-4qrms" Mar 09 14:24:31 crc kubenswrapper[4722]: I0309 14:24:31.937366 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5397205b-9c4a-4575-8bfa-8604e88784e9-operator-scripts\") pod \"keystone-db-create-4qrms\" (UID: \"5397205b-9c4a-4575-8bfa-8604e88784e9\") " pod="openstack/keystone-db-create-4qrms" Mar 09 14:24:31 crc kubenswrapper[4722]: I0309 14:24:31.941671 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5397205b-9c4a-4575-8bfa-8604e88784e9-operator-scripts\") pod \"keystone-db-create-4qrms\" (UID: \"5397205b-9c4a-4575-8bfa-8604e88784e9\") " pod="openstack/keystone-db-create-4qrms" Mar 09 14:24:31 crc kubenswrapper[4722]: I0309 14:24:31.956360 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9btkk\" (UniqueName: \"kubernetes.io/projected/5397205b-9c4a-4575-8bfa-8604e88784e9-kube-api-access-9btkk\") pod \"keystone-db-create-4qrms\" (UID: \"5397205b-9c4a-4575-8bfa-8604e88784e9\") " pod="openstack/keystone-db-create-4qrms" Mar 09 14:24:32 crc kubenswrapper[4722]: I0309 14:24:32.031365 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-65cx2"] Mar 09 14:24:32 crc kubenswrapper[4722]: I0309 14:24:32.033287 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-65cx2" Mar 09 14:24:32 crc kubenswrapper[4722]: I0309 14:24:32.052626 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpl9d\" (UniqueName: \"kubernetes.io/projected/e55131a4-f18d-417f-8d87-408a7f3bb919-kube-api-access-gpl9d\") pod \"keystone-23c6-account-create-update-nk68b\" (UID: \"e55131a4-f18d-417f-8d87-408a7f3bb919\") " pod="openstack/keystone-23c6-account-create-update-nk68b" Mar 09 14:24:32 crc kubenswrapper[4722]: I0309 14:24:32.052728 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e55131a4-f18d-417f-8d87-408a7f3bb919-operator-scripts\") pod \"keystone-23c6-account-create-update-nk68b\" (UID: \"e55131a4-f18d-417f-8d87-408a7f3bb919\") " pod="openstack/keystone-23c6-account-create-update-nk68b" Mar 09 14:24:32 crc kubenswrapper[4722]: I0309 14:24:32.054074 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e55131a4-f18d-417f-8d87-408a7f3bb919-operator-scripts\") pod \"keystone-23c6-account-create-update-nk68b\" (UID: \"e55131a4-f18d-417f-8d87-408a7f3bb919\") " pod="openstack/keystone-23c6-account-create-update-nk68b" Mar 09 14:24:32 crc kubenswrapper[4722]: I0309 14:24:32.075976 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-65cx2"] Mar 09 14:24:32 crc kubenswrapper[4722]: I0309 14:24:32.106148 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpl9d\" (UniqueName: \"kubernetes.io/projected/e55131a4-f18d-417f-8d87-408a7f3bb919-kube-api-access-gpl9d\") pod \"keystone-23c6-account-create-update-nk68b\" (UID: \"e55131a4-f18d-417f-8d87-408a7f3bb919\") " pod="openstack/keystone-23c6-account-create-update-nk68b" Mar 09 14:24:32 crc kubenswrapper[4722]: I0309 14:24:32.120373 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-c437-account-create-update-mdmhs"] Mar 09 14:24:32 crc kubenswrapper[4722]: I0309 14:24:32.121746 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c437-account-create-update-mdmhs" Mar 09 14:24:32 crc kubenswrapper[4722]: I0309 14:24:32.123740 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 09 14:24:32 crc kubenswrapper[4722]: I0309 14:24:32.127304 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-c437-account-create-update-mdmhs"] Mar 09 14:24:32 crc kubenswrapper[4722]: I0309 14:24:32.147594 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-4qrms" Mar 09 14:24:32 crc kubenswrapper[4722]: I0309 14:24:32.154645 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cjf6\" (UniqueName: \"kubernetes.io/projected/e9b4c63d-ce42-432e-bbcf-abe490c33d2e-kube-api-access-5cjf6\") pod \"placement-c437-account-create-update-mdmhs\" (UID: \"e9b4c63d-ce42-432e-bbcf-abe490c33d2e\") " pod="openstack/placement-c437-account-create-update-mdmhs" Mar 09 14:24:32 crc kubenswrapper[4722]: I0309 14:24:32.154691 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9b4c63d-ce42-432e-bbcf-abe490c33d2e-operator-scripts\") pod \"placement-c437-account-create-update-mdmhs\" (UID: \"e9b4c63d-ce42-432e-bbcf-abe490c33d2e\") " pod="openstack/placement-c437-account-create-update-mdmhs" Mar 09 14:24:32 crc kubenswrapper[4722]: I0309 14:24:32.154710 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggjzb\" (UniqueName: \"kubernetes.io/projected/759a8d7f-dc3f-4432-96ca-4adaf31331ae-kube-api-access-ggjzb\") pod \"placement-db-create-65cx2\" (UID: \"759a8d7f-dc3f-4432-96ca-4adaf31331ae\") " pod="openstack/placement-db-create-65cx2" Mar 09 14:24:32 crc kubenswrapper[4722]: I0309 14:24:32.154785 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/759a8d7f-dc3f-4432-96ca-4adaf31331ae-operator-scripts\") pod \"placement-db-create-65cx2\" (UID: \"759a8d7f-dc3f-4432-96ca-4adaf31331ae\") " pod="openstack/placement-db-create-65cx2" Mar 09 14:24:32 crc kubenswrapper[4722]: I0309 14:24:32.251885 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-23c6-account-create-update-nk68b" Mar 09 14:24:32 crc kubenswrapper[4722]: I0309 14:24:32.256977 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/759a8d7f-dc3f-4432-96ca-4adaf31331ae-operator-scripts\") pod \"placement-db-create-65cx2\" (UID: \"759a8d7f-dc3f-4432-96ca-4adaf31331ae\") " pod="openstack/placement-db-create-65cx2" Mar 09 14:24:32 crc kubenswrapper[4722]: I0309 14:24:32.257367 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cjf6\" (UniqueName: \"kubernetes.io/projected/e9b4c63d-ce42-432e-bbcf-abe490c33d2e-kube-api-access-5cjf6\") pod \"placement-c437-account-create-update-mdmhs\" (UID: \"e9b4c63d-ce42-432e-bbcf-abe490c33d2e\") " pod="openstack/placement-c437-account-create-update-mdmhs" Mar 09 14:24:32 crc kubenswrapper[4722]: I0309 14:24:32.257412 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9b4c63d-ce42-432e-bbcf-abe490c33d2e-operator-scripts\") pod \"placement-c437-account-create-update-mdmhs\" (UID: \"e9b4c63d-ce42-432e-bbcf-abe490c33d2e\") " pod="openstack/placement-c437-account-create-update-mdmhs" Mar 09 14:24:32 crc kubenswrapper[4722]: I0309 14:24:32.257442 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggjzb\" (UniqueName: \"kubernetes.io/projected/759a8d7f-dc3f-4432-96ca-4adaf31331ae-kube-api-access-ggjzb\") pod \"placement-db-create-65cx2\" (UID: \"759a8d7f-dc3f-4432-96ca-4adaf31331ae\") " pod="openstack/placement-db-create-65cx2" Mar 09 14:24:32 crc kubenswrapper[4722]: I0309 14:24:32.259027 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9b4c63d-ce42-432e-bbcf-abe490c33d2e-operator-scripts\") pod \"placement-c437-account-create-update-mdmhs\" (UID: \"e9b4c63d-ce42-432e-bbcf-abe490c33d2e\") " pod="openstack/placement-c437-account-create-update-mdmhs" Mar 09 14:24:32 crc kubenswrapper[4722]: I0309 14:24:32.264776 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/759a8d7f-dc3f-4432-96ca-4adaf31331ae-operator-scripts\") pod \"placement-db-create-65cx2\" (UID: \"759a8d7f-dc3f-4432-96ca-4adaf31331ae\") " pod="openstack/placement-db-create-65cx2" Mar 09 14:24:32 crc kubenswrapper[4722]: I0309 14:24:32.292775 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggjzb\" (UniqueName: \"kubernetes.io/projected/759a8d7f-dc3f-4432-96ca-4adaf31331ae-kube-api-access-ggjzb\") pod \"placement-db-create-65cx2\" (UID: \"759a8d7f-dc3f-4432-96ca-4adaf31331ae\") " pod="openstack/placement-db-create-65cx2" Mar 09 14:24:32 crc kubenswrapper[4722]: I0309 14:24:32.299127 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cjf6\" (UniqueName: \"kubernetes.io/projected/e9b4c63d-ce42-432e-bbcf-abe490c33d2e-kube-api-access-5cjf6\") pod \"placement-c437-account-create-update-mdmhs\" (UID: \"e9b4c63d-ce42-432e-bbcf-abe490c33d2e\") " pod="openstack/placement-c437-account-create-update-mdmhs" Mar 09 14:24:32 crc kubenswrapper[4722]: I0309 14:24:32.359802 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7463e84f-f457-4409-9621-507d331e06b5-etc-swift\") pod \"swift-storage-0\" (UID: \"7463e84f-f457-4409-9621-507d331e06b5\") " pod="openstack/swift-storage-0" Mar 09 14:24:32 crc kubenswrapper[4722]: E0309 14:24:32.360002 4722 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 09 14:24:32 crc kubenswrapper[4722]: E0309 14:24:32.360637 4722 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 09 14:24:32 crc kubenswrapper[4722]: E0309 14:24:32.360720 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7463e84f-f457-4409-9621-507d331e06b5-etc-swift podName:7463e84f-f457-4409-9621-507d331e06b5 nodeName:}" failed. No retries permitted until 2026-03-09 14:24:40.360700708 +0000 UTC m=+1320.916269294 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/7463e84f-f457-4409-9621-507d331e06b5-etc-swift") pod "swift-storage-0" (UID: "7463e84f-f457-4409-9621-507d331e06b5") : configmap "swift-ring-files" not found Mar 09 14:24:32 crc kubenswrapper[4722]: I0309 14:24:32.409634 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-65cx2" Mar 09 14:24:32 crc kubenswrapper[4722]: I0309 14:24:32.445153 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c437-account-create-update-mdmhs" Mar 09 14:24:32 crc kubenswrapper[4722]: W0309 14:24:32.598785 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d9d7716_b622_4ae7_9f3a_480c5807525b.slice/crio-91b7ae298e0ab69a2103821486880be47ed3f9348a720a697b8478a1f590fd89 WatchSource:0}: Error finding container 91b7ae298e0ab69a2103821486880be47ed3f9348a720a697b8478a1f590fd89: Status 404 returned error can't find the container with id 91b7ae298e0ab69a2103821486880be47ed3f9348a720a697b8478a1f590fd89 Mar 09 14:24:32 crc kubenswrapper[4722]: I0309 14:24:32.605621 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-33f7-account-create-update-tc7x6"] Mar 09 14:24:32 crc kubenswrapper[4722]: I0309 14:24:32.614805 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-fd4g6"] Mar 09 14:24:32 crc kubenswrapper[4722]: I0309 14:24:32.630312 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 09 14:24:32 crc kubenswrapper[4722]: I0309 14:24:32.677173 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-fd4g6" event={"ID":"b730192a-1410-4860-a847-f5e5974fd728","Type":"ContainerStarted","Data":"9ca143abe660467637008b2c4095e68af2061028d95dc81fb736b54abc9d697f"} Mar 09 14:24:32 crc kubenswrapper[4722]: I0309 14:24:32.681974 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"6585120e-2007-43b3-a72d-4e80fb7ab2fb","Type":"ContainerStarted","Data":"cc277e152b8c24c6825510e8b68658b70afb4d5386b0fb67701d0e47d11d7064"} Mar 09 14:24:32 crc kubenswrapper[4722]: I0309 14:24:32.685247 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-vtpb7" event={"ID":"cb0983d9-3f03-406f-a485-f89ba50341fc","Type":"ContainerStarted","Data":"9029fac70b952306b5efbc4eb7199723abe5367889318f7ad69441edc1614af8"} Mar 09 14:24:32 crc kubenswrapper[4722]: I0309 14:24:32.690059 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-33f7-account-create-update-tc7x6" event={"ID":"1d9d7716-b622-4ae7-9f3a-480c5807525b","Type":"ContainerStarted","Data":"91b7ae298e0ab69a2103821486880be47ed3f9348a720a697b8478a1f590fd89"} Mar 09 14:24:32 crc kubenswrapper[4722]: I0309 14:24:32.707124 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-vtpb7" podStartSLOduration=4.002334476 podStartE2EDuration="8.707106431s" podCreationTimestamp="2026-03-09 14:24:24 +0000 UTC" firstStartedPulling="2026-03-09 14:24:27.148371822 +0000 UTC m=+1307.703940388" lastFinishedPulling="2026-03-09 14:24:31.853143777 +0000 UTC m=+1312.408712343" observedRunningTime="2026-03-09 14:24:32.706972998 +0000 UTC m=+1313.262541574" watchObservedRunningTime="2026-03-09 14:24:32.707106431 +0000 UTC m=+1313.262675007" Mar 09 14:24:32 crc kubenswrapper[4722]: W0309 14:24:32.724078 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5397205b_9c4a_4575_8bfa_8604e88784e9.slice/crio-1c16b9dcc082d49d74cdb51f1f143682479734bfdc64f7b0e6e051f6f9a35e46 WatchSource:0}: Error finding container 1c16b9dcc082d49d74cdb51f1f143682479734bfdc64f7b0e6e051f6f9a35e46: Status 404 returned error can't find the container with id 1c16b9dcc082d49d74cdb51f1f143682479734bfdc64f7b0e6e051f6f9a35e46 Mar 09 14:24:32 crc kubenswrapper[4722]: I0309 14:24:32.726948 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-4qrms"] Mar 09 14:24:32 crc kubenswrapper[4722]: I0309 14:24:32.903883 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-23c6-account-create-update-nk68b"] Mar 09 14:24:32 crc kubenswrapper[4722]: W0309 14:24:32.912711 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode55131a4_f18d_417f_8d87_408a7f3bb919.slice/crio-238ad06d878827501436d90aa18630b753b6e3b6955daccfc79bf2c6bbf27490 WatchSource:0}: Error finding container 238ad06d878827501436d90aa18630b753b6e3b6955daccfc79bf2c6bbf27490: Status 404 returned error can't find the container with id 238ad06d878827501436d90aa18630b753b6e3b6955daccfc79bf2c6bbf27490 Mar 09 14:24:33 crc kubenswrapper[4722]: I0309 14:24:33.035392 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-7dk6r"] Mar 09 14:24:33 crc kubenswrapper[4722]: I0309 14:24:33.037572 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-7dk6r" Mar 09 14:24:33 crc kubenswrapper[4722]: I0309 14:24:33.052122 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-7dk6r"] Mar 09 14:24:33 crc kubenswrapper[4722]: I0309 14:24:33.076350 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vwfc\" (UniqueName: \"kubernetes.io/projected/f9e36087-1859-42c2-bf99-100d32617755-kube-api-access-5vwfc\") pod \"mysqld-exporter-openstack-db-create-7dk6r\" (UID: \"f9e36087-1859-42c2-bf99-100d32617755\") " pod="openstack/mysqld-exporter-openstack-db-create-7dk6r" Mar 09 14:24:33 crc kubenswrapper[4722]: I0309 14:24:33.076424 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9e36087-1859-42c2-bf99-100d32617755-operator-scripts\") pod \"mysqld-exporter-openstack-db-create-7dk6r\" (UID: \"f9e36087-1859-42c2-bf99-100d32617755\") " pod="openstack/mysqld-exporter-openstack-db-create-7dk6r" Mar 09 14:24:33 crc kubenswrapper[4722]: I0309 14:24:33.098717 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-65cx2"] Mar 09 14:24:33 crc kubenswrapper[4722]: I0309 14:24:33.109531 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-c437-account-create-update-mdmhs"] Mar 09 14:24:33 crc kubenswrapper[4722]: I0309 14:24:33.178299 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vwfc\" (UniqueName: \"kubernetes.io/projected/f9e36087-1859-42c2-bf99-100d32617755-kube-api-access-5vwfc\") pod \"mysqld-exporter-openstack-db-create-7dk6r\" (UID: \"f9e36087-1859-42c2-bf99-100d32617755\") " pod="openstack/mysqld-exporter-openstack-db-create-7dk6r" Mar 09 14:24:33 crc kubenswrapper[4722]: I0309 14:24:33.178372 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9e36087-1859-42c2-bf99-100d32617755-operator-scripts\") pod \"mysqld-exporter-openstack-db-create-7dk6r\" (UID: \"f9e36087-1859-42c2-bf99-100d32617755\") " pod="openstack/mysqld-exporter-openstack-db-create-7dk6r" Mar 09 14:24:33 crc kubenswrapper[4722]: I0309 14:24:33.179082 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9e36087-1859-42c2-bf99-100d32617755-operator-scripts\") pod \"mysqld-exporter-openstack-db-create-7dk6r\" (UID: \"f9e36087-1859-42c2-bf99-100d32617755\") " pod="openstack/mysqld-exporter-openstack-db-create-7dk6r" Mar 09 14:24:33 crc kubenswrapper[4722]: I0309 14:24:33.236479 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-34cc-account-create-update-hqh5l"] Mar 09 14:24:33 crc kubenswrapper[4722]: I0309 14:24:33.238697 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-34cc-account-create-update-hqh5l" Mar 09 14:24:33 crc kubenswrapper[4722]: I0309 14:24:33.240527 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vwfc\" (UniqueName: \"kubernetes.io/projected/f9e36087-1859-42c2-bf99-100d32617755-kube-api-access-5vwfc\") pod \"mysqld-exporter-openstack-db-create-7dk6r\" (UID: \"f9e36087-1859-42c2-bf99-100d32617755\") " pod="openstack/mysqld-exporter-openstack-db-create-7dk6r" Mar 09 14:24:33 crc kubenswrapper[4722]: I0309 14:24:33.241474 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-openstack-db-secret" Mar 09 14:24:33 crc kubenswrapper[4722]: I0309 14:24:33.253563 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-34cc-account-create-update-hqh5l"] Mar 09 14:24:33 crc kubenswrapper[4722]: I0309 14:24:33.273047 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 09 14:24:33 crc kubenswrapper[4722]: I0309 14:24:33.282744 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nw9dt\" (UniqueName: \"kubernetes.io/projected/d94802a7-3f7c-4172-b781-a1eac89761d6-kube-api-access-nw9dt\") pod \"mysqld-exporter-34cc-account-create-update-hqh5l\" (UID: \"d94802a7-3f7c-4172-b781-a1eac89761d6\") " pod="openstack/mysqld-exporter-34cc-account-create-update-hqh5l" Mar 09 14:24:33 crc kubenswrapper[4722]: I0309 14:24:33.282876 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d94802a7-3f7c-4172-b781-a1eac89761d6-operator-scripts\") pod \"mysqld-exporter-34cc-account-create-update-hqh5l\" (UID: \"d94802a7-3f7c-4172-b781-a1eac89761d6\") " pod="openstack/mysqld-exporter-34cc-account-create-update-hqh5l" Mar 09 14:24:33 crc kubenswrapper[4722]: I0309 14:24:33.368813 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-7dk6r" Mar 09 14:24:33 crc kubenswrapper[4722]: I0309 14:24:33.385204 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nw9dt\" (UniqueName: \"kubernetes.io/projected/d94802a7-3f7c-4172-b781-a1eac89761d6-kube-api-access-nw9dt\") pod \"mysqld-exporter-34cc-account-create-update-hqh5l\" (UID: \"d94802a7-3f7c-4172-b781-a1eac89761d6\") " pod="openstack/mysqld-exporter-34cc-account-create-update-hqh5l" Mar 09 14:24:33 crc kubenswrapper[4722]: I0309 14:24:33.385371 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d94802a7-3f7c-4172-b781-a1eac89761d6-operator-scripts\") pod \"mysqld-exporter-34cc-account-create-update-hqh5l\" (UID: \"d94802a7-3f7c-4172-b781-a1eac89761d6\") " pod="openstack/mysqld-exporter-34cc-account-create-update-hqh5l" Mar 09 14:24:33 crc kubenswrapper[4722]: I0309 14:24:33.386322 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d94802a7-3f7c-4172-b781-a1eac89761d6-operator-scripts\") pod \"mysqld-exporter-34cc-account-create-update-hqh5l\" (UID: \"d94802a7-3f7c-4172-b781-a1eac89761d6\") " pod="openstack/mysqld-exporter-34cc-account-create-update-hqh5l" Mar 09 14:24:33 crc kubenswrapper[4722]: I0309 14:24:33.406037 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nw9dt\" (UniqueName: \"kubernetes.io/projected/d94802a7-3f7c-4172-b781-a1eac89761d6-kube-api-access-nw9dt\") pod \"mysqld-exporter-34cc-account-create-update-hqh5l\" (UID: \"d94802a7-3f7c-4172-b781-a1eac89761d6\") " pod="openstack/mysqld-exporter-34cc-account-create-update-hqh5l" Mar 09 14:24:33 crc kubenswrapper[4722]: I0309 14:24:33.579799 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-34cc-account-create-update-hqh5l" Mar 09 14:24:33 crc kubenswrapper[4722]: I0309 14:24:33.667839 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-thqfj" Mar 09 14:24:33 crc kubenswrapper[4722]: I0309 14:24:33.737057 4722 generic.go:334] "Generic (PLEG): container finished" podID="b730192a-1410-4860-a847-f5e5974fd728" containerID="05e4e023af02d9f6a33763b4fa8ad55942cdd55d767418bf2f9c9ebd94a63ce4" exitCode=0 Mar 09 14:24:33 crc kubenswrapper[4722]: I0309 14:24:33.737139 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-fd4g6" event={"ID":"b730192a-1410-4860-a847-f5e5974fd728","Type":"ContainerDied","Data":"05e4e023af02d9f6a33763b4fa8ad55942cdd55d767418bf2f9c9ebd94a63ce4"} Mar 09 14:24:33 crc kubenswrapper[4722]: I0309 14:24:33.739623 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c437-account-create-update-mdmhs" event={"ID":"e9b4c63d-ce42-432e-bbcf-abe490c33d2e","Type":"ContainerStarted","Data":"84a5442f49e47806ced5341e3762f4d3f9e499c6bee42c6d293f5c8c8d961f4d"} Mar 09 14:24:33 crc kubenswrapper[4722]: I0309 14:24:33.739664 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c437-account-create-update-mdmhs" event={"ID":"e9b4c63d-ce42-432e-bbcf-abe490c33d2e","Type":"ContainerStarted","Data":"d27b2b3011a5b5abf2ab9850f5e65b5b1f837c21ecf210c21fa4584d6b954b31"} Mar 09 14:24:33 crc kubenswrapper[4722]: I0309 14:24:33.742440 4722 generic.go:334] "Generic (PLEG): container finished" podID="1d9d7716-b622-4ae7-9f3a-480c5807525b" containerID="ab1bc7f3f075346e0c3f08ec47a17a509a3b83aa2fd81823f6dbd4674ac6db74" exitCode=0 Mar 09 14:24:33 crc kubenswrapper[4722]: I0309 14:24:33.742520 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-33f7-account-create-update-tc7x6" event={"ID":"1d9d7716-b622-4ae7-9f3a-480c5807525b","Type":"ContainerDied","Data":"ab1bc7f3f075346e0c3f08ec47a17a509a3b83aa2fd81823f6dbd4674ac6db74"} Mar 09 14:24:33 crc kubenswrapper[4722]: I0309 14:24:33.744152 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-lvds5"] Mar 09 14:24:33 crc kubenswrapper[4722]: I0309 14:24:33.744397 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-lvds5" podUID="0ae3ba92-3a9f-4f65-8c32-2b3f4d1f95e7" containerName="dnsmasq-dns" containerID="cri-o://f6534d1e16ac3434c37eec26b3cbd5e4e89ad65721ae34a000ce2ae8185d8263" gracePeriod=10 Mar 09 14:24:33 crc kubenswrapper[4722]: I0309 14:24:33.751359 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-lvds5" Mar 09 14:24:33 crc kubenswrapper[4722]: I0309 14:24:33.761424 4722 generic.go:334] "Generic (PLEG): container finished" podID="e55131a4-f18d-417f-8d87-408a7f3bb919" containerID="8ba9f880a35b90799326c51b4ef8ca4c612c935b1f0676c9b692a22f589c5bf6" exitCode=0 Mar 09 14:24:33 crc kubenswrapper[4722]: I0309 14:24:33.761508 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-23c6-account-create-update-nk68b" event={"ID":"e55131a4-f18d-417f-8d87-408a7f3bb919","Type":"ContainerDied","Data":"8ba9f880a35b90799326c51b4ef8ca4c612c935b1f0676c9b692a22f589c5bf6"} Mar 09 14:24:33 crc kubenswrapper[4722]: I0309 14:24:33.761532 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-23c6-account-create-update-nk68b" event={"ID":"e55131a4-f18d-417f-8d87-408a7f3bb919","Type":"ContainerStarted","Data":"238ad06d878827501436d90aa18630b753b6e3b6955daccfc79bf2c6bbf27490"} Mar 09 14:24:33 crc kubenswrapper[4722]: I0309 14:24:33.772821 4722 generic.go:334] "Generic (PLEG): container finished" podID="5397205b-9c4a-4575-8bfa-8604e88784e9" containerID="9eb0caf05965c54430c17365b4f1c9c40b3e0003965ce27c87e41bfdeae42841" exitCode=0 Mar 09 14:24:33 crc kubenswrapper[4722]: I0309 14:24:33.772928 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-4qrms" event={"ID":"5397205b-9c4a-4575-8bfa-8604e88784e9","Type":"ContainerDied","Data":"9eb0caf05965c54430c17365b4f1c9c40b3e0003965ce27c87e41bfdeae42841"} Mar 09 14:24:33 crc kubenswrapper[4722]: I0309 14:24:33.772963 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-4qrms" event={"ID":"5397205b-9c4a-4575-8bfa-8604e88784e9","Type":"ContainerStarted","Data":"1c16b9dcc082d49d74cdb51f1f143682479734bfdc64f7b0e6e051f6f9a35e46"} Mar 09 14:24:33 crc kubenswrapper[4722]: I0309 14:24:33.785609 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-65cx2" event={"ID":"759a8d7f-dc3f-4432-96ca-4adaf31331ae","Type":"ContainerStarted","Data":"474844d6f470855b27765fc7bc63d527480411f21441a78f7ec486c1dc4a2e0b"} Mar 09 14:24:33 crc kubenswrapper[4722]: I0309 14:24:33.785669 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-65cx2" event={"ID":"759a8d7f-dc3f-4432-96ca-4adaf31331ae","Type":"ContainerStarted","Data":"a31d6f2b13102399523826e128e33ab3c62609927f88d3cc0bc390629c472df2"} Mar 09 14:24:33 crc kubenswrapper[4722]: I0309 14:24:33.847712 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-c437-account-create-update-mdmhs" podStartSLOduration=1.8476883160000002 podStartE2EDuration="1.847688316s" podCreationTimestamp="2026-03-09 14:24:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:24:33.829717967 +0000 UTC m=+1314.385286543" watchObservedRunningTime="2026-03-09 14:24:33.847688316 +0000 UTC m=+1314.403256892" Mar 09 14:24:33 crc kubenswrapper[4722]: I0309 14:24:33.897107 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-65cx2" podStartSLOduration=1.897085753 podStartE2EDuration="1.897085753s" podCreationTimestamp="2026-03-09 14:24:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:24:33.858893315 +0000 UTC m=+1314.414461891" watchObservedRunningTime="2026-03-09 14:24:33.897085753 +0000 UTC m=+1314.452654329" Mar 09 14:24:33 crc kubenswrapper[4722]: I0309 14:24:33.933519 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-7dk6r"] Mar 09 14:24:34 crc kubenswrapper[4722]: W0309 14:24:34.089097 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9e36087_1859_42c2_bf99_100d32617755.slice/crio-b03fcd081c9e02c56408d0171b25cd54e42f95b5fc882b4eabf8b671d3435ec2 WatchSource:0}: Error finding container b03fcd081c9e02c56408d0171b25cd54e42f95b5fc882b4eabf8b671d3435ec2: Status 404 returned error can't find the container with id b03fcd081c9e02c56408d0171b25cd54e42f95b5fc882b4eabf8b671d3435ec2 Mar 09 14:24:34 crc kubenswrapper[4722]: I0309 14:24:34.743614 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-lvds5" Mar 09 14:24:34 crc kubenswrapper[4722]: I0309 14:24:34.789458 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-34cc-account-create-update-hqh5l"] Mar 09 14:24:34 crc kubenswrapper[4722]: I0309 14:24:34.797328 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-7dk6r" event={"ID":"f9e36087-1859-42c2-bf99-100d32617755","Type":"ContainerStarted","Data":"e7d19a62a82beca0226b21f1e83b0cce1b7119fc6bc4bdd57cf3ae7a89c8013d"} Mar 09 14:24:34 crc kubenswrapper[4722]: I0309 14:24:34.797372 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-7dk6r" event={"ID":"f9e36087-1859-42c2-bf99-100d32617755","Type":"ContainerStarted","Data":"b03fcd081c9e02c56408d0171b25cd54e42f95b5fc882b4eabf8b671d3435ec2"} Mar 09 14:24:34 crc kubenswrapper[4722]: I0309 14:24:34.816507 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-openstack-db-create-7dk6r" podStartSLOduration=1.816473891 podStartE2EDuration="1.816473891s" podCreationTimestamp="2026-03-09 14:24:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:24:34.814408506 +0000 UTC m=+1315.369977082" watchObservedRunningTime="2026-03-09 14:24:34.816473891 +0000 UTC m=+1315.372042467" Mar 09 14:24:34 crc kubenswrapper[4722]: I0309 14:24:34.816738 4722 generic.go:334] "Generic (PLEG): container finished" podID="0ae3ba92-3a9f-4f65-8c32-2b3f4d1f95e7" containerID="f6534d1e16ac3434c37eec26b3cbd5e4e89ad65721ae34a000ce2ae8185d8263" exitCode=0 Mar 09 14:24:34 crc kubenswrapper[4722]: I0309 14:24:34.816874 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-lvds5" event={"ID":"0ae3ba92-3a9f-4f65-8c32-2b3f4d1f95e7","Type":"ContainerDied","Data":"f6534d1e16ac3434c37eec26b3cbd5e4e89ad65721ae34a000ce2ae8185d8263"} Mar 09 14:24:34 crc kubenswrapper[4722]: I0309 14:24:34.817024 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-lvds5" event={"ID":"0ae3ba92-3a9f-4f65-8c32-2b3f4d1f95e7","Type":"ContainerDied","Data":"fe472556cd60ef8c56aa441231ec648ac421358272f3d5ea3c68c4a6b38797a4"} Mar 09 14:24:34 crc kubenswrapper[4722]: I0309 14:24:34.816855 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-lvds5" Mar 09 14:24:34 crc kubenswrapper[4722]: I0309 14:24:34.817052 4722 scope.go:117] "RemoveContainer" containerID="f6534d1e16ac3434c37eec26b3cbd5e4e89ad65721ae34a000ce2ae8185d8263" Mar 09 14:24:34 crc kubenswrapper[4722]: I0309 14:24:34.820766 4722 generic.go:334] "Generic (PLEG): container finished" podID="759a8d7f-dc3f-4432-96ca-4adaf31331ae" containerID="474844d6f470855b27765fc7bc63d527480411f21441a78f7ec486c1dc4a2e0b" exitCode=0 Mar 09 14:24:34 crc kubenswrapper[4722]: I0309 14:24:34.820855 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-65cx2" event={"ID":"759a8d7f-dc3f-4432-96ca-4adaf31331ae","Type":"ContainerDied","Data":"474844d6f470855b27765fc7bc63d527480411f21441a78f7ec486c1dc4a2e0b"} Mar 09 14:24:34 crc kubenswrapper[4722]: I0309 14:24:34.823316 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"6585120e-2007-43b3-a72d-4e80fb7ab2fb","Type":"ContainerStarted","Data":"a1b0fb4a97c6cd8b486dc561286b2607b0aa6dde9675eda37871b4b488eedecd"} Mar 09 14:24:34 crc kubenswrapper[4722]: I0309 14:24:34.829092 4722 generic.go:334] "Generic (PLEG): container finished" podID="e9b4c63d-ce42-432e-bbcf-abe490c33d2e" containerID="84a5442f49e47806ced5341e3762f4d3f9e499c6bee42c6d293f5c8c8d961f4d" exitCode=0 Mar 09 14:24:34 crc kubenswrapper[4722]: I0309 14:24:34.829335 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c437-account-create-update-mdmhs" event={"ID":"e9b4c63d-ce42-432e-bbcf-abe490c33d2e","Type":"ContainerDied","Data":"84a5442f49e47806ced5341e3762f4d3f9e499c6bee42c6d293f5c8c8d961f4d"} Mar 09 14:24:34 crc kubenswrapper[4722]: I0309 14:24:34.852291 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0ae3ba92-3a9f-4f65-8c32-2b3f4d1f95e7-ovsdbserver-nb\") pod \"0ae3ba92-3a9f-4f65-8c32-2b3f4d1f95e7\" (UID: \"0ae3ba92-3a9f-4f65-8c32-2b3f4d1f95e7\") " Mar 09 14:24:34 crc kubenswrapper[4722]: I0309 14:24:34.852619 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5z9x\" (UniqueName: \"kubernetes.io/projected/0ae3ba92-3a9f-4f65-8c32-2b3f4d1f95e7-kube-api-access-t5z9x\") pod \"0ae3ba92-3a9f-4f65-8c32-2b3f4d1f95e7\" (UID: \"0ae3ba92-3a9f-4f65-8c32-2b3f4d1f95e7\") " Mar 09 14:24:34 crc kubenswrapper[4722]: I0309 14:24:34.852819 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0ae3ba92-3a9f-4f65-8c32-2b3f4d1f95e7-ovsdbserver-sb\") pod \"0ae3ba92-3a9f-4f65-8c32-2b3f4d1f95e7\" (UID: \"0ae3ba92-3a9f-4f65-8c32-2b3f4d1f95e7\") " Mar 09 14:24:34 crc kubenswrapper[4722]: I0309 14:24:34.852889 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ae3ba92-3a9f-4f65-8c32-2b3f4d1f95e7-dns-svc\") pod \"0ae3ba92-3a9f-4f65-8c32-2b3f4d1f95e7\" (UID: \"0ae3ba92-3a9f-4f65-8c32-2b3f4d1f95e7\") " Mar 09 14:24:34 crc kubenswrapper[4722]: I0309 14:24:34.852953 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ae3ba92-3a9f-4f65-8c32-2b3f4d1f95e7-config\") pod \"0ae3ba92-3a9f-4f65-8c32-2b3f4d1f95e7\" (UID: \"0ae3ba92-3a9f-4f65-8c32-2b3f4d1f95e7\") " Mar 09 14:24:34 crc kubenswrapper[4722]: I0309 14:24:34.873503 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ae3ba92-3a9f-4f65-8c32-2b3f4d1f95e7-kube-api-access-t5z9x" (OuterVolumeSpecName: "kube-api-access-t5z9x") pod "0ae3ba92-3a9f-4f65-8c32-2b3f4d1f95e7" (UID: "0ae3ba92-3a9f-4f65-8c32-2b3f4d1f95e7"). InnerVolumeSpecName "kube-api-access-t5z9x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:24:34 crc kubenswrapper[4722]: I0309 14:24:34.904176 4722 scope.go:117] "RemoveContainer" containerID="f3a6349731b8bd48668c095339efa4febc5af6a7eb1936c487dd1fc5d805a644" Mar 09 14:24:34 crc kubenswrapper[4722]: I0309 14:24:34.925997 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ae3ba92-3a9f-4f65-8c32-2b3f4d1f95e7-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0ae3ba92-3a9f-4f65-8c32-2b3f4d1f95e7" (UID: "0ae3ba92-3a9f-4f65-8c32-2b3f4d1f95e7"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:24:34 crc kubenswrapper[4722]: I0309 14:24:34.927264 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ae3ba92-3a9f-4f65-8c32-2b3f4d1f95e7-config" (OuterVolumeSpecName: "config") pod "0ae3ba92-3a9f-4f65-8c32-2b3f4d1f95e7" (UID: "0ae3ba92-3a9f-4f65-8c32-2b3f4d1f95e7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:24:34 crc kubenswrapper[4722]: I0309 14:24:34.928702 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ae3ba92-3a9f-4f65-8c32-2b3f4d1f95e7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0ae3ba92-3a9f-4f65-8c32-2b3f4d1f95e7" (UID: "0ae3ba92-3a9f-4f65-8c32-2b3f4d1f95e7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:24:34 crc kubenswrapper[4722]: I0309 14:24:34.933375 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ae3ba92-3a9f-4f65-8c32-2b3f4d1f95e7-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0ae3ba92-3a9f-4f65-8c32-2b3f4d1f95e7" (UID: "0ae3ba92-3a9f-4f65-8c32-2b3f4d1f95e7"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:24:34 crc kubenswrapper[4722]: I0309 14:24:34.937932 4722 scope.go:117] "RemoveContainer" containerID="f6534d1e16ac3434c37eec26b3cbd5e4e89ad65721ae34a000ce2ae8185d8263" Mar 09 14:24:34 crc kubenswrapper[4722]: E0309 14:24:34.939365 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6534d1e16ac3434c37eec26b3cbd5e4e89ad65721ae34a000ce2ae8185d8263\": container with ID starting with f6534d1e16ac3434c37eec26b3cbd5e4e89ad65721ae34a000ce2ae8185d8263 not found: ID does not exist" containerID="f6534d1e16ac3434c37eec26b3cbd5e4e89ad65721ae34a000ce2ae8185d8263" Mar 09 14:24:34 crc kubenswrapper[4722]: I0309 14:24:34.939410 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6534d1e16ac3434c37eec26b3cbd5e4e89ad65721ae34a000ce2ae8185d8263"} err="failed to get container status \"f6534d1e16ac3434c37eec26b3cbd5e4e89ad65721ae34a000ce2ae8185d8263\": rpc error: code = NotFound desc = could not find container \"f6534d1e16ac3434c37eec26b3cbd5e4e89ad65721ae34a000ce2ae8185d8263\": container with ID starting with f6534d1e16ac3434c37eec26b3cbd5e4e89ad65721ae34a000ce2ae8185d8263 not found: ID does not exist" Mar 09 14:24:34 crc kubenswrapper[4722]: I0309 14:24:34.939466 4722 scope.go:117] "RemoveContainer" containerID="f3a6349731b8bd48668c095339efa4febc5af6a7eb1936c487dd1fc5d805a644" Mar 09 14:24:34 crc kubenswrapper[4722]: E0309 14:24:34.940818 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3a6349731b8bd48668c095339efa4febc5af6a7eb1936c487dd1fc5d805a644\": container with ID starting with f3a6349731b8bd48668c095339efa4febc5af6a7eb1936c487dd1fc5d805a644 not found: ID does not exist" containerID="f3a6349731b8bd48668c095339efa4febc5af6a7eb1936c487dd1fc5d805a644" Mar 09 14:24:34 crc kubenswrapper[4722]: I0309 14:24:34.940842 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3a6349731b8bd48668c095339efa4febc5af6a7eb1936c487dd1fc5d805a644"} err="failed to get container status \"f3a6349731b8bd48668c095339efa4febc5af6a7eb1936c487dd1fc5d805a644\": rpc error: code = NotFound desc = could not find container \"f3a6349731b8bd48668c095339efa4febc5af6a7eb1936c487dd1fc5d805a644\": container with ID starting with f3a6349731b8bd48668c095339efa4febc5af6a7eb1936c487dd1fc5d805a644 not found: ID does not exist" Mar 09 14:24:34 crc kubenswrapper[4722]: I0309 14:24:34.956672 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5z9x\" (UniqueName: \"kubernetes.io/projected/0ae3ba92-3a9f-4f65-8c32-2b3f4d1f95e7-kube-api-access-t5z9x\") on node \"crc\" DevicePath \"\"" Mar 09 14:24:34 crc kubenswrapper[4722]: I0309 14:24:34.956695 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0ae3ba92-3a9f-4f65-8c32-2b3f4d1f95e7-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 09 14:24:34 crc kubenswrapper[4722]: I0309 14:24:34.956704 4722 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0ae3ba92-3a9f-4f65-8c32-2b3f4d1f95e7-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 14:24:34 crc kubenswrapper[4722]: I0309 14:24:34.956712 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ae3ba92-3a9f-4f65-8c32-2b3f4d1f95e7-config\") on node \"crc\" DevicePath \"\"" Mar 09 14:24:34 crc kubenswrapper[4722]: I0309 14:24:34.956721 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0ae3ba92-3a9f-4f65-8c32-2b3f4d1f95e7-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 09 14:24:35 crc kubenswrapper[4722]: I0309 14:24:35.168342 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-lvds5"] Mar 09 14:24:35 crc kubenswrapper[4722]: I0309 14:24:35.179839 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-lvds5"] Mar 09 14:24:35 crc kubenswrapper[4722]: I0309 14:24:35.403466 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-fd4g6" Mar 09 14:24:35 crc kubenswrapper[4722]: I0309 14:24:35.571084 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b730192a-1410-4860-a847-f5e5974fd728-operator-scripts\") pod \"b730192a-1410-4860-a847-f5e5974fd728\" (UID: \"b730192a-1410-4860-a847-f5e5974fd728\") " Mar 09 14:24:35 crc kubenswrapper[4722]: I0309 14:24:35.571168 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7pfvd\" (UniqueName: \"kubernetes.io/projected/b730192a-1410-4860-a847-f5e5974fd728-kube-api-access-7pfvd\") pod \"b730192a-1410-4860-a847-f5e5974fd728\" (UID: \"b730192a-1410-4860-a847-f5e5974fd728\") " Mar 09 14:24:35 crc kubenswrapper[4722]: I0309 14:24:35.571763 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b730192a-1410-4860-a847-f5e5974fd728-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b730192a-1410-4860-a847-f5e5974fd728" (UID: "b730192a-1410-4860-a847-f5e5974fd728"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:24:35 crc kubenswrapper[4722]: I0309 14:24:35.571896 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b730192a-1410-4860-a847-f5e5974fd728-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 14:24:35 crc kubenswrapper[4722]: I0309 14:24:35.576878 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b730192a-1410-4860-a847-f5e5974fd728-kube-api-access-7pfvd" (OuterVolumeSpecName: "kube-api-access-7pfvd") pod "b730192a-1410-4860-a847-f5e5974fd728" (UID: "b730192a-1410-4860-a847-f5e5974fd728"). InnerVolumeSpecName "kube-api-access-7pfvd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:24:35 crc kubenswrapper[4722]: I0309 14:24:35.627147 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-4qrms" Mar 09 14:24:35 crc kubenswrapper[4722]: I0309 14:24:35.634095 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-33f7-account-create-update-tc7x6" Mar 09 14:24:35 crc kubenswrapper[4722]: I0309 14:24:35.644969 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-23c6-account-create-update-nk68b" Mar 09 14:24:35 crc kubenswrapper[4722]: I0309 14:24:35.675834 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7pfvd\" (UniqueName: \"kubernetes.io/projected/b730192a-1410-4860-a847-f5e5974fd728-kube-api-access-7pfvd\") on node \"crc\" DevicePath \"\"" Mar 09 14:24:35 crc kubenswrapper[4722]: I0309 14:24:35.777266 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5qbgb\" (UniqueName: \"kubernetes.io/projected/1d9d7716-b622-4ae7-9f3a-480c5807525b-kube-api-access-5qbgb\") pod \"1d9d7716-b622-4ae7-9f3a-480c5807525b\" (UID: \"1d9d7716-b622-4ae7-9f3a-480c5807525b\") " Mar 09 14:24:35 crc kubenswrapper[4722]: I0309 14:24:35.777321 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gpl9d\" (UniqueName: \"kubernetes.io/projected/e55131a4-f18d-417f-8d87-408a7f3bb919-kube-api-access-gpl9d\") pod \"e55131a4-f18d-417f-8d87-408a7f3bb919\" (UID: \"e55131a4-f18d-417f-8d87-408a7f3bb919\") " Mar 09 14:24:35 crc kubenswrapper[4722]: I0309 14:24:35.777383 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5397205b-9c4a-4575-8bfa-8604e88784e9-operator-scripts\") pod \"5397205b-9c4a-4575-8bfa-8604e88784e9\" (UID: \"5397205b-9c4a-4575-8bfa-8604e88784e9\") " Mar 09 14:24:35 crc kubenswrapper[4722]: I0309 14:24:35.777433 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9btkk\" (UniqueName: \"kubernetes.io/projected/5397205b-9c4a-4575-8bfa-8604e88784e9-kube-api-access-9btkk\") pod \"5397205b-9c4a-4575-8bfa-8604e88784e9\" (UID: \"5397205b-9c4a-4575-8bfa-8604e88784e9\") " Mar 09 14:24:35 crc kubenswrapper[4722]: I0309 14:24:35.777490 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1d9d7716-b622-4ae7-9f3a-480c5807525b-operator-scripts\") pod \"1d9d7716-b622-4ae7-9f3a-480c5807525b\" (UID: \"1d9d7716-b622-4ae7-9f3a-480c5807525b\") " Mar 09 14:24:35 crc kubenswrapper[4722]: I0309 14:24:35.777580 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e55131a4-f18d-417f-8d87-408a7f3bb919-operator-scripts\") pod \"e55131a4-f18d-417f-8d87-408a7f3bb919\" (UID: \"e55131a4-f18d-417f-8d87-408a7f3bb919\") " Mar 09 14:24:35 crc kubenswrapper[4722]: I0309 14:24:35.778364 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5397205b-9c4a-4575-8bfa-8604e88784e9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5397205b-9c4a-4575-8bfa-8604e88784e9" (UID: "5397205b-9c4a-4575-8bfa-8604e88784e9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:24:35 crc kubenswrapper[4722]: I0309 14:24:35.778418 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d9d7716-b622-4ae7-9f3a-480c5807525b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1d9d7716-b622-4ae7-9f3a-480c5807525b" (UID: "1d9d7716-b622-4ae7-9f3a-480c5807525b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:24:35 crc kubenswrapper[4722]: I0309 14:24:35.778469 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e55131a4-f18d-417f-8d87-408a7f3bb919-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e55131a4-f18d-417f-8d87-408a7f3bb919" (UID: "e55131a4-f18d-417f-8d87-408a7f3bb919"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:24:35 crc kubenswrapper[4722]: I0309 14:24:35.780919 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d9d7716-b622-4ae7-9f3a-480c5807525b-kube-api-access-5qbgb" (OuterVolumeSpecName: "kube-api-access-5qbgb") pod "1d9d7716-b622-4ae7-9f3a-480c5807525b" (UID: "1d9d7716-b622-4ae7-9f3a-480c5807525b"). InnerVolumeSpecName "kube-api-access-5qbgb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:24:35 crc kubenswrapper[4722]: I0309 14:24:35.781539 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e55131a4-f18d-417f-8d87-408a7f3bb919-kube-api-access-gpl9d" (OuterVolumeSpecName: "kube-api-access-gpl9d") pod "e55131a4-f18d-417f-8d87-408a7f3bb919" (UID: "e55131a4-f18d-417f-8d87-408a7f3bb919"). InnerVolumeSpecName "kube-api-access-gpl9d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:24:35 crc kubenswrapper[4722]: I0309 14:24:35.782362 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5397205b-9c4a-4575-8bfa-8604e88784e9-kube-api-access-9btkk" (OuterVolumeSpecName: "kube-api-access-9btkk") pod "5397205b-9c4a-4575-8bfa-8604e88784e9" (UID: "5397205b-9c4a-4575-8bfa-8604e88784e9"). InnerVolumeSpecName "kube-api-access-9btkk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:24:35 crc kubenswrapper[4722]: I0309 14:24:35.850072 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-23c6-account-create-update-nk68b" event={"ID":"e55131a4-f18d-417f-8d87-408a7f3bb919","Type":"ContainerDied","Data":"238ad06d878827501436d90aa18630b753b6e3b6955daccfc79bf2c6bbf27490"} Mar 09 14:24:35 crc kubenswrapper[4722]: I0309 14:24:35.850124 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="238ad06d878827501436d90aa18630b753b6e3b6955daccfc79bf2c6bbf27490" Mar 09 14:24:35 crc kubenswrapper[4722]: I0309 14:24:35.850181 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-23c6-account-create-update-nk68b" Mar 09 14:24:35 crc kubenswrapper[4722]: I0309 14:24:35.854921 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-4qrms" event={"ID":"5397205b-9c4a-4575-8bfa-8604e88784e9","Type":"ContainerDied","Data":"1c16b9dcc082d49d74cdb51f1f143682479734bfdc64f7b0e6e051f6f9a35e46"} Mar 09 14:24:35 crc kubenswrapper[4722]: I0309 14:24:35.854948 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c16b9dcc082d49d74cdb51f1f143682479734bfdc64f7b0e6e051f6f9a35e46" Mar 09 14:24:35 crc kubenswrapper[4722]: I0309 14:24:35.854960 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-4qrms" Mar 09 14:24:35 crc kubenswrapper[4722]: I0309 14:24:35.857849 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-fd4g6" event={"ID":"b730192a-1410-4860-a847-f5e5974fd728","Type":"ContainerDied","Data":"9ca143abe660467637008b2c4095e68af2061028d95dc81fb736b54abc9d697f"} Mar 09 14:24:35 crc kubenswrapper[4722]: I0309 14:24:35.857893 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ca143abe660467637008b2c4095e68af2061028d95dc81fb736b54abc9d697f" Mar 09 14:24:35 crc kubenswrapper[4722]: I0309 14:24:35.857985 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-fd4g6" Mar 09 14:24:35 crc kubenswrapper[4722]: I0309 14:24:35.866060 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"6585120e-2007-43b3-a72d-4e80fb7ab2fb","Type":"ContainerStarted","Data":"8646b91c50974af0718a87938561944b59d4c9e9453fe805df57d0868427837f"} Mar 09 14:24:35 crc kubenswrapper[4722]: I0309 14:24:35.866113 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 09 14:24:35 crc kubenswrapper[4722]: I0309 14:24:35.868971 4722 generic.go:334] "Generic (PLEG): container finished" podID="f9e36087-1859-42c2-bf99-100d32617755" containerID="e7d19a62a82beca0226b21f1e83b0cce1b7119fc6bc4bdd57cf3ae7a89c8013d" exitCode=0 Mar 09 14:24:35 crc kubenswrapper[4722]: I0309 14:24:35.869009 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-7dk6r" event={"ID":"f9e36087-1859-42c2-bf99-100d32617755","Type":"ContainerDied","Data":"e7d19a62a82beca0226b21f1e83b0cce1b7119fc6bc4bdd57cf3ae7a89c8013d"} Mar 09 14:24:35 crc kubenswrapper[4722]: I0309 14:24:35.871928 4722 generic.go:334] "Generic (PLEG): container finished" podID="d94802a7-3f7c-4172-b781-a1eac89761d6" containerID="48c3518495d5e233291f49d08b4afd8f53185fc9cc71a1db82bf42a1673c4010" exitCode=0 Mar 09 14:24:35 crc kubenswrapper[4722]: I0309 14:24:35.872010 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-34cc-account-create-update-hqh5l" event={"ID":"d94802a7-3f7c-4172-b781-a1eac89761d6","Type":"ContainerDied","Data":"48c3518495d5e233291f49d08b4afd8f53185fc9cc71a1db82bf42a1673c4010"} Mar 09 14:24:35 crc kubenswrapper[4722]: I0309 14:24:35.872051 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-34cc-account-create-update-hqh5l" event={"ID":"d94802a7-3f7c-4172-b781-a1eac89761d6","Type":"ContainerStarted","Data":"395d8710806573ef5c4f2df400cef95eb1b9e7ef068985e7612bf714dced9451"} Mar 09 14:24:35 crc kubenswrapper[4722]: I0309 14:24:35.879966 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5qbgb\" (UniqueName: \"kubernetes.io/projected/1d9d7716-b622-4ae7-9f3a-480c5807525b-kube-api-access-5qbgb\") on node \"crc\" DevicePath \"\"" Mar 09 14:24:35 crc kubenswrapper[4722]: I0309 14:24:35.880335 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gpl9d\" (UniqueName: \"kubernetes.io/projected/e55131a4-f18d-417f-8d87-408a7f3bb919-kube-api-access-gpl9d\") on node \"crc\" DevicePath \"\"" Mar 09 14:24:35 crc kubenswrapper[4722]: I0309 14:24:35.880349 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5397205b-9c4a-4575-8bfa-8604e88784e9-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 14:24:35 crc kubenswrapper[4722]: I0309 14:24:35.880392 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9btkk\" (UniqueName: \"kubernetes.io/projected/5397205b-9c4a-4575-8bfa-8604e88784e9-kube-api-access-9btkk\") on node \"crc\" DevicePath \"\"" Mar 09 14:24:35 crc kubenswrapper[4722]: I0309 14:24:35.880408 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1d9d7716-b622-4ae7-9f3a-480c5807525b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 14:24:35 crc kubenswrapper[4722]: I0309 14:24:35.880419 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e55131a4-f18d-417f-8d87-408a7f3bb919-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 14:24:35 crc kubenswrapper[4722]: I0309 14:24:35.884076 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-33f7-account-create-update-tc7x6" Mar 09 14:24:35 crc kubenswrapper[4722]: I0309 14:24:35.885236 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-33f7-account-create-update-tc7x6" event={"ID":"1d9d7716-b622-4ae7-9f3a-480c5807525b","Type":"ContainerDied","Data":"91b7ae298e0ab69a2103821486880be47ed3f9348a720a697b8478a1f590fd89"} Mar 09 14:24:35 crc kubenswrapper[4722]: I0309 14:24:35.885452 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91b7ae298e0ab69a2103821486880be47ed3f9348a720a697b8478a1f590fd89" Mar 09 14:24:35 crc kubenswrapper[4722]: I0309 14:24:35.890223 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=3.387842395 podStartE2EDuration="4.890187143s" podCreationTimestamp="2026-03-09 14:24:31 +0000 UTC" firstStartedPulling="2026-03-09 14:24:32.63539496 +0000 UTC m=+1313.190963536" lastFinishedPulling="2026-03-09 14:24:34.137739708 +0000 UTC m=+1314.693308284" observedRunningTime="2026-03-09 14:24:35.885156589 +0000 UTC m=+1316.440725155" watchObservedRunningTime="2026-03-09 14:24:35.890187143 +0000 UTC m=+1316.445755719" Mar 09 14:24:36 crc kubenswrapper[4722]: I0309 14:24:36.166713 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ae3ba92-3a9f-4f65-8c32-2b3f4d1f95e7" path="/var/lib/kubelet/pods/0ae3ba92-3a9f-4f65-8c32-2b3f4d1f95e7/volumes" Mar 09 14:24:36 crc kubenswrapper[4722]: I0309 14:24:36.357801 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c437-account-create-update-mdmhs" Mar 09 14:24:36 crc kubenswrapper[4722]: I0309 14:24:36.490715 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-65cx2" Mar 09 14:24:36 crc kubenswrapper[4722]: I0309 14:24:36.495152 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5cjf6\" (UniqueName: \"kubernetes.io/projected/e9b4c63d-ce42-432e-bbcf-abe490c33d2e-kube-api-access-5cjf6\") pod \"e9b4c63d-ce42-432e-bbcf-abe490c33d2e\" (UID: \"e9b4c63d-ce42-432e-bbcf-abe490c33d2e\") " Mar 09 14:24:36 crc kubenswrapper[4722]: I0309 14:24:36.495252 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9b4c63d-ce42-432e-bbcf-abe490c33d2e-operator-scripts\") pod \"e9b4c63d-ce42-432e-bbcf-abe490c33d2e\" (UID: \"e9b4c63d-ce42-432e-bbcf-abe490c33d2e\") " Mar 09 14:24:36 crc kubenswrapper[4722]: I0309 14:24:36.496329 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9b4c63d-ce42-432e-bbcf-abe490c33d2e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e9b4c63d-ce42-432e-bbcf-abe490c33d2e" (UID: "e9b4c63d-ce42-432e-bbcf-abe490c33d2e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:24:36 crc kubenswrapper[4722]: I0309 14:24:36.501226 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9b4c63d-ce42-432e-bbcf-abe490c33d2e-kube-api-access-5cjf6" (OuterVolumeSpecName: "kube-api-access-5cjf6") pod "e9b4c63d-ce42-432e-bbcf-abe490c33d2e" (UID: "e9b4c63d-ce42-432e-bbcf-abe490c33d2e"). InnerVolumeSpecName "kube-api-access-5cjf6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:24:36 crc kubenswrapper[4722]: I0309 14:24:36.597701 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/759a8d7f-dc3f-4432-96ca-4adaf31331ae-operator-scripts\") pod \"759a8d7f-dc3f-4432-96ca-4adaf31331ae\" (UID: \"759a8d7f-dc3f-4432-96ca-4adaf31331ae\") " Mar 09 14:24:36 crc kubenswrapper[4722]: I0309 14:24:36.597959 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ggjzb\" (UniqueName: \"kubernetes.io/projected/759a8d7f-dc3f-4432-96ca-4adaf31331ae-kube-api-access-ggjzb\") pod \"759a8d7f-dc3f-4432-96ca-4adaf31331ae\" (UID: \"759a8d7f-dc3f-4432-96ca-4adaf31331ae\") " Mar 09 14:24:36 crc kubenswrapper[4722]: I0309 14:24:36.598230 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/759a8d7f-dc3f-4432-96ca-4adaf31331ae-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "759a8d7f-dc3f-4432-96ca-4adaf31331ae" (UID: "759a8d7f-dc3f-4432-96ca-4adaf31331ae"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:24:36 crc kubenswrapper[4722]: I0309 14:24:36.598926 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/759a8d7f-dc3f-4432-96ca-4adaf31331ae-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 14:24:36 crc kubenswrapper[4722]: I0309 14:24:36.598949 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5cjf6\" (UniqueName: \"kubernetes.io/projected/e9b4c63d-ce42-432e-bbcf-abe490c33d2e-kube-api-access-5cjf6\") on node \"crc\" DevicePath \"\"" Mar 09 14:24:36 crc kubenswrapper[4722]: I0309 14:24:36.598961 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e9b4c63d-ce42-432e-bbcf-abe490c33d2e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 14:24:36 crc kubenswrapper[4722]: I0309 14:24:36.600696 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/759a8d7f-dc3f-4432-96ca-4adaf31331ae-kube-api-access-ggjzb" (OuterVolumeSpecName: "kube-api-access-ggjzb") pod "759a8d7f-dc3f-4432-96ca-4adaf31331ae" (UID: "759a8d7f-dc3f-4432-96ca-4adaf31331ae"). InnerVolumeSpecName "kube-api-access-ggjzb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:24:36 crc kubenswrapper[4722]: I0309 14:24:36.701284 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ggjzb\" (UniqueName: \"kubernetes.io/projected/759a8d7f-dc3f-4432-96ca-4adaf31331ae-kube-api-access-ggjzb\") on node \"crc\" DevicePath \"\"" Mar 09 14:24:36 crc kubenswrapper[4722]: I0309 14:24:36.897964 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-65cx2" Mar 09 14:24:36 crc kubenswrapper[4722]: I0309 14:24:36.898148 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-65cx2" event={"ID":"759a8d7f-dc3f-4432-96ca-4adaf31331ae","Type":"ContainerDied","Data":"a31d6f2b13102399523826e128e33ab3c62609927f88d3cc0bc390629c472df2"} Mar 09 14:24:36 crc kubenswrapper[4722]: I0309 14:24:36.898434 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a31d6f2b13102399523826e128e33ab3c62609927f88d3cc0bc390629c472df2" Mar 09 14:24:36 crc kubenswrapper[4722]: I0309 14:24:36.900458 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c437-account-create-update-mdmhs" event={"ID":"e9b4c63d-ce42-432e-bbcf-abe490c33d2e","Type":"ContainerDied","Data":"d27b2b3011a5b5abf2ab9850f5e65b5b1f837c21ecf210c21fa4584d6b954b31"} Mar 09 14:24:36 crc kubenswrapper[4722]: I0309 14:24:36.900527 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c437-account-create-update-mdmhs" Mar 09 14:24:36 crc kubenswrapper[4722]: I0309 14:24:36.900499 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d27b2b3011a5b5abf2ab9850f5e65b5b1f837c21ecf210c21fa4584d6b954b31" Mar 09 14:24:37 crc kubenswrapper[4722]: I0309 14:24:37.394509 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-7dk6r" Mar 09 14:24:37 crc kubenswrapper[4722]: I0309 14:24:37.518083 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9e36087-1859-42c2-bf99-100d32617755-operator-scripts\") pod \"f9e36087-1859-42c2-bf99-100d32617755\" (UID: \"f9e36087-1859-42c2-bf99-100d32617755\") " Mar 09 14:24:37 crc kubenswrapper[4722]: I0309 14:24:37.518423 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vwfc\" (UniqueName: \"kubernetes.io/projected/f9e36087-1859-42c2-bf99-100d32617755-kube-api-access-5vwfc\") pod \"f9e36087-1859-42c2-bf99-100d32617755\" (UID: \"f9e36087-1859-42c2-bf99-100d32617755\") " Mar 09 14:24:37 crc kubenswrapper[4722]: I0309 14:24:37.519170 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9e36087-1859-42c2-bf99-100d32617755-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f9e36087-1859-42c2-bf99-100d32617755" (UID: "f9e36087-1859-42c2-bf99-100d32617755"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:24:37 crc kubenswrapper[4722]: I0309 14:24:37.519867 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9e36087-1859-42c2-bf99-100d32617755-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 14:24:37 crc kubenswrapper[4722]: I0309 14:24:37.531785 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9e36087-1859-42c2-bf99-100d32617755-kube-api-access-5vwfc" (OuterVolumeSpecName: "kube-api-access-5vwfc") pod "f9e36087-1859-42c2-bf99-100d32617755" (UID: "f9e36087-1859-42c2-bf99-100d32617755"). InnerVolumeSpecName "kube-api-access-5vwfc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:24:37 crc kubenswrapper[4722]: I0309 14:24:37.550941 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-34cc-account-create-update-hqh5l" Mar 09 14:24:37 crc kubenswrapper[4722]: I0309 14:24:37.621452 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5vwfc\" (UniqueName: \"kubernetes.io/projected/f9e36087-1859-42c2-bf99-100d32617755-kube-api-access-5vwfc\") on node \"crc\" DevicePath \"\"" Mar 09 14:24:37 crc kubenswrapper[4722]: I0309 14:24:37.723128 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nw9dt\" (UniqueName: \"kubernetes.io/projected/d94802a7-3f7c-4172-b781-a1eac89761d6-kube-api-access-nw9dt\") pod \"d94802a7-3f7c-4172-b781-a1eac89761d6\" (UID: \"d94802a7-3f7c-4172-b781-a1eac89761d6\") " Mar 09 14:24:37 crc kubenswrapper[4722]: I0309 14:24:37.723166 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d94802a7-3f7c-4172-b781-a1eac89761d6-operator-scripts\") pod \"d94802a7-3f7c-4172-b781-a1eac89761d6\" (UID: \"d94802a7-3f7c-4172-b781-a1eac89761d6\") " Mar 09 14:24:37 crc kubenswrapper[4722]: I0309 14:24:37.724367 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d94802a7-3f7c-4172-b781-a1eac89761d6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d94802a7-3f7c-4172-b781-a1eac89761d6" (UID: "d94802a7-3f7c-4172-b781-a1eac89761d6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:24:37 crc kubenswrapper[4722]: I0309 14:24:37.726861 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d94802a7-3f7c-4172-b781-a1eac89761d6-kube-api-access-nw9dt" (OuterVolumeSpecName: "kube-api-access-nw9dt") pod "d94802a7-3f7c-4172-b781-a1eac89761d6" (UID: "d94802a7-3f7c-4172-b781-a1eac89761d6"). InnerVolumeSpecName "kube-api-access-nw9dt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:24:37 crc kubenswrapper[4722]: I0309 14:24:37.781264 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-69xgs"] Mar 09 14:24:37 crc kubenswrapper[4722]: E0309 14:24:37.781693 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ae3ba92-3a9f-4f65-8c32-2b3f4d1f95e7" containerName="init" Mar 09 14:24:37 crc kubenswrapper[4722]: I0309 14:24:37.781709 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ae3ba92-3a9f-4f65-8c32-2b3f4d1f95e7" containerName="init" Mar 09 14:24:37 crc kubenswrapper[4722]: E0309 14:24:37.781725 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ae3ba92-3a9f-4f65-8c32-2b3f4d1f95e7" containerName="dnsmasq-dns" Mar 09 14:24:37 crc kubenswrapper[4722]: I0309 14:24:37.781731 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ae3ba92-3a9f-4f65-8c32-2b3f4d1f95e7" containerName="dnsmasq-dns" Mar 09 14:24:37 crc kubenswrapper[4722]: E0309 14:24:37.781745 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b730192a-1410-4860-a847-f5e5974fd728" containerName="mariadb-database-create" Mar 09 14:24:37 crc kubenswrapper[4722]: I0309 14:24:37.781753 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="b730192a-1410-4860-a847-f5e5974fd728" containerName="mariadb-database-create" Mar 09 14:24:37 crc kubenswrapper[4722]: E0309 14:24:37.781762 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9b4c63d-ce42-432e-bbcf-abe490c33d2e" containerName="mariadb-account-create-update" Mar 09 14:24:37 crc kubenswrapper[4722]: I0309 14:24:37.781767 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9b4c63d-ce42-432e-bbcf-abe490c33d2e" containerName="mariadb-account-create-update" Mar 09 14:24:37 crc kubenswrapper[4722]: E0309 14:24:37.781781 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5397205b-9c4a-4575-8bfa-8604e88784e9" containerName="mariadb-database-create" Mar 09 14:24:37 crc kubenswrapper[4722]: I0309 14:24:37.781787 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="5397205b-9c4a-4575-8bfa-8604e88784e9" containerName="mariadb-database-create" Mar 09 14:24:37 crc kubenswrapper[4722]: E0309 14:24:37.781799 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d9d7716-b622-4ae7-9f3a-480c5807525b" containerName="mariadb-account-create-update" Mar 09 14:24:37 crc kubenswrapper[4722]: I0309 14:24:37.781806 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d9d7716-b622-4ae7-9f3a-480c5807525b" containerName="mariadb-account-create-update" Mar 09 14:24:37 crc kubenswrapper[4722]: E0309 14:24:37.781819 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e55131a4-f18d-417f-8d87-408a7f3bb919" containerName="mariadb-account-create-update" Mar 09 14:24:37 crc kubenswrapper[4722]: I0309 14:24:37.781825 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="e55131a4-f18d-417f-8d87-408a7f3bb919" containerName="mariadb-account-create-update" Mar 09 14:24:37 crc kubenswrapper[4722]: E0309 14:24:37.781836 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="759a8d7f-dc3f-4432-96ca-4adaf31331ae" containerName="mariadb-database-create" Mar 09 14:24:37 crc kubenswrapper[4722]: I0309 14:24:37.781842 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="759a8d7f-dc3f-4432-96ca-4adaf31331ae" containerName="mariadb-database-create" Mar 09 14:24:37 crc kubenswrapper[4722]: E0309 14:24:37.781851 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d94802a7-3f7c-4172-b781-a1eac89761d6" containerName="mariadb-account-create-update" Mar 09 14:24:37 crc kubenswrapper[4722]: I0309 14:24:37.781858 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="d94802a7-3f7c-4172-b781-a1eac89761d6" containerName="mariadb-account-create-update" Mar 09 14:24:37 crc kubenswrapper[4722]: E0309 14:24:37.781869 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9e36087-1859-42c2-bf99-100d32617755" containerName="mariadb-database-create" Mar 09 14:24:37 crc kubenswrapper[4722]: I0309 14:24:37.781877 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9e36087-1859-42c2-bf99-100d32617755" containerName="mariadb-database-create" Mar 09 14:24:37 crc kubenswrapper[4722]: I0309 14:24:37.782050 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="e55131a4-f18d-417f-8d87-408a7f3bb919" containerName="mariadb-account-create-update" Mar 09 14:24:37 crc kubenswrapper[4722]: I0309 14:24:37.782065 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="759a8d7f-dc3f-4432-96ca-4adaf31331ae" containerName="mariadb-database-create" Mar 09 14:24:37 crc kubenswrapper[4722]: I0309 14:24:37.782074 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9e36087-1859-42c2-bf99-100d32617755" containerName="mariadb-database-create" Mar 09 14:24:37 crc kubenswrapper[4722]: I0309 14:24:37.782086 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="d94802a7-3f7c-4172-b781-a1eac89761d6" containerName="mariadb-account-create-update" Mar 09 14:24:37 crc kubenswrapper[4722]: I0309 14:24:37.782096 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d9d7716-b622-4ae7-9f3a-480c5807525b" containerName="mariadb-account-create-update" Mar 09 14:24:37 crc kubenswrapper[4722]: I0309 14:24:37.782108 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9b4c63d-ce42-432e-bbcf-abe490c33d2e" containerName="mariadb-account-create-update" Mar 09 14:24:37 crc kubenswrapper[4722]: I0309 14:24:37.782117 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="b730192a-1410-4860-a847-f5e5974fd728" containerName="mariadb-database-create" Mar 09 14:24:37 crc kubenswrapper[4722]: I0309 14:24:37.782128 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="5397205b-9c4a-4575-8bfa-8604e88784e9" containerName="mariadb-database-create" Mar 09 14:24:37 crc kubenswrapper[4722]: I0309 14:24:37.782139 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ae3ba92-3a9f-4f65-8c32-2b3f4d1f95e7" containerName="dnsmasq-dns" Mar 09 14:24:37 crc kubenswrapper[4722]: I0309 14:24:37.782903 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-69xgs" Mar 09 14:24:37 crc kubenswrapper[4722]: I0309 14:24:37.784913 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 09 14:24:37 crc kubenswrapper[4722]: I0309 14:24:37.793345 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-69xgs"] Mar 09 14:24:37 crc kubenswrapper[4722]: I0309 14:24:37.825885 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nw9dt\" (UniqueName: \"kubernetes.io/projected/d94802a7-3f7c-4172-b781-a1eac89761d6-kube-api-access-nw9dt\") on node \"crc\" DevicePath \"\"" Mar 09 14:24:37 crc kubenswrapper[4722]: I0309 14:24:37.825917 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d94802a7-3f7c-4172-b781-a1eac89761d6-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 14:24:37 crc kubenswrapper[4722]: I0309 14:24:37.912169 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-34cc-account-create-update-hqh5l" event={"ID":"d94802a7-3f7c-4172-b781-a1eac89761d6","Type":"ContainerDied","Data":"395d8710806573ef5c4f2df400cef95eb1b9e7ef068985e7612bf714dced9451"} Mar 09 14:24:37 crc kubenswrapper[4722]: I0309 14:24:37.912220 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-34cc-account-create-update-hqh5l" Mar 09 14:24:37 crc kubenswrapper[4722]: I0309 14:24:37.912227 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="395d8710806573ef5c4f2df400cef95eb1b9e7ef068985e7612bf714dced9451" Mar 09 14:24:37 crc kubenswrapper[4722]: I0309 14:24:37.914039 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-7dk6r" event={"ID":"f9e36087-1859-42c2-bf99-100d32617755","Type":"ContainerDied","Data":"b03fcd081c9e02c56408d0171b25cd54e42f95b5fc882b4eabf8b671d3435ec2"} Mar 09 14:24:37 crc kubenswrapper[4722]: I0309 14:24:37.914060 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b03fcd081c9e02c56408d0171b25cd54e42f95b5fc882b4eabf8b671d3435ec2" Mar 09 14:24:37 crc kubenswrapper[4722]: I0309 14:24:37.914098 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-7dk6r" Mar 09 14:24:37 crc kubenswrapper[4722]: I0309 14:24:37.927362 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3bbf7c99-d896-4ccc-ac04-5c81ddfa3a11-operator-scripts\") pod \"root-account-create-update-69xgs\" (UID: \"3bbf7c99-d896-4ccc-ac04-5c81ddfa3a11\") " pod="openstack/root-account-create-update-69xgs" Mar 09 14:24:37 crc kubenswrapper[4722]: I0309 14:24:37.927501 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsfq2\" (UniqueName: \"kubernetes.io/projected/3bbf7c99-d896-4ccc-ac04-5c81ddfa3a11-kube-api-access-lsfq2\") pod \"root-account-create-update-69xgs\" (UID: \"3bbf7c99-d896-4ccc-ac04-5c81ddfa3a11\") " pod="openstack/root-account-create-update-69xgs" Mar 09 14:24:38 crc kubenswrapper[4722]: I0309 14:24:38.030391 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3bbf7c99-d896-4ccc-ac04-5c81ddfa3a11-operator-scripts\") pod \"root-account-create-update-69xgs\" (UID: \"3bbf7c99-d896-4ccc-ac04-5c81ddfa3a11\") " pod="openstack/root-account-create-update-69xgs" Mar 09 14:24:38 crc kubenswrapper[4722]: I0309 14:24:38.030611 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsfq2\" (UniqueName: \"kubernetes.io/projected/3bbf7c99-d896-4ccc-ac04-5c81ddfa3a11-kube-api-access-lsfq2\") pod \"root-account-create-update-69xgs\" (UID: \"3bbf7c99-d896-4ccc-ac04-5c81ddfa3a11\") " pod="openstack/root-account-create-update-69xgs" Mar 09 14:24:38 crc kubenswrapper[4722]: I0309 14:24:38.031219 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3bbf7c99-d896-4ccc-ac04-5c81ddfa3a11-operator-scripts\") pod \"root-account-create-update-69xgs\" (UID: \"3bbf7c99-d896-4ccc-ac04-5c81ddfa3a11\") " pod="openstack/root-account-create-update-69xgs" Mar 09 14:24:38 crc kubenswrapper[4722]: I0309 14:24:38.047289 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsfq2\" (UniqueName: \"kubernetes.io/projected/3bbf7c99-d896-4ccc-ac04-5c81ddfa3a11-kube-api-access-lsfq2\") pod \"root-account-create-update-69xgs\" (UID: \"3bbf7c99-d896-4ccc-ac04-5c81ddfa3a11\") " pod="openstack/root-account-create-update-69xgs" Mar 09 14:24:38 crc kubenswrapper[4722]: I0309 14:24:38.110796 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-69xgs" Mar 09 14:24:40 crc kubenswrapper[4722]: I0309 14:24:40.380726 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7463e84f-f457-4409-9621-507d331e06b5-etc-swift\") pod \"swift-storage-0\" (UID: \"7463e84f-f457-4409-9621-507d331e06b5\") " pod="openstack/swift-storage-0" Mar 09 14:24:40 crc kubenswrapper[4722]: E0309 14:24:40.380984 4722 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 09 14:24:40 crc kubenswrapper[4722]: E0309 14:24:40.381396 4722 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 09 14:24:40 crc kubenswrapper[4722]: E0309 14:24:40.381463 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7463e84f-f457-4409-9621-507d331e06b5-etc-swift podName:7463e84f-f457-4409-9621-507d331e06b5 nodeName:}" failed. No retries permitted until 2026-03-09 14:24:56.381443465 +0000 UTC m=+1336.937012051 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/7463e84f-f457-4409-9621-507d331e06b5-etc-swift") pod "swift-storage-0" (UID: "7463e84f-f457-4409-9621-507d331e06b5") : configmap "swift-ring-files" not found Mar 09 14:24:40 crc kubenswrapper[4722]: I0309 14:24:40.942544 4722 generic.go:334] "Generic (PLEG): container finished" podID="cb0983d9-3f03-406f-a485-f89ba50341fc" containerID="9029fac70b952306b5efbc4eb7199723abe5367889318f7ad69441edc1614af8" exitCode=0 Mar 09 14:24:40 crc kubenswrapper[4722]: I0309 14:24:40.942589 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-vtpb7" event={"ID":"cb0983d9-3f03-406f-a485-f89ba50341fc","Type":"ContainerDied","Data":"9029fac70b952306b5efbc4eb7199723abe5367889318f7ad69441edc1614af8"} Mar 09 14:24:41 crc kubenswrapper[4722]: I0309 14:24:41.307855 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-rl9l9"] Mar 09 14:24:41 crc kubenswrapper[4722]: I0309 14:24:41.309513 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-rl9l9" Mar 09 14:24:41 crc kubenswrapper[4722]: I0309 14:24:41.311462 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-xv77d" Mar 09 14:24:41 crc kubenswrapper[4722]: I0309 14:24:41.311689 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 09 14:24:41 crc kubenswrapper[4722]: I0309 14:24:41.323320 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-rl9l9"] Mar 09 14:24:41 crc kubenswrapper[4722]: I0309 14:24:41.401996 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17f0c1f6-4aea-4ada-aaec-3493cec60053-combined-ca-bundle\") pod \"glance-db-sync-rl9l9\" (UID: \"17f0c1f6-4aea-4ada-aaec-3493cec60053\") " pod="openstack/glance-db-sync-rl9l9" Mar 09 14:24:41 crc kubenswrapper[4722]: I0309 14:24:41.402069 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17f0c1f6-4aea-4ada-aaec-3493cec60053-config-data\") pod \"glance-db-sync-rl9l9\" (UID: \"17f0c1f6-4aea-4ada-aaec-3493cec60053\") " pod="openstack/glance-db-sync-rl9l9" Mar 09 14:24:41 crc kubenswrapper[4722]: I0309 14:24:41.402211 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfpc4\" (UniqueName: \"kubernetes.io/projected/17f0c1f6-4aea-4ada-aaec-3493cec60053-kube-api-access-nfpc4\") pod \"glance-db-sync-rl9l9\" (UID: \"17f0c1f6-4aea-4ada-aaec-3493cec60053\") " pod="openstack/glance-db-sync-rl9l9" Mar 09 14:24:41 crc kubenswrapper[4722]: I0309 14:24:41.402312 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/17f0c1f6-4aea-4ada-aaec-3493cec60053-db-sync-config-data\") pod \"glance-db-sync-rl9l9\" (UID: \"17f0c1f6-4aea-4ada-aaec-3493cec60053\") " pod="openstack/glance-db-sync-rl9l9" Mar 09 14:24:41 crc kubenswrapper[4722]: I0309 14:24:41.504136 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17f0c1f6-4aea-4ada-aaec-3493cec60053-combined-ca-bundle\") pod \"glance-db-sync-rl9l9\" (UID: \"17f0c1f6-4aea-4ada-aaec-3493cec60053\") " pod="openstack/glance-db-sync-rl9l9" Mar 09 14:24:41 crc kubenswrapper[4722]: I0309 14:24:41.504244 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17f0c1f6-4aea-4ada-aaec-3493cec60053-config-data\") pod \"glance-db-sync-rl9l9\" (UID: \"17f0c1f6-4aea-4ada-aaec-3493cec60053\") " pod="openstack/glance-db-sync-rl9l9" Mar 09 14:24:41 crc kubenswrapper[4722]: I0309 14:24:41.504283 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfpc4\" (UniqueName: \"kubernetes.io/projected/17f0c1f6-4aea-4ada-aaec-3493cec60053-kube-api-access-nfpc4\") pod \"glance-db-sync-rl9l9\" (UID: \"17f0c1f6-4aea-4ada-aaec-3493cec60053\") " pod="openstack/glance-db-sync-rl9l9" Mar 09 14:24:41 crc kubenswrapper[4722]: I0309 14:24:41.504342 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/17f0c1f6-4aea-4ada-aaec-3493cec60053-db-sync-config-data\") pod \"glance-db-sync-rl9l9\" (UID: \"17f0c1f6-4aea-4ada-aaec-3493cec60053\") " pod="openstack/glance-db-sync-rl9l9" Mar 09 14:24:41 crc kubenswrapper[4722]: I0309 14:24:41.509681 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17f0c1f6-4aea-4ada-aaec-3493cec60053-config-data\") pod \"glance-db-sync-rl9l9\" (UID: \"17f0c1f6-4aea-4ada-aaec-3493cec60053\") " pod="openstack/glance-db-sync-rl9l9" Mar 09 14:24:41 crc kubenswrapper[4722]: I0309 14:24:41.510249 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/17f0c1f6-4aea-4ada-aaec-3493cec60053-db-sync-config-data\") pod \"glance-db-sync-rl9l9\" (UID: \"17f0c1f6-4aea-4ada-aaec-3493cec60053\") " pod="openstack/glance-db-sync-rl9l9" Mar 09 14:24:41 crc kubenswrapper[4722]: I0309 14:24:41.510481 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17f0c1f6-4aea-4ada-aaec-3493cec60053-combined-ca-bundle\") pod \"glance-db-sync-rl9l9\" (UID: \"17f0c1f6-4aea-4ada-aaec-3493cec60053\") " pod="openstack/glance-db-sync-rl9l9" Mar 09 14:24:41 crc kubenswrapper[4722]: I0309 14:24:41.521888 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfpc4\" (UniqueName: \"kubernetes.io/projected/17f0c1f6-4aea-4ada-aaec-3493cec60053-kube-api-access-nfpc4\") pod \"glance-db-sync-rl9l9\" (UID: \"17f0c1f6-4aea-4ada-aaec-3493cec60053\") " pod="openstack/glance-db-sync-rl9l9" Mar 09 14:24:41 crc kubenswrapper[4722]: I0309 14:24:41.633550 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-rl9l9" Mar 09 14:24:42 crc kubenswrapper[4722]: I0309 14:24:42.262638 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-69xgs"] Mar 09 14:24:42 crc kubenswrapper[4722]: I0309 14:24:42.320539 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 09 14:24:42 crc kubenswrapper[4722]: I0309 14:24:42.487336 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-vtpb7" Mar 09 14:24:42 crc kubenswrapper[4722]: I0309 14:24:42.548765 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-rl9l9"] Mar 09 14:24:42 crc kubenswrapper[4722]: I0309 14:24:42.639599 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8nwbd\" (UniqueName: \"kubernetes.io/projected/cb0983d9-3f03-406f-a485-f89ba50341fc-kube-api-access-8nwbd\") pod \"cb0983d9-3f03-406f-a485-f89ba50341fc\" (UID: \"cb0983d9-3f03-406f-a485-f89ba50341fc\") " Mar 09 14:24:42 crc kubenswrapper[4722]: I0309 14:24:42.639655 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb0983d9-3f03-406f-a485-f89ba50341fc-combined-ca-bundle\") pod \"cb0983d9-3f03-406f-a485-f89ba50341fc\" (UID: \"cb0983d9-3f03-406f-a485-f89ba50341fc\") " Mar 09 14:24:42 crc kubenswrapper[4722]: I0309 14:24:42.639718 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cb0983d9-3f03-406f-a485-f89ba50341fc-scripts\") pod \"cb0983d9-3f03-406f-a485-f89ba50341fc\" (UID: \"cb0983d9-3f03-406f-a485-f89ba50341fc\") " Mar 09 14:24:42 crc kubenswrapper[4722]: I0309 14:24:42.639756 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/cb0983d9-3f03-406f-a485-f89ba50341fc-dispersionconf\") pod \"cb0983d9-3f03-406f-a485-f89ba50341fc\" (UID: \"cb0983d9-3f03-406f-a485-f89ba50341fc\") " Mar 09 14:24:42 crc kubenswrapper[4722]: I0309 14:24:42.639788 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/cb0983d9-3f03-406f-a485-f89ba50341fc-etc-swift\") pod \"cb0983d9-3f03-406f-a485-f89ba50341fc\" (UID: \"cb0983d9-3f03-406f-a485-f89ba50341fc\") " Mar 09 14:24:42 crc kubenswrapper[4722]: I0309 14:24:42.639830 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/cb0983d9-3f03-406f-a485-f89ba50341fc-swiftconf\") pod \"cb0983d9-3f03-406f-a485-f89ba50341fc\" (UID: \"cb0983d9-3f03-406f-a485-f89ba50341fc\") " Mar 09 14:24:42 crc kubenswrapper[4722]: I0309 14:24:42.639878 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/cb0983d9-3f03-406f-a485-f89ba50341fc-ring-data-devices\") pod \"cb0983d9-3f03-406f-a485-f89ba50341fc\" (UID: \"cb0983d9-3f03-406f-a485-f89ba50341fc\") " Mar 09 14:24:42 crc kubenswrapper[4722]: I0309 14:24:42.641073 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb0983d9-3f03-406f-a485-f89ba50341fc-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "cb0983d9-3f03-406f-a485-f89ba50341fc" (UID: "cb0983d9-3f03-406f-a485-f89ba50341fc"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:24:42 crc kubenswrapper[4722]: I0309 14:24:42.641492 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb0983d9-3f03-406f-a485-f89ba50341fc-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "cb0983d9-3f03-406f-a485-f89ba50341fc" (UID: "cb0983d9-3f03-406f-a485-f89ba50341fc"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:24:42 crc kubenswrapper[4722]: I0309 14:24:42.742171 4722 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/cb0983d9-3f03-406f-a485-f89ba50341fc-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 09 14:24:42 crc kubenswrapper[4722]: I0309 14:24:42.742237 4722 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/cb0983d9-3f03-406f-a485-f89ba50341fc-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 09 14:24:42 crc kubenswrapper[4722]: I0309 14:24:42.772000 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb0983d9-3f03-406f-a485-f89ba50341fc-kube-api-access-8nwbd" (OuterVolumeSpecName: "kube-api-access-8nwbd") pod "cb0983d9-3f03-406f-a485-f89ba50341fc" (UID: "cb0983d9-3f03-406f-a485-f89ba50341fc"). InnerVolumeSpecName "kube-api-access-8nwbd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:24:42 crc kubenswrapper[4722]: I0309 14:24:42.844024 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8nwbd\" (UniqueName: \"kubernetes.io/projected/cb0983d9-3f03-406f-a485-f89ba50341fc-kube-api-access-8nwbd\") on node \"crc\" DevicePath \"\"" Mar 09 14:24:42 crc kubenswrapper[4722]: I0309 14:24:42.873300 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb0983d9-3f03-406f-a485-f89ba50341fc-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "cb0983d9-3f03-406f-a485-f89ba50341fc" (UID: "cb0983d9-3f03-406f-a485-f89ba50341fc"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:24:42 crc kubenswrapper[4722]: I0309 14:24:42.945757 4722 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/cb0983d9-3f03-406f-a485-f89ba50341fc-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 09 14:24:42 crc kubenswrapper[4722]: I0309 14:24:42.962972 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e0db26a0-2877-48fa-b706-b5558f9973d5","Type":"ContainerStarted","Data":"f43ad056ea4fd9b6d874ac1b12dbe3d25d05f26348f873c90015860853bd08a4"} Mar 09 14:24:42 crc kubenswrapper[4722]: I0309 14:24:42.964861 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-vtpb7" event={"ID":"cb0983d9-3f03-406f-a485-f89ba50341fc","Type":"ContainerDied","Data":"e23d446c9c64a15350b2cbdfc8a3c98a4bb27e01afffe096e3d9b12ec895189b"} Mar 09 14:24:42 crc kubenswrapper[4722]: I0309 14:24:42.964903 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e23d446c9c64a15350b2cbdfc8a3c98a4bb27e01afffe096e3d9b12ec895189b" Mar 09 14:24:42 crc kubenswrapper[4722]: I0309 14:24:42.964881 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-vtpb7" Mar 09 14:24:42 crc kubenswrapper[4722]: I0309 14:24:42.966106 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-rl9l9" event={"ID":"17f0c1f6-4aea-4ada-aaec-3493cec60053","Type":"ContainerStarted","Data":"11e4b7703195d05b92ce0f1b24130c1d3c997e881712bb06cc8a629c76aeaf60"} Mar 09 14:24:42 crc kubenswrapper[4722]: I0309 14:24:42.967387 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-69xgs" event={"ID":"3bbf7c99-d896-4ccc-ac04-5c81ddfa3a11","Type":"ContainerStarted","Data":"82b7ffa14e8d018b723803168eca53c467a157bd62f531e66d17a31a488d4d6f"} Mar 09 14:24:43 crc kubenswrapper[4722]: I0309 14:24:43.095589 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb0983d9-3f03-406f-a485-f89ba50341fc-scripts" (OuterVolumeSpecName: "scripts") pod "cb0983d9-3f03-406f-a485-f89ba50341fc" (UID: "cb0983d9-3f03-406f-a485-f89ba50341fc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:24:43 crc kubenswrapper[4722]: I0309 14:24:43.104174 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb0983d9-3f03-406f-a485-f89ba50341fc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cb0983d9-3f03-406f-a485-f89ba50341fc" (UID: "cb0983d9-3f03-406f-a485-f89ba50341fc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:24:43 crc kubenswrapper[4722]: I0309 14:24:43.107434 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb0983d9-3f03-406f-a485-f89ba50341fc-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "cb0983d9-3f03-406f-a485-f89ba50341fc" (UID: "cb0983d9-3f03-406f-a485-f89ba50341fc"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:24:43 crc kubenswrapper[4722]: I0309 14:24:43.150461 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb0983d9-3f03-406f-a485-f89ba50341fc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:24:43 crc kubenswrapper[4722]: I0309 14:24:43.150492 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cb0983d9-3f03-406f-a485-f89ba50341fc-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 14:24:43 crc kubenswrapper[4722]: I0309 14:24:43.150501 4722 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/cb0983d9-3f03-406f-a485-f89ba50341fc-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 09 14:24:43 crc kubenswrapper[4722]: I0309 14:24:43.451753 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-sbfvj"] Mar 09 14:24:43 crc kubenswrapper[4722]: E0309 14:24:43.452611 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb0983d9-3f03-406f-a485-f89ba50341fc" containerName="swift-ring-rebalance" Mar 09 14:24:43 crc kubenswrapper[4722]: I0309 14:24:43.452633 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb0983d9-3f03-406f-a485-f89ba50341fc" containerName="swift-ring-rebalance" Mar 09 14:24:43 crc kubenswrapper[4722]: I0309 14:24:43.452885 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb0983d9-3f03-406f-a485-f89ba50341fc" containerName="swift-ring-rebalance" Mar 09 14:24:43 crc kubenswrapper[4722]: I0309 14:24:43.454038 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-sbfvj" Mar 09 14:24:43 crc kubenswrapper[4722]: I0309 14:24:43.468570 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-sbfvj"] Mar 09 14:24:43 crc kubenswrapper[4722]: I0309 14:24:43.554885 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-b04f-account-create-update-bmkn7"] Mar 09 14:24:43 crc kubenswrapper[4722]: I0309 14:24:43.556567 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-b04f-account-create-update-bmkn7" Mar 09 14:24:43 crc kubenswrapper[4722]: I0309 14:24:43.558919 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-openstack-cell1-db-secret" Mar 09 14:24:43 crc kubenswrapper[4722]: I0309 14:24:43.567850 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b9c74d7-fa1d-4c29-8875-47d506627d77-operator-scripts\") pod \"mysqld-exporter-openstack-cell1-db-create-sbfvj\" (UID: \"2b9c74d7-fa1d-4c29-8875-47d506627d77\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-sbfvj" Mar 09 14:24:43 crc kubenswrapper[4722]: I0309 14:24:43.567906 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p45ng\" (UniqueName: \"kubernetes.io/projected/2b9c74d7-fa1d-4c29-8875-47d506627d77-kube-api-access-p45ng\") pod \"mysqld-exporter-openstack-cell1-db-create-sbfvj\" (UID: \"2b9c74d7-fa1d-4c29-8875-47d506627d77\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-sbfvj" Mar 09 14:24:43 crc kubenswrapper[4722]: I0309 14:24:43.575431 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-b04f-account-create-update-bmkn7"] Mar 09 14:24:43 crc kubenswrapper[4722]: I0309 14:24:43.669893 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b9c74d7-fa1d-4c29-8875-47d506627d77-operator-scripts\") pod \"mysqld-exporter-openstack-cell1-db-create-sbfvj\" (UID: \"2b9c74d7-fa1d-4c29-8875-47d506627d77\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-sbfvj" Mar 09 14:24:43 crc kubenswrapper[4722]: I0309 14:24:43.669935 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p45ng\" (UniqueName: \"kubernetes.io/projected/2b9c74d7-fa1d-4c29-8875-47d506627d77-kube-api-access-p45ng\") pod \"mysqld-exporter-openstack-cell1-db-create-sbfvj\" (UID: \"2b9c74d7-fa1d-4c29-8875-47d506627d77\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-sbfvj" Mar 09 14:24:43 crc kubenswrapper[4722]: I0309 14:24:43.670079 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4d162f3c-13b0-4ec3-955b-5a1bf804c61c-operator-scripts\") pod \"mysqld-exporter-b04f-account-create-update-bmkn7\" (UID: \"4d162f3c-13b0-4ec3-955b-5a1bf804c61c\") " pod="openstack/mysqld-exporter-b04f-account-create-update-bmkn7" Mar 09 14:24:43 crc kubenswrapper[4722]: I0309 14:24:43.670123 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ng89v\" (UniqueName: \"kubernetes.io/projected/4d162f3c-13b0-4ec3-955b-5a1bf804c61c-kube-api-access-ng89v\") pod \"mysqld-exporter-b04f-account-create-update-bmkn7\" (UID: \"4d162f3c-13b0-4ec3-955b-5a1bf804c61c\") " pod="openstack/mysqld-exporter-b04f-account-create-update-bmkn7" Mar 09 14:24:43 crc kubenswrapper[4722]: I0309 14:24:43.670833 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b9c74d7-fa1d-4c29-8875-47d506627d77-operator-scripts\") pod \"mysqld-exporter-openstack-cell1-db-create-sbfvj\" (UID: \"2b9c74d7-fa1d-4c29-8875-47d506627d77\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-sbfvj" Mar 09 14:24:43 crc kubenswrapper[4722]: I0309 14:24:43.692120 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p45ng\" (UniqueName: \"kubernetes.io/projected/2b9c74d7-fa1d-4c29-8875-47d506627d77-kube-api-access-p45ng\") pod \"mysqld-exporter-openstack-cell1-db-create-sbfvj\" (UID: \"2b9c74d7-fa1d-4c29-8875-47d506627d77\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-sbfvj" Mar 09 14:24:43 crc kubenswrapper[4722]: I0309 14:24:43.772055 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4d162f3c-13b0-4ec3-955b-5a1bf804c61c-operator-scripts\") pod \"mysqld-exporter-b04f-account-create-update-bmkn7\" (UID: \"4d162f3c-13b0-4ec3-955b-5a1bf804c61c\") " pod="openstack/mysqld-exporter-b04f-account-create-update-bmkn7" Mar 09 14:24:43 crc kubenswrapper[4722]: I0309 14:24:43.772217 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ng89v\" (UniqueName: \"kubernetes.io/projected/4d162f3c-13b0-4ec3-955b-5a1bf804c61c-kube-api-access-ng89v\") pod \"mysqld-exporter-b04f-account-create-update-bmkn7\" (UID: \"4d162f3c-13b0-4ec3-955b-5a1bf804c61c\") " pod="openstack/mysqld-exporter-b04f-account-create-update-bmkn7" Mar 09 14:24:43 crc kubenswrapper[4722]: I0309 14:24:43.772940 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4d162f3c-13b0-4ec3-955b-5a1bf804c61c-operator-scripts\") pod \"mysqld-exporter-b04f-account-create-update-bmkn7\" (UID: \"4d162f3c-13b0-4ec3-955b-5a1bf804c61c\") " pod="openstack/mysqld-exporter-b04f-account-create-update-bmkn7" Mar 09 14:24:43 crc kubenswrapper[4722]: I0309 14:24:43.781756 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-sbfvj" Mar 09 14:24:43 crc kubenswrapper[4722]: I0309 14:24:43.802478 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ng89v\" (UniqueName: \"kubernetes.io/projected/4d162f3c-13b0-4ec3-955b-5a1bf804c61c-kube-api-access-ng89v\") pod \"mysqld-exporter-b04f-account-create-update-bmkn7\" (UID: \"4d162f3c-13b0-4ec3-955b-5a1bf804c61c\") " pod="openstack/mysqld-exporter-b04f-account-create-update-bmkn7" Mar 09 14:24:43 crc kubenswrapper[4722]: I0309 14:24:43.889930 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-b04f-account-create-update-bmkn7" Mar 09 14:24:43 crc kubenswrapper[4722]: I0309 14:24:43.995695 4722 generic.go:334] "Generic (PLEG): container finished" podID="3bbf7c99-d896-4ccc-ac04-5c81ddfa3a11" containerID="9914625d14d065d2c597a16fdbdc977022690005aca822fcfd2634b47fba082d" exitCode=0 Mar 09 14:24:43 crc kubenswrapper[4722]: I0309 14:24:43.995739 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-69xgs" event={"ID":"3bbf7c99-d896-4ccc-ac04-5c81ddfa3a11","Type":"ContainerDied","Data":"9914625d14d065d2c597a16fdbdc977022690005aca822fcfd2634b47fba082d"} Mar 09 14:24:44 crc kubenswrapper[4722]: I0309 14:24:44.355978 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-sbfvj"] Mar 09 14:24:44 crc kubenswrapper[4722]: W0309 14:24:44.369277 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2b9c74d7_fa1d_4c29_8875_47d506627d77.slice/crio-d35e31fc232e7d43edbc920d5798086b8bcfd99a67012c1ffcc3464e87a8fdcc WatchSource:0}: Error finding container d35e31fc232e7d43edbc920d5798086b8bcfd99a67012c1ffcc3464e87a8fdcc: Status 404 returned error can't find the container with id d35e31fc232e7d43edbc920d5798086b8bcfd99a67012c1ffcc3464e87a8fdcc Mar 09 14:24:44 crc kubenswrapper[4722]: I0309 14:24:44.506659 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-b04f-account-create-update-bmkn7"] Mar 09 14:24:44 crc kubenswrapper[4722]: W0309 14:24:44.509506 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4d162f3c_13b0_4ec3_955b_5a1bf804c61c.slice/crio-1a695f2881cd95df00077b82e07e8a393d542a632078dd88c11a1f1b8e81830a WatchSource:0}: Error finding container 1a695f2881cd95df00077b82e07e8a393d542a632078dd88c11a1f1b8e81830a: Status 404 returned error can't find the container with id 1a695f2881cd95df00077b82e07e8a393d542a632078dd88c11a1f1b8e81830a Mar 09 14:24:45 crc kubenswrapper[4722]: I0309 14:24:45.016739 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-b04f-account-create-update-bmkn7" event={"ID":"4d162f3c-13b0-4ec3-955b-5a1bf804c61c","Type":"ContainerStarted","Data":"1a695f2881cd95df00077b82e07e8a393d542a632078dd88c11a1f1b8e81830a"} Mar 09 14:24:45 crc kubenswrapper[4722]: I0309 14:24:45.025127 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-sbfvj" event={"ID":"2b9c74d7-fa1d-4c29-8875-47d506627d77","Type":"ContainerStarted","Data":"bd82c164ee409dde9a58c38a0dd185a5560bf460d2e5290d5d5480ce344605d9"} Mar 09 14:24:45 crc kubenswrapper[4722]: I0309 14:24:45.025192 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-sbfvj" event={"ID":"2b9c74d7-fa1d-4c29-8875-47d506627d77","Type":"ContainerStarted","Data":"d35e31fc232e7d43edbc920d5798086b8bcfd99a67012c1ffcc3464e87a8fdcc"} Mar 09 14:24:45 crc kubenswrapper[4722]: I0309 14:24:45.055636 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-openstack-cell1-db-create-sbfvj" podStartSLOduration=2.055608034 podStartE2EDuration="2.055608034s" podCreationTimestamp="2026-03-09 14:24:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:24:45.050491828 +0000 UTC m=+1325.606060414" watchObservedRunningTime="2026-03-09 14:24:45.055608034 +0000 UTC m=+1325.611176620" Mar 09 14:24:45 crc kubenswrapper[4722]: I0309 14:24:45.599302 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-69xgs" Mar 09 14:24:45 crc kubenswrapper[4722]: I0309 14:24:45.737275 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lsfq2\" (UniqueName: \"kubernetes.io/projected/3bbf7c99-d896-4ccc-ac04-5c81ddfa3a11-kube-api-access-lsfq2\") pod \"3bbf7c99-d896-4ccc-ac04-5c81ddfa3a11\" (UID: \"3bbf7c99-d896-4ccc-ac04-5c81ddfa3a11\") " Mar 09 14:24:45 crc kubenswrapper[4722]: I0309 14:24:45.737395 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3bbf7c99-d896-4ccc-ac04-5c81ddfa3a11-operator-scripts\") pod \"3bbf7c99-d896-4ccc-ac04-5c81ddfa3a11\" (UID: \"3bbf7c99-d896-4ccc-ac04-5c81ddfa3a11\") " Mar 09 14:24:45 crc kubenswrapper[4722]: I0309 14:24:45.738144 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3bbf7c99-d896-4ccc-ac04-5c81ddfa3a11-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3bbf7c99-d896-4ccc-ac04-5c81ddfa3a11" (UID: "3bbf7c99-d896-4ccc-ac04-5c81ddfa3a11"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:24:45 crc kubenswrapper[4722]: I0309 14:24:45.745925 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bbf7c99-d896-4ccc-ac04-5c81ddfa3a11-kube-api-access-lsfq2" (OuterVolumeSpecName: "kube-api-access-lsfq2") pod "3bbf7c99-d896-4ccc-ac04-5c81ddfa3a11" (UID: "3bbf7c99-d896-4ccc-ac04-5c81ddfa3a11"). InnerVolumeSpecName "kube-api-access-lsfq2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:24:45 crc kubenswrapper[4722]: I0309 14:24:45.839832 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lsfq2\" (UniqueName: \"kubernetes.io/projected/3bbf7c99-d896-4ccc-ac04-5c81ddfa3a11-kube-api-access-lsfq2\") on node \"crc\" DevicePath \"\"" Mar 09 14:24:45 crc kubenswrapper[4722]: I0309 14:24:45.839870 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3bbf7c99-d896-4ccc-ac04-5c81ddfa3a11-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 14:24:46 crc kubenswrapper[4722]: I0309 14:24:46.039122 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e0db26a0-2877-48fa-b706-b5558f9973d5","Type":"ContainerStarted","Data":"4aa6764ae1c61b4084dcc5f0a098a411934735e696ad3102d63cf6771c505e59"} Mar 09 14:24:46 crc kubenswrapper[4722]: I0309 14:24:46.041571 4722 generic.go:334] "Generic (PLEG): container finished" podID="4d162f3c-13b0-4ec3-955b-5a1bf804c61c" containerID="2d71a3ff9d06d6730989830c92086552aabfd7058ad9bdbfa73c47fa2069e148" exitCode=0 Mar 09 14:24:46 crc kubenswrapper[4722]: I0309 14:24:46.041615 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-b04f-account-create-update-bmkn7" event={"ID":"4d162f3c-13b0-4ec3-955b-5a1bf804c61c","Type":"ContainerDied","Data":"2d71a3ff9d06d6730989830c92086552aabfd7058ad9bdbfa73c47fa2069e148"} Mar 09 14:24:46 crc kubenswrapper[4722]: I0309 14:24:46.046086 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-69xgs" Mar 09 14:24:46 crc kubenswrapper[4722]: I0309 14:24:46.048430 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-69xgs" event={"ID":"3bbf7c99-d896-4ccc-ac04-5c81ddfa3a11","Type":"ContainerDied","Data":"82b7ffa14e8d018b723803168eca53c467a157bd62f531e66d17a31a488d4d6f"} Mar 09 14:24:46 crc kubenswrapper[4722]: I0309 14:24:46.048469 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="82b7ffa14e8d018b723803168eca53c467a157bd62f531e66d17a31a488d4d6f" Mar 09 14:24:46 crc kubenswrapper[4722]: I0309 14:24:46.049926 4722 generic.go:334] "Generic (PLEG): container finished" podID="2b9c74d7-fa1d-4c29-8875-47d506627d77" containerID="bd82c164ee409dde9a58c38a0dd185a5560bf460d2e5290d5d5480ce344605d9" exitCode=0 Mar 09 14:24:46 crc kubenswrapper[4722]: I0309 14:24:46.049983 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-sbfvj" event={"ID":"2b9c74d7-fa1d-4c29-8875-47d506627d77","Type":"ContainerDied","Data":"bd82c164ee409dde9a58c38a0dd185a5560bf460d2e5290d5d5480ce344605d9"} Mar 09 14:24:47 crc kubenswrapper[4722]: I0309 14:24:47.067087 4722 generic.go:334] "Generic (PLEG): container finished" podID="17ce7999-f86f-45fa-ae07-785f70d797a1" containerID="2ed1c447dbb8dfe73b7c01fa28b0e8e47079d52fb2e0e72560df64042052f747" exitCode=0 Mar 09 14:24:47 crc kubenswrapper[4722]: I0309 14:24:47.067257 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"17ce7999-f86f-45fa-ae07-785f70d797a1","Type":"ContainerDied","Data":"2ed1c447dbb8dfe73b7c01fa28b0e8e47079d52fb2e0e72560df64042052f747"} Mar 09 14:24:47 crc kubenswrapper[4722]: I0309 14:24:47.071830 4722 generic.go:334] "Generic (PLEG): container finished" podID="6f4e007a-4a18-40e6-bf96-4a751e00cd73" containerID="89991879f6e59e858e98954d53f4101c5c7935bc1ad02ef1f93145110f421678" exitCode=0 Mar 09 14:24:47 crc kubenswrapper[4722]: I0309 14:24:47.071883 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6f4e007a-4a18-40e6-bf96-4a751e00cd73","Type":"ContainerDied","Data":"89991879f6e59e858e98954d53f4101c5c7935bc1ad02ef1f93145110f421678"} Mar 09 14:24:47 crc kubenswrapper[4722]: I0309 14:24:47.075315 4722 generic.go:334] "Generic (PLEG): container finished" podID="1c98e541-4b72-465d-8799-89e8c9791c3e" containerID="83b4abfa07a9f0cdc86b0978dab11bb5c16ae3ddce1e3930e50e2705f0aa51fa" exitCode=0 Mar 09 14:24:47 crc kubenswrapper[4722]: I0309 14:24:47.075382 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1c98e541-4b72-465d-8799-89e8c9791c3e","Type":"ContainerDied","Data":"83b4abfa07a9f0cdc86b0978dab11bb5c16ae3ddce1e3930e50e2705f0aa51fa"} Mar 09 14:24:47 crc kubenswrapper[4722]: I0309 14:24:47.077861 4722 generic.go:334] "Generic (PLEG): container finished" podID="c6ada086-becc-4f4a-a0a0-0aad894dc550" containerID="c14c75a63d784852902e25e93e5a8cf7646ecf4eccaa9b3cddd0d364975c6f55" exitCode=0 Mar 09 14:24:47 crc kubenswrapper[4722]: I0309 14:24:47.078030 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"c6ada086-becc-4f4a-a0a0-0aad894dc550","Type":"ContainerDied","Data":"c14c75a63d784852902e25e93e5a8cf7646ecf4eccaa9b3cddd0d364975c6f55"} Mar 09 14:24:47 crc kubenswrapper[4722]: I0309 14:24:47.647418 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-b04f-account-create-update-bmkn7" Mar 09 14:24:47 crc kubenswrapper[4722]: I0309 14:24:47.689220 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-sbfvj" Mar 09 14:24:47 crc kubenswrapper[4722]: I0309 14:24:47.800869 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4d162f3c-13b0-4ec3-955b-5a1bf804c61c-operator-scripts\") pod \"4d162f3c-13b0-4ec3-955b-5a1bf804c61c\" (UID: \"4d162f3c-13b0-4ec3-955b-5a1bf804c61c\") " Mar 09 14:24:47 crc kubenswrapper[4722]: I0309 14:24:47.800955 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ng89v\" (UniqueName: \"kubernetes.io/projected/4d162f3c-13b0-4ec3-955b-5a1bf804c61c-kube-api-access-ng89v\") pod \"4d162f3c-13b0-4ec3-955b-5a1bf804c61c\" (UID: \"4d162f3c-13b0-4ec3-955b-5a1bf804c61c\") " Mar 09 14:24:47 crc kubenswrapper[4722]: I0309 14:24:47.801072 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b9c74d7-fa1d-4c29-8875-47d506627d77-operator-scripts\") pod \"2b9c74d7-fa1d-4c29-8875-47d506627d77\" (UID: \"2b9c74d7-fa1d-4c29-8875-47d506627d77\") " Mar 09 14:24:47 crc kubenswrapper[4722]: I0309 14:24:47.801222 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p45ng\" (UniqueName: \"kubernetes.io/projected/2b9c74d7-fa1d-4c29-8875-47d506627d77-kube-api-access-p45ng\") pod \"2b9c74d7-fa1d-4c29-8875-47d506627d77\" (UID: \"2b9c74d7-fa1d-4c29-8875-47d506627d77\") " Mar 09 14:24:47 crc kubenswrapper[4722]: I0309 14:24:47.801720 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d162f3c-13b0-4ec3-955b-5a1bf804c61c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4d162f3c-13b0-4ec3-955b-5a1bf804c61c" (UID: "4d162f3c-13b0-4ec3-955b-5a1bf804c61c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:24:47 crc kubenswrapper[4722]: I0309 14:24:47.801776 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b9c74d7-fa1d-4c29-8875-47d506627d77-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2b9c74d7-fa1d-4c29-8875-47d506627d77" (UID: "2b9c74d7-fa1d-4c29-8875-47d506627d77"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:24:47 crc kubenswrapper[4722]: I0309 14:24:47.806474 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d162f3c-13b0-4ec3-955b-5a1bf804c61c-kube-api-access-ng89v" (OuterVolumeSpecName: "kube-api-access-ng89v") pod "4d162f3c-13b0-4ec3-955b-5a1bf804c61c" (UID: "4d162f3c-13b0-4ec3-955b-5a1bf804c61c"). InnerVolumeSpecName "kube-api-access-ng89v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:24:47 crc kubenswrapper[4722]: I0309 14:24:47.806554 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b9c74d7-fa1d-4c29-8875-47d506627d77-kube-api-access-p45ng" (OuterVolumeSpecName: "kube-api-access-p45ng") pod "2b9c74d7-fa1d-4c29-8875-47d506627d77" (UID: "2b9c74d7-fa1d-4c29-8875-47d506627d77"). InnerVolumeSpecName "kube-api-access-p45ng". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:24:47 crc kubenswrapper[4722]: I0309 14:24:47.903882 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p45ng\" (UniqueName: \"kubernetes.io/projected/2b9c74d7-fa1d-4c29-8875-47d506627d77-kube-api-access-p45ng\") on node \"crc\" DevicePath \"\"" Mar 09 14:24:47 crc kubenswrapper[4722]: I0309 14:24:47.904351 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4d162f3c-13b0-4ec3-955b-5a1bf804c61c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 14:24:47 crc kubenswrapper[4722]: I0309 14:24:47.904439 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ng89v\" (UniqueName: \"kubernetes.io/projected/4d162f3c-13b0-4ec3-955b-5a1bf804c61c-kube-api-access-ng89v\") on node \"crc\" DevicePath \"\"" Mar 09 14:24:47 crc kubenswrapper[4722]: I0309 14:24:47.904519 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b9c74d7-fa1d-4c29-8875-47d506627d77-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 14:24:48 crc kubenswrapper[4722]: I0309 14:24:48.091024 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1c98e541-4b72-465d-8799-89e8c9791c3e","Type":"ContainerStarted","Data":"491a32d2b4f4528a2d9ac9ed69a4bc0a4f2ea0173c8712dc6ab51834b9601f38"} Mar 09 14:24:48 crc kubenswrapper[4722]: I0309 14:24:48.092412 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 09 14:24:48 crc kubenswrapper[4722]: I0309 14:24:48.095683 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"c6ada086-becc-4f4a-a0a0-0aad894dc550","Type":"ContainerStarted","Data":"0cdebc0d2406fceecb9b5fd85c03284427aaea6f5c3f1ce6e67e86ed2f14dfed"} Mar 09 14:24:48 crc kubenswrapper[4722]: I0309 14:24:48.095878 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-1" Mar 09 14:24:48 crc kubenswrapper[4722]: I0309 14:24:48.097469 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-sbfvj" event={"ID":"2b9c74d7-fa1d-4c29-8875-47d506627d77","Type":"ContainerDied","Data":"d35e31fc232e7d43edbc920d5798086b8bcfd99a67012c1ffcc3464e87a8fdcc"} Mar 09 14:24:48 crc kubenswrapper[4722]: I0309 14:24:48.097491 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-sbfvj" Mar 09 14:24:48 crc kubenswrapper[4722]: I0309 14:24:48.097504 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d35e31fc232e7d43edbc920d5798086b8bcfd99a67012c1ffcc3464e87a8fdcc" Mar 09 14:24:48 crc kubenswrapper[4722]: I0309 14:24:48.120571 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"17ce7999-f86f-45fa-ae07-785f70d797a1","Type":"ContainerStarted","Data":"269fab5abed3a65c1a525f2ce6fb7275926a484f18e769dddcc4268e414b80cd"} Mar 09 14:24:48 crc kubenswrapper[4722]: I0309 14:24:48.120878 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-2" Mar 09 14:24:48 crc kubenswrapper[4722]: I0309 14:24:48.123705 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6f4e007a-4a18-40e6-bf96-4a751e00cd73","Type":"ContainerStarted","Data":"1d4f1f48ede004d37f49113410c9accaab4140f421e9ac96954fd4856a15b12f"} Mar 09 14:24:48 crc kubenswrapper[4722]: I0309 14:24:48.124551 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 09 14:24:48 crc kubenswrapper[4722]: I0309 14:24:48.129967 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-b04f-account-create-update-bmkn7" event={"ID":"4d162f3c-13b0-4ec3-955b-5a1bf804c61c","Type":"ContainerDied","Data":"1a695f2881cd95df00077b82e07e8a393d542a632078dd88c11a1f1b8e81830a"} Mar 09 14:24:48 crc kubenswrapper[4722]: I0309 14:24:48.130003 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a695f2881cd95df00077b82e07e8a393d542a632078dd88c11a1f1b8e81830a" Mar 09 14:24:48 crc kubenswrapper[4722]: I0309 14:24:48.130053 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-b04f-account-create-update-bmkn7" Mar 09 14:24:48 crc kubenswrapper[4722]: I0309 14:24:48.146762 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=45.774470411 podStartE2EDuration="1m2.146741695s" podCreationTimestamp="2026-03-09 14:23:46 +0000 UTC" firstStartedPulling="2026-03-09 14:23:56.774688321 +0000 UTC m=+1277.330256897" lastFinishedPulling="2026-03-09 14:24:13.146959605 +0000 UTC m=+1293.702528181" observedRunningTime="2026-03-09 14:24:48.14056116 +0000 UTC m=+1328.696129736" watchObservedRunningTime="2026-03-09 14:24:48.146741695 +0000 UTC m=+1328.702310271" Mar 09 14:24:48 crc kubenswrapper[4722]: I0309 14:24:48.178057 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-2" podStartSLOduration=45.939471379 podStartE2EDuration="1m2.17803644s" podCreationTimestamp="2026-03-09 14:23:46 +0000 UTC" firstStartedPulling="2026-03-09 14:23:56.76455501 +0000 UTC m=+1277.320123576" lastFinishedPulling="2026-03-09 14:24:13.003120061 +0000 UTC m=+1293.558688637" observedRunningTime="2026-03-09 14:24:48.170661282 +0000 UTC m=+1328.726229858" watchObservedRunningTime="2026-03-09 14:24:48.17803644 +0000 UTC m=+1328.733605016" Mar 09 14:24:48 crc kubenswrapper[4722]: I0309 14:24:48.227435 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-1" podStartSLOduration=45.964470154 podStartE2EDuration="1m2.227416796s" podCreationTimestamp="2026-03-09 14:23:46 +0000 UTC" firstStartedPulling="2026-03-09 14:23:56.764460227 +0000 UTC m=+1277.320028803" lastFinishedPulling="2026-03-09 14:24:13.027406869 +0000 UTC m=+1293.582975445" observedRunningTime="2026-03-09 14:24:48.201681759 +0000 UTC m=+1328.757250335" watchObservedRunningTime="2026-03-09 14:24:48.227416796 +0000 UTC m=+1328.782985372" Mar 09 14:24:48 crc kubenswrapper[4722]: I0309 14:24:48.255124 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=45.839257138 podStartE2EDuration="1m2.255101984s" podCreationTimestamp="2026-03-09 14:23:46 +0000 UTC" firstStartedPulling="2026-03-09 14:23:56.774639169 +0000 UTC m=+1277.330207745" lastFinishedPulling="2026-03-09 14:24:13.190484015 +0000 UTC m=+1293.746052591" observedRunningTime="2026-03-09 14:24:48.244647385 +0000 UTC m=+1328.800215961" watchObservedRunningTime="2026-03-09 14:24:48.255101984 +0000 UTC m=+1328.810670560" Mar 09 14:24:49 crc kubenswrapper[4722]: I0309 14:24:49.158220 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-69xgs"] Mar 09 14:24:49 crc kubenswrapper[4722]: I0309 14:24:49.175747 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-69xgs"] Mar 09 14:24:49 crc kubenswrapper[4722]: I0309 14:24:49.812280 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-78fdf7cd4f-mt82k" podUID="aa31f801-ed80-405f-960c-74c254d4f9ca" containerName="console" containerID="cri-o://e88aefd45843603369ebe8bd286ece8982c0ae3f0cd8e6b50b0d085a2c5d1930" gracePeriod=15 Mar 09 14:24:50 crc kubenswrapper[4722]: I0309 14:24:50.155717 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-78fdf7cd4f-mt82k_aa31f801-ed80-405f-960c-74c254d4f9ca/console/0.log" Mar 09 14:24:50 crc kubenswrapper[4722]: I0309 14:24:50.155934 4722 generic.go:334] "Generic (PLEG): container finished" podID="aa31f801-ed80-405f-960c-74c254d4f9ca" containerID="e88aefd45843603369ebe8bd286ece8982c0ae3f0cd8e6b50b0d085a2c5d1930" exitCode=2 Mar 09 14:24:50 crc kubenswrapper[4722]: I0309 14:24:50.159278 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bbf7c99-d896-4ccc-ac04-5c81ddfa3a11" path="/var/lib/kubelet/pods/3bbf7c99-d896-4ccc-ac04-5c81ddfa3a11/volumes" Mar 09 14:24:50 crc kubenswrapper[4722]: I0309 14:24:50.159839 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-78fdf7cd4f-mt82k" event={"ID":"aa31f801-ed80-405f-960c-74c254d4f9ca","Type":"ContainerDied","Data":"e88aefd45843603369ebe8bd286ece8982c0ae3f0cd8e6b50b0d085a2c5d1930"} Mar 09 14:24:51 crc kubenswrapper[4722]: I0309 14:24:51.472687 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 09 14:24:53 crc kubenswrapper[4722]: I0309 14:24:53.649319 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-0"] Mar 09 14:24:53 crc kubenswrapper[4722]: E0309 14:24:53.650108 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d162f3c-13b0-4ec3-955b-5a1bf804c61c" containerName="mariadb-account-create-update" Mar 09 14:24:53 crc kubenswrapper[4722]: I0309 14:24:53.650125 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d162f3c-13b0-4ec3-955b-5a1bf804c61c" containerName="mariadb-account-create-update" Mar 09 14:24:53 crc kubenswrapper[4722]: E0309 14:24:53.650136 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bbf7c99-d896-4ccc-ac04-5c81ddfa3a11" containerName="mariadb-account-create-update" Mar 09 14:24:53 crc kubenswrapper[4722]: I0309 14:24:53.650144 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bbf7c99-d896-4ccc-ac04-5c81ddfa3a11" containerName="mariadb-account-create-update" Mar 09 14:24:53 crc kubenswrapper[4722]: E0309 14:24:53.650180 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b9c74d7-fa1d-4c29-8875-47d506627d77" containerName="mariadb-database-create" Mar 09 14:24:53 crc kubenswrapper[4722]: I0309 14:24:53.650187 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b9c74d7-fa1d-4c29-8875-47d506627d77" containerName="mariadb-database-create" Mar 09 14:24:53 crc kubenswrapper[4722]: I0309 14:24:53.650421 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b9c74d7-fa1d-4c29-8875-47d506627d77" containerName="mariadb-database-create" Mar 09 14:24:53 crc kubenswrapper[4722]: I0309 14:24:53.650446 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d162f3c-13b0-4ec3-955b-5a1bf804c61c" containerName="mariadb-account-create-update" Mar 09 14:24:53 crc kubenswrapper[4722]: I0309 14:24:53.650463 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bbf7c99-d896-4ccc-ac04-5c81ddfa3a11" containerName="mariadb-account-create-update" Mar 09 14:24:53 crc kubenswrapper[4722]: I0309 14:24:53.651148 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Mar 09 14:24:53 crc kubenswrapper[4722]: I0309 14:24:53.660242 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-config-data" Mar 09 14:24:53 crc kubenswrapper[4722]: I0309 14:24:53.674514 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Mar 09 14:24:53 crc kubenswrapper[4722]: I0309 14:24:53.817669 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdb981de-3376-4da4-834d-ae2446f02b8e-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"cdb981de-3376-4da4-834d-ae2446f02b8e\") " pod="openstack/mysqld-exporter-0" Mar 09 14:24:53 crc kubenswrapper[4722]: I0309 14:24:53.817974 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9v7s7\" (UniqueName: \"kubernetes.io/projected/cdb981de-3376-4da4-834d-ae2446f02b8e-kube-api-access-9v7s7\") pod \"mysqld-exporter-0\" (UID: \"cdb981de-3376-4da4-834d-ae2446f02b8e\") " pod="openstack/mysqld-exporter-0" Mar 09 14:24:53 crc kubenswrapper[4722]: I0309 14:24:53.818131 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdb981de-3376-4da4-834d-ae2446f02b8e-config-data\") pod \"mysqld-exporter-0\" (UID: \"cdb981de-3376-4da4-834d-ae2446f02b8e\") " pod="openstack/mysqld-exporter-0" Mar 09 14:24:53 crc kubenswrapper[4722]: I0309 14:24:53.920162 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdb981de-3376-4da4-834d-ae2446f02b8e-config-data\") pod \"mysqld-exporter-0\" (UID: \"cdb981de-3376-4da4-834d-ae2446f02b8e\") " pod="openstack/mysqld-exporter-0" Mar 09 14:24:53 crc kubenswrapper[4722]: I0309 14:24:53.920242 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdb981de-3376-4da4-834d-ae2446f02b8e-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"cdb981de-3376-4da4-834d-ae2446f02b8e\") " pod="openstack/mysqld-exporter-0" Mar 09 14:24:53 crc kubenswrapper[4722]: I0309 14:24:53.920322 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9v7s7\" (UniqueName: \"kubernetes.io/projected/cdb981de-3376-4da4-834d-ae2446f02b8e-kube-api-access-9v7s7\") pod \"mysqld-exporter-0\" (UID: \"cdb981de-3376-4da4-834d-ae2446f02b8e\") " pod="openstack/mysqld-exporter-0" Mar 09 14:24:53 crc kubenswrapper[4722]: I0309 14:24:53.928925 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdb981de-3376-4da4-834d-ae2446f02b8e-config-data\") pod \"mysqld-exporter-0\" (UID: \"cdb981de-3376-4da4-834d-ae2446f02b8e\") " pod="openstack/mysqld-exporter-0" Mar 09 14:24:53 crc kubenswrapper[4722]: I0309 14:24:53.929478 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdb981de-3376-4da4-834d-ae2446f02b8e-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"cdb981de-3376-4da4-834d-ae2446f02b8e\") " pod="openstack/mysqld-exporter-0" Mar 09 14:24:53 crc kubenswrapper[4722]: I0309 14:24:53.944583 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9v7s7\" (UniqueName: \"kubernetes.io/projected/cdb981de-3376-4da4-834d-ae2446f02b8e-kube-api-access-9v7s7\") pod \"mysqld-exporter-0\" (UID: \"cdb981de-3376-4da4-834d-ae2446f02b8e\") " pod="openstack/mysqld-exporter-0" Mar 09 14:24:53 crc kubenswrapper[4722]: I0309 14:24:53.976826 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Mar 09 14:24:54 crc kubenswrapper[4722]: I0309 14:24:54.183367 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-dxsxq"] Mar 09 14:24:54 crc kubenswrapper[4722]: I0309 14:24:54.185057 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-dxsxq" Mar 09 14:24:54 crc kubenswrapper[4722]: I0309 14:24:54.190246 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 09 14:24:54 crc kubenswrapper[4722]: I0309 14:24:54.196720 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-dxsxq"] Mar 09 14:24:54 crc kubenswrapper[4722]: I0309 14:24:54.226020 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4s6h6\" (UniqueName: \"kubernetes.io/projected/10519e91-e280-418b-947a-114e2696e8a8-kube-api-access-4s6h6\") pod \"root-account-create-update-dxsxq\" (UID: \"10519e91-e280-418b-947a-114e2696e8a8\") " pod="openstack/root-account-create-update-dxsxq" Mar 09 14:24:54 crc kubenswrapper[4722]: I0309 14:24:54.226235 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/10519e91-e280-418b-947a-114e2696e8a8-operator-scripts\") pod \"root-account-create-update-dxsxq\" (UID: \"10519e91-e280-418b-947a-114e2696e8a8\") " pod="openstack/root-account-create-update-dxsxq" Mar 09 14:24:54 crc kubenswrapper[4722]: I0309 14:24:54.327882 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/10519e91-e280-418b-947a-114e2696e8a8-operator-scripts\") pod \"root-account-create-update-dxsxq\" (UID: \"10519e91-e280-418b-947a-114e2696e8a8\") " pod="openstack/root-account-create-update-dxsxq" Mar 09 14:24:54 crc kubenswrapper[4722]: I0309 14:24:54.327987 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4s6h6\" (UniqueName: \"kubernetes.io/projected/10519e91-e280-418b-947a-114e2696e8a8-kube-api-access-4s6h6\") pod \"root-account-create-update-dxsxq\" (UID: \"10519e91-e280-418b-947a-114e2696e8a8\") " pod="openstack/root-account-create-update-dxsxq" Mar 09 14:24:54 crc kubenswrapper[4722]: I0309 14:24:54.328766 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/10519e91-e280-418b-947a-114e2696e8a8-operator-scripts\") pod \"root-account-create-update-dxsxq\" (UID: \"10519e91-e280-418b-947a-114e2696e8a8\") " pod="openstack/root-account-create-update-dxsxq" Mar 09 14:24:54 crc kubenswrapper[4722]: I0309 14:24:54.355374 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4s6h6\" (UniqueName: \"kubernetes.io/projected/10519e91-e280-418b-947a-114e2696e8a8-kube-api-access-4s6h6\") pod \"root-account-create-update-dxsxq\" (UID: \"10519e91-e280-418b-947a-114e2696e8a8\") " pod="openstack/root-account-create-update-dxsxq" Mar 09 14:24:54 crc kubenswrapper[4722]: I0309 14:24:54.562791 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-dxsxq" Mar 09 14:24:56 crc kubenswrapper[4722]: I0309 14:24:56.297245 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-b8gzx" podUID="32bc4279-b6a2-4846-801c-ddf3a01db8b2" containerName="ovn-controller" probeResult="failure" output=< Mar 09 14:24:56 crc kubenswrapper[4722]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 09 14:24:56 crc kubenswrapper[4722]: > Mar 09 14:24:56 crc kubenswrapper[4722]: I0309 14:24:56.477850 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7463e84f-f457-4409-9621-507d331e06b5-etc-swift\") pod \"swift-storage-0\" (UID: \"7463e84f-f457-4409-9621-507d331e06b5\") " pod="openstack/swift-storage-0" Mar 09 14:24:56 crc kubenswrapper[4722]: I0309 14:24:56.484955 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7463e84f-f457-4409-9621-507d331e06b5-etc-swift\") pod \"swift-storage-0\" (UID: \"7463e84f-f457-4409-9621-507d331e06b5\") " pod="openstack/swift-storage-0" Mar 09 14:24:56 crc kubenswrapper[4722]: I0309 14:24:56.558312 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 09 14:24:57 crc kubenswrapper[4722]: I0309 14:24:57.018180 4722 patch_prober.go:28] interesting pod/console-78fdf7cd4f-mt82k container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.91:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 14:24:57 crc kubenswrapper[4722]: I0309 14:24:57.018569 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-78fdf7cd4f-mt82k" podUID="aa31f801-ed80-405f-960c-74c254d4f9ca" containerName="console" probeResult="failure" output="Get \"https://10.217.0.91:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 14:24:58 crc kubenswrapper[4722]: I0309 14:24:58.257268 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="1c98e541-4b72-465d-8799-89e8c9791c3e" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.133:5671: connect: connection refused" Mar 09 14:24:58 crc kubenswrapper[4722]: I0309 14:24:58.377255 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="6f4e007a-4a18-40e6-bf96-4a751e00cd73" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.134:5671: connect: connection refused" Mar 09 14:24:58 crc kubenswrapper[4722]: I0309 14:24:58.414933 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-2" podUID="17ce7999-f86f-45fa-ae07-785f70d797a1" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.135:5671: connect: connection refused" Mar 09 14:24:58 crc kubenswrapper[4722]: I0309 14:24:58.432384 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-1" podUID="c6ada086-becc-4f4a-a0a0-0aad894dc550" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.136:5671: connect: connection refused" Mar 09 14:24:59 crc kubenswrapper[4722]: I0309 14:24:59.192718 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-78fdf7cd4f-mt82k_aa31f801-ed80-405f-960c-74c254d4f9ca/console/0.log" Mar 09 14:24:59 crc kubenswrapper[4722]: I0309 14:24:59.193411 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-78fdf7cd4f-mt82k" Mar 09 14:24:59 crc kubenswrapper[4722]: I0309 14:24:59.238055 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/aa31f801-ed80-405f-960c-74c254d4f9ca-console-oauth-config\") pod \"aa31f801-ed80-405f-960c-74c254d4f9ca\" (UID: \"aa31f801-ed80-405f-960c-74c254d4f9ca\") " Mar 09 14:24:59 crc kubenswrapper[4722]: I0309 14:24:59.238174 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/aa31f801-ed80-405f-960c-74c254d4f9ca-service-ca\") pod \"aa31f801-ed80-405f-960c-74c254d4f9ca\" (UID: \"aa31f801-ed80-405f-960c-74c254d4f9ca\") " Mar 09 14:24:59 crc kubenswrapper[4722]: I0309 14:24:59.238245 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/aa31f801-ed80-405f-960c-74c254d4f9ca-oauth-serving-cert\") pod \"aa31f801-ed80-405f-960c-74c254d4f9ca\" (UID: \"aa31f801-ed80-405f-960c-74c254d4f9ca\") " Mar 09 14:24:59 crc kubenswrapper[4722]: I0309 14:24:59.238343 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa31f801-ed80-405f-960c-74c254d4f9ca-trusted-ca-bundle\") pod \"aa31f801-ed80-405f-960c-74c254d4f9ca\" (UID: \"aa31f801-ed80-405f-960c-74c254d4f9ca\") " Mar 09 14:24:59 crc kubenswrapper[4722]: I0309 14:24:59.238382 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpqsr\" (UniqueName: \"kubernetes.io/projected/aa31f801-ed80-405f-960c-74c254d4f9ca-kube-api-access-bpqsr\") pod \"aa31f801-ed80-405f-960c-74c254d4f9ca\" (UID: \"aa31f801-ed80-405f-960c-74c254d4f9ca\") " Mar 09 14:24:59 crc kubenswrapper[4722]: I0309 14:24:59.238484 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/aa31f801-ed80-405f-960c-74c254d4f9ca-console-serving-cert\") pod \"aa31f801-ed80-405f-960c-74c254d4f9ca\" (UID: \"aa31f801-ed80-405f-960c-74c254d4f9ca\") " Mar 09 14:24:59 crc kubenswrapper[4722]: I0309 14:24:59.238535 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/aa31f801-ed80-405f-960c-74c254d4f9ca-console-config\") pod \"aa31f801-ed80-405f-960c-74c254d4f9ca\" (UID: \"aa31f801-ed80-405f-960c-74c254d4f9ca\") " Mar 09 14:24:59 crc kubenswrapper[4722]: I0309 14:24:59.240882 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa31f801-ed80-405f-960c-74c254d4f9ca-console-config" (OuterVolumeSpecName: "console-config") pod "aa31f801-ed80-405f-960c-74c254d4f9ca" (UID: "aa31f801-ed80-405f-960c-74c254d4f9ca"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:24:59 crc kubenswrapper[4722]: I0309 14:24:59.240914 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa31f801-ed80-405f-960c-74c254d4f9ca-service-ca" (OuterVolumeSpecName: "service-ca") pod "aa31f801-ed80-405f-960c-74c254d4f9ca" (UID: "aa31f801-ed80-405f-960c-74c254d4f9ca"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:24:59 crc kubenswrapper[4722]: I0309 14:24:59.242656 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa31f801-ed80-405f-960c-74c254d4f9ca-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "aa31f801-ed80-405f-960c-74c254d4f9ca" (UID: "aa31f801-ed80-405f-960c-74c254d4f9ca"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:24:59 crc kubenswrapper[4722]: I0309 14:24:59.243059 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa31f801-ed80-405f-960c-74c254d4f9ca-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "aa31f801-ed80-405f-960c-74c254d4f9ca" (UID: "aa31f801-ed80-405f-960c-74c254d4f9ca"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:24:59 crc kubenswrapper[4722]: I0309 14:24:59.245288 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa31f801-ed80-405f-960c-74c254d4f9ca-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "aa31f801-ed80-405f-960c-74c254d4f9ca" (UID: "aa31f801-ed80-405f-960c-74c254d4f9ca"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:24:59 crc kubenswrapper[4722]: I0309 14:24:59.245332 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa31f801-ed80-405f-960c-74c254d4f9ca-kube-api-access-bpqsr" (OuterVolumeSpecName: "kube-api-access-bpqsr") pod "aa31f801-ed80-405f-960c-74c254d4f9ca" (UID: "aa31f801-ed80-405f-960c-74c254d4f9ca"). InnerVolumeSpecName "kube-api-access-bpqsr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:24:59 crc kubenswrapper[4722]: I0309 14:24:59.246430 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa31f801-ed80-405f-960c-74c254d4f9ca-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "aa31f801-ed80-405f-960c-74c254d4f9ca" (UID: "aa31f801-ed80-405f-960c-74c254d4f9ca"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:24:59 crc kubenswrapper[4722]: I0309 14:24:59.252439 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-78fdf7cd4f-mt82k_aa31f801-ed80-405f-960c-74c254d4f9ca/console/0.log" Mar 09 14:24:59 crc kubenswrapper[4722]: I0309 14:24:59.252489 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-78fdf7cd4f-mt82k" event={"ID":"aa31f801-ed80-405f-960c-74c254d4f9ca","Type":"ContainerDied","Data":"06afa867c517b762f56818737eb26daa1472d479f2b7b9888aad848b5ed83f4c"} Mar 09 14:24:59 crc kubenswrapper[4722]: I0309 14:24:59.252525 4722 scope.go:117] "RemoveContainer" containerID="e88aefd45843603369ebe8bd286ece8982c0ae3f0cd8e6b50b0d085a2c5d1930" Mar 09 14:24:59 crc kubenswrapper[4722]: I0309 14:24:59.252560 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-78fdf7cd4f-mt82k" Mar 09 14:24:59 crc kubenswrapper[4722]: I0309 14:24:59.300500 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-78fdf7cd4f-mt82k"] Mar 09 14:24:59 crc kubenswrapper[4722]: I0309 14:24:59.311897 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-78fdf7cd4f-mt82k"] Mar 09 14:24:59 crc kubenswrapper[4722]: I0309 14:24:59.340905 4722 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/aa31f801-ed80-405f-960c-74c254d4f9ca-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 14:24:59 crc kubenswrapper[4722]: I0309 14:24:59.340926 4722 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/aa31f801-ed80-405f-960c-74c254d4f9ca-console-config\") on node \"crc\" DevicePath \"\"" Mar 09 14:24:59 crc kubenswrapper[4722]: I0309 14:24:59.340934 4722 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/aa31f801-ed80-405f-960c-74c254d4f9ca-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 09 14:24:59 crc kubenswrapper[4722]: I0309 14:24:59.340943 4722 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/aa31f801-ed80-405f-960c-74c254d4f9ca-service-ca\") on node \"crc\" DevicePath \"\"" Mar 09 14:24:59 crc kubenswrapper[4722]: I0309 14:24:59.340955 4722 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/aa31f801-ed80-405f-960c-74c254d4f9ca-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 09 14:24:59 crc kubenswrapper[4722]: I0309 14:24:59.340964 4722 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa31f801-ed80-405f-960c-74c254d4f9ca-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:24:59 crc kubenswrapper[4722]: I0309 14:24:59.340972 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bpqsr\" (UniqueName: \"kubernetes.io/projected/aa31f801-ed80-405f-960c-74c254d4f9ca-kube-api-access-bpqsr\") on node \"crc\" DevicePath \"\"" Mar 09 14:24:59 crc kubenswrapper[4722]: I0309 14:24:59.551585 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Mar 09 14:24:59 crc kubenswrapper[4722]: W0309 14:24:59.569815 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcdb981de_3376_4da4_834d_ae2446f02b8e.slice/crio-acb467c49c3d1959f49b618ffd68b5618a2943275f9f2a4bc3d5ebb9d50257f3 WatchSource:0}: Error finding container acb467c49c3d1959f49b618ffd68b5618a2943275f9f2a4bc3d5ebb9d50257f3: Status 404 returned error can't find the container with id acb467c49c3d1959f49b618ffd68b5618a2943275f9f2a4bc3d5ebb9d50257f3 Mar 09 14:24:59 crc kubenswrapper[4722]: I0309 14:24:59.763577 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-dxsxq"] Mar 09 14:24:59 crc kubenswrapper[4722]: I0309 14:24:59.785013 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 09 14:25:00 crc kubenswrapper[4722]: I0309 14:25:00.162070 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa31f801-ed80-405f-960c-74c254d4f9ca" path="/var/lib/kubelet/pods/aa31f801-ed80-405f-960c-74c254d4f9ca/volumes" Mar 09 14:25:00 crc kubenswrapper[4722]: I0309 14:25:00.266944 4722 generic.go:334] "Generic (PLEG): container finished" podID="10519e91-e280-418b-947a-114e2696e8a8" containerID="79e69e9857ff6bcf1ff3a0ee629adedb582f12c826219910c63e7b4fe8b7365e" exitCode=0 Mar 09 14:25:00 crc kubenswrapper[4722]: I0309 14:25:00.267009 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-dxsxq" event={"ID":"10519e91-e280-418b-947a-114e2696e8a8","Type":"ContainerDied","Data":"79e69e9857ff6bcf1ff3a0ee629adedb582f12c826219910c63e7b4fe8b7365e"} Mar 09 14:25:00 crc kubenswrapper[4722]: I0309 14:25:00.267038 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-dxsxq" event={"ID":"10519e91-e280-418b-947a-114e2696e8a8","Type":"ContainerStarted","Data":"d7301e66929a5eff4694f8801e174ff0935c965a811738805d6c98d40f538a13"} Mar 09 14:25:00 crc kubenswrapper[4722]: I0309 14:25:00.270097 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e0db26a0-2877-48fa-b706-b5558f9973d5","Type":"ContainerStarted","Data":"255b938bfd69ad600dd36a900edecc9cf5d23939501d8d23e0bda82feb284e0e"} Mar 09 14:25:00 crc kubenswrapper[4722]: I0309 14:25:00.271561 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"cdb981de-3376-4da4-834d-ae2446f02b8e","Type":"ContainerStarted","Data":"acb467c49c3d1959f49b618ffd68b5618a2943275f9f2a4bc3d5ebb9d50257f3"} Mar 09 14:25:00 crc kubenswrapper[4722]: I0309 14:25:00.274375 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-rl9l9" event={"ID":"17f0c1f6-4aea-4ada-aaec-3493cec60053","Type":"ContainerStarted","Data":"362a3baf5e0261fd5426cdde6dead9ccd0a206479e0b9484ca198bd9bce59b63"} Mar 09 14:25:00 crc kubenswrapper[4722]: I0309 14:25:00.280877 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7463e84f-f457-4409-9621-507d331e06b5","Type":"ContainerStarted","Data":"e11658d79422a1a074bb9b1a3ec5ce8317f19d61c705bffc4a3f2498b311d3c3"} Mar 09 14:25:00 crc kubenswrapper[4722]: I0309 14:25:00.308812 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-rl9l9" podStartSLOduration=2.846011307 podStartE2EDuration="19.308795132s" podCreationTimestamp="2026-03-09 14:24:41 +0000 UTC" firstStartedPulling="2026-03-09 14:24:42.539317448 +0000 UTC m=+1323.094886014" lastFinishedPulling="2026-03-09 14:24:59.002101263 +0000 UTC m=+1339.557669839" observedRunningTime="2026-03-09 14:25:00.302455626 +0000 UTC m=+1340.858024202" watchObservedRunningTime="2026-03-09 14:25:00.308795132 +0000 UTC m=+1340.864363708" Mar 09 14:25:00 crc kubenswrapper[4722]: I0309 14:25:00.348286 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=23.518220691 podStartE2EDuration="1m7.348264255s" podCreationTimestamp="2026-03-09 14:23:53 +0000 UTC" firstStartedPulling="2026-03-09 14:24:15.170584479 +0000 UTC m=+1295.726153055" lastFinishedPulling="2026-03-09 14:24:59.000628033 +0000 UTC m=+1339.556196619" observedRunningTime="2026-03-09 14:25:00.337496807 +0000 UTC m=+1340.893065383" watchObservedRunningTime="2026-03-09 14:25:00.348264255 +0000 UTC m=+1340.903832831" Mar 09 14:25:01 crc kubenswrapper[4722]: I0309 14:25:01.286884 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-b8gzx" podUID="32bc4279-b6a2-4846-801c-ddf3a01db8b2" containerName="ovn-controller" probeResult="failure" output=< Mar 09 14:25:01 crc kubenswrapper[4722]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 09 14:25:01 crc kubenswrapper[4722]: > Mar 09 14:25:01 crc kubenswrapper[4722]: I0309 14:25:01.326169 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-k6ng6" Mar 09 14:25:01 crc kubenswrapper[4722]: I0309 14:25:01.335618 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-k6ng6" Mar 09 14:25:01 crc kubenswrapper[4722]: I0309 14:25:01.566384 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-b8gzx-config-p4922"] Mar 09 14:25:01 crc kubenswrapper[4722]: E0309 14:25:01.567079 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa31f801-ed80-405f-960c-74c254d4f9ca" containerName="console" Mar 09 14:25:01 crc kubenswrapper[4722]: I0309 14:25:01.567092 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa31f801-ed80-405f-960c-74c254d4f9ca" containerName="console" Mar 09 14:25:01 crc kubenswrapper[4722]: I0309 14:25:01.567379 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa31f801-ed80-405f-960c-74c254d4f9ca" containerName="console" Mar 09 14:25:01 crc kubenswrapper[4722]: I0309 14:25:01.568117 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-b8gzx-config-p4922" Mar 09 14:25:01 crc kubenswrapper[4722]: I0309 14:25:01.574649 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 09 14:25:01 crc kubenswrapper[4722]: I0309 14:25:01.589656 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-b8gzx-config-p4922"] Mar 09 14:25:01 crc kubenswrapper[4722]: I0309 14:25:01.695302 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvshn\" (UniqueName: \"kubernetes.io/projected/f615c5ce-b7dd-4206-9bc6-98630f704a4d-kube-api-access-fvshn\") pod \"ovn-controller-b8gzx-config-p4922\" (UID: \"f615c5ce-b7dd-4206-9bc6-98630f704a4d\") " pod="openstack/ovn-controller-b8gzx-config-p4922" Mar 09 14:25:01 crc kubenswrapper[4722]: I0309 14:25:01.695429 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f615c5ce-b7dd-4206-9bc6-98630f704a4d-scripts\") pod \"ovn-controller-b8gzx-config-p4922\" (UID: \"f615c5ce-b7dd-4206-9bc6-98630f704a4d\") " pod="openstack/ovn-controller-b8gzx-config-p4922" Mar 09 14:25:01 crc kubenswrapper[4722]: I0309 14:25:01.695464 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f615c5ce-b7dd-4206-9bc6-98630f704a4d-var-run-ovn\") pod \"ovn-controller-b8gzx-config-p4922\" (UID: \"f615c5ce-b7dd-4206-9bc6-98630f704a4d\") " pod="openstack/ovn-controller-b8gzx-config-p4922" Mar 09 14:25:01 crc kubenswrapper[4722]: I0309 14:25:01.695496 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f615c5ce-b7dd-4206-9bc6-98630f704a4d-var-run\") pod \"ovn-controller-b8gzx-config-p4922\" (UID: \"f615c5ce-b7dd-4206-9bc6-98630f704a4d\") " pod="openstack/ovn-controller-b8gzx-config-p4922" Mar 09 14:25:01 crc kubenswrapper[4722]: I0309 14:25:01.695653 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f615c5ce-b7dd-4206-9bc6-98630f704a4d-var-log-ovn\") pod \"ovn-controller-b8gzx-config-p4922\" (UID: \"f615c5ce-b7dd-4206-9bc6-98630f704a4d\") " pod="openstack/ovn-controller-b8gzx-config-p4922" Mar 09 14:25:01 crc kubenswrapper[4722]: I0309 14:25:01.695889 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/f615c5ce-b7dd-4206-9bc6-98630f704a4d-additional-scripts\") pod \"ovn-controller-b8gzx-config-p4922\" (UID: \"f615c5ce-b7dd-4206-9bc6-98630f704a4d\") " pod="openstack/ovn-controller-b8gzx-config-p4922" Mar 09 14:25:01 crc kubenswrapper[4722]: I0309 14:25:01.797557 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvshn\" (UniqueName: \"kubernetes.io/projected/f615c5ce-b7dd-4206-9bc6-98630f704a4d-kube-api-access-fvshn\") pod \"ovn-controller-b8gzx-config-p4922\" (UID: \"f615c5ce-b7dd-4206-9bc6-98630f704a4d\") " pod="openstack/ovn-controller-b8gzx-config-p4922" Mar 09 14:25:01 crc kubenswrapper[4722]: I0309 14:25:01.797668 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f615c5ce-b7dd-4206-9bc6-98630f704a4d-scripts\") pod \"ovn-controller-b8gzx-config-p4922\" (UID: \"f615c5ce-b7dd-4206-9bc6-98630f704a4d\") " pod="openstack/ovn-controller-b8gzx-config-p4922" Mar 09 14:25:01 crc kubenswrapper[4722]: I0309 14:25:01.797693 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f615c5ce-b7dd-4206-9bc6-98630f704a4d-var-run-ovn\") pod \"ovn-controller-b8gzx-config-p4922\" (UID: \"f615c5ce-b7dd-4206-9bc6-98630f704a4d\") " pod="openstack/ovn-controller-b8gzx-config-p4922" Mar 09 14:25:01 crc kubenswrapper[4722]: I0309 14:25:01.797713 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f615c5ce-b7dd-4206-9bc6-98630f704a4d-var-run\") pod \"ovn-controller-b8gzx-config-p4922\" (UID: \"f615c5ce-b7dd-4206-9bc6-98630f704a4d\") " pod="openstack/ovn-controller-b8gzx-config-p4922" Mar 09 14:25:01 crc kubenswrapper[4722]: I0309 14:25:01.797772 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f615c5ce-b7dd-4206-9bc6-98630f704a4d-var-log-ovn\") pod \"ovn-controller-b8gzx-config-p4922\" (UID: \"f615c5ce-b7dd-4206-9bc6-98630f704a4d\") " pod="openstack/ovn-controller-b8gzx-config-p4922" Mar 09 14:25:01 crc kubenswrapper[4722]: I0309 14:25:01.797852 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/f615c5ce-b7dd-4206-9bc6-98630f704a4d-additional-scripts\") pod \"ovn-controller-b8gzx-config-p4922\" (UID: \"f615c5ce-b7dd-4206-9bc6-98630f704a4d\") " pod="openstack/ovn-controller-b8gzx-config-p4922" Mar 09 14:25:01 crc kubenswrapper[4722]: I0309 14:25:01.798435 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f615c5ce-b7dd-4206-9bc6-98630f704a4d-var-run\") pod \"ovn-controller-b8gzx-config-p4922\" (UID: \"f615c5ce-b7dd-4206-9bc6-98630f704a4d\") " pod="openstack/ovn-controller-b8gzx-config-p4922" Mar 09 14:25:01 crc kubenswrapper[4722]: I0309 14:25:01.798445 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f615c5ce-b7dd-4206-9bc6-98630f704a4d-var-log-ovn\") pod \"ovn-controller-b8gzx-config-p4922\" (UID: \"f615c5ce-b7dd-4206-9bc6-98630f704a4d\") " pod="openstack/ovn-controller-b8gzx-config-p4922" Mar 09 14:25:01 crc kubenswrapper[4722]: I0309 14:25:01.798576 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f615c5ce-b7dd-4206-9bc6-98630f704a4d-var-run-ovn\") pod \"ovn-controller-b8gzx-config-p4922\" (UID: \"f615c5ce-b7dd-4206-9bc6-98630f704a4d\") " pod="openstack/ovn-controller-b8gzx-config-p4922" Mar 09 14:25:01 crc kubenswrapper[4722]: I0309 14:25:01.802800 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/f615c5ce-b7dd-4206-9bc6-98630f704a4d-additional-scripts\") pod \"ovn-controller-b8gzx-config-p4922\" (UID: \"f615c5ce-b7dd-4206-9bc6-98630f704a4d\") " pod="openstack/ovn-controller-b8gzx-config-p4922" Mar 09 14:25:01 crc kubenswrapper[4722]: I0309 14:25:01.804112 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f615c5ce-b7dd-4206-9bc6-98630f704a4d-scripts\") pod \"ovn-controller-b8gzx-config-p4922\" (UID: \"f615c5ce-b7dd-4206-9bc6-98630f704a4d\") " pod="openstack/ovn-controller-b8gzx-config-p4922" Mar 09 14:25:01 crc kubenswrapper[4722]: I0309 14:25:01.815039 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvshn\" (UniqueName: \"kubernetes.io/projected/f615c5ce-b7dd-4206-9bc6-98630f704a4d-kube-api-access-fvshn\") pod \"ovn-controller-b8gzx-config-p4922\" (UID: \"f615c5ce-b7dd-4206-9bc6-98630f704a4d\") " pod="openstack/ovn-controller-b8gzx-config-p4922" Mar 09 14:25:01 crc kubenswrapper[4722]: I0309 14:25:01.920965 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-b8gzx-config-p4922" Mar 09 14:25:01 crc kubenswrapper[4722]: I0309 14:25:01.944602 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-dxsxq" Mar 09 14:25:02 crc kubenswrapper[4722]: I0309 14:25:02.104982 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/10519e91-e280-418b-947a-114e2696e8a8-operator-scripts\") pod \"10519e91-e280-418b-947a-114e2696e8a8\" (UID: \"10519e91-e280-418b-947a-114e2696e8a8\") " Mar 09 14:25:02 crc kubenswrapper[4722]: I0309 14:25:02.105116 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4s6h6\" (UniqueName: \"kubernetes.io/projected/10519e91-e280-418b-947a-114e2696e8a8-kube-api-access-4s6h6\") pod \"10519e91-e280-418b-947a-114e2696e8a8\" (UID: \"10519e91-e280-418b-947a-114e2696e8a8\") " Mar 09 14:25:02 crc kubenswrapper[4722]: I0309 14:25:02.105636 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10519e91-e280-418b-947a-114e2696e8a8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "10519e91-e280-418b-947a-114e2696e8a8" (UID: "10519e91-e280-418b-947a-114e2696e8a8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:25:02 crc kubenswrapper[4722]: I0309 14:25:02.105957 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/10519e91-e280-418b-947a-114e2696e8a8-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 14:25:02 crc kubenswrapper[4722]: I0309 14:25:02.108378 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10519e91-e280-418b-947a-114e2696e8a8-kube-api-access-4s6h6" (OuterVolumeSpecName: "kube-api-access-4s6h6") pod "10519e91-e280-418b-947a-114e2696e8a8" (UID: "10519e91-e280-418b-947a-114e2696e8a8"). InnerVolumeSpecName "kube-api-access-4s6h6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:25:02 crc kubenswrapper[4722]: I0309 14:25:02.207676 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4s6h6\" (UniqueName: \"kubernetes.io/projected/10519e91-e280-418b-947a-114e2696e8a8-kube-api-access-4s6h6\") on node \"crc\" DevicePath \"\"" Mar 09 14:25:02 crc kubenswrapper[4722]: I0309 14:25:02.311880 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-dxsxq" event={"ID":"10519e91-e280-418b-947a-114e2696e8a8","Type":"ContainerDied","Data":"d7301e66929a5eff4694f8801e174ff0935c965a811738805d6c98d40f538a13"} Mar 09 14:25:02 crc kubenswrapper[4722]: I0309 14:25:02.311924 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d7301e66929a5eff4694f8801e174ff0935c965a811738805d6c98d40f538a13" Mar 09 14:25:02 crc kubenswrapper[4722]: I0309 14:25:02.311976 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-dxsxq" Mar 09 14:25:02 crc kubenswrapper[4722]: I0309 14:25:02.313892 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"cdb981de-3376-4da4-834d-ae2446f02b8e","Type":"ContainerStarted","Data":"a923ea13e4da33d77768149e9e6f1a1366d9961f621630001d7b5d50fdd6ce08"} Mar 09 14:25:02 crc kubenswrapper[4722]: I0309 14:25:02.322674 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7463e84f-f457-4409-9621-507d331e06b5","Type":"ContainerStarted","Data":"1ce1aaa580096fdb9749ab84338e7fe086692902ac7be5d074d5835ef26e764f"} Mar 09 14:25:02 crc kubenswrapper[4722]: I0309 14:25:02.324821 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7463e84f-f457-4409-9621-507d331e06b5","Type":"ContainerStarted","Data":"14ca28f3e4b4054adc5670e66baf5879b1163303e9b63422d5518d3e86e7a0ef"} Mar 09 14:25:02 crc kubenswrapper[4722]: I0309 14:25:02.342373 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-0" podStartSLOduration=7.446133275 podStartE2EDuration="9.342347265s" podCreationTimestamp="2026-03-09 14:24:53 +0000 UTC" firstStartedPulling="2026-03-09 14:24:59.572565638 +0000 UTC m=+1340.128134214" lastFinishedPulling="2026-03-09 14:25:01.468779628 +0000 UTC m=+1342.024348204" observedRunningTime="2026-03-09 14:25:02.334617941 +0000 UTC m=+1342.890186517" watchObservedRunningTime="2026-03-09 14:25:02.342347265 +0000 UTC m=+1342.897915851" Mar 09 14:25:02 crc kubenswrapper[4722]: I0309 14:25:02.446688 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-b8gzx-config-p4922"] Mar 09 14:25:02 crc kubenswrapper[4722]: W0309 14:25:02.464031 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf615c5ce_b7dd_4206_9bc6_98630f704a4d.slice/crio-ddb283750e21be12e3f21138108bd7b8b5352042976cd071df16dcc0f70b7abd WatchSource:0}: Error finding container ddb283750e21be12e3f21138108bd7b8b5352042976cd071df16dcc0f70b7abd: Status 404 returned error can't find the container with id ddb283750e21be12e3f21138108bd7b8b5352042976cd071df16dcc0f70b7abd Mar 09 14:25:03 crc kubenswrapper[4722]: I0309 14:25:03.338623 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7463e84f-f457-4409-9621-507d331e06b5","Type":"ContainerStarted","Data":"dd4d4038c352dcd5e2b75924b63a325fec720046b520742e254b6e7a753c72a9"} Mar 09 14:25:03 crc kubenswrapper[4722]: I0309 14:25:03.339064 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7463e84f-f457-4409-9621-507d331e06b5","Type":"ContainerStarted","Data":"086bfa47b38e04f792f2907daeb26fb064c28bf348df8f2d29a62cac2fac6f91"} Mar 09 14:25:03 crc kubenswrapper[4722]: I0309 14:25:03.340615 4722 generic.go:334] "Generic (PLEG): container finished" podID="f615c5ce-b7dd-4206-9bc6-98630f704a4d" containerID="2fa001aeecbb2e4d0fbf27e36b5acac60963a831415d2c1be0bb7d705ff66e2e" exitCode=0 Mar 09 14:25:03 crc kubenswrapper[4722]: I0309 14:25:03.340672 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-b8gzx-config-p4922" event={"ID":"f615c5ce-b7dd-4206-9bc6-98630f704a4d","Type":"ContainerDied","Data":"2fa001aeecbb2e4d0fbf27e36b5acac60963a831415d2c1be0bb7d705ff66e2e"} Mar 09 14:25:03 crc kubenswrapper[4722]: I0309 14:25:03.340728 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-b8gzx-config-p4922" event={"ID":"f615c5ce-b7dd-4206-9bc6-98630f704a4d","Type":"ContainerStarted","Data":"ddb283750e21be12e3f21138108bd7b8b5352042976cd071df16dcc0f70b7abd"} Mar 09 14:25:04 crc kubenswrapper[4722]: I0309 14:25:04.361671 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7463e84f-f457-4409-9621-507d331e06b5","Type":"ContainerStarted","Data":"7eb4b29cf00108ed93b6f25691006ee41f998adc224e447b7e65df24f3e083fc"} Mar 09 14:25:04 crc kubenswrapper[4722]: I0309 14:25:04.362046 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7463e84f-f457-4409-9621-507d331e06b5","Type":"ContainerStarted","Data":"f0592101978792babcb1c624e1d356c0af4d7af0020e8358ff365065ee6b5e79"} Mar 09 14:25:04 crc kubenswrapper[4722]: I0309 14:25:04.833936 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Mar 09 14:25:04 crc kubenswrapper[4722]: I0309 14:25:04.938172 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-b8gzx-config-p4922" Mar 09 14:25:05 crc kubenswrapper[4722]: I0309 14:25:05.000691 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f615c5ce-b7dd-4206-9bc6-98630f704a4d-scripts\") pod \"f615c5ce-b7dd-4206-9bc6-98630f704a4d\" (UID: \"f615c5ce-b7dd-4206-9bc6-98630f704a4d\") " Mar 09 14:25:05 crc kubenswrapper[4722]: I0309 14:25:05.000759 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/f615c5ce-b7dd-4206-9bc6-98630f704a4d-additional-scripts\") pod \"f615c5ce-b7dd-4206-9bc6-98630f704a4d\" (UID: \"f615c5ce-b7dd-4206-9bc6-98630f704a4d\") " Mar 09 14:25:05 crc kubenswrapper[4722]: I0309 14:25:05.000835 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f615c5ce-b7dd-4206-9bc6-98630f704a4d-var-run\") pod \"f615c5ce-b7dd-4206-9bc6-98630f704a4d\" (UID: \"f615c5ce-b7dd-4206-9bc6-98630f704a4d\") " Mar 09 14:25:05 crc kubenswrapper[4722]: I0309 14:25:05.000974 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fvshn\" (UniqueName: \"kubernetes.io/projected/f615c5ce-b7dd-4206-9bc6-98630f704a4d-kube-api-access-fvshn\") pod \"f615c5ce-b7dd-4206-9bc6-98630f704a4d\" (UID: \"f615c5ce-b7dd-4206-9bc6-98630f704a4d\") " Mar 09 14:25:05 crc kubenswrapper[4722]: I0309 14:25:05.001054 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f615c5ce-b7dd-4206-9bc6-98630f704a4d-var-log-ovn\") pod \"f615c5ce-b7dd-4206-9bc6-98630f704a4d\" (UID: \"f615c5ce-b7dd-4206-9bc6-98630f704a4d\") " Mar 09 14:25:05 crc kubenswrapper[4722]: I0309 14:25:05.001106 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f615c5ce-b7dd-4206-9bc6-98630f704a4d-var-run-ovn\") pod \"f615c5ce-b7dd-4206-9bc6-98630f704a4d\" (UID: \"f615c5ce-b7dd-4206-9bc6-98630f704a4d\") " Mar 09 14:25:05 crc kubenswrapper[4722]: I0309 14:25:05.002806 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f615c5ce-b7dd-4206-9bc6-98630f704a4d-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "f615c5ce-b7dd-4206-9bc6-98630f704a4d" (UID: "f615c5ce-b7dd-4206-9bc6-98630f704a4d"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 14:25:05 crc kubenswrapper[4722]: I0309 14:25:05.003052 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f615c5ce-b7dd-4206-9bc6-98630f704a4d-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "f615c5ce-b7dd-4206-9bc6-98630f704a4d" (UID: "f615c5ce-b7dd-4206-9bc6-98630f704a4d"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 14:25:05 crc kubenswrapper[4722]: I0309 14:25:05.003368 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f615c5ce-b7dd-4206-9bc6-98630f704a4d-var-run" (OuterVolumeSpecName: "var-run") pod "f615c5ce-b7dd-4206-9bc6-98630f704a4d" (UID: "f615c5ce-b7dd-4206-9bc6-98630f704a4d"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 14:25:05 crc kubenswrapper[4722]: I0309 14:25:05.004217 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f615c5ce-b7dd-4206-9bc6-98630f704a4d-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "f615c5ce-b7dd-4206-9bc6-98630f704a4d" (UID: "f615c5ce-b7dd-4206-9bc6-98630f704a4d"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:25:05 crc kubenswrapper[4722]: I0309 14:25:05.004359 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f615c5ce-b7dd-4206-9bc6-98630f704a4d-scripts" (OuterVolumeSpecName: "scripts") pod "f615c5ce-b7dd-4206-9bc6-98630f704a4d" (UID: "f615c5ce-b7dd-4206-9bc6-98630f704a4d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:25:05 crc kubenswrapper[4722]: I0309 14:25:05.027143 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f615c5ce-b7dd-4206-9bc6-98630f704a4d-kube-api-access-fvshn" (OuterVolumeSpecName: "kube-api-access-fvshn") pod "f615c5ce-b7dd-4206-9bc6-98630f704a4d" (UID: "f615c5ce-b7dd-4206-9bc6-98630f704a4d"). InnerVolumeSpecName "kube-api-access-fvshn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:25:05 crc kubenswrapper[4722]: I0309 14:25:05.103785 4722 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f615c5ce-b7dd-4206-9bc6-98630f704a4d-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 09 14:25:05 crc kubenswrapper[4722]: I0309 14:25:05.103821 4722 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f615c5ce-b7dd-4206-9bc6-98630f704a4d-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 09 14:25:05 crc kubenswrapper[4722]: I0309 14:25:05.103831 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f615c5ce-b7dd-4206-9bc6-98630f704a4d-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 14:25:05 crc kubenswrapper[4722]: I0309 14:25:05.103840 4722 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/f615c5ce-b7dd-4206-9bc6-98630f704a4d-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 14:25:05 crc kubenswrapper[4722]: I0309 14:25:05.103851 4722 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f615c5ce-b7dd-4206-9bc6-98630f704a4d-var-run\") on node \"crc\" DevicePath \"\"" Mar 09 14:25:05 crc kubenswrapper[4722]: I0309 14:25:05.103859 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fvshn\" (UniqueName: \"kubernetes.io/projected/f615c5ce-b7dd-4206-9bc6-98630f704a4d-kube-api-access-fvshn\") on node \"crc\" DevicePath \"\"" Mar 09 14:25:05 crc kubenswrapper[4722]: I0309 14:25:05.373739 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-b8gzx-config-p4922" event={"ID":"f615c5ce-b7dd-4206-9bc6-98630f704a4d","Type":"ContainerDied","Data":"ddb283750e21be12e3f21138108bd7b8b5352042976cd071df16dcc0f70b7abd"} Mar 09 14:25:05 crc kubenswrapper[4722]: I0309 14:25:05.373797 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-b8gzx-config-p4922" Mar 09 14:25:05 crc kubenswrapper[4722]: I0309 14:25:05.373805 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ddb283750e21be12e3f21138108bd7b8b5352042976cd071df16dcc0f70b7abd" Mar 09 14:25:05 crc kubenswrapper[4722]: I0309 14:25:05.377347 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7463e84f-f457-4409-9621-507d331e06b5","Type":"ContainerStarted","Data":"500c5ef931d562706526fd678a5833474794e8e30ba4e60a1ab3bac4b70c32c3"} Mar 09 14:25:05 crc kubenswrapper[4722]: I0309 14:25:05.377388 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7463e84f-f457-4409-9621-507d331e06b5","Type":"ContainerStarted","Data":"6445f99d241f31b40a915f3a5bf68ec451f824878604993e8c55c147a7cadfba"} Mar 09 14:25:06 crc kubenswrapper[4722]: I0309 14:25:06.048193 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-b8gzx-config-p4922"] Mar 09 14:25:06 crc kubenswrapper[4722]: I0309 14:25:06.067074 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-b8gzx-config-p4922"] Mar 09 14:25:06 crc kubenswrapper[4722]: I0309 14:25:06.161163 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f615c5ce-b7dd-4206-9bc6-98630f704a4d" path="/var/lib/kubelet/pods/f615c5ce-b7dd-4206-9bc6-98630f704a4d/volumes" Mar 09 14:25:06 crc kubenswrapper[4722]: I0309 14:25:06.318526 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-b8gzx" Mar 09 14:25:07 crc kubenswrapper[4722]: I0309 14:25:07.402497 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7463e84f-f457-4409-9621-507d331e06b5","Type":"ContainerStarted","Data":"aa574eebbf945861bf7c3a2027886ac8da276b185932a25d616db4ccabb0a705"} Mar 09 14:25:07 crc kubenswrapper[4722]: I0309 14:25:07.403086 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7463e84f-f457-4409-9621-507d331e06b5","Type":"ContainerStarted","Data":"a410bff8ef1dd379af9918607ce4858e6e9bd3d2af03588ed251bf8243a23c15"} Mar 09 14:25:07 crc kubenswrapper[4722]: I0309 14:25:07.403107 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7463e84f-f457-4409-9621-507d331e06b5","Type":"ContainerStarted","Data":"baa74e2b252f2dc4f518a9b563aeb5d80a66a8dfa881b0a70bcbcf80a82d357a"} Mar 09 14:25:07 crc kubenswrapper[4722]: I0309 14:25:07.403119 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7463e84f-f457-4409-9621-507d331e06b5","Type":"ContainerStarted","Data":"c9858535d449928f26e3045eb4e0bed1af96b1d03708cee4ecfcf60d3db792a5"} Mar 09 14:25:08 crc kubenswrapper[4722]: I0309 14:25:08.258448 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 09 14:25:08 crc kubenswrapper[4722]: I0309 14:25:08.370843 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="6f4e007a-4a18-40e6-bf96-4a751e00cd73" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.134:5671: connect: connection refused" Mar 09 14:25:08 crc kubenswrapper[4722]: I0309 14:25:08.415408 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-2" Mar 09 14:25:08 crc kubenswrapper[4722]: I0309 14:25:08.423664 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7463e84f-f457-4409-9621-507d331e06b5","Type":"ContainerStarted","Data":"18570badd2d18a13f2fe3ffb82d40fc9c02cd8fd33df7ba1dc068f8067c91797"} Mar 09 14:25:08 crc kubenswrapper[4722]: I0309 14:25:08.432404 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-1" Mar 09 14:25:09 crc kubenswrapper[4722]: I0309 14:25:09.443693 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7463e84f-f457-4409-9621-507d331e06b5","Type":"ContainerStarted","Data":"bed4615f14998c95a408924c7b540be214f0cfc703389de70ab109988d386076"} Mar 09 14:25:09 crc kubenswrapper[4722]: I0309 14:25:09.444274 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7463e84f-f457-4409-9621-507d331e06b5","Type":"ContainerStarted","Data":"066448fe8262fb0071c5f8407bbd4e46f589fb5cb7bec325ee665a66fa0e16b7"} Mar 09 14:25:09 crc kubenswrapper[4722]: I0309 14:25:09.522643 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=40.10859988 podStartE2EDuration="46.522624305s" podCreationTimestamp="2026-03-09 14:24:23 +0000 UTC" firstStartedPulling="2026-03-09 14:24:59.795719326 +0000 UTC m=+1340.351287912" lastFinishedPulling="2026-03-09 14:25:06.209743761 +0000 UTC m=+1346.765312337" observedRunningTime="2026-03-09 14:25:09.520651601 +0000 UTC m=+1350.076220187" watchObservedRunningTime="2026-03-09 14:25:09.522624305 +0000 UTC m=+1350.078192881" Mar 09 14:25:09 crc kubenswrapper[4722]: I0309 14:25:09.790355 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-lgh7d"] Mar 09 14:25:09 crc kubenswrapper[4722]: E0309 14:25:09.790747 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f615c5ce-b7dd-4206-9bc6-98630f704a4d" containerName="ovn-config" Mar 09 14:25:09 crc kubenswrapper[4722]: I0309 14:25:09.790764 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f615c5ce-b7dd-4206-9bc6-98630f704a4d" containerName="ovn-config" Mar 09 14:25:09 crc kubenswrapper[4722]: E0309 14:25:09.790782 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10519e91-e280-418b-947a-114e2696e8a8" containerName="mariadb-account-create-update" Mar 09 14:25:09 crc kubenswrapper[4722]: I0309 14:25:09.790789 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="10519e91-e280-418b-947a-114e2696e8a8" containerName="mariadb-account-create-update" Mar 09 14:25:09 crc kubenswrapper[4722]: I0309 14:25:09.790961 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="f615c5ce-b7dd-4206-9bc6-98630f704a4d" containerName="ovn-config" Mar 09 14:25:09 crc kubenswrapper[4722]: I0309 14:25:09.790973 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="10519e91-e280-418b-947a-114e2696e8a8" containerName="mariadb-account-create-update" Mar 09 14:25:09 crc kubenswrapper[4722]: I0309 14:25:09.792029 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-lgh7d" Mar 09 14:25:09 crc kubenswrapper[4722]: I0309 14:25:09.792863 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Mar 09 14:25:09 crc kubenswrapper[4722]: I0309 14:25:09.795394 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Mar 09 14:25:09 crc kubenswrapper[4722]: I0309 14:25:09.800815 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Mar 09 14:25:09 crc kubenswrapper[4722]: I0309 14:25:09.808306 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-lgh7d"] Mar 09 14:25:09 crc kubenswrapper[4722]: I0309 14:25:09.900510 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e762951c-0c20-44c7-927e-c314d6baf1c7-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-lgh7d\" (UID: \"e762951c-0c20-44c7-927e-c314d6baf1c7\") " pod="openstack/dnsmasq-dns-764c5664d7-lgh7d" Mar 09 14:25:09 crc kubenswrapper[4722]: I0309 14:25:09.900643 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e762951c-0c20-44c7-927e-c314d6baf1c7-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-lgh7d\" (UID: \"e762951c-0c20-44c7-927e-c314d6baf1c7\") " pod="openstack/dnsmasq-dns-764c5664d7-lgh7d" Mar 09 14:25:09 crc kubenswrapper[4722]: I0309 14:25:09.900711 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e762951c-0c20-44c7-927e-c314d6baf1c7-config\") pod \"dnsmasq-dns-764c5664d7-lgh7d\" (UID: \"e762951c-0c20-44c7-927e-c314d6baf1c7\") " pod="openstack/dnsmasq-dns-764c5664d7-lgh7d" Mar 09 14:25:09 crc kubenswrapper[4722]: I0309 14:25:09.901473 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e762951c-0c20-44c7-927e-c314d6baf1c7-dns-svc\") pod \"dnsmasq-dns-764c5664d7-lgh7d\" (UID: \"e762951c-0c20-44c7-927e-c314d6baf1c7\") " pod="openstack/dnsmasq-dns-764c5664d7-lgh7d" Mar 09 14:25:09 crc kubenswrapper[4722]: I0309 14:25:09.901705 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e762951c-0c20-44c7-927e-c314d6baf1c7-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-lgh7d\" (UID: \"e762951c-0c20-44c7-927e-c314d6baf1c7\") " pod="openstack/dnsmasq-dns-764c5664d7-lgh7d" Mar 09 14:25:09 crc kubenswrapper[4722]: I0309 14:25:09.901758 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wlr7\" (UniqueName: \"kubernetes.io/projected/e762951c-0c20-44c7-927e-c314d6baf1c7-kube-api-access-7wlr7\") pod \"dnsmasq-dns-764c5664d7-lgh7d\" (UID: \"e762951c-0c20-44c7-927e-c314d6baf1c7\") " pod="openstack/dnsmasq-dns-764c5664d7-lgh7d" Mar 09 14:25:10 crc kubenswrapper[4722]: I0309 14:25:10.003605 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e762951c-0c20-44c7-927e-c314d6baf1c7-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-lgh7d\" (UID: \"e762951c-0c20-44c7-927e-c314d6baf1c7\") " pod="openstack/dnsmasq-dns-764c5664d7-lgh7d" Mar 09 14:25:10 crc kubenswrapper[4722]: I0309 14:25:10.003655 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wlr7\" (UniqueName: \"kubernetes.io/projected/e762951c-0c20-44c7-927e-c314d6baf1c7-kube-api-access-7wlr7\") pod \"dnsmasq-dns-764c5664d7-lgh7d\" (UID: \"e762951c-0c20-44c7-927e-c314d6baf1c7\") " pod="openstack/dnsmasq-dns-764c5664d7-lgh7d" Mar 09 14:25:10 crc kubenswrapper[4722]: I0309 14:25:10.003699 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e762951c-0c20-44c7-927e-c314d6baf1c7-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-lgh7d\" (UID: \"e762951c-0c20-44c7-927e-c314d6baf1c7\") " pod="openstack/dnsmasq-dns-764c5664d7-lgh7d" Mar 09 14:25:10 crc kubenswrapper[4722]: I0309 14:25:10.003754 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e762951c-0c20-44c7-927e-c314d6baf1c7-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-lgh7d\" (UID: \"e762951c-0c20-44c7-927e-c314d6baf1c7\") " pod="openstack/dnsmasq-dns-764c5664d7-lgh7d" Mar 09 14:25:10 crc kubenswrapper[4722]: I0309 14:25:10.003790 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e762951c-0c20-44c7-927e-c314d6baf1c7-config\") pod \"dnsmasq-dns-764c5664d7-lgh7d\" (UID: \"e762951c-0c20-44c7-927e-c314d6baf1c7\") " pod="openstack/dnsmasq-dns-764c5664d7-lgh7d" Mar 09 14:25:10 crc kubenswrapper[4722]: I0309 14:25:10.003874 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e762951c-0c20-44c7-927e-c314d6baf1c7-dns-svc\") pod \"dnsmasq-dns-764c5664d7-lgh7d\" (UID: \"e762951c-0c20-44c7-927e-c314d6baf1c7\") " pod="openstack/dnsmasq-dns-764c5664d7-lgh7d" Mar 09 14:25:10 crc kubenswrapper[4722]: I0309 14:25:10.004988 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e762951c-0c20-44c7-927e-c314d6baf1c7-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-lgh7d\" (UID: \"e762951c-0c20-44c7-927e-c314d6baf1c7\") " pod="openstack/dnsmasq-dns-764c5664d7-lgh7d" Mar 09 14:25:10 crc kubenswrapper[4722]: I0309 14:25:10.005012 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e762951c-0c20-44c7-927e-c314d6baf1c7-dns-svc\") pod \"dnsmasq-dns-764c5664d7-lgh7d\" (UID: \"e762951c-0c20-44c7-927e-c314d6baf1c7\") " pod="openstack/dnsmasq-dns-764c5664d7-lgh7d" Mar 09 14:25:10 crc kubenswrapper[4722]: I0309 14:25:10.004997 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e762951c-0c20-44c7-927e-c314d6baf1c7-config\") pod \"dnsmasq-dns-764c5664d7-lgh7d\" (UID: \"e762951c-0c20-44c7-927e-c314d6baf1c7\") " pod="openstack/dnsmasq-dns-764c5664d7-lgh7d" Mar 09 14:25:10 crc kubenswrapper[4722]: I0309 14:25:10.005041 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e762951c-0c20-44c7-927e-c314d6baf1c7-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-lgh7d\" (UID: \"e762951c-0c20-44c7-927e-c314d6baf1c7\") " pod="openstack/dnsmasq-dns-764c5664d7-lgh7d" Mar 09 14:25:10 crc kubenswrapper[4722]: I0309 14:25:10.005261 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e762951c-0c20-44c7-927e-c314d6baf1c7-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-lgh7d\" (UID: \"e762951c-0c20-44c7-927e-c314d6baf1c7\") " pod="openstack/dnsmasq-dns-764c5664d7-lgh7d" Mar 09 14:25:10 crc kubenswrapper[4722]: I0309 14:25:10.025682 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wlr7\" (UniqueName: \"kubernetes.io/projected/e762951c-0c20-44c7-927e-c314d6baf1c7-kube-api-access-7wlr7\") pod \"dnsmasq-dns-764c5664d7-lgh7d\" (UID: \"e762951c-0c20-44c7-927e-c314d6baf1c7\") " pod="openstack/dnsmasq-dns-764c5664d7-lgh7d" Mar 09 14:25:10 crc kubenswrapper[4722]: I0309 14:25:10.108751 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-lgh7d" Mar 09 14:25:10 crc kubenswrapper[4722]: I0309 14:25:10.454548 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Mar 09 14:25:10 crc kubenswrapper[4722]: I0309 14:25:10.611552 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-lgh7d"] Mar 09 14:25:11 crc kubenswrapper[4722]: I0309 14:25:11.485680 4722 generic.go:334] "Generic (PLEG): container finished" podID="e762951c-0c20-44c7-927e-c314d6baf1c7" containerID="2d288737d0b02b3d7e1abeadb235b24ba3bec04f56a1b34e72a52bc15136a763" exitCode=0 Mar 09 14:25:11 crc kubenswrapper[4722]: I0309 14:25:11.485784 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-lgh7d" event={"ID":"e762951c-0c20-44c7-927e-c314d6baf1c7","Type":"ContainerDied","Data":"2d288737d0b02b3d7e1abeadb235b24ba3bec04f56a1b34e72a52bc15136a763"} Mar 09 14:25:11 crc kubenswrapper[4722]: I0309 14:25:11.486397 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-lgh7d" event={"ID":"e762951c-0c20-44c7-927e-c314d6baf1c7","Type":"ContainerStarted","Data":"92e67a80a82f31780adc28ab6b875690201291843a37b1df669b30468dc4cc5c"} Mar 09 14:25:11 crc kubenswrapper[4722]: I0309 14:25:11.487959 4722 generic.go:334] "Generic (PLEG): container finished" podID="17f0c1f6-4aea-4ada-aaec-3493cec60053" containerID="362a3baf5e0261fd5426cdde6dead9ccd0a206479e0b9484ca198bd9bce59b63" exitCode=0 Mar 09 14:25:11 crc kubenswrapper[4722]: I0309 14:25:11.488031 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-rl9l9" event={"ID":"17f0c1f6-4aea-4ada-aaec-3493cec60053","Type":"ContainerDied","Data":"362a3baf5e0261fd5426cdde6dead9ccd0a206479e0b9484ca198bd9bce59b63"} Mar 09 14:25:12 crc kubenswrapper[4722]: I0309 14:25:12.500144 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-lgh7d" event={"ID":"e762951c-0c20-44c7-927e-c314d6baf1c7","Type":"ContainerStarted","Data":"6971521a0a931ee86b3fbac4e1562fd3eeadfafabb001aa15b6254ce3e269857"} Mar 09 14:25:13 crc kubenswrapper[4722]: I0309 14:25:13.016527 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-rl9l9" Mar 09 14:25:13 crc kubenswrapper[4722]: I0309 14:25:13.034430 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-764c5664d7-lgh7d" podStartSLOduration=4.034390485 podStartE2EDuration="4.034390485s" podCreationTimestamp="2026-03-09 14:25:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:25:12.527967684 +0000 UTC m=+1353.083536280" watchObservedRunningTime="2026-03-09 14:25:13.034390485 +0000 UTC m=+1353.589959081" Mar 09 14:25:13 crc kubenswrapper[4722]: I0309 14:25:13.064569 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/17f0c1f6-4aea-4ada-aaec-3493cec60053-db-sync-config-data\") pod \"17f0c1f6-4aea-4ada-aaec-3493cec60053\" (UID: \"17f0c1f6-4aea-4ada-aaec-3493cec60053\") " Mar 09 14:25:13 crc kubenswrapper[4722]: I0309 14:25:13.064942 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17f0c1f6-4aea-4ada-aaec-3493cec60053-config-data\") pod \"17f0c1f6-4aea-4ada-aaec-3493cec60053\" (UID: \"17f0c1f6-4aea-4ada-aaec-3493cec60053\") " Mar 09 14:25:13 crc kubenswrapper[4722]: I0309 14:25:13.065116 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nfpc4\" (UniqueName: \"kubernetes.io/projected/17f0c1f6-4aea-4ada-aaec-3493cec60053-kube-api-access-nfpc4\") pod \"17f0c1f6-4aea-4ada-aaec-3493cec60053\" (UID: \"17f0c1f6-4aea-4ada-aaec-3493cec60053\") " Mar 09 14:25:13 crc kubenswrapper[4722]: I0309 14:25:13.065478 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17f0c1f6-4aea-4ada-aaec-3493cec60053-combined-ca-bundle\") pod \"17f0c1f6-4aea-4ada-aaec-3493cec60053\" (UID: \"17f0c1f6-4aea-4ada-aaec-3493cec60053\") " Mar 09 14:25:13 crc kubenswrapper[4722]: I0309 14:25:13.069396 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17f0c1f6-4aea-4ada-aaec-3493cec60053-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "17f0c1f6-4aea-4ada-aaec-3493cec60053" (UID: "17f0c1f6-4aea-4ada-aaec-3493cec60053"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:25:13 crc kubenswrapper[4722]: I0309 14:25:13.070303 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17f0c1f6-4aea-4ada-aaec-3493cec60053-kube-api-access-nfpc4" (OuterVolumeSpecName: "kube-api-access-nfpc4") pod "17f0c1f6-4aea-4ada-aaec-3493cec60053" (UID: "17f0c1f6-4aea-4ada-aaec-3493cec60053"). InnerVolumeSpecName "kube-api-access-nfpc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:25:13 crc kubenswrapper[4722]: I0309 14:25:13.099946 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17f0c1f6-4aea-4ada-aaec-3493cec60053-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "17f0c1f6-4aea-4ada-aaec-3493cec60053" (UID: "17f0c1f6-4aea-4ada-aaec-3493cec60053"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:25:13 crc kubenswrapper[4722]: I0309 14:25:13.123427 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17f0c1f6-4aea-4ada-aaec-3493cec60053-config-data" (OuterVolumeSpecName: "config-data") pod "17f0c1f6-4aea-4ada-aaec-3493cec60053" (UID: "17f0c1f6-4aea-4ada-aaec-3493cec60053"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:25:13 crc kubenswrapper[4722]: I0309 14:25:13.168415 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17f0c1f6-4aea-4ada-aaec-3493cec60053-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:25:13 crc kubenswrapper[4722]: I0309 14:25:13.168454 4722 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/17f0c1f6-4aea-4ada-aaec-3493cec60053-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:25:13 crc kubenswrapper[4722]: I0309 14:25:13.168469 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17f0c1f6-4aea-4ada-aaec-3493cec60053-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:25:13 crc kubenswrapper[4722]: I0309 14:25:13.168480 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nfpc4\" (UniqueName: \"kubernetes.io/projected/17f0c1f6-4aea-4ada-aaec-3493cec60053-kube-api-access-nfpc4\") on node \"crc\" DevicePath \"\"" Mar 09 14:25:13 crc kubenswrapper[4722]: I0309 14:25:13.471447 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 09 14:25:13 crc kubenswrapper[4722]: I0309 14:25:13.471761 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="e0db26a0-2877-48fa-b706-b5558f9973d5" containerName="prometheus" containerID="cri-o://f43ad056ea4fd9b6d874ac1b12dbe3d25d05f26348f873c90015860853bd08a4" gracePeriod=600 Mar 09 14:25:13 crc kubenswrapper[4722]: I0309 14:25:13.471822 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="e0db26a0-2877-48fa-b706-b5558f9973d5" containerName="config-reloader" containerID="cri-o://4aa6764ae1c61b4084dcc5f0a098a411934735e696ad3102d63cf6771c505e59" gracePeriod=600 Mar 09 14:25:13 crc kubenswrapper[4722]: I0309 14:25:13.471862 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="e0db26a0-2877-48fa-b706-b5558f9973d5" containerName="thanos-sidecar" containerID="cri-o://255b938bfd69ad600dd36a900edecc9cf5d23939501d8d23e0bda82feb284e0e" gracePeriod=600 Mar 09 14:25:13 crc kubenswrapper[4722]: I0309 14:25:13.527310 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-rl9l9" Mar 09 14:25:13 crc kubenswrapper[4722]: I0309 14:25:13.531348 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-rl9l9" event={"ID":"17f0c1f6-4aea-4ada-aaec-3493cec60053","Type":"ContainerDied","Data":"11e4b7703195d05b92ce0f1b24130c1d3c997e881712bb06cc8a629c76aeaf60"} Mar 09 14:25:13 crc kubenswrapper[4722]: I0309 14:25:13.531429 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11e4b7703195d05b92ce0f1b24130c1d3c997e881712bb06cc8a629c76aeaf60" Mar 09 14:25:13 crc kubenswrapper[4722]: I0309 14:25:13.531461 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-764c5664d7-lgh7d" Mar 09 14:25:13 crc kubenswrapper[4722]: I0309 14:25:13.999078 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-lgh7d"] Mar 09 14:25:14 crc kubenswrapper[4722]: I0309 14:25:14.019711 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-qccbt"] Mar 09 14:25:14 crc kubenswrapper[4722]: E0309 14:25:14.021062 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17f0c1f6-4aea-4ada-aaec-3493cec60053" containerName="glance-db-sync" Mar 09 14:25:14 crc kubenswrapper[4722]: I0309 14:25:14.021089 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="17f0c1f6-4aea-4ada-aaec-3493cec60053" containerName="glance-db-sync" Mar 09 14:25:14 crc kubenswrapper[4722]: I0309 14:25:14.021648 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="17f0c1f6-4aea-4ada-aaec-3493cec60053" containerName="glance-db-sync" Mar 09 14:25:14 crc kubenswrapper[4722]: I0309 14:25:14.029024 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-qccbt" Mar 09 14:25:14 crc kubenswrapper[4722]: I0309 14:25:14.050856 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-qccbt"] Mar 09 14:25:14 crc kubenswrapper[4722]: I0309 14:25:14.116408 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f87e19a1-bff1-4e82-8677-3812b2f51f46-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-qccbt\" (UID: \"f87e19a1-bff1-4e82-8677-3812b2f51f46\") " pod="openstack/dnsmasq-dns-74f6bcbc87-qccbt" Mar 09 14:25:14 crc kubenswrapper[4722]: I0309 14:25:14.116476 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffrlk\" (UniqueName: \"kubernetes.io/projected/f87e19a1-bff1-4e82-8677-3812b2f51f46-kube-api-access-ffrlk\") pod \"dnsmasq-dns-74f6bcbc87-qccbt\" (UID: \"f87e19a1-bff1-4e82-8677-3812b2f51f46\") " pod="openstack/dnsmasq-dns-74f6bcbc87-qccbt" Mar 09 14:25:14 crc kubenswrapper[4722]: I0309 14:25:14.116555 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f87e19a1-bff1-4e82-8677-3812b2f51f46-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-qccbt\" (UID: \"f87e19a1-bff1-4e82-8677-3812b2f51f46\") " pod="openstack/dnsmasq-dns-74f6bcbc87-qccbt" Mar 09 14:25:14 crc kubenswrapper[4722]: I0309 14:25:14.116833 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f87e19a1-bff1-4e82-8677-3812b2f51f46-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-qccbt\" (UID: \"f87e19a1-bff1-4e82-8677-3812b2f51f46\") " pod="openstack/dnsmasq-dns-74f6bcbc87-qccbt" Mar 09 14:25:14 crc kubenswrapper[4722]: I0309 14:25:14.117103 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f87e19a1-bff1-4e82-8677-3812b2f51f46-config\") pod \"dnsmasq-dns-74f6bcbc87-qccbt\" (UID: \"f87e19a1-bff1-4e82-8677-3812b2f51f46\") " pod="openstack/dnsmasq-dns-74f6bcbc87-qccbt" Mar 09 14:25:14 crc kubenswrapper[4722]: I0309 14:25:14.117133 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f87e19a1-bff1-4e82-8677-3812b2f51f46-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-qccbt\" (UID: \"f87e19a1-bff1-4e82-8677-3812b2f51f46\") " pod="openstack/dnsmasq-dns-74f6bcbc87-qccbt" Mar 09 14:25:14 crc kubenswrapper[4722]: I0309 14:25:14.219416 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f87e19a1-bff1-4e82-8677-3812b2f51f46-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-qccbt\" (UID: \"f87e19a1-bff1-4e82-8677-3812b2f51f46\") " pod="openstack/dnsmasq-dns-74f6bcbc87-qccbt" Mar 09 14:25:14 crc kubenswrapper[4722]: I0309 14:25:14.219489 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f87e19a1-bff1-4e82-8677-3812b2f51f46-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-qccbt\" (UID: \"f87e19a1-bff1-4e82-8677-3812b2f51f46\") " pod="openstack/dnsmasq-dns-74f6bcbc87-qccbt" Mar 09 14:25:14 crc kubenswrapper[4722]: I0309 14:25:14.219547 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f87e19a1-bff1-4e82-8677-3812b2f51f46-config\") pod \"dnsmasq-dns-74f6bcbc87-qccbt\" (UID: \"f87e19a1-bff1-4e82-8677-3812b2f51f46\") " pod="openstack/dnsmasq-dns-74f6bcbc87-qccbt" Mar 09 14:25:14 crc kubenswrapper[4722]: I0309 14:25:14.219563 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f87e19a1-bff1-4e82-8677-3812b2f51f46-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-qccbt\" (UID: \"f87e19a1-bff1-4e82-8677-3812b2f51f46\") " pod="openstack/dnsmasq-dns-74f6bcbc87-qccbt" Mar 09 14:25:14 crc kubenswrapper[4722]: I0309 14:25:14.219638 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f87e19a1-bff1-4e82-8677-3812b2f51f46-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-qccbt\" (UID: \"f87e19a1-bff1-4e82-8677-3812b2f51f46\") " pod="openstack/dnsmasq-dns-74f6bcbc87-qccbt" Mar 09 14:25:14 crc kubenswrapper[4722]: I0309 14:25:14.219654 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffrlk\" (UniqueName: \"kubernetes.io/projected/f87e19a1-bff1-4e82-8677-3812b2f51f46-kube-api-access-ffrlk\") pod \"dnsmasq-dns-74f6bcbc87-qccbt\" (UID: \"f87e19a1-bff1-4e82-8677-3812b2f51f46\") " pod="openstack/dnsmasq-dns-74f6bcbc87-qccbt" Mar 09 14:25:14 crc kubenswrapper[4722]: I0309 14:25:14.220636 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f87e19a1-bff1-4e82-8677-3812b2f51f46-config\") pod \"dnsmasq-dns-74f6bcbc87-qccbt\" (UID: \"f87e19a1-bff1-4e82-8677-3812b2f51f46\") " pod="openstack/dnsmasq-dns-74f6bcbc87-qccbt" Mar 09 14:25:14 crc kubenswrapper[4722]: I0309 14:25:14.220736 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f87e19a1-bff1-4e82-8677-3812b2f51f46-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-qccbt\" (UID: \"f87e19a1-bff1-4e82-8677-3812b2f51f46\") " pod="openstack/dnsmasq-dns-74f6bcbc87-qccbt" Mar 09 14:25:14 crc kubenswrapper[4722]: I0309 14:25:14.220964 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f87e19a1-bff1-4e82-8677-3812b2f51f46-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-qccbt\" (UID: \"f87e19a1-bff1-4e82-8677-3812b2f51f46\") " pod="openstack/dnsmasq-dns-74f6bcbc87-qccbt" Mar 09 14:25:14 crc kubenswrapper[4722]: I0309 14:25:14.220991 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f87e19a1-bff1-4e82-8677-3812b2f51f46-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-qccbt\" (UID: \"f87e19a1-bff1-4e82-8677-3812b2f51f46\") " pod="openstack/dnsmasq-dns-74f6bcbc87-qccbt" Mar 09 14:25:14 crc kubenswrapper[4722]: I0309 14:25:14.221043 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f87e19a1-bff1-4e82-8677-3812b2f51f46-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-qccbt\" (UID: \"f87e19a1-bff1-4e82-8677-3812b2f51f46\") " pod="openstack/dnsmasq-dns-74f6bcbc87-qccbt" Mar 09 14:25:14 crc kubenswrapper[4722]: I0309 14:25:14.240487 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffrlk\" (UniqueName: \"kubernetes.io/projected/f87e19a1-bff1-4e82-8677-3812b2f51f46-kube-api-access-ffrlk\") pod \"dnsmasq-dns-74f6bcbc87-qccbt\" (UID: \"f87e19a1-bff1-4e82-8677-3812b2f51f46\") " pod="openstack/dnsmasq-dns-74f6bcbc87-qccbt" Mar 09 14:25:14 crc kubenswrapper[4722]: I0309 14:25:14.358168 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-qccbt" Mar 09 14:25:14 crc kubenswrapper[4722]: I0309 14:25:14.566570 4722 generic.go:334] "Generic (PLEG): container finished" podID="e0db26a0-2877-48fa-b706-b5558f9973d5" containerID="255b938bfd69ad600dd36a900edecc9cf5d23939501d8d23e0bda82feb284e0e" exitCode=0 Mar 09 14:25:14 crc kubenswrapper[4722]: I0309 14:25:14.566953 4722 generic.go:334] "Generic (PLEG): container finished" podID="e0db26a0-2877-48fa-b706-b5558f9973d5" containerID="4aa6764ae1c61b4084dcc5f0a098a411934735e696ad3102d63cf6771c505e59" exitCode=0 Mar 09 14:25:14 crc kubenswrapper[4722]: I0309 14:25:14.566968 4722 generic.go:334] "Generic (PLEG): container finished" podID="e0db26a0-2877-48fa-b706-b5558f9973d5" containerID="f43ad056ea4fd9b6d874ac1b12dbe3d25d05f26348f873c90015860853bd08a4" exitCode=0 Mar 09 14:25:14 crc kubenswrapper[4722]: I0309 14:25:14.566715 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e0db26a0-2877-48fa-b706-b5558f9973d5","Type":"ContainerDied","Data":"255b938bfd69ad600dd36a900edecc9cf5d23939501d8d23e0bda82feb284e0e"} Mar 09 14:25:14 crc kubenswrapper[4722]: I0309 14:25:14.567245 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e0db26a0-2877-48fa-b706-b5558f9973d5","Type":"ContainerDied","Data":"4aa6764ae1c61b4084dcc5f0a098a411934735e696ad3102d63cf6771c505e59"} Mar 09 14:25:14 crc kubenswrapper[4722]: I0309 14:25:14.567265 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e0db26a0-2877-48fa-b706-b5558f9973d5","Type":"ContainerDied","Data":"f43ad056ea4fd9b6d874ac1b12dbe3d25d05f26348f873c90015860853bd08a4"} Mar 09 14:25:14 crc kubenswrapper[4722]: I0309 14:25:14.567279 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e0db26a0-2877-48fa-b706-b5558f9973d5","Type":"ContainerDied","Data":"1cd523e3b573fe09cf95a9251bec022f3e7c1a846e2ea18bb9119d72c3848821"} Mar 09 14:25:14 crc kubenswrapper[4722]: I0309 14:25:14.567290 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1cd523e3b573fe09cf95a9251bec022f3e7c1a846e2ea18bb9119d72c3848821" Mar 09 14:25:14 crc kubenswrapper[4722]: I0309 14:25:14.569307 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 09 14:25:14 crc kubenswrapper[4722]: I0309 14:25:14.732286 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d60fc4df-9ddd-42e1-8e0d-2624907f046b\") pod \"e0db26a0-2877-48fa-b706-b5558f9973d5\" (UID: \"e0db26a0-2877-48fa-b706-b5558f9973d5\") " Mar 09 14:25:14 crc kubenswrapper[4722]: I0309 14:25:14.732472 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e0db26a0-2877-48fa-b706-b5558f9973d5-tls-assets\") pod \"e0db26a0-2877-48fa-b706-b5558f9973d5\" (UID: \"e0db26a0-2877-48fa-b706-b5558f9973d5\") " Mar 09 14:25:14 crc kubenswrapper[4722]: I0309 14:25:14.732706 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e0db26a0-2877-48fa-b706-b5558f9973d5-prometheus-metric-storage-rulefiles-0\") pod \"e0db26a0-2877-48fa-b706-b5558f9973d5\" (UID: \"e0db26a0-2877-48fa-b706-b5558f9973d5\") " Mar 09 14:25:14 crc kubenswrapper[4722]: I0309 14:25:14.732761 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e0db26a0-2877-48fa-b706-b5558f9973d5-web-config\") pod \"e0db26a0-2877-48fa-b706-b5558f9973d5\" (UID: \"e0db26a0-2877-48fa-b706-b5558f9973d5\") " Mar 09 14:25:14 crc kubenswrapper[4722]: I0309 14:25:14.732897 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7sslz\" (UniqueName: \"kubernetes.io/projected/e0db26a0-2877-48fa-b706-b5558f9973d5-kube-api-access-7sslz\") pod \"e0db26a0-2877-48fa-b706-b5558f9973d5\" (UID: \"e0db26a0-2877-48fa-b706-b5558f9973d5\") " Mar 09 14:25:14 crc kubenswrapper[4722]: I0309 14:25:14.732975 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e0db26a0-2877-48fa-b706-b5558f9973d5-thanos-prometheus-http-client-file\") pod \"e0db26a0-2877-48fa-b706-b5558f9973d5\" (UID: \"e0db26a0-2877-48fa-b706-b5558f9973d5\") " Mar 09 14:25:14 crc kubenswrapper[4722]: I0309 14:25:14.733069 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e0db26a0-2877-48fa-b706-b5558f9973d5-config-out\") pod \"e0db26a0-2877-48fa-b706-b5558f9973d5\" (UID: \"e0db26a0-2877-48fa-b706-b5558f9973d5\") " Mar 09 14:25:14 crc kubenswrapper[4722]: I0309 14:25:14.733106 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e0db26a0-2877-48fa-b706-b5558f9973d5-config\") pod \"e0db26a0-2877-48fa-b706-b5558f9973d5\" (UID: \"e0db26a0-2877-48fa-b706-b5558f9973d5\") " Mar 09 14:25:14 crc kubenswrapper[4722]: I0309 14:25:14.733134 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/e0db26a0-2877-48fa-b706-b5558f9973d5-prometheus-metric-storage-rulefiles-2\") pod \"e0db26a0-2877-48fa-b706-b5558f9973d5\" (UID: \"e0db26a0-2877-48fa-b706-b5558f9973d5\") " Mar 09 14:25:14 crc kubenswrapper[4722]: I0309 14:25:14.733176 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/e0db26a0-2877-48fa-b706-b5558f9973d5-prometheus-metric-storage-rulefiles-1\") pod \"e0db26a0-2877-48fa-b706-b5558f9973d5\" (UID: \"e0db26a0-2877-48fa-b706-b5558f9973d5\") " Mar 09 14:25:14 crc kubenswrapper[4722]: I0309 14:25:14.740099 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0db26a0-2877-48fa-b706-b5558f9973d5-prometheus-metric-storage-rulefiles-1" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-1") pod "e0db26a0-2877-48fa-b706-b5558f9973d5" (UID: "e0db26a0-2877-48fa-b706-b5558f9973d5"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:25:14 crc kubenswrapper[4722]: I0309 14:25:14.740335 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0db26a0-2877-48fa-b706-b5558f9973d5-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "e0db26a0-2877-48fa-b706-b5558f9973d5" (UID: "e0db26a0-2877-48fa-b706-b5558f9973d5"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:25:14 crc kubenswrapper[4722]: I0309 14:25:14.740634 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0db26a0-2877-48fa-b706-b5558f9973d5-prometheus-metric-storage-rulefiles-2" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-2") pod "e0db26a0-2877-48fa-b706-b5558f9973d5" (UID: "e0db26a0-2877-48fa-b706-b5558f9973d5"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:25:14 crc kubenswrapper[4722]: I0309 14:25:14.743362 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0db26a0-2877-48fa-b706-b5558f9973d5-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "e0db26a0-2877-48fa-b706-b5558f9973d5" (UID: "e0db26a0-2877-48fa-b706-b5558f9973d5"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:25:14 crc kubenswrapper[4722]: I0309 14:25:14.743499 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0db26a0-2877-48fa-b706-b5558f9973d5-kube-api-access-7sslz" (OuterVolumeSpecName: "kube-api-access-7sslz") pod "e0db26a0-2877-48fa-b706-b5558f9973d5" (UID: "e0db26a0-2877-48fa-b706-b5558f9973d5"). InnerVolumeSpecName "kube-api-access-7sslz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:25:14 crc kubenswrapper[4722]: I0309 14:25:14.746513 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0db26a0-2877-48fa-b706-b5558f9973d5-config" (OuterVolumeSpecName: "config") pod "e0db26a0-2877-48fa-b706-b5558f9973d5" (UID: "e0db26a0-2877-48fa-b706-b5558f9973d5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:25:14 crc kubenswrapper[4722]: I0309 14:25:14.746903 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0db26a0-2877-48fa-b706-b5558f9973d5-config-out" (OuterVolumeSpecName: "config-out") pod "e0db26a0-2877-48fa-b706-b5558f9973d5" (UID: "e0db26a0-2877-48fa-b706-b5558f9973d5"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:25:14 crc kubenswrapper[4722]: I0309 14:25:14.755640 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0db26a0-2877-48fa-b706-b5558f9973d5-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "e0db26a0-2877-48fa-b706-b5558f9973d5" (UID: "e0db26a0-2877-48fa-b706-b5558f9973d5"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:25:14 crc kubenswrapper[4722]: I0309 14:25:14.779980 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d60fc4df-9ddd-42e1-8e0d-2624907f046b" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "e0db26a0-2877-48fa-b706-b5558f9973d5" (UID: "e0db26a0-2877-48fa-b706-b5558f9973d5"). InnerVolumeSpecName "pvc-d60fc4df-9ddd-42e1-8e0d-2624907f046b". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 09 14:25:14 crc kubenswrapper[4722]: I0309 14:25:14.786374 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0db26a0-2877-48fa-b706-b5558f9973d5-web-config" (OuterVolumeSpecName: "web-config") pod "e0db26a0-2877-48fa-b706-b5558f9973d5" (UID: "e0db26a0-2877-48fa-b706-b5558f9973d5"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:25:14 crc kubenswrapper[4722]: I0309 14:25:14.837223 4722 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e0db26a0-2877-48fa-b706-b5558f9973d5-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Mar 09 14:25:14 crc kubenswrapper[4722]: I0309 14:25:14.837260 4722 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e0db26a0-2877-48fa-b706-b5558f9973d5-web-config\") on node \"crc\" DevicePath \"\"" Mar 09 14:25:14 crc kubenswrapper[4722]: I0309 14:25:14.837272 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7sslz\" (UniqueName: \"kubernetes.io/projected/e0db26a0-2877-48fa-b706-b5558f9973d5-kube-api-access-7sslz\") on node \"crc\" DevicePath \"\"" Mar 09 14:25:14 crc kubenswrapper[4722]: I0309 14:25:14.837282 4722 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e0db26a0-2877-48fa-b706-b5558f9973d5-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Mar 09 14:25:14 crc kubenswrapper[4722]: I0309 14:25:14.837291 4722 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e0db26a0-2877-48fa-b706-b5558f9973d5-config-out\") on node \"crc\" DevicePath \"\"" Mar 09 14:25:14 crc kubenswrapper[4722]: I0309 14:25:14.837300 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/e0db26a0-2877-48fa-b706-b5558f9973d5-config\") on node \"crc\" DevicePath \"\"" Mar 09 14:25:14 crc kubenswrapper[4722]: I0309 14:25:14.837309 4722 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/e0db26a0-2877-48fa-b706-b5558f9973d5-prometheus-metric-storage-rulefiles-2\") on node \"crc\" DevicePath \"\"" Mar 09 14:25:14 crc kubenswrapper[4722]: I0309 14:25:14.837318 4722 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/e0db26a0-2877-48fa-b706-b5558f9973d5-prometheus-metric-storage-rulefiles-1\") on node \"crc\" DevicePath \"\"" Mar 09 14:25:14 crc kubenswrapper[4722]: I0309 14:25:14.837372 4722 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-d60fc4df-9ddd-42e1-8e0d-2624907f046b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d60fc4df-9ddd-42e1-8e0d-2624907f046b\") on node \"crc\" " Mar 09 14:25:14 crc kubenswrapper[4722]: I0309 14:25:14.837383 4722 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e0db26a0-2877-48fa-b706-b5558f9973d5-tls-assets\") on node \"crc\" DevicePath \"\"" Mar 09 14:25:14 crc kubenswrapper[4722]: I0309 14:25:14.860313 4722 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 09 14:25:14 crc kubenswrapper[4722]: I0309 14:25:14.860506 4722 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-d60fc4df-9ddd-42e1-8e0d-2624907f046b" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d60fc4df-9ddd-42e1-8e0d-2624907f046b") on node "crc" Mar 09 14:25:14 crc kubenswrapper[4722]: I0309 14:25:14.938814 4722 reconciler_common.go:293] "Volume detached for volume \"pvc-d60fc4df-9ddd-42e1-8e0d-2624907f046b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d60fc4df-9ddd-42e1-8e0d-2624907f046b\") on node \"crc\" DevicePath \"\"" Mar 09 14:25:14 crc kubenswrapper[4722]: W0309 14:25:14.952602 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf87e19a1_bff1_4e82_8677_3812b2f51f46.slice/crio-a6165b4cd2ff710979ecd42b5bdffe7ae31013a27dfbe5b42133dfa53e52f52b WatchSource:0}: Error finding container a6165b4cd2ff710979ecd42b5bdffe7ae31013a27dfbe5b42133dfa53e52f52b: Status 404 returned error can't find the container with id a6165b4cd2ff710979ecd42b5bdffe7ae31013a27dfbe5b42133dfa53e52f52b Mar 09 14:25:14 crc kubenswrapper[4722]: I0309 14:25:14.956225 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-qccbt"] Mar 09 14:25:15 crc kubenswrapper[4722]: I0309 14:25:15.591844 4722 generic.go:334] "Generic (PLEG): container finished" podID="f87e19a1-bff1-4e82-8677-3812b2f51f46" containerID="46949b946146e4e13918c7e305ade73c59e9952a3685ca766fa24cb64b6b976d" exitCode=0 Mar 09 14:25:15 crc kubenswrapper[4722]: I0309 14:25:15.592034 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-qccbt" event={"ID":"f87e19a1-bff1-4e82-8677-3812b2f51f46","Type":"ContainerDied","Data":"46949b946146e4e13918c7e305ade73c59e9952a3685ca766fa24cb64b6b976d"} Mar 09 14:25:15 crc kubenswrapper[4722]: I0309 14:25:15.592344 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-qccbt" event={"ID":"f87e19a1-bff1-4e82-8677-3812b2f51f46","Type":"ContainerStarted","Data":"a6165b4cd2ff710979ecd42b5bdffe7ae31013a27dfbe5b42133dfa53e52f52b"} Mar 09 14:25:15 crc kubenswrapper[4722]: I0309 14:25:15.593429 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-764c5664d7-lgh7d" podUID="e762951c-0c20-44c7-927e-c314d6baf1c7" containerName="dnsmasq-dns" containerID="cri-o://6971521a0a931ee86b3fbac4e1562fd3eeadfafabb001aa15b6254ce3e269857" gracePeriod=10 Mar 09 14:25:15 crc kubenswrapper[4722]: I0309 14:25:15.593481 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 09 14:25:15 crc kubenswrapper[4722]: I0309 14:25:15.750473 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 09 14:25:15 crc kubenswrapper[4722]: I0309 14:25:15.767577 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 09 14:25:15 crc kubenswrapper[4722]: I0309 14:25:15.808785 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 09 14:25:15 crc kubenswrapper[4722]: E0309 14:25:15.809230 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0db26a0-2877-48fa-b706-b5558f9973d5" containerName="config-reloader" Mar 09 14:25:15 crc kubenswrapper[4722]: I0309 14:25:15.809248 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0db26a0-2877-48fa-b706-b5558f9973d5" containerName="config-reloader" Mar 09 14:25:15 crc kubenswrapper[4722]: E0309 14:25:15.809260 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0db26a0-2877-48fa-b706-b5558f9973d5" containerName="prometheus" Mar 09 14:25:15 crc kubenswrapper[4722]: I0309 14:25:15.809266 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0db26a0-2877-48fa-b706-b5558f9973d5" containerName="prometheus" Mar 09 14:25:15 crc kubenswrapper[4722]: E0309 14:25:15.809280 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0db26a0-2877-48fa-b706-b5558f9973d5" containerName="thanos-sidecar" Mar 09 14:25:15 crc kubenswrapper[4722]: I0309 14:25:15.809286 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0db26a0-2877-48fa-b706-b5558f9973d5" containerName="thanos-sidecar" Mar 09 14:25:15 crc kubenswrapper[4722]: E0309 14:25:15.809318 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0db26a0-2877-48fa-b706-b5558f9973d5" containerName="init-config-reloader" Mar 09 14:25:15 crc kubenswrapper[4722]: I0309 14:25:15.809324 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0db26a0-2877-48fa-b706-b5558f9973d5" containerName="init-config-reloader" Mar 09 14:25:15 crc kubenswrapper[4722]: I0309 14:25:15.809503 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0db26a0-2877-48fa-b706-b5558f9973d5" containerName="config-reloader" Mar 09 14:25:15 crc kubenswrapper[4722]: I0309 14:25:15.809518 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0db26a0-2877-48fa-b706-b5558f9973d5" containerName="thanos-sidecar" Mar 09 14:25:15 crc kubenswrapper[4722]: I0309 14:25:15.809530 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0db26a0-2877-48fa-b706-b5558f9973d5" containerName="prometheus" Mar 09 14:25:15 crc kubenswrapper[4722]: I0309 14:25:15.845004 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 09 14:25:15 crc kubenswrapper[4722]: I0309 14:25:15.845110 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 09 14:25:15 crc kubenswrapper[4722]: I0309 14:25:15.849819 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Mar 09 14:25:15 crc kubenswrapper[4722]: I0309 14:25:15.850040 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Mar 09 14:25:15 crc kubenswrapper[4722]: I0309 14:25:15.850179 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Mar 09 14:25:15 crc kubenswrapper[4722]: I0309 14:25:15.850366 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Mar 09 14:25:15 crc kubenswrapper[4722]: I0309 14:25:15.850598 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Mar 09 14:25:15 crc kubenswrapper[4722]: I0309 14:25:15.850747 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-g5gbv" Mar 09 14:25:15 crc kubenswrapper[4722]: I0309 14:25:15.851378 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Mar 09 14:25:15 crc kubenswrapper[4722]: I0309 14:25:15.851561 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Mar 09 14:25:15 crc kubenswrapper[4722]: I0309 14:25:15.863782 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Mar 09 14:25:15 crc kubenswrapper[4722]: I0309 14:25:15.966617 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0e07fbab-4a47-4e59-aa72-f0a4521296af-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"0e07fbab-4a47-4e59-aa72-f0a4521296af\") " pod="openstack/prometheus-metric-storage-0" Mar 09 14:25:15 crc kubenswrapper[4722]: I0309 14:25:15.966713 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0e07fbab-4a47-4e59-aa72-f0a4521296af-config\") pod \"prometheus-metric-storage-0\" (UID: \"0e07fbab-4a47-4e59-aa72-f0a4521296af\") " pod="openstack/prometheus-metric-storage-0" Mar 09 14:25:15 crc kubenswrapper[4722]: I0309 14:25:15.966756 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0e07fbab-4a47-4e59-aa72-f0a4521296af-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"0e07fbab-4a47-4e59-aa72-f0a4521296af\") " pod="openstack/prometheus-metric-storage-0" Mar 09 14:25:15 crc kubenswrapper[4722]: I0309 14:25:15.966786 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67q9v\" (UniqueName: \"kubernetes.io/projected/0e07fbab-4a47-4e59-aa72-f0a4521296af-kube-api-access-67q9v\") pod \"prometheus-metric-storage-0\" (UID: \"0e07fbab-4a47-4e59-aa72-f0a4521296af\") " pod="openstack/prometheus-metric-storage-0" Mar 09 14:25:15 crc kubenswrapper[4722]: I0309 14:25:15.966833 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/0e07fbab-4a47-4e59-aa72-f0a4521296af-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"0e07fbab-4a47-4e59-aa72-f0a4521296af\") " pod="openstack/prometheus-metric-storage-0" Mar 09 14:25:15 crc kubenswrapper[4722]: I0309 14:25:15.966868 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e07fbab-4a47-4e59-aa72-f0a4521296af-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"0e07fbab-4a47-4e59-aa72-f0a4521296af\") " pod="openstack/prometheus-metric-storage-0" Mar 09 14:25:15 crc kubenswrapper[4722]: I0309 14:25:15.966901 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/0e07fbab-4a47-4e59-aa72-f0a4521296af-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"0e07fbab-4a47-4e59-aa72-f0a4521296af\") " pod="openstack/prometheus-metric-storage-0" Mar 09 14:25:15 crc kubenswrapper[4722]: I0309 14:25:15.966925 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/0e07fbab-4a47-4e59-aa72-f0a4521296af-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"0e07fbab-4a47-4e59-aa72-f0a4521296af\") " pod="openstack/prometheus-metric-storage-0" Mar 09 14:25:15 crc kubenswrapper[4722]: I0309 14:25:15.966946 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0e07fbab-4a47-4e59-aa72-f0a4521296af-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"0e07fbab-4a47-4e59-aa72-f0a4521296af\") " pod="openstack/prometheus-metric-storage-0" Mar 09 14:25:15 crc kubenswrapper[4722]: I0309 14:25:15.966972 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d60fc4df-9ddd-42e1-8e0d-2624907f046b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d60fc4df-9ddd-42e1-8e0d-2624907f046b\") pod \"prometheus-metric-storage-0\" (UID: \"0e07fbab-4a47-4e59-aa72-f0a4521296af\") " pod="openstack/prometheus-metric-storage-0" Mar 09 14:25:15 crc kubenswrapper[4722]: I0309 14:25:15.967313 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0e07fbab-4a47-4e59-aa72-f0a4521296af-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"0e07fbab-4a47-4e59-aa72-f0a4521296af\") " pod="openstack/prometheus-metric-storage-0" Mar 09 14:25:15 crc kubenswrapper[4722]: I0309 14:25:15.967474 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/0e07fbab-4a47-4e59-aa72-f0a4521296af-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"0e07fbab-4a47-4e59-aa72-f0a4521296af\") " pod="openstack/prometheus-metric-storage-0" Mar 09 14:25:15 crc kubenswrapper[4722]: I0309 14:25:15.967633 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0e07fbab-4a47-4e59-aa72-f0a4521296af-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"0e07fbab-4a47-4e59-aa72-f0a4521296af\") " pod="openstack/prometheus-metric-storage-0" Mar 09 14:25:16 crc kubenswrapper[4722]: I0309 14:25:16.074433 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0e07fbab-4a47-4e59-aa72-f0a4521296af-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"0e07fbab-4a47-4e59-aa72-f0a4521296af\") " pod="openstack/prometheus-metric-storage-0" Mar 09 14:25:16 crc kubenswrapper[4722]: I0309 14:25:16.074709 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/0e07fbab-4a47-4e59-aa72-f0a4521296af-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"0e07fbab-4a47-4e59-aa72-f0a4521296af\") " pod="openstack/prometheus-metric-storage-0" Mar 09 14:25:16 crc kubenswrapper[4722]: I0309 14:25:16.074831 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0e07fbab-4a47-4e59-aa72-f0a4521296af-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"0e07fbab-4a47-4e59-aa72-f0a4521296af\") " pod="openstack/prometheus-metric-storage-0" Mar 09 14:25:16 crc kubenswrapper[4722]: I0309 14:25:16.074915 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0e07fbab-4a47-4e59-aa72-f0a4521296af-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"0e07fbab-4a47-4e59-aa72-f0a4521296af\") " pod="openstack/prometheus-metric-storage-0" Mar 09 14:25:16 crc kubenswrapper[4722]: I0309 14:25:16.074972 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0e07fbab-4a47-4e59-aa72-f0a4521296af-config\") pod \"prometheus-metric-storage-0\" (UID: \"0e07fbab-4a47-4e59-aa72-f0a4521296af\") " pod="openstack/prometheus-metric-storage-0" Mar 09 14:25:16 crc kubenswrapper[4722]: I0309 14:25:16.075003 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0e07fbab-4a47-4e59-aa72-f0a4521296af-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"0e07fbab-4a47-4e59-aa72-f0a4521296af\") " pod="openstack/prometheus-metric-storage-0" Mar 09 14:25:16 crc kubenswrapper[4722]: I0309 14:25:16.075032 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67q9v\" (UniqueName: \"kubernetes.io/projected/0e07fbab-4a47-4e59-aa72-f0a4521296af-kube-api-access-67q9v\") pod \"prometheus-metric-storage-0\" (UID: \"0e07fbab-4a47-4e59-aa72-f0a4521296af\") " pod="openstack/prometheus-metric-storage-0" Mar 09 14:25:16 crc kubenswrapper[4722]: I0309 14:25:16.075052 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/0e07fbab-4a47-4e59-aa72-f0a4521296af-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"0e07fbab-4a47-4e59-aa72-f0a4521296af\") " pod="openstack/prometheus-metric-storage-0" Mar 09 14:25:16 crc kubenswrapper[4722]: I0309 14:25:16.075073 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e07fbab-4a47-4e59-aa72-f0a4521296af-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"0e07fbab-4a47-4e59-aa72-f0a4521296af\") " pod="openstack/prometheus-metric-storage-0" Mar 09 14:25:16 crc kubenswrapper[4722]: I0309 14:25:16.075088 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/0e07fbab-4a47-4e59-aa72-f0a4521296af-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"0e07fbab-4a47-4e59-aa72-f0a4521296af\") " pod="openstack/prometheus-metric-storage-0" Mar 09 14:25:16 crc kubenswrapper[4722]: I0309 14:25:16.075104 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/0e07fbab-4a47-4e59-aa72-f0a4521296af-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"0e07fbab-4a47-4e59-aa72-f0a4521296af\") " pod="openstack/prometheus-metric-storage-0" Mar 09 14:25:16 crc kubenswrapper[4722]: I0309 14:25:16.075122 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0e07fbab-4a47-4e59-aa72-f0a4521296af-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"0e07fbab-4a47-4e59-aa72-f0a4521296af\") " pod="openstack/prometheus-metric-storage-0" Mar 09 14:25:16 crc kubenswrapper[4722]: I0309 14:25:16.075140 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d60fc4df-9ddd-42e1-8e0d-2624907f046b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d60fc4df-9ddd-42e1-8e0d-2624907f046b\") pod \"prometheus-metric-storage-0\" (UID: \"0e07fbab-4a47-4e59-aa72-f0a4521296af\") " pod="openstack/prometheus-metric-storage-0" Mar 09 14:25:16 crc kubenswrapper[4722]: I0309 14:25:16.075623 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/0e07fbab-4a47-4e59-aa72-f0a4521296af-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"0e07fbab-4a47-4e59-aa72-f0a4521296af\") " pod="openstack/prometheus-metric-storage-0" Mar 09 14:25:16 crc kubenswrapper[4722]: I0309 14:25:16.075826 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/0e07fbab-4a47-4e59-aa72-f0a4521296af-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"0e07fbab-4a47-4e59-aa72-f0a4521296af\") " pod="openstack/prometheus-metric-storage-0" Mar 09 14:25:16 crc kubenswrapper[4722]: I0309 14:25:16.076194 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/0e07fbab-4a47-4e59-aa72-f0a4521296af-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"0e07fbab-4a47-4e59-aa72-f0a4521296af\") " pod="openstack/prometheus-metric-storage-0" Mar 09 14:25:16 crc kubenswrapper[4722]: I0309 14:25:16.086948 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/0e07fbab-4a47-4e59-aa72-f0a4521296af-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"0e07fbab-4a47-4e59-aa72-f0a4521296af\") " pod="openstack/prometheus-metric-storage-0" Mar 09 14:25:16 crc kubenswrapper[4722]: I0309 14:25:16.087046 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e07fbab-4a47-4e59-aa72-f0a4521296af-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"0e07fbab-4a47-4e59-aa72-f0a4521296af\") " pod="openstack/prometheus-metric-storage-0" Mar 09 14:25:16 crc kubenswrapper[4722]: I0309 14:25:16.087288 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0e07fbab-4a47-4e59-aa72-f0a4521296af-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"0e07fbab-4a47-4e59-aa72-f0a4521296af\") " pod="openstack/prometheus-metric-storage-0" Mar 09 14:25:16 crc kubenswrapper[4722]: I0309 14:25:16.087530 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0e07fbab-4a47-4e59-aa72-f0a4521296af-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"0e07fbab-4a47-4e59-aa72-f0a4521296af\") " pod="openstack/prometheus-metric-storage-0" Mar 09 14:25:16 crc kubenswrapper[4722]: I0309 14:25:16.087932 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/0e07fbab-4a47-4e59-aa72-f0a4521296af-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"0e07fbab-4a47-4e59-aa72-f0a4521296af\") " pod="openstack/prometheus-metric-storage-0" Mar 09 14:25:16 crc kubenswrapper[4722]: I0309 14:25:16.090241 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/0e07fbab-4a47-4e59-aa72-f0a4521296af-config\") pod \"prometheus-metric-storage-0\" (UID: \"0e07fbab-4a47-4e59-aa72-f0a4521296af\") " pod="openstack/prometheus-metric-storage-0" Mar 09 14:25:16 crc kubenswrapper[4722]: I0309 14:25:16.091154 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 09 14:25:16 crc kubenswrapper[4722]: I0309 14:25:16.091178 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d60fc4df-9ddd-42e1-8e0d-2624907f046b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d60fc4df-9ddd-42e1-8e0d-2624907f046b\") pod \"prometheus-metric-storage-0\" (UID: \"0e07fbab-4a47-4e59-aa72-f0a4521296af\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/5450886054d333d0379a47b47c9d8ace333ac0051caf38aa48f7152c27378c49/globalmount\"" pod="openstack/prometheus-metric-storage-0" Mar 09 14:25:16 crc kubenswrapper[4722]: I0309 14:25:16.092764 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0e07fbab-4a47-4e59-aa72-f0a4521296af-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"0e07fbab-4a47-4e59-aa72-f0a4521296af\") " pod="openstack/prometheus-metric-storage-0" Mar 09 14:25:16 crc kubenswrapper[4722]: I0309 14:25:16.103046 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/0e07fbab-4a47-4e59-aa72-f0a4521296af-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"0e07fbab-4a47-4e59-aa72-f0a4521296af\") " pod="openstack/prometheus-metric-storage-0" Mar 09 14:25:16 crc kubenswrapper[4722]: I0309 14:25:16.110395 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67q9v\" (UniqueName: \"kubernetes.io/projected/0e07fbab-4a47-4e59-aa72-f0a4521296af-kube-api-access-67q9v\") pod \"prometheus-metric-storage-0\" (UID: \"0e07fbab-4a47-4e59-aa72-f0a4521296af\") " pod="openstack/prometheus-metric-storage-0" Mar 09 14:25:16 crc kubenswrapper[4722]: I0309 14:25:16.167894 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0db26a0-2877-48fa-b706-b5558f9973d5" path="/var/lib/kubelet/pods/e0db26a0-2877-48fa-b706-b5558f9973d5/volumes" Mar 09 14:25:16 crc kubenswrapper[4722]: I0309 14:25:16.252872 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d60fc4df-9ddd-42e1-8e0d-2624907f046b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d60fc4df-9ddd-42e1-8e0d-2624907f046b\") pod \"prometheus-metric-storage-0\" (UID: \"0e07fbab-4a47-4e59-aa72-f0a4521296af\") " pod="openstack/prometheus-metric-storage-0" Mar 09 14:25:16 crc kubenswrapper[4722]: I0309 14:25:16.331871 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-lgh7d" Mar 09 14:25:16 crc kubenswrapper[4722]: I0309 14:25:16.383124 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e762951c-0c20-44c7-927e-c314d6baf1c7-config\") pod \"e762951c-0c20-44c7-927e-c314d6baf1c7\" (UID: \"e762951c-0c20-44c7-927e-c314d6baf1c7\") " Mar 09 14:25:16 crc kubenswrapper[4722]: I0309 14:25:16.383167 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e762951c-0c20-44c7-927e-c314d6baf1c7-ovsdbserver-nb\") pod \"e762951c-0c20-44c7-927e-c314d6baf1c7\" (UID: \"e762951c-0c20-44c7-927e-c314d6baf1c7\") " Mar 09 14:25:16 crc kubenswrapper[4722]: I0309 14:25:16.383234 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e762951c-0c20-44c7-927e-c314d6baf1c7-ovsdbserver-sb\") pod \"e762951c-0c20-44c7-927e-c314d6baf1c7\" (UID: \"e762951c-0c20-44c7-927e-c314d6baf1c7\") " Mar 09 14:25:16 crc kubenswrapper[4722]: I0309 14:25:16.383287 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e762951c-0c20-44c7-927e-c314d6baf1c7-dns-swift-storage-0\") pod \"e762951c-0c20-44c7-927e-c314d6baf1c7\" (UID: \"e762951c-0c20-44c7-927e-c314d6baf1c7\") " Mar 09 14:25:16 crc kubenswrapper[4722]: I0309 14:25:16.383339 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e762951c-0c20-44c7-927e-c314d6baf1c7-dns-svc\") pod \"e762951c-0c20-44c7-927e-c314d6baf1c7\" (UID: \"e762951c-0c20-44c7-927e-c314d6baf1c7\") " Mar 09 14:25:16 crc kubenswrapper[4722]: I0309 14:25:16.383451 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wlr7\" (UniqueName: \"kubernetes.io/projected/e762951c-0c20-44c7-927e-c314d6baf1c7-kube-api-access-7wlr7\") pod \"e762951c-0c20-44c7-927e-c314d6baf1c7\" (UID: \"e762951c-0c20-44c7-927e-c314d6baf1c7\") " Mar 09 14:25:16 crc kubenswrapper[4722]: I0309 14:25:16.387427 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e762951c-0c20-44c7-927e-c314d6baf1c7-kube-api-access-7wlr7" (OuterVolumeSpecName: "kube-api-access-7wlr7") pod "e762951c-0c20-44c7-927e-c314d6baf1c7" (UID: "e762951c-0c20-44c7-927e-c314d6baf1c7"). InnerVolumeSpecName "kube-api-access-7wlr7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:25:16 crc kubenswrapper[4722]: I0309 14:25:16.439222 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e762951c-0c20-44c7-927e-c314d6baf1c7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e762951c-0c20-44c7-927e-c314d6baf1c7" (UID: "e762951c-0c20-44c7-927e-c314d6baf1c7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:25:16 crc kubenswrapper[4722]: I0309 14:25:16.447144 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e762951c-0c20-44c7-927e-c314d6baf1c7-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e762951c-0c20-44c7-927e-c314d6baf1c7" (UID: "e762951c-0c20-44c7-927e-c314d6baf1c7"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:25:16 crc kubenswrapper[4722]: I0309 14:25:16.447618 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e762951c-0c20-44c7-927e-c314d6baf1c7-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e762951c-0c20-44c7-927e-c314d6baf1c7" (UID: "e762951c-0c20-44c7-927e-c314d6baf1c7"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:25:16 crc kubenswrapper[4722]: I0309 14:25:16.455054 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e762951c-0c20-44c7-927e-c314d6baf1c7-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e762951c-0c20-44c7-927e-c314d6baf1c7" (UID: "e762951c-0c20-44c7-927e-c314d6baf1c7"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:25:16 crc kubenswrapper[4722]: I0309 14:25:16.478819 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e762951c-0c20-44c7-927e-c314d6baf1c7-config" (OuterVolumeSpecName: "config") pod "e762951c-0c20-44c7-927e-c314d6baf1c7" (UID: "e762951c-0c20-44c7-927e-c314d6baf1c7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:25:16 crc kubenswrapper[4722]: I0309 14:25:16.485550 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e762951c-0c20-44c7-927e-c314d6baf1c7-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 09 14:25:16 crc kubenswrapper[4722]: I0309 14:25:16.485584 4722 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e762951c-0c20-44c7-927e-c314d6baf1c7-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 09 14:25:16 crc kubenswrapper[4722]: I0309 14:25:16.485596 4722 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e762951c-0c20-44c7-927e-c314d6baf1c7-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 14:25:16 crc kubenswrapper[4722]: I0309 14:25:16.485607 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wlr7\" (UniqueName: \"kubernetes.io/projected/e762951c-0c20-44c7-927e-c314d6baf1c7-kube-api-access-7wlr7\") on node \"crc\" DevicePath \"\"" Mar 09 14:25:16 crc kubenswrapper[4722]: I0309 14:25:16.485616 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e762951c-0c20-44c7-927e-c314d6baf1c7-config\") on node \"crc\" DevicePath \"\"" Mar 09 14:25:16 crc kubenswrapper[4722]: I0309 14:25:16.485624 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e762951c-0c20-44c7-927e-c314d6baf1c7-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 09 14:25:16 crc kubenswrapper[4722]: I0309 14:25:16.495310 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 09 14:25:16 crc kubenswrapper[4722]: I0309 14:25:16.606618 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-qccbt" event={"ID":"f87e19a1-bff1-4e82-8677-3812b2f51f46","Type":"ContainerStarted","Data":"d7a02dfe8d14a05bddff13ef5f261df15fa1e4afa9505ac0d75001bbd58e3903"} Mar 09 14:25:16 crc kubenswrapper[4722]: I0309 14:25:16.607356 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74f6bcbc87-qccbt" Mar 09 14:25:16 crc kubenswrapper[4722]: I0309 14:25:16.615182 4722 generic.go:334] "Generic (PLEG): container finished" podID="e762951c-0c20-44c7-927e-c314d6baf1c7" containerID="6971521a0a931ee86b3fbac4e1562fd3eeadfafabb001aa15b6254ce3e269857" exitCode=0 Mar 09 14:25:16 crc kubenswrapper[4722]: I0309 14:25:16.615344 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-lgh7d" event={"ID":"e762951c-0c20-44c7-927e-c314d6baf1c7","Type":"ContainerDied","Data":"6971521a0a931ee86b3fbac4e1562fd3eeadfafabb001aa15b6254ce3e269857"} Mar 09 14:25:16 crc kubenswrapper[4722]: I0309 14:25:16.615418 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-lgh7d" event={"ID":"e762951c-0c20-44c7-927e-c314d6baf1c7","Type":"ContainerDied","Data":"92e67a80a82f31780adc28ab6b875690201291843a37b1df669b30468dc4cc5c"} Mar 09 14:25:16 crc kubenswrapper[4722]: I0309 14:25:16.615607 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-lgh7d" Mar 09 14:25:16 crc kubenswrapper[4722]: I0309 14:25:16.615641 4722 scope.go:117] "RemoveContainer" containerID="6971521a0a931ee86b3fbac4e1562fd3eeadfafabb001aa15b6254ce3e269857" Mar 09 14:25:16 crc kubenswrapper[4722]: I0309 14:25:16.643850 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-74f6bcbc87-qccbt" podStartSLOduration=3.643834701 podStartE2EDuration="3.643834701s" podCreationTimestamp="2026-03-09 14:25:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:25:16.627021826 +0000 UTC m=+1357.182590422" watchObservedRunningTime="2026-03-09 14:25:16.643834701 +0000 UTC m=+1357.199403277" Mar 09 14:25:16 crc kubenswrapper[4722]: I0309 14:25:16.654470 4722 scope.go:117] "RemoveContainer" containerID="2d288737d0b02b3d7e1abeadb235b24ba3bec04f56a1b34e72a52bc15136a763" Mar 09 14:25:16 crc kubenswrapper[4722]: I0309 14:25:16.668360 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-lgh7d"] Mar 09 14:25:16 crc kubenswrapper[4722]: I0309 14:25:16.680529 4722 scope.go:117] "RemoveContainer" containerID="6971521a0a931ee86b3fbac4e1562fd3eeadfafabb001aa15b6254ce3e269857" Mar 09 14:25:16 crc kubenswrapper[4722]: E0309 14:25:16.684621 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6971521a0a931ee86b3fbac4e1562fd3eeadfafabb001aa15b6254ce3e269857\": container with ID starting with 6971521a0a931ee86b3fbac4e1562fd3eeadfafabb001aa15b6254ce3e269857 not found: ID does not exist" containerID="6971521a0a931ee86b3fbac4e1562fd3eeadfafabb001aa15b6254ce3e269857" Mar 09 14:25:16 crc kubenswrapper[4722]: I0309 14:25:16.684655 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6971521a0a931ee86b3fbac4e1562fd3eeadfafabb001aa15b6254ce3e269857"} err="failed to get container status \"6971521a0a931ee86b3fbac4e1562fd3eeadfafabb001aa15b6254ce3e269857\": rpc error: code = NotFound desc = could not find container \"6971521a0a931ee86b3fbac4e1562fd3eeadfafabb001aa15b6254ce3e269857\": container with ID starting with 6971521a0a931ee86b3fbac4e1562fd3eeadfafabb001aa15b6254ce3e269857 not found: ID does not exist" Mar 09 14:25:16 crc kubenswrapper[4722]: I0309 14:25:16.684675 4722 scope.go:117] "RemoveContainer" containerID="2d288737d0b02b3d7e1abeadb235b24ba3bec04f56a1b34e72a52bc15136a763" Mar 09 14:25:16 crc kubenswrapper[4722]: E0309 14:25:16.684967 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d288737d0b02b3d7e1abeadb235b24ba3bec04f56a1b34e72a52bc15136a763\": container with ID starting with 2d288737d0b02b3d7e1abeadb235b24ba3bec04f56a1b34e72a52bc15136a763 not found: ID does not exist" containerID="2d288737d0b02b3d7e1abeadb235b24ba3bec04f56a1b34e72a52bc15136a763" Mar 09 14:25:16 crc kubenswrapper[4722]: I0309 14:25:16.684987 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d288737d0b02b3d7e1abeadb235b24ba3bec04f56a1b34e72a52bc15136a763"} err="failed to get container status \"2d288737d0b02b3d7e1abeadb235b24ba3bec04f56a1b34e72a52bc15136a763\": rpc error: code = NotFound desc = could not find container \"2d288737d0b02b3d7e1abeadb235b24ba3bec04f56a1b34e72a52bc15136a763\": container with ID starting with 2d288737d0b02b3d7e1abeadb235b24ba3bec04f56a1b34e72a52bc15136a763 not found: ID does not exist" Mar 09 14:25:16 crc kubenswrapper[4722]: I0309 14:25:16.689115 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-lgh7d"] Mar 09 14:25:17 crc kubenswrapper[4722]: I0309 14:25:17.002171 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 09 14:25:17 crc kubenswrapper[4722]: W0309 14:25:17.022029 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e07fbab_4a47_4e59_aa72_f0a4521296af.slice/crio-bdf4a7d97d9db3011caa33f68d6b5fca001ee24d8fcb92c17ea1e7957aaa83a5 WatchSource:0}: Error finding container bdf4a7d97d9db3011caa33f68d6b5fca001ee24d8fcb92c17ea1e7957aaa83a5: Status 404 returned error can't find the container with id bdf4a7d97d9db3011caa33f68d6b5fca001ee24d8fcb92c17ea1e7957aaa83a5 Mar 09 14:25:17 crc kubenswrapper[4722]: I0309 14:25:17.626751 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"0e07fbab-4a47-4e59-aa72-f0a4521296af","Type":"ContainerStarted","Data":"bdf4a7d97d9db3011caa33f68d6b5fca001ee24d8fcb92c17ea1e7957aaa83a5"} Mar 09 14:25:17 crc kubenswrapper[4722]: I0309 14:25:17.947511 4722 scope.go:117] "RemoveContainer" containerID="32938ebe1b7b6b82bb2086db38761774319a2bbe444d065ebebb2cb4f6773226" Mar 09 14:25:18 crc kubenswrapper[4722]: I0309 14:25:18.169762 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e762951c-0c20-44c7-927e-c314d6baf1c7" path="/var/lib/kubelet/pods/e762951c-0c20-44c7-927e-c314d6baf1c7/volumes" Mar 09 14:25:18 crc kubenswrapper[4722]: I0309 14:25:18.361976 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 09 14:25:18 crc kubenswrapper[4722]: I0309 14:25:18.749769 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-fbldj"] Mar 09 14:25:18 crc kubenswrapper[4722]: E0309 14:25:18.750286 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e762951c-0c20-44c7-927e-c314d6baf1c7" containerName="dnsmasq-dns" Mar 09 14:25:18 crc kubenswrapper[4722]: I0309 14:25:18.750303 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="e762951c-0c20-44c7-927e-c314d6baf1c7" containerName="dnsmasq-dns" Mar 09 14:25:18 crc kubenswrapper[4722]: E0309 14:25:18.750349 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e762951c-0c20-44c7-927e-c314d6baf1c7" containerName="init" Mar 09 14:25:18 crc kubenswrapper[4722]: I0309 14:25:18.750358 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="e762951c-0c20-44c7-927e-c314d6baf1c7" containerName="init" Mar 09 14:25:18 crc kubenswrapper[4722]: I0309 14:25:18.750607 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="e762951c-0c20-44c7-927e-c314d6baf1c7" containerName="dnsmasq-dns" Mar 09 14:25:18 crc kubenswrapper[4722]: I0309 14:25:18.751460 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-fbldj" Mar 09 14:25:18 crc kubenswrapper[4722]: I0309 14:25:18.763041 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-fbldj"] Mar 09 14:25:18 crc kubenswrapper[4722]: I0309 14:25:18.835605 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e8291dd-3577-4312-9ef5-fddf9a98b9db-operator-scripts\") pod \"heat-db-create-fbldj\" (UID: \"4e8291dd-3577-4312-9ef5-fddf9a98b9db\") " pod="openstack/heat-db-create-fbldj" Mar 09 14:25:18 crc kubenswrapper[4722]: I0309 14:25:18.835843 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gj2zc\" (UniqueName: \"kubernetes.io/projected/4e8291dd-3577-4312-9ef5-fddf9a98b9db-kube-api-access-gj2zc\") pod \"heat-db-create-fbldj\" (UID: \"4e8291dd-3577-4312-9ef5-fddf9a98b9db\") " pod="openstack/heat-db-create-fbldj" Mar 09 14:25:18 crc kubenswrapper[4722]: I0309 14:25:18.837751 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-28mvd"] Mar 09 14:25:18 crc kubenswrapper[4722]: I0309 14:25:18.839071 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-28mvd" Mar 09 14:25:18 crc kubenswrapper[4722]: I0309 14:25:18.850686 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-28mvd"] Mar 09 14:25:18 crc kubenswrapper[4722]: I0309 14:25:18.919430 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-b229-account-create-update-8qcbk"] Mar 09 14:25:18 crc kubenswrapper[4722]: I0309 14:25:18.921527 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-b229-account-create-update-8qcbk" Mar 09 14:25:18 crc kubenswrapper[4722]: I0309 14:25:18.928810 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Mar 09 14:25:18 crc kubenswrapper[4722]: I0309 14:25:18.943355 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-b229-account-create-update-8qcbk"] Mar 09 14:25:18 crc kubenswrapper[4722]: I0309 14:25:18.951935 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e8291dd-3577-4312-9ef5-fddf9a98b9db-operator-scripts\") pod \"heat-db-create-fbldj\" (UID: \"4e8291dd-3577-4312-9ef5-fddf9a98b9db\") " pod="openstack/heat-db-create-fbldj" Mar 09 14:25:18 crc kubenswrapper[4722]: I0309 14:25:18.952148 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gj2zc\" (UniqueName: \"kubernetes.io/projected/4e8291dd-3577-4312-9ef5-fddf9a98b9db-kube-api-access-gj2zc\") pod \"heat-db-create-fbldj\" (UID: \"4e8291dd-3577-4312-9ef5-fddf9a98b9db\") " pod="openstack/heat-db-create-fbldj" Mar 09 14:25:18 crc kubenswrapper[4722]: I0309 14:25:18.952230 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxljj\" (UniqueName: \"kubernetes.io/projected/7c70c5f8-b667-4d94-96f4-7184d2fc36aa-kube-api-access-zxljj\") pod \"cinder-db-create-28mvd\" (UID: \"7c70c5f8-b667-4d94-96f4-7184d2fc36aa\") " pod="openstack/cinder-db-create-28mvd" Mar 09 14:25:18 crc kubenswrapper[4722]: I0309 14:25:18.952366 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c70c5f8-b667-4d94-96f4-7184d2fc36aa-operator-scripts\") pod \"cinder-db-create-28mvd\" (UID: \"7c70c5f8-b667-4d94-96f4-7184d2fc36aa\") " pod="openstack/cinder-db-create-28mvd" Mar 09 14:25:18 crc kubenswrapper[4722]: I0309 14:25:18.966819 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e8291dd-3577-4312-9ef5-fddf9a98b9db-operator-scripts\") pod \"heat-db-create-fbldj\" (UID: \"4e8291dd-3577-4312-9ef5-fddf9a98b9db\") " pod="openstack/heat-db-create-fbldj" Mar 09 14:25:19 crc kubenswrapper[4722]: I0309 14:25:19.054893 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wr2l8\" (UniqueName: \"kubernetes.io/projected/ee1ac8e7-cf85-4fe4-9e02-7be1c6ac8119-kube-api-access-wr2l8\") pod \"heat-b229-account-create-update-8qcbk\" (UID: \"ee1ac8e7-cf85-4fe4-9e02-7be1c6ac8119\") " pod="openstack/heat-b229-account-create-update-8qcbk" Mar 09 14:25:19 crc kubenswrapper[4722]: I0309 14:25:19.054996 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxljj\" (UniqueName: \"kubernetes.io/projected/7c70c5f8-b667-4d94-96f4-7184d2fc36aa-kube-api-access-zxljj\") pod \"cinder-db-create-28mvd\" (UID: \"7c70c5f8-b667-4d94-96f4-7184d2fc36aa\") " pod="openstack/cinder-db-create-28mvd" Mar 09 14:25:19 crc kubenswrapper[4722]: I0309 14:25:19.055082 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c70c5f8-b667-4d94-96f4-7184d2fc36aa-operator-scripts\") pod \"cinder-db-create-28mvd\" (UID: \"7c70c5f8-b667-4d94-96f4-7184d2fc36aa\") " pod="openstack/cinder-db-create-28mvd" Mar 09 14:25:19 crc kubenswrapper[4722]: I0309 14:25:19.055172 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ee1ac8e7-cf85-4fe4-9e02-7be1c6ac8119-operator-scripts\") pod \"heat-b229-account-create-update-8qcbk\" (UID: \"ee1ac8e7-cf85-4fe4-9e02-7be1c6ac8119\") " pod="openstack/heat-b229-account-create-update-8qcbk" Mar 09 14:25:19 crc kubenswrapper[4722]: I0309 14:25:19.056324 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c70c5f8-b667-4d94-96f4-7184d2fc36aa-operator-scripts\") pod \"cinder-db-create-28mvd\" (UID: \"7c70c5f8-b667-4d94-96f4-7184d2fc36aa\") " pod="openstack/cinder-db-create-28mvd" Mar 09 14:25:19 crc kubenswrapper[4722]: I0309 14:25:19.077098 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gj2zc\" (UniqueName: \"kubernetes.io/projected/4e8291dd-3577-4312-9ef5-fddf9a98b9db-kube-api-access-gj2zc\") pod \"heat-db-create-fbldj\" (UID: \"4e8291dd-3577-4312-9ef5-fddf9a98b9db\") " pod="openstack/heat-db-create-fbldj" Mar 09 14:25:19 crc kubenswrapper[4722]: I0309 14:25:19.085273 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-c159-account-create-update-wrg5t"] Mar 09 14:25:19 crc kubenswrapper[4722]: I0309 14:25:19.086919 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-c159-account-create-update-wrg5t" Mar 09 14:25:19 crc kubenswrapper[4722]: I0309 14:25:19.092522 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 09 14:25:19 crc kubenswrapper[4722]: I0309 14:25:19.095242 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-c159-account-create-update-wrg5t"] Mar 09 14:25:19 crc kubenswrapper[4722]: I0309 14:25:19.102761 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxljj\" (UniqueName: \"kubernetes.io/projected/7c70c5f8-b667-4d94-96f4-7184d2fc36aa-kube-api-access-zxljj\") pod \"cinder-db-create-28mvd\" (UID: \"7c70c5f8-b667-4d94-96f4-7184d2fc36aa\") " pod="openstack/cinder-db-create-28mvd" Mar 09 14:25:19 crc kubenswrapper[4722]: I0309 14:25:19.153306 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-gw98d"] Mar 09 14:25:19 crc kubenswrapper[4722]: I0309 14:25:19.154346 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-28mvd" Mar 09 14:25:19 crc kubenswrapper[4722]: I0309 14:25:19.155170 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-gw98d" Mar 09 14:25:19 crc kubenswrapper[4722]: I0309 14:25:19.159399 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ee1ac8e7-cf85-4fe4-9e02-7be1c6ac8119-operator-scripts\") pod \"heat-b229-account-create-update-8qcbk\" (UID: \"ee1ac8e7-cf85-4fe4-9e02-7be1c6ac8119\") " pod="openstack/heat-b229-account-create-update-8qcbk" Mar 09 14:25:19 crc kubenswrapper[4722]: I0309 14:25:19.167354 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wr2l8\" (UniqueName: \"kubernetes.io/projected/ee1ac8e7-cf85-4fe4-9e02-7be1c6ac8119-kube-api-access-wr2l8\") pod \"heat-b229-account-create-update-8qcbk\" (UID: \"ee1ac8e7-cf85-4fe4-9e02-7be1c6ac8119\") " pod="openstack/heat-b229-account-create-update-8qcbk" Mar 09 14:25:19 crc kubenswrapper[4722]: I0309 14:25:19.167642 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6cddd9c-6855-4f24-bbe7-b628c2431354-operator-scripts\") pod \"cinder-c159-account-create-update-wrg5t\" (UID: \"d6cddd9c-6855-4f24-bbe7-b628c2431354\") " pod="openstack/cinder-c159-account-create-update-wrg5t" Mar 09 14:25:19 crc kubenswrapper[4722]: I0309 14:25:19.167948 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52zk9\" (UniqueName: \"kubernetes.io/projected/d6cddd9c-6855-4f24-bbe7-b628c2431354-kube-api-access-52zk9\") pod \"cinder-c159-account-create-update-wrg5t\" (UID: \"d6cddd9c-6855-4f24-bbe7-b628c2431354\") " pod="openstack/cinder-c159-account-create-update-wrg5t" Mar 09 14:25:19 crc kubenswrapper[4722]: I0309 14:25:19.159993 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ee1ac8e7-cf85-4fe4-9e02-7be1c6ac8119-operator-scripts\") pod \"heat-b229-account-create-update-8qcbk\" (UID: \"ee1ac8e7-cf85-4fe4-9e02-7be1c6ac8119\") " pod="openstack/heat-b229-account-create-update-8qcbk" Mar 09 14:25:19 crc kubenswrapper[4722]: I0309 14:25:19.169377 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-jfd9g" Mar 09 14:25:19 crc kubenswrapper[4722]: I0309 14:25:19.176886 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 09 14:25:19 crc kubenswrapper[4722]: I0309 14:25:19.177092 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 09 14:25:19 crc kubenswrapper[4722]: I0309 14:25:19.177385 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 09 14:25:19 crc kubenswrapper[4722]: I0309 14:25:19.198421 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-gw98d"] Mar 09 14:25:19 crc kubenswrapper[4722]: I0309 14:25:19.205538 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wr2l8\" (UniqueName: \"kubernetes.io/projected/ee1ac8e7-cf85-4fe4-9e02-7be1c6ac8119-kube-api-access-wr2l8\") pod \"heat-b229-account-create-update-8qcbk\" (UID: \"ee1ac8e7-cf85-4fe4-9e02-7be1c6ac8119\") " pod="openstack/heat-b229-account-create-update-8qcbk" Mar 09 14:25:19 crc kubenswrapper[4722]: I0309 14:25:19.227616 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-2f2fz"] Mar 09 14:25:19 crc kubenswrapper[4722]: I0309 14:25:19.231282 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-2f2fz" Mar 09 14:25:19 crc kubenswrapper[4722]: I0309 14:25:19.252176 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-2f2fz"] Mar 09 14:25:19 crc kubenswrapper[4722]: I0309 14:25:19.257809 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-b229-account-create-update-8qcbk" Mar 09 14:25:19 crc kubenswrapper[4722]: I0309 14:25:19.270362 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pbfw\" (UniqueName: \"kubernetes.io/projected/1f457549-38ab-40d6-97ad-160ce234e8e5-kube-api-access-7pbfw\") pod \"keystone-db-sync-gw98d\" (UID: \"1f457549-38ab-40d6-97ad-160ce234e8e5\") " pod="openstack/keystone-db-sync-gw98d" Mar 09 14:25:19 crc kubenswrapper[4722]: I0309 14:25:19.270431 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52zk9\" (UniqueName: \"kubernetes.io/projected/d6cddd9c-6855-4f24-bbe7-b628c2431354-kube-api-access-52zk9\") pod \"cinder-c159-account-create-update-wrg5t\" (UID: \"d6cddd9c-6855-4f24-bbe7-b628c2431354\") " pod="openstack/cinder-c159-account-create-update-wrg5t" Mar 09 14:25:19 crc kubenswrapper[4722]: I0309 14:25:19.270534 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f457549-38ab-40d6-97ad-160ce234e8e5-combined-ca-bundle\") pod \"keystone-db-sync-gw98d\" (UID: \"1f457549-38ab-40d6-97ad-160ce234e8e5\") " pod="openstack/keystone-db-sync-gw98d" Mar 09 14:25:19 crc kubenswrapper[4722]: I0309 14:25:19.270582 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f457549-38ab-40d6-97ad-160ce234e8e5-config-data\") pod \"keystone-db-sync-gw98d\" (UID: \"1f457549-38ab-40d6-97ad-160ce234e8e5\") " pod="openstack/keystone-db-sync-gw98d" Mar 09 14:25:19 crc kubenswrapper[4722]: I0309 14:25:19.270760 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6cddd9c-6855-4f24-bbe7-b628c2431354-operator-scripts\") pod \"cinder-c159-account-create-update-wrg5t\" (UID: \"d6cddd9c-6855-4f24-bbe7-b628c2431354\") " pod="openstack/cinder-c159-account-create-update-wrg5t" Mar 09 14:25:19 crc kubenswrapper[4722]: I0309 14:25:19.272692 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6cddd9c-6855-4f24-bbe7-b628c2431354-operator-scripts\") pod \"cinder-c159-account-create-update-wrg5t\" (UID: \"d6cddd9c-6855-4f24-bbe7-b628c2431354\") " pod="openstack/cinder-c159-account-create-update-wrg5t" Mar 09 14:25:19 crc kubenswrapper[4722]: I0309 14:25:19.323946 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52zk9\" (UniqueName: \"kubernetes.io/projected/d6cddd9c-6855-4f24-bbe7-b628c2431354-kube-api-access-52zk9\") pod \"cinder-c159-account-create-update-wrg5t\" (UID: \"d6cddd9c-6855-4f24-bbe7-b628c2431354\") " pod="openstack/cinder-c159-account-create-update-wrg5t" Mar 09 14:25:19 crc kubenswrapper[4722]: I0309 14:25:19.358783 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-d6qd7"] Mar 09 14:25:19 crc kubenswrapper[4722]: I0309 14:25:19.360506 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-d6qd7" Mar 09 14:25:19 crc kubenswrapper[4722]: I0309 14:25:19.371026 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-fbldj" Mar 09 14:25:19 crc kubenswrapper[4722]: I0309 14:25:19.372976 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pbfw\" (UniqueName: \"kubernetes.io/projected/1f457549-38ab-40d6-97ad-160ce234e8e5-kube-api-access-7pbfw\") pod \"keystone-db-sync-gw98d\" (UID: \"1f457549-38ab-40d6-97ad-160ce234e8e5\") " pod="openstack/keystone-db-sync-gw98d" Mar 09 14:25:19 crc kubenswrapper[4722]: I0309 14:25:19.373059 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69af744d-8cc5-460e-8a51-f867d17f8e49-operator-scripts\") pod \"barbican-db-create-2f2fz\" (UID: \"69af744d-8cc5-460e-8a51-f867d17f8e49\") " pod="openstack/barbican-db-create-2f2fz" Mar 09 14:25:19 crc kubenswrapper[4722]: I0309 14:25:19.373094 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctwg2\" (UniqueName: \"kubernetes.io/projected/69af744d-8cc5-460e-8a51-f867d17f8e49-kube-api-access-ctwg2\") pod \"barbican-db-create-2f2fz\" (UID: \"69af744d-8cc5-460e-8a51-f867d17f8e49\") " pod="openstack/barbican-db-create-2f2fz" Mar 09 14:25:19 crc kubenswrapper[4722]: I0309 14:25:19.373152 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f457549-38ab-40d6-97ad-160ce234e8e5-combined-ca-bundle\") pod \"keystone-db-sync-gw98d\" (UID: \"1f457549-38ab-40d6-97ad-160ce234e8e5\") " pod="openstack/keystone-db-sync-gw98d" Mar 09 14:25:19 crc kubenswrapper[4722]: I0309 14:25:19.373197 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f457549-38ab-40d6-97ad-160ce234e8e5-config-data\") pod \"keystone-db-sync-gw98d\" (UID: \"1f457549-38ab-40d6-97ad-160ce234e8e5\") " pod="openstack/keystone-db-sync-gw98d" Mar 09 14:25:19 crc kubenswrapper[4722]: I0309 14:25:19.379240 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f457549-38ab-40d6-97ad-160ce234e8e5-combined-ca-bundle\") pod \"keystone-db-sync-gw98d\" (UID: \"1f457549-38ab-40d6-97ad-160ce234e8e5\") " pod="openstack/keystone-db-sync-gw98d" Mar 09 14:25:19 crc kubenswrapper[4722]: I0309 14:25:19.382892 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f457549-38ab-40d6-97ad-160ce234e8e5-config-data\") pod \"keystone-db-sync-gw98d\" (UID: \"1f457549-38ab-40d6-97ad-160ce234e8e5\") " pod="openstack/keystone-db-sync-gw98d" Mar 09 14:25:19 crc kubenswrapper[4722]: I0309 14:25:19.399091 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-d6qd7"] Mar 09 14:25:19 crc kubenswrapper[4722]: I0309 14:25:19.402778 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-c159-account-create-update-wrg5t" Mar 09 14:25:19 crc kubenswrapper[4722]: I0309 14:25:19.406072 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-b1e8-account-create-update-6xt7j"] Mar 09 14:25:19 crc kubenswrapper[4722]: I0309 14:25:19.407376 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b1e8-account-create-update-6xt7j" Mar 09 14:25:19 crc kubenswrapper[4722]: I0309 14:25:19.412652 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 09 14:25:19 crc kubenswrapper[4722]: I0309 14:25:19.421118 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pbfw\" (UniqueName: \"kubernetes.io/projected/1f457549-38ab-40d6-97ad-160ce234e8e5-kube-api-access-7pbfw\") pod \"keystone-db-sync-gw98d\" (UID: \"1f457549-38ab-40d6-97ad-160ce234e8e5\") " pod="openstack/keystone-db-sync-gw98d" Mar 09 14:25:19 crc kubenswrapper[4722]: I0309 14:25:19.442168 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-b1e8-account-create-update-6xt7j"] Mar 09 14:25:19 crc kubenswrapper[4722]: I0309 14:25:19.483586 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbx8k\" (UniqueName: \"kubernetes.io/projected/8e2e23b8-e05f-4e51-b435-baba257c7a55-kube-api-access-qbx8k\") pod \"neutron-db-create-d6qd7\" (UID: \"8e2e23b8-e05f-4e51-b435-baba257c7a55\") " pod="openstack/neutron-db-create-d6qd7" Mar 09 14:25:19 crc kubenswrapper[4722]: I0309 14:25:19.483622 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8e2e23b8-e05f-4e51-b435-baba257c7a55-operator-scripts\") pod \"neutron-db-create-d6qd7\" (UID: \"8e2e23b8-e05f-4e51-b435-baba257c7a55\") " pod="openstack/neutron-db-create-d6qd7" Mar 09 14:25:19 crc kubenswrapper[4722]: I0309 14:25:19.483651 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6934dfa5-4f2a-4241-8cc2-20a87e4659e8-operator-scripts\") pod \"barbican-b1e8-account-create-update-6xt7j\" (UID: \"6934dfa5-4f2a-4241-8cc2-20a87e4659e8\") " pod="openstack/barbican-b1e8-account-create-update-6xt7j" Mar 09 14:25:19 crc kubenswrapper[4722]: I0309 14:25:19.483725 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69af744d-8cc5-460e-8a51-f867d17f8e49-operator-scripts\") pod \"barbican-db-create-2f2fz\" (UID: \"69af744d-8cc5-460e-8a51-f867d17f8e49\") " pod="openstack/barbican-db-create-2f2fz" Mar 09 14:25:19 crc kubenswrapper[4722]: I0309 14:25:19.483745 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctwg2\" (UniqueName: \"kubernetes.io/projected/69af744d-8cc5-460e-8a51-f867d17f8e49-kube-api-access-ctwg2\") pod \"barbican-db-create-2f2fz\" (UID: \"69af744d-8cc5-460e-8a51-f867d17f8e49\") " pod="openstack/barbican-db-create-2f2fz" Mar 09 14:25:19 crc kubenswrapper[4722]: I0309 14:25:19.483867 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4mm5\" (UniqueName: \"kubernetes.io/projected/6934dfa5-4f2a-4241-8cc2-20a87e4659e8-kube-api-access-p4mm5\") pod \"barbican-b1e8-account-create-update-6xt7j\" (UID: \"6934dfa5-4f2a-4241-8cc2-20a87e4659e8\") " pod="openstack/barbican-b1e8-account-create-update-6xt7j" Mar 09 14:25:19 crc kubenswrapper[4722]: I0309 14:25:19.484648 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69af744d-8cc5-460e-8a51-f867d17f8e49-operator-scripts\") pod \"barbican-db-create-2f2fz\" (UID: \"69af744d-8cc5-460e-8a51-f867d17f8e49\") " pod="openstack/barbican-db-create-2f2fz" Mar 09 14:25:19 crc kubenswrapper[4722]: I0309 14:25:19.586037 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4mm5\" (UniqueName: \"kubernetes.io/projected/6934dfa5-4f2a-4241-8cc2-20a87e4659e8-kube-api-access-p4mm5\") pod \"barbican-b1e8-account-create-update-6xt7j\" (UID: \"6934dfa5-4f2a-4241-8cc2-20a87e4659e8\") " pod="openstack/barbican-b1e8-account-create-update-6xt7j" Mar 09 14:25:19 crc kubenswrapper[4722]: I0309 14:25:19.586120 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbx8k\" (UniqueName: \"kubernetes.io/projected/8e2e23b8-e05f-4e51-b435-baba257c7a55-kube-api-access-qbx8k\") pod \"neutron-db-create-d6qd7\" (UID: \"8e2e23b8-e05f-4e51-b435-baba257c7a55\") " pod="openstack/neutron-db-create-d6qd7" Mar 09 14:25:19 crc kubenswrapper[4722]: I0309 14:25:19.586145 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8e2e23b8-e05f-4e51-b435-baba257c7a55-operator-scripts\") pod \"neutron-db-create-d6qd7\" (UID: \"8e2e23b8-e05f-4e51-b435-baba257c7a55\") " pod="openstack/neutron-db-create-d6qd7" Mar 09 14:25:19 crc kubenswrapper[4722]: I0309 14:25:19.586175 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6934dfa5-4f2a-4241-8cc2-20a87e4659e8-operator-scripts\") pod \"barbican-b1e8-account-create-update-6xt7j\" (UID: \"6934dfa5-4f2a-4241-8cc2-20a87e4659e8\") " pod="openstack/barbican-b1e8-account-create-update-6xt7j" Mar 09 14:25:19 crc kubenswrapper[4722]: I0309 14:25:19.586955 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8e2e23b8-e05f-4e51-b435-baba257c7a55-operator-scripts\") pod \"neutron-db-create-d6qd7\" (UID: \"8e2e23b8-e05f-4e51-b435-baba257c7a55\") " pod="openstack/neutron-db-create-d6qd7" Mar 09 14:25:19 crc kubenswrapper[4722]: I0309 14:25:19.587461 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6934dfa5-4f2a-4241-8cc2-20a87e4659e8-operator-scripts\") pod \"barbican-b1e8-account-create-update-6xt7j\" (UID: \"6934dfa5-4f2a-4241-8cc2-20a87e4659e8\") " pod="openstack/barbican-b1e8-account-create-update-6xt7j" Mar 09 14:25:19 crc kubenswrapper[4722]: I0309 14:25:19.659134 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-ea4e-account-create-update-vgttx"] Mar 09 14:25:19 crc kubenswrapper[4722]: I0309 14:25:19.661101 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ea4e-account-create-update-vgttx" Mar 09 14:25:19 crc kubenswrapper[4722]: I0309 14:25:19.663404 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 09 14:25:19 crc kubenswrapper[4722]: I0309 14:25:19.669271 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-ea4e-account-create-update-vgttx"] Mar 09 14:25:19 crc kubenswrapper[4722]: I0309 14:25:19.679311 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctwg2\" (UniqueName: \"kubernetes.io/projected/69af744d-8cc5-460e-8a51-f867d17f8e49-kube-api-access-ctwg2\") pod \"barbican-db-create-2f2fz\" (UID: \"69af744d-8cc5-460e-8a51-f867d17f8e49\") " pod="openstack/barbican-db-create-2f2fz" Mar 09 14:25:19 crc kubenswrapper[4722]: I0309 14:25:19.686121 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbx8k\" (UniqueName: \"kubernetes.io/projected/8e2e23b8-e05f-4e51-b435-baba257c7a55-kube-api-access-qbx8k\") pod \"neutron-db-create-d6qd7\" (UID: \"8e2e23b8-e05f-4e51-b435-baba257c7a55\") " pod="openstack/neutron-db-create-d6qd7" Mar 09 14:25:19 crc kubenswrapper[4722]: I0309 14:25:19.688048 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4mm5\" (UniqueName: \"kubernetes.io/projected/6934dfa5-4f2a-4241-8cc2-20a87e4659e8-kube-api-access-p4mm5\") pod \"barbican-b1e8-account-create-update-6xt7j\" (UID: \"6934dfa5-4f2a-4241-8cc2-20a87e4659e8\") " pod="openstack/barbican-b1e8-account-create-update-6xt7j" Mar 09 14:25:19 crc kubenswrapper[4722]: I0309 14:25:19.708917 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-gw98d" Mar 09 14:25:19 crc kubenswrapper[4722]: I0309 14:25:19.736534 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-2f2fz" Mar 09 14:25:19 crc kubenswrapper[4722]: I0309 14:25:19.749784 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-d6qd7" Mar 09 14:25:19 crc kubenswrapper[4722]: I0309 14:25:19.763357 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b1e8-account-create-update-6xt7j" Mar 09 14:25:19 crc kubenswrapper[4722]: I0309 14:25:19.792919 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e291a0c4-8966-4c54-b4c9-fa2ed6c28b1b-operator-scripts\") pod \"neutron-ea4e-account-create-update-vgttx\" (UID: \"e291a0c4-8966-4c54-b4c9-fa2ed6c28b1b\") " pod="openstack/neutron-ea4e-account-create-update-vgttx" Mar 09 14:25:19 crc kubenswrapper[4722]: I0309 14:25:19.793153 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jn8rs\" (UniqueName: \"kubernetes.io/projected/e291a0c4-8966-4c54-b4c9-fa2ed6c28b1b-kube-api-access-jn8rs\") pod \"neutron-ea4e-account-create-update-vgttx\" (UID: \"e291a0c4-8966-4c54-b4c9-fa2ed6c28b1b\") " pod="openstack/neutron-ea4e-account-create-update-vgttx" Mar 09 14:25:19 crc kubenswrapper[4722]: I0309 14:25:19.895750 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e291a0c4-8966-4c54-b4c9-fa2ed6c28b1b-operator-scripts\") pod \"neutron-ea4e-account-create-update-vgttx\" (UID: \"e291a0c4-8966-4c54-b4c9-fa2ed6c28b1b\") " pod="openstack/neutron-ea4e-account-create-update-vgttx" Mar 09 14:25:19 crc kubenswrapper[4722]: I0309 14:25:19.896242 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jn8rs\" (UniqueName: \"kubernetes.io/projected/e291a0c4-8966-4c54-b4c9-fa2ed6c28b1b-kube-api-access-jn8rs\") pod \"neutron-ea4e-account-create-update-vgttx\" (UID: \"e291a0c4-8966-4c54-b4c9-fa2ed6c28b1b\") " pod="openstack/neutron-ea4e-account-create-update-vgttx" Mar 09 14:25:19 crc kubenswrapper[4722]: I0309 14:25:19.896795 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e291a0c4-8966-4c54-b4c9-fa2ed6c28b1b-operator-scripts\") pod \"neutron-ea4e-account-create-update-vgttx\" (UID: \"e291a0c4-8966-4c54-b4c9-fa2ed6c28b1b\") " pod="openstack/neutron-ea4e-account-create-update-vgttx" Mar 09 14:25:19 crc kubenswrapper[4722]: I0309 14:25:19.933389 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jn8rs\" (UniqueName: \"kubernetes.io/projected/e291a0c4-8966-4c54-b4c9-fa2ed6c28b1b-kube-api-access-jn8rs\") pod \"neutron-ea4e-account-create-update-vgttx\" (UID: \"e291a0c4-8966-4c54-b4c9-fa2ed6c28b1b\") " pod="openstack/neutron-ea4e-account-create-update-vgttx" Mar 09 14:25:19 crc kubenswrapper[4722]: I0309 14:25:19.972038 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-28mvd"] Mar 09 14:25:19 crc kubenswrapper[4722]: I0309 14:25:19.988103 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ea4e-account-create-update-vgttx" Mar 09 14:25:20 crc kubenswrapper[4722]: I0309 14:25:20.301212 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-b229-account-create-update-8qcbk"] Mar 09 14:25:20 crc kubenswrapper[4722]: I0309 14:25:20.672246 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-28mvd" event={"ID":"7c70c5f8-b667-4d94-96f4-7184d2fc36aa","Type":"ContainerStarted","Data":"0818377d7e7c8f400a6f0a5a2ec3ea62dc6867105f8b49f85ba04d8ffe1a915d"} Mar 09 14:25:20 crc kubenswrapper[4722]: I0309 14:25:20.672287 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-28mvd" event={"ID":"7c70c5f8-b667-4d94-96f4-7184d2fc36aa","Type":"ContainerStarted","Data":"39780260fa846c5d42a18c34e884d1f3474d275baccb014230e060d0606cdaf1"} Mar 09 14:25:20 crc kubenswrapper[4722]: I0309 14:25:20.676524 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-b229-account-create-update-8qcbk" event={"ID":"ee1ac8e7-cf85-4fe4-9e02-7be1c6ac8119","Type":"ContainerStarted","Data":"1ef10ab0e5ce60e5b0f7a11819a85a66d4a6d04218225bb94af9e274afcf0af3"} Mar 09 14:25:20 crc kubenswrapper[4722]: I0309 14:25:20.690608 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"0e07fbab-4a47-4e59-aa72-f0a4521296af","Type":"ContainerStarted","Data":"20c16653ad371dd67281816bdd470b5061bf7b0b72384a8e4ed5df82525a12d7"} Mar 09 14:25:20 crc kubenswrapper[4722]: I0309 14:25:20.723015 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-2f2fz"] Mar 09 14:25:20 crc kubenswrapper[4722]: I0309 14:25:20.737274 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-fbldj"] Mar 09 14:25:20 crc kubenswrapper[4722]: I0309 14:25:20.749303 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-gw98d"] Mar 09 14:25:20 crc kubenswrapper[4722]: I0309 14:25:20.759632 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-c159-account-create-update-wrg5t"] Mar 09 14:25:20 crc kubenswrapper[4722]: I0309 14:25:20.836848 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-b1e8-account-create-update-6xt7j"] Mar 09 14:25:20 crc kubenswrapper[4722]: I0309 14:25:20.947251 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-d6qd7"] Mar 09 14:25:20 crc kubenswrapper[4722]: W0309 14:25:20.964826 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8e2e23b8_e05f_4e51_b435_baba257c7a55.slice/crio-83eb56dea6d08051a67002266ba78f22719462829b47e2a89ae8c199468cc59f WatchSource:0}: Error finding container 83eb56dea6d08051a67002266ba78f22719462829b47e2a89ae8c199468cc59f: Status 404 returned error can't find the container with id 83eb56dea6d08051a67002266ba78f22719462829b47e2a89ae8c199468cc59f Mar 09 14:25:20 crc kubenswrapper[4722]: I0309 14:25:20.978252 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-ea4e-account-create-update-vgttx"] Mar 09 14:25:20 crc kubenswrapper[4722]: W0309 14:25:20.993724 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode291a0c4_8966_4c54_b4c9_fa2ed6c28b1b.slice/crio-7f6ed815aebb17944ca8606c4dead154bf208ae59773b8a5e4dde89e2f009428 WatchSource:0}: Error finding container 7f6ed815aebb17944ca8606c4dead154bf208ae59773b8a5e4dde89e2f009428: Status 404 returned error can't find the container with id 7f6ed815aebb17944ca8606c4dead154bf208ae59773b8a5e4dde89e2f009428 Mar 09 14:25:21 crc kubenswrapper[4722]: I0309 14:25:21.529858 4722 patch_prober.go:28] interesting pod/machine-config-daemon-hjrrb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:25:21 crc kubenswrapper[4722]: I0309 14:25:21.530170 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:25:21 crc kubenswrapper[4722]: I0309 14:25:21.701495 4722 generic.go:334] "Generic (PLEG): container finished" podID="7c70c5f8-b667-4d94-96f4-7184d2fc36aa" containerID="0818377d7e7c8f400a6f0a5a2ec3ea62dc6867105f8b49f85ba04d8ffe1a915d" exitCode=0 Mar 09 14:25:21 crc kubenswrapper[4722]: I0309 14:25:21.701552 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-28mvd" event={"ID":"7c70c5f8-b667-4d94-96f4-7184d2fc36aa","Type":"ContainerDied","Data":"0818377d7e7c8f400a6f0a5a2ec3ea62dc6867105f8b49f85ba04d8ffe1a915d"} Mar 09 14:25:21 crc kubenswrapper[4722]: I0309 14:25:21.703642 4722 generic.go:334] "Generic (PLEG): container finished" podID="8e2e23b8-e05f-4e51-b435-baba257c7a55" containerID="e343ca6f06216a8b36f8366941533ba1cb1fd88a064c2a90674afaf0e6169fee" exitCode=0 Mar 09 14:25:21 crc kubenswrapper[4722]: I0309 14:25:21.703698 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-d6qd7" event={"ID":"8e2e23b8-e05f-4e51-b435-baba257c7a55","Type":"ContainerDied","Data":"e343ca6f06216a8b36f8366941533ba1cb1fd88a064c2a90674afaf0e6169fee"} Mar 09 14:25:21 crc kubenswrapper[4722]: I0309 14:25:21.703719 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-d6qd7" event={"ID":"8e2e23b8-e05f-4e51-b435-baba257c7a55","Type":"ContainerStarted","Data":"83eb56dea6d08051a67002266ba78f22719462829b47e2a89ae8c199468cc59f"} Mar 09 14:25:21 crc kubenswrapper[4722]: I0309 14:25:21.706696 4722 generic.go:334] "Generic (PLEG): container finished" podID="4e8291dd-3577-4312-9ef5-fddf9a98b9db" containerID="ae82c8c817df06b4b73dcd2172abb119c668da4ef69be56ed5bb917ed7198e43" exitCode=0 Mar 09 14:25:21 crc kubenswrapper[4722]: I0309 14:25:21.706741 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-fbldj" event={"ID":"4e8291dd-3577-4312-9ef5-fddf9a98b9db","Type":"ContainerDied","Data":"ae82c8c817df06b4b73dcd2172abb119c668da4ef69be56ed5bb917ed7198e43"} Mar 09 14:25:21 crc kubenswrapper[4722]: I0309 14:25:21.706781 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-fbldj" event={"ID":"4e8291dd-3577-4312-9ef5-fddf9a98b9db","Type":"ContainerStarted","Data":"ba7705459149977570f0b1f308837213aadf065becc359a800d73c7e9ef3f4a6"} Mar 09 14:25:21 crc kubenswrapper[4722]: I0309 14:25:21.709076 4722 generic.go:334] "Generic (PLEG): container finished" podID="d6cddd9c-6855-4f24-bbe7-b628c2431354" containerID="942a11c95cd02763b3e8064661fec1d9adf3cea22dc10a549c6a28ccb2e72f87" exitCode=0 Mar 09 14:25:21 crc kubenswrapper[4722]: I0309 14:25:21.709142 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-c159-account-create-update-wrg5t" event={"ID":"d6cddd9c-6855-4f24-bbe7-b628c2431354","Type":"ContainerDied","Data":"942a11c95cd02763b3e8064661fec1d9adf3cea22dc10a549c6a28ccb2e72f87"} Mar 09 14:25:21 crc kubenswrapper[4722]: I0309 14:25:21.709183 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-c159-account-create-update-wrg5t" event={"ID":"d6cddd9c-6855-4f24-bbe7-b628c2431354","Type":"ContainerStarted","Data":"426b40e2532de5ce7612e15991d344f1d1742898a53e22962ab2dae02bf4dd9a"} Mar 09 14:25:21 crc kubenswrapper[4722]: I0309 14:25:21.712477 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-b1e8-account-create-update-6xt7j" event={"ID":"6934dfa5-4f2a-4241-8cc2-20a87e4659e8","Type":"ContainerStarted","Data":"3ffd34a8cc592524621e7a2905633a55ed45699ded92c0dbfb682c1d36cd612c"} Mar 09 14:25:21 crc kubenswrapper[4722]: I0309 14:25:21.712537 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-b1e8-account-create-update-6xt7j" event={"ID":"6934dfa5-4f2a-4241-8cc2-20a87e4659e8","Type":"ContainerStarted","Data":"c3794cc21790d625fb43284e7245826034dc2d3020b3ce0e7d7a59a0442b418c"} Mar 09 14:25:21 crc kubenswrapper[4722]: I0309 14:25:21.724760 4722 generic.go:334] "Generic (PLEG): container finished" podID="69af744d-8cc5-460e-8a51-f867d17f8e49" containerID="e6aa7414333ac95def3ea245d3435a51edd711e5e6e631810dfb12614097070f" exitCode=0 Mar 09 14:25:21 crc kubenswrapper[4722]: I0309 14:25:21.724820 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-2f2fz" event={"ID":"69af744d-8cc5-460e-8a51-f867d17f8e49","Type":"ContainerDied","Data":"e6aa7414333ac95def3ea245d3435a51edd711e5e6e631810dfb12614097070f"} Mar 09 14:25:21 crc kubenswrapper[4722]: I0309 14:25:21.724862 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-2f2fz" event={"ID":"69af744d-8cc5-460e-8a51-f867d17f8e49","Type":"ContainerStarted","Data":"e9319e00c6d6caae6065737bf7eea699631185531f9f8942b45fcde87558fccc"} Mar 09 14:25:21 crc kubenswrapper[4722]: I0309 14:25:21.727835 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ea4e-account-create-update-vgttx" event={"ID":"e291a0c4-8966-4c54-b4c9-fa2ed6c28b1b","Type":"ContainerStarted","Data":"ae0f0a10b39149567e274974d05e02450f5f2fa41fba87a0a9d831ebf28b999e"} Mar 09 14:25:21 crc kubenswrapper[4722]: I0309 14:25:21.727880 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ea4e-account-create-update-vgttx" event={"ID":"e291a0c4-8966-4c54-b4c9-fa2ed6c28b1b","Type":"ContainerStarted","Data":"7f6ed815aebb17944ca8606c4dead154bf208ae59773b8a5e4dde89e2f009428"} Mar 09 14:25:21 crc kubenswrapper[4722]: I0309 14:25:21.730246 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-gw98d" event={"ID":"1f457549-38ab-40d6-97ad-160ce234e8e5","Type":"ContainerStarted","Data":"23d132a7a9a0f496f6f9f80055221524c4d065c89d7d5167de17cae4413fd1a3"} Mar 09 14:25:21 crc kubenswrapper[4722]: I0309 14:25:21.733174 4722 generic.go:334] "Generic (PLEG): container finished" podID="ee1ac8e7-cf85-4fe4-9e02-7be1c6ac8119" containerID="24a467e4c3a886fcf22d058f6004b174454c597a6959d3e90b0c7b0a148e24d1" exitCode=0 Mar 09 14:25:21 crc kubenswrapper[4722]: I0309 14:25:21.733928 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-b229-account-create-update-8qcbk" event={"ID":"ee1ac8e7-cf85-4fe4-9e02-7be1c6ac8119","Type":"ContainerDied","Data":"24a467e4c3a886fcf22d058f6004b174454c597a6959d3e90b0c7b0a148e24d1"} Mar 09 14:25:21 crc kubenswrapper[4722]: I0309 14:25:21.812848 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-b1e8-account-create-update-6xt7j" podStartSLOduration=2.812823725 podStartE2EDuration="2.812823725s" podCreationTimestamp="2026-03-09 14:25:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:25:21.797870751 +0000 UTC m=+1362.353439327" watchObservedRunningTime="2026-03-09 14:25:21.812823725 +0000 UTC m=+1362.368392301" Mar 09 14:25:21 crc kubenswrapper[4722]: I0309 14:25:21.855055 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-ea4e-account-create-update-vgttx" podStartSLOduration=2.855038934 podStartE2EDuration="2.855038934s" podCreationTimestamp="2026-03-09 14:25:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:25:21.848257936 +0000 UTC m=+1362.403826522" watchObservedRunningTime="2026-03-09 14:25:21.855038934 +0000 UTC m=+1362.410607510" Mar 09 14:25:22 crc kubenswrapper[4722]: I0309 14:25:22.746070 4722 generic.go:334] "Generic (PLEG): container finished" podID="e291a0c4-8966-4c54-b4c9-fa2ed6c28b1b" containerID="ae0f0a10b39149567e274974d05e02450f5f2fa41fba87a0a9d831ebf28b999e" exitCode=0 Mar 09 14:25:22 crc kubenswrapper[4722]: I0309 14:25:22.746141 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ea4e-account-create-update-vgttx" event={"ID":"e291a0c4-8966-4c54-b4c9-fa2ed6c28b1b","Type":"ContainerDied","Data":"ae0f0a10b39149567e274974d05e02450f5f2fa41fba87a0a9d831ebf28b999e"} Mar 09 14:25:22 crc kubenswrapper[4722]: I0309 14:25:22.748683 4722 generic.go:334] "Generic (PLEG): container finished" podID="6934dfa5-4f2a-4241-8cc2-20a87e4659e8" containerID="3ffd34a8cc592524621e7a2905633a55ed45699ded92c0dbfb682c1d36cd612c" exitCode=0 Mar 09 14:25:22 crc kubenswrapper[4722]: I0309 14:25:22.749033 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-b1e8-account-create-update-6xt7j" event={"ID":"6934dfa5-4f2a-4241-8cc2-20a87e4659e8","Type":"ContainerDied","Data":"3ffd34a8cc592524621e7a2905633a55ed45699ded92c0dbfb682c1d36cd612c"} Mar 09 14:25:24 crc kubenswrapper[4722]: I0309 14:25:24.360397 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-74f6bcbc87-qccbt" Mar 09 14:25:24 crc kubenswrapper[4722]: I0309 14:25:24.418916 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-thqfj"] Mar 09 14:25:24 crc kubenswrapper[4722]: I0309 14:25:24.419137 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-thqfj" podUID="d029d7d8-83b7-441d-a7b3-014e9ea76618" containerName="dnsmasq-dns" containerID="cri-o://96d495a9e82f8249a90710718e21a86830a2a6a682c4099fa1ca5d36da70d331" gracePeriod=10 Mar 09 14:25:24 crc kubenswrapper[4722]: I0309 14:25:24.780509 4722 generic.go:334] "Generic (PLEG): container finished" podID="d029d7d8-83b7-441d-a7b3-014e9ea76618" containerID="96d495a9e82f8249a90710718e21a86830a2a6a682c4099fa1ca5d36da70d331" exitCode=0 Mar 09 14:25:24 crc kubenswrapper[4722]: I0309 14:25:24.780556 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-thqfj" event={"ID":"d029d7d8-83b7-441d-a7b3-014e9ea76618","Type":"ContainerDied","Data":"96d495a9e82f8249a90710718e21a86830a2a6a682c4099fa1ca5d36da70d331"} Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.278517 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-28mvd" Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.299344 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-c159-account-create-update-wrg5t" Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.379913 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-d6qd7" Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.385446 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52zk9\" (UniqueName: \"kubernetes.io/projected/d6cddd9c-6855-4f24-bbe7-b628c2431354-kube-api-access-52zk9\") pod \"d6cddd9c-6855-4f24-bbe7-b628c2431354\" (UID: \"d6cddd9c-6855-4f24-bbe7-b628c2431354\") " Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.385495 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6cddd9c-6855-4f24-bbe7-b628c2431354-operator-scripts\") pod \"d6cddd9c-6855-4f24-bbe7-b628c2431354\" (UID: \"d6cddd9c-6855-4f24-bbe7-b628c2431354\") " Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.385664 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c70c5f8-b667-4d94-96f4-7184d2fc36aa-operator-scripts\") pod \"7c70c5f8-b667-4d94-96f4-7184d2fc36aa\" (UID: \"7c70c5f8-b667-4d94-96f4-7184d2fc36aa\") " Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.385721 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zxljj\" (UniqueName: \"kubernetes.io/projected/7c70c5f8-b667-4d94-96f4-7184d2fc36aa-kube-api-access-zxljj\") pod \"7c70c5f8-b667-4d94-96f4-7184d2fc36aa\" (UID: \"7c70c5f8-b667-4d94-96f4-7184d2fc36aa\") " Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.387464 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c70c5f8-b667-4d94-96f4-7184d2fc36aa-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7c70c5f8-b667-4d94-96f4-7184d2fc36aa" (UID: "7c70c5f8-b667-4d94-96f4-7184d2fc36aa"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.387507 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6cddd9c-6855-4f24-bbe7-b628c2431354-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d6cddd9c-6855-4f24-bbe7-b628c2431354" (UID: "d6cddd9c-6855-4f24-bbe7-b628c2431354"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.393854 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-b229-account-create-update-8qcbk" Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.471561 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c70c5f8-b667-4d94-96f4-7184d2fc36aa-kube-api-access-zxljj" (OuterVolumeSpecName: "kube-api-access-zxljj") pod "7c70c5f8-b667-4d94-96f4-7184d2fc36aa" (UID: "7c70c5f8-b667-4d94-96f4-7184d2fc36aa"). InnerVolumeSpecName "kube-api-access-zxljj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.471614 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6cddd9c-6855-4f24-bbe7-b628c2431354-kube-api-access-52zk9" (OuterVolumeSpecName: "kube-api-access-52zk9") pod "d6cddd9c-6855-4f24-bbe7-b628c2431354" (UID: "d6cddd9c-6855-4f24-bbe7-b628c2431354"). InnerVolumeSpecName "kube-api-access-52zk9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.493731 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8e2e23b8-e05f-4e51-b435-baba257c7a55-operator-scripts\") pod \"8e2e23b8-e05f-4e51-b435-baba257c7a55\" (UID: \"8e2e23b8-e05f-4e51-b435-baba257c7a55\") " Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.493808 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qbx8k\" (UniqueName: \"kubernetes.io/projected/8e2e23b8-e05f-4e51-b435-baba257c7a55-kube-api-access-qbx8k\") pod \"8e2e23b8-e05f-4e51-b435-baba257c7a55\" (UID: \"8e2e23b8-e05f-4e51-b435-baba257c7a55\") " Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.493956 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ee1ac8e7-cf85-4fe4-9e02-7be1c6ac8119-operator-scripts\") pod \"ee1ac8e7-cf85-4fe4-9e02-7be1c6ac8119\" (UID: \"ee1ac8e7-cf85-4fe4-9e02-7be1c6ac8119\") " Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.494044 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wr2l8\" (UniqueName: \"kubernetes.io/projected/ee1ac8e7-cf85-4fe4-9e02-7be1c6ac8119-kube-api-access-wr2l8\") pod \"ee1ac8e7-cf85-4fe4-9e02-7be1c6ac8119\" (UID: \"ee1ac8e7-cf85-4fe4-9e02-7be1c6ac8119\") " Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.494289 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e2e23b8-e05f-4e51-b435-baba257c7a55-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8e2e23b8-e05f-4e51-b435-baba257c7a55" (UID: "8e2e23b8-e05f-4e51-b435-baba257c7a55"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.494412 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee1ac8e7-cf85-4fe4-9e02-7be1c6ac8119-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ee1ac8e7-cf85-4fe4-9e02-7be1c6ac8119" (UID: "ee1ac8e7-cf85-4fe4-9e02-7be1c6ac8119"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.494947 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-52zk9\" (UniqueName: \"kubernetes.io/projected/d6cddd9c-6855-4f24-bbe7-b628c2431354-kube-api-access-52zk9\") on node \"crc\" DevicePath \"\"" Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.494973 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d6cddd9c-6855-4f24-bbe7-b628c2431354-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.494984 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8e2e23b8-e05f-4e51-b435-baba257c7a55-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.495003 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c70c5f8-b667-4d94-96f4-7184d2fc36aa-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.495015 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zxljj\" (UniqueName: \"kubernetes.io/projected/7c70c5f8-b667-4d94-96f4-7184d2fc36aa-kube-api-access-zxljj\") on node \"crc\" DevicePath \"\"" Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.495026 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ee1ac8e7-cf85-4fe4-9e02-7be1c6ac8119-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.497444 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e2e23b8-e05f-4e51-b435-baba257c7a55-kube-api-access-qbx8k" (OuterVolumeSpecName: "kube-api-access-qbx8k") pod "8e2e23b8-e05f-4e51-b435-baba257c7a55" (UID: "8e2e23b8-e05f-4e51-b435-baba257c7a55"). InnerVolumeSpecName "kube-api-access-qbx8k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.497504 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee1ac8e7-cf85-4fe4-9e02-7be1c6ac8119-kube-api-access-wr2l8" (OuterVolumeSpecName: "kube-api-access-wr2l8") pod "ee1ac8e7-cf85-4fe4-9e02-7be1c6ac8119" (UID: "ee1ac8e7-cf85-4fe4-9e02-7be1c6ac8119"). InnerVolumeSpecName "kube-api-access-wr2l8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.549024 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-2f2fz" Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.571133 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-fbldj" Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.588238 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b1e8-account-create-update-6xt7j" Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.597242 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wr2l8\" (UniqueName: \"kubernetes.io/projected/ee1ac8e7-cf85-4fe4-9e02-7be1c6ac8119-kube-api-access-wr2l8\") on node \"crc\" DevicePath \"\"" Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.597281 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qbx8k\" (UniqueName: \"kubernetes.io/projected/8e2e23b8-e05f-4e51-b435-baba257c7a55-kube-api-access-qbx8k\") on node \"crc\" DevicePath \"\"" Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.603279 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ea4e-account-create-update-vgttx" Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.617251 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-thqfj" Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.698316 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jn8rs\" (UniqueName: \"kubernetes.io/projected/e291a0c4-8966-4c54-b4c9-fa2ed6c28b1b-kube-api-access-jn8rs\") pod \"e291a0c4-8966-4c54-b4c9-fa2ed6c28b1b\" (UID: \"e291a0c4-8966-4c54-b4c9-fa2ed6c28b1b\") " Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.698513 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d029d7d8-83b7-441d-a7b3-014e9ea76618-dns-svc\") pod \"d029d7d8-83b7-441d-a7b3-014e9ea76618\" (UID: \"d029d7d8-83b7-441d-a7b3-014e9ea76618\") " Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.698541 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ctwg2\" (UniqueName: \"kubernetes.io/projected/69af744d-8cc5-460e-8a51-f867d17f8e49-kube-api-access-ctwg2\") pod \"69af744d-8cc5-460e-8a51-f867d17f8e49\" (UID: \"69af744d-8cc5-460e-8a51-f867d17f8e49\") " Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.698574 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e8291dd-3577-4312-9ef5-fddf9a98b9db-operator-scripts\") pod \"4e8291dd-3577-4312-9ef5-fddf9a98b9db\" (UID: \"4e8291dd-3577-4312-9ef5-fddf9a98b9db\") " Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.698604 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4mm5\" (UniqueName: \"kubernetes.io/projected/6934dfa5-4f2a-4241-8cc2-20a87e4659e8-kube-api-access-p4mm5\") pod \"6934dfa5-4f2a-4241-8cc2-20a87e4659e8\" (UID: \"6934dfa5-4f2a-4241-8cc2-20a87e4659e8\") " Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.698639 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d029d7d8-83b7-441d-a7b3-014e9ea76618-ovsdbserver-sb\") pod \"d029d7d8-83b7-441d-a7b3-014e9ea76618\" (UID: \"d029d7d8-83b7-441d-a7b3-014e9ea76618\") " Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.698671 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6934dfa5-4f2a-4241-8cc2-20a87e4659e8-operator-scripts\") pod \"6934dfa5-4f2a-4241-8cc2-20a87e4659e8\" (UID: \"6934dfa5-4f2a-4241-8cc2-20a87e4659e8\") " Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.698705 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gj2zc\" (UniqueName: \"kubernetes.io/projected/4e8291dd-3577-4312-9ef5-fddf9a98b9db-kube-api-access-gj2zc\") pod \"4e8291dd-3577-4312-9ef5-fddf9a98b9db\" (UID: \"4e8291dd-3577-4312-9ef5-fddf9a98b9db\") " Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.698722 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d029d7d8-83b7-441d-a7b3-014e9ea76618-config\") pod \"d029d7d8-83b7-441d-a7b3-014e9ea76618\" (UID: \"d029d7d8-83b7-441d-a7b3-014e9ea76618\") " Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.698743 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d029d7d8-83b7-441d-a7b3-014e9ea76618-ovsdbserver-nb\") pod \"d029d7d8-83b7-441d-a7b3-014e9ea76618\" (UID: \"d029d7d8-83b7-441d-a7b3-014e9ea76618\") " Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.698785 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69af744d-8cc5-460e-8a51-f867d17f8e49-operator-scripts\") pod \"69af744d-8cc5-460e-8a51-f867d17f8e49\" (UID: \"69af744d-8cc5-460e-8a51-f867d17f8e49\") " Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.698842 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e291a0c4-8966-4c54-b4c9-fa2ed6c28b1b-operator-scripts\") pod \"e291a0c4-8966-4c54-b4c9-fa2ed6c28b1b\" (UID: \"e291a0c4-8966-4c54-b4c9-fa2ed6c28b1b\") " Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.698862 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hk7r8\" (UniqueName: \"kubernetes.io/projected/d029d7d8-83b7-441d-a7b3-014e9ea76618-kube-api-access-hk7r8\") pod \"d029d7d8-83b7-441d-a7b3-014e9ea76618\" (UID: \"d029d7d8-83b7-441d-a7b3-014e9ea76618\") " Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.699129 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e8291dd-3577-4312-9ef5-fddf9a98b9db-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4e8291dd-3577-4312-9ef5-fddf9a98b9db" (UID: "4e8291dd-3577-4312-9ef5-fddf9a98b9db"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.699577 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6934dfa5-4f2a-4241-8cc2-20a87e4659e8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6934dfa5-4f2a-4241-8cc2-20a87e4659e8" (UID: "6934dfa5-4f2a-4241-8cc2-20a87e4659e8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.699602 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69af744d-8cc5-460e-8a51-f867d17f8e49-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "69af744d-8cc5-460e-8a51-f867d17f8e49" (UID: "69af744d-8cc5-460e-8a51-f867d17f8e49"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.699663 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4e8291dd-3577-4312-9ef5-fddf9a98b9db-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.699918 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e291a0c4-8966-4c54-b4c9-fa2ed6c28b1b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e291a0c4-8966-4c54-b4c9-fa2ed6c28b1b" (UID: "e291a0c4-8966-4c54-b4c9-fa2ed6c28b1b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.702749 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e291a0c4-8966-4c54-b4c9-fa2ed6c28b1b-kube-api-access-jn8rs" (OuterVolumeSpecName: "kube-api-access-jn8rs") pod "e291a0c4-8966-4c54-b4c9-fa2ed6c28b1b" (UID: "e291a0c4-8966-4c54-b4c9-fa2ed6c28b1b"). InnerVolumeSpecName "kube-api-access-jn8rs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.703220 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e8291dd-3577-4312-9ef5-fddf9a98b9db-kube-api-access-gj2zc" (OuterVolumeSpecName: "kube-api-access-gj2zc") pod "4e8291dd-3577-4312-9ef5-fddf9a98b9db" (UID: "4e8291dd-3577-4312-9ef5-fddf9a98b9db"). InnerVolumeSpecName "kube-api-access-gj2zc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.703334 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6934dfa5-4f2a-4241-8cc2-20a87e4659e8-kube-api-access-p4mm5" (OuterVolumeSpecName: "kube-api-access-p4mm5") pod "6934dfa5-4f2a-4241-8cc2-20a87e4659e8" (UID: "6934dfa5-4f2a-4241-8cc2-20a87e4659e8"). InnerVolumeSpecName "kube-api-access-p4mm5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.703914 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d029d7d8-83b7-441d-a7b3-014e9ea76618-kube-api-access-hk7r8" (OuterVolumeSpecName: "kube-api-access-hk7r8") pod "d029d7d8-83b7-441d-a7b3-014e9ea76618" (UID: "d029d7d8-83b7-441d-a7b3-014e9ea76618"). InnerVolumeSpecName "kube-api-access-hk7r8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.704454 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69af744d-8cc5-460e-8a51-f867d17f8e49-kube-api-access-ctwg2" (OuterVolumeSpecName: "kube-api-access-ctwg2") pod "69af744d-8cc5-460e-8a51-f867d17f8e49" (UID: "69af744d-8cc5-460e-8a51-f867d17f8e49"). InnerVolumeSpecName "kube-api-access-ctwg2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.748594 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d029d7d8-83b7-441d-a7b3-014e9ea76618-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d029d7d8-83b7-441d-a7b3-014e9ea76618" (UID: "d029d7d8-83b7-441d-a7b3-014e9ea76618"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.750730 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d029d7d8-83b7-441d-a7b3-014e9ea76618-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d029d7d8-83b7-441d-a7b3-014e9ea76618" (UID: "d029d7d8-83b7-441d-a7b3-014e9ea76618"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.753064 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d029d7d8-83b7-441d-a7b3-014e9ea76618-config" (OuterVolumeSpecName: "config") pod "d029d7d8-83b7-441d-a7b3-014e9ea76618" (UID: "d029d7d8-83b7-441d-a7b3-014e9ea76618"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.756605 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d029d7d8-83b7-441d-a7b3-014e9ea76618-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d029d7d8-83b7-441d-a7b3-014e9ea76618" (UID: "d029d7d8-83b7-441d-a7b3-014e9ea76618"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.802632 4722 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d029d7d8-83b7-441d-a7b3-014e9ea76618-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.802665 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ctwg2\" (UniqueName: \"kubernetes.io/projected/69af744d-8cc5-460e-8a51-f867d17f8e49-kube-api-access-ctwg2\") on node \"crc\" DevicePath \"\"" Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.802677 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p4mm5\" (UniqueName: \"kubernetes.io/projected/6934dfa5-4f2a-4241-8cc2-20a87e4659e8-kube-api-access-p4mm5\") on node \"crc\" DevicePath \"\"" Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.802685 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d029d7d8-83b7-441d-a7b3-014e9ea76618-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.802694 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6934dfa5-4f2a-4241-8cc2-20a87e4659e8-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.802703 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d029d7d8-83b7-441d-a7b3-014e9ea76618-config\") on node \"crc\" DevicePath \"\"" Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.802712 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gj2zc\" (UniqueName: \"kubernetes.io/projected/4e8291dd-3577-4312-9ef5-fddf9a98b9db-kube-api-access-gj2zc\") on node \"crc\" DevicePath \"\"" Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.802720 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d029d7d8-83b7-441d-a7b3-014e9ea76618-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.802728 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69af744d-8cc5-460e-8a51-f867d17f8e49-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.802736 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e291a0c4-8966-4c54-b4c9-fa2ed6c28b1b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.802744 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hk7r8\" (UniqueName: \"kubernetes.io/projected/d029d7d8-83b7-441d-a7b3-014e9ea76618-kube-api-access-hk7r8\") on node \"crc\" DevicePath \"\"" Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.802752 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jn8rs\" (UniqueName: \"kubernetes.io/projected/e291a0c4-8966-4c54-b4c9-fa2ed6c28b1b-kube-api-access-jn8rs\") on node \"crc\" DevicePath \"\"" Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.806180 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-thqfj" event={"ID":"d029d7d8-83b7-441d-a7b3-014e9ea76618","Type":"ContainerDied","Data":"da7eedebb6465c8445304593b4e941d077cea8c3545d9ea5426f85503f9413e5"} Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.806242 4722 scope.go:117] "RemoveContainer" containerID="96d495a9e82f8249a90710718e21a86830a2a6a682c4099fa1ca5d36da70d331" Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.806237 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-thqfj" Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.808048 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-c159-account-create-update-wrg5t" event={"ID":"d6cddd9c-6855-4f24-bbe7-b628c2431354","Type":"ContainerDied","Data":"426b40e2532de5ce7612e15991d344f1d1742898a53e22962ab2dae02bf4dd9a"} Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.808095 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="426b40e2532de5ce7612e15991d344f1d1742898a53e22962ab2dae02bf4dd9a" Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.808165 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-c159-account-create-update-wrg5t" Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.819840 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-gw98d" event={"ID":"1f457549-38ab-40d6-97ad-160ce234e8e5","Type":"ContainerStarted","Data":"df3dbbc38c06144873dfadfa1c59d99738ef935ead1458109e138b25317b0454"} Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.832686 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-b229-account-create-update-8qcbk" Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.832711 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-b229-account-create-update-8qcbk" event={"ID":"ee1ac8e7-cf85-4fe4-9e02-7be1c6ac8119","Type":"ContainerDied","Data":"1ef10ab0e5ce60e5b0f7a11819a85a66d4a6d04218225bb94af9e274afcf0af3"} Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.833895 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ef10ab0e5ce60e5b0f7a11819a85a66d4a6d04218225bb94af9e274afcf0af3" Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.839069 4722 scope.go:117] "RemoveContainer" containerID="938cb38ce445ab609ed1a74b409c8033ceeaf1593440b7134131222510a86963" Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.844189 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b1e8-account-create-update-6xt7j" Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.844624 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-b1e8-account-create-update-6xt7j" event={"ID":"6934dfa5-4f2a-4241-8cc2-20a87e4659e8","Type":"ContainerDied","Data":"c3794cc21790d625fb43284e7245826034dc2d3020b3ce0e7d7a59a0442b418c"} Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.844655 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c3794cc21790d625fb43284e7245826034dc2d3020b3ce0e7d7a59a0442b418c" Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.848783 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-gw98d" podStartSLOduration=2.449856436 podStartE2EDuration="7.848753915s" podCreationTimestamp="2026-03-09 14:25:19 +0000 UTC" firstStartedPulling="2026-03-09 14:25:20.782096318 +0000 UTC m=+1361.337664894" lastFinishedPulling="2026-03-09 14:25:26.180993797 +0000 UTC m=+1366.736562373" observedRunningTime="2026-03-09 14:25:26.839457567 +0000 UTC m=+1367.395026153" watchObservedRunningTime="2026-03-09 14:25:26.848753915 +0000 UTC m=+1367.404322501" Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.862266 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-2f2fz" event={"ID":"69af744d-8cc5-460e-8a51-f867d17f8e49","Type":"ContainerDied","Data":"e9319e00c6d6caae6065737bf7eea699631185531f9f8942b45fcde87558fccc"} Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.862300 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e9319e00c6d6caae6065737bf7eea699631185531f9f8942b45fcde87558fccc" Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.862349 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-2f2fz" Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.875298 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-fbldj" event={"ID":"4e8291dd-3577-4312-9ef5-fddf9a98b9db","Type":"ContainerDied","Data":"ba7705459149977570f0b1f308837213aadf065becc359a800d73c7e9ef3f4a6"} Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.875332 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba7705459149977570f0b1f308837213aadf065becc359a800d73c7e9ef3f4a6" Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.875383 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-fbldj" Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.877513 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ea4e-account-create-update-vgttx" Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.877516 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ea4e-account-create-update-vgttx" event={"ID":"e291a0c4-8966-4c54-b4c9-fa2ed6c28b1b","Type":"ContainerDied","Data":"7f6ed815aebb17944ca8606c4dead154bf208ae59773b8a5e4dde89e2f009428"} Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.878429 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f6ed815aebb17944ca8606c4dead154bf208ae59773b8a5e4dde89e2f009428" Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.881025 4722 generic.go:334] "Generic (PLEG): container finished" podID="0e07fbab-4a47-4e59-aa72-f0a4521296af" containerID="20c16653ad371dd67281816bdd470b5061bf7b0b72384a8e4ed5df82525a12d7" exitCode=0 Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.881074 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"0e07fbab-4a47-4e59-aa72-f0a4521296af","Type":"ContainerDied","Data":"20c16653ad371dd67281816bdd470b5061bf7b0b72384a8e4ed5df82525a12d7"} Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.888270 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-28mvd" event={"ID":"7c70c5f8-b667-4d94-96f4-7184d2fc36aa","Type":"ContainerDied","Data":"39780260fa846c5d42a18c34e884d1f3474d275baccb014230e060d0606cdaf1"} Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.888311 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="39780260fa846c5d42a18c34e884d1f3474d275baccb014230e060d0606cdaf1" Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.888360 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-28mvd" Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.897451 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-thqfj"] Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.901751 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-d6qd7" event={"ID":"8e2e23b8-e05f-4e51-b435-baba257c7a55","Type":"ContainerDied","Data":"83eb56dea6d08051a67002266ba78f22719462829b47e2a89ae8c199468cc59f"} Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.901789 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83eb56dea6d08051a67002266ba78f22719462829b47e2a89ae8c199468cc59f" Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.901845 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-d6qd7" Mar 09 14:25:26 crc kubenswrapper[4722]: I0309 14:25:26.917038 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-thqfj"] Mar 09 14:25:27 crc kubenswrapper[4722]: I0309 14:25:27.914500 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"0e07fbab-4a47-4e59-aa72-f0a4521296af","Type":"ContainerStarted","Data":"38499bf9c5372707d5d95e2047e0945f3da095a98e2e6d148414c77401c94dad"} Mar 09 14:25:28 crc kubenswrapper[4722]: I0309 14:25:28.160549 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d029d7d8-83b7-441d-a7b3-014e9ea76618" path="/var/lib/kubelet/pods/d029d7d8-83b7-441d-a7b3-014e9ea76618/volumes" Mar 09 14:25:29 crc kubenswrapper[4722]: I0309 14:25:29.933647 4722 generic.go:334] "Generic (PLEG): container finished" podID="1f457549-38ab-40d6-97ad-160ce234e8e5" containerID="df3dbbc38c06144873dfadfa1c59d99738ef935ead1458109e138b25317b0454" exitCode=0 Mar 09 14:25:29 crc kubenswrapper[4722]: I0309 14:25:29.933705 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-gw98d" event={"ID":"1f457549-38ab-40d6-97ad-160ce234e8e5","Type":"ContainerDied","Data":"df3dbbc38c06144873dfadfa1c59d99738ef935ead1458109e138b25317b0454"} Mar 09 14:25:30 crc kubenswrapper[4722]: I0309 14:25:30.944697 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"0e07fbab-4a47-4e59-aa72-f0a4521296af","Type":"ContainerStarted","Data":"ad5404ea05f55bc342ca7c682f087860fb585c7b56d12b68c3c4a533bc8772e9"} Mar 09 14:25:30 crc kubenswrapper[4722]: I0309 14:25:30.945024 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"0e07fbab-4a47-4e59-aa72-f0a4521296af","Type":"ContainerStarted","Data":"a1615f55c9a86d29c9f41ff2fcdf482c31d49d4633038fa0cc2d1dfd9139b3b2"} Mar 09 14:25:30 crc kubenswrapper[4722]: I0309 14:25:30.991291 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=15.991260958 podStartE2EDuration="15.991260958s" podCreationTimestamp="2026-03-09 14:25:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:25:30.971735857 +0000 UTC m=+1371.527304443" watchObservedRunningTime="2026-03-09 14:25:30.991260958 +0000 UTC m=+1371.546829544" Mar 09 14:25:31 crc kubenswrapper[4722]: I0309 14:25:31.426506 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-gw98d" Mar 09 14:25:31 crc kubenswrapper[4722]: I0309 14:25:31.495966 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Mar 09 14:25:31 crc kubenswrapper[4722]: I0309 14:25:31.496510 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Mar 09 14:25:31 crc kubenswrapper[4722]: I0309 14:25:31.506896 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7pbfw\" (UniqueName: \"kubernetes.io/projected/1f457549-38ab-40d6-97ad-160ce234e8e5-kube-api-access-7pbfw\") pod \"1f457549-38ab-40d6-97ad-160ce234e8e5\" (UID: \"1f457549-38ab-40d6-97ad-160ce234e8e5\") " Mar 09 14:25:31 crc kubenswrapper[4722]: I0309 14:25:31.507016 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f457549-38ab-40d6-97ad-160ce234e8e5-combined-ca-bundle\") pod \"1f457549-38ab-40d6-97ad-160ce234e8e5\" (UID: \"1f457549-38ab-40d6-97ad-160ce234e8e5\") " Mar 09 14:25:31 crc kubenswrapper[4722]: I0309 14:25:31.507103 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f457549-38ab-40d6-97ad-160ce234e8e5-config-data\") pod \"1f457549-38ab-40d6-97ad-160ce234e8e5\" (UID: \"1f457549-38ab-40d6-97ad-160ce234e8e5\") " Mar 09 14:25:31 crc kubenswrapper[4722]: I0309 14:25:31.514312 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Mar 09 14:25:31 crc kubenswrapper[4722]: I0309 14:25:31.518744 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f457549-38ab-40d6-97ad-160ce234e8e5-kube-api-access-7pbfw" (OuterVolumeSpecName: "kube-api-access-7pbfw") pod "1f457549-38ab-40d6-97ad-160ce234e8e5" (UID: "1f457549-38ab-40d6-97ad-160ce234e8e5"). InnerVolumeSpecName "kube-api-access-7pbfw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:25:31 crc kubenswrapper[4722]: I0309 14:25:31.534178 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f457549-38ab-40d6-97ad-160ce234e8e5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1f457549-38ab-40d6-97ad-160ce234e8e5" (UID: "1f457549-38ab-40d6-97ad-160ce234e8e5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:25:31 crc kubenswrapper[4722]: I0309 14:25:31.571363 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f457549-38ab-40d6-97ad-160ce234e8e5-config-data" (OuterVolumeSpecName: "config-data") pod "1f457549-38ab-40d6-97ad-160ce234e8e5" (UID: "1f457549-38ab-40d6-97ad-160ce234e8e5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:25:31 crc kubenswrapper[4722]: I0309 14:25:31.609799 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7pbfw\" (UniqueName: \"kubernetes.io/projected/1f457549-38ab-40d6-97ad-160ce234e8e5-kube-api-access-7pbfw\") on node \"crc\" DevicePath \"\"" Mar 09 14:25:31 crc kubenswrapper[4722]: I0309 14:25:31.609832 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f457549-38ab-40d6-97ad-160ce234e8e5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:25:31 crc kubenswrapper[4722]: I0309 14:25:31.609842 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f457549-38ab-40d6-97ad-160ce234e8e5-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:25:31 crc kubenswrapper[4722]: I0309 14:25:31.954854 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-gw98d" event={"ID":"1f457549-38ab-40d6-97ad-160ce234e8e5","Type":"ContainerDied","Data":"23d132a7a9a0f496f6f9f80055221524c4d065c89d7d5167de17cae4413fd1a3"} Mar 09 14:25:31 crc kubenswrapper[4722]: I0309 14:25:31.955979 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23d132a7a9a0f496f6f9f80055221524c4d065c89d7d5167de17cae4413fd1a3" Mar 09 14:25:31 crc kubenswrapper[4722]: I0309 14:25:31.954884 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-gw98d" Mar 09 14:25:31 crc kubenswrapper[4722]: I0309 14:25:31.970118 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.198116 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-xh4vc"] Mar 09 14:25:32 crc kubenswrapper[4722]: E0309 14:25:32.199589 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f457549-38ab-40d6-97ad-160ce234e8e5" containerName="keystone-db-sync" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.199616 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f457549-38ab-40d6-97ad-160ce234e8e5" containerName="keystone-db-sync" Mar 09 14:25:32 crc kubenswrapper[4722]: E0309 14:25:32.199641 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6934dfa5-4f2a-4241-8cc2-20a87e4659e8" containerName="mariadb-account-create-update" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.199647 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="6934dfa5-4f2a-4241-8cc2-20a87e4659e8" containerName="mariadb-account-create-update" Mar 09 14:25:32 crc kubenswrapper[4722]: E0309 14:25:32.199660 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d029d7d8-83b7-441d-a7b3-014e9ea76618" containerName="dnsmasq-dns" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.199665 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="d029d7d8-83b7-441d-a7b3-014e9ea76618" containerName="dnsmasq-dns" Mar 09 14:25:32 crc kubenswrapper[4722]: E0309 14:25:32.199683 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e291a0c4-8966-4c54-b4c9-fa2ed6c28b1b" containerName="mariadb-account-create-update" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.199689 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="e291a0c4-8966-4c54-b4c9-fa2ed6c28b1b" containerName="mariadb-account-create-update" Mar 09 14:25:32 crc kubenswrapper[4722]: E0309 14:25:32.199698 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee1ac8e7-cf85-4fe4-9e02-7be1c6ac8119" containerName="mariadb-account-create-update" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.199704 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee1ac8e7-cf85-4fe4-9e02-7be1c6ac8119" containerName="mariadb-account-create-update" Mar 09 14:25:32 crc kubenswrapper[4722]: E0309 14:25:32.199723 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c70c5f8-b667-4d94-96f4-7184d2fc36aa" containerName="mariadb-database-create" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.199729 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c70c5f8-b667-4d94-96f4-7184d2fc36aa" containerName="mariadb-database-create" Mar 09 14:25:32 crc kubenswrapper[4722]: E0309 14:25:32.199741 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d029d7d8-83b7-441d-a7b3-014e9ea76618" containerName="init" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.199746 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="d029d7d8-83b7-441d-a7b3-014e9ea76618" containerName="init" Mar 09 14:25:32 crc kubenswrapper[4722]: E0309 14:25:32.199755 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e2e23b8-e05f-4e51-b435-baba257c7a55" containerName="mariadb-database-create" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.199761 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e2e23b8-e05f-4e51-b435-baba257c7a55" containerName="mariadb-database-create" Mar 09 14:25:32 crc kubenswrapper[4722]: E0309 14:25:32.199777 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69af744d-8cc5-460e-8a51-f867d17f8e49" containerName="mariadb-database-create" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.199783 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="69af744d-8cc5-460e-8a51-f867d17f8e49" containerName="mariadb-database-create" Mar 09 14:25:32 crc kubenswrapper[4722]: E0309 14:25:32.199795 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e8291dd-3577-4312-9ef5-fddf9a98b9db" containerName="mariadb-database-create" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.199801 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e8291dd-3577-4312-9ef5-fddf9a98b9db" containerName="mariadb-database-create" Mar 09 14:25:32 crc kubenswrapper[4722]: E0309 14:25:32.199815 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6cddd9c-6855-4f24-bbe7-b628c2431354" containerName="mariadb-account-create-update" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.199821 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6cddd9c-6855-4f24-bbe7-b628c2431354" containerName="mariadb-account-create-update" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.199993 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="d029d7d8-83b7-441d-a7b3-014e9ea76618" containerName="dnsmasq-dns" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.200007 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6cddd9c-6855-4f24-bbe7-b628c2431354" containerName="mariadb-account-create-update" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.200020 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee1ac8e7-cf85-4fe4-9e02-7be1c6ac8119" containerName="mariadb-account-create-update" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.200027 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e2e23b8-e05f-4e51-b435-baba257c7a55" containerName="mariadb-database-create" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.200037 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="6934dfa5-4f2a-4241-8cc2-20a87e4659e8" containerName="mariadb-account-create-update" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.200043 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f457549-38ab-40d6-97ad-160ce234e8e5" containerName="keystone-db-sync" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.200055 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="69af744d-8cc5-460e-8a51-f867d17f8e49" containerName="mariadb-database-create" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.200064 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="e291a0c4-8966-4c54-b4c9-fa2ed6c28b1b" containerName="mariadb-account-create-update" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.200071 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c70c5f8-b667-4d94-96f4-7184d2fc36aa" containerName="mariadb-database-create" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.200083 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e8291dd-3577-4312-9ef5-fddf9a98b9db" containerName="mariadb-database-create" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.201414 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-xh4vc" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.219702 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-xh4vc"] Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.271427 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-6ssn4"] Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.277464 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-6ssn4" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.281132 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.281416 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.281529 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.293344 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-6ssn4"] Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.293725 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-jfd9g" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.306740 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.323831 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25586930-1c5f-4223-bbda-9c045110613e-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-xh4vc\" (UID: \"25586930-1c5f-4223-bbda-9c045110613e\") " pod="openstack/dnsmasq-dns-847c4cc679-xh4vc" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.323930 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lvfq\" (UniqueName: \"kubernetes.io/projected/f2ad273c-a2e3-4d6e-b9ca-f8b67f8a680b-kube-api-access-6lvfq\") pod \"keystone-bootstrap-6ssn4\" (UID: \"f2ad273c-a2e3-4d6e-b9ca-f8b67f8a680b\") " pod="openstack/keystone-bootstrap-6ssn4" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.323976 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f2ad273c-a2e3-4d6e-b9ca-f8b67f8a680b-fernet-keys\") pod \"keystone-bootstrap-6ssn4\" (UID: \"f2ad273c-a2e3-4d6e-b9ca-f8b67f8a680b\") " pod="openstack/keystone-bootstrap-6ssn4" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.323997 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2ad273c-a2e3-4d6e-b9ca-f8b67f8a680b-scripts\") pod \"keystone-bootstrap-6ssn4\" (UID: \"f2ad273c-a2e3-4d6e-b9ca-f8b67f8a680b\") " pod="openstack/keystone-bootstrap-6ssn4" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.324029 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2ad273c-a2e3-4d6e-b9ca-f8b67f8a680b-config-data\") pod \"keystone-bootstrap-6ssn4\" (UID: \"f2ad273c-a2e3-4d6e-b9ca-f8b67f8a680b\") " pod="openstack/keystone-bootstrap-6ssn4" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.324058 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2ad273c-a2e3-4d6e-b9ca-f8b67f8a680b-combined-ca-bundle\") pod \"keystone-bootstrap-6ssn4\" (UID: \"f2ad273c-a2e3-4d6e-b9ca-f8b67f8a680b\") " pod="openstack/keystone-bootstrap-6ssn4" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.324075 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/25586930-1c5f-4223-bbda-9c045110613e-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-xh4vc\" (UID: \"25586930-1c5f-4223-bbda-9c045110613e\") " pod="openstack/dnsmasq-dns-847c4cc679-xh4vc" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.324109 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25586930-1c5f-4223-bbda-9c045110613e-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-xh4vc\" (UID: \"25586930-1c5f-4223-bbda-9c045110613e\") " pod="openstack/dnsmasq-dns-847c4cc679-xh4vc" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.324137 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zltk4\" (UniqueName: \"kubernetes.io/projected/25586930-1c5f-4223-bbda-9c045110613e-kube-api-access-zltk4\") pod \"dnsmasq-dns-847c4cc679-xh4vc\" (UID: \"25586930-1c5f-4223-bbda-9c045110613e\") " pod="openstack/dnsmasq-dns-847c4cc679-xh4vc" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.324163 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25586930-1c5f-4223-bbda-9c045110613e-config\") pod \"dnsmasq-dns-847c4cc679-xh4vc\" (UID: \"25586930-1c5f-4223-bbda-9c045110613e\") " pod="openstack/dnsmasq-dns-847c4cc679-xh4vc" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.324192 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f2ad273c-a2e3-4d6e-b9ca-f8b67f8a680b-credential-keys\") pod \"keystone-bootstrap-6ssn4\" (UID: \"f2ad273c-a2e3-4d6e-b9ca-f8b67f8a680b\") " pod="openstack/keystone-bootstrap-6ssn4" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.324343 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25586930-1c5f-4223-bbda-9c045110613e-dns-svc\") pod \"dnsmasq-dns-847c4cc679-xh4vc\" (UID: \"25586930-1c5f-4223-bbda-9c045110613e\") " pod="openstack/dnsmasq-dns-847c4cc679-xh4vc" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.331170 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-j4lgt"] Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.333055 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-j4lgt" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.344527 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-bb86g" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.344765 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.387011 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-j4lgt"] Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.432388 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2ad273c-a2e3-4d6e-b9ca-f8b67f8a680b-config-data\") pod \"keystone-bootstrap-6ssn4\" (UID: \"f2ad273c-a2e3-4d6e-b9ca-f8b67f8a680b\") " pod="openstack/keystone-bootstrap-6ssn4" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.432458 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2ad273c-a2e3-4d6e-b9ca-f8b67f8a680b-combined-ca-bundle\") pod \"keystone-bootstrap-6ssn4\" (UID: \"f2ad273c-a2e3-4d6e-b9ca-f8b67f8a680b\") " pod="openstack/keystone-bootstrap-6ssn4" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.432488 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/25586930-1c5f-4223-bbda-9c045110613e-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-xh4vc\" (UID: \"25586930-1c5f-4223-bbda-9c045110613e\") " pod="openstack/dnsmasq-dns-847c4cc679-xh4vc" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.432527 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25586930-1c5f-4223-bbda-9c045110613e-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-xh4vc\" (UID: \"25586930-1c5f-4223-bbda-9c045110613e\") " pod="openstack/dnsmasq-dns-847c4cc679-xh4vc" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.432548 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tw8df\" (UniqueName: \"kubernetes.io/projected/7f51218c-6b15-4f4a-ad49-1ba0ccd5e292-kube-api-access-tw8df\") pod \"heat-db-sync-j4lgt\" (UID: \"7f51218c-6b15-4f4a-ad49-1ba0ccd5e292\") " pod="openstack/heat-db-sync-j4lgt" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.432575 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zltk4\" (UniqueName: \"kubernetes.io/projected/25586930-1c5f-4223-bbda-9c045110613e-kube-api-access-zltk4\") pod \"dnsmasq-dns-847c4cc679-xh4vc\" (UID: \"25586930-1c5f-4223-bbda-9c045110613e\") " pod="openstack/dnsmasq-dns-847c4cc679-xh4vc" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.432601 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25586930-1c5f-4223-bbda-9c045110613e-config\") pod \"dnsmasq-dns-847c4cc679-xh4vc\" (UID: \"25586930-1c5f-4223-bbda-9c045110613e\") " pod="openstack/dnsmasq-dns-847c4cc679-xh4vc" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.432620 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f51218c-6b15-4f4a-ad49-1ba0ccd5e292-combined-ca-bundle\") pod \"heat-db-sync-j4lgt\" (UID: \"7f51218c-6b15-4f4a-ad49-1ba0ccd5e292\") " pod="openstack/heat-db-sync-j4lgt" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.432640 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f2ad273c-a2e3-4d6e-b9ca-f8b67f8a680b-credential-keys\") pod \"keystone-bootstrap-6ssn4\" (UID: \"f2ad273c-a2e3-4d6e-b9ca-f8b67f8a680b\") " pod="openstack/keystone-bootstrap-6ssn4" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.432660 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f51218c-6b15-4f4a-ad49-1ba0ccd5e292-config-data\") pod \"heat-db-sync-j4lgt\" (UID: \"7f51218c-6b15-4f4a-ad49-1ba0ccd5e292\") " pod="openstack/heat-db-sync-j4lgt" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.432684 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25586930-1c5f-4223-bbda-9c045110613e-dns-svc\") pod \"dnsmasq-dns-847c4cc679-xh4vc\" (UID: \"25586930-1c5f-4223-bbda-9c045110613e\") " pod="openstack/dnsmasq-dns-847c4cc679-xh4vc" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.432727 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25586930-1c5f-4223-bbda-9c045110613e-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-xh4vc\" (UID: \"25586930-1c5f-4223-bbda-9c045110613e\") " pod="openstack/dnsmasq-dns-847c4cc679-xh4vc" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.432798 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lvfq\" (UniqueName: \"kubernetes.io/projected/f2ad273c-a2e3-4d6e-b9ca-f8b67f8a680b-kube-api-access-6lvfq\") pod \"keystone-bootstrap-6ssn4\" (UID: \"f2ad273c-a2e3-4d6e-b9ca-f8b67f8a680b\") " pod="openstack/keystone-bootstrap-6ssn4" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.432837 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f2ad273c-a2e3-4d6e-b9ca-f8b67f8a680b-fernet-keys\") pod \"keystone-bootstrap-6ssn4\" (UID: \"f2ad273c-a2e3-4d6e-b9ca-f8b67f8a680b\") " pod="openstack/keystone-bootstrap-6ssn4" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.432854 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2ad273c-a2e3-4d6e-b9ca-f8b67f8a680b-scripts\") pod \"keystone-bootstrap-6ssn4\" (UID: \"f2ad273c-a2e3-4d6e-b9ca-f8b67f8a680b\") " pod="openstack/keystone-bootstrap-6ssn4" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.433935 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25586930-1c5f-4223-bbda-9c045110613e-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-xh4vc\" (UID: \"25586930-1c5f-4223-bbda-9c045110613e\") " pod="openstack/dnsmasq-dns-847c4cc679-xh4vc" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.434063 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25586930-1c5f-4223-bbda-9c045110613e-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-xh4vc\" (UID: \"25586930-1c5f-4223-bbda-9c045110613e\") " pod="openstack/dnsmasq-dns-847c4cc679-xh4vc" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.434491 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25586930-1c5f-4223-bbda-9c045110613e-dns-svc\") pod \"dnsmasq-dns-847c4cc679-xh4vc\" (UID: \"25586930-1c5f-4223-bbda-9c045110613e\") " pod="openstack/dnsmasq-dns-847c4cc679-xh4vc" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.435002 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25586930-1c5f-4223-bbda-9c045110613e-config\") pod \"dnsmasq-dns-847c4cc679-xh4vc\" (UID: \"25586930-1c5f-4223-bbda-9c045110613e\") " pod="openstack/dnsmasq-dns-847c4cc679-xh4vc" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.435389 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/25586930-1c5f-4223-bbda-9c045110613e-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-xh4vc\" (UID: \"25586930-1c5f-4223-bbda-9c045110613e\") " pod="openstack/dnsmasq-dns-847c4cc679-xh4vc" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.438409 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f2ad273c-a2e3-4d6e-b9ca-f8b67f8a680b-fernet-keys\") pod \"keystone-bootstrap-6ssn4\" (UID: \"f2ad273c-a2e3-4d6e-b9ca-f8b67f8a680b\") " pod="openstack/keystone-bootstrap-6ssn4" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.441938 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2ad273c-a2e3-4d6e-b9ca-f8b67f8a680b-combined-ca-bundle\") pod \"keystone-bootstrap-6ssn4\" (UID: \"f2ad273c-a2e3-4d6e-b9ca-f8b67f8a680b\") " pod="openstack/keystone-bootstrap-6ssn4" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.446656 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f2ad273c-a2e3-4d6e-b9ca-f8b67f8a680b-credential-keys\") pod \"keystone-bootstrap-6ssn4\" (UID: \"f2ad273c-a2e3-4d6e-b9ca-f8b67f8a680b\") " pod="openstack/keystone-bootstrap-6ssn4" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.448616 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2ad273c-a2e3-4d6e-b9ca-f8b67f8a680b-scripts\") pod \"keystone-bootstrap-6ssn4\" (UID: \"f2ad273c-a2e3-4d6e-b9ca-f8b67f8a680b\") " pod="openstack/keystone-bootstrap-6ssn4" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.465853 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2ad273c-a2e3-4d6e-b9ca-f8b67f8a680b-config-data\") pod \"keystone-bootstrap-6ssn4\" (UID: \"f2ad273c-a2e3-4d6e-b9ca-f8b67f8a680b\") " pod="openstack/keystone-bootstrap-6ssn4" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.466616 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lvfq\" (UniqueName: \"kubernetes.io/projected/f2ad273c-a2e3-4d6e-b9ca-f8b67f8a680b-kube-api-access-6lvfq\") pod \"keystone-bootstrap-6ssn4\" (UID: \"f2ad273c-a2e3-4d6e-b9ca-f8b67f8a680b\") " pod="openstack/keystone-bootstrap-6ssn4" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.472751 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zltk4\" (UniqueName: \"kubernetes.io/projected/25586930-1c5f-4223-bbda-9c045110613e-kube-api-access-zltk4\") pod \"dnsmasq-dns-847c4cc679-xh4vc\" (UID: \"25586930-1c5f-4223-bbda-9c045110613e\") " pod="openstack/dnsmasq-dns-847c4cc679-xh4vc" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.483229 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-l9wtc"] Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.484779 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-l9wtc" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.494454 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.494715 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-d8dpk" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.494836 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.534158 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qqtt\" (UniqueName: \"kubernetes.io/projected/69530d68-be96-4605-be46-8053083aa178-kube-api-access-4qqtt\") pod \"neutron-db-sync-l9wtc\" (UID: \"69530d68-be96-4605-be46-8053083aa178\") " pod="openstack/neutron-db-sync-l9wtc" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.534259 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tw8df\" (UniqueName: \"kubernetes.io/projected/7f51218c-6b15-4f4a-ad49-1ba0ccd5e292-kube-api-access-tw8df\") pod \"heat-db-sync-j4lgt\" (UID: \"7f51218c-6b15-4f4a-ad49-1ba0ccd5e292\") " pod="openstack/heat-db-sync-j4lgt" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.534293 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f51218c-6b15-4f4a-ad49-1ba0ccd5e292-combined-ca-bundle\") pod \"heat-db-sync-j4lgt\" (UID: \"7f51218c-6b15-4f4a-ad49-1ba0ccd5e292\") " pod="openstack/heat-db-sync-j4lgt" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.534318 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f51218c-6b15-4f4a-ad49-1ba0ccd5e292-config-data\") pod \"heat-db-sync-j4lgt\" (UID: \"7f51218c-6b15-4f4a-ad49-1ba0ccd5e292\") " pod="openstack/heat-db-sync-j4lgt" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.534368 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/69530d68-be96-4605-be46-8053083aa178-config\") pod \"neutron-db-sync-l9wtc\" (UID: \"69530d68-be96-4605-be46-8053083aa178\") " pod="openstack/neutron-db-sync-l9wtc" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.534396 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69530d68-be96-4605-be46-8053083aa178-combined-ca-bundle\") pod \"neutron-db-sync-l9wtc\" (UID: \"69530d68-be96-4605-be46-8053083aa178\") " pod="openstack/neutron-db-sync-l9wtc" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.538230 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-xh4vc" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.539751 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f51218c-6b15-4f4a-ad49-1ba0ccd5e292-combined-ca-bundle\") pod \"heat-db-sync-j4lgt\" (UID: \"7f51218c-6b15-4f4a-ad49-1ba0ccd5e292\") " pod="openstack/heat-db-sync-j4lgt" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.553020 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f51218c-6b15-4f4a-ad49-1ba0ccd5e292-config-data\") pod \"heat-db-sync-j4lgt\" (UID: \"7f51218c-6b15-4f4a-ad49-1ba0ccd5e292\") " pod="openstack/heat-db-sync-j4lgt" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.559763 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tw8df\" (UniqueName: \"kubernetes.io/projected/7f51218c-6b15-4f4a-ad49-1ba0ccd5e292-kube-api-access-tw8df\") pod \"heat-db-sync-j4lgt\" (UID: \"7f51218c-6b15-4f4a-ad49-1ba0ccd5e292\") " pod="openstack/heat-db-sync-j4lgt" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.581235 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-l9wtc"] Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.601077 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-6ssn4" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.640639 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/69530d68-be96-4605-be46-8053083aa178-config\") pod \"neutron-db-sync-l9wtc\" (UID: \"69530d68-be96-4605-be46-8053083aa178\") " pod="openstack/neutron-db-sync-l9wtc" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.640966 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69530d68-be96-4605-be46-8053083aa178-combined-ca-bundle\") pod \"neutron-db-sync-l9wtc\" (UID: \"69530d68-be96-4605-be46-8053083aa178\") " pod="openstack/neutron-db-sync-l9wtc" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.641185 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qqtt\" (UniqueName: \"kubernetes.io/projected/69530d68-be96-4605-be46-8053083aa178-kube-api-access-4qqtt\") pod \"neutron-db-sync-l9wtc\" (UID: \"69530d68-be96-4605-be46-8053083aa178\") " pod="openstack/neutron-db-sync-l9wtc" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.645901 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-tlbnm"] Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.646773 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69530d68-be96-4605-be46-8053083aa178-combined-ca-bundle\") pod \"neutron-db-sync-l9wtc\" (UID: \"69530d68-be96-4605-be46-8053083aa178\") " pod="openstack/neutron-db-sync-l9wtc" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.647775 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-tlbnm" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.668721 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-j4lgt" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.670354 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-r5rcp" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.670638 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.671642 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/69530d68-be96-4605-be46-8053083aa178-config\") pod \"neutron-db-sync-l9wtc\" (UID: \"69530d68-be96-4605-be46-8053083aa178\") " pod="openstack/neutron-db-sync-l9wtc" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.699938 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-fzj6s"] Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.700844 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qqtt\" (UniqueName: \"kubernetes.io/projected/69530d68-be96-4605-be46-8053083aa178-kube-api-access-4qqtt\") pod \"neutron-db-sync-l9wtc\" (UID: \"69530d68-be96-4605-be46-8053083aa178\") " pod="openstack/neutron-db-sync-l9wtc" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.702001 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-fzj6s" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.715839 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-wdrpg" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.716044 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.720068 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.750888 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-tlbnm"] Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.821186 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-l9wtc" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.852449 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b63a3af5-a347-40ed-b9bc-52ad70e7ff13-combined-ca-bundle\") pod \"barbican-db-sync-tlbnm\" (UID: \"b63a3af5-a347-40ed-b9bc-52ad70e7ff13\") " pod="openstack/barbican-db-sync-tlbnm" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.852666 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbn6m\" (UniqueName: \"kubernetes.io/projected/b63a3af5-a347-40ed-b9bc-52ad70e7ff13-kube-api-access-gbn6m\") pod \"barbican-db-sync-tlbnm\" (UID: \"b63a3af5-a347-40ed-b9bc-52ad70e7ff13\") " pod="openstack/barbican-db-sync-tlbnm" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.852823 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5035bd54-0aaa-4ff3-b90a-6145145fe95c-scripts\") pod \"cinder-db-sync-fzj6s\" (UID: \"5035bd54-0aaa-4ff3-b90a-6145145fe95c\") " pod="openstack/cinder-db-sync-fzj6s" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.852935 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5035bd54-0aaa-4ff3-b90a-6145145fe95c-config-data\") pod \"cinder-db-sync-fzj6s\" (UID: \"5035bd54-0aaa-4ff3-b90a-6145145fe95c\") " pod="openstack/cinder-db-sync-fzj6s" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.852970 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5035bd54-0aaa-4ff3-b90a-6145145fe95c-etc-machine-id\") pod \"cinder-db-sync-fzj6s\" (UID: \"5035bd54-0aaa-4ff3-b90a-6145145fe95c\") " pod="openstack/cinder-db-sync-fzj6s" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.853008 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b63a3af5-a347-40ed-b9bc-52ad70e7ff13-db-sync-config-data\") pod \"barbican-db-sync-tlbnm\" (UID: \"b63a3af5-a347-40ed-b9bc-52ad70e7ff13\") " pod="openstack/barbican-db-sync-tlbnm" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.853037 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5035bd54-0aaa-4ff3-b90a-6145145fe95c-combined-ca-bundle\") pod \"cinder-db-sync-fzj6s\" (UID: \"5035bd54-0aaa-4ff3-b90a-6145145fe95c\") " pod="openstack/cinder-db-sync-fzj6s" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.866634 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-9llgc"] Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.885997 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-9llgc" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.896172 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-xh4vc"] Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.899776 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5035bd54-0aaa-4ff3-b90a-6145145fe95c-db-sync-config-data\") pod \"cinder-db-sync-fzj6s\" (UID: \"5035bd54-0aaa-4ff3-b90a-6145145fe95c\") " pod="openstack/cinder-db-sync-fzj6s" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.899827 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rw8pm\" (UniqueName: \"kubernetes.io/projected/5035bd54-0aaa-4ff3-b90a-6145145fe95c-kube-api-access-rw8pm\") pod \"cinder-db-sync-fzj6s\" (UID: \"5035bd54-0aaa-4ff3-b90a-6145145fe95c\") " pod="openstack/cinder-db-sync-fzj6s" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.904050 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.904527 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-98w4p" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.904744 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.931822 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-9llgc"] Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.953275 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-fzj6s"] Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.973069 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-gv4lz"] Mar 09 14:25:32 crc kubenswrapper[4722]: I0309 14:25:32.975280 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-gv4lz" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:32.999377 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-gv4lz"] Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.010982 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t69c7\" (UniqueName: \"kubernetes.io/projected/d1e56f45-1812-4ee1-aacc-0b012cf07111-kube-api-access-t69c7\") pod \"placement-db-sync-9llgc\" (UID: \"d1e56f45-1812-4ee1-aacc-0b012cf07111\") " pod="openstack/placement-db-sync-9llgc" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.011043 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbn6m\" (UniqueName: \"kubernetes.io/projected/b63a3af5-a347-40ed-b9bc-52ad70e7ff13-kube-api-access-gbn6m\") pod \"barbican-db-sync-tlbnm\" (UID: \"b63a3af5-a347-40ed-b9bc-52ad70e7ff13\") " pod="openstack/barbican-db-sync-tlbnm" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.011083 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1e56f45-1812-4ee1-aacc-0b012cf07111-combined-ca-bundle\") pod \"placement-db-sync-9llgc\" (UID: \"d1e56f45-1812-4ee1-aacc-0b012cf07111\") " pod="openstack/placement-db-sync-9llgc" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.011122 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1e56f45-1812-4ee1-aacc-0b012cf07111-scripts\") pod \"placement-db-sync-9llgc\" (UID: \"d1e56f45-1812-4ee1-aacc-0b012cf07111\") " pod="openstack/placement-db-sync-9llgc" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.011218 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1e56f45-1812-4ee1-aacc-0b012cf07111-config-data\") pod \"placement-db-sync-9llgc\" (UID: \"d1e56f45-1812-4ee1-aacc-0b012cf07111\") " pod="openstack/placement-db-sync-9llgc" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.011237 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5035bd54-0aaa-4ff3-b90a-6145145fe95c-scripts\") pod \"cinder-db-sync-fzj6s\" (UID: \"5035bd54-0aaa-4ff3-b90a-6145145fe95c\") " pod="openstack/cinder-db-sync-fzj6s" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.011275 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1e56f45-1812-4ee1-aacc-0b012cf07111-logs\") pod \"placement-db-sync-9llgc\" (UID: \"d1e56f45-1812-4ee1-aacc-0b012cf07111\") " pod="openstack/placement-db-sync-9llgc" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.011347 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5035bd54-0aaa-4ff3-b90a-6145145fe95c-config-data\") pod \"cinder-db-sync-fzj6s\" (UID: \"5035bd54-0aaa-4ff3-b90a-6145145fe95c\") " pod="openstack/cinder-db-sync-fzj6s" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.011386 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5035bd54-0aaa-4ff3-b90a-6145145fe95c-etc-machine-id\") pod \"cinder-db-sync-fzj6s\" (UID: \"5035bd54-0aaa-4ff3-b90a-6145145fe95c\") " pod="openstack/cinder-db-sync-fzj6s" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.011415 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b63a3af5-a347-40ed-b9bc-52ad70e7ff13-db-sync-config-data\") pod \"barbican-db-sync-tlbnm\" (UID: \"b63a3af5-a347-40ed-b9bc-52ad70e7ff13\") " pod="openstack/barbican-db-sync-tlbnm" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.011440 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5035bd54-0aaa-4ff3-b90a-6145145fe95c-combined-ca-bundle\") pod \"cinder-db-sync-fzj6s\" (UID: \"5035bd54-0aaa-4ff3-b90a-6145145fe95c\") " pod="openstack/cinder-db-sync-fzj6s" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.011471 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5035bd54-0aaa-4ff3-b90a-6145145fe95c-db-sync-config-data\") pod \"cinder-db-sync-fzj6s\" (UID: \"5035bd54-0aaa-4ff3-b90a-6145145fe95c\") " pod="openstack/cinder-db-sync-fzj6s" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.022635 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rw8pm\" (UniqueName: \"kubernetes.io/projected/5035bd54-0aaa-4ff3-b90a-6145145fe95c-kube-api-access-rw8pm\") pod \"cinder-db-sync-fzj6s\" (UID: \"5035bd54-0aaa-4ff3-b90a-6145145fe95c\") " pod="openstack/cinder-db-sync-fzj6s" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.022665 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b63a3af5-a347-40ed-b9bc-52ad70e7ff13-combined-ca-bundle\") pod \"barbican-db-sync-tlbnm\" (UID: \"b63a3af5-a347-40ed-b9bc-52ad70e7ff13\") " pod="openstack/barbican-db-sync-tlbnm" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.033964 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5035bd54-0aaa-4ff3-b90a-6145145fe95c-etc-machine-id\") pod \"cinder-db-sync-fzj6s\" (UID: \"5035bd54-0aaa-4ff3-b90a-6145145fe95c\") " pod="openstack/cinder-db-sync-fzj6s" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.046088 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b63a3af5-a347-40ed-b9bc-52ad70e7ff13-combined-ca-bundle\") pod \"barbican-db-sync-tlbnm\" (UID: \"b63a3af5-a347-40ed-b9bc-52ad70e7ff13\") " pod="openstack/barbican-db-sync-tlbnm" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.047356 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5035bd54-0aaa-4ff3-b90a-6145145fe95c-scripts\") pod \"cinder-db-sync-fzj6s\" (UID: \"5035bd54-0aaa-4ff3-b90a-6145145fe95c\") " pod="openstack/cinder-db-sync-fzj6s" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.051650 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5035bd54-0aaa-4ff3-b90a-6145145fe95c-db-sync-config-data\") pod \"cinder-db-sync-fzj6s\" (UID: \"5035bd54-0aaa-4ff3-b90a-6145145fe95c\") " pod="openstack/cinder-db-sync-fzj6s" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.052056 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5035bd54-0aaa-4ff3-b90a-6145145fe95c-combined-ca-bundle\") pod \"cinder-db-sync-fzj6s\" (UID: \"5035bd54-0aaa-4ff3-b90a-6145145fe95c\") " pod="openstack/cinder-db-sync-fzj6s" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.052276 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b63a3af5-a347-40ed-b9bc-52ad70e7ff13-db-sync-config-data\") pod \"barbican-db-sync-tlbnm\" (UID: \"b63a3af5-a347-40ed-b9bc-52ad70e7ff13\") " pod="openstack/barbican-db-sync-tlbnm" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.040324 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5035bd54-0aaa-4ff3-b90a-6145145fe95c-config-data\") pod \"cinder-db-sync-fzj6s\" (UID: \"5035bd54-0aaa-4ff3-b90a-6145145fe95c\") " pod="openstack/cinder-db-sync-fzj6s" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.059763 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rw8pm\" (UniqueName: \"kubernetes.io/projected/5035bd54-0aaa-4ff3-b90a-6145145fe95c-kube-api-access-rw8pm\") pod \"cinder-db-sync-fzj6s\" (UID: \"5035bd54-0aaa-4ff3-b90a-6145145fe95c\") " pod="openstack/cinder-db-sync-fzj6s" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.060177 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbn6m\" (UniqueName: \"kubernetes.io/projected/b63a3af5-a347-40ed-b9bc-52ad70e7ff13-kube-api-access-gbn6m\") pod \"barbican-db-sync-tlbnm\" (UID: \"b63a3af5-a347-40ed-b9bc-52ad70e7ff13\") " pod="openstack/barbican-db-sync-tlbnm" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.113745 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.117193 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.123751 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.123911 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.125992 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/50946f11-a025-4a18-a7b3-3dafa15c3b2f-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-gv4lz\" (UID: \"50946f11-a025-4a18-a7b3-3dafa15c3b2f\") " pod="openstack/dnsmasq-dns-785d8bcb8c-gv4lz" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.126047 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1e56f45-1812-4ee1-aacc-0b012cf07111-scripts\") pod \"placement-db-sync-9llgc\" (UID: \"d1e56f45-1812-4ee1-aacc-0b012cf07111\") " pod="openstack/placement-db-sync-9llgc" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.126109 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50946f11-a025-4a18-a7b3-3dafa15c3b2f-config\") pod \"dnsmasq-dns-785d8bcb8c-gv4lz\" (UID: \"50946f11-a025-4a18-a7b3-3dafa15c3b2f\") " pod="openstack/dnsmasq-dns-785d8bcb8c-gv4lz" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.126190 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1e56f45-1812-4ee1-aacc-0b012cf07111-config-data\") pod \"placement-db-sync-9llgc\" (UID: \"d1e56f45-1812-4ee1-aacc-0b012cf07111\") " pod="openstack/placement-db-sync-9llgc" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.126230 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tx74\" (UniqueName: \"kubernetes.io/projected/50946f11-a025-4a18-a7b3-3dafa15c3b2f-kube-api-access-7tx74\") pod \"dnsmasq-dns-785d8bcb8c-gv4lz\" (UID: \"50946f11-a025-4a18-a7b3-3dafa15c3b2f\") " pod="openstack/dnsmasq-dns-785d8bcb8c-gv4lz" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.126254 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/50946f11-a025-4a18-a7b3-3dafa15c3b2f-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-gv4lz\" (UID: \"50946f11-a025-4a18-a7b3-3dafa15c3b2f\") " pod="openstack/dnsmasq-dns-785d8bcb8c-gv4lz" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.126287 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1e56f45-1812-4ee1-aacc-0b012cf07111-logs\") pod \"placement-db-sync-9llgc\" (UID: \"d1e56f45-1812-4ee1-aacc-0b012cf07111\") " pod="openstack/placement-db-sync-9llgc" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.126337 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/50946f11-a025-4a18-a7b3-3dafa15c3b2f-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-gv4lz\" (UID: \"50946f11-a025-4a18-a7b3-3dafa15c3b2f\") " pod="openstack/dnsmasq-dns-785d8bcb8c-gv4lz" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.126477 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t69c7\" (UniqueName: \"kubernetes.io/projected/d1e56f45-1812-4ee1-aacc-0b012cf07111-kube-api-access-t69c7\") pod \"placement-db-sync-9llgc\" (UID: \"d1e56f45-1812-4ee1-aacc-0b012cf07111\") " pod="openstack/placement-db-sync-9llgc" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.126502 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/50946f11-a025-4a18-a7b3-3dafa15c3b2f-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-gv4lz\" (UID: \"50946f11-a025-4a18-a7b3-3dafa15c3b2f\") " pod="openstack/dnsmasq-dns-785d8bcb8c-gv4lz" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.126545 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1e56f45-1812-4ee1-aacc-0b012cf07111-combined-ca-bundle\") pod \"placement-db-sync-9llgc\" (UID: \"d1e56f45-1812-4ee1-aacc-0b012cf07111\") " pod="openstack/placement-db-sync-9llgc" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.141471 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1e56f45-1812-4ee1-aacc-0b012cf07111-scripts\") pod \"placement-db-sync-9llgc\" (UID: \"d1e56f45-1812-4ee1-aacc-0b012cf07111\") " pod="openstack/placement-db-sync-9llgc" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.144602 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1e56f45-1812-4ee1-aacc-0b012cf07111-combined-ca-bundle\") pod \"placement-db-sync-9llgc\" (UID: \"d1e56f45-1812-4ee1-aacc-0b012cf07111\") " pod="openstack/placement-db-sync-9llgc" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.149077 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1e56f45-1812-4ee1-aacc-0b012cf07111-config-data\") pod \"placement-db-sync-9llgc\" (UID: \"d1e56f45-1812-4ee1-aacc-0b012cf07111\") " pod="openstack/placement-db-sync-9llgc" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.150217 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.151681 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-tlbnm" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.155365 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1e56f45-1812-4ee1-aacc-0b012cf07111-logs\") pod \"placement-db-sync-9llgc\" (UID: \"d1e56f45-1812-4ee1-aacc-0b012cf07111\") " pod="openstack/placement-db-sync-9llgc" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.160110 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t69c7\" (UniqueName: \"kubernetes.io/projected/d1e56f45-1812-4ee1-aacc-0b012cf07111-kube-api-access-t69c7\") pod \"placement-db-sync-9llgc\" (UID: \"d1e56f45-1812-4ee1-aacc-0b012cf07111\") " pod="openstack/placement-db-sync-9llgc" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.234317 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/34700700-cbd0-4a2e-b791-25b63e3de5b8-log-httpd\") pod \"ceilometer-0\" (UID: \"34700700-cbd0-4a2e-b791-25b63e3de5b8\") " pod="openstack/ceilometer-0" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.234405 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/50946f11-a025-4a18-a7b3-3dafa15c3b2f-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-gv4lz\" (UID: \"50946f11-a025-4a18-a7b3-3dafa15c3b2f\") " pod="openstack/dnsmasq-dns-785d8bcb8c-gv4lz" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.234455 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50946f11-a025-4a18-a7b3-3dafa15c3b2f-config\") pod \"dnsmasq-dns-785d8bcb8c-gv4lz\" (UID: \"50946f11-a025-4a18-a7b3-3dafa15c3b2f\") " pod="openstack/dnsmasq-dns-785d8bcb8c-gv4lz" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.234515 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tx74\" (UniqueName: \"kubernetes.io/projected/50946f11-a025-4a18-a7b3-3dafa15c3b2f-kube-api-access-7tx74\") pod \"dnsmasq-dns-785d8bcb8c-gv4lz\" (UID: \"50946f11-a025-4a18-a7b3-3dafa15c3b2f\") " pod="openstack/dnsmasq-dns-785d8bcb8c-gv4lz" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.234540 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/50946f11-a025-4a18-a7b3-3dafa15c3b2f-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-gv4lz\" (UID: \"50946f11-a025-4a18-a7b3-3dafa15c3b2f\") " pod="openstack/dnsmasq-dns-785d8bcb8c-gv4lz" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.234591 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/34700700-cbd0-4a2e-b791-25b63e3de5b8-run-httpd\") pod \"ceilometer-0\" (UID: \"34700700-cbd0-4a2e-b791-25b63e3de5b8\") " pod="openstack/ceilometer-0" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.234622 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndlhs\" (UniqueName: \"kubernetes.io/projected/34700700-cbd0-4a2e-b791-25b63e3de5b8-kube-api-access-ndlhs\") pod \"ceilometer-0\" (UID: \"34700700-cbd0-4a2e-b791-25b63e3de5b8\") " pod="openstack/ceilometer-0" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.234649 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/50946f11-a025-4a18-a7b3-3dafa15c3b2f-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-gv4lz\" (UID: \"50946f11-a025-4a18-a7b3-3dafa15c3b2f\") " pod="openstack/dnsmasq-dns-785d8bcb8c-gv4lz" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.234720 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/34700700-cbd0-4a2e-b791-25b63e3de5b8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"34700700-cbd0-4a2e-b791-25b63e3de5b8\") " pod="openstack/ceilometer-0" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.234758 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34700700-cbd0-4a2e-b791-25b63e3de5b8-config-data\") pod \"ceilometer-0\" (UID: \"34700700-cbd0-4a2e-b791-25b63e3de5b8\") " pod="openstack/ceilometer-0" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.234782 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34700700-cbd0-4a2e-b791-25b63e3de5b8-scripts\") pod \"ceilometer-0\" (UID: \"34700700-cbd0-4a2e-b791-25b63e3de5b8\") " pod="openstack/ceilometer-0" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.234822 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34700700-cbd0-4a2e-b791-25b63e3de5b8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"34700700-cbd0-4a2e-b791-25b63e3de5b8\") " pod="openstack/ceilometer-0" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.234900 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/50946f11-a025-4a18-a7b3-3dafa15c3b2f-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-gv4lz\" (UID: \"50946f11-a025-4a18-a7b3-3dafa15c3b2f\") " pod="openstack/dnsmasq-dns-785d8bcb8c-gv4lz" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.236310 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/50946f11-a025-4a18-a7b3-3dafa15c3b2f-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-gv4lz\" (UID: \"50946f11-a025-4a18-a7b3-3dafa15c3b2f\") " pod="openstack/dnsmasq-dns-785d8bcb8c-gv4lz" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.236978 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/50946f11-a025-4a18-a7b3-3dafa15c3b2f-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-gv4lz\" (UID: \"50946f11-a025-4a18-a7b3-3dafa15c3b2f\") " pod="openstack/dnsmasq-dns-785d8bcb8c-gv4lz" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.237623 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50946f11-a025-4a18-a7b3-3dafa15c3b2f-config\") pod \"dnsmasq-dns-785d8bcb8c-gv4lz\" (UID: \"50946f11-a025-4a18-a7b3-3dafa15c3b2f\") " pod="openstack/dnsmasq-dns-785d8bcb8c-gv4lz" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.238913 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/50946f11-a025-4a18-a7b3-3dafa15c3b2f-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-gv4lz\" (UID: \"50946f11-a025-4a18-a7b3-3dafa15c3b2f\") " pod="openstack/dnsmasq-dns-785d8bcb8c-gv4lz" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.238914 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/50946f11-a025-4a18-a7b3-3dafa15c3b2f-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-gv4lz\" (UID: \"50946f11-a025-4a18-a7b3-3dafa15c3b2f\") " pod="openstack/dnsmasq-dns-785d8bcb8c-gv4lz" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.246967 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-fzj6s" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.266758 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tx74\" (UniqueName: \"kubernetes.io/projected/50946f11-a025-4a18-a7b3-3dafa15c3b2f-kube-api-access-7tx74\") pod \"dnsmasq-dns-785d8bcb8c-gv4lz\" (UID: \"50946f11-a025-4a18-a7b3-3dafa15c3b2f\") " pod="openstack/dnsmasq-dns-785d8bcb8c-gv4lz" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.268161 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-9llgc" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.326850 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-xh4vc"] Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.336573 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/34700700-cbd0-4a2e-b791-25b63e3de5b8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"34700700-cbd0-4a2e-b791-25b63e3de5b8\") " pod="openstack/ceilometer-0" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.336617 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34700700-cbd0-4a2e-b791-25b63e3de5b8-config-data\") pod \"ceilometer-0\" (UID: \"34700700-cbd0-4a2e-b791-25b63e3de5b8\") " pod="openstack/ceilometer-0" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.336636 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34700700-cbd0-4a2e-b791-25b63e3de5b8-scripts\") pod \"ceilometer-0\" (UID: \"34700700-cbd0-4a2e-b791-25b63e3de5b8\") " pod="openstack/ceilometer-0" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.336676 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34700700-cbd0-4a2e-b791-25b63e3de5b8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"34700700-cbd0-4a2e-b791-25b63e3de5b8\") " pod="openstack/ceilometer-0" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.336746 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/34700700-cbd0-4a2e-b791-25b63e3de5b8-log-httpd\") pod \"ceilometer-0\" (UID: \"34700700-cbd0-4a2e-b791-25b63e3de5b8\") " pod="openstack/ceilometer-0" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.336841 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/34700700-cbd0-4a2e-b791-25b63e3de5b8-run-httpd\") pod \"ceilometer-0\" (UID: \"34700700-cbd0-4a2e-b791-25b63e3de5b8\") " pod="openstack/ceilometer-0" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.336863 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndlhs\" (UniqueName: \"kubernetes.io/projected/34700700-cbd0-4a2e-b791-25b63e3de5b8-kube-api-access-ndlhs\") pod \"ceilometer-0\" (UID: \"34700700-cbd0-4a2e-b791-25b63e3de5b8\") " pod="openstack/ceilometer-0" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.337714 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/34700700-cbd0-4a2e-b791-25b63e3de5b8-log-httpd\") pod \"ceilometer-0\" (UID: \"34700700-cbd0-4a2e-b791-25b63e3de5b8\") " pod="openstack/ceilometer-0" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.338660 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/34700700-cbd0-4a2e-b791-25b63e3de5b8-run-httpd\") pod \"ceilometer-0\" (UID: \"34700700-cbd0-4a2e-b791-25b63e3de5b8\") " pod="openstack/ceilometer-0" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.350048 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/34700700-cbd0-4a2e-b791-25b63e3de5b8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"34700700-cbd0-4a2e-b791-25b63e3de5b8\") " pod="openstack/ceilometer-0" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.350908 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34700700-cbd0-4a2e-b791-25b63e3de5b8-config-data\") pod \"ceilometer-0\" (UID: \"34700700-cbd0-4a2e-b791-25b63e3de5b8\") " pod="openstack/ceilometer-0" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.353299 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-gv4lz" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.354078 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34700700-cbd0-4a2e-b791-25b63e3de5b8-scripts\") pod \"ceilometer-0\" (UID: \"34700700-cbd0-4a2e-b791-25b63e3de5b8\") " pod="openstack/ceilometer-0" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.358294 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34700700-cbd0-4a2e-b791-25b63e3de5b8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"34700700-cbd0-4a2e-b791-25b63e3de5b8\") " pod="openstack/ceilometer-0" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.367051 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndlhs\" (UniqueName: \"kubernetes.io/projected/34700700-cbd0-4a2e-b791-25b63e3de5b8-kube-api-access-ndlhs\") pod \"ceilometer-0\" (UID: \"34700700-cbd0-4a2e-b791-25b63e3de5b8\") " pod="openstack/ceilometer-0" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.465522 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.494436 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.497799 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.508644 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.508664 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.508934 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-xv77d" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.509126 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.555336 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.638682 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.645257 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.653456 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.653685 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.660627 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e-config-data\") pod \"glance-default-external-api-0\" (UID: \"bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e\") " pod="openstack/glance-default-external-api-0" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.660682 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e\") " pod="openstack/glance-default-external-api-0" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.660854 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e-scripts\") pod \"glance-default-external-api-0\" (UID: \"bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e\") " pod="openstack/glance-default-external-api-0" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.661613 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-92420d06-c977-4be0-b4f7-e2f75390c9aa\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-92420d06-c977-4be0-b4f7-e2f75390c9aa\") pod \"glance-default-external-api-0\" (UID: \"bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e\") " pod="openstack/glance-default-external-api-0" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.662531 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e\") " pod="openstack/glance-default-external-api-0" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.662602 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pp6h6\" (UniqueName: \"kubernetes.io/projected/bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e-kube-api-access-pp6h6\") pod \"glance-default-external-api-0\" (UID: \"bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e\") " pod="openstack/glance-default-external-api-0" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.662732 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e\") " pod="openstack/glance-default-external-api-0" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.662768 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e-logs\") pod \"glance-default-external-api-0\" (UID: \"bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e\") " pod="openstack/glance-default-external-api-0" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.673259 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.767872 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pp6h6\" (UniqueName: \"kubernetes.io/projected/bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e-kube-api-access-pp6h6\") pod \"glance-default-external-api-0\" (UID: \"bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e\") " pod="openstack/glance-default-external-api-0" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.768235 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a946c8a-1c04-4453-ac3f-e7305672bc2a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6a946c8a-1c04-4453-ac3f-e7305672bc2a\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.768267 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhgzh\" (UniqueName: \"kubernetes.io/projected/6a946c8a-1c04-4453-ac3f-e7305672bc2a-kube-api-access-jhgzh\") pod \"glance-default-internal-api-0\" (UID: \"6a946c8a-1c04-4453-ac3f-e7305672bc2a\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.768305 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e\") " pod="openstack/glance-default-external-api-0" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.768323 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e-logs\") pod \"glance-default-external-api-0\" (UID: \"bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e\") " pod="openstack/glance-default-external-api-0" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.768342 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e-config-data\") pod \"glance-default-external-api-0\" (UID: \"bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e\") " pod="openstack/glance-default-external-api-0" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.768380 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e\") " pod="openstack/glance-default-external-api-0" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.768420 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6a946c8a-1c04-4453-ac3f-e7305672bc2a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6a946c8a-1c04-4453-ac3f-e7305672bc2a\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.768492 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-59bfcd6f-16ca-4d2a-8185-976337e0ec2c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-59bfcd6f-16ca-4d2a-8185-976337e0ec2c\") pod \"glance-default-internal-api-0\" (UID: \"6a946c8a-1c04-4453-ac3f-e7305672bc2a\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.768540 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a946c8a-1c04-4453-ac3f-e7305672bc2a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6a946c8a-1c04-4453-ac3f-e7305672bc2a\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.768743 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e-scripts\") pod \"glance-default-external-api-0\" (UID: \"bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e\") " pod="openstack/glance-default-external-api-0" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.768787 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-92420d06-c977-4be0-b4f7-e2f75390c9aa\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-92420d06-c977-4be0-b4f7-e2f75390c9aa\") pod \"glance-default-external-api-0\" (UID: \"bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e\") " pod="openstack/glance-default-external-api-0" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.768808 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a946c8a-1c04-4453-ac3f-e7305672bc2a-logs\") pod \"glance-default-internal-api-0\" (UID: \"6a946c8a-1c04-4453-ac3f-e7305672bc2a\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.768871 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a946c8a-1c04-4453-ac3f-e7305672bc2a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6a946c8a-1c04-4453-ac3f-e7305672bc2a\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.768892 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e\") " pod="openstack/glance-default-external-api-0" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.768913 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a946c8a-1c04-4453-ac3f-e7305672bc2a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6a946c8a-1c04-4453-ac3f-e7305672bc2a\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.770719 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e-logs\") pod \"glance-default-external-api-0\" (UID: \"bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e\") " pod="openstack/glance-default-external-api-0" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.774684 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e\") " pod="openstack/glance-default-external-api-0" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.783546 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e\") " pod="openstack/glance-default-external-api-0" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.785397 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e-scripts\") pod \"glance-default-external-api-0\" (UID: \"bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e\") " pod="openstack/glance-default-external-api-0" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.788733 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e\") " pod="openstack/glance-default-external-api-0" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.789388 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e-config-data\") pod \"glance-default-external-api-0\" (UID: \"bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e\") " pod="openstack/glance-default-external-api-0" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.791038 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.791074 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-92420d06-c977-4be0-b4f7-e2f75390c9aa\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-92420d06-c977-4be0-b4f7-e2f75390c9aa\") pod \"glance-default-external-api-0\" (UID: \"bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/4bd5fd036fa183ff3c7bd061e321acfa035788f5e30cbd29138724604e749ce5/globalmount\"" pod="openstack/glance-default-external-api-0" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.824794 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-6ssn4"] Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.848445 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pp6h6\" (UniqueName: \"kubernetes.io/projected/bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e-kube-api-access-pp6h6\") pod \"glance-default-external-api-0\" (UID: \"bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e\") " pod="openstack/glance-default-external-api-0" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.871865 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a946c8a-1c04-4453-ac3f-e7305672bc2a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6a946c8a-1c04-4453-ac3f-e7305672bc2a\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.871924 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhgzh\" (UniqueName: \"kubernetes.io/projected/6a946c8a-1c04-4453-ac3f-e7305672bc2a-kube-api-access-jhgzh\") pod \"glance-default-internal-api-0\" (UID: \"6a946c8a-1c04-4453-ac3f-e7305672bc2a\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.871992 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6a946c8a-1c04-4453-ac3f-e7305672bc2a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6a946c8a-1c04-4453-ac3f-e7305672bc2a\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.872051 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-59bfcd6f-16ca-4d2a-8185-976337e0ec2c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-59bfcd6f-16ca-4d2a-8185-976337e0ec2c\") pod \"glance-default-internal-api-0\" (UID: \"6a946c8a-1c04-4453-ac3f-e7305672bc2a\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.872084 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a946c8a-1c04-4453-ac3f-e7305672bc2a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6a946c8a-1c04-4453-ac3f-e7305672bc2a\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.872130 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a946c8a-1c04-4453-ac3f-e7305672bc2a-logs\") pod \"glance-default-internal-api-0\" (UID: \"6a946c8a-1c04-4453-ac3f-e7305672bc2a\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.872169 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a946c8a-1c04-4453-ac3f-e7305672bc2a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6a946c8a-1c04-4453-ac3f-e7305672bc2a\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.872193 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a946c8a-1c04-4453-ac3f-e7305672bc2a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6a946c8a-1c04-4453-ac3f-e7305672bc2a\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.875815 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a946c8a-1c04-4453-ac3f-e7305672bc2a-logs\") pod \"glance-default-internal-api-0\" (UID: \"6a946c8a-1c04-4453-ac3f-e7305672bc2a\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.884954 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6a946c8a-1c04-4453-ac3f-e7305672bc2a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6a946c8a-1c04-4453-ac3f-e7305672bc2a\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.897612 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-j4lgt"] Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.897790 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a946c8a-1c04-4453-ac3f-e7305672bc2a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6a946c8a-1c04-4453-ac3f-e7305672bc2a\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.907835 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a946c8a-1c04-4453-ac3f-e7305672bc2a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6a946c8a-1c04-4453-ac3f-e7305672bc2a\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.910672 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a946c8a-1c04-4453-ac3f-e7305672bc2a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6a946c8a-1c04-4453-ac3f-e7305672bc2a\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.914304 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a946c8a-1c04-4453-ac3f-e7305672bc2a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6a946c8a-1c04-4453-ac3f-e7305672bc2a\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.926053 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhgzh\" (UniqueName: \"kubernetes.io/projected/6a946c8a-1c04-4453-ac3f-e7305672bc2a-kube-api-access-jhgzh\") pod \"glance-default-internal-api-0\" (UID: \"6a946c8a-1c04-4453-ac3f-e7305672bc2a\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.972809 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 09 14:25:33 crc kubenswrapper[4722]: I0309 14:25:33.972873 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-59bfcd6f-16ca-4d2a-8185-976337e0ec2c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-59bfcd6f-16ca-4d2a-8185-976337e0ec2c\") pod \"glance-default-internal-api-0\" (UID: \"6a946c8a-1c04-4453-ac3f-e7305672bc2a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/a3395c88a7dc2d3ef264f22a8309ab5263d0d43341a96b8565f3c55ce5be97e0/globalmount\"" pod="openstack/glance-default-internal-api-0" Mar 09 14:25:34 crc kubenswrapper[4722]: I0309 14:25:34.032242 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-xh4vc" event={"ID":"25586930-1c5f-4223-bbda-9c045110613e","Type":"ContainerStarted","Data":"1d6fa7543d99d7e40043c5b14d06c90e608052360d96302b0779b57661dc8a22"} Mar 09 14:25:34 crc kubenswrapper[4722]: I0309 14:25:34.092549 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-j4lgt" event={"ID":"7f51218c-6b15-4f4a-ad49-1ba0ccd5e292","Type":"ContainerStarted","Data":"766d7400c3f1f6e3f5e39cbc7a00cea245bd30221eb791c1d7edeee72536af50"} Mar 09 14:25:34 crc kubenswrapper[4722]: I0309 14:25:34.105452 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-6ssn4" event={"ID":"f2ad273c-a2e3-4d6e-b9ca-f8b67f8a680b","Type":"ContainerStarted","Data":"a51d9e78db82f603fda3f557a68aa6a7effd64cb2a1ae34cb042b43fc045bea8"} Mar 09 14:25:34 crc kubenswrapper[4722]: I0309 14:25:34.182331 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-59bfcd6f-16ca-4d2a-8185-976337e0ec2c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-59bfcd6f-16ca-4d2a-8185-976337e0ec2c\") pod \"glance-default-internal-api-0\" (UID: \"6a946c8a-1c04-4453-ac3f-e7305672bc2a\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:25:34 crc kubenswrapper[4722]: I0309 14:25:34.211259 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-l9wtc"] Mar 09 14:25:34 crc kubenswrapper[4722]: I0309 14:25:34.251767 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-92420d06-c977-4be0-b4f7-e2f75390c9aa\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-92420d06-c977-4be0-b4f7-e2f75390c9aa\") pod \"glance-default-external-api-0\" (UID: \"bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e\") " pod="openstack/glance-default-external-api-0" Mar 09 14:25:34 crc kubenswrapper[4722]: I0309 14:25:34.333704 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 09 14:25:34 crc kubenswrapper[4722]: I0309 14:25:34.480116 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 09 14:25:34 crc kubenswrapper[4722]: I0309 14:25:34.508690 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-tlbnm"] Mar 09 14:25:34 crc kubenswrapper[4722]: I0309 14:25:34.552153 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-9llgc"] Mar 09 14:25:34 crc kubenswrapper[4722]: I0309 14:25:34.594444 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-fzj6s"] Mar 09 14:25:34 crc kubenswrapper[4722]: I0309 14:25:34.658703 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 14:25:34 crc kubenswrapper[4722]: I0309 14:25:34.765125 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 14:25:34 crc kubenswrapper[4722]: I0309 14:25:34.891072 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-gv4lz"] Mar 09 14:25:34 crc kubenswrapper[4722]: I0309 14:25:34.949321 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 14:25:35 crc kubenswrapper[4722]: I0309 14:25:35.139534 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-tlbnm" event={"ID":"b63a3af5-a347-40ed-b9bc-52ad70e7ff13","Type":"ContainerStarted","Data":"0417d7acb31f6478d8c5a2810f3257e0cd333fa09a260f974fedc6ed106d91db"} Mar 09 14:25:35 crc kubenswrapper[4722]: I0309 14:25:35.155032 4722 generic.go:334] "Generic (PLEG): container finished" podID="25586930-1c5f-4223-bbda-9c045110613e" containerID="5fdecc292050654160a6e72539b0e1931ad0b189ee273f70c7ded2180ce430f3" exitCode=0 Mar 09 14:25:35 crc kubenswrapper[4722]: I0309 14:25:35.155100 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-xh4vc" event={"ID":"25586930-1c5f-4223-bbda-9c045110613e","Type":"ContainerDied","Data":"5fdecc292050654160a6e72539b0e1931ad0b189ee273f70c7ded2180ce430f3"} Mar 09 14:25:35 crc kubenswrapper[4722]: I0309 14:25:35.160530 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-9llgc" event={"ID":"d1e56f45-1812-4ee1-aacc-0b012cf07111","Type":"ContainerStarted","Data":"7eac4554a11940cebe903bf3e04f5c16028a3e281dccd0c3261949ceabb29e88"} Mar 09 14:25:35 crc kubenswrapper[4722]: I0309 14:25:35.165568 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-gv4lz" event={"ID":"50946f11-a025-4a18-a7b3-3dafa15c3b2f","Type":"ContainerStarted","Data":"fbecf7eaf5b6bff6668a614978bf1bf63ae89f079f6543d4dad2e46116f01992"} Mar 09 14:25:35 crc kubenswrapper[4722]: I0309 14:25:35.167576 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-6ssn4" event={"ID":"f2ad273c-a2e3-4d6e-b9ca-f8b67f8a680b","Type":"ContainerStarted","Data":"730a73dac07e27476a0ac62dd9ca48a96be66fe6494762e1268c7950776a78d1"} Mar 09 14:25:35 crc kubenswrapper[4722]: I0309 14:25:35.172004 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-l9wtc" event={"ID":"69530d68-be96-4605-be46-8053083aa178","Type":"ContainerStarted","Data":"911525eabae91f320ecd486d8c9736832ea7f1256cdd8b97a7ae5e490d7c6c14"} Mar 09 14:25:35 crc kubenswrapper[4722]: I0309 14:25:35.172048 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-l9wtc" event={"ID":"69530d68-be96-4605-be46-8053083aa178","Type":"ContainerStarted","Data":"3c9e54fbcde6482e4278145ee4c88bc3f0b0e9fa8cb13d43852c8ab34ce8cd32"} Mar 09 14:25:35 crc kubenswrapper[4722]: I0309 14:25:35.173463 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"34700700-cbd0-4a2e-b791-25b63e3de5b8","Type":"ContainerStarted","Data":"0ea2488b2edd679df22bff5d6545eaf705fc305844f02792062e8792aeb721c9"} Mar 09 14:25:35 crc kubenswrapper[4722]: I0309 14:25:35.200483 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-fzj6s" event={"ID":"5035bd54-0aaa-4ff3-b90a-6145145fe95c","Type":"ContainerStarted","Data":"96928eacf917034a694476bc33209ba61518a7241f7699fec5544d7ef3781c9a"} Mar 09 14:25:35 crc kubenswrapper[4722]: W0309 14:25:35.216342 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a946c8a_1c04_4453_ac3f_e7305672bc2a.slice/crio-06c66c84ab1a3cf79dbd647d7c3c822602cecd95915fca8df8e3d1b8bcacf221 WatchSource:0}: Error finding container 06c66c84ab1a3cf79dbd647d7c3c822602cecd95915fca8df8e3d1b8bcacf221: Status 404 returned error can't find the container with id 06c66c84ab1a3cf79dbd647d7c3c822602cecd95915fca8df8e3d1b8bcacf221 Mar 09 14:25:35 crc kubenswrapper[4722]: I0309 14:25:35.241196 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 14:25:35 crc kubenswrapper[4722]: I0309 14:25:35.252018 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-6ssn4" podStartSLOduration=3.251997325 podStartE2EDuration="3.251997325s" podCreationTimestamp="2026-03-09 14:25:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:25:35.195057109 +0000 UTC m=+1375.750625685" watchObservedRunningTime="2026-03-09 14:25:35.251997325 +0000 UTC m=+1375.807565901" Mar 09 14:25:35 crc kubenswrapper[4722]: I0309 14:25:35.252408 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-l9wtc" podStartSLOduration=3.252404867 podStartE2EDuration="3.252404867s" podCreationTimestamp="2026-03-09 14:25:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:25:35.216630816 +0000 UTC m=+1375.772199392" watchObservedRunningTime="2026-03-09 14:25:35.252404867 +0000 UTC m=+1375.807973443" Mar 09 14:25:35 crc kubenswrapper[4722]: I0309 14:25:35.270128 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 14:25:35 crc kubenswrapper[4722]: W0309 14:25:35.413491 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbdf48693_5b1f_466a_ac6d_a3a4dcdcf80e.slice/crio-549b983b2229843f7ccff13adad00f9907e8e999a3d10d18b1763326c2e20c13 WatchSource:0}: Error finding container 549b983b2229843f7ccff13adad00f9907e8e999a3d10d18b1763326c2e20c13: Status 404 returned error can't find the container with id 549b983b2229843f7ccff13adad00f9907e8e999a3d10d18b1763326c2e20c13 Mar 09 14:25:35 crc kubenswrapper[4722]: I0309 14:25:35.413635 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 14:25:35 crc kubenswrapper[4722]: I0309 14:25:35.770409 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-xh4vc" Mar 09 14:25:35 crc kubenswrapper[4722]: I0309 14:25:35.964838 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25586930-1c5f-4223-bbda-9c045110613e-config\") pod \"25586930-1c5f-4223-bbda-9c045110613e\" (UID: \"25586930-1c5f-4223-bbda-9c045110613e\") " Mar 09 14:25:35 crc kubenswrapper[4722]: I0309 14:25:35.965216 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/25586930-1c5f-4223-bbda-9c045110613e-dns-swift-storage-0\") pod \"25586930-1c5f-4223-bbda-9c045110613e\" (UID: \"25586930-1c5f-4223-bbda-9c045110613e\") " Mar 09 14:25:35 crc kubenswrapper[4722]: I0309 14:25:35.965300 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25586930-1c5f-4223-bbda-9c045110613e-ovsdbserver-nb\") pod \"25586930-1c5f-4223-bbda-9c045110613e\" (UID: \"25586930-1c5f-4223-bbda-9c045110613e\") " Mar 09 14:25:35 crc kubenswrapper[4722]: I0309 14:25:35.965327 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25586930-1c5f-4223-bbda-9c045110613e-dns-svc\") pod \"25586930-1c5f-4223-bbda-9c045110613e\" (UID: \"25586930-1c5f-4223-bbda-9c045110613e\") " Mar 09 14:25:35 crc kubenswrapper[4722]: I0309 14:25:35.965379 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zltk4\" (UniqueName: \"kubernetes.io/projected/25586930-1c5f-4223-bbda-9c045110613e-kube-api-access-zltk4\") pod \"25586930-1c5f-4223-bbda-9c045110613e\" (UID: \"25586930-1c5f-4223-bbda-9c045110613e\") " Mar 09 14:25:35 crc kubenswrapper[4722]: I0309 14:25:35.965413 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25586930-1c5f-4223-bbda-9c045110613e-ovsdbserver-sb\") pod \"25586930-1c5f-4223-bbda-9c045110613e\" (UID: \"25586930-1c5f-4223-bbda-9c045110613e\") " Mar 09 14:25:36 crc kubenswrapper[4722]: I0309 14:25:36.009925 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25586930-1c5f-4223-bbda-9c045110613e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "25586930-1c5f-4223-bbda-9c045110613e" (UID: "25586930-1c5f-4223-bbda-9c045110613e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:25:36 crc kubenswrapper[4722]: I0309 14:25:36.072359 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/25586930-1c5f-4223-bbda-9c045110613e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 09 14:25:36 crc kubenswrapper[4722]: I0309 14:25:36.077393 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25586930-1c5f-4223-bbda-9c045110613e-kube-api-access-zltk4" (OuterVolumeSpecName: "kube-api-access-zltk4") pod "25586930-1c5f-4223-bbda-9c045110613e" (UID: "25586930-1c5f-4223-bbda-9c045110613e"). InnerVolumeSpecName "kube-api-access-zltk4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:25:36 crc kubenswrapper[4722]: I0309 14:25:36.089429 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25586930-1c5f-4223-bbda-9c045110613e-config" (OuterVolumeSpecName: "config") pod "25586930-1c5f-4223-bbda-9c045110613e" (UID: "25586930-1c5f-4223-bbda-9c045110613e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:25:36 crc kubenswrapper[4722]: I0309 14:25:36.158782 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25586930-1c5f-4223-bbda-9c045110613e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "25586930-1c5f-4223-bbda-9c045110613e" (UID: "25586930-1c5f-4223-bbda-9c045110613e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:25:36 crc kubenswrapper[4722]: I0309 14:25:36.176921 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zltk4\" (UniqueName: \"kubernetes.io/projected/25586930-1c5f-4223-bbda-9c045110613e-kube-api-access-zltk4\") on node \"crc\" DevicePath \"\"" Mar 09 14:25:36 crc kubenswrapper[4722]: I0309 14:25:36.176948 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25586930-1c5f-4223-bbda-9c045110613e-config\") on node \"crc\" DevicePath \"\"" Mar 09 14:25:36 crc kubenswrapper[4722]: I0309 14:25:36.176958 4722 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25586930-1c5f-4223-bbda-9c045110613e-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 14:25:36 crc kubenswrapper[4722]: I0309 14:25:36.195631 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25586930-1c5f-4223-bbda-9c045110613e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "25586930-1c5f-4223-bbda-9c045110613e" (UID: "25586930-1c5f-4223-bbda-9c045110613e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:25:36 crc kubenswrapper[4722]: I0309 14:25:36.203357 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25586930-1c5f-4223-bbda-9c045110613e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "25586930-1c5f-4223-bbda-9c045110613e" (UID: "25586930-1c5f-4223-bbda-9c045110613e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:25:36 crc kubenswrapper[4722]: I0309 14:25:36.254647 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e","Type":"ContainerStarted","Data":"549b983b2229843f7ccff13adad00f9907e8e999a3d10d18b1763326c2e20c13"} Mar 09 14:25:36 crc kubenswrapper[4722]: I0309 14:25:36.265045 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-xh4vc" event={"ID":"25586930-1c5f-4223-bbda-9c045110613e","Type":"ContainerDied","Data":"1d6fa7543d99d7e40043c5b14d06c90e608052360d96302b0779b57661dc8a22"} Mar 09 14:25:36 crc kubenswrapper[4722]: I0309 14:25:36.265086 4722 scope.go:117] "RemoveContainer" containerID="5fdecc292050654160a6e72539b0e1931ad0b189ee273f70c7ded2180ce430f3" Mar 09 14:25:36 crc kubenswrapper[4722]: I0309 14:25:36.265214 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-xh4vc" Mar 09 14:25:36 crc kubenswrapper[4722]: I0309 14:25:36.280714 4722 generic.go:334] "Generic (PLEG): container finished" podID="50946f11-a025-4a18-a7b3-3dafa15c3b2f" containerID="b63bb0bbcc06518f6bdcc16a44a7db665233979d61911135e7b9af0d5c85aa53" exitCode=0 Mar 09 14:25:36 crc kubenswrapper[4722]: I0309 14:25:36.282904 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-gv4lz" event={"ID":"50946f11-a025-4a18-a7b3-3dafa15c3b2f","Type":"ContainerDied","Data":"b63bb0bbcc06518f6bdcc16a44a7db665233979d61911135e7b9af0d5c85aa53"} Mar 09 14:25:36 crc kubenswrapper[4722]: I0309 14:25:36.285347 4722 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/25586930-1c5f-4223-bbda-9c045110613e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 09 14:25:36 crc kubenswrapper[4722]: I0309 14:25:36.285390 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25586930-1c5f-4223-bbda-9c045110613e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 09 14:25:36 crc kubenswrapper[4722]: I0309 14:25:36.306669 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6a946c8a-1c04-4453-ac3f-e7305672bc2a","Type":"ContainerStarted","Data":"06c66c84ab1a3cf79dbd647d7c3c822602cecd95915fca8df8e3d1b8bcacf221"} Mar 09 14:25:36 crc kubenswrapper[4722]: I0309 14:25:36.383723 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-xh4vc"] Mar 09 14:25:36 crc kubenswrapper[4722]: I0309 14:25:36.424436 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-xh4vc"] Mar 09 14:25:36 crc kubenswrapper[4722]: E0309 14:25:36.661051 4722 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod25586930_1c5f_4223_bbda_9c045110613e.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod25586930_1c5f_4223_bbda_9c045110613e.slice/crio-1d6fa7543d99d7e40043c5b14d06c90e608052360d96302b0779b57661dc8a22\": RecentStats: unable to find data in memory cache]" Mar 09 14:25:37 crc kubenswrapper[4722]: I0309 14:25:37.338120 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-gv4lz" event={"ID":"50946f11-a025-4a18-a7b3-3dafa15c3b2f","Type":"ContainerStarted","Data":"8462432b5cf6bd80898d429fde3ac72bf95277a5349b45032409711c6330e695"} Mar 09 14:25:37 crc kubenswrapper[4722]: I0309 14:25:37.339422 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-785d8bcb8c-gv4lz" Mar 09 14:25:37 crc kubenswrapper[4722]: I0309 14:25:37.342390 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6a946c8a-1c04-4453-ac3f-e7305672bc2a","Type":"ContainerStarted","Data":"ee8056cff9b66b64015f6a67e20f17d900daef19b7e1712f52cb634e0b3b8a43"} Mar 09 14:25:37 crc kubenswrapper[4722]: I0309 14:25:37.344783 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e","Type":"ContainerStarted","Data":"fa472c6776005131d535047382281f2a71af316d713f97635d857b68b418fa16"} Mar 09 14:25:37 crc kubenswrapper[4722]: I0309 14:25:37.361582 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-785d8bcb8c-gv4lz" podStartSLOduration=5.3615637320000005 podStartE2EDuration="5.361563732s" podCreationTimestamp="2026-03-09 14:25:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:25:37.358452237 +0000 UTC m=+1377.914020813" watchObservedRunningTime="2026-03-09 14:25:37.361563732 +0000 UTC m=+1377.917132308" Mar 09 14:25:38 crc kubenswrapper[4722]: I0309 14:25:38.168001 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25586930-1c5f-4223-bbda-9c045110613e" path="/var/lib/kubelet/pods/25586930-1c5f-4223-bbda-9c045110613e/volumes" Mar 09 14:25:38 crc kubenswrapper[4722]: I0309 14:25:38.360646 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="6a946c8a-1c04-4453-ac3f-e7305672bc2a" containerName="glance-log" containerID="cri-o://ee8056cff9b66b64015f6a67e20f17d900daef19b7e1712f52cb634e0b3b8a43" gracePeriod=30 Mar 09 14:25:38 crc kubenswrapper[4722]: I0309 14:25:38.360698 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="6a946c8a-1c04-4453-ac3f-e7305672bc2a" containerName="glance-httpd" containerID="cri-o://6a3438f512f7c0694330f71d0207e49281ab8f2c8c1ead1d5fe2173fab3274d8" gracePeriod=30 Mar 09 14:25:38 crc kubenswrapper[4722]: I0309 14:25:38.361987 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6a946c8a-1c04-4453-ac3f-e7305672bc2a","Type":"ContainerStarted","Data":"6a3438f512f7c0694330f71d0207e49281ab8f2c8c1ead1d5fe2173fab3274d8"} Mar 09 14:25:38 crc kubenswrapper[4722]: I0309 14:25:38.372548 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e","Type":"ContainerStarted","Data":"1ad4a3032934339b7242327bcac91622a2eb45e3bcae1e676a15748689ebb407"} Mar 09 14:25:38 crc kubenswrapper[4722]: I0309 14:25:38.372726 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e" containerName="glance-log" containerID="cri-o://fa472c6776005131d535047382281f2a71af316d713f97635d857b68b418fa16" gracePeriod=30 Mar 09 14:25:38 crc kubenswrapper[4722]: I0309 14:25:38.372851 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e" containerName="glance-httpd" containerID="cri-o://1ad4a3032934339b7242327bcac91622a2eb45e3bcae1e676a15748689ebb407" gracePeriod=30 Mar 09 14:25:38 crc kubenswrapper[4722]: I0309 14:25:38.392155 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=6.392135756 podStartE2EDuration="6.392135756s" podCreationTimestamp="2026-03-09 14:25:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:25:38.377383527 +0000 UTC m=+1378.932952103" watchObservedRunningTime="2026-03-09 14:25:38.392135756 +0000 UTC m=+1378.947704332" Mar 09 14:25:38 crc kubenswrapper[4722]: I0309 14:25:38.418860 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=6.418836065 podStartE2EDuration="6.418836065s" podCreationTimestamp="2026-03-09 14:25:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:25:38.404667573 +0000 UTC m=+1378.960236149" watchObservedRunningTime="2026-03-09 14:25:38.418836065 +0000 UTC m=+1378.974404641" Mar 09 14:25:39 crc kubenswrapper[4722]: I0309 14:25:39.387939 4722 generic.go:334] "Generic (PLEG): container finished" podID="bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e" containerID="1ad4a3032934339b7242327bcac91622a2eb45e3bcae1e676a15748689ebb407" exitCode=0 Mar 09 14:25:39 crc kubenswrapper[4722]: I0309 14:25:39.388296 4722 generic.go:334] "Generic (PLEG): container finished" podID="bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e" containerID="fa472c6776005131d535047382281f2a71af316d713f97635d857b68b418fa16" exitCode=143 Mar 09 14:25:39 crc kubenswrapper[4722]: I0309 14:25:39.388006 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e","Type":"ContainerDied","Data":"1ad4a3032934339b7242327bcac91622a2eb45e3bcae1e676a15748689ebb407"} Mar 09 14:25:39 crc kubenswrapper[4722]: I0309 14:25:39.388396 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e","Type":"ContainerDied","Data":"fa472c6776005131d535047382281f2a71af316d713f97635d857b68b418fa16"} Mar 09 14:25:39 crc kubenswrapper[4722]: I0309 14:25:39.391667 4722 generic.go:334] "Generic (PLEG): container finished" podID="6a946c8a-1c04-4453-ac3f-e7305672bc2a" containerID="6a3438f512f7c0694330f71d0207e49281ab8f2c8c1ead1d5fe2173fab3274d8" exitCode=0 Mar 09 14:25:39 crc kubenswrapper[4722]: I0309 14:25:39.391694 4722 generic.go:334] "Generic (PLEG): container finished" podID="6a946c8a-1c04-4453-ac3f-e7305672bc2a" containerID="ee8056cff9b66b64015f6a67e20f17d900daef19b7e1712f52cb634e0b3b8a43" exitCode=143 Mar 09 14:25:39 crc kubenswrapper[4722]: I0309 14:25:39.391715 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6a946c8a-1c04-4453-ac3f-e7305672bc2a","Type":"ContainerDied","Data":"6a3438f512f7c0694330f71d0207e49281ab8f2c8c1ead1d5fe2173fab3274d8"} Mar 09 14:25:39 crc kubenswrapper[4722]: I0309 14:25:39.391744 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6a946c8a-1c04-4453-ac3f-e7305672bc2a","Type":"ContainerDied","Data":"ee8056cff9b66b64015f6a67e20f17d900daef19b7e1712f52cb634e0b3b8a43"} Mar 09 14:25:40 crc kubenswrapper[4722]: I0309 14:25:40.403688 4722 generic.go:334] "Generic (PLEG): container finished" podID="f2ad273c-a2e3-4d6e-b9ca-f8b67f8a680b" containerID="730a73dac07e27476a0ac62dd9ca48a96be66fe6494762e1268c7950776a78d1" exitCode=0 Mar 09 14:25:40 crc kubenswrapper[4722]: I0309 14:25:40.403783 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-6ssn4" event={"ID":"f2ad273c-a2e3-4d6e-b9ca-f8b67f8a680b","Type":"ContainerDied","Data":"730a73dac07e27476a0ac62dd9ca48a96be66fe6494762e1268c7950776a78d1"} Mar 09 14:25:43 crc kubenswrapper[4722]: I0309 14:25:43.355421 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-785d8bcb8c-gv4lz" Mar 09 14:25:43 crc kubenswrapper[4722]: I0309 14:25:43.415145 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-qccbt"] Mar 09 14:25:43 crc kubenswrapper[4722]: I0309 14:25:43.415394 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-74f6bcbc87-qccbt" podUID="f87e19a1-bff1-4e82-8677-3812b2f51f46" containerName="dnsmasq-dns" containerID="cri-o://d7a02dfe8d14a05bddff13ef5f261df15fa1e4afa9505ac0d75001bbd58e3903" gracePeriod=10 Mar 09 14:25:44 crc kubenswrapper[4722]: I0309 14:25:44.361440 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-qccbt" podUID="f87e19a1-bff1-4e82-8677-3812b2f51f46" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.172:5353: connect: connection refused" Mar 09 14:25:44 crc kubenswrapper[4722]: I0309 14:25:44.470843 4722 generic.go:334] "Generic (PLEG): container finished" podID="f87e19a1-bff1-4e82-8677-3812b2f51f46" containerID="d7a02dfe8d14a05bddff13ef5f261df15fa1e4afa9505ac0d75001bbd58e3903" exitCode=0 Mar 09 14:25:44 crc kubenswrapper[4722]: I0309 14:25:44.471075 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-qccbt" event={"ID":"f87e19a1-bff1-4e82-8677-3812b2f51f46","Type":"ContainerDied","Data":"d7a02dfe8d14a05bddff13ef5f261df15fa1e4afa9505ac0d75001bbd58e3903"} Mar 09 14:25:45 crc kubenswrapper[4722]: I0309 14:25:45.871051 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 09 14:25:45 crc kubenswrapper[4722]: I0309 14:25:45.879386 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.000916 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a946c8a-1c04-4453-ac3f-e7305672bc2a-scripts\") pod \"6a946c8a-1c04-4453-ac3f-e7305672bc2a\" (UID: \"6a946c8a-1c04-4453-ac3f-e7305672bc2a\") " Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.000971 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a946c8a-1c04-4453-ac3f-e7305672bc2a-logs\") pod \"6a946c8a-1c04-4453-ac3f-e7305672bc2a\" (UID: \"6a946c8a-1c04-4453-ac3f-e7305672bc2a\") " Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.001015 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a946c8a-1c04-4453-ac3f-e7305672bc2a-combined-ca-bundle\") pod \"6a946c8a-1c04-4453-ac3f-e7305672bc2a\" (UID: \"6a946c8a-1c04-4453-ac3f-e7305672bc2a\") " Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.001043 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e-scripts\") pod \"bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e\" (UID: \"bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e\") " Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.001068 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e-combined-ca-bundle\") pod \"bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e\" (UID: \"bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e\") " Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.001115 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e-logs\") pod \"bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e\" (UID: \"bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e\") " Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.001259 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-92420d06-c977-4be0-b4f7-e2f75390c9aa\") pod \"bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e\" (UID: \"bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e\") " Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.001296 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e-public-tls-certs\") pod \"bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e\" (UID: \"bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e\") " Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.001334 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhgzh\" (UniqueName: \"kubernetes.io/projected/6a946c8a-1c04-4453-ac3f-e7305672bc2a-kube-api-access-jhgzh\") pod \"6a946c8a-1c04-4453-ac3f-e7305672bc2a\" (UID: \"6a946c8a-1c04-4453-ac3f-e7305672bc2a\") " Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.001377 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6a946c8a-1c04-4453-ac3f-e7305672bc2a-httpd-run\") pod \"6a946c8a-1c04-4453-ac3f-e7305672bc2a\" (UID: \"6a946c8a-1c04-4453-ac3f-e7305672bc2a\") " Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.001430 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-59bfcd6f-16ca-4d2a-8185-976337e0ec2c\") pod \"6a946c8a-1c04-4453-ac3f-e7305672bc2a\" (UID: \"6a946c8a-1c04-4453-ac3f-e7305672bc2a\") " Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.001487 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e-httpd-run\") pod \"bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e\" (UID: \"bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e\") " Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.001524 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e-config-data\") pod \"bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e\" (UID: \"bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e\") " Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.001542 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a946c8a-1c04-4453-ac3f-e7305672bc2a-internal-tls-certs\") pod \"6a946c8a-1c04-4453-ac3f-e7305672bc2a\" (UID: \"6a946c8a-1c04-4453-ac3f-e7305672bc2a\") " Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.001583 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pp6h6\" (UniqueName: \"kubernetes.io/projected/bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e-kube-api-access-pp6h6\") pod \"bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e\" (UID: \"bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e\") " Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.001622 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a946c8a-1c04-4453-ac3f-e7305672bc2a-config-data\") pod \"6a946c8a-1c04-4453-ac3f-e7305672bc2a\" (UID: \"6a946c8a-1c04-4453-ac3f-e7305672bc2a\") " Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.002375 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e-logs" (OuterVolumeSpecName: "logs") pod "bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e" (UID: "bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.002770 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e" (UID: "bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.003168 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a946c8a-1c04-4453-ac3f-e7305672bc2a-logs" (OuterVolumeSpecName: "logs") pod "6a946c8a-1c04-4453-ac3f-e7305672bc2a" (UID: "6a946c8a-1c04-4453-ac3f-e7305672bc2a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.003321 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a946c8a-1c04-4453-ac3f-e7305672bc2a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "6a946c8a-1c04-4453-ac3f-e7305672bc2a" (UID: "6a946c8a-1c04-4453-ac3f-e7305672bc2a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.008137 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e-kube-api-access-pp6h6" (OuterVolumeSpecName: "kube-api-access-pp6h6") pod "bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e" (UID: "bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e"). InnerVolumeSpecName "kube-api-access-pp6h6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.008871 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a946c8a-1c04-4453-ac3f-e7305672bc2a-scripts" (OuterVolumeSpecName: "scripts") pod "6a946c8a-1c04-4453-ac3f-e7305672bc2a" (UID: "6a946c8a-1c04-4453-ac3f-e7305672bc2a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.010597 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a946c8a-1c04-4453-ac3f-e7305672bc2a-kube-api-access-jhgzh" (OuterVolumeSpecName: "kube-api-access-jhgzh") pod "6a946c8a-1c04-4453-ac3f-e7305672bc2a" (UID: "6a946c8a-1c04-4453-ac3f-e7305672bc2a"). InnerVolumeSpecName "kube-api-access-jhgzh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.011886 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e-scripts" (OuterVolumeSpecName: "scripts") pod "bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e" (UID: "bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.026843 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-92420d06-c977-4be0-b4f7-e2f75390c9aa" (OuterVolumeSpecName: "glance") pod "bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e" (UID: "bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e"). InnerVolumeSpecName "pvc-92420d06-c977-4be0-b4f7-e2f75390c9aa". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.052559 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e" (UID: "bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:25:46 crc kubenswrapper[4722]: E0309 14:25:46.057235 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-59bfcd6f-16ca-4d2a-8185-976337e0ec2c podName:6a946c8a-1c04-4453-ac3f-e7305672bc2a nodeName:}" failed. No retries permitted until 2026-03-09 14:25:46.557210989 +0000 UTC m=+1387.112779565 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "glance" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-59bfcd6f-16ca-4d2a-8185-976337e0ec2c") pod "6a946c8a-1c04-4453-ac3f-e7305672bc2a" (UID: "6a946c8a-1c04-4453-ac3f-e7305672bc2a") : kubernetes.io/csi: Unmounter.TearDownAt failed: rpc error: code = Unknown desc = check target path: could not get consistent content of /proc/mounts after 3 attempts Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.077600 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e" (UID: "bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.080421 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a946c8a-1c04-4453-ac3f-e7305672bc2a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6a946c8a-1c04-4453-ac3f-e7305672bc2a" (UID: "6a946c8a-1c04-4453-ac3f-e7305672bc2a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.083451 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e-config-data" (OuterVolumeSpecName: "config-data") pod "bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e" (UID: "bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.088050 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a946c8a-1c04-4453-ac3f-e7305672bc2a-config-data" (OuterVolumeSpecName: "config-data") pod "6a946c8a-1c04-4453-ac3f-e7305672bc2a" (UID: "6a946c8a-1c04-4453-ac3f-e7305672bc2a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.104444 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a946c8a-1c04-4453-ac3f-e7305672bc2a-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.104483 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6a946c8a-1c04-4453-ac3f-e7305672bc2a-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.104491 4722 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a946c8a-1c04-4453-ac3f-e7305672bc2a-logs\") on node \"crc\" DevicePath \"\"" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.104499 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a946c8a-1c04-4453-ac3f-e7305672bc2a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.104509 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.104519 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.104530 4722 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e-logs\") on node \"crc\" DevicePath \"\"" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.104562 4722 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-92420d06-c977-4be0-b4f7-e2f75390c9aa\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-92420d06-c977-4be0-b4f7-e2f75390c9aa\") on node \"crc\" " Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.104573 4722 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.104583 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhgzh\" (UniqueName: \"kubernetes.io/projected/6a946c8a-1c04-4453-ac3f-e7305672bc2a-kube-api-access-jhgzh\") on node \"crc\" DevicePath \"\"" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.104599 4722 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6a946c8a-1c04-4453-ac3f-e7305672bc2a-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.104608 4722 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.104616 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.104624 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pp6h6\" (UniqueName: \"kubernetes.io/projected/bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e-kube-api-access-pp6h6\") on node \"crc\" DevicePath \"\"" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.106161 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a946c8a-1c04-4453-ac3f-e7305672bc2a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "6a946c8a-1c04-4453-ac3f-e7305672bc2a" (UID: "6a946c8a-1c04-4453-ac3f-e7305672bc2a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.129254 4722 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.129384 4722 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-92420d06-c977-4be0-b4f7-e2f75390c9aa" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-92420d06-c977-4be0-b4f7-e2f75390c9aa") on node "crc" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.208234 4722 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a946c8a-1c04-4453-ac3f-e7305672bc2a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.208274 4722 reconciler_common.go:293] "Volume detached for volume \"pvc-92420d06-c977-4be0-b4f7-e2f75390c9aa\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-92420d06-c977-4be0-b4f7-e2f75390c9aa\") on node \"crc\" DevicePath \"\"" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.492681 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6a946c8a-1c04-4453-ac3f-e7305672bc2a","Type":"ContainerDied","Data":"06c66c84ab1a3cf79dbd647d7c3c822602cecd95915fca8df8e3d1b8bcacf221"} Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.493038 4722 scope.go:117] "RemoveContainer" containerID="6a3438f512f7c0694330f71d0207e49281ab8f2c8c1ead1d5fe2173fab3274d8" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.492747 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.497154 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e","Type":"ContainerDied","Data":"549b983b2229843f7ccff13adad00f9907e8e999a3d10d18b1763326c2e20c13"} Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.497350 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.533484 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.544747 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.570253 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 14:25:46 crc kubenswrapper[4722]: E0309 14:25:46.570959 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e" containerName="glance-httpd" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.570987 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e" containerName="glance-httpd" Mar 09 14:25:46 crc kubenswrapper[4722]: E0309 14:25:46.571020 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25586930-1c5f-4223-bbda-9c045110613e" containerName="init" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.571030 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="25586930-1c5f-4223-bbda-9c045110613e" containerName="init" Mar 09 14:25:46 crc kubenswrapper[4722]: E0309 14:25:46.571043 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e" containerName="glance-log" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.571051 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e" containerName="glance-log" Mar 09 14:25:46 crc kubenswrapper[4722]: E0309 14:25:46.571068 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a946c8a-1c04-4453-ac3f-e7305672bc2a" containerName="glance-log" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.571076 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a946c8a-1c04-4453-ac3f-e7305672bc2a" containerName="glance-log" Mar 09 14:25:46 crc kubenswrapper[4722]: E0309 14:25:46.571109 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a946c8a-1c04-4453-ac3f-e7305672bc2a" containerName="glance-httpd" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.571117 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a946c8a-1c04-4453-ac3f-e7305672bc2a" containerName="glance-httpd" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.571418 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e" containerName="glance-log" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.571441 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e" containerName="glance-httpd" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.571461 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="25586930-1c5f-4223-bbda-9c045110613e" containerName="init" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.571480 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a946c8a-1c04-4453-ac3f-e7305672bc2a" containerName="glance-log" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.571498 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a946c8a-1c04-4453-ac3f-e7305672bc2a" containerName="glance-httpd" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.573118 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.575219 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.575444 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.585016 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.618315 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-59bfcd6f-16ca-4d2a-8185-976337e0ec2c\") pod \"6a946c8a-1c04-4453-ac3f-e7305672bc2a\" (UID: \"6a946c8a-1c04-4453-ac3f-e7305672bc2a\") " Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.618730 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/905e426d-8ef1-442e-abb2-69905e8fc61a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"905e426d-8ef1-442e-abb2-69905e8fc61a\") " pod="openstack/glance-default-external-api-0" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.618775 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/905e426d-8ef1-442e-abb2-69905e8fc61a-scripts\") pod \"glance-default-external-api-0\" (UID: \"905e426d-8ef1-442e-abb2-69905e8fc61a\") " pod="openstack/glance-default-external-api-0" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.618837 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/905e426d-8ef1-442e-abb2-69905e8fc61a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"905e426d-8ef1-442e-abb2-69905e8fc61a\") " pod="openstack/glance-default-external-api-0" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.618910 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/905e426d-8ef1-442e-abb2-69905e8fc61a-logs\") pod \"glance-default-external-api-0\" (UID: \"905e426d-8ef1-442e-abb2-69905e8fc61a\") " pod="openstack/glance-default-external-api-0" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.618928 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/905e426d-8ef1-442e-abb2-69905e8fc61a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"905e426d-8ef1-442e-abb2-69905e8fc61a\") " pod="openstack/glance-default-external-api-0" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.619016 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-92420d06-c977-4be0-b4f7-e2f75390c9aa\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-92420d06-c977-4be0-b4f7-e2f75390c9aa\") pod \"glance-default-external-api-0\" (UID: \"905e426d-8ef1-442e-abb2-69905e8fc61a\") " pod="openstack/glance-default-external-api-0" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.619042 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/905e426d-8ef1-442e-abb2-69905e8fc61a-config-data\") pod \"glance-default-external-api-0\" (UID: \"905e426d-8ef1-442e-abb2-69905e8fc61a\") " pod="openstack/glance-default-external-api-0" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.619134 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvpdt\" (UniqueName: \"kubernetes.io/projected/905e426d-8ef1-442e-abb2-69905e8fc61a-kube-api-access-cvpdt\") pod \"glance-default-external-api-0\" (UID: \"905e426d-8ef1-442e-abb2-69905e8fc61a\") " pod="openstack/glance-default-external-api-0" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.640666 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-59bfcd6f-16ca-4d2a-8185-976337e0ec2c" (OuterVolumeSpecName: "glance") pod "6a946c8a-1c04-4453-ac3f-e7305672bc2a" (UID: "6a946c8a-1c04-4453-ac3f-e7305672bc2a"). InnerVolumeSpecName "pvc-59bfcd6f-16ca-4d2a-8185-976337e0ec2c". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.732070 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.737724 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/905e426d-8ef1-442e-abb2-69905e8fc61a-config-data\") pod \"glance-default-external-api-0\" (UID: \"905e426d-8ef1-442e-abb2-69905e8fc61a\") " pod="openstack/glance-default-external-api-0" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.738835 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvpdt\" (UniqueName: \"kubernetes.io/projected/905e426d-8ef1-442e-abb2-69905e8fc61a-kube-api-access-cvpdt\") pod \"glance-default-external-api-0\" (UID: \"905e426d-8ef1-442e-abb2-69905e8fc61a\") " pod="openstack/glance-default-external-api-0" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.738944 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/905e426d-8ef1-442e-abb2-69905e8fc61a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"905e426d-8ef1-442e-abb2-69905e8fc61a\") " pod="openstack/glance-default-external-api-0" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.738975 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/905e426d-8ef1-442e-abb2-69905e8fc61a-scripts\") pod \"glance-default-external-api-0\" (UID: \"905e426d-8ef1-442e-abb2-69905e8fc61a\") " pod="openstack/glance-default-external-api-0" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.739031 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/905e426d-8ef1-442e-abb2-69905e8fc61a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"905e426d-8ef1-442e-abb2-69905e8fc61a\") " pod="openstack/glance-default-external-api-0" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.739120 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/905e426d-8ef1-442e-abb2-69905e8fc61a-logs\") pod \"glance-default-external-api-0\" (UID: \"905e426d-8ef1-442e-abb2-69905e8fc61a\") " pod="openstack/glance-default-external-api-0" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.739149 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/905e426d-8ef1-442e-abb2-69905e8fc61a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"905e426d-8ef1-442e-abb2-69905e8fc61a\") " pod="openstack/glance-default-external-api-0" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.739312 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-92420d06-c977-4be0-b4f7-e2f75390c9aa\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-92420d06-c977-4be0-b4f7-e2f75390c9aa\") pod \"glance-default-external-api-0\" (UID: \"905e426d-8ef1-442e-abb2-69905e8fc61a\") " pod="openstack/glance-default-external-api-0" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.739397 4722 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-59bfcd6f-16ca-4d2a-8185-976337e0ec2c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-59bfcd6f-16ca-4d2a-8185-976337e0ec2c\") on node \"crc\" " Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.739895 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/905e426d-8ef1-442e-abb2-69905e8fc61a-logs\") pod \"glance-default-external-api-0\" (UID: \"905e426d-8ef1-442e-abb2-69905e8fc61a\") " pod="openstack/glance-default-external-api-0" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.740244 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/905e426d-8ef1-442e-abb2-69905e8fc61a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"905e426d-8ef1-442e-abb2-69905e8fc61a\") " pod="openstack/glance-default-external-api-0" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.745481 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.745649 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/905e426d-8ef1-442e-abb2-69905e8fc61a-config-data\") pod \"glance-default-external-api-0\" (UID: \"905e426d-8ef1-442e-abb2-69905e8fc61a\") " pod="openstack/glance-default-external-api-0" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.753929 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/905e426d-8ef1-442e-abb2-69905e8fc61a-scripts\") pod \"glance-default-external-api-0\" (UID: \"905e426d-8ef1-442e-abb2-69905e8fc61a\") " pod="openstack/glance-default-external-api-0" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.753989 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/905e426d-8ef1-442e-abb2-69905e8fc61a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"905e426d-8ef1-442e-abb2-69905e8fc61a\") " pod="openstack/glance-default-external-api-0" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.754180 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/905e426d-8ef1-442e-abb2-69905e8fc61a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"905e426d-8ef1-442e-abb2-69905e8fc61a\") " pod="openstack/glance-default-external-api-0" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.757148 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.759516 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.768416 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.768461 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-92420d06-c977-4be0-b4f7-e2f75390c9aa\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-92420d06-c977-4be0-b4f7-e2f75390c9aa\") pod \"glance-default-external-api-0\" (UID: \"905e426d-8ef1-442e-abb2-69905e8fc61a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/4bd5fd036fa183ff3c7bd061e321acfa035788f5e30cbd29138724604e749ce5/globalmount\"" pod="openstack/glance-default-external-api-0" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.769558 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.769914 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.773997 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvpdt\" (UniqueName: \"kubernetes.io/projected/905e426d-8ef1-442e-abb2-69905e8fc61a-kube-api-access-cvpdt\") pod \"glance-default-external-api-0\" (UID: \"905e426d-8ef1-442e-abb2-69905e8fc61a\") " pod="openstack/glance-default-external-api-0" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.781711 4722 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.781877 4722 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-59bfcd6f-16ca-4d2a-8185-976337e0ec2c" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-59bfcd6f-16ca-4d2a-8185-976337e0ec2c") on node "crc" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.820410 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.838136 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-92420d06-c977-4be0-b4f7-e2f75390c9aa\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-92420d06-c977-4be0-b4f7-e2f75390c9aa\") pod \"glance-default-external-api-0\" (UID: \"905e426d-8ef1-442e-abb2-69905e8fc61a\") " pod="openstack/glance-default-external-api-0" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.854551 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.854633 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.854712 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-59bfcd6f-16ca-4d2a-8185-976337e0ec2c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-59bfcd6f-16ca-4d2a-8185-976337e0ec2c\") pod \"glance-default-internal-api-0\" (UID: \"f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.854769 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.854878 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.854908 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.854949 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4-logs\") pod \"glance-default-internal-api-0\" (UID: \"f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.855045 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2hxk\" (UniqueName: \"kubernetes.io/projected/f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4-kube-api-access-r2hxk\") pod \"glance-default-internal-api-0\" (UID: \"f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.870216 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.870257 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-59bfcd6f-16ca-4d2a-8185-976337e0ec2c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-59bfcd6f-16ca-4d2a-8185-976337e0ec2c\") pod \"glance-default-internal-api-0\" (UID: \"f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/a3395c88a7dc2d3ef264f22a8309ab5263d0d43341a96b8565f3c55ce5be97e0/globalmount\"" pod="openstack/glance-default-internal-api-0" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.909892 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.946328 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-59bfcd6f-16ca-4d2a-8185-976337e0ec2c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-59bfcd6f-16ca-4d2a-8185-976337e0ec2c\") pod \"glance-default-internal-api-0\" (UID: \"f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.956788 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2hxk\" (UniqueName: \"kubernetes.io/projected/f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4-kube-api-access-r2hxk\") pod \"glance-default-internal-api-0\" (UID: \"f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.956879 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.956907 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.956936 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.957031 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.957050 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.957076 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4-logs\") pod \"glance-default-internal-api-0\" (UID: \"f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.957606 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4-logs\") pod \"glance-default-internal-api-0\" (UID: \"f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.959757 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.962813 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.971056 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.971663 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.974253 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:25:46 crc kubenswrapper[4722]: I0309 14:25:46.977036 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2hxk\" (UniqueName: \"kubernetes.io/projected/f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4-kube-api-access-r2hxk\") pod \"glance-default-internal-api-0\" (UID: \"f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:25:47 crc kubenswrapper[4722]: I0309 14:25:47.240691 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 09 14:25:48 crc kubenswrapper[4722]: I0309 14:25:48.164062 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a946c8a-1c04-4453-ac3f-e7305672bc2a" path="/var/lib/kubelet/pods/6a946c8a-1c04-4453-ac3f-e7305672bc2a/volumes" Mar 09 14:25:48 crc kubenswrapper[4722]: I0309 14:25:48.165777 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e" path="/var/lib/kubelet/pods/bdf48693-5b1f-466a-ac6d-a3a4dcdcf80e/volumes" Mar 09 14:25:49 crc kubenswrapper[4722]: I0309 14:25:49.360745 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-qccbt" podUID="f87e19a1-bff1-4e82-8677-3812b2f51f46" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.172:5353: connect: connection refused" Mar 09 14:25:50 crc kubenswrapper[4722]: I0309 14:25:50.712840 4722 scope.go:117] "RemoveContainer" containerID="ee8056cff9b66b64015f6a67e20f17d900daef19b7e1712f52cb634e0b3b8a43" Mar 09 14:25:50 crc kubenswrapper[4722]: E0309 14:25:50.718526 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified" Mar 09 14:25:50 crc kubenswrapper[4722]: E0309 14:25:50.718725 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:heat-db-sync,Image:quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified,Command:[/bin/bash],Args:[-c /usr/bin/heat-manage --config-dir /etc/heat/heat.conf.d db_sync],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/heat/heat.conf.d/00-default.conf,SubPath:00-default.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/heat/heat.conf.d/01-custom.conf,SubPath:01-custom.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tw8df,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42418,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*42418,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-db-sync-j4lgt_openstack(7f51218c-6b15-4f4a-ad49-1ba0ccd5e292): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 14:25:50 crc kubenswrapper[4722]: E0309 14:25:50.720104 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/heat-db-sync-j4lgt" podUID="7f51218c-6b15-4f4a-ad49-1ba0ccd5e292" Mar 09 14:25:51 crc kubenswrapper[4722]: E0309 14:25:51.444284 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Mar 09 14:25:51 crc kubenswrapper[4722]: E0309 14:25:51.444773 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gbn6m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-tlbnm_openstack(b63a3af5-a347-40ed-b9bc-52ad70e7ff13): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 14:25:51 crc kubenswrapper[4722]: E0309 14:25:51.446499 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-tlbnm" podUID="b63a3af5-a347-40ed-b9bc-52ad70e7ff13" Mar 09 14:25:51 crc kubenswrapper[4722]: I0309 14:25:51.527739 4722 patch_prober.go:28] interesting pod/machine-config-daemon-hjrrb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:25:51 crc kubenswrapper[4722]: I0309 14:25:51.527807 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:25:51 crc kubenswrapper[4722]: E0309 14:25:51.557635 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified\\\"\"" pod="openstack/heat-db-sync-j4lgt" podUID="7f51218c-6b15-4f4a-ad49-1ba0ccd5e292" Mar 09 14:25:51 crc kubenswrapper[4722]: E0309 14:25:51.557633 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-tlbnm" podUID="b63a3af5-a347-40ed-b9bc-52ad70e7ff13" Mar 09 14:25:59 crc kubenswrapper[4722]: I0309 14:25:59.360412 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-qccbt" podUID="f87e19a1-bff1-4e82-8677-3812b2f51f46" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.172:5353: i/o timeout" Mar 09 14:25:59 crc kubenswrapper[4722]: I0309 14:25:59.361034 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74f6bcbc87-qccbt" Mar 09 14:25:59 crc kubenswrapper[4722]: I0309 14:25:59.818283 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-6ssn4" Mar 09 14:25:59 crc kubenswrapper[4722]: I0309 14:25:59.863891 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6lvfq\" (UniqueName: \"kubernetes.io/projected/f2ad273c-a2e3-4d6e-b9ca-f8b67f8a680b-kube-api-access-6lvfq\") pod \"f2ad273c-a2e3-4d6e-b9ca-f8b67f8a680b\" (UID: \"f2ad273c-a2e3-4d6e-b9ca-f8b67f8a680b\") " Mar 09 14:25:59 crc kubenswrapper[4722]: I0309 14:25:59.864059 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f2ad273c-a2e3-4d6e-b9ca-f8b67f8a680b-credential-keys\") pod \"f2ad273c-a2e3-4d6e-b9ca-f8b67f8a680b\" (UID: \"f2ad273c-a2e3-4d6e-b9ca-f8b67f8a680b\") " Mar 09 14:25:59 crc kubenswrapper[4722]: I0309 14:25:59.864086 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2ad273c-a2e3-4d6e-b9ca-f8b67f8a680b-scripts\") pod \"f2ad273c-a2e3-4d6e-b9ca-f8b67f8a680b\" (UID: \"f2ad273c-a2e3-4d6e-b9ca-f8b67f8a680b\") " Mar 09 14:25:59 crc kubenswrapper[4722]: I0309 14:25:59.864124 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2ad273c-a2e3-4d6e-b9ca-f8b67f8a680b-config-data\") pod \"f2ad273c-a2e3-4d6e-b9ca-f8b67f8a680b\" (UID: \"f2ad273c-a2e3-4d6e-b9ca-f8b67f8a680b\") " Mar 09 14:25:59 crc kubenswrapper[4722]: I0309 14:25:59.864268 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f2ad273c-a2e3-4d6e-b9ca-f8b67f8a680b-fernet-keys\") pod \"f2ad273c-a2e3-4d6e-b9ca-f8b67f8a680b\" (UID: \"f2ad273c-a2e3-4d6e-b9ca-f8b67f8a680b\") " Mar 09 14:25:59 crc kubenswrapper[4722]: I0309 14:25:59.864324 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2ad273c-a2e3-4d6e-b9ca-f8b67f8a680b-combined-ca-bundle\") pod \"f2ad273c-a2e3-4d6e-b9ca-f8b67f8a680b\" (UID: \"f2ad273c-a2e3-4d6e-b9ca-f8b67f8a680b\") " Mar 09 14:25:59 crc kubenswrapper[4722]: I0309 14:25:59.870541 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2ad273c-a2e3-4d6e-b9ca-f8b67f8a680b-scripts" (OuterVolumeSpecName: "scripts") pod "f2ad273c-a2e3-4d6e-b9ca-f8b67f8a680b" (UID: "f2ad273c-a2e3-4d6e-b9ca-f8b67f8a680b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:25:59 crc kubenswrapper[4722]: I0309 14:25:59.870537 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2ad273c-a2e3-4d6e-b9ca-f8b67f8a680b-kube-api-access-6lvfq" (OuterVolumeSpecName: "kube-api-access-6lvfq") pod "f2ad273c-a2e3-4d6e-b9ca-f8b67f8a680b" (UID: "f2ad273c-a2e3-4d6e-b9ca-f8b67f8a680b"). InnerVolumeSpecName "kube-api-access-6lvfq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:25:59 crc kubenswrapper[4722]: I0309 14:25:59.871367 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2ad273c-a2e3-4d6e-b9ca-f8b67f8a680b-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "f2ad273c-a2e3-4d6e-b9ca-f8b67f8a680b" (UID: "f2ad273c-a2e3-4d6e-b9ca-f8b67f8a680b"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:25:59 crc kubenswrapper[4722]: I0309 14:25:59.872044 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2ad273c-a2e3-4d6e-b9ca-f8b67f8a680b-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "f2ad273c-a2e3-4d6e-b9ca-f8b67f8a680b" (UID: "f2ad273c-a2e3-4d6e-b9ca-f8b67f8a680b"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:25:59 crc kubenswrapper[4722]: I0309 14:25:59.893176 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2ad273c-a2e3-4d6e-b9ca-f8b67f8a680b-config-data" (OuterVolumeSpecName: "config-data") pod "f2ad273c-a2e3-4d6e-b9ca-f8b67f8a680b" (UID: "f2ad273c-a2e3-4d6e-b9ca-f8b67f8a680b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:25:59 crc kubenswrapper[4722]: I0309 14:25:59.910635 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2ad273c-a2e3-4d6e-b9ca-f8b67f8a680b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f2ad273c-a2e3-4d6e-b9ca-f8b67f8a680b" (UID: "f2ad273c-a2e3-4d6e-b9ca-f8b67f8a680b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:25:59 crc kubenswrapper[4722]: I0309 14:25:59.967080 4722 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f2ad273c-a2e3-4d6e-b9ca-f8b67f8a680b-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 09 14:25:59 crc kubenswrapper[4722]: I0309 14:25:59.967109 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2ad273c-a2e3-4d6e-b9ca-f8b67f8a680b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:25:59 crc kubenswrapper[4722]: I0309 14:25:59.967121 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6lvfq\" (UniqueName: \"kubernetes.io/projected/f2ad273c-a2e3-4d6e-b9ca-f8b67f8a680b-kube-api-access-6lvfq\") on node \"crc\" DevicePath \"\"" Mar 09 14:25:59 crc kubenswrapper[4722]: I0309 14:25:59.967130 4722 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f2ad273c-a2e3-4d6e-b9ca-f8b67f8a680b-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 09 14:25:59 crc kubenswrapper[4722]: I0309 14:25:59.967138 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2ad273c-a2e3-4d6e-b9ca-f8b67f8a680b-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 14:25:59 crc kubenswrapper[4722]: I0309 14:25:59.967149 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2ad273c-a2e3-4d6e-b9ca-f8b67f8a680b-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:00 crc kubenswrapper[4722]: I0309 14:26:00.160924 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551106-tzqf7"] Mar 09 14:26:00 crc kubenswrapper[4722]: E0309 14:26:00.161643 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2ad273c-a2e3-4d6e-b9ca-f8b67f8a680b" containerName="keystone-bootstrap" Mar 09 14:26:00 crc kubenswrapper[4722]: I0309 14:26:00.161663 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2ad273c-a2e3-4d6e-b9ca-f8b67f8a680b" containerName="keystone-bootstrap" Mar 09 14:26:00 crc kubenswrapper[4722]: I0309 14:26:00.161899 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2ad273c-a2e3-4d6e-b9ca-f8b67f8a680b" containerName="keystone-bootstrap" Mar 09 14:26:00 crc kubenswrapper[4722]: I0309 14:26:00.162928 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551106-tzqf7" Mar 09 14:26:00 crc kubenswrapper[4722]: I0309 14:26:00.165249 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 14:26:00 crc kubenswrapper[4722]: I0309 14:26:00.165471 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-dr2x6" Mar 09 14:26:00 crc kubenswrapper[4722]: I0309 14:26:00.165801 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 14:26:00 crc kubenswrapper[4722]: I0309 14:26:00.170472 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551106-tzqf7"] Mar 09 14:26:00 crc kubenswrapper[4722]: E0309 14:26:00.222345 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Mar 09 14:26:00 crc kubenswrapper[4722]: E0309 14:26:00.222495 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5b9h5b5h58ch5f5h56h695h54bhc7h66fh7dh88hb4h5c8h65dh585h5f8h7fhd9hf6h5fbhbdh5b8h687h65h65fh575h689h68h54fhbfh576hbq,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ndlhs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(34700700-cbd0-4a2e-b791-25b63e3de5b8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 14:26:00 crc kubenswrapper[4722]: I0309 14:26:00.270864 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9h6h\" (UniqueName: \"kubernetes.io/projected/6e7bbbcf-fe3a-4007-b98d-2692b6f6cbe5-kube-api-access-f9h6h\") pod \"auto-csr-approver-29551106-tzqf7\" (UID: \"6e7bbbcf-fe3a-4007-b98d-2692b6f6cbe5\") " pod="openshift-infra/auto-csr-approver-29551106-tzqf7" Mar 09 14:26:00 crc kubenswrapper[4722]: I0309 14:26:00.369350 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-qccbt" Mar 09 14:26:00 crc kubenswrapper[4722]: I0309 14:26:00.375263 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f87e19a1-bff1-4e82-8677-3812b2f51f46-ovsdbserver-nb\") pod \"f87e19a1-bff1-4e82-8677-3812b2f51f46\" (UID: \"f87e19a1-bff1-4e82-8677-3812b2f51f46\") " Mar 09 14:26:00 crc kubenswrapper[4722]: I0309 14:26:00.375351 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f87e19a1-bff1-4e82-8677-3812b2f51f46-dns-swift-storage-0\") pod \"f87e19a1-bff1-4e82-8677-3812b2f51f46\" (UID: \"f87e19a1-bff1-4e82-8677-3812b2f51f46\") " Mar 09 14:26:00 crc kubenswrapper[4722]: I0309 14:26:00.375414 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f87e19a1-bff1-4e82-8677-3812b2f51f46-dns-svc\") pod \"f87e19a1-bff1-4e82-8677-3812b2f51f46\" (UID: \"f87e19a1-bff1-4e82-8677-3812b2f51f46\") " Mar 09 14:26:00 crc kubenswrapper[4722]: I0309 14:26:00.375505 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ffrlk\" (UniqueName: \"kubernetes.io/projected/f87e19a1-bff1-4e82-8677-3812b2f51f46-kube-api-access-ffrlk\") pod \"f87e19a1-bff1-4e82-8677-3812b2f51f46\" (UID: \"f87e19a1-bff1-4e82-8677-3812b2f51f46\") " Mar 09 14:26:00 crc kubenswrapper[4722]: I0309 14:26:00.375539 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f87e19a1-bff1-4e82-8677-3812b2f51f46-config\") pod \"f87e19a1-bff1-4e82-8677-3812b2f51f46\" (UID: \"f87e19a1-bff1-4e82-8677-3812b2f51f46\") " Mar 09 14:26:00 crc kubenswrapper[4722]: I0309 14:26:00.375614 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f87e19a1-bff1-4e82-8677-3812b2f51f46-ovsdbserver-sb\") pod \"f87e19a1-bff1-4e82-8677-3812b2f51f46\" (UID: \"f87e19a1-bff1-4e82-8677-3812b2f51f46\") " Mar 09 14:26:00 crc kubenswrapper[4722]: I0309 14:26:00.375870 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9h6h\" (UniqueName: \"kubernetes.io/projected/6e7bbbcf-fe3a-4007-b98d-2692b6f6cbe5-kube-api-access-f9h6h\") pod \"auto-csr-approver-29551106-tzqf7\" (UID: \"6e7bbbcf-fe3a-4007-b98d-2692b6f6cbe5\") " pod="openshift-infra/auto-csr-approver-29551106-tzqf7" Mar 09 14:26:00 crc kubenswrapper[4722]: I0309 14:26:00.386262 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f87e19a1-bff1-4e82-8677-3812b2f51f46-kube-api-access-ffrlk" (OuterVolumeSpecName: "kube-api-access-ffrlk") pod "f87e19a1-bff1-4e82-8677-3812b2f51f46" (UID: "f87e19a1-bff1-4e82-8677-3812b2f51f46"). InnerVolumeSpecName "kube-api-access-ffrlk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:26:00 crc kubenswrapper[4722]: I0309 14:26:00.415002 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9h6h\" (UniqueName: \"kubernetes.io/projected/6e7bbbcf-fe3a-4007-b98d-2692b6f6cbe5-kube-api-access-f9h6h\") pod \"auto-csr-approver-29551106-tzqf7\" (UID: \"6e7bbbcf-fe3a-4007-b98d-2692b6f6cbe5\") " pod="openshift-infra/auto-csr-approver-29551106-tzqf7" Mar 09 14:26:00 crc kubenswrapper[4722]: I0309 14:26:00.442278 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f87e19a1-bff1-4e82-8677-3812b2f51f46-config" (OuterVolumeSpecName: "config") pod "f87e19a1-bff1-4e82-8677-3812b2f51f46" (UID: "f87e19a1-bff1-4e82-8677-3812b2f51f46"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:26:00 crc kubenswrapper[4722]: I0309 14:26:00.448183 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f87e19a1-bff1-4e82-8677-3812b2f51f46-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f87e19a1-bff1-4e82-8677-3812b2f51f46" (UID: "f87e19a1-bff1-4e82-8677-3812b2f51f46"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:26:00 crc kubenswrapper[4722]: I0309 14:26:00.454012 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f87e19a1-bff1-4e82-8677-3812b2f51f46-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f87e19a1-bff1-4e82-8677-3812b2f51f46" (UID: "f87e19a1-bff1-4e82-8677-3812b2f51f46"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:26:00 crc kubenswrapper[4722]: I0309 14:26:00.454659 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f87e19a1-bff1-4e82-8677-3812b2f51f46-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f87e19a1-bff1-4e82-8677-3812b2f51f46" (UID: "f87e19a1-bff1-4e82-8677-3812b2f51f46"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:26:00 crc kubenswrapper[4722]: I0309 14:26:00.513504 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551106-tzqf7" Mar 09 14:26:00 crc kubenswrapper[4722]: I0309 14:26:00.526931 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f87e19a1-bff1-4e82-8677-3812b2f51f46-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f87e19a1-bff1-4e82-8677-3812b2f51f46" (UID: "f87e19a1-bff1-4e82-8677-3812b2f51f46"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:26:00 crc kubenswrapper[4722]: I0309 14:26:00.533937 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f87e19a1-bff1-4e82-8677-3812b2f51f46-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:00 crc kubenswrapper[4722]: I0309 14:26:00.533963 4722 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f87e19a1-bff1-4e82-8677-3812b2f51f46-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:00 crc kubenswrapper[4722]: I0309 14:26:00.533976 4722 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f87e19a1-bff1-4e82-8677-3812b2f51f46-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:00 crc kubenswrapper[4722]: I0309 14:26:00.533989 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ffrlk\" (UniqueName: \"kubernetes.io/projected/f87e19a1-bff1-4e82-8677-3812b2f51f46-kube-api-access-ffrlk\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:00 crc kubenswrapper[4722]: I0309 14:26:00.534001 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f87e19a1-bff1-4e82-8677-3812b2f51f46-config\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:00 crc kubenswrapper[4722]: I0309 14:26:00.534012 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f87e19a1-bff1-4e82-8677-3812b2f51f46-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:00 crc kubenswrapper[4722]: I0309 14:26:00.659389 4722 generic.go:334] "Generic (PLEG): container finished" podID="69530d68-be96-4605-be46-8053083aa178" containerID="911525eabae91f320ecd486d8c9736832ea7f1256cdd8b97a7ae5e490d7c6c14" exitCode=0 Mar 09 14:26:00 crc kubenswrapper[4722]: I0309 14:26:00.659445 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-l9wtc" event={"ID":"69530d68-be96-4605-be46-8053083aa178","Type":"ContainerDied","Data":"911525eabae91f320ecd486d8c9736832ea7f1256cdd8b97a7ae5e490d7c6c14"} Mar 09 14:26:00 crc kubenswrapper[4722]: I0309 14:26:00.661856 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-qccbt" event={"ID":"f87e19a1-bff1-4e82-8677-3812b2f51f46","Type":"ContainerDied","Data":"a6165b4cd2ff710979ecd42b5bdffe7ae31013a27dfbe5b42133dfa53e52f52b"} Mar 09 14:26:00 crc kubenswrapper[4722]: I0309 14:26:00.661926 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-qccbt" Mar 09 14:26:00 crc kubenswrapper[4722]: I0309 14:26:00.669417 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-6ssn4" event={"ID":"f2ad273c-a2e3-4d6e-b9ca-f8b67f8a680b","Type":"ContainerDied","Data":"a51d9e78db82f603fda3f557a68aa6a7effd64cb2a1ae34cb042b43fc045bea8"} Mar 09 14:26:00 crc kubenswrapper[4722]: I0309 14:26:00.669452 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a51d9e78db82f603fda3f557a68aa6a7effd64cb2a1ae34cb042b43fc045bea8" Mar 09 14:26:00 crc kubenswrapper[4722]: I0309 14:26:00.669475 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-6ssn4" Mar 09 14:26:00 crc kubenswrapper[4722]: I0309 14:26:00.716568 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-qccbt"] Mar 09 14:26:00 crc kubenswrapper[4722]: I0309 14:26:00.725911 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-qccbt"] Mar 09 14:26:00 crc kubenswrapper[4722]: I0309 14:26:00.759613 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 14:26:00 crc kubenswrapper[4722]: I0309 14:26:00.892784 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-6ssn4"] Mar 09 14:26:00 crc kubenswrapper[4722]: I0309 14:26:00.903940 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-6ssn4"] Mar 09 14:26:00 crc kubenswrapper[4722]: I0309 14:26:00.998477 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-fx7ks"] Mar 09 14:26:00 crc kubenswrapper[4722]: E0309 14:26:00.999293 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f87e19a1-bff1-4e82-8677-3812b2f51f46" containerName="init" Mar 09 14:26:00 crc kubenswrapper[4722]: I0309 14:26:00.999312 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f87e19a1-bff1-4e82-8677-3812b2f51f46" containerName="init" Mar 09 14:26:01 crc kubenswrapper[4722]: E0309 14:26:00.999369 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f87e19a1-bff1-4e82-8677-3812b2f51f46" containerName="dnsmasq-dns" Mar 09 14:26:01 crc kubenswrapper[4722]: I0309 14:26:00.999380 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f87e19a1-bff1-4e82-8677-3812b2f51f46" containerName="dnsmasq-dns" Mar 09 14:26:01 crc kubenswrapper[4722]: I0309 14:26:00.999674 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="f87e19a1-bff1-4e82-8677-3812b2f51f46" containerName="dnsmasq-dns" Mar 09 14:26:01 crc kubenswrapper[4722]: I0309 14:26:01.000554 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fx7ks" Mar 09 14:26:01 crc kubenswrapper[4722]: I0309 14:26:01.004115 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 09 14:26:01 crc kubenswrapper[4722]: I0309 14:26:01.004373 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-jfd9g" Mar 09 14:26:01 crc kubenswrapper[4722]: I0309 14:26:01.004572 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 09 14:26:01 crc kubenswrapper[4722]: I0309 14:26:01.005005 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 09 14:26:01 crc kubenswrapper[4722]: I0309 14:26:01.006908 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 09 14:26:01 crc kubenswrapper[4722]: I0309 14:26:01.019178 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-fx7ks"] Mar 09 14:26:01 crc kubenswrapper[4722]: I0309 14:26:01.044881 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdb62cc9-3fd6-41fd-bbbd-de3aaee2073b-config-data\") pod \"keystone-bootstrap-fx7ks\" (UID: \"cdb62cc9-3fd6-41fd-bbbd-de3aaee2073b\") " pod="openstack/keystone-bootstrap-fx7ks" Mar 09 14:26:01 crc kubenswrapper[4722]: I0309 14:26:01.044922 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdb62cc9-3fd6-41fd-bbbd-de3aaee2073b-combined-ca-bundle\") pod \"keystone-bootstrap-fx7ks\" (UID: \"cdb62cc9-3fd6-41fd-bbbd-de3aaee2073b\") " pod="openstack/keystone-bootstrap-fx7ks" Mar 09 14:26:01 crc kubenswrapper[4722]: I0309 14:26:01.044957 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cdb62cc9-3fd6-41fd-bbbd-de3aaee2073b-fernet-keys\") pod \"keystone-bootstrap-fx7ks\" (UID: \"cdb62cc9-3fd6-41fd-bbbd-de3aaee2073b\") " pod="openstack/keystone-bootstrap-fx7ks" Mar 09 14:26:01 crc kubenswrapper[4722]: I0309 14:26:01.045044 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdb62cc9-3fd6-41fd-bbbd-de3aaee2073b-scripts\") pod \"keystone-bootstrap-fx7ks\" (UID: \"cdb62cc9-3fd6-41fd-bbbd-de3aaee2073b\") " pod="openstack/keystone-bootstrap-fx7ks" Mar 09 14:26:01 crc kubenswrapper[4722]: I0309 14:26:01.045080 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cdb62cc9-3fd6-41fd-bbbd-de3aaee2073b-credential-keys\") pod \"keystone-bootstrap-fx7ks\" (UID: \"cdb62cc9-3fd6-41fd-bbbd-de3aaee2073b\") " pod="openstack/keystone-bootstrap-fx7ks" Mar 09 14:26:01 crc kubenswrapper[4722]: I0309 14:26:01.045115 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7gmj\" (UniqueName: \"kubernetes.io/projected/cdb62cc9-3fd6-41fd-bbbd-de3aaee2073b-kube-api-access-p7gmj\") pod \"keystone-bootstrap-fx7ks\" (UID: \"cdb62cc9-3fd6-41fd-bbbd-de3aaee2073b\") " pod="openstack/keystone-bootstrap-fx7ks" Mar 09 14:26:01 crc kubenswrapper[4722]: I0309 14:26:01.146675 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cdb62cc9-3fd6-41fd-bbbd-de3aaee2073b-credential-keys\") pod \"keystone-bootstrap-fx7ks\" (UID: \"cdb62cc9-3fd6-41fd-bbbd-de3aaee2073b\") " pod="openstack/keystone-bootstrap-fx7ks" Mar 09 14:26:01 crc kubenswrapper[4722]: I0309 14:26:01.146740 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7gmj\" (UniqueName: \"kubernetes.io/projected/cdb62cc9-3fd6-41fd-bbbd-de3aaee2073b-kube-api-access-p7gmj\") pod \"keystone-bootstrap-fx7ks\" (UID: \"cdb62cc9-3fd6-41fd-bbbd-de3aaee2073b\") " pod="openstack/keystone-bootstrap-fx7ks" Mar 09 14:26:01 crc kubenswrapper[4722]: I0309 14:26:01.146802 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdb62cc9-3fd6-41fd-bbbd-de3aaee2073b-config-data\") pod \"keystone-bootstrap-fx7ks\" (UID: \"cdb62cc9-3fd6-41fd-bbbd-de3aaee2073b\") " pod="openstack/keystone-bootstrap-fx7ks" Mar 09 14:26:01 crc kubenswrapper[4722]: I0309 14:26:01.146828 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdb62cc9-3fd6-41fd-bbbd-de3aaee2073b-combined-ca-bundle\") pod \"keystone-bootstrap-fx7ks\" (UID: \"cdb62cc9-3fd6-41fd-bbbd-de3aaee2073b\") " pod="openstack/keystone-bootstrap-fx7ks" Mar 09 14:26:01 crc kubenswrapper[4722]: I0309 14:26:01.146855 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cdb62cc9-3fd6-41fd-bbbd-de3aaee2073b-fernet-keys\") pod \"keystone-bootstrap-fx7ks\" (UID: \"cdb62cc9-3fd6-41fd-bbbd-de3aaee2073b\") " pod="openstack/keystone-bootstrap-fx7ks" Mar 09 14:26:01 crc kubenswrapper[4722]: I0309 14:26:01.146946 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdb62cc9-3fd6-41fd-bbbd-de3aaee2073b-scripts\") pod \"keystone-bootstrap-fx7ks\" (UID: \"cdb62cc9-3fd6-41fd-bbbd-de3aaee2073b\") " pod="openstack/keystone-bootstrap-fx7ks" Mar 09 14:26:01 crc kubenswrapper[4722]: I0309 14:26:01.151085 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdb62cc9-3fd6-41fd-bbbd-de3aaee2073b-config-data\") pod \"keystone-bootstrap-fx7ks\" (UID: \"cdb62cc9-3fd6-41fd-bbbd-de3aaee2073b\") " pod="openstack/keystone-bootstrap-fx7ks" Mar 09 14:26:01 crc kubenswrapper[4722]: I0309 14:26:01.152797 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdb62cc9-3fd6-41fd-bbbd-de3aaee2073b-scripts\") pod \"keystone-bootstrap-fx7ks\" (UID: \"cdb62cc9-3fd6-41fd-bbbd-de3aaee2073b\") " pod="openstack/keystone-bootstrap-fx7ks" Mar 09 14:26:01 crc kubenswrapper[4722]: I0309 14:26:01.153170 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdb62cc9-3fd6-41fd-bbbd-de3aaee2073b-combined-ca-bundle\") pod \"keystone-bootstrap-fx7ks\" (UID: \"cdb62cc9-3fd6-41fd-bbbd-de3aaee2073b\") " pod="openstack/keystone-bootstrap-fx7ks" Mar 09 14:26:01 crc kubenswrapper[4722]: I0309 14:26:01.154440 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cdb62cc9-3fd6-41fd-bbbd-de3aaee2073b-credential-keys\") pod \"keystone-bootstrap-fx7ks\" (UID: \"cdb62cc9-3fd6-41fd-bbbd-de3aaee2073b\") " pod="openstack/keystone-bootstrap-fx7ks" Mar 09 14:26:01 crc kubenswrapper[4722]: I0309 14:26:01.164245 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cdb62cc9-3fd6-41fd-bbbd-de3aaee2073b-fernet-keys\") pod \"keystone-bootstrap-fx7ks\" (UID: \"cdb62cc9-3fd6-41fd-bbbd-de3aaee2073b\") " pod="openstack/keystone-bootstrap-fx7ks" Mar 09 14:26:01 crc kubenswrapper[4722]: I0309 14:26:01.167147 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7gmj\" (UniqueName: \"kubernetes.io/projected/cdb62cc9-3fd6-41fd-bbbd-de3aaee2073b-kube-api-access-p7gmj\") pod \"keystone-bootstrap-fx7ks\" (UID: \"cdb62cc9-3fd6-41fd-bbbd-de3aaee2073b\") " pod="openstack/keystone-bootstrap-fx7ks" Mar 09 14:26:01 crc kubenswrapper[4722]: I0309 14:26:01.329251 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fx7ks" Mar 09 14:26:01 crc kubenswrapper[4722]: I0309 14:26:01.671658 4722 scope.go:117] "RemoveContainer" containerID="1ad4a3032934339b7242327bcac91622a2eb45e3bcae1e676a15748689ebb407" Mar 09 14:26:01 crc kubenswrapper[4722]: E0309 14:26:01.690828 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Mar 09 14:26:01 crc kubenswrapper[4722]: E0309 14:26:01.691539 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rw8pm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-fzj6s_openstack(5035bd54-0aaa-4ff3-b90a-6145145fe95c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 14:26:01 crc kubenswrapper[4722]: E0309 14:26:01.694101 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-fzj6s" podUID="5035bd54-0aaa-4ff3-b90a-6145145fe95c" Mar 09 14:26:01 crc kubenswrapper[4722]: I0309 14:26:01.989996 4722 scope.go:117] "RemoveContainer" containerID="fa472c6776005131d535047382281f2a71af316d713f97635d857b68b418fa16" Mar 09 14:26:02 crc kubenswrapper[4722]: I0309 14:26:02.079431 4722 scope.go:117] "RemoveContainer" containerID="d7a02dfe8d14a05bddff13ef5f261df15fa1e4afa9505ac0d75001bbd58e3903" Mar 09 14:26:02 crc kubenswrapper[4722]: I0309 14:26:02.133795 4722 scope.go:117] "RemoveContainer" containerID="46949b946146e4e13918c7e305ade73c59e9952a3685ca766fa24cb64b6b976d" Mar 09 14:26:02 crc kubenswrapper[4722]: I0309 14:26:02.200584 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2ad273c-a2e3-4d6e-b9ca-f8b67f8a680b" path="/var/lib/kubelet/pods/f2ad273c-a2e3-4d6e-b9ca-f8b67f8a680b/volumes" Mar 09 14:26:02 crc kubenswrapper[4722]: I0309 14:26:02.201329 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f87e19a1-bff1-4e82-8677-3812b2f51f46" path="/var/lib/kubelet/pods/f87e19a1-bff1-4e82-8677-3812b2f51f46/volumes" Mar 09 14:26:02 crc kubenswrapper[4722]: I0309 14:26:02.324222 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-l9wtc" Mar 09 14:26:02 crc kubenswrapper[4722]: I0309 14:26:02.358079 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551106-tzqf7"] Mar 09 14:26:02 crc kubenswrapper[4722]: W0309 14:26:02.397202 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e7bbbcf_fe3a_4007_b98d_2692b6f6cbe5.slice/crio-6b6378f42e8e568eb734f2a119dd536e84363f672cd6cf6baccb8b4cae882a07 WatchSource:0}: Error finding container 6b6378f42e8e568eb734f2a119dd536e84363f672cd6cf6baccb8b4cae882a07: Status 404 returned error can't find the container with id 6b6378f42e8e568eb734f2a119dd536e84363f672cd6cf6baccb8b4cae882a07 Mar 09 14:26:02 crc kubenswrapper[4722]: I0309 14:26:02.483482 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/69530d68-be96-4605-be46-8053083aa178-config\") pod \"69530d68-be96-4605-be46-8053083aa178\" (UID: \"69530d68-be96-4605-be46-8053083aa178\") " Mar 09 14:26:02 crc kubenswrapper[4722]: I0309 14:26:02.483788 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4qqtt\" (UniqueName: \"kubernetes.io/projected/69530d68-be96-4605-be46-8053083aa178-kube-api-access-4qqtt\") pod \"69530d68-be96-4605-be46-8053083aa178\" (UID: \"69530d68-be96-4605-be46-8053083aa178\") " Mar 09 14:26:02 crc kubenswrapper[4722]: I0309 14:26:02.484009 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69530d68-be96-4605-be46-8053083aa178-combined-ca-bundle\") pod \"69530d68-be96-4605-be46-8053083aa178\" (UID: \"69530d68-be96-4605-be46-8053083aa178\") " Mar 09 14:26:02 crc kubenswrapper[4722]: I0309 14:26:02.489157 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69530d68-be96-4605-be46-8053083aa178-kube-api-access-4qqtt" (OuterVolumeSpecName: "kube-api-access-4qqtt") pod "69530d68-be96-4605-be46-8053083aa178" (UID: "69530d68-be96-4605-be46-8053083aa178"). InnerVolumeSpecName "kube-api-access-4qqtt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:26:02 crc kubenswrapper[4722]: I0309 14:26:02.514495 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69530d68-be96-4605-be46-8053083aa178-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "69530d68-be96-4605-be46-8053083aa178" (UID: "69530d68-be96-4605-be46-8053083aa178"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:26:02 crc kubenswrapper[4722]: I0309 14:26:02.536418 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69530d68-be96-4605-be46-8053083aa178-config" (OuterVolumeSpecName: "config") pod "69530d68-be96-4605-be46-8053083aa178" (UID: "69530d68-be96-4605-be46-8053083aa178"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:26:02 crc kubenswrapper[4722]: I0309 14:26:02.565517 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-fx7ks"] Mar 09 14:26:02 crc kubenswrapper[4722]: I0309 14:26:02.586837 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/69530d68-be96-4605-be46-8053083aa178-config\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:02 crc kubenswrapper[4722]: I0309 14:26:02.586881 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4qqtt\" (UniqueName: \"kubernetes.io/projected/69530d68-be96-4605-be46-8053083aa178-kube-api-access-4qqtt\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:02 crc kubenswrapper[4722]: I0309 14:26:02.586897 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69530d68-be96-4605-be46-8053083aa178-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:02 crc kubenswrapper[4722]: W0309 14:26:02.592272 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcdb62cc9_3fd6_41fd_bbbd_de3aaee2073b.slice/crio-c24a1a5f921771115b437afb566d6a43e306470f334ebb51dcb64ceecf3d7b6e WatchSource:0}: Error finding container c24a1a5f921771115b437afb566d6a43e306470f334ebb51dcb64ceecf3d7b6e: Status 404 returned error can't find the container with id c24a1a5f921771115b437afb566d6a43e306470f334ebb51dcb64ceecf3d7b6e Mar 09 14:26:02 crc kubenswrapper[4722]: I0309 14:26:02.696362 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-l9wtc" event={"ID":"69530d68-be96-4605-be46-8053083aa178","Type":"ContainerDied","Data":"3c9e54fbcde6482e4278145ee4c88bc3f0b0e9fa8cb13d43852c8ab34ce8cd32"} Mar 09 14:26:02 crc kubenswrapper[4722]: I0309 14:26:02.696409 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c9e54fbcde6482e4278145ee4c88bc3f0b0e9fa8cb13d43852c8ab34ce8cd32" Mar 09 14:26:02 crc kubenswrapper[4722]: I0309 14:26:02.696874 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-l9wtc" Mar 09 14:26:02 crc kubenswrapper[4722]: I0309 14:26:02.723770 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fx7ks" event={"ID":"cdb62cc9-3fd6-41fd-bbbd-de3aaee2073b","Type":"ContainerStarted","Data":"c24a1a5f921771115b437afb566d6a43e306470f334ebb51dcb64ceecf3d7b6e"} Mar 09 14:26:02 crc kubenswrapper[4722]: I0309 14:26:02.728269 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551106-tzqf7" event={"ID":"6e7bbbcf-fe3a-4007-b98d-2692b6f6cbe5","Type":"ContainerStarted","Data":"6b6378f42e8e568eb734f2a119dd536e84363f672cd6cf6baccb8b4cae882a07"} Mar 09 14:26:02 crc kubenswrapper[4722]: I0309 14:26:02.734756 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-9llgc" event={"ID":"d1e56f45-1812-4ee1-aacc-0b012cf07111","Type":"ContainerStarted","Data":"8146c5ee9c87f326a93c865b0b24bb207155822d151e5247f424dbf90262a039"} Mar 09 14:26:02 crc kubenswrapper[4722]: I0309 14:26:02.738627 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"905e426d-8ef1-442e-abb2-69905e8fc61a","Type":"ContainerStarted","Data":"d22f656d55f72d75c7f1d48eb90e71d7c26d641320c24c6a229270b403b98e8e"} Mar 09 14:26:02 crc kubenswrapper[4722]: E0309 14:26:02.743387 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-fzj6s" podUID="5035bd54-0aaa-4ff3-b90a-6145145fe95c" Mar 09 14:26:02 crc kubenswrapper[4722]: I0309 14:26:02.753513 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-9llgc" podStartSLOduration=5.114432809 podStartE2EDuration="30.753494689s" podCreationTimestamp="2026-03-09 14:25:32 +0000 UTC" firstStartedPulling="2026-03-09 14:25:34.584280587 +0000 UTC m=+1375.139849163" lastFinishedPulling="2026-03-09 14:26:00.223342477 +0000 UTC m=+1400.778911043" observedRunningTime="2026-03-09 14:26:02.75171476 +0000 UTC m=+1403.307283346" watchObservedRunningTime="2026-03-09 14:26:02.753494689 +0000 UTC m=+1403.309063265" Mar 09 14:26:02 crc kubenswrapper[4722]: I0309 14:26:02.943750 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-fbs5z"] Mar 09 14:26:02 crc kubenswrapper[4722]: E0309 14:26:02.944428 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69530d68-be96-4605-be46-8053083aa178" containerName="neutron-db-sync" Mar 09 14:26:02 crc kubenswrapper[4722]: I0309 14:26:02.944456 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="69530d68-be96-4605-be46-8053083aa178" containerName="neutron-db-sync" Mar 09 14:26:02 crc kubenswrapper[4722]: I0309 14:26:02.946267 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="69530d68-be96-4605-be46-8053083aa178" containerName="neutron-db-sync" Mar 09 14:26:02 crc kubenswrapper[4722]: I0309 14:26:02.948474 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-fbs5z" Mar 09 14:26:02 crc kubenswrapper[4722]: I0309 14:26:02.963439 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-fbs5z"] Mar 09 14:26:03 crc kubenswrapper[4722]: I0309 14:26:03.069745 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6f5cbf5964-pmtn7"] Mar 09 14:26:03 crc kubenswrapper[4722]: I0309 14:26:03.073135 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6f5cbf5964-pmtn7" Mar 09 14:26:03 crc kubenswrapper[4722]: I0309 14:26:03.075717 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-d8dpk" Mar 09 14:26:03 crc kubenswrapper[4722]: I0309 14:26:03.075926 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 09 14:26:03 crc kubenswrapper[4722]: I0309 14:26:03.076123 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 09 14:26:03 crc kubenswrapper[4722]: I0309 14:26:03.076257 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 09 14:26:03 crc kubenswrapper[4722]: I0309 14:26:03.083805 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6f5cbf5964-pmtn7"] Mar 09 14:26:03 crc kubenswrapper[4722]: I0309 14:26:03.118515 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/279a10b1-27b8-49d5-8f22-2e01c3db5a04-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-fbs5z\" (UID: \"279a10b1-27b8-49d5-8f22-2e01c3db5a04\") " pod="openstack/dnsmasq-dns-55f844cf75-fbs5z" Mar 09 14:26:03 crc kubenswrapper[4722]: I0309 14:26:03.118591 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/279a10b1-27b8-49d5-8f22-2e01c3db5a04-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-fbs5z\" (UID: \"279a10b1-27b8-49d5-8f22-2e01c3db5a04\") " pod="openstack/dnsmasq-dns-55f844cf75-fbs5z" Mar 09 14:26:03 crc kubenswrapper[4722]: I0309 14:26:03.118637 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/279a10b1-27b8-49d5-8f22-2e01c3db5a04-dns-svc\") pod \"dnsmasq-dns-55f844cf75-fbs5z\" (UID: \"279a10b1-27b8-49d5-8f22-2e01c3db5a04\") " pod="openstack/dnsmasq-dns-55f844cf75-fbs5z" Mar 09 14:26:03 crc kubenswrapper[4722]: I0309 14:26:03.118685 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7hpk\" (UniqueName: \"kubernetes.io/projected/279a10b1-27b8-49d5-8f22-2e01c3db5a04-kube-api-access-f7hpk\") pod \"dnsmasq-dns-55f844cf75-fbs5z\" (UID: \"279a10b1-27b8-49d5-8f22-2e01c3db5a04\") " pod="openstack/dnsmasq-dns-55f844cf75-fbs5z" Mar 09 14:26:03 crc kubenswrapper[4722]: I0309 14:26:03.118774 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/279a10b1-27b8-49d5-8f22-2e01c3db5a04-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-fbs5z\" (UID: \"279a10b1-27b8-49d5-8f22-2e01c3db5a04\") " pod="openstack/dnsmasq-dns-55f844cf75-fbs5z" Mar 09 14:26:03 crc kubenswrapper[4722]: I0309 14:26:03.118914 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/279a10b1-27b8-49d5-8f22-2e01c3db5a04-config\") pod \"dnsmasq-dns-55f844cf75-fbs5z\" (UID: \"279a10b1-27b8-49d5-8f22-2e01c3db5a04\") " pod="openstack/dnsmasq-dns-55f844cf75-fbs5z" Mar 09 14:26:03 crc kubenswrapper[4722]: I0309 14:26:03.233155 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/279a10b1-27b8-49d5-8f22-2e01c3db5a04-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-fbs5z\" (UID: \"279a10b1-27b8-49d5-8f22-2e01c3db5a04\") " pod="openstack/dnsmasq-dns-55f844cf75-fbs5z" Mar 09 14:26:03 crc kubenswrapper[4722]: I0309 14:26:03.233270 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24650aaa-2d24-4e59-9f9a-40b929e25c10-combined-ca-bundle\") pod \"neutron-6f5cbf5964-pmtn7\" (UID: \"24650aaa-2d24-4e59-9f9a-40b929e25c10\") " pod="openstack/neutron-6f5cbf5964-pmtn7" Mar 09 14:26:03 crc kubenswrapper[4722]: I0309 14:26:03.233464 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/24650aaa-2d24-4e59-9f9a-40b929e25c10-config\") pod \"neutron-6f5cbf5964-pmtn7\" (UID: \"24650aaa-2d24-4e59-9f9a-40b929e25c10\") " pod="openstack/neutron-6f5cbf5964-pmtn7" Mar 09 14:26:03 crc kubenswrapper[4722]: I0309 14:26:03.233604 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/279a10b1-27b8-49d5-8f22-2e01c3db5a04-config\") pod \"dnsmasq-dns-55f844cf75-fbs5z\" (UID: \"279a10b1-27b8-49d5-8f22-2e01c3db5a04\") " pod="openstack/dnsmasq-dns-55f844cf75-fbs5z" Mar 09 14:26:03 crc kubenswrapper[4722]: I0309 14:26:03.233706 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ll9zp\" (UniqueName: \"kubernetes.io/projected/24650aaa-2d24-4e59-9f9a-40b929e25c10-kube-api-access-ll9zp\") pod \"neutron-6f5cbf5964-pmtn7\" (UID: \"24650aaa-2d24-4e59-9f9a-40b929e25c10\") " pod="openstack/neutron-6f5cbf5964-pmtn7" Mar 09 14:26:03 crc kubenswrapper[4722]: I0309 14:26:03.233981 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/279a10b1-27b8-49d5-8f22-2e01c3db5a04-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-fbs5z\" (UID: \"279a10b1-27b8-49d5-8f22-2e01c3db5a04\") " pod="openstack/dnsmasq-dns-55f844cf75-fbs5z" Mar 09 14:26:03 crc kubenswrapper[4722]: I0309 14:26:03.234077 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/24650aaa-2d24-4e59-9f9a-40b929e25c10-ovndb-tls-certs\") pod \"neutron-6f5cbf5964-pmtn7\" (UID: \"24650aaa-2d24-4e59-9f9a-40b929e25c10\") " pod="openstack/neutron-6f5cbf5964-pmtn7" Mar 09 14:26:03 crc kubenswrapper[4722]: I0309 14:26:03.234103 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/279a10b1-27b8-49d5-8f22-2e01c3db5a04-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-fbs5z\" (UID: \"279a10b1-27b8-49d5-8f22-2e01c3db5a04\") " pod="openstack/dnsmasq-dns-55f844cf75-fbs5z" Mar 09 14:26:03 crc kubenswrapper[4722]: I0309 14:26:03.234173 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/279a10b1-27b8-49d5-8f22-2e01c3db5a04-dns-svc\") pod \"dnsmasq-dns-55f844cf75-fbs5z\" (UID: \"279a10b1-27b8-49d5-8f22-2e01c3db5a04\") " pod="openstack/dnsmasq-dns-55f844cf75-fbs5z" Mar 09 14:26:03 crc kubenswrapper[4722]: I0309 14:26:03.234318 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7hpk\" (UniqueName: \"kubernetes.io/projected/279a10b1-27b8-49d5-8f22-2e01c3db5a04-kube-api-access-f7hpk\") pod \"dnsmasq-dns-55f844cf75-fbs5z\" (UID: \"279a10b1-27b8-49d5-8f22-2e01c3db5a04\") " pod="openstack/dnsmasq-dns-55f844cf75-fbs5z" Mar 09 14:26:03 crc kubenswrapper[4722]: I0309 14:26:03.234475 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/24650aaa-2d24-4e59-9f9a-40b929e25c10-httpd-config\") pod \"neutron-6f5cbf5964-pmtn7\" (UID: \"24650aaa-2d24-4e59-9f9a-40b929e25c10\") " pod="openstack/neutron-6f5cbf5964-pmtn7" Mar 09 14:26:03 crc kubenswrapper[4722]: I0309 14:26:03.234923 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/279a10b1-27b8-49d5-8f22-2e01c3db5a04-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-fbs5z\" (UID: \"279a10b1-27b8-49d5-8f22-2e01c3db5a04\") " pod="openstack/dnsmasq-dns-55f844cf75-fbs5z" Mar 09 14:26:03 crc kubenswrapper[4722]: I0309 14:26:03.235572 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/279a10b1-27b8-49d5-8f22-2e01c3db5a04-dns-svc\") pod \"dnsmasq-dns-55f844cf75-fbs5z\" (UID: \"279a10b1-27b8-49d5-8f22-2e01c3db5a04\") " pod="openstack/dnsmasq-dns-55f844cf75-fbs5z" Mar 09 14:26:03 crc kubenswrapper[4722]: I0309 14:26:03.236548 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/279a10b1-27b8-49d5-8f22-2e01c3db5a04-config\") pod \"dnsmasq-dns-55f844cf75-fbs5z\" (UID: \"279a10b1-27b8-49d5-8f22-2e01c3db5a04\") " pod="openstack/dnsmasq-dns-55f844cf75-fbs5z" Mar 09 14:26:03 crc kubenswrapper[4722]: I0309 14:26:03.236654 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/279a10b1-27b8-49d5-8f22-2e01c3db5a04-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-fbs5z\" (UID: \"279a10b1-27b8-49d5-8f22-2e01c3db5a04\") " pod="openstack/dnsmasq-dns-55f844cf75-fbs5z" Mar 09 14:26:03 crc kubenswrapper[4722]: I0309 14:26:03.237014 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/279a10b1-27b8-49d5-8f22-2e01c3db5a04-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-fbs5z\" (UID: \"279a10b1-27b8-49d5-8f22-2e01c3db5a04\") " pod="openstack/dnsmasq-dns-55f844cf75-fbs5z" Mar 09 14:26:03 crc kubenswrapper[4722]: I0309 14:26:03.260666 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7hpk\" (UniqueName: \"kubernetes.io/projected/279a10b1-27b8-49d5-8f22-2e01c3db5a04-kube-api-access-f7hpk\") pod \"dnsmasq-dns-55f844cf75-fbs5z\" (UID: \"279a10b1-27b8-49d5-8f22-2e01c3db5a04\") " pod="openstack/dnsmasq-dns-55f844cf75-fbs5z" Mar 09 14:26:03 crc kubenswrapper[4722]: I0309 14:26:03.277169 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-fbs5z" Mar 09 14:26:03 crc kubenswrapper[4722]: I0309 14:26:03.335748 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/24650aaa-2d24-4e59-9f9a-40b929e25c10-config\") pod \"neutron-6f5cbf5964-pmtn7\" (UID: \"24650aaa-2d24-4e59-9f9a-40b929e25c10\") " pod="openstack/neutron-6f5cbf5964-pmtn7" Mar 09 14:26:03 crc kubenswrapper[4722]: I0309 14:26:03.336184 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ll9zp\" (UniqueName: \"kubernetes.io/projected/24650aaa-2d24-4e59-9f9a-40b929e25c10-kube-api-access-ll9zp\") pod \"neutron-6f5cbf5964-pmtn7\" (UID: \"24650aaa-2d24-4e59-9f9a-40b929e25c10\") " pod="openstack/neutron-6f5cbf5964-pmtn7" Mar 09 14:26:03 crc kubenswrapper[4722]: I0309 14:26:03.336353 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/24650aaa-2d24-4e59-9f9a-40b929e25c10-ovndb-tls-certs\") pod \"neutron-6f5cbf5964-pmtn7\" (UID: \"24650aaa-2d24-4e59-9f9a-40b929e25c10\") " pod="openstack/neutron-6f5cbf5964-pmtn7" Mar 09 14:26:03 crc kubenswrapper[4722]: I0309 14:26:03.337017 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/24650aaa-2d24-4e59-9f9a-40b929e25c10-httpd-config\") pod \"neutron-6f5cbf5964-pmtn7\" (UID: \"24650aaa-2d24-4e59-9f9a-40b929e25c10\") " pod="openstack/neutron-6f5cbf5964-pmtn7" Mar 09 14:26:03 crc kubenswrapper[4722]: I0309 14:26:03.337105 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24650aaa-2d24-4e59-9f9a-40b929e25c10-combined-ca-bundle\") pod \"neutron-6f5cbf5964-pmtn7\" (UID: \"24650aaa-2d24-4e59-9f9a-40b929e25c10\") " pod="openstack/neutron-6f5cbf5964-pmtn7" Mar 09 14:26:03 crc kubenswrapper[4722]: I0309 14:26:03.341144 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/24650aaa-2d24-4e59-9f9a-40b929e25c10-ovndb-tls-certs\") pod \"neutron-6f5cbf5964-pmtn7\" (UID: \"24650aaa-2d24-4e59-9f9a-40b929e25c10\") " pod="openstack/neutron-6f5cbf5964-pmtn7" Mar 09 14:26:03 crc kubenswrapper[4722]: I0309 14:26:03.341533 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24650aaa-2d24-4e59-9f9a-40b929e25c10-combined-ca-bundle\") pod \"neutron-6f5cbf5964-pmtn7\" (UID: \"24650aaa-2d24-4e59-9f9a-40b929e25c10\") " pod="openstack/neutron-6f5cbf5964-pmtn7" Mar 09 14:26:03 crc kubenswrapper[4722]: I0309 14:26:03.349967 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/24650aaa-2d24-4e59-9f9a-40b929e25c10-httpd-config\") pod \"neutron-6f5cbf5964-pmtn7\" (UID: \"24650aaa-2d24-4e59-9f9a-40b929e25c10\") " pod="openstack/neutron-6f5cbf5964-pmtn7" Mar 09 14:26:03 crc kubenswrapper[4722]: I0309 14:26:03.361061 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/24650aaa-2d24-4e59-9f9a-40b929e25c10-config\") pod \"neutron-6f5cbf5964-pmtn7\" (UID: \"24650aaa-2d24-4e59-9f9a-40b929e25c10\") " pod="openstack/neutron-6f5cbf5964-pmtn7" Mar 09 14:26:03 crc kubenswrapper[4722]: I0309 14:26:03.363775 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ll9zp\" (UniqueName: \"kubernetes.io/projected/24650aaa-2d24-4e59-9f9a-40b929e25c10-kube-api-access-ll9zp\") pod \"neutron-6f5cbf5964-pmtn7\" (UID: \"24650aaa-2d24-4e59-9f9a-40b929e25c10\") " pod="openstack/neutron-6f5cbf5964-pmtn7" Mar 09 14:26:03 crc kubenswrapper[4722]: I0309 14:26:03.414023 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6f5cbf5964-pmtn7" Mar 09 14:26:03 crc kubenswrapper[4722]: I0309 14:26:03.671491 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 14:26:03 crc kubenswrapper[4722]: I0309 14:26:03.764752 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4","Type":"ContainerStarted","Data":"eab35e11f04f2ca25e4cf0c889b31ea97bd408319f100d96ec0a29726d467849"} Mar 09 14:26:03 crc kubenswrapper[4722]: I0309 14:26:03.890253 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-fbs5z"] Mar 09 14:26:03 crc kubenswrapper[4722]: W0309 14:26:03.899585 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod279a10b1_27b8_49d5_8f22_2e01c3db5a04.slice/crio-d53dbc0076eafbefbd5833a18d097cd091aeb9a7ae879c460d01486740d6795c WatchSource:0}: Error finding container d53dbc0076eafbefbd5833a18d097cd091aeb9a7ae879c460d01486740d6795c: Status 404 returned error can't find the container with id d53dbc0076eafbefbd5833a18d097cd091aeb9a7ae879c460d01486740d6795c Mar 09 14:26:04 crc kubenswrapper[4722]: I0309 14:26:04.243144 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6f5cbf5964-pmtn7"] Mar 09 14:26:04 crc kubenswrapper[4722]: I0309 14:26:04.361579 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-qccbt" podUID="f87e19a1-bff1-4e82-8677-3812b2f51f46" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.172:5353: i/o timeout" Mar 09 14:26:04 crc kubenswrapper[4722]: I0309 14:26:04.777430 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fx7ks" event={"ID":"cdb62cc9-3fd6-41fd-bbbd-de3aaee2073b","Type":"ContainerStarted","Data":"3a6f8c5ed200b2e304dfa5940d263adcd0ac9b8b79a760183d551dd9d0eee91d"} Mar 09 14:26:04 crc kubenswrapper[4722]: I0309 14:26:04.782234 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"905e426d-8ef1-442e-abb2-69905e8fc61a","Type":"ContainerStarted","Data":"2b782e1d01d6f1e5d883408b56654cb190f65dda28ade9acdcc2b519429bd927"} Mar 09 14:26:04 crc kubenswrapper[4722]: I0309 14:26:04.794023 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-fx7ks" podStartSLOduration=4.794006795 podStartE2EDuration="4.794006795s" podCreationTimestamp="2026-03-09 14:26:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:26:04.792245276 +0000 UTC m=+1405.347813852" watchObservedRunningTime="2026-03-09 14:26:04.794006795 +0000 UTC m=+1405.349575371" Mar 09 14:26:04 crc kubenswrapper[4722]: I0309 14:26:04.812647 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4","Type":"ContainerStarted","Data":"a23379779376add9f8ba28be54166afd7bd04e41bcf899bf7e5acdad5d66214e"} Mar 09 14:26:04 crc kubenswrapper[4722]: I0309 14:26:04.822146 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-fbs5z" event={"ID":"279a10b1-27b8-49d5-8f22-2e01c3db5a04","Type":"ContainerStarted","Data":"33d3a450005f1f13e5f186ef301b9b16c456b6d21adcb53af06c7cf934fe591b"} Mar 09 14:26:04 crc kubenswrapper[4722]: I0309 14:26:04.822187 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-fbs5z" event={"ID":"279a10b1-27b8-49d5-8f22-2e01c3db5a04","Type":"ContainerStarted","Data":"d53dbc0076eafbefbd5833a18d097cd091aeb9a7ae879c460d01486740d6795c"} Mar 09 14:26:04 crc kubenswrapper[4722]: I0309 14:26:04.825232 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6f5cbf5964-pmtn7" event={"ID":"24650aaa-2d24-4e59-9f9a-40b929e25c10","Type":"ContainerStarted","Data":"5415c2e2f0d9b8528559acb465bd00b9d5e4a5f98d5315538979c87b19b791c3"} Mar 09 14:26:04 crc kubenswrapper[4722]: I0309 14:26:04.825267 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6f5cbf5964-pmtn7" event={"ID":"24650aaa-2d24-4e59-9f9a-40b929e25c10","Type":"ContainerStarted","Data":"cf70bb37a7aacb5ae3ddda8d9165846a1ed111f9dfc97a3211de8ffb1087dbba"} Mar 09 14:26:05 crc kubenswrapper[4722]: I0309 14:26:05.626272 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5487cbd8d5-crfpm"] Mar 09 14:26:05 crc kubenswrapper[4722]: I0309 14:26:05.629135 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5487cbd8d5-crfpm" Mar 09 14:26:05 crc kubenswrapper[4722]: I0309 14:26:05.634399 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 09 14:26:05 crc kubenswrapper[4722]: I0309 14:26:05.634665 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 09 14:26:05 crc kubenswrapper[4722]: I0309 14:26:05.667142 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5487cbd8d5-crfpm"] Mar 09 14:26:05 crc kubenswrapper[4722]: I0309 14:26:05.719733 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a53b24c6-01ac-48c0-8c62-f27e8309de23-public-tls-certs\") pod \"neutron-5487cbd8d5-crfpm\" (UID: \"a53b24c6-01ac-48c0-8c62-f27e8309de23\") " pod="openstack/neutron-5487cbd8d5-crfpm" Mar 09 14:26:05 crc kubenswrapper[4722]: I0309 14:26:05.719793 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a53b24c6-01ac-48c0-8c62-f27e8309de23-ovndb-tls-certs\") pod \"neutron-5487cbd8d5-crfpm\" (UID: \"a53b24c6-01ac-48c0-8c62-f27e8309de23\") " pod="openstack/neutron-5487cbd8d5-crfpm" Mar 09 14:26:05 crc kubenswrapper[4722]: I0309 14:26:05.719817 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a53b24c6-01ac-48c0-8c62-f27e8309de23-config\") pod \"neutron-5487cbd8d5-crfpm\" (UID: \"a53b24c6-01ac-48c0-8c62-f27e8309de23\") " pod="openstack/neutron-5487cbd8d5-crfpm" Mar 09 14:26:05 crc kubenswrapper[4722]: I0309 14:26:05.719831 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a53b24c6-01ac-48c0-8c62-f27e8309de23-httpd-config\") pod \"neutron-5487cbd8d5-crfpm\" (UID: \"a53b24c6-01ac-48c0-8c62-f27e8309de23\") " pod="openstack/neutron-5487cbd8d5-crfpm" Mar 09 14:26:05 crc kubenswrapper[4722]: I0309 14:26:05.719848 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a53b24c6-01ac-48c0-8c62-f27e8309de23-combined-ca-bundle\") pod \"neutron-5487cbd8d5-crfpm\" (UID: \"a53b24c6-01ac-48c0-8c62-f27e8309de23\") " pod="openstack/neutron-5487cbd8d5-crfpm" Mar 09 14:26:05 crc kubenswrapper[4722]: I0309 14:26:05.719864 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpt6n\" (UniqueName: \"kubernetes.io/projected/a53b24c6-01ac-48c0-8c62-f27e8309de23-kube-api-access-cpt6n\") pod \"neutron-5487cbd8d5-crfpm\" (UID: \"a53b24c6-01ac-48c0-8c62-f27e8309de23\") " pod="openstack/neutron-5487cbd8d5-crfpm" Mar 09 14:26:05 crc kubenswrapper[4722]: I0309 14:26:05.720002 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a53b24c6-01ac-48c0-8c62-f27e8309de23-internal-tls-certs\") pod \"neutron-5487cbd8d5-crfpm\" (UID: \"a53b24c6-01ac-48c0-8c62-f27e8309de23\") " pod="openstack/neutron-5487cbd8d5-crfpm" Mar 09 14:26:05 crc kubenswrapper[4722]: I0309 14:26:05.822590 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a53b24c6-01ac-48c0-8c62-f27e8309de23-ovndb-tls-certs\") pod \"neutron-5487cbd8d5-crfpm\" (UID: \"a53b24c6-01ac-48c0-8c62-f27e8309de23\") " pod="openstack/neutron-5487cbd8d5-crfpm" Mar 09 14:26:05 crc kubenswrapper[4722]: I0309 14:26:05.822648 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a53b24c6-01ac-48c0-8c62-f27e8309de23-config\") pod \"neutron-5487cbd8d5-crfpm\" (UID: \"a53b24c6-01ac-48c0-8c62-f27e8309de23\") " pod="openstack/neutron-5487cbd8d5-crfpm" Mar 09 14:26:05 crc kubenswrapper[4722]: I0309 14:26:05.822667 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a53b24c6-01ac-48c0-8c62-f27e8309de23-httpd-config\") pod \"neutron-5487cbd8d5-crfpm\" (UID: \"a53b24c6-01ac-48c0-8c62-f27e8309de23\") " pod="openstack/neutron-5487cbd8d5-crfpm" Mar 09 14:26:05 crc kubenswrapper[4722]: I0309 14:26:05.822685 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpt6n\" (UniqueName: \"kubernetes.io/projected/a53b24c6-01ac-48c0-8c62-f27e8309de23-kube-api-access-cpt6n\") pod \"neutron-5487cbd8d5-crfpm\" (UID: \"a53b24c6-01ac-48c0-8c62-f27e8309de23\") " pod="openstack/neutron-5487cbd8d5-crfpm" Mar 09 14:26:05 crc kubenswrapper[4722]: I0309 14:26:05.822705 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a53b24c6-01ac-48c0-8c62-f27e8309de23-combined-ca-bundle\") pod \"neutron-5487cbd8d5-crfpm\" (UID: \"a53b24c6-01ac-48c0-8c62-f27e8309de23\") " pod="openstack/neutron-5487cbd8d5-crfpm" Mar 09 14:26:05 crc kubenswrapper[4722]: I0309 14:26:05.827812 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a53b24c6-01ac-48c0-8c62-f27e8309de23-internal-tls-certs\") pod \"neutron-5487cbd8d5-crfpm\" (UID: \"a53b24c6-01ac-48c0-8c62-f27e8309de23\") " pod="openstack/neutron-5487cbd8d5-crfpm" Mar 09 14:26:05 crc kubenswrapper[4722]: I0309 14:26:05.828596 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a53b24c6-01ac-48c0-8c62-f27e8309de23-public-tls-certs\") pod \"neutron-5487cbd8d5-crfpm\" (UID: \"a53b24c6-01ac-48c0-8c62-f27e8309de23\") " pod="openstack/neutron-5487cbd8d5-crfpm" Mar 09 14:26:05 crc kubenswrapper[4722]: I0309 14:26:05.859604 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a53b24c6-01ac-48c0-8c62-f27e8309de23-combined-ca-bundle\") pod \"neutron-5487cbd8d5-crfpm\" (UID: \"a53b24c6-01ac-48c0-8c62-f27e8309de23\") " pod="openstack/neutron-5487cbd8d5-crfpm" Mar 09 14:26:05 crc kubenswrapper[4722]: I0309 14:26:05.859660 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a53b24c6-01ac-48c0-8c62-f27e8309de23-httpd-config\") pod \"neutron-5487cbd8d5-crfpm\" (UID: \"a53b24c6-01ac-48c0-8c62-f27e8309de23\") " pod="openstack/neutron-5487cbd8d5-crfpm" Mar 09 14:26:05 crc kubenswrapper[4722]: I0309 14:26:05.860488 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/a53b24c6-01ac-48c0-8c62-f27e8309de23-config\") pod \"neutron-5487cbd8d5-crfpm\" (UID: \"a53b24c6-01ac-48c0-8c62-f27e8309de23\") " pod="openstack/neutron-5487cbd8d5-crfpm" Mar 09 14:26:05 crc kubenswrapper[4722]: I0309 14:26:05.860793 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a53b24c6-01ac-48c0-8c62-f27e8309de23-ovndb-tls-certs\") pod \"neutron-5487cbd8d5-crfpm\" (UID: \"a53b24c6-01ac-48c0-8c62-f27e8309de23\") " pod="openstack/neutron-5487cbd8d5-crfpm" Mar 09 14:26:05 crc kubenswrapper[4722]: I0309 14:26:05.863388 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"34700700-cbd0-4a2e-b791-25b63e3de5b8","Type":"ContainerStarted","Data":"3b2426d4609239d48a635ae0e691919369f6b42c652697acdcc320f9f919a30b"} Mar 09 14:26:05 crc kubenswrapper[4722]: I0309 14:26:05.864127 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a53b24c6-01ac-48c0-8c62-f27e8309de23-internal-tls-certs\") pod \"neutron-5487cbd8d5-crfpm\" (UID: \"a53b24c6-01ac-48c0-8c62-f27e8309de23\") " pod="openstack/neutron-5487cbd8d5-crfpm" Mar 09 14:26:05 crc kubenswrapper[4722]: I0309 14:26:05.868010 4722 generic.go:334] "Generic (PLEG): container finished" podID="279a10b1-27b8-49d5-8f22-2e01c3db5a04" containerID="33d3a450005f1f13e5f186ef301b9b16c456b6d21adcb53af06c7cf934fe591b" exitCode=0 Mar 09 14:26:05 crc kubenswrapper[4722]: I0309 14:26:05.868072 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-fbs5z" event={"ID":"279a10b1-27b8-49d5-8f22-2e01c3db5a04","Type":"ContainerDied","Data":"33d3a450005f1f13e5f186ef301b9b16c456b6d21adcb53af06c7cf934fe591b"} Mar 09 14:26:05 crc kubenswrapper[4722]: I0309 14:26:05.868098 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-fbs5z" event={"ID":"279a10b1-27b8-49d5-8f22-2e01c3db5a04","Type":"ContainerStarted","Data":"358f5f147e39647eee3fd33e0437b25e2c7aff3825809a733605042d2612b38c"} Mar 09 14:26:05 crc kubenswrapper[4722]: I0309 14:26:05.869535 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a53b24c6-01ac-48c0-8c62-f27e8309de23-public-tls-certs\") pod \"neutron-5487cbd8d5-crfpm\" (UID: \"a53b24c6-01ac-48c0-8c62-f27e8309de23\") " pod="openstack/neutron-5487cbd8d5-crfpm" Mar 09 14:26:05 crc kubenswrapper[4722]: I0309 14:26:05.869845 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55f844cf75-fbs5z" Mar 09 14:26:05 crc kubenswrapper[4722]: I0309 14:26:05.878979 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpt6n\" (UniqueName: \"kubernetes.io/projected/a53b24c6-01ac-48c0-8c62-f27e8309de23-kube-api-access-cpt6n\") pod \"neutron-5487cbd8d5-crfpm\" (UID: \"a53b24c6-01ac-48c0-8c62-f27e8309de23\") " pod="openstack/neutron-5487cbd8d5-crfpm" Mar 09 14:26:05 crc kubenswrapper[4722]: I0309 14:26:05.890745 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6f5cbf5964-pmtn7" event={"ID":"24650aaa-2d24-4e59-9f9a-40b929e25c10","Type":"ContainerStarted","Data":"89975d701fd58751699513e7e1eafbc6124794762a9b7f960bfecb6450d873bd"} Mar 09 14:26:05 crc kubenswrapper[4722]: I0309 14:26:05.890929 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6f5cbf5964-pmtn7" Mar 09 14:26:05 crc kubenswrapper[4722]: I0309 14:26:05.939878 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55f844cf75-fbs5z" podStartSLOduration=3.93985219 podStartE2EDuration="3.93985219s" podCreationTimestamp="2026-03-09 14:26:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:26:05.897876798 +0000 UTC m=+1406.453445374" watchObservedRunningTime="2026-03-09 14:26:05.93985219 +0000 UTC m=+1406.495420766" Mar 09 14:26:05 crc kubenswrapper[4722]: I0309 14:26:05.943591 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6f5cbf5964-pmtn7" podStartSLOduration=2.943580023 podStartE2EDuration="2.943580023s" podCreationTimestamp="2026-03-09 14:26:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:26:05.92470115 +0000 UTC m=+1406.480269866" watchObservedRunningTime="2026-03-09 14:26:05.943580023 +0000 UTC m=+1406.499148599" Mar 09 14:26:05 crc kubenswrapper[4722]: I0309 14:26:05.976472 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5487cbd8d5-crfpm" Mar 09 14:26:06 crc kubenswrapper[4722]: I0309 14:26:06.792147 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5487cbd8d5-crfpm"] Mar 09 14:26:06 crc kubenswrapper[4722]: W0309 14:26:06.820306 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda53b24c6_01ac_48c0_8c62_f27e8309de23.slice/crio-c731c6cb05aebe7df0fcbb137dadee56fb6cb38e27f8832d1fd62cec2e951509 WatchSource:0}: Error finding container c731c6cb05aebe7df0fcbb137dadee56fb6cb38e27f8832d1fd62cec2e951509: Status 404 returned error can't find the container with id c731c6cb05aebe7df0fcbb137dadee56fb6cb38e27f8832d1fd62cec2e951509 Mar 09 14:26:06 crc kubenswrapper[4722]: I0309 14:26:06.924337 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5487cbd8d5-crfpm" event={"ID":"a53b24c6-01ac-48c0-8c62-f27e8309de23","Type":"ContainerStarted","Data":"c731c6cb05aebe7df0fcbb137dadee56fb6cb38e27f8832d1fd62cec2e951509"} Mar 09 14:26:06 crc kubenswrapper[4722]: I0309 14:26:06.928557 4722 generic.go:334] "Generic (PLEG): container finished" podID="6e7bbbcf-fe3a-4007-b98d-2692b6f6cbe5" containerID="f03fc0a6d2153285b68b06ee976c841e642a446a81e1e6eb709dce371d8eb9b3" exitCode=0 Mar 09 14:26:06 crc kubenswrapper[4722]: I0309 14:26:06.928602 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551106-tzqf7" event={"ID":"6e7bbbcf-fe3a-4007-b98d-2692b6f6cbe5","Type":"ContainerDied","Data":"f03fc0a6d2153285b68b06ee976c841e642a446a81e1e6eb709dce371d8eb9b3"} Mar 09 14:26:06 crc kubenswrapper[4722]: I0309 14:26:06.933659 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"905e426d-8ef1-442e-abb2-69905e8fc61a","Type":"ContainerStarted","Data":"9fdfba0d1ad4b491c62ca71f25534c103428e138f08244b726136f8e29e87ad2"} Mar 09 14:26:06 crc kubenswrapper[4722]: I0309 14:26:06.958079 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4","Type":"ContainerStarted","Data":"7d51fe379c52b16a7a6d962b827b1c07be67bbcd90aa939d56cebc2eb739f4c7"} Mar 09 14:26:07 crc kubenswrapper[4722]: I0309 14:26:07.113028 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=21.112998761 podStartE2EDuration="21.112998761s" podCreationTimestamp="2026-03-09 14:25:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:26:06.965124226 +0000 UTC m=+1407.520692802" watchObservedRunningTime="2026-03-09 14:26:07.112998761 +0000 UTC m=+1407.668567347" Mar 09 14:26:07 crc kubenswrapper[4722]: I0309 14:26:07.137883 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=21.13786325 podStartE2EDuration="21.13786325s" podCreationTimestamp="2026-03-09 14:25:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:26:07.009737732 +0000 UTC m=+1407.565306308" watchObservedRunningTime="2026-03-09 14:26:07.13786325 +0000 UTC m=+1407.693431826" Mar 09 14:26:07 crc kubenswrapper[4722]: I0309 14:26:07.248342 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 09 14:26:07 crc kubenswrapper[4722]: I0309 14:26:07.248725 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 09 14:26:07 crc kubenswrapper[4722]: I0309 14:26:07.308987 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 09 14:26:07 crc kubenswrapper[4722]: I0309 14:26:07.313643 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 09 14:26:07 crc kubenswrapper[4722]: I0309 14:26:07.963444 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-tlbnm" event={"ID":"b63a3af5-a347-40ed-b9bc-52ad70e7ff13","Type":"ContainerStarted","Data":"096316f8c04dfff3c49fef10ecd7a9fa77ee78f8e756c864851cd88a0c5c6a66"} Mar 09 14:26:07 crc kubenswrapper[4722]: I0309 14:26:07.969823 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5487cbd8d5-crfpm" event={"ID":"a53b24c6-01ac-48c0-8c62-f27e8309de23","Type":"ContainerStarted","Data":"d3ba4abf6a56fdb679f45127646186f63a7a511208e8f8b22d325ba9f4f9c918"} Mar 09 14:26:07 crc kubenswrapper[4722]: I0309 14:26:07.970152 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 09 14:26:07 crc kubenswrapper[4722]: I0309 14:26:07.970177 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 09 14:26:08 crc kubenswrapper[4722]: I0309 14:26:08.985007 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551106-tzqf7" event={"ID":"6e7bbbcf-fe3a-4007-b98d-2692b6f6cbe5","Type":"ContainerDied","Data":"6b6378f42e8e568eb734f2a119dd536e84363f672cd6cf6baccb8b4cae882a07"} Mar 09 14:26:08 crc kubenswrapper[4722]: I0309 14:26:08.985632 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b6378f42e8e568eb734f2a119dd536e84363f672cd6cf6baccb8b4cae882a07" Mar 09 14:26:09 crc kubenswrapper[4722]: I0309 14:26:09.005822 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551106-tzqf7" Mar 09 14:26:09 crc kubenswrapper[4722]: I0309 14:26:09.007396 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-tlbnm" podStartSLOduration=4.924655115 podStartE2EDuration="37.00737758s" podCreationTimestamp="2026-03-09 14:25:32 +0000 UTC" firstStartedPulling="2026-03-09 14:25:34.552829547 +0000 UTC m=+1375.108398123" lastFinishedPulling="2026-03-09 14:26:06.635552012 +0000 UTC m=+1407.191120588" observedRunningTime="2026-03-09 14:26:09.000587912 +0000 UTC m=+1409.556156498" watchObservedRunningTime="2026-03-09 14:26:09.00737758 +0000 UTC m=+1409.562946146" Mar 09 14:26:09 crc kubenswrapper[4722]: I0309 14:26:09.114152 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9h6h\" (UniqueName: \"kubernetes.io/projected/6e7bbbcf-fe3a-4007-b98d-2692b6f6cbe5-kube-api-access-f9h6h\") pod \"6e7bbbcf-fe3a-4007-b98d-2692b6f6cbe5\" (UID: \"6e7bbbcf-fe3a-4007-b98d-2692b6f6cbe5\") " Mar 09 14:26:09 crc kubenswrapper[4722]: I0309 14:26:09.119626 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e7bbbcf-fe3a-4007-b98d-2692b6f6cbe5-kube-api-access-f9h6h" (OuterVolumeSpecName: "kube-api-access-f9h6h") pod "6e7bbbcf-fe3a-4007-b98d-2692b6f6cbe5" (UID: "6e7bbbcf-fe3a-4007-b98d-2692b6f6cbe5"). InnerVolumeSpecName "kube-api-access-f9h6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:26:09 crc kubenswrapper[4722]: I0309 14:26:09.217408 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9h6h\" (UniqueName: \"kubernetes.io/projected/6e7bbbcf-fe3a-4007-b98d-2692b6f6cbe5-kube-api-access-f9h6h\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:10 crc kubenswrapper[4722]: I0309 14:26:10.002095 4722 generic.go:334] "Generic (PLEG): container finished" podID="d1e56f45-1812-4ee1-aacc-0b012cf07111" containerID="8146c5ee9c87f326a93c865b0b24bb207155822d151e5247f424dbf90262a039" exitCode=0 Mar 09 14:26:10 crc kubenswrapper[4722]: I0309 14:26:10.002180 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-9llgc" event={"ID":"d1e56f45-1812-4ee1-aacc-0b012cf07111","Type":"ContainerDied","Data":"8146c5ee9c87f326a93c865b0b24bb207155822d151e5247f424dbf90262a039"} Mar 09 14:26:10 crc kubenswrapper[4722]: I0309 14:26:10.007520 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551106-tzqf7" Mar 09 14:26:10 crc kubenswrapper[4722]: I0309 14:26:10.008582 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5487cbd8d5-crfpm" event={"ID":"a53b24c6-01ac-48c0-8c62-f27e8309de23","Type":"ContainerStarted","Data":"3e86f1b2140a6860ab063b2ed7b54181844d7003819352ffb7cdf3e8f395b8a7"} Mar 09 14:26:10 crc kubenswrapper[4722]: I0309 14:26:10.008937 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5487cbd8d5-crfpm" Mar 09 14:26:10 crc kubenswrapper[4722]: I0309 14:26:10.071880 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5487cbd8d5-crfpm" podStartSLOduration=5.071853942 podStartE2EDuration="5.071853942s" podCreationTimestamp="2026-03-09 14:26:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:26:10.044354801 +0000 UTC m=+1410.599923387" watchObservedRunningTime="2026-03-09 14:26:10.071853942 +0000 UTC m=+1410.627422518" Mar 09 14:26:10 crc kubenswrapper[4722]: I0309 14:26:10.098717 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551100-2cmr4"] Mar 09 14:26:10 crc kubenswrapper[4722]: I0309 14:26:10.111320 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551100-2cmr4"] Mar 09 14:26:10 crc kubenswrapper[4722]: I0309 14:26:10.209312 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bbc4a147-c662-4236-b11b-16239fa031a0" path="/var/lib/kubelet/pods/bbc4a147-c662-4236-b11b-16239fa031a0/volumes" Mar 09 14:26:12 crc kubenswrapper[4722]: I0309 14:26:12.609349 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-9llgc" Mar 09 14:26:12 crc kubenswrapper[4722]: I0309 14:26:12.622369 4722 generic.go:334] "Generic (PLEG): container finished" podID="cdb62cc9-3fd6-41fd-bbbd-de3aaee2073b" containerID="3a6f8c5ed200b2e304dfa5940d263adcd0ac9b8b79a760183d551dd9d0eee91d" exitCode=0 Mar 09 14:26:12 crc kubenswrapper[4722]: I0309 14:26:12.622462 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fx7ks" event={"ID":"cdb62cc9-3fd6-41fd-bbbd-de3aaee2073b","Type":"ContainerDied","Data":"3a6f8c5ed200b2e304dfa5940d263adcd0ac9b8b79a760183d551dd9d0eee91d"} Mar 09 14:26:12 crc kubenswrapper[4722]: I0309 14:26:12.654928 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-9llgc" event={"ID":"d1e56f45-1812-4ee1-aacc-0b012cf07111","Type":"ContainerDied","Data":"7eac4554a11940cebe903bf3e04f5c16028a3e281dccd0c3261949ceabb29e88"} Mar 09 14:26:12 crc kubenswrapper[4722]: I0309 14:26:12.654990 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7eac4554a11940cebe903bf3e04f5c16028a3e281dccd0c3261949ceabb29e88" Mar 09 14:26:12 crc kubenswrapper[4722]: I0309 14:26:12.655066 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-9llgc" Mar 09 14:26:12 crc kubenswrapper[4722]: I0309 14:26:12.766085 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1e56f45-1812-4ee1-aacc-0b012cf07111-logs\") pod \"d1e56f45-1812-4ee1-aacc-0b012cf07111\" (UID: \"d1e56f45-1812-4ee1-aacc-0b012cf07111\") " Mar 09 14:26:12 crc kubenswrapper[4722]: I0309 14:26:12.766225 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t69c7\" (UniqueName: \"kubernetes.io/projected/d1e56f45-1812-4ee1-aacc-0b012cf07111-kube-api-access-t69c7\") pod \"d1e56f45-1812-4ee1-aacc-0b012cf07111\" (UID: \"d1e56f45-1812-4ee1-aacc-0b012cf07111\") " Mar 09 14:26:12 crc kubenswrapper[4722]: I0309 14:26:12.766320 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1e56f45-1812-4ee1-aacc-0b012cf07111-config-data\") pod \"d1e56f45-1812-4ee1-aacc-0b012cf07111\" (UID: \"d1e56f45-1812-4ee1-aacc-0b012cf07111\") " Mar 09 14:26:12 crc kubenswrapper[4722]: I0309 14:26:12.766736 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1e56f45-1812-4ee1-aacc-0b012cf07111-logs" (OuterVolumeSpecName: "logs") pod "d1e56f45-1812-4ee1-aacc-0b012cf07111" (UID: "d1e56f45-1812-4ee1-aacc-0b012cf07111"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:26:12 crc kubenswrapper[4722]: I0309 14:26:12.767530 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1e56f45-1812-4ee1-aacc-0b012cf07111-combined-ca-bundle\") pod \"d1e56f45-1812-4ee1-aacc-0b012cf07111\" (UID: \"d1e56f45-1812-4ee1-aacc-0b012cf07111\") " Mar 09 14:26:12 crc kubenswrapper[4722]: I0309 14:26:12.767631 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1e56f45-1812-4ee1-aacc-0b012cf07111-scripts\") pod \"d1e56f45-1812-4ee1-aacc-0b012cf07111\" (UID: \"d1e56f45-1812-4ee1-aacc-0b012cf07111\") " Mar 09 14:26:12 crc kubenswrapper[4722]: I0309 14:26:12.768545 4722 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d1e56f45-1812-4ee1-aacc-0b012cf07111-logs\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:12 crc kubenswrapper[4722]: I0309 14:26:12.773966 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1e56f45-1812-4ee1-aacc-0b012cf07111-kube-api-access-t69c7" (OuterVolumeSpecName: "kube-api-access-t69c7") pod "d1e56f45-1812-4ee1-aacc-0b012cf07111" (UID: "d1e56f45-1812-4ee1-aacc-0b012cf07111"). InnerVolumeSpecName "kube-api-access-t69c7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:26:12 crc kubenswrapper[4722]: I0309 14:26:12.789903 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1e56f45-1812-4ee1-aacc-0b012cf07111-scripts" (OuterVolumeSpecName: "scripts") pod "d1e56f45-1812-4ee1-aacc-0b012cf07111" (UID: "d1e56f45-1812-4ee1-aacc-0b012cf07111"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:26:12 crc kubenswrapper[4722]: I0309 14:26:12.812243 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1e56f45-1812-4ee1-aacc-0b012cf07111-config-data" (OuterVolumeSpecName: "config-data") pod "d1e56f45-1812-4ee1-aacc-0b012cf07111" (UID: "d1e56f45-1812-4ee1-aacc-0b012cf07111"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:26:12 crc kubenswrapper[4722]: I0309 14:26:12.812418 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1e56f45-1812-4ee1-aacc-0b012cf07111-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d1e56f45-1812-4ee1-aacc-0b012cf07111" (UID: "d1e56f45-1812-4ee1-aacc-0b012cf07111"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:26:12 crc kubenswrapper[4722]: I0309 14:26:12.870181 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1e56f45-1812-4ee1-aacc-0b012cf07111-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:12 crc kubenswrapper[4722]: I0309 14:26:12.870233 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1e56f45-1812-4ee1-aacc-0b012cf07111-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:12 crc kubenswrapper[4722]: I0309 14:26:12.870243 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t69c7\" (UniqueName: \"kubernetes.io/projected/d1e56f45-1812-4ee1-aacc-0b012cf07111-kube-api-access-t69c7\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:12 crc kubenswrapper[4722]: I0309 14:26:12.870256 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1e56f45-1812-4ee1-aacc-0b012cf07111-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:13 crc kubenswrapper[4722]: I0309 14:26:13.301268 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55f844cf75-fbs5z" Mar 09 14:26:13 crc kubenswrapper[4722]: I0309 14:26:13.384797 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-gv4lz"] Mar 09 14:26:13 crc kubenswrapper[4722]: I0309 14:26:13.385195 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-785d8bcb8c-gv4lz" podUID="50946f11-a025-4a18-a7b3-3dafa15c3b2f" containerName="dnsmasq-dns" containerID="cri-o://8462432b5cf6bd80898d429fde3ac72bf95277a5349b45032409711c6330e695" gracePeriod=10 Mar 09 14:26:13 crc kubenswrapper[4722]: I0309 14:26:13.667848 4722 generic.go:334] "Generic (PLEG): container finished" podID="b63a3af5-a347-40ed-b9bc-52ad70e7ff13" containerID="096316f8c04dfff3c49fef10ecd7a9fa77ee78f8e756c864851cd88a0c5c6a66" exitCode=0 Mar 09 14:26:13 crc kubenswrapper[4722]: I0309 14:26:13.667912 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-tlbnm" event={"ID":"b63a3af5-a347-40ed-b9bc-52ad70e7ff13","Type":"ContainerDied","Data":"096316f8c04dfff3c49fef10ecd7a9fa77ee78f8e756c864851cd88a0c5c6a66"} Mar 09 14:26:13 crc kubenswrapper[4722]: I0309 14:26:13.671184 4722 generic.go:334] "Generic (PLEG): container finished" podID="50946f11-a025-4a18-a7b3-3dafa15c3b2f" containerID="8462432b5cf6bd80898d429fde3ac72bf95277a5349b45032409711c6330e695" exitCode=0 Mar 09 14:26:13 crc kubenswrapper[4722]: I0309 14:26:13.671393 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-gv4lz" event={"ID":"50946f11-a025-4a18-a7b3-3dafa15c3b2f","Type":"ContainerDied","Data":"8462432b5cf6bd80898d429fde3ac72bf95277a5349b45032409711c6330e695"} Mar 09 14:26:13 crc kubenswrapper[4722]: I0309 14:26:13.818632 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-77d5d5f47b-7zdbg"] Mar 09 14:26:13 crc kubenswrapper[4722]: E0309 14:26:13.819060 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1e56f45-1812-4ee1-aacc-0b012cf07111" containerName="placement-db-sync" Mar 09 14:26:13 crc kubenswrapper[4722]: I0309 14:26:13.819076 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1e56f45-1812-4ee1-aacc-0b012cf07111" containerName="placement-db-sync" Mar 09 14:26:13 crc kubenswrapper[4722]: E0309 14:26:13.819111 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e7bbbcf-fe3a-4007-b98d-2692b6f6cbe5" containerName="oc" Mar 09 14:26:13 crc kubenswrapper[4722]: I0309 14:26:13.819118 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e7bbbcf-fe3a-4007-b98d-2692b6f6cbe5" containerName="oc" Mar 09 14:26:13 crc kubenswrapper[4722]: I0309 14:26:13.819392 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1e56f45-1812-4ee1-aacc-0b012cf07111" containerName="placement-db-sync" Mar 09 14:26:13 crc kubenswrapper[4722]: I0309 14:26:13.819406 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e7bbbcf-fe3a-4007-b98d-2692b6f6cbe5" containerName="oc" Mar 09 14:26:13 crc kubenswrapper[4722]: I0309 14:26:13.820479 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-77d5d5f47b-7zdbg" Mar 09 14:26:13 crc kubenswrapper[4722]: I0309 14:26:13.831621 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 09 14:26:13 crc kubenswrapper[4722]: I0309 14:26:13.831771 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 09 14:26:13 crc kubenswrapper[4722]: I0309 14:26:13.831646 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-98w4p" Mar 09 14:26:13 crc kubenswrapper[4722]: I0309 14:26:13.832973 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 09 14:26:13 crc kubenswrapper[4722]: I0309 14:26:13.834542 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 09 14:26:13 crc kubenswrapper[4722]: I0309 14:26:13.880346 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-77d5d5f47b-7zdbg"] Mar 09 14:26:13 crc kubenswrapper[4722]: I0309 14:26:13.920447 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/154add5d-d1cd-4c93-ae63-3c7dfe4cb035-logs\") pod \"placement-77d5d5f47b-7zdbg\" (UID: \"154add5d-d1cd-4c93-ae63-3c7dfe4cb035\") " pod="openstack/placement-77d5d5f47b-7zdbg" Mar 09 14:26:13 crc kubenswrapper[4722]: I0309 14:26:13.920528 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/154add5d-d1cd-4c93-ae63-3c7dfe4cb035-public-tls-certs\") pod \"placement-77d5d5f47b-7zdbg\" (UID: \"154add5d-d1cd-4c93-ae63-3c7dfe4cb035\") " pod="openstack/placement-77d5d5f47b-7zdbg" Mar 09 14:26:13 crc kubenswrapper[4722]: I0309 14:26:13.920575 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6xsh\" (UniqueName: \"kubernetes.io/projected/154add5d-d1cd-4c93-ae63-3c7dfe4cb035-kube-api-access-k6xsh\") pod \"placement-77d5d5f47b-7zdbg\" (UID: \"154add5d-d1cd-4c93-ae63-3c7dfe4cb035\") " pod="openstack/placement-77d5d5f47b-7zdbg" Mar 09 14:26:13 crc kubenswrapper[4722]: I0309 14:26:13.920609 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/154add5d-d1cd-4c93-ae63-3c7dfe4cb035-internal-tls-certs\") pod \"placement-77d5d5f47b-7zdbg\" (UID: \"154add5d-d1cd-4c93-ae63-3c7dfe4cb035\") " pod="openstack/placement-77d5d5f47b-7zdbg" Mar 09 14:26:13 crc kubenswrapper[4722]: I0309 14:26:13.920651 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/154add5d-d1cd-4c93-ae63-3c7dfe4cb035-config-data\") pod \"placement-77d5d5f47b-7zdbg\" (UID: \"154add5d-d1cd-4c93-ae63-3c7dfe4cb035\") " pod="openstack/placement-77d5d5f47b-7zdbg" Mar 09 14:26:13 crc kubenswrapper[4722]: I0309 14:26:13.920683 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/154add5d-d1cd-4c93-ae63-3c7dfe4cb035-combined-ca-bundle\") pod \"placement-77d5d5f47b-7zdbg\" (UID: \"154add5d-d1cd-4c93-ae63-3c7dfe4cb035\") " pod="openstack/placement-77d5d5f47b-7zdbg" Mar 09 14:26:13 crc kubenswrapper[4722]: I0309 14:26:13.920697 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/154add5d-d1cd-4c93-ae63-3c7dfe4cb035-scripts\") pod \"placement-77d5d5f47b-7zdbg\" (UID: \"154add5d-d1cd-4c93-ae63-3c7dfe4cb035\") " pod="openstack/placement-77d5d5f47b-7zdbg" Mar 09 14:26:14 crc kubenswrapper[4722]: I0309 14:26:14.023421 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6xsh\" (UniqueName: \"kubernetes.io/projected/154add5d-d1cd-4c93-ae63-3c7dfe4cb035-kube-api-access-k6xsh\") pod \"placement-77d5d5f47b-7zdbg\" (UID: \"154add5d-d1cd-4c93-ae63-3c7dfe4cb035\") " pod="openstack/placement-77d5d5f47b-7zdbg" Mar 09 14:26:14 crc kubenswrapper[4722]: I0309 14:26:14.023501 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/154add5d-d1cd-4c93-ae63-3c7dfe4cb035-internal-tls-certs\") pod \"placement-77d5d5f47b-7zdbg\" (UID: \"154add5d-d1cd-4c93-ae63-3c7dfe4cb035\") " pod="openstack/placement-77d5d5f47b-7zdbg" Mar 09 14:26:14 crc kubenswrapper[4722]: I0309 14:26:14.023570 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/154add5d-d1cd-4c93-ae63-3c7dfe4cb035-config-data\") pod \"placement-77d5d5f47b-7zdbg\" (UID: \"154add5d-d1cd-4c93-ae63-3c7dfe4cb035\") " pod="openstack/placement-77d5d5f47b-7zdbg" Mar 09 14:26:14 crc kubenswrapper[4722]: I0309 14:26:14.023656 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/154add5d-d1cd-4c93-ae63-3c7dfe4cb035-scripts\") pod \"placement-77d5d5f47b-7zdbg\" (UID: \"154add5d-d1cd-4c93-ae63-3c7dfe4cb035\") " pod="openstack/placement-77d5d5f47b-7zdbg" Mar 09 14:26:14 crc kubenswrapper[4722]: I0309 14:26:14.023702 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/154add5d-d1cd-4c93-ae63-3c7dfe4cb035-combined-ca-bundle\") pod \"placement-77d5d5f47b-7zdbg\" (UID: \"154add5d-d1cd-4c93-ae63-3c7dfe4cb035\") " pod="openstack/placement-77d5d5f47b-7zdbg" Mar 09 14:26:14 crc kubenswrapper[4722]: I0309 14:26:14.023913 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/154add5d-d1cd-4c93-ae63-3c7dfe4cb035-logs\") pod \"placement-77d5d5f47b-7zdbg\" (UID: \"154add5d-d1cd-4c93-ae63-3c7dfe4cb035\") " pod="openstack/placement-77d5d5f47b-7zdbg" Mar 09 14:26:14 crc kubenswrapper[4722]: I0309 14:26:14.024024 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/154add5d-d1cd-4c93-ae63-3c7dfe4cb035-public-tls-certs\") pod \"placement-77d5d5f47b-7zdbg\" (UID: \"154add5d-d1cd-4c93-ae63-3c7dfe4cb035\") " pod="openstack/placement-77d5d5f47b-7zdbg" Mar 09 14:26:14 crc kubenswrapper[4722]: I0309 14:26:14.025788 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/154add5d-d1cd-4c93-ae63-3c7dfe4cb035-logs\") pod \"placement-77d5d5f47b-7zdbg\" (UID: \"154add5d-d1cd-4c93-ae63-3c7dfe4cb035\") " pod="openstack/placement-77d5d5f47b-7zdbg" Mar 09 14:26:14 crc kubenswrapper[4722]: I0309 14:26:14.029532 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/154add5d-d1cd-4c93-ae63-3c7dfe4cb035-internal-tls-certs\") pod \"placement-77d5d5f47b-7zdbg\" (UID: \"154add5d-d1cd-4c93-ae63-3c7dfe4cb035\") " pod="openstack/placement-77d5d5f47b-7zdbg" Mar 09 14:26:14 crc kubenswrapper[4722]: I0309 14:26:14.031777 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/154add5d-d1cd-4c93-ae63-3c7dfe4cb035-scripts\") pod \"placement-77d5d5f47b-7zdbg\" (UID: \"154add5d-d1cd-4c93-ae63-3c7dfe4cb035\") " pod="openstack/placement-77d5d5f47b-7zdbg" Mar 09 14:26:14 crc kubenswrapper[4722]: I0309 14:26:14.032763 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/154add5d-d1cd-4c93-ae63-3c7dfe4cb035-combined-ca-bundle\") pod \"placement-77d5d5f47b-7zdbg\" (UID: \"154add5d-d1cd-4c93-ae63-3c7dfe4cb035\") " pod="openstack/placement-77d5d5f47b-7zdbg" Mar 09 14:26:14 crc kubenswrapper[4722]: I0309 14:26:14.033191 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/154add5d-d1cd-4c93-ae63-3c7dfe4cb035-config-data\") pod \"placement-77d5d5f47b-7zdbg\" (UID: \"154add5d-d1cd-4c93-ae63-3c7dfe4cb035\") " pod="openstack/placement-77d5d5f47b-7zdbg" Mar 09 14:26:14 crc kubenswrapper[4722]: I0309 14:26:14.039647 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/154add5d-d1cd-4c93-ae63-3c7dfe4cb035-public-tls-certs\") pod \"placement-77d5d5f47b-7zdbg\" (UID: \"154add5d-d1cd-4c93-ae63-3c7dfe4cb035\") " pod="openstack/placement-77d5d5f47b-7zdbg" Mar 09 14:26:14 crc kubenswrapper[4722]: I0309 14:26:14.040462 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6xsh\" (UniqueName: \"kubernetes.io/projected/154add5d-d1cd-4c93-ae63-3c7dfe4cb035-kube-api-access-k6xsh\") pod \"placement-77d5d5f47b-7zdbg\" (UID: \"154add5d-d1cd-4c93-ae63-3c7dfe4cb035\") " pod="openstack/placement-77d5d5f47b-7zdbg" Mar 09 14:26:14 crc kubenswrapper[4722]: I0309 14:26:14.151019 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-77d5d5f47b-7zdbg" Mar 09 14:26:15 crc kubenswrapper[4722]: I0309 14:26:15.697522 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-tlbnm" event={"ID":"b63a3af5-a347-40ed-b9bc-52ad70e7ff13","Type":"ContainerDied","Data":"0417d7acb31f6478d8c5a2810f3257e0cd333fa09a260f974fedc6ed106d91db"} Mar 09 14:26:15 crc kubenswrapper[4722]: I0309 14:26:15.697818 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0417d7acb31f6478d8c5a2810f3257e0cd333fa09a260f974fedc6ed106d91db" Mar 09 14:26:15 crc kubenswrapper[4722]: I0309 14:26:15.700836 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-fx7ks" event={"ID":"cdb62cc9-3fd6-41fd-bbbd-de3aaee2073b","Type":"ContainerDied","Data":"c24a1a5f921771115b437afb566d6a43e306470f334ebb51dcb64ceecf3d7b6e"} Mar 09 14:26:15 crc kubenswrapper[4722]: I0309 14:26:15.700897 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c24a1a5f921771115b437afb566d6a43e306470f334ebb51dcb64ceecf3d7b6e" Mar 09 14:26:15 crc kubenswrapper[4722]: I0309 14:26:15.962128 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fx7ks" Mar 09 14:26:15 crc kubenswrapper[4722]: I0309 14:26:15.984025 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-tlbnm" Mar 09 14:26:16 crc kubenswrapper[4722]: I0309 14:26:16.029035 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-gv4lz" Mar 09 14:26:16 crc kubenswrapper[4722]: I0309 14:26:16.076002 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cdb62cc9-3fd6-41fd-bbbd-de3aaee2073b-credential-keys\") pod \"cdb62cc9-3fd6-41fd-bbbd-de3aaee2073b\" (UID: \"cdb62cc9-3fd6-41fd-bbbd-de3aaee2073b\") " Mar 09 14:26:16 crc kubenswrapper[4722]: I0309 14:26:16.076068 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7gmj\" (UniqueName: \"kubernetes.io/projected/cdb62cc9-3fd6-41fd-bbbd-de3aaee2073b-kube-api-access-p7gmj\") pod \"cdb62cc9-3fd6-41fd-bbbd-de3aaee2073b\" (UID: \"cdb62cc9-3fd6-41fd-bbbd-de3aaee2073b\") " Mar 09 14:26:16 crc kubenswrapper[4722]: I0309 14:26:16.076095 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/50946f11-a025-4a18-a7b3-3dafa15c3b2f-ovsdbserver-nb\") pod \"50946f11-a025-4a18-a7b3-3dafa15c3b2f\" (UID: \"50946f11-a025-4a18-a7b3-3dafa15c3b2f\") " Mar 09 14:26:16 crc kubenswrapper[4722]: I0309 14:26:16.076154 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdb62cc9-3fd6-41fd-bbbd-de3aaee2073b-scripts\") pod \"cdb62cc9-3fd6-41fd-bbbd-de3aaee2073b\" (UID: \"cdb62cc9-3fd6-41fd-bbbd-de3aaee2073b\") " Mar 09 14:26:16 crc kubenswrapper[4722]: I0309 14:26:16.076182 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdb62cc9-3fd6-41fd-bbbd-de3aaee2073b-combined-ca-bundle\") pod \"cdb62cc9-3fd6-41fd-bbbd-de3aaee2073b\" (UID: \"cdb62cc9-3fd6-41fd-bbbd-de3aaee2073b\") " Mar 09 14:26:16 crc kubenswrapper[4722]: I0309 14:26:16.076253 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdb62cc9-3fd6-41fd-bbbd-de3aaee2073b-config-data\") pod \"cdb62cc9-3fd6-41fd-bbbd-de3aaee2073b\" (UID: \"cdb62cc9-3fd6-41fd-bbbd-de3aaee2073b\") " Mar 09 14:26:16 crc kubenswrapper[4722]: I0309 14:26:16.076361 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b63a3af5-a347-40ed-b9bc-52ad70e7ff13-db-sync-config-data\") pod \"b63a3af5-a347-40ed-b9bc-52ad70e7ff13\" (UID: \"b63a3af5-a347-40ed-b9bc-52ad70e7ff13\") " Mar 09 14:26:16 crc kubenswrapper[4722]: I0309 14:26:16.076381 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/50946f11-a025-4a18-a7b3-3dafa15c3b2f-dns-swift-storage-0\") pod \"50946f11-a025-4a18-a7b3-3dafa15c3b2f\" (UID: \"50946f11-a025-4a18-a7b3-3dafa15c3b2f\") " Mar 09 14:26:16 crc kubenswrapper[4722]: I0309 14:26:16.076401 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7tx74\" (UniqueName: \"kubernetes.io/projected/50946f11-a025-4a18-a7b3-3dafa15c3b2f-kube-api-access-7tx74\") pod \"50946f11-a025-4a18-a7b3-3dafa15c3b2f\" (UID: \"50946f11-a025-4a18-a7b3-3dafa15c3b2f\") " Mar 09 14:26:16 crc kubenswrapper[4722]: I0309 14:26:16.076462 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/50946f11-a025-4a18-a7b3-3dafa15c3b2f-dns-svc\") pod \"50946f11-a025-4a18-a7b3-3dafa15c3b2f\" (UID: \"50946f11-a025-4a18-a7b3-3dafa15c3b2f\") " Mar 09 14:26:16 crc kubenswrapper[4722]: I0309 14:26:16.076488 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cdb62cc9-3fd6-41fd-bbbd-de3aaee2073b-fernet-keys\") pod \"cdb62cc9-3fd6-41fd-bbbd-de3aaee2073b\" (UID: \"cdb62cc9-3fd6-41fd-bbbd-de3aaee2073b\") " Mar 09 14:26:16 crc kubenswrapper[4722]: I0309 14:26:16.076507 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b63a3af5-a347-40ed-b9bc-52ad70e7ff13-combined-ca-bundle\") pod \"b63a3af5-a347-40ed-b9bc-52ad70e7ff13\" (UID: \"b63a3af5-a347-40ed-b9bc-52ad70e7ff13\") " Mar 09 14:26:16 crc kubenswrapper[4722]: I0309 14:26:16.076522 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gbn6m\" (UniqueName: \"kubernetes.io/projected/b63a3af5-a347-40ed-b9bc-52ad70e7ff13-kube-api-access-gbn6m\") pod \"b63a3af5-a347-40ed-b9bc-52ad70e7ff13\" (UID: \"b63a3af5-a347-40ed-b9bc-52ad70e7ff13\") " Mar 09 14:26:16 crc kubenswrapper[4722]: I0309 14:26:16.076552 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/50946f11-a025-4a18-a7b3-3dafa15c3b2f-ovsdbserver-sb\") pod \"50946f11-a025-4a18-a7b3-3dafa15c3b2f\" (UID: \"50946f11-a025-4a18-a7b3-3dafa15c3b2f\") " Mar 09 14:26:16 crc kubenswrapper[4722]: I0309 14:26:16.076579 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50946f11-a025-4a18-a7b3-3dafa15c3b2f-config\") pod \"50946f11-a025-4a18-a7b3-3dafa15c3b2f\" (UID: \"50946f11-a025-4a18-a7b3-3dafa15c3b2f\") " Mar 09 14:26:16 crc kubenswrapper[4722]: I0309 14:26:16.082153 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdb62cc9-3fd6-41fd-bbbd-de3aaee2073b-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "cdb62cc9-3fd6-41fd-bbbd-de3aaee2073b" (UID: "cdb62cc9-3fd6-41fd-bbbd-de3aaee2073b"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:26:16 crc kubenswrapper[4722]: I0309 14:26:16.091577 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdb62cc9-3fd6-41fd-bbbd-de3aaee2073b-kube-api-access-p7gmj" (OuterVolumeSpecName: "kube-api-access-p7gmj") pod "cdb62cc9-3fd6-41fd-bbbd-de3aaee2073b" (UID: "cdb62cc9-3fd6-41fd-bbbd-de3aaee2073b"). InnerVolumeSpecName "kube-api-access-p7gmj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:26:16 crc kubenswrapper[4722]: I0309 14:26:16.093382 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b63a3af5-a347-40ed-b9bc-52ad70e7ff13-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "b63a3af5-a347-40ed-b9bc-52ad70e7ff13" (UID: "b63a3af5-a347-40ed-b9bc-52ad70e7ff13"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:26:16 crc kubenswrapper[4722]: I0309 14:26:16.093428 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdb62cc9-3fd6-41fd-bbbd-de3aaee2073b-scripts" (OuterVolumeSpecName: "scripts") pod "cdb62cc9-3fd6-41fd-bbbd-de3aaee2073b" (UID: "cdb62cc9-3fd6-41fd-bbbd-de3aaee2073b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:26:16 crc kubenswrapper[4722]: I0309 14:26:16.097280 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b63a3af5-a347-40ed-b9bc-52ad70e7ff13-kube-api-access-gbn6m" (OuterVolumeSpecName: "kube-api-access-gbn6m") pod "b63a3af5-a347-40ed-b9bc-52ad70e7ff13" (UID: "b63a3af5-a347-40ed-b9bc-52ad70e7ff13"). InnerVolumeSpecName "kube-api-access-gbn6m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:26:16 crc kubenswrapper[4722]: I0309 14:26:16.099486 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50946f11-a025-4a18-a7b3-3dafa15c3b2f-kube-api-access-7tx74" (OuterVolumeSpecName: "kube-api-access-7tx74") pod "50946f11-a025-4a18-a7b3-3dafa15c3b2f" (UID: "50946f11-a025-4a18-a7b3-3dafa15c3b2f"). InnerVolumeSpecName "kube-api-access-7tx74". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:26:16 crc kubenswrapper[4722]: I0309 14:26:16.099641 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdb62cc9-3fd6-41fd-bbbd-de3aaee2073b-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "cdb62cc9-3fd6-41fd-bbbd-de3aaee2073b" (UID: "cdb62cc9-3fd6-41fd-bbbd-de3aaee2073b"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:26:16 crc kubenswrapper[4722]: I0309 14:26:16.126417 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b63a3af5-a347-40ed-b9bc-52ad70e7ff13-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b63a3af5-a347-40ed-b9bc-52ad70e7ff13" (UID: "b63a3af5-a347-40ed-b9bc-52ad70e7ff13"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:26:16 crc kubenswrapper[4722]: I0309 14:26:16.167222 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdb62cc9-3fd6-41fd-bbbd-de3aaee2073b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cdb62cc9-3fd6-41fd-bbbd-de3aaee2073b" (UID: "cdb62cc9-3fd6-41fd-bbbd-de3aaee2073b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:26:16 crc kubenswrapper[4722]: I0309 14:26:16.172041 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50946f11-a025-4a18-a7b3-3dafa15c3b2f-config" (OuterVolumeSpecName: "config") pod "50946f11-a025-4a18-a7b3-3dafa15c3b2f" (UID: "50946f11-a025-4a18-a7b3-3dafa15c3b2f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:26:16 crc kubenswrapper[4722]: I0309 14:26:16.179482 4722 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/b63a3af5-a347-40ed-b9bc-52ad70e7ff13-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:16 crc kubenswrapper[4722]: I0309 14:26:16.179520 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7tx74\" (UniqueName: \"kubernetes.io/projected/50946f11-a025-4a18-a7b3-3dafa15c3b2f-kube-api-access-7tx74\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:16 crc kubenswrapper[4722]: I0309 14:26:16.179537 4722 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/cdb62cc9-3fd6-41fd-bbbd-de3aaee2073b-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:16 crc kubenswrapper[4722]: I0309 14:26:16.179550 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b63a3af5-a347-40ed-b9bc-52ad70e7ff13-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:16 crc kubenswrapper[4722]: I0309 14:26:16.179563 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gbn6m\" (UniqueName: \"kubernetes.io/projected/b63a3af5-a347-40ed-b9bc-52ad70e7ff13-kube-api-access-gbn6m\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:16 crc kubenswrapper[4722]: I0309 14:26:16.179575 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50946f11-a025-4a18-a7b3-3dafa15c3b2f-config\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:16 crc kubenswrapper[4722]: I0309 14:26:16.179586 4722 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/cdb62cc9-3fd6-41fd-bbbd-de3aaee2073b-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:16 crc kubenswrapper[4722]: I0309 14:26:16.179599 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p7gmj\" (UniqueName: \"kubernetes.io/projected/cdb62cc9-3fd6-41fd-bbbd-de3aaee2073b-kube-api-access-p7gmj\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:16 crc kubenswrapper[4722]: I0309 14:26:16.179609 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdb62cc9-3fd6-41fd-bbbd-de3aaee2073b-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:16 crc kubenswrapper[4722]: I0309 14:26:16.179618 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdb62cc9-3fd6-41fd-bbbd-de3aaee2073b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:16 crc kubenswrapper[4722]: I0309 14:26:16.182358 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdb62cc9-3fd6-41fd-bbbd-de3aaee2073b-config-data" (OuterVolumeSpecName: "config-data") pod "cdb62cc9-3fd6-41fd-bbbd-de3aaee2073b" (UID: "cdb62cc9-3fd6-41fd-bbbd-de3aaee2073b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:26:16 crc kubenswrapper[4722]: I0309 14:26:16.190667 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50946f11-a025-4a18-a7b3-3dafa15c3b2f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "50946f11-a025-4a18-a7b3-3dafa15c3b2f" (UID: "50946f11-a025-4a18-a7b3-3dafa15c3b2f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:26:16 crc kubenswrapper[4722]: I0309 14:26:16.197723 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50946f11-a025-4a18-a7b3-3dafa15c3b2f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "50946f11-a025-4a18-a7b3-3dafa15c3b2f" (UID: "50946f11-a025-4a18-a7b3-3dafa15c3b2f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:26:16 crc kubenswrapper[4722]: I0309 14:26:16.198565 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50946f11-a025-4a18-a7b3-3dafa15c3b2f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "50946f11-a025-4a18-a7b3-3dafa15c3b2f" (UID: "50946f11-a025-4a18-a7b3-3dafa15c3b2f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:26:16 crc kubenswrapper[4722]: I0309 14:26:16.198927 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50946f11-a025-4a18-a7b3-3dafa15c3b2f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "50946f11-a025-4a18-a7b3-3dafa15c3b2f" (UID: "50946f11-a025-4a18-a7b3-3dafa15c3b2f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:26:16 crc kubenswrapper[4722]: I0309 14:26:16.257736 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-77d5d5f47b-7zdbg"] Mar 09 14:26:16 crc kubenswrapper[4722]: W0309 14:26:16.260471 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod154add5d_d1cd_4c93_ae63_3c7dfe4cb035.slice/crio-9fdf03ad32af0ad3f5e4a055e548127156923b886acc6d05d2b525eb0aadf456 WatchSource:0}: Error finding container 9fdf03ad32af0ad3f5e4a055e548127156923b886acc6d05d2b525eb0aadf456: Status 404 returned error can't find the container with id 9fdf03ad32af0ad3f5e4a055e548127156923b886acc6d05d2b525eb0aadf456 Mar 09 14:26:16 crc kubenswrapper[4722]: I0309 14:26:16.281956 4722 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/50946f11-a025-4a18-a7b3-3dafa15c3b2f-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:16 crc kubenswrapper[4722]: I0309 14:26:16.282162 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/50946f11-a025-4a18-a7b3-3dafa15c3b2f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:16 crc kubenswrapper[4722]: I0309 14:26:16.282306 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/50946f11-a025-4a18-a7b3-3dafa15c3b2f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:16 crc kubenswrapper[4722]: I0309 14:26:16.282404 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdb62cc9-3fd6-41fd-bbbd-de3aaee2073b-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:16 crc kubenswrapper[4722]: I0309 14:26:16.282485 4722 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/50946f11-a025-4a18-a7b3-3dafa15c3b2f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:16 crc kubenswrapper[4722]: I0309 14:26:16.715827 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"34700700-cbd0-4a2e-b791-25b63e3de5b8","Type":"ContainerStarted","Data":"cd01cfbd6236ef8ba1ab9a77a89a9b477fdb53cf600452555b20b52ab693210f"} Mar 09 14:26:16 crc kubenswrapper[4722]: I0309 14:26:16.719128 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-gv4lz" event={"ID":"50946f11-a025-4a18-a7b3-3dafa15c3b2f","Type":"ContainerDied","Data":"fbecf7eaf5b6bff6668a614978bf1bf63ae89f079f6543d4dad2e46116f01992"} Mar 09 14:26:16 crc kubenswrapper[4722]: I0309 14:26:16.719220 4722 scope.go:117] "RemoveContainer" containerID="8462432b5cf6bd80898d429fde3ac72bf95277a5349b45032409711c6330e695" Mar 09 14:26:16 crc kubenswrapper[4722]: I0309 14:26:16.719234 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-gv4lz" Mar 09 14:26:16 crc kubenswrapper[4722]: I0309 14:26:16.723650 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-j4lgt" event={"ID":"7f51218c-6b15-4f4a-ad49-1ba0ccd5e292","Type":"ContainerStarted","Data":"7774a1b7beb6709cc7100d6b0e05365cd9498f94cb28bd5282e6c7b3a858a60d"} Mar 09 14:26:16 crc kubenswrapper[4722]: I0309 14:26:16.744946 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-fx7ks" Mar 09 14:26:16 crc kubenswrapper[4722]: I0309 14:26:16.745236 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-77d5d5f47b-7zdbg" event={"ID":"154add5d-d1cd-4c93-ae63-3c7dfe4cb035","Type":"ContainerStarted","Data":"3c9158dc8fcb0bfa1ebb4b61def0600239b2bd11f0845392eaa82372894666db"} Mar 09 14:26:16 crc kubenswrapper[4722]: I0309 14:26:16.745456 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-77d5d5f47b-7zdbg" event={"ID":"154add5d-d1cd-4c93-ae63-3c7dfe4cb035","Type":"ContainerStarted","Data":"9fdf03ad32af0ad3f5e4a055e548127156923b886acc6d05d2b525eb0aadf456"} Mar 09 14:26:16 crc kubenswrapper[4722]: I0309 14:26:16.745421 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-tlbnm" Mar 09 14:26:16 crc kubenswrapper[4722]: I0309 14:26:16.753475 4722 scope.go:117] "RemoveContainer" containerID="b63bb0bbcc06518f6bdcc16a44a7db665233979d61911135e7b9af0d5c85aa53" Mar 09 14:26:16 crc kubenswrapper[4722]: I0309 14:26:16.769667 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-j4lgt" podStartSLOduration=3.020477685 podStartE2EDuration="44.769640974s" podCreationTimestamp="2026-03-09 14:25:32 +0000 UTC" firstStartedPulling="2026-03-09 14:25:33.906611205 +0000 UTC m=+1374.462179781" lastFinishedPulling="2026-03-09 14:26:15.655774494 +0000 UTC m=+1416.211343070" observedRunningTime="2026-03-09 14:26:16.743089369 +0000 UTC m=+1417.298657945" watchObservedRunningTime="2026-03-09 14:26:16.769640974 +0000 UTC m=+1417.325209550" Mar 09 14:26:16 crc kubenswrapper[4722]: I0309 14:26:16.817967 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-gv4lz"] Mar 09 14:26:16 crc kubenswrapper[4722]: I0309 14:26:16.829516 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-gv4lz"] Mar 09 14:26:16 crc kubenswrapper[4722]: I0309 14:26:16.911497 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 09 14:26:16 crc kubenswrapper[4722]: I0309 14:26:16.912943 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 09 14:26:16 crc kubenswrapper[4722]: I0309 14:26:16.913505 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 09 14:26:16 crc kubenswrapper[4722]: I0309 14:26:16.913563 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 09 14:26:16 crc kubenswrapper[4722]: I0309 14:26:16.963425 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 09 14:26:16 crc kubenswrapper[4722]: I0309 14:26:16.967045 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.094802 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-8495c95fc9-42qqz"] Mar 09 14:26:17 crc kubenswrapper[4722]: E0309 14:26:17.095674 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdb62cc9-3fd6-41fd-bbbd-de3aaee2073b" containerName="keystone-bootstrap" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.095697 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdb62cc9-3fd6-41fd-bbbd-de3aaee2073b" containerName="keystone-bootstrap" Mar 09 14:26:17 crc kubenswrapper[4722]: E0309 14:26:17.095717 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50946f11-a025-4a18-a7b3-3dafa15c3b2f" containerName="init" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.095725 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="50946f11-a025-4a18-a7b3-3dafa15c3b2f" containerName="init" Mar 09 14:26:17 crc kubenswrapper[4722]: E0309 14:26:17.095733 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50946f11-a025-4a18-a7b3-3dafa15c3b2f" containerName="dnsmasq-dns" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.095739 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="50946f11-a025-4a18-a7b3-3dafa15c3b2f" containerName="dnsmasq-dns" Mar 09 14:26:17 crc kubenswrapper[4722]: E0309 14:26:17.095759 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b63a3af5-a347-40ed-b9bc-52ad70e7ff13" containerName="barbican-db-sync" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.095766 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="b63a3af5-a347-40ed-b9bc-52ad70e7ff13" containerName="barbican-db-sync" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.095985 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdb62cc9-3fd6-41fd-bbbd-de3aaee2073b" containerName="keystone-bootstrap" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.095998 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="50946f11-a025-4a18-a7b3-3dafa15c3b2f" containerName="dnsmasq-dns" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.096016 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="b63a3af5-a347-40ed-b9bc-52ad70e7ff13" containerName="barbican-db-sync" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.096825 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-8495c95fc9-42qqz" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.103356 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.103800 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.104215 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.104264 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-jfd9g" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.104300 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.104366 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.108145 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-8495c95fc9-42qqz"] Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.202508 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9k9x\" (UniqueName: \"kubernetes.io/projected/1343ca6f-93ab-45e7-8887-261b10bb1e88-kube-api-access-t9k9x\") pod \"keystone-8495c95fc9-42qqz\" (UID: \"1343ca6f-93ab-45e7-8887-261b10bb1e88\") " pod="openstack/keystone-8495c95fc9-42qqz" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.202580 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1343ca6f-93ab-45e7-8887-261b10bb1e88-public-tls-certs\") pod \"keystone-8495c95fc9-42qqz\" (UID: \"1343ca6f-93ab-45e7-8887-261b10bb1e88\") " pod="openstack/keystone-8495c95fc9-42qqz" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.202604 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1343ca6f-93ab-45e7-8887-261b10bb1e88-config-data\") pod \"keystone-8495c95fc9-42qqz\" (UID: \"1343ca6f-93ab-45e7-8887-261b10bb1e88\") " pod="openstack/keystone-8495c95fc9-42qqz" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.202625 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1343ca6f-93ab-45e7-8887-261b10bb1e88-scripts\") pod \"keystone-8495c95fc9-42qqz\" (UID: \"1343ca6f-93ab-45e7-8887-261b10bb1e88\") " pod="openstack/keystone-8495c95fc9-42qqz" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.202667 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1343ca6f-93ab-45e7-8887-261b10bb1e88-fernet-keys\") pod \"keystone-8495c95fc9-42qqz\" (UID: \"1343ca6f-93ab-45e7-8887-261b10bb1e88\") " pod="openstack/keystone-8495c95fc9-42qqz" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.202750 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1343ca6f-93ab-45e7-8887-261b10bb1e88-internal-tls-certs\") pod \"keystone-8495c95fc9-42qqz\" (UID: \"1343ca6f-93ab-45e7-8887-261b10bb1e88\") " pod="openstack/keystone-8495c95fc9-42qqz" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.202775 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1343ca6f-93ab-45e7-8887-261b10bb1e88-combined-ca-bundle\") pod \"keystone-8495c95fc9-42qqz\" (UID: \"1343ca6f-93ab-45e7-8887-261b10bb1e88\") " pod="openstack/keystone-8495c95fc9-42qqz" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.202835 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1343ca6f-93ab-45e7-8887-261b10bb1e88-credential-keys\") pod \"keystone-8495c95fc9-42qqz\" (UID: \"1343ca6f-93ab-45e7-8887-261b10bb1e88\") " pod="openstack/keystone-8495c95fc9-42qqz" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.271234 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-57655d7975-hskdb"] Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.284409 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-57655d7975-hskdb" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.299820 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.300131 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.300311 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-r5rcp" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.304689 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-57655d7975-hskdb"] Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.305136 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1343ca6f-93ab-45e7-8887-261b10bb1e88-public-tls-certs\") pod \"keystone-8495c95fc9-42qqz\" (UID: \"1343ca6f-93ab-45e7-8887-261b10bb1e88\") " pod="openstack/keystone-8495c95fc9-42qqz" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.305194 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1343ca6f-93ab-45e7-8887-261b10bb1e88-config-data\") pod \"keystone-8495c95fc9-42qqz\" (UID: \"1343ca6f-93ab-45e7-8887-261b10bb1e88\") " pod="openstack/keystone-8495c95fc9-42qqz" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.305246 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1343ca6f-93ab-45e7-8887-261b10bb1e88-scripts\") pod \"keystone-8495c95fc9-42qqz\" (UID: \"1343ca6f-93ab-45e7-8887-261b10bb1e88\") " pod="openstack/keystone-8495c95fc9-42qqz" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.305290 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1343ca6f-93ab-45e7-8887-261b10bb1e88-fernet-keys\") pod \"keystone-8495c95fc9-42qqz\" (UID: \"1343ca6f-93ab-45e7-8887-261b10bb1e88\") " pod="openstack/keystone-8495c95fc9-42qqz" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.305362 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1343ca6f-93ab-45e7-8887-261b10bb1e88-internal-tls-certs\") pod \"keystone-8495c95fc9-42qqz\" (UID: \"1343ca6f-93ab-45e7-8887-261b10bb1e88\") " pod="openstack/keystone-8495c95fc9-42qqz" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.305392 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1343ca6f-93ab-45e7-8887-261b10bb1e88-combined-ca-bundle\") pod \"keystone-8495c95fc9-42qqz\" (UID: \"1343ca6f-93ab-45e7-8887-261b10bb1e88\") " pod="openstack/keystone-8495c95fc9-42qqz" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.305456 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1343ca6f-93ab-45e7-8887-261b10bb1e88-credential-keys\") pod \"keystone-8495c95fc9-42qqz\" (UID: \"1343ca6f-93ab-45e7-8887-261b10bb1e88\") " pod="openstack/keystone-8495c95fc9-42qqz" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.305517 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9k9x\" (UniqueName: \"kubernetes.io/projected/1343ca6f-93ab-45e7-8887-261b10bb1e88-kube-api-access-t9k9x\") pod \"keystone-8495c95fc9-42qqz\" (UID: \"1343ca6f-93ab-45e7-8887-261b10bb1e88\") " pod="openstack/keystone-8495c95fc9-42qqz" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.333454 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1343ca6f-93ab-45e7-8887-261b10bb1e88-combined-ca-bundle\") pod \"keystone-8495c95fc9-42qqz\" (UID: \"1343ca6f-93ab-45e7-8887-261b10bb1e88\") " pod="openstack/keystone-8495c95fc9-42qqz" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.334821 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1343ca6f-93ab-45e7-8887-261b10bb1e88-credential-keys\") pod \"keystone-8495c95fc9-42qqz\" (UID: \"1343ca6f-93ab-45e7-8887-261b10bb1e88\") " pod="openstack/keystone-8495c95fc9-42qqz" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.343770 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1343ca6f-93ab-45e7-8887-261b10bb1e88-public-tls-certs\") pod \"keystone-8495c95fc9-42qqz\" (UID: \"1343ca6f-93ab-45e7-8887-261b10bb1e88\") " pod="openstack/keystone-8495c95fc9-42qqz" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.344284 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1343ca6f-93ab-45e7-8887-261b10bb1e88-config-data\") pod \"keystone-8495c95fc9-42qqz\" (UID: \"1343ca6f-93ab-45e7-8887-261b10bb1e88\") " pod="openstack/keystone-8495c95fc9-42qqz" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.344294 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1343ca6f-93ab-45e7-8887-261b10bb1e88-scripts\") pod \"keystone-8495c95fc9-42qqz\" (UID: \"1343ca6f-93ab-45e7-8887-261b10bb1e88\") " pod="openstack/keystone-8495c95fc9-42qqz" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.350877 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1343ca6f-93ab-45e7-8887-261b10bb1e88-internal-tls-certs\") pod \"keystone-8495c95fc9-42qqz\" (UID: \"1343ca6f-93ab-45e7-8887-261b10bb1e88\") " pod="openstack/keystone-8495c95fc9-42qqz" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.351532 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1343ca6f-93ab-45e7-8887-261b10bb1e88-fernet-keys\") pod \"keystone-8495c95fc9-42qqz\" (UID: \"1343ca6f-93ab-45e7-8887-261b10bb1e88\") " pod="openstack/keystone-8495c95fc9-42qqz" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.367921 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9k9x\" (UniqueName: \"kubernetes.io/projected/1343ca6f-93ab-45e7-8887-261b10bb1e88-kube-api-access-t9k9x\") pod \"keystone-8495c95fc9-42qqz\" (UID: \"1343ca6f-93ab-45e7-8887-261b10bb1e88\") " pod="openstack/keystone-8495c95fc9-42qqz" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.372279 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-858d6f6fd6-wqzdg"] Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.374112 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-858d6f6fd6-wqzdg" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.403140 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.407464 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3b18ae1-c2c9-4454-a591-53ce06064d82-logs\") pod \"barbican-worker-57655d7975-hskdb\" (UID: \"c3b18ae1-c2c9-4454-a591-53ce06064d82\") " pod="openstack/barbican-worker-57655d7975-hskdb" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.407554 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3b18ae1-c2c9-4454-a591-53ce06064d82-combined-ca-bundle\") pod \"barbican-worker-57655d7975-hskdb\" (UID: \"c3b18ae1-c2c9-4454-a591-53ce06064d82\") " pod="openstack/barbican-worker-57655d7975-hskdb" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.407617 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c3b18ae1-c2c9-4454-a591-53ce06064d82-config-data-custom\") pod \"barbican-worker-57655d7975-hskdb\" (UID: \"c3b18ae1-c2c9-4454-a591-53ce06064d82\") " pod="openstack/barbican-worker-57655d7975-hskdb" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.407642 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3b18ae1-c2c9-4454-a591-53ce06064d82-config-data\") pod \"barbican-worker-57655d7975-hskdb\" (UID: \"c3b18ae1-c2c9-4454-a591-53ce06064d82\") " pod="openstack/barbican-worker-57655d7975-hskdb" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.407664 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtm5c\" (UniqueName: \"kubernetes.io/projected/c3b18ae1-c2c9-4454-a591-53ce06064d82-kube-api-access-mtm5c\") pod \"barbican-worker-57655d7975-hskdb\" (UID: \"c3b18ae1-c2c9-4454-a591-53ce06064d82\") " pod="openstack/barbican-worker-57655d7975-hskdb" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.408289 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-858d6f6fd6-wqzdg"] Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.425867 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-k92sh"] Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.426672 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-8495c95fc9-42qqz" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.428598 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-k92sh" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.439995 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-k92sh"] Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.512329 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-cb7fc4c44-jrv9s"] Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.514109 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-cb7fc4c44-jrv9s" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.514565 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cc74ab6-6c85-4da3-8a79-3af240adb999-combined-ca-bundle\") pod \"barbican-keystone-listener-858d6f6fd6-wqzdg\" (UID: \"1cc74ab6-6c85-4da3-8a79-3af240adb999\") " pod="openstack/barbican-keystone-listener-858d6f6fd6-wqzdg" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.514850 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7dhc\" (UniqueName: \"kubernetes.io/projected/12245a01-16fe-424e-9fbb-59d906d90152-kube-api-access-n7dhc\") pod \"dnsmasq-dns-85ff748b95-k92sh\" (UID: \"12245a01-16fe-424e-9fbb-59d906d90152\") " pod="openstack/dnsmasq-dns-85ff748b95-k92sh" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.514872 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/12245a01-16fe-424e-9fbb-59d906d90152-dns-svc\") pod \"dnsmasq-dns-85ff748b95-k92sh\" (UID: \"12245a01-16fe-424e-9fbb-59d906d90152\") " pod="openstack/dnsmasq-dns-85ff748b95-k92sh" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.514891 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxr5p\" (UniqueName: \"kubernetes.io/projected/1cc74ab6-6c85-4da3-8a79-3af240adb999-kube-api-access-zxr5p\") pod \"barbican-keystone-listener-858d6f6fd6-wqzdg\" (UID: \"1cc74ab6-6c85-4da3-8a79-3af240adb999\") " pod="openstack/barbican-keystone-listener-858d6f6fd6-wqzdg" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.514925 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/12245a01-16fe-424e-9fbb-59d906d90152-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-k92sh\" (UID: \"12245a01-16fe-424e-9fbb-59d906d90152\") " pod="openstack/dnsmasq-dns-85ff748b95-k92sh" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.514952 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/12245a01-16fe-424e-9fbb-59d906d90152-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-k92sh\" (UID: \"12245a01-16fe-424e-9fbb-59d906d90152\") " pod="openstack/dnsmasq-dns-85ff748b95-k92sh" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.514967 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1cc74ab6-6c85-4da3-8a79-3af240adb999-logs\") pod \"barbican-keystone-listener-858d6f6fd6-wqzdg\" (UID: \"1cc74ab6-6c85-4da3-8a79-3af240adb999\") " pod="openstack/barbican-keystone-listener-858d6f6fd6-wqzdg" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.514989 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3b18ae1-c2c9-4454-a591-53ce06064d82-logs\") pod \"barbican-worker-57655d7975-hskdb\" (UID: \"c3b18ae1-c2c9-4454-a591-53ce06064d82\") " pod="openstack/barbican-worker-57655d7975-hskdb" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.515008 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12245a01-16fe-424e-9fbb-59d906d90152-config\") pod \"dnsmasq-dns-85ff748b95-k92sh\" (UID: \"12245a01-16fe-424e-9fbb-59d906d90152\") " pod="openstack/dnsmasq-dns-85ff748b95-k92sh" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.515068 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3b18ae1-c2c9-4454-a591-53ce06064d82-combined-ca-bundle\") pod \"barbican-worker-57655d7975-hskdb\" (UID: \"c3b18ae1-c2c9-4454-a591-53ce06064d82\") " pod="openstack/barbican-worker-57655d7975-hskdb" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.515086 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/12245a01-16fe-424e-9fbb-59d906d90152-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-k92sh\" (UID: \"12245a01-16fe-424e-9fbb-59d906d90152\") " pod="openstack/dnsmasq-dns-85ff748b95-k92sh" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.515155 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cc74ab6-6c85-4da3-8a79-3af240adb999-config-data\") pod \"barbican-keystone-listener-858d6f6fd6-wqzdg\" (UID: \"1cc74ab6-6c85-4da3-8a79-3af240adb999\") " pod="openstack/barbican-keystone-listener-858d6f6fd6-wqzdg" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.515175 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c3b18ae1-c2c9-4454-a591-53ce06064d82-config-data-custom\") pod \"barbican-worker-57655d7975-hskdb\" (UID: \"c3b18ae1-c2c9-4454-a591-53ce06064d82\") " pod="openstack/barbican-worker-57655d7975-hskdb" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.515218 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3b18ae1-c2c9-4454-a591-53ce06064d82-config-data\") pod \"barbican-worker-57655d7975-hskdb\" (UID: \"c3b18ae1-c2c9-4454-a591-53ce06064d82\") " pod="openstack/barbican-worker-57655d7975-hskdb" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.515244 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtm5c\" (UniqueName: \"kubernetes.io/projected/c3b18ae1-c2c9-4454-a591-53ce06064d82-kube-api-access-mtm5c\") pod \"barbican-worker-57655d7975-hskdb\" (UID: \"c3b18ae1-c2c9-4454-a591-53ce06064d82\") " pod="openstack/barbican-worker-57655d7975-hskdb" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.515273 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1cc74ab6-6c85-4da3-8a79-3af240adb999-config-data-custom\") pod \"barbican-keystone-listener-858d6f6fd6-wqzdg\" (UID: \"1cc74ab6-6c85-4da3-8a79-3af240adb999\") " pod="openstack/barbican-keystone-listener-858d6f6fd6-wqzdg" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.515770 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3b18ae1-c2c9-4454-a591-53ce06064d82-logs\") pod \"barbican-worker-57655d7975-hskdb\" (UID: \"c3b18ae1-c2c9-4454-a591-53ce06064d82\") " pod="openstack/barbican-worker-57655d7975-hskdb" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.522938 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3b18ae1-c2c9-4454-a591-53ce06064d82-combined-ca-bundle\") pod \"barbican-worker-57655d7975-hskdb\" (UID: \"c3b18ae1-c2c9-4454-a591-53ce06064d82\") " pod="openstack/barbican-worker-57655d7975-hskdb" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.523669 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.525282 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3b18ae1-c2c9-4454-a591-53ce06064d82-config-data\") pod \"barbican-worker-57655d7975-hskdb\" (UID: \"c3b18ae1-c2c9-4454-a591-53ce06064d82\") " pod="openstack/barbican-worker-57655d7975-hskdb" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.526120 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c3b18ae1-c2c9-4454-a591-53ce06064d82-config-data-custom\") pod \"barbican-worker-57655d7975-hskdb\" (UID: \"c3b18ae1-c2c9-4454-a591-53ce06064d82\") " pod="openstack/barbican-worker-57655d7975-hskdb" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.539824 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-cb7fc4c44-jrv9s"] Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.549783 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtm5c\" (UniqueName: \"kubernetes.io/projected/c3b18ae1-c2c9-4454-a591-53ce06064d82-kube-api-access-mtm5c\") pod \"barbican-worker-57655d7975-hskdb\" (UID: \"c3b18ae1-c2c9-4454-a591-53ce06064d82\") " pod="openstack/barbican-worker-57655d7975-hskdb" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.621704 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-57655d7975-hskdb" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.623351 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8066fa12-e1a3-483a-8c94-e7a59cc2f1d9-logs\") pod \"barbican-api-cb7fc4c44-jrv9s\" (UID: \"8066fa12-e1a3-483a-8c94-e7a59cc2f1d9\") " pod="openstack/barbican-api-cb7fc4c44-jrv9s" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.623461 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1cc74ab6-6c85-4da3-8a79-3af240adb999-config-data-custom\") pod \"barbican-keystone-listener-858d6f6fd6-wqzdg\" (UID: \"1cc74ab6-6c85-4da3-8a79-3af240adb999\") " pod="openstack/barbican-keystone-listener-858d6f6fd6-wqzdg" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.623537 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cc74ab6-6c85-4da3-8a79-3af240adb999-combined-ca-bundle\") pod \"barbican-keystone-listener-858d6f6fd6-wqzdg\" (UID: \"1cc74ab6-6c85-4da3-8a79-3af240adb999\") " pod="openstack/barbican-keystone-listener-858d6f6fd6-wqzdg" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.623611 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7dhc\" (UniqueName: \"kubernetes.io/projected/12245a01-16fe-424e-9fbb-59d906d90152-kube-api-access-n7dhc\") pod \"dnsmasq-dns-85ff748b95-k92sh\" (UID: \"12245a01-16fe-424e-9fbb-59d906d90152\") " pod="openstack/dnsmasq-dns-85ff748b95-k92sh" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.623635 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/12245a01-16fe-424e-9fbb-59d906d90152-dns-svc\") pod \"dnsmasq-dns-85ff748b95-k92sh\" (UID: \"12245a01-16fe-424e-9fbb-59d906d90152\") " pod="openstack/dnsmasq-dns-85ff748b95-k92sh" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.623654 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxr5p\" (UniqueName: \"kubernetes.io/projected/1cc74ab6-6c85-4da3-8a79-3af240adb999-kube-api-access-zxr5p\") pod \"barbican-keystone-listener-858d6f6fd6-wqzdg\" (UID: \"1cc74ab6-6c85-4da3-8a79-3af240adb999\") " pod="openstack/barbican-keystone-listener-858d6f6fd6-wqzdg" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.623699 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8066fa12-e1a3-483a-8c94-e7a59cc2f1d9-config-data\") pod \"barbican-api-cb7fc4c44-jrv9s\" (UID: \"8066fa12-e1a3-483a-8c94-e7a59cc2f1d9\") " pod="openstack/barbican-api-cb7fc4c44-jrv9s" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.623782 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/12245a01-16fe-424e-9fbb-59d906d90152-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-k92sh\" (UID: \"12245a01-16fe-424e-9fbb-59d906d90152\") " pod="openstack/dnsmasq-dns-85ff748b95-k92sh" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.623827 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8066fa12-e1a3-483a-8c94-e7a59cc2f1d9-config-data-custom\") pod \"barbican-api-cb7fc4c44-jrv9s\" (UID: \"8066fa12-e1a3-483a-8c94-e7a59cc2f1d9\") " pod="openstack/barbican-api-cb7fc4c44-jrv9s" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.623875 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/12245a01-16fe-424e-9fbb-59d906d90152-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-k92sh\" (UID: \"12245a01-16fe-424e-9fbb-59d906d90152\") " pod="openstack/dnsmasq-dns-85ff748b95-k92sh" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.623896 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1cc74ab6-6c85-4da3-8a79-3af240adb999-logs\") pod \"barbican-keystone-listener-858d6f6fd6-wqzdg\" (UID: \"1cc74ab6-6c85-4da3-8a79-3af240adb999\") " pod="openstack/barbican-keystone-listener-858d6f6fd6-wqzdg" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.623950 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12245a01-16fe-424e-9fbb-59d906d90152-config\") pod \"dnsmasq-dns-85ff748b95-k92sh\" (UID: \"12245a01-16fe-424e-9fbb-59d906d90152\") " pod="openstack/dnsmasq-dns-85ff748b95-k92sh" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.624026 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8066fa12-e1a3-483a-8c94-e7a59cc2f1d9-combined-ca-bundle\") pod \"barbican-api-cb7fc4c44-jrv9s\" (UID: \"8066fa12-e1a3-483a-8c94-e7a59cc2f1d9\") " pod="openstack/barbican-api-cb7fc4c44-jrv9s" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.624124 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/12245a01-16fe-424e-9fbb-59d906d90152-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-k92sh\" (UID: \"12245a01-16fe-424e-9fbb-59d906d90152\") " pod="openstack/dnsmasq-dns-85ff748b95-k92sh" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.624261 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-959f6\" (UniqueName: \"kubernetes.io/projected/8066fa12-e1a3-483a-8c94-e7a59cc2f1d9-kube-api-access-959f6\") pod \"barbican-api-cb7fc4c44-jrv9s\" (UID: \"8066fa12-e1a3-483a-8c94-e7a59cc2f1d9\") " pod="openstack/barbican-api-cb7fc4c44-jrv9s" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.624339 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cc74ab6-6c85-4da3-8a79-3af240adb999-config-data\") pod \"barbican-keystone-listener-858d6f6fd6-wqzdg\" (UID: \"1cc74ab6-6c85-4da3-8a79-3af240adb999\") " pod="openstack/barbican-keystone-listener-858d6f6fd6-wqzdg" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.624775 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/12245a01-16fe-424e-9fbb-59d906d90152-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-k92sh\" (UID: \"12245a01-16fe-424e-9fbb-59d906d90152\") " pod="openstack/dnsmasq-dns-85ff748b95-k92sh" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.625063 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1cc74ab6-6c85-4da3-8a79-3af240adb999-logs\") pod \"barbican-keystone-listener-858d6f6fd6-wqzdg\" (UID: \"1cc74ab6-6c85-4da3-8a79-3af240adb999\") " pod="openstack/barbican-keystone-listener-858d6f6fd6-wqzdg" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.637685 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1cc74ab6-6c85-4da3-8a79-3af240adb999-config-data-custom\") pod \"barbican-keystone-listener-858d6f6fd6-wqzdg\" (UID: \"1cc74ab6-6c85-4da3-8a79-3af240adb999\") " pod="openstack/barbican-keystone-listener-858d6f6fd6-wqzdg" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.648088 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/12245a01-16fe-424e-9fbb-59d906d90152-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-k92sh\" (UID: \"12245a01-16fe-424e-9fbb-59d906d90152\") " pod="openstack/dnsmasq-dns-85ff748b95-k92sh" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.651313 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxr5p\" (UniqueName: \"kubernetes.io/projected/1cc74ab6-6c85-4da3-8a79-3af240adb999-kube-api-access-zxr5p\") pod \"barbican-keystone-listener-858d6f6fd6-wqzdg\" (UID: \"1cc74ab6-6c85-4da3-8a79-3af240adb999\") " pod="openstack/barbican-keystone-listener-858d6f6fd6-wqzdg" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.652408 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cc74ab6-6c85-4da3-8a79-3af240adb999-combined-ca-bundle\") pod \"barbican-keystone-listener-858d6f6fd6-wqzdg\" (UID: \"1cc74ab6-6c85-4da3-8a79-3af240adb999\") " pod="openstack/barbican-keystone-listener-858d6f6fd6-wqzdg" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.662987 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/12245a01-16fe-424e-9fbb-59d906d90152-dns-svc\") pod \"dnsmasq-dns-85ff748b95-k92sh\" (UID: \"12245a01-16fe-424e-9fbb-59d906d90152\") " pod="openstack/dnsmasq-dns-85ff748b95-k92sh" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.663919 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/12245a01-16fe-424e-9fbb-59d906d90152-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-k92sh\" (UID: \"12245a01-16fe-424e-9fbb-59d906d90152\") " pod="openstack/dnsmasq-dns-85ff748b95-k92sh" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.664447 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12245a01-16fe-424e-9fbb-59d906d90152-config\") pod \"dnsmasq-dns-85ff748b95-k92sh\" (UID: \"12245a01-16fe-424e-9fbb-59d906d90152\") " pod="openstack/dnsmasq-dns-85ff748b95-k92sh" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.665101 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7dhc\" (UniqueName: \"kubernetes.io/projected/12245a01-16fe-424e-9fbb-59d906d90152-kube-api-access-n7dhc\") pod \"dnsmasq-dns-85ff748b95-k92sh\" (UID: \"12245a01-16fe-424e-9fbb-59d906d90152\") " pod="openstack/dnsmasq-dns-85ff748b95-k92sh" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.678578 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cc74ab6-6c85-4da3-8a79-3af240adb999-config-data\") pod \"barbican-keystone-listener-858d6f6fd6-wqzdg\" (UID: \"1cc74ab6-6c85-4da3-8a79-3af240adb999\") " pod="openstack/barbican-keystone-listener-858d6f6fd6-wqzdg" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.726656 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-959f6\" (UniqueName: \"kubernetes.io/projected/8066fa12-e1a3-483a-8c94-e7a59cc2f1d9-kube-api-access-959f6\") pod \"barbican-api-cb7fc4c44-jrv9s\" (UID: \"8066fa12-e1a3-483a-8c94-e7a59cc2f1d9\") " pod="openstack/barbican-api-cb7fc4c44-jrv9s" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.726744 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8066fa12-e1a3-483a-8c94-e7a59cc2f1d9-logs\") pod \"barbican-api-cb7fc4c44-jrv9s\" (UID: \"8066fa12-e1a3-483a-8c94-e7a59cc2f1d9\") " pod="openstack/barbican-api-cb7fc4c44-jrv9s" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.726851 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8066fa12-e1a3-483a-8c94-e7a59cc2f1d9-config-data\") pod \"barbican-api-cb7fc4c44-jrv9s\" (UID: \"8066fa12-e1a3-483a-8c94-e7a59cc2f1d9\") " pod="openstack/barbican-api-cb7fc4c44-jrv9s" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.726884 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8066fa12-e1a3-483a-8c94-e7a59cc2f1d9-config-data-custom\") pod \"barbican-api-cb7fc4c44-jrv9s\" (UID: \"8066fa12-e1a3-483a-8c94-e7a59cc2f1d9\") " pod="openstack/barbican-api-cb7fc4c44-jrv9s" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.726928 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8066fa12-e1a3-483a-8c94-e7a59cc2f1d9-combined-ca-bundle\") pod \"barbican-api-cb7fc4c44-jrv9s\" (UID: \"8066fa12-e1a3-483a-8c94-e7a59cc2f1d9\") " pod="openstack/barbican-api-cb7fc4c44-jrv9s" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.741643 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8066fa12-e1a3-483a-8c94-e7a59cc2f1d9-logs\") pod \"barbican-api-cb7fc4c44-jrv9s\" (UID: \"8066fa12-e1a3-483a-8c94-e7a59cc2f1d9\") " pod="openstack/barbican-api-cb7fc4c44-jrv9s" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.745447 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-5c8c9c48fd-dnbhs"] Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.747116 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5c8c9c48fd-dnbhs" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.753683 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8066fa12-e1a3-483a-8c94-e7a59cc2f1d9-combined-ca-bundle\") pod \"barbican-api-cb7fc4c44-jrv9s\" (UID: \"8066fa12-e1a3-483a-8c94-e7a59cc2f1d9\") " pod="openstack/barbican-api-cb7fc4c44-jrv9s" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.769024 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8066fa12-e1a3-483a-8c94-e7a59cc2f1d9-config-data-custom\") pod \"barbican-api-cb7fc4c44-jrv9s\" (UID: \"8066fa12-e1a3-483a-8c94-e7a59cc2f1d9\") " pod="openstack/barbican-api-cb7fc4c44-jrv9s" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.779104 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-858d6f6fd6-wqzdg" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.784977 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-959f6\" (UniqueName: \"kubernetes.io/projected/8066fa12-e1a3-483a-8c94-e7a59cc2f1d9-kube-api-access-959f6\") pod \"barbican-api-cb7fc4c44-jrv9s\" (UID: \"8066fa12-e1a3-483a-8c94-e7a59cc2f1d9\") " pod="openstack/barbican-api-cb7fc4c44-jrv9s" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.794643 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-k92sh" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.805595 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8066fa12-e1a3-483a-8c94-e7a59cc2f1d9-config-data\") pod \"barbican-api-cb7fc4c44-jrv9s\" (UID: \"8066fa12-e1a3-483a-8c94-e7a59cc2f1d9\") " pod="openstack/barbican-api-cb7fc4c44-jrv9s" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.828328 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-5576b4c89f-ddl4q"] Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.841491 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5576b4c89f-ddl4q" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.865377 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-cb7fc4c44-jrv9s" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.905323 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-77d5d5f47b-7zdbg" event={"ID":"154add5d-d1cd-4c93-ae63-3c7dfe4cb035","Type":"ContainerStarted","Data":"17e660b299cb6418cfd87beaf38373cbd8fa8b6a05a78e24fc2b2cec4cdbc8a0"} Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.934003 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56fwj\" (UniqueName: \"kubernetes.io/projected/7ff2eef6-823d-496e-b64d-abb692d53b42-kube-api-access-56fwj\") pod \"barbican-worker-5576b4c89f-ddl4q\" (UID: \"7ff2eef6-823d-496e-b64d-abb692d53b42\") " pod="openstack/barbican-worker-5576b4c89f-ddl4q" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.934078 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7ff2eef6-823d-496e-b64d-abb692d53b42-config-data-custom\") pod \"barbican-worker-5576b4c89f-ddl4q\" (UID: \"7ff2eef6-823d-496e-b64d-abb692d53b42\") " pod="openstack/barbican-worker-5576b4c89f-ddl4q" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.934100 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/612e965f-d243-486c-90a5-e4c867ef6fd5-combined-ca-bundle\") pod \"barbican-keystone-listener-5c8c9c48fd-dnbhs\" (UID: \"612e965f-d243-486c-90a5-e4c867ef6fd5\") " pod="openstack/barbican-keystone-listener-5c8c9c48fd-dnbhs" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.934146 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/612e965f-d243-486c-90a5-e4c867ef6fd5-config-data\") pod \"barbican-keystone-listener-5c8c9c48fd-dnbhs\" (UID: \"612e965f-d243-486c-90a5-e4c867ef6fd5\") " pod="openstack/barbican-keystone-listener-5c8c9c48fd-dnbhs" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.934167 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/612e965f-d243-486c-90a5-e4c867ef6fd5-config-data-custom\") pod \"barbican-keystone-listener-5c8c9c48fd-dnbhs\" (UID: \"612e965f-d243-486c-90a5-e4c867ef6fd5\") " pod="openstack/barbican-keystone-listener-5c8c9c48fd-dnbhs" Mar 09 14:26:17 crc kubenswrapper[4722]: I0309 14:26:17.945736 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ff2eef6-823d-496e-b64d-abb692d53b42-logs\") pod \"barbican-worker-5576b4c89f-ddl4q\" (UID: \"7ff2eef6-823d-496e-b64d-abb692d53b42\") " pod="openstack/barbican-worker-5576b4c89f-ddl4q" Mar 09 14:26:18 crc kubenswrapper[4722]: I0309 14:26:18.014400 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ff2eef6-823d-496e-b64d-abb692d53b42-combined-ca-bundle\") pod \"barbican-worker-5576b4c89f-ddl4q\" (UID: \"7ff2eef6-823d-496e-b64d-abb692d53b42\") " pod="openstack/barbican-worker-5576b4c89f-ddl4q" Mar 09 14:26:18 crc kubenswrapper[4722]: I0309 14:26:18.015353 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwqh4\" (UniqueName: \"kubernetes.io/projected/612e965f-d243-486c-90a5-e4c867ef6fd5-kube-api-access-fwqh4\") pod \"barbican-keystone-listener-5c8c9c48fd-dnbhs\" (UID: \"612e965f-d243-486c-90a5-e4c867ef6fd5\") " pod="openstack/barbican-keystone-listener-5c8c9c48fd-dnbhs" Mar 09 14:26:18 crc kubenswrapper[4722]: I0309 14:26:18.015664 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/612e965f-d243-486c-90a5-e4c867ef6fd5-logs\") pod \"barbican-keystone-listener-5c8c9c48fd-dnbhs\" (UID: \"612e965f-d243-486c-90a5-e4c867ef6fd5\") " pod="openstack/barbican-keystone-listener-5c8c9c48fd-dnbhs" Mar 09 14:26:18 crc kubenswrapper[4722]: I0309 14:26:18.015750 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ff2eef6-823d-496e-b64d-abb692d53b42-config-data\") pod \"barbican-worker-5576b4c89f-ddl4q\" (UID: \"7ff2eef6-823d-496e-b64d-abb692d53b42\") " pod="openstack/barbican-worker-5576b4c89f-ddl4q" Mar 09 14:26:18 crc kubenswrapper[4722]: I0309 14:26:18.031473 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5576b4c89f-ddl4q"] Mar 09 14:26:18 crc kubenswrapper[4722]: I0309 14:26:18.064960 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5c8c9c48fd-dnbhs"] Mar 09 14:26:18 crc kubenswrapper[4722]: I0309 14:26:18.117953 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/612e965f-d243-486c-90a5-e4c867ef6fd5-logs\") pod \"barbican-keystone-listener-5c8c9c48fd-dnbhs\" (UID: \"612e965f-d243-486c-90a5-e4c867ef6fd5\") " pod="openstack/barbican-keystone-listener-5c8c9c48fd-dnbhs" Mar 09 14:26:18 crc kubenswrapper[4722]: I0309 14:26:18.118313 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ff2eef6-823d-496e-b64d-abb692d53b42-config-data\") pod \"barbican-worker-5576b4c89f-ddl4q\" (UID: \"7ff2eef6-823d-496e-b64d-abb692d53b42\") " pod="openstack/barbican-worker-5576b4c89f-ddl4q" Mar 09 14:26:18 crc kubenswrapper[4722]: I0309 14:26:18.118655 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56fwj\" (UniqueName: \"kubernetes.io/projected/7ff2eef6-823d-496e-b64d-abb692d53b42-kube-api-access-56fwj\") pod \"barbican-worker-5576b4c89f-ddl4q\" (UID: \"7ff2eef6-823d-496e-b64d-abb692d53b42\") " pod="openstack/barbican-worker-5576b4c89f-ddl4q" Mar 09 14:26:18 crc kubenswrapper[4722]: I0309 14:26:18.119898 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7ff2eef6-823d-496e-b64d-abb692d53b42-config-data-custom\") pod \"barbican-worker-5576b4c89f-ddl4q\" (UID: \"7ff2eef6-823d-496e-b64d-abb692d53b42\") " pod="openstack/barbican-worker-5576b4c89f-ddl4q" Mar 09 14:26:18 crc kubenswrapper[4722]: I0309 14:26:18.120021 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/612e965f-d243-486c-90a5-e4c867ef6fd5-combined-ca-bundle\") pod \"barbican-keystone-listener-5c8c9c48fd-dnbhs\" (UID: \"612e965f-d243-486c-90a5-e4c867ef6fd5\") " pod="openstack/barbican-keystone-listener-5c8c9c48fd-dnbhs" Mar 09 14:26:18 crc kubenswrapper[4722]: I0309 14:26:18.120169 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/612e965f-d243-486c-90a5-e4c867ef6fd5-config-data\") pod \"barbican-keystone-listener-5c8c9c48fd-dnbhs\" (UID: \"612e965f-d243-486c-90a5-e4c867ef6fd5\") " pod="openstack/barbican-keystone-listener-5c8c9c48fd-dnbhs" Mar 09 14:26:18 crc kubenswrapper[4722]: I0309 14:26:18.120392 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/612e965f-d243-486c-90a5-e4c867ef6fd5-config-data-custom\") pod \"barbican-keystone-listener-5c8c9c48fd-dnbhs\" (UID: \"612e965f-d243-486c-90a5-e4c867ef6fd5\") " pod="openstack/barbican-keystone-listener-5c8c9c48fd-dnbhs" Mar 09 14:26:18 crc kubenswrapper[4722]: I0309 14:26:18.120651 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ff2eef6-823d-496e-b64d-abb692d53b42-logs\") pod \"barbican-worker-5576b4c89f-ddl4q\" (UID: \"7ff2eef6-823d-496e-b64d-abb692d53b42\") " pod="openstack/barbican-worker-5576b4c89f-ddl4q" Mar 09 14:26:18 crc kubenswrapper[4722]: I0309 14:26:18.120840 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ff2eef6-823d-496e-b64d-abb692d53b42-combined-ca-bundle\") pod \"barbican-worker-5576b4c89f-ddl4q\" (UID: \"7ff2eef6-823d-496e-b64d-abb692d53b42\") " pod="openstack/barbican-worker-5576b4c89f-ddl4q" Mar 09 14:26:18 crc kubenswrapper[4722]: I0309 14:26:18.120966 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwqh4\" (UniqueName: \"kubernetes.io/projected/612e965f-d243-486c-90a5-e4c867ef6fd5-kube-api-access-fwqh4\") pod \"barbican-keystone-listener-5c8c9c48fd-dnbhs\" (UID: \"612e965f-d243-486c-90a5-e4c867ef6fd5\") " pod="openstack/barbican-keystone-listener-5c8c9c48fd-dnbhs" Mar 09 14:26:18 crc kubenswrapper[4722]: I0309 14:26:18.122062 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/612e965f-d243-486c-90a5-e4c867ef6fd5-logs\") pod \"barbican-keystone-listener-5c8c9c48fd-dnbhs\" (UID: \"612e965f-d243-486c-90a5-e4c867ef6fd5\") " pod="openstack/barbican-keystone-listener-5c8c9c48fd-dnbhs" Mar 09 14:26:18 crc kubenswrapper[4722]: I0309 14:26:18.130962 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-6bc9487fbb-xx845"] Mar 09 14:26:18 crc kubenswrapper[4722]: I0309 14:26:18.133339 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7ff2eef6-823d-496e-b64d-abb692d53b42-config-data-custom\") pod \"barbican-worker-5576b4c89f-ddl4q\" (UID: \"7ff2eef6-823d-496e-b64d-abb692d53b42\") " pod="openstack/barbican-worker-5576b4c89f-ddl4q" Mar 09 14:26:18 crc kubenswrapper[4722]: I0309 14:26:18.134988 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6bc9487fbb-xx845" Mar 09 14:26:18 crc kubenswrapper[4722]: I0309 14:26:18.141141 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ff2eef6-823d-496e-b64d-abb692d53b42-logs\") pod \"barbican-worker-5576b4c89f-ddl4q\" (UID: \"7ff2eef6-823d-496e-b64d-abb692d53b42\") " pod="openstack/barbican-worker-5576b4c89f-ddl4q" Mar 09 14:26:18 crc kubenswrapper[4722]: I0309 14:26:18.144313 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/612e965f-d243-486c-90a5-e4c867ef6fd5-config-data\") pod \"barbican-keystone-listener-5c8c9c48fd-dnbhs\" (UID: \"612e965f-d243-486c-90a5-e4c867ef6fd5\") " pod="openstack/barbican-keystone-listener-5c8c9c48fd-dnbhs" Mar 09 14:26:18 crc kubenswrapper[4722]: I0309 14:26:18.146468 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/612e965f-d243-486c-90a5-e4c867ef6fd5-config-data-custom\") pod \"barbican-keystone-listener-5c8c9c48fd-dnbhs\" (UID: \"612e965f-d243-486c-90a5-e4c867ef6fd5\") " pod="openstack/barbican-keystone-listener-5c8c9c48fd-dnbhs" Mar 09 14:26:18 crc kubenswrapper[4722]: I0309 14:26:18.149501 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ff2eef6-823d-496e-b64d-abb692d53b42-combined-ca-bundle\") pod \"barbican-worker-5576b4c89f-ddl4q\" (UID: \"7ff2eef6-823d-496e-b64d-abb692d53b42\") " pod="openstack/barbican-worker-5576b4c89f-ddl4q" Mar 09 14:26:18 crc kubenswrapper[4722]: I0309 14:26:18.149952 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/612e965f-d243-486c-90a5-e4c867ef6fd5-combined-ca-bundle\") pod \"barbican-keystone-listener-5c8c9c48fd-dnbhs\" (UID: \"612e965f-d243-486c-90a5-e4c867ef6fd5\") " pod="openstack/barbican-keystone-listener-5c8c9c48fd-dnbhs" Mar 09 14:26:18 crc kubenswrapper[4722]: I0309 14:26:18.184552 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ff2eef6-823d-496e-b64d-abb692d53b42-config-data\") pod \"barbican-worker-5576b4c89f-ddl4q\" (UID: \"7ff2eef6-823d-496e-b64d-abb692d53b42\") " pod="openstack/barbican-worker-5576b4c89f-ddl4q" Mar 09 14:26:18 crc kubenswrapper[4722]: I0309 14:26:18.192688 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwqh4\" (UniqueName: \"kubernetes.io/projected/612e965f-d243-486c-90a5-e4c867ef6fd5-kube-api-access-fwqh4\") pod \"barbican-keystone-listener-5c8c9c48fd-dnbhs\" (UID: \"612e965f-d243-486c-90a5-e4c867ef6fd5\") " pod="openstack/barbican-keystone-listener-5c8c9c48fd-dnbhs" Mar 09 14:26:18 crc kubenswrapper[4722]: I0309 14:26:18.192984 4722 scope.go:117] "RemoveContainer" containerID="dd11c5d281c72d880edcd7ab0de997dadc7ed6b7c81eb6f7415412e1d4bb7a0f" Mar 09 14:26:18 crc kubenswrapper[4722]: I0309 14:26:18.198123 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56fwj\" (UniqueName: \"kubernetes.io/projected/7ff2eef6-823d-496e-b64d-abb692d53b42-kube-api-access-56fwj\") pod \"barbican-worker-5576b4c89f-ddl4q\" (UID: \"7ff2eef6-823d-496e-b64d-abb692d53b42\") " pod="openstack/barbican-worker-5576b4c89f-ddl4q" Mar 09 14:26:18 crc kubenswrapper[4722]: I0309 14:26:18.231931 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17ff98af-3f49-4e81-9d50-397d5eb8076c-config-data\") pod \"barbican-api-6bc9487fbb-xx845\" (UID: \"17ff98af-3f49-4e81-9d50-397d5eb8076c\") " pod="openstack/barbican-api-6bc9487fbb-xx845" Mar 09 14:26:18 crc kubenswrapper[4722]: I0309 14:26:18.231968 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17ff98af-3f49-4e81-9d50-397d5eb8076c-combined-ca-bundle\") pod \"barbican-api-6bc9487fbb-xx845\" (UID: \"17ff98af-3f49-4e81-9d50-397d5eb8076c\") " pod="openstack/barbican-api-6bc9487fbb-xx845" Mar 09 14:26:18 crc kubenswrapper[4722]: I0309 14:26:18.232034 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/17ff98af-3f49-4e81-9d50-397d5eb8076c-config-data-custom\") pod \"barbican-api-6bc9487fbb-xx845\" (UID: \"17ff98af-3f49-4e81-9d50-397d5eb8076c\") " pod="openstack/barbican-api-6bc9487fbb-xx845" Mar 09 14:26:18 crc kubenswrapper[4722]: I0309 14:26:18.232071 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17ff98af-3f49-4e81-9d50-397d5eb8076c-logs\") pod \"barbican-api-6bc9487fbb-xx845\" (UID: \"17ff98af-3f49-4e81-9d50-397d5eb8076c\") " pod="openstack/barbican-api-6bc9487fbb-xx845" Mar 09 14:26:18 crc kubenswrapper[4722]: I0309 14:26:18.232112 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t659p\" (UniqueName: \"kubernetes.io/projected/17ff98af-3f49-4e81-9d50-397d5eb8076c-kube-api-access-t659p\") pod \"barbican-api-6bc9487fbb-xx845\" (UID: \"17ff98af-3f49-4e81-9d50-397d5eb8076c\") " pod="openstack/barbican-api-6bc9487fbb-xx845" Mar 09 14:26:18 crc kubenswrapper[4722]: I0309 14:26:18.241587 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-77d5d5f47b-7zdbg" podStartSLOduration=5.241565487 podStartE2EDuration="5.241565487s" podCreationTimestamp="2026-03-09 14:26:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:26:17.935134003 +0000 UTC m=+1418.490702569" watchObservedRunningTime="2026-03-09 14:26:18.241565487 +0000 UTC m=+1418.797134063" Mar 09 14:26:18 crc kubenswrapper[4722]: I0309 14:26:18.246402 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50946f11-a025-4a18-a7b3-3dafa15c3b2f" path="/var/lib/kubelet/pods/50946f11-a025-4a18-a7b3-3dafa15c3b2f/volumes" Mar 09 14:26:18 crc kubenswrapper[4722]: I0309 14:26:18.247542 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6bc9487fbb-xx845"] Mar 09 14:26:18 crc kubenswrapper[4722]: I0309 14:26:18.299291 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-7ccbf8d8bb-gjl7f"] Mar 09 14:26:18 crc kubenswrapper[4722]: I0309 14:26:18.313724 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7ccbf8d8bb-gjl7f"] Mar 09 14:26:18 crc kubenswrapper[4722]: I0309 14:26:18.314706 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7ccbf8d8bb-gjl7f" Mar 09 14:26:18 crc kubenswrapper[4722]: I0309 14:26:18.325759 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-8495c95fc9-42qqz"] Mar 09 14:26:18 crc kubenswrapper[4722]: I0309 14:26:18.333529 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17ff98af-3f49-4e81-9d50-397d5eb8076c-config-data\") pod \"barbican-api-6bc9487fbb-xx845\" (UID: \"17ff98af-3f49-4e81-9d50-397d5eb8076c\") " pod="openstack/barbican-api-6bc9487fbb-xx845" Mar 09 14:26:18 crc kubenswrapper[4722]: I0309 14:26:18.333561 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17ff98af-3f49-4e81-9d50-397d5eb8076c-combined-ca-bundle\") pod \"barbican-api-6bc9487fbb-xx845\" (UID: \"17ff98af-3f49-4e81-9d50-397d5eb8076c\") " pod="openstack/barbican-api-6bc9487fbb-xx845" Mar 09 14:26:18 crc kubenswrapper[4722]: I0309 14:26:18.333660 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/17ff98af-3f49-4e81-9d50-397d5eb8076c-config-data-custom\") pod \"barbican-api-6bc9487fbb-xx845\" (UID: \"17ff98af-3f49-4e81-9d50-397d5eb8076c\") " pod="openstack/barbican-api-6bc9487fbb-xx845" Mar 09 14:26:18 crc kubenswrapper[4722]: I0309 14:26:18.333707 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17ff98af-3f49-4e81-9d50-397d5eb8076c-logs\") pod \"barbican-api-6bc9487fbb-xx845\" (UID: \"17ff98af-3f49-4e81-9d50-397d5eb8076c\") " pod="openstack/barbican-api-6bc9487fbb-xx845" Mar 09 14:26:18 crc kubenswrapper[4722]: I0309 14:26:18.333788 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t659p\" (UniqueName: \"kubernetes.io/projected/17ff98af-3f49-4e81-9d50-397d5eb8076c-kube-api-access-t659p\") pod \"barbican-api-6bc9487fbb-xx845\" (UID: \"17ff98af-3f49-4e81-9d50-397d5eb8076c\") " pod="openstack/barbican-api-6bc9487fbb-xx845" Mar 09 14:26:18 crc kubenswrapper[4722]: I0309 14:26:18.335129 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17ff98af-3f49-4e81-9d50-397d5eb8076c-logs\") pod \"barbican-api-6bc9487fbb-xx845\" (UID: \"17ff98af-3f49-4e81-9d50-397d5eb8076c\") " pod="openstack/barbican-api-6bc9487fbb-xx845" Mar 09 14:26:18 crc kubenswrapper[4722]: I0309 14:26:18.346789 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17ff98af-3f49-4e81-9d50-397d5eb8076c-combined-ca-bundle\") pod \"barbican-api-6bc9487fbb-xx845\" (UID: \"17ff98af-3f49-4e81-9d50-397d5eb8076c\") " pod="openstack/barbican-api-6bc9487fbb-xx845" Mar 09 14:26:18 crc kubenswrapper[4722]: I0309 14:26:18.359836 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/17ff98af-3f49-4e81-9d50-397d5eb8076c-config-data-custom\") pod \"barbican-api-6bc9487fbb-xx845\" (UID: \"17ff98af-3f49-4e81-9d50-397d5eb8076c\") " pod="openstack/barbican-api-6bc9487fbb-xx845" Mar 09 14:26:18 crc kubenswrapper[4722]: I0309 14:26:18.365247 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t659p\" (UniqueName: \"kubernetes.io/projected/17ff98af-3f49-4e81-9d50-397d5eb8076c-kube-api-access-t659p\") pod \"barbican-api-6bc9487fbb-xx845\" (UID: \"17ff98af-3f49-4e81-9d50-397d5eb8076c\") " pod="openstack/barbican-api-6bc9487fbb-xx845" Mar 09 14:26:18 crc kubenswrapper[4722]: I0309 14:26:18.384005 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17ff98af-3f49-4e81-9d50-397d5eb8076c-config-data\") pod \"barbican-api-6bc9487fbb-xx845\" (UID: \"17ff98af-3f49-4e81-9d50-397d5eb8076c\") " pod="openstack/barbican-api-6bc9487fbb-xx845" Mar 09 14:26:18 crc kubenswrapper[4722]: I0309 14:26:18.442297 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5c8c9c48fd-dnbhs" Mar 09 14:26:18 crc kubenswrapper[4722]: I0309 14:26:18.484931 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5576b4c89f-ddl4q" Mar 09 14:26:18 crc kubenswrapper[4722]: I0309 14:26:18.485835 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6bc9487fbb-xx845" Mar 09 14:26:18 crc kubenswrapper[4722]: I0309 14:26:18.487898 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2bzf\" (UniqueName: \"kubernetes.io/projected/b46c2ebf-e484-41c4-9f12-392f46798dfd-kube-api-access-q2bzf\") pod \"placement-7ccbf8d8bb-gjl7f\" (UID: \"b46c2ebf-e484-41c4-9f12-392f46798dfd\") " pod="openstack/placement-7ccbf8d8bb-gjl7f" Mar 09 14:26:18 crc kubenswrapper[4722]: I0309 14:26:18.488031 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b46c2ebf-e484-41c4-9f12-392f46798dfd-combined-ca-bundle\") pod \"placement-7ccbf8d8bb-gjl7f\" (UID: \"b46c2ebf-e484-41c4-9f12-392f46798dfd\") " pod="openstack/placement-7ccbf8d8bb-gjl7f" Mar 09 14:26:18 crc kubenswrapper[4722]: I0309 14:26:18.488080 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b46c2ebf-e484-41c4-9f12-392f46798dfd-scripts\") pod \"placement-7ccbf8d8bb-gjl7f\" (UID: \"b46c2ebf-e484-41c4-9f12-392f46798dfd\") " pod="openstack/placement-7ccbf8d8bb-gjl7f" Mar 09 14:26:18 crc kubenswrapper[4722]: I0309 14:26:18.488121 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b46c2ebf-e484-41c4-9f12-392f46798dfd-logs\") pod \"placement-7ccbf8d8bb-gjl7f\" (UID: \"b46c2ebf-e484-41c4-9f12-392f46798dfd\") " pod="openstack/placement-7ccbf8d8bb-gjl7f" Mar 09 14:26:18 crc kubenswrapper[4722]: I0309 14:26:18.488158 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b46c2ebf-e484-41c4-9f12-392f46798dfd-internal-tls-certs\") pod \"placement-7ccbf8d8bb-gjl7f\" (UID: \"b46c2ebf-e484-41c4-9f12-392f46798dfd\") " pod="openstack/placement-7ccbf8d8bb-gjl7f" Mar 09 14:26:18 crc kubenswrapper[4722]: I0309 14:26:18.488182 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b46c2ebf-e484-41c4-9f12-392f46798dfd-config-data\") pod \"placement-7ccbf8d8bb-gjl7f\" (UID: \"b46c2ebf-e484-41c4-9f12-392f46798dfd\") " pod="openstack/placement-7ccbf8d8bb-gjl7f" Mar 09 14:26:18 crc kubenswrapper[4722]: I0309 14:26:18.488273 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b46c2ebf-e484-41c4-9f12-392f46798dfd-public-tls-certs\") pod \"placement-7ccbf8d8bb-gjl7f\" (UID: \"b46c2ebf-e484-41c4-9f12-392f46798dfd\") " pod="openstack/placement-7ccbf8d8bb-gjl7f" Mar 09 14:26:18 crc kubenswrapper[4722]: I0309 14:26:18.589933 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b46c2ebf-e484-41c4-9f12-392f46798dfd-combined-ca-bundle\") pod \"placement-7ccbf8d8bb-gjl7f\" (UID: \"b46c2ebf-e484-41c4-9f12-392f46798dfd\") " pod="openstack/placement-7ccbf8d8bb-gjl7f" Mar 09 14:26:18 crc kubenswrapper[4722]: I0309 14:26:18.589984 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b46c2ebf-e484-41c4-9f12-392f46798dfd-scripts\") pod \"placement-7ccbf8d8bb-gjl7f\" (UID: \"b46c2ebf-e484-41c4-9f12-392f46798dfd\") " pod="openstack/placement-7ccbf8d8bb-gjl7f" Mar 09 14:26:18 crc kubenswrapper[4722]: I0309 14:26:18.590019 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b46c2ebf-e484-41c4-9f12-392f46798dfd-logs\") pod \"placement-7ccbf8d8bb-gjl7f\" (UID: \"b46c2ebf-e484-41c4-9f12-392f46798dfd\") " pod="openstack/placement-7ccbf8d8bb-gjl7f" Mar 09 14:26:18 crc kubenswrapper[4722]: I0309 14:26:18.590042 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b46c2ebf-e484-41c4-9f12-392f46798dfd-internal-tls-certs\") pod \"placement-7ccbf8d8bb-gjl7f\" (UID: \"b46c2ebf-e484-41c4-9f12-392f46798dfd\") " pod="openstack/placement-7ccbf8d8bb-gjl7f" Mar 09 14:26:18 crc kubenswrapper[4722]: I0309 14:26:18.590062 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b46c2ebf-e484-41c4-9f12-392f46798dfd-config-data\") pod \"placement-7ccbf8d8bb-gjl7f\" (UID: \"b46c2ebf-e484-41c4-9f12-392f46798dfd\") " pod="openstack/placement-7ccbf8d8bb-gjl7f" Mar 09 14:26:18 crc kubenswrapper[4722]: I0309 14:26:18.590111 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b46c2ebf-e484-41c4-9f12-392f46798dfd-public-tls-certs\") pod \"placement-7ccbf8d8bb-gjl7f\" (UID: \"b46c2ebf-e484-41c4-9f12-392f46798dfd\") " pod="openstack/placement-7ccbf8d8bb-gjl7f" Mar 09 14:26:18 crc kubenswrapper[4722]: I0309 14:26:18.590180 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2bzf\" (UniqueName: \"kubernetes.io/projected/b46c2ebf-e484-41c4-9f12-392f46798dfd-kube-api-access-q2bzf\") pod \"placement-7ccbf8d8bb-gjl7f\" (UID: \"b46c2ebf-e484-41c4-9f12-392f46798dfd\") " pod="openstack/placement-7ccbf8d8bb-gjl7f" Mar 09 14:26:18 crc kubenswrapper[4722]: I0309 14:26:18.592287 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-57655d7975-hskdb"] Mar 09 14:26:18 crc kubenswrapper[4722]: I0309 14:26:18.592729 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b46c2ebf-e484-41c4-9f12-392f46798dfd-logs\") pod \"placement-7ccbf8d8bb-gjl7f\" (UID: \"b46c2ebf-e484-41c4-9f12-392f46798dfd\") " pod="openstack/placement-7ccbf8d8bb-gjl7f" Mar 09 14:26:18 crc kubenswrapper[4722]: I0309 14:26:18.612076 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b46c2ebf-e484-41c4-9f12-392f46798dfd-public-tls-certs\") pod \"placement-7ccbf8d8bb-gjl7f\" (UID: \"b46c2ebf-e484-41c4-9f12-392f46798dfd\") " pod="openstack/placement-7ccbf8d8bb-gjl7f" Mar 09 14:26:18 crc kubenswrapper[4722]: I0309 14:26:18.616635 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b46c2ebf-e484-41c4-9f12-392f46798dfd-internal-tls-certs\") pod \"placement-7ccbf8d8bb-gjl7f\" (UID: \"b46c2ebf-e484-41c4-9f12-392f46798dfd\") " pod="openstack/placement-7ccbf8d8bb-gjl7f" Mar 09 14:26:18 crc kubenswrapper[4722]: I0309 14:26:18.625659 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b46c2ebf-e484-41c4-9f12-392f46798dfd-scripts\") pod \"placement-7ccbf8d8bb-gjl7f\" (UID: \"b46c2ebf-e484-41c4-9f12-392f46798dfd\") " pod="openstack/placement-7ccbf8d8bb-gjl7f" Mar 09 14:26:18 crc kubenswrapper[4722]: I0309 14:26:18.630827 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b46c2ebf-e484-41c4-9f12-392f46798dfd-combined-ca-bundle\") pod \"placement-7ccbf8d8bb-gjl7f\" (UID: \"b46c2ebf-e484-41c4-9f12-392f46798dfd\") " pod="openstack/placement-7ccbf8d8bb-gjl7f" Mar 09 14:26:18 crc kubenswrapper[4722]: I0309 14:26:18.640750 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2bzf\" (UniqueName: \"kubernetes.io/projected/b46c2ebf-e484-41c4-9f12-392f46798dfd-kube-api-access-q2bzf\") pod \"placement-7ccbf8d8bb-gjl7f\" (UID: \"b46c2ebf-e484-41c4-9f12-392f46798dfd\") " pod="openstack/placement-7ccbf8d8bb-gjl7f" Mar 09 14:26:18 crc kubenswrapper[4722]: I0309 14:26:18.641096 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b46c2ebf-e484-41c4-9f12-392f46798dfd-config-data\") pod \"placement-7ccbf8d8bb-gjl7f\" (UID: \"b46c2ebf-e484-41c4-9f12-392f46798dfd\") " pod="openstack/placement-7ccbf8d8bb-gjl7f" Mar 09 14:26:18 crc kubenswrapper[4722]: I0309 14:26:18.938810 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7ccbf8d8bb-gjl7f" Mar 09 14:26:18 crc kubenswrapper[4722]: I0309 14:26:18.955264 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-858d6f6fd6-wqzdg"] Mar 09 14:26:18 crc kubenswrapper[4722]: I0309 14:26:18.997897 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-57655d7975-hskdb" event={"ID":"c3b18ae1-c2c9-4454-a591-53ce06064d82","Type":"ContainerStarted","Data":"66173dab4c95a59cb756913887d69c0eea112d340545edc798dae7a2c3856d81"} Mar 09 14:26:19 crc kubenswrapper[4722]: I0309 14:26:19.032300 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-8495c95fc9-42qqz" event={"ID":"1343ca6f-93ab-45e7-8887-261b10bb1e88","Type":"ContainerStarted","Data":"775bf2dc49aada3fa17d57230cbe5abdace71fe5bcdd856f589664854911af5e"} Mar 09 14:26:19 crc kubenswrapper[4722]: I0309 14:26:19.032452 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-77d5d5f47b-7zdbg" Mar 09 14:26:19 crc kubenswrapper[4722]: I0309 14:26:19.032491 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-77d5d5f47b-7zdbg" Mar 09 14:26:19 crc kubenswrapper[4722]: I0309 14:26:19.090658 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-k92sh"] Mar 09 14:26:19 crc kubenswrapper[4722]: I0309 14:26:19.206283 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-cb7fc4c44-jrv9s"] Mar 09 14:26:19 crc kubenswrapper[4722]: I0309 14:26:19.692324 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5576b4c89f-ddl4q"] Mar 09 14:26:19 crc kubenswrapper[4722]: I0309 14:26:19.704537 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5c8c9c48fd-dnbhs"] Mar 09 14:26:19 crc kubenswrapper[4722]: I0309 14:26:19.841056 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-6bc9487fbb-xx845"] Mar 09 14:26:20 crc kubenswrapper[4722]: I0309 14:26:20.145813 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-k92sh" event={"ID":"12245a01-16fe-424e-9fbb-59d906d90152","Type":"ContainerStarted","Data":"605be6803ebfb0e06597572f94f7e4bf290f0d27028740b80b6c3e404eba40aa"} Mar 09 14:26:20 crc kubenswrapper[4722]: I0309 14:26:20.301917 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6bc9487fbb-xx845" event={"ID":"17ff98af-3f49-4e81-9d50-397d5eb8076c","Type":"ContainerStarted","Data":"d055441ca146b9872a1a5e0b25dde1daaf5a8774c00142e39b502801b6ea9946"} Mar 09 14:26:20 crc kubenswrapper[4722]: I0309 14:26:20.302232 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7ccbf8d8bb-gjl7f"] Mar 09 14:26:20 crc kubenswrapper[4722]: I0309 14:26:20.302257 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-8495c95fc9-42qqz" Mar 09 14:26:20 crc kubenswrapper[4722]: I0309 14:26:20.302281 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-8495c95fc9-42qqz" event={"ID":"1343ca6f-93ab-45e7-8887-261b10bb1e88","Type":"ContainerStarted","Data":"66626c28fba46639b4725052690bf2e44f7293e14dde1ad897fd801bc7a85ead"} Mar 09 14:26:20 crc kubenswrapper[4722]: I0309 14:26:20.302291 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5576b4c89f-ddl4q" event={"ID":"7ff2eef6-823d-496e-b64d-abb692d53b42","Type":"ContainerStarted","Data":"1a582cb47aa2ddf86dfb88140fccf4d3c93e058beccb0a75f45cdecc1bd37e30"} Mar 09 14:26:20 crc kubenswrapper[4722]: I0309 14:26:20.302306 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-858d6f6fd6-wqzdg" event={"ID":"1cc74ab6-6c85-4da3-8a79-3af240adb999","Type":"ContainerStarted","Data":"a5838f9eb0a984e2b49cf758afe91b53782d3b9ca807c3a8ce1ba1ba95711fda"} Mar 09 14:26:20 crc kubenswrapper[4722]: I0309 14:26:20.302316 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5c8c9c48fd-dnbhs" event={"ID":"612e965f-d243-486c-90a5-e4c867ef6fd5","Type":"ContainerStarted","Data":"4eb18062703bc246676595972d3b076b94ed26cbe3ead10035656cd982e7a661"} Mar 09 14:26:20 crc kubenswrapper[4722]: I0309 14:26:20.302327 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-cb7fc4c44-jrv9s" event={"ID":"8066fa12-e1a3-483a-8c94-e7a59cc2f1d9","Type":"ContainerStarted","Data":"6be84d49059047100547341362493db59e9c0a5393e70a8126117748e190932e"} Mar 09 14:26:20 crc kubenswrapper[4722]: I0309 14:26:20.302337 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-cb7fc4c44-jrv9s" event={"ID":"8066fa12-e1a3-483a-8c94-e7a59cc2f1d9","Type":"ContainerStarted","Data":"61d9b18c5dba6555fb05e13585a5ea0ad201dd7340176902c8d9b3e41445e7c6"} Mar 09 14:26:20 crc kubenswrapper[4722]: I0309 14:26:20.472918 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-8495c95fc9-42qqz" podStartSLOduration=3.472901336 podStartE2EDuration="3.472901336s" podCreationTimestamp="2026-03-09 14:26:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:26:20.43152714 +0000 UTC m=+1420.987095726" watchObservedRunningTime="2026-03-09 14:26:20.472901336 +0000 UTC m=+1421.028469912" Mar 09 14:26:21 crc kubenswrapper[4722]: I0309 14:26:21.292590 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7ccbf8d8bb-gjl7f" event={"ID":"b46c2ebf-e484-41c4-9f12-392f46798dfd","Type":"ContainerStarted","Data":"3168e3fa475f50d0f184b6877b64dc220806a605996ee6e8e62342b756410e54"} Mar 09 14:26:21 crc kubenswrapper[4722]: I0309 14:26:21.293116 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7ccbf8d8bb-gjl7f" event={"ID":"b46c2ebf-e484-41c4-9f12-392f46798dfd","Type":"ContainerStarted","Data":"83fbcb390b6b8f96ac3b92c858941c4ab5bf7b832ab4dd360d1827c23e70c724"} Mar 09 14:26:21 crc kubenswrapper[4722]: I0309 14:26:21.310961 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-cb7fc4c44-jrv9s" event={"ID":"8066fa12-e1a3-483a-8c94-e7a59cc2f1d9","Type":"ContainerStarted","Data":"55247a0ac598f6b1b4061095ab5246e6b6508d186c1cd181cb3e595bf543cc3e"} Mar 09 14:26:21 crc kubenswrapper[4722]: I0309 14:26:21.312799 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-cb7fc4c44-jrv9s" Mar 09 14:26:21 crc kubenswrapper[4722]: I0309 14:26:21.312831 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-cb7fc4c44-jrv9s" Mar 09 14:26:21 crc kubenswrapper[4722]: I0309 14:26:21.331297 4722 generic.go:334] "Generic (PLEG): container finished" podID="12245a01-16fe-424e-9fbb-59d906d90152" containerID="afb30c9b889b0ef73da00845124667b5134eba739a3db9c077ed82aae17d15e2" exitCode=0 Mar 09 14:26:21 crc kubenswrapper[4722]: I0309 14:26:21.331383 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-k92sh" event={"ID":"12245a01-16fe-424e-9fbb-59d906d90152","Type":"ContainerDied","Data":"afb30c9b889b0ef73da00845124667b5134eba739a3db9c077ed82aae17d15e2"} Mar 09 14:26:21 crc kubenswrapper[4722]: I0309 14:26:21.331413 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-k92sh" event={"ID":"12245a01-16fe-424e-9fbb-59d906d90152","Type":"ContainerStarted","Data":"8779974b5aec2bce4f7903ef8950a2bf797b650d511cc7b0ea5aa4fbafc22086"} Mar 09 14:26:21 crc kubenswrapper[4722]: I0309 14:26:21.332503 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-85ff748b95-k92sh" Mar 09 14:26:21 crc kubenswrapper[4722]: I0309 14:26:21.347158 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-fzj6s" event={"ID":"5035bd54-0aaa-4ff3-b90a-6145145fe95c","Type":"ContainerStarted","Data":"2360808bc57a87b6b2fa7f5fc11ab102736e19dbbb32712fc1d07735f9404fa8"} Mar 09 14:26:21 crc kubenswrapper[4722]: I0309 14:26:21.351112 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-cb7fc4c44-jrv9s" podStartSLOduration=4.351097021 podStartE2EDuration="4.351097021s" podCreationTimestamp="2026-03-09 14:26:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:26:21.335338695 +0000 UTC m=+1421.890907281" watchObservedRunningTime="2026-03-09 14:26:21.351097021 +0000 UTC m=+1421.906665607" Mar 09 14:26:21 crc kubenswrapper[4722]: I0309 14:26:21.393717 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6bc9487fbb-xx845" event={"ID":"17ff98af-3f49-4e81-9d50-397d5eb8076c","Type":"ContainerStarted","Data":"fc2ee5715c59ebb2f03ff918f277326dbac252d00fa038b71001e7df27396ee2"} Mar 09 14:26:21 crc kubenswrapper[4722]: I0309 14:26:21.393757 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6bc9487fbb-xx845" Mar 09 14:26:21 crc kubenswrapper[4722]: I0309 14:26:21.393769 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6bc9487fbb-xx845" event={"ID":"17ff98af-3f49-4e81-9d50-397d5eb8076c","Type":"ContainerStarted","Data":"ee011b195beeb3631b51cd9fb39e35c666d28917fd9146c1d9bb5a34e1056cfd"} Mar 09 14:26:21 crc kubenswrapper[4722]: I0309 14:26:21.393778 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-6bc9487fbb-xx845" Mar 09 14:26:21 crc kubenswrapper[4722]: I0309 14:26:21.396626 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-85ff748b95-k92sh" podStartSLOduration=4.3966004 podStartE2EDuration="4.3966004s" podCreationTimestamp="2026-03-09 14:26:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:26:21.394002528 +0000 UTC m=+1421.949571104" watchObservedRunningTime="2026-03-09 14:26:21.3966004 +0000 UTC m=+1421.952168976" Mar 09 14:26:21 crc kubenswrapper[4722]: I0309 14:26:21.435927 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-6bc9487fbb-xx845" podStartSLOduration=4.435911148 podStartE2EDuration="4.435911148s" podCreationTimestamp="2026-03-09 14:26:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:26:21.434693006 +0000 UTC m=+1421.990261582" watchObservedRunningTime="2026-03-09 14:26:21.435911148 +0000 UTC m=+1421.991479724" Mar 09 14:26:21 crc kubenswrapper[4722]: I0309 14:26:21.481975 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-fzj6s" podStartSLOduration=4.793201945 podStartE2EDuration="49.481959143s" podCreationTimestamp="2026-03-09 14:25:32 +0000 UTC" firstStartedPulling="2026-03-09 14:25:34.58471854 +0000 UTC m=+1375.140287116" lastFinishedPulling="2026-03-09 14:26:19.273475738 +0000 UTC m=+1419.829044314" observedRunningTime="2026-03-09 14:26:21.476648966 +0000 UTC m=+1422.032217542" watchObservedRunningTime="2026-03-09 14:26:21.481959143 +0000 UTC m=+1422.037527719" Mar 09 14:26:21 crc kubenswrapper[4722]: I0309 14:26:21.527494 4722 patch_prober.go:28] interesting pod/machine-config-daemon-hjrrb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:26:21 crc kubenswrapper[4722]: I0309 14:26:21.527546 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:26:21 crc kubenswrapper[4722]: I0309 14:26:21.527582 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" Mar 09 14:26:21 crc kubenswrapper[4722]: I0309 14:26:21.528345 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2d686d3e92fab7cd0f339e5d57afd546181543a3a9585b91ecf278050136cecb"} pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 14:26:21 crc kubenswrapper[4722]: I0309 14:26:21.528395 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerName="machine-config-daemon" containerID="cri-o://2d686d3e92fab7cd0f339e5d57afd546181543a3a9585b91ecf278050136cecb" gracePeriod=600 Mar 09 14:26:21 crc kubenswrapper[4722]: I0309 14:26:21.965607 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6bc9487fbb-xx845"] Mar 09 14:26:22 crc kubenswrapper[4722]: I0309 14:26:22.054658 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-847f5f7dcd-6xz74"] Mar 09 14:26:22 crc kubenswrapper[4722]: I0309 14:26:22.057654 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-847f5f7dcd-6xz74" Mar 09 14:26:22 crc kubenswrapper[4722]: I0309 14:26:22.060308 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Mar 09 14:26:22 crc kubenswrapper[4722]: I0309 14:26:22.060522 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Mar 09 14:26:22 crc kubenswrapper[4722]: I0309 14:26:22.133635 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-847f5f7dcd-6xz74"] Mar 09 14:26:22 crc kubenswrapper[4722]: I0309 14:26:22.156341 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/339f60b7-5615-4bbf-a907-ec8daeb69158-internal-tls-certs\") pod \"barbican-api-847f5f7dcd-6xz74\" (UID: \"339f60b7-5615-4bbf-a907-ec8daeb69158\") " pod="openstack/barbican-api-847f5f7dcd-6xz74" Mar 09 14:26:22 crc kubenswrapper[4722]: I0309 14:26:22.156387 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmcjf\" (UniqueName: \"kubernetes.io/projected/339f60b7-5615-4bbf-a907-ec8daeb69158-kube-api-access-vmcjf\") pod \"barbican-api-847f5f7dcd-6xz74\" (UID: \"339f60b7-5615-4bbf-a907-ec8daeb69158\") " pod="openstack/barbican-api-847f5f7dcd-6xz74" Mar 09 14:26:22 crc kubenswrapper[4722]: I0309 14:26:22.156408 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/339f60b7-5615-4bbf-a907-ec8daeb69158-public-tls-certs\") pod \"barbican-api-847f5f7dcd-6xz74\" (UID: \"339f60b7-5615-4bbf-a907-ec8daeb69158\") " pod="openstack/barbican-api-847f5f7dcd-6xz74" Mar 09 14:26:22 crc kubenswrapper[4722]: I0309 14:26:22.156461 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/339f60b7-5615-4bbf-a907-ec8daeb69158-config-data-custom\") pod \"barbican-api-847f5f7dcd-6xz74\" (UID: \"339f60b7-5615-4bbf-a907-ec8daeb69158\") " pod="openstack/barbican-api-847f5f7dcd-6xz74" Mar 09 14:26:22 crc kubenswrapper[4722]: I0309 14:26:22.156503 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/339f60b7-5615-4bbf-a907-ec8daeb69158-combined-ca-bundle\") pod \"barbican-api-847f5f7dcd-6xz74\" (UID: \"339f60b7-5615-4bbf-a907-ec8daeb69158\") " pod="openstack/barbican-api-847f5f7dcd-6xz74" Mar 09 14:26:22 crc kubenswrapper[4722]: I0309 14:26:22.156561 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/339f60b7-5615-4bbf-a907-ec8daeb69158-config-data\") pod \"barbican-api-847f5f7dcd-6xz74\" (UID: \"339f60b7-5615-4bbf-a907-ec8daeb69158\") " pod="openstack/barbican-api-847f5f7dcd-6xz74" Mar 09 14:26:22 crc kubenswrapper[4722]: I0309 14:26:22.156649 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/339f60b7-5615-4bbf-a907-ec8daeb69158-logs\") pod \"barbican-api-847f5f7dcd-6xz74\" (UID: \"339f60b7-5615-4bbf-a907-ec8daeb69158\") " pod="openstack/barbican-api-847f5f7dcd-6xz74" Mar 09 14:26:22 crc kubenswrapper[4722]: I0309 14:26:22.259516 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/339f60b7-5615-4bbf-a907-ec8daeb69158-internal-tls-certs\") pod \"barbican-api-847f5f7dcd-6xz74\" (UID: \"339f60b7-5615-4bbf-a907-ec8daeb69158\") " pod="openstack/barbican-api-847f5f7dcd-6xz74" Mar 09 14:26:22 crc kubenswrapper[4722]: I0309 14:26:22.259571 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmcjf\" (UniqueName: \"kubernetes.io/projected/339f60b7-5615-4bbf-a907-ec8daeb69158-kube-api-access-vmcjf\") pod \"barbican-api-847f5f7dcd-6xz74\" (UID: \"339f60b7-5615-4bbf-a907-ec8daeb69158\") " pod="openstack/barbican-api-847f5f7dcd-6xz74" Mar 09 14:26:22 crc kubenswrapper[4722]: I0309 14:26:22.259602 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/339f60b7-5615-4bbf-a907-ec8daeb69158-public-tls-certs\") pod \"barbican-api-847f5f7dcd-6xz74\" (UID: \"339f60b7-5615-4bbf-a907-ec8daeb69158\") " pod="openstack/barbican-api-847f5f7dcd-6xz74" Mar 09 14:26:22 crc kubenswrapper[4722]: I0309 14:26:22.259700 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/339f60b7-5615-4bbf-a907-ec8daeb69158-config-data-custom\") pod \"barbican-api-847f5f7dcd-6xz74\" (UID: \"339f60b7-5615-4bbf-a907-ec8daeb69158\") " pod="openstack/barbican-api-847f5f7dcd-6xz74" Mar 09 14:26:22 crc kubenswrapper[4722]: I0309 14:26:22.259784 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/339f60b7-5615-4bbf-a907-ec8daeb69158-combined-ca-bundle\") pod \"barbican-api-847f5f7dcd-6xz74\" (UID: \"339f60b7-5615-4bbf-a907-ec8daeb69158\") " pod="openstack/barbican-api-847f5f7dcd-6xz74" Mar 09 14:26:22 crc kubenswrapper[4722]: I0309 14:26:22.259880 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/339f60b7-5615-4bbf-a907-ec8daeb69158-config-data\") pod \"barbican-api-847f5f7dcd-6xz74\" (UID: \"339f60b7-5615-4bbf-a907-ec8daeb69158\") " pod="openstack/barbican-api-847f5f7dcd-6xz74" Mar 09 14:26:22 crc kubenswrapper[4722]: I0309 14:26:22.259998 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/339f60b7-5615-4bbf-a907-ec8daeb69158-logs\") pod \"barbican-api-847f5f7dcd-6xz74\" (UID: \"339f60b7-5615-4bbf-a907-ec8daeb69158\") " pod="openstack/barbican-api-847f5f7dcd-6xz74" Mar 09 14:26:22 crc kubenswrapper[4722]: I0309 14:26:22.260553 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/339f60b7-5615-4bbf-a907-ec8daeb69158-logs\") pod \"barbican-api-847f5f7dcd-6xz74\" (UID: \"339f60b7-5615-4bbf-a907-ec8daeb69158\") " pod="openstack/barbican-api-847f5f7dcd-6xz74" Mar 09 14:26:22 crc kubenswrapper[4722]: I0309 14:26:22.270954 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/339f60b7-5615-4bbf-a907-ec8daeb69158-combined-ca-bundle\") pod \"barbican-api-847f5f7dcd-6xz74\" (UID: \"339f60b7-5615-4bbf-a907-ec8daeb69158\") " pod="openstack/barbican-api-847f5f7dcd-6xz74" Mar 09 14:26:22 crc kubenswrapper[4722]: I0309 14:26:22.273710 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/339f60b7-5615-4bbf-a907-ec8daeb69158-internal-tls-certs\") pod \"barbican-api-847f5f7dcd-6xz74\" (UID: \"339f60b7-5615-4bbf-a907-ec8daeb69158\") " pod="openstack/barbican-api-847f5f7dcd-6xz74" Mar 09 14:26:22 crc kubenswrapper[4722]: I0309 14:26:22.273731 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/339f60b7-5615-4bbf-a907-ec8daeb69158-config-data-custom\") pod \"barbican-api-847f5f7dcd-6xz74\" (UID: \"339f60b7-5615-4bbf-a907-ec8daeb69158\") " pod="openstack/barbican-api-847f5f7dcd-6xz74" Mar 09 14:26:22 crc kubenswrapper[4722]: I0309 14:26:22.277546 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/339f60b7-5615-4bbf-a907-ec8daeb69158-config-data\") pod \"barbican-api-847f5f7dcd-6xz74\" (UID: \"339f60b7-5615-4bbf-a907-ec8daeb69158\") " pod="openstack/barbican-api-847f5f7dcd-6xz74" Mar 09 14:26:22 crc kubenswrapper[4722]: I0309 14:26:22.280774 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/339f60b7-5615-4bbf-a907-ec8daeb69158-public-tls-certs\") pod \"barbican-api-847f5f7dcd-6xz74\" (UID: \"339f60b7-5615-4bbf-a907-ec8daeb69158\") " pod="openstack/barbican-api-847f5f7dcd-6xz74" Mar 09 14:26:22 crc kubenswrapper[4722]: I0309 14:26:22.292763 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmcjf\" (UniqueName: \"kubernetes.io/projected/339f60b7-5615-4bbf-a907-ec8daeb69158-kube-api-access-vmcjf\") pod \"barbican-api-847f5f7dcd-6xz74\" (UID: \"339f60b7-5615-4bbf-a907-ec8daeb69158\") " pod="openstack/barbican-api-847f5f7dcd-6xz74" Mar 09 14:26:22 crc kubenswrapper[4722]: I0309 14:26:22.409004 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7ccbf8d8bb-gjl7f" event={"ID":"b46c2ebf-e484-41c4-9f12-392f46798dfd","Type":"ContainerStarted","Data":"637be467d87d37255f54b579526afaaae3e4805d01e96a08425994bea46e3063"} Mar 09 14:26:22 crc kubenswrapper[4722]: I0309 14:26:22.410332 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7ccbf8d8bb-gjl7f" Mar 09 14:26:22 crc kubenswrapper[4722]: I0309 14:26:22.410359 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7ccbf8d8bb-gjl7f" Mar 09 14:26:22 crc kubenswrapper[4722]: I0309 14:26:22.417257 4722 generic.go:334] "Generic (PLEG): container finished" podID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerID="2d686d3e92fab7cd0f339e5d57afd546181543a3a9585b91ecf278050136cecb" exitCode=0 Mar 09 14:26:22 crc kubenswrapper[4722]: I0309 14:26:22.418091 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" event={"ID":"dac2aaf5-653b-4b2a-8efe-ed26bac8d648","Type":"ContainerDied","Data":"2d686d3e92fab7cd0f339e5d57afd546181543a3a9585b91ecf278050136cecb"} Mar 09 14:26:22 crc kubenswrapper[4722]: I0309 14:26:22.418118 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" event={"ID":"dac2aaf5-653b-4b2a-8efe-ed26bac8d648","Type":"ContainerStarted","Data":"86a6dcdf7dcf4d48b8bdfb71d9a2c30b89f235700c424312c7e16580a86021d6"} Mar 09 14:26:22 crc kubenswrapper[4722]: I0309 14:26:22.418134 4722 scope.go:117] "RemoveContainer" containerID="61f8d88e021e1998adcf282dcc3a5969939b9f1d00069284614000e527956e5e" Mar 09 14:26:22 crc kubenswrapper[4722]: I0309 14:26:22.440229 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-7ccbf8d8bb-gjl7f" podStartSLOduration=4.440194665 podStartE2EDuration="4.440194665s" podCreationTimestamp="2026-03-09 14:26:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:26:22.430946838 +0000 UTC m=+1422.986515414" watchObservedRunningTime="2026-03-09 14:26:22.440194665 +0000 UTC m=+1422.995763231" Mar 09 14:26:22 crc kubenswrapper[4722]: I0309 14:26:22.454179 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-847f5f7dcd-6xz74" Mar 09 14:26:23 crc kubenswrapper[4722]: I0309 14:26:23.434555 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6bc9487fbb-xx845" podUID="17ff98af-3f49-4e81-9d50-397d5eb8076c" containerName="barbican-api-log" containerID="cri-o://ee011b195beeb3631b51cd9fb39e35c666d28917fd9146c1d9bb5a34e1056cfd" gracePeriod=30 Mar 09 14:26:23 crc kubenswrapper[4722]: I0309 14:26:23.435100 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-6bc9487fbb-xx845" podUID="17ff98af-3f49-4e81-9d50-397d5eb8076c" containerName="barbican-api" containerID="cri-o://fc2ee5715c59ebb2f03ff918f277326dbac252d00fa038b71001e7df27396ee2" gracePeriod=30 Mar 09 14:26:23 crc kubenswrapper[4722]: I0309 14:26:23.785171 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 09 14:26:23 crc kubenswrapper[4722]: I0309 14:26:23.785346 4722 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 09 14:26:23 crc kubenswrapper[4722]: I0309 14:26:23.790008 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 09 14:26:23 crc kubenswrapper[4722]: I0309 14:26:23.796907 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 09 14:26:23 crc kubenswrapper[4722]: I0309 14:26:23.808103 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 09 14:26:24 crc kubenswrapper[4722]: I0309 14:26:24.470336 4722 generic.go:334] "Generic (PLEG): container finished" podID="17ff98af-3f49-4e81-9d50-397d5eb8076c" containerID="fc2ee5715c59ebb2f03ff918f277326dbac252d00fa038b71001e7df27396ee2" exitCode=0 Mar 09 14:26:24 crc kubenswrapper[4722]: I0309 14:26:24.470911 4722 generic.go:334] "Generic (PLEG): container finished" podID="17ff98af-3f49-4e81-9d50-397d5eb8076c" containerID="ee011b195beeb3631b51cd9fb39e35c666d28917fd9146c1d9bb5a34e1056cfd" exitCode=143 Mar 09 14:26:24 crc kubenswrapper[4722]: I0309 14:26:24.470981 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6bc9487fbb-xx845" event={"ID":"17ff98af-3f49-4e81-9d50-397d5eb8076c","Type":"ContainerDied","Data":"fc2ee5715c59ebb2f03ff918f277326dbac252d00fa038b71001e7df27396ee2"} Mar 09 14:26:24 crc kubenswrapper[4722]: I0309 14:26:24.471009 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6bc9487fbb-xx845" event={"ID":"17ff98af-3f49-4e81-9d50-397d5eb8076c","Type":"ContainerDied","Data":"ee011b195beeb3631b51cd9fb39e35c666d28917fd9146c1d9bb5a34e1056cfd"} Mar 09 14:26:24 crc kubenswrapper[4722]: I0309 14:26:24.998388 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6bc9487fbb-xx845" Mar 09 14:26:25 crc kubenswrapper[4722]: I0309 14:26:25.030686 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/17ff98af-3f49-4e81-9d50-397d5eb8076c-config-data-custom\") pod \"17ff98af-3f49-4e81-9d50-397d5eb8076c\" (UID: \"17ff98af-3f49-4e81-9d50-397d5eb8076c\") " Mar 09 14:26:25 crc kubenswrapper[4722]: I0309 14:26:25.030761 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17ff98af-3f49-4e81-9d50-397d5eb8076c-config-data\") pod \"17ff98af-3f49-4e81-9d50-397d5eb8076c\" (UID: \"17ff98af-3f49-4e81-9d50-397d5eb8076c\") " Mar 09 14:26:25 crc kubenswrapper[4722]: I0309 14:26:25.030825 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t659p\" (UniqueName: \"kubernetes.io/projected/17ff98af-3f49-4e81-9d50-397d5eb8076c-kube-api-access-t659p\") pod \"17ff98af-3f49-4e81-9d50-397d5eb8076c\" (UID: \"17ff98af-3f49-4e81-9d50-397d5eb8076c\") " Mar 09 14:26:25 crc kubenswrapper[4722]: I0309 14:26:25.030892 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17ff98af-3f49-4e81-9d50-397d5eb8076c-combined-ca-bundle\") pod \"17ff98af-3f49-4e81-9d50-397d5eb8076c\" (UID: \"17ff98af-3f49-4e81-9d50-397d5eb8076c\") " Mar 09 14:26:25 crc kubenswrapper[4722]: I0309 14:26:25.030944 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17ff98af-3f49-4e81-9d50-397d5eb8076c-logs\") pod \"17ff98af-3f49-4e81-9d50-397d5eb8076c\" (UID: \"17ff98af-3f49-4e81-9d50-397d5eb8076c\") " Mar 09 14:26:25 crc kubenswrapper[4722]: I0309 14:26:25.032436 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17ff98af-3f49-4e81-9d50-397d5eb8076c-logs" (OuterVolumeSpecName: "logs") pod "17ff98af-3f49-4e81-9d50-397d5eb8076c" (UID: "17ff98af-3f49-4e81-9d50-397d5eb8076c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:26:25 crc kubenswrapper[4722]: I0309 14:26:25.039325 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17ff98af-3f49-4e81-9d50-397d5eb8076c-kube-api-access-t659p" (OuterVolumeSpecName: "kube-api-access-t659p") pod "17ff98af-3f49-4e81-9d50-397d5eb8076c" (UID: "17ff98af-3f49-4e81-9d50-397d5eb8076c"). InnerVolumeSpecName "kube-api-access-t659p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:26:25 crc kubenswrapper[4722]: I0309 14:26:25.042489 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17ff98af-3f49-4e81-9d50-397d5eb8076c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "17ff98af-3f49-4e81-9d50-397d5eb8076c" (UID: "17ff98af-3f49-4e81-9d50-397d5eb8076c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:26:25 crc kubenswrapper[4722]: I0309 14:26:25.114174 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17ff98af-3f49-4e81-9d50-397d5eb8076c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "17ff98af-3f49-4e81-9d50-397d5eb8076c" (UID: "17ff98af-3f49-4e81-9d50-397d5eb8076c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:26:25 crc kubenswrapper[4722]: I0309 14:26:25.132567 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17ff98af-3f49-4e81-9d50-397d5eb8076c-config-data" (OuterVolumeSpecName: "config-data") pod "17ff98af-3f49-4e81-9d50-397d5eb8076c" (UID: "17ff98af-3f49-4e81-9d50-397d5eb8076c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:26:25 crc kubenswrapper[4722]: I0309 14:26:25.133708 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17ff98af-3f49-4e81-9d50-397d5eb8076c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:25 crc kubenswrapper[4722]: I0309 14:26:25.133734 4722 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/17ff98af-3f49-4e81-9d50-397d5eb8076c-logs\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:25 crc kubenswrapper[4722]: I0309 14:26:25.133744 4722 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/17ff98af-3f49-4e81-9d50-397d5eb8076c-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:25 crc kubenswrapper[4722]: I0309 14:26:25.133752 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17ff98af-3f49-4e81-9d50-397d5eb8076c-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:25 crc kubenswrapper[4722]: I0309 14:26:25.133760 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t659p\" (UniqueName: \"kubernetes.io/projected/17ff98af-3f49-4e81-9d50-397d5eb8076c-kube-api-access-t659p\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:25 crc kubenswrapper[4722]: I0309 14:26:25.200092 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-847f5f7dcd-6xz74"] Mar 09 14:26:25 crc kubenswrapper[4722]: I0309 14:26:25.490009 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-6bc9487fbb-xx845" Mar 09 14:26:25 crc kubenswrapper[4722]: I0309 14:26:25.489993 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-6bc9487fbb-xx845" event={"ID":"17ff98af-3f49-4e81-9d50-397d5eb8076c","Type":"ContainerDied","Data":"d055441ca146b9872a1a5e0b25dde1daaf5a8774c00142e39b502801b6ea9946"} Mar 09 14:26:25 crc kubenswrapper[4722]: I0309 14:26:25.490406 4722 scope.go:117] "RemoveContainer" containerID="fc2ee5715c59ebb2f03ff918f277326dbac252d00fa038b71001e7df27396ee2" Mar 09 14:26:25 crc kubenswrapper[4722]: I0309 14:26:25.496430 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5c8c9c48fd-dnbhs" event={"ID":"612e965f-d243-486c-90a5-e4c867ef6fd5","Type":"ContainerStarted","Data":"6dde062e8ce610cb0bdb6a4226f1f78c5f82f7c85f6beca31addc1dc3da927db"} Mar 09 14:26:25 crc kubenswrapper[4722]: I0309 14:26:25.496472 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5c8c9c48fd-dnbhs" event={"ID":"612e965f-d243-486c-90a5-e4c867ef6fd5","Type":"ContainerStarted","Data":"97635b1f131b478e46e2cef9cdfb974ee5b24a041d75c8647beb11a9af6f838c"} Mar 09 14:26:25 crc kubenswrapper[4722]: I0309 14:26:25.505616 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-57655d7975-hskdb" event={"ID":"c3b18ae1-c2c9-4454-a591-53ce06064d82","Type":"ContainerStarted","Data":"c6864d1bed9c306653a0b503bd6991b82fa3f33a3e4d35aa629f8fb746dfd504"} Mar 09 14:26:25 crc kubenswrapper[4722]: I0309 14:26:25.505653 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-57655d7975-hskdb" event={"ID":"c3b18ae1-c2c9-4454-a591-53ce06064d82","Type":"ContainerStarted","Data":"41e74f0bf251b86c307d87608970a4aa9f3d425bfc90e218b49c8fca85813b49"} Mar 09 14:26:25 crc kubenswrapper[4722]: I0309 14:26:25.511386 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5576b4c89f-ddl4q" event={"ID":"7ff2eef6-823d-496e-b64d-abb692d53b42","Type":"ContainerStarted","Data":"adfdb4ce1e596f20182b65b21b8dca335c56fc5e47968d432bf0cddcbf9a717a"} Mar 09 14:26:25 crc kubenswrapper[4722]: I0309 14:26:25.511421 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5576b4c89f-ddl4q" event={"ID":"7ff2eef6-823d-496e-b64d-abb692d53b42","Type":"ContainerStarted","Data":"855c35895bda1cf0c05ef80afb78786f8433bb293154d2bd147b2b894eb3f9e9"} Mar 09 14:26:25 crc kubenswrapper[4722]: I0309 14:26:25.523631 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-5c8c9c48fd-dnbhs" podStartSLOduration=3.68431819 podStartE2EDuration="8.523613105s" podCreationTimestamp="2026-03-09 14:26:17 +0000 UTC" firstStartedPulling="2026-03-09 14:26:19.734328397 +0000 UTC m=+1420.289896973" lastFinishedPulling="2026-03-09 14:26:24.573623312 +0000 UTC m=+1425.129191888" observedRunningTime="2026-03-09 14:26:25.512158968 +0000 UTC m=+1426.067727544" watchObservedRunningTime="2026-03-09 14:26:25.523613105 +0000 UTC m=+1426.079181681" Mar 09 14:26:25 crc kubenswrapper[4722]: I0309 14:26:25.550818 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-847f5f7dcd-6xz74" event={"ID":"339f60b7-5615-4bbf-a907-ec8daeb69158","Type":"ContainerStarted","Data":"1c5ee0d3a135c5b3092d1527d18b6cc66ce7ae0257692d23cc266d4adfbb9a33"} Mar 09 14:26:25 crc kubenswrapper[4722]: I0309 14:26:25.554975 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-57655d7975-hskdb" podStartSLOduration=2.910679779 podStartE2EDuration="8.554961362s" podCreationTimestamp="2026-03-09 14:26:17 +0000 UTC" firstStartedPulling="2026-03-09 14:26:18.930314576 +0000 UTC m=+1419.485883152" lastFinishedPulling="2026-03-09 14:26:24.574596169 +0000 UTC m=+1425.130164735" observedRunningTime="2026-03-09 14:26:25.53172022 +0000 UTC m=+1426.087288796" watchObservedRunningTime="2026-03-09 14:26:25.554961362 +0000 UTC m=+1426.110529938" Mar 09 14:26:25 crc kubenswrapper[4722]: I0309 14:26:25.562639 4722 scope.go:117] "RemoveContainer" containerID="ee011b195beeb3631b51cd9fb39e35c666d28917fd9146c1d9bb5a34e1056cfd" Mar 09 14:26:25 crc kubenswrapper[4722]: I0309 14:26:25.582522 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-858d6f6fd6-wqzdg"] Mar 09 14:26:25 crc kubenswrapper[4722]: I0309 14:26:25.590560 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-5576b4c89f-ddl4q" podStartSLOduration=3.715977466 podStartE2EDuration="8.590536498s" podCreationTimestamp="2026-03-09 14:26:17 +0000 UTC" firstStartedPulling="2026-03-09 14:26:19.741445674 +0000 UTC m=+1420.297014250" lastFinishedPulling="2026-03-09 14:26:24.616004706 +0000 UTC m=+1425.171573282" observedRunningTime="2026-03-09 14:26:25.556783284 +0000 UTC m=+1426.112351860" watchObservedRunningTime="2026-03-09 14:26:25.590536498 +0000 UTC m=+1426.146105064" Mar 09 14:26:25 crc kubenswrapper[4722]: I0309 14:26:25.601603 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-858d6f6fd6-wqzdg" event={"ID":"1cc74ab6-6c85-4da3-8a79-3af240adb999","Type":"ContainerStarted","Data":"2ea6220f7559189dfe1a672a930e7f578f963c2059ccb638d6facdeba1269b53"} Mar 09 14:26:25 crc kubenswrapper[4722]: I0309 14:26:25.601651 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-858d6f6fd6-wqzdg" event={"ID":"1cc74ab6-6c85-4da3-8a79-3af240adb999","Type":"ContainerStarted","Data":"bb6a4cf444c858712ca2adabb6d43167f4c418dc0decc8bcd31c1edce0e7fe88"} Mar 09 14:26:25 crc kubenswrapper[4722]: I0309 14:26:25.608404 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-6bc9487fbb-xx845"] Mar 09 14:26:25 crc kubenswrapper[4722]: I0309 14:26:25.618655 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-6bc9487fbb-xx845"] Mar 09 14:26:25 crc kubenswrapper[4722]: I0309 14:26:25.634964 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-57655d7975-hskdb"] Mar 09 14:26:25 crc kubenswrapper[4722]: I0309 14:26:25.636629 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-858d6f6fd6-wqzdg" podStartSLOduration=3.314871291 podStartE2EDuration="8.636609294s" podCreationTimestamp="2026-03-09 14:26:17 +0000 UTC" firstStartedPulling="2026-03-09 14:26:19.217095097 +0000 UTC m=+1419.772663673" lastFinishedPulling="2026-03-09 14:26:24.5388331 +0000 UTC m=+1425.094401676" observedRunningTime="2026-03-09 14:26:25.618621996 +0000 UTC m=+1426.174190582" watchObservedRunningTime="2026-03-09 14:26:25.636609294 +0000 UTC m=+1426.192177870" Mar 09 14:26:26 crc kubenswrapper[4722]: I0309 14:26:26.162412 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17ff98af-3f49-4e81-9d50-397d5eb8076c" path="/var/lib/kubelet/pods/17ff98af-3f49-4e81-9d50-397d5eb8076c/volumes" Mar 09 14:26:26 crc kubenswrapper[4722]: I0309 14:26:26.616031 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-847f5f7dcd-6xz74" event={"ID":"339f60b7-5615-4bbf-a907-ec8daeb69158","Type":"ContainerStarted","Data":"58683afc357301835b764f6eaddf9e02a15d063c937e85f9195fc805a4b05c7c"} Mar 09 14:26:26 crc kubenswrapper[4722]: I0309 14:26:26.616430 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-847f5f7dcd-6xz74" event={"ID":"339f60b7-5615-4bbf-a907-ec8daeb69158","Type":"ContainerStarted","Data":"396e81d627f88608154fd7a0fe82adf2388bc8e399c5bcad261013c3b69d85d1"} Mar 09 14:26:26 crc kubenswrapper[4722]: I0309 14:26:26.616454 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-847f5f7dcd-6xz74" Mar 09 14:26:26 crc kubenswrapper[4722]: I0309 14:26:26.619579 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-858d6f6fd6-wqzdg" podUID="1cc74ab6-6c85-4da3-8a79-3af240adb999" containerName="barbican-keystone-listener-log" containerID="cri-o://bb6a4cf444c858712ca2adabb6d43167f4c418dc0decc8bcd31c1edce0e7fe88" gracePeriod=30 Mar 09 14:26:26 crc kubenswrapper[4722]: I0309 14:26:26.619697 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-858d6f6fd6-wqzdg" podUID="1cc74ab6-6c85-4da3-8a79-3af240adb999" containerName="barbican-keystone-listener" containerID="cri-o://2ea6220f7559189dfe1a672a930e7f578f963c2059ccb638d6facdeba1269b53" gracePeriod=30 Mar 09 14:26:26 crc kubenswrapper[4722]: I0309 14:26:26.661810 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-847f5f7dcd-6xz74" podStartSLOduration=5.6617903179999995 podStartE2EDuration="5.661790318s" podCreationTimestamp="2026-03-09 14:26:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:26:26.644349464 +0000 UTC m=+1427.199918040" watchObservedRunningTime="2026-03-09 14:26:26.661790318 +0000 UTC m=+1427.217358894" Mar 09 14:26:27 crc kubenswrapper[4722]: I0309 14:26:27.455054 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-847f5f7dcd-6xz74" Mar 09 14:26:27 crc kubenswrapper[4722]: I0309 14:26:27.640644 4722 generic.go:334] "Generic (PLEG): container finished" podID="1cc74ab6-6c85-4da3-8a79-3af240adb999" containerID="bb6a4cf444c858712ca2adabb6d43167f4c418dc0decc8bcd31c1edce0e7fe88" exitCode=143 Mar 09 14:26:27 crc kubenswrapper[4722]: I0309 14:26:27.641225 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-858d6f6fd6-wqzdg" event={"ID":"1cc74ab6-6c85-4da3-8a79-3af240adb999","Type":"ContainerDied","Data":"bb6a4cf444c858712ca2adabb6d43167f4c418dc0decc8bcd31c1edce0e7fe88"} Mar 09 14:26:27 crc kubenswrapper[4722]: I0309 14:26:27.641428 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-57655d7975-hskdb" podUID="c3b18ae1-c2c9-4454-a591-53ce06064d82" containerName="barbican-worker-log" containerID="cri-o://41e74f0bf251b86c307d87608970a4aa9f3d425bfc90e218b49c8fca85813b49" gracePeriod=30 Mar 09 14:26:27 crc kubenswrapper[4722]: I0309 14:26:27.641819 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-57655d7975-hskdb" podUID="c3b18ae1-c2c9-4454-a591-53ce06064d82" containerName="barbican-worker" containerID="cri-o://c6864d1bed9c306653a0b503bd6991b82fa3f33a3e4d35aa629f8fb746dfd504" gracePeriod=30 Mar 09 14:26:27 crc kubenswrapper[4722]: I0309 14:26:27.796389 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-85ff748b95-k92sh" Mar 09 14:26:27 crc kubenswrapper[4722]: I0309 14:26:27.869355 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-fbs5z"] Mar 09 14:26:27 crc kubenswrapper[4722]: I0309 14:26:27.869575 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-55f844cf75-fbs5z" podUID="279a10b1-27b8-49d5-8f22-2e01c3db5a04" containerName="dnsmasq-dns" containerID="cri-o://358f5f147e39647eee3fd33e0437b25e2c7aff3825809a733605042d2612b38c" gracePeriod=10 Mar 09 14:26:28 crc kubenswrapper[4722]: I0309 14:26:28.278644 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-55f844cf75-fbs5z" podUID="279a10b1-27b8-49d5-8f22-2e01c3db5a04" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.198:5353: connect: connection refused" Mar 09 14:26:28 crc kubenswrapper[4722]: I0309 14:26:28.652195 4722 generic.go:334] "Generic (PLEG): container finished" podID="279a10b1-27b8-49d5-8f22-2e01c3db5a04" containerID="358f5f147e39647eee3fd33e0437b25e2c7aff3825809a733605042d2612b38c" exitCode=0 Mar 09 14:26:28 crc kubenswrapper[4722]: I0309 14:26:28.652306 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-fbs5z" event={"ID":"279a10b1-27b8-49d5-8f22-2e01c3db5a04","Type":"ContainerDied","Data":"358f5f147e39647eee3fd33e0437b25e2c7aff3825809a733605042d2612b38c"} Mar 09 14:26:28 crc kubenswrapper[4722]: I0309 14:26:28.653780 4722 generic.go:334] "Generic (PLEG): container finished" podID="c3b18ae1-c2c9-4454-a591-53ce06064d82" containerID="41e74f0bf251b86c307d87608970a4aa9f3d425bfc90e218b49c8fca85813b49" exitCode=143 Mar 09 14:26:28 crc kubenswrapper[4722]: I0309 14:26:28.653850 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-57655d7975-hskdb" event={"ID":"c3b18ae1-c2c9-4454-a591-53ce06064d82","Type":"ContainerDied","Data":"41e74f0bf251b86c307d87608970a4aa9f3d425bfc90e218b49c8fca85813b49"} Mar 09 14:26:28 crc kubenswrapper[4722]: I0309 14:26:28.655413 4722 generic.go:334] "Generic (PLEG): container finished" podID="7f51218c-6b15-4f4a-ad49-1ba0ccd5e292" containerID="7774a1b7beb6709cc7100d6b0e05365cd9498f94cb28bd5282e6c7b3a858a60d" exitCode=0 Mar 09 14:26:28 crc kubenswrapper[4722]: I0309 14:26:28.655654 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-j4lgt" event={"ID":"7f51218c-6b15-4f4a-ad49-1ba0ccd5e292","Type":"ContainerDied","Data":"7774a1b7beb6709cc7100d6b0e05365cd9498f94cb28bd5282e6c7b3a858a60d"} Mar 09 14:26:29 crc kubenswrapper[4722]: I0309 14:26:29.719159 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-cb7fc4c44-jrv9s" Mar 09 14:26:29 crc kubenswrapper[4722]: I0309 14:26:29.793573 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-cb7fc4c44-jrv9s" Mar 09 14:26:31 crc kubenswrapper[4722]: I0309 14:26:31.719795 4722 generic.go:334] "Generic (PLEG): container finished" podID="5035bd54-0aaa-4ff3-b90a-6145145fe95c" containerID="2360808bc57a87b6b2fa7f5fc11ab102736e19dbbb32712fc1d07735f9404fa8" exitCode=0 Mar 09 14:26:31 crc kubenswrapper[4722]: I0309 14:26:31.719850 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-fzj6s" event={"ID":"5035bd54-0aaa-4ff3-b90a-6145145fe95c","Type":"ContainerDied","Data":"2360808bc57a87b6b2fa7f5fc11ab102736e19dbbb32712fc1d07735f9404fa8"} Mar 09 14:26:32 crc kubenswrapper[4722]: I0309 14:26:32.138488 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-j4lgt" Mar 09 14:26:32 crc kubenswrapper[4722]: I0309 14:26:32.231569 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f51218c-6b15-4f4a-ad49-1ba0ccd5e292-config-data\") pod \"7f51218c-6b15-4f4a-ad49-1ba0ccd5e292\" (UID: \"7f51218c-6b15-4f4a-ad49-1ba0ccd5e292\") " Mar 09 14:26:32 crc kubenswrapper[4722]: I0309 14:26:32.231697 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f51218c-6b15-4f4a-ad49-1ba0ccd5e292-combined-ca-bundle\") pod \"7f51218c-6b15-4f4a-ad49-1ba0ccd5e292\" (UID: \"7f51218c-6b15-4f4a-ad49-1ba0ccd5e292\") " Mar 09 14:26:32 crc kubenswrapper[4722]: I0309 14:26:32.231726 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tw8df\" (UniqueName: \"kubernetes.io/projected/7f51218c-6b15-4f4a-ad49-1ba0ccd5e292-kube-api-access-tw8df\") pod \"7f51218c-6b15-4f4a-ad49-1ba0ccd5e292\" (UID: \"7f51218c-6b15-4f4a-ad49-1ba0ccd5e292\") " Mar 09 14:26:32 crc kubenswrapper[4722]: I0309 14:26:32.239611 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f51218c-6b15-4f4a-ad49-1ba0ccd5e292-kube-api-access-tw8df" (OuterVolumeSpecName: "kube-api-access-tw8df") pod "7f51218c-6b15-4f4a-ad49-1ba0ccd5e292" (UID: "7f51218c-6b15-4f4a-ad49-1ba0ccd5e292"). InnerVolumeSpecName "kube-api-access-tw8df". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:26:32 crc kubenswrapper[4722]: I0309 14:26:32.302375 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f51218c-6b15-4f4a-ad49-1ba0ccd5e292-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7f51218c-6b15-4f4a-ad49-1ba0ccd5e292" (UID: "7f51218c-6b15-4f4a-ad49-1ba0ccd5e292"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:26:32 crc kubenswrapper[4722]: I0309 14:26:32.335172 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tw8df\" (UniqueName: \"kubernetes.io/projected/7f51218c-6b15-4f4a-ad49-1ba0ccd5e292-kube-api-access-tw8df\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:32 crc kubenswrapper[4722]: I0309 14:26:32.335223 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f51218c-6b15-4f4a-ad49-1ba0ccd5e292-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:32 crc kubenswrapper[4722]: I0309 14:26:32.343703 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f51218c-6b15-4f4a-ad49-1ba0ccd5e292-config-data" (OuterVolumeSpecName: "config-data") pod "7f51218c-6b15-4f4a-ad49-1ba0ccd5e292" (UID: "7f51218c-6b15-4f4a-ad49-1ba0ccd5e292"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:26:32 crc kubenswrapper[4722]: I0309 14:26:32.437694 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f51218c-6b15-4f4a-ad49-1ba0ccd5e292-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:32 crc kubenswrapper[4722]: I0309 14:26:32.741785 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-j4lgt" event={"ID":"7f51218c-6b15-4f4a-ad49-1ba0ccd5e292","Type":"ContainerDied","Data":"766d7400c3f1f6e3f5e39cbc7a00cea245bd30221eb791c1d7edeee72536af50"} Mar 09 14:26:32 crc kubenswrapper[4722]: I0309 14:26:32.741839 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="766d7400c3f1f6e3f5e39cbc7a00cea245bd30221eb791c1d7edeee72536af50" Mar 09 14:26:32 crc kubenswrapper[4722]: I0309 14:26:32.742915 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-j4lgt" Mar 09 14:26:33 crc kubenswrapper[4722]: I0309 14:26:33.442038 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6f5cbf5964-pmtn7" Mar 09 14:26:33 crc kubenswrapper[4722]: I0309 14:26:33.712530 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5487cbd8d5-crfpm"] Mar 09 14:26:33 crc kubenswrapper[4722]: I0309 14:26:33.713137 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5487cbd8d5-crfpm" podUID="a53b24c6-01ac-48c0-8c62-f27e8309de23" containerName="neutron-api" containerID="cri-o://d3ba4abf6a56fdb679f45127646186f63a7a511208e8f8b22d325ba9f4f9c918" gracePeriod=30 Mar 09 14:26:33 crc kubenswrapper[4722]: I0309 14:26:33.713367 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5487cbd8d5-crfpm" podUID="a53b24c6-01ac-48c0-8c62-f27e8309de23" containerName="neutron-httpd" containerID="cri-o://3e86f1b2140a6860ab063b2ed7b54181844d7003819352ffb7cdf3e8f395b8a7" gracePeriod=30 Mar 09 14:26:33 crc kubenswrapper[4722]: I0309 14:26:33.735730 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-5487cbd8d5-crfpm" podUID="a53b24c6-01ac-48c0-8c62-f27e8309de23" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.200:9696/\": EOF" Mar 09 14:26:33 crc kubenswrapper[4722]: I0309 14:26:33.818632 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6f9b466589-25vjm"] Mar 09 14:26:33 crc kubenswrapper[4722]: E0309 14:26:33.819329 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17ff98af-3f49-4e81-9d50-397d5eb8076c" containerName="barbican-api-log" Mar 09 14:26:33 crc kubenswrapper[4722]: I0309 14:26:33.819345 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="17ff98af-3f49-4e81-9d50-397d5eb8076c" containerName="barbican-api-log" Mar 09 14:26:33 crc kubenswrapper[4722]: E0309 14:26:33.819373 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17ff98af-3f49-4e81-9d50-397d5eb8076c" containerName="barbican-api" Mar 09 14:26:33 crc kubenswrapper[4722]: I0309 14:26:33.819379 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="17ff98af-3f49-4e81-9d50-397d5eb8076c" containerName="barbican-api" Mar 09 14:26:33 crc kubenswrapper[4722]: E0309 14:26:33.819407 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f51218c-6b15-4f4a-ad49-1ba0ccd5e292" containerName="heat-db-sync" Mar 09 14:26:33 crc kubenswrapper[4722]: I0309 14:26:33.819415 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f51218c-6b15-4f4a-ad49-1ba0ccd5e292" containerName="heat-db-sync" Mar 09 14:26:33 crc kubenswrapper[4722]: I0309 14:26:33.819655 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="17ff98af-3f49-4e81-9d50-397d5eb8076c" containerName="barbican-api" Mar 09 14:26:33 crc kubenswrapper[4722]: I0309 14:26:33.819699 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="17ff98af-3f49-4e81-9d50-397d5eb8076c" containerName="barbican-api-log" Mar 09 14:26:33 crc kubenswrapper[4722]: I0309 14:26:33.819710 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f51218c-6b15-4f4a-ad49-1ba0ccd5e292" containerName="heat-db-sync" Mar 09 14:26:33 crc kubenswrapper[4722]: I0309 14:26:33.821239 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6f9b466589-25vjm" Mar 09 14:26:33 crc kubenswrapper[4722]: I0309 14:26:33.840012 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6f9b466589-25vjm"] Mar 09 14:26:33 crc kubenswrapper[4722]: I0309 14:26:33.877781 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/36f6d192-80a4-427c-8869-643481617222-public-tls-certs\") pod \"neutron-6f9b466589-25vjm\" (UID: \"36f6d192-80a4-427c-8869-643481617222\") " pod="openstack/neutron-6f9b466589-25vjm" Mar 09 14:26:33 crc kubenswrapper[4722]: I0309 14:26:33.877829 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/36f6d192-80a4-427c-8869-643481617222-httpd-config\") pod \"neutron-6f9b466589-25vjm\" (UID: \"36f6d192-80a4-427c-8869-643481617222\") " pod="openstack/neutron-6f9b466589-25vjm" Mar 09 14:26:33 crc kubenswrapper[4722]: I0309 14:26:33.877908 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/36f6d192-80a4-427c-8869-643481617222-internal-tls-certs\") pod \"neutron-6f9b466589-25vjm\" (UID: \"36f6d192-80a4-427c-8869-643481617222\") " pod="openstack/neutron-6f9b466589-25vjm" Mar 09 14:26:33 crc kubenswrapper[4722]: I0309 14:26:33.877933 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/36f6d192-80a4-427c-8869-643481617222-ovndb-tls-certs\") pod \"neutron-6f9b466589-25vjm\" (UID: \"36f6d192-80a4-427c-8869-643481617222\") " pod="openstack/neutron-6f9b466589-25vjm" Mar 09 14:26:33 crc kubenswrapper[4722]: I0309 14:26:33.877953 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58ckc\" (UniqueName: \"kubernetes.io/projected/36f6d192-80a4-427c-8869-643481617222-kube-api-access-58ckc\") pod \"neutron-6f9b466589-25vjm\" (UID: \"36f6d192-80a4-427c-8869-643481617222\") " pod="openstack/neutron-6f9b466589-25vjm" Mar 09 14:26:33 crc kubenswrapper[4722]: I0309 14:26:33.877991 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36f6d192-80a4-427c-8869-643481617222-combined-ca-bundle\") pod \"neutron-6f9b466589-25vjm\" (UID: \"36f6d192-80a4-427c-8869-643481617222\") " pod="openstack/neutron-6f9b466589-25vjm" Mar 09 14:26:33 crc kubenswrapper[4722]: I0309 14:26:33.878031 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/36f6d192-80a4-427c-8869-643481617222-config\") pod \"neutron-6f9b466589-25vjm\" (UID: \"36f6d192-80a4-427c-8869-643481617222\") " pod="openstack/neutron-6f9b466589-25vjm" Mar 09 14:26:33 crc kubenswrapper[4722]: I0309 14:26:33.993343 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36f6d192-80a4-427c-8869-643481617222-combined-ca-bundle\") pod \"neutron-6f9b466589-25vjm\" (UID: \"36f6d192-80a4-427c-8869-643481617222\") " pod="openstack/neutron-6f9b466589-25vjm" Mar 09 14:26:33 crc kubenswrapper[4722]: I0309 14:26:33.995466 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/36f6d192-80a4-427c-8869-643481617222-config\") pod \"neutron-6f9b466589-25vjm\" (UID: \"36f6d192-80a4-427c-8869-643481617222\") " pod="openstack/neutron-6f9b466589-25vjm" Mar 09 14:26:33 crc kubenswrapper[4722]: I0309 14:26:33.995854 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/36f6d192-80a4-427c-8869-643481617222-public-tls-certs\") pod \"neutron-6f9b466589-25vjm\" (UID: \"36f6d192-80a4-427c-8869-643481617222\") " pod="openstack/neutron-6f9b466589-25vjm" Mar 09 14:26:33 crc kubenswrapper[4722]: I0309 14:26:33.995972 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/36f6d192-80a4-427c-8869-643481617222-httpd-config\") pod \"neutron-6f9b466589-25vjm\" (UID: \"36f6d192-80a4-427c-8869-643481617222\") " pod="openstack/neutron-6f9b466589-25vjm" Mar 09 14:26:33 crc kubenswrapper[4722]: I0309 14:26:33.996327 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/36f6d192-80a4-427c-8869-643481617222-internal-tls-certs\") pod \"neutron-6f9b466589-25vjm\" (UID: \"36f6d192-80a4-427c-8869-643481617222\") " pod="openstack/neutron-6f9b466589-25vjm" Mar 09 14:26:33 crc kubenswrapper[4722]: I0309 14:26:33.996478 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/36f6d192-80a4-427c-8869-643481617222-ovndb-tls-certs\") pod \"neutron-6f9b466589-25vjm\" (UID: \"36f6d192-80a4-427c-8869-643481617222\") " pod="openstack/neutron-6f9b466589-25vjm" Mar 09 14:26:33 crc kubenswrapper[4722]: I0309 14:26:33.997056 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58ckc\" (UniqueName: \"kubernetes.io/projected/36f6d192-80a4-427c-8869-643481617222-kube-api-access-58ckc\") pod \"neutron-6f9b466589-25vjm\" (UID: \"36f6d192-80a4-427c-8869-643481617222\") " pod="openstack/neutron-6f9b466589-25vjm" Mar 09 14:26:34 crc kubenswrapper[4722]: I0309 14:26:34.025549 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/36f6d192-80a4-427c-8869-643481617222-public-tls-certs\") pod \"neutron-6f9b466589-25vjm\" (UID: \"36f6d192-80a4-427c-8869-643481617222\") " pod="openstack/neutron-6f9b466589-25vjm" Mar 09 14:26:34 crc kubenswrapper[4722]: I0309 14:26:34.025639 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/36f6d192-80a4-427c-8869-643481617222-config\") pod \"neutron-6f9b466589-25vjm\" (UID: \"36f6d192-80a4-427c-8869-643481617222\") " pod="openstack/neutron-6f9b466589-25vjm" Mar 09 14:26:34 crc kubenswrapper[4722]: I0309 14:26:34.031362 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/36f6d192-80a4-427c-8869-643481617222-httpd-config\") pod \"neutron-6f9b466589-25vjm\" (UID: \"36f6d192-80a4-427c-8869-643481617222\") " pod="openstack/neutron-6f9b466589-25vjm" Mar 09 14:26:34 crc kubenswrapper[4722]: I0309 14:26:34.032144 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/36f6d192-80a4-427c-8869-643481617222-ovndb-tls-certs\") pod \"neutron-6f9b466589-25vjm\" (UID: \"36f6d192-80a4-427c-8869-643481617222\") " pod="openstack/neutron-6f9b466589-25vjm" Mar 09 14:26:34 crc kubenswrapper[4722]: I0309 14:26:34.032333 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/36f6d192-80a4-427c-8869-643481617222-internal-tls-certs\") pod \"neutron-6f9b466589-25vjm\" (UID: \"36f6d192-80a4-427c-8869-643481617222\") " pod="openstack/neutron-6f9b466589-25vjm" Mar 09 14:26:34 crc kubenswrapper[4722]: I0309 14:26:34.035609 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58ckc\" (UniqueName: \"kubernetes.io/projected/36f6d192-80a4-427c-8869-643481617222-kube-api-access-58ckc\") pod \"neutron-6f9b466589-25vjm\" (UID: \"36f6d192-80a4-427c-8869-643481617222\") " pod="openstack/neutron-6f9b466589-25vjm" Mar 09 14:26:34 crc kubenswrapper[4722]: I0309 14:26:34.039982 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36f6d192-80a4-427c-8869-643481617222-combined-ca-bundle\") pod \"neutron-6f9b466589-25vjm\" (UID: \"36f6d192-80a4-427c-8869-643481617222\") " pod="openstack/neutron-6f9b466589-25vjm" Mar 09 14:26:34 crc kubenswrapper[4722]: I0309 14:26:34.148858 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6f9b466589-25vjm" Mar 09 14:26:34 crc kubenswrapper[4722]: I0309 14:26:34.161210 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-847f5f7dcd-6xz74" Mar 09 14:26:34 crc kubenswrapper[4722]: I0309 14:26:34.298559 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-847f5f7dcd-6xz74" Mar 09 14:26:34 crc kubenswrapper[4722]: I0309 14:26:34.373746 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-cb7fc4c44-jrv9s"] Mar 09 14:26:34 crc kubenswrapper[4722]: I0309 14:26:34.374112 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-cb7fc4c44-jrv9s" podUID="8066fa12-e1a3-483a-8c94-e7a59cc2f1d9" containerName="barbican-api" containerID="cri-o://55247a0ac598f6b1b4061095ab5246e6b6508d186c1cd181cb3e595bf543cc3e" gracePeriod=30 Mar 09 14:26:34 crc kubenswrapper[4722]: I0309 14:26:34.374260 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-cb7fc4c44-jrv9s" podUID="8066fa12-e1a3-483a-8c94-e7a59cc2f1d9" containerName="barbican-api-log" containerID="cri-o://6be84d49059047100547341362493db59e9c0a5393e70a8126117748e190932e" gracePeriod=30 Mar 09 14:26:34 crc kubenswrapper[4722]: I0309 14:26:34.387564 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-cb7fc4c44-jrv9s" podUID="8066fa12-e1a3-483a-8c94-e7a59cc2f1d9" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.206:9311/healthcheck\": EOF" Mar 09 14:26:34 crc kubenswrapper[4722]: I0309 14:26:34.740256 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-fbs5z" Mar 09 14:26:34 crc kubenswrapper[4722]: I0309 14:26:34.812001 4722 generic.go:334] "Generic (PLEG): container finished" podID="8066fa12-e1a3-483a-8c94-e7a59cc2f1d9" containerID="6be84d49059047100547341362493db59e9c0a5393e70a8126117748e190932e" exitCode=143 Mar 09 14:26:34 crc kubenswrapper[4722]: I0309 14:26:34.812082 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-cb7fc4c44-jrv9s" event={"ID":"8066fa12-e1a3-483a-8c94-e7a59cc2f1d9","Type":"ContainerDied","Data":"6be84d49059047100547341362493db59e9c0a5393e70a8126117748e190932e"} Mar 09 14:26:34 crc kubenswrapper[4722]: I0309 14:26:34.819041 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/279a10b1-27b8-49d5-8f22-2e01c3db5a04-config\") pod \"279a10b1-27b8-49d5-8f22-2e01c3db5a04\" (UID: \"279a10b1-27b8-49d5-8f22-2e01c3db5a04\") " Mar 09 14:26:34 crc kubenswrapper[4722]: I0309 14:26:34.819191 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/279a10b1-27b8-49d5-8f22-2e01c3db5a04-ovsdbserver-nb\") pod \"279a10b1-27b8-49d5-8f22-2e01c3db5a04\" (UID: \"279a10b1-27b8-49d5-8f22-2e01c3db5a04\") " Mar 09 14:26:34 crc kubenswrapper[4722]: I0309 14:26:34.819388 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/279a10b1-27b8-49d5-8f22-2e01c3db5a04-dns-svc\") pod \"279a10b1-27b8-49d5-8f22-2e01c3db5a04\" (UID: \"279a10b1-27b8-49d5-8f22-2e01c3db5a04\") " Mar 09 14:26:34 crc kubenswrapper[4722]: I0309 14:26:34.819443 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7hpk\" (UniqueName: \"kubernetes.io/projected/279a10b1-27b8-49d5-8f22-2e01c3db5a04-kube-api-access-f7hpk\") pod \"279a10b1-27b8-49d5-8f22-2e01c3db5a04\" (UID: \"279a10b1-27b8-49d5-8f22-2e01c3db5a04\") " Mar 09 14:26:34 crc kubenswrapper[4722]: I0309 14:26:34.819474 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/279a10b1-27b8-49d5-8f22-2e01c3db5a04-ovsdbserver-sb\") pod \"279a10b1-27b8-49d5-8f22-2e01c3db5a04\" (UID: \"279a10b1-27b8-49d5-8f22-2e01c3db5a04\") " Mar 09 14:26:34 crc kubenswrapper[4722]: I0309 14:26:34.819544 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/279a10b1-27b8-49d5-8f22-2e01c3db5a04-dns-swift-storage-0\") pod \"279a10b1-27b8-49d5-8f22-2e01c3db5a04\" (UID: \"279a10b1-27b8-49d5-8f22-2e01c3db5a04\") " Mar 09 14:26:34 crc kubenswrapper[4722]: I0309 14:26:34.823820 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/279a10b1-27b8-49d5-8f22-2e01c3db5a04-kube-api-access-f7hpk" (OuterVolumeSpecName: "kube-api-access-f7hpk") pod "279a10b1-27b8-49d5-8f22-2e01c3db5a04" (UID: "279a10b1-27b8-49d5-8f22-2e01c3db5a04"). InnerVolumeSpecName "kube-api-access-f7hpk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:26:34 crc kubenswrapper[4722]: I0309 14:26:34.835463 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-fzj6s" event={"ID":"5035bd54-0aaa-4ff3-b90a-6145145fe95c","Type":"ContainerDied","Data":"96928eacf917034a694476bc33209ba61518a7241f7699fec5544d7ef3781c9a"} Mar 09 14:26:34 crc kubenswrapper[4722]: I0309 14:26:34.835504 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="96928eacf917034a694476bc33209ba61518a7241f7699fec5544d7ef3781c9a" Mar 09 14:26:34 crc kubenswrapper[4722]: I0309 14:26:34.851168 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-fzj6s" Mar 09 14:26:34 crc kubenswrapper[4722]: I0309 14:26:34.852457 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-fbs5z" Mar 09 14:26:34 crc kubenswrapper[4722]: I0309 14:26:34.852833 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-fbs5z" event={"ID":"279a10b1-27b8-49d5-8f22-2e01c3db5a04","Type":"ContainerDied","Data":"d53dbc0076eafbefbd5833a18d097cd091aeb9a7ae879c460d01486740d6795c"} Mar 09 14:26:34 crc kubenswrapper[4722]: I0309 14:26:34.852868 4722 scope.go:117] "RemoveContainer" containerID="358f5f147e39647eee3fd33e0437b25e2c7aff3825809a733605042d2612b38c" Mar 09 14:26:34 crc kubenswrapper[4722]: I0309 14:26:34.921301 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5035bd54-0aaa-4ff3-b90a-6145145fe95c-scripts\") pod \"5035bd54-0aaa-4ff3-b90a-6145145fe95c\" (UID: \"5035bd54-0aaa-4ff3-b90a-6145145fe95c\") " Mar 09 14:26:34 crc kubenswrapper[4722]: I0309 14:26:34.921678 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5035bd54-0aaa-4ff3-b90a-6145145fe95c-config-data\") pod \"5035bd54-0aaa-4ff3-b90a-6145145fe95c\" (UID: \"5035bd54-0aaa-4ff3-b90a-6145145fe95c\") " Mar 09 14:26:34 crc kubenswrapper[4722]: I0309 14:26:34.921782 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5035bd54-0aaa-4ff3-b90a-6145145fe95c-combined-ca-bundle\") pod \"5035bd54-0aaa-4ff3-b90a-6145145fe95c\" (UID: \"5035bd54-0aaa-4ff3-b90a-6145145fe95c\") " Mar 09 14:26:34 crc kubenswrapper[4722]: I0309 14:26:34.921821 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rw8pm\" (UniqueName: \"kubernetes.io/projected/5035bd54-0aaa-4ff3-b90a-6145145fe95c-kube-api-access-rw8pm\") pod \"5035bd54-0aaa-4ff3-b90a-6145145fe95c\" (UID: \"5035bd54-0aaa-4ff3-b90a-6145145fe95c\") " Mar 09 14:26:34 crc kubenswrapper[4722]: I0309 14:26:34.922680 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5035bd54-0aaa-4ff3-b90a-6145145fe95c-etc-machine-id\") pod \"5035bd54-0aaa-4ff3-b90a-6145145fe95c\" (UID: \"5035bd54-0aaa-4ff3-b90a-6145145fe95c\") " Mar 09 14:26:34 crc kubenswrapper[4722]: I0309 14:26:34.922734 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5035bd54-0aaa-4ff3-b90a-6145145fe95c-db-sync-config-data\") pod \"5035bd54-0aaa-4ff3-b90a-6145145fe95c\" (UID: \"5035bd54-0aaa-4ff3-b90a-6145145fe95c\") " Mar 09 14:26:34 crc kubenswrapper[4722]: I0309 14:26:34.923450 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f7hpk\" (UniqueName: \"kubernetes.io/projected/279a10b1-27b8-49d5-8f22-2e01c3db5a04-kube-api-access-f7hpk\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:34 crc kubenswrapper[4722]: I0309 14:26:34.926957 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5035bd54-0aaa-4ff3-b90a-6145145fe95c-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "5035bd54-0aaa-4ff3-b90a-6145145fe95c" (UID: "5035bd54-0aaa-4ff3-b90a-6145145fe95c"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 14:26:34 crc kubenswrapper[4722]: I0309 14:26:34.943373 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5035bd54-0aaa-4ff3-b90a-6145145fe95c-kube-api-access-rw8pm" (OuterVolumeSpecName: "kube-api-access-rw8pm") pod "5035bd54-0aaa-4ff3-b90a-6145145fe95c" (UID: "5035bd54-0aaa-4ff3-b90a-6145145fe95c"). InnerVolumeSpecName "kube-api-access-rw8pm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:26:34 crc kubenswrapper[4722]: I0309 14:26:34.943946 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5035bd54-0aaa-4ff3-b90a-6145145fe95c-scripts" (OuterVolumeSpecName: "scripts") pod "5035bd54-0aaa-4ff3-b90a-6145145fe95c" (UID: "5035bd54-0aaa-4ff3-b90a-6145145fe95c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:26:34 crc kubenswrapper[4722]: I0309 14:26:34.944495 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/279a10b1-27b8-49d5-8f22-2e01c3db5a04-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "279a10b1-27b8-49d5-8f22-2e01c3db5a04" (UID: "279a10b1-27b8-49d5-8f22-2e01c3db5a04"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:26:34 crc kubenswrapper[4722]: I0309 14:26:34.951242 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/279a10b1-27b8-49d5-8f22-2e01c3db5a04-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "279a10b1-27b8-49d5-8f22-2e01c3db5a04" (UID: "279a10b1-27b8-49d5-8f22-2e01c3db5a04"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:26:34 crc kubenswrapper[4722]: I0309 14:26:34.954783 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/279a10b1-27b8-49d5-8f22-2e01c3db5a04-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "279a10b1-27b8-49d5-8f22-2e01c3db5a04" (UID: "279a10b1-27b8-49d5-8f22-2e01c3db5a04"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:26:34 crc kubenswrapper[4722]: I0309 14:26:34.956903 4722 scope.go:117] "RemoveContainer" containerID="33d3a450005f1f13e5f186ef301b9b16c456b6d21adcb53af06c7cf934fe591b" Mar 09 14:26:34 crc kubenswrapper[4722]: I0309 14:26:34.968742 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/279a10b1-27b8-49d5-8f22-2e01c3db5a04-config" (OuterVolumeSpecName: "config") pod "279a10b1-27b8-49d5-8f22-2e01c3db5a04" (UID: "279a10b1-27b8-49d5-8f22-2e01c3db5a04"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:26:34 crc kubenswrapper[4722]: I0309 14:26:34.974360 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5035bd54-0aaa-4ff3-b90a-6145145fe95c-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "5035bd54-0aaa-4ff3-b90a-6145145fe95c" (UID: "5035bd54-0aaa-4ff3-b90a-6145145fe95c"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:26:35 crc kubenswrapper[4722]: I0309 14:26:35.000456 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/279a10b1-27b8-49d5-8f22-2e01c3db5a04-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "279a10b1-27b8-49d5-8f22-2e01c3db5a04" (UID: "279a10b1-27b8-49d5-8f22-2e01c3db5a04"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:26:35 crc kubenswrapper[4722]: I0309 14:26:35.025234 4722 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/279a10b1-27b8-49d5-8f22-2e01c3db5a04-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:35 crc kubenswrapper[4722]: I0309 14:26:35.025268 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/279a10b1-27b8-49d5-8f22-2e01c3db5a04-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:35 crc kubenswrapper[4722]: I0309 14:26:35.025283 4722 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/279a10b1-27b8-49d5-8f22-2e01c3db5a04-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:35 crc kubenswrapper[4722]: I0309 14:26:35.025293 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/279a10b1-27b8-49d5-8f22-2e01c3db5a04-config\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:35 crc kubenswrapper[4722]: I0309 14:26:35.025302 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/279a10b1-27b8-49d5-8f22-2e01c3db5a04-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:35 crc kubenswrapper[4722]: I0309 14:26:35.025312 4722 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5035bd54-0aaa-4ff3-b90a-6145145fe95c-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:35 crc kubenswrapper[4722]: I0309 14:26:35.025320 4722 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5035bd54-0aaa-4ff3-b90a-6145145fe95c-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:35 crc kubenswrapper[4722]: I0309 14:26:35.025328 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5035bd54-0aaa-4ff3-b90a-6145145fe95c-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:35 crc kubenswrapper[4722]: I0309 14:26:35.025336 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rw8pm\" (UniqueName: \"kubernetes.io/projected/5035bd54-0aaa-4ff3-b90a-6145145fe95c-kube-api-access-rw8pm\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:35 crc kubenswrapper[4722]: I0309 14:26:35.067717 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5035bd54-0aaa-4ff3-b90a-6145145fe95c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5035bd54-0aaa-4ff3-b90a-6145145fe95c" (UID: "5035bd54-0aaa-4ff3-b90a-6145145fe95c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:26:35 crc kubenswrapper[4722]: I0309 14:26:35.088729 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5035bd54-0aaa-4ff3-b90a-6145145fe95c-config-data" (OuterVolumeSpecName: "config-data") pod "5035bd54-0aaa-4ff3-b90a-6145145fe95c" (UID: "5035bd54-0aaa-4ff3-b90a-6145145fe95c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:26:35 crc kubenswrapper[4722]: I0309 14:26:35.127489 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5035bd54-0aaa-4ff3-b90a-6145145fe95c-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:35 crc kubenswrapper[4722]: I0309 14:26:35.127522 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5035bd54-0aaa-4ff3-b90a-6145145fe95c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:35 crc kubenswrapper[4722]: I0309 14:26:35.220070 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-fbs5z"] Mar 09 14:26:35 crc kubenswrapper[4722]: I0309 14:26:35.235961 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-fbs5z"] Mar 09 14:26:35 crc kubenswrapper[4722]: E0309 14:26:35.272630 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="34700700-cbd0-4a2e-b791-25b63e3de5b8" Mar 09 14:26:35 crc kubenswrapper[4722]: I0309 14:26:35.425585 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6f9b466589-25vjm"] Mar 09 14:26:35 crc kubenswrapper[4722]: I0309 14:26:35.879584 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6f9b466589-25vjm" event={"ID":"36f6d192-80a4-427c-8869-643481617222","Type":"ContainerStarted","Data":"b1bda527dbb22987873b6c20f684d88e2ea080ae943bfe1050dede4c7b5e32bb"} Mar 09 14:26:35 crc kubenswrapper[4722]: I0309 14:26:35.879890 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6f9b466589-25vjm" event={"ID":"36f6d192-80a4-427c-8869-643481617222","Type":"ContainerStarted","Data":"67f7dfe10ea499366a60b12137a2fe49157cd0717cf80e49bba7568ca97ea677"} Mar 09 14:26:35 crc kubenswrapper[4722]: I0309 14:26:35.883070 4722 generic.go:334] "Generic (PLEG): container finished" podID="a53b24c6-01ac-48c0-8c62-f27e8309de23" containerID="3e86f1b2140a6860ab063b2ed7b54181844d7003819352ffb7cdf3e8f395b8a7" exitCode=0 Mar 09 14:26:35 crc kubenswrapper[4722]: I0309 14:26:35.883141 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5487cbd8d5-crfpm" event={"ID":"a53b24c6-01ac-48c0-8c62-f27e8309de23","Type":"ContainerDied","Data":"3e86f1b2140a6860ab063b2ed7b54181844d7003819352ffb7cdf3e8f395b8a7"} Mar 09 14:26:35 crc kubenswrapper[4722]: I0309 14:26:35.890676 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-fzj6s" Mar 09 14:26:35 crc kubenswrapper[4722]: I0309 14:26:35.899848 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="34700700-cbd0-4a2e-b791-25b63e3de5b8" containerName="ceilometer-notification-agent" containerID="cri-o://3b2426d4609239d48a635ae0e691919369f6b42c652697acdcc320f9f919a30b" gracePeriod=30 Mar 09 14:26:35 crc kubenswrapper[4722]: I0309 14:26:35.899958 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"34700700-cbd0-4a2e-b791-25b63e3de5b8","Type":"ContainerStarted","Data":"648af9c2448d9a573865eb523325d52d94f6162bb99fb1122a0292322aa61715"} Mar 09 14:26:35 crc kubenswrapper[4722]: I0309 14:26:35.900028 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="34700700-cbd0-4a2e-b791-25b63e3de5b8" containerName="proxy-httpd" containerID="cri-o://648af9c2448d9a573865eb523325d52d94f6162bb99fb1122a0292322aa61715" gracePeriod=30 Mar 09 14:26:35 crc kubenswrapper[4722]: I0309 14:26:35.900059 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="34700700-cbd0-4a2e-b791-25b63e3de5b8" containerName="sg-core" containerID="cri-o://cd01cfbd6236ef8ba1ab9a77a89a9b477fdb53cf600452555b20b52ab693210f" gracePeriod=30 Mar 09 14:26:35 crc kubenswrapper[4722]: I0309 14:26:35.900215 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 09 14:26:35 crc kubenswrapper[4722]: I0309 14:26:35.977715 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-5487cbd8d5-crfpm" podUID="a53b24c6-01ac-48c0-8c62-f27e8309de23" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.200:9696/\": dial tcp 10.217.0.200:9696: connect: connection refused" Mar 09 14:26:36 crc kubenswrapper[4722]: I0309 14:26:36.076416 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 09 14:26:36 crc kubenswrapper[4722]: E0309 14:26:36.076950 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="279a10b1-27b8-49d5-8f22-2e01c3db5a04" containerName="init" Mar 09 14:26:36 crc kubenswrapper[4722]: I0309 14:26:36.076972 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="279a10b1-27b8-49d5-8f22-2e01c3db5a04" containerName="init" Mar 09 14:26:36 crc kubenswrapper[4722]: E0309 14:26:36.076992 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="279a10b1-27b8-49d5-8f22-2e01c3db5a04" containerName="dnsmasq-dns" Mar 09 14:26:36 crc kubenswrapper[4722]: I0309 14:26:36.077001 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="279a10b1-27b8-49d5-8f22-2e01c3db5a04" containerName="dnsmasq-dns" Mar 09 14:26:36 crc kubenswrapper[4722]: E0309 14:26:36.077018 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5035bd54-0aaa-4ff3-b90a-6145145fe95c" containerName="cinder-db-sync" Mar 09 14:26:36 crc kubenswrapper[4722]: I0309 14:26:36.077024 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="5035bd54-0aaa-4ff3-b90a-6145145fe95c" containerName="cinder-db-sync" Mar 09 14:26:36 crc kubenswrapper[4722]: I0309 14:26:36.077313 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="279a10b1-27b8-49d5-8f22-2e01c3db5a04" containerName="dnsmasq-dns" Mar 09 14:26:36 crc kubenswrapper[4722]: I0309 14:26:36.077345 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="5035bd54-0aaa-4ff3-b90a-6145145fe95c" containerName="cinder-db-sync" Mar 09 14:26:36 crc kubenswrapper[4722]: I0309 14:26:36.078595 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 09 14:26:36 crc kubenswrapper[4722]: I0309 14:26:36.096102 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 09 14:26:36 crc kubenswrapper[4722]: I0309 14:26:36.099672 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 09 14:26:36 crc kubenswrapper[4722]: I0309 14:26:36.099898 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 09 14:26:36 crc kubenswrapper[4722]: I0309 14:26:36.100000 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 09 14:26:36 crc kubenswrapper[4722]: I0309 14:26:36.100113 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-wdrpg" Mar 09 14:26:36 crc kubenswrapper[4722]: I0309 14:26:36.195730 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2b2dd04-3dd3-4915-ab91-201d5bfb54b1-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"f2b2dd04-3dd3-4915-ab91-201d5bfb54b1\") " pod="openstack/cinder-scheduler-0" Mar 09 14:26:36 crc kubenswrapper[4722]: I0309 14:26:36.196059 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f2b2dd04-3dd3-4915-ab91-201d5bfb54b1-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"f2b2dd04-3dd3-4915-ab91-201d5bfb54b1\") " pod="openstack/cinder-scheduler-0" Mar 09 14:26:36 crc kubenswrapper[4722]: I0309 14:26:36.196544 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2b2dd04-3dd3-4915-ab91-201d5bfb54b1-scripts\") pod \"cinder-scheduler-0\" (UID: \"f2b2dd04-3dd3-4915-ab91-201d5bfb54b1\") " pod="openstack/cinder-scheduler-0" Mar 09 14:26:36 crc kubenswrapper[4722]: I0309 14:26:36.196644 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkkwx\" (UniqueName: \"kubernetes.io/projected/f2b2dd04-3dd3-4915-ab91-201d5bfb54b1-kube-api-access-vkkwx\") pod \"cinder-scheduler-0\" (UID: \"f2b2dd04-3dd3-4915-ab91-201d5bfb54b1\") " pod="openstack/cinder-scheduler-0" Mar 09 14:26:36 crc kubenswrapper[4722]: I0309 14:26:36.196776 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f2b2dd04-3dd3-4915-ab91-201d5bfb54b1-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"f2b2dd04-3dd3-4915-ab91-201d5bfb54b1\") " pod="openstack/cinder-scheduler-0" Mar 09 14:26:36 crc kubenswrapper[4722]: I0309 14:26:36.196851 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2b2dd04-3dd3-4915-ab91-201d5bfb54b1-config-data\") pod \"cinder-scheduler-0\" (UID: \"f2b2dd04-3dd3-4915-ab91-201d5bfb54b1\") " pod="openstack/cinder-scheduler-0" Mar 09 14:26:36 crc kubenswrapper[4722]: I0309 14:26:36.274499 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="279a10b1-27b8-49d5-8f22-2e01c3db5a04" path="/var/lib/kubelet/pods/279a10b1-27b8-49d5-8f22-2e01c3db5a04/volumes" Mar 09 14:26:36 crc kubenswrapper[4722]: I0309 14:26:36.275231 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-hbhls"] Mar 09 14:26:36 crc kubenswrapper[4722]: I0309 14:26:36.292379 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-hbhls" Mar 09 14:26:36 crc kubenswrapper[4722]: I0309 14:26:36.301140 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2b2dd04-3dd3-4915-ab91-201d5bfb54b1-scripts\") pod \"cinder-scheduler-0\" (UID: \"f2b2dd04-3dd3-4915-ab91-201d5bfb54b1\") " pod="openstack/cinder-scheduler-0" Mar 09 14:26:36 crc kubenswrapper[4722]: I0309 14:26:36.301225 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkkwx\" (UniqueName: \"kubernetes.io/projected/f2b2dd04-3dd3-4915-ab91-201d5bfb54b1-kube-api-access-vkkwx\") pod \"cinder-scheduler-0\" (UID: \"f2b2dd04-3dd3-4915-ab91-201d5bfb54b1\") " pod="openstack/cinder-scheduler-0" Mar 09 14:26:36 crc kubenswrapper[4722]: I0309 14:26:36.301608 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f2b2dd04-3dd3-4915-ab91-201d5bfb54b1-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"f2b2dd04-3dd3-4915-ab91-201d5bfb54b1\") " pod="openstack/cinder-scheduler-0" Mar 09 14:26:36 crc kubenswrapper[4722]: I0309 14:26:36.301675 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f2b2dd04-3dd3-4915-ab91-201d5bfb54b1-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"f2b2dd04-3dd3-4915-ab91-201d5bfb54b1\") " pod="openstack/cinder-scheduler-0" Mar 09 14:26:36 crc kubenswrapper[4722]: I0309 14:26:36.304331 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2b2dd04-3dd3-4915-ab91-201d5bfb54b1-config-data\") pod \"cinder-scheduler-0\" (UID: \"f2b2dd04-3dd3-4915-ab91-201d5bfb54b1\") " pod="openstack/cinder-scheduler-0" Mar 09 14:26:36 crc kubenswrapper[4722]: I0309 14:26:36.304485 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2b2dd04-3dd3-4915-ab91-201d5bfb54b1-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"f2b2dd04-3dd3-4915-ab91-201d5bfb54b1\") " pod="openstack/cinder-scheduler-0" Mar 09 14:26:36 crc kubenswrapper[4722]: I0309 14:26:36.304526 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f2b2dd04-3dd3-4915-ab91-201d5bfb54b1-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"f2b2dd04-3dd3-4915-ab91-201d5bfb54b1\") " pod="openstack/cinder-scheduler-0" Mar 09 14:26:36 crc kubenswrapper[4722]: I0309 14:26:36.313803 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-hbhls"] Mar 09 14:26:36 crc kubenswrapper[4722]: I0309 14:26:36.365049 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkkwx\" (UniqueName: \"kubernetes.io/projected/f2b2dd04-3dd3-4915-ab91-201d5bfb54b1-kube-api-access-vkkwx\") pod \"cinder-scheduler-0\" (UID: \"f2b2dd04-3dd3-4915-ab91-201d5bfb54b1\") " pod="openstack/cinder-scheduler-0" Mar 09 14:26:36 crc kubenswrapper[4722]: I0309 14:26:36.365503 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f2b2dd04-3dd3-4915-ab91-201d5bfb54b1-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"f2b2dd04-3dd3-4915-ab91-201d5bfb54b1\") " pod="openstack/cinder-scheduler-0" Mar 09 14:26:36 crc kubenswrapper[4722]: I0309 14:26:36.365816 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2b2dd04-3dd3-4915-ab91-201d5bfb54b1-scripts\") pod \"cinder-scheduler-0\" (UID: \"f2b2dd04-3dd3-4915-ab91-201d5bfb54b1\") " pod="openstack/cinder-scheduler-0" Mar 09 14:26:36 crc kubenswrapper[4722]: I0309 14:26:36.365897 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2b2dd04-3dd3-4915-ab91-201d5bfb54b1-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"f2b2dd04-3dd3-4915-ab91-201d5bfb54b1\") " pod="openstack/cinder-scheduler-0" Mar 09 14:26:36 crc kubenswrapper[4722]: I0309 14:26:36.366577 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2b2dd04-3dd3-4915-ab91-201d5bfb54b1-config-data\") pod \"cinder-scheduler-0\" (UID: \"f2b2dd04-3dd3-4915-ab91-201d5bfb54b1\") " pod="openstack/cinder-scheduler-0" Mar 09 14:26:36 crc kubenswrapper[4722]: I0309 14:26:36.398252 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 09 14:26:36 crc kubenswrapper[4722]: I0309 14:26:36.400592 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 09 14:26:36 crc kubenswrapper[4722]: I0309 14:26:36.403908 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 09 14:26:36 crc kubenswrapper[4722]: I0309 14:26:36.410639 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5aaf2eb8-65eb-4404-93df-c16fe6796329-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-hbhls\" (UID: \"5aaf2eb8-65eb-4404-93df-c16fe6796329\") " pod="openstack/dnsmasq-dns-5c9776ccc5-hbhls" Mar 09 14:26:36 crc kubenswrapper[4722]: I0309 14:26:36.410710 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6svbv\" (UniqueName: \"kubernetes.io/projected/5aaf2eb8-65eb-4404-93df-c16fe6796329-kube-api-access-6svbv\") pod \"dnsmasq-dns-5c9776ccc5-hbhls\" (UID: \"5aaf2eb8-65eb-4404-93df-c16fe6796329\") " pod="openstack/dnsmasq-dns-5c9776ccc5-hbhls" Mar 09 14:26:36 crc kubenswrapper[4722]: I0309 14:26:36.410786 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5aaf2eb8-65eb-4404-93df-c16fe6796329-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-hbhls\" (UID: \"5aaf2eb8-65eb-4404-93df-c16fe6796329\") " pod="openstack/dnsmasq-dns-5c9776ccc5-hbhls" Mar 09 14:26:36 crc kubenswrapper[4722]: I0309 14:26:36.410823 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5aaf2eb8-65eb-4404-93df-c16fe6796329-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-hbhls\" (UID: \"5aaf2eb8-65eb-4404-93df-c16fe6796329\") " pod="openstack/dnsmasq-dns-5c9776ccc5-hbhls" Mar 09 14:26:36 crc kubenswrapper[4722]: I0309 14:26:36.410875 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5aaf2eb8-65eb-4404-93df-c16fe6796329-config\") pod \"dnsmasq-dns-5c9776ccc5-hbhls\" (UID: \"5aaf2eb8-65eb-4404-93df-c16fe6796329\") " pod="openstack/dnsmasq-dns-5c9776ccc5-hbhls" Mar 09 14:26:36 crc kubenswrapper[4722]: I0309 14:26:36.410985 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5aaf2eb8-65eb-4404-93df-c16fe6796329-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-hbhls\" (UID: \"5aaf2eb8-65eb-4404-93df-c16fe6796329\") " pod="openstack/dnsmasq-dns-5c9776ccc5-hbhls" Mar 09 14:26:36 crc kubenswrapper[4722]: I0309 14:26:36.430054 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 09 14:26:36 crc kubenswrapper[4722]: I0309 14:26:36.515515 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5aaf2eb8-65eb-4404-93df-c16fe6796329-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-hbhls\" (UID: \"5aaf2eb8-65eb-4404-93df-c16fe6796329\") " pod="openstack/dnsmasq-dns-5c9776ccc5-hbhls" Mar 09 14:26:36 crc kubenswrapper[4722]: I0309 14:26:36.515561 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6svbv\" (UniqueName: \"kubernetes.io/projected/5aaf2eb8-65eb-4404-93df-c16fe6796329-kube-api-access-6svbv\") pod \"dnsmasq-dns-5c9776ccc5-hbhls\" (UID: \"5aaf2eb8-65eb-4404-93df-c16fe6796329\") " pod="openstack/dnsmasq-dns-5c9776ccc5-hbhls" Mar 09 14:26:36 crc kubenswrapper[4722]: I0309 14:26:36.515599 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/896c45ad-c78a-48cd-a84d-7852705df811-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"896c45ad-c78a-48cd-a84d-7852705df811\") " pod="openstack/cinder-api-0" Mar 09 14:26:36 crc kubenswrapper[4722]: I0309 14:26:36.515624 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5aaf2eb8-65eb-4404-93df-c16fe6796329-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-hbhls\" (UID: \"5aaf2eb8-65eb-4404-93df-c16fe6796329\") " pod="openstack/dnsmasq-dns-5c9776ccc5-hbhls" Mar 09 14:26:36 crc kubenswrapper[4722]: I0309 14:26:36.515646 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5aaf2eb8-65eb-4404-93df-c16fe6796329-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-hbhls\" (UID: \"5aaf2eb8-65eb-4404-93df-c16fe6796329\") " pod="openstack/dnsmasq-dns-5c9776ccc5-hbhls" Mar 09 14:26:36 crc kubenswrapper[4722]: I0309 14:26:36.515672 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/896c45ad-c78a-48cd-a84d-7852705df811-etc-machine-id\") pod \"cinder-api-0\" (UID: \"896c45ad-c78a-48cd-a84d-7852705df811\") " pod="openstack/cinder-api-0" Mar 09 14:26:36 crc kubenswrapper[4722]: I0309 14:26:36.515692 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5aaf2eb8-65eb-4404-93df-c16fe6796329-config\") pod \"dnsmasq-dns-5c9776ccc5-hbhls\" (UID: \"5aaf2eb8-65eb-4404-93df-c16fe6796329\") " pod="openstack/dnsmasq-dns-5c9776ccc5-hbhls" Mar 09 14:26:36 crc kubenswrapper[4722]: I0309 14:26:36.515765 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5aaf2eb8-65eb-4404-93df-c16fe6796329-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-hbhls\" (UID: \"5aaf2eb8-65eb-4404-93df-c16fe6796329\") " pod="openstack/dnsmasq-dns-5c9776ccc5-hbhls" Mar 09 14:26:36 crc kubenswrapper[4722]: I0309 14:26:36.515825 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/896c45ad-c78a-48cd-a84d-7852705df811-config-data-custom\") pod \"cinder-api-0\" (UID: \"896c45ad-c78a-48cd-a84d-7852705df811\") " pod="openstack/cinder-api-0" Mar 09 14:26:36 crc kubenswrapper[4722]: I0309 14:26:36.515857 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/896c45ad-c78a-48cd-a84d-7852705df811-scripts\") pod \"cinder-api-0\" (UID: \"896c45ad-c78a-48cd-a84d-7852705df811\") " pod="openstack/cinder-api-0" Mar 09 14:26:36 crc kubenswrapper[4722]: I0309 14:26:36.515882 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/896c45ad-c78a-48cd-a84d-7852705df811-logs\") pod \"cinder-api-0\" (UID: \"896c45ad-c78a-48cd-a84d-7852705df811\") " pod="openstack/cinder-api-0" Mar 09 14:26:36 crc kubenswrapper[4722]: I0309 14:26:36.515920 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/896c45ad-c78a-48cd-a84d-7852705df811-config-data\") pod \"cinder-api-0\" (UID: \"896c45ad-c78a-48cd-a84d-7852705df811\") " pod="openstack/cinder-api-0" Mar 09 14:26:36 crc kubenswrapper[4722]: I0309 14:26:36.515935 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g99j5\" (UniqueName: \"kubernetes.io/projected/896c45ad-c78a-48cd-a84d-7852705df811-kube-api-access-g99j5\") pod \"cinder-api-0\" (UID: \"896c45ad-c78a-48cd-a84d-7852705df811\") " pod="openstack/cinder-api-0" Mar 09 14:26:36 crc kubenswrapper[4722]: I0309 14:26:36.517023 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5aaf2eb8-65eb-4404-93df-c16fe6796329-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-hbhls\" (UID: \"5aaf2eb8-65eb-4404-93df-c16fe6796329\") " pod="openstack/dnsmasq-dns-5c9776ccc5-hbhls" Mar 09 14:26:36 crc kubenswrapper[4722]: I0309 14:26:36.517137 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5aaf2eb8-65eb-4404-93df-c16fe6796329-config\") pod \"dnsmasq-dns-5c9776ccc5-hbhls\" (UID: \"5aaf2eb8-65eb-4404-93df-c16fe6796329\") " pod="openstack/dnsmasq-dns-5c9776ccc5-hbhls" Mar 09 14:26:36 crc kubenswrapper[4722]: I0309 14:26:36.521571 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5aaf2eb8-65eb-4404-93df-c16fe6796329-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-hbhls\" (UID: \"5aaf2eb8-65eb-4404-93df-c16fe6796329\") " pod="openstack/dnsmasq-dns-5c9776ccc5-hbhls" Mar 09 14:26:36 crc kubenswrapper[4722]: I0309 14:26:36.521819 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5aaf2eb8-65eb-4404-93df-c16fe6796329-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-hbhls\" (UID: \"5aaf2eb8-65eb-4404-93df-c16fe6796329\") " pod="openstack/dnsmasq-dns-5c9776ccc5-hbhls" Mar 09 14:26:36 crc kubenswrapper[4722]: I0309 14:26:36.539877 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5aaf2eb8-65eb-4404-93df-c16fe6796329-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-hbhls\" (UID: \"5aaf2eb8-65eb-4404-93df-c16fe6796329\") " pod="openstack/dnsmasq-dns-5c9776ccc5-hbhls" Mar 09 14:26:36 crc kubenswrapper[4722]: I0309 14:26:36.560963 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6svbv\" (UniqueName: \"kubernetes.io/projected/5aaf2eb8-65eb-4404-93df-c16fe6796329-kube-api-access-6svbv\") pod \"dnsmasq-dns-5c9776ccc5-hbhls\" (UID: \"5aaf2eb8-65eb-4404-93df-c16fe6796329\") " pod="openstack/dnsmasq-dns-5c9776ccc5-hbhls" Mar 09 14:26:36 crc kubenswrapper[4722]: I0309 14:26:36.563936 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 09 14:26:36 crc kubenswrapper[4722]: I0309 14:26:36.618318 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/896c45ad-c78a-48cd-a84d-7852705df811-scripts\") pod \"cinder-api-0\" (UID: \"896c45ad-c78a-48cd-a84d-7852705df811\") " pod="openstack/cinder-api-0" Mar 09 14:26:36 crc kubenswrapper[4722]: I0309 14:26:36.618377 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/896c45ad-c78a-48cd-a84d-7852705df811-logs\") pod \"cinder-api-0\" (UID: \"896c45ad-c78a-48cd-a84d-7852705df811\") " pod="openstack/cinder-api-0" Mar 09 14:26:36 crc kubenswrapper[4722]: I0309 14:26:36.618425 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/896c45ad-c78a-48cd-a84d-7852705df811-config-data\") pod \"cinder-api-0\" (UID: \"896c45ad-c78a-48cd-a84d-7852705df811\") " pod="openstack/cinder-api-0" Mar 09 14:26:36 crc kubenswrapper[4722]: I0309 14:26:36.618443 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g99j5\" (UniqueName: \"kubernetes.io/projected/896c45ad-c78a-48cd-a84d-7852705df811-kube-api-access-g99j5\") pod \"cinder-api-0\" (UID: \"896c45ad-c78a-48cd-a84d-7852705df811\") " pod="openstack/cinder-api-0" Mar 09 14:26:36 crc kubenswrapper[4722]: I0309 14:26:36.618482 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/896c45ad-c78a-48cd-a84d-7852705df811-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"896c45ad-c78a-48cd-a84d-7852705df811\") " pod="openstack/cinder-api-0" Mar 09 14:26:36 crc kubenswrapper[4722]: I0309 14:26:36.618526 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/896c45ad-c78a-48cd-a84d-7852705df811-etc-machine-id\") pod \"cinder-api-0\" (UID: \"896c45ad-c78a-48cd-a84d-7852705df811\") " pod="openstack/cinder-api-0" Mar 09 14:26:36 crc kubenswrapper[4722]: I0309 14:26:36.618632 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/896c45ad-c78a-48cd-a84d-7852705df811-config-data-custom\") pod \"cinder-api-0\" (UID: \"896c45ad-c78a-48cd-a84d-7852705df811\") " pod="openstack/cinder-api-0" Mar 09 14:26:36 crc kubenswrapper[4722]: I0309 14:26:36.623617 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/896c45ad-c78a-48cd-a84d-7852705df811-etc-machine-id\") pod \"cinder-api-0\" (UID: \"896c45ad-c78a-48cd-a84d-7852705df811\") " pod="openstack/cinder-api-0" Mar 09 14:26:36 crc kubenswrapper[4722]: I0309 14:26:36.625030 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/896c45ad-c78a-48cd-a84d-7852705df811-logs\") pod \"cinder-api-0\" (UID: \"896c45ad-c78a-48cd-a84d-7852705df811\") " pod="openstack/cinder-api-0" Mar 09 14:26:36 crc kubenswrapper[4722]: I0309 14:26:36.629602 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/896c45ad-c78a-48cd-a84d-7852705df811-config-data\") pod \"cinder-api-0\" (UID: \"896c45ad-c78a-48cd-a84d-7852705df811\") " pod="openstack/cinder-api-0" Mar 09 14:26:36 crc kubenswrapper[4722]: I0309 14:26:36.630427 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/896c45ad-c78a-48cd-a84d-7852705df811-config-data-custom\") pod \"cinder-api-0\" (UID: \"896c45ad-c78a-48cd-a84d-7852705df811\") " pod="openstack/cinder-api-0" Mar 09 14:26:36 crc kubenswrapper[4722]: I0309 14:26:36.632857 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/896c45ad-c78a-48cd-a84d-7852705df811-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"896c45ad-c78a-48cd-a84d-7852705df811\") " pod="openstack/cinder-api-0" Mar 09 14:26:36 crc kubenswrapper[4722]: I0309 14:26:36.642754 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/896c45ad-c78a-48cd-a84d-7852705df811-scripts\") pod \"cinder-api-0\" (UID: \"896c45ad-c78a-48cd-a84d-7852705df811\") " pod="openstack/cinder-api-0" Mar 09 14:26:36 crc kubenswrapper[4722]: I0309 14:26:36.661811 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g99j5\" (UniqueName: \"kubernetes.io/projected/896c45ad-c78a-48cd-a84d-7852705df811-kube-api-access-g99j5\") pod \"cinder-api-0\" (UID: \"896c45ad-c78a-48cd-a84d-7852705df811\") " pod="openstack/cinder-api-0" Mar 09 14:26:36 crc kubenswrapper[4722]: I0309 14:26:36.750214 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-hbhls" Mar 09 14:26:36 crc kubenswrapper[4722]: I0309 14:26:36.771541 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 09 14:26:36 crc kubenswrapper[4722]: I0309 14:26:36.948882 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6f9b466589-25vjm" event={"ID":"36f6d192-80a4-427c-8869-643481617222","Type":"ContainerStarted","Data":"c39d191e54a132494ca509564be234777a4da6f3ccbc114ff5fdca14e18470a4"} Mar 09 14:26:36 crc kubenswrapper[4722]: I0309 14:26:36.950423 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6f9b466589-25vjm" Mar 09 14:26:36 crc kubenswrapper[4722]: I0309 14:26:36.954871 4722 generic.go:334] "Generic (PLEG): container finished" podID="34700700-cbd0-4a2e-b791-25b63e3de5b8" containerID="648af9c2448d9a573865eb523325d52d94f6162bb99fb1122a0292322aa61715" exitCode=0 Mar 09 14:26:36 crc kubenswrapper[4722]: I0309 14:26:36.954898 4722 generic.go:334] "Generic (PLEG): container finished" podID="34700700-cbd0-4a2e-b791-25b63e3de5b8" containerID="cd01cfbd6236ef8ba1ab9a77a89a9b477fdb53cf600452555b20b52ab693210f" exitCode=2 Mar 09 14:26:36 crc kubenswrapper[4722]: I0309 14:26:36.954917 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"34700700-cbd0-4a2e-b791-25b63e3de5b8","Type":"ContainerDied","Data":"648af9c2448d9a573865eb523325d52d94f6162bb99fb1122a0292322aa61715"} Mar 09 14:26:36 crc kubenswrapper[4722]: I0309 14:26:36.954939 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"34700700-cbd0-4a2e-b791-25b63e3de5b8","Type":"ContainerDied","Data":"cd01cfbd6236ef8ba1ab9a77a89a9b477fdb53cf600452555b20b52ab693210f"} Mar 09 14:26:37 crc kubenswrapper[4722]: I0309 14:26:37.031321 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6f9b466589-25vjm" podStartSLOduration=4.03129965 podStartE2EDuration="4.03129965s" podCreationTimestamp="2026-03-09 14:26:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:26:36.982902891 +0000 UTC m=+1437.538471477" watchObservedRunningTime="2026-03-09 14:26:37.03129965 +0000 UTC m=+1437.586868226" Mar 09 14:26:37 crc kubenswrapper[4722]: I0309 14:26:37.359651 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 09 14:26:37 crc kubenswrapper[4722]: W0309 14:26:37.362418 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf2b2dd04_3dd3_4915_ab91_201d5bfb54b1.slice/crio-140be863383b8c2a361522e8005af37d00b25161c36a6b3b4ea9d87016f85b81 WatchSource:0}: Error finding container 140be863383b8c2a361522e8005af37d00b25161c36a6b3b4ea9d87016f85b81: Status 404 returned error can't find the container with id 140be863383b8c2a361522e8005af37d00b25161c36a6b3b4ea9d87016f85b81 Mar 09 14:26:37 crc kubenswrapper[4722]: W0309 14:26:37.535374 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod896c45ad_c78a_48cd_a84d_7852705df811.slice/crio-1f235d54cebc73f53c4a69e365684ea0655dc048457f90e257da710d1017839b WatchSource:0}: Error finding container 1f235d54cebc73f53c4a69e365684ea0655dc048457f90e257da710d1017839b: Status 404 returned error can't find the container with id 1f235d54cebc73f53c4a69e365684ea0655dc048457f90e257da710d1017839b Mar 09 14:26:37 crc kubenswrapper[4722]: I0309 14:26:37.540524 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-hbhls"] Mar 09 14:26:37 crc kubenswrapper[4722]: I0309 14:26:37.572928 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 09 14:26:37 crc kubenswrapper[4722]: I0309 14:26:37.974517 4722 generic.go:334] "Generic (PLEG): container finished" podID="5aaf2eb8-65eb-4404-93df-c16fe6796329" containerID="47459de10f70446fb04c6ce8a3b18a01c64f884125b7c2303d821b0bc5b88e0d" exitCode=0 Mar 09 14:26:37 crc kubenswrapper[4722]: I0309 14:26:37.974842 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-hbhls" event={"ID":"5aaf2eb8-65eb-4404-93df-c16fe6796329","Type":"ContainerDied","Data":"47459de10f70446fb04c6ce8a3b18a01c64f884125b7c2303d821b0bc5b88e0d"} Mar 09 14:26:37 crc kubenswrapper[4722]: I0309 14:26:37.974871 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-hbhls" event={"ID":"5aaf2eb8-65eb-4404-93df-c16fe6796329","Type":"ContainerStarted","Data":"d70e2bf9cec27169da9ed2921244cafb7477e2448a377092a449b75c8c654e70"} Mar 09 14:26:37 crc kubenswrapper[4722]: I0309 14:26:37.980435 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f2b2dd04-3dd3-4915-ab91-201d5bfb54b1","Type":"ContainerStarted","Data":"140be863383b8c2a361522e8005af37d00b25161c36a6b3b4ea9d87016f85b81"} Mar 09 14:26:37 crc kubenswrapper[4722]: I0309 14:26:37.982131 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"896c45ad-c78a-48cd-a84d-7852705df811","Type":"ContainerStarted","Data":"1f235d54cebc73f53c4a69e365684ea0655dc048457f90e257da710d1017839b"} Mar 09 14:26:38 crc kubenswrapper[4722]: I0309 14:26:38.279334 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-55f844cf75-fbs5z" podUID="279a10b1-27b8-49d5-8f22-2e01c3db5a04" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.198:5353: i/o timeout" Mar 09 14:26:38 crc kubenswrapper[4722]: I0309 14:26:38.571606 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 09 14:26:38 crc kubenswrapper[4722]: I0309 14:26:38.919666 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-cb7fc4c44-jrv9s" podUID="8066fa12-e1a3-483a-8c94-e7a59cc2f1d9" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.206:9311/healthcheck\": read tcp 10.217.0.2:46580->10.217.0.206:9311: read: connection reset by peer" Mar 09 14:26:38 crc kubenswrapper[4722]: I0309 14:26:38.920276 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-cb7fc4c44-jrv9s" podUID="8066fa12-e1a3-483a-8c94-e7a59cc2f1d9" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.206:9311/healthcheck\": read tcp 10.217.0.2:46570->10.217.0.206:9311: read: connection reset by peer" Mar 09 14:26:39 crc kubenswrapper[4722]: I0309 14:26:39.018122 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-hbhls" event={"ID":"5aaf2eb8-65eb-4404-93df-c16fe6796329","Type":"ContainerStarted","Data":"78edfd45b55f5b27e7faa8f4b105788a5b0ac9f34b23575abe46855f8cccbd18"} Mar 09 14:26:39 crc kubenswrapper[4722]: I0309 14:26:39.019612 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c9776ccc5-hbhls" Mar 09 14:26:39 crc kubenswrapper[4722]: I0309 14:26:39.051795 4722 generic.go:334] "Generic (PLEG): container finished" podID="a53b24c6-01ac-48c0-8c62-f27e8309de23" containerID="d3ba4abf6a56fdb679f45127646186f63a7a511208e8f8b22d325ba9f4f9c918" exitCode=0 Mar 09 14:26:39 crc kubenswrapper[4722]: I0309 14:26:39.051913 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5487cbd8d5-crfpm" event={"ID":"a53b24c6-01ac-48c0-8c62-f27e8309de23","Type":"ContainerDied","Data":"d3ba4abf6a56fdb679f45127646186f63a7a511208e8f8b22d325ba9f4f9c918"} Mar 09 14:26:39 crc kubenswrapper[4722]: I0309 14:26:39.075967 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c9776ccc5-hbhls" podStartSLOduration=3.075946893 podStartE2EDuration="3.075946893s" podCreationTimestamp="2026-03-09 14:26:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:26:39.061605746 +0000 UTC m=+1439.617174322" watchObservedRunningTime="2026-03-09 14:26:39.075946893 +0000 UTC m=+1439.631515469" Mar 09 14:26:39 crc kubenswrapper[4722]: I0309 14:26:39.109398 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"896c45ad-c78a-48cd-a84d-7852705df811","Type":"ContainerStarted","Data":"38c12366a8db34c5ce5fe002d1b8fdf7d15151c41c2a6275c9f9ca4b87e6dc3a"} Mar 09 14:26:39 crc kubenswrapper[4722]: I0309 14:26:39.705564 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5487cbd8d5-crfpm" Mar 09 14:26:39 crc kubenswrapper[4722]: I0309 14:26:39.813380 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a53b24c6-01ac-48c0-8c62-f27e8309de23-combined-ca-bundle\") pod \"a53b24c6-01ac-48c0-8c62-f27e8309de23\" (UID: \"a53b24c6-01ac-48c0-8c62-f27e8309de23\") " Mar 09 14:26:39 crc kubenswrapper[4722]: I0309 14:26:39.813430 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cpt6n\" (UniqueName: \"kubernetes.io/projected/a53b24c6-01ac-48c0-8c62-f27e8309de23-kube-api-access-cpt6n\") pod \"a53b24c6-01ac-48c0-8c62-f27e8309de23\" (UID: \"a53b24c6-01ac-48c0-8c62-f27e8309de23\") " Mar 09 14:26:39 crc kubenswrapper[4722]: I0309 14:26:39.813502 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a53b24c6-01ac-48c0-8c62-f27e8309de23-internal-tls-certs\") pod \"a53b24c6-01ac-48c0-8c62-f27e8309de23\" (UID: \"a53b24c6-01ac-48c0-8c62-f27e8309de23\") " Mar 09 14:26:39 crc kubenswrapper[4722]: I0309 14:26:39.813592 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a53b24c6-01ac-48c0-8c62-f27e8309de23-ovndb-tls-certs\") pod \"a53b24c6-01ac-48c0-8c62-f27e8309de23\" (UID: \"a53b24c6-01ac-48c0-8c62-f27e8309de23\") " Mar 09 14:26:39 crc kubenswrapper[4722]: I0309 14:26:39.813682 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a53b24c6-01ac-48c0-8c62-f27e8309de23-httpd-config\") pod \"a53b24c6-01ac-48c0-8c62-f27e8309de23\" (UID: \"a53b24c6-01ac-48c0-8c62-f27e8309de23\") " Mar 09 14:26:39 crc kubenswrapper[4722]: I0309 14:26:39.813804 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a53b24c6-01ac-48c0-8c62-f27e8309de23-config\") pod \"a53b24c6-01ac-48c0-8c62-f27e8309de23\" (UID: \"a53b24c6-01ac-48c0-8c62-f27e8309de23\") " Mar 09 14:26:39 crc kubenswrapper[4722]: I0309 14:26:39.813835 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a53b24c6-01ac-48c0-8c62-f27e8309de23-public-tls-certs\") pod \"a53b24c6-01ac-48c0-8c62-f27e8309de23\" (UID: \"a53b24c6-01ac-48c0-8c62-f27e8309de23\") " Mar 09 14:26:39 crc kubenswrapper[4722]: I0309 14:26:39.819722 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a53b24c6-01ac-48c0-8c62-f27e8309de23-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "a53b24c6-01ac-48c0-8c62-f27e8309de23" (UID: "a53b24c6-01ac-48c0-8c62-f27e8309de23"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:26:39 crc kubenswrapper[4722]: I0309 14:26:39.820911 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a53b24c6-01ac-48c0-8c62-f27e8309de23-kube-api-access-cpt6n" (OuterVolumeSpecName: "kube-api-access-cpt6n") pod "a53b24c6-01ac-48c0-8c62-f27e8309de23" (UID: "a53b24c6-01ac-48c0-8c62-f27e8309de23"). InnerVolumeSpecName "kube-api-access-cpt6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:26:39 crc kubenswrapper[4722]: I0309 14:26:39.857763 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-cb7fc4c44-jrv9s" Mar 09 14:26:39 crc kubenswrapper[4722]: I0309 14:26:39.903840 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a53b24c6-01ac-48c0-8c62-f27e8309de23-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a53b24c6-01ac-48c0-8c62-f27e8309de23" (UID: "a53b24c6-01ac-48c0-8c62-f27e8309de23"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:26:39 crc kubenswrapper[4722]: I0309 14:26:39.914447 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a53b24c6-01ac-48c0-8c62-f27e8309de23-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a53b24c6-01ac-48c0-8c62-f27e8309de23" (UID: "a53b24c6-01ac-48c0-8c62-f27e8309de23"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:26:39 crc kubenswrapper[4722]: I0309 14:26:39.915167 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8066fa12-e1a3-483a-8c94-e7a59cc2f1d9-config-data\") pod \"8066fa12-e1a3-483a-8c94-e7a59cc2f1d9\" (UID: \"8066fa12-e1a3-483a-8c94-e7a59cc2f1d9\") " Mar 09 14:26:39 crc kubenswrapper[4722]: I0309 14:26:39.915357 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8066fa12-e1a3-483a-8c94-e7a59cc2f1d9-combined-ca-bundle\") pod \"8066fa12-e1a3-483a-8c94-e7a59cc2f1d9\" (UID: \"8066fa12-e1a3-483a-8c94-e7a59cc2f1d9\") " Mar 09 14:26:39 crc kubenswrapper[4722]: I0309 14:26:39.915409 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8066fa12-e1a3-483a-8c94-e7a59cc2f1d9-logs\") pod \"8066fa12-e1a3-483a-8c94-e7a59cc2f1d9\" (UID: \"8066fa12-e1a3-483a-8c94-e7a59cc2f1d9\") " Mar 09 14:26:39 crc kubenswrapper[4722]: I0309 14:26:39.915474 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8066fa12-e1a3-483a-8c94-e7a59cc2f1d9-config-data-custom\") pod \"8066fa12-e1a3-483a-8c94-e7a59cc2f1d9\" (UID: \"8066fa12-e1a3-483a-8c94-e7a59cc2f1d9\") " Mar 09 14:26:39 crc kubenswrapper[4722]: I0309 14:26:39.915620 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-959f6\" (UniqueName: \"kubernetes.io/projected/8066fa12-e1a3-483a-8c94-e7a59cc2f1d9-kube-api-access-959f6\") pod \"8066fa12-e1a3-483a-8c94-e7a59cc2f1d9\" (UID: \"8066fa12-e1a3-483a-8c94-e7a59cc2f1d9\") " Mar 09 14:26:39 crc kubenswrapper[4722]: I0309 14:26:39.916795 4722 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a53b24c6-01ac-48c0-8c62-f27e8309de23-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:39 crc kubenswrapper[4722]: I0309 14:26:39.916818 4722 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a53b24c6-01ac-48c0-8c62-f27e8309de23-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:39 crc kubenswrapper[4722]: I0309 14:26:39.916827 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a53b24c6-01ac-48c0-8c62-f27e8309de23-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:39 crc kubenswrapper[4722]: I0309 14:26:39.916838 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cpt6n\" (UniqueName: \"kubernetes.io/projected/a53b24c6-01ac-48c0-8c62-f27e8309de23-kube-api-access-cpt6n\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:39 crc kubenswrapper[4722]: I0309 14:26:39.917507 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8066fa12-e1a3-483a-8c94-e7a59cc2f1d9-logs" (OuterVolumeSpecName: "logs") pod "8066fa12-e1a3-483a-8c94-e7a59cc2f1d9" (UID: "8066fa12-e1a3-483a-8c94-e7a59cc2f1d9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:26:39 crc kubenswrapper[4722]: I0309 14:26:39.920640 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8066fa12-e1a3-483a-8c94-e7a59cc2f1d9-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "8066fa12-e1a3-483a-8c94-e7a59cc2f1d9" (UID: "8066fa12-e1a3-483a-8c94-e7a59cc2f1d9"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:26:39 crc kubenswrapper[4722]: I0309 14:26:39.923354 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8066fa12-e1a3-483a-8c94-e7a59cc2f1d9-kube-api-access-959f6" (OuterVolumeSpecName: "kube-api-access-959f6") pod "8066fa12-e1a3-483a-8c94-e7a59cc2f1d9" (UID: "8066fa12-e1a3-483a-8c94-e7a59cc2f1d9"). InnerVolumeSpecName "kube-api-access-959f6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:26:39 crc kubenswrapper[4722]: I0309 14:26:39.923757 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a53b24c6-01ac-48c0-8c62-f27e8309de23-config" (OuterVolumeSpecName: "config") pod "a53b24c6-01ac-48c0-8c62-f27e8309de23" (UID: "a53b24c6-01ac-48c0-8c62-f27e8309de23"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:26:39 crc kubenswrapper[4722]: I0309 14:26:39.927490 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a53b24c6-01ac-48c0-8c62-f27e8309de23-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a53b24c6-01ac-48c0-8c62-f27e8309de23" (UID: "a53b24c6-01ac-48c0-8c62-f27e8309de23"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:26:39 crc kubenswrapper[4722]: I0309 14:26:39.934487 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a53b24c6-01ac-48c0-8c62-f27e8309de23-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "a53b24c6-01ac-48c0-8c62-f27e8309de23" (UID: "a53b24c6-01ac-48c0-8c62-f27e8309de23"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:26:39 crc kubenswrapper[4722]: I0309 14:26:39.951414 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8066fa12-e1a3-483a-8c94-e7a59cc2f1d9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8066fa12-e1a3-483a-8c94-e7a59cc2f1d9" (UID: "8066fa12-e1a3-483a-8c94-e7a59cc2f1d9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:26:40 crc kubenswrapper[4722]: I0309 14:26:40.019951 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8066fa12-e1a3-483a-8c94-e7a59cc2f1d9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:40 crc kubenswrapper[4722]: I0309 14:26:40.020333 4722 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8066fa12-e1a3-483a-8c94-e7a59cc2f1d9-logs\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:40 crc kubenswrapper[4722]: I0309 14:26:40.020348 4722 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a53b24c6-01ac-48c0-8c62-f27e8309de23-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:40 crc kubenswrapper[4722]: I0309 14:26:40.020359 4722 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8066fa12-e1a3-483a-8c94-e7a59cc2f1d9-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:40 crc kubenswrapper[4722]: I0309 14:26:40.020369 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-959f6\" (UniqueName: \"kubernetes.io/projected/8066fa12-e1a3-483a-8c94-e7a59cc2f1d9-kube-api-access-959f6\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:40 crc kubenswrapper[4722]: I0309 14:26:40.020385 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/a53b24c6-01ac-48c0-8c62-f27e8309de23-config\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:40 crc kubenswrapper[4722]: I0309 14:26:40.020397 4722 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a53b24c6-01ac-48c0-8c62-f27e8309de23-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:40 crc kubenswrapper[4722]: I0309 14:26:40.020746 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8066fa12-e1a3-483a-8c94-e7a59cc2f1d9-config-data" (OuterVolumeSpecName: "config-data") pod "8066fa12-e1a3-483a-8c94-e7a59cc2f1d9" (UID: "8066fa12-e1a3-483a-8c94-e7a59cc2f1d9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:26:40 crc kubenswrapper[4722]: I0309 14:26:40.121948 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8066fa12-e1a3-483a-8c94-e7a59cc2f1d9-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:40 crc kubenswrapper[4722]: I0309 14:26:40.122493 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="896c45ad-c78a-48cd-a84d-7852705df811" containerName="cinder-api-log" containerID="cri-o://38c12366a8db34c5ce5fe002d1b8fdf7d15151c41c2a6275c9f9ca4b87e6dc3a" gracePeriod=30 Mar 09 14:26:40 crc kubenswrapper[4722]: I0309 14:26:40.122815 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"896c45ad-c78a-48cd-a84d-7852705df811","Type":"ContainerStarted","Data":"0d0099139496608abed4e024a326ac55d0d3bf2362ba9bf1d2b568d64a22c66a"} Mar 09 14:26:40 crc kubenswrapper[4722]: I0309 14:26:40.126773 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 09 14:26:40 crc kubenswrapper[4722]: I0309 14:26:40.127184 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="896c45ad-c78a-48cd-a84d-7852705df811" containerName="cinder-api" containerID="cri-o://0d0099139496608abed4e024a326ac55d0d3bf2362ba9bf1d2b568d64a22c66a" gracePeriod=30 Mar 09 14:26:40 crc kubenswrapper[4722]: I0309 14:26:40.139422 4722 generic.go:334] "Generic (PLEG): container finished" podID="8066fa12-e1a3-483a-8c94-e7a59cc2f1d9" containerID="55247a0ac598f6b1b4061095ab5246e6b6508d186c1cd181cb3e595bf543cc3e" exitCode=0 Mar 09 14:26:40 crc kubenswrapper[4722]: I0309 14:26:40.139505 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-cb7fc4c44-jrv9s" event={"ID":"8066fa12-e1a3-483a-8c94-e7a59cc2f1d9","Type":"ContainerDied","Data":"55247a0ac598f6b1b4061095ab5246e6b6508d186c1cd181cb3e595bf543cc3e"} Mar 09 14:26:40 crc kubenswrapper[4722]: I0309 14:26:40.139530 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-cb7fc4c44-jrv9s" event={"ID":"8066fa12-e1a3-483a-8c94-e7a59cc2f1d9","Type":"ContainerDied","Data":"61d9b18c5dba6555fb05e13585a5ea0ad201dd7340176902c8d9b3e41445e7c6"} Mar 09 14:26:40 crc kubenswrapper[4722]: I0309 14:26:40.139526 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-cb7fc4c44-jrv9s" Mar 09 14:26:40 crc kubenswrapper[4722]: I0309 14:26:40.139548 4722 scope.go:117] "RemoveContainer" containerID="55247a0ac598f6b1b4061095ab5246e6b6508d186c1cd181cb3e595bf543cc3e" Mar 09 14:26:40 crc kubenswrapper[4722]: I0309 14:26:40.178928 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5487cbd8d5-crfpm" Mar 09 14:26:40 crc kubenswrapper[4722]: I0309 14:26:40.188550 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f2b2dd04-3dd3-4915-ab91-201d5bfb54b1","Type":"ContainerStarted","Data":"ac655597c39eb8a0179586c41d96cacd49308efbc35dd59979c55b269b34454d"} Mar 09 14:26:40 crc kubenswrapper[4722]: I0309 14:26:40.188587 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5487cbd8d5-crfpm" event={"ID":"a53b24c6-01ac-48c0-8c62-f27e8309de23","Type":"ContainerDied","Data":"c731c6cb05aebe7df0fcbb137dadee56fb6cb38e27f8832d1fd62cec2e951509"} Mar 09 14:26:40 crc kubenswrapper[4722]: I0309 14:26:40.206038 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.205993749 podStartE2EDuration="4.205993749s" podCreationTimestamp="2026-03-09 14:26:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:26:40.152734926 +0000 UTC m=+1440.708303502" watchObservedRunningTime="2026-03-09 14:26:40.205993749 +0000 UTC m=+1440.761562325" Mar 09 14:26:40 crc kubenswrapper[4722]: I0309 14:26:40.220031 4722 scope.go:117] "RemoveContainer" containerID="6be84d49059047100547341362493db59e9c0a5393e70a8126117748e190932e" Mar 09 14:26:40 crc kubenswrapper[4722]: I0309 14:26:40.272281 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-cb7fc4c44-jrv9s"] Mar 09 14:26:40 crc kubenswrapper[4722]: I0309 14:26:40.298300 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-cb7fc4c44-jrv9s"] Mar 09 14:26:40 crc kubenswrapper[4722]: I0309 14:26:40.298954 4722 scope.go:117] "RemoveContainer" containerID="55247a0ac598f6b1b4061095ab5246e6b6508d186c1cd181cb3e595bf543cc3e" Mar 09 14:26:40 crc kubenswrapper[4722]: E0309 14:26:40.299862 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55247a0ac598f6b1b4061095ab5246e6b6508d186c1cd181cb3e595bf543cc3e\": container with ID starting with 55247a0ac598f6b1b4061095ab5246e6b6508d186c1cd181cb3e595bf543cc3e not found: ID does not exist" containerID="55247a0ac598f6b1b4061095ab5246e6b6508d186c1cd181cb3e595bf543cc3e" Mar 09 14:26:40 crc kubenswrapper[4722]: I0309 14:26:40.299923 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55247a0ac598f6b1b4061095ab5246e6b6508d186c1cd181cb3e595bf543cc3e"} err="failed to get container status \"55247a0ac598f6b1b4061095ab5246e6b6508d186c1cd181cb3e595bf543cc3e\": rpc error: code = NotFound desc = could not find container \"55247a0ac598f6b1b4061095ab5246e6b6508d186c1cd181cb3e595bf543cc3e\": container with ID starting with 55247a0ac598f6b1b4061095ab5246e6b6508d186c1cd181cb3e595bf543cc3e not found: ID does not exist" Mar 09 14:26:40 crc kubenswrapper[4722]: I0309 14:26:40.299950 4722 scope.go:117] "RemoveContainer" containerID="6be84d49059047100547341362493db59e9c0a5393e70a8126117748e190932e" Mar 09 14:26:40 crc kubenswrapper[4722]: E0309 14:26:40.302987 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6be84d49059047100547341362493db59e9c0a5393e70a8126117748e190932e\": container with ID starting with 6be84d49059047100547341362493db59e9c0a5393e70a8126117748e190932e not found: ID does not exist" containerID="6be84d49059047100547341362493db59e9c0a5393e70a8126117748e190932e" Mar 09 14:26:40 crc kubenswrapper[4722]: I0309 14:26:40.303037 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6be84d49059047100547341362493db59e9c0a5393e70a8126117748e190932e"} err="failed to get container status \"6be84d49059047100547341362493db59e9c0a5393e70a8126117748e190932e\": rpc error: code = NotFound desc = could not find container \"6be84d49059047100547341362493db59e9c0a5393e70a8126117748e190932e\": container with ID starting with 6be84d49059047100547341362493db59e9c0a5393e70a8126117748e190932e not found: ID does not exist" Mar 09 14:26:40 crc kubenswrapper[4722]: I0309 14:26:40.303064 4722 scope.go:117] "RemoveContainer" containerID="3e86f1b2140a6860ab063b2ed7b54181844d7003819352ffb7cdf3e8f395b8a7" Mar 09 14:26:40 crc kubenswrapper[4722]: I0309 14:26:40.332276 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5487cbd8d5-crfpm"] Mar 09 14:26:40 crc kubenswrapper[4722]: I0309 14:26:40.342954 4722 scope.go:117] "RemoveContainer" containerID="d3ba4abf6a56fdb679f45127646186f63a7a511208e8f8b22d325ba9f4f9c918" Mar 09 14:26:40 crc kubenswrapper[4722]: I0309 14:26:40.349960 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5487cbd8d5-crfpm"] Mar 09 14:26:40 crc kubenswrapper[4722]: I0309 14:26:40.748709 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 14:26:40 crc kubenswrapper[4722]: I0309 14:26:40.845461 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34700700-cbd0-4a2e-b791-25b63e3de5b8-combined-ca-bundle\") pod \"34700700-cbd0-4a2e-b791-25b63e3de5b8\" (UID: \"34700700-cbd0-4a2e-b791-25b63e3de5b8\") " Mar 09 14:26:40 crc kubenswrapper[4722]: I0309 14:26:40.845551 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/34700700-cbd0-4a2e-b791-25b63e3de5b8-run-httpd\") pod \"34700700-cbd0-4a2e-b791-25b63e3de5b8\" (UID: \"34700700-cbd0-4a2e-b791-25b63e3de5b8\") " Mar 09 14:26:40 crc kubenswrapper[4722]: I0309 14:26:40.845575 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/34700700-cbd0-4a2e-b791-25b63e3de5b8-sg-core-conf-yaml\") pod \"34700700-cbd0-4a2e-b791-25b63e3de5b8\" (UID: \"34700700-cbd0-4a2e-b791-25b63e3de5b8\") " Mar 09 14:26:40 crc kubenswrapper[4722]: I0309 14:26:40.845753 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34700700-cbd0-4a2e-b791-25b63e3de5b8-config-data\") pod \"34700700-cbd0-4a2e-b791-25b63e3de5b8\" (UID: \"34700700-cbd0-4a2e-b791-25b63e3de5b8\") " Mar 09 14:26:40 crc kubenswrapper[4722]: I0309 14:26:40.845798 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/34700700-cbd0-4a2e-b791-25b63e3de5b8-log-httpd\") pod \"34700700-cbd0-4a2e-b791-25b63e3de5b8\" (UID: \"34700700-cbd0-4a2e-b791-25b63e3de5b8\") " Mar 09 14:26:40 crc kubenswrapper[4722]: I0309 14:26:40.845839 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34700700-cbd0-4a2e-b791-25b63e3de5b8-scripts\") pod \"34700700-cbd0-4a2e-b791-25b63e3de5b8\" (UID: \"34700700-cbd0-4a2e-b791-25b63e3de5b8\") " Mar 09 14:26:40 crc kubenswrapper[4722]: I0309 14:26:40.845914 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndlhs\" (UniqueName: \"kubernetes.io/projected/34700700-cbd0-4a2e-b791-25b63e3de5b8-kube-api-access-ndlhs\") pod \"34700700-cbd0-4a2e-b791-25b63e3de5b8\" (UID: \"34700700-cbd0-4a2e-b791-25b63e3de5b8\") " Mar 09 14:26:40 crc kubenswrapper[4722]: I0309 14:26:40.846641 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34700700-cbd0-4a2e-b791-25b63e3de5b8-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "34700700-cbd0-4a2e-b791-25b63e3de5b8" (UID: "34700700-cbd0-4a2e-b791-25b63e3de5b8"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:26:40 crc kubenswrapper[4722]: I0309 14:26:40.846681 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34700700-cbd0-4a2e-b791-25b63e3de5b8-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "34700700-cbd0-4a2e-b791-25b63e3de5b8" (UID: "34700700-cbd0-4a2e-b791-25b63e3de5b8"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:26:40 crc kubenswrapper[4722]: I0309 14:26:40.859385 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34700700-cbd0-4a2e-b791-25b63e3de5b8-kube-api-access-ndlhs" (OuterVolumeSpecName: "kube-api-access-ndlhs") pod "34700700-cbd0-4a2e-b791-25b63e3de5b8" (UID: "34700700-cbd0-4a2e-b791-25b63e3de5b8"). InnerVolumeSpecName "kube-api-access-ndlhs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:26:40 crc kubenswrapper[4722]: I0309 14:26:40.863390 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34700700-cbd0-4a2e-b791-25b63e3de5b8-scripts" (OuterVolumeSpecName: "scripts") pod "34700700-cbd0-4a2e-b791-25b63e3de5b8" (UID: "34700700-cbd0-4a2e-b791-25b63e3de5b8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:26:40 crc kubenswrapper[4722]: I0309 14:26:40.899029 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34700700-cbd0-4a2e-b791-25b63e3de5b8-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "34700700-cbd0-4a2e-b791-25b63e3de5b8" (UID: "34700700-cbd0-4a2e-b791-25b63e3de5b8"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:26:40 crc kubenswrapper[4722]: I0309 14:26:40.949659 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34700700-cbd0-4a2e-b791-25b63e3de5b8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "34700700-cbd0-4a2e-b791-25b63e3de5b8" (UID: "34700700-cbd0-4a2e-b791-25b63e3de5b8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:26:40 crc kubenswrapper[4722]: I0309 14:26:40.950376 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34700700-cbd0-4a2e-b791-25b63e3de5b8-combined-ca-bundle\") pod \"34700700-cbd0-4a2e-b791-25b63e3de5b8\" (UID: \"34700700-cbd0-4a2e-b791-25b63e3de5b8\") " Mar 09 14:26:40 crc kubenswrapper[4722]: W0309 14:26:40.950512 4722 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/34700700-cbd0-4a2e-b791-25b63e3de5b8/volumes/kubernetes.io~secret/combined-ca-bundle Mar 09 14:26:40 crc kubenswrapper[4722]: I0309 14:26:40.950547 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34700700-cbd0-4a2e-b791-25b63e3de5b8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "34700700-cbd0-4a2e-b791-25b63e3de5b8" (UID: "34700700-cbd0-4a2e-b791-25b63e3de5b8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:26:40 crc kubenswrapper[4722]: I0309 14:26:40.951272 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34700700-cbd0-4a2e-b791-25b63e3de5b8-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:40 crc kubenswrapper[4722]: I0309 14:26:40.951301 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ndlhs\" (UniqueName: \"kubernetes.io/projected/34700700-cbd0-4a2e-b791-25b63e3de5b8-kube-api-access-ndlhs\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:40 crc kubenswrapper[4722]: I0309 14:26:40.951316 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34700700-cbd0-4a2e-b791-25b63e3de5b8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:40 crc kubenswrapper[4722]: I0309 14:26:40.951328 4722 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/34700700-cbd0-4a2e-b791-25b63e3de5b8-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:40 crc kubenswrapper[4722]: I0309 14:26:40.951339 4722 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/34700700-cbd0-4a2e-b791-25b63e3de5b8-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:40 crc kubenswrapper[4722]: I0309 14:26:40.951350 4722 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/34700700-cbd0-4a2e-b791-25b63e3de5b8-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:40 crc kubenswrapper[4722]: I0309 14:26:40.963942 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34700700-cbd0-4a2e-b791-25b63e3de5b8-config-data" (OuterVolumeSpecName: "config-data") pod "34700700-cbd0-4a2e-b791-25b63e3de5b8" (UID: "34700700-cbd0-4a2e-b791-25b63e3de5b8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.013490 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.054569 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34700700-cbd0-4a2e-b791-25b63e3de5b8-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.155756 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/896c45ad-c78a-48cd-a84d-7852705df811-combined-ca-bundle\") pod \"896c45ad-c78a-48cd-a84d-7852705df811\" (UID: \"896c45ad-c78a-48cd-a84d-7852705df811\") " Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.155838 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/896c45ad-c78a-48cd-a84d-7852705df811-logs\") pod \"896c45ad-c78a-48cd-a84d-7852705df811\" (UID: \"896c45ad-c78a-48cd-a84d-7852705df811\") " Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.155864 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g99j5\" (UniqueName: \"kubernetes.io/projected/896c45ad-c78a-48cd-a84d-7852705df811-kube-api-access-g99j5\") pod \"896c45ad-c78a-48cd-a84d-7852705df811\" (UID: \"896c45ad-c78a-48cd-a84d-7852705df811\") " Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.155920 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/896c45ad-c78a-48cd-a84d-7852705df811-config-data-custom\") pod \"896c45ad-c78a-48cd-a84d-7852705df811\" (UID: \"896c45ad-c78a-48cd-a84d-7852705df811\") " Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.155959 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/896c45ad-c78a-48cd-a84d-7852705df811-config-data\") pod \"896c45ad-c78a-48cd-a84d-7852705df811\" (UID: \"896c45ad-c78a-48cd-a84d-7852705df811\") " Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.156093 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/896c45ad-c78a-48cd-a84d-7852705df811-etc-machine-id\") pod \"896c45ad-c78a-48cd-a84d-7852705df811\" (UID: \"896c45ad-c78a-48cd-a84d-7852705df811\") " Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.156225 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/896c45ad-c78a-48cd-a84d-7852705df811-scripts\") pod \"896c45ad-c78a-48cd-a84d-7852705df811\" (UID: \"896c45ad-c78a-48cd-a84d-7852705df811\") " Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.157850 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/896c45ad-c78a-48cd-a84d-7852705df811-logs" (OuterVolumeSpecName: "logs") pod "896c45ad-c78a-48cd-a84d-7852705df811" (UID: "896c45ad-c78a-48cd-a84d-7852705df811"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.157979 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/896c45ad-c78a-48cd-a84d-7852705df811-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "896c45ad-c78a-48cd-a84d-7852705df811" (UID: "896c45ad-c78a-48cd-a84d-7852705df811"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.162182 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/896c45ad-c78a-48cd-a84d-7852705df811-kube-api-access-g99j5" (OuterVolumeSpecName: "kube-api-access-g99j5") pod "896c45ad-c78a-48cd-a84d-7852705df811" (UID: "896c45ad-c78a-48cd-a84d-7852705df811"). InnerVolumeSpecName "kube-api-access-g99j5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.179543 4722 generic.go:334] "Generic (PLEG): container finished" podID="896c45ad-c78a-48cd-a84d-7852705df811" containerID="0d0099139496608abed4e024a326ac55d0d3bf2362ba9bf1d2b568d64a22c66a" exitCode=0 Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.179588 4722 generic.go:334] "Generic (PLEG): container finished" podID="896c45ad-c78a-48cd-a84d-7852705df811" containerID="38c12366a8db34c5ce5fe002d1b8fdf7d15151c41c2a6275c9f9ca4b87e6dc3a" exitCode=143 Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.179637 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"896c45ad-c78a-48cd-a84d-7852705df811","Type":"ContainerDied","Data":"0d0099139496608abed4e024a326ac55d0d3bf2362ba9bf1d2b568d64a22c66a"} Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.179693 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"896c45ad-c78a-48cd-a84d-7852705df811","Type":"ContainerDied","Data":"38c12366a8db34c5ce5fe002d1b8fdf7d15151c41c2a6275c9f9ca4b87e6dc3a"} Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.179708 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"896c45ad-c78a-48cd-a84d-7852705df811","Type":"ContainerDied","Data":"1f235d54cebc73f53c4a69e365684ea0655dc048457f90e257da710d1017839b"} Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.179727 4722 scope.go:117] "RemoveContainer" containerID="0d0099139496608abed4e024a326ac55d0d3bf2362ba9bf1d2b568d64a22c66a" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.179858 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.184703 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/896c45ad-c78a-48cd-a84d-7852705df811-scripts" (OuterVolumeSpecName: "scripts") pod "896c45ad-c78a-48cd-a84d-7852705df811" (UID: "896c45ad-c78a-48cd-a84d-7852705df811"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.184767 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/896c45ad-c78a-48cd-a84d-7852705df811-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "896c45ad-c78a-48cd-a84d-7852705df811" (UID: "896c45ad-c78a-48cd-a84d-7852705df811"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.188722 4722 generic.go:334] "Generic (PLEG): container finished" podID="34700700-cbd0-4a2e-b791-25b63e3de5b8" containerID="3b2426d4609239d48a635ae0e691919369f6b42c652697acdcc320f9f919a30b" exitCode=0 Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.188795 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"34700700-cbd0-4a2e-b791-25b63e3de5b8","Type":"ContainerDied","Data":"3b2426d4609239d48a635ae0e691919369f6b42c652697acdcc320f9f919a30b"} Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.188823 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"34700700-cbd0-4a2e-b791-25b63e3de5b8","Type":"ContainerDied","Data":"0ea2488b2edd679df22bff5d6545eaf705fc305844f02792062e8792aeb721c9"} Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.188831 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.192898 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f2b2dd04-3dd3-4915-ab91-201d5bfb54b1","Type":"ContainerStarted","Data":"098ab54ae06e8bd92c40a772e9383c8da1c225c9f461a643a0ec06a19468ca46"} Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.227952 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/896c45ad-c78a-48cd-a84d-7852705df811-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "896c45ad-c78a-48cd-a84d-7852705df811" (UID: "896c45ad-c78a-48cd-a84d-7852705df811"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.240733 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.569618575 podStartE2EDuration="5.240713298s" podCreationTimestamp="2026-03-09 14:26:36 +0000 UTC" firstStartedPulling="2026-03-09 14:26:37.365178164 +0000 UTC m=+1437.920746740" lastFinishedPulling="2026-03-09 14:26:38.036272887 +0000 UTC m=+1438.591841463" observedRunningTime="2026-03-09 14:26:41.214987827 +0000 UTC m=+1441.770556403" watchObservedRunningTime="2026-03-09 14:26:41.240713298 +0000 UTC m=+1441.796281874" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.245370 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/896c45ad-c78a-48cd-a84d-7852705df811-config-data" (OuterVolumeSpecName: "config-data") pod "896c45ad-c78a-48cd-a84d-7852705df811" (UID: "896c45ad-c78a-48cd-a84d-7852705df811"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.259017 4722 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/896c45ad-c78a-48cd-a84d-7852705df811-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.259052 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/896c45ad-c78a-48cd-a84d-7852705df811-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.259065 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/896c45ad-c78a-48cd-a84d-7852705df811-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.259073 4722 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/896c45ad-c78a-48cd-a84d-7852705df811-logs\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.259082 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g99j5\" (UniqueName: \"kubernetes.io/projected/896c45ad-c78a-48cd-a84d-7852705df811-kube-api-access-g99j5\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.259094 4722 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/896c45ad-c78a-48cd-a84d-7852705df811-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.259102 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/896c45ad-c78a-48cd-a84d-7852705df811-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.330398 4722 scope.go:117] "RemoveContainer" containerID="38c12366a8db34c5ce5fe002d1b8fdf7d15151c41c2a6275c9f9ca4b87e6dc3a" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.365234 4722 scope.go:117] "RemoveContainer" containerID="0d0099139496608abed4e024a326ac55d0d3bf2362ba9bf1d2b568d64a22c66a" Mar 09 14:26:41 crc kubenswrapper[4722]: E0309 14:26:41.368920 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d0099139496608abed4e024a326ac55d0d3bf2362ba9bf1d2b568d64a22c66a\": container with ID starting with 0d0099139496608abed4e024a326ac55d0d3bf2362ba9bf1d2b568d64a22c66a not found: ID does not exist" containerID="0d0099139496608abed4e024a326ac55d0d3bf2362ba9bf1d2b568d64a22c66a" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.368969 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d0099139496608abed4e024a326ac55d0d3bf2362ba9bf1d2b568d64a22c66a"} err="failed to get container status \"0d0099139496608abed4e024a326ac55d0d3bf2362ba9bf1d2b568d64a22c66a\": rpc error: code = NotFound desc = could not find container \"0d0099139496608abed4e024a326ac55d0d3bf2362ba9bf1d2b568d64a22c66a\": container with ID starting with 0d0099139496608abed4e024a326ac55d0d3bf2362ba9bf1d2b568d64a22c66a not found: ID does not exist" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.369003 4722 scope.go:117] "RemoveContainer" containerID="38c12366a8db34c5ce5fe002d1b8fdf7d15151c41c2a6275c9f9ca4b87e6dc3a" Mar 09 14:26:41 crc kubenswrapper[4722]: E0309 14:26:41.373680 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38c12366a8db34c5ce5fe002d1b8fdf7d15151c41c2a6275c9f9ca4b87e6dc3a\": container with ID starting with 38c12366a8db34c5ce5fe002d1b8fdf7d15151c41c2a6275c9f9ca4b87e6dc3a not found: ID does not exist" containerID="38c12366a8db34c5ce5fe002d1b8fdf7d15151c41c2a6275c9f9ca4b87e6dc3a" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.373727 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38c12366a8db34c5ce5fe002d1b8fdf7d15151c41c2a6275c9f9ca4b87e6dc3a"} err="failed to get container status \"38c12366a8db34c5ce5fe002d1b8fdf7d15151c41c2a6275c9f9ca4b87e6dc3a\": rpc error: code = NotFound desc = could not find container \"38c12366a8db34c5ce5fe002d1b8fdf7d15151c41c2a6275c9f9ca4b87e6dc3a\": container with ID starting with 38c12366a8db34c5ce5fe002d1b8fdf7d15151c41c2a6275c9f9ca4b87e6dc3a not found: ID does not exist" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.373755 4722 scope.go:117] "RemoveContainer" containerID="0d0099139496608abed4e024a326ac55d0d3bf2362ba9bf1d2b568d64a22c66a" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.374688 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d0099139496608abed4e024a326ac55d0d3bf2362ba9bf1d2b568d64a22c66a"} err="failed to get container status \"0d0099139496608abed4e024a326ac55d0d3bf2362ba9bf1d2b568d64a22c66a\": rpc error: code = NotFound desc = could not find container \"0d0099139496608abed4e024a326ac55d0d3bf2362ba9bf1d2b568d64a22c66a\": container with ID starting with 0d0099139496608abed4e024a326ac55d0d3bf2362ba9bf1d2b568d64a22c66a not found: ID does not exist" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.374708 4722 scope.go:117] "RemoveContainer" containerID="38c12366a8db34c5ce5fe002d1b8fdf7d15151c41c2a6275c9f9ca4b87e6dc3a" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.375045 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38c12366a8db34c5ce5fe002d1b8fdf7d15151c41c2a6275c9f9ca4b87e6dc3a"} err="failed to get container status \"38c12366a8db34c5ce5fe002d1b8fdf7d15151c41c2a6275c9f9ca4b87e6dc3a\": rpc error: code = NotFound desc = could not find container \"38c12366a8db34c5ce5fe002d1b8fdf7d15151c41c2a6275c9f9ca4b87e6dc3a\": container with ID starting with 38c12366a8db34c5ce5fe002d1b8fdf7d15151c41c2a6275c9f9ca4b87e6dc3a not found: ID does not exist" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.375068 4722 scope.go:117] "RemoveContainer" containerID="648af9c2448d9a573865eb523325d52d94f6162bb99fb1122a0292322aa61715" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.375796 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.393036 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.411015 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.411435 4722 scope.go:117] "RemoveContainer" containerID="cd01cfbd6236ef8ba1ab9a77a89a9b477fdb53cf600452555b20b52ab693210f" Mar 09 14:26:41 crc kubenswrapper[4722]: E0309 14:26:41.411655 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="896c45ad-c78a-48cd-a84d-7852705df811" containerName="cinder-api-log" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.411688 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="896c45ad-c78a-48cd-a84d-7852705df811" containerName="cinder-api-log" Mar 09 14:26:41 crc kubenswrapper[4722]: E0309 14:26:41.411699 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34700700-cbd0-4a2e-b791-25b63e3de5b8" containerName="ceilometer-notification-agent" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.411706 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="34700700-cbd0-4a2e-b791-25b63e3de5b8" containerName="ceilometer-notification-agent" Mar 09 14:26:41 crc kubenswrapper[4722]: E0309 14:26:41.411719 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a53b24c6-01ac-48c0-8c62-f27e8309de23" containerName="neutron-api" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.411725 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a53b24c6-01ac-48c0-8c62-f27e8309de23" containerName="neutron-api" Mar 09 14:26:41 crc kubenswrapper[4722]: E0309 14:26:41.411759 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8066fa12-e1a3-483a-8c94-e7a59cc2f1d9" containerName="barbican-api" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.411766 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="8066fa12-e1a3-483a-8c94-e7a59cc2f1d9" containerName="barbican-api" Mar 09 14:26:41 crc kubenswrapper[4722]: E0309 14:26:41.411783 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34700700-cbd0-4a2e-b791-25b63e3de5b8" containerName="proxy-httpd" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.411789 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="34700700-cbd0-4a2e-b791-25b63e3de5b8" containerName="proxy-httpd" Mar 09 14:26:41 crc kubenswrapper[4722]: E0309 14:26:41.411807 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8066fa12-e1a3-483a-8c94-e7a59cc2f1d9" containerName="barbican-api-log" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.411824 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="8066fa12-e1a3-483a-8c94-e7a59cc2f1d9" containerName="barbican-api-log" Mar 09 14:26:41 crc kubenswrapper[4722]: E0309 14:26:41.411839 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="896c45ad-c78a-48cd-a84d-7852705df811" containerName="cinder-api" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.411846 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="896c45ad-c78a-48cd-a84d-7852705df811" containerName="cinder-api" Mar 09 14:26:41 crc kubenswrapper[4722]: E0309 14:26:41.411864 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a53b24c6-01ac-48c0-8c62-f27e8309de23" containerName="neutron-httpd" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.411873 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a53b24c6-01ac-48c0-8c62-f27e8309de23" containerName="neutron-httpd" Mar 09 14:26:41 crc kubenswrapper[4722]: E0309 14:26:41.411894 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34700700-cbd0-4a2e-b791-25b63e3de5b8" containerName="sg-core" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.411903 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="34700700-cbd0-4a2e-b791-25b63e3de5b8" containerName="sg-core" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.412146 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="a53b24c6-01ac-48c0-8c62-f27e8309de23" containerName="neutron-api" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.412167 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="8066fa12-e1a3-483a-8c94-e7a59cc2f1d9" containerName="barbican-api-log" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.412183 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="34700700-cbd0-4a2e-b791-25b63e3de5b8" containerName="sg-core" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.412192 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="34700700-cbd0-4a2e-b791-25b63e3de5b8" containerName="proxy-httpd" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.412222 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="a53b24c6-01ac-48c0-8c62-f27e8309de23" containerName="neutron-httpd" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.412266 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="896c45ad-c78a-48cd-a84d-7852705df811" containerName="cinder-api" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.412285 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="34700700-cbd0-4a2e-b791-25b63e3de5b8" containerName="ceilometer-notification-agent" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.412292 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="8066fa12-e1a3-483a-8c94-e7a59cc2f1d9" containerName="barbican-api" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.412305 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="896c45ad-c78a-48cd-a84d-7852705df811" containerName="cinder-api-log" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.414772 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.417894 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.418118 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.425542 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.459686 4722 scope.go:117] "RemoveContainer" containerID="3b2426d4609239d48a635ae0e691919369f6b42c652697acdcc320f9f919a30b" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.467390 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2b1f498-8133-44c3-b9e5-fb0accca46b1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a2b1f498-8133-44c3-b9e5-fb0accca46b1\") " pod="openstack/ceilometer-0" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.467570 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a2b1f498-8133-44c3-b9e5-fb0accca46b1-run-httpd\") pod \"ceilometer-0\" (UID: \"a2b1f498-8133-44c3-b9e5-fb0accca46b1\") " pod="openstack/ceilometer-0" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.467708 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a2b1f498-8133-44c3-b9e5-fb0accca46b1-scripts\") pod \"ceilometer-0\" (UID: \"a2b1f498-8133-44c3-b9e5-fb0accca46b1\") " pod="openstack/ceilometer-0" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.467891 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a2b1f498-8133-44c3-b9e5-fb0accca46b1-log-httpd\") pod \"ceilometer-0\" (UID: \"a2b1f498-8133-44c3-b9e5-fb0accca46b1\") " pod="openstack/ceilometer-0" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.468241 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vftrz\" (UniqueName: \"kubernetes.io/projected/a2b1f498-8133-44c3-b9e5-fb0accca46b1-kube-api-access-vftrz\") pod \"ceilometer-0\" (UID: \"a2b1f498-8133-44c3-b9e5-fb0accca46b1\") " pod="openstack/ceilometer-0" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.468395 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2b1f498-8133-44c3-b9e5-fb0accca46b1-config-data\") pod \"ceilometer-0\" (UID: \"a2b1f498-8133-44c3-b9e5-fb0accca46b1\") " pod="openstack/ceilometer-0" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.468436 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a2b1f498-8133-44c3-b9e5-fb0accca46b1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a2b1f498-8133-44c3-b9e5-fb0accca46b1\") " pod="openstack/ceilometer-0" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.507663 4722 scope.go:117] "RemoveContainer" containerID="648af9c2448d9a573865eb523325d52d94f6162bb99fb1122a0292322aa61715" Mar 09 14:26:41 crc kubenswrapper[4722]: E0309 14:26:41.508541 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"648af9c2448d9a573865eb523325d52d94f6162bb99fb1122a0292322aa61715\": container with ID starting with 648af9c2448d9a573865eb523325d52d94f6162bb99fb1122a0292322aa61715 not found: ID does not exist" containerID="648af9c2448d9a573865eb523325d52d94f6162bb99fb1122a0292322aa61715" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.508583 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"648af9c2448d9a573865eb523325d52d94f6162bb99fb1122a0292322aa61715"} err="failed to get container status \"648af9c2448d9a573865eb523325d52d94f6162bb99fb1122a0292322aa61715\": rpc error: code = NotFound desc = could not find container \"648af9c2448d9a573865eb523325d52d94f6162bb99fb1122a0292322aa61715\": container with ID starting with 648af9c2448d9a573865eb523325d52d94f6162bb99fb1122a0292322aa61715 not found: ID does not exist" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.508622 4722 scope.go:117] "RemoveContainer" containerID="cd01cfbd6236ef8ba1ab9a77a89a9b477fdb53cf600452555b20b52ab693210f" Mar 09 14:26:41 crc kubenswrapper[4722]: E0309 14:26:41.509044 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd01cfbd6236ef8ba1ab9a77a89a9b477fdb53cf600452555b20b52ab693210f\": container with ID starting with cd01cfbd6236ef8ba1ab9a77a89a9b477fdb53cf600452555b20b52ab693210f not found: ID does not exist" containerID="cd01cfbd6236ef8ba1ab9a77a89a9b477fdb53cf600452555b20b52ab693210f" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.509072 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd01cfbd6236ef8ba1ab9a77a89a9b477fdb53cf600452555b20b52ab693210f"} err="failed to get container status \"cd01cfbd6236ef8ba1ab9a77a89a9b477fdb53cf600452555b20b52ab693210f\": rpc error: code = NotFound desc = could not find container \"cd01cfbd6236ef8ba1ab9a77a89a9b477fdb53cf600452555b20b52ab693210f\": container with ID starting with cd01cfbd6236ef8ba1ab9a77a89a9b477fdb53cf600452555b20b52ab693210f not found: ID does not exist" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.509093 4722 scope.go:117] "RemoveContainer" containerID="3b2426d4609239d48a635ae0e691919369f6b42c652697acdcc320f9f919a30b" Mar 09 14:26:41 crc kubenswrapper[4722]: E0309 14:26:41.510771 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b2426d4609239d48a635ae0e691919369f6b42c652697acdcc320f9f919a30b\": container with ID starting with 3b2426d4609239d48a635ae0e691919369f6b42c652697acdcc320f9f919a30b not found: ID does not exist" containerID="3b2426d4609239d48a635ae0e691919369f6b42c652697acdcc320f9f919a30b" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.510842 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b2426d4609239d48a635ae0e691919369f6b42c652697acdcc320f9f919a30b"} err="failed to get container status \"3b2426d4609239d48a635ae0e691919369f6b42c652697acdcc320f9f919a30b\": rpc error: code = NotFound desc = could not find container \"3b2426d4609239d48a635ae0e691919369f6b42c652697acdcc320f9f919a30b\": container with ID starting with 3b2426d4609239d48a635ae0e691919369f6b42c652697acdcc320f9f919a30b not found: ID does not exist" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.522314 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.540777 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.563262 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.565765 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.565904 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.568795 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.569418 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.569549 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.569798 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vftrz\" (UniqueName: \"kubernetes.io/projected/a2b1f498-8133-44c3-b9e5-fb0accca46b1-kube-api-access-vftrz\") pod \"ceilometer-0\" (UID: \"a2b1f498-8133-44c3-b9e5-fb0accca46b1\") " pod="openstack/ceilometer-0" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.569897 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2b1f498-8133-44c3-b9e5-fb0accca46b1-config-data\") pod \"ceilometer-0\" (UID: \"a2b1f498-8133-44c3-b9e5-fb0accca46b1\") " pod="openstack/ceilometer-0" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.569926 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a2b1f498-8133-44c3-b9e5-fb0accca46b1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a2b1f498-8133-44c3-b9e5-fb0accca46b1\") " pod="openstack/ceilometer-0" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.569974 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2b1f498-8133-44c3-b9e5-fb0accca46b1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a2b1f498-8133-44c3-b9e5-fb0accca46b1\") " pod="openstack/ceilometer-0" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.570002 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a2b1f498-8133-44c3-b9e5-fb0accca46b1-run-httpd\") pod \"ceilometer-0\" (UID: \"a2b1f498-8133-44c3-b9e5-fb0accca46b1\") " pod="openstack/ceilometer-0" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.570023 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a2b1f498-8133-44c3-b9e5-fb0accca46b1-scripts\") pod \"ceilometer-0\" (UID: \"a2b1f498-8133-44c3-b9e5-fb0accca46b1\") " pod="openstack/ceilometer-0" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.570072 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a2b1f498-8133-44c3-b9e5-fb0accca46b1-log-httpd\") pod \"ceilometer-0\" (UID: \"a2b1f498-8133-44c3-b9e5-fb0accca46b1\") " pod="openstack/ceilometer-0" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.570597 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a2b1f498-8133-44c3-b9e5-fb0accca46b1-log-httpd\") pod \"ceilometer-0\" (UID: \"a2b1f498-8133-44c3-b9e5-fb0accca46b1\") " pod="openstack/ceilometer-0" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.571142 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a2b1f498-8133-44c3-b9e5-fb0accca46b1-run-httpd\") pod \"ceilometer-0\" (UID: \"a2b1f498-8133-44c3-b9e5-fb0accca46b1\") " pod="openstack/ceilometer-0" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.575833 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a2b1f498-8133-44c3-b9e5-fb0accca46b1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a2b1f498-8133-44c3-b9e5-fb0accca46b1\") " pod="openstack/ceilometer-0" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.577169 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2b1f498-8133-44c3-b9e5-fb0accca46b1-config-data\") pod \"ceilometer-0\" (UID: \"a2b1f498-8133-44c3-b9e5-fb0accca46b1\") " pod="openstack/ceilometer-0" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.577327 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2b1f498-8133-44c3-b9e5-fb0accca46b1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a2b1f498-8133-44c3-b9e5-fb0accca46b1\") " pod="openstack/ceilometer-0" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.584766 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a2b1f498-8133-44c3-b9e5-fb0accca46b1-scripts\") pod \"ceilometer-0\" (UID: \"a2b1f498-8133-44c3-b9e5-fb0accca46b1\") " pod="openstack/ceilometer-0" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.590723 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vftrz\" (UniqueName: \"kubernetes.io/projected/a2b1f498-8133-44c3-b9e5-fb0accca46b1-kube-api-access-vftrz\") pod \"ceilometer-0\" (UID: \"a2b1f498-8133-44c3-b9e5-fb0accca46b1\") " pod="openstack/ceilometer-0" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.597615 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.672217 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dba3cf0e-d449-48f5-86c6-080681acf177-logs\") pod \"cinder-api-0\" (UID: \"dba3cf0e-d449-48f5-86c6-080681acf177\") " pod="openstack/cinder-api-0" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.672308 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dba3cf0e-d449-48f5-86c6-080681acf177-scripts\") pod \"cinder-api-0\" (UID: \"dba3cf0e-d449-48f5-86c6-080681acf177\") " pod="openstack/cinder-api-0" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.672446 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dba3cf0e-d449-48f5-86c6-080681acf177-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"dba3cf0e-d449-48f5-86c6-080681acf177\") " pod="openstack/cinder-api-0" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.672590 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dba3cf0e-d449-48f5-86c6-080681acf177-etc-machine-id\") pod \"cinder-api-0\" (UID: \"dba3cf0e-d449-48f5-86c6-080681acf177\") " pod="openstack/cinder-api-0" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.672621 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dba3cf0e-d449-48f5-86c6-080681acf177-public-tls-certs\") pod \"cinder-api-0\" (UID: \"dba3cf0e-d449-48f5-86c6-080681acf177\") " pod="openstack/cinder-api-0" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.672670 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gt48\" (UniqueName: \"kubernetes.io/projected/dba3cf0e-d449-48f5-86c6-080681acf177-kube-api-access-4gt48\") pod \"cinder-api-0\" (UID: \"dba3cf0e-d449-48f5-86c6-080681acf177\") " pod="openstack/cinder-api-0" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.672888 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dba3cf0e-d449-48f5-86c6-080681acf177-config-data\") pod \"cinder-api-0\" (UID: \"dba3cf0e-d449-48f5-86c6-080681acf177\") " pod="openstack/cinder-api-0" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.672942 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dba3cf0e-d449-48f5-86c6-080681acf177-config-data-custom\") pod \"cinder-api-0\" (UID: \"dba3cf0e-d449-48f5-86c6-080681acf177\") " pod="openstack/cinder-api-0" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.673186 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dba3cf0e-d449-48f5-86c6-080681acf177-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"dba3cf0e-d449-48f5-86c6-080681acf177\") " pod="openstack/cinder-api-0" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.745088 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.775838 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dba3cf0e-d449-48f5-86c6-080681acf177-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"dba3cf0e-d449-48f5-86c6-080681acf177\") " pod="openstack/cinder-api-0" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.775957 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dba3cf0e-d449-48f5-86c6-080681acf177-logs\") pod \"cinder-api-0\" (UID: \"dba3cf0e-d449-48f5-86c6-080681acf177\") " pod="openstack/cinder-api-0" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.776024 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dba3cf0e-d449-48f5-86c6-080681acf177-scripts\") pod \"cinder-api-0\" (UID: \"dba3cf0e-d449-48f5-86c6-080681acf177\") " pod="openstack/cinder-api-0" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.776044 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dba3cf0e-d449-48f5-86c6-080681acf177-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"dba3cf0e-d449-48f5-86c6-080681acf177\") " pod="openstack/cinder-api-0" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.776079 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dba3cf0e-d449-48f5-86c6-080681acf177-etc-machine-id\") pod \"cinder-api-0\" (UID: \"dba3cf0e-d449-48f5-86c6-080681acf177\") " pod="openstack/cinder-api-0" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.776095 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dba3cf0e-d449-48f5-86c6-080681acf177-public-tls-certs\") pod \"cinder-api-0\" (UID: \"dba3cf0e-d449-48f5-86c6-080681acf177\") " pod="openstack/cinder-api-0" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.776120 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gt48\" (UniqueName: \"kubernetes.io/projected/dba3cf0e-d449-48f5-86c6-080681acf177-kube-api-access-4gt48\") pod \"cinder-api-0\" (UID: \"dba3cf0e-d449-48f5-86c6-080681acf177\") " pod="openstack/cinder-api-0" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.776145 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dba3cf0e-d449-48f5-86c6-080681acf177-config-data\") pod \"cinder-api-0\" (UID: \"dba3cf0e-d449-48f5-86c6-080681acf177\") " pod="openstack/cinder-api-0" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.776160 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dba3cf0e-d449-48f5-86c6-080681acf177-config-data-custom\") pod \"cinder-api-0\" (UID: \"dba3cf0e-d449-48f5-86c6-080681acf177\") " pod="openstack/cinder-api-0" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.777232 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dba3cf0e-d449-48f5-86c6-080681acf177-etc-machine-id\") pod \"cinder-api-0\" (UID: \"dba3cf0e-d449-48f5-86c6-080681acf177\") " pod="openstack/cinder-api-0" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.778529 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dba3cf0e-d449-48f5-86c6-080681acf177-logs\") pod \"cinder-api-0\" (UID: \"dba3cf0e-d449-48f5-86c6-080681acf177\") " pod="openstack/cinder-api-0" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.780986 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dba3cf0e-d449-48f5-86c6-080681acf177-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"dba3cf0e-d449-48f5-86c6-080681acf177\") " pod="openstack/cinder-api-0" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.781376 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dba3cf0e-d449-48f5-86c6-080681acf177-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"dba3cf0e-d449-48f5-86c6-080681acf177\") " pod="openstack/cinder-api-0" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.781959 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dba3cf0e-d449-48f5-86c6-080681acf177-scripts\") pod \"cinder-api-0\" (UID: \"dba3cf0e-d449-48f5-86c6-080681acf177\") " pod="openstack/cinder-api-0" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.782123 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dba3cf0e-d449-48f5-86c6-080681acf177-config-data-custom\") pod \"cinder-api-0\" (UID: \"dba3cf0e-d449-48f5-86c6-080681acf177\") " pod="openstack/cinder-api-0" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.782754 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dba3cf0e-d449-48f5-86c6-080681acf177-public-tls-certs\") pod \"cinder-api-0\" (UID: \"dba3cf0e-d449-48f5-86c6-080681acf177\") " pod="openstack/cinder-api-0" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.786093 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dba3cf0e-d449-48f5-86c6-080681acf177-config-data\") pod \"cinder-api-0\" (UID: \"dba3cf0e-d449-48f5-86c6-080681acf177\") " pod="openstack/cinder-api-0" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.804735 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gt48\" (UniqueName: \"kubernetes.io/projected/dba3cf0e-d449-48f5-86c6-080681acf177-kube-api-access-4gt48\") pod \"cinder-api-0\" (UID: \"dba3cf0e-d449-48f5-86c6-080681acf177\") " pod="openstack/cinder-api-0" Mar 09 14:26:41 crc kubenswrapper[4722]: I0309 14:26:41.950670 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 09 14:26:42 crc kubenswrapper[4722]: I0309 14:26:42.178971 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34700700-cbd0-4a2e-b791-25b63e3de5b8" path="/var/lib/kubelet/pods/34700700-cbd0-4a2e-b791-25b63e3de5b8/volumes" Mar 09 14:26:42 crc kubenswrapper[4722]: I0309 14:26:42.190701 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8066fa12-e1a3-483a-8c94-e7a59cc2f1d9" path="/var/lib/kubelet/pods/8066fa12-e1a3-483a-8c94-e7a59cc2f1d9/volumes" Mar 09 14:26:42 crc kubenswrapper[4722]: I0309 14:26:42.191615 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="896c45ad-c78a-48cd-a84d-7852705df811" path="/var/lib/kubelet/pods/896c45ad-c78a-48cd-a84d-7852705df811/volumes" Mar 09 14:26:42 crc kubenswrapper[4722]: I0309 14:26:42.192488 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a53b24c6-01ac-48c0-8c62-f27e8309de23" path="/var/lib/kubelet/pods/a53b24c6-01ac-48c0-8c62-f27e8309de23/volumes" Mar 09 14:26:42 crc kubenswrapper[4722]: I0309 14:26:42.223931 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 14:26:42 crc kubenswrapper[4722]: W0309 14:26:42.226412 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda2b1f498_8133_44c3_b9e5_fb0accca46b1.slice/crio-1eff49e536c100fb8b6b05bf268153f5fcbfdcb11219c3476d0f3462947cf7d5 WatchSource:0}: Error finding container 1eff49e536c100fb8b6b05bf268153f5fcbfdcb11219c3476d0f3462947cf7d5: Status 404 returned error can't find the container with id 1eff49e536c100fb8b6b05bf268153f5fcbfdcb11219c3476d0f3462947cf7d5 Mar 09 14:26:42 crc kubenswrapper[4722]: I0309 14:26:42.510012 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 09 14:26:42 crc kubenswrapper[4722]: W0309 14:26:42.513163 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddba3cf0e_d449_48f5_86c6_080681acf177.slice/crio-a1b94186e75f20e73c0d2b257b7302197260ed752ee52b081c65c95a29c26310 WatchSource:0}: Error finding container a1b94186e75f20e73c0d2b257b7302197260ed752ee52b081c65c95a29c26310: Status 404 returned error can't find the container with id a1b94186e75f20e73c0d2b257b7302197260ed752ee52b081c65c95a29c26310 Mar 09 14:26:43 crc kubenswrapper[4722]: I0309 14:26:43.229892 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a2b1f498-8133-44c3-b9e5-fb0accca46b1","Type":"ContainerStarted","Data":"4111597f6938af2698c7e65d183b920cc4a5a02e25e591a9d7c7e74ba34a1adc"} Mar 09 14:26:43 crc kubenswrapper[4722]: I0309 14:26:43.230359 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a2b1f498-8133-44c3-b9e5-fb0accca46b1","Type":"ContainerStarted","Data":"1eff49e536c100fb8b6b05bf268153f5fcbfdcb11219c3476d0f3462947cf7d5"} Mar 09 14:26:43 crc kubenswrapper[4722]: I0309 14:26:43.232579 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"dba3cf0e-d449-48f5-86c6-080681acf177","Type":"ContainerStarted","Data":"a1b94186e75f20e73c0d2b257b7302197260ed752ee52b081c65c95a29c26310"} Mar 09 14:26:44 crc kubenswrapper[4722]: I0309 14:26:44.244825 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"dba3cf0e-d449-48f5-86c6-080681acf177","Type":"ContainerStarted","Data":"ad0f60d04a6d6dec563c9334e90c82207b907bf570600de95fc9453d6d3c2b4e"} Mar 09 14:26:44 crc kubenswrapper[4722]: I0309 14:26:44.245451 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"dba3cf0e-d449-48f5-86c6-080681acf177","Type":"ContainerStarted","Data":"a3cc310e0a63330d8d783c3eb151d73b2f25cf5ce620a31a81aa2a467c02d4f5"} Mar 09 14:26:44 crc kubenswrapper[4722]: I0309 14:26:44.247179 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 09 14:26:44 crc kubenswrapper[4722]: I0309 14:26:44.249521 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a2b1f498-8133-44c3-b9e5-fb0accca46b1","Type":"ContainerStarted","Data":"8c4ef0b1b0525da141a7b474417b1d9f5f076ece945f64130e94d442a9b6d5c1"} Mar 09 14:26:45 crc kubenswrapper[4722]: I0309 14:26:45.264806 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a2b1f498-8133-44c3-b9e5-fb0accca46b1","Type":"ContainerStarted","Data":"5a99f9106014c8a16f08805e02978936d40105f4edfc311dc98e8ab1e37b2c97"} Mar 09 14:26:45 crc kubenswrapper[4722]: I0309 14:26:45.520893 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-77d5d5f47b-7zdbg" Mar 09 14:26:45 crc kubenswrapper[4722]: I0309 14:26:45.525554 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-77d5d5f47b-7zdbg" Mar 09 14:26:45 crc kubenswrapper[4722]: I0309 14:26:45.558640 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.558619629 podStartE2EDuration="4.558619629s" podCreationTimestamp="2026-03-09 14:26:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:26:44.275808997 +0000 UTC m=+1444.831377583" watchObservedRunningTime="2026-03-09 14:26:45.558619629 +0000 UTC m=+1446.114188205" Mar 09 14:26:46 crc kubenswrapper[4722]: I0309 14:26:46.751532 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c9776ccc5-hbhls" Mar 09 14:26:46 crc kubenswrapper[4722]: I0309 14:26:46.861012 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-k92sh"] Mar 09 14:26:46 crc kubenswrapper[4722]: I0309 14:26:46.869314 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-85ff748b95-k92sh" podUID="12245a01-16fe-424e-9fbb-59d906d90152" containerName="dnsmasq-dns" containerID="cri-o://8779974b5aec2bce4f7903ef8950a2bf797b650d511cc7b0ea5aa4fbafc22086" gracePeriod=10 Mar 09 14:26:46 crc kubenswrapper[4722]: I0309 14:26:46.923544 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 09 14:26:46 crc kubenswrapper[4722]: I0309 14:26:46.984821 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 09 14:26:47 crc kubenswrapper[4722]: I0309 14:26:47.318882 4722 generic.go:334] "Generic (PLEG): container finished" podID="12245a01-16fe-424e-9fbb-59d906d90152" containerID="8779974b5aec2bce4f7903ef8950a2bf797b650d511cc7b0ea5aa4fbafc22086" exitCode=0 Mar 09 14:26:47 crc kubenswrapper[4722]: I0309 14:26:47.318954 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-k92sh" event={"ID":"12245a01-16fe-424e-9fbb-59d906d90152","Type":"ContainerDied","Data":"8779974b5aec2bce4f7903ef8950a2bf797b650d511cc7b0ea5aa4fbafc22086"} Mar 09 14:26:47 crc kubenswrapper[4722]: I0309 14:26:47.329285 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a2b1f498-8133-44c3-b9e5-fb0accca46b1","Type":"ContainerStarted","Data":"d72d728f1904b9bb92a777806f4d320dcd1558801a3377ecfb3691c9c51e8ef3"} Mar 09 14:26:47 crc kubenswrapper[4722]: I0309 14:26:47.329350 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="f2b2dd04-3dd3-4915-ab91-201d5bfb54b1" containerName="cinder-scheduler" containerID="cri-o://ac655597c39eb8a0179586c41d96cacd49308efbc35dd59979c55b269b34454d" gracePeriod=30 Mar 09 14:26:47 crc kubenswrapper[4722]: I0309 14:26:47.329446 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="f2b2dd04-3dd3-4915-ab91-201d5bfb54b1" containerName="probe" containerID="cri-o://098ab54ae06e8bd92c40a772e9383c8da1c225c9f461a643a0ec06a19468ca46" gracePeriod=30 Mar 09 14:26:47 crc kubenswrapper[4722]: I0309 14:26:47.457798 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.362556807 podStartE2EDuration="6.457776498s" podCreationTimestamp="2026-03-09 14:26:41 +0000 UTC" firstStartedPulling="2026-03-09 14:26:42.22847809 +0000 UTC m=+1442.784046666" lastFinishedPulling="2026-03-09 14:26:46.323697781 +0000 UTC m=+1446.879266357" observedRunningTime="2026-03-09 14:26:47.369898508 +0000 UTC m=+1447.925467084" watchObservedRunningTime="2026-03-09 14:26:47.457776498 +0000 UTC m=+1448.013345074" Mar 09 14:26:47 crc kubenswrapper[4722]: I0309 14:26:47.728115 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-k92sh" Mar 09 14:26:47 crc kubenswrapper[4722]: I0309 14:26:47.778849 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7dhc\" (UniqueName: \"kubernetes.io/projected/12245a01-16fe-424e-9fbb-59d906d90152-kube-api-access-n7dhc\") pod \"12245a01-16fe-424e-9fbb-59d906d90152\" (UID: \"12245a01-16fe-424e-9fbb-59d906d90152\") " Mar 09 14:26:47 crc kubenswrapper[4722]: I0309 14:26:47.778897 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/12245a01-16fe-424e-9fbb-59d906d90152-ovsdbserver-nb\") pod \"12245a01-16fe-424e-9fbb-59d906d90152\" (UID: \"12245a01-16fe-424e-9fbb-59d906d90152\") " Mar 09 14:26:47 crc kubenswrapper[4722]: I0309 14:26:47.778922 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/12245a01-16fe-424e-9fbb-59d906d90152-ovsdbserver-sb\") pod \"12245a01-16fe-424e-9fbb-59d906d90152\" (UID: \"12245a01-16fe-424e-9fbb-59d906d90152\") " Mar 09 14:26:47 crc kubenswrapper[4722]: I0309 14:26:47.779073 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12245a01-16fe-424e-9fbb-59d906d90152-config\") pod \"12245a01-16fe-424e-9fbb-59d906d90152\" (UID: \"12245a01-16fe-424e-9fbb-59d906d90152\") " Mar 09 14:26:47 crc kubenswrapper[4722]: I0309 14:26:47.779151 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/12245a01-16fe-424e-9fbb-59d906d90152-dns-svc\") pod \"12245a01-16fe-424e-9fbb-59d906d90152\" (UID: \"12245a01-16fe-424e-9fbb-59d906d90152\") " Mar 09 14:26:47 crc kubenswrapper[4722]: I0309 14:26:47.779269 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/12245a01-16fe-424e-9fbb-59d906d90152-dns-swift-storage-0\") pod \"12245a01-16fe-424e-9fbb-59d906d90152\" (UID: \"12245a01-16fe-424e-9fbb-59d906d90152\") " Mar 09 14:26:47 crc kubenswrapper[4722]: I0309 14:26:47.832765 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12245a01-16fe-424e-9fbb-59d906d90152-kube-api-access-n7dhc" (OuterVolumeSpecName: "kube-api-access-n7dhc") pod "12245a01-16fe-424e-9fbb-59d906d90152" (UID: "12245a01-16fe-424e-9fbb-59d906d90152"). InnerVolumeSpecName "kube-api-access-n7dhc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:26:47 crc kubenswrapper[4722]: I0309 14:26:47.882301 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n7dhc\" (UniqueName: \"kubernetes.io/projected/12245a01-16fe-424e-9fbb-59d906d90152-kube-api-access-n7dhc\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:47 crc kubenswrapper[4722]: I0309 14:26:47.918562 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12245a01-16fe-424e-9fbb-59d906d90152-config" (OuterVolumeSpecName: "config") pod "12245a01-16fe-424e-9fbb-59d906d90152" (UID: "12245a01-16fe-424e-9fbb-59d906d90152"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:26:47 crc kubenswrapper[4722]: I0309 14:26:47.975376 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12245a01-16fe-424e-9fbb-59d906d90152-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "12245a01-16fe-424e-9fbb-59d906d90152" (UID: "12245a01-16fe-424e-9fbb-59d906d90152"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:26:47 crc kubenswrapper[4722]: I0309 14:26:47.988172 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12245a01-16fe-424e-9fbb-59d906d90152-config\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:47 crc kubenswrapper[4722]: I0309 14:26:47.988545 4722 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/12245a01-16fe-424e-9fbb-59d906d90152-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:47 crc kubenswrapper[4722]: I0309 14:26:47.992758 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12245a01-16fe-424e-9fbb-59d906d90152-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "12245a01-16fe-424e-9fbb-59d906d90152" (UID: "12245a01-16fe-424e-9fbb-59d906d90152"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:26:48 crc kubenswrapper[4722]: I0309 14:26:48.014959 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12245a01-16fe-424e-9fbb-59d906d90152-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "12245a01-16fe-424e-9fbb-59d906d90152" (UID: "12245a01-16fe-424e-9fbb-59d906d90152"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:26:48 crc kubenswrapper[4722]: I0309 14:26:48.015813 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12245a01-16fe-424e-9fbb-59d906d90152-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "12245a01-16fe-424e-9fbb-59d906d90152" (UID: "12245a01-16fe-424e-9fbb-59d906d90152"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:26:48 crc kubenswrapper[4722]: I0309 14:26:48.089982 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/12245a01-16fe-424e-9fbb-59d906d90152-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:48 crc kubenswrapper[4722]: I0309 14:26:48.090369 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/12245a01-16fe-424e-9fbb-59d906d90152-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:48 crc kubenswrapper[4722]: I0309 14:26:48.090381 4722 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/12245a01-16fe-424e-9fbb-59d906d90152-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:48 crc kubenswrapper[4722]: I0309 14:26:48.347345 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-k92sh" event={"ID":"12245a01-16fe-424e-9fbb-59d906d90152","Type":"ContainerDied","Data":"605be6803ebfb0e06597572f94f7e4bf290f0d27028740b80b6c3e404eba40aa"} Mar 09 14:26:48 crc kubenswrapper[4722]: I0309 14:26:48.347434 4722 scope.go:117] "RemoveContainer" containerID="8779974b5aec2bce4f7903ef8950a2bf797b650d511cc7b0ea5aa4fbafc22086" Mar 09 14:26:48 crc kubenswrapper[4722]: I0309 14:26:48.348224 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-k92sh" Mar 09 14:26:48 crc kubenswrapper[4722]: I0309 14:26:48.348485 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 09 14:26:48 crc kubenswrapper[4722]: I0309 14:26:48.386037 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-k92sh"] Mar 09 14:26:48 crc kubenswrapper[4722]: I0309 14:26:48.406182 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-k92sh"] Mar 09 14:26:48 crc kubenswrapper[4722]: I0309 14:26:48.407191 4722 scope.go:117] "RemoveContainer" containerID="afb30c9b889b0ef73da00845124667b5134eba739a3db9c077ed82aae17d15e2" Mar 09 14:26:49 crc kubenswrapper[4722]: I0309 14:26:49.358476 4722 generic.go:334] "Generic (PLEG): container finished" podID="f2b2dd04-3dd3-4915-ab91-201d5bfb54b1" containerID="098ab54ae06e8bd92c40a772e9383c8da1c225c9f461a643a0ec06a19468ca46" exitCode=0 Mar 09 14:26:49 crc kubenswrapper[4722]: I0309 14:26:49.358534 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f2b2dd04-3dd3-4915-ab91-201d5bfb54b1","Type":"ContainerDied","Data":"098ab54ae06e8bd92c40a772e9383c8da1c225c9f461a643a0ec06a19468ca46"} Mar 09 14:26:50 crc kubenswrapper[4722]: I0309 14:26:50.160492 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12245a01-16fe-424e-9fbb-59d906d90152" path="/var/lib/kubelet/pods/12245a01-16fe-424e-9fbb-59d906d90152/volumes" Mar 09 14:26:50 crc kubenswrapper[4722]: I0309 14:26:50.302884 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7ccbf8d8bb-gjl7f" Mar 09 14:26:50 crc kubenswrapper[4722]: I0309 14:26:50.578730 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7ccbf8d8bb-gjl7f" Mar 09 14:26:50 crc kubenswrapper[4722]: I0309 14:26:50.650503 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-77d5d5f47b-7zdbg"] Mar 09 14:26:50 crc kubenswrapper[4722]: I0309 14:26:50.650790 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-77d5d5f47b-7zdbg" podUID="154add5d-d1cd-4c93-ae63-3c7dfe4cb035" containerName="placement-log" containerID="cri-o://3c9158dc8fcb0bfa1ebb4b61def0600239b2bd11f0845392eaa82372894666db" gracePeriod=30 Mar 09 14:26:50 crc kubenswrapper[4722]: I0309 14:26:50.652686 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-77d5d5f47b-7zdbg" podUID="154add5d-d1cd-4c93-ae63-3c7dfe4cb035" containerName="placement-api" containerID="cri-o://17e660b299cb6418cfd87beaf38373cbd8fa8b6a05a78e24fc2b2cec4cdbc8a0" gracePeriod=30 Mar 09 14:26:51 crc kubenswrapper[4722]: I0309 14:26:51.172330 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-8495c95fc9-42qqz" Mar 09 14:26:51 crc kubenswrapper[4722]: I0309 14:26:51.431838 4722 generic.go:334] "Generic (PLEG): container finished" podID="154add5d-d1cd-4c93-ae63-3c7dfe4cb035" containerID="3c9158dc8fcb0bfa1ebb4b61def0600239b2bd11f0845392eaa82372894666db" exitCode=143 Mar 09 14:26:51 crc kubenswrapper[4722]: I0309 14:26:51.431931 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-77d5d5f47b-7zdbg" event={"ID":"154add5d-d1cd-4c93-ae63-3c7dfe4cb035","Type":"ContainerDied","Data":"3c9158dc8fcb0bfa1ebb4b61def0600239b2bd11f0845392eaa82372894666db"} Mar 09 14:26:52 crc kubenswrapper[4722]: I0309 14:26:52.282132 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 09 14:26:52 crc kubenswrapper[4722]: I0309 14:26:52.391485 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f2b2dd04-3dd3-4915-ab91-201d5bfb54b1-etc-machine-id\") pod \"f2b2dd04-3dd3-4915-ab91-201d5bfb54b1\" (UID: \"f2b2dd04-3dd3-4915-ab91-201d5bfb54b1\") " Mar 09 14:26:52 crc kubenswrapper[4722]: I0309 14:26:52.391551 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vkkwx\" (UniqueName: \"kubernetes.io/projected/f2b2dd04-3dd3-4915-ab91-201d5bfb54b1-kube-api-access-vkkwx\") pod \"f2b2dd04-3dd3-4915-ab91-201d5bfb54b1\" (UID: \"f2b2dd04-3dd3-4915-ab91-201d5bfb54b1\") " Mar 09 14:26:52 crc kubenswrapper[4722]: I0309 14:26:52.391607 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2b2dd04-3dd3-4915-ab91-201d5bfb54b1-config-data\") pod \"f2b2dd04-3dd3-4915-ab91-201d5bfb54b1\" (UID: \"f2b2dd04-3dd3-4915-ab91-201d5bfb54b1\") " Mar 09 14:26:52 crc kubenswrapper[4722]: I0309 14:26:52.391644 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f2b2dd04-3dd3-4915-ab91-201d5bfb54b1-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "f2b2dd04-3dd3-4915-ab91-201d5bfb54b1" (UID: "f2b2dd04-3dd3-4915-ab91-201d5bfb54b1"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 14:26:52 crc kubenswrapper[4722]: I0309 14:26:52.391718 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f2b2dd04-3dd3-4915-ab91-201d5bfb54b1-config-data-custom\") pod \"f2b2dd04-3dd3-4915-ab91-201d5bfb54b1\" (UID: \"f2b2dd04-3dd3-4915-ab91-201d5bfb54b1\") " Mar 09 14:26:52 crc kubenswrapper[4722]: I0309 14:26:52.391874 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2b2dd04-3dd3-4915-ab91-201d5bfb54b1-scripts\") pod \"f2b2dd04-3dd3-4915-ab91-201d5bfb54b1\" (UID: \"f2b2dd04-3dd3-4915-ab91-201d5bfb54b1\") " Mar 09 14:26:52 crc kubenswrapper[4722]: I0309 14:26:52.391949 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2b2dd04-3dd3-4915-ab91-201d5bfb54b1-combined-ca-bundle\") pod \"f2b2dd04-3dd3-4915-ab91-201d5bfb54b1\" (UID: \"f2b2dd04-3dd3-4915-ab91-201d5bfb54b1\") " Mar 09 14:26:52 crc kubenswrapper[4722]: I0309 14:26:52.392495 4722 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f2b2dd04-3dd3-4915-ab91-201d5bfb54b1-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:52 crc kubenswrapper[4722]: I0309 14:26:52.407492 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2b2dd04-3dd3-4915-ab91-201d5bfb54b1-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "f2b2dd04-3dd3-4915-ab91-201d5bfb54b1" (UID: "f2b2dd04-3dd3-4915-ab91-201d5bfb54b1"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:26:52 crc kubenswrapper[4722]: I0309 14:26:52.409477 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2b2dd04-3dd3-4915-ab91-201d5bfb54b1-kube-api-access-vkkwx" (OuterVolumeSpecName: "kube-api-access-vkkwx") pod "f2b2dd04-3dd3-4915-ab91-201d5bfb54b1" (UID: "f2b2dd04-3dd3-4915-ab91-201d5bfb54b1"). InnerVolumeSpecName "kube-api-access-vkkwx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:26:52 crc kubenswrapper[4722]: I0309 14:26:52.411794 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2b2dd04-3dd3-4915-ab91-201d5bfb54b1-scripts" (OuterVolumeSpecName: "scripts") pod "f2b2dd04-3dd3-4915-ab91-201d5bfb54b1" (UID: "f2b2dd04-3dd3-4915-ab91-201d5bfb54b1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:26:52 crc kubenswrapper[4722]: I0309 14:26:52.448048 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 09 14:26:52 crc kubenswrapper[4722]: E0309 14:26:52.449521 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12245a01-16fe-424e-9fbb-59d906d90152" containerName="dnsmasq-dns" Mar 09 14:26:52 crc kubenswrapper[4722]: I0309 14:26:52.449540 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="12245a01-16fe-424e-9fbb-59d906d90152" containerName="dnsmasq-dns" Mar 09 14:26:52 crc kubenswrapper[4722]: E0309 14:26:52.449550 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2b2dd04-3dd3-4915-ab91-201d5bfb54b1" containerName="probe" Mar 09 14:26:52 crc kubenswrapper[4722]: I0309 14:26:52.449557 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2b2dd04-3dd3-4915-ab91-201d5bfb54b1" containerName="probe" Mar 09 14:26:52 crc kubenswrapper[4722]: E0309 14:26:52.449592 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2b2dd04-3dd3-4915-ab91-201d5bfb54b1" containerName="cinder-scheduler" Mar 09 14:26:52 crc kubenswrapper[4722]: I0309 14:26:52.449599 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2b2dd04-3dd3-4915-ab91-201d5bfb54b1" containerName="cinder-scheduler" Mar 09 14:26:52 crc kubenswrapper[4722]: E0309 14:26:52.449614 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12245a01-16fe-424e-9fbb-59d906d90152" containerName="init" Mar 09 14:26:52 crc kubenswrapper[4722]: I0309 14:26:52.449620 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="12245a01-16fe-424e-9fbb-59d906d90152" containerName="init" Mar 09 14:26:52 crc kubenswrapper[4722]: I0309 14:26:52.449844 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="12245a01-16fe-424e-9fbb-59d906d90152" containerName="dnsmasq-dns" Mar 09 14:26:52 crc kubenswrapper[4722]: I0309 14:26:52.449865 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2b2dd04-3dd3-4915-ab91-201d5bfb54b1" containerName="probe" Mar 09 14:26:52 crc kubenswrapper[4722]: I0309 14:26:52.449876 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2b2dd04-3dd3-4915-ab91-201d5bfb54b1" containerName="cinder-scheduler" Mar 09 14:26:52 crc kubenswrapper[4722]: I0309 14:26:52.450589 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 09 14:26:52 crc kubenswrapper[4722]: I0309 14:26:52.453877 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-7l6pp" Mar 09 14:26:52 crc kubenswrapper[4722]: I0309 14:26:52.457913 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 09 14:26:52 crc kubenswrapper[4722]: I0309 14:26:52.458057 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 09 14:26:52 crc kubenswrapper[4722]: I0309 14:26:52.466510 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 09 14:26:52 crc kubenswrapper[4722]: I0309 14:26:52.473742 4722 generic.go:334] "Generic (PLEG): container finished" podID="f2b2dd04-3dd3-4915-ab91-201d5bfb54b1" containerID="ac655597c39eb8a0179586c41d96cacd49308efbc35dd59979c55b269b34454d" exitCode=0 Mar 09 14:26:52 crc kubenswrapper[4722]: I0309 14:26:52.473785 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f2b2dd04-3dd3-4915-ab91-201d5bfb54b1","Type":"ContainerDied","Data":"ac655597c39eb8a0179586c41d96cacd49308efbc35dd59979c55b269b34454d"} Mar 09 14:26:52 crc kubenswrapper[4722]: I0309 14:26:52.473821 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"f2b2dd04-3dd3-4915-ab91-201d5bfb54b1","Type":"ContainerDied","Data":"140be863383b8c2a361522e8005af37d00b25161c36a6b3b4ea9d87016f85b81"} Mar 09 14:26:52 crc kubenswrapper[4722]: I0309 14:26:52.473837 4722 scope.go:117] "RemoveContainer" containerID="098ab54ae06e8bd92c40a772e9383c8da1c225c9f461a643a0ec06a19468ca46" Mar 09 14:26:52 crc kubenswrapper[4722]: I0309 14:26:52.474348 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 09 14:26:52 crc kubenswrapper[4722]: I0309 14:26:52.499923 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mw9fv\" (UniqueName: \"kubernetes.io/projected/363b3657-6ba6-40e5-a353-7e1440ce3d01-kube-api-access-mw9fv\") pod \"openstackclient\" (UID: \"363b3657-6ba6-40e5-a353-7e1440ce3d01\") " pod="openstack/openstackclient" Mar 09 14:26:52 crc kubenswrapper[4722]: I0309 14:26:52.499968 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/363b3657-6ba6-40e5-a353-7e1440ce3d01-combined-ca-bundle\") pod \"openstackclient\" (UID: \"363b3657-6ba6-40e5-a353-7e1440ce3d01\") " pod="openstack/openstackclient" Mar 09 14:26:52 crc kubenswrapper[4722]: I0309 14:26:52.500017 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/363b3657-6ba6-40e5-a353-7e1440ce3d01-openstack-config-secret\") pod \"openstackclient\" (UID: \"363b3657-6ba6-40e5-a353-7e1440ce3d01\") " pod="openstack/openstackclient" Mar 09 14:26:52 crc kubenswrapper[4722]: I0309 14:26:52.500057 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/363b3657-6ba6-40e5-a353-7e1440ce3d01-openstack-config\") pod \"openstackclient\" (UID: \"363b3657-6ba6-40e5-a353-7e1440ce3d01\") " pod="openstack/openstackclient" Mar 09 14:26:52 crc kubenswrapper[4722]: I0309 14:26:52.500772 4722 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f2b2dd04-3dd3-4915-ab91-201d5bfb54b1-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:52 crc kubenswrapper[4722]: I0309 14:26:52.500830 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2b2dd04-3dd3-4915-ab91-201d5bfb54b1-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:52 crc kubenswrapper[4722]: I0309 14:26:52.500842 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vkkwx\" (UniqueName: \"kubernetes.io/projected/f2b2dd04-3dd3-4915-ab91-201d5bfb54b1-kube-api-access-vkkwx\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:52 crc kubenswrapper[4722]: I0309 14:26:52.514311 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2b2dd04-3dd3-4915-ab91-201d5bfb54b1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f2b2dd04-3dd3-4915-ab91-201d5bfb54b1" (UID: "f2b2dd04-3dd3-4915-ab91-201d5bfb54b1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:26:52 crc kubenswrapper[4722]: I0309 14:26:52.541479 4722 scope.go:117] "RemoveContainer" containerID="ac655597c39eb8a0179586c41d96cacd49308efbc35dd59979c55b269b34454d" Mar 09 14:26:52 crc kubenswrapper[4722]: I0309 14:26:52.549356 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2b2dd04-3dd3-4915-ab91-201d5bfb54b1-config-data" (OuterVolumeSpecName: "config-data") pod "f2b2dd04-3dd3-4915-ab91-201d5bfb54b1" (UID: "f2b2dd04-3dd3-4915-ab91-201d5bfb54b1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:26:52 crc kubenswrapper[4722]: I0309 14:26:52.568627 4722 scope.go:117] "RemoveContainer" containerID="098ab54ae06e8bd92c40a772e9383c8da1c225c9f461a643a0ec06a19468ca46" Mar 09 14:26:52 crc kubenswrapper[4722]: E0309 14:26:52.569005 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"098ab54ae06e8bd92c40a772e9383c8da1c225c9f461a643a0ec06a19468ca46\": container with ID starting with 098ab54ae06e8bd92c40a772e9383c8da1c225c9f461a643a0ec06a19468ca46 not found: ID does not exist" containerID="098ab54ae06e8bd92c40a772e9383c8da1c225c9f461a643a0ec06a19468ca46" Mar 09 14:26:52 crc kubenswrapper[4722]: I0309 14:26:52.569039 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"098ab54ae06e8bd92c40a772e9383c8da1c225c9f461a643a0ec06a19468ca46"} err="failed to get container status \"098ab54ae06e8bd92c40a772e9383c8da1c225c9f461a643a0ec06a19468ca46\": rpc error: code = NotFound desc = could not find container \"098ab54ae06e8bd92c40a772e9383c8da1c225c9f461a643a0ec06a19468ca46\": container with ID starting with 098ab54ae06e8bd92c40a772e9383c8da1c225c9f461a643a0ec06a19468ca46 not found: ID does not exist" Mar 09 14:26:52 crc kubenswrapper[4722]: I0309 14:26:52.569060 4722 scope.go:117] "RemoveContainer" containerID="ac655597c39eb8a0179586c41d96cacd49308efbc35dd59979c55b269b34454d" Mar 09 14:26:52 crc kubenswrapper[4722]: E0309 14:26:52.569384 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac655597c39eb8a0179586c41d96cacd49308efbc35dd59979c55b269b34454d\": container with ID starting with ac655597c39eb8a0179586c41d96cacd49308efbc35dd59979c55b269b34454d not found: ID does not exist" containerID="ac655597c39eb8a0179586c41d96cacd49308efbc35dd59979c55b269b34454d" Mar 09 14:26:52 crc kubenswrapper[4722]: I0309 14:26:52.569406 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac655597c39eb8a0179586c41d96cacd49308efbc35dd59979c55b269b34454d"} err="failed to get container status \"ac655597c39eb8a0179586c41d96cacd49308efbc35dd59979c55b269b34454d\": rpc error: code = NotFound desc = could not find container \"ac655597c39eb8a0179586c41d96cacd49308efbc35dd59979c55b269b34454d\": container with ID starting with ac655597c39eb8a0179586c41d96cacd49308efbc35dd59979c55b269b34454d not found: ID does not exist" Mar 09 14:26:52 crc kubenswrapper[4722]: I0309 14:26:52.603153 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mw9fv\" (UniqueName: \"kubernetes.io/projected/363b3657-6ba6-40e5-a353-7e1440ce3d01-kube-api-access-mw9fv\") pod \"openstackclient\" (UID: \"363b3657-6ba6-40e5-a353-7e1440ce3d01\") " pod="openstack/openstackclient" Mar 09 14:26:52 crc kubenswrapper[4722]: I0309 14:26:52.603189 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/363b3657-6ba6-40e5-a353-7e1440ce3d01-combined-ca-bundle\") pod \"openstackclient\" (UID: \"363b3657-6ba6-40e5-a353-7e1440ce3d01\") " pod="openstack/openstackclient" Mar 09 14:26:52 crc kubenswrapper[4722]: I0309 14:26:52.603249 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/363b3657-6ba6-40e5-a353-7e1440ce3d01-openstack-config-secret\") pod \"openstackclient\" (UID: \"363b3657-6ba6-40e5-a353-7e1440ce3d01\") " pod="openstack/openstackclient" Mar 09 14:26:52 crc kubenswrapper[4722]: I0309 14:26:52.603305 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/363b3657-6ba6-40e5-a353-7e1440ce3d01-openstack-config\") pod \"openstackclient\" (UID: \"363b3657-6ba6-40e5-a353-7e1440ce3d01\") " pod="openstack/openstackclient" Mar 09 14:26:52 crc kubenswrapper[4722]: I0309 14:26:52.603405 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2b2dd04-3dd3-4915-ab91-201d5bfb54b1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:52 crc kubenswrapper[4722]: I0309 14:26:52.603417 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2b2dd04-3dd3-4915-ab91-201d5bfb54b1-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:52 crc kubenswrapper[4722]: I0309 14:26:52.604897 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/363b3657-6ba6-40e5-a353-7e1440ce3d01-openstack-config\") pod \"openstackclient\" (UID: \"363b3657-6ba6-40e5-a353-7e1440ce3d01\") " pod="openstack/openstackclient" Mar 09 14:26:52 crc kubenswrapper[4722]: I0309 14:26:52.608787 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/363b3657-6ba6-40e5-a353-7e1440ce3d01-openstack-config-secret\") pod \"openstackclient\" (UID: \"363b3657-6ba6-40e5-a353-7e1440ce3d01\") " pod="openstack/openstackclient" Mar 09 14:26:52 crc kubenswrapper[4722]: I0309 14:26:52.608922 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/363b3657-6ba6-40e5-a353-7e1440ce3d01-combined-ca-bundle\") pod \"openstackclient\" (UID: \"363b3657-6ba6-40e5-a353-7e1440ce3d01\") " pod="openstack/openstackclient" Mar 09 14:26:52 crc kubenswrapper[4722]: I0309 14:26:52.620324 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mw9fv\" (UniqueName: \"kubernetes.io/projected/363b3657-6ba6-40e5-a353-7e1440ce3d01-kube-api-access-mw9fv\") pod \"openstackclient\" (UID: \"363b3657-6ba6-40e5-a353-7e1440ce3d01\") " pod="openstack/openstackclient" Mar 09 14:26:52 crc kubenswrapper[4722]: I0309 14:26:52.820563 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 09 14:26:52 crc kubenswrapper[4722]: I0309 14:26:52.830534 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 09 14:26:52 crc kubenswrapper[4722]: I0309 14:26:52.838269 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 09 14:26:52 crc kubenswrapper[4722]: I0309 14:26:52.861695 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 09 14:26:52 crc kubenswrapper[4722]: I0309 14:26:52.864471 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 09 14:26:52 crc kubenswrapper[4722]: I0309 14:26:52.879939 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 09 14:26:52 crc kubenswrapper[4722]: I0309 14:26:52.890675 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 09 14:26:53 crc kubenswrapper[4722]: I0309 14:26:53.014311 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c227d2b-e035-426b-b1e1-5be3a4e06090-config-data\") pod \"cinder-scheduler-0\" (UID: \"4c227d2b-e035-426b-b1e1-5be3a4e06090\") " pod="openstack/cinder-scheduler-0" Mar 09 14:26:53 crc kubenswrapper[4722]: I0309 14:26:53.014377 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whcp5\" (UniqueName: \"kubernetes.io/projected/4c227d2b-e035-426b-b1e1-5be3a4e06090-kube-api-access-whcp5\") pod \"cinder-scheduler-0\" (UID: \"4c227d2b-e035-426b-b1e1-5be3a4e06090\") " pod="openstack/cinder-scheduler-0" Mar 09 14:26:53 crc kubenswrapper[4722]: I0309 14:26:53.014456 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4c227d2b-e035-426b-b1e1-5be3a4e06090-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"4c227d2b-e035-426b-b1e1-5be3a4e06090\") " pod="openstack/cinder-scheduler-0" Mar 09 14:26:53 crc kubenswrapper[4722]: I0309 14:26:53.014487 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c227d2b-e035-426b-b1e1-5be3a4e06090-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"4c227d2b-e035-426b-b1e1-5be3a4e06090\") " pod="openstack/cinder-scheduler-0" Mar 09 14:26:53 crc kubenswrapper[4722]: I0309 14:26:53.014536 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c227d2b-e035-426b-b1e1-5be3a4e06090-scripts\") pod \"cinder-scheduler-0\" (UID: \"4c227d2b-e035-426b-b1e1-5be3a4e06090\") " pod="openstack/cinder-scheduler-0" Mar 09 14:26:53 crc kubenswrapper[4722]: I0309 14:26:53.014806 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4c227d2b-e035-426b-b1e1-5be3a4e06090-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"4c227d2b-e035-426b-b1e1-5be3a4e06090\") " pod="openstack/cinder-scheduler-0" Mar 09 14:26:53 crc kubenswrapper[4722]: I0309 14:26:53.127442 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c227d2b-e035-426b-b1e1-5be3a4e06090-config-data\") pod \"cinder-scheduler-0\" (UID: \"4c227d2b-e035-426b-b1e1-5be3a4e06090\") " pod="openstack/cinder-scheduler-0" Mar 09 14:26:53 crc kubenswrapper[4722]: I0309 14:26:53.127792 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-whcp5\" (UniqueName: \"kubernetes.io/projected/4c227d2b-e035-426b-b1e1-5be3a4e06090-kube-api-access-whcp5\") pod \"cinder-scheduler-0\" (UID: \"4c227d2b-e035-426b-b1e1-5be3a4e06090\") " pod="openstack/cinder-scheduler-0" Mar 09 14:26:53 crc kubenswrapper[4722]: I0309 14:26:53.127965 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4c227d2b-e035-426b-b1e1-5be3a4e06090-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"4c227d2b-e035-426b-b1e1-5be3a4e06090\") " pod="openstack/cinder-scheduler-0" Mar 09 14:26:53 crc kubenswrapper[4722]: I0309 14:26:53.128070 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c227d2b-e035-426b-b1e1-5be3a4e06090-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"4c227d2b-e035-426b-b1e1-5be3a4e06090\") " pod="openstack/cinder-scheduler-0" Mar 09 14:26:53 crc kubenswrapper[4722]: I0309 14:26:53.128187 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c227d2b-e035-426b-b1e1-5be3a4e06090-scripts\") pod \"cinder-scheduler-0\" (UID: \"4c227d2b-e035-426b-b1e1-5be3a4e06090\") " pod="openstack/cinder-scheduler-0" Mar 09 14:26:53 crc kubenswrapper[4722]: I0309 14:26:53.128579 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4c227d2b-e035-426b-b1e1-5be3a4e06090-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"4c227d2b-e035-426b-b1e1-5be3a4e06090\") " pod="openstack/cinder-scheduler-0" Mar 09 14:26:53 crc kubenswrapper[4722]: I0309 14:26:53.129488 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4c227d2b-e035-426b-b1e1-5be3a4e06090-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"4c227d2b-e035-426b-b1e1-5be3a4e06090\") " pod="openstack/cinder-scheduler-0" Mar 09 14:26:53 crc kubenswrapper[4722]: I0309 14:26:53.141997 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c227d2b-e035-426b-b1e1-5be3a4e06090-scripts\") pod \"cinder-scheduler-0\" (UID: \"4c227d2b-e035-426b-b1e1-5be3a4e06090\") " pod="openstack/cinder-scheduler-0" Mar 09 14:26:53 crc kubenswrapper[4722]: I0309 14:26:53.157466 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c227d2b-e035-426b-b1e1-5be3a4e06090-config-data\") pod \"cinder-scheduler-0\" (UID: \"4c227d2b-e035-426b-b1e1-5be3a4e06090\") " pod="openstack/cinder-scheduler-0" Mar 09 14:26:53 crc kubenswrapper[4722]: I0309 14:26:53.157823 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-whcp5\" (UniqueName: \"kubernetes.io/projected/4c227d2b-e035-426b-b1e1-5be3a4e06090-kube-api-access-whcp5\") pod \"cinder-scheduler-0\" (UID: \"4c227d2b-e035-426b-b1e1-5be3a4e06090\") " pod="openstack/cinder-scheduler-0" Mar 09 14:26:53 crc kubenswrapper[4722]: I0309 14:26:53.158151 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4c227d2b-e035-426b-b1e1-5be3a4e06090-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"4c227d2b-e035-426b-b1e1-5be3a4e06090\") " pod="openstack/cinder-scheduler-0" Mar 09 14:26:53 crc kubenswrapper[4722]: I0309 14:26:53.176954 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c227d2b-e035-426b-b1e1-5be3a4e06090-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"4c227d2b-e035-426b-b1e1-5be3a4e06090\") " pod="openstack/cinder-scheduler-0" Mar 09 14:26:53 crc kubenswrapper[4722]: I0309 14:26:53.342121 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 09 14:26:53 crc kubenswrapper[4722]: I0309 14:26:53.696940 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 09 14:26:53 crc kubenswrapper[4722]: I0309 14:26:53.901967 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 09 14:26:53 crc kubenswrapper[4722]: W0309 14:26:53.922447 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4c227d2b_e035_426b_b1e1_5be3a4e06090.slice/crio-05a775636e65f56dd98a2868db5e3973c1977277bcefd4dbd41d7e3d1e4977d3 WatchSource:0}: Error finding container 05a775636e65f56dd98a2868db5e3973c1977277bcefd4dbd41d7e3d1e4977d3: Status 404 returned error can't find the container with id 05a775636e65f56dd98a2868db5e3973c1977277bcefd4dbd41d7e3d1e4977d3 Mar 09 14:26:54 crc kubenswrapper[4722]: I0309 14:26:54.171902 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2b2dd04-3dd3-4915-ab91-201d5bfb54b1" path="/var/lib/kubelet/pods/f2b2dd04-3dd3-4915-ab91-201d5bfb54b1/volumes" Mar 09 14:26:54 crc kubenswrapper[4722]: I0309 14:26:54.390450 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-77d5d5f47b-7zdbg" Mar 09 14:26:54 crc kubenswrapper[4722]: I0309 14:26:54.480934 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/154add5d-d1cd-4c93-ae63-3c7dfe4cb035-logs\") pod \"154add5d-d1cd-4c93-ae63-3c7dfe4cb035\" (UID: \"154add5d-d1cd-4c93-ae63-3c7dfe4cb035\") " Mar 09 14:26:54 crc kubenswrapper[4722]: I0309 14:26:54.481001 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/154add5d-d1cd-4c93-ae63-3c7dfe4cb035-combined-ca-bundle\") pod \"154add5d-d1cd-4c93-ae63-3c7dfe4cb035\" (UID: \"154add5d-d1cd-4c93-ae63-3c7dfe4cb035\") " Mar 09 14:26:54 crc kubenswrapper[4722]: I0309 14:26:54.481080 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/154add5d-d1cd-4c93-ae63-3c7dfe4cb035-config-data\") pod \"154add5d-d1cd-4c93-ae63-3c7dfe4cb035\" (UID: \"154add5d-d1cd-4c93-ae63-3c7dfe4cb035\") " Mar 09 14:26:54 crc kubenswrapper[4722]: I0309 14:26:54.481216 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/154add5d-d1cd-4c93-ae63-3c7dfe4cb035-scripts\") pod \"154add5d-d1cd-4c93-ae63-3c7dfe4cb035\" (UID: \"154add5d-d1cd-4c93-ae63-3c7dfe4cb035\") " Mar 09 14:26:54 crc kubenswrapper[4722]: I0309 14:26:54.481242 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6xsh\" (UniqueName: \"kubernetes.io/projected/154add5d-d1cd-4c93-ae63-3c7dfe4cb035-kube-api-access-k6xsh\") pod \"154add5d-d1cd-4c93-ae63-3c7dfe4cb035\" (UID: \"154add5d-d1cd-4c93-ae63-3c7dfe4cb035\") " Mar 09 14:26:54 crc kubenswrapper[4722]: I0309 14:26:54.481280 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/154add5d-d1cd-4c93-ae63-3c7dfe4cb035-internal-tls-certs\") pod \"154add5d-d1cd-4c93-ae63-3c7dfe4cb035\" (UID: \"154add5d-d1cd-4c93-ae63-3c7dfe4cb035\") " Mar 09 14:26:54 crc kubenswrapper[4722]: I0309 14:26:54.481324 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/154add5d-d1cd-4c93-ae63-3c7dfe4cb035-public-tls-certs\") pod \"154add5d-d1cd-4c93-ae63-3c7dfe4cb035\" (UID: \"154add5d-d1cd-4c93-ae63-3c7dfe4cb035\") " Mar 09 14:26:54 crc kubenswrapper[4722]: I0309 14:26:54.481469 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/154add5d-d1cd-4c93-ae63-3c7dfe4cb035-logs" (OuterVolumeSpecName: "logs") pod "154add5d-d1cd-4c93-ae63-3c7dfe4cb035" (UID: "154add5d-d1cd-4c93-ae63-3c7dfe4cb035"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:26:54 crc kubenswrapper[4722]: I0309 14:26:54.481864 4722 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/154add5d-d1cd-4c93-ae63-3c7dfe4cb035-logs\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:54 crc kubenswrapper[4722]: I0309 14:26:54.488328 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/154add5d-d1cd-4c93-ae63-3c7dfe4cb035-kube-api-access-k6xsh" (OuterVolumeSpecName: "kube-api-access-k6xsh") pod "154add5d-d1cd-4c93-ae63-3c7dfe4cb035" (UID: "154add5d-d1cd-4c93-ae63-3c7dfe4cb035"). InnerVolumeSpecName "kube-api-access-k6xsh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:26:54 crc kubenswrapper[4722]: I0309 14:26:54.488374 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/154add5d-d1cd-4c93-ae63-3c7dfe4cb035-scripts" (OuterVolumeSpecName: "scripts") pod "154add5d-d1cd-4c93-ae63-3c7dfe4cb035" (UID: "154add5d-d1cd-4c93-ae63-3c7dfe4cb035"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:26:54 crc kubenswrapper[4722]: I0309 14:26:54.520989 4722 generic.go:334] "Generic (PLEG): container finished" podID="154add5d-d1cd-4c93-ae63-3c7dfe4cb035" containerID="17e660b299cb6418cfd87beaf38373cbd8fa8b6a05a78e24fc2b2cec4cdbc8a0" exitCode=0 Mar 09 14:26:54 crc kubenswrapper[4722]: I0309 14:26:54.521076 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-77d5d5f47b-7zdbg" event={"ID":"154add5d-d1cd-4c93-ae63-3c7dfe4cb035","Type":"ContainerDied","Data":"17e660b299cb6418cfd87beaf38373cbd8fa8b6a05a78e24fc2b2cec4cdbc8a0"} Mar 09 14:26:54 crc kubenswrapper[4722]: I0309 14:26:54.521110 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-77d5d5f47b-7zdbg" event={"ID":"154add5d-d1cd-4c93-ae63-3c7dfe4cb035","Type":"ContainerDied","Data":"9fdf03ad32af0ad3f5e4a055e548127156923b886acc6d05d2b525eb0aadf456"} Mar 09 14:26:54 crc kubenswrapper[4722]: I0309 14:26:54.521128 4722 scope.go:117] "RemoveContainer" containerID="17e660b299cb6418cfd87beaf38373cbd8fa8b6a05a78e24fc2b2cec4cdbc8a0" Mar 09 14:26:54 crc kubenswrapper[4722]: I0309 14:26:54.521407 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-77d5d5f47b-7zdbg" Mar 09 14:26:54 crc kubenswrapper[4722]: I0309 14:26:54.522737 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"363b3657-6ba6-40e5-a353-7e1440ce3d01","Type":"ContainerStarted","Data":"05e571ed6425368a7434ac74018f8fdcc6f99f044d34d92e6e20e293cabc1eba"} Mar 09 14:26:54 crc kubenswrapper[4722]: I0309 14:26:54.523944 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4c227d2b-e035-426b-b1e1-5be3a4e06090","Type":"ContainerStarted","Data":"05a775636e65f56dd98a2868db5e3973c1977277bcefd4dbd41d7e3d1e4977d3"} Mar 09 14:26:54 crc kubenswrapper[4722]: I0309 14:26:54.575925 4722 scope.go:117] "RemoveContainer" containerID="3c9158dc8fcb0bfa1ebb4b61def0600239b2bd11f0845392eaa82372894666db" Mar 09 14:26:54 crc kubenswrapper[4722]: I0309 14:26:54.585053 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/154add5d-d1cd-4c93-ae63-3c7dfe4cb035-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:54 crc kubenswrapper[4722]: I0309 14:26:54.585080 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6xsh\" (UniqueName: \"kubernetes.io/projected/154add5d-d1cd-4c93-ae63-3c7dfe4cb035-kube-api-access-k6xsh\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:54 crc kubenswrapper[4722]: I0309 14:26:54.606769 4722 scope.go:117] "RemoveContainer" containerID="17e660b299cb6418cfd87beaf38373cbd8fa8b6a05a78e24fc2b2cec4cdbc8a0" Mar 09 14:26:54 crc kubenswrapper[4722]: E0309 14:26:54.607290 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17e660b299cb6418cfd87beaf38373cbd8fa8b6a05a78e24fc2b2cec4cdbc8a0\": container with ID starting with 17e660b299cb6418cfd87beaf38373cbd8fa8b6a05a78e24fc2b2cec4cdbc8a0 not found: ID does not exist" containerID="17e660b299cb6418cfd87beaf38373cbd8fa8b6a05a78e24fc2b2cec4cdbc8a0" Mar 09 14:26:54 crc kubenswrapper[4722]: I0309 14:26:54.607349 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17e660b299cb6418cfd87beaf38373cbd8fa8b6a05a78e24fc2b2cec4cdbc8a0"} err="failed to get container status \"17e660b299cb6418cfd87beaf38373cbd8fa8b6a05a78e24fc2b2cec4cdbc8a0\": rpc error: code = NotFound desc = could not find container \"17e660b299cb6418cfd87beaf38373cbd8fa8b6a05a78e24fc2b2cec4cdbc8a0\": container with ID starting with 17e660b299cb6418cfd87beaf38373cbd8fa8b6a05a78e24fc2b2cec4cdbc8a0 not found: ID does not exist" Mar 09 14:26:54 crc kubenswrapper[4722]: I0309 14:26:54.607384 4722 scope.go:117] "RemoveContainer" containerID="3c9158dc8fcb0bfa1ebb4b61def0600239b2bd11f0845392eaa82372894666db" Mar 09 14:26:54 crc kubenswrapper[4722]: E0309 14:26:54.607813 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c9158dc8fcb0bfa1ebb4b61def0600239b2bd11f0845392eaa82372894666db\": container with ID starting with 3c9158dc8fcb0bfa1ebb4b61def0600239b2bd11f0845392eaa82372894666db not found: ID does not exist" containerID="3c9158dc8fcb0bfa1ebb4b61def0600239b2bd11f0845392eaa82372894666db" Mar 09 14:26:54 crc kubenswrapper[4722]: I0309 14:26:54.607858 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c9158dc8fcb0bfa1ebb4b61def0600239b2bd11f0845392eaa82372894666db"} err="failed to get container status \"3c9158dc8fcb0bfa1ebb4b61def0600239b2bd11f0845392eaa82372894666db\": rpc error: code = NotFound desc = could not find container \"3c9158dc8fcb0bfa1ebb4b61def0600239b2bd11f0845392eaa82372894666db\": container with ID starting with 3c9158dc8fcb0bfa1ebb4b61def0600239b2bd11f0845392eaa82372894666db not found: ID does not exist" Mar 09 14:26:54 crc kubenswrapper[4722]: I0309 14:26:54.671131 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/154add5d-d1cd-4c93-ae63-3c7dfe4cb035-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "154add5d-d1cd-4c93-ae63-3c7dfe4cb035" (UID: "154add5d-d1cd-4c93-ae63-3c7dfe4cb035"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:26:54 crc kubenswrapper[4722]: I0309 14:26:54.687023 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/154add5d-d1cd-4c93-ae63-3c7dfe4cb035-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:54 crc kubenswrapper[4722]: I0309 14:26:54.708637 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/154add5d-d1cd-4c93-ae63-3c7dfe4cb035-config-data" (OuterVolumeSpecName: "config-data") pod "154add5d-d1cd-4c93-ae63-3c7dfe4cb035" (UID: "154add5d-d1cd-4c93-ae63-3c7dfe4cb035"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:26:54 crc kubenswrapper[4722]: I0309 14:26:54.735412 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/154add5d-d1cd-4c93-ae63-3c7dfe4cb035-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "154add5d-d1cd-4c93-ae63-3c7dfe4cb035" (UID: "154add5d-d1cd-4c93-ae63-3c7dfe4cb035"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:26:54 crc kubenswrapper[4722]: I0309 14:26:54.764173 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/154add5d-d1cd-4c93-ae63-3c7dfe4cb035-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "154add5d-d1cd-4c93-ae63-3c7dfe4cb035" (UID: "154add5d-d1cd-4c93-ae63-3c7dfe4cb035"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:26:54 crc kubenswrapper[4722]: I0309 14:26:54.788938 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/154add5d-d1cd-4c93-ae63-3c7dfe4cb035-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:54 crc kubenswrapper[4722]: I0309 14:26:54.789192 4722 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/154add5d-d1cd-4c93-ae63-3c7dfe4cb035-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:54 crc kubenswrapper[4722]: I0309 14:26:54.789313 4722 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/154add5d-d1cd-4c93-ae63-3c7dfe4cb035-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:54 crc kubenswrapper[4722]: I0309 14:26:54.866441 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-77d5d5f47b-7zdbg"] Mar 09 14:26:54 crc kubenswrapper[4722]: I0309 14:26:54.878043 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-77d5d5f47b-7zdbg"] Mar 09 14:26:55 crc kubenswrapper[4722]: I0309 14:26:55.040657 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 09 14:26:55 crc kubenswrapper[4722]: I0309 14:26:55.538000 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4c227d2b-e035-426b-b1e1-5be3a4e06090","Type":"ContainerStarted","Data":"8a9ac2a175e66346618c96cee89ab5fb970fe921c03a24e88a11f9ced7e766ea"} Mar 09 14:26:55 crc kubenswrapper[4722]: I0309 14:26:55.538041 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4c227d2b-e035-426b-b1e1-5be3a4e06090","Type":"ContainerStarted","Data":"231f54731c6756f7b9f1284a4fb32b33d4ba1b6c9bd8d9fce721d043fab1848e"} Mar 09 14:26:55 crc kubenswrapper[4722]: I0309 14:26:55.569502 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.569480402 podStartE2EDuration="3.569480402s" podCreationTimestamp="2026-03-09 14:26:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:26:55.561888342 +0000 UTC m=+1456.117456918" watchObservedRunningTime="2026-03-09 14:26:55.569480402 +0000 UTC m=+1456.125048998" Mar 09 14:26:56 crc kubenswrapper[4722]: I0309 14:26:56.164855 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="154add5d-d1cd-4c93-ae63-3c7dfe4cb035" path="/var/lib/kubelet/pods/154add5d-d1cd-4c93-ae63-3c7dfe4cb035/volumes" Mar 09 14:26:57 crc kubenswrapper[4722]: I0309 14:26:57.233865 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-858d6f6fd6-wqzdg" Mar 09 14:26:57 crc kubenswrapper[4722]: I0309 14:26:57.372279 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zxr5p\" (UniqueName: \"kubernetes.io/projected/1cc74ab6-6c85-4da3-8a79-3af240adb999-kube-api-access-zxr5p\") pod \"1cc74ab6-6c85-4da3-8a79-3af240adb999\" (UID: \"1cc74ab6-6c85-4da3-8a79-3af240adb999\") " Mar 09 14:26:57 crc kubenswrapper[4722]: I0309 14:26:57.372344 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1cc74ab6-6c85-4da3-8a79-3af240adb999-logs\") pod \"1cc74ab6-6c85-4da3-8a79-3af240adb999\" (UID: \"1cc74ab6-6c85-4da3-8a79-3af240adb999\") " Mar 09 14:26:57 crc kubenswrapper[4722]: I0309 14:26:57.372420 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cc74ab6-6c85-4da3-8a79-3af240adb999-config-data\") pod \"1cc74ab6-6c85-4da3-8a79-3af240adb999\" (UID: \"1cc74ab6-6c85-4da3-8a79-3af240adb999\") " Mar 09 14:26:57 crc kubenswrapper[4722]: I0309 14:26:57.372489 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1cc74ab6-6c85-4da3-8a79-3af240adb999-config-data-custom\") pod \"1cc74ab6-6c85-4da3-8a79-3af240adb999\" (UID: \"1cc74ab6-6c85-4da3-8a79-3af240adb999\") " Mar 09 14:26:57 crc kubenswrapper[4722]: I0309 14:26:57.372732 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cc74ab6-6c85-4da3-8a79-3af240adb999-combined-ca-bundle\") pod \"1cc74ab6-6c85-4da3-8a79-3af240adb999\" (UID: \"1cc74ab6-6c85-4da3-8a79-3af240adb999\") " Mar 09 14:26:57 crc kubenswrapper[4722]: I0309 14:26:57.373457 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1cc74ab6-6c85-4da3-8a79-3af240adb999-logs" (OuterVolumeSpecName: "logs") pod "1cc74ab6-6c85-4da3-8a79-3af240adb999" (UID: "1cc74ab6-6c85-4da3-8a79-3af240adb999"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:26:57 crc kubenswrapper[4722]: I0309 14:26:57.375072 4722 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1cc74ab6-6c85-4da3-8a79-3af240adb999-logs\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:57 crc kubenswrapper[4722]: I0309 14:26:57.379634 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cc74ab6-6c85-4da3-8a79-3af240adb999-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1cc74ab6-6c85-4da3-8a79-3af240adb999" (UID: "1cc74ab6-6c85-4da3-8a79-3af240adb999"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:26:57 crc kubenswrapper[4722]: I0309 14:26:57.395683 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cc74ab6-6c85-4da3-8a79-3af240adb999-kube-api-access-zxr5p" (OuterVolumeSpecName: "kube-api-access-zxr5p") pod "1cc74ab6-6c85-4da3-8a79-3af240adb999" (UID: "1cc74ab6-6c85-4da3-8a79-3af240adb999"). InnerVolumeSpecName "kube-api-access-zxr5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:26:57 crc kubenswrapper[4722]: I0309 14:26:57.434074 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cc74ab6-6c85-4da3-8a79-3af240adb999-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1cc74ab6-6c85-4da3-8a79-3af240adb999" (UID: "1cc74ab6-6c85-4da3-8a79-3af240adb999"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:26:57 crc kubenswrapper[4722]: I0309 14:26:57.441415 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cc74ab6-6c85-4da3-8a79-3af240adb999-config-data" (OuterVolumeSpecName: "config-data") pod "1cc74ab6-6c85-4da3-8a79-3af240adb999" (UID: "1cc74ab6-6c85-4da3-8a79-3af240adb999"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:26:57 crc kubenswrapper[4722]: I0309 14:26:57.477306 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zxr5p\" (UniqueName: \"kubernetes.io/projected/1cc74ab6-6c85-4da3-8a79-3af240adb999-kube-api-access-zxr5p\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:57 crc kubenswrapper[4722]: I0309 14:26:57.477340 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cc74ab6-6c85-4da3-8a79-3af240adb999-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:57 crc kubenswrapper[4722]: I0309 14:26:57.477353 4722 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1cc74ab6-6c85-4da3-8a79-3af240adb999-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:57 crc kubenswrapper[4722]: I0309 14:26:57.477365 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cc74ab6-6c85-4da3-8a79-3af240adb999-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:57 crc kubenswrapper[4722]: I0309 14:26:57.569478 4722 generic.go:334] "Generic (PLEG): container finished" podID="1cc74ab6-6c85-4da3-8a79-3af240adb999" containerID="2ea6220f7559189dfe1a672a930e7f578f963c2059ccb638d6facdeba1269b53" exitCode=137 Mar 09 14:26:57 crc kubenswrapper[4722]: I0309 14:26:57.569516 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-858d6f6fd6-wqzdg" event={"ID":"1cc74ab6-6c85-4da3-8a79-3af240adb999","Type":"ContainerDied","Data":"2ea6220f7559189dfe1a672a930e7f578f963c2059ccb638d6facdeba1269b53"} Mar 09 14:26:57 crc kubenswrapper[4722]: I0309 14:26:57.569543 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-858d6f6fd6-wqzdg" event={"ID":"1cc74ab6-6c85-4da3-8a79-3af240adb999","Type":"ContainerDied","Data":"a5838f9eb0a984e2b49cf758afe91b53782d3b9ca807c3a8ce1ba1ba95711fda"} Mar 09 14:26:57 crc kubenswrapper[4722]: I0309 14:26:57.569560 4722 scope.go:117] "RemoveContainer" containerID="2ea6220f7559189dfe1a672a930e7f578f963c2059ccb638d6facdeba1269b53" Mar 09 14:26:57 crc kubenswrapper[4722]: I0309 14:26:57.569553 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-858d6f6fd6-wqzdg" Mar 09 14:26:57 crc kubenswrapper[4722]: I0309 14:26:57.592616 4722 scope.go:117] "RemoveContainer" containerID="bb6a4cf444c858712ca2adabb6d43167f4c418dc0decc8bcd31c1edce0e7fe88" Mar 09 14:26:57 crc kubenswrapper[4722]: I0309 14:26:57.608252 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-858d6f6fd6-wqzdg"] Mar 09 14:26:57 crc kubenswrapper[4722]: I0309 14:26:57.618738 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-858d6f6fd6-wqzdg"] Mar 09 14:26:57 crc kubenswrapper[4722]: I0309 14:26:57.629737 4722 scope.go:117] "RemoveContainer" containerID="2ea6220f7559189dfe1a672a930e7f578f963c2059ccb638d6facdeba1269b53" Mar 09 14:26:57 crc kubenswrapper[4722]: E0309 14:26:57.630150 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ea6220f7559189dfe1a672a930e7f578f963c2059ccb638d6facdeba1269b53\": container with ID starting with 2ea6220f7559189dfe1a672a930e7f578f963c2059ccb638d6facdeba1269b53 not found: ID does not exist" containerID="2ea6220f7559189dfe1a672a930e7f578f963c2059ccb638d6facdeba1269b53" Mar 09 14:26:57 crc kubenswrapper[4722]: I0309 14:26:57.630193 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ea6220f7559189dfe1a672a930e7f578f963c2059ccb638d6facdeba1269b53"} err="failed to get container status \"2ea6220f7559189dfe1a672a930e7f578f963c2059ccb638d6facdeba1269b53\": rpc error: code = NotFound desc = could not find container \"2ea6220f7559189dfe1a672a930e7f578f963c2059ccb638d6facdeba1269b53\": container with ID starting with 2ea6220f7559189dfe1a672a930e7f578f963c2059ccb638d6facdeba1269b53 not found: ID does not exist" Mar 09 14:26:57 crc kubenswrapper[4722]: I0309 14:26:57.630338 4722 scope.go:117] "RemoveContainer" containerID="bb6a4cf444c858712ca2adabb6d43167f4c418dc0decc8bcd31c1edce0e7fe88" Mar 09 14:26:57 crc kubenswrapper[4722]: E0309 14:26:57.630777 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb6a4cf444c858712ca2adabb6d43167f4c418dc0decc8bcd31c1edce0e7fe88\": container with ID starting with bb6a4cf444c858712ca2adabb6d43167f4c418dc0decc8bcd31c1edce0e7fe88 not found: ID does not exist" containerID="bb6a4cf444c858712ca2adabb6d43167f4c418dc0decc8bcd31c1edce0e7fe88" Mar 09 14:26:57 crc kubenswrapper[4722]: I0309 14:26:57.630831 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb6a4cf444c858712ca2adabb6d43167f4c418dc0decc8bcd31c1edce0e7fe88"} err="failed to get container status \"bb6a4cf444c858712ca2adabb6d43167f4c418dc0decc8bcd31c1edce0e7fe88\": rpc error: code = NotFound desc = could not find container \"bb6a4cf444c858712ca2adabb6d43167f4c418dc0decc8bcd31c1edce0e7fe88\": container with ID starting with bb6a4cf444c858712ca2adabb6d43167f4c418dc0decc8bcd31c1edce0e7fe88 not found: ID does not exist" Mar 09 14:26:58 crc kubenswrapper[4722]: I0309 14:26:58.165636 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1cc74ab6-6c85-4da3-8a79-3af240adb999" path="/var/lib/kubelet/pods/1cc74ab6-6c85-4da3-8a79-3af240adb999/volumes" Mar 09 14:26:58 crc kubenswrapper[4722]: I0309 14:26:58.204682 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-57655d7975-hskdb" Mar 09 14:26:58 crc kubenswrapper[4722]: I0309 14:26:58.298576 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3b18ae1-c2c9-4454-a591-53ce06064d82-combined-ca-bundle\") pod \"c3b18ae1-c2c9-4454-a591-53ce06064d82\" (UID: \"c3b18ae1-c2c9-4454-a591-53ce06064d82\") " Mar 09 14:26:58 crc kubenswrapper[4722]: I0309 14:26:58.298617 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtm5c\" (UniqueName: \"kubernetes.io/projected/c3b18ae1-c2c9-4454-a591-53ce06064d82-kube-api-access-mtm5c\") pod \"c3b18ae1-c2c9-4454-a591-53ce06064d82\" (UID: \"c3b18ae1-c2c9-4454-a591-53ce06064d82\") " Mar 09 14:26:58 crc kubenswrapper[4722]: I0309 14:26:58.298705 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c3b18ae1-c2c9-4454-a591-53ce06064d82-config-data-custom\") pod \"c3b18ae1-c2c9-4454-a591-53ce06064d82\" (UID: \"c3b18ae1-c2c9-4454-a591-53ce06064d82\") " Mar 09 14:26:58 crc kubenswrapper[4722]: I0309 14:26:58.298774 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3b18ae1-c2c9-4454-a591-53ce06064d82-config-data\") pod \"c3b18ae1-c2c9-4454-a591-53ce06064d82\" (UID: \"c3b18ae1-c2c9-4454-a591-53ce06064d82\") " Mar 09 14:26:58 crc kubenswrapper[4722]: I0309 14:26:58.298811 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3b18ae1-c2c9-4454-a591-53ce06064d82-logs\") pod \"c3b18ae1-c2c9-4454-a591-53ce06064d82\" (UID: \"c3b18ae1-c2c9-4454-a591-53ce06064d82\") " Mar 09 14:26:58 crc kubenswrapper[4722]: I0309 14:26:58.299385 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3b18ae1-c2c9-4454-a591-53ce06064d82-logs" (OuterVolumeSpecName: "logs") pod "c3b18ae1-c2c9-4454-a591-53ce06064d82" (UID: "c3b18ae1-c2c9-4454-a591-53ce06064d82"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:26:58 crc kubenswrapper[4722]: I0309 14:26:58.299619 4722 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3b18ae1-c2c9-4454-a591-53ce06064d82-logs\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:58 crc kubenswrapper[4722]: I0309 14:26:58.303104 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3b18ae1-c2c9-4454-a591-53ce06064d82-kube-api-access-mtm5c" (OuterVolumeSpecName: "kube-api-access-mtm5c") pod "c3b18ae1-c2c9-4454-a591-53ce06064d82" (UID: "c3b18ae1-c2c9-4454-a591-53ce06064d82"). InnerVolumeSpecName "kube-api-access-mtm5c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:26:58 crc kubenswrapper[4722]: I0309 14:26:58.316339 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3b18ae1-c2c9-4454-a591-53ce06064d82-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c3b18ae1-c2c9-4454-a591-53ce06064d82" (UID: "c3b18ae1-c2c9-4454-a591-53ce06064d82"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:26:58 crc kubenswrapper[4722]: I0309 14:26:58.340549 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3b18ae1-c2c9-4454-a591-53ce06064d82-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c3b18ae1-c2c9-4454-a591-53ce06064d82" (UID: "c3b18ae1-c2c9-4454-a591-53ce06064d82"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:26:58 crc kubenswrapper[4722]: I0309 14:26:58.343595 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 09 14:26:58 crc kubenswrapper[4722]: I0309 14:26:58.390366 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3b18ae1-c2c9-4454-a591-53ce06064d82-config-data" (OuterVolumeSpecName: "config-data") pod "c3b18ae1-c2c9-4454-a591-53ce06064d82" (UID: "c3b18ae1-c2c9-4454-a591-53ce06064d82"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:26:58 crc kubenswrapper[4722]: I0309 14:26:58.402682 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3b18ae1-c2c9-4454-a591-53ce06064d82-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:58 crc kubenswrapper[4722]: I0309 14:26:58.402744 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtm5c\" (UniqueName: \"kubernetes.io/projected/c3b18ae1-c2c9-4454-a591-53ce06064d82-kube-api-access-mtm5c\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:58 crc kubenswrapper[4722]: I0309 14:26:58.402759 4722 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c3b18ae1-c2c9-4454-a591-53ce06064d82-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:58 crc kubenswrapper[4722]: I0309 14:26:58.402770 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3b18ae1-c2c9-4454-a591-53ce06064d82-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:26:58 crc kubenswrapper[4722]: I0309 14:26:58.491654 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-77cd88d8c5-gml97"] Mar 09 14:26:58 crc kubenswrapper[4722]: E0309 14:26:58.492182 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="154add5d-d1cd-4c93-ae63-3c7dfe4cb035" containerName="placement-api" Mar 09 14:26:58 crc kubenswrapper[4722]: I0309 14:26:58.492211 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="154add5d-d1cd-4c93-ae63-3c7dfe4cb035" containerName="placement-api" Mar 09 14:26:58 crc kubenswrapper[4722]: E0309 14:26:58.492225 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cc74ab6-6c85-4da3-8a79-3af240adb999" containerName="barbican-keystone-listener" Mar 09 14:26:58 crc kubenswrapper[4722]: I0309 14:26:58.492232 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cc74ab6-6c85-4da3-8a79-3af240adb999" containerName="barbican-keystone-listener" Mar 09 14:26:58 crc kubenswrapper[4722]: E0309 14:26:58.492254 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3b18ae1-c2c9-4454-a591-53ce06064d82" containerName="barbican-worker" Mar 09 14:26:58 crc kubenswrapper[4722]: I0309 14:26:58.492261 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3b18ae1-c2c9-4454-a591-53ce06064d82" containerName="barbican-worker" Mar 09 14:26:58 crc kubenswrapper[4722]: E0309 14:26:58.492289 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cc74ab6-6c85-4da3-8a79-3af240adb999" containerName="barbican-keystone-listener-log" Mar 09 14:26:58 crc kubenswrapper[4722]: I0309 14:26:58.492295 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cc74ab6-6c85-4da3-8a79-3af240adb999" containerName="barbican-keystone-listener-log" Mar 09 14:26:58 crc kubenswrapper[4722]: E0309 14:26:58.492318 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="154add5d-d1cd-4c93-ae63-3c7dfe4cb035" containerName="placement-log" Mar 09 14:26:58 crc kubenswrapper[4722]: I0309 14:26:58.492324 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="154add5d-d1cd-4c93-ae63-3c7dfe4cb035" containerName="placement-log" Mar 09 14:26:58 crc kubenswrapper[4722]: E0309 14:26:58.492334 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3b18ae1-c2c9-4454-a591-53ce06064d82" containerName="barbican-worker-log" Mar 09 14:26:58 crc kubenswrapper[4722]: I0309 14:26:58.492340 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3b18ae1-c2c9-4454-a591-53ce06064d82" containerName="barbican-worker-log" Mar 09 14:26:58 crc kubenswrapper[4722]: I0309 14:26:58.492515 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="154add5d-d1cd-4c93-ae63-3c7dfe4cb035" containerName="placement-api" Mar 09 14:26:58 crc kubenswrapper[4722]: I0309 14:26:58.492528 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cc74ab6-6c85-4da3-8a79-3af240adb999" containerName="barbican-keystone-listener" Mar 09 14:26:58 crc kubenswrapper[4722]: I0309 14:26:58.492536 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3b18ae1-c2c9-4454-a591-53ce06064d82" containerName="barbican-worker" Mar 09 14:26:58 crc kubenswrapper[4722]: I0309 14:26:58.492546 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cc74ab6-6c85-4da3-8a79-3af240adb999" containerName="barbican-keystone-listener-log" Mar 09 14:26:58 crc kubenswrapper[4722]: I0309 14:26:58.492562 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="154add5d-d1cd-4c93-ae63-3c7dfe4cb035" containerName="placement-log" Mar 09 14:26:58 crc kubenswrapper[4722]: I0309 14:26:58.492579 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3b18ae1-c2c9-4454-a591-53ce06064d82" containerName="barbican-worker-log" Mar 09 14:26:58 crc kubenswrapper[4722]: I0309 14:26:58.493737 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-77cd88d8c5-gml97" Mar 09 14:26:58 crc kubenswrapper[4722]: I0309 14:26:58.497644 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Mar 09 14:26:58 crc kubenswrapper[4722]: I0309 14:26:58.498006 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Mar 09 14:26:58 crc kubenswrapper[4722]: I0309 14:26:58.498088 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 09 14:26:58 crc kubenswrapper[4722]: I0309 14:26:58.515346 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-77cd88d8c5-gml97"] Mar 09 14:26:58 crc kubenswrapper[4722]: I0309 14:26:58.593788 4722 generic.go:334] "Generic (PLEG): container finished" podID="c3b18ae1-c2c9-4454-a591-53ce06064d82" containerID="c6864d1bed9c306653a0b503bd6991b82fa3f33a3e4d35aa629f8fb746dfd504" exitCode=137 Mar 09 14:26:58 crc kubenswrapper[4722]: I0309 14:26:58.593831 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-57655d7975-hskdb" event={"ID":"c3b18ae1-c2c9-4454-a591-53ce06064d82","Type":"ContainerDied","Data":"c6864d1bed9c306653a0b503bd6991b82fa3f33a3e4d35aa629f8fb746dfd504"} Mar 09 14:26:58 crc kubenswrapper[4722]: I0309 14:26:58.593854 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-57655d7975-hskdb" Mar 09 14:26:58 crc kubenswrapper[4722]: I0309 14:26:58.593876 4722 scope.go:117] "RemoveContainer" containerID="c6864d1bed9c306653a0b503bd6991b82fa3f33a3e4d35aa629f8fb746dfd504" Mar 09 14:26:58 crc kubenswrapper[4722]: I0309 14:26:58.593863 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-57655d7975-hskdb" event={"ID":"c3b18ae1-c2c9-4454-a591-53ce06064d82","Type":"ContainerDied","Data":"66173dab4c95a59cb756913887d69c0eea112d340545edc798dae7a2c3856d81"} Mar 09 14:26:58 crc kubenswrapper[4722]: I0309 14:26:58.614978 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/685a5733-d06e-4523-a35a-051db91eb0be-public-tls-certs\") pod \"swift-proxy-77cd88d8c5-gml97\" (UID: \"685a5733-d06e-4523-a35a-051db91eb0be\") " pod="openstack/swift-proxy-77cd88d8c5-gml97" Mar 09 14:26:58 crc kubenswrapper[4722]: I0309 14:26:58.615320 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/685a5733-d06e-4523-a35a-051db91eb0be-etc-swift\") pod \"swift-proxy-77cd88d8c5-gml97\" (UID: \"685a5733-d06e-4523-a35a-051db91eb0be\") " pod="openstack/swift-proxy-77cd88d8c5-gml97" Mar 09 14:26:58 crc kubenswrapper[4722]: I0309 14:26:58.615359 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/685a5733-d06e-4523-a35a-051db91eb0be-config-data\") pod \"swift-proxy-77cd88d8c5-gml97\" (UID: \"685a5733-d06e-4523-a35a-051db91eb0be\") " pod="openstack/swift-proxy-77cd88d8c5-gml97" Mar 09 14:26:58 crc kubenswrapper[4722]: I0309 14:26:58.615400 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/685a5733-d06e-4523-a35a-051db91eb0be-combined-ca-bundle\") pod \"swift-proxy-77cd88d8c5-gml97\" (UID: \"685a5733-d06e-4523-a35a-051db91eb0be\") " pod="openstack/swift-proxy-77cd88d8c5-gml97" Mar 09 14:26:58 crc kubenswrapper[4722]: I0309 14:26:58.615427 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tddg4\" (UniqueName: \"kubernetes.io/projected/685a5733-d06e-4523-a35a-051db91eb0be-kube-api-access-tddg4\") pod \"swift-proxy-77cd88d8c5-gml97\" (UID: \"685a5733-d06e-4523-a35a-051db91eb0be\") " pod="openstack/swift-proxy-77cd88d8c5-gml97" Mar 09 14:26:58 crc kubenswrapper[4722]: I0309 14:26:58.615483 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/685a5733-d06e-4523-a35a-051db91eb0be-log-httpd\") pod \"swift-proxy-77cd88d8c5-gml97\" (UID: \"685a5733-d06e-4523-a35a-051db91eb0be\") " pod="openstack/swift-proxy-77cd88d8c5-gml97" Mar 09 14:26:58 crc kubenswrapper[4722]: I0309 14:26:58.615505 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/685a5733-d06e-4523-a35a-051db91eb0be-internal-tls-certs\") pod \"swift-proxy-77cd88d8c5-gml97\" (UID: \"685a5733-d06e-4523-a35a-051db91eb0be\") " pod="openstack/swift-proxy-77cd88d8c5-gml97" Mar 09 14:26:58 crc kubenswrapper[4722]: I0309 14:26:58.615538 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/685a5733-d06e-4523-a35a-051db91eb0be-run-httpd\") pod \"swift-proxy-77cd88d8c5-gml97\" (UID: \"685a5733-d06e-4523-a35a-051db91eb0be\") " pod="openstack/swift-proxy-77cd88d8c5-gml97" Mar 09 14:26:58 crc kubenswrapper[4722]: I0309 14:26:58.649389 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-57655d7975-hskdb"] Mar 09 14:26:58 crc kubenswrapper[4722]: I0309 14:26:58.657146 4722 scope.go:117] "RemoveContainer" containerID="41e74f0bf251b86c307d87608970a4aa9f3d425bfc90e218b49c8fca85813b49" Mar 09 14:26:58 crc kubenswrapper[4722]: I0309 14:26:58.659882 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-57655d7975-hskdb"] Mar 09 14:26:58 crc kubenswrapper[4722]: I0309 14:26:58.686687 4722 scope.go:117] "RemoveContainer" containerID="c6864d1bed9c306653a0b503bd6991b82fa3f33a3e4d35aa629f8fb746dfd504" Mar 09 14:26:58 crc kubenswrapper[4722]: E0309 14:26:58.688208 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6864d1bed9c306653a0b503bd6991b82fa3f33a3e4d35aa629f8fb746dfd504\": container with ID starting with c6864d1bed9c306653a0b503bd6991b82fa3f33a3e4d35aa629f8fb746dfd504 not found: ID does not exist" containerID="c6864d1bed9c306653a0b503bd6991b82fa3f33a3e4d35aa629f8fb746dfd504" Mar 09 14:26:58 crc kubenswrapper[4722]: I0309 14:26:58.688250 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6864d1bed9c306653a0b503bd6991b82fa3f33a3e4d35aa629f8fb746dfd504"} err="failed to get container status \"c6864d1bed9c306653a0b503bd6991b82fa3f33a3e4d35aa629f8fb746dfd504\": rpc error: code = NotFound desc = could not find container \"c6864d1bed9c306653a0b503bd6991b82fa3f33a3e4d35aa629f8fb746dfd504\": container with ID starting with c6864d1bed9c306653a0b503bd6991b82fa3f33a3e4d35aa629f8fb746dfd504 not found: ID does not exist" Mar 09 14:26:58 crc kubenswrapper[4722]: I0309 14:26:58.688279 4722 scope.go:117] "RemoveContainer" containerID="41e74f0bf251b86c307d87608970a4aa9f3d425bfc90e218b49c8fca85813b49" Mar 09 14:26:58 crc kubenswrapper[4722]: E0309 14:26:58.688691 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41e74f0bf251b86c307d87608970a4aa9f3d425bfc90e218b49c8fca85813b49\": container with ID starting with 41e74f0bf251b86c307d87608970a4aa9f3d425bfc90e218b49c8fca85813b49 not found: ID does not exist" containerID="41e74f0bf251b86c307d87608970a4aa9f3d425bfc90e218b49c8fca85813b49" Mar 09 14:26:58 crc kubenswrapper[4722]: I0309 14:26:58.688732 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41e74f0bf251b86c307d87608970a4aa9f3d425bfc90e218b49c8fca85813b49"} err="failed to get container status \"41e74f0bf251b86c307d87608970a4aa9f3d425bfc90e218b49c8fca85813b49\": rpc error: code = NotFound desc = could not find container \"41e74f0bf251b86c307d87608970a4aa9f3d425bfc90e218b49c8fca85813b49\": container with ID starting with 41e74f0bf251b86c307d87608970a4aa9f3d425bfc90e218b49c8fca85813b49 not found: ID does not exist" Mar 09 14:26:58 crc kubenswrapper[4722]: I0309 14:26:58.721549 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/685a5733-d06e-4523-a35a-051db91eb0be-combined-ca-bundle\") pod \"swift-proxy-77cd88d8c5-gml97\" (UID: \"685a5733-d06e-4523-a35a-051db91eb0be\") " pod="openstack/swift-proxy-77cd88d8c5-gml97" Mar 09 14:26:58 crc kubenswrapper[4722]: I0309 14:26:58.721630 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tddg4\" (UniqueName: \"kubernetes.io/projected/685a5733-d06e-4523-a35a-051db91eb0be-kube-api-access-tddg4\") pod \"swift-proxy-77cd88d8c5-gml97\" (UID: \"685a5733-d06e-4523-a35a-051db91eb0be\") " pod="openstack/swift-proxy-77cd88d8c5-gml97" Mar 09 14:26:58 crc kubenswrapper[4722]: I0309 14:26:58.721702 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/685a5733-d06e-4523-a35a-051db91eb0be-log-httpd\") pod \"swift-proxy-77cd88d8c5-gml97\" (UID: \"685a5733-d06e-4523-a35a-051db91eb0be\") " pod="openstack/swift-proxy-77cd88d8c5-gml97" Mar 09 14:26:58 crc kubenswrapper[4722]: I0309 14:26:58.721754 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/685a5733-d06e-4523-a35a-051db91eb0be-internal-tls-certs\") pod \"swift-proxy-77cd88d8c5-gml97\" (UID: \"685a5733-d06e-4523-a35a-051db91eb0be\") " pod="openstack/swift-proxy-77cd88d8c5-gml97" Mar 09 14:26:58 crc kubenswrapper[4722]: I0309 14:26:58.721826 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/685a5733-d06e-4523-a35a-051db91eb0be-run-httpd\") pod \"swift-proxy-77cd88d8c5-gml97\" (UID: \"685a5733-d06e-4523-a35a-051db91eb0be\") " pod="openstack/swift-proxy-77cd88d8c5-gml97" Mar 09 14:26:58 crc kubenswrapper[4722]: I0309 14:26:58.722094 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/685a5733-d06e-4523-a35a-051db91eb0be-public-tls-certs\") pod \"swift-proxy-77cd88d8c5-gml97\" (UID: \"685a5733-d06e-4523-a35a-051db91eb0be\") " pod="openstack/swift-proxy-77cd88d8c5-gml97" Mar 09 14:26:58 crc kubenswrapper[4722]: I0309 14:26:58.722193 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/685a5733-d06e-4523-a35a-051db91eb0be-etc-swift\") pod \"swift-proxy-77cd88d8c5-gml97\" (UID: \"685a5733-d06e-4523-a35a-051db91eb0be\") " pod="openstack/swift-proxy-77cd88d8c5-gml97" Mar 09 14:26:58 crc kubenswrapper[4722]: I0309 14:26:58.722311 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/685a5733-d06e-4523-a35a-051db91eb0be-config-data\") pod \"swift-proxy-77cd88d8c5-gml97\" (UID: \"685a5733-d06e-4523-a35a-051db91eb0be\") " pod="openstack/swift-proxy-77cd88d8c5-gml97" Mar 09 14:26:58 crc kubenswrapper[4722]: I0309 14:26:58.725492 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/685a5733-d06e-4523-a35a-051db91eb0be-run-httpd\") pod \"swift-proxy-77cd88d8c5-gml97\" (UID: \"685a5733-d06e-4523-a35a-051db91eb0be\") " pod="openstack/swift-proxy-77cd88d8c5-gml97" Mar 09 14:26:58 crc kubenswrapper[4722]: I0309 14:26:58.725992 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/685a5733-d06e-4523-a35a-051db91eb0be-log-httpd\") pod \"swift-proxy-77cd88d8c5-gml97\" (UID: \"685a5733-d06e-4523-a35a-051db91eb0be\") " pod="openstack/swift-proxy-77cd88d8c5-gml97" Mar 09 14:26:58 crc kubenswrapper[4722]: I0309 14:26:58.736844 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/685a5733-d06e-4523-a35a-051db91eb0be-public-tls-certs\") pod \"swift-proxy-77cd88d8c5-gml97\" (UID: \"685a5733-d06e-4523-a35a-051db91eb0be\") " pod="openstack/swift-proxy-77cd88d8c5-gml97" Mar 09 14:26:58 crc kubenswrapper[4722]: I0309 14:26:58.758073 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/685a5733-d06e-4523-a35a-051db91eb0be-combined-ca-bundle\") pod \"swift-proxy-77cd88d8c5-gml97\" (UID: \"685a5733-d06e-4523-a35a-051db91eb0be\") " pod="openstack/swift-proxy-77cd88d8c5-gml97" Mar 09 14:26:58 crc kubenswrapper[4722]: I0309 14:26:58.763824 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/685a5733-d06e-4523-a35a-051db91eb0be-config-data\") pod \"swift-proxy-77cd88d8c5-gml97\" (UID: \"685a5733-d06e-4523-a35a-051db91eb0be\") " pod="openstack/swift-proxy-77cd88d8c5-gml97" Mar 09 14:26:58 crc kubenswrapper[4722]: I0309 14:26:58.771055 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tddg4\" (UniqueName: \"kubernetes.io/projected/685a5733-d06e-4523-a35a-051db91eb0be-kube-api-access-tddg4\") pod \"swift-proxy-77cd88d8c5-gml97\" (UID: \"685a5733-d06e-4523-a35a-051db91eb0be\") " pod="openstack/swift-proxy-77cd88d8c5-gml97" Mar 09 14:26:58 crc kubenswrapper[4722]: I0309 14:26:58.773654 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/685a5733-d06e-4523-a35a-051db91eb0be-internal-tls-certs\") pod \"swift-proxy-77cd88d8c5-gml97\" (UID: \"685a5733-d06e-4523-a35a-051db91eb0be\") " pod="openstack/swift-proxy-77cd88d8c5-gml97" Mar 09 14:26:58 crc kubenswrapper[4722]: I0309 14:26:58.774386 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/685a5733-d06e-4523-a35a-051db91eb0be-etc-swift\") pod \"swift-proxy-77cd88d8c5-gml97\" (UID: \"685a5733-d06e-4523-a35a-051db91eb0be\") " pod="openstack/swift-proxy-77cd88d8c5-gml97" Mar 09 14:26:58 crc kubenswrapper[4722]: I0309 14:26:58.830536 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-77cd88d8c5-gml97" Mar 09 14:26:59 crc kubenswrapper[4722]: I0309 14:26:59.202693 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-5884f594db-ls9vv"] Mar 09 14:26:59 crc kubenswrapper[4722]: I0309 14:26:59.204556 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-5884f594db-ls9vv" Mar 09 14:26:59 crc kubenswrapper[4722]: I0309 14:26:59.207568 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Mar 09 14:26:59 crc kubenswrapper[4722]: I0309 14:26:59.207910 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-bb86g" Mar 09 14:26:59 crc kubenswrapper[4722]: I0309 14:26:59.208024 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Mar 09 14:26:59 crc kubenswrapper[4722]: I0309 14:26:59.219490 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-5884f594db-ls9vv"] Mar 09 14:26:59 crc kubenswrapper[4722]: I0309 14:26:59.326960 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-xpcxn"] Mar 09 14:26:59 crc kubenswrapper[4722]: I0309 14:26:59.337380 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f71670c9-520f-4005-a324-199bc52fac7f-combined-ca-bundle\") pod \"heat-engine-5884f594db-ls9vv\" (UID: \"f71670c9-520f-4005-a324-199bc52fac7f\") " pod="openstack/heat-engine-5884f594db-ls9vv" Mar 09 14:26:59 crc kubenswrapper[4722]: I0309 14:26:59.337540 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f71670c9-520f-4005-a324-199bc52fac7f-config-data-custom\") pod \"heat-engine-5884f594db-ls9vv\" (UID: \"f71670c9-520f-4005-a324-199bc52fac7f\") " pod="openstack/heat-engine-5884f594db-ls9vv" Mar 09 14:26:59 crc kubenswrapper[4722]: I0309 14:26:59.337566 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzv6c\" (UniqueName: \"kubernetes.io/projected/f71670c9-520f-4005-a324-199bc52fac7f-kube-api-access-xzv6c\") pod \"heat-engine-5884f594db-ls9vv\" (UID: \"f71670c9-520f-4005-a324-199bc52fac7f\") " pod="openstack/heat-engine-5884f594db-ls9vv" Mar 09 14:26:59 crc kubenswrapper[4722]: I0309 14:26:59.337640 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f71670c9-520f-4005-a324-199bc52fac7f-config-data\") pod \"heat-engine-5884f594db-ls9vv\" (UID: \"f71670c9-520f-4005-a324-199bc52fac7f\") " pod="openstack/heat-engine-5884f594db-ls9vv" Mar 09 14:26:59 crc kubenswrapper[4722]: I0309 14:26:59.339003 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7756b9d78c-xpcxn" Mar 09 14:26:59 crc kubenswrapper[4722]: I0309 14:26:59.361295 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-58665ff9-7zt7s"] Mar 09 14:26:59 crc kubenswrapper[4722]: I0309 14:26:59.362794 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-58665ff9-7zt7s" Mar 09 14:26:59 crc kubenswrapper[4722]: I0309 14:26:59.370567 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Mar 09 14:26:59 crc kubenswrapper[4722]: I0309 14:26:59.372764 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-xpcxn"] Mar 09 14:26:59 crc kubenswrapper[4722]: I0309 14:26:59.401820 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-58665ff9-7zt7s"] Mar 09 14:26:59 crc kubenswrapper[4722]: I0309 14:26:59.441383 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c1bee46-a464-474e-ae9c-e333a0ef2190-config\") pod \"dnsmasq-dns-7756b9d78c-xpcxn\" (UID: \"3c1bee46-a464-474e-ae9c-e333a0ef2190\") " pod="openstack/dnsmasq-dns-7756b9d78c-xpcxn" Mar 09 14:26:59 crc kubenswrapper[4722]: I0309 14:26:59.441458 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3c1bee46-a464-474e-ae9c-e333a0ef2190-ovsdbserver-nb\") pod \"dnsmasq-dns-7756b9d78c-xpcxn\" (UID: \"3c1bee46-a464-474e-ae9c-e333a0ef2190\") " pod="openstack/dnsmasq-dns-7756b9d78c-xpcxn" Mar 09 14:26:59 crc kubenswrapper[4722]: I0309 14:26:59.441492 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3c1bee46-a464-474e-ae9c-e333a0ef2190-dns-svc\") pod \"dnsmasq-dns-7756b9d78c-xpcxn\" (UID: \"3c1bee46-a464-474e-ae9c-e333a0ef2190\") " pod="openstack/dnsmasq-dns-7756b9d78c-xpcxn" Mar 09 14:26:59 crc kubenswrapper[4722]: I0309 14:26:59.441510 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3c1bee46-a464-474e-ae9c-e333a0ef2190-dns-swift-storage-0\") pod \"dnsmasq-dns-7756b9d78c-xpcxn\" (UID: \"3c1bee46-a464-474e-ae9c-e333a0ef2190\") " pod="openstack/dnsmasq-dns-7756b9d78c-xpcxn" Mar 09 14:26:59 crc kubenswrapper[4722]: I0309 14:26:59.441539 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f71670c9-520f-4005-a324-199bc52fac7f-config-data-custom\") pod \"heat-engine-5884f594db-ls9vv\" (UID: \"f71670c9-520f-4005-a324-199bc52fac7f\") " pod="openstack/heat-engine-5884f594db-ls9vv" Mar 09 14:26:59 crc kubenswrapper[4722]: I0309 14:26:59.441557 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzv6c\" (UniqueName: \"kubernetes.io/projected/f71670c9-520f-4005-a324-199bc52fac7f-kube-api-access-xzv6c\") pod \"heat-engine-5884f594db-ls9vv\" (UID: \"f71670c9-520f-4005-a324-199bc52fac7f\") " pod="openstack/heat-engine-5884f594db-ls9vv" Mar 09 14:26:59 crc kubenswrapper[4722]: I0309 14:26:59.441588 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37f48e3f-e425-4806-8fb5-27724da7ad0d-config-data\") pod \"heat-cfnapi-58665ff9-7zt7s\" (UID: \"37f48e3f-e425-4806-8fb5-27724da7ad0d\") " pod="openstack/heat-cfnapi-58665ff9-7zt7s" Mar 09 14:26:59 crc kubenswrapper[4722]: I0309 14:26:59.441628 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpqhb\" (UniqueName: \"kubernetes.io/projected/3c1bee46-a464-474e-ae9c-e333a0ef2190-kube-api-access-zpqhb\") pod \"dnsmasq-dns-7756b9d78c-xpcxn\" (UID: \"3c1bee46-a464-474e-ae9c-e333a0ef2190\") " pod="openstack/dnsmasq-dns-7756b9d78c-xpcxn" Mar 09 14:26:59 crc kubenswrapper[4722]: I0309 14:26:59.441681 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f71670c9-520f-4005-a324-199bc52fac7f-config-data\") pod \"heat-engine-5884f594db-ls9vv\" (UID: \"f71670c9-520f-4005-a324-199bc52fac7f\") " pod="openstack/heat-engine-5884f594db-ls9vv" Mar 09 14:26:59 crc kubenswrapper[4722]: I0309 14:26:59.441699 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3c1bee46-a464-474e-ae9c-e333a0ef2190-ovsdbserver-sb\") pod \"dnsmasq-dns-7756b9d78c-xpcxn\" (UID: \"3c1bee46-a464-474e-ae9c-e333a0ef2190\") " pod="openstack/dnsmasq-dns-7756b9d78c-xpcxn" Mar 09 14:26:59 crc kubenswrapper[4722]: I0309 14:26:59.441735 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twx9m\" (UniqueName: \"kubernetes.io/projected/37f48e3f-e425-4806-8fb5-27724da7ad0d-kube-api-access-twx9m\") pod \"heat-cfnapi-58665ff9-7zt7s\" (UID: \"37f48e3f-e425-4806-8fb5-27724da7ad0d\") " pod="openstack/heat-cfnapi-58665ff9-7zt7s" Mar 09 14:26:59 crc kubenswrapper[4722]: I0309 14:26:59.441758 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f71670c9-520f-4005-a324-199bc52fac7f-combined-ca-bundle\") pod \"heat-engine-5884f594db-ls9vv\" (UID: \"f71670c9-520f-4005-a324-199bc52fac7f\") " pod="openstack/heat-engine-5884f594db-ls9vv" Mar 09 14:26:59 crc kubenswrapper[4722]: I0309 14:26:59.441777 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37f48e3f-e425-4806-8fb5-27724da7ad0d-combined-ca-bundle\") pod \"heat-cfnapi-58665ff9-7zt7s\" (UID: \"37f48e3f-e425-4806-8fb5-27724da7ad0d\") " pod="openstack/heat-cfnapi-58665ff9-7zt7s" Mar 09 14:26:59 crc kubenswrapper[4722]: I0309 14:26:59.441814 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/37f48e3f-e425-4806-8fb5-27724da7ad0d-config-data-custom\") pod \"heat-cfnapi-58665ff9-7zt7s\" (UID: \"37f48e3f-e425-4806-8fb5-27724da7ad0d\") " pod="openstack/heat-cfnapi-58665ff9-7zt7s" Mar 09 14:26:59 crc kubenswrapper[4722]: I0309 14:26:59.458128 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f71670c9-520f-4005-a324-199bc52fac7f-combined-ca-bundle\") pod \"heat-engine-5884f594db-ls9vv\" (UID: \"f71670c9-520f-4005-a324-199bc52fac7f\") " pod="openstack/heat-engine-5884f594db-ls9vv" Mar 09 14:26:59 crc kubenswrapper[4722]: I0309 14:26:59.458157 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f71670c9-520f-4005-a324-199bc52fac7f-config-data-custom\") pod \"heat-engine-5884f594db-ls9vv\" (UID: \"f71670c9-520f-4005-a324-199bc52fac7f\") " pod="openstack/heat-engine-5884f594db-ls9vv" Mar 09 14:26:59 crc kubenswrapper[4722]: I0309 14:26:59.460023 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f71670c9-520f-4005-a324-199bc52fac7f-config-data\") pod \"heat-engine-5884f594db-ls9vv\" (UID: \"f71670c9-520f-4005-a324-199bc52fac7f\") " pod="openstack/heat-engine-5884f594db-ls9vv" Mar 09 14:26:59 crc kubenswrapper[4722]: I0309 14:26:59.469937 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-5ffdf96bcf-klbzp"] Mar 09 14:26:59 crc kubenswrapper[4722]: I0309 14:26:59.473107 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzv6c\" (UniqueName: \"kubernetes.io/projected/f71670c9-520f-4005-a324-199bc52fac7f-kube-api-access-xzv6c\") pod \"heat-engine-5884f594db-ls9vv\" (UID: \"f71670c9-520f-4005-a324-199bc52fac7f\") " pod="openstack/heat-engine-5884f594db-ls9vv" Mar 09 14:26:59 crc kubenswrapper[4722]: I0309 14:26:59.473502 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5ffdf96bcf-klbzp" Mar 09 14:26:59 crc kubenswrapper[4722]: I0309 14:26:59.480883 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-5ffdf96bcf-klbzp"] Mar 09 14:26:59 crc kubenswrapper[4722]: I0309 14:26:59.493112 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Mar 09 14:26:59 crc kubenswrapper[4722]: I0309 14:26:59.535887 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-5884f594db-ls9vv" Mar 09 14:26:59 crc kubenswrapper[4722]: I0309 14:26:59.544003 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpqhb\" (UniqueName: \"kubernetes.io/projected/3c1bee46-a464-474e-ae9c-e333a0ef2190-kube-api-access-zpqhb\") pod \"dnsmasq-dns-7756b9d78c-xpcxn\" (UID: \"3c1bee46-a464-474e-ae9c-e333a0ef2190\") " pod="openstack/dnsmasq-dns-7756b9d78c-xpcxn" Mar 09 14:26:59 crc kubenswrapper[4722]: I0309 14:26:59.544088 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3c1bee46-a464-474e-ae9c-e333a0ef2190-ovsdbserver-sb\") pod \"dnsmasq-dns-7756b9d78c-xpcxn\" (UID: \"3c1bee46-a464-474e-ae9c-e333a0ef2190\") " pod="openstack/dnsmasq-dns-7756b9d78c-xpcxn" Mar 09 14:26:59 crc kubenswrapper[4722]: I0309 14:26:59.544131 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twx9m\" (UniqueName: \"kubernetes.io/projected/37f48e3f-e425-4806-8fb5-27724da7ad0d-kube-api-access-twx9m\") pod \"heat-cfnapi-58665ff9-7zt7s\" (UID: \"37f48e3f-e425-4806-8fb5-27724da7ad0d\") " pod="openstack/heat-cfnapi-58665ff9-7zt7s" Mar 09 14:26:59 crc kubenswrapper[4722]: I0309 14:26:59.544150 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37f48e3f-e425-4806-8fb5-27724da7ad0d-combined-ca-bundle\") pod \"heat-cfnapi-58665ff9-7zt7s\" (UID: \"37f48e3f-e425-4806-8fb5-27724da7ad0d\") " pod="openstack/heat-cfnapi-58665ff9-7zt7s" Mar 09 14:26:59 crc kubenswrapper[4722]: I0309 14:26:59.544182 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c08dfba1-d105-4384-b906-5772148696e4-config-data-custom\") pod \"heat-api-5ffdf96bcf-klbzp\" (UID: \"c08dfba1-d105-4384-b906-5772148696e4\") " pod="openstack/heat-api-5ffdf96bcf-klbzp" Mar 09 14:26:59 crc kubenswrapper[4722]: I0309 14:26:59.544251 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/37f48e3f-e425-4806-8fb5-27724da7ad0d-config-data-custom\") pod \"heat-cfnapi-58665ff9-7zt7s\" (UID: \"37f48e3f-e425-4806-8fb5-27724da7ad0d\") " pod="openstack/heat-cfnapi-58665ff9-7zt7s" Mar 09 14:26:59 crc kubenswrapper[4722]: I0309 14:26:59.544298 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c08dfba1-d105-4384-b906-5772148696e4-config-data\") pod \"heat-api-5ffdf96bcf-klbzp\" (UID: \"c08dfba1-d105-4384-b906-5772148696e4\") " pod="openstack/heat-api-5ffdf96bcf-klbzp" Mar 09 14:26:59 crc kubenswrapper[4722]: I0309 14:26:59.544328 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqcsp\" (UniqueName: \"kubernetes.io/projected/c08dfba1-d105-4384-b906-5772148696e4-kube-api-access-nqcsp\") pod \"heat-api-5ffdf96bcf-klbzp\" (UID: \"c08dfba1-d105-4384-b906-5772148696e4\") " pod="openstack/heat-api-5ffdf96bcf-klbzp" Mar 09 14:26:59 crc kubenswrapper[4722]: I0309 14:26:59.544366 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c1bee46-a464-474e-ae9c-e333a0ef2190-config\") pod \"dnsmasq-dns-7756b9d78c-xpcxn\" (UID: \"3c1bee46-a464-474e-ae9c-e333a0ef2190\") " pod="openstack/dnsmasq-dns-7756b9d78c-xpcxn" Mar 09 14:26:59 crc kubenswrapper[4722]: I0309 14:26:59.544412 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3c1bee46-a464-474e-ae9c-e333a0ef2190-ovsdbserver-nb\") pod \"dnsmasq-dns-7756b9d78c-xpcxn\" (UID: \"3c1bee46-a464-474e-ae9c-e333a0ef2190\") " pod="openstack/dnsmasq-dns-7756b9d78c-xpcxn" Mar 09 14:26:59 crc kubenswrapper[4722]: I0309 14:26:59.544442 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3c1bee46-a464-474e-ae9c-e333a0ef2190-dns-svc\") pod \"dnsmasq-dns-7756b9d78c-xpcxn\" (UID: \"3c1bee46-a464-474e-ae9c-e333a0ef2190\") " pod="openstack/dnsmasq-dns-7756b9d78c-xpcxn" Mar 09 14:26:59 crc kubenswrapper[4722]: I0309 14:26:59.544458 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3c1bee46-a464-474e-ae9c-e333a0ef2190-dns-swift-storage-0\") pod \"dnsmasq-dns-7756b9d78c-xpcxn\" (UID: \"3c1bee46-a464-474e-ae9c-e333a0ef2190\") " pod="openstack/dnsmasq-dns-7756b9d78c-xpcxn" Mar 09 14:26:59 crc kubenswrapper[4722]: I0309 14:26:59.544494 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c08dfba1-d105-4384-b906-5772148696e4-combined-ca-bundle\") pod \"heat-api-5ffdf96bcf-klbzp\" (UID: \"c08dfba1-d105-4384-b906-5772148696e4\") " pod="openstack/heat-api-5ffdf96bcf-klbzp" Mar 09 14:26:59 crc kubenswrapper[4722]: I0309 14:26:59.544525 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37f48e3f-e425-4806-8fb5-27724da7ad0d-config-data\") pod \"heat-cfnapi-58665ff9-7zt7s\" (UID: \"37f48e3f-e425-4806-8fb5-27724da7ad0d\") " pod="openstack/heat-cfnapi-58665ff9-7zt7s" Mar 09 14:26:59 crc kubenswrapper[4722]: I0309 14:26:59.545373 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3c1bee46-a464-474e-ae9c-e333a0ef2190-ovsdbserver-sb\") pod \"dnsmasq-dns-7756b9d78c-xpcxn\" (UID: \"3c1bee46-a464-474e-ae9c-e333a0ef2190\") " pod="openstack/dnsmasq-dns-7756b9d78c-xpcxn" Mar 09 14:26:59 crc kubenswrapper[4722]: I0309 14:26:59.545376 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3c1bee46-a464-474e-ae9c-e333a0ef2190-ovsdbserver-nb\") pod \"dnsmasq-dns-7756b9d78c-xpcxn\" (UID: \"3c1bee46-a464-474e-ae9c-e333a0ef2190\") " pod="openstack/dnsmasq-dns-7756b9d78c-xpcxn" Mar 09 14:26:59 crc kubenswrapper[4722]: I0309 14:26:59.545893 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c1bee46-a464-474e-ae9c-e333a0ef2190-config\") pod \"dnsmasq-dns-7756b9d78c-xpcxn\" (UID: \"3c1bee46-a464-474e-ae9c-e333a0ef2190\") " pod="openstack/dnsmasq-dns-7756b9d78c-xpcxn" Mar 09 14:26:59 crc kubenswrapper[4722]: I0309 14:26:59.546011 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3c1bee46-a464-474e-ae9c-e333a0ef2190-dns-swift-storage-0\") pod \"dnsmasq-dns-7756b9d78c-xpcxn\" (UID: \"3c1bee46-a464-474e-ae9c-e333a0ef2190\") " pod="openstack/dnsmasq-dns-7756b9d78c-xpcxn" Mar 09 14:26:59 crc kubenswrapper[4722]: I0309 14:26:59.546281 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3c1bee46-a464-474e-ae9c-e333a0ef2190-dns-svc\") pod \"dnsmasq-dns-7756b9d78c-xpcxn\" (UID: \"3c1bee46-a464-474e-ae9c-e333a0ef2190\") " pod="openstack/dnsmasq-dns-7756b9d78c-xpcxn" Mar 09 14:26:59 crc kubenswrapper[4722]: I0309 14:26:59.550457 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37f48e3f-e425-4806-8fb5-27724da7ad0d-combined-ca-bundle\") pod \"heat-cfnapi-58665ff9-7zt7s\" (UID: \"37f48e3f-e425-4806-8fb5-27724da7ad0d\") " pod="openstack/heat-cfnapi-58665ff9-7zt7s" Mar 09 14:26:59 crc kubenswrapper[4722]: I0309 14:26:59.553880 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/37f48e3f-e425-4806-8fb5-27724da7ad0d-config-data-custom\") pod \"heat-cfnapi-58665ff9-7zt7s\" (UID: \"37f48e3f-e425-4806-8fb5-27724da7ad0d\") " pod="openstack/heat-cfnapi-58665ff9-7zt7s" Mar 09 14:26:59 crc kubenswrapper[4722]: I0309 14:26:59.555690 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37f48e3f-e425-4806-8fb5-27724da7ad0d-config-data\") pod \"heat-cfnapi-58665ff9-7zt7s\" (UID: \"37f48e3f-e425-4806-8fb5-27724da7ad0d\") " pod="openstack/heat-cfnapi-58665ff9-7zt7s" Mar 09 14:26:59 crc kubenswrapper[4722]: I0309 14:26:59.574425 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twx9m\" (UniqueName: \"kubernetes.io/projected/37f48e3f-e425-4806-8fb5-27724da7ad0d-kube-api-access-twx9m\") pod \"heat-cfnapi-58665ff9-7zt7s\" (UID: \"37f48e3f-e425-4806-8fb5-27724da7ad0d\") " pod="openstack/heat-cfnapi-58665ff9-7zt7s" Mar 09 14:26:59 crc kubenswrapper[4722]: I0309 14:26:59.577949 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpqhb\" (UniqueName: \"kubernetes.io/projected/3c1bee46-a464-474e-ae9c-e333a0ef2190-kube-api-access-zpqhb\") pod \"dnsmasq-dns-7756b9d78c-xpcxn\" (UID: \"3c1bee46-a464-474e-ae9c-e333a0ef2190\") " pod="openstack/dnsmasq-dns-7756b9d78c-xpcxn" Mar 09 14:26:59 crc kubenswrapper[4722]: I0309 14:26:59.583563 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-77cd88d8c5-gml97"] Mar 09 14:26:59 crc kubenswrapper[4722]: I0309 14:26:59.621591 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-77cd88d8c5-gml97" event={"ID":"685a5733-d06e-4523-a35a-051db91eb0be","Type":"ContainerStarted","Data":"be3ecde308cbcea91221e56bc92861c0b49dcf5aae725f977f325a3810ae78f2"} Mar 09 14:26:59 crc kubenswrapper[4722]: I0309 14:26:59.646771 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c08dfba1-d105-4384-b906-5772148696e4-config-data-custom\") pod \"heat-api-5ffdf96bcf-klbzp\" (UID: \"c08dfba1-d105-4384-b906-5772148696e4\") " pod="openstack/heat-api-5ffdf96bcf-klbzp" Mar 09 14:26:59 crc kubenswrapper[4722]: I0309 14:26:59.646862 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c08dfba1-d105-4384-b906-5772148696e4-config-data\") pod \"heat-api-5ffdf96bcf-klbzp\" (UID: \"c08dfba1-d105-4384-b906-5772148696e4\") " pod="openstack/heat-api-5ffdf96bcf-klbzp" Mar 09 14:26:59 crc kubenswrapper[4722]: I0309 14:26:59.646896 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqcsp\" (UniqueName: \"kubernetes.io/projected/c08dfba1-d105-4384-b906-5772148696e4-kube-api-access-nqcsp\") pod \"heat-api-5ffdf96bcf-klbzp\" (UID: \"c08dfba1-d105-4384-b906-5772148696e4\") " pod="openstack/heat-api-5ffdf96bcf-klbzp" Mar 09 14:26:59 crc kubenswrapper[4722]: I0309 14:26:59.646964 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c08dfba1-d105-4384-b906-5772148696e4-combined-ca-bundle\") pod \"heat-api-5ffdf96bcf-klbzp\" (UID: \"c08dfba1-d105-4384-b906-5772148696e4\") " pod="openstack/heat-api-5ffdf96bcf-klbzp" Mar 09 14:26:59 crc kubenswrapper[4722]: I0309 14:26:59.655284 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c08dfba1-d105-4384-b906-5772148696e4-combined-ca-bundle\") pod \"heat-api-5ffdf96bcf-klbzp\" (UID: \"c08dfba1-d105-4384-b906-5772148696e4\") " pod="openstack/heat-api-5ffdf96bcf-klbzp" Mar 09 14:26:59 crc kubenswrapper[4722]: I0309 14:26:59.662147 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c08dfba1-d105-4384-b906-5772148696e4-config-data\") pod \"heat-api-5ffdf96bcf-klbzp\" (UID: \"c08dfba1-d105-4384-b906-5772148696e4\") " pod="openstack/heat-api-5ffdf96bcf-klbzp" Mar 09 14:26:59 crc kubenswrapper[4722]: I0309 14:26:59.673179 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c08dfba1-d105-4384-b906-5772148696e4-config-data-custom\") pod \"heat-api-5ffdf96bcf-klbzp\" (UID: \"c08dfba1-d105-4384-b906-5772148696e4\") " pod="openstack/heat-api-5ffdf96bcf-klbzp" Mar 09 14:26:59 crc kubenswrapper[4722]: I0309 14:26:59.681120 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqcsp\" (UniqueName: \"kubernetes.io/projected/c08dfba1-d105-4384-b906-5772148696e4-kube-api-access-nqcsp\") pod \"heat-api-5ffdf96bcf-klbzp\" (UID: \"c08dfba1-d105-4384-b906-5772148696e4\") " pod="openstack/heat-api-5ffdf96bcf-klbzp" Mar 09 14:26:59 crc kubenswrapper[4722]: I0309 14:26:59.683344 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7756b9d78c-xpcxn" Mar 09 14:26:59 crc kubenswrapper[4722]: I0309 14:26:59.710544 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-58665ff9-7zt7s" Mar 09 14:26:59 crc kubenswrapper[4722]: I0309 14:26:59.891937 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5ffdf96bcf-klbzp" Mar 09 14:27:00 crc kubenswrapper[4722]: I0309 14:27:00.202898 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3b18ae1-c2c9-4454-a591-53ce06064d82" path="/var/lib/kubelet/pods/c3b18ae1-c2c9-4454-a591-53ce06064d82/volumes" Mar 09 14:27:00 crc kubenswrapper[4722]: I0309 14:27:00.383194 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-5884f594db-ls9vv"] Mar 09 14:27:00 crc kubenswrapper[4722]: W0309 14:27:00.396769 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf71670c9_520f_4005_a324_199bc52fac7f.slice/crio-f8354b24d5aa015bff9d0e7c6220ce3b18d67b7dc6b099982162cc991789aeaf WatchSource:0}: Error finding container f8354b24d5aa015bff9d0e7c6220ce3b18d67b7dc6b099982162cc991789aeaf: Status 404 returned error can't find the container with id f8354b24d5aa015bff9d0e7c6220ce3b18d67b7dc6b099982162cc991789aeaf Mar 09 14:27:00 crc kubenswrapper[4722]: I0309 14:27:00.584484 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 14:27:00 crc kubenswrapper[4722]: I0309 14:27:00.584758 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a2b1f498-8133-44c3-b9e5-fb0accca46b1" containerName="ceilometer-central-agent" containerID="cri-o://4111597f6938af2698c7e65d183b920cc4a5a02e25e591a9d7c7e74ba34a1adc" gracePeriod=30 Mar 09 14:27:00 crc kubenswrapper[4722]: I0309 14:27:00.585184 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a2b1f498-8133-44c3-b9e5-fb0accca46b1" containerName="ceilometer-notification-agent" containerID="cri-o://8c4ef0b1b0525da141a7b474417b1d9f5f076ece945f64130e94d442a9b6d5c1" gracePeriod=30 Mar 09 14:27:00 crc kubenswrapper[4722]: I0309 14:27:00.585257 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a2b1f498-8133-44c3-b9e5-fb0accca46b1" containerName="proxy-httpd" containerID="cri-o://d72d728f1904b9bb92a777806f4d320dcd1558801a3377ecfb3691c9c51e8ef3" gracePeriod=30 Mar 09 14:27:00 crc kubenswrapper[4722]: I0309 14:27:00.585280 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a2b1f498-8133-44c3-b9e5-fb0accca46b1" containerName="sg-core" containerID="cri-o://5a99f9106014c8a16f08805e02978936d40105f4edfc311dc98e8ab1e37b2c97" gracePeriod=30 Mar 09 14:27:00 crc kubenswrapper[4722]: I0309 14:27:00.608350 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="a2b1f498-8133-44c3-b9e5-fb0accca46b1" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.216:3000/\": EOF" Mar 09 14:27:00 crc kubenswrapper[4722]: I0309 14:27:00.656293 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-77cd88d8c5-gml97" event={"ID":"685a5733-d06e-4523-a35a-051db91eb0be","Type":"ContainerStarted","Data":"17d6931a27e883a8284e224af72a0ab65394d31a8cc96cafc1d9e891c116021a"} Mar 09 14:27:00 crc kubenswrapper[4722]: I0309 14:27:00.664467 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-5884f594db-ls9vv" event={"ID":"f71670c9-520f-4005-a324-199bc52fac7f","Type":"ContainerStarted","Data":"f8354b24d5aa015bff9d0e7c6220ce3b18d67b7dc6b099982162cc991789aeaf"} Mar 09 14:27:00 crc kubenswrapper[4722]: I0309 14:27:00.895877 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-xpcxn"] Mar 09 14:27:00 crc kubenswrapper[4722]: W0309 14:27:00.901757 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3c1bee46_a464_474e_ae9c_e333a0ef2190.slice/crio-c9bfffe05c307b7c132099fcfe0f331fc42a2687812efbaa4accf54ae4e82945 WatchSource:0}: Error finding container c9bfffe05c307b7c132099fcfe0f331fc42a2687812efbaa4accf54ae4e82945: Status 404 returned error can't find the container with id c9bfffe05c307b7c132099fcfe0f331fc42a2687812efbaa4accf54ae4e82945 Mar 09 14:27:00 crc kubenswrapper[4722]: I0309 14:27:00.952965 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-58665ff9-7zt7s"] Mar 09 14:27:00 crc kubenswrapper[4722]: W0309 14:27:00.969871 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37f48e3f_e425_4806_8fb5_27724da7ad0d.slice/crio-8624c0c4345c2696fd5321fd486df640d0f505a836b998620c0888808603aefd WatchSource:0}: Error finding container 8624c0c4345c2696fd5321fd486df640d0f505a836b998620c0888808603aefd: Status 404 returned error can't find the container with id 8624c0c4345c2696fd5321fd486df640d0f505a836b998620c0888808603aefd Mar 09 14:27:00 crc kubenswrapper[4722]: I0309 14:27:00.988540 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-5ffdf96bcf-klbzp"] Mar 09 14:27:01 crc kubenswrapper[4722]: I0309 14:27:01.690087 4722 generic.go:334] "Generic (PLEG): container finished" podID="a2b1f498-8133-44c3-b9e5-fb0accca46b1" containerID="d72d728f1904b9bb92a777806f4d320dcd1558801a3377ecfb3691c9c51e8ef3" exitCode=0 Mar 09 14:27:01 crc kubenswrapper[4722]: I0309 14:27:01.690609 4722 generic.go:334] "Generic (PLEG): container finished" podID="a2b1f498-8133-44c3-b9e5-fb0accca46b1" containerID="5a99f9106014c8a16f08805e02978936d40105f4edfc311dc98e8ab1e37b2c97" exitCode=2 Mar 09 14:27:01 crc kubenswrapper[4722]: I0309 14:27:01.690618 4722 generic.go:334] "Generic (PLEG): container finished" podID="a2b1f498-8133-44c3-b9e5-fb0accca46b1" containerID="8c4ef0b1b0525da141a7b474417b1d9f5f076ece945f64130e94d442a9b6d5c1" exitCode=0 Mar 09 14:27:01 crc kubenswrapper[4722]: I0309 14:27:01.690627 4722 generic.go:334] "Generic (PLEG): container finished" podID="a2b1f498-8133-44c3-b9e5-fb0accca46b1" containerID="4111597f6938af2698c7e65d183b920cc4a5a02e25e591a9d7c7e74ba34a1adc" exitCode=0 Mar 09 14:27:01 crc kubenswrapper[4722]: I0309 14:27:01.690670 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a2b1f498-8133-44c3-b9e5-fb0accca46b1","Type":"ContainerDied","Data":"d72d728f1904b9bb92a777806f4d320dcd1558801a3377ecfb3691c9c51e8ef3"} Mar 09 14:27:01 crc kubenswrapper[4722]: I0309 14:27:01.690697 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a2b1f498-8133-44c3-b9e5-fb0accca46b1","Type":"ContainerDied","Data":"5a99f9106014c8a16f08805e02978936d40105f4edfc311dc98e8ab1e37b2c97"} Mar 09 14:27:01 crc kubenswrapper[4722]: I0309 14:27:01.690710 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a2b1f498-8133-44c3-b9e5-fb0accca46b1","Type":"ContainerDied","Data":"8c4ef0b1b0525da141a7b474417b1d9f5f076ece945f64130e94d442a9b6d5c1"} Mar 09 14:27:01 crc kubenswrapper[4722]: I0309 14:27:01.690720 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a2b1f498-8133-44c3-b9e5-fb0accca46b1","Type":"ContainerDied","Data":"4111597f6938af2698c7e65d183b920cc4a5a02e25e591a9d7c7e74ba34a1adc"} Mar 09 14:27:01 crc kubenswrapper[4722]: I0309 14:27:01.694051 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5ffdf96bcf-klbzp" event={"ID":"c08dfba1-d105-4384-b906-5772148696e4","Type":"ContainerStarted","Data":"48703b92787f9698e4a7e81da6bfa90aaf203ac85cd822625bf29c90ea7ead58"} Mar 09 14:27:01 crc kubenswrapper[4722]: I0309 14:27:01.744447 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-5884f594db-ls9vv" event={"ID":"f71670c9-520f-4005-a324-199bc52fac7f","Type":"ContainerStarted","Data":"876323b8a50b38036cbeb67f8682fe72e20d6ca089e54cdc4c767cfd67f20e7f"} Mar 09 14:27:01 crc kubenswrapper[4722]: I0309 14:27:01.744624 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-5884f594db-ls9vv" Mar 09 14:27:01 crc kubenswrapper[4722]: I0309 14:27:01.756425 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-58665ff9-7zt7s" event={"ID":"37f48e3f-e425-4806-8fb5-27724da7ad0d","Type":"ContainerStarted","Data":"8624c0c4345c2696fd5321fd486df640d0f505a836b998620c0888808603aefd"} Mar 09 14:27:01 crc kubenswrapper[4722]: I0309 14:27:01.766507 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-77cd88d8c5-gml97" event={"ID":"685a5733-d06e-4523-a35a-051db91eb0be","Type":"ContainerStarted","Data":"e6aac84ca22de311827d8706b846e27e5cef7de1a3689acec4587ccf33ac3018"} Mar 09 14:27:01 crc kubenswrapper[4722]: I0309 14:27:01.766768 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-77cd88d8c5-gml97" Mar 09 14:27:01 crc kubenswrapper[4722]: I0309 14:27:01.766814 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-77cd88d8c5-gml97" Mar 09 14:27:01 crc kubenswrapper[4722]: I0309 14:27:01.767975 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-5884f594db-ls9vv" podStartSLOduration=2.767957167 podStartE2EDuration="2.767957167s" podCreationTimestamp="2026-03-09 14:26:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:27:01.764738888 +0000 UTC m=+1462.320307464" watchObservedRunningTime="2026-03-09 14:27:01.767957167 +0000 UTC m=+1462.323525743" Mar 09 14:27:01 crc kubenswrapper[4722]: I0309 14:27:01.783294 4722 generic.go:334] "Generic (PLEG): container finished" podID="3c1bee46-a464-474e-ae9c-e333a0ef2190" containerID="86c3469bd786421a67af04f30ed7e5cdf03b6d03b83ddff2bae2a5ea53f862f4" exitCode=0 Mar 09 14:27:01 crc kubenswrapper[4722]: I0309 14:27:01.783421 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7756b9d78c-xpcxn" event={"ID":"3c1bee46-a464-474e-ae9c-e333a0ef2190","Type":"ContainerDied","Data":"86c3469bd786421a67af04f30ed7e5cdf03b6d03b83ddff2bae2a5ea53f862f4"} Mar 09 14:27:01 crc kubenswrapper[4722]: I0309 14:27:01.783677 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7756b9d78c-xpcxn" event={"ID":"3c1bee46-a464-474e-ae9c-e333a0ef2190","Type":"ContainerStarted","Data":"c9bfffe05c307b7c132099fcfe0f331fc42a2687812efbaa4accf54ae4e82945"} Mar 09 14:27:01 crc kubenswrapper[4722]: I0309 14:27:01.798025 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-77cd88d8c5-gml97" podStartSLOduration=3.797020051 podStartE2EDuration="3.797020051s" podCreationTimestamp="2026-03-09 14:26:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:27:01.789524923 +0000 UTC m=+1462.345093499" watchObservedRunningTime="2026-03-09 14:27:01.797020051 +0000 UTC m=+1462.352588627" Mar 09 14:27:01 crc kubenswrapper[4722]: I0309 14:27:01.976286 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 14:27:02 crc kubenswrapper[4722]: I0309 14:27:02.102071 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2b1f498-8133-44c3-b9e5-fb0accca46b1-config-data\") pod \"a2b1f498-8133-44c3-b9e5-fb0accca46b1\" (UID: \"a2b1f498-8133-44c3-b9e5-fb0accca46b1\") " Mar 09 14:27:02 crc kubenswrapper[4722]: I0309 14:27:02.102586 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a2b1f498-8133-44c3-b9e5-fb0accca46b1-scripts\") pod \"a2b1f498-8133-44c3-b9e5-fb0accca46b1\" (UID: \"a2b1f498-8133-44c3-b9e5-fb0accca46b1\") " Mar 09 14:27:02 crc kubenswrapper[4722]: I0309 14:27:02.102697 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a2b1f498-8133-44c3-b9e5-fb0accca46b1-sg-core-conf-yaml\") pod \"a2b1f498-8133-44c3-b9e5-fb0accca46b1\" (UID: \"a2b1f498-8133-44c3-b9e5-fb0accca46b1\") " Mar 09 14:27:02 crc kubenswrapper[4722]: I0309 14:27:02.102742 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vftrz\" (UniqueName: \"kubernetes.io/projected/a2b1f498-8133-44c3-b9e5-fb0accca46b1-kube-api-access-vftrz\") pod \"a2b1f498-8133-44c3-b9e5-fb0accca46b1\" (UID: \"a2b1f498-8133-44c3-b9e5-fb0accca46b1\") " Mar 09 14:27:02 crc kubenswrapper[4722]: I0309 14:27:02.102813 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a2b1f498-8133-44c3-b9e5-fb0accca46b1-run-httpd\") pod \"a2b1f498-8133-44c3-b9e5-fb0accca46b1\" (UID: \"a2b1f498-8133-44c3-b9e5-fb0accca46b1\") " Mar 09 14:27:02 crc kubenswrapper[4722]: I0309 14:27:02.102888 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a2b1f498-8133-44c3-b9e5-fb0accca46b1-log-httpd\") pod \"a2b1f498-8133-44c3-b9e5-fb0accca46b1\" (UID: \"a2b1f498-8133-44c3-b9e5-fb0accca46b1\") " Mar 09 14:27:02 crc kubenswrapper[4722]: I0309 14:27:02.102914 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2b1f498-8133-44c3-b9e5-fb0accca46b1-combined-ca-bundle\") pod \"a2b1f498-8133-44c3-b9e5-fb0accca46b1\" (UID: \"a2b1f498-8133-44c3-b9e5-fb0accca46b1\") " Mar 09 14:27:02 crc kubenswrapper[4722]: I0309 14:27:02.107041 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2b1f498-8133-44c3-b9e5-fb0accca46b1-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a2b1f498-8133-44c3-b9e5-fb0accca46b1" (UID: "a2b1f498-8133-44c3-b9e5-fb0accca46b1"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:27:02 crc kubenswrapper[4722]: I0309 14:27:02.107133 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2b1f498-8133-44c3-b9e5-fb0accca46b1-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a2b1f498-8133-44c3-b9e5-fb0accca46b1" (UID: "a2b1f498-8133-44c3-b9e5-fb0accca46b1"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:27:02 crc kubenswrapper[4722]: I0309 14:27:02.107404 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2b1f498-8133-44c3-b9e5-fb0accca46b1-scripts" (OuterVolumeSpecName: "scripts") pod "a2b1f498-8133-44c3-b9e5-fb0accca46b1" (UID: "a2b1f498-8133-44c3-b9e5-fb0accca46b1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:27:02 crc kubenswrapper[4722]: I0309 14:27:02.113626 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2b1f498-8133-44c3-b9e5-fb0accca46b1-kube-api-access-vftrz" (OuterVolumeSpecName: "kube-api-access-vftrz") pod "a2b1f498-8133-44c3-b9e5-fb0accca46b1" (UID: "a2b1f498-8133-44c3-b9e5-fb0accca46b1"). InnerVolumeSpecName "kube-api-access-vftrz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:27:02 crc kubenswrapper[4722]: I0309 14:27:02.183391 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2b1f498-8133-44c3-b9e5-fb0accca46b1-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a2b1f498-8133-44c3-b9e5-fb0accca46b1" (UID: "a2b1f498-8133-44c3-b9e5-fb0accca46b1"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:27:02 crc kubenswrapper[4722]: I0309 14:27:02.216340 4722 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a2b1f498-8133-44c3-b9e5-fb0accca46b1-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 09 14:27:02 crc kubenswrapper[4722]: I0309 14:27:02.216619 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vftrz\" (UniqueName: \"kubernetes.io/projected/a2b1f498-8133-44c3-b9e5-fb0accca46b1-kube-api-access-vftrz\") on node \"crc\" DevicePath \"\"" Mar 09 14:27:02 crc kubenswrapper[4722]: I0309 14:27:02.216704 4722 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a2b1f498-8133-44c3-b9e5-fb0accca46b1-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 14:27:02 crc kubenswrapper[4722]: I0309 14:27:02.216878 4722 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a2b1f498-8133-44c3-b9e5-fb0accca46b1-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 14:27:02 crc kubenswrapper[4722]: I0309 14:27:02.217014 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a2b1f498-8133-44c3-b9e5-fb0accca46b1-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 14:27:02 crc kubenswrapper[4722]: I0309 14:27:02.266344 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2b1f498-8133-44c3-b9e5-fb0accca46b1-config-data" (OuterVolumeSpecName: "config-data") pod "a2b1f498-8133-44c3-b9e5-fb0accca46b1" (UID: "a2b1f498-8133-44c3-b9e5-fb0accca46b1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:27:02 crc kubenswrapper[4722]: I0309 14:27:02.276903 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2b1f498-8133-44c3-b9e5-fb0accca46b1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a2b1f498-8133-44c3-b9e5-fb0accca46b1" (UID: "a2b1f498-8133-44c3-b9e5-fb0accca46b1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:27:02 crc kubenswrapper[4722]: I0309 14:27:02.325349 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2b1f498-8133-44c3-b9e5-fb0accca46b1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:27:02 crc kubenswrapper[4722]: I0309 14:27:02.325385 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2b1f498-8133-44c3-b9e5-fb0accca46b1-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:27:02 crc kubenswrapper[4722]: I0309 14:27:02.867007 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a2b1f498-8133-44c3-b9e5-fb0accca46b1","Type":"ContainerDied","Data":"1eff49e536c100fb8b6b05bf268153f5fcbfdcb11219c3476d0f3462947cf7d5"} Mar 09 14:27:02 crc kubenswrapper[4722]: I0309 14:27:02.867085 4722 scope.go:117] "RemoveContainer" containerID="d72d728f1904b9bb92a777806f4d320dcd1558801a3377ecfb3691c9c51e8ef3" Mar 09 14:27:02 crc kubenswrapper[4722]: I0309 14:27:02.867094 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 14:27:02 crc kubenswrapper[4722]: I0309 14:27:02.888566 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7756b9d78c-xpcxn" event={"ID":"3c1bee46-a464-474e-ae9c-e333a0ef2190","Type":"ContainerStarted","Data":"b4992f3e1f40480b19383ed9d7d471916f5e637a19de820f80b9af6999306965"} Mar 09 14:27:02 crc kubenswrapper[4722]: I0309 14:27:02.894722 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7756b9d78c-xpcxn" Mar 09 14:27:02 crc kubenswrapper[4722]: I0309 14:27:02.923908 4722 scope.go:117] "RemoveContainer" containerID="5a99f9106014c8a16f08805e02978936d40105f4edfc311dc98e8ab1e37b2c97" Mar 09 14:27:02 crc kubenswrapper[4722]: I0309 14:27:02.934496 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7756b9d78c-xpcxn" podStartSLOduration=3.934477812 podStartE2EDuration="3.934477812s" podCreationTimestamp="2026-03-09 14:26:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:27:02.924844076 +0000 UTC m=+1463.480412672" watchObservedRunningTime="2026-03-09 14:27:02.934477812 +0000 UTC m=+1463.490046388" Mar 09 14:27:03 crc kubenswrapper[4722]: I0309 14:27:03.011982 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 14:27:03 crc kubenswrapper[4722]: I0309 14:27:03.012490 4722 scope.go:117] "RemoveContainer" containerID="8c4ef0b1b0525da141a7b474417b1d9f5f076ece945f64130e94d442a9b6d5c1" Mar 09 14:27:03 crc kubenswrapper[4722]: I0309 14:27:03.028022 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 09 14:27:03 crc kubenswrapper[4722]: I0309 14:27:03.041441 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 09 14:27:03 crc kubenswrapper[4722]: E0309 14:27:03.042194 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2b1f498-8133-44c3-b9e5-fb0accca46b1" containerName="ceilometer-notification-agent" Mar 09 14:27:03 crc kubenswrapper[4722]: I0309 14:27:03.042227 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2b1f498-8133-44c3-b9e5-fb0accca46b1" containerName="ceilometer-notification-agent" Mar 09 14:27:03 crc kubenswrapper[4722]: E0309 14:27:03.042261 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2b1f498-8133-44c3-b9e5-fb0accca46b1" containerName="ceilometer-central-agent" Mar 09 14:27:03 crc kubenswrapper[4722]: I0309 14:27:03.042267 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2b1f498-8133-44c3-b9e5-fb0accca46b1" containerName="ceilometer-central-agent" Mar 09 14:27:03 crc kubenswrapper[4722]: E0309 14:27:03.042284 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2b1f498-8133-44c3-b9e5-fb0accca46b1" containerName="sg-core" Mar 09 14:27:03 crc kubenswrapper[4722]: I0309 14:27:03.042291 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2b1f498-8133-44c3-b9e5-fb0accca46b1" containerName="sg-core" Mar 09 14:27:03 crc kubenswrapper[4722]: E0309 14:27:03.042308 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2b1f498-8133-44c3-b9e5-fb0accca46b1" containerName="proxy-httpd" Mar 09 14:27:03 crc kubenswrapper[4722]: I0309 14:27:03.042313 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2b1f498-8133-44c3-b9e5-fb0accca46b1" containerName="proxy-httpd" Mar 09 14:27:03 crc kubenswrapper[4722]: I0309 14:27:03.042538 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2b1f498-8133-44c3-b9e5-fb0accca46b1" containerName="ceilometer-notification-agent" Mar 09 14:27:03 crc kubenswrapper[4722]: I0309 14:27:03.042554 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2b1f498-8133-44c3-b9e5-fb0accca46b1" containerName="ceilometer-central-agent" Mar 09 14:27:03 crc kubenswrapper[4722]: I0309 14:27:03.042569 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2b1f498-8133-44c3-b9e5-fb0accca46b1" containerName="sg-core" Mar 09 14:27:03 crc kubenswrapper[4722]: I0309 14:27:03.042596 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2b1f498-8133-44c3-b9e5-fb0accca46b1" containerName="proxy-httpd" Mar 09 14:27:03 crc kubenswrapper[4722]: I0309 14:27:03.044689 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 14:27:03 crc kubenswrapper[4722]: I0309 14:27:03.061509 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 14:27:03 crc kubenswrapper[4722]: I0309 14:27:03.064091 4722 scope.go:117] "RemoveContainer" containerID="4111597f6938af2698c7e65d183b920cc4a5a02e25e591a9d7c7e74ba34a1adc" Mar 09 14:27:03 crc kubenswrapper[4722]: I0309 14:27:03.066460 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 09 14:27:03 crc kubenswrapper[4722]: I0309 14:27:03.066695 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 09 14:27:03 crc kubenswrapper[4722]: I0309 14:27:03.185008 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2a7b55ea-3f51-4dfc-b861-23be50541a1c-run-httpd\") pod \"ceilometer-0\" (UID: \"2a7b55ea-3f51-4dfc-b861-23be50541a1c\") " pod="openstack/ceilometer-0" Mar 09 14:27:03 crc kubenswrapper[4722]: I0309 14:27:03.185072 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a7b55ea-3f51-4dfc-b861-23be50541a1c-scripts\") pod \"ceilometer-0\" (UID: \"2a7b55ea-3f51-4dfc-b861-23be50541a1c\") " pod="openstack/ceilometer-0" Mar 09 14:27:03 crc kubenswrapper[4722]: I0309 14:27:03.185169 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2a7b55ea-3f51-4dfc-b861-23be50541a1c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2a7b55ea-3f51-4dfc-b861-23be50541a1c\") " pod="openstack/ceilometer-0" Mar 09 14:27:03 crc kubenswrapper[4722]: I0309 14:27:03.185944 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2a7b55ea-3f51-4dfc-b861-23be50541a1c-log-httpd\") pod \"ceilometer-0\" (UID: \"2a7b55ea-3f51-4dfc-b861-23be50541a1c\") " pod="openstack/ceilometer-0" Mar 09 14:27:03 crc kubenswrapper[4722]: I0309 14:27:03.186037 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a7b55ea-3f51-4dfc-b861-23be50541a1c-config-data\") pod \"ceilometer-0\" (UID: \"2a7b55ea-3f51-4dfc-b861-23be50541a1c\") " pod="openstack/ceilometer-0" Mar 09 14:27:03 crc kubenswrapper[4722]: I0309 14:27:03.186125 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7pk2\" (UniqueName: \"kubernetes.io/projected/2a7b55ea-3f51-4dfc-b861-23be50541a1c-kube-api-access-v7pk2\") pod \"ceilometer-0\" (UID: \"2a7b55ea-3f51-4dfc-b861-23be50541a1c\") " pod="openstack/ceilometer-0" Mar 09 14:27:03 crc kubenswrapper[4722]: I0309 14:27:03.186154 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a7b55ea-3f51-4dfc-b861-23be50541a1c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2a7b55ea-3f51-4dfc-b861-23be50541a1c\") " pod="openstack/ceilometer-0" Mar 09 14:27:03 crc kubenswrapper[4722]: I0309 14:27:03.289958 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2a7b55ea-3f51-4dfc-b861-23be50541a1c-log-httpd\") pod \"ceilometer-0\" (UID: \"2a7b55ea-3f51-4dfc-b861-23be50541a1c\") " pod="openstack/ceilometer-0" Mar 09 14:27:03 crc kubenswrapper[4722]: I0309 14:27:03.290296 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a7b55ea-3f51-4dfc-b861-23be50541a1c-config-data\") pod \"ceilometer-0\" (UID: \"2a7b55ea-3f51-4dfc-b861-23be50541a1c\") " pod="openstack/ceilometer-0" Mar 09 14:27:03 crc kubenswrapper[4722]: I0309 14:27:03.290401 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7pk2\" (UniqueName: \"kubernetes.io/projected/2a7b55ea-3f51-4dfc-b861-23be50541a1c-kube-api-access-v7pk2\") pod \"ceilometer-0\" (UID: \"2a7b55ea-3f51-4dfc-b861-23be50541a1c\") " pod="openstack/ceilometer-0" Mar 09 14:27:03 crc kubenswrapper[4722]: I0309 14:27:03.290433 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a7b55ea-3f51-4dfc-b861-23be50541a1c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2a7b55ea-3f51-4dfc-b861-23be50541a1c\") " pod="openstack/ceilometer-0" Mar 09 14:27:03 crc kubenswrapper[4722]: I0309 14:27:03.290535 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2a7b55ea-3f51-4dfc-b861-23be50541a1c-run-httpd\") pod \"ceilometer-0\" (UID: \"2a7b55ea-3f51-4dfc-b861-23be50541a1c\") " pod="openstack/ceilometer-0" Mar 09 14:27:03 crc kubenswrapper[4722]: I0309 14:27:03.290562 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a7b55ea-3f51-4dfc-b861-23be50541a1c-scripts\") pod \"ceilometer-0\" (UID: \"2a7b55ea-3f51-4dfc-b861-23be50541a1c\") " pod="openstack/ceilometer-0" Mar 09 14:27:03 crc kubenswrapper[4722]: I0309 14:27:03.290675 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2a7b55ea-3f51-4dfc-b861-23be50541a1c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2a7b55ea-3f51-4dfc-b861-23be50541a1c\") " pod="openstack/ceilometer-0" Mar 09 14:27:03 crc kubenswrapper[4722]: I0309 14:27:03.294021 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2a7b55ea-3f51-4dfc-b861-23be50541a1c-log-httpd\") pod \"ceilometer-0\" (UID: \"2a7b55ea-3f51-4dfc-b861-23be50541a1c\") " pod="openstack/ceilometer-0" Mar 09 14:27:03 crc kubenswrapper[4722]: I0309 14:27:03.294907 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2a7b55ea-3f51-4dfc-b861-23be50541a1c-run-httpd\") pod \"ceilometer-0\" (UID: \"2a7b55ea-3f51-4dfc-b861-23be50541a1c\") " pod="openstack/ceilometer-0" Mar 09 14:27:03 crc kubenswrapper[4722]: I0309 14:27:03.304750 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a7b55ea-3f51-4dfc-b861-23be50541a1c-scripts\") pod \"ceilometer-0\" (UID: \"2a7b55ea-3f51-4dfc-b861-23be50541a1c\") " pod="openstack/ceilometer-0" Mar 09 14:27:03 crc kubenswrapper[4722]: I0309 14:27:03.306814 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2a7b55ea-3f51-4dfc-b861-23be50541a1c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2a7b55ea-3f51-4dfc-b861-23be50541a1c\") " pod="openstack/ceilometer-0" Mar 09 14:27:03 crc kubenswrapper[4722]: I0309 14:27:03.313996 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7pk2\" (UniqueName: \"kubernetes.io/projected/2a7b55ea-3f51-4dfc-b861-23be50541a1c-kube-api-access-v7pk2\") pod \"ceilometer-0\" (UID: \"2a7b55ea-3f51-4dfc-b861-23be50541a1c\") " pod="openstack/ceilometer-0" Mar 09 14:27:03 crc kubenswrapper[4722]: I0309 14:27:03.329371 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a7b55ea-3f51-4dfc-b861-23be50541a1c-config-data\") pod \"ceilometer-0\" (UID: \"2a7b55ea-3f51-4dfc-b861-23be50541a1c\") " pod="openstack/ceilometer-0" Mar 09 14:27:03 crc kubenswrapper[4722]: I0309 14:27:03.332089 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a7b55ea-3f51-4dfc-b861-23be50541a1c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2a7b55ea-3f51-4dfc-b861-23be50541a1c\") " pod="openstack/ceilometer-0" Mar 09 14:27:03 crc kubenswrapper[4722]: I0309 14:27:03.390019 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 14:27:03 crc kubenswrapper[4722]: I0309 14:27:03.587284 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 09 14:27:03 crc kubenswrapper[4722]: I0309 14:27:03.851878 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 14:27:03 crc kubenswrapper[4722]: I0309 14:27:03.852097 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4" containerName="glance-log" containerID="cri-o://a23379779376add9f8ba28be54166afd7bd04e41bcf899bf7e5acdad5d66214e" gracePeriod=30 Mar 09 14:27:03 crc kubenswrapper[4722]: I0309 14:27:03.852529 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4" containerName="glance-httpd" containerID="cri-o://7d51fe379c52b16a7a6d962b827b1c07be67bbcd90aa939d56cebc2eb739f4c7" gracePeriod=30 Mar 09 14:27:04 crc kubenswrapper[4722]: I0309 14:27:04.183021 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2b1f498-8133-44c3-b9e5-fb0accca46b1" path="/var/lib/kubelet/pods/a2b1f498-8133-44c3-b9e5-fb0accca46b1/volumes" Mar 09 14:27:04 crc kubenswrapper[4722]: I0309 14:27:04.184275 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6f9b466589-25vjm" Mar 09 14:27:04 crc kubenswrapper[4722]: I0309 14:27:04.197184 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 14:27:04 crc kubenswrapper[4722]: I0309 14:27:04.283238 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6f5cbf5964-pmtn7"] Mar 09 14:27:04 crc kubenswrapper[4722]: I0309 14:27:04.283556 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6f5cbf5964-pmtn7" podUID="24650aaa-2d24-4e59-9f9a-40b929e25c10" containerName="neutron-api" containerID="cri-o://5415c2e2f0d9b8528559acb465bd00b9d5e4a5f98d5315538979c87b19b791c3" gracePeriod=30 Mar 09 14:27:04 crc kubenswrapper[4722]: I0309 14:27:04.283596 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6f5cbf5964-pmtn7" podUID="24650aaa-2d24-4e59-9f9a-40b929e25c10" containerName="neutron-httpd" containerID="cri-o://89975d701fd58751699513e7e1eafbc6124794762a9b7f960bfecb6450d873bd" gracePeriod=30 Mar 09 14:27:04 crc kubenswrapper[4722]: I0309 14:27:04.918039 4722 generic.go:334] "Generic (PLEG): container finished" podID="f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4" containerID="a23379779376add9f8ba28be54166afd7bd04e41bcf899bf7e5acdad5d66214e" exitCode=143 Mar 09 14:27:04 crc kubenswrapper[4722]: I0309 14:27:04.918141 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4","Type":"ContainerDied","Data":"a23379779376add9f8ba28be54166afd7bd04e41bcf899bf7e5acdad5d66214e"} Mar 09 14:27:04 crc kubenswrapper[4722]: I0309 14:27:04.920305 4722 generic.go:334] "Generic (PLEG): container finished" podID="24650aaa-2d24-4e59-9f9a-40b929e25c10" containerID="89975d701fd58751699513e7e1eafbc6124794762a9b7f960bfecb6450d873bd" exitCode=0 Mar 09 14:27:04 crc kubenswrapper[4722]: I0309 14:27:04.920351 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6f5cbf5964-pmtn7" event={"ID":"24650aaa-2d24-4e59-9f9a-40b929e25c10","Type":"ContainerDied","Data":"89975d701fd58751699513e7e1eafbc6124794762a9b7f960bfecb6450d873bd"} Mar 09 14:27:05 crc kubenswrapper[4722]: I0309 14:27:05.872647 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-9wrb2"] Mar 09 14:27:05 crc kubenswrapper[4722]: I0309 14:27:05.874160 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-9wrb2" Mar 09 14:27:05 crc kubenswrapper[4722]: I0309 14:27:05.875952 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff53300e-f89f-4204-82ce-5100fa8b10be-operator-scripts\") pod \"nova-api-db-create-9wrb2\" (UID: \"ff53300e-f89f-4204-82ce-5100fa8b10be\") " pod="openstack/nova-api-db-create-9wrb2" Mar 09 14:27:05 crc kubenswrapper[4722]: I0309 14:27:05.876007 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fc6k\" (UniqueName: \"kubernetes.io/projected/ff53300e-f89f-4204-82ce-5100fa8b10be-kube-api-access-7fc6k\") pod \"nova-api-db-create-9wrb2\" (UID: \"ff53300e-f89f-4204-82ce-5100fa8b10be\") " pod="openstack/nova-api-db-create-9wrb2" Mar 09 14:27:05 crc kubenswrapper[4722]: I0309 14:27:05.948010 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-9wrb2"] Mar 09 14:27:05 crc kubenswrapper[4722]: I0309 14:27:05.982320 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff53300e-f89f-4204-82ce-5100fa8b10be-operator-scripts\") pod \"nova-api-db-create-9wrb2\" (UID: \"ff53300e-f89f-4204-82ce-5100fa8b10be\") " pod="openstack/nova-api-db-create-9wrb2" Mar 09 14:27:05 crc kubenswrapper[4722]: I0309 14:27:05.982377 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fc6k\" (UniqueName: \"kubernetes.io/projected/ff53300e-f89f-4204-82ce-5100fa8b10be-kube-api-access-7fc6k\") pod \"nova-api-db-create-9wrb2\" (UID: \"ff53300e-f89f-4204-82ce-5100fa8b10be\") " pod="openstack/nova-api-db-create-9wrb2" Mar 09 14:27:05 crc kubenswrapper[4722]: I0309 14:27:05.983436 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff53300e-f89f-4204-82ce-5100fa8b10be-operator-scripts\") pod \"nova-api-db-create-9wrb2\" (UID: \"ff53300e-f89f-4204-82ce-5100fa8b10be\") " pod="openstack/nova-api-db-create-9wrb2" Mar 09 14:27:05 crc kubenswrapper[4722]: I0309 14:27:05.984458 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-48rrj"] Mar 09 14:27:05 crc kubenswrapper[4722]: I0309 14:27:05.986141 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-48rrj" Mar 09 14:27:06 crc kubenswrapper[4722]: I0309 14:27:06.020547 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-48rrj"] Mar 09 14:27:06 crc kubenswrapper[4722]: I0309 14:27:06.051947 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fc6k\" (UniqueName: \"kubernetes.io/projected/ff53300e-f89f-4204-82ce-5100fa8b10be-kube-api-access-7fc6k\") pod \"nova-api-db-create-9wrb2\" (UID: \"ff53300e-f89f-4204-82ce-5100fa8b10be\") " pod="openstack/nova-api-db-create-9wrb2" Mar 09 14:27:06 crc kubenswrapper[4722]: I0309 14:27:06.095393 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22868fd9-09e2-4ad6-b923-ab373da94453-operator-scripts\") pod \"nova-cell0-db-create-48rrj\" (UID: \"22868fd9-09e2-4ad6-b923-ab373da94453\") " pod="openstack/nova-cell0-db-create-48rrj" Mar 09 14:27:06 crc kubenswrapper[4722]: I0309 14:27:06.095565 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqj85\" (UniqueName: \"kubernetes.io/projected/22868fd9-09e2-4ad6-b923-ab373da94453-kube-api-access-gqj85\") pod \"nova-cell0-db-create-48rrj\" (UID: \"22868fd9-09e2-4ad6-b923-ab373da94453\") " pod="openstack/nova-cell0-db-create-48rrj" Mar 09 14:27:06 crc kubenswrapper[4722]: I0309 14:27:06.127646 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-b4fe-account-create-update-jrh5p"] Mar 09 14:27:06 crc kubenswrapper[4722]: I0309 14:27:06.129380 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-b4fe-account-create-update-jrh5p" Mar 09 14:27:06 crc kubenswrapper[4722]: I0309 14:27:06.132508 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 09 14:27:06 crc kubenswrapper[4722]: I0309 14:27:06.173759 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-b4fe-account-create-update-jrh5p"] Mar 09 14:27:06 crc kubenswrapper[4722]: I0309 14:27:06.197851 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ce279d8-a769-4e43-89f9-18c598b6f207-operator-scripts\") pod \"nova-api-b4fe-account-create-update-jrh5p\" (UID: \"9ce279d8-a769-4e43-89f9-18c598b6f207\") " pod="openstack/nova-api-b4fe-account-create-update-jrh5p" Mar 09 14:27:06 crc kubenswrapper[4722]: I0309 14:27:06.197942 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqj85\" (UniqueName: \"kubernetes.io/projected/22868fd9-09e2-4ad6-b923-ab373da94453-kube-api-access-gqj85\") pod \"nova-cell0-db-create-48rrj\" (UID: \"22868fd9-09e2-4ad6-b923-ab373da94453\") " pod="openstack/nova-cell0-db-create-48rrj" Mar 09 14:27:06 crc kubenswrapper[4722]: I0309 14:27:06.198049 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8rp2\" (UniqueName: \"kubernetes.io/projected/9ce279d8-a769-4e43-89f9-18c598b6f207-kube-api-access-w8rp2\") pod \"nova-api-b4fe-account-create-update-jrh5p\" (UID: \"9ce279d8-a769-4e43-89f9-18c598b6f207\") " pod="openstack/nova-api-b4fe-account-create-update-jrh5p" Mar 09 14:27:06 crc kubenswrapper[4722]: I0309 14:27:06.198084 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22868fd9-09e2-4ad6-b923-ab373da94453-operator-scripts\") pod \"nova-cell0-db-create-48rrj\" (UID: \"22868fd9-09e2-4ad6-b923-ab373da94453\") " pod="openstack/nova-cell0-db-create-48rrj" Mar 09 14:27:06 crc kubenswrapper[4722]: I0309 14:27:06.201023 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22868fd9-09e2-4ad6-b923-ab373da94453-operator-scripts\") pod \"nova-cell0-db-create-48rrj\" (UID: \"22868fd9-09e2-4ad6-b923-ab373da94453\") " pod="openstack/nova-cell0-db-create-48rrj" Mar 09 14:27:06 crc kubenswrapper[4722]: I0309 14:27:06.227026 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqj85\" (UniqueName: \"kubernetes.io/projected/22868fd9-09e2-4ad6-b923-ab373da94453-kube-api-access-gqj85\") pod \"nova-cell0-db-create-48rrj\" (UID: \"22868fd9-09e2-4ad6-b923-ab373da94453\") " pod="openstack/nova-cell0-db-create-48rrj" Mar 09 14:27:06 crc kubenswrapper[4722]: I0309 14:27:06.229808 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-wtg6v"] Mar 09 14:27:06 crc kubenswrapper[4722]: I0309 14:27:06.231464 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-wtg6v" Mar 09 14:27:06 crc kubenswrapper[4722]: I0309 14:27:06.242668 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-wtg6v"] Mar 09 14:27:06 crc kubenswrapper[4722]: I0309 14:27:06.281708 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-9wrb2" Mar 09 14:27:06 crc kubenswrapper[4722]: I0309 14:27:06.300587 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8bcd1ab8-574f-4e9e-8b60-058524c8be9f-operator-scripts\") pod \"nova-cell1-db-create-wtg6v\" (UID: \"8bcd1ab8-574f-4e9e-8b60-058524c8be9f\") " pod="openstack/nova-cell1-db-create-wtg6v" Mar 09 14:27:06 crc kubenswrapper[4722]: I0309 14:27:06.300667 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ce279d8-a769-4e43-89f9-18c598b6f207-operator-scripts\") pod \"nova-api-b4fe-account-create-update-jrh5p\" (UID: \"9ce279d8-a769-4e43-89f9-18c598b6f207\") " pod="openstack/nova-api-b4fe-account-create-update-jrh5p" Mar 09 14:27:06 crc kubenswrapper[4722]: I0309 14:27:06.300776 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhxl7\" (UniqueName: \"kubernetes.io/projected/8bcd1ab8-574f-4e9e-8b60-058524c8be9f-kube-api-access-hhxl7\") pod \"nova-cell1-db-create-wtg6v\" (UID: \"8bcd1ab8-574f-4e9e-8b60-058524c8be9f\") " pod="openstack/nova-cell1-db-create-wtg6v" Mar 09 14:27:06 crc kubenswrapper[4722]: I0309 14:27:06.300809 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8rp2\" (UniqueName: \"kubernetes.io/projected/9ce279d8-a769-4e43-89f9-18c598b6f207-kube-api-access-w8rp2\") pod \"nova-api-b4fe-account-create-update-jrh5p\" (UID: \"9ce279d8-a769-4e43-89f9-18c598b6f207\") " pod="openstack/nova-api-b4fe-account-create-update-jrh5p" Mar 09 14:27:06 crc kubenswrapper[4722]: I0309 14:27:06.301744 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ce279d8-a769-4e43-89f9-18c598b6f207-operator-scripts\") pod \"nova-api-b4fe-account-create-update-jrh5p\" (UID: \"9ce279d8-a769-4e43-89f9-18c598b6f207\") " pod="openstack/nova-api-b4fe-account-create-update-jrh5p" Mar 09 14:27:06 crc kubenswrapper[4722]: I0309 14:27:06.327433 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-48rrj" Mar 09 14:27:06 crc kubenswrapper[4722]: I0309 14:27:06.358325 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-489a-account-create-update-z4jfp"] Mar 09 14:27:06 crc kubenswrapper[4722]: I0309 14:27:06.361999 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-489a-account-create-update-z4jfp" Mar 09 14:27:06 crc kubenswrapper[4722]: I0309 14:27:06.365634 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 09 14:27:06 crc kubenswrapper[4722]: I0309 14:27:06.417103 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-489a-account-create-update-z4jfp"] Mar 09 14:27:06 crc kubenswrapper[4722]: I0309 14:27:06.417208 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8rp2\" (UniqueName: \"kubernetes.io/projected/9ce279d8-a769-4e43-89f9-18c598b6f207-kube-api-access-w8rp2\") pod \"nova-api-b4fe-account-create-update-jrh5p\" (UID: \"9ce279d8-a769-4e43-89f9-18c598b6f207\") " pod="openstack/nova-api-b4fe-account-create-update-jrh5p" Mar 09 14:27:06 crc kubenswrapper[4722]: I0309 14:27:06.426960 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3bebeafd-3acc-450b-85b1-145cb598ac05-operator-scripts\") pod \"nova-cell0-489a-account-create-update-z4jfp\" (UID: \"3bebeafd-3acc-450b-85b1-145cb598ac05\") " pod="openstack/nova-cell0-489a-account-create-update-z4jfp" Mar 09 14:27:06 crc kubenswrapper[4722]: I0309 14:27:06.427166 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8bcd1ab8-574f-4e9e-8b60-058524c8be9f-operator-scripts\") pod \"nova-cell1-db-create-wtg6v\" (UID: \"8bcd1ab8-574f-4e9e-8b60-058524c8be9f\") " pod="openstack/nova-cell1-db-create-wtg6v" Mar 09 14:27:06 crc kubenswrapper[4722]: I0309 14:27:06.427280 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vt7l\" (UniqueName: \"kubernetes.io/projected/3bebeafd-3acc-450b-85b1-145cb598ac05-kube-api-access-8vt7l\") pod \"nova-cell0-489a-account-create-update-z4jfp\" (UID: \"3bebeafd-3acc-450b-85b1-145cb598ac05\") " pod="openstack/nova-cell0-489a-account-create-update-z4jfp" Mar 09 14:27:06 crc kubenswrapper[4722]: I0309 14:27:06.427948 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhxl7\" (UniqueName: \"kubernetes.io/projected/8bcd1ab8-574f-4e9e-8b60-058524c8be9f-kube-api-access-hhxl7\") pod \"nova-cell1-db-create-wtg6v\" (UID: \"8bcd1ab8-574f-4e9e-8b60-058524c8be9f\") " pod="openstack/nova-cell1-db-create-wtg6v" Mar 09 14:27:06 crc kubenswrapper[4722]: I0309 14:27:06.432708 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8bcd1ab8-574f-4e9e-8b60-058524c8be9f-operator-scripts\") pod \"nova-cell1-db-create-wtg6v\" (UID: \"8bcd1ab8-574f-4e9e-8b60-058524c8be9f\") " pod="openstack/nova-cell1-db-create-wtg6v" Mar 09 14:27:06 crc kubenswrapper[4722]: I0309 14:27:06.469078 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-b4fe-account-create-update-jrh5p" Mar 09 14:27:06 crc kubenswrapper[4722]: I0309 14:27:06.472287 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 14:27:06 crc kubenswrapper[4722]: I0309 14:27:06.472544 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="905e426d-8ef1-442e-abb2-69905e8fc61a" containerName="glance-log" containerID="cri-o://2b782e1d01d6f1e5d883408b56654cb190f65dda28ade9acdcc2b519429bd927" gracePeriod=30 Mar 09 14:27:06 crc kubenswrapper[4722]: I0309 14:27:06.473076 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="905e426d-8ef1-442e-abb2-69905e8fc61a" containerName="glance-httpd" containerID="cri-o://9fdfba0d1ad4b491c62ca71f25534c103428e138f08244b726136f8e29e87ad2" gracePeriod=30 Mar 09 14:27:06 crc kubenswrapper[4722]: I0309 14:27:06.479931 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhxl7\" (UniqueName: \"kubernetes.io/projected/8bcd1ab8-574f-4e9e-8b60-058524c8be9f-kube-api-access-hhxl7\") pod \"nova-cell1-db-create-wtg6v\" (UID: \"8bcd1ab8-574f-4e9e-8b60-058524c8be9f\") " pod="openstack/nova-cell1-db-create-wtg6v" Mar 09 14:27:06 crc kubenswrapper[4722]: I0309 14:27:06.503289 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-25f0-account-create-update-cmbgw"] Mar 09 14:27:06 crc kubenswrapper[4722]: I0309 14:27:06.505152 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-25f0-account-create-update-cmbgw" Mar 09 14:27:06 crc kubenswrapper[4722]: I0309 14:27:06.523129 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 09 14:27:06 crc kubenswrapper[4722]: I0309 14:27:06.529786 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vt7l\" (UniqueName: \"kubernetes.io/projected/3bebeafd-3acc-450b-85b1-145cb598ac05-kube-api-access-8vt7l\") pod \"nova-cell0-489a-account-create-update-z4jfp\" (UID: \"3bebeafd-3acc-450b-85b1-145cb598ac05\") " pod="openstack/nova-cell0-489a-account-create-update-z4jfp" Mar 09 14:27:06 crc kubenswrapper[4722]: I0309 14:27:06.529886 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ql2qk\" (UniqueName: \"kubernetes.io/projected/87133d68-d972-4b38-a6b9-f88733004c17-kube-api-access-ql2qk\") pod \"nova-cell1-25f0-account-create-update-cmbgw\" (UID: \"87133d68-d972-4b38-a6b9-f88733004c17\") " pod="openstack/nova-cell1-25f0-account-create-update-cmbgw" Mar 09 14:27:06 crc kubenswrapper[4722]: I0309 14:27:06.529934 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87133d68-d972-4b38-a6b9-f88733004c17-operator-scripts\") pod \"nova-cell1-25f0-account-create-update-cmbgw\" (UID: \"87133d68-d972-4b38-a6b9-f88733004c17\") " pod="openstack/nova-cell1-25f0-account-create-update-cmbgw" Mar 09 14:27:06 crc kubenswrapper[4722]: I0309 14:27:06.530062 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3bebeafd-3acc-450b-85b1-145cb598ac05-operator-scripts\") pod \"nova-cell0-489a-account-create-update-z4jfp\" (UID: \"3bebeafd-3acc-450b-85b1-145cb598ac05\") " pod="openstack/nova-cell0-489a-account-create-update-z4jfp" Mar 09 14:27:06 crc kubenswrapper[4722]: I0309 14:27:06.530752 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3bebeafd-3acc-450b-85b1-145cb598ac05-operator-scripts\") pod \"nova-cell0-489a-account-create-update-z4jfp\" (UID: \"3bebeafd-3acc-450b-85b1-145cb598ac05\") " pod="openstack/nova-cell0-489a-account-create-update-z4jfp" Mar 09 14:27:06 crc kubenswrapper[4722]: I0309 14:27:06.547676 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-25f0-account-create-update-cmbgw"] Mar 09 14:27:06 crc kubenswrapper[4722]: I0309 14:27:06.580835 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vt7l\" (UniqueName: \"kubernetes.io/projected/3bebeafd-3acc-450b-85b1-145cb598ac05-kube-api-access-8vt7l\") pod \"nova-cell0-489a-account-create-update-z4jfp\" (UID: \"3bebeafd-3acc-450b-85b1-145cb598ac05\") " pod="openstack/nova-cell0-489a-account-create-update-z4jfp" Mar 09 14:27:06 crc kubenswrapper[4722]: I0309 14:27:06.613833 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-wtg6v" Mar 09 14:27:06 crc kubenswrapper[4722]: I0309 14:27:06.630958 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ql2qk\" (UniqueName: \"kubernetes.io/projected/87133d68-d972-4b38-a6b9-f88733004c17-kube-api-access-ql2qk\") pod \"nova-cell1-25f0-account-create-update-cmbgw\" (UID: \"87133d68-d972-4b38-a6b9-f88733004c17\") " pod="openstack/nova-cell1-25f0-account-create-update-cmbgw" Mar 09 14:27:06 crc kubenswrapper[4722]: I0309 14:27:06.631026 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87133d68-d972-4b38-a6b9-f88733004c17-operator-scripts\") pod \"nova-cell1-25f0-account-create-update-cmbgw\" (UID: \"87133d68-d972-4b38-a6b9-f88733004c17\") " pod="openstack/nova-cell1-25f0-account-create-update-cmbgw" Mar 09 14:27:06 crc kubenswrapper[4722]: I0309 14:27:06.631809 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87133d68-d972-4b38-a6b9-f88733004c17-operator-scripts\") pod \"nova-cell1-25f0-account-create-update-cmbgw\" (UID: \"87133d68-d972-4b38-a6b9-f88733004c17\") " pod="openstack/nova-cell1-25f0-account-create-update-cmbgw" Mar 09 14:27:06 crc kubenswrapper[4722]: I0309 14:27:06.657843 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ql2qk\" (UniqueName: \"kubernetes.io/projected/87133d68-d972-4b38-a6b9-f88733004c17-kube-api-access-ql2qk\") pod \"nova-cell1-25f0-account-create-update-cmbgw\" (UID: \"87133d68-d972-4b38-a6b9-f88733004c17\") " pod="openstack/nova-cell1-25f0-account-create-update-cmbgw" Mar 09 14:27:06 crc kubenswrapper[4722]: I0309 14:27:06.746147 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-489a-account-create-update-z4jfp" Mar 09 14:27:06 crc kubenswrapper[4722]: I0309 14:27:06.853285 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-5b85b5677d-6lmg9"] Mar 09 14:27:06 crc kubenswrapper[4722]: I0309 14:27:06.854788 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5b85b5677d-6lmg9" Mar 09 14:27:06 crc kubenswrapper[4722]: I0309 14:27:06.875284 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-5f6f45fb9b-nw8cq"] Mar 09 14:27:06 crc kubenswrapper[4722]: I0309 14:27:06.876835 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5f6f45fb9b-nw8cq" Mar 09 14:27:06 crc kubenswrapper[4722]: I0309 14:27:06.893935 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-678fb995f7-q5bfj"] Mar 09 14:27:06 crc kubenswrapper[4722]: I0309 14:27:06.895901 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-678fb995f7-q5bfj" Mar 09 14:27:06 crc kubenswrapper[4722]: I0309 14:27:06.923022 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-25f0-account-create-update-cmbgw" Mar 09 14:27:06 crc kubenswrapper[4722]: I0309 14:27:06.926340 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-5b85b5677d-6lmg9"] Mar 09 14:27:06 crc kubenswrapper[4722]: I0309 14:27:06.970418 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-5f6f45fb9b-nw8cq"] Mar 09 14:27:06 crc kubenswrapper[4722]: I0309 14:27:06.974486 4722 generic.go:334] "Generic (PLEG): container finished" podID="905e426d-8ef1-442e-abb2-69905e8fc61a" containerID="2b782e1d01d6f1e5d883408b56654cb190f65dda28ade9acdcc2b519429bd927" exitCode=143 Mar 09 14:27:06 crc kubenswrapper[4722]: I0309 14:27:06.974525 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"905e426d-8ef1-442e-abb2-69905e8fc61a","Type":"ContainerDied","Data":"2b782e1d01d6f1e5d883408b56654cb190f65dda28ade9acdcc2b519429bd927"} Mar 09 14:27:07 crc kubenswrapper[4722]: I0309 14:27:07.034593 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-678fb995f7-q5bfj"] Mar 09 14:27:07 crc kubenswrapper[4722]: I0309 14:27:07.037412 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfb53424-4444-464a-9be1-97e2e095c496-config-data\") pod \"heat-engine-678fb995f7-q5bfj\" (UID: \"dfb53424-4444-464a-9be1-97e2e095c496\") " pod="openstack/heat-engine-678fb995f7-q5bfj" Mar 09 14:27:07 crc kubenswrapper[4722]: I0309 14:27:07.037468 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfb53424-4444-464a-9be1-97e2e095c496-combined-ca-bundle\") pod \"heat-engine-678fb995f7-q5bfj\" (UID: \"dfb53424-4444-464a-9be1-97e2e095c496\") " pod="openstack/heat-engine-678fb995f7-q5bfj" Mar 09 14:27:07 crc kubenswrapper[4722]: I0309 14:27:07.037516 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pt2x\" (UniqueName: \"kubernetes.io/projected/dfb53424-4444-464a-9be1-97e2e095c496-kube-api-access-4pt2x\") pod \"heat-engine-678fb995f7-q5bfj\" (UID: \"dfb53424-4444-464a-9be1-97e2e095c496\") " pod="openstack/heat-engine-678fb995f7-q5bfj" Mar 09 14:27:07 crc kubenswrapper[4722]: I0309 14:27:07.037544 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fe8dbea-cdd7-4bda-b5f2-717a0a30d116-config-data\") pod \"heat-cfnapi-5f6f45fb9b-nw8cq\" (UID: \"6fe8dbea-cdd7-4bda-b5f2-717a0a30d116\") " pod="openstack/heat-cfnapi-5f6f45fb9b-nw8cq" Mar 09 14:27:07 crc kubenswrapper[4722]: I0309 14:27:07.037575 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d6c4171f-11a7-47bf-900b-55fa05f03f49-config-data-custom\") pod \"heat-api-5b85b5677d-6lmg9\" (UID: \"d6c4171f-11a7-47bf-900b-55fa05f03f49\") " pod="openstack/heat-api-5b85b5677d-6lmg9" Mar 09 14:27:07 crc kubenswrapper[4722]: I0309 14:27:07.037614 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dfb53424-4444-464a-9be1-97e2e095c496-config-data-custom\") pod \"heat-engine-678fb995f7-q5bfj\" (UID: \"dfb53424-4444-464a-9be1-97e2e095c496\") " pod="openstack/heat-engine-678fb995f7-q5bfj" Mar 09 14:27:07 crc kubenswrapper[4722]: I0309 14:27:07.037640 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crjv4\" (UniqueName: \"kubernetes.io/projected/d6c4171f-11a7-47bf-900b-55fa05f03f49-kube-api-access-crjv4\") pod \"heat-api-5b85b5677d-6lmg9\" (UID: \"d6c4171f-11a7-47bf-900b-55fa05f03f49\") " pod="openstack/heat-api-5b85b5677d-6lmg9" Mar 09 14:27:07 crc kubenswrapper[4722]: I0309 14:27:07.037876 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fe8dbea-cdd7-4bda-b5f2-717a0a30d116-combined-ca-bundle\") pod \"heat-cfnapi-5f6f45fb9b-nw8cq\" (UID: \"6fe8dbea-cdd7-4bda-b5f2-717a0a30d116\") " pod="openstack/heat-cfnapi-5f6f45fb9b-nw8cq" Mar 09 14:27:07 crc kubenswrapper[4722]: I0309 14:27:07.037937 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvvn9\" (UniqueName: \"kubernetes.io/projected/6fe8dbea-cdd7-4bda-b5f2-717a0a30d116-kube-api-access-bvvn9\") pod \"heat-cfnapi-5f6f45fb9b-nw8cq\" (UID: \"6fe8dbea-cdd7-4bda-b5f2-717a0a30d116\") " pod="openstack/heat-cfnapi-5f6f45fb9b-nw8cq" Mar 09 14:27:07 crc kubenswrapper[4722]: I0309 14:27:07.038027 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6c4171f-11a7-47bf-900b-55fa05f03f49-config-data\") pod \"heat-api-5b85b5677d-6lmg9\" (UID: \"d6c4171f-11a7-47bf-900b-55fa05f03f49\") " pod="openstack/heat-api-5b85b5677d-6lmg9" Mar 09 14:27:07 crc kubenswrapper[4722]: I0309 14:27:07.038092 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6c4171f-11a7-47bf-900b-55fa05f03f49-combined-ca-bundle\") pod \"heat-api-5b85b5677d-6lmg9\" (UID: \"d6c4171f-11a7-47bf-900b-55fa05f03f49\") " pod="openstack/heat-api-5b85b5677d-6lmg9" Mar 09 14:27:07 crc kubenswrapper[4722]: I0309 14:27:07.038168 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6fe8dbea-cdd7-4bda-b5f2-717a0a30d116-config-data-custom\") pod \"heat-cfnapi-5f6f45fb9b-nw8cq\" (UID: \"6fe8dbea-cdd7-4bda-b5f2-717a0a30d116\") " pod="openstack/heat-cfnapi-5f6f45fb9b-nw8cq" Mar 09 14:27:07 crc kubenswrapper[4722]: I0309 14:27:07.140112 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pt2x\" (UniqueName: \"kubernetes.io/projected/dfb53424-4444-464a-9be1-97e2e095c496-kube-api-access-4pt2x\") pod \"heat-engine-678fb995f7-q5bfj\" (UID: \"dfb53424-4444-464a-9be1-97e2e095c496\") " pod="openstack/heat-engine-678fb995f7-q5bfj" Mar 09 14:27:07 crc kubenswrapper[4722]: I0309 14:27:07.140212 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fe8dbea-cdd7-4bda-b5f2-717a0a30d116-config-data\") pod \"heat-cfnapi-5f6f45fb9b-nw8cq\" (UID: \"6fe8dbea-cdd7-4bda-b5f2-717a0a30d116\") " pod="openstack/heat-cfnapi-5f6f45fb9b-nw8cq" Mar 09 14:27:07 crc kubenswrapper[4722]: I0309 14:27:07.140342 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d6c4171f-11a7-47bf-900b-55fa05f03f49-config-data-custom\") pod \"heat-api-5b85b5677d-6lmg9\" (UID: \"d6c4171f-11a7-47bf-900b-55fa05f03f49\") " pod="openstack/heat-api-5b85b5677d-6lmg9" Mar 09 14:27:07 crc kubenswrapper[4722]: I0309 14:27:07.140386 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dfb53424-4444-464a-9be1-97e2e095c496-config-data-custom\") pod \"heat-engine-678fb995f7-q5bfj\" (UID: \"dfb53424-4444-464a-9be1-97e2e095c496\") " pod="openstack/heat-engine-678fb995f7-q5bfj" Mar 09 14:27:07 crc kubenswrapper[4722]: I0309 14:27:07.140416 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crjv4\" (UniqueName: \"kubernetes.io/projected/d6c4171f-11a7-47bf-900b-55fa05f03f49-kube-api-access-crjv4\") pod \"heat-api-5b85b5677d-6lmg9\" (UID: \"d6c4171f-11a7-47bf-900b-55fa05f03f49\") " pod="openstack/heat-api-5b85b5677d-6lmg9" Mar 09 14:27:07 crc kubenswrapper[4722]: I0309 14:27:07.140497 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fe8dbea-cdd7-4bda-b5f2-717a0a30d116-combined-ca-bundle\") pod \"heat-cfnapi-5f6f45fb9b-nw8cq\" (UID: \"6fe8dbea-cdd7-4bda-b5f2-717a0a30d116\") " pod="openstack/heat-cfnapi-5f6f45fb9b-nw8cq" Mar 09 14:27:07 crc kubenswrapper[4722]: I0309 14:27:07.140521 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvvn9\" (UniqueName: \"kubernetes.io/projected/6fe8dbea-cdd7-4bda-b5f2-717a0a30d116-kube-api-access-bvvn9\") pod \"heat-cfnapi-5f6f45fb9b-nw8cq\" (UID: \"6fe8dbea-cdd7-4bda-b5f2-717a0a30d116\") " pod="openstack/heat-cfnapi-5f6f45fb9b-nw8cq" Mar 09 14:27:07 crc kubenswrapper[4722]: I0309 14:27:07.140547 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6c4171f-11a7-47bf-900b-55fa05f03f49-config-data\") pod \"heat-api-5b85b5677d-6lmg9\" (UID: \"d6c4171f-11a7-47bf-900b-55fa05f03f49\") " pod="openstack/heat-api-5b85b5677d-6lmg9" Mar 09 14:27:07 crc kubenswrapper[4722]: I0309 14:27:07.140569 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6c4171f-11a7-47bf-900b-55fa05f03f49-combined-ca-bundle\") pod \"heat-api-5b85b5677d-6lmg9\" (UID: \"d6c4171f-11a7-47bf-900b-55fa05f03f49\") " pod="openstack/heat-api-5b85b5677d-6lmg9" Mar 09 14:27:07 crc kubenswrapper[4722]: I0309 14:27:07.140594 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6fe8dbea-cdd7-4bda-b5f2-717a0a30d116-config-data-custom\") pod \"heat-cfnapi-5f6f45fb9b-nw8cq\" (UID: \"6fe8dbea-cdd7-4bda-b5f2-717a0a30d116\") " pod="openstack/heat-cfnapi-5f6f45fb9b-nw8cq" Mar 09 14:27:07 crc kubenswrapper[4722]: I0309 14:27:07.140627 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfb53424-4444-464a-9be1-97e2e095c496-config-data\") pod \"heat-engine-678fb995f7-q5bfj\" (UID: \"dfb53424-4444-464a-9be1-97e2e095c496\") " pod="openstack/heat-engine-678fb995f7-q5bfj" Mar 09 14:27:07 crc kubenswrapper[4722]: I0309 14:27:07.140654 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfb53424-4444-464a-9be1-97e2e095c496-combined-ca-bundle\") pod \"heat-engine-678fb995f7-q5bfj\" (UID: \"dfb53424-4444-464a-9be1-97e2e095c496\") " pod="openstack/heat-engine-678fb995f7-q5bfj" Mar 09 14:27:07 crc kubenswrapper[4722]: I0309 14:27:07.145940 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6fe8dbea-cdd7-4bda-b5f2-717a0a30d116-config-data-custom\") pod \"heat-cfnapi-5f6f45fb9b-nw8cq\" (UID: \"6fe8dbea-cdd7-4bda-b5f2-717a0a30d116\") " pod="openstack/heat-cfnapi-5f6f45fb9b-nw8cq" Mar 09 14:27:07 crc kubenswrapper[4722]: I0309 14:27:07.147947 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fe8dbea-cdd7-4bda-b5f2-717a0a30d116-config-data\") pod \"heat-cfnapi-5f6f45fb9b-nw8cq\" (UID: \"6fe8dbea-cdd7-4bda-b5f2-717a0a30d116\") " pod="openstack/heat-cfnapi-5f6f45fb9b-nw8cq" Mar 09 14:27:07 crc kubenswrapper[4722]: I0309 14:27:07.148114 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6c4171f-11a7-47bf-900b-55fa05f03f49-combined-ca-bundle\") pod \"heat-api-5b85b5677d-6lmg9\" (UID: \"d6c4171f-11a7-47bf-900b-55fa05f03f49\") " pod="openstack/heat-api-5b85b5677d-6lmg9" Mar 09 14:27:07 crc kubenswrapper[4722]: I0309 14:27:07.148257 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6c4171f-11a7-47bf-900b-55fa05f03f49-config-data\") pod \"heat-api-5b85b5677d-6lmg9\" (UID: \"d6c4171f-11a7-47bf-900b-55fa05f03f49\") " pod="openstack/heat-api-5b85b5677d-6lmg9" Mar 09 14:27:07 crc kubenswrapper[4722]: I0309 14:27:07.149905 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfb53424-4444-464a-9be1-97e2e095c496-config-data\") pod \"heat-engine-678fb995f7-q5bfj\" (UID: \"dfb53424-4444-464a-9be1-97e2e095c496\") " pod="openstack/heat-engine-678fb995f7-q5bfj" Mar 09 14:27:07 crc kubenswrapper[4722]: I0309 14:27:07.150220 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d6c4171f-11a7-47bf-900b-55fa05f03f49-config-data-custom\") pod \"heat-api-5b85b5677d-6lmg9\" (UID: \"d6c4171f-11a7-47bf-900b-55fa05f03f49\") " pod="openstack/heat-api-5b85b5677d-6lmg9" Mar 09 14:27:07 crc kubenswrapper[4722]: I0309 14:27:07.150833 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfb53424-4444-464a-9be1-97e2e095c496-combined-ca-bundle\") pod \"heat-engine-678fb995f7-q5bfj\" (UID: \"dfb53424-4444-464a-9be1-97e2e095c496\") " pod="openstack/heat-engine-678fb995f7-q5bfj" Mar 09 14:27:07 crc kubenswrapper[4722]: I0309 14:27:07.151482 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dfb53424-4444-464a-9be1-97e2e095c496-config-data-custom\") pod \"heat-engine-678fb995f7-q5bfj\" (UID: \"dfb53424-4444-464a-9be1-97e2e095c496\") " pod="openstack/heat-engine-678fb995f7-q5bfj" Mar 09 14:27:07 crc kubenswrapper[4722]: I0309 14:27:07.158865 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fe8dbea-cdd7-4bda-b5f2-717a0a30d116-combined-ca-bundle\") pod \"heat-cfnapi-5f6f45fb9b-nw8cq\" (UID: \"6fe8dbea-cdd7-4bda-b5f2-717a0a30d116\") " pod="openstack/heat-cfnapi-5f6f45fb9b-nw8cq" Mar 09 14:27:07 crc kubenswrapper[4722]: I0309 14:27:07.159111 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pt2x\" (UniqueName: \"kubernetes.io/projected/dfb53424-4444-464a-9be1-97e2e095c496-kube-api-access-4pt2x\") pod \"heat-engine-678fb995f7-q5bfj\" (UID: \"dfb53424-4444-464a-9be1-97e2e095c496\") " pod="openstack/heat-engine-678fb995f7-q5bfj" Mar 09 14:27:07 crc kubenswrapper[4722]: I0309 14:27:07.168022 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crjv4\" (UniqueName: \"kubernetes.io/projected/d6c4171f-11a7-47bf-900b-55fa05f03f49-kube-api-access-crjv4\") pod \"heat-api-5b85b5677d-6lmg9\" (UID: \"d6c4171f-11a7-47bf-900b-55fa05f03f49\") " pod="openstack/heat-api-5b85b5677d-6lmg9" Mar 09 14:27:07 crc kubenswrapper[4722]: I0309 14:27:07.168060 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvvn9\" (UniqueName: \"kubernetes.io/projected/6fe8dbea-cdd7-4bda-b5f2-717a0a30d116-kube-api-access-bvvn9\") pod \"heat-cfnapi-5f6f45fb9b-nw8cq\" (UID: \"6fe8dbea-cdd7-4bda-b5f2-717a0a30d116\") " pod="openstack/heat-cfnapi-5f6f45fb9b-nw8cq" Mar 09 14:27:07 crc kubenswrapper[4722]: I0309 14:27:07.200120 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5b85b5677d-6lmg9" Mar 09 14:27:07 crc kubenswrapper[4722]: I0309 14:27:07.213044 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5f6f45fb9b-nw8cq" Mar 09 14:27:07 crc kubenswrapper[4722]: I0309 14:27:07.224837 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-678fb995f7-q5bfj" Mar 09 14:27:07 crc kubenswrapper[4722]: I0309 14:27:07.999167 4722 generic.go:334] "Generic (PLEG): container finished" podID="f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4" containerID="7d51fe379c52b16a7a6d962b827b1c07be67bbcd90aa939d56cebc2eb739f4c7" exitCode=0 Mar 09 14:27:07 crc kubenswrapper[4722]: I0309 14:27:07.999224 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4","Type":"ContainerDied","Data":"7d51fe379c52b16a7a6d962b827b1c07be67bbcd90aa939d56cebc2eb739f4c7"} Mar 09 14:27:08 crc kubenswrapper[4722]: I0309 14:27:08.625579 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-5ffdf96bcf-klbzp"] Mar 09 14:27:08 crc kubenswrapper[4722]: I0309 14:27:08.666192 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-58665ff9-7zt7s"] Mar 09 14:27:08 crc kubenswrapper[4722]: I0309 14:27:08.687187 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-6957fcb6b8-8jmt4"] Mar 09 14:27:08 crc kubenswrapper[4722]: I0309 14:27:08.688857 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-6957fcb6b8-8jmt4" Mar 09 14:27:08 crc kubenswrapper[4722]: I0309 14:27:08.698608 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-public-svc" Mar 09 14:27:08 crc kubenswrapper[4722]: I0309 14:27:08.698840 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-internal-svc" Mar 09 14:27:08 crc kubenswrapper[4722]: I0309 14:27:08.710726 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-6957fcb6b8-8jmt4"] Mar 09 14:27:08 crc kubenswrapper[4722]: I0309 14:27:08.737089 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-5c766df7b4-znr5j"] Mar 09 14:27:08 crc kubenswrapper[4722]: I0309 14:27:08.738881 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5c766df7b4-znr5j" Mar 09 14:27:08 crc kubenswrapper[4722]: I0309 14:27:08.748514 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-internal-svc" Mar 09 14:27:08 crc kubenswrapper[4722]: I0309 14:27:08.748616 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-public-svc" Mar 09 14:27:08 crc kubenswrapper[4722]: I0309 14:27:08.752368 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-5c766df7b4-znr5j"] Mar 09 14:27:08 crc kubenswrapper[4722]: I0309 14:27:08.857771 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-77cd88d8c5-gml97" Mar 09 14:27:08 crc kubenswrapper[4722]: I0309 14:27:08.860897 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-77cd88d8c5-gml97" Mar 09 14:27:08 crc kubenswrapper[4722]: I0309 14:27:08.884386 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cd4ad6c-298e-47c9-8648-3cfa1f407aad-internal-tls-certs\") pod \"heat-api-6957fcb6b8-8jmt4\" (UID: \"9cd4ad6c-298e-47c9-8648-3cfa1f407aad\") " pod="openstack/heat-api-6957fcb6b8-8jmt4" Mar 09 14:27:08 crc kubenswrapper[4722]: I0309 14:27:08.884443 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9213500f-da49-496b-b99e-1ec95658b48f-combined-ca-bundle\") pod \"heat-cfnapi-5c766df7b4-znr5j\" (UID: \"9213500f-da49-496b-b99e-1ec95658b48f\") " pod="openstack/heat-cfnapi-5c766df7b4-znr5j" Mar 09 14:27:08 crc kubenswrapper[4722]: I0309 14:27:08.885735 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cd4ad6c-298e-47c9-8648-3cfa1f407aad-config-data\") pod \"heat-api-6957fcb6b8-8jmt4\" (UID: \"9cd4ad6c-298e-47c9-8648-3cfa1f407aad\") " pod="openstack/heat-api-6957fcb6b8-8jmt4" Mar 09 14:27:08 crc kubenswrapper[4722]: I0309 14:27:08.885816 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbwqz\" (UniqueName: \"kubernetes.io/projected/9213500f-da49-496b-b99e-1ec95658b48f-kube-api-access-wbwqz\") pod \"heat-cfnapi-5c766df7b4-znr5j\" (UID: \"9213500f-da49-496b-b99e-1ec95658b48f\") " pod="openstack/heat-cfnapi-5c766df7b4-znr5j" Mar 09 14:27:08 crc kubenswrapper[4722]: I0309 14:27:08.886022 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9213500f-da49-496b-b99e-1ec95658b48f-config-data\") pod \"heat-cfnapi-5c766df7b4-znr5j\" (UID: \"9213500f-da49-496b-b99e-1ec95658b48f\") " pod="openstack/heat-cfnapi-5c766df7b4-znr5j" Mar 09 14:27:08 crc kubenswrapper[4722]: I0309 14:27:08.886085 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9cd4ad6c-298e-47c9-8648-3cfa1f407aad-config-data-custom\") pod \"heat-api-6957fcb6b8-8jmt4\" (UID: \"9cd4ad6c-298e-47c9-8648-3cfa1f407aad\") " pod="openstack/heat-api-6957fcb6b8-8jmt4" Mar 09 14:27:08 crc kubenswrapper[4722]: I0309 14:27:08.886216 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cd4ad6c-298e-47c9-8648-3cfa1f407aad-combined-ca-bundle\") pod \"heat-api-6957fcb6b8-8jmt4\" (UID: \"9cd4ad6c-298e-47c9-8648-3cfa1f407aad\") " pod="openstack/heat-api-6957fcb6b8-8jmt4" Mar 09 14:27:08 crc kubenswrapper[4722]: I0309 14:27:08.886416 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cd4ad6c-298e-47c9-8648-3cfa1f407aad-public-tls-certs\") pod \"heat-api-6957fcb6b8-8jmt4\" (UID: \"9cd4ad6c-298e-47c9-8648-3cfa1f407aad\") " pod="openstack/heat-api-6957fcb6b8-8jmt4" Mar 09 14:27:08 crc kubenswrapper[4722]: I0309 14:27:08.886505 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9213500f-da49-496b-b99e-1ec95658b48f-config-data-custom\") pod \"heat-cfnapi-5c766df7b4-znr5j\" (UID: \"9213500f-da49-496b-b99e-1ec95658b48f\") " pod="openstack/heat-cfnapi-5c766df7b4-znr5j" Mar 09 14:27:08 crc kubenswrapper[4722]: I0309 14:27:08.886600 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9213500f-da49-496b-b99e-1ec95658b48f-internal-tls-certs\") pod \"heat-cfnapi-5c766df7b4-znr5j\" (UID: \"9213500f-da49-496b-b99e-1ec95658b48f\") " pod="openstack/heat-cfnapi-5c766df7b4-znr5j" Mar 09 14:27:08 crc kubenswrapper[4722]: I0309 14:27:08.886665 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9213500f-da49-496b-b99e-1ec95658b48f-public-tls-certs\") pod \"heat-cfnapi-5c766df7b4-znr5j\" (UID: \"9213500f-da49-496b-b99e-1ec95658b48f\") " pod="openstack/heat-cfnapi-5c766df7b4-znr5j" Mar 09 14:27:08 crc kubenswrapper[4722]: I0309 14:27:08.886697 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6j9b\" (UniqueName: \"kubernetes.io/projected/9cd4ad6c-298e-47c9-8648-3cfa1f407aad-kube-api-access-p6j9b\") pod \"heat-api-6957fcb6b8-8jmt4\" (UID: \"9cd4ad6c-298e-47c9-8648-3cfa1f407aad\") " pod="openstack/heat-api-6957fcb6b8-8jmt4" Mar 09 14:27:08 crc kubenswrapper[4722]: I0309 14:27:08.987601 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9213500f-da49-496b-b99e-1ec95658b48f-config-data-custom\") pod \"heat-cfnapi-5c766df7b4-znr5j\" (UID: \"9213500f-da49-496b-b99e-1ec95658b48f\") " pod="openstack/heat-cfnapi-5c766df7b4-znr5j" Mar 09 14:27:08 crc kubenswrapper[4722]: I0309 14:27:08.987661 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9213500f-da49-496b-b99e-1ec95658b48f-internal-tls-certs\") pod \"heat-cfnapi-5c766df7b4-znr5j\" (UID: \"9213500f-da49-496b-b99e-1ec95658b48f\") " pod="openstack/heat-cfnapi-5c766df7b4-znr5j" Mar 09 14:27:08 crc kubenswrapper[4722]: I0309 14:27:08.987691 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9213500f-da49-496b-b99e-1ec95658b48f-public-tls-certs\") pod \"heat-cfnapi-5c766df7b4-znr5j\" (UID: \"9213500f-da49-496b-b99e-1ec95658b48f\") " pod="openstack/heat-cfnapi-5c766df7b4-znr5j" Mar 09 14:27:08 crc kubenswrapper[4722]: I0309 14:27:08.987711 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6j9b\" (UniqueName: \"kubernetes.io/projected/9cd4ad6c-298e-47c9-8648-3cfa1f407aad-kube-api-access-p6j9b\") pod \"heat-api-6957fcb6b8-8jmt4\" (UID: \"9cd4ad6c-298e-47c9-8648-3cfa1f407aad\") " pod="openstack/heat-api-6957fcb6b8-8jmt4" Mar 09 14:27:08 crc kubenswrapper[4722]: I0309 14:27:08.987743 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cd4ad6c-298e-47c9-8648-3cfa1f407aad-internal-tls-certs\") pod \"heat-api-6957fcb6b8-8jmt4\" (UID: \"9cd4ad6c-298e-47c9-8648-3cfa1f407aad\") " pod="openstack/heat-api-6957fcb6b8-8jmt4" Mar 09 14:27:08 crc kubenswrapper[4722]: I0309 14:27:08.987762 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9213500f-da49-496b-b99e-1ec95658b48f-combined-ca-bundle\") pod \"heat-cfnapi-5c766df7b4-znr5j\" (UID: \"9213500f-da49-496b-b99e-1ec95658b48f\") " pod="openstack/heat-cfnapi-5c766df7b4-znr5j" Mar 09 14:27:08 crc kubenswrapper[4722]: I0309 14:27:08.987782 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cd4ad6c-298e-47c9-8648-3cfa1f407aad-config-data\") pod \"heat-api-6957fcb6b8-8jmt4\" (UID: \"9cd4ad6c-298e-47c9-8648-3cfa1f407aad\") " pod="openstack/heat-api-6957fcb6b8-8jmt4" Mar 09 14:27:08 crc kubenswrapper[4722]: I0309 14:27:08.987802 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbwqz\" (UniqueName: \"kubernetes.io/projected/9213500f-da49-496b-b99e-1ec95658b48f-kube-api-access-wbwqz\") pod \"heat-cfnapi-5c766df7b4-znr5j\" (UID: \"9213500f-da49-496b-b99e-1ec95658b48f\") " pod="openstack/heat-cfnapi-5c766df7b4-znr5j" Mar 09 14:27:08 crc kubenswrapper[4722]: I0309 14:27:08.987853 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9213500f-da49-496b-b99e-1ec95658b48f-config-data\") pod \"heat-cfnapi-5c766df7b4-znr5j\" (UID: \"9213500f-da49-496b-b99e-1ec95658b48f\") " pod="openstack/heat-cfnapi-5c766df7b4-znr5j" Mar 09 14:27:08 crc kubenswrapper[4722]: I0309 14:27:08.987876 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9cd4ad6c-298e-47c9-8648-3cfa1f407aad-config-data-custom\") pod \"heat-api-6957fcb6b8-8jmt4\" (UID: \"9cd4ad6c-298e-47c9-8648-3cfa1f407aad\") " pod="openstack/heat-api-6957fcb6b8-8jmt4" Mar 09 14:27:08 crc kubenswrapper[4722]: I0309 14:27:08.987917 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cd4ad6c-298e-47c9-8648-3cfa1f407aad-combined-ca-bundle\") pod \"heat-api-6957fcb6b8-8jmt4\" (UID: \"9cd4ad6c-298e-47c9-8648-3cfa1f407aad\") " pod="openstack/heat-api-6957fcb6b8-8jmt4" Mar 09 14:27:08 crc kubenswrapper[4722]: I0309 14:27:08.987966 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cd4ad6c-298e-47c9-8648-3cfa1f407aad-public-tls-certs\") pod \"heat-api-6957fcb6b8-8jmt4\" (UID: \"9cd4ad6c-298e-47c9-8648-3cfa1f407aad\") " pod="openstack/heat-api-6957fcb6b8-8jmt4" Mar 09 14:27:08 crc kubenswrapper[4722]: I0309 14:27:08.997736 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9213500f-da49-496b-b99e-1ec95658b48f-combined-ca-bundle\") pod \"heat-cfnapi-5c766df7b4-znr5j\" (UID: \"9213500f-da49-496b-b99e-1ec95658b48f\") " pod="openstack/heat-cfnapi-5c766df7b4-znr5j" Mar 09 14:27:09 crc kubenswrapper[4722]: I0309 14:27:08.999397 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cd4ad6c-298e-47c9-8648-3cfa1f407aad-public-tls-certs\") pod \"heat-api-6957fcb6b8-8jmt4\" (UID: \"9cd4ad6c-298e-47c9-8648-3cfa1f407aad\") " pod="openstack/heat-api-6957fcb6b8-8jmt4" Mar 09 14:27:09 crc kubenswrapper[4722]: I0309 14:27:09.003507 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cd4ad6c-298e-47c9-8648-3cfa1f407aad-internal-tls-certs\") pod \"heat-api-6957fcb6b8-8jmt4\" (UID: \"9cd4ad6c-298e-47c9-8648-3cfa1f407aad\") " pod="openstack/heat-api-6957fcb6b8-8jmt4" Mar 09 14:27:09 crc kubenswrapper[4722]: I0309 14:27:09.005346 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9213500f-da49-496b-b99e-1ec95658b48f-public-tls-certs\") pod \"heat-cfnapi-5c766df7b4-znr5j\" (UID: \"9213500f-da49-496b-b99e-1ec95658b48f\") " pod="openstack/heat-cfnapi-5c766df7b4-znr5j" Mar 09 14:27:09 crc kubenswrapper[4722]: I0309 14:27:09.005432 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9cd4ad6c-298e-47c9-8648-3cfa1f407aad-config-data-custom\") pod \"heat-api-6957fcb6b8-8jmt4\" (UID: \"9cd4ad6c-298e-47c9-8648-3cfa1f407aad\") " pod="openstack/heat-api-6957fcb6b8-8jmt4" Mar 09 14:27:09 crc kubenswrapper[4722]: I0309 14:27:09.006243 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cd4ad6c-298e-47c9-8648-3cfa1f407aad-combined-ca-bundle\") pod \"heat-api-6957fcb6b8-8jmt4\" (UID: \"9cd4ad6c-298e-47c9-8648-3cfa1f407aad\") " pod="openstack/heat-api-6957fcb6b8-8jmt4" Mar 09 14:27:09 crc kubenswrapper[4722]: I0309 14:27:09.013119 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6j9b\" (UniqueName: \"kubernetes.io/projected/9cd4ad6c-298e-47c9-8648-3cfa1f407aad-kube-api-access-p6j9b\") pod \"heat-api-6957fcb6b8-8jmt4\" (UID: \"9cd4ad6c-298e-47c9-8648-3cfa1f407aad\") " pod="openstack/heat-api-6957fcb6b8-8jmt4" Mar 09 14:27:09 crc kubenswrapper[4722]: I0309 14:27:09.013133 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbwqz\" (UniqueName: \"kubernetes.io/projected/9213500f-da49-496b-b99e-1ec95658b48f-kube-api-access-wbwqz\") pod \"heat-cfnapi-5c766df7b4-znr5j\" (UID: \"9213500f-da49-496b-b99e-1ec95658b48f\") " pod="openstack/heat-cfnapi-5c766df7b4-znr5j" Mar 09 14:27:09 crc kubenswrapper[4722]: I0309 14:27:09.013237 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9213500f-da49-496b-b99e-1ec95658b48f-config-data-custom\") pod \"heat-cfnapi-5c766df7b4-znr5j\" (UID: \"9213500f-da49-496b-b99e-1ec95658b48f\") " pod="openstack/heat-cfnapi-5c766df7b4-znr5j" Mar 09 14:27:09 crc kubenswrapper[4722]: I0309 14:27:09.013465 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9213500f-da49-496b-b99e-1ec95658b48f-internal-tls-certs\") pod \"heat-cfnapi-5c766df7b4-znr5j\" (UID: \"9213500f-da49-496b-b99e-1ec95658b48f\") " pod="openstack/heat-cfnapi-5c766df7b4-znr5j" Mar 09 14:27:09 crc kubenswrapper[4722]: I0309 14:27:09.014028 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9213500f-da49-496b-b99e-1ec95658b48f-config-data\") pod \"heat-cfnapi-5c766df7b4-znr5j\" (UID: \"9213500f-da49-496b-b99e-1ec95658b48f\") " pod="openstack/heat-cfnapi-5c766df7b4-znr5j" Mar 09 14:27:09 crc kubenswrapper[4722]: I0309 14:27:09.014811 4722 generic.go:334] "Generic (PLEG): container finished" podID="24650aaa-2d24-4e59-9f9a-40b929e25c10" containerID="5415c2e2f0d9b8528559acb465bd00b9d5e4a5f98d5315538979c87b19b791c3" exitCode=0 Mar 09 14:27:09 crc kubenswrapper[4722]: I0309 14:27:09.015841 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6f5cbf5964-pmtn7" event={"ID":"24650aaa-2d24-4e59-9f9a-40b929e25c10","Type":"ContainerDied","Data":"5415c2e2f0d9b8528559acb465bd00b9d5e4a5f98d5315538979c87b19b791c3"} Mar 09 14:27:09 crc kubenswrapper[4722]: I0309 14:27:09.021499 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cd4ad6c-298e-47c9-8648-3cfa1f407aad-config-data\") pod \"heat-api-6957fcb6b8-8jmt4\" (UID: \"9cd4ad6c-298e-47c9-8648-3cfa1f407aad\") " pod="openstack/heat-api-6957fcb6b8-8jmt4" Mar 09 14:27:09 crc kubenswrapper[4722]: I0309 14:27:09.057453 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-6957fcb6b8-8jmt4" Mar 09 14:27:09 crc kubenswrapper[4722]: I0309 14:27:09.073309 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5c766df7b4-znr5j" Mar 09 14:27:09 crc kubenswrapper[4722]: I0309 14:27:09.685824 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7756b9d78c-xpcxn" Mar 09 14:27:09 crc kubenswrapper[4722]: I0309 14:27:09.762337 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-hbhls"] Mar 09 14:27:09 crc kubenswrapper[4722]: I0309 14:27:09.762723 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c9776ccc5-hbhls" podUID="5aaf2eb8-65eb-4404-93df-c16fe6796329" containerName="dnsmasq-dns" containerID="cri-o://78edfd45b55f5b27e7faa8f4b105788a5b0ac9f34b23575abe46855f8cccbd18" gracePeriod=10 Mar 09 14:27:10 crc kubenswrapper[4722]: I0309 14:27:10.034582 4722 generic.go:334] "Generic (PLEG): container finished" podID="905e426d-8ef1-442e-abb2-69905e8fc61a" containerID="9fdfba0d1ad4b491c62ca71f25534c103428e138f08244b726136f8e29e87ad2" exitCode=0 Mar 09 14:27:10 crc kubenswrapper[4722]: I0309 14:27:10.034817 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"905e426d-8ef1-442e-abb2-69905e8fc61a","Type":"ContainerDied","Data":"9fdfba0d1ad4b491c62ca71f25534c103428e138f08244b726136f8e29e87ad2"} Mar 09 14:27:10 crc kubenswrapper[4722]: I0309 14:27:10.036352 4722 generic.go:334] "Generic (PLEG): container finished" podID="5aaf2eb8-65eb-4404-93df-c16fe6796329" containerID="78edfd45b55f5b27e7faa8f4b105788a5b0ac9f34b23575abe46855f8cccbd18" exitCode=0 Mar 09 14:27:10 crc kubenswrapper[4722]: I0309 14:27:10.036376 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-hbhls" event={"ID":"5aaf2eb8-65eb-4404-93df-c16fe6796329","Type":"ContainerDied","Data":"78edfd45b55f5b27e7faa8f4b105788a5b0ac9f34b23575abe46855f8cccbd18"} Mar 09 14:27:11 crc kubenswrapper[4722]: I0309 14:27:11.751935 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5c9776ccc5-hbhls" podUID="5aaf2eb8-65eb-4404-93df-c16fe6796329" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.214:5353: connect: connection refused" Mar 09 14:27:13 crc kubenswrapper[4722]: E0309 14:27:13.087045 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-heat-api-cfn:current-podified" Mar 09 14:27:13 crc kubenswrapper[4722]: E0309 14:27:13.087450 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:heat-cfnapi,Image:quay.io/podified-antelope-centos9/openstack-heat-api-cfn:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_httpd_setup && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5b8h5f9hc4h5f7h666h676h598h8bhd7h5b8h665h58dh559h5ddh5b6h5cfh66fh668h8h6fh548h8h5d5h558h55ch688h68ch8dh644h57ch5bh5f9q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:heat-cfnapi-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-custom,ReadOnly:true,MountPath:/etc/heat/heat.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-twx9m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthcheck,Port:{0 8000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:10,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthcheck,Port:{0 8000 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:10,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42418,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42418,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-cfnapi-58665ff9-7zt7s_openstack(37f48e3f-e425-4806-8fb5-27724da7ad0d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 14:27:13 crc kubenswrapper[4722]: E0309 14:27:13.089144 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/heat-cfnapi-58665ff9-7zt7s" podUID="37f48e3f-e425-4806-8fb5-27724da7ad0d" Mar 09 14:27:14 crc kubenswrapper[4722]: I0309 14:27:14.142285 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"905e426d-8ef1-442e-abb2-69905e8fc61a","Type":"ContainerDied","Data":"d22f656d55f72d75c7f1d48eb90e71d7c26d641320c24c6a229270b403b98e8e"} Mar 09 14:27:14 crc kubenswrapper[4722]: I0309 14:27:14.143081 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d22f656d55f72d75c7f1d48eb90e71d7c26d641320c24c6a229270b403b98e8e" Mar 09 14:27:14 crc kubenswrapper[4722]: I0309 14:27:14.145766 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-hbhls" event={"ID":"5aaf2eb8-65eb-4404-93df-c16fe6796329","Type":"ContainerDied","Data":"d70e2bf9cec27169da9ed2921244cafb7477e2448a377092a449b75c8c654e70"} Mar 09 14:27:14 crc kubenswrapper[4722]: I0309 14:27:14.145793 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d70e2bf9cec27169da9ed2921244cafb7477e2448a377092a449b75c8c654e70" Mar 09 14:27:14 crc kubenswrapper[4722]: I0309 14:27:14.177605 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-hbhls" Mar 09 14:27:14 crc kubenswrapper[4722]: I0309 14:27:14.256191 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 09 14:27:14 crc kubenswrapper[4722]: I0309 14:27:14.283150 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6svbv\" (UniqueName: \"kubernetes.io/projected/5aaf2eb8-65eb-4404-93df-c16fe6796329-kube-api-access-6svbv\") pod \"5aaf2eb8-65eb-4404-93df-c16fe6796329\" (UID: \"5aaf2eb8-65eb-4404-93df-c16fe6796329\") " Mar 09 14:27:14 crc kubenswrapper[4722]: I0309 14:27:14.283223 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5aaf2eb8-65eb-4404-93df-c16fe6796329-dns-swift-storage-0\") pod \"5aaf2eb8-65eb-4404-93df-c16fe6796329\" (UID: \"5aaf2eb8-65eb-4404-93df-c16fe6796329\") " Mar 09 14:27:14 crc kubenswrapper[4722]: I0309 14:27:14.283298 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5aaf2eb8-65eb-4404-93df-c16fe6796329-dns-svc\") pod \"5aaf2eb8-65eb-4404-93df-c16fe6796329\" (UID: \"5aaf2eb8-65eb-4404-93df-c16fe6796329\") " Mar 09 14:27:14 crc kubenswrapper[4722]: I0309 14:27:14.283334 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5aaf2eb8-65eb-4404-93df-c16fe6796329-config\") pod \"5aaf2eb8-65eb-4404-93df-c16fe6796329\" (UID: \"5aaf2eb8-65eb-4404-93df-c16fe6796329\") " Mar 09 14:27:14 crc kubenswrapper[4722]: I0309 14:27:14.283389 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5aaf2eb8-65eb-4404-93df-c16fe6796329-ovsdbserver-sb\") pod \"5aaf2eb8-65eb-4404-93df-c16fe6796329\" (UID: \"5aaf2eb8-65eb-4404-93df-c16fe6796329\") " Mar 09 14:27:14 crc kubenswrapper[4722]: I0309 14:27:14.283576 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5aaf2eb8-65eb-4404-93df-c16fe6796329-ovsdbserver-nb\") pod \"5aaf2eb8-65eb-4404-93df-c16fe6796329\" (UID: \"5aaf2eb8-65eb-4404-93df-c16fe6796329\") " Mar 09 14:27:14 crc kubenswrapper[4722]: I0309 14:27:14.309424 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5aaf2eb8-65eb-4404-93df-c16fe6796329-kube-api-access-6svbv" (OuterVolumeSpecName: "kube-api-access-6svbv") pod "5aaf2eb8-65eb-4404-93df-c16fe6796329" (UID: "5aaf2eb8-65eb-4404-93df-c16fe6796329"). InnerVolumeSpecName "kube-api-access-6svbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:27:14 crc kubenswrapper[4722]: I0309 14:27:14.386755 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-92420d06-c977-4be0-b4f7-e2f75390c9aa\") pod \"905e426d-8ef1-442e-abb2-69905e8fc61a\" (UID: \"905e426d-8ef1-442e-abb2-69905e8fc61a\") " Mar 09 14:27:14 crc kubenswrapper[4722]: I0309 14:27:14.386941 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvpdt\" (UniqueName: \"kubernetes.io/projected/905e426d-8ef1-442e-abb2-69905e8fc61a-kube-api-access-cvpdt\") pod \"905e426d-8ef1-442e-abb2-69905e8fc61a\" (UID: \"905e426d-8ef1-442e-abb2-69905e8fc61a\") " Mar 09 14:27:14 crc kubenswrapper[4722]: I0309 14:27:14.386999 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/905e426d-8ef1-442e-abb2-69905e8fc61a-config-data\") pod \"905e426d-8ef1-442e-abb2-69905e8fc61a\" (UID: \"905e426d-8ef1-442e-abb2-69905e8fc61a\") " Mar 09 14:27:14 crc kubenswrapper[4722]: I0309 14:27:14.387162 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/905e426d-8ef1-442e-abb2-69905e8fc61a-public-tls-certs\") pod \"905e426d-8ef1-442e-abb2-69905e8fc61a\" (UID: \"905e426d-8ef1-442e-abb2-69905e8fc61a\") " Mar 09 14:27:14 crc kubenswrapper[4722]: I0309 14:27:14.387761 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/905e426d-8ef1-442e-abb2-69905e8fc61a-httpd-run\") pod \"905e426d-8ef1-442e-abb2-69905e8fc61a\" (UID: \"905e426d-8ef1-442e-abb2-69905e8fc61a\") " Mar 09 14:27:14 crc kubenswrapper[4722]: I0309 14:27:14.387834 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/905e426d-8ef1-442e-abb2-69905e8fc61a-logs\") pod \"905e426d-8ef1-442e-abb2-69905e8fc61a\" (UID: \"905e426d-8ef1-442e-abb2-69905e8fc61a\") " Mar 09 14:27:14 crc kubenswrapper[4722]: I0309 14:27:14.387896 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/905e426d-8ef1-442e-abb2-69905e8fc61a-combined-ca-bundle\") pod \"905e426d-8ef1-442e-abb2-69905e8fc61a\" (UID: \"905e426d-8ef1-442e-abb2-69905e8fc61a\") " Mar 09 14:27:14 crc kubenswrapper[4722]: I0309 14:27:14.388004 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/905e426d-8ef1-442e-abb2-69905e8fc61a-scripts\") pod \"905e426d-8ef1-442e-abb2-69905e8fc61a\" (UID: \"905e426d-8ef1-442e-abb2-69905e8fc61a\") " Mar 09 14:27:14 crc kubenswrapper[4722]: I0309 14:27:14.389539 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6svbv\" (UniqueName: \"kubernetes.io/projected/5aaf2eb8-65eb-4404-93df-c16fe6796329-kube-api-access-6svbv\") on node \"crc\" DevicePath \"\"" Mar 09 14:27:14 crc kubenswrapper[4722]: I0309 14:27:14.391072 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/905e426d-8ef1-442e-abb2-69905e8fc61a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "905e426d-8ef1-442e-abb2-69905e8fc61a" (UID: "905e426d-8ef1-442e-abb2-69905e8fc61a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:27:14 crc kubenswrapper[4722]: I0309 14:27:14.393910 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/905e426d-8ef1-442e-abb2-69905e8fc61a-logs" (OuterVolumeSpecName: "logs") pod "905e426d-8ef1-442e-abb2-69905e8fc61a" (UID: "905e426d-8ef1-442e-abb2-69905e8fc61a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:27:14 crc kubenswrapper[4722]: I0309 14:27:14.416168 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/905e426d-8ef1-442e-abb2-69905e8fc61a-scripts" (OuterVolumeSpecName: "scripts") pod "905e426d-8ef1-442e-abb2-69905e8fc61a" (UID: "905e426d-8ef1-442e-abb2-69905e8fc61a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:27:14 crc kubenswrapper[4722]: I0309 14:27:14.431378 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/905e426d-8ef1-442e-abb2-69905e8fc61a-kube-api-access-cvpdt" (OuterVolumeSpecName: "kube-api-access-cvpdt") pod "905e426d-8ef1-442e-abb2-69905e8fc61a" (UID: "905e426d-8ef1-442e-abb2-69905e8fc61a"). InnerVolumeSpecName "kube-api-access-cvpdt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:27:14 crc kubenswrapper[4722]: I0309 14:27:14.454481 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-92420d06-c977-4be0-b4f7-e2f75390c9aa" (OuterVolumeSpecName: "glance") pod "905e426d-8ef1-442e-abb2-69905e8fc61a" (UID: "905e426d-8ef1-442e-abb2-69905e8fc61a"). InnerVolumeSpecName "pvc-92420d06-c977-4be0-b4f7-e2f75390c9aa". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 09 14:27:14 crc kubenswrapper[4722]: I0309 14:27:14.458244 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5aaf2eb8-65eb-4404-93df-c16fe6796329-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5aaf2eb8-65eb-4404-93df-c16fe6796329" (UID: "5aaf2eb8-65eb-4404-93df-c16fe6796329"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:27:14 crc kubenswrapper[4722]: I0309 14:27:14.482516 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5aaf2eb8-65eb-4404-93df-c16fe6796329-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "5aaf2eb8-65eb-4404-93df-c16fe6796329" (UID: "5aaf2eb8-65eb-4404-93df-c16fe6796329"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:27:14 crc kubenswrapper[4722]: I0309 14:27:14.491191 4722 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/905e426d-8ef1-442e-abb2-69905e8fc61a-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 09 14:27:14 crc kubenswrapper[4722]: I0309 14:27:14.491232 4722 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/905e426d-8ef1-442e-abb2-69905e8fc61a-logs\") on node \"crc\" DevicePath \"\"" Mar 09 14:27:14 crc kubenswrapper[4722]: I0309 14:27:14.491241 4722 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5aaf2eb8-65eb-4404-93df-c16fe6796329-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 09 14:27:14 crc kubenswrapper[4722]: I0309 14:27:14.491250 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/905e426d-8ef1-442e-abb2-69905e8fc61a-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 14:27:14 crc kubenswrapper[4722]: I0309 14:27:14.491273 4722 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-92420d06-c977-4be0-b4f7-e2f75390c9aa\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-92420d06-c977-4be0-b4f7-e2f75390c9aa\") on node \"crc\" " Mar 09 14:27:14 crc kubenswrapper[4722]: I0309 14:27:14.491282 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cvpdt\" (UniqueName: \"kubernetes.io/projected/905e426d-8ef1-442e-abb2-69905e8fc61a-kube-api-access-cvpdt\") on node \"crc\" DevicePath \"\"" Mar 09 14:27:14 crc kubenswrapper[4722]: I0309 14:27:14.491293 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5aaf2eb8-65eb-4404-93df-c16fe6796329-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 09 14:27:14 crc kubenswrapper[4722]: I0309 14:27:14.500594 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 09 14:27:14 crc kubenswrapper[4722]: I0309 14:27:14.508861 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5aaf2eb8-65eb-4404-93df-c16fe6796329-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5aaf2eb8-65eb-4404-93df-c16fe6796329" (UID: "5aaf2eb8-65eb-4404-93df-c16fe6796329"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:27:14 crc kubenswrapper[4722]: I0309 14:27:14.529343 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5aaf2eb8-65eb-4404-93df-c16fe6796329-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5aaf2eb8-65eb-4404-93df-c16fe6796329" (UID: "5aaf2eb8-65eb-4404-93df-c16fe6796329"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:27:14 crc kubenswrapper[4722]: I0309 14:27:14.541753 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5aaf2eb8-65eb-4404-93df-c16fe6796329-config" (OuterVolumeSpecName: "config") pod "5aaf2eb8-65eb-4404-93df-c16fe6796329" (UID: "5aaf2eb8-65eb-4404-93df-c16fe6796329"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:27:14 crc kubenswrapper[4722]: I0309 14:27:14.542545 4722 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 09 14:27:14 crc kubenswrapper[4722]: I0309 14:27:14.542688 4722 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-92420d06-c977-4be0-b4f7-e2f75390c9aa" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-92420d06-c977-4be0-b4f7-e2f75390c9aa") on node "crc" Mar 09 14:27:14 crc kubenswrapper[4722]: I0309 14:27:14.545299 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/905e426d-8ef1-442e-abb2-69905e8fc61a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "905e426d-8ef1-442e-abb2-69905e8fc61a" (UID: "905e426d-8ef1-442e-abb2-69905e8fc61a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:27:14 crc kubenswrapper[4722]: I0309 14:27:14.575661 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/905e426d-8ef1-442e-abb2-69905e8fc61a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "905e426d-8ef1-442e-abb2-69905e8fc61a" (UID: "905e426d-8ef1-442e-abb2-69905e8fc61a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:27:14 crc kubenswrapper[4722]: I0309 14:27:14.579475 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/905e426d-8ef1-442e-abb2-69905e8fc61a-config-data" (OuterVolumeSpecName: "config-data") pod "905e426d-8ef1-442e-abb2-69905e8fc61a" (UID: "905e426d-8ef1-442e-abb2-69905e8fc61a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:27:14 crc kubenswrapper[4722]: I0309 14:27:14.592136 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4-httpd-run\") pod \"f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4\" (UID: \"f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4\") " Mar 09 14:27:14 crc kubenswrapper[4722]: I0309 14:27:14.592660 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r2hxk\" (UniqueName: \"kubernetes.io/projected/f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4-kube-api-access-r2hxk\") pod \"f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4\" (UID: \"f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4\") " Mar 09 14:27:14 crc kubenswrapper[4722]: I0309 14:27:14.593120 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4" (UID: "f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:27:14 crc kubenswrapper[4722]: I0309 14:27:14.595515 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-59bfcd6f-16ca-4d2a-8185-976337e0ec2c\") pod \"f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4\" (UID: \"f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4\") " Mar 09 14:27:14 crc kubenswrapper[4722]: I0309 14:27:14.595562 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4-logs\") pod \"f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4\" (UID: \"f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4\") " Mar 09 14:27:14 crc kubenswrapper[4722]: I0309 14:27:14.595600 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4-scripts\") pod \"f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4\" (UID: \"f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4\") " Mar 09 14:27:14 crc kubenswrapper[4722]: I0309 14:27:14.595697 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4-combined-ca-bundle\") pod \"f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4\" (UID: \"f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4\") " Mar 09 14:27:14 crc kubenswrapper[4722]: I0309 14:27:14.595755 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4-config-data\") pod \"f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4\" (UID: \"f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4\") " Mar 09 14:27:14 crc kubenswrapper[4722]: I0309 14:27:14.595844 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4-internal-tls-certs\") pod \"f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4\" (UID: \"f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4\") " Mar 09 14:27:14 crc kubenswrapper[4722]: I0309 14:27:14.596864 4722 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/905e426d-8ef1-442e-abb2-69905e8fc61a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 14:27:14 crc kubenswrapper[4722]: I0309 14:27:14.596893 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/905e426d-8ef1-442e-abb2-69905e8fc61a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:27:14 crc kubenswrapper[4722]: I0309 14:27:14.596906 4722 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 09 14:27:14 crc kubenswrapper[4722]: I0309 14:27:14.596919 4722 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5aaf2eb8-65eb-4404-93df-c16fe6796329-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 14:27:14 crc kubenswrapper[4722]: I0309 14:27:14.596930 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5aaf2eb8-65eb-4404-93df-c16fe6796329-config\") on node \"crc\" DevicePath \"\"" Mar 09 14:27:14 crc kubenswrapper[4722]: I0309 14:27:14.596942 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5aaf2eb8-65eb-4404-93df-c16fe6796329-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 09 14:27:14 crc kubenswrapper[4722]: I0309 14:27:14.596933 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4-logs" (OuterVolumeSpecName: "logs") pod "f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4" (UID: "f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:27:14 crc kubenswrapper[4722]: I0309 14:27:14.596957 4722 reconciler_common.go:293] "Volume detached for volume \"pvc-92420d06-c977-4be0-b4f7-e2f75390c9aa\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-92420d06-c977-4be0-b4f7-e2f75390c9aa\") on node \"crc\" DevicePath \"\"" Mar 09 14:27:14 crc kubenswrapper[4722]: I0309 14:27:14.597016 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/905e426d-8ef1-442e-abb2-69905e8fc61a-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:27:14 crc kubenswrapper[4722]: I0309 14:27:14.598559 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4-kube-api-access-r2hxk" (OuterVolumeSpecName: "kube-api-access-r2hxk") pod "f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4" (UID: "f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4"). InnerVolumeSpecName "kube-api-access-r2hxk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:27:14 crc kubenswrapper[4722]: I0309 14:27:14.605741 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4-scripts" (OuterVolumeSpecName: "scripts") pod "f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4" (UID: "f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:27:14 crc kubenswrapper[4722]: I0309 14:27:14.641676 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-59bfcd6f-16ca-4d2a-8185-976337e0ec2c" (OuterVolumeSpecName: "glance") pod "f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4" (UID: "f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4"). InnerVolumeSpecName "pvc-59bfcd6f-16ca-4d2a-8185-976337e0ec2c". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 09 14:27:14 crc kubenswrapper[4722]: I0309 14:27:14.641942 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4" (UID: "f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:27:14 crc kubenswrapper[4722]: I0309 14:27:14.692752 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4-config-data" (OuterVolumeSpecName: "config-data") pod "f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4" (UID: "f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:27:14 crc kubenswrapper[4722]: I0309 14:27:14.700407 4722 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-59bfcd6f-16ca-4d2a-8185-976337e0ec2c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-59bfcd6f-16ca-4d2a-8185-976337e0ec2c\") on node \"crc\" " Mar 09 14:27:14 crc kubenswrapper[4722]: I0309 14:27:14.700442 4722 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4-logs\") on node \"crc\" DevicePath \"\"" Mar 09 14:27:14 crc kubenswrapper[4722]: I0309 14:27:14.700451 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 14:27:14 crc kubenswrapper[4722]: I0309 14:27:14.700461 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:27:14 crc kubenswrapper[4722]: I0309 14:27:14.700478 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:27:14 crc kubenswrapper[4722]: I0309 14:27:14.700487 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r2hxk\" (UniqueName: \"kubernetes.io/projected/f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4-kube-api-access-r2hxk\") on node \"crc\" DevicePath \"\"" Mar 09 14:27:14 crc kubenswrapper[4722]: I0309 14:27:14.702653 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4" (UID: "f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:27:14 crc kubenswrapper[4722]: I0309 14:27:14.763057 4722 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 09 14:27:14 crc kubenswrapper[4722]: I0309 14:27:14.763290 4722 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-59bfcd6f-16ca-4d2a-8185-976337e0ec2c" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-59bfcd6f-16ca-4d2a-8185-976337e0ec2c") on node "crc" Mar 09 14:27:14 crc kubenswrapper[4722]: I0309 14:27:14.809515 4722 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 14:27:14 crc kubenswrapper[4722]: I0309 14:27:14.809598 4722 reconciler_common.go:293] "Volume detached for volume \"pvc-59bfcd6f-16ca-4d2a-8185-976337e0ec2c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-59bfcd6f-16ca-4d2a-8185-976337e0ec2c\") on node \"crc\" DevicePath \"\"" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.164832 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-api-5ffdf96bcf-klbzp" podUID="c08dfba1-d105-4384-b906-5772148696e4" containerName="heat-api" containerID="cri-o://0e8ee638f5a8a5f6208d4a02f57136748175cc08c6f2160681a9ac8581a97ad0" gracePeriod=60 Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.165261 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5ffdf96bcf-klbzp" event={"ID":"c08dfba1-d105-4384-b906-5772148696e4","Type":"ContainerStarted","Data":"0e8ee638f5a8a5f6208d4a02f57136748175cc08c6f2160681a9ac8581a97ad0"} Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.165792 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-5ffdf96bcf-klbzp" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.168182 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4","Type":"ContainerDied","Data":"eab35e11f04f2ca25e4cf0c889b31ea97bd408319f100d96ec0a29726d467849"} Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.168443 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.168452 4722 scope.go:117] "RemoveContainer" containerID="7d51fe379c52b16a7a6d962b827b1c07be67bbcd90aa939d56cebc2eb739f4c7" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.199630 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-hbhls" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.200309 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"363b3657-6ba6-40e5-a353-7e1440ce3d01","Type":"ContainerStarted","Data":"2665a2cc6fdf17304d031274f87e29d7419a659266ec73993b31195ab05b8606"} Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.200464 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.243395 4722 scope.go:117] "RemoveContainer" containerID="a23379779376add9f8ba28be54166afd7bd04e41bcf899bf7e5acdad5d66214e" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.244459 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-5ffdf96bcf-klbzp" podStartSLOduration=3.960198653 podStartE2EDuration="16.244438647s" podCreationTimestamp="2026-03-09 14:26:59 +0000 UTC" firstStartedPulling="2026-03-09 14:27:00.947607707 +0000 UTC m=+1461.503176283" lastFinishedPulling="2026-03-09 14:27:13.231847701 +0000 UTC m=+1473.787416277" observedRunningTime="2026-03-09 14:27:15.193524829 +0000 UTC m=+1475.749093405" watchObservedRunningTime="2026-03-09 14:27:15.244438647 +0000 UTC m=+1475.800007223" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.246632 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=3.665581911 podStartE2EDuration="23.246619158s" podCreationTimestamp="2026-03-09 14:26:52 +0000 UTC" firstStartedPulling="2026-03-09 14:26:53.67999437 +0000 UTC m=+1454.235562946" lastFinishedPulling="2026-03-09 14:27:13.261031617 +0000 UTC m=+1473.816600193" observedRunningTime="2026-03-09 14:27:15.239086849 +0000 UTC m=+1475.794655425" watchObservedRunningTime="2026-03-09 14:27:15.246619158 +0000 UTC m=+1475.802187734" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.300820 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.355343 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.385718 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 14:27:15 crc kubenswrapper[4722]: E0309 14:27:15.386228 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4" containerName="glance-httpd" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.386241 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4" containerName="glance-httpd" Mar 09 14:27:15 crc kubenswrapper[4722]: E0309 14:27:15.386257 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4" containerName="glance-log" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.386262 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4" containerName="glance-log" Mar 09 14:27:15 crc kubenswrapper[4722]: E0309 14:27:15.386277 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="905e426d-8ef1-442e-abb2-69905e8fc61a" containerName="glance-log" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.386283 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="905e426d-8ef1-442e-abb2-69905e8fc61a" containerName="glance-log" Mar 09 14:27:15 crc kubenswrapper[4722]: E0309 14:27:15.386297 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5aaf2eb8-65eb-4404-93df-c16fe6796329" containerName="dnsmasq-dns" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.386303 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="5aaf2eb8-65eb-4404-93df-c16fe6796329" containerName="dnsmasq-dns" Mar 09 14:27:15 crc kubenswrapper[4722]: E0309 14:27:15.386332 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="905e426d-8ef1-442e-abb2-69905e8fc61a" containerName="glance-httpd" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.386338 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="905e426d-8ef1-442e-abb2-69905e8fc61a" containerName="glance-httpd" Mar 09 14:27:15 crc kubenswrapper[4722]: E0309 14:27:15.386353 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5aaf2eb8-65eb-4404-93df-c16fe6796329" containerName="init" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.386358 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="5aaf2eb8-65eb-4404-93df-c16fe6796329" containerName="init" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.386568 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="5aaf2eb8-65eb-4404-93df-c16fe6796329" containerName="dnsmasq-dns" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.386584 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4" containerName="glance-httpd" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.386593 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="905e426d-8ef1-442e-abb2-69905e8fc61a" containerName="glance-log" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.386605 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4" containerName="glance-log" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.386613 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="905e426d-8ef1-442e-abb2-69905e8fc61a" containerName="glance-httpd" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.388491 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.390508 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.390784 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.395935 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-xv77d" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.396235 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.427363 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.458665 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.472097 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-58665ff9-7zt7s" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.508852 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-hbhls"] Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.534771 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6f5cbf5964-pmtn7" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.543229 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-hbhls"] Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.543282 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/37f48e3f-e425-4806-8fb5-27724da7ad0d-config-data-custom\") pod \"37f48e3f-e425-4806-8fb5-27724da7ad0d\" (UID: \"37f48e3f-e425-4806-8fb5-27724da7ad0d\") " Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.543321 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37f48e3f-e425-4806-8fb5-27724da7ad0d-config-data\") pod \"37f48e3f-e425-4806-8fb5-27724da7ad0d\" (UID: \"37f48e3f-e425-4806-8fb5-27724da7ad0d\") " Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.543467 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37f48e3f-e425-4806-8fb5-27724da7ad0d-combined-ca-bundle\") pod \"37f48e3f-e425-4806-8fb5-27724da7ad0d\" (UID: \"37f48e3f-e425-4806-8fb5-27724da7ad0d\") " Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.543498 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twx9m\" (UniqueName: \"kubernetes.io/projected/37f48e3f-e425-4806-8fb5-27724da7ad0d-kube-api-access-twx9m\") pod \"37f48e3f-e425-4806-8fb5-27724da7ad0d\" (UID: \"37f48e3f-e425-4806-8fb5-27724da7ad0d\") " Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.543956 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-59bfcd6f-16ca-4d2a-8185-976337e0ec2c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-59bfcd6f-16ca-4d2a-8185-976337e0ec2c\") pod \"glance-default-internal-api-0\" (UID: \"3cebe5cd-3f7d-4c97-beb5-5702d5ad9aae\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.543995 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3cebe5cd-3f7d-4c97-beb5-5702d5ad9aae-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3cebe5cd-3f7d-4c97-beb5-5702d5ad9aae\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.544052 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3cebe5cd-3f7d-4c97-beb5-5702d5ad9aae-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"3cebe5cd-3f7d-4c97-beb5-5702d5ad9aae\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.544125 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3cebe5cd-3f7d-4c97-beb5-5702d5ad9aae-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3cebe5cd-3f7d-4c97-beb5-5702d5ad9aae\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.544170 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3cebe5cd-3f7d-4c97-beb5-5702d5ad9aae-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3cebe5cd-3f7d-4c97-beb5-5702d5ad9aae\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.544187 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cebe5cd-3f7d-4c97-beb5-5702d5ad9aae-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3cebe5cd-3f7d-4c97-beb5-5702d5ad9aae\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.544228 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnwf7\" (UniqueName: \"kubernetes.io/projected/3cebe5cd-3f7d-4c97-beb5-5702d5ad9aae-kube-api-access-pnwf7\") pod \"glance-default-internal-api-0\" (UID: \"3cebe5cd-3f7d-4c97-beb5-5702d5ad9aae\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.544243 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3cebe5cd-3f7d-4c97-beb5-5702d5ad9aae-logs\") pod \"glance-default-internal-api-0\" (UID: \"3cebe5cd-3f7d-4c97-beb5-5702d5ad9aae\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.556272 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.568258 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 14:27:15 crc kubenswrapper[4722]: E0309 14:27:15.568955 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24650aaa-2d24-4e59-9f9a-40b929e25c10" containerName="neutron-api" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.568975 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="24650aaa-2d24-4e59-9f9a-40b929e25c10" containerName="neutron-api" Mar 09 14:27:15 crc kubenswrapper[4722]: E0309 14:27:15.569015 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24650aaa-2d24-4e59-9f9a-40b929e25c10" containerName="neutron-httpd" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.569021 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="24650aaa-2d24-4e59-9f9a-40b929e25c10" containerName="neutron-httpd" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.569296 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="24650aaa-2d24-4e59-9f9a-40b929e25c10" containerName="neutron-api" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.569322 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="24650aaa-2d24-4e59-9f9a-40b929e25c10" containerName="neutron-httpd" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.570777 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.573173 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.573495 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.577025 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.587388 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37f48e3f-e425-4806-8fb5-27724da7ad0d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "37f48e3f-e425-4806-8fb5-27724da7ad0d" (UID: "37f48e3f-e425-4806-8fb5-27724da7ad0d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.587407 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37f48e3f-e425-4806-8fb5-27724da7ad0d-config-data" (OuterVolumeSpecName: "config-data") pod "37f48e3f-e425-4806-8fb5-27724da7ad0d" (UID: "37f48e3f-e425-4806-8fb5-27724da7ad0d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.596519 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37f48e3f-e425-4806-8fb5-27724da7ad0d-kube-api-access-twx9m" (OuterVolumeSpecName: "kube-api-access-twx9m") pod "37f48e3f-e425-4806-8fb5-27724da7ad0d" (UID: "37f48e3f-e425-4806-8fb5-27724da7ad0d"). InnerVolumeSpecName "kube-api-access-twx9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.596661 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37f48e3f-e425-4806-8fb5-27724da7ad0d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "37f48e3f-e425-4806-8fb5-27724da7ad0d" (UID: "37f48e3f-e425-4806-8fb5-27724da7ad0d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.645801 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgf9l\" (UniqueName: \"kubernetes.io/projected/ff21e8aa-a39e-4355-a251-a86106a089c7-kube-api-access-pgf9l\") pod \"glance-default-external-api-0\" (UID: \"ff21e8aa-a39e-4355-a251-a86106a089c7\") " pod="openstack/glance-default-external-api-0" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.645884 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-59bfcd6f-16ca-4d2a-8185-976337e0ec2c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-59bfcd6f-16ca-4d2a-8185-976337e0ec2c\") pod \"glance-default-internal-api-0\" (UID: \"3cebe5cd-3f7d-4c97-beb5-5702d5ad9aae\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.645910 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3cebe5cd-3f7d-4c97-beb5-5702d5ad9aae-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3cebe5cd-3f7d-4c97-beb5-5702d5ad9aae\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.645963 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-92420d06-c977-4be0-b4f7-e2f75390c9aa\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-92420d06-c977-4be0-b4f7-e2f75390c9aa\") pod \"glance-default-external-api-0\" (UID: \"ff21e8aa-a39e-4355-a251-a86106a089c7\") " pod="openstack/glance-default-external-api-0" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.645989 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3cebe5cd-3f7d-4c97-beb5-5702d5ad9aae-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"3cebe5cd-3f7d-4c97-beb5-5702d5ad9aae\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.646027 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff21e8aa-a39e-4355-a251-a86106a089c7-scripts\") pod \"glance-default-external-api-0\" (UID: \"ff21e8aa-a39e-4355-a251-a86106a089c7\") " pod="openstack/glance-default-external-api-0" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.646058 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff21e8aa-a39e-4355-a251-a86106a089c7-config-data\") pod \"glance-default-external-api-0\" (UID: \"ff21e8aa-a39e-4355-a251-a86106a089c7\") " pod="openstack/glance-default-external-api-0" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.646082 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff21e8aa-a39e-4355-a251-a86106a089c7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ff21e8aa-a39e-4355-a251-a86106a089c7\") " pod="openstack/glance-default-external-api-0" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.646108 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3cebe5cd-3f7d-4c97-beb5-5702d5ad9aae-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3cebe5cd-3f7d-4c97-beb5-5702d5ad9aae\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.646133 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ff21e8aa-a39e-4355-a251-a86106a089c7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ff21e8aa-a39e-4355-a251-a86106a089c7\") " pod="openstack/glance-default-external-api-0" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.646181 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3cebe5cd-3f7d-4c97-beb5-5702d5ad9aae-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3cebe5cd-3f7d-4c97-beb5-5702d5ad9aae\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.646205 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cebe5cd-3f7d-4c97-beb5-5702d5ad9aae-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3cebe5cd-3f7d-4c97-beb5-5702d5ad9aae\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.646299 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff21e8aa-a39e-4355-a251-a86106a089c7-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ff21e8aa-a39e-4355-a251-a86106a089c7\") " pod="openstack/glance-default-external-api-0" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.646327 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3cebe5cd-3f7d-4c97-beb5-5702d5ad9aae-logs\") pod \"glance-default-internal-api-0\" (UID: \"3cebe5cd-3f7d-4c97-beb5-5702d5ad9aae\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.646345 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnwf7\" (UniqueName: \"kubernetes.io/projected/3cebe5cd-3f7d-4c97-beb5-5702d5ad9aae-kube-api-access-pnwf7\") pod \"glance-default-internal-api-0\" (UID: \"3cebe5cd-3f7d-4c97-beb5-5702d5ad9aae\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.646383 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff21e8aa-a39e-4355-a251-a86106a089c7-logs\") pod \"glance-default-external-api-0\" (UID: \"ff21e8aa-a39e-4355-a251-a86106a089c7\") " pod="openstack/glance-default-external-api-0" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.646436 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37f48e3f-e425-4806-8fb5-27724da7ad0d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.646447 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-twx9m\" (UniqueName: \"kubernetes.io/projected/37f48e3f-e425-4806-8fb5-27724da7ad0d-kube-api-access-twx9m\") on node \"crc\" DevicePath \"\"" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.646457 4722 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/37f48e3f-e425-4806-8fb5-27724da7ad0d-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.646466 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37f48e3f-e425-4806-8fb5-27724da7ad0d-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.653798 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3cebe5cd-3f7d-4c97-beb5-5702d5ad9aae-scripts\") pod \"glance-default-internal-api-0\" (UID: \"3cebe5cd-3f7d-4c97-beb5-5702d5ad9aae\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.655681 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3cebe5cd-3f7d-4c97-beb5-5702d5ad9aae-logs\") pod \"glance-default-internal-api-0\" (UID: \"3cebe5cd-3f7d-4c97-beb5-5702d5ad9aae\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.655878 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3cebe5cd-3f7d-4c97-beb5-5702d5ad9aae-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"3cebe5cd-3f7d-4c97-beb5-5702d5ad9aae\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.664344 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3cebe5cd-3f7d-4c97-beb5-5702d5ad9aae-config-data\") pod \"glance-default-internal-api-0\" (UID: \"3cebe5cd-3f7d-4c97-beb5-5702d5ad9aae\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.665510 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cebe5cd-3f7d-4c97-beb5-5702d5ad9aae-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"3cebe5cd-3f7d-4c97-beb5-5702d5ad9aae\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.667139 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3cebe5cd-3f7d-4c97-beb5-5702d5ad9aae-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"3cebe5cd-3f7d-4c97-beb5-5702d5ad9aae\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.694485 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.694529 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-59bfcd6f-16ca-4d2a-8185-976337e0ec2c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-59bfcd6f-16ca-4d2a-8185-976337e0ec2c\") pod \"glance-default-internal-api-0\" (UID: \"3cebe5cd-3f7d-4c97-beb5-5702d5ad9aae\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/a3395c88a7dc2d3ef264f22a8309ab5263d0d43341a96b8565f3c55ce5be97e0/globalmount\"" pod="openstack/glance-default-internal-api-0" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.696040 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnwf7\" (UniqueName: \"kubernetes.io/projected/3cebe5cd-3f7d-4c97-beb5-5702d5ad9aae-kube-api-access-pnwf7\") pod \"glance-default-internal-api-0\" (UID: \"3cebe5cd-3f7d-4c97-beb5-5702d5ad9aae\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.754432 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ll9zp\" (UniqueName: \"kubernetes.io/projected/24650aaa-2d24-4e59-9f9a-40b929e25c10-kube-api-access-ll9zp\") pod \"24650aaa-2d24-4e59-9f9a-40b929e25c10\" (UID: \"24650aaa-2d24-4e59-9f9a-40b929e25c10\") " Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.754757 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/24650aaa-2d24-4e59-9f9a-40b929e25c10-config\") pod \"24650aaa-2d24-4e59-9f9a-40b929e25c10\" (UID: \"24650aaa-2d24-4e59-9f9a-40b929e25c10\") " Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.754779 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24650aaa-2d24-4e59-9f9a-40b929e25c10-combined-ca-bundle\") pod \"24650aaa-2d24-4e59-9f9a-40b929e25c10\" (UID: \"24650aaa-2d24-4e59-9f9a-40b929e25c10\") " Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.754840 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/24650aaa-2d24-4e59-9f9a-40b929e25c10-httpd-config\") pod \"24650aaa-2d24-4e59-9f9a-40b929e25c10\" (UID: \"24650aaa-2d24-4e59-9f9a-40b929e25c10\") " Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.754939 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/24650aaa-2d24-4e59-9f9a-40b929e25c10-ovndb-tls-certs\") pod \"24650aaa-2d24-4e59-9f9a-40b929e25c10\" (UID: \"24650aaa-2d24-4e59-9f9a-40b929e25c10\") " Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.755160 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-48rrj"] Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.755386 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff21e8aa-a39e-4355-a251-a86106a089c7-scripts\") pod \"glance-default-external-api-0\" (UID: \"ff21e8aa-a39e-4355-a251-a86106a089c7\") " pod="openstack/glance-default-external-api-0" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.755429 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff21e8aa-a39e-4355-a251-a86106a089c7-config-data\") pod \"glance-default-external-api-0\" (UID: \"ff21e8aa-a39e-4355-a251-a86106a089c7\") " pod="openstack/glance-default-external-api-0" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.755459 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff21e8aa-a39e-4355-a251-a86106a089c7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ff21e8aa-a39e-4355-a251-a86106a089c7\") " pod="openstack/glance-default-external-api-0" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.755496 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ff21e8aa-a39e-4355-a251-a86106a089c7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ff21e8aa-a39e-4355-a251-a86106a089c7\") " pod="openstack/glance-default-external-api-0" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.755536 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff21e8aa-a39e-4355-a251-a86106a089c7-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ff21e8aa-a39e-4355-a251-a86106a089c7\") " pod="openstack/glance-default-external-api-0" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.757539 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff21e8aa-a39e-4355-a251-a86106a089c7-logs\") pod \"glance-default-external-api-0\" (UID: \"ff21e8aa-a39e-4355-a251-a86106a089c7\") " pod="openstack/glance-default-external-api-0" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.757603 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgf9l\" (UniqueName: \"kubernetes.io/projected/ff21e8aa-a39e-4355-a251-a86106a089c7-kube-api-access-pgf9l\") pod \"glance-default-external-api-0\" (UID: \"ff21e8aa-a39e-4355-a251-a86106a089c7\") " pod="openstack/glance-default-external-api-0" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.757776 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-92420d06-c977-4be0-b4f7-e2f75390c9aa\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-92420d06-c977-4be0-b4f7-e2f75390c9aa\") pod \"glance-default-external-api-0\" (UID: \"ff21e8aa-a39e-4355-a251-a86106a089c7\") " pod="openstack/glance-default-external-api-0" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.759230 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff21e8aa-a39e-4355-a251-a86106a089c7-logs\") pod \"glance-default-external-api-0\" (UID: \"ff21e8aa-a39e-4355-a251-a86106a089c7\") " pod="openstack/glance-default-external-api-0" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.768189 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ff21e8aa-a39e-4355-a251-a86106a089c7-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ff21e8aa-a39e-4355-a251-a86106a089c7\") " pod="openstack/glance-default-external-api-0" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.769470 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.769718 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-92420d06-c977-4be0-b4f7-e2f75390c9aa\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-92420d06-c977-4be0-b4f7-e2f75390c9aa\") pod \"glance-default-external-api-0\" (UID: \"ff21e8aa-a39e-4355-a251-a86106a089c7\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/4bd5fd036fa183ff3c7bd061e321acfa035788f5e30cbd29138724604e749ce5/globalmount\"" pod="openstack/glance-default-external-api-0" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.770255 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24650aaa-2d24-4e59-9f9a-40b929e25c10-kube-api-access-ll9zp" (OuterVolumeSpecName: "kube-api-access-ll9zp") pod "24650aaa-2d24-4e59-9f9a-40b929e25c10" (UID: "24650aaa-2d24-4e59-9f9a-40b929e25c10"). InnerVolumeSpecName "kube-api-access-ll9zp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.773902 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff21e8aa-a39e-4355-a251-a86106a089c7-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ff21e8aa-a39e-4355-a251-a86106a089c7\") " pod="openstack/glance-default-external-api-0" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.775100 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff21e8aa-a39e-4355-a251-a86106a089c7-config-data\") pod \"glance-default-external-api-0\" (UID: \"ff21e8aa-a39e-4355-a251-a86106a089c7\") " pod="openstack/glance-default-external-api-0" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.785860 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff21e8aa-a39e-4355-a251-a86106a089c7-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ff21e8aa-a39e-4355-a251-a86106a089c7\") " pod="openstack/glance-default-external-api-0" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.786893 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgf9l\" (UniqueName: \"kubernetes.io/projected/ff21e8aa-a39e-4355-a251-a86106a089c7-kube-api-access-pgf9l\") pod \"glance-default-external-api-0\" (UID: \"ff21e8aa-a39e-4355-a251-a86106a089c7\") " pod="openstack/glance-default-external-api-0" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.796194 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24650aaa-2d24-4e59-9f9a-40b929e25c10-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "24650aaa-2d24-4e59-9f9a-40b929e25c10" (UID: "24650aaa-2d24-4e59-9f9a-40b929e25c10"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.804766 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff21e8aa-a39e-4355-a251-a86106a089c7-scripts\") pod \"glance-default-external-api-0\" (UID: \"ff21e8aa-a39e-4355-a251-a86106a089c7\") " pod="openstack/glance-default-external-api-0" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.822250 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-b4fe-account-create-update-jrh5p"] Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.861085 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ll9zp\" (UniqueName: \"kubernetes.io/projected/24650aaa-2d24-4e59-9f9a-40b929e25c10-kube-api-access-ll9zp\") on node \"crc\" DevicePath \"\"" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.861106 4722 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/24650aaa-2d24-4e59-9f9a-40b929e25c10-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.882284 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-59bfcd6f-16ca-4d2a-8185-976337e0ec2c\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-59bfcd6f-16ca-4d2a-8185-976337e0ec2c\") pod \"glance-default-internal-api-0\" (UID: \"3cebe5cd-3f7d-4c97-beb5-5702d5ad9aae\") " pod="openstack/glance-default-internal-api-0" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.974087 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24650aaa-2d24-4e59-9f9a-40b929e25c10-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "24650aaa-2d24-4e59-9f9a-40b929e25c10" (UID: "24650aaa-2d24-4e59-9f9a-40b929e25c10"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:27:15 crc kubenswrapper[4722]: I0309 14:27:15.984154 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 14:27:16 crc kubenswrapper[4722]: I0309 14:27:16.012103 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-92420d06-c977-4be0-b4f7-e2f75390c9aa\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-92420d06-c977-4be0-b4f7-e2f75390c9aa\") pod \"glance-default-external-api-0\" (UID: \"ff21e8aa-a39e-4355-a251-a86106a089c7\") " pod="openstack/glance-default-external-api-0" Mar 09 14:27:16 crc kubenswrapper[4722]: I0309 14:27:16.016453 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24650aaa-2d24-4e59-9f9a-40b929e25c10-config" (OuterVolumeSpecName: "config") pod "24650aaa-2d24-4e59-9f9a-40b929e25c10" (UID: "24650aaa-2d24-4e59-9f9a-40b929e25c10"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:27:16 crc kubenswrapper[4722]: I0309 14:27:16.019772 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 09 14:27:16 crc kubenswrapper[4722]: I0309 14:27:16.023288 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-5c766df7b4-znr5j"] Mar 09 14:27:16 crc kubenswrapper[4722]: I0309 14:27:16.052741 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-5b85b5677d-6lmg9"] Mar 09 14:27:16 crc kubenswrapper[4722]: I0309 14:27:16.065793 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-25f0-account-create-update-cmbgw"] Mar 09 14:27:16 crc kubenswrapper[4722]: I0309 14:27:16.076319 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/24650aaa-2d24-4e59-9f9a-40b929e25c10-config\") on node \"crc\" DevicePath \"\"" Mar 09 14:27:16 crc kubenswrapper[4722]: I0309 14:27:16.076354 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24650aaa-2d24-4e59-9f9a-40b929e25c10-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:27:16 crc kubenswrapper[4722]: I0309 14:27:16.077257 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-9wrb2"] Mar 09 14:27:16 crc kubenswrapper[4722]: I0309 14:27:16.099399 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-678fb995f7-q5bfj"] Mar 09 14:27:16 crc kubenswrapper[4722]: I0309 14:27:16.111278 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24650aaa-2d24-4e59-9f9a-40b929e25c10-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "24650aaa-2d24-4e59-9f9a-40b929e25c10" (UID: "24650aaa-2d24-4e59-9f9a-40b929e25c10"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:27:16 crc kubenswrapper[4722]: I0309 14:27:16.121311 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-wtg6v"] Mar 09 14:27:16 crc kubenswrapper[4722]: I0309 14:27:16.133245 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 09 14:27:16 crc kubenswrapper[4722]: I0309 14:27:16.137625 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-489a-account-create-update-z4jfp"] Mar 09 14:27:16 crc kubenswrapper[4722]: I0309 14:27:16.173779 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5aaf2eb8-65eb-4404-93df-c16fe6796329" path="/var/lib/kubelet/pods/5aaf2eb8-65eb-4404-93df-c16fe6796329/volumes" Mar 09 14:27:16 crc kubenswrapper[4722]: I0309 14:27:16.183917 4722 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/24650aaa-2d24-4e59-9f9a-40b929e25c10-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 14:27:16 crc kubenswrapper[4722]: I0309 14:27:16.185466 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="905e426d-8ef1-442e-abb2-69905e8fc61a" path="/var/lib/kubelet/pods/905e426d-8ef1-442e-abb2-69905e8fc61a/volumes" Mar 09 14:27:16 crc kubenswrapper[4722]: I0309 14:27:16.186674 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4" path="/var/lib/kubelet/pods/f016b9ee-ebbb-40e4-b6c9-eed84cb2d1f4/volumes" Mar 09 14:27:16 crc kubenswrapper[4722]: I0309 14:27:16.187923 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-5f6f45fb9b-nw8cq"] Mar 09 14:27:16 crc kubenswrapper[4722]: I0309 14:27:16.187960 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-6957fcb6b8-8jmt4"] Mar 09 14:27:16 crc kubenswrapper[4722]: I0309 14:27:16.230604 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5f6f45fb9b-nw8cq" event={"ID":"6fe8dbea-cdd7-4bda-b5f2-717a0a30d116","Type":"ContainerStarted","Data":"5daeb5f8197986c6582be7f05adcad2f1dab3daf247fea598269aafd2929b229"} Mar 09 14:27:16 crc kubenswrapper[4722]: I0309 14:27:16.257016 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-678fb995f7-q5bfj" event={"ID":"dfb53424-4444-464a-9be1-97e2e095c496","Type":"ContainerStarted","Data":"8d548a7591e05f2c3428e1d40835e5da777fad9fe54741b13665685b21446a8c"} Mar 09 14:27:16 crc kubenswrapper[4722]: I0309 14:27:16.259409 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5c766df7b4-znr5j" event={"ID":"9213500f-da49-496b-b99e-1ec95658b48f","Type":"ContainerStarted","Data":"08b8f36076a9b9afbf23360ea871dec1d014a02f3bbc37147b10da79b806d583"} Mar 09 14:27:16 crc kubenswrapper[4722]: I0309 14:27:16.264506 4722 generic.go:334] "Generic (PLEG): container finished" podID="c08dfba1-d105-4384-b906-5772148696e4" containerID="0e8ee638f5a8a5f6208d4a02f57136748175cc08c6f2160681a9ac8581a97ad0" exitCode=0 Mar 09 14:27:16 crc kubenswrapper[4722]: I0309 14:27:16.264602 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5ffdf96bcf-klbzp" event={"ID":"c08dfba1-d105-4384-b906-5772148696e4","Type":"ContainerDied","Data":"0e8ee638f5a8a5f6208d4a02f57136748175cc08c6f2160681a9ac8581a97ad0"} Mar 09 14:27:16 crc kubenswrapper[4722]: I0309 14:27:16.266659 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-48rrj" event={"ID":"22868fd9-09e2-4ad6-b923-ab373da94453","Type":"ContainerStarted","Data":"46d0583863795efc8b4bde0c8e7f46ed1a647412be8ec70fa02fd0874193f011"} Mar 09 14:27:16 crc kubenswrapper[4722]: I0309 14:27:16.269197 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-25f0-account-create-update-cmbgw" event={"ID":"87133d68-d972-4b38-a6b9-f88733004c17","Type":"ContainerStarted","Data":"e0e96a5f64201a329466de9635c0d207caaec6290febc2d40046fcec75b03da0"} Mar 09 14:27:16 crc kubenswrapper[4722]: I0309 14:27:16.296189 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2a7b55ea-3f51-4dfc-b861-23be50541a1c","Type":"ContainerStarted","Data":"26a7a09f41a26a7f4d7ad2a920f5b0b4ca0d1fd3e99c5c7e77790a0632a2a390"} Mar 09 14:27:16 crc kubenswrapper[4722]: I0309 14:27:16.297582 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-9wrb2" event={"ID":"ff53300e-f89f-4204-82ce-5100fa8b10be","Type":"ContainerStarted","Data":"946fcf04258a7b63ccbdbe729e0f3b42adaa948b01222ea6f50547a20d857381"} Mar 09 14:27:16 crc kubenswrapper[4722]: I0309 14:27:16.298744 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6957fcb6b8-8jmt4" event={"ID":"9cd4ad6c-298e-47c9-8648-3cfa1f407aad","Type":"ContainerStarted","Data":"14e1e2adbf020bfb97669fe703440cebd4aef6774ae12c997f83538f125d3769"} Mar 09 14:27:16 crc kubenswrapper[4722]: I0309 14:27:16.300195 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-b4fe-account-create-update-jrh5p" event={"ID":"9ce279d8-a769-4e43-89f9-18c598b6f207","Type":"ContainerStarted","Data":"5bf4cf6890d6c726aef01bd610213f60e0e5e5aec58a0b153c3ce3185d41d131"} Mar 09 14:27:16 crc kubenswrapper[4722]: I0309 14:27:16.301082 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-58665ff9-7zt7s" event={"ID":"37f48e3f-e425-4806-8fb5-27724da7ad0d","Type":"ContainerDied","Data":"8624c0c4345c2696fd5321fd486df640d0f505a836b998620c0888808603aefd"} Mar 09 14:27:16 crc kubenswrapper[4722]: I0309 14:27:16.301312 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-58665ff9-7zt7s" Mar 09 14:27:16 crc kubenswrapper[4722]: I0309 14:27:16.302303 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5b85b5677d-6lmg9" event={"ID":"d6c4171f-11a7-47bf-900b-55fa05f03f49","Type":"ContainerStarted","Data":"00addcf75d6f6f309792bc2c6ed5476100062f632c8630932d6872f511bc9ef8"} Mar 09 14:27:16 crc kubenswrapper[4722]: I0309 14:27:16.304820 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6f5cbf5964-pmtn7" event={"ID":"24650aaa-2d24-4e59-9f9a-40b929e25c10","Type":"ContainerDied","Data":"cf70bb37a7aacb5ae3ddda8d9165846a1ed111f9dfc97a3211de8ffb1087dbba"} Mar 09 14:27:16 crc kubenswrapper[4722]: I0309 14:27:16.304854 4722 scope.go:117] "RemoveContainer" containerID="89975d701fd58751699513e7e1eafbc6124794762a9b7f960bfecb6450d873bd" Mar 09 14:27:16 crc kubenswrapper[4722]: I0309 14:27:16.304981 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6f5cbf5964-pmtn7" Mar 09 14:27:16 crc kubenswrapper[4722]: I0309 14:27:16.308152 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-489a-account-create-update-z4jfp" event={"ID":"3bebeafd-3acc-450b-85b1-145cb598ac05","Type":"ContainerStarted","Data":"402c24238cf6a326ef2444b9ac441b640bcf9d1255259b6bdc24b09484f1d287"} Mar 09 14:27:16 crc kubenswrapper[4722]: I0309 14:27:16.309415 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-wtg6v" event={"ID":"8bcd1ab8-574f-4e9e-8b60-058524c8be9f","Type":"ContainerStarted","Data":"0af933b48793bf133675e5cd79c1c80fbdee995e1e813721d587fe83c6c41745"} Mar 09 14:27:16 crc kubenswrapper[4722]: I0309 14:27:16.347761 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6f5cbf5964-pmtn7"] Mar 09 14:27:16 crc kubenswrapper[4722]: I0309 14:27:16.363085 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5ffdf96bcf-klbzp" Mar 09 14:27:16 crc kubenswrapper[4722]: I0309 14:27:16.376109 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-6f5cbf5964-pmtn7"] Mar 09 14:27:16 crc kubenswrapper[4722]: I0309 14:27:16.395062 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c08dfba1-d105-4384-b906-5772148696e4-combined-ca-bundle\") pod \"c08dfba1-d105-4384-b906-5772148696e4\" (UID: \"c08dfba1-d105-4384-b906-5772148696e4\") " Mar 09 14:27:16 crc kubenswrapper[4722]: I0309 14:27:16.395164 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqcsp\" (UniqueName: \"kubernetes.io/projected/c08dfba1-d105-4384-b906-5772148696e4-kube-api-access-nqcsp\") pod \"c08dfba1-d105-4384-b906-5772148696e4\" (UID: \"c08dfba1-d105-4384-b906-5772148696e4\") " Mar 09 14:27:16 crc kubenswrapper[4722]: I0309 14:27:16.395234 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c08dfba1-d105-4384-b906-5772148696e4-config-data-custom\") pod \"c08dfba1-d105-4384-b906-5772148696e4\" (UID: \"c08dfba1-d105-4384-b906-5772148696e4\") " Mar 09 14:27:16 crc kubenswrapper[4722]: I0309 14:27:16.395295 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c08dfba1-d105-4384-b906-5772148696e4-config-data\") pod \"c08dfba1-d105-4384-b906-5772148696e4\" (UID: \"c08dfba1-d105-4384-b906-5772148696e4\") " Mar 09 14:27:16 crc kubenswrapper[4722]: I0309 14:27:16.405398 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-58665ff9-7zt7s"] Mar 09 14:27:16 crc kubenswrapper[4722]: I0309 14:27:16.408554 4722 scope.go:117] "RemoveContainer" containerID="5415c2e2f0d9b8528559acb465bd00b9d5e4a5f98d5315538979c87b19b791c3" Mar 09 14:27:16 crc kubenswrapper[4722]: I0309 14:27:16.420663 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-58665ff9-7zt7s"] Mar 09 14:27:16 crc kubenswrapper[4722]: I0309 14:27:16.433998 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c08dfba1-d105-4384-b906-5772148696e4-kube-api-access-nqcsp" (OuterVolumeSpecName: "kube-api-access-nqcsp") pod "c08dfba1-d105-4384-b906-5772148696e4" (UID: "c08dfba1-d105-4384-b906-5772148696e4"). InnerVolumeSpecName "kube-api-access-nqcsp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:27:16 crc kubenswrapper[4722]: I0309 14:27:16.462609 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c08dfba1-d105-4384-b906-5772148696e4-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c08dfba1-d105-4384-b906-5772148696e4" (UID: "c08dfba1-d105-4384-b906-5772148696e4"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:27:16 crc kubenswrapper[4722]: I0309 14:27:16.498231 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqcsp\" (UniqueName: \"kubernetes.io/projected/c08dfba1-d105-4384-b906-5772148696e4-kube-api-access-nqcsp\") on node \"crc\" DevicePath \"\"" Mar 09 14:27:16 crc kubenswrapper[4722]: I0309 14:27:16.498255 4722 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c08dfba1-d105-4384-b906-5772148696e4-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 09 14:27:17 crc kubenswrapper[4722]: I0309 14:27:17.063991 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 09 14:27:17 crc kubenswrapper[4722]: I0309 14:27:17.077368 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c08dfba1-d105-4384-b906-5772148696e4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c08dfba1-d105-4384-b906-5772148696e4" (UID: "c08dfba1-d105-4384-b906-5772148696e4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:27:17 crc kubenswrapper[4722]: W0309 14:27:17.114143 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3cebe5cd_3f7d_4c97_beb5_5702d5ad9aae.slice/crio-6b01dea062c313d81f474b5f198b70490fb062548de051f7b4c36cfb11c8f73e WatchSource:0}: Error finding container 6b01dea062c313d81f474b5f198b70490fb062548de051f7b4c36cfb11c8f73e: Status 404 returned error can't find the container with id 6b01dea062c313d81f474b5f198b70490fb062548de051f7b4c36cfb11c8f73e Mar 09 14:27:17 crc kubenswrapper[4722]: I0309 14:27:17.148478 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c08dfba1-d105-4384-b906-5772148696e4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:27:17 crc kubenswrapper[4722]: I0309 14:27:17.157424 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c08dfba1-d105-4384-b906-5772148696e4-config-data" (OuterVolumeSpecName: "config-data") pod "c08dfba1-d105-4384-b906-5772148696e4" (UID: "c08dfba1-d105-4384-b906-5772148696e4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:27:17 crc kubenswrapper[4722]: I0309 14:27:17.241347 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 09 14:27:17 crc kubenswrapper[4722]: I0309 14:27:17.251907 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c08dfba1-d105-4384-b906-5772148696e4-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:27:17 crc kubenswrapper[4722]: W0309 14:27:17.279115 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff21e8aa_a39e_4355_a251_a86106a089c7.slice/crio-ba8331a520ffcd69e6766224ef34e1e2f664139a285ae3561a182e6bd08ae0ae WatchSource:0}: Error finding container ba8331a520ffcd69e6766224ef34e1e2f664139a285ae3561a182e6bd08ae0ae: Status 404 returned error can't find the container with id ba8331a520ffcd69e6766224ef34e1e2f664139a285ae3561a182e6bd08ae0ae Mar 09 14:27:17 crc kubenswrapper[4722]: I0309 14:27:17.324546 4722 generic.go:334] "Generic (PLEG): container finished" podID="9ce279d8-a769-4e43-89f9-18c598b6f207" containerID="264abab60d8e40cc9ad26af630c68c524c74d4168716437d47ead93f83091391" exitCode=0 Mar 09 14:27:17 crc kubenswrapper[4722]: I0309 14:27:17.324650 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-b4fe-account-create-update-jrh5p" event={"ID":"9ce279d8-a769-4e43-89f9-18c598b6f207","Type":"ContainerDied","Data":"264abab60d8e40cc9ad26af630c68c524c74d4168716437d47ead93f83091391"} Mar 09 14:27:17 crc kubenswrapper[4722]: I0309 14:27:17.327103 4722 generic.go:334] "Generic (PLEG): container finished" podID="22868fd9-09e2-4ad6-b923-ab373da94453" containerID="e70b1eca4a308b4ae963e0695763ed8e18e59f63b05e049fc322abf1067f5f46" exitCode=0 Mar 09 14:27:17 crc kubenswrapper[4722]: I0309 14:27:17.327148 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-48rrj" event={"ID":"22868fd9-09e2-4ad6-b923-ab373da94453","Type":"ContainerDied","Data":"e70b1eca4a308b4ae963e0695763ed8e18e59f63b05e049fc322abf1067f5f46"} Mar 09 14:27:17 crc kubenswrapper[4722]: I0309 14:27:17.341489 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-wtg6v" event={"ID":"8bcd1ab8-574f-4e9e-8b60-058524c8be9f","Type":"ContainerStarted","Data":"fdcbebcda0784849d010d9f351b3cab1c65a75030b21be0704a612a41d13ddc4"} Mar 09 14:27:17 crc kubenswrapper[4722]: I0309 14:27:17.347599 4722 generic.go:334] "Generic (PLEG): container finished" podID="87133d68-d972-4b38-a6b9-f88733004c17" containerID="be95b2bc7740c453215243f49acbc33a603da69895f058d18be9c4f1ce1acf40" exitCode=0 Mar 09 14:27:17 crc kubenswrapper[4722]: I0309 14:27:17.347666 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-25f0-account-create-update-cmbgw" event={"ID":"87133d68-d972-4b38-a6b9-f88733004c17","Type":"ContainerDied","Data":"be95b2bc7740c453215243f49acbc33a603da69895f058d18be9c4f1ce1acf40"} Mar 09 14:27:17 crc kubenswrapper[4722]: I0309 14:27:17.357411 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-9wrb2" event={"ID":"ff53300e-f89f-4204-82ce-5100fa8b10be","Type":"ContainerStarted","Data":"92e1b081619a85f67e51a0724fade902503af8f7f9809e6c3d63fba6df4a4861"} Mar 09 14:27:17 crc kubenswrapper[4722]: I0309 14:27:17.360095 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3cebe5cd-3f7d-4c97-beb5-5702d5ad9aae","Type":"ContainerStarted","Data":"6b01dea062c313d81f474b5f198b70490fb062548de051f7b4c36cfb11c8f73e"} Mar 09 14:27:17 crc kubenswrapper[4722]: I0309 14:27:17.362003 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ff21e8aa-a39e-4355-a251-a86106a089c7","Type":"ContainerStarted","Data":"ba8331a520ffcd69e6766224ef34e1e2f664139a285ae3561a182e6bd08ae0ae"} Mar 09 14:27:17 crc kubenswrapper[4722]: I0309 14:27:17.374661 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5ffdf96bcf-klbzp" event={"ID":"c08dfba1-d105-4384-b906-5772148696e4","Type":"ContainerDied","Data":"48703b92787f9698e4a7e81da6bfa90aaf203ac85cd822625bf29c90ea7ead58"} Mar 09 14:27:17 crc kubenswrapper[4722]: I0309 14:27:17.374717 4722 scope.go:117] "RemoveContainer" containerID="0e8ee638f5a8a5f6208d4a02f57136748175cc08c6f2160681a9ac8581a97ad0" Mar 09 14:27:17 crc kubenswrapper[4722]: I0309 14:27:17.374844 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5ffdf96bcf-klbzp" Mar 09 14:27:17 crc kubenswrapper[4722]: I0309 14:27:17.433775 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-9wrb2" podStartSLOduration=12.433039322 podStartE2EDuration="12.433039322s" podCreationTimestamp="2026-03-09 14:27:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:27:17.402744275 +0000 UTC m=+1477.958312851" watchObservedRunningTime="2026-03-09 14:27:17.433039322 +0000 UTC m=+1477.988607898" Mar 09 14:27:17 crc kubenswrapper[4722]: I0309 14:27:17.733645 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-5ffdf96bcf-klbzp"] Mar 09 14:27:17 crc kubenswrapper[4722]: I0309 14:27:17.755501 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-5ffdf96bcf-klbzp"] Mar 09 14:27:18 crc kubenswrapper[4722]: I0309 14:27:18.177913 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24650aaa-2d24-4e59-9f9a-40b929e25c10" path="/var/lib/kubelet/pods/24650aaa-2d24-4e59-9f9a-40b929e25c10/volumes" Mar 09 14:27:18 crc kubenswrapper[4722]: I0309 14:27:18.179229 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37f48e3f-e425-4806-8fb5-27724da7ad0d" path="/var/lib/kubelet/pods/37f48e3f-e425-4806-8fb5-27724da7ad0d/volumes" Mar 09 14:27:18 crc kubenswrapper[4722]: I0309 14:27:18.181774 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c08dfba1-d105-4384-b906-5772148696e4" path="/var/lib/kubelet/pods/c08dfba1-d105-4384-b906-5772148696e4/volumes" Mar 09 14:27:18 crc kubenswrapper[4722]: I0309 14:27:18.424001 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6957fcb6b8-8jmt4" event={"ID":"9cd4ad6c-298e-47c9-8648-3cfa1f407aad","Type":"ContainerStarted","Data":"7a6d6410a31756df86ae1256bbe1b8693b637feb0999a0f93b3883be294fab8b"} Mar 09 14:27:18 crc kubenswrapper[4722]: I0309 14:27:18.424537 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-6957fcb6b8-8jmt4" Mar 09 14:27:18 crc kubenswrapper[4722]: I0309 14:27:18.463661 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5c766df7b4-znr5j" event={"ID":"9213500f-da49-496b-b99e-1ec95658b48f","Type":"ContainerStarted","Data":"16ca1376d5e10d5ddff1520fbf35e02673d95dd3732d67bdc68c20e9647c98aa"} Mar 09 14:27:18 crc kubenswrapper[4722]: I0309 14:27:18.464923 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-5c766df7b4-znr5j" Mar 09 14:27:18 crc kubenswrapper[4722]: I0309 14:27:18.471227 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-489a-account-create-update-z4jfp" event={"ID":"3bebeafd-3acc-450b-85b1-145cb598ac05","Type":"ContainerStarted","Data":"bc4d4d8cb0425cb7086646156e49b032cf304d0fd45000968132c6a706f3fc6e"} Mar 09 14:27:18 crc kubenswrapper[4722]: I0309 14:27:18.488061 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-678fb995f7-q5bfj" event={"ID":"dfb53424-4444-464a-9be1-97e2e095c496","Type":"ContainerStarted","Data":"d13db23d16ba06dabebfe79eacf52b9d98971ac551281b501b27011139739a15"} Mar 09 14:27:18 crc kubenswrapper[4722]: I0309 14:27:18.488101 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-678fb995f7-q5bfj" Mar 09 14:27:18 crc kubenswrapper[4722]: I0309 14:27:18.507653 4722 generic.go:334] "Generic (PLEG): container finished" podID="8bcd1ab8-574f-4e9e-8b60-058524c8be9f" containerID="fdcbebcda0784849d010d9f351b3cab1c65a75030b21be0704a612a41d13ddc4" exitCode=0 Mar 09 14:27:18 crc kubenswrapper[4722]: I0309 14:27:18.507735 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-wtg6v" event={"ID":"8bcd1ab8-574f-4e9e-8b60-058524c8be9f","Type":"ContainerDied","Data":"fdcbebcda0784849d010d9f351b3cab1c65a75030b21be0704a612a41d13ddc4"} Mar 09 14:27:18 crc kubenswrapper[4722]: I0309 14:27:18.508639 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-5c766df7b4-znr5j" podStartSLOduration=9.554693127 podStartE2EDuration="10.508619491s" podCreationTimestamp="2026-03-09 14:27:08 +0000 UTC" firstStartedPulling="2026-03-09 14:27:15.725853763 +0000 UTC m=+1476.281422329" lastFinishedPulling="2026-03-09 14:27:16.679780117 +0000 UTC m=+1477.235348693" observedRunningTime="2026-03-09 14:27:18.492183037 +0000 UTC m=+1479.047751613" watchObservedRunningTime="2026-03-09 14:27:18.508619491 +0000 UTC m=+1479.064188067" Mar 09 14:27:18 crc kubenswrapper[4722]: I0309 14:27:18.514408 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-6957fcb6b8-8jmt4" podStartSLOduration=10.51436839 podStartE2EDuration="10.51436839s" podCreationTimestamp="2026-03-09 14:27:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:27:18.451465281 +0000 UTC m=+1479.007033857" watchObservedRunningTime="2026-03-09 14:27:18.51436839 +0000 UTC m=+1479.069936966" Mar 09 14:27:18 crc kubenswrapper[4722]: I0309 14:27:18.525606 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5b85b5677d-6lmg9" event={"ID":"d6c4171f-11a7-47bf-900b-55fa05f03f49","Type":"ContainerStarted","Data":"682b653dc560eb158389415e00b2b01b8e427fd763202d90d5116f1cb1eb6a9d"} Mar 09 14:27:18 crc kubenswrapper[4722]: I0309 14:27:18.525856 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-5b85b5677d-6lmg9" Mar 09 14:27:18 crc kubenswrapper[4722]: I0309 14:27:18.554240 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-678fb995f7-q5bfj" podStartSLOduration=12.545385029 podStartE2EDuration="12.545385029s" podCreationTimestamp="2026-03-09 14:27:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:27:18.543745683 +0000 UTC m=+1479.099314259" watchObservedRunningTime="2026-03-09 14:27:18.545385029 +0000 UTC m=+1479.100953605" Mar 09 14:27:18 crc kubenswrapper[4722]: I0309 14:27:18.563595 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3cebe5cd-3f7d-4c97-beb5-5702d5ad9aae","Type":"ContainerStarted","Data":"a60e0282b7dae6cad05dabc133d203fa11fe532231b8108afa6b597d19f0c21e"} Mar 09 14:27:18 crc kubenswrapper[4722]: I0309 14:27:18.594134 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-5b85b5677d-6lmg9" podStartSLOduration=12.594072855 podStartE2EDuration="12.594072855s" podCreationTimestamp="2026-03-09 14:27:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:27:18.572444647 +0000 UTC m=+1479.128013223" watchObservedRunningTime="2026-03-09 14:27:18.594072855 +0000 UTC m=+1479.149641441" Mar 09 14:27:18 crc kubenswrapper[4722]: I0309 14:27:18.613471 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5f6f45fb9b-nw8cq" event={"ID":"6fe8dbea-cdd7-4bda-b5f2-717a0a30d116","Type":"ContainerStarted","Data":"29fda5293f0284bf6d8a97f0d5183516753731ed6c8af3abb442653d2357a4c8"} Mar 09 14:27:18 crc kubenswrapper[4722]: I0309 14:27:18.614348 4722 scope.go:117] "RemoveContainer" containerID="29fda5293f0284bf6d8a97f0d5183516753731ed6c8af3abb442653d2357a4c8" Mar 09 14:27:18 crc kubenswrapper[4722]: I0309 14:27:18.639021 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2a7b55ea-3f51-4dfc-b861-23be50541a1c","Type":"ContainerStarted","Data":"48eff71955332d3b52375f0a991f10f3e4374f91b7e99deb991ff46c62536e31"} Mar 09 14:27:18 crc kubenswrapper[4722]: I0309 14:27:18.643421 4722 generic.go:334] "Generic (PLEG): container finished" podID="ff53300e-f89f-4204-82ce-5100fa8b10be" containerID="92e1b081619a85f67e51a0724fade902503af8f7f9809e6c3d63fba6df4a4861" exitCode=0 Mar 09 14:27:18 crc kubenswrapper[4722]: I0309 14:27:18.643995 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-9wrb2" event={"ID":"ff53300e-f89f-4204-82ce-5100fa8b10be","Type":"ContainerDied","Data":"92e1b081619a85f67e51a0724fade902503af8f7f9809e6c3d63fba6df4a4861"} Mar 09 14:27:18 crc kubenswrapper[4722]: I0309 14:27:18.648839 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ff21e8aa-a39e-4355-a251-a86106a089c7","Type":"ContainerStarted","Data":"0af822ca9d1fb74786f4662743dfa37646cc6f269ed169533a3afd67c3adf03b"} Mar 09 14:27:19 crc kubenswrapper[4722]: I0309 14:27:19.176072 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-wtg6v" Mar 09 14:27:19 crc kubenswrapper[4722]: I0309 14:27:19.222891 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hhxl7\" (UniqueName: \"kubernetes.io/projected/8bcd1ab8-574f-4e9e-8b60-058524c8be9f-kube-api-access-hhxl7\") pod \"8bcd1ab8-574f-4e9e-8b60-058524c8be9f\" (UID: \"8bcd1ab8-574f-4e9e-8b60-058524c8be9f\") " Mar 09 14:27:19 crc kubenswrapper[4722]: I0309 14:27:19.223405 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8bcd1ab8-574f-4e9e-8b60-058524c8be9f-operator-scripts\") pod \"8bcd1ab8-574f-4e9e-8b60-058524c8be9f\" (UID: \"8bcd1ab8-574f-4e9e-8b60-058524c8be9f\") " Mar 09 14:27:19 crc kubenswrapper[4722]: I0309 14:27:19.224767 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bcd1ab8-574f-4e9e-8b60-058524c8be9f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8bcd1ab8-574f-4e9e-8b60-058524c8be9f" (UID: "8bcd1ab8-574f-4e9e-8b60-058524c8be9f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:27:19 crc kubenswrapper[4722]: I0309 14:27:19.234595 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bcd1ab8-574f-4e9e-8b60-058524c8be9f-kube-api-access-hhxl7" (OuterVolumeSpecName: "kube-api-access-hhxl7") pod "8bcd1ab8-574f-4e9e-8b60-058524c8be9f" (UID: "8bcd1ab8-574f-4e9e-8b60-058524c8be9f"). InnerVolumeSpecName "kube-api-access-hhxl7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:27:19 crc kubenswrapper[4722]: I0309 14:27:19.327001 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hhxl7\" (UniqueName: \"kubernetes.io/projected/8bcd1ab8-574f-4e9e-8b60-058524c8be9f-kube-api-access-hhxl7\") on node \"crc\" DevicePath \"\"" Mar 09 14:27:19 crc kubenswrapper[4722]: I0309 14:27:19.327044 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8bcd1ab8-574f-4e9e-8b60-058524c8be9f-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 14:27:19 crc kubenswrapper[4722]: I0309 14:27:19.358859 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-25f0-account-create-update-cmbgw" Mar 09 14:27:19 crc kubenswrapper[4722]: I0309 14:27:19.368979 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-b4fe-account-create-update-jrh5p" Mar 09 14:27:19 crc kubenswrapper[4722]: I0309 14:27:19.386305 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-48rrj" Mar 09 14:27:19 crc kubenswrapper[4722]: I0309 14:27:19.428827 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ql2qk\" (UniqueName: \"kubernetes.io/projected/87133d68-d972-4b38-a6b9-f88733004c17-kube-api-access-ql2qk\") pod \"87133d68-d972-4b38-a6b9-f88733004c17\" (UID: \"87133d68-d972-4b38-a6b9-f88733004c17\") " Mar 09 14:27:19 crc kubenswrapper[4722]: I0309 14:27:19.428912 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ce279d8-a769-4e43-89f9-18c598b6f207-operator-scripts\") pod \"9ce279d8-a769-4e43-89f9-18c598b6f207\" (UID: \"9ce279d8-a769-4e43-89f9-18c598b6f207\") " Mar 09 14:27:19 crc kubenswrapper[4722]: I0309 14:27:19.428942 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87133d68-d972-4b38-a6b9-f88733004c17-operator-scripts\") pod \"87133d68-d972-4b38-a6b9-f88733004c17\" (UID: \"87133d68-d972-4b38-a6b9-f88733004c17\") " Mar 09 14:27:19 crc kubenswrapper[4722]: I0309 14:27:19.429121 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gqj85\" (UniqueName: \"kubernetes.io/projected/22868fd9-09e2-4ad6-b923-ab373da94453-kube-api-access-gqj85\") pod \"22868fd9-09e2-4ad6-b923-ab373da94453\" (UID: \"22868fd9-09e2-4ad6-b923-ab373da94453\") " Mar 09 14:27:19 crc kubenswrapper[4722]: I0309 14:27:19.429224 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22868fd9-09e2-4ad6-b923-ab373da94453-operator-scripts\") pod \"22868fd9-09e2-4ad6-b923-ab373da94453\" (UID: \"22868fd9-09e2-4ad6-b923-ab373da94453\") " Mar 09 14:27:19 crc kubenswrapper[4722]: I0309 14:27:19.429242 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8rp2\" (UniqueName: \"kubernetes.io/projected/9ce279d8-a769-4e43-89f9-18c598b6f207-kube-api-access-w8rp2\") pod \"9ce279d8-a769-4e43-89f9-18c598b6f207\" (UID: \"9ce279d8-a769-4e43-89f9-18c598b6f207\") " Mar 09 14:27:19 crc kubenswrapper[4722]: I0309 14:27:19.430769 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87133d68-d972-4b38-a6b9-f88733004c17-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "87133d68-d972-4b38-a6b9-f88733004c17" (UID: "87133d68-d972-4b38-a6b9-f88733004c17"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:27:19 crc kubenswrapper[4722]: I0309 14:27:19.438525 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22868fd9-09e2-4ad6-b923-ab373da94453-kube-api-access-gqj85" (OuterVolumeSpecName: "kube-api-access-gqj85") pod "22868fd9-09e2-4ad6-b923-ab373da94453" (UID: "22868fd9-09e2-4ad6-b923-ab373da94453"). InnerVolumeSpecName "kube-api-access-gqj85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:27:19 crc kubenswrapper[4722]: I0309 14:27:19.438777 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ce279d8-a769-4e43-89f9-18c598b6f207-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9ce279d8-a769-4e43-89f9-18c598b6f207" (UID: "9ce279d8-a769-4e43-89f9-18c598b6f207"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:27:19 crc kubenswrapper[4722]: I0309 14:27:19.439112 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87133d68-d972-4b38-a6b9-f88733004c17-kube-api-access-ql2qk" (OuterVolumeSpecName: "kube-api-access-ql2qk") pod "87133d68-d972-4b38-a6b9-f88733004c17" (UID: "87133d68-d972-4b38-a6b9-f88733004c17"). InnerVolumeSpecName "kube-api-access-ql2qk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:27:19 crc kubenswrapper[4722]: I0309 14:27:19.439661 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22868fd9-09e2-4ad6-b923-ab373da94453-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "22868fd9-09e2-4ad6-b923-ab373da94453" (UID: "22868fd9-09e2-4ad6-b923-ab373da94453"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:27:19 crc kubenswrapper[4722]: I0309 14:27:19.453476 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ce279d8-a769-4e43-89f9-18c598b6f207-kube-api-access-w8rp2" (OuterVolumeSpecName: "kube-api-access-w8rp2") pod "9ce279d8-a769-4e43-89f9-18c598b6f207" (UID: "9ce279d8-a769-4e43-89f9-18c598b6f207"). InnerVolumeSpecName "kube-api-access-w8rp2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:27:19 crc kubenswrapper[4722]: I0309 14:27:19.532158 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ce279d8-a769-4e43-89f9-18c598b6f207-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 14:27:19 crc kubenswrapper[4722]: I0309 14:27:19.532459 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/87133d68-d972-4b38-a6b9-f88733004c17-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 14:27:19 crc kubenswrapper[4722]: I0309 14:27:19.532474 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gqj85\" (UniqueName: \"kubernetes.io/projected/22868fd9-09e2-4ad6-b923-ab373da94453-kube-api-access-gqj85\") on node \"crc\" DevicePath \"\"" Mar 09 14:27:19 crc kubenswrapper[4722]: I0309 14:27:19.532488 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/22868fd9-09e2-4ad6-b923-ab373da94453-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 14:27:19 crc kubenswrapper[4722]: I0309 14:27:19.532501 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w8rp2\" (UniqueName: \"kubernetes.io/projected/9ce279d8-a769-4e43-89f9-18c598b6f207-kube-api-access-w8rp2\") on node \"crc\" DevicePath \"\"" Mar 09 14:27:19 crc kubenswrapper[4722]: I0309 14:27:19.532512 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ql2qk\" (UniqueName: \"kubernetes.io/projected/87133d68-d972-4b38-a6b9-f88733004c17-kube-api-access-ql2qk\") on node \"crc\" DevicePath \"\"" Mar 09 14:27:19 crc kubenswrapper[4722]: I0309 14:27:19.573960 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-5884f594db-ls9vv" Mar 09 14:27:19 crc kubenswrapper[4722]: I0309 14:27:19.690662 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"3cebe5cd-3f7d-4c97-beb5-5702d5ad9aae","Type":"ContainerStarted","Data":"638b931c5801d87e344ae1df05d6efaebaa4f0d3b96feb8e35380a8c56fa2757"} Mar 09 14:27:19 crc kubenswrapper[4722]: I0309 14:27:19.706263 4722 generic.go:334] "Generic (PLEG): container finished" podID="d6c4171f-11a7-47bf-900b-55fa05f03f49" containerID="682b653dc560eb158389415e00b2b01b8e427fd763202d90d5116f1cb1eb6a9d" exitCode=1 Mar 09 14:27:19 crc kubenswrapper[4722]: I0309 14:27:19.706505 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5b85b5677d-6lmg9" event={"ID":"d6c4171f-11a7-47bf-900b-55fa05f03f49","Type":"ContainerDied","Data":"682b653dc560eb158389415e00b2b01b8e427fd763202d90d5116f1cb1eb6a9d"} Mar 09 14:27:19 crc kubenswrapper[4722]: I0309 14:27:19.707592 4722 scope.go:117] "RemoveContainer" containerID="682b653dc560eb158389415e00b2b01b8e427fd763202d90d5116f1cb1eb6a9d" Mar 09 14:27:19 crc kubenswrapper[4722]: I0309 14:27:19.710562 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-25f0-account-create-update-cmbgw" event={"ID":"87133d68-d972-4b38-a6b9-f88733004c17","Type":"ContainerDied","Data":"e0e96a5f64201a329466de9635c0d207caaec6290febc2d40046fcec75b03da0"} Mar 09 14:27:19 crc kubenswrapper[4722]: I0309 14:27:19.710601 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e0e96a5f64201a329466de9635c0d207caaec6290febc2d40046fcec75b03da0" Mar 09 14:27:19 crc kubenswrapper[4722]: I0309 14:27:19.710673 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-25f0-account-create-update-cmbgw" Mar 09 14:27:19 crc kubenswrapper[4722]: I0309 14:27:19.712096 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-b4fe-account-create-update-jrh5p" event={"ID":"9ce279d8-a769-4e43-89f9-18c598b6f207","Type":"ContainerDied","Data":"5bf4cf6890d6c726aef01bd610213f60e0e5e5aec58a0b153c3ce3185d41d131"} Mar 09 14:27:19 crc kubenswrapper[4722]: I0309 14:27:19.712111 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5bf4cf6890d6c726aef01bd610213f60e0e5e5aec58a0b153c3ce3185d41d131" Mar 09 14:27:19 crc kubenswrapper[4722]: I0309 14:27:19.712153 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-b4fe-account-create-update-jrh5p" Mar 09 14:27:19 crc kubenswrapper[4722]: I0309 14:27:19.714351 4722 generic.go:334] "Generic (PLEG): container finished" podID="3bebeafd-3acc-450b-85b1-145cb598ac05" containerID="bc4d4d8cb0425cb7086646156e49b032cf304d0fd45000968132c6a706f3fc6e" exitCode=0 Mar 09 14:27:19 crc kubenswrapper[4722]: I0309 14:27:19.714400 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-489a-account-create-update-z4jfp" event={"ID":"3bebeafd-3acc-450b-85b1-145cb598ac05","Type":"ContainerDied","Data":"bc4d4d8cb0425cb7086646156e49b032cf304d0fd45000968132c6a706f3fc6e"} Mar 09 14:27:19 crc kubenswrapper[4722]: I0309 14:27:19.733833 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.733807889 podStartE2EDuration="4.733807889s" podCreationTimestamp="2026-03-09 14:27:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:27:19.723591907 +0000 UTC m=+1480.279160483" watchObservedRunningTime="2026-03-09 14:27:19.733807889 +0000 UTC m=+1480.289376475" Mar 09 14:27:19 crc kubenswrapper[4722]: I0309 14:27:19.776888 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-48rrj" event={"ID":"22868fd9-09e2-4ad6-b923-ab373da94453","Type":"ContainerDied","Data":"46d0583863795efc8b4bde0c8e7f46ed1a647412be8ec70fa02fd0874193f011"} Mar 09 14:27:19 crc kubenswrapper[4722]: I0309 14:27:19.777111 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="46d0583863795efc8b4bde0c8e7f46ed1a647412be8ec70fa02fd0874193f011" Mar 09 14:27:19 crc kubenswrapper[4722]: I0309 14:27:19.777281 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-48rrj" Mar 09 14:27:19 crc kubenswrapper[4722]: I0309 14:27:19.787884 4722 generic.go:334] "Generic (PLEG): container finished" podID="6fe8dbea-cdd7-4bda-b5f2-717a0a30d116" containerID="29fda5293f0284bf6d8a97f0d5183516753731ed6c8af3abb442653d2357a4c8" exitCode=1 Mar 09 14:27:19 crc kubenswrapper[4722]: I0309 14:27:19.787916 4722 generic.go:334] "Generic (PLEG): container finished" podID="6fe8dbea-cdd7-4bda-b5f2-717a0a30d116" containerID="5af8a71261e43a7a76f52a673e0c3d2b29962c6dd7bd2166b79ae5beb28304af" exitCode=1 Mar 09 14:27:19 crc kubenswrapper[4722]: I0309 14:27:19.787954 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5f6f45fb9b-nw8cq" event={"ID":"6fe8dbea-cdd7-4bda-b5f2-717a0a30d116","Type":"ContainerDied","Data":"29fda5293f0284bf6d8a97f0d5183516753731ed6c8af3abb442653d2357a4c8"} Mar 09 14:27:19 crc kubenswrapper[4722]: I0309 14:27:19.787980 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5f6f45fb9b-nw8cq" event={"ID":"6fe8dbea-cdd7-4bda-b5f2-717a0a30d116","Type":"ContainerDied","Data":"5af8a71261e43a7a76f52a673e0c3d2b29962c6dd7bd2166b79ae5beb28304af"} Mar 09 14:27:19 crc kubenswrapper[4722]: I0309 14:27:19.787995 4722 scope.go:117] "RemoveContainer" containerID="29fda5293f0284bf6d8a97f0d5183516753731ed6c8af3abb442653d2357a4c8" Mar 09 14:27:19 crc kubenswrapper[4722]: I0309 14:27:19.788733 4722 scope.go:117] "RemoveContainer" containerID="5af8a71261e43a7a76f52a673e0c3d2b29962c6dd7bd2166b79ae5beb28304af" Mar 09 14:27:19 crc kubenswrapper[4722]: E0309 14:27:19.788956 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-5f6f45fb9b-nw8cq_openstack(6fe8dbea-cdd7-4bda-b5f2-717a0a30d116)\"" pod="openstack/heat-cfnapi-5f6f45fb9b-nw8cq" podUID="6fe8dbea-cdd7-4bda-b5f2-717a0a30d116" Mar 09 14:27:19 crc kubenswrapper[4722]: I0309 14:27:19.796514 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-wtg6v" event={"ID":"8bcd1ab8-574f-4e9e-8b60-058524c8be9f","Type":"ContainerDied","Data":"0af933b48793bf133675e5cd79c1c80fbdee995e1e813721d587fe83c6c41745"} Mar 09 14:27:19 crc kubenswrapper[4722]: I0309 14:27:19.796534 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0af933b48793bf133675e5cd79c1c80fbdee995e1e813721d587fe83c6c41745" Mar 09 14:27:19 crc kubenswrapper[4722]: I0309 14:27:19.796591 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-wtg6v" Mar 09 14:27:19 crc kubenswrapper[4722]: I0309 14:27:19.829340 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2a7b55ea-3f51-4dfc-b861-23be50541a1c","Type":"ContainerStarted","Data":"c24875009a9faa332cc97d55a86352afdf804d1f1652af3912b252e0ee1afaea"} Mar 09 14:27:19 crc kubenswrapper[4722]: I0309 14:27:19.840646 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ff21e8aa-a39e-4355-a251-a86106a089c7","Type":"ContainerStarted","Data":"cc374c40bec1b9f2cc9509e01fe0c5108151653c6cb0fb6d9d73d38a505f63ce"} Mar 09 14:27:19 crc kubenswrapper[4722]: I0309 14:27:19.867669 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.867646801 podStartE2EDuration="4.867646801s" podCreationTimestamp="2026-03-09 14:27:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:27:19.86108339 +0000 UTC m=+1480.416651966" watchObservedRunningTime="2026-03-09 14:27:19.867646801 +0000 UTC m=+1480.423215377" Mar 09 14:27:20 crc kubenswrapper[4722]: I0309 14:27:20.475452 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-489a-account-create-update-z4jfp" Mar 09 14:27:20 crc kubenswrapper[4722]: I0309 14:27:20.507637 4722 scope.go:117] "RemoveContainer" containerID="29fda5293f0284bf6d8a97f0d5183516753731ed6c8af3abb442653d2357a4c8" Mar 09 14:27:20 crc kubenswrapper[4722]: E0309 14:27:20.519901 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29fda5293f0284bf6d8a97f0d5183516753731ed6c8af3abb442653d2357a4c8\": container with ID starting with 29fda5293f0284bf6d8a97f0d5183516753731ed6c8af3abb442653d2357a4c8 not found: ID does not exist" containerID="29fda5293f0284bf6d8a97f0d5183516753731ed6c8af3abb442653d2357a4c8" Mar 09 14:27:20 crc kubenswrapper[4722]: I0309 14:27:20.520104 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29fda5293f0284bf6d8a97f0d5183516753731ed6c8af3abb442653d2357a4c8"} err="failed to get container status \"29fda5293f0284bf6d8a97f0d5183516753731ed6c8af3abb442653d2357a4c8\": rpc error: code = NotFound desc = could not find container \"29fda5293f0284bf6d8a97f0d5183516753731ed6c8af3abb442653d2357a4c8\": container with ID starting with 29fda5293f0284bf6d8a97f0d5183516753731ed6c8af3abb442653d2357a4c8 not found: ID does not exist" Mar 09 14:27:20 crc kubenswrapper[4722]: I0309 14:27:20.580423 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3bebeafd-3acc-450b-85b1-145cb598ac05-operator-scripts\") pod \"3bebeafd-3acc-450b-85b1-145cb598ac05\" (UID: \"3bebeafd-3acc-450b-85b1-145cb598ac05\") " Mar 09 14:27:20 crc kubenswrapper[4722]: I0309 14:27:20.580601 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vt7l\" (UniqueName: \"kubernetes.io/projected/3bebeafd-3acc-450b-85b1-145cb598ac05-kube-api-access-8vt7l\") pod \"3bebeafd-3acc-450b-85b1-145cb598ac05\" (UID: \"3bebeafd-3acc-450b-85b1-145cb598ac05\") " Mar 09 14:27:20 crc kubenswrapper[4722]: I0309 14:27:20.581141 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3bebeafd-3acc-450b-85b1-145cb598ac05-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3bebeafd-3acc-450b-85b1-145cb598ac05" (UID: "3bebeafd-3acc-450b-85b1-145cb598ac05"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:27:20 crc kubenswrapper[4722]: I0309 14:27:20.581799 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3bebeafd-3acc-450b-85b1-145cb598ac05-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 14:27:20 crc kubenswrapper[4722]: I0309 14:27:20.584905 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bebeafd-3acc-450b-85b1-145cb598ac05-kube-api-access-8vt7l" (OuterVolumeSpecName: "kube-api-access-8vt7l") pod "3bebeafd-3acc-450b-85b1-145cb598ac05" (UID: "3bebeafd-3acc-450b-85b1-145cb598ac05"). InnerVolumeSpecName "kube-api-access-8vt7l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:27:20 crc kubenswrapper[4722]: I0309 14:27:20.648771 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-9wrb2" Mar 09 14:27:20 crc kubenswrapper[4722]: I0309 14:27:20.691563 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff53300e-f89f-4204-82ce-5100fa8b10be-operator-scripts\") pod \"ff53300e-f89f-4204-82ce-5100fa8b10be\" (UID: \"ff53300e-f89f-4204-82ce-5100fa8b10be\") " Mar 09 14:27:20 crc kubenswrapper[4722]: I0309 14:27:20.691648 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7fc6k\" (UniqueName: \"kubernetes.io/projected/ff53300e-f89f-4204-82ce-5100fa8b10be-kube-api-access-7fc6k\") pod \"ff53300e-f89f-4204-82ce-5100fa8b10be\" (UID: \"ff53300e-f89f-4204-82ce-5100fa8b10be\") " Mar 09 14:27:20 crc kubenswrapper[4722]: I0309 14:27:20.692225 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vt7l\" (UniqueName: \"kubernetes.io/projected/3bebeafd-3acc-450b-85b1-145cb598ac05-kube-api-access-8vt7l\") on node \"crc\" DevicePath \"\"" Mar 09 14:27:20 crc kubenswrapper[4722]: I0309 14:27:20.692547 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff53300e-f89f-4204-82ce-5100fa8b10be-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ff53300e-f89f-4204-82ce-5100fa8b10be" (UID: "ff53300e-f89f-4204-82ce-5100fa8b10be"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:27:20 crc kubenswrapper[4722]: I0309 14:27:20.695517 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff53300e-f89f-4204-82ce-5100fa8b10be-kube-api-access-7fc6k" (OuterVolumeSpecName: "kube-api-access-7fc6k") pod "ff53300e-f89f-4204-82ce-5100fa8b10be" (UID: "ff53300e-f89f-4204-82ce-5100fa8b10be"). InnerVolumeSpecName "kube-api-access-7fc6k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:27:20 crc kubenswrapper[4722]: I0309 14:27:20.796709 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ff53300e-f89f-4204-82ce-5100fa8b10be-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 14:27:20 crc kubenswrapper[4722]: I0309 14:27:20.796744 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7fc6k\" (UniqueName: \"kubernetes.io/projected/ff53300e-f89f-4204-82ce-5100fa8b10be-kube-api-access-7fc6k\") on node \"crc\" DevicePath \"\"" Mar 09 14:27:20 crc kubenswrapper[4722]: I0309 14:27:20.849573 4722 scope.go:117] "RemoveContainer" containerID="5af8a71261e43a7a76f52a673e0c3d2b29962c6dd7bd2166b79ae5beb28304af" Mar 09 14:27:20 crc kubenswrapper[4722]: E0309 14:27:20.849888 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-5f6f45fb9b-nw8cq_openstack(6fe8dbea-cdd7-4bda-b5f2-717a0a30d116)\"" pod="openstack/heat-cfnapi-5f6f45fb9b-nw8cq" podUID="6fe8dbea-cdd7-4bda-b5f2-717a0a30d116" Mar 09 14:27:20 crc kubenswrapper[4722]: I0309 14:27:20.852411 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5b85b5677d-6lmg9" event={"ID":"d6c4171f-11a7-47bf-900b-55fa05f03f49","Type":"ContainerStarted","Data":"73506ed0545c9032408fb9295746b8107ae47b119a54aee899ccb465f961a34c"} Mar 09 14:27:20 crc kubenswrapper[4722]: I0309 14:27:20.852558 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-5b85b5677d-6lmg9" Mar 09 14:27:20 crc kubenswrapper[4722]: I0309 14:27:20.854002 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-489a-account-create-update-z4jfp" event={"ID":"3bebeafd-3acc-450b-85b1-145cb598ac05","Type":"ContainerDied","Data":"402c24238cf6a326ef2444b9ac441b640bcf9d1255259b6bdc24b09484f1d287"} Mar 09 14:27:20 crc kubenswrapper[4722]: I0309 14:27:20.854042 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="402c24238cf6a326ef2444b9ac441b640bcf9d1255259b6bdc24b09484f1d287" Mar 09 14:27:20 crc kubenswrapper[4722]: I0309 14:27:20.854014 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-489a-account-create-update-z4jfp" Mar 09 14:27:20 crc kubenswrapper[4722]: I0309 14:27:20.855363 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-9wrb2" Mar 09 14:27:20 crc kubenswrapper[4722]: I0309 14:27:20.855403 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-9wrb2" event={"ID":"ff53300e-f89f-4204-82ce-5100fa8b10be","Type":"ContainerDied","Data":"946fcf04258a7b63ccbdbe729e0f3b42adaa948b01222ea6f50547a20d857381"} Mar 09 14:27:20 crc kubenswrapper[4722]: I0309 14:27:20.855440 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="946fcf04258a7b63ccbdbe729e0f3b42adaa948b01222ea6f50547a20d857381" Mar 09 14:27:21 crc kubenswrapper[4722]: I0309 14:27:21.866519 4722 generic.go:334] "Generic (PLEG): container finished" podID="d6c4171f-11a7-47bf-900b-55fa05f03f49" containerID="73506ed0545c9032408fb9295746b8107ae47b119a54aee899ccb465f961a34c" exitCode=1 Mar 09 14:27:21 crc kubenswrapper[4722]: I0309 14:27:21.868031 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5b85b5677d-6lmg9" event={"ID":"d6c4171f-11a7-47bf-900b-55fa05f03f49","Type":"ContainerDied","Data":"73506ed0545c9032408fb9295746b8107ae47b119a54aee899ccb465f961a34c"} Mar 09 14:27:21 crc kubenswrapper[4722]: I0309 14:27:21.868912 4722 scope.go:117] "RemoveContainer" containerID="73506ed0545c9032408fb9295746b8107ae47b119a54aee899ccb465f961a34c" Mar 09 14:27:21 crc kubenswrapper[4722]: I0309 14:27:21.869519 4722 scope.go:117] "RemoveContainer" containerID="682b653dc560eb158389415e00b2b01b8e427fd763202d90d5116f1cb1eb6a9d" Mar 09 14:27:21 crc kubenswrapper[4722]: E0309 14:27:21.870153 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-5b85b5677d-6lmg9_openstack(d6c4171f-11a7-47bf-900b-55fa05f03f49)\"" pod="openstack/heat-api-5b85b5677d-6lmg9" podUID="d6c4171f-11a7-47bf-900b-55fa05f03f49" Mar 09 14:27:21 crc kubenswrapper[4722]: I0309 14:27:21.871916 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2a7b55ea-3f51-4dfc-b861-23be50541a1c","Type":"ContainerStarted","Data":"4c2160840bc5dd1aa87e9cf70ed2a592e6050a1f29dadce21d04f9229d43ba63"} Mar 09 14:27:22 crc kubenswrapper[4722]: I0309 14:27:22.201289 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-api-5b85b5677d-6lmg9" Mar 09 14:27:22 crc kubenswrapper[4722]: I0309 14:27:22.213586 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-5f6f45fb9b-nw8cq" Mar 09 14:27:22 crc kubenswrapper[4722]: I0309 14:27:22.213634 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-cfnapi-5f6f45fb9b-nw8cq" Mar 09 14:27:22 crc kubenswrapper[4722]: I0309 14:27:22.214585 4722 scope.go:117] "RemoveContainer" containerID="5af8a71261e43a7a76f52a673e0c3d2b29962c6dd7bd2166b79ae5beb28304af" Mar 09 14:27:22 crc kubenswrapper[4722]: E0309 14:27:22.214899 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-5f6f45fb9b-nw8cq_openstack(6fe8dbea-cdd7-4bda-b5f2-717a0a30d116)\"" pod="openstack/heat-cfnapi-5f6f45fb9b-nw8cq" podUID="6fe8dbea-cdd7-4bda-b5f2-717a0a30d116" Mar 09 14:27:22 crc kubenswrapper[4722]: I0309 14:27:22.884134 4722 scope.go:117] "RemoveContainer" containerID="73506ed0545c9032408fb9295746b8107ae47b119a54aee899ccb465f961a34c" Mar 09 14:27:22 crc kubenswrapper[4722]: E0309 14:27:22.884773 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-5b85b5677d-6lmg9_openstack(d6c4171f-11a7-47bf-900b-55fa05f03f49)\"" pod="openstack/heat-api-5b85b5677d-6lmg9" podUID="d6c4171f-11a7-47bf-900b-55fa05f03f49" Mar 09 14:27:23 crc kubenswrapper[4722]: I0309 14:27:23.894007 4722 scope.go:117] "RemoveContainer" containerID="73506ed0545c9032408fb9295746b8107ae47b119a54aee899ccb465f961a34c" Mar 09 14:27:23 crc kubenswrapper[4722]: E0309 14:27:23.894396 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-5b85b5677d-6lmg9_openstack(d6c4171f-11a7-47bf-900b-55fa05f03f49)\"" pod="openstack/heat-api-5b85b5677d-6lmg9" podUID="d6c4171f-11a7-47bf-900b-55fa05f03f49" Mar 09 14:27:24 crc kubenswrapper[4722]: I0309 14:27:24.907274 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2a7b55ea-3f51-4dfc-b861-23be50541a1c","Type":"ContainerStarted","Data":"ed42696657e1eeb95d8b396c66e34829ba43892ddf5f42c7ce6aa4a57730253b"} Mar 09 14:27:24 crc kubenswrapper[4722]: I0309 14:27:24.908920 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 09 14:27:24 crc kubenswrapper[4722]: I0309 14:27:24.907478 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2a7b55ea-3f51-4dfc-b861-23be50541a1c" containerName="proxy-httpd" containerID="cri-o://ed42696657e1eeb95d8b396c66e34829ba43892ddf5f42c7ce6aa4a57730253b" gracePeriod=30 Mar 09 14:27:24 crc kubenswrapper[4722]: I0309 14:27:24.907489 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2a7b55ea-3f51-4dfc-b861-23be50541a1c" containerName="ceilometer-notification-agent" containerID="cri-o://c24875009a9faa332cc97d55a86352afdf804d1f1652af3912b252e0ee1afaea" gracePeriod=30 Mar 09 14:27:24 crc kubenswrapper[4722]: I0309 14:27:24.907522 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2a7b55ea-3f51-4dfc-b861-23be50541a1c" containerName="sg-core" containerID="cri-o://4c2160840bc5dd1aa87e9cf70ed2a592e6050a1f29dadce21d04f9229d43ba63" gracePeriod=30 Mar 09 14:27:24 crc kubenswrapper[4722]: I0309 14:27:24.907408 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2a7b55ea-3f51-4dfc-b861-23be50541a1c" containerName="ceilometer-central-agent" containerID="cri-o://48eff71955332d3b52375f0a991f10f3e4374f91b7e99deb991ff46c62536e31" gracePeriod=30 Mar 09 14:27:24 crc kubenswrapper[4722]: I0309 14:27:24.954156 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=14.591523056 podStartE2EDuration="22.95413579s" podCreationTimestamp="2026-03-09 14:27:02 +0000 UTC" firstStartedPulling="2026-03-09 14:27:15.768703987 +0000 UTC m=+1476.324272563" lastFinishedPulling="2026-03-09 14:27:24.131316721 +0000 UTC m=+1484.686885297" observedRunningTime="2026-03-09 14:27:24.942971491 +0000 UTC m=+1485.498540087" watchObservedRunningTime="2026-03-09 14:27:24.95413579 +0000 UTC m=+1485.509704366" Mar 09 14:27:25 crc kubenswrapper[4722]: I0309 14:27:25.955606 4722 generic.go:334] "Generic (PLEG): container finished" podID="2a7b55ea-3f51-4dfc-b861-23be50541a1c" containerID="ed42696657e1eeb95d8b396c66e34829ba43892ddf5f42c7ce6aa4a57730253b" exitCode=0 Mar 09 14:27:25 crc kubenswrapper[4722]: I0309 14:27:25.955877 4722 generic.go:334] "Generic (PLEG): container finished" podID="2a7b55ea-3f51-4dfc-b861-23be50541a1c" containerID="4c2160840bc5dd1aa87e9cf70ed2a592e6050a1f29dadce21d04f9229d43ba63" exitCode=2 Mar 09 14:27:25 crc kubenswrapper[4722]: I0309 14:27:25.955887 4722 generic.go:334] "Generic (PLEG): container finished" podID="2a7b55ea-3f51-4dfc-b861-23be50541a1c" containerID="c24875009a9faa332cc97d55a86352afdf804d1f1652af3912b252e0ee1afaea" exitCode=0 Mar 09 14:27:25 crc kubenswrapper[4722]: I0309 14:27:25.955690 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2a7b55ea-3f51-4dfc-b861-23be50541a1c","Type":"ContainerDied","Data":"ed42696657e1eeb95d8b396c66e34829ba43892ddf5f42c7ce6aa4a57730253b"} Mar 09 14:27:25 crc kubenswrapper[4722]: I0309 14:27:25.955922 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2a7b55ea-3f51-4dfc-b861-23be50541a1c","Type":"ContainerDied","Data":"4c2160840bc5dd1aa87e9cf70ed2a592e6050a1f29dadce21d04f9229d43ba63"} Mar 09 14:27:25 crc kubenswrapper[4722]: I0309 14:27:25.955936 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2a7b55ea-3f51-4dfc-b861-23be50541a1c","Type":"ContainerDied","Data":"c24875009a9faa332cc97d55a86352afdf804d1f1652af3912b252e0ee1afaea"} Mar 09 14:27:26 crc kubenswrapper[4722]: I0309 14:27:26.007309 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-5c766df7b4-znr5j" Mar 09 14:27:26 crc kubenswrapper[4722]: I0309 14:27:26.020428 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 09 14:27:26 crc kubenswrapper[4722]: I0309 14:27:26.025526 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 09 14:27:26 crc kubenswrapper[4722]: I0309 14:27:26.077060 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-5f6f45fb9b-nw8cq"] Mar 09 14:27:26 crc kubenswrapper[4722]: I0309 14:27:26.116587 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 09 14:27:26 crc kubenswrapper[4722]: I0309 14:27:26.117114 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 09 14:27:26 crc kubenswrapper[4722]: I0309 14:27:26.133349 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 09 14:27:26 crc kubenswrapper[4722]: I0309 14:27:26.134167 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 09 14:27:26 crc kubenswrapper[4722]: I0309 14:27:26.164341 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-6957fcb6b8-8jmt4" Mar 09 14:27:26 crc kubenswrapper[4722]: I0309 14:27:26.188643 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 09 14:27:26 crc kubenswrapper[4722]: I0309 14:27:26.188751 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 09 14:27:26 crc kubenswrapper[4722]: I0309 14:27:26.253599 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-5b85b5677d-6lmg9"] Mar 09 14:27:30 crc kubenswrapper[4722]: I0309 14:27:26.582678 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5f6f45fb9b-nw8cq" Mar 09 14:27:30 crc kubenswrapper[4722]: I0309 14:27:26.645612 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-dxrwp"] Mar 09 14:27:30 crc kubenswrapper[4722]: E0309 14:27:26.646125 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ce279d8-a769-4e43-89f9-18c598b6f207" containerName="mariadb-account-create-update" Mar 09 14:27:30 crc kubenswrapper[4722]: I0309 14:27:26.646139 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ce279d8-a769-4e43-89f9-18c598b6f207" containerName="mariadb-account-create-update" Mar 09 14:27:30 crc kubenswrapper[4722]: E0309 14:27:26.646165 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bebeafd-3acc-450b-85b1-145cb598ac05" containerName="mariadb-account-create-update" Mar 09 14:27:30 crc kubenswrapper[4722]: I0309 14:27:26.646172 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bebeafd-3acc-450b-85b1-145cb598ac05" containerName="mariadb-account-create-update" Mar 09 14:27:30 crc kubenswrapper[4722]: E0309 14:27:26.646182 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fe8dbea-cdd7-4bda-b5f2-717a0a30d116" containerName="heat-cfnapi" Mar 09 14:27:30 crc kubenswrapper[4722]: I0309 14:27:26.646187 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fe8dbea-cdd7-4bda-b5f2-717a0a30d116" containerName="heat-cfnapi" Mar 09 14:27:30 crc kubenswrapper[4722]: E0309 14:27:26.646196 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c08dfba1-d105-4384-b906-5772148696e4" containerName="heat-api" Mar 09 14:27:30 crc kubenswrapper[4722]: I0309 14:27:26.646221 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="c08dfba1-d105-4384-b906-5772148696e4" containerName="heat-api" Mar 09 14:27:30 crc kubenswrapper[4722]: E0309 14:27:26.646239 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff53300e-f89f-4204-82ce-5100fa8b10be" containerName="mariadb-database-create" Mar 09 14:27:30 crc kubenswrapper[4722]: I0309 14:27:26.646245 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff53300e-f89f-4204-82ce-5100fa8b10be" containerName="mariadb-database-create" Mar 09 14:27:30 crc kubenswrapper[4722]: E0309 14:27:26.646266 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bcd1ab8-574f-4e9e-8b60-058524c8be9f" containerName="mariadb-database-create" Mar 09 14:27:30 crc kubenswrapper[4722]: I0309 14:27:26.646273 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bcd1ab8-574f-4e9e-8b60-058524c8be9f" containerName="mariadb-database-create" Mar 09 14:27:30 crc kubenswrapper[4722]: E0309 14:27:26.646292 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87133d68-d972-4b38-a6b9-f88733004c17" containerName="mariadb-account-create-update" Mar 09 14:27:30 crc kubenswrapper[4722]: I0309 14:27:26.646298 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="87133d68-d972-4b38-a6b9-f88733004c17" containerName="mariadb-account-create-update" Mar 09 14:27:30 crc kubenswrapper[4722]: E0309 14:27:26.646308 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fe8dbea-cdd7-4bda-b5f2-717a0a30d116" containerName="heat-cfnapi" Mar 09 14:27:30 crc kubenswrapper[4722]: I0309 14:27:26.646314 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fe8dbea-cdd7-4bda-b5f2-717a0a30d116" containerName="heat-cfnapi" Mar 09 14:27:30 crc kubenswrapper[4722]: E0309 14:27:26.646330 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22868fd9-09e2-4ad6-b923-ab373da94453" containerName="mariadb-database-create" Mar 09 14:27:30 crc kubenswrapper[4722]: I0309 14:27:26.646336 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="22868fd9-09e2-4ad6-b923-ab373da94453" containerName="mariadb-database-create" Mar 09 14:27:30 crc kubenswrapper[4722]: I0309 14:27:26.646529 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bebeafd-3acc-450b-85b1-145cb598ac05" containerName="mariadb-account-create-update" Mar 09 14:27:30 crc kubenswrapper[4722]: I0309 14:27:26.646543 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="22868fd9-09e2-4ad6-b923-ab373da94453" containerName="mariadb-database-create" Mar 09 14:27:30 crc kubenswrapper[4722]: I0309 14:27:26.646554 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fe8dbea-cdd7-4bda-b5f2-717a0a30d116" containerName="heat-cfnapi" Mar 09 14:27:30 crc kubenswrapper[4722]: I0309 14:27:26.646563 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fe8dbea-cdd7-4bda-b5f2-717a0a30d116" containerName="heat-cfnapi" Mar 09 14:27:30 crc kubenswrapper[4722]: I0309 14:27:26.646577 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="87133d68-d972-4b38-a6b9-f88733004c17" containerName="mariadb-account-create-update" Mar 09 14:27:30 crc kubenswrapper[4722]: I0309 14:27:26.646593 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ce279d8-a769-4e43-89f9-18c598b6f207" containerName="mariadb-account-create-update" Mar 09 14:27:30 crc kubenswrapper[4722]: I0309 14:27:26.646604 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff53300e-f89f-4204-82ce-5100fa8b10be" containerName="mariadb-database-create" Mar 09 14:27:30 crc kubenswrapper[4722]: I0309 14:27:26.646613 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="c08dfba1-d105-4384-b906-5772148696e4" containerName="heat-api" Mar 09 14:27:30 crc kubenswrapper[4722]: I0309 14:27:26.646624 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bcd1ab8-574f-4e9e-8b60-058524c8be9f" containerName="mariadb-database-create" Mar 09 14:27:30 crc kubenswrapper[4722]: I0309 14:27:26.647380 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-dxrwp" Mar 09 14:27:30 crc kubenswrapper[4722]: I0309 14:27:26.654059 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-6n4zr" Mar 09 14:27:30 crc kubenswrapper[4722]: I0309 14:27:26.654256 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 09 14:27:30 crc kubenswrapper[4722]: I0309 14:27:26.655127 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 09 14:27:30 crc kubenswrapper[4722]: I0309 14:27:26.679591 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-dxrwp"] Mar 09 14:27:30 crc kubenswrapper[4722]: I0309 14:27:26.742272 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fe8dbea-cdd7-4bda-b5f2-717a0a30d116-config-data\") pod \"6fe8dbea-cdd7-4bda-b5f2-717a0a30d116\" (UID: \"6fe8dbea-cdd7-4bda-b5f2-717a0a30d116\") " Mar 09 14:27:30 crc kubenswrapper[4722]: I0309 14:27:26.742411 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bvvn9\" (UniqueName: \"kubernetes.io/projected/6fe8dbea-cdd7-4bda-b5f2-717a0a30d116-kube-api-access-bvvn9\") pod \"6fe8dbea-cdd7-4bda-b5f2-717a0a30d116\" (UID: \"6fe8dbea-cdd7-4bda-b5f2-717a0a30d116\") " Mar 09 14:27:30 crc kubenswrapper[4722]: I0309 14:27:26.742443 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fe8dbea-cdd7-4bda-b5f2-717a0a30d116-combined-ca-bundle\") pod \"6fe8dbea-cdd7-4bda-b5f2-717a0a30d116\" (UID: \"6fe8dbea-cdd7-4bda-b5f2-717a0a30d116\") " Mar 09 14:27:30 crc kubenswrapper[4722]: I0309 14:27:26.742483 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6fe8dbea-cdd7-4bda-b5f2-717a0a30d116-config-data-custom\") pod \"6fe8dbea-cdd7-4bda-b5f2-717a0a30d116\" (UID: \"6fe8dbea-cdd7-4bda-b5f2-717a0a30d116\") " Mar 09 14:27:30 crc kubenswrapper[4722]: I0309 14:27:26.742958 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59nn4\" (UniqueName: \"kubernetes.io/projected/d8345b50-e7db-4e96-8ed3-1c4593079a7e-kube-api-access-59nn4\") pod \"nova-cell0-conductor-db-sync-dxrwp\" (UID: \"d8345b50-e7db-4e96-8ed3-1c4593079a7e\") " pod="openstack/nova-cell0-conductor-db-sync-dxrwp" Mar 09 14:27:30 crc kubenswrapper[4722]: I0309 14:27:26.742990 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8345b50-e7db-4e96-8ed3-1c4593079a7e-scripts\") pod \"nova-cell0-conductor-db-sync-dxrwp\" (UID: \"d8345b50-e7db-4e96-8ed3-1c4593079a7e\") " pod="openstack/nova-cell0-conductor-db-sync-dxrwp" Mar 09 14:27:30 crc kubenswrapper[4722]: I0309 14:27:26.743051 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8345b50-e7db-4e96-8ed3-1c4593079a7e-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-dxrwp\" (UID: \"d8345b50-e7db-4e96-8ed3-1c4593079a7e\") " pod="openstack/nova-cell0-conductor-db-sync-dxrwp" Mar 09 14:27:30 crc kubenswrapper[4722]: I0309 14:27:26.743146 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8345b50-e7db-4e96-8ed3-1c4593079a7e-config-data\") pod \"nova-cell0-conductor-db-sync-dxrwp\" (UID: \"d8345b50-e7db-4e96-8ed3-1c4593079a7e\") " pod="openstack/nova-cell0-conductor-db-sync-dxrwp" Mar 09 14:27:30 crc kubenswrapper[4722]: I0309 14:27:26.752429 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fe8dbea-cdd7-4bda-b5f2-717a0a30d116-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "6fe8dbea-cdd7-4bda-b5f2-717a0a30d116" (UID: "6fe8dbea-cdd7-4bda-b5f2-717a0a30d116"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:27:30 crc kubenswrapper[4722]: I0309 14:27:26.754295 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fe8dbea-cdd7-4bda-b5f2-717a0a30d116-kube-api-access-bvvn9" (OuterVolumeSpecName: "kube-api-access-bvvn9") pod "6fe8dbea-cdd7-4bda-b5f2-717a0a30d116" (UID: "6fe8dbea-cdd7-4bda-b5f2-717a0a30d116"). InnerVolumeSpecName "kube-api-access-bvvn9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:27:30 crc kubenswrapper[4722]: I0309 14:27:26.799109 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fe8dbea-cdd7-4bda-b5f2-717a0a30d116-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6fe8dbea-cdd7-4bda-b5f2-717a0a30d116" (UID: "6fe8dbea-cdd7-4bda-b5f2-717a0a30d116"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:27:30 crc kubenswrapper[4722]: I0309 14:27:26.839817 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fe8dbea-cdd7-4bda-b5f2-717a0a30d116-config-data" (OuterVolumeSpecName: "config-data") pod "6fe8dbea-cdd7-4bda-b5f2-717a0a30d116" (UID: "6fe8dbea-cdd7-4bda-b5f2-717a0a30d116"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:27:30 crc kubenswrapper[4722]: I0309 14:27:26.845792 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8345b50-e7db-4e96-8ed3-1c4593079a7e-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-dxrwp\" (UID: \"d8345b50-e7db-4e96-8ed3-1c4593079a7e\") " pod="openstack/nova-cell0-conductor-db-sync-dxrwp" Mar 09 14:27:30 crc kubenswrapper[4722]: I0309 14:27:26.845895 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8345b50-e7db-4e96-8ed3-1c4593079a7e-config-data\") pod \"nova-cell0-conductor-db-sync-dxrwp\" (UID: \"d8345b50-e7db-4e96-8ed3-1c4593079a7e\") " pod="openstack/nova-cell0-conductor-db-sync-dxrwp" Mar 09 14:27:30 crc kubenswrapper[4722]: I0309 14:27:26.848804 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59nn4\" (UniqueName: \"kubernetes.io/projected/d8345b50-e7db-4e96-8ed3-1c4593079a7e-kube-api-access-59nn4\") pod \"nova-cell0-conductor-db-sync-dxrwp\" (UID: \"d8345b50-e7db-4e96-8ed3-1c4593079a7e\") " pod="openstack/nova-cell0-conductor-db-sync-dxrwp" Mar 09 14:27:30 crc kubenswrapper[4722]: I0309 14:27:26.848944 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8345b50-e7db-4e96-8ed3-1c4593079a7e-scripts\") pod \"nova-cell0-conductor-db-sync-dxrwp\" (UID: \"d8345b50-e7db-4e96-8ed3-1c4593079a7e\") " pod="openstack/nova-cell0-conductor-db-sync-dxrwp" Mar 09 14:27:30 crc kubenswrapper[4722]: I0309 14:27:26.849042 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fe8dbea-cdd7-4bda-b5f2-717a0a30d116-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:27:30 crc kubenswrapper[4722]: I0309 14:27:26.849054 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bvvn9\" (UniqueName: \"kubernetes.io/projected/6fe8dbea-cdd7-4bda-b5f2-717a0a30d116-kube-api-access-bvvn9\") on node \"crc\" DevicePath \"\"" Mar 09 14:27:30 crc kubenswrapper[4722]: I0309 14:27:26.849063 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fe8dbea-cdd7-4bda-b5f2-717a0a30d116-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:27:30 crc kubenswrapper[4722]: I0309 14:27:26.849072 4722 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6fe8dbea-cdd7-4bda-b5f2-717a0a30d116-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 09 14:27:30 crc kubenswrapper[4722]: I0309 14:27:26.849676 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8345b50-e7db-4e96-8ed3-1c4593079a7e-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-dxrwp\" (UID: \"d8345b50-e7db-4e96-8ed3-1c4593079a7e\") " pod="openstack/nova-cell0-conductor-db-sync-dxrwp" Mar 09 14:27:30 crc kubenswrapper[4722]: I0309 14:27:26.858674 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8345b50-e7db-4e96-8ed3-1c4593079a7e-config-data\") pod \"nova-cell0-conductor-db-sync-dxrwp\" (UID: \"d8345b50-e7db-4e96-8ed3-1c4593079a7e\") " pod="openstack/nova-cell0-conductor-db-sync-dxrwp" Mar 09 14:27:30 crc kubenswrapper[4722]: I0309 14:27:26.859299 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8345b50-e7db-4e96-8ed3-1c4593079a7e-scripts\") pod \"nova-cell0-conductor-db-sync-dxrwp\" (UID: \"d8345b50-e7db-4e96-8ed3-1c4593079a7e\") " pod="openstack/nova-cell0-conductor-db-sync-dxrwp" Mar 09 14:27:30 crc kubenswrapper[4722]: I0309 14:27:26.870460 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59nn4\" (UniqueName: \"kubernetes.io/projected/d8345b50-e7db-4e96-8ed3-1c4593079a7e-kube-api-access-59nn4\") pod \"nova-cell0-conductor-db-sync-dxrwp\" (UID: \"d8345b50-e7db-4e96-8ed3-1c4593079a7e\") " pod="openstack/nova-cell0-conductor-db-sync-dxrwp" Mar 09 14:27:30 crc kubenswrapper[4722]: I0309 14:27:26.883249 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5b85b5677d-6lmg9" Mar 09 14:27:30 crc kubenswrapper[4722]: I0309 14:27:26.969311 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5f6f45fb9b-nw8cq" event={"ID":"6fe8dbea-cdd7-4bda-b5f2-717a0a30d116","Type":"ContainerDied","Data":"5daeb5f8197986c6582be7f05adcad2f1dab3daf247fea598269aafd2929b229"} Mar 09 14:27:30 crc kubenswrapper[4722]: I0309 14:27:26.969357 4722 scope.go:117] "RemoveContainer" containerID="5af8a71261e43a7a76f52a673e0c3d2b29962c6dd7bd2166b79ae5beb28304af" Mar 09 14:27:30 crc kubenswrapper[4722]: I0309 14:27:26.969397 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5f6f45fb9b-nw8cq" Mar 09 14:27:30 crc kubenswrapper[4722]: I0309 14:27:26.970088 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-dxrwp" Mar 09 14:27:30 crc kubenswrapper[4722]: I0309 14:27:26.972299 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-5b85b5677d-6lmg9" Mar 09 14:27:30 crc kubenswrapper[4722]: I0309 14:27:26.972906 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-5b85b5677d-6lmg9" event={"ID":"d6c4171f-11a7-47bf-900b-55fa05f03f49","Type":"ContainerDied","Data":"00addcf75d6f6f309792bc2c6ed5476100062f632c8630932d6872f511bc9ef8"} Mar 09 14:27:30 crc kubenswrapper[4722]: I0309 14:27:26.972975 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 09 14:27:30 crc kubenswrapper[4722]: I0309 14:27:26.972996 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 09 14:27:30 crc kubenswrapper[4722]: I0309 14:27:26.973008 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 09 14:27:30 crc kubenswrapper[4722]: I0309 14:27:26.973165 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 09 14:27:30 crc kubenswrapper[4722]: I0309 14:27:27.026335 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-5f6f45fb9b-nw8cq"] Mar 09 14:27:30 crc kubenswrapper[4722]: I0309 14:27:27.034467 4722 scope.go:117] "RemoveContainer" containerID="73506ed0545c9032408fb9295746b8107ae47b119a54aee899ccb465f961a34c" Mar 09 14:27:30 crc kubenswrapper[4722]: I0309 14:27:27.036330 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-5f6f45fb9b-nw8cq"] Mar 09 14:27:30 crc kubenswrapper[4722]: I0309 14:27:27.053104 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crjv4\" (UniqueName: \"kubernetes.io/projected/d6c4171f-11a7-47bf-900b-55fa05f03f49-kube-api-access-crjv4\") pod \"d6c4171f-11a7-47bf-900b-55fa05f03f49\" (UID: \"d6c4171f-11a7-47bf-900b-55fa05f03f49\") " Mar 09 14:27:30 crc kubenswrapper[4722]: I0309 14:27:27.059458 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6c4171f-11a7-47bf-900b-55fa05f03f49-kube-api-access-crjv4" (OuterVolumeSpecName: "kube-api-access-crjv4") pod "d6c4171f-11a7-47bf-900b-55fa05f03f49" (UID: "d6c4171f-11a7-47bf-900b-55fa05f03f49"). InnerVolumeSpecName "kube-api-access-crjv4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:27:30 crc kubenswrapper[4722]: I0309 14:27:27.063772 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6c4171f-11a7-47bf-900b-55fa05f03f49-config-data\") pod \"d6c4171f-11a7-47bf-900b-55fa05f03f49\" (UID: \"d6c4171f-11a7-47bf-900b-55fa05f03f49\") " Mar 09 14:27:30 crc kubenswrapper[4722]: I0309 14:27:27.063812 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6c4171f-11a7-47bf-900b-55fa05f03f49-combined-ca-bundle\") pod \"d6c4171f-11a7-47bf-900b-55fa05f03f49\" (UID: \"d6c4171f-11a7-47bf-900b-55fa05f03f49\") " Mar 09 14:27:30 crc kubenswrapper[4722]: I0309 14:27:27.063924 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d6c4171f-11a7-47bf-900b-55fa05f03f49-config-data-custom\") pod \"d6c4171f-11a7-47bf-900b-55fa05f03f49\" (UID: \"d6c4171f-11a7-47bf-900b-55fa05f03f49\") " Mar 09 14:27:30 crc kubenswrapper[4722]: I0309 14:27:27.065023 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-crjv4\" (UniqueName: \"kubernetes.io/projected/d6c4171f-11a7-47bf-900b-55fa05f03f49-kube-api-access-crjv4\") on node \"crc\" DevicePath \"\"" Mar 09 14:27:30 crc kubenswrapper[4722]: I0309 14:27:27.068370 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6c4171f-11a7-47bf-900b-55fa05f03f49-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d6c4171f-11a7-47bf-900b-55fa05f03f49" (UID: "d6c4171f-11a7-47bf-900b-55fa05f03f49"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:27:30 crc kubenswrapper[4722]: I0309 14:27:27.099819 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6c4171f-11a7-47bf-900b-55fa05f03f49-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d6c4171f-11a7-47bf-900b-55fa05f03f49" (UID: "d6c4171f-11a7-47bf-900b-55fa05f03f49"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:27:30 crc kubenswrapper[4722]: I0309 14:27:27.141358 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6c4171f-11a7-47bf-900b-55fa05f03f49-config-data" (OuterVolumeSpecName: "config-data") pod "d6c4171f-11a7-47bf-900b-55fa05f03f49" (UID: "d6c4171f-11a7-47bf-900b-55fa05f03f49"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:27:30 crc kubenswrapper[4722]: I0309 14:27:27.167532 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d6c4171f-11a7-47bf-900b-55fa05f03f49-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:27:30 crc kubenswrapper[4722]: I0309 14:27:27.167559 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d6c4171f-11a7-47bf-900b-55fa05f03f49-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:27:30 crc kubenswrapper[4722]: I0309 14:27:27.167569 4722 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d6c4171f-11a7-47bf-900b-55fa05f03f49-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 09 14:27:30 crc kubenswrapper[4722]: I0309 14:27:27.308972 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-5b85b5677d-6lmg9"] Mar 09 14:27:30 crc kubenswrapper[4722]: I0309 14:27:27.319306 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-5b85b5677d-6lmg9"] Mar 09 14:27:30 crc kubenswrapper[4722]: I0309 14:27:28.164964 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fe8dbea-cdd7-4bda-b5f2-717a0a30d116" path="/var/lib/kubelet/pods/6fe8dbea-cdd7-4bda-b5f2-717a0a30d116/volumes" Mar 09 14:27:30 crc kubenswrapper[4722]: I0309 14:27:28.169557 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6c4171f-11a7-47bf-900b-55fa05f03f49" path="/var/lib/kubelet/pods/d6c4171f-11a7-47bf-900b-55fa05f03f49/volumes" Mar 09 14:27:30 crc kubenswrapper[4722]: I0309 14:27:30.884345 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-dxrwp"] Mar 09 14:27:30 crc kubenswrapper[4722]: W0309 14:27:30.893919 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8345b50_e7db_4e96_8ed3_1c4593079a7e.slice/crio-e7e0ef759cc0a886a8f82b7eb0c69b4ca4642295e278a1680a314d406cd2e9a9 WatchSource:0}: Error finding container e7e0ef759cc0a886a8f82b7eb0c69b4ca4642295e278a1680a314d406cd2e9a9: Status 404 returned error can't find the container with id e7e0ef759cc0a886a8f82b7eb0c69b4ca4642295e278a1680a314d406cd2e9a9 Mar 09 14:27:31 crc kubenswrapper[4722]: I0309 14:27:31.051813 4722 generic.go:334] "Generic (PLEG): container finished" podID="2a7b55ea-3f51-4dfc-b861-23be50541a1c" containerID="48eff71955332d3b52375f0a991f10f3e4374f91b7e99deb991ff46c62536e31" exitCode=0 Mar 09 14:27:31 crc kubenswrapper[4722]: I0309 14:27:31.051872 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2a7b55ea-3f51-4dfc-b861-23be50541a1c","Type":"ContainerDied","Data":"48eff71955332d3b52375f0a991f10f3e4374f91b7e99deb991ff46c62536e31"} Mar 09 14:27:31 crc kubenswrapper[4722]: I0309 14:27:31.051899 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2a7b55ea-3f51-4dfc-b861-23be50541a1c","Type":"ContainerDied","Data":"26a7a09f41a26a7f4d7ad2a920f5b0b4ca0d1fd3e99c5c7e77790a0632a2a390"} Mar 09 14:27:31 crc kubenswrapper[4722]: I0309 14:27:31.051910 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="26a7a09f41a26a7f4d7ad2a920f5b0b4ca0d1fd3e99c5c7e77790a0632a2a390" Mar 09 14:27:31 crc kubenswrapper[4722]: I0309 14:27:31.052789 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-dxrwp" event={"ID":"d8345b50-e7db-4e96-8ed3-1c4593079a7e","Type":"ContainerStarted","Data":"e7e0ef759cc0a886a8f82b7eb0c69b4ca4642295e278a1680a314d406cd2e9a9"} Mar 09 14:27:31 crc kubenswrapper[4722]: I0309 14:27:31.072539 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 14:27:31 crc kubenswrapper[4722]: I0309 14:27:31.154911 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a7b55ea-3f51-4dfc-b861-23be50541a1c-scripts\") pod \"2a7b55ea-3f51-4dfc-b861-23be50541a1c\" (UID: \"2a7b55ea-3f51-4dfc-b861-23be50541a1c\") " Mar 09 14:27:31 crc kubenswrapper[4722]: I0309 14:27:31.155096 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a7b55ea-3f51-4dfc-b861-23be50541a1c-combined-ca-bundle\") pod \"2a7b55ea-3f51-4dfc-b861-23be50541a1c\" (UID: \"2a7b55ea-3f51-4dfc-b861-23be50541a1c\") " Mar 09 14:27:31 crc kubenswrapper[4722]: I0309 14:27:31.155912 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2a7b55ea-3f51-4dfc-b861-23be50541a1c-run-httpd\") pod \"2a7b55ea-3f51-4dfc-b861-23be50541a1c\" (UID: \"2a7b55ea-3f51-4dfc-b861-23be50541a1c\") " Mar 09 14:27:31 crc kubenswrapper[4722]: I0309 14:27:31.155955 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a7b55ea-3f51-4dfc-b861-23be50541a1c-config-data\") pod \"2a7b55ea-3f51-4dfc-b861-23be50541a1c\" (UID: \"2a7b55ea-3f51-4dfc-b861-23be50541a1c\") " Mar 09 14:27:31 crc kubenswrapper[4722]: I0309 14:27:31.156015 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2a7b55ea-3f51-4dfc-b861-23be50541a1c-log-httpd\") pod \"2a7b55ea-3f51-4dfc-b861-23be50541a1c\" (UID: \"2a7b55ea-3f51-4dfc-b861-23be50541a1c\") " Mar 09 14:27:31 crc kubenswrapper[4722]: I0309 14:27:31.156114 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7pk2\" (UniqueName: \"kubernetes.io/projected/2a7b55ea-3f51-4dfc-b861-23be50541a1c-kube-api-access-v7pk2\") pod \"2a7b55ea-3f51-4dfc-b861-23be50541a1c\" (UID: \"2a7b55ea-3f51-4dfc-b861-23be50541a1c\") " Mar 09 14:27:31 crc kubenswrapper[4722]: I0309 14:27:31.156161 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2a7b55ea-3f51-4dfc-b861-23be50541a1c-sg-core-conf-yaml\") pod \"2a7b55ea-3f51-4dfc-b861-23be50541a1c\" (UID: \"2a7b55ea-3f51-4dfc-b861-23be50541a1c\") " Mar 09 14:27:31 crc kubenswrapper[4722]: I0309 14:27:31.156172 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a7b55ea-3f51-4dfc-b861-23be50541a1c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2a7b55ea-3f51-4dfc-b861-23be50541a1c" (UID: "2a7b55ea-3f51-4dfc-b861-23be50541a1c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:27:31 crc kubenswrapper[4722]: I0309 14:27:31.156534 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a7b55ea-3f51-4dfc-b861-23be50541a1c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2a7b55ea-3f51-4dfc-b861-23be50541a1c" (UID: "2a7b55ea-3f51-4dfc-b861-23be50541a1c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:27:31 crc kubenswrapper[4722]: I0309 14:27:31.157067 4722 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2a7b55ea-3f51-4dfc-b861-23be50541a1c-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 14:27:31 crc kubenswrapper[4722]: I0309 14:27:31.157086 4722 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2a7b55ea-3f51-4dfc-b861-23be50541a1c-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 14:27:31 crc kubenswrapper[4722]: I0309 14:27:31.161874 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a7b55ea-3f51-4dfc-b861-23be50541a1c-scripts" (OuterVolumeSpecName: "scripts") pod "2a7b55ea-3f51-4dfc-b861-23be50541a1c" (UID: "2a7b55ea-3f51-4dfc-b861-23be50541a1c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:27:31 crc kubenswrapper[4722]: I0309 14:27:31.161947 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a7b55ea-3f51-4dfc-b861-23be50541a1c-kube-api-access-v7pk2" (OuterVolumeSpecName: "kube-api-access-v7pk2") pod "2a7b55ea-3f51-4dfc-b861-23be50541a1c" (UID: "2a7b55ea-3f51-4dfc-b861-23be50541a1c"). InnerVolumeSpecName "kube-api-access-v7pk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:27:31 crc kubenswrapper[4722]: I0309 14:27:31.192276 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a7b55ea-3f51-4dfc-b861-23be50541a1c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "2a7b55ea-3f51-4dfc-b861-23be50541a1c" (UID: "2a7b55ea-3f51-4dfc-b861-23be50541a1c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:27:31 crc kubenswrapper[4722]: I0309 14:27:31.261598 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a7b55ea-3f51-4dfc-b861-23be50541a1c-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 14:27:31 crc kubenswrapper[4722]: I0309 14:27:31.261626 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7pk2\" (UniqueName: \"kubernetes.io/projected/2a7b55ea-3f51-4dfc-b861-23be50541a1c-kube-api-access-v7pk2\") on node \"crc\" DevicePath \"\"" Mar 09 14:27:31 crc kubenswrapper[4722]: I0309 14:27:31.261639 4722 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2a7b55ea-3f51-4dfc-b861-23be50541a1c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 09 14:27:31 crc kubenswrapper[4722]: I0309 14:27:31.277470 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a7b55ea-3f51-4dfc-b861-23be50541a1c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2a7b55ea-3f51-4dfc-b861-23be50541a1c" (UID: "2a7b55ea-3f51-4dfc-b861-23be50541a1c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:27:31 crc kubenswrapper[4722]: I0309 14:27:31.285345 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a7b55ea-3f51-4dfc-b861-23be50541a1c-config-data" (OuterVolumeSpecName: "config-data") pod "2a7b55ea-3f51-4dfc-b861-23be50541a1c" (UID: "2a7b55ea-3f51-4dfc-b861-23be50541a1c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:27:31 crc kubenswrapper[4722]: I0309 14:27:31.368463 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a7b55ea-3f51-4dfc-b861-23be50541a1c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:27:31 crc kubenswrapper[4722]: I0309 14:27:31.368500 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a7b55ea-3f51-4dfc-b861-23be50541a1c-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:27:32 crc kubenswrapper[4722]: I0309 14:27:32.061619 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 14:27:32 crc kubenswrapper[4722]: I0309 14:27:32.100341 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 14:27:32 crc kubenswrapper[4722]: I0309 14:27:32.117993 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 09 14:27:32 crc kubenswrapper[4722]: I0309 14:27:32.134895 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 09 14:27:32 crc kubenswrapper[4722]: E0309 14:27:32.135524 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6c4171f-11a7-47bf-900b-55fa05f03f49" containerName="heat-api" Mar 09 14:27:32 crc kubenswrapper[4722]: I0309 14:27:32.135545 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6c4171f-11a7-47bf-900b-55fa05f03f49" containerName="heat-api" Mar 09 14:27:32 crc kubenswrapper[4722]: E0309 14:27:32.135575 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a7b55ea-3f51-4dfc-b861-23be50541a1c" containerName="proxy-httpd" Mar 09 14:27:32 crc kubenswrapper[4722]: I0309 14:27:32.135583 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a7b55ea-3f51-4dfc-b861-23be50541a1c" containerName="proxy-httpd" Mar 09 14:27:32 crc kubenswrapper[4722]: E0309 14:27:32.135612 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a7b55ea-3f51-4dfc-b861-23be50541a1c" containerName="ceilometer-notification-agent" Mar 09 14:27:32 crc kubenswrapper[4722]: I0309 14:27:32.135619 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a7b55ea-3f51-4dfc-b861-23be50541a1c" containerName="ceilometer-notification-agent" Mar 09 14:27:32 crc kubenswrapper[4722]: E0309 14:27:32.135638 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a7b55ea-3f51-4dfc-b861-23be50541a1c" containerName="sg-core" Mar 09 14:27:32 crc kubenswrapper[4722]: I0309 14:27:32.135646 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a7b55ea-3f51-4dfc-b861-23be50541a1c" containerName="sg-core" Mar 09 14:27:32 crc kubenswrapper[4722]: E0309 14:27:32.135673 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a7b55ea-3f51-4dfc-b861-23be50541a1c" containerName="ceilometer-central-agent" Mar 09 14:27:32 crc kubenswrapper[4722]: I0309 14:27:32.135680 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a7b55ea-3f51-4dfc-b861-23be50541a1c" containerName="ceilometer-central-agent" Mar 09 14:27:32 crc kubenswrapper[4722]: I0309 14:27:32.135918 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6c4171f-11a7-47bf-900b-55fa05f03f49" containerName="heat-api" Mar 09 14:27:32 crc kubenswrapper[4722]: I0309 14:27:32.135930 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6c4171f-11a7-47bf-900b-55fa05f03f49" containerName="heat-api" Mar 09 14:27:32 crc kubenswrapper[4722]: I0309 14:27:32.135941 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a7b55ea-3f51-4dfc-b861-23be50541a1c" containerName="sg-core" Mar 09 14:27:32 crc kubenswrapper[4722]: I0309 14:27:32.135954 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a7b55ea-3f51-4dfc-b861-23be50541a1c" containerName="ceilometer-central-agent" Mar 09 14:27:32 crc kubenswrapper[4722]: I0309 14:27:32.135970 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a7b55ea-3f51-4dfc-b861-23be50541a1c" containerName="proxy-httpd" Mar 09 14:27:32 crc kubenswrapper[4722]: I0309 14:27:32.135982 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a7b55ea-3f51-4dfc-b861-23be50541a1c" containerName="ceilometer-notification-agent" Mar 09 14:27:32 crc kubenswrapper[4722]: E0309 14:27:32.136169 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6c4171f-11a7-47bf-900b-55fa05f03f49" containerName="heat-api" Mar 09 14:27:32 crc kubenswrapper[4722]: I0309 14:27:32.136177 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6c4171f-11a7-47bf-900b-55fa05f03f49" containerName="heat-api" Mar 09 14:27:32 crc kubenswrapper[4722]: I0309 14:27:32.138687 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 14:27:32 crc kubenswrapper[4722]: I0309 14:27:32.143782 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 09 14:27:32 crc kubenswrapper[4722]: I0309 14:27:32.144011 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 09 14:27:32 crc kubenswrapper[4722]: I0309 14:27:32.176987 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a7b55ea-3f51-4dfc-b861-23be50541a1c" path="/var/lib/kubelet/pods/2a7b55ea-3f51-4dfc-b861-23be50541a1c/volumes" Mar 09 14:27:32 crc kubenswrapper[4722]: I0309 14:27:32.178197 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 14:27:32 crc kubenswrapper[4722]: I0309 14:27:32.295549 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzpms\" (UniqueName: \"kubernetes.io/projected/f7cd25c5-3f3b-4b28-b96f-0aa6114b498c-kube-api-access-kzpms\") pod \"ceilometer-0\" (UID: \"f7cd25c5-3f3b-4b28-b96f-0aa6114b498c\") " pod="openstack/ceilometer-0" Mar 09 14:27:32 crc kubenswrapper[4722]: I0309 14:27:32.295751 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f7cd25c5-3f3b-4b28-b96f-0aa6114b498c-run-httpd\") pod \"ceilometer-0\" (UID: \"f7cd25c5-3f3b-4b28-b96f-0aa6114b498c\") " pod="openstack/ceilometer-0" Mar 09 14:27:32 crc kubenswrapper[4722]: I0309 14:27:32.295940 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7cd25c5-3f3b-4b28-b96f-0aa6114b498c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f7cd25c5-3f3b-4b28-b96f-0aa6114b498c\") " pod="openstack/ceilometer-0" Mar 09 14:27:32 crc kubenswrapper[4722]: I0309 14:27:32.296297 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f7cd25c5-3f3b-4b28-b96f-0aa6114b498c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f7cd25c5-3f3b-4b28-b96f-0aa6114b498c\") " pod="openstack/ceilometer-0" Mar 09 14:27:32 crc kubenswrapper[4722]: I0309 14:27:32.296369 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7cd25c5-3f3b-4b28-b96f-0aa6114b498c-scripts\") pod \"ceilometer-0\" (UID: \"f7cd25c5-3f3b-4b28-b96f-0aa6114b498c\") " pod="openstack/ceilometer-0" Mar 09 14:27:32 crc kubenswrapper[4722]: I0309 14:27:32.296397 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7cd25c5-3f3b-4b28-b96f-0aa6114b498c-config-data\") pod \"ceilometer-0\" (UID: \"f7cd25c5-3f3b-4b28-b96f-0aa6114b498c\") " pod="openstack/ceilometer-0" Mar 09 14:27:32 crc kubenswrapper[4722]: I0309 14:27:32.296481 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f7cd25c5-3f3b-4b28-b96f-0aa6114b498c-log-httpd\") pod \"ceilometer-0\" (UID: \"f7cd25c5-3f3b-4b28-b96f-0aa6114b498c\") " pod="openstack/ceilometer-0" Mar 09 14:27:32 crc kubenswrapper[4722]: I0309 14:27:32.398277 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f7cd25c5-3f3b-4b28-b96f-0aa6114b498c-run-httpd\") pod \"ceilometer-0\" (UID: \"f7cd25c5-3f3b-4b28-b96f-0aa6114b498c\") " pod="openstack/ceilometer-0" Mar 09 14:27:32 crc kubenswrapper[4722]: I0309 14:27:32.398352 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7cd25c5-3f3b-4b28-b96f-0aa6114b498c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f7cd25c5-3f3b-4b28-b96f-0aa6114b498c\") " pod="openstack/ceilometer-0" Mar 09 14:27:32 crc kubenswrapper[4722]: I0309 14:27:32.398431 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f7cd25c5-3f3b-4b28-b96f-0aa6114b498c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f7cd25c5-3f3b-4b28-b96f-0aa6114b498c\") " pod="openstack/ceilometer-0" Mar 09 14:27:32 crc kubenswrapper[4722]: I0309 14:27:32.398453 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7cd25c5-3f3b-4b28-b96f-0aa6114b498c-scripts\") pod \"ceilometer-0\" (UID: \"f7cd25c5-3f3b-4b28-b96f-0aa6114b498c\") " pod="openstack/ceilometer-0" Mar 09 14:27:32 crc kubenswrapper[4722]: I0309 14:27:32.398470 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7cd25c5-3f3b-4b28-b96f-0aa6114b498c-config-data\") pod \"ceilometer-0\" (UID: \"f7cd25c5-3f3b-4b28-b96f-0aa6114b498c\") " pod="openstack/ceilometer-0" Mar 09 14:27:32 crc kubenswrapper[4722]: I0309 14:27:32.398502 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f7cd25c5-3f3b-4b28-b96f-0aa6114b498c-log-httpd\") pod \"ceilometer-0\" (UID: \"f7cd25c5-3f3b-4b28-b96f-0aa6114b498c\") " pod="openstack/ceilometer-0" Mar 09 14:27:32 crc kubenswrapper[4722]: I0309 14:27:32.398554 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzpms\" (UniqueName: \"kubernetes.io/projected/f7cd25c5-3f3b-4b28-b96f-0aa6114b498c-kube-api-access-kzpms\") pod \"ceilometer-0\" (UID: \"f7cd25c5-3f3b-4b28-b96f-0aa6114b498c\") " pod="openstack/ceilometer-0" Mar 09 14:27:32 crc kubenswrapper[4722]: I0309 14:27:32.398740 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f7cd25c5-3f3b-4b28-b96f-0aa6114b498c-run-httpd\") pod \"ceilometer-0\" (UID: \"f7cd25c5-3f3b-4b28-b96f-0aa6114b498c\") " pod="openstack/ceilometer-0" Mar 09 14:27:32 crc kubenswrapper[4722]: I0309 14:27:32.399697 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f7cd25c5-3f3b-4b28-b96f-0aa6114b498c-log-httpd\") pod \"ceilometer-0\" (UID: \"f7cd25c5-3f3b-4b28-b96f-0aa6114b498c\") " pod="openstack/ceilometer-0" Mar 09 14:27:32 crc kubenswrapper[4722]: I0309 14:27:32.403329 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7cd25c5-3f3b-4b28-b96f-0aa6114b498c-scripts\") pod \"ceilometer-0\" (UID: \"f7cd25c5-3f3b-4b28-b96f-0aa6114b498c\") " pod="openstack/ceilometer-0" Mar 09 14:27:32 crc kubenswrapper[4722]: I0309 14:27:32.403444 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7cd25c5-3f3b-4b28-b96f-0aa6114b498c-config-data\") pod \"ceilometer-0\" (UID: \"f7cd25c5-3f3b-4b28-b96f-0aa6114b498c\") " pod="openstack/ceilometer-0" Mar 09 14:27:32 crc kubenswrapper[4722]: I0309 14:27:32.403942 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7cd25c5-3f3b-4b28-b96f-0aa6114b498c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f7cd25c5-3f3b-4b28-b96f-0aa6114b498c\") " pod="openstack/ceilometer-0" Mar 09 14:27:32 crc kubenswrapper[4722]: I0309 14:27:32.410330 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f7cd25c5-3f3b-4b28-b96f-0aa6114b498c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f7cd25c5-3f3b-4b28-b96f-0aa6114b498c\") " pod="openstack/ceilometer-0" Mar 09 14:27:32 crc kubenswrapper[4722]: I0309 14:27:32.415986 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzpms\" (UniqueName: \"kubernetes.io/projected/f7cd25c5-3f3b-4b28-b96f-0aa6114b498c-kube-api-access-kzpms\") pod \"ceilometer-0\" (UID: \"f7cd25c5-3f3b-4b28-b96f-0aa6114b498c\") " pod="openstack/ceilometer-0" Mar 09 14:27:32 crc kubenswrapper[4722]: I0309 14:27:32.471979 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 14:27:33 crc kubenswrapper[4722]: I0309 14:27:33.023552 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 14:27:33 crc kubenswrapper[4722]: I0309 14:27:33.075876 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f7cd25c5-3f3b-4b28-b96f-0aa6114b498c","Type":"ContainerStarted","Data":"eae54c922f9529589ee43329029d006bb384ce22859b3bc9778943f9e106d4a6"} Mar 09 14:27:33 crc kubenswrapper[4722]: I0309 14:27:33.178078 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 09 14:27:33 crc kubenswrapper[4722]: I0309 14:27:33.178458 4722 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 09 14:27:33 crc kubenswrapper[4722]: I0309 14:27:33.181789 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 09 14:27:33 crc kubenswrapper[4722]: I0309 14:27:33.184906 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 09 14:27:33 crc kubenswrapper[4722]: I0309 14:27:33.185000 4722 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 09 14:27:33 crc kubenswrapper[4722]: I0309 14:27:33.188041 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 09 14:27:34 crc kubenswrapper[4722]: I0309 14:27:34.087085 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f7cd25c5-3f3b-4b28-b96f-0aa6114b498c","Type":"ContainerStarted","Data":"31de818897cbe2c334e4d93aef21bd7aabc4a46e9f9ad4e25698a2e64984583b"} Mar 09 14:27:34 crc kubenswrapper[4722]: I0309 14:27:34.125914 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 14:27:37 crc kubenswrapper[4722]: I0309 14:27:37.266316 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-678fb995f7-q5bfj" Mar 09 14:27:37 crc kubenswrapper[4722]: I0309 14:27:37.331874 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-5884f594db-ls9vv"] Mar 09 14:27:37 crc kubenswrapper[4722]: I0309 14:27:37.332170 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-engine-5884f594db-ls9vv" podUID="f71670c9-520f-4005-a324-199bc52fac7f" containerName="heat-engine" containerID="cri-o://876323b8a50b38036cbeb67f8682fe72e20d6ca089e54cdc4c767cfd67f20e7f" gracePeriod=60 Mar 09 14:27:39 crc kubenswrapper[4722]: E0309 14:27:39.538966 4722 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="876323b8a50b38036cbeb67f8682fe72e20d6ca089e54cdc4c767cfd67f20e7f" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 09 14:27:39 crc kubenswrapper[4722]: E0309 14:27:39.549563 4722 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="876323b8a50b38036cbeb67f8682fe72e20d6ca089e54cdc4c767cfd67f20e7f" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 09 14:27:39 crc kubenswrapper[4722]: E0309 14:27:39.551601 4722 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="876323b8a50b38036cbeb67f8682fe72e20d6ca089e54cdc4c767cfd67f20e7f" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 09 14:27:39 crc kubenswrapper[4722]: E0309 14:27:39.551665 4722 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-5884f594db-ls9vv" podUID="f71670c9-520f-4005-a324-199bc52fac7f" containerName="heat-engine" Mar 09 14:27:40 crc kubenswrapper[4722]: I0309 14:27:40.454213 4722 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 14:27:41 crc kubenswrapper[4722]: I0309 14:27:41.198418 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-dxrwp" event={"ID":"d8345b50-e7db-4e96-8ed3-1c4593079a7e","Type":"ContainerStarted","Data":"5cfcdbdecc8b8ae4a8263e25cb31ca5a7a3a52f2aea68b74ff60f3033bbdbc44"} Mar 09 14:27:41 crc kubenswrapper[4722]: I0309 14:27:41.200420 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f7cd25c5-3f3b-4b28-b96f-0aa6114b498c","Type":"ContainerStarted","Data":"a872ee97ebf499e50f3742bdc8cbbec5579a4732c47ec733caa75f4d83a96e13"} Mar 09 14:27:41 crc kubenswrapper[4722]: I0309 14:27:41.200520 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f7cd25c5-3f3b-4b28-b96f-0aa6114b498c","Type":"ContainerStarted","Data":"04b2201652075c3dca2d3c2bdd5acf063c18205cade60221aa03618b3d270250"} Mar 09 14:27:41 crc kubenswrapper[4722]: I0309 14:27:41.223306 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-dxrwp" podStartSLOduration=5.919732002 podStartE2EDuration="15.223288402s" podCreationTimestamp="2026-03-09 14:27:26 +0000 UTC" firstStartedPulling="2026-03-09 14:27:30.902330182 +0000 UTC m=+1491.457898758" lastFinishedPulling="2026-03-09 14:27:40.205886582 +0000 UTC m=+1500.761455158" observedRunningTime="2026-03-09 14:27:41.212764142 +0000 UTC m=+1501.768332718" watchObservedRunningTime="2026-03-09 14:27:41.223288402 +0000 UTC m=+1501.778856978" Mar 09 14:27:43 crc kubenswrapper[4722]: I0309 14:27:43.222049 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f7cd25c5-3f3b-4b28-b96f-0aa6114b498c","Type":"ContainerStarted","Data":"91cc55bbe0c15e0bda9e4b09eb94f00df004b0b14ebbb120410702aeca6523b4"} Mar 09 14:27:43 crc kubenswrapper[4722]: I0309 14:27:43.222516 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f7cd25c5-3f3b-4b28-b96f-0aa6114b498c" containerName="ceilometer-central-agent" containerID="cri-o://31de818897cbe2c334e4d93aef21bd7aabc4a46e9f9ad4e25698a2e64984583b" gracePeriod=30 Mar 09 14:27:43 crc kubenswrapper[4722]: I0309 14:27:43.222737 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f7cd25c5-3f3b-4b28-b96f-0aa6114b498c" containerName="proxy-httpd" containerID="cri-o://91cc55bbe0c15e0bda9e4b09eb94f00df004b0b14ebbb120410702aeca6523b4" gracePeriod=30 Mar 09 14:27:43 crc kubenswrapper[4722]: I0309 14:27:43.222808 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f7cd25c5-3f3b-4b28-b96f-0aa6114b498c" containerName="sg-core" containerID="cri-o://a872ee97ebf499e50f3742bdc8cbbec5579a4732c47ec733caa75f4d83a96e13" gracePeriod=30 Mar 09 14:27:43 crc kubenswrapper[4722]: I0309 14:27:43.222871 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f7cd25c5-3f3b-4b28-b96f-0aa6114b498c" containerName="ceilometer-notification-agent" containerID="cri-o://04b2201652075c3dca2d3c2bdd5acf063c18205cade60221aa03618b3d270250" gracePeriod=30 Mar 09 14:27:43 crc kubenswrapper[4722]: I0309 14:27:43.222878 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 09 14:27:43 crc kubenswrapper[4722]: I0309 14:27:43.248140 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.490667174 podStartE2EDuration="11.248120838s" podCreationTimestamp="2026-03-09 14:27:32 +0000 UTC" firstStartedPulling="2026-03-09 14:27:33.047155717 +0000 UTC m=+1493.602724293" lastFinishedPulling="2026-03-09 14:27:42.804609381 +0000 UTC m=+1503.360177957" observedRunningTime="2026-03-09 14:27:43.245037642 +0000 UTC m=+1503.800606228" watchObservedRunningTime="2026-03-09 14:27:43.248120838 +0000 UTC m=+1503.803689414" Mar 09 14:27:44 crc kubenswrapper[4722]: I0309 14:27:44.235672 4722 generic.go:334] "Generic (PLEG): container finished" podID="f7cd25c5-3f3b-4b28-b96f-0aa6114b498c" containerID="a872ee97ebf499e50f3742bdc8cbbec5579a4732c47ec733caa75f4d83a96e13" exitCode=2 Mar 09 14:27:44 crc kubenswrapper[4722]: I0309 14:27:44.236019 4722 generic.go:334] "Generic (PLEG): container finished" podID="f7cd25c5-3f3b-4b28-b96f-0aa6114b498c" containerID="04b2201652075c3dca2d3c2bdd5acf063c18205cade60221aa03618b3d270250" exitCode=0 Mar 09 14:27:44 crc kubenswrapper[4722]: I0309 14:27:44.236032 4722 generic.go:334] "Generic (PLEG): container finished" podID="f7cd25c5-3f3b-4b28-b96f-0aa6114b498c" containerID="31de818897cbe2c334e4d93aef21bd7aabc4a46e9f9ad4e25698a2e64984583b" exitCode=0 Mar 09 14:27:44 crc kubenswrapper[4722]: I0309 14:27:44.235764 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f7cd25c5-3f3b-4b28-b96f-0aa6114b498c","Type":"ContainerDied","Data":"a872ee97ebf499e50f3742bdc8cbbec5579a4732c47ec733caa75f4d83a96e13"} Mar 09 14:27:44 crc kubenswrapper[4722]: I0309 14:27:44.236082 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f7cd25c5-3f3b-4b28-b96f-0aa6114b498c","Type":"ContainerDied","Data":"04b2201652075c3dca2d3c2bdd5acf063c18205cade60221aa03618b3d270250"} Mar 09 14:27:44 crc kubenswrapper[4722]: I0309 14:27:44.236101 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f7cd25c5-3f3b-4b28-b96f-0aa6114b498c","Type":"ContainerDied","Data":"31de818897cbe2c334e4d93aef21bd7aabc4a46e9f9ad4e25698a2e64984583b"} Mar 09 14:27:47 crc kubenswrapper[4722]: E0309 14:27:47.208671 4722 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf71670c9_520f_4005_a324_199bc52fac7f.slice/crio-conmon-876323b8a50b38036cbeb67f8682fe72e20d6ca089e54cdc4c767cfd67f20e7f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf7cd25c5_3f3b_4b28_b96f_0aa6114b498c.slice/crio-conmon-31de818897cbe2c334e4d93aef21bd7aabc4a46e9f9ad4e25698a2e64984583b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf7cd25c5_3f3b_4b28_b96f_0aa6114b498c.slice/crio-31de818897cbe2c334e4d93aef21bd7aabc4a46e9f9ad4e25698a2e64984583b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf71670c9_520f_4005_a324_199bc52fac7f.slice/crio-876323b8a50b38036cbeb67f8682fe72e20d6ca089e54cdc4c767cfd67f20e7f.scope\": RecentStats: unable to find data in memory cache]" Mar 09 14:27:47 crc kubenswrapper[4722]: I0309 14:27:47.279893 4722 generic.go:334] "Generic (PLEG): container finished" podID="f71670c9-520f-4005-a324-199bc52fac7f" containerID="876323b8a50b38036cbeb67f8682fe72e20d6ca089e54cdc4c767cfd67f20e7f" exitCode=0 Mar 09 14:27:47 crc kubenswrapper[4722]: I0309 14:27:47.279969 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-5884f594db-ls9vv" event={"ID":"f71670c9-520f-4005-a324-199bc52fac7f","Type":"ContainerDied","Data":"876323b8a50b38036cbeb67f8682fe72e20d6ca089e54cdc4c767cfd67f20e7f"} Mar 09 14:27:47 crc kubenswrapper[4722]: I0309 14:27:47.280143 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-5884f594db-ls9vv" event={"ID":"f71670c9-520f-4005-a324-199bc52fac7f","Type":"ContainerDied","Data":"f8354b24d5aa015bff9d0e7c6220ce3b18d67b7dc6b099982162cc991789aeaf"} Mar 09 14:27:47 crc kubenswrapper[4722]: I0309 14:27:47.280161 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f8354b24d5aa015bff9d0e7c6220ce3b18d67b7dc6b099982162cc991789aeaf" Mar 09 14:27:47 crc kubenswrapper[4722]: I0309 14:27:47.330871 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-5884f594db-ls9vv" Mar 09 14:27:47 crc kubenswrapper[4722]: I0309 14:27:47.515734 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f71670c9-520f-4005-a324-199bc52fac7f-combined-ca-bundle\") pod \"f71670c9-520f-4005-a324-199bc52fac7f\" (UID: \"f71670c9-520f-4005-a324-199bc52fac7f\") " Mar 09 14:27:47 crc kubenswrapper[4722]: I0309 14:27:47.516084 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xzv6c\" (UniqueName: \"kubernetes.io/projected/f71670c9-520f-4005-a324-199bc52fac7f-kube-api-access-xzv6c\") pod \"f71670c9-520f-4005-a324-199bc52fac7f\" (UID: \"f71670c9-520f-4005-a324-199bc52fac7f\") " Mar 09 14:27:47 crc kubenswrapper[4722]: I0309 14:27:47.516144 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f71670c9-520f-4005-a324-199bc52fac7f-config-data\") pod \"f71670c9-520f-4005-a324-199bc52fac7f\" (UID: \"f71670c9-520f-4005-a324-199bc52fac7f\") " Mar 09 14:27:47 crc kubenswrapper[4722]: I0309 14:27:47.516193 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f71670c9-520f-4005-a324-199bc52fac7f-config-data-custom\") pod \"f71670c9-520f-4005-a324-199bc52fac7f\" (UID: \"f71670c9-520f-4005-a324-199bc52fac7f\") " Mar 09 14:27:47 crc kubenswrapper[4722]: I0309 14:27:47.522110 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f71670c9-520f-4005-a324-199bc52fac7f-kube-api-access-xzv6c" (OuterVolumeSpecName: "kube-api-access-xzv6c") pod "f71670c9-520f-4005-a324-199bc52fac7f" (UID: "f71670c9-520f-4005-a324-199bc52fac7f"). InnerVolumeSpecName "kube-api-access-xzv6c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:27:47 crc kubenswrapper[4722]: I0309 14:27:47.524391 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f71670c9-520f-4005-a324-199bc52fac7f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "f71670c9-520f-4005-a324-199bc52fac7f" (UID: "f71670c9-520f-4005-a324-199bc52fac7f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:27:47 crc kubenswrapper[4722]: I0309 14:27:47.549104 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f71670c9-520f-4005-a324-199bc52fac7f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f71670c9-520f-4005-a324-199bc52fac7f" (UID: "f71670c9-520f-4005-a324-199bc52fac7f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:27:47 crc kubenswrapper[4722]: I0309 14:27:47.578929 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f71670c9-520f-4005-a324-199bc52fac7f-config-data" (OuterVolumeSpecName: "config-data") pod "f71670c9-520f-4005-a324-199bc52fac7f" (UID: "f71670c9-520f-4005-a324-199bc52fac7f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:27:47 crc kubenswrapper[4722]: I0309 14:27:47.618587 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f71670c9-520f-4005-a324-199bc52fac7f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:27:47 crc kubenswrapper[4722]: I0309 14:27:47.618622 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xzv6c\" (UniqueName: \"kubernetes.io/projected/f71670c9-520f-4005-a324-199bc52fac7f-kube-api-access-xzv6c\") on node \"crc\" DevicePath \"\"" Mar 09 14:27:47 crc kubenswrapper[4722]: I0309 14:27:47.618635 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f71670c9-520f-4005-a324-199bc52fac7f-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:27:47 crc kubenswrapper[4722]: I0309 14:27:47.618643 4722 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f71670c9-520f-4005-a324-199bc52fac7f-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 09 14:27:48 crc kubenswrapper[4722]: I0309 14:27:48.291356 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-5884f594db-ls9vv" Mar 09 14:27:48 crc kubenswrapper[4722]: I0309 14:27:48.322789 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-5884f594db-ls9vv"] Mar 09 14:27:48 crc kubenswrapper[4722]: I0309 14:27:48.336622 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-engine-5884f594db-ls9vv"] Mar 09 14:27:50 crc kubenswrapper[4722]: I0309 14:27:50.168073 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f71670c9-520f-4005-a324-199bc52fac7f" path="/var/lib/kubelet/pods/f71670c9-520f-4005-a324-199bc52fac7f/volumes" Mar 09 14:27:52 crc kubenswrapper[4722]: I0309 14:27:52.332948 4722 generic.go:334] "Generic (PLEG): container finished" podID="d8345b50-e7db-4e96-8ed3-1c4593079a7e" containerID="5cfcdbdecc8b8ae4a8263e25cb31ca5a7a3a52f2aea68b74ff60f3033bbdbc44" exitCode=0 Mar 09 14:27:52 crc kubenswrapper[4722]: I0309 14:27:52.332986 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-dxrwp" event={"ID":"d8345b50-e7db-4e96-8ed3-1c4593079a7e","Type":"ContainerDied","Data":"5cfcdbdecc8b8ae4a8263e25cb31ca5a7a3a52f2aea68b74ff60f3033bbdbc44"} Mar 09 14:27:53 crc kubenswrapper[4722]: I0309 14:27:53.841921 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-dxrwp" Mar 09 14:27:53 crc kubenswrapper[4722]: I0309 14:27:53.972448 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8345b50-e7db-4e96-8ed3-1c4593079a7e-config-data\") pod \"d8345b50-e7db-4e96-8ed3-1c4593079a7e\" (UID: \"d8345b50-e7db-4e96-8ed3-1c4593079a7e\") " Mar 09 14:27:53 crc kubenswrapper[4722]: I0309 14:27:53.972606 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59nn4\" (UniqueName: \"kubernetes.io/projected/d8345b50-e7db-4e96-8ed3-1c4593079a7e-kube-api-access-59nn4\") pod \"d8345b50-e7db-4e96-8ed3-1c4593079a7e\" (UID: \"d8345b50-e7db-4e96-8ed3-1c4593079a7e\") " Mar 09 14:27:53 crc kubenswrapper[4722]: I0309 14:27:53.972683 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8345b50-e7db-4e96-8ed3-1c4593079a7e-scripts\") pod \"d8345b50-e7db-4e96-8ed3-1c4593079a7e\" (UID: \"d8345b50-e7db-4e96-8ed3-1c4593079a7e\") " Mar 09 14:27:53 crc kubenswrapper[4722]: I0309 14:27:53.972721 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8345b50-e7db-4e96-8ed3-1c4593079a7e-combined-ca-bundle\") pod \"d8345b50-e7db-4e96-8ed3-1c4593079a7e\" (UID: \"d8345b50-e7db-4e96-8ed3-1c4593079a7e\") " Mar 09 14:27:53 crc kubenswrapper[4722]: I0309 14:27:53.987384 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8345b50-e7db-4e96-8ed3-1c4593079a7e-scripts" (OuterVolumeSpecName: "scripts") pod "d8345b50-e7db-4e96-8ed3-1c4593079a7e" (UID: "d8345b50-e7db-4e96-8ed3-1c4593079a7e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:27:53 crc kubenswrapper[4722]: I0309 14:27:53.987622 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8345b50-e7db-4e96-8ed3-1c4593079a7e-kube-api-access-59nn4" (OuterVolumeSpecName: "kube-api-access-59nn4") pod "d8345b50-e7db-4e96-8ed3-1c4593079a7e" (UID: "d8345b50-e7db-4e96-8ed3-1c4593079a7e"). InnerVolumeSpecName "kube-api-access-59nn4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:27:54 crc kubenswrapper[4722]: I0309 14:27:54.013768 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8345b50-e7db-4e96-8ed3-1c4593079a7e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d8345b50-e7db-4e96-8ed3-1c4593079a7e" (UID: "d8345b50-e7db-4e96-8ed3-1c4593079a7e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:27:54 crc kubenswrapper[4722]: I0309 14:27:54.015330 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8345b50-e7db-4e96-8ed3-1c4593079a7e-config-data" (OuterVolumeSpecName: "config-data") pod "d8345b50-e7db-4e96-8ed3-1c4593079a7e" (UID: "d8345b50-e7db-4e96-8ed3-1c4593079a7e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:27:54 crc kubenswrapper[4722]: I0309 14:27:54.077223 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59nn4\" (UniqueName: \"kubernetes.io/projected/d8345b50-e7db-4e96-8ed3-1c4593079a7e-kube-api-access-59nn4\") on node \"crc\" DevicePath \"\"" Mar 09 14:27:54 crc kubenswrapper[4722]: I0309 14:27:54.077268 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8345b50-e7db-4e96-8ed3-1c4593079a7e-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 14:27:54 crc kubenswrapper[4722]: I0309 14:27:54.077287 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8345b50-e7db-4e96-8ed3-1c4593079a7e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:27:54 crc kubenswrapper[4722]: I0309 14:27:54.077305 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8345b50-e7db-4e96-8ed3-1c4593079a7e-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:27:54 crc kubenswrapper[4722]: I0309 14:27:54.360915 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-dxrwp" event={"ID":"d8345b50-e7db-4e96-8ed3-1c4593079a7e","Type":"ContainerDied","Data":"e7e0ef759cc0a886a8f82b7eb0c69b4ca4642295e278a1680a314d406cd2e9a9"} Mar 09 14:27:54 crc kubenswrapper[4722]: I0309 14:27:54.360957 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-dxrwp" Mar 09 14:27:54 crc kubenswrapper[4722]: I0309 14:27:54.360967 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e7e0ef759cc0a886a8f82b7eb0c69b4ca4642295e278a1680a314d406cd2e9a9" Mar 09 14:27:54 crc kubenswrapper[4722]: I0309 14:27:54.497117 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 09 14:27:54 crc kubenswrapper[4722]: E0309 14:27:54.497900 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f71670c9-520f-4005-a324-199bc52fac7f" containerName="heat-engine" Mar 09 14:27:54 crc kubenswrapper[4722]: I0309 14:27:54.497919 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f71670c9-520f-4005-a324-199bc52fac7f" containerName="heat-engine" Mar 09 14:27:54 crc kubenswrapper[4722]: E0309 14:27:54.497938 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8345b50-e7db-4e96-8ed3-1c4593079a7e" containerName="nova-cell0-conductor-db-sync" Mar 09 14:27:54 crc kubenswrapper[4722]: I0309 14:27:54.497946 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8345b50-e7db-4e96-8ed3-1c4593079a7e" containerName="nova-cell0-conductor-db-sync" Mar 09 14:27:54 crc kubenswrapper[4722]: I0309 14:27:54.498143 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="f71670c9-520f-4005-a324-199bc52fac7f" containerName="heat-engine" Mar 09 14:27:54 crc kubenswrapper[4722]: I0309 14:27:54.498168 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8345b50-e7db-4e96-8ed3-1c4593079a7e" containerName="nova-cell0-conductor-db-sync" Mar 09 14:27:54 crc kubenswrapper[4722]: I0309 14:27:54.498934 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 09 14:27:54 crc kubenswrapper[4722]: I0309 14:27:54.502128 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 09 14:27:54 crc kubenswrapper[4722]: I0309 14:27:54.512302 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 09 14:27:54 crc kubenswrapper[4722]: I0309 14:27:54.516734 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-6n4zr" Mar 09 14:27:54 crc kubenswrapper[4722]: I0309 14:27:54.690757 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b3ff5cc-27b8-4242-b213-41632f062f72-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"2b3ff5cc-27b8-4242-b213-41632f062f72\") " pod="openstack/nova-cell0-conductor-0" Mar 09 14:27:54 crc kubenswrapper[4722]: I0309 14:27:54.690859 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b3ff5cc-27b8-4242-b213-41632f062f72-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"2b3ff5cc-27b8-4242-b213-41632f062f72\") " pod="openstack/nova-cell0-conductor-0" Mar 09 14:27:54 crc kubenswrapper[4722]: I0309 14:27:54.690934 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84n5g\" (UniqueName: \"kubernetes.io/projected/2b3ff5cc-27b8-4242-b213-41632f062f72-kube-api-access-84n5g\") pod \"nova-cell0-conductor-0\" (UID: \"2b3ff5cc-27b8-4242-b213-41632f062f72\") " pod="openstack/nova-cell0-conductor-0" Mar 09 14:27:54 crc kubenswrapper[4722]: I0309 14:27:54.793336 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b3ff5cc-27b8-4242-b213-41632f062f72-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"2b3ff5cc-27b8-4242-b213-41632f062f72\") " pod="openstack/nova-cell0-conductor-0" Mar 09 14:27:54 crc kubenswrapper[4722]: I0309 14:27:54.793687 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b3ff5cc-27b8-4242-b213-41632f062f72-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"2b3ff5cc-27b8-4242-b213-41632f062f72\") " pod="openstack/nova-cell0-conductor-0" Mar 09 14:27:54 crc kubenswrapper[4722]: I0309 14:27:54.793866 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84n5g\" (UniqueName: \"kubernetes.io/projected/2b3ff5cc-27b8-4242-b213-41632f062f72-kube-api-access-84n5g\") pod \"nova-cell0-conductor-0\" (UID: \"2b3ff5cc-27b8-4242-b213-41632f062f72\") " pod="openstack/nova-cell0-conductor-0" Mar 09 14:27:54 crc kubenswrapper[4722]: I0309 14:27:54.800000 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2b3ff5cc-27b8-4242-b213-41632f062f72-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"2b3ff5cc-27b8-4242-b213-41632f062f72\") " pod="openstack/nova-cell0-conductor-0" Mar 09 14:27:54 crc kubenswrapper[4722]: I0309 14:27:54.802475 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b3ff5cc-27b8-4242-b213-41632f062f72-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"2b3ff5cc-27b8-4242-b213-41632f062f72\") " pod="openstack/nova-cell0-conductor-0" Mar 09 14:27:54 crc kubenswrapper[4722]: I0309 14:27:54.813334 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84n5g\" (UniqueName: \"kubernetes.io/projected/2b3ff5cc-27b8-4242-b213-41632f062f72-kube-api-access-84n5g\") pod \"nova-cell0-conductor-0\" (UID: \"2b3ff5cc-27b8-4242-b213-41632f062f72\") " pod="openstack/nova-cell0-conductor-0" Mar 09 14:27:54 crc kubenswrapper[4722]: I0309 14:27:54.815626 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 09 14:27:55 crc kubenswrapper[4722]: I0309 14:27:55.313848 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 09 14:27:55 crc kubenswrapper[4722]: I0309 14:27:55.375094 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"2b3ff5cc-27b8-4242-b213-41632f062f72","Type":"ContainerStarted","Data":"b3861c714719c0bcf26e457d7daa5c9251a203b2bd6a39f5088ac3bb0d362c83"} Mar 09 14:27:56 crc kubenswrapper[4722]: I0309 14:27:56.394349 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"2b3ff5cc-27b8-4242-b213-41632f062f72","Type":"ContainerStarted","Data":"96e81ee53e9dbce11d5850ceff97c210ee451b4fd7afa6e5f0a7332932570bec"} Mar 09 14:27:56 crc kubenswrapper[4722]: I0309 14:27:56.394863 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 09 14:27:56 crc kubenswrapper[4722]: I0309 14:27:56.415026 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.415004035 podStartE2EDuration="2.415004035s" podCreationTimestamp="2026-03-09 14:27:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:27:56.414223853 +0000 UTC m=+1516.969792459" watchObservedRunningTime="2026-03-09 14:27:56.415004035 +0000 UTC m=+1516.970572631" Mar 09 14:28:00 crc kubenswrapper[4722]: I0309 14:28:00.135683 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551108-xnm76"] Mar 09 14:28:00 crc kubenswrapper[4722]: I0309 14:28:00.138595 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551108-xnm76" Mar 09 14:28:00 crc kubenswrapper[4722]: I0309 14:28:00.141954 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 14:28:00 crc kubenswrapper[4722]: I0309 14:28:00.143257 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 14:28:00 crc kubenswrapper[4722]: I0309 14:28:00.143469 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-dr2x6" Mar 09 14:28:00 crc kubenswrapper[4722]: I0309 14:28:00.146455 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551108-xnm76"] Mar 09 14:28:00 crc kubenswrapper[4722]: I0309 14:28:00.230216 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jz9p\" (UniqueName: \"kubernetes.io/projected/6a5f5ad3-9957-4077-b5a1-7a53d5c4caaf-kube-api-access-5jz9p\") pod \"auto-csr-approver-29551108-xnm76\" (UID: \"6a5f5ad3-9957-4077-b5a1-7a53d5c4caaf\") " pod="openshift-infra/auto-csr-approver-29551108-xnm76" Mar 09 14:28:00 crc kubenswrapper[4722]: I0309 14:28:00.331871 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jz9p\" (UniqueName: \"kubernetes.io/projected/6a5f5ad3-9957-4077-b5a1-7a53d5c4caaf-kube-api-access-5jz9p\") pod \"auto-csr-approver-29551108-xnm76\" (UID: \"6a5f5ad3-9957-4077-b5a1-7a53d5c4caaf\") " pod="openshift-infra/auto-csr-approver-29551108-xnm76" Mar 09 14:28:00 crc kubenswrapper[4722]: I0309 14:28:00.355339 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jz9p\" (UniqueName: \"kubernetes.io/projected/6a5f5ad3-9957-4077-b5a1-7a53d5c4caaf-kube-api-access-5jz9p\") pod \"auto-csr-approver-29551108-xnm76\" (UID: \"6a5f5ad3-9957-4077-b5a1-7a53d5c4caaf\") " pod="openshift-infra/auto-csr-approver-29551108-xnm76" Mar 09 14:28:00 crc kubenswrapper[4722]: I0309 14:28:00.458030 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551108-xnm76" Mar 09 14:28:00 crc kubenswrapper[4722]: I0309 14:28:00.992509 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551108-xnm76"] Mar 09 14:28:01 crc kubenswrapper[4722]: I0309 14:28:01.450090 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551108-xnm76" event={"ID":"6a5f5ad3-9957-4077-b5a1-7a53d5c4caaf","Type":"ContainerStarted","Data":"a493da62570c0f0e2c23d299ea6128c2cac67a61892543dafe7b98e304247713"} Mar 09 14:28:02 crc kubenswrapper[4722]: I0309 14:28:02.481813 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="f7cd25c5-3f3b-4b28-b96f-0aa6114b498c" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 09 14:28:02 crc kubenswrapper[4722]: I0309 14:28:02.919857 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-create-kf5wr"] Mar 09 14:28:02 crc kubenswrapper[4722]: I0309 14:28:02.928352 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-kf5wr" Mar 09 14:28:02 crc kubenswrapper[4722]: I0309 14:28:02.931625 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-kf5wr"] Mar 09 14:28:03 crc kubenswrapper[4722]: I0309 14:28:03.025645 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-6e64-account-create-update-sddxq"] Mar 09 14:28:03 crc kubenswrapper[4722]: I0309 14:28:03.028041 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-6e64-account-create-update-sddxq" Mar 09 14:28:03 crc kubenswrapper[4722]: I0309 14:28:03.032923 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-db-secret" Mar 09 14:28:03 crc kubenswrapper[4722]: I0309 14:28:03.039533 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-6e64-account-create-update-sddxq"] Mar 09 14:28:03 crc kubenswrapper[4722]: I0309 14:28:03.121764 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f48afe9d-4a0b-4d74-b9c4-a67f1fd098f3-operator-scripts\") pod \"aodh-db-create-kf5wr\" (UID: \"f48afe9d-4a0b-4d74-b9c4-a67f1fd098f3\") " pod="openstack/aodh-db-create-kf5wr" Mar 09 14:28:03 crc kubenswrapper[4722]: I0309 14:28:03.121991 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z75zx\" (UniqueName: \"kubernetes.io/projected/f48afe9d-4a0b-4d74-b9c4-a67f1fd098f3-kube-api-access-z75zx\") pod \"aodh-db-create-kf5wr\" (UID: \"f48afe9d-4a0b-4d74-b9c4-a67f1fd098f3\") " pod="openstack/aodh-db-create-kf5wr" Mar 09 14:28:03 crc kubenswrapper[4722]: I0309 14:28:03.122228 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x72ld\" (UniqueName: \"kubernetes.io/projected/cb4ac952-30e4-4b7d-866f-5bd6c9825bb2-kube-api-access-x72ld\") pod \"aodh-6e64-account-create-update-sddxq\" (UID: \"cb4ac952-30e4-4b7d-866f-5bd6c9825bb2\") " pod="openstack/aodh-6e64-account-create-update-sddxq" Mar 09 14:28:03 crc kubenswrapper[4722]: I0309 14:28:03.122262 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cb4ac952-30e4-4b7d-866f-5bd6c9825bb2-operator-scripts\") pod \"aodh-6e64-account-create-update-sddxq\" (UID: \"cb4ac952-30e4-4b7d-866f-5bd6c9825bb2\") " pod="openstack/aodh-6e64-account-create-update-sddxq" Mar 09 14:28:03 crc kubenswrapper[4722]: I0309 14:28:03.224630 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f48afe9d-4a0b-4d74-b9c4-a67f1fd098f3-operator-scripts\") pod \"aodh-db-create-kf5wr\" (UID: \"f48afe9d-4a0b-4d74-b9c4-a67f1fd098f3\") " pod="openstack/aodh-db-create-kf5wr" Mar 09 14:28:03 crc kubenswrapper[4722]: I0309 14:28:03.224747 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z75zx\" (UniqueName: \"kubernetes.io/projected/f48afe9d-4a0b-4d74-b9c4-a67f1fd098f3-kube-api-access-z75zx\") pod \"aodh-db-create-kf5wr\" (UID: \"f48afe9d-4a0b-4d74-b9c4-a67f1fd098f3\") " pod="openstack/aodh-db-create-kf5wr" Mar 09 14:28:03 crc kubenswrapper[4722]: I0309 14:28:03.224835 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x72ld\" (UniqueName: \"kubernetes.io/projected/cb4ac952-30e4-4b7d-866f-5bd6c9825bb2-kube-api-access-x72ld\") pod \"aodh-6e64-account-create-update-sddxq\" (UID: \"cb4ac952-30e4-4b7d-866f-5bd6c9825bb2\") " pod="openstack/aodh-6e64-account-create-update-sddxq" Mar 09 14:28:03 crc kubenswrapper[4722]: I0309 14:28:03.224863 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cb4ac952-30e4-4b7d-866f-5bd6c9825bb2-operator-scripts\") pod \"aodh-6e64-account-create-update-sddxq\" (UID: \"cb4ac952-30e4-4b7d-866f-5bd6c9825bb2\") " pod="openstack/aodh-6e64-account-create-update-sddxq" Mar 09 14:28:03 crc kubenswrapper[4722]: I0309 14:28:03.225929 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cb4ac952-30e4-4b7d-866f-5bd6c9825bb2-operator-scripts\") pod \"aodh-6e64-account-create-update-sddxq\" (UID: \"cb4ac952-30e4-4b7d-866f-5bd6c9825bb2\") " pod="openstack/aodh-6e64-account-create-update-sddxq" Mar 09 14:28:03 crc kubenswrapper[4722]: I0309 14:28:03.226180 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f48afe9d-4a0b-4d74-b9c4-a67f1fd098f3-operator-scripts\") pod \"aodh-db-create-kf5wr\" (UID: \"f48afe9d-4a0b-4d74-b9c4-a67f1fd098f3\") " pod="openstack/aodh-db-create-kf5wr" Mar 09 14:28:03 crc kubenswrapper[4722]: I0309 14:28:03.243868 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x72ld\" (UniqueName: \"kubernetes.io/projected/cb4ac952-30e4-4b7d-866f-5bd6c9825bb2-kube-api-access-x72ld\") pod \"aodh-6e64-account-create-update-sddxq\" (UID: \"cb4ac952-30e4-4b7d-866f-5bd6c9825bb2\") " pod="openstack/aodh-6e64-account-create-update-sddxq" Mar 09 14:28:03 crc kubenswrapper[4722]: I0309 14:28:03.244079 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z75zx\" (UniqueName: \"kubernetes.io/projected/f48afe9d-4a0b-4d74-b9c4-a67f1fd098f3-kube-api-access-z75zx\") pod \"aodh-db-create-kf5wr\" (UID: \"f48afe9d-4a0b-4d74-b9c4-a67f1fd098f3\") " pod="openstack/aodh-db-create-kf5wr" Mar 09 14:28:03 crc kubenswrapper[4722]: I0309 14:28:03.257397 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-kf5wr" Mar 09 14:28:03 crc kubenswrapper[4722]: I0309 14:28:03.356440 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-6e64-account-create-update-sddxq" Mar 09 14:28:03 crc kubenswrapper[4722]: I0309 14:28:03.492521 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551108-xnm76" event={"ID":"6a5f5ad3-9957-4077-b5a1-7a53d5c4caaf","Type":"ContainerStarted","Data":"231703ca880ff1453e2218d6219ba5ae4b08bd9416a34771d6be6aabef221562"} Mar 09 14:28:03 crc kubenswrapper[4722]: I0309 14:28:03.525522 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551108-xnm76" podStartSLOduration=2.320378592 podStartE2EDuration="3.525500745s" podCreationTimestamp="2026-03-09 14:28:00 +0000 UTC" firstStartedPulling="2026-03-09 14:28:01.008355993 +0000 UTC m=+1521.563924569" lastFinishedPulling="2026-03-09 14:28:02.213478146 +0000 UTC m=+1522.769046722" observedRunningTime="2026-03-09 14:28:03.512903417 +0000 UTC m=+1524.068472003" watchObservedRunningTime="2026-03-09 14:28:03.525500745 +0000 UTC m=+1524.081069321" Mar 09 14:28:03 crc kubenswrapper[4722]: I0309 14:28:03.714676 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-kf5wr"] Mar 09 14:28:03 crc kubenswrapper[4722]: I0309 14:28:03.893597 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-6e64-account-create-update-sddxq"] Mar 09 14:28:04 crc kubenswrapper[4722]: I0309 14:28:04.504884 4722 generic.go:334] "Generic (PLEG): container finished" podID="f48afe9d-4a0b-4d74-b9c4-a67f1fd098f3" containerID="d634f9a55d0ddc3d3c58d5a7f5aa0b15dcc0956bd1f9c086901e198d8e8e3d3e" exitCode=0 Mar 09 14:28:04 crc kubenswrapper[4722]: I0309 14:28:04.504934 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-kf5wr" event={"ID":"f48afe9d-4a0b-4d74-b9c4-a67f1fd098f3","Type":"ContainerDied","Data":"d634f9a55d0ddc3d3c58d5a7f5aa0b15dcc0956bd1f9c086901e198d8e8e3d3e"} Mar 09 14:28:04 crc kubenswrapper[4722]: I0309 14:28:04.505297 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-kf5wr" event={"ID":"f48afe9d-4a0b-4d74-b9c4-a67f1fd098f3","Type":"ContainerStarted","Data":"543727dec0f2fe15ac4b814fa208a782fdbcd4b775b23721d7c2e90354e63daf"} Mar 09 14:28:04 crc kubenswrapper[4722]: I0309 14:28:04.507005 4722 generic.go:334] "Generic (PLEG): container finished" podID="6a5f5ad3-9957-4077-b5a1-7a53d5c4caaf" containerID="231703ca880ff1453e2218d6219ba5ae4b08bd9416a34771d6be6aabef221562" exitCode=0 Mar 09 14:28:04 crc kubenswrapper[4722]: I0309 14:28:04.507100 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551108-xnm76" event={"ID":"6a5f5ad3-9957-4077-b5a1-7a53d5c4caaf","Type":"ContainerDied","Data":"231703ca880ff1453e2218d6219ba5ae4b08bd9416a34771d6be6aabef221562"} Mar 09 14:28:04 crc kubenswrapper[4722]: I0309 14:28:04.508840 4722 generic.go:334] "Generic (PLEG): container finished" podID="cb4ac952-30e4-4b7d-866f-5bd6c9825bb2" containerID="a76ff0649e5c91ef5ff4b1a327c4c8abcf7793a48f4877104cf10541009801e1" exitCode=0 Mar 09 14:28:04 crc kubenswrapper[4722]: I0309 14:28:04.508895 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-6e64-account-create-update-sddxq" event={"ID":"cb4ac952-30e4-4b7d-866f-5bd6c9825bb2","Type":"ContainerDied","Data":"a76ff0649e5c91ef5ff4b1a327c4c8abcf7793a48f4877104cf10541009801e1"} Mar 09 14:28:04 crc kubenswrapper[4722]: I0309 14:28:04.508927 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-6e64-account-create-update-sddxq" event={"ID":"cb4ac952-30e4-4b7d-866f-5bd6c9825bb2","Type":"ContainerStarted","Data":"45d652304944749910a264c486d08fbb70e5a7087c6e86ae802b6ab3cf9bb959"} Mar 09 14:28:04 crc kubenswrapper[4722]: I0309 14:28:04.866183 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 09 14:28:05 crc kubenswrapper[4722]: I0309 14:28:05.460985 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-8qttf"] Mar 09 14:28:05 crc kubenswrapper[4722]: I0309 14:28:05.463033 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-8qttf" Mar 09 14:28:05 crc kubenswrapper[4722]: I0309 14:28:05.466591 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 09 14:28:05 crc kubenswrapper[4722]: I0309 14:28:05.466822 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 09 14:28:05 crc kubenswrapper[4722]: I0309 14:28:05.475757 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-8qttf"] Mar 09 14:28:05 crc kubenswrapper[4722]: I0309 14:28:05.589670 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa375409-8285-4709-8abd-1916c59a6566-scripts\") pod \"nova-cell0-cell-mapping-8qttf\" (UID: \"aa375409-8285-4709-8abd-1916c59a6566\") " pod="openstack/nova-cell0-cell-mapping-8qttf" Mar 09 14:28:05 crc kubenswrapper[4722]: I0309 14:28:05.589773 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa375409-8285-4709-8abd-1916c59a6566-config-data\") pod \"nova-cell0-cell-mapping-8qttf\" (UID: \"aa375409-8285-4709-8abd-1916c59a6566\") " pod="openstack/nova-cell0-cell-mapping-8qttf" Mar 09 14:28:05 crc kubenswrapper[4722]: I0309 14:28:05.589842 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa375409-8285-4709-8abd-1916c59a6566-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-8qttf\" (UID: \"aa375409-8285-4709-8abd-1916c59a6566\") " pod="openstack/nova-cell0-cell-mapping-8qttf" Mar 09 14:28:05 crc kubenswrapper[4722]: I0309 14:28:05.589899 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkkpb\" (UniqueName: \"kubernetes.io/projected/aa375409-8285-4709-8abd-1916c59a6566-kube-api-access-hkkpb\") pod \"nova-cell0-cell-mapping-8qttf\" (UID: \"aa375409-8285-4709-8abd-1916c59a6566\") " pod="openstack/nova-cell0-cell-mapping-8qttf" Mar 09 14:28:05 crc kubenswrapper[4722]: I0309 14:28:05.667350 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 14:28:05 crc kubenswrapper[4722]: I0309 14:28:05.668830 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 09 14:28:05 crc kubenswrapper[4722]: I0309 14:28:05.673827 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 09 14:28:05 crc kubenswrapper[4722]: I0309 14:28:05.692609 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa375409-8285-4709-8abd-1916c59a6566-scripts\") pod \"nova-cell0-cell-mapping-8qttf\" (UID: \"aa375409-8285-4709-8abd-1916c59a6566\") " pod="openstack/nova-cell0-cell-mapping-8qttf" Mar 09 14:28:05 crc kubenswrapper[4722]: I0309 14:28:05.692691 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa375409-8285-4709-8abd-1916c59a6566-config-data\") pod \"nova-cell0-cell-mapping-8qttf\" (UID: \"aa375409-8285-4709-8abd-1916c59a6566\") " pod="openstack/nova-cell0-cell-mapping-8qttf" Mar 09 14:28:05 crc kubenswrapper[4722]: I0309 14:28:05.692735 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa375409-8285-4709-8abd-1916c59a6566-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-8qttf\" (UID: \"aa375409-8285-4709-8abd-1916c59a6566\") " pod="openstack/nova-cell0-cell-mapping-8qttf" Mar 09 14:28:05 crc kubenswrapper[4722]: I0309 14:28:05.692778 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkkpb\" (UniqueName: \"kubernetes.io/projected/aa375409-8285-4709-8abd-1916c59a6566-kube-api-access-hkkpb\") pod \"nova-cell0-cell-mapping-8qttf\" (UID: \"aa375409-8285-4709-8abd-1916c59a6566\") " pod="openstack/nova-cell0-cell-mapping-8qttf" Mar 09 14:28:05 crc kubenswrapper[4722]: I0309 14:28:05.696084 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 14:28:05 crc kubenswrapper[4722]: I0309 14:28:05.711195 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa375409-8285-4709-8abd-1916c59a6566-scripts\") pod \"nova-cell0-cell-mapping-8qttf\" (UID: \"aa375409-8285-4709-8abd-1916c59a6566\") " pod="openstack/nova-cell0-cell-mapping-8qttf" Mar 09 14:28:05 crc kubenswrapper[4722]: I0309 14:28:05.711429 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa375409-8285-4709-8abd-1916c59a6566-config-data\") pod \"nova-cell0-cell-mapping-8qttf\" (UID: \"aa375409-8285-4709-8abd-1916c59a6566\") " pod="openstack/nova-cell0-cell-mapping-8qttf" Mar 09 14:28:05 crc kubenswrapper[4722]: I0309 14:28:05.722823 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa375409-8285-4709-8abd-1916c59a6566-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-8qttf\" (UID: \"aa375409-8285-4709-8abd-1916c59a6566\") " pod="openstack/nova-cell0-cell-mapping-8qttf" Mar 09 14:28:05 crc kubenswrapper[4722]: I0309 14:28:05.752519 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkkpb\" (UniqueName: \"kubernetes.io/projected/aa375409-8285-4709-8abd-1916c59a6566-kube-api-access-hkkpb\") pod \"nova-cell0-cell-mapping-8qttf\" (UID: \"aa375409-8285-4709-8abd-1916c59a6566\") " pod="openstack/nova-cell0-cell-mapping-8qttf" Mar 09 14:28:05 crc kubenswrapper[4722]: I0309 14:28:05.795422 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70dbd0c7-a6dc-4f0c-aff3-1de023a615f4-config-data\") pod \"nova-scheduler-0\" (UID: \"70dbd0c7-a6dc-4f0c-aff3-1de023a615f4\") " pod="openstack/nova-scheduler-0" Mar 09 14:28:05 crc kubenswrapper[4722]: I0309 14:28:05.795838 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70dbd0c7-a6dc-4f0c-aff3-1de023a615f4-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"70dbd0c7-a6dc-4f0c-aff3-1de023a615f4\") " pod="openstack/nova-scheduler-0" Mar 09 14:28:05 crc kubenswrapper[4722]: I0309 14:28:05.795870 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kds6z\" (UniqueName: \"kubernetes.io/projected/70dbd0c7-a6dc-4f0c-aff3-1de023a615f4-kube-api-access-kds6z\") pod \"nova-scheduler-0\" (UID: \"70dbd0c7-a6dc-4f0c-aff3-1de023a615f4\") " pod="openstack/nova-scheduler-0" Mar 09 14:28:05 crc kubenswrapper[4722]: I0309 14:28:05.796164 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-8qttf" Mar 09 14:28:05 crc kubenswrapper[4722]: I0309 14:28:05.799660 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 09 14:28:05 crc kubenswrapper[4722]: I0309 14:28:05.803728 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 14:28:05 crc kubenswrapper[4722]: I0309 14:28:05.832964 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 09 14:28:05 crc kubenswrapper[4722]: I0309 14:28:05.836482 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 09 14:28:05 crc kubenswrapper[4722]: I0309 14:28:05.898325 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70dbd0c7-a6dc-4f0c-aff3-1de023a615f4-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"70dbd0c7-a6dc-4f0c-aff3-1de023a615f4\") " pod="openstack/nova-scheduler-0" Mar 09 14:28:05 crc kubenswrapper[4722]: I0309 14:28:05.898405 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kds6z\" (UniqueName: \"kubernetes.io/projected/70dbd0c7-a6dc-4f0c-aff3-1de023a615f4-kube-api-access-kds6z\") pod \"nova-scheduler-0\" (UID: \"70dbd0c7-a6dc-4f0c-aff3-1de023a615f4\") " pod="openstack/nova-scheduler-0" Mar 09 14:28:05 crc kubenswrapper[4722]: I0309 14:28:05.898554 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70dbd0c7-a6dc-4f0c-aff3-1de023a615f4-config-data\") pod \"nova-scheduler-0\" (UID: \"70dbd0c7-a6dc-4f0c-aff3-1de023a615f4\") " pod="openstack/nova-scheduler-0" Mar 09 14:28:05 crc kubenswrapper[4722]: I0309 14:28:05.909147 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70dbd0c7-a6dc-4f0c-aff3-1de023a615f4-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"70dbd0c7-a6dc-4f0c-aff3-1de023a615f4\") " pod="openstack/nova-scheduler-0" Mar 09 14:28:05 crc kubenswrapper[4722]: I0309 14:28:05.941231 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kds6z\" (UniqueName: \"kubernetes.io/projected/70dbd0c7-a6dc-4f0c-aff3-1de023a615f4-kube-api-access-kds6z\") pod \"nova-scheduler-0\" (UID: \"70dbd0c7-a6dc-4f0c-aff3-1de023a615f4\") " pod="openstack/nova-scheduler-0" Mar 09 14:28:05 crc kubenswrapper[4722]: I0309 14:28:05.966013 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70dbd0c7-a6dc-4f0c-aff3-1de023a615f4-config-data\") pod \"nova-scheduler-0\" (UID: \"70dbd0c7-a6dc-4f0c-aff3-1de023a615f4\") " pod="openstack/nova-scheduler-0" Mar 09 14:28:05 crc kubenswrapper[4722]: I0309 14:28:05.973281 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 09 14:28:05 crc kubenswrapper[4722]: I0309 14:28:05.976077 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 09 14:28:05 crc kubenswrapper[4722]: I0309 14:28:05.996313 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 09 14:28:06 crc kubenswrapper[4722]: I0309 14:28:06.000856 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/035524e8-b163-4660-8e2d-b90ebcbef082-config-data\") pod \"nova-api-0\" (UID: \"035524e8-b163-4660-8e2d-b90ebcbef082\") " pod="openstack/nova-api-0" Mar 09 14:28:06 crc kubenswrapper[4722]: I0309 14:28:06.000938 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tl947\" (UniqueName: \"kubernetes.io/projected/035524e8-b163-4660-8e2d-b90ebcbef082-kube-api-access-tl947\") pod \"nova-api-0\" (UID: \"035524e8-b163-4660-8e2d-b90ebcbef082\") " pod="openstack/nova-api-0" Mar 09 14:28:06 crc kubenswrapper[4722]: I0309 14:28:06.000961 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/035524e8-b163-4660-8e2d-b90ebcbef082-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"035524e8-b163-4660-8e2d-b90ebcbef082\") " pod="openstack/nova-api-0" Mar 09 14:28:06 crc kubenswrapper[4722]: I0309 14:28:06.001048 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/035524e8-b163-4660-8e2d-b90ebcbef082-logs\") pod \"nova-api-0\" (UID: \"035524e8-b163-4660-8e2d-b90ebcbef082\") " pod="openstack/nova-api-0" Mar 09 14:28:06 crc kubenswrapper[4722]: I0309 14:28:06.047603 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 14:28:06 crc kubenswrapper[4722]: I0309 14:28:06.103266 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce2e955c-c37e-4d29-9887-0daf8cb6ceea-logs\") pod \"nova-metadata-0\" (UID: \"ce2e955c-c37e-4d29-9887-0daf8cb6ceea\") " pod="openstack/nova-metadata-0" Mar 09 14:28:06 crc kubenswrapper[4722]: I0309 14:28:06.103353 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7tp4\" (UniqueName: \"kubernetes.io/projected/ce2e955c-c37e-4d29-9887-0daf8cb6ceea-kube-api-access-p7tp4\") pod \"nova-metadata-0\" (UID: \"ce2e955c-c37e-4d29-9887-0daf8cb6ceea\") " pod="openstack/nova-metadata-0" Mar 09 14:28:06 crc kubenswrapper[4722]: I0309 14:28:06.103441 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/035524e8-b163-4660-8e2d-b90ebcbef082-logs\") pod \"nova-api-0\" (UID: \"035524e8-b163-4660-8e2d-b90ebcbef082\") " pod="openstack/nova-api-0" Mar 09 14:28:06 crc kubenswrapper[4722]: I0309 14:28:06.103538 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce2e955c-c37e-4d29-9887-0daf8cb6ceea-config-data\") pod \"nova-metadata-0\" (UID: \"ce2e955c-c37e-4d29-9887-0daf8cb6ceea\") " pod="openstack/nova-metadata-0" Mar 09 14:28:06 crc kubenswrapper[4722]: I0309 14:28:06.107635 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/035524e8-b163-4660-8e2d-b90ebcbef082-logs\") pod \"nova-api-0\" (UID: \"035524e8-b163-4660-8e2d-b90ebcbef082\") " pod="openstack/nova-api-0" Mar 09 14:28:06 crc kubenswrapper[4722]: I0309 14:28:06.114330 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/035524e8-b163-4660-8e2d-b90ebcbef082-config-data\") pod \"nova-api-0\" (UID: \"035524e8-b163-4660-8e2d-b90ebcbef082\") " pod="openstack/nova-api-0" Mar 09 14:28:06 crc kubenswrapper[4722]: I0309 14:28:06.114478 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce2e955c-c37e-4d29-9887-0daf8cb6ceea-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ce2e955c-c37e-4d29-9887-0daf8cb6ceea\") " pod="openstack/nova-metadata-0" Mar 09 14:28:06 crc kubenswrapper[4722]: I0309 14:28:06.114563 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/035524e8-b163-4660-8e2d-b90ebcbef082-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"035524e8-b163-4660-8e2d-b90ebcbef082\") " pod="openstack/nova-api-0" Mar 09 14:28:06 crc kubenswrapper[4722]: I0309 14:28:06.114597 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tl947\" (UniqueName: \"kubernetes.io/projected/035524e8-b163-4660-8e2d-b90ebcbef082-kube-api-access-tl947\") pod \"nova-api-0\" (UID: \"035524e8-b163-4660-8e2d-b90ebcbef082\") " pod="openstack/nova-api-0" Mar 09 14:28:06 crc kubenswrapper[4722]: I0309 14:28:06.139558 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/035524e8-b163-4660-8e2d-b90ebcbef082-config-data\") pod \"nova-api-0\" (UID: \"035524e8-b163-4660-8e2d-b90ebcbef082\") " pod="openstack/nova-api-0" Mar 09 14:28:06 crc kubenswrapper[4722]: I0309 14:28:06.140919 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/035524e8-b163-4660-8e2d-b90ebcbef082-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"035524e8-b163-4660-8e2d-b90ebcbef082\") " pod="openstack/nova-api-0" Mar 09 14:28:06 crc kubenswrapper[4722]: I0309 14:28:06.145906 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 09 14:28:06 crc kubenswrapper[4722]: I0309 14:28:06.147799 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 09 14:28:06 crc kubenswrapper[4722]: I0309 14:28:06.160668 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 09 14:28:06 crc kubenswrapper[4722]: I0309 14:28:06.163169 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 09 14:28:06 crc kubenswrapper[4722]: I0309 14:28:06.173519 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tl947\" (UniqueName: \"kubernetes.io/projected/035524e8-b163-4660-8e2d-b90ebcbef082-kube-api-access-tl947\") pod \"nova-api-0\" (UID: \"035524e8-b163-4660-8e2d-b90ebcbef082\") " pod="openstack/nova-api-0" Mar 09 14:28:06 crc kubenswrapper[4722]: I0309 14:28:06.193696 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 14:28:06 crc kubenswrapper[4722]: I0309 14:28:06.204706 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 09 14:28:06 crc kubenswrapper[4722]: I0309 14:28:06.216434 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7tp4\" (UniqueName: \"kubernetes.io/projected/ce2e955c-c37e-4d29-9887-0daf8cb6ceea-kube-api-access-p7tp4\") pod \"nova-metadata-0\" (UID: \"ce2e955c-c37e-4d29-9887-0daf8cb6ceea\") " pod="openstack/nova-metadata-0" Mar 09 14:28:06 crc kubenswrapper[4722]: I0309 14:28:06.216570 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce2e955c-c37e-4d29-9887-0daf8cb6ceea-config-data\") pod \"nova-metadata-0\" (UID: \"ce2e955c-c37e-4d29-9887-0daf8cb6ceea\") " pod="openstack/nova-metadata-0" Mar 09 14:28:06 crc kubenswrapper[4722]: I0309 14:28:06.216663 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce2e955c-c37e-4d29-9887-0daf8cb6ceea-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ce2e955c-c37e-4d29-9887-0daf8cb6ceea\") " pod="openstack/nova-metadata-0" Mar 09 14:28:06 crc kubenswrapper[4722]: I0309 14:28:06.216706 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce2e955c-c37e-4d29-9887-0daf8cb6ceea-logs\") pod \"nova-metadata-0\" (UID: \"ce2e955c-c37e-4d29-9887-0daf8cb6ceea\") " pod="openstack/nova-metadata-0" Mar 09 14:28:06 crc kubenswrapper[4722]: I0309 14:28:06.224093 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-s8lxr"] Mar 09 14:28:06 crc kubenswrapper[4722]: I0309 14:28:06.224612 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce2e955c-c37e-4d29-9887-0daf8cb6ceea-logs\") pod \"nova-metadata-0\" (UID: \"ce2e955c-c37e-4d29-9887-0daf8cb6ceea\") " pod="openstack/nova-metadata-0" Mar 09 14:28:06 crc kubenswrapper[4722]: I0309 14:28:06.226488 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9b86998b5-s8lxr" Mar 09 14:28:06 crc kubenswrapper[4722]: I0309 14:28:06.246839 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce2e955c-c37e-4d29-9887-0daf8cb6ceea-config-data\") pod \"nova-metadata-0\" (UID: \"ce2e955c-c37e-4d29-9887-0daf8cb6ceea\") " pod="openstack/nova-metadata-0" Mar 09 14:28:06 crc kubenswrapper[4722]: I0309 14:28:06.249362 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce2e955c-c37e-4d29-9887-0daf8cb6ceea-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ce2e955c-c37e-4d29-9887-0daf8cb6ceea\") " pod="openstack/nova-metadata-0" Mar 09 14:28:06 crc kubenswrapper[4722]: I0309 14:28:06.255840 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-s8lxr"] Mar 09 14:28:06 crc kubenswrapper[4722]: I0309 14:28:06.280632 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7tp4\" (UniqueName: \"kubernetes.io/projected/ce2e955c-c37e-4d29-9887-0daf8cb6ceea-kube-api-access-p7tp4\") pod \"nova-metadata-0\" (UID: \"ce2e955c-c37e-4d29-9887-0daf8cb6ceea\") " pod="openstack/nova-metadata-0" Mar 09 14:28:06 crc kubenswrapper[4722]: I0309 14:28:06.325862 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/181d67a9-9459-4457-a22f-ba515776d23c-dns-swift-storage-0\") pod \"dnsmasq-dns-9b86998b5-s8lxr\" (UID: \"181d67a9-9459-4457-a22f-ba515776d23c\") " pod="openstack/dnsmasq-dns-9b86998b5-s8lxr" Mar 09 14:28:06 crc kubenswrapper[4722]: I0309 14:28:06.325919 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/181d67a9-9459-4457-a22f-ba515776d23c-ovsdbserver-nb\") pod \"dnsmasq-dns-9b86998b5-s8lxr\" (UID: \"181d67a9-9459-4457-a22f-ba515776d23c\") " pod="openstack/dnsmasq-dns-9b86998b5-s8lxr" Mar 09 14:28:06 crc kubenswrapper[4722]: I0309 14:28:06.325968 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfdb4207-6d78-4641-809a-83fc15cd750e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"cfdb4207-6d78-4641-809a-83fc15cd750e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 14:28:06 crc kubenswrapper[4722]: I0309 14:28:06.326013 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/181d67a9-9459-4457-a22f-ba515776d23c-config\") pod \"dnsmasq-dns-9b86998b5-s8lxr\" (UID: \"181d67a9-9459-4457-a22f-ba515776d23c\") " pod="openstack/dnsmasq-dns-9b86998b5-s8lxr" Mar 09 14:28:06 crc kubenswrapper[4722]: I0309 14:28:06.326138 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/181d67a9-9459-4457-a22f-ba515776d23c-dns-svc\") pod \"dnsmasq-dns-9b86998b5-s8lxr\" (UID: \"181d67a9-9459-4457-a22f-ba515776d23c\") " pod="openstack/dnsmasq-dns-9b86998b5-s8lxr" Mar 09 14:28:06 crc kubenswrapper[4722]: I0309 14:28:06.327006 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/181d67a9-9459-4457-a22f-ba515776d23c-ovsdbserver-sb\") pod \"dnsmasq-dns-9b86998b5-s8lxr\" (UID: \"181d67a9-9459-4457-a22f-ba515776d23c\") " pod="openstack/dnsmasq-dns-9b86998b5-s8lxr" Mar 09 14:28:06 crc kubenswrapper[4722]: I0309 14:28:06.327112 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xslrb\" (UniqueName: \"kubernetes.io/projected/181d67a9-9459-4457-a22f-ba515776d23c-kube-api-access-xslrb\") pod \"dnsmasq-dns-9b86998b5-s8lxr\" (UID: \"181d67a9-9459-4457-a22f-ba515776d23c\") " pod="openstack/dnsmasq-dns-9b86998b5-s8lxr" Mar 09 14:28:06 crc kubenswrapper[4722]: I0309 14:28:06.327226 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kj2pp\" (UniqueName: \"kubernetes.io/projected/cfdb4207-6d78-4641-809a-83fc15cd750e-kube-api-access-kj2pp\") pod \"nova-cell1-novncproxy-0\" (UID: \"cfdb4207-6d78-4641-809a-83fc15cd750e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 14:28:06 crc kubenswrapper[4722]: I0309 14:28:06.327325 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfdb4207-6d78-4641-809a-83fc15cd750e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"cfdb4207-6d78-4641-809a-83fc15cd750e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 14:28:06 crc kubenswrapper[4722]: I0309 14:28:06.429125 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kj2pp\" (UniqueName: \"kubernetes.io/projected/cfdb4207-6d78-4641-809a-83fc15cd750e-kube-api-access-kj2pp\") pod \"nova-cell1-novncproxy-0\" (UID: \"cfdb4207-6d78-4641-809a-83fc15cd750e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 14:28:06 crc kubenswrapper[4722]: I0309 14:28:06.429436 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfdb4207-6d78-4641-809a-83fc15cd750e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"cfdb4207-6d78-4641-809a-83fc15cd750e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 14:28:06 crc kubenswrapper[4722]: I0309 14:28:06.429537 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/181d67a9-9459-4457-a22f-ba515776d23c-dns-swift-storage-0\") pod \"dnsmasq-dns-9b86998b5-s8lxr\" (UID: \"181d67a9-9459-4457-a22f-ba515776d23c\") " pod="openstack/dnsmasq-dns-9b86998b5-s8lxr" Mar 09 14:28:06 crc kubenswrapper[4722]: I0309 14:28:06.429555 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/181d67a9-9459-4457-a22f-ba515776d23c-ovsdbserver-nb\") pod \"dnsmasq-dns-9b86998b5-s8lxr\" (UID: \"181d67a9-9459-4457-a22f-ba515776d23c\") " pod="openstack/dnsmasq-dns-9b86998b5-s8lxr" Mar 09 14:28:06 crc kubenswrapper[4722]: I0309 14:28:06.429577 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfdb4207-6d78-4641-809a-83fc15cd750e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"cfdb4207-6d78-4641-809a-83fc15cd750e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 14:28:06 crc kubenswrapper[4722]: I0309 14:28:06.429597 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/181d67a9-9459-4457-a22f-ba515776d23c-config\") pod \"dnsmasq-dns-9b86998b5-s8lxr\" (UID: \"181d67a9-9459-4457-a22f-ba515776d23c\") " pod="openstack/dnsmasq-dns-9b86998b5-s8lxr" Mar 09 14:28:06 crc kubenswrapper[4722]: I0309 14:28:06.429639 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/181d67a9-9459-4457-a22f-ba515776d23c-dns-svc\") pod \"dnsmasq-dns-9b86998b5-s8lxr\" (UID: \"181d67a9-9459-4457-a22f-ba515776d23c\") " pod="openstack/dnsmasq-dns-9b86998b5-s8lxr" Mar 09 14:28:06 crc kubenswrapper[4722]: I0309 14:28:06.429714 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/181d67a9-9459-4457-a22f-ba515776d23c-ovsdbserver-sb\") pod \"dnsmasq-dns-9b86998b5-s8lxr\" (UID: \"181d67a9-9459-4457-a22f-ba515776d23c\") " pod="openstack/dnsmasq-dns-9b86998b5-s8lxr" Mar 09 14:28:06 crc kubenswrapper[4722]: I0309 14:28:06.429748 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xslrb\" (UniqueName: \"kubernetes.io/projected/181d67a9-9459-4457-a22f-ba515776d23c-kube-api-access-xslrb\") pod \"dnsmasq-dns-9b86998b5-s8lxr\" (UID: \"181d67a9-9459-4457-a22f-ba515776d23c\") " pod="openstack/dnsmasq-dns-9b86998b5-s8lxr" Mar 09 14:28:06 crc kubenswrapper[4722]: I0309 14:28:06.430396 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/181d67a9-9459-4457-a22f-ba515776d23c-dns-swift-storage-0\") pod \"dnsmasq-dns-9b86998b5-s8lxr\" (UID: \"181d67a9-9459-4457-a22f-ba515776d23c\") " pod="openstack/dnsmasq-dns-9b86998b5-s8lxr" Mar 09 14:28:06 crc kubenswrapper[4722]: I0309 14:28:06.432452 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/181d67a9-9459-4457-a22f-ba515776d23c-config\") pod \"dnsmasq-dns-9b86998b5-s8lxr\" (UID: \"181d67a9-9459-4457-a22f-ba515776d23c\") " pod="openstack/dnsmasq-dns-9b86998b5-s8lxr" Mar 09 14:28:06 crc kubenswrapper[4722]: I0309 14:28:06.433425 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/181d67a9-9459-4457-a22f-ba515776d23c-ovsdbserver-nb\") pod \"dnsmasq-dns-9b86998b5-s8lxr\" (UID: \"181d67a9-9459-4457-a22f-ba515776d23c\") " pod="openstack/dnsmasq-dns-9b86998b5-s8lxr" Mar 09 14:28:06 crc kubenswrapper[4722]: I0309 14:28:06.433953 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/181d67a9-9459-4457-a22f-ba515776d23c-dns-svc\") pod \"dnsmasq-dns-9b86998b5-s8lxr\" (UID: \"181d67a9-9459-4457-a22f-ba515776d23c\") " pod="openstack/dnsmasq-dns-9b86998b5-s8lxr" Mar 09 14:28:06 crc kubenswrapper[4722]: I0309 14:28:06.434475 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/181d67a9-9459-4457-a22f-ba515776d23c-ovsdbserver-sb\") pod \"dnsmasq-dns-9b86998b5-s8lxr\" (UID: \"181d67a9-9459-4457-a22f-ba515776d23c\") " pod="openstack/dnsmasq-dns-9b86998b5-s8lxr" Mar 09 14:28:06 crc kubenswrapper[4722]: I0309 14:28:06.435312 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfdb4207-6d78-4641-809a-83fc15cd750e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"cfdb4207-6d78-4641-809a-83fc15cd750e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 14:28:06 crc kubenswrapper[4722]: I0309 14:28:06.442754 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfdb4207-6d78-4641-809a-83fc15cd750e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"cfdb4207-6d78-4641-809a-83fc15cd750e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 14:28:06 crc kubenswrapper[4722]: I0309 14:28:06.446475 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-6e64-account-create-update-sddxq" Mar 09 14:28:06 crc kubenswrapper[4722]: I0309 14:28:06.452077 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kj2pp\" (UniqueName: \"kubernetes.io/projected/cfdb4207-6d78-4641-809a-83fc15cd750e-kube-api-access-kj2pp\") pod \"nova-cell1-novncproxy-0\" (UID: \"cfdb4207-6d78-4641-809a-83fc15cd750e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 14:28:06 crc kubenswrapper[4722]: I0309 14:28:06.453160 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xslrb\" (UniqueName: \"kubernetes.io/projected/181d67a9-9459-4457-a22f-ba515776d23c-kube-api-access-xslrb\") pod \"dnsmasq-dns-9b86998b5-s8lxr\" (UID: \"181d67a9-9459-4457-a22f-ba515776d23c\") " pod="openstack/dnsmasq-dns-9b86998b5-s8lxr" Mar 09 14:28:06 crc kubenswrapper[4722]: I0309 14:28:06.464045 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 09 14:28:06 crc kubenswrapper[4722]: I0309 14:28:06.520245 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 09 14:28:06 crc kubenswrapper[4722]: I0309 14:28:06.531313 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x72ld\" (UniqueName: \"kubernetes.io/projected/cb4ac952-30e4-4b7d-866f-5bd6c9825bb2-kube-api-access-x72ld\") pod \"cb4ac952-30e4-4b7d-866f-5bd6c9825bb2\" (UID: \"cb4ac952-30e4-4b7d-866f-5bd6c9825bb2\") " Mar 09 14:28:06 crc kubenswrapper[4722]: I0309 14:28:06.531681 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cb4ac952-30e4-4b7d-866f-5bd6c9825bb2-operator-scripts\") pod \"cb4ac952-30e4-4b7d-866f-5bd6c9825bb2\" (UID: \"cb4ac952-30e4-4b7d-866f-5bd6c9825bb2\") " Mar 09 14:28:06 crc kubenswrapper[4722]: I0309 14:28:06.537980 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb4ac952-30e4-4b7d-866f-5bd6c9825bb2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cb4ac952-30e4-4b7d-866f-5bd6c9825bb2" (UID: "cb4ac952-30e4-4b7d-866f-5bd6c9825bb2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:28:06 crc kubenswrapper[4722]: I0309 14:28:06.541849 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb4ac952-30e4-4b7d-866f-5bd6c9825bb2-kube-api-access-x72ld" (OuterVolumeSpecName: "kube-api-access-x72ld") pod "cb4ac952-30e4-4b7d-866f-5bd6c9825bb2" (UID: "cb4ac952-30e4-4b7d-866f-5bd6c9825bb2"). InnerVolumeSpecName "kube-api-access-x72ld". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:28:06 crc kubenswrapper[4722]: I0309 14:28:06.594118 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9b86998b5-s8lxr" Mar 09 14:28:06 crc kubenswrapper[4722]: I0309 14:28:06.599486 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-6e64-account-create-update-sddxq" event={"ID":"cb4ac952-30e4-4b7d-866f-5bd6c9825bb2","Type":"ContainerDied","Data":"45d652304944749910a264c486d08fbb70e5a7087c6e86ae802b6ab3cf9bb959"} Mar 09 14:28:06 crc kubenswrapper[4722]: I0309 14:28:06.599544 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="45d652304944749910a264c486d08fbb70e5a7087c6e86ae802b6ab3cf9bb959" Mar 09 14:28:06 crc kubenswrapper[4722]: I0309 14:28:06.599597 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-6e64-account-create-update-sddxq" Mar 09 14:28:06 crc kubenswrapper[4722]: I0309 14:28:06.636726 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cb4ac952-30e4-4b7d-866f-5bd6c9825bb2-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:06 crc kubenswrapper[4722]: I0309 14:28:06.636753 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x72ld\" (UniqueName: \"kubernetes.io/projected/cb4ac952-30e4-4b7d-866f-5bd6c9825bb2-kube-api-access-x72ld\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:06 crc kubenswrapper[4722]: I0309 14:28:06.918778 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-kf5wr" Mar 09 14:28:06 crc kubenswrapper[4722]: I0309 14:28:06.927800 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551108-xnm76" Mar 09 14:28:07 crc kubenswrapper[4722]: I0309 14:28:07.053127 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z75zx\" (UniqueName: \"kubernetes.io/projected/f48afe9d-4a0b-4d74-b9c4-a67f1fd098f3-kube-api-access-z75zx\") pod \"f48afe9d-4a0b-4d74-b9c4-a67f1fd098f3\" (UID: \"f48afe9d-4a0b-4d74-b9c4-a67f1fd098f3\") " Mar 09 14:28:07 crc kubenswrapper[4722]: I0309 14:28:07.053387 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5jz9p\" (UniqueName: \"kubernetes.io/projected/6a5f5ad3-9957-4077-b5a1-7a53d5c4caaf-kube-api-access-5jz9p\") pod \"6a5f5ad3-9957-4077-b5a1-7a53d5c4caaf\" (UID: \"6a5f5ad3-9957-4077-b5a1-7a53d5c4caaf\") " Mar 09 14:28:07 crc kubenswrapper[4722]: I0309 14:28:07.053458 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f48afe9d-4a0b-4d74-b9c4-a67f1fd098f3-operator-scripts\") pod \"f48afe9d-4a0b-4d74-b9c4-a67f1fd098f3\" (UID: \"f48afe9d-4a0b-4d74-b9c4-a67f1fd098f3\") " Mar 09 14:28:07 crc kubenswrapper[4722]: I0309 14:28:07.057394 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f48afe9d-4a0b-4d74-b9c4-a67f1fd098f3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f48afe9d-4a0b-4d74-b9c4-a67f1fd098f3" (UID: "f48afe9d-4a0b-4d74-b9c4-a67f1fd098f3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:28:07 crc kubenswrapper[4722]: I0309 14:28:07.058767 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-cklld"] Mar 09 14:28:07 crc kubenswrapper[4722]: E0309 14:28:07.059325 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f48afe9d-4a0b-4d74-b9c4-a67f1fd098f3" containerName="mariadb-database-create" Mar 09 14:28:07 crc kubenswrapper[4722]: I0309 14:28:07.059343 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f48afe9d-4a0b-4d74-b9c4-a67f1fd098f3" containerName="mariadb-database-create" Mar 09 14:28:07 crc kubenswrapper[4722]: E0309 14:28:07.059362 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb4ac952-30e4-4b7d-866f-5bd6c9825bb2" containerName="mariadb-account-create-update" Mar 09 14:28:07 crc kubenswrapper[4722]: I0309 14:28:07.059368 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb4ac952-30e4-4b7d-866f-5bd6c9825bb2" containerName="mariadb-account-create-update" Mar 09 14:28:07 crc kubenswrapper[4722]: E0309 14:28:07.059389 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a5f5ad3-9957-4077-b5a1-7a53d5c4caaf" containerName="oc" Mar 09 14:28:07 crc kubenswrapper[4722]: I0309 14:28:07.059398 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a5f5ad3-9957-4077-b5a1-7a53d5c4caaf" containerName="oc" Mar 09 14:28:07 crc kubenswrapper[4722]: I0309 14:28:07.059598 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb4ac952-30e4-4b7d-866f-5bd6c9825bb2" containerName="mariadb-account-create-update" Mar 09 14:28:07 crc kubenswrapper[4722]: I0309 14:28:07.059618 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="f48afe9d-4a0b-4d74-b9c4-a67f1fd098f3" containerName="mariadb-database-create" Mar 09 14:28:07 crc kubenswrapper[4722]: I0309 14:28:07.059632 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a5f5ad3-9957-4077-b5a1-7a53d5c4caaf" containerName="oc" Mar 09 14:28:07 crc kubenswrapper[4722]: I0309 14:28:07.060392 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-cklld" Mar 09 14:28:07 crc kubenswrapper[4722]: I0309 14:28:07.062653 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 09 14:28:07 crc kubenswrapper[4722]: I0309 14:28:07.063346 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a5f5ad3-9957-4077-b5a1-7a53d5c4caaf-kube-api-access-5jz9p" (OuterVolumeSpecName: "kube-api-access-5jz9p") pod "6a5f5ad3-9957-4077-b5a1-7a53d5c4caaf" (UID: "6a5f5ad3-9957-4077-b5a1-7a53d5c4caaf"). InnerVolumeSpecName "kube-api-access-5jz9p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:28:07 crc kubenswrapper[4722]: I0309 14:28:07.064635 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f48afe9d-4a0b-4d74-b9c4-a67f1fd098f3-kube-api-access-z75zx" (OuterVolumeSpecName: "kube-api-access-z75zx") pod "f48afe9d-4a0b-4d74-b9c4-a67f1fd098f3" (UID: "f48afe9d-4a0b-4d74-b9c4-a67f1fd098f3"). InnerVolumeSpecName "kube-api-access-z75zx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:28:07 crc kubenswrapper[4722]: I0309 14:28:07.064736 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 09 14:28:07 crc kubenswrapper[4722]: I0309 14:28:07.077446 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-cklld"] Mar 09 14:28:07 crc kubenswrapper[4722]: I0309 14:28:07.155998 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e25996e-4f6c-4135-b781-95eae749689e-scripts\") pod \"nova-cell1-conductor-db-sync-cklld\" (UID: \"8e25996e-4f6c-4135-b781-95eae749689e\") " pod="openstack/nova-cell1-conductor-db-sync-cklld" Mar 09 14:28:07 crc kubenswrapper[4722]: I0309 14:28:07.156060 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kx7rw\" (UniqueName: \"kubernetes.io/projected/8e25996e-4f6c-4135-b781-95eae749689e-kube-api-access-kx7rw\") pod \"nova-cell1-conductor-db-sync-cklld\" (UID: \"8e25996e-4f6c-4135-b781-95eae749689e\") " pod="openstack/nova-cell1-conductor-db-sync-cklld" Mar 09 14:28:07 crc kubenswrapper[4722]: I0309 14:28:07.156145 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e25996e-4f6c-4135-b781-95eae749689e-config-data\") pod \"nova-cell1-conductor-db-sync-cklld\" (UID: \"8e25996e-4f6c-4135-b781-95eae749689e\") " pod="openstack/nova-cell1-conductor-db-sync-cklld" Mar 09 14:28:07 crc kubenswrapper[4722]: I0309 14:28:07.156244 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e25996e-4f6c-4135-b781-95eae749689e-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-cklld\" (UID: \"8e25996e-4f6c-4135-b781-95eae749689e\") " pod="openstack/nova-cell1-conductor-db-sync-cklld" Mar 09 14:28:07 crc kubenswrapper[4722]: I0309 14:28:07.156329 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z75zx\" (UniqueName: \"kubernetes.io/projected/f48afe9d-4a0b-4d74-b9c4-a67f1fd098f3-kube-api-access-z75zx\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:07 crc kubenswrapper[4722]: I0309 14:28:07.156342 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5jz9p\" (UniqueName: \"kubernetes.io/projected/6a5f5ad3-9957-4077-b5a1-7a53d5c4caaf-kube-api-access-5jz9p\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:07 crc kubenswrapper[4722]: I0309 14:28:07.156350 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f48afe9d-4a0b-4d74-b9c4-a67f1fd098f3-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:07 crc kubenswrapper[4722]: I0309 14:28:07.257744 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e25996e-4f6c-4135-b781-95eae749689e-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-cklld\" (UID: \"8e25996e-4f6c-4135-b781-95eae749689e\") " pod="openstack/nova-cell1-conductor-db-sync-cklld" Mar 09 14:28:07 crc kubenswrapper[4722]: I0309 14:28:07.257859 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e25996e-4f6c-4135-b781-95eae749689e-scripts\") pod \"nova-cell1-conductor-db-sync-cklld\" (UID: \"8e25996e-4f6c-4135-b781-95eae749689e\") " pod="openstack/nova-cell1-conductor-db-sync-cklld" Mar 09 14:28:07 crc kubenswrapper[4722]: I0309 14:28:07.257901 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kx7rw\" (UniqueName: \"kubernetes.io/projected/8e25996e-4f6c-4135-b781-95eae749689e-kube-api-access-kx7rw\") pod \"nova-cell1-conductor-db-sync-cklld\" (UID: \"8e25996e-4f6c-4135-b781-95eae749689e\") " pod="openstack/nova-cell1-conductor-db-sync-cklld" Mar 09 14:28:07 crc kubenswrapper[4722]: I0309 14:28:07.257986 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e25996e-4f6c-4135-b781-95eae749689e-config-data\") pod \"nova-cell1-conductor-db-sync-cklld\" (UID: \"8e25996e-4f6c-4135-b781-95eae749689e\") " pod="openstack/nova-cell1-conductor-db-sync-cklld" Mar 09 14:28:07 crc kubenswrapper[4722]: I0309 14:28:07.264074 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e25996e-4f6c-4135-b781-95eae749689e-config-data\") pod \"nova-cell1-conductor-db-sync-cklld\" (UID: \"8e25996e-4f6c-4135-b781-95eae749689e\") " pod="openstack/nova-cell1-conductor-db-sync-cklld" Mar 09 14:28:07 crc kubenswrapper[4722]: I0309 14:28:07.267582 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e25996e-4f6c-4135-b781-95eae749689e-scripts\") pod \"nova-cell1-conductor-db-sync-cklld\" (UID: \"8e25996e-4f6c-4135-b781-95eae749689e\") " pod="openstack/nova-cell1-conductor-db-sync-cklld" Mar 09 14:28:07 crc kubenswrapper[4722]: I0309 14:28:07.277890 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kx7rw\" (UniqueName: \"kubernetes.io/projected/8e25996e-4f6c-4135-b781-95eae749689e-kube-api-access-kx7rw\") pod \"nova-cell1-conductor-db-sync-cklld\" (UID: \"8e25996e-4f6c-4135-b781-95eae749689e\") " pod="openstack/nova-cell1-conductor-db-sync-cklld" Mar 09 14:28:07 crc kubenswrapper[4722]: I0309 14:28:07.278886 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e25996e-4f6c-4135-b781-95eae749689e-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-cklld\" (UID: \"8e25996e-4f6c-4135-b781-95eae749689e\") " pod="openstack/nova-cell1-conductor-db-sync-cklld" Mar 09 14:28:07 crc kubenswrapper[4722]: I0309 14:28:07.311491 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-8qttf"] Mar 09 14:28:07 crc kubenswrapper[4722]: I0309 14:28:07.332369 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 14:28:07 crc kubenswrapper[4722]: I0309 14:28:07.360866 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 09 14:28:07 crc kubenswrapper[4722]: I0309 14:28:07.509484 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-cklld" Mar 09 14:28:07 crc kubenswrapper[4722]: I0309 14:28:07.616928 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"035524e8-b163-4660-8e2d-b90ebcbef082","Type":"ContainerStarted","Data":"98176ee23c37a0c155ddec66dae6ea64825e99b9f337ab7714bbea101eb5d619"} Mar 09 14:28:07 crc kubenswrapper[4722]: I0309 14:28:07.619282 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-8qttf" event={"ID":"aa375409-8285-4709-8abd-1916c59a6566","Type":"ContainerStarted","Data":"866dea47edd6b6df9c06f06c280717c12a33b978612ab76a0a4e30d589b49aba"} Mar 09 14:28:07 crc kubenswrapper[4722]: I0309 14:28:07.619350 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-8qttf" event={"ID":"aa375409-8285-4709-8abd-1916c59a6566","Type":"ContainerStarted","Data":"ae96543132aca98dba4f1c455bce164458c3cedb30e6001a7a085837574ffa87"} Mar 09 14:28:07 crc kubenswrapper[4722]: I0309 14:28:07.637987 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-kf5wr" Mar 09 14:28:07 crc kubenswrapper[4722]: I0309 14:28:07.638522 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-kf5wr" event={"ID":"f48afe9d-4a0b-4d74-b9c4-a67f1fd098f3","Type":"ContainerDied","Data":"543727dec0f2fe15ac4b814fa208a782fdbcd4b775b23721d7c2e90354e63daf"} Mar 09 14:28:07 crc kubenswrapper[4722]: I0309 14:28:07.638559 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="543727dec0f2fe15ac4b814fa208a782fdbcd4b775b23721d7c2e90354e63daf" Mar 09 14:28:07 crc kubenswrapper[4722]: I0309 14:28:07.654145 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 14:28:07 crc kubenswrapper[4722]: I0309 14:28:07.660710 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551108-xnm76" Mar 09 14:28:07 crc kubenswrapper[4722]: I0309 14:28:07.660713 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551108-xnm76" event={"ID":"6a5f5ad3-9957-4077-b5a1-7a53d5c4caaf","Type":"ContainerDied","Data":"a493da62570c0f0e2c23d299ea6128c2cac67a61892543dafe7b98e304247713"} Mar 09 14:28:07 crc kubenswrapper[4722]: I0309 14:28:07.662922 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a493da62570c0f0e2c23d299ea6128c2cac67a61892543dafe7b98e304247713" Mar 09 14:28:07 crc kubenswrapper[4722]: I0309 14:28:07.664732 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"70dbd0c7-a6dc-4f0c-aff3-1de023a615f4","Type":"ContainerStarted","Data":"0dc9834d0c1e16fa5fe0bde5d4f1b3ef1fe9b6e534c09558f91edacc2341613c"} Mar 09 14:28:07 crc kubenswrapper[4722]: I0309 14:28:07.677708 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-8qttf" podStartSLOduration=2.677686951 podStartE2EDuration="2.677686951s" podCreationTimestamp="2026-03-09 14:28:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:28:07.632254485 +0000 UTC m=+1528.187823061" watchObservedRunningTime="2026-03-09 14:28:07.677686951 +0000 UTC m=+1528.233255527" Mar 09 14:28:07 crc kubenswrapper[4722]: I0309 14:28:07.696043 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 09 14:28:07 crc kubenswrapper[4722]: I0309 14:28:07.717113 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-s8lxr"] Mar 09 14:28:08 crc kubenswrapper[4722]: I0309 14:28:08.032419 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551102-xr98b"] Mar 09 14:28:08 crc kubenswrapper[4722]: I0309 14:28:08.044848 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551102-xr98b"] Mar 09 14:28:08 crc kubenswrapper[4722]: W0309 14:28:08.079260 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8e25996e_4f6c_4135_b781_95eae749689e.slice/crio-c3b462a6936d1680bfe184966493fe8ff2672dd835244f8e8202be25c9ffcb6b WatchSource:0}: Error finding container c3b462a6936d1680bfe184966493fe8ff2672dd835244f8e8202be25c9ffcb6b: Status 404 returned error can't find the container with id c3b462a6936d1680bfe184966493fe8ff2672dd835244f8e8202be25c9ffcb6b Mar 09 14:28:08 crc kubenswrapper[4722]: I0309 14:28:08.084739 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-cklld"] Mar 09 14:28:08 crc kubenswrapper[4722]: I0309 14:28:08.170399 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94a144d8-4427-4dd2-99df-825094bc5a4b" path="/var/lib/kubelet/pods/94a144d8-4427-4dd2-99df-825094bc5a4b/volumes" Mar 09 14:28:08 crc kubenswrapper[4722]: I0309 14:28:08.349906 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-vl8jp"] Mar 09 14:28:08 crc kubenswrapper[4722]: I0309 14:28:08.351512 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-vl8jp" Mar 09 14:28:08 crc kubenswrapper[4722]: I0309 14:28:08.356706 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Mar 09 14:28:08 crc kubenswrapper[4722]: I0309 14:28:08.356762 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-d98zq" Mar 09 14:28:08 crc kubenswrapper[4722]: I0309 14:28:08.356982 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 09 14:28:08 crc kubenswrapper[4722]: I0309 14:28:08.366810 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-vl8jp"] Mar 09 14:28:08 crc kubenswrapper[4722]: I0309 14:28:08.369291 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Mar 09 14:28:08 crc kubenswrapper[4722]: I0309 14:28:08.402769 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c58faa7-5e6b-4140-9859-a4729c6354d9-combined-ca-bundle\") pod \"aodh-db-sync-vl8jp\" (UID: \"1c58faa7-5e6b-4140-9859-a4729c6354d9\") " pod="openstack/aodh-db-sync-vl8jp" Mar 09 14:28:08 crc kubenswrapper[4722]: I0309 14:28:08.402893 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c58faa7-5e6b-4140-9859-a4729c6354d9-scripts\") pod \"aodh-db-sync-vl8jp\" (UID: \"1c58faa7-5e6b-4140-9859-a4729c6354d9\") " pod="openstack/aodh-db-sync-vl8jp" Mar 09 14:28:08 crc kubenswrapper[4722]: I0309 14:28:08.403098 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d277j\" (UniqueName: \"kubernetes.io/projected/1c58faa7-5e6b-4140-9859-a4729c6354d9-kube-api-access-d277j\") pod \"aodh-db-sync-vl8jp\" (UID: \"1c58faa7-5e6b-4140-9859-a4729c6354d9\") " pod="openstack/aodh-db-sync-vl8jp" Mar 09 14:28:08 crc kubenswrapper[4722]: I0309 14:28:08.403388 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c58faa7-5e6b-4140-9859-a4729c6354d9-config-data\") pod \"aodh-db-sync-vl8jp\" (UID: \"1c58faa7-5e6b-4140-9859-a4729c6354d9\") " pod="openstack/aodh-db-sync-vl8jp" Mar 09 14:28:08 crc kubenswrapper[4722]: I0309 14:28:08.506390 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c58faa7-5e6b-4140-9859-a4729c6354d9-combined-ca-bundle\") pod \"aodh-db-sync-vl8jp\" (UID: \"1c58faa7-5e6b-4140-9859-a4729c6354d9\") " pod="openstack/aodh-db-sync-vl8jp" Mar 09 14:28:08 crc kubenswrapper[4722]: I0309 14:28:08.506451 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c58faa7-5e6b-4140-9859-a4729c6354d9-scripts\") pod \"aodh-db-sync-vl8jp\" (UID: \"1c58faa7-5e6b-4140-9859-a4729c6354d9\") " pod="openstack/aodh-db-sync-vl8jp" Mar 09 14:28:08 crc kubenswrapper[4722]: I0309 14:28:08.506533 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d277j\" (UniqueName: \"kubernetes.io/projected/1c58faa7-5e6b-4140-9859-a4729c6354d9-kube-api-access-d277j\") pod \"aodh-db-sync-vl8jp\" (UID: \"1c58faa7-5e6b-4140-9859-a4729c6354d9\") " pod="openstack/aodh-db-sync-vl8jp" Mar 09 14:28:08 crc kubenswrapper[4722]: I0309 14:28:08.506637 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c58faa7-5e6b-4140-9859-a4729c6354d9-config-data\") pod \"aodh-db-sync-vl8jp\" (UID: \"1c58faa7-5e6b-4140-9859-a4729c6354d9\") " pod="openstack/aodh-db-sync-vl8jp" Mar 09 14:28:08 crc kubenswrapper[4722]: I0309 14:28:08.511122 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c58faa7-5e6b-4140-9859-a4729c6354d9-combined-ca-bundle\") pod \"aodh-db-sync-vl8jp\" (UID: \"1c58faa7-5e6b-4140-9859-a4729c6354d9\") " pod="openstack/aodh-db-sync-vl8jp" Mar 09 14:28:08 crc kubenswrapper[4722]: I0309 14:28:08.511554 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c58faa7-5e6b-4140-9859-a4729c6354d9-scripts\") pod \"aodh-db-sync-vl8jp\" (UID: \"1c58faa7-5e6b-4140-9859-a4729c6354d9\") " pod="openstack/aodh-db-sync-vl8jp" Mar 09 14:28:08 crc kubenswrapper[4722]: I0309 14:28:08.518044 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c58faa7-5e6b-4140-9859-a4729c6354d9-config-data\") pod \"aodh-db-sync-vl8jp\" (UID: \"1c58faa7-5e6b-4140-9859-a4729c6354d9\") " pod="openstack/aodh-db-sync-vl8jp" Mar 09 14:28:08 crc kubenswrapper[4722]: I0309 14:28:08.531681 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d277j\" (UniqueName: \"kubernetes.io/projected/1c58faa7-5e6b-4140-9859-a4729c6354d9-kube-api-access-d277j\") pod \"aodh-db-sync-vl8jp\" (UID: \"1c58faa7-5e6b-4140-9859-a4729c6354d9\") " pod="openstack/aodh-db-sync-vl8jp" Mar 09 14:28:08 crc kubenswrapper[4722]: I0309 14:28:08.681827 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-vl8jp" Mar 09 14:28:08 crc kubenswrapper[4722]: I0309 14:28:08.692811 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ce2e955c-c37e-4d29-9887-0daf8cb6ceea","Type":"ContainerStarted","Data":"18a2330040e053f0ee31d2a071720509237cd64ccdb83f7cbfdf96df05ae9563"} Mar 09 14:28:08 crc kubenswrapper[4722]: I0309 14:28:08.695734 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"cfdb4207-6d78-4641-809a-83fc15cd750e","Type":"ContainerStarted","Data":"039e016b6e9ff95306850e8701866536bb298ae048d98ffca3123162f845b539"} Mar 09 14:28:08 crc kubenswrapper[4722]: I0309 14:28:08.700039 4722 generic.go:334] "Generic (PLEG): container finished" podID="181d67a9-9459-4457-a22f-ba515776d23c" containerID="2aa35b1c19c07721a2680d9e41fc1cf9fc0342153c3d37d5aa637a1d689f054b" exitCode=0 Mar 09 14:28:08 crc kubenswrapper[4722]: I0309 14:28:08.700363 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b86998b5-s8lxr" event={"ID":"181d67a9-9459-4457-a22f-ba515776d23c","Type":"ContainerDied","Data":"2aa35b1c19c07721a2680d9e41fc1cf9fc0342153c3d37d5aa637a1d689f054b"} Mar 09 14:28:08 crc kubenswrapper[4722]: I0309 14:28:08.700406 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b86998b5-s8lxr" event={"ID":"181d67a9-9459-4457-a22f-ba515776d23c","Type":"ContainerStarted","Data":"444e390d24fa4e9b47fc85371b7faa0854056a86e3941d747490fcc3082da9b7"} Mar 09 14:28:08 crc kubenswrapper[4722]: I0309 14:28:08.703356 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-cklld" event={"ID":"8e25996e-4f6c-4135-b781-95eae749689e","Type":"ContainerStarted","Data":"d9ca2f07ecb9bf38ccd438169eed0c8afa14d0ac674a9f2f87af6aba969cea02"} Mar 09 14:28:08 crc kubenswrapper[4722]: I0309 14:28:08.703403 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-cklld" event={"ID":"8e25996e-4f6c-4135-b781-95eae749689e","Type":"ContainerStarted","Data":"c3b462a6936d1680bfe184966493fe8ff2672dd835244f8e8202be25c9ffcb6b"} Mar 09 14:28:08 crc kubenswrapper[4722]: I0309 14:28:08.748187 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-cklld" podStartSLOduration=1.7481673 podStartE2EDuration="1.7481673s" podCreationTimestamp="2026-03-09 14:28:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:28:08.741351642 +0000 UTC m=+1529.296920218" watchObservedRunningTime="2026-03-09 14:28:08.7481673 +0000 UTC m=+1529.303735876" Mar 09 14:28:09 crc kubenswrapper[4722]: I0309 14:28:09.324749 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 09 14:28:09 crc kubenswrapper[4722]: I0309 14:28:09.337574 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 14:28:12 crc kubenswrapper[4722]: I0309 14:28:12.446638 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-vl8jp"] Mar 09 14:28:12 crc kubenswrapper[4722]: I0309 14:28:12.764311 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-vl8jp" event={"ID":"1c58faa7-5e6b-4140-9859-a4729c6354d9","Type":"ContainerStarted","Data":"5a25995b091694fbd1ef50b1148bc415f507a2e827be482df2619798a694f294"} Mar 09 14:28:12 crc kubenswrapper[4722]: I0309 14:28:12.768429 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"035524e8-b163-4660-8e2d-b90ebcbef082","Type":"ContainerStarted","Data":"f8f1aa6c9a54cd18444527eb6152479842ac9eccf62f56097bdcfda05d0b3fe2"} Mar 09 14:28:12 crc kubenswrapper[4722]: I0309 14:28:12.768479 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"035524e8-b163-4660-8e2d-b90ebcbef082","Type":"ContainerStarted","Data":"01f24812bf5b006c669261fd4a11b3af9c7cf9717f10922b664eee6f5e530995"} Mar 09 14:28:12 crc kubenswrapper[4722]: I0309 14:28:12.771595 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"70dbd0c7-a6dc-4f0c-aff3-1de023a615f4","Type":"ContainerStarted","Data":"32b8ae9e0ba215d2db25700cc901e98b1cc9facc016ba3ee9ca38bb7d739f35f"} Mar 09 14:28:12 crc kubenswrapper[4722]: I0309 14:28:12.775230 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b86998b5-s8lxr" event={"ID":"181d67a9-9459-4457-a22f-ba515776d23c","Type":"ContainerStarted","Data":"5b641acbb31e09da34d87a916b159a499bec718235322dd7e3894c41b3d59c30"} Mar 09 14:28:12 crc kubenswrapper[4722]: I0309 14:28:12.775419 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-9b86998b5-s8lxr" Mar 09 14:28:12 crc kubenswrapper[4722]: I0309 14:28:12.780916 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ce2e955c-c37e-4d29-9887-0daf8cb6ceea","Type":"ContainerStarted","Data":"6e1646526fccbf44f84656b18ed84e13ad604b922833bd4566ac826242fa85c7"} Mar 09 14:28:12 crc kubenswrapper[4722]: I0309 14:28:12.780969 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ce2e955c-c37e-4d29-9887-0daf8cb6ceea","Type":"ContainerStarted","Data":"29c23d7e883ee162938f3c346da5878bb9a55af0cf267b08f37f07a57ea1c3e3"} Mar 09 14:28:12 crc kubenswrapper[4722]: I0309 14:28:12.781009 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="ce2e955c-c37e-4d29-9887-0daf8cb6ceea" containerName="nova-metadata-log" containerID="cri-o://29c23d7e883ee162938f3c346da5878bb9a55af0cf267b08f37f07a57ea1c3e3" gracePeriod=30 Mar 09 14:28:12 crc kubenswrapper[4722]: I0309 14:28:12.781042 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="ce2e955c-c37e-4d29-9887-0daf8cb6ceea" containerName="nova-metadata-metadata" containerID="cri-o://6e1646526fccbf44f84656b18ed84e13ad604b922833bd4566ac826242fa85c7" gracePeriod=30 Mar 09 14:28:12 crc kubenswrapper[4722]: I0309 14:28:12.791417 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.2546478 podStartE2EDuration="7.791394053s" podCreationTimestamp="2026-03-09 14:28:05 +0000 UTC" firstStartedPulling="2026-03-09 14:28:07.35476433 +0000 UTC m=+1527.910332906" lastFinishedPulling="2026-03-09 14:28:11.891510593 +0000 UTC m=+1532.447079159" observedRunningTime="2026-03-09 14:28:12.789004487 +0000 UTC m=+1533.344573083" watchObservedRunningTime="2026-03-09 14:28:12.791394053 +0000 UTC m=+1533.346962629" Mar 09 14:28:12 crc kubenswrapper[4722]: I0309 14:28:12.794475 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"cfdb4207-6d78-4641-809a-83fc15cd750e","Type":"ContainerStarted","Data":"419a0ceea0f63127048ea992454ea646d8b399148a977e09dfa153c6e7a8c132"} Mar 09 14:28:12 crc kubenswrapper[4722]: I0309 14:28:12.794636 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="cfdb4207-6d78-4641-809a-83fc15cd750e" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://419a0ceea0f63127048ea992454ea646d8b399148a977e09dfa153c6e7a8c132" gracePeriod=30 Mar 09 14:28:12 crc kubenswrapper[4722]: I0309 14:28:12.821650 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.249247631 podStartE2EDuration="7.821628949s" podCreationTimestamp="2026-03-09 14:28:05 +0000 UTC" firstStartedPulling="2026-03-09 14:28:07.302502485 +0000 UTC m=+1527.858071061" lastFinishedPulling="2026-03-09 14:28:11.874883803 +0000 UTC m=+1532.430452379" observedRunningTime="2026-03-09 14:28:12.810566913 +0000 UTC m=+1533.366135509" watchObservedRunningTime="2026-03-09 14:28:12.821628949 +0000 UTC m=+1533.377197525" Mar 09 14:28:12 crc kubenswrapper[4722]: I0309 14:28:12.851382 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.664060935 podStartE2EDuration="7.851363592s" podCreationTimestamp="2026-03-09 14:28:05 +0000 UTC" firstStartedPulling="2026-03-09 14:28:07.703235258 +0000 UTC m=+1528.258803834" lastFinishedPulling="2026-03-09 14:28:11.890537895 +0000 UTC m=+1532.446106491" observedRunningTime="2026-03-09 14:28:12.828600302 +0000 UTC m=+1533.384168878" watchObservedRunningTime="2026-03-09 14:28:12.851363592 +0000 UTC m=+1533.406932168" Mar 09 14:28:12 crc kubenswrapper[4722]: I0309 14:28:12.868641 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-9b86998b5-s8lxr" podStartSLOduration=6.868620519 podStartE2EDuration="6.868620519s" podCreationTimestamp="2026-03-09 14:28:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:28:12.865306048 +0000 UTC m=+1533.420874624" watchObservedRunningTime="2026-03-09 14:28:12.868620519 +0000 UTC m=+1533.424189095" Mar 09 14:28:12 crc kubenswrapper[4722]: I0309 14:28:12.892256 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.726165103 podStartE2EDuration="7.892213892s" podCreationTimestamp="2026-03-09 14:28:05 +0000 UTC" firstStartedPulling="2026-03-09 14:28:07.708791182 +0000 UTC m=+1528.264359758" lastFinishedPulling="2026-03-09 14:28:11.874839971 +0000 UTC m=+1532.430408547" observedRunningTime="2026-03-09 14:28:12.882708839 +0000 UTC m=+1533.438277425" watchObservedRunningTime="2026-03-09 14:28:12.892213892 +0000 UTC m=+1533.447782468" Mar 09 14:28:13 crc kubenswrapper[4722]: I0309 14:28:13.814818 4722 generic.go:334] "Generic (PLEG): container finished" podID="f7cd25c5-3f3b-4b28-b96f-0aa6114b498c" containerID="91cc55bbe0c15e0bda9e4b09eb94f00df004b0b14ebbb120410702aeca6523b4" exitCode=137 Mar 09 14:28:13 crc kubenswrapper[4722]: I0309 14:28:13.814864 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f7cd25c5-3f3b-4b28-b96f-0aa6114b498c","Type":"ContainerDied","Data":"91cc55bbe0c15e0bda9e4b09eb94f00df004b0b14ebbb120410702aeca6523b4"} Mar 09 14:28:13 crc kubenswrapper[4722]: I0309 14:28:13.815195 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f7cd25c5-3f3b-4b28-b96f-0aa6114b498c","Type":"ContainerDied","Data":"eae54c922f9529589ee43329029d006bb384ce22859b3bc9778943f9e106d4a6"} Mar 09 14:28:13 crc kubenswrapper[4722]: I0309 14:28:13.815251 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eae54c922f9529589ee43329029d006bb384ce22859b3bc9778943f9e106d4a6" Mar 09 14:28:13 crc kubenswrapper[4722]: I0309 14:28:13.817239 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 14:28:13 crc kubenswrapper[4722]: I0309 14:28:13.819571 4722 generic.go:334] "Generic (PLEG): container finished" podID="ce2e955c-c37e-4d29-9887-0daf8cb6ceea" containerID="29c23d7e883ee162938f3c346da5878bb9a55af0cf267b08f37f07a57ea1c3e3" exitCode=143 Mar 09 14:28:13 crc kubenswrapper[4722]: I0309 14:28:13.820373 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ce2e955c-c37e-4d29-9887-0daf8cb6ceea","Type":"ContainerDied","Data":"29c23d7e883ee162938f3c346da5878bb9a55af0cf267b08f37f07a57ea1c3e3"} Mar 09 14:28:13 crc kubenswrapper[4722]: I0309 14:28:13.856878 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7cd25c5-3f3b-4b28-b96f-0aa6114b498c-config-data\") pod \"f7cd25c5-3f3b-4b28-b96f-0aa6114b498c\" (UID: \"f7cd25c5-3f3b-4b28-b96f-0aa6114b498c\") " Mar 09 14:28:13 crc kubenswrapper[4722]: I0309 14:28:13.963483 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f7cd25c5-3f3b-4b28-b96f-0aa6114b498c-log-httpd\") pod \"f7cd25c5-3f3b-4b28-b96f-0aa6114b498c\" (UID: \"f7cd25c5-3f3b-4b28-b96f-0aa6114b498c\") " Mar 09 14:28:13 crc kubenswrapper[4722]: I0309 14:28:13.963865 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7cd25c5-3f3b-4b28-b96f-0aa6114b498c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f7cd25c5-3f3b-4b28-b96f-0aa6114b498c" (UID: "f7cd25c5-3f3b-4b28-b96f-0aa6114b498c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:28:13 crc kubenswrapper[4722]: I0309 14:28:13.963926 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7cd25c5-3f3b-4b28-b96f-0aa6114b498c-combined-ca-bundle\") pod \"f7cd25c5-3f3b-4b28-b96f-0aa6114b498c\" (UID: \"f7cd25c5-3f3b-4b28-b96f-0aa6114b498c\") " Mar 09 14:28:13 crc kubenswrapper[4722]: I0309 14:28:13.963950 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzpms\" (UniqueName: \"kubernetes.io/projected/f7cd25c5-3f3b-4b28-b96f-0aa6114b498c-kube-api-access-kzpms\") pod \"f7cd25c5-3f3b-4b28-b96f-0aa6114b498c\" (UID: \"f7cd25c5-3f3b-4b28-b96f-0aa6114b498c\") " Mar 09 14:28:13 crc kubenswrapper[4722]: I0309 14:28:13.964285 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f7cd25c5-3f3b-4b28-b96f-0aa6114b498c-sg-core-conf-yaml\") pod \"f7cd25c5-3f3b-4b28-b96f-0aa6114b498c\" (UID: \"f7cd25c5-3f3b-4b28-b96f-0aa6114b498c\") " Mar 09 14:28:13 crc kubenswrapper[4722]: I0309 14:28:13.964389 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7cd25c5-3f3b-4b28-b96f-0aa6114b498c-scripts\") pod \"f7cd25c5-3f3b-4b28-b96f-0aa6114b498c\" (UID: \"f7cd25c5-3f3b-4b28-b96f-0aa6114b498c\") " Mar 09 14:28:13 crc kubenswrapper[4722]: I0309 14:28:13.964414 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f7cd25c5-3f3b-4b28-b96f-0aa6114b498c-run-httpd\") pod \"f7cd25c5-3f3b-4b28-b96f-0aa6114b498c\" (UID: \"f7cd25c5-3f3b-4b28-b96f-0aa6114b498c\") " Mar 09 14:28:13 crc kubenswrapper[4722]: I0309 14:28:13.965348 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7cd25c5-3f3b-4b28-b96f-0aa6114b498c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f7cd25c5-3f3b-4b28-b96f-0aa6114b498c" (UID: "f7cd25c5-3f3b-4b28-b96f-0aa6114b498c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:28:13 crc kubenswrapper[4722]: I0309 14:28:13.966086 4722 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f7cd25c5-3f3b-4b28-b96f-0aa6114b498c-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:13 crc kubenswrapper[4722]: I0309 14:28:13.966382 4722 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f7cd25c5-3f3b-4b28-b96f-0aa6114b498c-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:13 crc kubenswrapper[4722]: I0309 14:28:13.967473 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7cd25c5-3f3b-4b28-b96f-0aa6114b498c-kube-api-access-kzpms" (OuterVolumeSpecName: "kube-api-access-kzpms") pod "f7cd25c5-3f3b-4b28-b96f-0aa6114b498c" (UID: "f7cd25c5-3f3b-4b28-b96f-0aa6114b498c"). InnerVolumeSpecName "kube-api-access-kzpms". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:28:13 crc kubenswrapper[4722]: I0309 14:28:13.968627 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7cd25c5-3f3b-4b28-b96f-0aa6114b498c-scripts" (OuterVolumeSpecName: "scripts") pod "f7cd25c5-3f3b-4b28-b96f-0aa6114b498c" (UID: "f7cd25c5-3f3b-4b28-b96f-0aa6114b498c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:28:14 crc kubenswrapper[4722]: I0309 14:28:14.004604 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7cd25c5-3f3b-4b28-b96f-0aa6114b498c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f7cd25c5-3f3b-4b28-b96f-0aa6114b498c" (UID: "f7cd25c5-3f3b-4b28-b96f-0aa6114b498c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:28:14 crc kubenswrapper[4722]: I0309 14:28:14.016399 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7cd25c5-3f3b-4b28-b96f-0aa6114b498c-config-data" (OuterVolumeSpecName: "config-data") pod "f7cd25c5-3f3b-4b28-b96f-0aa6114b498c" (UID: "f7cd25c5-3f3b-4b28-b96f-0aa6114b498c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:28:14 crc kubenswrapper[4722]: I0309 14:28:14.054788 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7cd25c5-3f3b-4b28-b96f-0aa6114b498c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f7cd25c5-3f3b-4b28-b96f-0aa6114b498c" (UID: "f7cd25c5-3f3b-4b28-b96f-0aa6114b498c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:28:14 crc kubenswrapper[4722]: I0309 14:28:14.068499 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kzpms\" (UniqueName: \"kubernetes.io/projected/f7cd25c5-3f3b-4b28-b96f-0aa6114b498c-kube-api-access-kzpms\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:14 crc kubenswrapper[4722]: I0309 14:28:14.068543 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7cd25c5-3f3b-4b28-b96f-0aa6114b498c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:14 crc kubenswrapper[4722]: I0309 14:28:14.068557 4722 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f7cd25c5-3f3b-4b28-b96f-0aa6114b498c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:14 crc kubenswrapper[4722]: I0309 14:28:14.068569 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7cd25c5-3f3b-4b28-b96f-0aa6114b498c-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:14 crc kubenswrapper[4722]: I0309 14:28:14.068582 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7cd25c5-3f3b-4b28-b96f-0aa6114b498c-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:14 crc kubenswrapper[4722]: I0309 14:28:14.832457 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 14:28:14 crc kubenswrapper[4722]: I0309 14:28:14.864388 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 14:28:14 crc kubenswrapper[4722]: I0309 14:28:14.880558 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 09 14:28:14 crc kubenswrapper[4722]: I0309 14:28:14.903084 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 09 14:28:14 crc kubenswrapper[4722]: E0309 14:28:14.904016 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7cd25c5-3f3b-4b28-b96f-0aa6114b498c" containerName="ceilometer-notification-agent" Mar 09 14:28:14 crc kubenswrapper[4722]: I0309 14:28:14.904091 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7cd25c5-3f3b-4b28-b96f-0aa6114b498c" containerName="ceilometer-notification-agent" Mar 09 14:28:14 crc kubenswrapper[4722]: E0309 14:28:14.904188 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7cd25c5-3f3b-4b28-b96f-0aa6114b498c" containerName="ceilometer-central-agent" Mar 09 14:28:14 crc kubenswrapper[4722]: I0309 14:28:14.904266 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7cd25c5-3f3b-4b28-b96f-0aa6114b498c" containerName="ceilometer-central-agent" Mar 09 14:28:14 crc kubenswrapper[4722]: E0309 14:28:14.904328 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7cd25c5-3f3b-4b28-b96f-0aa6114b498c" containerName="sg-core" Mar 09 14:28:14 crc kubenswrapper[4722]: I0309 14:28:14.904378 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7cd25c5-3f3b-4b28-b96f-0aa6114b498c" containerName="sg-core" Mar 09 14:28:14 crc kubenswrapper[4722]: E0309 14:28:14.904440 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7cd25c5-3f3b-4b28-b96f-0aa6114b498c" containerName="proxy-httpd" Mar 09 14:28:14 crc kubenswrapper[4722]: I0309 14:28:14.904490 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7cd25c5-3f3b-4b28-b96f-0aa6114b498c" containerName="proxy-httpd" Mar 09 14:28:14 crc kubenswrapper[4722]: I0309 14:28:14.904820 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7cd25c5-3f3b-4b28-b96f-0aa6114b498c" containerName="proxy-httpd" Mar 09 14:28:14 crc kubenswrapper[4722]: I0309 14:28:14.905073 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7cd25c5-3f3b-4b28-b96f-0aa6114b498c" containerName="ceilometer-central-agent" Mar 09 14:28:14 crc kubenswrapper[4722]: I0309 14:28:14.905137 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7cd25c5-3f3b-4b28-b96f-0aa6114b498c" containerName="ceilometer-notification-agent" Mar 09 14:28:14 crc kubenswrapper[4722]: I0309 14:28:14.905254 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7cd25c5-3f3b-4b28-b96f-0aa6114b498c" containerName="sg-core" Mar 09 14:28:14 crc kubenswrapper[4722]: I0309 14:28:14.910064 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 14:28:14 crc kubenswrapper[4722]: I0309 14:28:14.912649 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 09 14:28:14 crc kubenswrapper[4722]: I0309 14:28:14.913039 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 09 14:28:14 crc kubenswrapper[4722]: I0309 14:28:14.922738 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 14:28:15 crc kubenswrapper[4722]: I0309 14:28:15.004903 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/483d53e3-10df-4bea-97df-8d72228035b9-config-data\") pod \"ceilometer-0\" (UID: \"483d53e3-10df-4bea-97df-8d72228035b9\") " pod="openstack/ceilometer-0" Mar 09 14:28:15 crc kubenswrapper[4722]: I0309 14:28:15.004940 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/483d53e3-10df-4bea-97df-8d72228035b9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"483d53e3-10df-4bea-97df-8d72228035b9\") " pod="openstack/ceilometer-0" Mar 09 14:28:15 crc kubenswrapper[4722]: I0309 14:28:15.004994 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/483d53e3-10df-4bea-97df-8d72228035b9-run-httpd\") pod \"ceilometer-0\" (UID: \"483d53e3-10df-4bea-97df-8d72228035b9\") " pod="openstack/ceilometer-0" Mar 09 14:28:15 crc kubenswrapper[4722]: I0309 14:28:15.005218 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/483d53e3-10df-4bea-97df-8d72228035b9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"483d53e3-10df-4bea-97df-8d72228035b9\") " pod="openstack/ceilometer-0" Mar 09 14:28:15 crc kubenswrapper[4722]: I0309 14:28:15.005241 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/483d53e3-10df-4bea-97df-8d72228035b9-scripts\") pod \"ceilometer-0\" (UID: \"483d53e3-10df-4bea-97df-8d72228035b9\") " pod="openstack/ceilometer-0" Mar 09 14:28:15 crc kubenswrapper[4722]: I0309 14:28:15.005434 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/483d53e3-10df-4bea-97df-8d72228035b9-log-httpd\") pod \"ceilometer-0\" (UID: \"483d53e3-10df-4bea-97df-8d72228035b9\") " pod="openstack/ceilometer-0" Mar 09 14:28:15 crc kubenswrapper[4722]: I0309 14:28:15.005532 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjqbx\" (UniqueName: \"kubernetes.io/projected/483d53e3-10df-4bea-97df-8d72228035b9-kube-api-access-mjqbx\") pod \"ceilometer-0\" (UID: \"483d53e3-10df-4bea-97df-8d72228035b9\") " pod="openstack/ceilometer-0" Mar 09 14:28:15 crc kubenswrapper[4722]: I0309 14:28:15.107857 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/483d53e3-10df-4bea-97df-8d72228035b9-log-httpd\") pod \"ceilometer-0\" (UID: \"483d53e3-10df-4bea-97df-8d72228035b9\") " pod="openstack/ceilometer-0" Mar 09 14:28:15 crc kubenswrapper[4722]: I0309 14:28:15.108154 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjqbx\" (UniqueName: \"kubernetes.io/projected/483d53e3-10df-4bea-97df-8d72228035b9-kube-api-access-mjqbx\") pod \"ceilometer-0\" (UID: \"483d53e3-10df-4bea-97df-8d72228035b9\") " pod="openstack/ceilometer-0" Mar 09 14:28:15 crc kubenswrapper[4722]: I0309 14:28:15.108260 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/483d53e3-10df-4bea-97df-8d72228035b9-config-data\") pod \"ceilometer-0\" (UID: \"483d53e3-10df-4bea-97df-8d72228035b9\") " pod="openstack/ceilometer-0" Mar 09 14:28:15 crc kubenswrapper[4722]: I0309 14:28:15.108278 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/483d53e3-10df-4bea-97df-8d72228035b9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"483d53e3-10df-4bea-97df-8d72228035b9\") " pod="openstack/ceilometer-0" Mar 09 14:28:15 crc kubenswrapper[4722]: I0309 14:28:15.108318 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/483d53e3-10df-4bea-97df-8d72228035b9-run-httpd\") pod \"ceilometer-0\" (UID: \"483d53e3-10df-4bea-97df-8d72228035b9\") " pod="openstack/ceilometer-0" Mar 09 14:28:15 crc kubenswrapper[4722]: I0309 14:28:15.108382 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/483d53e3-10df-4bea-97df-8d72228035b9-log-httpd\") pod \"ceilometer-0\" (UID: \"483d53e3-10df-4bea-97df-8d72228035b9\") " pod="openstack/ceilometer-0" Mar 09 14:28:15 crc kubenswrapper[4722]: I0309 14:28:15.108641 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/483d53e3-10df-4bea-97df-8d72228035b9-run-httpd\") pod \"ceilometer-0\" (UID: \"483d53e3-10df-4bea-97df-8d72228035b9\") " pod="openstack/ceilometer-0" Mar 09 14:28:15 crc kubenswrapper[4722]: I0309 14:28:15.108678 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/483d53e3-10df-4bea-97df-8d72228035b9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"483d53e3-10df-4bea-97df-8d72228035b9\") " pod="openstack/ceilometer-0" Mar 09 14:28:15 crc kubenswrapper[4722]: I0309 14:28:15.108731 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/483d53e3-10df-4bea-97df-8d72228035b9-scripts\") pod \"ceilometer-0\" (UID: \"483d53e3-10df-4bea-97df-8d72228035b9\") " pod="openstack/ceilometer-0" Mar 09 14:28:15 crc kubenswrapper[4722]: I0309 14:28:15.115018 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/483d53e3-10df-4bea-97df-8d72228035b9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"483d53e3-10df-4bea-97df-8d72228035b9\") " pod="openstack/ceilometer-0" Mar 09 14:28:15 crc kubenswrapper[4722]: I0309 14:28:15.115684 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/483d53e3-10df-4bea-97df-8d72228035b9-config-data\") pod \"ceilometer-0\" (UID: \"483d53e3-10df-4bea-97df-8d72228035b9\") " pod="openstack/ceilometer-0" Mar 09 14:28:15 crc kubenswrapper[4722]: I0309 14:28:15.115707 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/483d53e3-10df-4bea-97df-8d72228035b9-scripts\") pod \"ceilometer-0\" (UID: \"483d53e3-10df-4bea-97df-8d72228035b9\") " pod="openstack/ceilometer-0" Mar 09 14:28:15 crc kubenswrapper[4722]: I0309 14:28:15.116562 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/483d53e3-10df-4bea-97df-8d72228035b9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"483d53e3-10df-4bea-97df-8d72228035b9\") " pod="openstack/ceilometer-0" Mar 09 14:28:15 crc kubenswrapper[4722]: I0309 14:28:15.158101 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjqbx\" (UniqueName: \"kubernetes.io/projected/483d53e3-10df-4bea-97df-8d72228035b9-kube-api-access-mjqbx\") pod \"ceilometer-0\" (UID: \"483d53e3-10df-4bea-97df-8d72228035b9\") " pod="openstack/ceilometer-0" Mar 09 14:28:15 crc kubenswrapper[4722]: I0309 14:28:15.278281 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 14:28:16 crc kubenswrapper[4722]: I0309 14:28:16.166442 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7cd25c5-3f3b-4b28-b96f-0aa6114b498c" path="/var/lib/kubelet/pods/f7cd25c5-3f3b-4b28-b96f-0aa6114b498c/volumes" Mar 09 14:28:16 crc kubenswrapper[4722]: I0309 14:28:16.167493 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 09 14:28:16 crc kubenswrapper[4722]: I0309 14:28:16.167520 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 09 14:28:16 crc kubenswrapper[4722]: I0309 14:28:16.195280 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 09 14:28:16 crc kubenswrapper[4722]: I0309 14:28:16.195341 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 09 14:28:16 crc kubenswrapper[4722]: I0309 14:28:16.202629 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 09 14:28:16 crc kubenswrapper[4722]: I0309 14:28:16.465623 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 09 14:28:16 crc kubenswrapper[4722]: I0309 14:28:16.466088 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 09 14:28:16 crc kubenswrapper[4722]: I0309 14:28:16.521327 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 09 14:28:16 crc kubenswrapper[4722]: I0309 14:28:16.878028 4722 generic.go:334] "Generic (PLEG): container finished" podID="aa375409-8285-4709-8abd-1916c59a6566" containerID="866dea47edd6b6df9c06f06c280717c12a33b978612ab76a0a4e30d589b49aba" exitCode=0 Mar 09 14:28:16 crc kubenswrapper[4722]: I0309 14:28:16.878112 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-8qttf" event={"ID":"aa375409-8285-4709-8abd-1916c59a6566","Type":"ContainerDied","Data":"866dea47edd6b6df9c06f06c280717c12a33b978612ab76a0a4e30d589b49aba"} Mar 09 14:28:16 crc kubenswrapper[4722]: I0309 14:28:16.916428 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 09 14:28:17 crc kubenswrapper[4722]: I0309 14:28:17.277489 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="035524e8-b163-4660-8e2d-b90ebcbef082" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.247:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:28:17 crc kubenswrapper[4722]: I0309 14:28:17.277513 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="035524e8-b163-4660-8e2d-b90ebcbef082" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.247:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:28:18 crc kubenswrapper[4722]: I0309 14:28:18.875270 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-8qttf" Mar 09 14:28:18 crc kubenswrapper[4722]: I0309 14:28:18.931233 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-vl8jp" event={"ID":"1c58faa7-5e6b-4140-9859-a4729c6354d9","Type":"ContainerStarted","Data":"1456c5b63dd83c218d01c12cae5566503438f38b2defeadc341d008a2ceecb6a"} Mar 09 14:28:18 crc kubenswrapper[4722]: I0309 14:28:18.934123 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-8qttf" event={"ID":"aa375409-8285-4709-8abd-1916c59a6566","Type":"ContainerDied","Data":"ae96543132aca98dba4f1c455bce164458c3cedb30e6001a7a085837574ffa87"} Mar 09 14:28:18 crc kubenswrapper[4722]: I0309 14:28:18.934154 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae96543132aca98dba4f1c455bce164458c3cedb30e6001a7a085837574ffa87" Mar 09 14:28:18 crc kubenswrapper[4722]: I0309 14:28:18.934263 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-8qttf" Mar 09 14:28:18 crc kubenswrapper[4722]: I0309 14:28:18.949498 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-vl8jp" podStartSLOduration=4.85057706 podStartE2EDuration="10.949479161s" podCreationTimestamp="2026-03-09 14:28:08 +0000 UTC" firstStartedPulling="2026-03-09 14:28:12.454896975 +0000 UTC m=+1533.010465541" lastFinishedPulling="2026-03-09 14:28:18.553799076 +0000 UTC m=+1539.109367642" observedRunningTime="2026-03-09 14:28:18.945154351 +0000 UTC m=+1539.500722937" watchObservedRunningTime="2026-03-09 14:28:18.949479161 +0000 UTC m=+1539.505047737" Mar 09 14:28:19 crc kubenswrapper[4722]: I0309 14:28:19.002747 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hkkpb\" (UniqueName: \"kubernetes.io/projected/aa375409-8285-4709-8abd-1916c59a6566-kube-api-access-hkkpb\") pod \"aa375409-8285-4709-8abd-1916c59a6566\" (UID: \"aa375409-8285-4709-8abd-1916c59a6566\") " Mar 09 14:28:19 crc kubenswrapper[4722]: I0309 14:28:19.002845 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa375409-8285-4709-8abd-1916c59a6566-config-data\") pod \"aa375409-8285-4709-8abd-1916c59a6566\" (UID: \"aa375409-8285-4709-8abd-1916c59a6566\") " Mar 09 14:28:19 crc kubenswrapper[4722]: I0309 14:28:19.002885 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa375409-8285-4709-8abd-1916c59a6566-combined-ca-bundle\") pod \"aa375409-8285-4709-8abd-1916c59a6566\" (UID: \"aa375409-8285-4709-8abd-1916c59a6566\") " Mar 09 14:28:19 crc kubenswrapper[4722]: I0309 14:28:19.002948 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa375409-8285-4709-8abd-1916c59a6566-scripts\") pod \"aa375409-8285-4709-8abd-1916c59a6566\" (UID: \"aa375409-8285-4709-8abd-1916c59a6566\") " Mar 09 14:28:19 crc kubenswrapper[4722]: I0309 14:28:19.008784 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa375409-8285-4709-8abd-1916c59a6566-kube-api-access-hkkpb" (OuterVolumeSpecName: "kube-api-access-hkkpb") pod "aa375409-8285-4709-8abd-1916c59a6566" (UID: "aa375409-8285-4709-8abd-1916c59a6566"). InnerVolumeSpecName "kube-api-access-hkkpb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:28:19 crc kubenswrapper[4722]: I0309 14:28:19.009510 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa375409-8285-4709-8abd-1916c59a6566-scripts" (OuterVolumeSpecName: "scripts") pod "aa375409-8285-4709-8abd-1916c59a6566" (UID: "aa375409-8285-4709-8abd-1916c59a6566"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:28:19 crc kubenswrapper[4722]: I0309 14:28:19.034583 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa375409-8285-4709-8abd-1916c59a6566-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aa375409-8285-4709-8abd-1916c59a6566" (UID: "aa375409-8285-4709-8abd-1916c59a6566"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:28:19 crc kubenswrapper[4722]: I0309 14:28:19.036227 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa375409-8285-4709-8abd-1916c59a6566-config-data" (OuterVolumeSpecName: "config-data") pod "aa375409-8285-4709-8abd-1916c59a6566" (UID: "aa375409-8285-4709-8abd-1916c59a6566"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:28:19 crc kubenswrapper[4722]: I0309 14:28:19.087278 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 14:28:19 crc kubenswrapper[4722]: I0309 14:28:19.110016 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hkkpb\" (UniqueName: \"kubernetes.io/projected/aa375409-8285-4709-8abd-1916c59a6566-kube-api-access-hkkpb\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:19 crc kubenswrapper[4722]: I0309 14:28:19.110053 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa375409-8285-4709-8abd-1916c59a6566-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:19 crc kubenswrapper[4722]: I0309 14:28:19.110066 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa375409-8285-4709-8abd-1916c59a6566-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:19 crc kubenswrapper[4722]: I0309 14:28:19.110078 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa375409-8285-4709-8abd-1916c59a6566-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:19 crc kubenswrapper[4722]: I0309 14:28:19.116273 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 09 14:28:19 crc kubenswrapper[4722]: I0309 14:28:19.116552 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="035524e8-b163-4660-8e2d-b90ebcbef082" containerName="nova-api-log" containerID="cri-o://01f24812bf5b006c669261fd4a11b3af9c7cf9717f10922b664eee6f5e530995" gracePeriod=30 Mar 09 14:28:19 crc kubenswrapper[4722]: I0309 14:28:19.116723 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="035524e8-b163-4660-8e2d-b90ebcbef082" containerName="nova-api-api" containerID="cri-o://f8f1aa6c9a54cd18444527eb6152479842ac9eccf62f56097bdcfda05d0b3fe2" gracePeriod=30 Mar 09 14:28:19 crc kubenswrapper[4722]: I0309 14:28:19.134674 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 14:28:19 crc kubenswrapper[4722]: I0309 14:28:19.134878 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="70dbd0c7-a6dc-4f0c-aff3-1de023a615f4" containerName="nova-scheduler-scheduler" containerID="cri-o://32b8ae9e0ba215d2db25700cc901e98b1cc9facc016ba3ee9ca38bb7d739f35f" gracePeriod=30 Mar 09 14:28:19 crc kubenswrapper[4722]: I0309 14:28:19.952705 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"483d53e3-10df-4bea-97df-8d72228035b9","Type":"ContainerStarted","Data":"cb7f4f66a8c95faf66fe9599f65b1f6dcfc0d56b1d2d137055208101c01d8bd1"} Mar 09 14:28:19 crc kubenswrapper[4722]: I0309 14:28:19.952961 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"483d53e3-10df-4bea-97df-8d72228035b9","Type":"ContainerStarted","Data":"d5f95064d66a6fb3eddaf5d28cec1c93924105afae37580608a47faacaf0d90a"} Mar 09 14:28:19 crc kubenswrapper[4722]: I0309 14:28:19.961969 4722 generic.go:334] "Generic (PLEG): container finished" podID="035524e8-b163-4660-8e2d-b90ebcbef082" containerID="01f24812bf5b006c669261fd4a11b3af9c7cf9717f10922b664eee6f5e530995" exitCode=143 Mar 09 14:28:19 crc kubenswrapper[4722]: I0309 14:28:19.962028 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"035524e8-b163-4660-8e2d-b90ebcbef082","Type":"ContainerDied","Data":"01f24812bf5b006c669261fd4a11b3af9c7cf9717f10922b664eee6f5e530995"} Mar 09 14:28:19 crc kubenswrapper[4722]: I0309 14:28:19.967500 4722 generic.go:334] "Generic (PLEG): container finished" podID="70dbd0c7-a6dc-4f0c-aff3-1de023a615f4" containerID="32b8ae9e0ba215d2db25700cc901e98b1cc9facc016ba3ee9ca38bb7d739f35f" exitCode=0 Mar 09 14:28:19 crc kubenswrapper[4722]: I0309 14:28:19.967558 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"70dbd0c7-a6dc-4f0c-aff3-1de023a615f4","Type":"ContainerDied","Data":"32b8ae9e0ba215d2db25700cc901e98b1cc9facc016ba3ee9ca38bb7d739f35f"} Mar 09 14:28:19 crc kubenswrapper[4722]: I0309 14:28:19.969265 4722 generic.go:334] "Generic (PLEG): container finished" podID="8e25996e-4f6c-4135-b781-95eae749689e" containerID="d9ca2f07ecb9bf38ccd438169eed0c8afa14d0ac674a9f2f87af6aba969cea02" exitCode=0 Mar 09 14:28:19 crc kubenswrapper[4722]: I0309 14:28:19.970437 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-cklld" event={"ID":"8e25996e-4f6c-4135-b781-95eae749689e","Type":"ContainerDied","Data":"d9ca2f07ecb9bf38ccd438169eed0c8afa14d0ac674a9f2f87af6aba969cea02"} Mar 09 14:28:20 crc kubenswrapper[4722]: E0309 14:28:20.039024 4722 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod70dbd0c7_a6dc_4f0c_aff3_1de023a615f4.slice/crio-conmon-32b8ae9e0ba215d2db25700cc901e98b1cc9facc016ba3ee9ca38bb7d739f35f.scope\": RecentStats: unable to find data in memory cache]" Mar 09 14:28:20 crc kubenswrapper[4722]: I0309 14:28:20.262043 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 09 14:28:20 crc kubenswrapper[4722]: I0309 14:28:20.337932 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70dbd0c7-a6dc-4f0c-aff3-1de023a615f4-combined-ca-bundle\") pod \"70dbd0c7-a6dc-4f0c-aff3-1de023a615f4\" (UID: \"70dbd0c7-a6dc-4f0c-aff3-1de023a615f4\") " Mar 09 14:28:20 crc kubenswrapper[4722]: I0309 14:28:20.338044 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kds6z\" (UniqueName: \"kubernetes.io/projected/70dbd0c7-a6dc-4f0c-aff3-1de023a615f4-kube-api-access-kds6z\") pod \"70dbd0c7-a6dc-4f0c-aff3-1de023a615f4\" (UID: \"70dbd0c7-a6dc-4f0c-aff3-1de023a615f4\") " Mar 09 14:28:20 crc kubenswrapper[4722]: I0309 14:28:20.338513 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70dbd0c7-a6dc-4f0c-aff3-1de023a615f4-config-data\") pod \"70dbd0c7-a6dc-4f0c-aff3-1de023a615f4\" (UID: \"70dbd0c7-a6dc-4f0c-aff3-1de023a615f4\") " Mar 09 14:28:20 crc kubenswrapper[4722]: I0309 14:28:20.346434 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70dbd0c7-a6dc-4f0c-aff3-1de023a615f4-kube-api-access-kds6z" (OuterVolumeSpecName: "kube-api-access-kds6z") pod "70dbd0c7-a6dc-4f0c-aff3-1de023a615f4" (UID: "70dbd0c7-a6dc-4f0c-aff3-1de023a615f4"). InnerVolumeSpecName "kube-api-access-kds6z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:28:20 crc kubenswrapper[4722]: I0309 14:28:20.371518 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70dbd0c7-a6dc-4f0c-aff3-1de023a615f4-config-data" (OuterVolumeSpecName: "config-data") pod "70dbd0c7-a6dc-4f0c-aff3-1de023a615f4" (UID: "70dbd0c7-a6dc-4f0c-aff3-1de023a615f4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:28:20 crc kubenswrapper[4722]: I0309 14:28:20.374329 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70dbd0c7-a6dc-4f0c-aff3-1de023a615f4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "70dbd0c7-a6dc-4f0c-aff3-1de023a615f4" (UID: "70dbd0c7-a6dc-4f0c-aff3-1de023a615f4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:28:20 crc kubenswrapper[4722]: I0309 14:28:20.441522 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/70dbd0c7-a6dc-4f0c-aff3-1de023a615f4-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:20 crc kubenswrapper[4722]: I0309 14:28:20.441559 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70dbd0c7-a6dc-4f0c-aff3-1de023a615f4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:20 crc kubenswrapper[4722]: I0309 14:28:20.441573 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kds6z\" (UniqueName: \"kubernetes.io/projected/70dbd0c7-a6dc-4f0c-aff3-1de023a615f4-kube-api-access-kds6z\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:20 crc kubenswrapper[4722]: I0309 14:28:20.790797 4722 scope.go:117] "RemoveContainer" containerID="428579cc73e13fdc6466d3bc9654e1b74df863575b4241ab50e43a0a961ac2b1" Mar 09 14:28:20 crc kubenswrapper[4722]: I0309 14:28:20.990186 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"70dbd0c7-a6dc-4f0c-aff3-1de023a615f4","Type":"ContainerDied","Data":"0dc9834d0c1e16fa5fe0bde5d4f1b3ef1fe9b6e534c09558f91edacc2341613c"} Mar 09 14:28:20 crc kubenswrapper[4722]: I0309 14:28:20.990670 4722 scope.go:117] "RemoveContainer" containerID="32b8ae9e0ba215d2db25700cc901e98b1cc9facc016ba3ee9ca38bb7d739f35f" Mar 09 14:28:20 crc kubenswrapper[4722]: I0309 14:28:20.990995 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 09 14:28:21 crc kubenswrapper[4722]: I0309 14:28:21.008879 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"483d53e3-10df-4bea-97df-8d72228035b9","Type":"ContainerStarted","Data":"58372d3991297b65f350e20d6923a3bdb3c68c4d52579b01ceaffcc5e05c2dfd"} Mar 09 14:28:21 crc kubenswrapper[4722]: I0309 14:28:21.107374 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 14:28:21 crc kubenswrapper[4722]: I0309 14:28:21.110919 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 14:28:21 crc kubenswrapper[4722]: I0309 14:28:21.157519 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 14:28:21 crc kubenswrapper[4722]: E0309 14:28:21.158182 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa375409-8285-4709-8abd-1916c59a6566" containerName="nova-manage" Mar 09 14:28:21 crc kubenswrapper[4722]: I0309 14:28:21.158213 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa375409-8285-4709-8abd-1916c59a6566" containerName="nova-manage" Mar 09 14:28:21 crc kubenswrapper[4722]: E0309 14:28:21.158244 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70dbd0c7-a6dc-4f0c-aff3-1de023a615f4" containerName="nova-scheduler-scheduler" Mar 09 14:28:21 crc kubenswrapper[4722]: I0309 14:28:21.158252 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="70dbd0c7-a6dc-4f0c-aff3-1de023a615f4" containerName="nova-scheduler-scheduler" Mar 09 14:28:21 crc kubenswrapper[4722]: I0309 14:28:21.158510 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="70dbd0c7-a6dc-4f0c-aff3-1de023a615f4" containerName="nova-scheduler-scheduler" Mar 09 14:28:21 crc kubenswrapper[4722]: I0309 14:28:21.158528 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa375409-8285-4709-8abd-1916c59a6566" containerName="nova-manage" Mar 09 14:28:21 crc kubenswrapper[4722]: I0309 14:28:21.159316 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 09 14:28:21 crc kubenswrapper[4722]: I0309 14:28:21.163943 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 09 14:28:21 crc kubenswrapper[4722]: I0309 14:28:21.191040 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 14:28:21 crc kubenswrapper[4722]: I0309 14:28:21.258776 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fe185bd-3b13-48df-9ceb-df1ba08e277b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1fe185bd-3b13-48df-9ceb-df1ba08e277b\") " pod="openstack/nova-scheduler-0" Mar 09 14:28:21 crc kubenswrapper[4722]: I0309 14:28:21.259110 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6frd5\" (UniqueName: \"kubernetes.io/projected/1fe185bd-3b13-48df-9ceb-df1ba08e277b-kube-api-access-6frd5\") pod \"nova-scheduler-0\" (UID: \"1fe185bd-3b13-48df-9ceb-df1ba08e277b\") " pod="openstack/nova-scheduler-0" Mar 09 14:28:21 crc kubenswrapper[4722]: I0309 14:28:21.259522 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fe185bd-3b13-48df-9ceb-df1ba08e277b-config-data\") pod \"nova-scheduler-0\" (UID: \"1fe185bd-3b13-48df-9ceb-df1ba08e277b\") " pod="openstack/nova-scheduler-0" Mar 09 14:28:21 crc kubenswrapper[4722]: I0309 14:28:21.362250 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fe185bd-3b13-48df-9ceb-df1ba08e277b-config-data\") pod \"nova-scheduler-0\" (UID: \"1fe185bd-3b13-48df-9ceb-df1ba08e277b\") " pod="openstack/nova-scheduler-0" Mar 09 14:28:21 crc kubenswrapper[4722]: I0309 14:28:21.362466 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fe185bd-3b13-48df-9ceb-df1ba08e277b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1fe185bd-3b13-48df-9ceb-df1ba08e277b\") " pod="openstack/nova-scheduler-0" Mar 09 14:28:21 crc kubenswrapper[4722]: I0309 14:28:21.362533 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6frd5\" (UniqueName: \"kubernetes.io/projected/1fe185bd-3b13-48df-9ceb-df1ba08e277b-kube-api-access-6frd5\") pod \"nova-scheduler-0\" (UID: \"1fe185bd-3b13-48df-9ceb-df1ba08e277b\") " pod="openstack/nova-scheduler-0" Mar 09 14:28:21 crc kubenswrapper[4722]: I0309 14:28:21.371102 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fe185bd-3b13-48df-9ceb-df1ba08e277b-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"1fe185bd-3b13-48df-9ceb-df1ba08e277b\") " pod="openstack/nova-scheduler-0" Mar 09 14:28:21 crc kubenswrapper[4722]: I0309 14:28:21.371881 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fe185bd-3b13-48df-9ceb-df1ba08e277b-config-data\") pod \"nova-scheduler-0\" (UID: \"1fe185bd-3b13-48df-9ceb-df1ba08e277b\") " pod="openstack/nova-scheduler-0" Mar 09 14:28:21 crc kubenswrapper[4722]: I0309 14:28:21.378734 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6frd5\" (UniqueName: \"kubernetes.io/projected/1fe185bd-3b13-48df-9ceb-df1ba08e277b-kube-api-access-6frd5\") pod \"nova-scheduler-0\" (UID: \"1fe185bd-3b13-48df-9ceb-df1ba08e277b\") " pod="openstack/nova-scheduler-0" Mar 09 14:28:21 crc kubenswrapper[4722]: I0309 14:28:21.488567 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 09 14:28:21 crc kubenswrapper[4722]: I0309 14:28:21.597456 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-9b86998b5-s8lxr" Mar 09 14:28:21 crc kubenswrapper[4722]: I0309 14:28:21.685776 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-xpcxn"] Mar 09 14:28:21 crc kubenswrapper[4722]: I0309 14:28:21.686541 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7756b9d78c-xpcxn" podUID="3c1bee46-a464-474e-ae9c-e333a0ef2190" containerName="dnsmasq-dns" containerID="cri-o://b4992f3e1f40480b19383ed9d7d471916f5e637a19de820f80b9af6999306965" gracePeriod=10 Mar 09 14:28:21 crc kubenswrapper[4722]: I0309 14:28:21.854593 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-cklld" Mar 09 14:28:21 crc kubenswrapper[4722]: I0309 14:28:21.988803 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e25996e-4f6c-4135-b781-95eae749689e-combined-ca-bundle\") pod \"8e25996e-4f6c-4135-b781-95eae749689e\" (UID: \"8e25996e-4f6c-4135-b781-95eae749689e\") " Mar 09 14:28:21 crc kubenswrapper[4722]: I0309 14:28:21.989108 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kx7rw\" (UniqueName: \"kubernetes.io/projected/8e25996e-4f6c-4135-b781-95eae749689e-kube-api-access-kx7rw\") pod \"8e25996e-4f6c-4135-b781-95eae749689e\" (UID: \"8e25996e-4f6c-4135-b781-95eae749689e\") " Mar 09 14:28:21 crc kubenswrapper[4722]: I0309 14:28:21.989219 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e25996e-4f6c-4135-b781-95eae749689e-config-data\") pod \"8e25996e-4f6c-4135-b781-95eae749689e\" (UID: \"8e25996e-4f6c-4135-b781-95eae749689e\") " Mar 09 14:28:21 crc kubenswrapper[4722]: I0309 14:28:21.989263 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e25996e-4f6c-4135-b781-95eae749689e-scripts\") pod \"8e25996e-4f6c-4135-b781-95eae749689e\" (UID: \"8e25996e-4f6c-4135-b781-95eae749689e\") " Mar 09 14:28:21 crc kubenswrapper[4722]: I0309 14:28:21.996777 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e25996e-4f6c-4135-b781-95eae749689e-kube-api-access-kx7rw" (OuterVolumeSpecName: "kube-api-access-kx7rw") pod "8e25996e-4f6c-4135-b781-95eae749689e" (UID: "8e25996e-4f6c-4135-b781-95eae749689e"). InnerVolumeSpecName "kube-api-access-kx7rw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:28:22 crc kubenswrapper[4722]: I0309 14:28:22.001802 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e25996e-4f6c-4135-b781-95eae749689e-scripts" (OuterVolumeSpecName: "scripts") pod "8e25996e-4f6c-4135-b781-95eae749689e" (UID: "8e25996e-4f6c-4135-b781-95eae749689e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:28:22 crc kubenswrapper[4722]: I0309 14:28:22.060234 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e25996e-4f6c-4135-b781-95eae749689e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8e25996e-4f6c-4135-b781-95eae749689e" (UID: "8e25996e-4f6c-4135-b781-95eae749689e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:28:22 crc kubenswrapper[4722]: W0309 14:28:22.069279 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1fe185bd_3b13_48df_9ceb_df1ba08e277b.slice/crio-7e797916bf99fb06ad0a2846b0c3f3f33f9c999d2c43e165feedf9ff47a052ed WatchSource:0}: Error finding container 7e797916bf99fb06ad0a2846b0c3f3f33f9c999d2c43e165feedf9ff47a052ed: Status 404 returned error can't find the container with id 7e797916bf99fb06ad0a2846b0c3f3f33f9c999d2c43e165feedf9ff47a052ed Mar 09 14:28:22 crc kubenswrapper[4722]: I0309 14:28:22.069313 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 14:28:22 crc kubenswrapper[4722]: I0309 14:28:22.071592 4722 generic.go:334] "Generic (PLEG): container finished" podID="3c1bee46-a464-474e-ae9c-e333a0ef2190" containerID="b4992f3e1f40480b19383ed9d7d471916f5e637a19de820f80b9af6999306965" exitCode=0 Mar 09 14:28:22 crc kubenswrapper[4722]: I0309 14:28:22.071669 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7756b9d78c-xpcxn" event={"ID":"3c1bee46-a464-474e-ae9c-e333a0ef2190","Type":"ContainerDied","Data":"b4992f3e1f40480b19383ed9d7d471916f5e637a19de820f80b9af6999306965"} Mar 09 14:28:22 crc kubenswrapper[4722]: I0309 14:28:22.076608 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e25996e-4f6c-4135-b781-95eae749689e-config-data" (OuterVolumeSpecName: "config-data") pod "8e25996e-4f6c-4135-b781-95eae749689e" (UID: "8e25996e-4f6c-4135-b781-95eae749689e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:28:22 crc kubenswrapper[4722]: I0309 14:28:22.083716 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 09 14:28:22 crc kubenswrapper[4722]: E0309 14:28:22.084346 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e25996e-4f6c-4135-b781-95eae749689e" containerName="nova-cell1-conductor-db-sync" Mar 09 14:28:22 crc kubenswrapper[4722]: I0309 14:28:22.084374 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e25996e-4f6c-4135-b781-95eae749689e" containerName="nova-cell1-conductor-db-sync" Mar 09 14:28:22 crc kubenswrapper[4722]: I0309 14:28:22.084664 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e25996e-4f6c-4135-b781-95eae749689e" containerName="nova-cell1-conductor-db-sync" Mar 09 14:28:22 crc kubenswrapper[4722]: I0309 14:28:22.085513 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 09 14:28:22 crc kubenswrapper[4722]: I0309 14:28:22.095359 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8e25996e-4f6c-4135-b781-95eae749689e-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:22 crc kubenswrapper[4722]: I0309 14:28:22.095388 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8e25996e-4f6c-4135-b781-95eae749689e-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:22 crc kubenswrapper[4722]: I0309 14:28:22.095400 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e25996e-4f6c-4135-b781-95eae749689e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:22 crc kubenswrapper[4722]: I0309 14:28:22.095413 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kx7rw\" (UniqueName: \"kubernetes.io/projected/8e25996e-4f6c-4135-b781-95eae749689e-kube-api-access-kx7rw\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:22 crc kubenswrapper[4722]: I0309 14:28:22.098093 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 09 14:28:22 crc kubenswrapper[4722]: I0309 14:28:22.104394 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-cklld" Mar 09 14:28:22 crc kubenswrapper[4722]: I0309 14:28:22.104383 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-cklld" event={"ID":"8e25996e-4f6c-4135-b781-95eae749689e","Type":"ContainerDied","Data":"c3b462a6936d1680bfe184966493fe8ff2672dd835244f8e8202be25c9ffcb6b"} Mar 09 14:28:22 crc kubenswrapper[4722]: I0309 14:28:22.104502 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c3b462a6936d1680bfe184966493fe8ff2672dd835244f8e8202be25c9ffcb6b" Mar 09 14:28:22 crc kubenswrapper[4722]: I0309 14:28:22.115134 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"483d53e3-10df-4bea-97df-8d72228035b9","Type":"ContainerStarted","Data":"0009ddb824d3e48e1939ff0f61d50ec63d10781151ccaf88f95422c1b485e7e3"} Mar 09 14:28:22 crc kubenswrapper[4722]: I0309 14:28:22.120864 4722 generic.go:334] "Generic (PLEG): container finished" podID="1c58faa7-5e6b-4140-9859-a4729c6354d9" containerID="1456c5b63dd83c218d01c12cae5566503438f38b2defeadc341d008a2ceecb6a" exitCode=0 Mar 09 14:28:22 crc kubenswrapper[4722]: I0309 14:28:22.120908 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-vl8jp" event={"ID":"1c58faa7-5e6b-4140-9859-a4729c6354d9","Type":"ContainerDied","Data":"1456c5b63dd83c218d01c12cae5566503438f38b2defeadc341d008a2ceecb6a"} Mar 09 14:28:22 crc kubenswrapper[4722]: I0309 14:28:22.191873 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70dbd0c7-a6dc-4f0c-aff3-1de023a615f4" path="/var/lib/kubelet/pods/70dbd0c7-a6dc-4f0c-aff3-1de023a615f4/volumes" Mar 09 14:28:22 crc kubenswrapper[4722]: I0309 14:28:22.198903 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b61a6cf-d8fb-40e9-ae4f-19441d83beed-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"6b61a6cf-d8fb-40e9-ae4f-19441d83beed\") " pod="openstack/nova-cell1-conductor-0" Mar 09 14:28:22 crc kubenswrapper[4722]: I0309 14:28:22.199033 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b61a6cf-d8fb-40e9-ae4f-19441d83beed-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"6b61a6cf-d8fb-40e9-ae4f-19441d83beed\") " pod="openstack/nova-cell1-conductor-0" Mar 09 14:28:22 crc kubenswrapper[4722]: I0309 14:28:22.199080 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6ctm\" (UniqueName: \"kubernetes.io/projected/6b61a6cf-d8fb-40e9-ae4f-19441d83beed-kube-api-access-w6ctm\") pod \"nova-cell1-conductor-0\" (UID: \"6b61a6cf-d8fb-40e9-ae4f-19441d83beed\") " pod="openstack/nova-cell1-conductor-0" Mar 09 14:28:22 crc kubenswrapper[4722]: I0309 14:28:22.231814 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7756b9d78c-xpcxn" Mar 09 14:28:22 crc kubenswrapper[4722]: I0309 14:28:22.304070 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3c1bee46-a464-474e-ae9c-e333a0ef2190-dns-svc\") pod \"3c1bee46-a464-474e-ae9c-e333a0ef2190\" (UID: \"3c1bee46-a464-474e-ae9c-e333a0ef2190\") " Mar 09 14:28:22 crc kubenswrapper[4722]: I0309 14:28:22.304138 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3c1bee46-a464-474e-ae9c-e333a0ef2190-dns-swift-storage-0\") pod \"3c1bee46-a464-474e-ae9c-e333a0ef2190\" (UID: \"3c1bee46-a464-474e-ae9c-e333a0ef2190\") " Mar 09 14:28:22 crc kubenswrapper[4722]: I0309 14:28:22.304368 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c1bee46-a464-474e-ae9c-e333a0ef2190-config\") pod \"3c1bee46-a464-474e-ae9c-e333a0ef2190\" (UID: \"3c1bee46-a464-474e-ae9c-e333a0ef2190\") " Mar 09 14:28:22 crc kubenswrapper[4722]: I0309 14:28:22.304434 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zpqhb\" (UniqueName: \"kubernetes.io/projected/3c1bee46-a464-474e-ae9c-e333a0ef2190-kube-api-access-zpqhb\") pod \"3c1bee46-a464-474e-ae9c-e333a0ef2190\" (UID: \"3c1bee46-a464-474e-ae9c-e333a0ef2190\") " Mar 09 14:28:22 crc kubenswrapper[4722]: I0309 14:28:22.304492 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3c1bee46-a464-474e-ae9c-e333a0ef2190-ovsdbserver-nb\") pod \"3c1bee46-a464-474e-ae9c-e333a0ef2190\" (UID: \"3c1bee46-a464-474e-ae9c-e333a0ef2190\") " Mar 09 14:28:22 crc kubenswrapper[4722]: I0309 14:28:22.304523 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3c1bee46-a464-474e-ae9c-e333a0ef2190-ovsdbserver-sb\") pod \"3c1bee46-a464-474e-ae9c-e333a0ef2190\" (UID: \"3c1bee46-a464-474e-ae9c-e333a0ef2190\") " Mar 09 14:28:22 crc kubenswrapper[4722]: I0309 14:28:22.305411 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b61a6cf-d8fb-40e9-ae4f-19441d83beed-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"6b61a6cf-d8fb-40e9-ae4f-19441d83beed\") " pod="openstack/nova-cell1-conductor-0" Mar 09 14:28:22 crc kubenswrapper[4722]: I0309 14:28:22.305570 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b61a6cf-d8fb-40e9-ae4f-19441d83beed-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"6b61a6cf-d8fb-40e9-ae4f-19441d83beed\") " pod="openstack/nova-cell1-conductor-0" Mar 09 14:28:22 crc kubenswrapper[4722]: I0309 14:28:22.305635 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6ctm\" (UniqueName: \"kubernetes.io/projected/6b61a6cf-d8fb-40e9-ae4f-19441d83beed-kube-api-access-w6ctm\") pod \"nova-cell1-conductor-0\" (UID: \"6b61a6cf-d8fb-40e9-ae4f-19441d83beed\") " pod="openstack/nova-cell1-conductor-0" Mar 09 14:28:22 crc kubenswrapper[4722]: I0309 14:28:22.310742 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b61a6cf-d8fb-40e9-ae4f-19441d83beed-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"6b61a6cf-d8fb-40e9-ae4f-19441d83beed\") " pod="openstack/nova-cell1-conductor-0" Mar 09 14:28:22 crc kubenswrapper[4722]: I0309 14:28:22.313033 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b61a6cf-d8fb-40e9-ae4f-19441d83beed-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"6b61a6cf-d8fb-40e9-ae4f-19441d83beed\") " pod="openstack/nova-cell1-conductor-0" Mar 09 14:28:22 crc kubenswrapper[4722]: I0309 14:28:22.320531 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c1bee46-a464-474e-ae9c-e333a0ef2190-kube-api-access-zpqhb" (OuterVolumeSpecName: "kube-api-access-zpqhb") pod "3c1bee46-a464-474e-ae9c-e333a0ef2190" (UID: "3c1bee46-a464-474e-ae9c-e333a0ef2190"). InnerVolumeSpecName "kube-api-access-zpqhb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:28:22 crc kubenswrapper[4722]: I0309 14:28:22.322509 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6ctm\" (UniqueName: \"kubernetes.io/projected/6b61a6cf-d8fb-40e9-ae4f-19441d83beed-kube-api-access-w6ctm\") pod \"nova-cell1-conductor-0\" (UID: \"6b61a6cf-d8fb-40e9-ae4f-19441d83beed\") " pod="openstack/nova-cell1-conductor-0" Mar 09 14:28:22 crc kubenswrapper[4722]: I0309 14:28:22.365774 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c1bee46-a464-474e-ae9c-e333a0ef2190-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3c1bee46-a464-474e-ae9c-e333a0ef2190" (UID: "3c1bee46-a464-474e-ae9c-e333a0ef2190"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:28:22 crc kubenswrapper[4722]: I0309 14:28:22.367156 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c1bee46-a464-474e-ae9c-e333a0ef2190-config" (OuterVolumeSpecName: "config") pod "3c1bee46-a464-474e-ae9c-e333a0ef2190" (UID: "3c1bee46-a464-474e-ae9c-e333a0ef2190"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:28:22 crc kubenswrapper[4722]: I0309 14:28:22.372927 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c1bee46-a464-474e-ae9c-e333a0ef2190-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "3c1bee46-a464-474e-ae9c-e333a0ef2190" (UID: "3c1bee46-a464-474e-ae9c-e333a0ef2190"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:28:22 crc kubenswrapper[4722]: I0309 14:28:22.390948 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c1bee46-a464-474e-ae9c-e333a0ef2190-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3c1bee46-a464-474e-ae9c-e333a0ef2190" (UID: "3c1bee46-a464-474e-ae9c-e333a0ef2190"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:28:22 crc kubenswrapper[4722]: I0309 14:28:22.396881 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c1bee46-a464-474e-ae9c-e333a0ef2190-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3c1bee46-a464-474e-ae9c-e333a0ef2190" (UID: "3c1bee46-a464-474e-ae9c-e333a0ef2190"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:28:22 crc kubenswrapper[4722]: I0309 14:28:22.408427 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3c1bee46-a464-474e-ae9c-e333a0ef2190-config\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:22 crc kubenswrapper[4722]: I0309 14:28:22.408462 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zpqhb\" (UniqueName: \"kubernetes.io/projected/3c1bee46-a464-474e-ae9c-e333a0ef2190-kube-api-access-zpqhb\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:22 crc kubenswrapper[4722]: I0309 14:28:22.408474 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3c1bee46-a464-474e-ae9c-e333a0ef2190-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:22 crc kubenswrapper[4722]: I0309 14:28:22.408482 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3c1bee46-a464-474e-ae9c-e333a0ef2190-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:22 crc kubenswrapper[4722]: I0309 14:28:22.408491 4722 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3c1bee46-a464-474e-ae9c-e333a0ef2190-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:22 crc kubenswrapper[4722]: I0309 14:28:22.408499 4722 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3c1bee46-a464-474e-ae9c-e333a0ef2190-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:22 crc kubenswrapper[4722]: I0309 14:28:22.458381 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 09 14:28:22 crc kubenswrapper[4722]: I0309 14:28:22.944954 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 09 14:28:22 crc kubenswrapper[4722]: W0309 14:28:22.949838 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6b61a6cf_d8fb_40e9_ae4f_19441d83beed.slice/crio-ef151d8fc16aea718d11c5b680d41ea45b2aa8d6659d358e2c129d856a9a9573 WatchSource:0}: Error finding container ef151d8fc16aea718d11c5b680d41ea45b2aa8d6659d358e2c129d856a9a9573: Status 404 returned error can't find the container with id ef151d8fc16aea718d11c5b680d41ea45b2aa8d6659d358e2c129d856a9a9573 Mar 09 14:28:23 crc kubenswrapper[4722]: I0309 14:28:23.155510 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"6b61a6cf-d8fb-40e9-ae4f-19441d83beed","Type":"ContainerStarted","Data":"ef151d8fc16aea718d11c5b680d41ea45b2aa8d6659d358e2c129d856a9a9573"} Mar 09 14:28:23 crc kubenswrapper[4722]: I0309 14:28:23.179688 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7756b9d78c-xpcxn" event={"ID":"3c1bee46-a464-474e-ae9c-e333a0ef2190","Type":"ContainerDied","Data":"c9bfffe05c307b7c132099fcfe0f331fc42a2687812efbaa4accf54ae4e82945"} Mar 09 14:28:23 crc kubenswrapper[4722]: I0309 14:28:23.179738 4722 scope.go:117] "RemoveContainer" containerID="b4992f3e1f40480b19383ed9d7d471916f5e637a19de820f80b9af6999306965" Mar 09 14:28:23 crc kubenswrapper[4722]: I0309 14:28:23.179893 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7756b9d78c-xpcxn" Mar 09 14:28:23 crc kubenswrapper[4722]: I0309 14:28:23.210592 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1fe185bd-3b13-48df-9ceb-df1ba08e277b","Type":"ContainerStarted","Data":"e181a3d1402638abb6a7fd52b392c6c4834488f8c071ebed96b611f31e146cdb"} Mar 09 14:28:23 crc kubenswrapper[4722]: I0309 14:28:23.210642 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1fe185bd-3b13-48df-9ceb-df1ba08e277b","Type":"ContainerStarted","Data":"7e797916bf99fb06ad0a2846b0c3f3f33f9c999d2c43e165feedf9ff47a052ed"} Mar 09 14:28:23 crc kubenswrapper[4722]: I0309 14:28:23.258110 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.258077853 podStartE2EDuration="2.258077853s" podCreationTimestamp="2026-03-09 14:28:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:28:23.241652609 +0000 UTC m=+1543.797221185" watchObservedRunningTime="2026-03-09 14:28:23.258077853 +0000 UTC m=+1543.813646429" Mar 09 14:28:23 crc kubenswrapper[4722]: I0309 14:28:23.301036 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-xpcxn"] Mar 09 14:28:23 crc kubenswrapper[4722]: I0309 14:28:23.314779 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-xpcxn"] Mar 09 14:28:23 crc kubenswrapper[4722]: I0309 14:28:23.327834 4722 scope.go:117] "RemoveContainer" containerID="86c3469bd786421a67af04f30ed7e5cdf03b6d03b83ddff2bae2a5ea53f862f4" Mar 09 14:28:23 crc kubenswrapper[4722]: I0309 14:28:23.730443 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-vl8jp" Mar 09 14:28:23 crc kubenswrapper[4722]: I0309 14:28:23.859555 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c58faa7-5e6b-4140-9859-a4729c6354d9-combined-ca-bundle\") pod \"1c58faa7-5e6b-4140-9859-a4729c6354d9\" (UID: \"1c58faa7-5e6b-4140-9859-a4729c6354d9\") " Mar 09 14:28:23 crc kubenswrapper[4722]: I0309 14:28:23.859620 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c58faa7-5e6b-4140-9859-a4729c6354d9-scripts\") pod \"1c58faa7-5e6b-4140-9859-a4729c6354d9\" (UID: \"1c58faa7-5e6b-4140-9859-a4729c6354d9\") " Mar 09 14:28:23 crc kubenswrapper[4722]: I0309 14:28:23.859795 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c58faa7-5e6b-4140-9859-a4729c6354d9-config-data\") pod \"1c58faa7-5e6b-4140-9859-a4729c6354d9\" (UID: \"1c58faa7-5e6b-4140-9859-a4729c6354d9\") " Mar 09 14:28:23 crc kubenswrapper[4722]: I0309 14:28:23.860001 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d277j\" (UniqueName: \"kubernetes.io/projected/1c58faa7-5e6b-4140-9859-a4729c6354d9-kube-api-access-d277j\") pod \"1c58faa7-5e6b-4140-9859-a4729c6354d9\" (UID: \"1c58faa7-5e6b-4140-9859-a4729c6354d9\") " Mar 09 14:28:23 crc kubenswrapper[4722]: I0309 14:28:23.865585 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c58faa7-5e6b-4140-9859-a4729c6354d9-kube-api-access-d277j" (OuterVolumeSpecName: "kube-api-access-d277j") pod "1c58faa7-5e6b-4140-9859-a4729c6354d9" (UID: "1c58faa7-5e6b-4140-9859-a4729c6354d9"). InnerVolumeSpecName "kube-api-access-d277j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:28:23 crc kubenswrapper[4722]: I0309 14:28:23.866376 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c58faa7-5e6b-4140-9859-a4729c6354d9-scripts" (OuterVolumeSpecName: "scripts") pod "1c58faa7-5e6b-4140-9859-a4729c6354d9" (UID: "1c58faa7-5e6b-4140-9859-a4729c6354d9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:28:23 crc kubenswrapper[4722]: I0309 14:28:23.931962 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c58faa7-5e6b-4140-9859-a4729c6354d9-config-data" (OuterVolumeSpecName: "config-data") pod "1c58faa7-5e6b-4140-9859-a4729c6354d9" (UID: "1c58faa7-5e6b-4140-9859-a4729c6354d9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:28:23 crc kubenswrapper[4722]: I0309 14:28:23.937347 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c58faa7-5e6b-4140-9859-a4729c6354d9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1c58faa7-5e6b-4140-9859-a4729c6354d9" (UID: "1c58faa7-5e6b-4140-9859-a4729c6354d9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:28:23 crc kubenswrapper[4722]: I0309 14:28:23.964506 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d277j\" (UniqueName: \"kubernetes.io/projected/1c58faa7-5e6b-4140-9859-a4729c6354d9-kube-api-access-d277j\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:23 crc kubenswrapper[4722]: I0309 14:28:23.964541 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c58faa7-5e6b-4140-9859-a4729c6354d9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:23 crc kubenswrapper[4722]: I0309 14:28:23.964550 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c58faa7-5e6b-4140-9859-a4729c6354d9-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:23 crc kubenswrapper[4722]: I0309 14:28:23.964560 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c58faa7-5e6b-4140-9859-a4729c6354d9-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:24 crc kubenswrapper[4722]: I0309 14:28:24.052747 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 14:28:24 crc kubenswrapper[4722]: I0309 14:28:24.161428 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c1bee46-a464-474e-ae9c-e333a0ef2190" path="/var/lib/kubelet/pods/3c1bee46-a464-474e-ae9c-e333a0ef2190/volumes" Mar 09 14:28:24 crc kubenswrapper[4722]: I0309 14:28:24.167066 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/035524e8-b163-4660-8e2d-b90ebcbef082-combined-ca-bundle\") pod \"035524e8-b163-4660-8e2d-b90ebcbef082\" (UID: \"035524e8-b163-4660-8e2d-b90ebcbef082\") " Mar 09 14:28:24 crc kubenswrapper[4722]: I0309 14:28:24.167273 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tl947\" (UniqueName: \"kubernetes.io/projected/035524e8-b163-4660-8e2d-b90ebcbef082-kube-api-access-tl947\") pod \"035524e8-b163-4660-8e2d-b90ebcbef082\" (UID: \"035524e8-b163-4660-8e2d-b90ebcbef082\") " Mar 09 14:28:24 crc kubenswrapper[4722]: I0309 14:28:24.167295 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/035524e8-b163-4660-8e2d-b90ebcbef082-logs\") pod \"035524e8-b163-4660-8e2d-b90ebcbef082\" (UID: \"035524e8-b163-4660-8e2d-b90ebcbef082\") " Mar 09 14:28:24 crc kubenswrapper[4722]: I0309 14:28:24.167344 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/035524e8-b163-4660-8e2d-b90ebcbef082-config-data\") pod \"035524e8-b163-4660-8e2d-b90ebcbef082\" (UID: \"035524e8-b163-4660-8e2d-b90ebcbef082\") " Mar 09 14:28:24 crc kubenswrapper[4722]: I0309 14:28:24.167831 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/035524e8-b163-4660-8e2d-b90ebcbef082-logs" (OuterVolumeSpecName: "logs") pod "035524e8-b163-4660-8e2d-b90ebcbef082" (UID: "035524e8-b163-4660-8e2d-b90ebcbef082"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:28:24 crc kubenswrapper[4722]: I0309 14:28:24.168234 4722 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/035524e8-b163-4660-8e2d-b90ebcbef082-logs\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:24 crc kubenswrapper[4722]: I0309 14:28:24.172897 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/035524e8-b163-4660-8e2d-b90ebcbef082-kube-api-access-tl947" (OuterVolumeSpecName: "kube-api-access-tl947") pod "035524e8-b163-4660-8e2d-b90ebcbef082" (UID: "035524e8-b163-4660-8e2d-b90ebcbef082"). InnerVolumeSpecName "kube-api-access-tl947". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:28:24 crc kubenswrapper[4722]: I0309 14:28:24.222041 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/035524e8-b163-4660-8e2d-b90ebcbef082-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "035524e8-b163-4660-8e2d-b90ebcbef082" (UID: "035524e8-b163-4660-8e2d-b90ebcbef082"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:28:24 crc kubenswrapper[4722]: I0309 14:28:24.228104 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-vl8jp" event={"ID":"1c58faa7-5e6b-4140-9859-a4729c6354d9","Type":"ContainerDied","Data":"5a25995b091694fbd1ef50b1148bc415f507a2e827be482df2619798a694f294"} Mar 09 14:28:24 crc kubenswrapper[4722]: I0309 14:28:24.228137 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a25995b091694fbd1ef50b1148bc415f507a2e827be482df2619798a694f294" Mar 09 14:28:24 crc kubenswrapper[4722]: I0309 14:28:24.228182 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-vl8jp" Mar 09 14:28:24 crc kubenswrapper[4722]: I0309 14:28:24.231283 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 14:28:24 crc kubenswrapper[4722]: I0309 14:28:24.231334 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"035524e8-b163-4660-8e2d-b90ebcbef082","Type":"ContainerDied","Data":"f8f1aa6c9a54cd18444527eb6152479842ac9eccf62f56097bdcfda05d0b3fe2"} Mar 09 14:28:24 crc kubenswrapper[4722]: I0309 14:28:24.231416 4722 scope.go:117] "RemoveContainer" containerID="f8f1aa6c9a54cd18444527eb6152479842ac9eccf62f56097bdcfda05d0b3fe2" Mar 09 14:28:24 crc kubenswrapper[4722]: I0309 14:28:24.231043 4722 generic.go:334] "Generic (PLEG): container finished" podID="035524e8-b163-4660-8e2d-b90ebcbef082" containerID="f8f1aa6c9a54cd18444527eb6152479842ac9eccf62f56097bdcfda05d0b3fe2" exitCode=0 Mar 09 14:28:24 crc kubenswrapper[4722]: I0309 14:28:24.232355 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"035524e8-b163-4660-8e2d-b90ebcbef082","Type":"ContainerDied","Data":"98176ee23c37a0c155ddec66dae6ea64825e99b9f337ab7714bbea101eb5d619"} Mar 09 14:28:24 crc kubenswrapper[4722]: I0309 14:28:24.233786 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/035524e8-b163-4660-8e2d-b90ebcbef082-config-data" (OuterVolumeSpecName: "config-data") pod "035524e8-b163-4660-8e2d-b90ebcbef082" (UID: "035524e8-b163-4660-8e2d-b90ebcbef082"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:28:24 crc kubenswrapper[4722]: I0309 14:28:24.235638 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"6b61a6cf-d8fb-40e9-ae4f-19441d83beed","Type":"ContainerStarted","Data":"20de9ec346e60d04a5692697a87142821707b77186631ac304c4ff50f24ece25"} Mar 09 14:28:24 crc kubenswrapper[4722]: I0309 14:28:24.236020 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 09 14:28:24 crc kubenswrapper[4722]: I0309 14:28:24.254116 4722 scope.go:117] "RemoveContainer" containerID="01f24812bf5b006c669261fd4a11b3af9c7cf9717f10922b664eee6f5e530995" Mar 09 14:28:24 crc kubenswrapper[4722]: I0309 14:28:24.260506 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.260485429 podStartE2EDuration="2.260485429s" podCreationTimestamp="2026-03-09 14:28:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:28:24.251160921 +0000 UTC m=+1544.806729497" watchObservedRunningTime="2026-03-09 14:28:24.260485429 +0000 UTC m=+1544.816054005" Mar 09 14:28:24 crc kubenswrapper[4722]: I0309 14:28:24.270607 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/035524e8-b163-4660-8e2d-b90ebcbef082-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:24 crc kubenswrapper[4722]: I0309 14:28:24.270642 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tl947\" (UniqueName: \"kubernetes.io/projected/035524e8-b163-4660-8e2d-b90ebcbef082-kube-api-access-tl947\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:24 crc kubenswrapper[4722]: I0309 14:28:24.270652 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/035524e8-b163-4660-8e2d-b90ebcbef082-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:24 crc kubenswrapper[4722]: I0309 14:28:24.277265 4722 scope.go:117] "RemoveContainer" containerID="f8f1aa6c9a54cd18444527eb6152479842ac9eccf62f56097bdcfda05d0b3fe2" Mar 09 14:28:24 crc kubenswrapper[4722]: E0309 14:28:24.279312 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8f1aa6c9a54cd18444527eb6152479842ac9eccf62f56097bdcfda05d0b3fe2\": container with ID starting with f8f1aa6c9a54cd18444527eb6152479842ac9eccf62f56097bdcfda05d0b3fe2 not found: ID does not exist" containerID="f8f1aa6c9a54cd18444527eb6152479842ac9eccf62f56097bdcfda05d0b3fe2" Mar 09 14:28:24 crc kubenswrapper[4722]: I0309 14:28:24.279343 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8f1aa6c9a54cd18444527eb6152479842ac9eccf62f56097bdcfda05d0b3fe2"} err="failed to get container status \"f8f1aa6c9a54cd18444527eb6152479842ac9eccf62f56097bdcfda05d0b3fe2\": rpc error: code = NotFound desc = could not find container \"f8f1aa6c9a54cd18444527eb6152479842ac9eccf62f56097bdcfda05d0b3fe2\": container with ID starting with f8f1aa6c9a54cd18444527eb6152479842ac9eccf62f56097bdcfda05d0b3fe2 not found: ID does not exist" Mar 09 14:28:24 crc kubenswrapper[4722]: I0309 14:28:24.279362 4722 scope.go:117] "RemoveContainer" containerID="01f24812bf5b006c669261fd4a11b3af9c7cf9717f10922b664eee6f5e530995" Mar 09 14:28:24 crc kubenswrapper[4722]: E0309 14:28:24.279696 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01f24812bf5b006c669261fd4a11b3af9c7cf9717f10922b664eee6f5e530995\": container with ID starting with 01f24812bf5b006c669261fd4a11b3af9c7cf9717f10922b664eee6f5e530995 not found: ID does not exist" containerID="01f24812bf5b006c669261fd4a11b3af9c7cf9717f10922b664eee6f5e530995" Mar 09 14:28:24 crc kubenswrapper[4722]: I0309 14:28:24.279724 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01f24812bf5b006c669261fd4a11b3af9c7cf9717f10922b664eee6f5e530995"} err="failed to get container status \"01f24812bf5b006c669261fd4a11b3af9c7cf9717f10922b664eee6f5e530995\": rpc error: code = NotFound desc = could not find container \"01f24812bf5b006c669261fd4a11b3af9c7cf9717f10922b664eee6f5e530995\": container with ID starting with 01f24812bf5b006c669261fd4a11b3af9c7cf9717f10922b664eee6f5e530995 not found: ID does not exist" Mar 09 14:28:24 crc kubenswrapper[4722]: I0309 14:28:24.699273 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 09 14:28:24 crc kubenswrapper[4722]: I0309 14:28:24.725221 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 09 14:28:24 crc kubenswrapper[4722]: I0309 14:28:24.741274 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 09 14:28:24 crc kubenswrapper[4722]: E0309 14:28:24.742036 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c58faa7-5e6b-4140-9859-a4729c6354d9" containerName="aodh-db-sync" Mar 09 14:28:24 crc kubenswrapper[4722]: I0309 14:28:24.742061 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c58faa7-5e6b-4140-9859-a4729c6354d9" containerName="aodh-db-sync" Mar 09 14:28:24 crc kubenswrapper[4722]: E0309 14:28:24.742075 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="035524e8-b163-4660-8e2d-b90ebcbef082" containerName="nova-api-api" Mar 09 14:28:24 crc kubenswrapper[4722]: I0309 14:28:24.742083 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="035524e8-b163-4660-8e2d-b90ebcbef082" containerName="nova-api-api" Mar 09 14:28:24 crc kubenswrapper[4722]: E0309 14:28:24.742112 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="035524e8-b163-4660-8e2d-b90ebcbef082" containerName="nova-api-log" Mar 09 14:28:24 crc kubenswrapper[4722]: I0309 14:28:24.742124 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="035524e8-b163-4660-8e2d-b90ebcbef082" containerName="nova-api-log" Mar 09 14:28:24 crc kubenswrapper[4722]: E0309 14:28:24.742145 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c1bee46-a464-474e-ae9c-e333a0ef2190" containerName="dnsmasq-dns" Mar 09 14:28:24 crc kubenswrapper[4722]: I0309 14:28:24.742151 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c1bee46-a464-474e-ae9c-e333a0ef2190" containerName="dnsmasq-dns" Mar 09 14:28:24 crc kubenswrapper[4722]: E0309 14:28:24.742164 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c1bee46-a464-474e-ae9c-e333a0ef2190" containerName="init" Mar 09 14:28:24 crc kubenswrapper[4722]: I0309 14:28:24.742170 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c1bee46-a464-474e-ae9c-e333a0ef2190" containerName="init" Mar 09 14:28:24 crc kubenswrapper[4722]: I0309 14:28:24.742459 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c58faa7-5e6b-4140-9859-a4729c6354d9" containerName="aodh-db-sync" Mar 09 14:28:24 crc kubenswrapper[4722]: I0309 14:28:24.742485 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c1bee46-a464-474e-ae9c-e333a0ef2190" containerName="dnsmasq-dns" Mar 09 14:28:24 crc kubenswrapper[4722]: I0309 14:28:24.742504 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="035524e8-b163-4660-8e2d-b90ebcbef082" containerName="nova-api-api" Mar 09 14:28:24 crc kubenswrapper[4722]: I0309 14:28:24.742534 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="035524e8-b163-4660-8e2d-b90ebcbef082" containerName="nova-api-log" Mar 09 14:28:24 crc kubenswrapper[4722]: I0309 14:28:24.743986 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 14:28:24 crc kubenswrapper[4722]: I0309 14:28:24.746919 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 09 14:28:24 crc kubenswrapper[4722]: I0309 14:28:24.750867 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 09 14:28:24 crc kubenswrapper[4722]: I0309 14:28:24.888601 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c58393a-f27d-4e46-87ac-352b1370c295-logs\") pod \"nova-api-0\" (UID: \"9c58393a-f27d-4e46-87ac-352b1370c295\") " pod="openstack/nova-api-0" Mar 09 14:28:24 crc kubenswrapper[4722]: I0309 14:28:24.889437 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drp7d\" (UniqueName: \"kubernetes.io/projected/9c58393a-f27d-4e46-87ac-352b1370c295-kube-api-access-drp7d\") pod \"nova-api-0\" (UID: \"9c58393a-f27d-4e46-87ac-352b1370c295\") " pod="openstack/nova-api-0" Mar 09 14:28:24 crc kubenswrapper[4722]: I0309 14:28:24.889777 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c58393a-f27d-4e46-87ac-352b1370c295-config-data\") pod \"nova-api-0\" (UID: \"9c58393a-f27d-4e46-87ac-352b1370c295\") " pod="openstack/nova-api-0" Mar 09 14:28:24 crc kubenswrapper[4722]: I0309 14:28:24.889992 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c58393a-f27d-4e46-87ac-352b1370c295-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9c58393a-f27d-4e46-87ac-352b1370c295\") " pod="openstack/nova-api-0" Mar 09 14:28:24 crc kubenswrapper[4722]: I0309 14:28:24.993961 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c58393a-f27d-4e46-87ac-352b1370c295-config-data\") pod \"nova-api-0\" (UID: \"9c58393a-f27d-4e46-87ac-352b1370c295\") " pod="openstack/nova-api-0" Mar 09 14:28:24 crc kubenswrapper[4722]: I0309 14:28:24.994088 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c58393a-f27d-4e46-87ac-352b1370c295-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9c58393a-f27d-4e46-87ac-352b1370c295\") " pod="openstack/nova-api-0" Mar 09 14:28:24 crc kubenswrapper[4722]: I0309 14:28:24.994439 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c58393a-f27d-4e46-87ac-352b1370c295-logs\") pod \"nova-api-0\" (UID: \"9c58393a-f27d-4e46-87ac-352b1370c295\") " pod="openstack/nova-api-0" Mar 09 14:28:24 crc kubenswrapper[4722]: I0309 14:28:24.994875 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drp7d\" (UniqueName: \"kubernetes.io/projected/9c58393a-f27d-4e46-87ac-352b1370c295-kube-api-access-drp7d\") pod \"nova-api-0\" (UID: \"9c58393a-f27d-4e46-87ac-352b1370c295\") " pod="openstack/nova-api-0" Mar 09 14:28:24 crc kubenswrapper[4722]: I0309 14:28:24.995000 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c58393a-f27d-4e46-87ac-352b1370c295-logs\") pod \"nova-api-0\" (UID: \"9c58393a-f27d-4e46-87ac-352b1370c295\") " pod="openstack/nova-api-0" Mar 09 14:28:25 crc kubenswrapper[4722]: I0309 14:28:25.000447 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c58393a-f27d-4e46-87ac-352b1370c295-config-data\") pod \"nova-api-0\" (UID: \"9c58393a-f27d-4e46-87ac-352b1370c295\") " pod="openstack/nova-api-0" Mar 09 14:28:25 crc kubenswrapper[4722]: I0309 14:28:25.008848 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c58393a-f27d-4e46-87ac-352b1370c295-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9c58393a-f27d-4e46-87ac-352b1370c295\") " pod="openstack/nova-api-0" Mar 09 14:28:25 crc kubenswrapper[4722]: I0309 14:28:25.016777 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drp7d\" (UniqueName: \"kubernetes.io/projected/9c58393a-f27d-4e46-87ac-352b1370c295-kube-api-access-drp7d\") pod \"nova-api-0\" (UID: \"9c58393a-f27d-4e46-87ac-352b1370c295\") " pod="openstack/nova-api-0" Mar 09 14:28:25 crc kubenswrapper[4722]: I0309 14:28:25.105409 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 14:28:25 crc kubenswrapper[4722]: I0309 14:28:25.316142 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"483d53e3-10df-4bea-97df-8d72228035b9","Type":"ContainerStarted","Data":"2da874e6414bc043795d2fe45330c4ff876d3402e208b61378c7a30e50c18a61"} Mar 09 14:28:25 crc kubenswrapper[4722]: I0309 14:28:25.316228 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 09 14:28:25 crc kubenswrapper[4722]: I0309 14:28:25.373883 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=5.966528271 podStartE2EDuration="11.373859354s" podCreationTimestamp="2026-03-09 14:28:14 +0000 UTC" firstStartedPulling="2026-03-09 14:28:19.086805049 +0000 UTC m=+1539.642373625" lastFinishedPulling="2026-03-09 14:28:24.494136132 +0000 UTC m=+1545.049704708" observedRunningTime="2026-03-09 14:28:25.355737403 +0000 UTC m=+1545.911305979" watchObservedRunningTime="2026-03-09 14:28:25.373859354 +0000 UTC m=+1545.929427930" Mar 09 14:28:25 crc kubenswrapper[4722]: W0309 14:28:25.697559 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9c58393a_f27d_4e46_87ac_352b1370c295.slice/crio-b4be2528460b9ad671efaa3af9bb9b811d4976fd9bafc8d84a430b0099c5d19e WatchSource:0}: Error finding container b4be2528460b9ad671efaa3af9bb9b811d4976fd9bafc8d84a430b0099c5d19e: Status 404 returned error can't find the container with id b4be2528460b9ad671efaa3af9bb9b811d4976fd9bafc8d84a430b0099c5d19e Mar 09 14:28:25 crc kubenswrapper[4722]: I0309 14:28:25.707177 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 09 14:28:26 crc kubenswrapper[4722]: I0309 14:28:26.164603 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="035524e8-b163-4660-8e2d-b90ebcbef082" path="/var/lib/kubelet/pods/035524e8-b163-4660-8e2d-b90ebcbef082/volumes" Mar 09 14:28:26 crc kubenswrapper[4722]: I0309 14:28:26.325503 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9c58393a-f27d-4e46-87ac-352b1370c295","Type":"ContainerStarted","Data":"3c8f0ecc6e288a2c45b7eecaa7fbb8df96e687d06d2ecda8b11d40e75a734c21"} Mar 09 14:28:26 crc kubenswrapper[4722]: I0309 14:28:26.325872 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9c58393a-f27d-4e46-87ac-352b1370c295","Type":"ContainerStarted","Data":"9cf4b538321d3fa703583336e99c0ac5959c1b9751f95fd0ea88e2cc3ab41b80"} Mar 09 14:28:26 crc kubenswrapper[4722]: I0309 14:28:26.325887 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9c58393a-f27d-4e46-87ac-352b1370c295","Type":"ContainerStarted","Data":"b4be2528460b9ad671efaa3af9bb9b811d4976fd9bafc8d84a430b0099c5d19e"} Mar 09 14:28:26 crc kubenswrapper[4722]: I0309 14:28:26.352756 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.352737079 podStartE2EDuration="2.352737079s" podCreationTimestamp="2026-03-09 14:28:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:28:26.341470678 +0000 UTC m=+1546.897039254" watchObservedRunningTime="2026-03-09 14:28:26.352737079 +0000 UTC m=+1546.908305655" Mar 09 14:28:26 crc kubenswrapper[4722]: I0309 14:28:26.488822 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 09 14:28:28 crc kubenswrapper[4722]: I0309 14:28:28.118001 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Mar 09 14:28:28 crc kubenswrapper[4722]: I0309 14:28:28.122330 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 09 14:28:28 crc kubenswrapper[4722]: I0309 14:28:28.128049 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Mar 09 14:28:28 crc kubenswrapper[4722]: I0309 14:28:28.128160 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Mar 09 14:28:28 crc kubenswrapper[4722]: I0309 14:28:28.128049 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-d98zq" Mar 09 14:28:28 crc kubenswrapper[4722]: I0309 14:28:28.141567 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 09 14:28:28 crc kubenswrapper[4722]: I0309 14:28:28.183601 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9226bef9-b3b3-4f10-8e02-6fbc710bc4f3-config-data\") pod \"aodh-0\" (UID: \"9226bef9-b3b3-4f10-8e02-6fbc710bc4f3\") " pod="openstack/aodh-0" Mar 09 14:28:28 crc kubenswrapper[4722]: I0309 14:28:28.183840 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9226bef9-b3b3-4f10-8e02-6fbc710bc4f3-combined-ca-bundle\") pod \"aodh-0\" (UID: \"9226bef9-b3b3-4f10-8e02-6fbc710bc4f3\") " pod="openstack/aodh-0" Mar 09 14:28:28 crc kubenswrapper[4722]: I0309 14:28:28.184080 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9226bef9-b3b3-4f10-8e02-6fbc710bc4f3-scripts\") pod \"aodh-0\" (UID: \"9226bef9-b3b3-4f10-8e02-6fbc710bc4f3\") " pod="openstack/aodh-0" Mar 09 14:28:28 crc kubenswrapper[4722]: I0309 14:28:28.184266 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnkzm\" (UniqueName: \"kubernetes.io/projected/9226bef9-b3b3-4f10-8e02-6fbc710bc4f3-kube-api-access-qnkzm\") pod \"aodh-0\" (UID: \"9226bef9-b3b3-4f10-8e02-6fbc710bc4f3\") " pod="openstack/aodh-0" Mar 09 14:28:28 crc kubenswrapper[4722]: I0309 14:28:28.286370 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9226bef9-b3b3-4f10-8e02-6fbc710bc4f3-scripts\") pod \"aodh-0\" (UID: \"9226bef9-b3b3-4f10-8e02-6fbc710bc4f3\") " pod="openstack/aodh-0" Mar 09 14:28:28 crc kubenswrapper[4722]: I0309 14:28:28.286462 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnkzm\" (UniqueName: \"kubernetes.io/projected/9226bef9-b3b3-4f10-8e02-6fbc710bc4f3-kube-api-access-qnkzm\") pod \"aodh-0\" (UID: \"9226bef9-b3b3-4f10-8e02-6fbc710bc4f3\") " pod="openstack/aodh-0" Mar 09 14:28:28 crc kubenswrapper[4722]: I0309 14:28:28.286630 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9226bef9-b3b3-4f10-8e02-6fbc710bc4f3-config-data\") pod \"aodh-0\" (UID: \"9226bef9-b3b3-4f10-8e02-6fbc710bc4f3\") " pod="openstack/aodh-0" Mar 09 14:28:28 crc kubenswrapper[4722]: I0309 14:28:28.286668 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9226bef9-b3b3-4f10-8e02-6fbc710bc4f3-combined-ca-bundle\") pod \"aodh-0\" (UID: \"9226bef9-b3b3-4f10-8e02-6fbc710bc4f3\") " pod="openstack/aodh-0" Mar 09 14:28:28 crc kubenswrapper[4722]: I0309 14:28:28.291975 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9226bef9-b3b3-4f10-8e02-6fbc710bc4f3-scripts\") pod \"aodh-0\" (UID: \"9226bef9-b3b3-4f10-8e02-6fbc710bc4f3\") " pod="openstack/aodh-0" Mar 09 14:28:28 crc kubenswrapper[4722]: I0309 14:28:28.292403 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9226bef9-b3b3-4f10-8e02-6fbc710bc4f3-config-data\") pod \"aodh-0\" (UID: \"9226bef9-b3b3-4f10-8e02-6fbc710bc4f3\") " pod="openstack/aodh-0" Mar 09 14:28:28 crc kubenswrapper[4722]: I0309 14:28:28.293333 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9226bef9-b3b3-4f10-8e02-6fbc710bc4f3-combined-ca-bundle\") pod \"aodh-0\" (UID: \"9226bef9-b3b3-4f10-8e02-6fbc710bc4f3\") " pod="openstack/aodh-0" Mar 09 14:28:28 crc kubenswrapper[4722]: I0309 14:28:28.309371 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnkzm\" (UniqueName: \"kubernetes.io/projected/9226bef9-b3b3-4f10-8e02-6fbc710bc4f3-kube-api-access-qnkzm\") pod \"aodh-0\" (UID: \"9226bef9-b3b3-4f10-8e02-6fbc710bc4f3\") " pod="openstack/aodh-0" Mar 09 14:28:28 crc kubenswrapper[4722]: I0309 14:28:28.457944 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 09 14:28:29 crc kubenswrapper[4722]: I0309 14:28:29.192847 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 09 14:28:29 crc kubenswrapper[4722]: I0309 14:28:29.375870 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"9226bef9-b3b3-4f10-8e02-6fbc710bc4f3","Type":"ContainerStarted","Data":"b178e6de68f3d0a7ba0e3dc493d804391749665d2f57fa337c4435a67a78a420"} Mar 09 14:28:30 crc kubenswrapper[4722]: I0309 14:28:30.388731 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"9226bef9-b3b3-4f10-8e02-6fbc710bc4f3","Type":"ContainerStarted","Data":"6d9fb16100b3f52f35ea0198c84baf23497e15348c942b0f5dadd6b8d8a1b0de"} Mar 09 14:28:31 crc kubenswrapper[4722]: I0309 14:28:31.489013 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 09 14:28:31 crc kubenswrapper[4722]: I0309 14:28:31.532894 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 09 14:28:31 crc kubenswrapper[4722]: I0309 14:28:31.622643 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Mar 09 14:28:31 crc kubenswrapper[4722]: I0309 14:28:31.678747 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 14:28:31 crc kubenswrapper[4722]: I0309 14:28:31.679028 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="483d53e3-10df-4bea-97df-8d72228035b9" containerName="ceilometer-central-agent" containerID="cri-o://cb7f4f66a8c95faf66fe9599f65b1f6dcfc0d56b1d2d137055208101c01d8bd1" gracePeriod=30 Mar 09 14:28:31 crc kubenswrapper[4722]: I0309 14:28:31.679156 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="483d53e3-10df-4bea-97df-8d72228035b9" containerName="ceilometer-notification-agent" containerID="cri-o://58372d3991297b65f350e20d6923a3bdb3c68c4d52579b01ceaffcc5e05c2dfd" gracePeriod=30 Mar 09 14:28:31 crc kubenswrapper[4722]: I0309 14:28:31.679157 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="483d53e3-10df-4bea-97df-8d72228035b9" containerName="sg-core" containerID="cri-o://0009ddb824d3e48e1939ff0f61d50ec63d10781151ccaf88f95422c1b485e7e3" gracePeriod=30 Mar 09 14:28:31 crc kubenswrapper[4722]: I0309 14:28:31.679320 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="483d53e3-10df-4bea-97df-8d72228035b9" containerName="proxy-httpd" containerID="cri-o://2da874e6414bc043795d2fe45330c4ff876d3402e208b61378c7a30e50c18a61" gracePeriod=30 Mar 09 14:28:32 crc kubenswrapper[4722]: I0309 14:28:32.417080 4722 generic.go:334] "Generic (PLEG): container finished" podID="483d53e3-10df-4bea-97df-8d72228035b9" containerID="2da874e6414bc043795d2fe45330c4ff876d3402e208b61378c7a30e50c18a61" exitCode=0 Mar 09 14:28:32 crc kubenswrapper[4722]: I0309 14:28:32.417479 4722 generic.go:334] "Generic (PLEG): container finished" podID="483d53e3-10df-4bea-97df-8d72228035b9" containerID="0009ddb824d3e48e1939ff0f61d50ec63d10781151ccaf88f95422c1b485e7e3" exitCode=2 Mar 09 14:28:32 crc kubenswrapper[4722]: I0309 14:28:32.417494 4722 generic.go:334] "Generic (PLEG): container finished" podID="483d53e3-10df-4bea-97df-8d72228035b9" containerID="58372d3991297b65f350e20d6923a3bdb3c68c4d52579b01ceaffcc5e05c2dfd" exitCode=0 Mar 09 14:28:32 crc kubenswrapper[4722]: I0309 14:28:32.417114 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"483d53e3-10df-4bea-97df-8d72228035b9","Type":"ContainerDied","Data":"2da874e6414bc043795d2fe45330c4ff876d3402e208b61378c7a30e50c18a61"} Mar 09 14:28:32 crc kubenswrapper[4722]: I0309 14:28:32.417594 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"483d53e3-10df-4bea-97df-8d72228035b9","Type":"ContainerDied","Data":"0009ddb824d3e48e1939ff0f61d50ec63d10781151ccaf88f95422c1b485e7e3"} Mar 09 14:28:32 crc kubenswrapper[4722]: I0309 14:28:32.417627 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"483d53e3-10df-4bea-97df-8d72228035b9","Type":"ContainerDied","Data":"58372d3991297b65f350e20d6923a3bdb3c68c4d52579b01ceaffcc5e05c2dfd"} Mar 09 14:28:32 crc kubenswrapper[4722]: I0309 14:28:32.420987 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"9226bef9-b3b3-4f10-8e02-6fbc710bc4f3","Type":"ContainerStarted","Data":"e054dead3688eeb85afccf93d975a6293fdf7fdde17ec109bf4977b2b4adb687"} Mar 09 14:28:32 crc kubenswrapper[4722]: I0309 14:28:32.450228 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 09 14:28:32 crc kubenswrapper[4722]: I0309 14:28:32.495680 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 09 14:28:33 crc kubenswrapper[4722]: I0309 14:28:33.259877 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 14:28:33 crc kubenswrapper[4722]: I0309 14:28:33.286641 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/483d53e3-10df-4bea-97df-8d72228035b9-combined-ca-bundle\") pod \"483d53e3-10df-4bea-97df-8d72228035b9\" (UID: \"483d53e3-10df-4bea-97df-8d72228035b9\") " Mar 09 14:28:33 crc kubenswrapper[4722]: I0309 14:28:33.286765 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/483d53e3-10df-4bea-97df-8d72228035b9-run-httpd\") pod \"483d53e3-10df-4bea-97df-8d72228035b9\" (UID: \"483d53e3-10df-4bea-97df-8d72228035b9\") " Mar 09 14:28:33 crc kubenswrapper[4722]: I0309 14:28:33.286865 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/483d53e3-10df-4bea-97df-8d72228035b9-log-httpd\") pod \"483d53e3-10df-4bea-97df-8d72228035b9\" (UID: \"483d53e3-10df-4bea-97df-8d72228035b9\") " Mar 09 14:28:33 crc kubenswrapper[4722]: I0309 14:28:33.286918 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/483d53e3-10df-4bea-97df-8d72228035b9-sg-core-conf-yaml\") pod \"483d53e3-10df-4bea-97df-8d72228035b9\" (UID: \"483d53e3-10df-4bea-97df-8d72228035b9\") " Mar 09 14:28:33 crc kubenswrapper[4722]: I0309 14:28:33.286986 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/483d53e3-10df-4bea-97df-8d72228035b9-config-data\") pod \"483d53e3-10df-4bea-97df-8d72228035b9\" (UID: \"483d53e3-10df-4bea-97df-8d72228035b9\") " Mar 09 14:28:33 crc kubenswrapper[4722]: I0309 14:28:33.287032 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/483d53e3-10df-4bea-97df-8d72228035b9-scripts\") pod \"483d53e3-10df-4bea-97df-8d72228035b9\" (UID: \"483d53e3-10df-4bea-97df-8d72228035b9\") " Mar 09 14:28:33 crc kubenswrapper[4722]: I0309 14:28:33.287251 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjqbx\" (UniqueName: \"kubernetes.io/projected/483d53e3-10df-4bea-97df-8d72228035b9-kube-api-access-mjqbx\") pod \"483d53e3-10df-4bea-97df-8d72228035b9\" (UID: \"483d53e3-10df-4bea-97df-8d72228035b9\") " Mar 09 14:28:33 crc kubenswrapper[4722]: I0309 14:28:33.287638 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/483d53e3-10df-4bea-97df-8d72228035b9-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "483d53e3-10df-4bea-97df-8d72228035b9" (UID: "483d53e3-10df-4bea-97df-8d72228035b9"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:28:33 crc kubenswrapper[4722]: I0309 14:28:33.287821 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/483d53e3-10df-4bea-97df-8d72228035b9-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "483d53e3-10df-4bea-97df-8d72228035b9" (UID: "483d53e3-10df-4bea-97df-8d72228035b9"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:28:33 crc kubenswrapper[4722]: I0309 14:28:33.288193 4722 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/483d53e3-10df-4bea-97df-8d72228035b9-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:33 crc kubenswrapper[4722]: I0309 14:28:33.288235 4722 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/483d53e3-10df-4bea-97df-8d72228035b9-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:33 crc kubenswrapper[4722]: I0309 14:28:33.302344 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/483d53e3-10df-4bea-97df-8d72228035b9-scripts" (OuterVolumeSpecName: "scripts") pod "483d53e3-10df-4bea-97df-8d72228035b9" (UID: "483d53e3-10df-4bea-97df-8d72228035b9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:28:33 crc kubenswrapper[4722]: I0309 14:28:33.339686 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/483d53e3-10df-4bea-97df-8d72228035b9-kube-api-access-mjqbx" (OuterVolumeSpecName: "kube-api-access-mjqbx") pod "483d53e3-10df-4bea-97df-8d72228035b9" (UID: "483d53e3-10df-4bea-97df-8d72228035b9"). InnerVolumeSpecName "kube-api-access-mjqbx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:28:33 crc kubenswrapper[4722]: I0309 14:28:33.362648 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/483d53e3-10df-4bea-97df-8d72228035b9-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "483d53e3-10df-4bea-97df-8d72228035b9" (UID: "483d53e3-10df-4bea-97df-8d72228035b9"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:28:33 crc kubenswrapper[4722]: I0309 14:28:33.395640 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjqbx\" (UniqueName: \"kubernetes.io/projected/483d53e3-10df-4bea-97df-8d72228035b9-kube-api-access-mjqbx\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:33 crc kubenswrapper[4722]: I0309 14:28:33.395671 4722 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/483d53e3-10df-4bea-97df-8d72228035b9-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:33 crc kubenswrapper[4722]: I0309 14:28:33.395680 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/483d53e3-10df-4bea-97df-8d72228035b9-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:33 crc kubenswrapper[4722]: I0309 14:28:33.429506 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/483d53e3-10df-4bea-97df-8d72228035b9-config-data" (OuterVolumeSpecName: "config-data") pod "483d53e3-10df-4bea-97df-8d72228035b9" (UID: "483d53e3-10df-4bea-97df-8d72228035b9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:28:33 crc kubenswrapper[4722]: I0309 14:28:33.431336 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/483d53e3-10df-4bea-97df-8d72228035b9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "483d53e3-10df-4bea-97df-8d72228035b9" (UID: "483d53e3-10df-4bea-97df-8d72228035b9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:28:33 crc kubenswrapper[4722]: I0309 14:28:33.443140 4722 generic.go:334] "Generic (PLEG): container finished" podID="483d53e3-10df-4bea-97df-8d72228035b9" containerID="cb7f4f66a8c95faf66fe9599f65b1f6dcfc0d56b1d2d137055208101c01d8bd1" exitCode=0 Mar 09 14:28:33 crc kubenswrapper[4722]: I0309 14:28:33.443461 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 14:28:33 crc kubenswrapper[4722]: I0309 14:28:33.443822 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"483d53e3-10df-4bea-97df-8d72228035b9","Type":"ContainerDied","Data":"cb7f4f66a8c95faf66fe9599f65b1f6dcfc0d56b1d2d137055208101c01d8bd1"} Mar 09 14:28:33 crc kubenswrapper[4722]: I0309 14:28:33.443895 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"483d53e3-10df-4bea-97df-8d72228035b9","Type":"ContainerDied","Data":"d5f95064d66a6fb3eddaf5d28cec1c93924105afae37580608a47faacaf0d90a"} Mar 09 14:28:33 crc kubenswrapper[4722]: I0309 14:28:33.443917 4722 scope.go:117] "RemoveContainer" containerID="2da874e6414bc043795d2fe45330c4ff876d3402e208b61378c7a30e50c18a61" Mar 09 14:28:33 crc kubenswrapper[4722]: I0309 14:28:33.483244 4722 scope.go:117] "RemoveContainer" containerID="0009ddb824d3e48e1939ff0f61d50ec63d10781151ccaf88f95422c1b485e7e3" Mar 09 14:28:33 crc kubenswrapper[4722]: I0309 14:28:33.491261 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 14:28:33 crc kubenswrapper[4722]: I0309 14:28:33.498387 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/483d53e3-10df-4bea-97df-8d72228035b9-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:33 crc kubenswrapper[4722]: I0309 14:28:33.498432 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/483d53e3-10df-4bea-97df-8d72228035b9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:33 crc kubenswrapper[4722]: I0309 14:28:33.508771 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 09 14:28:33 crc kubenswrapper[4722]: I0309 14:28:33.526309 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 09 14:28:33 crc kubenswrapper[4722]: E0309 14:28:33.526791 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="483d53e3-10df-4bea-97df-8d72228035b9" containerName="ceilometer-notification-agent" Mar 09 14:28:33 crc kubenswrapper[4722]: I0309 14:28:33.526810 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="483d53e3-10df-4bea-97df-8d72228035b9" containerName="ceilometer-notification-agent" Mar 09 14:28:33 crc kubenswrapper[4722]: E0309 14:28:33.526839 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="483d53e3-10df-4bea-97df-8d72228035b9" containerName="proxy-httpd" Mar 09 14:28:33 crc kubenswrapper[4722]: I0309 14:28:33.526846 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="483d53e3-10df-4bea-97df-8d72228035b9" containerName="proxy-httpd" Mar 09 14:28:33 crc kubenswrapper[4722]: E0309 14:28:33.526855 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="483d53e3-10df-4bea-97df-8d72228035b9" containerName="ceilometer-central-agent" Mar 09 14:28:33 crc kubenswrapper[4722]: I0309 14:28:33.526861 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="483d53e3-10df-4bea-97df-8d72228035b9" containerName="ceilometer-central-agent" Mar 09 14:28:33 crc kubenswrapper[4722]: E0309 14:28:33.526872 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="483d53e3-10df-4bea-97df-8d72228035b9" containerName="sg-core" Mar 09 14:28:33 crc kubenswrapper[4722]: I0309 14:28:33.526878 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="483d53e3-10df-4bea-97df-8d72228035b9" containerName="sg-core" Mar 09 14:28:33 crc kubenswrapper[4722]: I0309 14:28:33.527104 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="483d53e3-10df-4bea-97df-8d72228035b9" containerName="ceilometer-central-agent" Mar 09 14:28:33 crc kubenswrapper[4722]: I0309 14:28:33.527121 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="483d53e3-10df-4bea-97df-8d72228035b9" containerName="ceilometer-notification-agent" Mar 09 14:28:33 crc kubenswrapper[4722]: I0309 14:28:33.527143 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="483d53e3-10df-4bea-97df-8d72228035b9" containerName="proxy-httpd" Mar 09 14:28:33 crc kubenswrapper[4722]: I0309 14:28:33.527153 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="483d53e3-10df-4bea-97df-8d72228035b9" containerName="sg-core" Mar 09 14:28:33 crc kubenswrapper[4722]: I0309 14:28:33.529094 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 14:28:33 crc kubenswrapper[4722]: I0309 14:28:33.529740 4722 scope.go:117] "RemoveContainer" containerID="58372d3991297b65f350e20d6923a3bdb3c68c4d52579b01ceaffcc5e05c2dfd" Mar 09 14:28:33 crc kubenswrapper[4722]: I0309 14:28:33.532217 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 09 14:28:33 crc kubenswrapper[4722]: I0309 14:28:33.532396 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 09 14:28:33 crc kubenswrapper[4722]: I0309 14:28:33.566222 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 14:28:33 crc kubenswrapper[4722]: I0309 14:28:33.576147 4722 scope.go:117] "RemoveContainer" containerID="cb7f4f66a8c95faf66fe9599f65b1f6dcfc0d56b1d2d137055208101c01d8bd1" Mar 09 14:28:33 crc kubenswrapper[4722]: I0309 14:28:33.600892 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43c9ed1c-0628-4235-b8ff-31392e2215d8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"43c9ed1c-0628-4235-b8ff-31392e2215d8\") " pod="openstack/ceilometer-0" Mar 09 14:28:33 crc kubenswrapper[4722]: I0309 14:28:33.600969 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43c9ed1c-0628-4235-b8ff-31392e2215d8-scripts\") pod \"ceilometer-0\" (UID: \"43c9ed1c-0628-4235-b8ff-31392e2215d8\") " pod="openstack/ceilometer-0" Mar 09 14:28:33 crc kubenswrapper[4722]: I0309 14:28:33.601074 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/43c9ed1c-0628-4235-b8ff-31392e2215d8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"43c9ed1c-0628-4235-b8ff-31392e2215d8\") " pod="openstack/ceilometer-0" Mar 09 14:28:33 crc kubenswrapper[4722]: I0309 14:28:33.601112 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43c9ed1c-0628-4235-b8ff-31392e2215d8-config-data\") pod \"ceilometer-0\" (UID: \"43c9ed1c-0628-4235-b8ff-31392e2215d8\") " pod="openstack/ceilometer-0" Mar 09 14:28:33 crc kubenswrapper[4722]: I0309 14:28:33.601148 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/43c9ed1c-0628-4235-b8ff-31392e2215d8-log-httpd\") pod \"ceilometer-0\" (UID: \"43c9ed1c-0628-4235-b8ff-31392e2215d8\") " pod="openstack/ceilometer-0" Mar 09 14:28:33 crc kubenswrapper[4722]: I0309 14:28:33.601222 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xss6j\" (UniqueName: \"kubernetes.io/projected/43c9ed1c-0628-4235-b8ff-31392e2215d8-kube-api-access-xss6j\") pod \"ceilometer-0\" (UID: \"43c9ed1c-0628-4235-b8ff-31392e2215d8\") " pod="openstack/ceilometer-0" Mar 09 14:28:33 crc kubenswrapper[4722]: I0309 14:28:33.601248 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/43c9ed1c-0628-4235-b8ff-31392e2215d8-run-httpd\") pod \"ceilometer-0\" (UID: \"43c9ed1c-0628-4235-b8ff-31392e2215d8\") " pod="openstack/ceilometer-0" Mar 09 14:28:33 crc kubenswrapper[4722]: I0309 14:28:33.603327 4722 scope.go:117] "RemoveContainer" containerID="2da874e6414bc043795d2fe45330c4ff876d3402e208b61378c7a30e50c18a61" Mar 09 14:28:33 crc kubenswrapper[4722]: E0309 14:28:33.603788 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2da874e6414bc043795d2fe45330c4ff876d3402e208b61378c7a30e50c18a61\": container with ID starting with 2da874e6414bc043795d2fe45330c4ff876d3402e208b61378c7a30e50c18a61 not found: ID does not exist" containerID="2da874e6414bc043795d2fe45330c4ff876d3402e208b61378c7a30e50c18a61" Mar 09 14:28:33 crc kubenswrapper[4722]: I0309 14:28:33.603827 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2da874e6414bc043795d2fe45330c4ff876d3402e208b61378c7a30e50c18a61"} err="failed to get container status \"2da874e6414bc043795d2fe45330c4ff876d3402e208b61378c7a30e50c18a61\": rpc error: code = NotFound desc = could not find container \"2da874e6414bc043795d2fe45330c4ff876d3402e208b61378c7a30e50c18a61\": container with ID starting with 2da874e6414bc043795d2fe45330c4ff876d3402e208b61378c7a30e50c18a61 not found: ID does not exist" Mar 09 14:28:33 crc kubenswrapper[4722]: I0309 14:28:33.603849 4722 scope.go:117] "RemoveContainer" containerID="0009ddb824d3e48e1939ff0f61d50ec63d10781151ccaf88f95422c1b485e7e3" Mar 09 14:28:33 crc kubenswrapper[4722]: E0309 14:28:33.604058 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0009ddb824d3e48e1939ff0f61d50ec63d10781151ccaf88f95422c1b485e7e3\": container with ID starting with 0009ddb824d3e48e1939ff0f61d50ec63d10781151ccaf88f95422c1b485e7e3 not found: ID does not exist" containerID="0009ddb824d3e48e1939ff0f61d50ec63d10781151ccaf88f95422c1b485e7e3" Mar 09 14:28:33 crc kubenswrapper[4722]: I0309 14:28:33.604080 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0009ddb824d3e48e1939ff0f61d50ec63d10781151ccaf88f95422c1b485e7e3"} err="failed to get container status \"0009ddb824d3e48e1939ff0f61d50ec63d10781151ccaf88f95422c1b485e7e3\": rpc error: code = NotFound desc = could not find container \"0009ddb824d3e48e1939ff0f61d50ec63d10781151ccaf88f95422c1b485e7e3\": container with ID starting with 0009ddb824d3e48e1939ff0f61d50ec63d10781151ccaf88f95422c1b485e7e3 not found: ID does not exist" Mar 09 14:28:33 crc kubenswrapper[4722]: I0309 14:28:33.604092 4722 scope.go:117] "RemoveContainer" containerID="58372d3991297b65f350e20d6923a3bdb3c68c4d52579b01ceaffcc5e05c2dfd" Mar 09 14:28:33 crc kubenswrapper[4722]: E0309 14:28:33.604297 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58372d3991297b65f350e20d6923a3bdb3c68c4d52579b01ceaffcc5e05c2dfd\": container with ID starting with 58372d3991297b65f350e20d6923a3bdb3c68c4d52579b01ceaffcc5e05c2dfd not found: ID does not exist" containerID="58372d3991297b65f350e20d6923a3bdb3c68c4d52579b01ceaffcc5e05c2dfd" Mar 09 14:28:33 crc kubenswrapper[4722]: I0309 14:28:33.604319 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58372d3991297b65f350e20d6923a3bdb3c68c4d52579b01ceaffcc5e05c2dfd"} err="failed to get container status \"58372d3991297b65f350e20d6923a3bdb3c68c4d52579b01ceaffcc5e05c2dfd\": rpc error: code = NotFound desc = could not find container \"58372d3991297b65f350e20d6923a3bdb3c68c4d52579b01ceaffcc5e05c2dfd\": container with ID starting with 58372d3991297b65f350e20d6923a3bdb3c68c4d52579b01ceaffcc5e05c2dfd not found: ID does not exist" Mar 09 14:28:33 crc kubenswrapper[4722]: I0309 14:28:33.604332 4722 scope.go:117] "RemoveContainer" containerID="cb7f4f66a8c95faf66fe9599f65b1f6dcfc0d56b1d2d137055208101c01d8bd1" Mar 09 14:28:33 crc kubenswrapper[4722]: E0309 14:28:33.604768 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb7f4f66a8c95faf66fe9599f65b1f6dcfc0d56b1d2d137055208101c01d8bd1\": container with ID starting with cb7f4f66a8c95faf66fe9599f65b1f6dcfc0d56b1d2d137055208101c01d8bd1 not found: ID does not exist" containerID="cb7f4f66a8c95faf66fe9599f65b1f6dcfc0d56b1d2d137055208101c01d8bd1" Mar 09 14:28:33 crc kubenswrapper[4722]: I0309 14:28:33.604799 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb7f4f66a8c95faf66fe9599f65b1f6dcfc0d56b1d2d137055208101c01d8bd1"} err="failed to get container status \"cb7f4f66a8c95faf66fe9599f65b1f6dcfc0d56b1d2d137055208101c01d8bd1\": rpc error: code = NotFound desc = could not find container \"cb7f4f66a8c95faf66fe9599f65b1f6dcfc0d56b1d2d137055208101c01d8bd1\": container with ID starting with cb7f4f66a8c95faf66fe9599f65b1f6dcfc0d56b1d2d137055208101c01d8bd1 not found: ID does not exist" Mar 09 14:28:33 crc kubenswrapper[4722]: I0309 14:28:33.703907 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43c9ed1c-0628-4235-b8ff-31392e2215d8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"43c9ed1c-0628-4235-b8ff-31392e2215d8\") " pod="openstack/ceilometer-0" Mar 09 14:28:33 crc kubenswrapper[4722]: I0309 14:28:33.703999 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43c9ed1c-0628-4235-b8ff-31392e2215d8-scripts\") pod \"ceilometer-0\" (UID: \"43c9ed1c-0628-4235-b8ff-31392e2215d8\") " pod="openstack/ceilometer-0" Mar 09 14:28:33 crc kubenswrapper[4722]: I0309 14:28:33.704581 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/43c9ed1c-0628-4235-b8ff-31392e2215d8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"43c9ed1c-0628-4235-b8ff-31392e2215d8\") " pod="openstack/ceilometer-0" Mar 09 14:28:33 crc kubenswrapper[4722]: I0309 14:28:33.704623 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43c9ed1c-0628-4235-b8ff-31392e2215d8-config-data\") pod \"ceilometer-0\" (UID: \"43c9ed1c-0628-4235-b8ff-31392e2215d8\") " pod="openstack/ceilometer-0" Mar 09 14:28:33 crc kubenswrapper[4722]: I0309 14:28:33.704667 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/43c9ed1c-0628-4235-b8ff-31392e2215d8-log-httpd\") pod \"ceilometer-0\" (UID: \"43c9ed1c-0628-4235-b8ff-31392e2215d8\") " pod="openstack/ceilometer-0" Mar 09 14:28:33 crc kubenswrapper[4722]: I0309 14:28:33.704704 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xss6j\" (UniqueName: \"kubernetes.io/projected/43c9ed1c-0628-4235-b8ff-31392e2215d8-kube-api-access-xss6j\") pod \"ceilometer-0\" (UID: \"43c9ed1c-0628-4235-b8ff-31392e2215d8\") " pod="openstack/ceilometer-0" Mar 09 14:28:33 crc kubenswrapper[4722]: I0309 14:28:33.704729 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/43c9ed1c-0628-4235-b8ff-31392e2215d8-run-httpd\") pod \"ceilometer-0\" (UID: \"43c9ed1c-0628-4235-b8ff-31392e2215d8\") " pod="openstack/ceilometer-0" Mar 09 14:28:33 crc kubenswrapper[4722]: I0309 14:28:33.705325 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/43c9ed1c-0628-4235-b8ff-31392e2215d8-run-httpd\") pod \"ceilometer-0\" (UID: \"43c9ed1c-0628-4235-b8ff-31392e2215d8\") " pod="openstack/ceilometer-0" Mar 09 14:28:33 crc kubenswrapper[4722]: I0309 14:28:33.705647 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/43c9ed1c-0628-4235-b8ff-31392e2215d8-log-httpd\") pod \"ceilometer-0\" (UID: \"43c9ed1c-0628-4235-b8ff-31392e2215d8\") " pod="openstack/ceilometer-0" Mar 09 14:28:33 crc kubenswrapper[4722]: I0309 14:28:33.708980 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43c9ed1c-0628-4235-b8ff-31392e2215d8-config-data\") pod \"ceilometer-0\" (UID: \"43c9ed1c-0628-4235-b8ff-31392e2215d8\") " pod="openstack/ceilometer-0" Mar 09 14:28:33 crc kubenswrapper[4722]: I0309 14:28:33.709317 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43c9ed1c-0628-4235-b8ff-31392e2215d8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"43c9ed1c-0628-4235-b8ff-31392e2215d8\") " pod="openstack/ceilometer-0" Mar 09 14:28:33 crc kubenswrapper[4722]: I0309 14:28:33.710162 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43c9ed1c-0628-4235-b8ff-31392e2215d8-scripts\") pod \"ceilometer-0\" (UID: \"43c9ed1c-0628-4235-b8ff-31392e2215d8\") " pod="openstack/ceilometer-0" Mar 09 14:28:33 crc kubenswrapper[4722]: I0309 14:28:33.715727 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/43c9ed1c-0628-4235-b8ff-31392e2215d8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"43c9ed1c-0628-4235-b8ff-31392e2215d8\") " pod="openstack/ceilometer-0" Mar 09 14:28:33 crc kubenswrapper[4722]: I0309 14:28:33.726920 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xss6j\" (UniqueName: \"kubernetes.io/projected/43c9ed1c-0628-4235-b8ff-31392e2215d8-kube-api-access-xss6j\") pod \"ceilometer-0\" (UID: \"43c9ed1c-0628-4235-b8ff-31392e2215d8\") " pod="openstack/ceilometer-0" Mar 09 14:28:33 crc kubenswrapper[4722]: I0309 14:28:33.857888 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 14:28:34 crc kubenswrapper[4722]: I0309 14:28:34.175675 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="483d53e3-10df-4bea-97df-8d72228035b9" path="/var/lib/kubelet/pods/483d53e3-10df-4bea-97df-8d72228035b9/volumes" Mar 09 14:28:34 crc kubenswrapper[4722]: I0309 14:28:34.434013 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 14:28:34 crc kubenswrapper[4722]: I0309 14:28:34.555332 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 14:28:34 crc kubenswrapper[4722]: W0309 14:28:34.563217 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod43c9ed1c_0628_4235_b8ff_31392e2215d8.slice/crio-30fdc62be671364d698d195ad4eb2a48ff3f0fc8b7a9e8ad2215a7923d9567f4 WatchSource:0}: Error finding container 30fdc62be671364d698d195ad4eb2a48ff3f0fc8b7a9e8ad2215a7923d9567f4: Status 404 returned error can't find the container with id 30fdc62be671364d698d195ad4eb2a48ff3f0fc8b7a9e8ad2215a7923d9567f4 Mar 09 14:28:35 crc kubenswrapper[4722]: I0309 14:28:35.106128 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 09 14:28:35 crc kubenswrapper[4722]: I0309 14:28:35.106792 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 09 14:28:35 crc kubenswrapper[4722]: I0309 14:28:35.468590 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"9226bef9-b3b3-4f10-8e02-6fbc710bc4f3","Type":"ContainerStarted","Data":"f57ad0ec10555687fa3af79581c9163214066df556cd0b5525e8a43d7e191299"} Mar 09 14:28:35 crc kubenswrapper[4722]: I0309 14:28:35.471336 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"43c9ed1c-0628-4235-b8ff-31392e2215d8","Type":"ContainerStarted","Data":"d463ee9894c1202165df79fd362814ec517ef527a2efd4cdd8ef55fba94b2b31"} Mar 09 14:28:35 crc kubenswrapper[4722]: I0309 14:28:35.471371 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"43c9ed1c-0628-4235-b8ff-31392e2215d8","Type":"ContainerStarted","Data":"30fdc62be671364d698d195ad4eb2a48ff3f0fc8b7a9e8ad2215a7923d9567f4"} Mar 09 14:28:36 crc kubenswrapper[4722]: I0309 14:28:36.188672 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="9c58393a-f27d-4e46-87ac-352b1370c295" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.0:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:28:36 crc kubenswrapper[4722]: I0309 14:28:36.189585 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="9c58393a-f27d-4e46-87ac-352b1370c295" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.0:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:28:37 crc kubenswrapper[4722]: I0309 14:28:37.502938 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"9226bef9-b3b3-4f10-8e02-6fbc710bc4f3","Type":"ContainerStarted","Data":"7d7d2b8b808b7e55d128afce9189980d53bf001ea1c56ed1149a3958838025b3"} Mar 09 14:28:37 crc kubenswrapper[4722]: I0309 14:28:37.503090 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="9226bef9-b3b3-4f10-8e02-6fbc710bc4f3" containerName="aodh-api" containerID="cri-o://6d9fb16100b3f52f35ea0198c84baf23497e15348c942b0f5dadd6b8d8a1b0de" gracePeriod=30 Mar 09 14:28:37 crc kubenswrapper[4722]: I0309 14:28:37.503101 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="9226bef9-b3b3-4f10-8e02-6fbc710bc4f3" containerName="aodh-notifier" containerID="cri-o://f57ad0ec10555687fa3af79581c9163214066df556cd0b5525e8a43d7e191299" gracePeriod=30 Mar 09 14:28:37 crc kubenswrapper[4722]: I0309 14:28:37.503119 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="9226bef9-b3b3-4f10-8e02-6fbc710bc4f3" containerName="aodh-evaluator" containerID="cri-o://e054dead3688eeb85afccf93d975a6293fdf7fdde17ec109bf4977b2b4adb687" gracePeriod=30 Mar 09 14:28:37 crc kubenswrapper[4722]: I0309 14:28:37.503164 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="9226bef9-b3b3-4f10-8e02-6fbc710bc4f3" containerName="aodh-listener" containerID="cri-o://7d7d2b8b808b7e55d128afce9189980d53bf001ea1c56ed1149a3958838025b3" gracePeriod=30 Mar 09 14:28:37 crc kubenswrapper[4722]: I0309 14:28:37.513545 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"43c9ed1c-0628-4235-b8ff-31392e2215d8","Type":"ContainerStarted","Data":"1e85a6c73337ea17be4fffea4fa219e03d81e53c4166b213198e319fcc3c288d"} Mar 09 14:28:37 crc kubenswrapper[4722]: I0309 14:28:37.554686 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.215276391 podStartE2EDuration="9.554664243s" podCreationTimestamp="2026-03-09 14:28:28 +0000 UTC" firstStartedPulling="2026-03-09 14:28:29.214053382 +0000 UTC m=+1549.769621958" lastFinishedPulling="2026-03-09 14:28:36.553441234 +0000 UTC m=+1557.109009810" observedRunningTime="2026-03-09 14:28:37.531741371 +0000 UTC m=+1558.087309947" watchObservedRunningTime="2026-03-09 14:28:37.554664243 +0000 UTC m=+1558.110232829" Mar 09 14:28:38 crc kubenswrapper[4722]: I0309 14:28:38.528953 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"43c9ed1c-0628-4235-b8ff-31392e2215d8","Type":"ContainerStarted","Data":"515ccb7dd4ea0672c37ef29269279da8c7fefac77a047757893eb4540d26fe2f"} Mar 09 14:28:38 crc kubenswrapper[4722]: I0309 14:28:38.534233 4722 generic.go:334] "Generic (PLEG): container finished" podID="9226bef9-b3b3-4f10-8e02-6fbc710bc4f3" containerID="f57ad0ec10555687fa3af79581c9163214066df556cd0b5525e8a43d7e191299" exitCode=0 Mar 09 14:28:38 crc kubenswrapper[4722]: I0309 14:28:38.534269 4722 generic.go:334] "Generic (PLEG): container finished" podID="9226bef9-b3b3-4f10-8e02-6fbc710bc4f3" containerID="e054dead3688eeb85afccf93d975a6293fdf7fdde17ec109bf4977b2b4adb687" exitCode=0 Mar 09 14:28:38 crc kubenswrapper[4722]: I0309 14:28:38.534281 4722 generic.go:334] "Generic (PLEG): container finished" podID="9226bef9-b3b3-4f10-8e02-6fbc710bc4f3" containerID="6d9fb16100b3f52f35ea0198c84baf23497e15348c942b0f5dadd6b8d8a1b0de" exitCode=0 Mar 09 14:28:38 crc kubenswrapper[4722]: I0309 14:28:38.534301 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"9226bef9-b3b3-4f10-8e02-6fbc710bc4f3","Type":"ContainerDied","Data":"f57ad0ec10555687fa3af79581c9163214066df556cd0b5525e8a43d7e191299"} Mar 09 14:28:38 crc kubenswrapper[4722]: I0309 14:28:38.534329 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"9226bef9-b3b3-4f10-8e02-6fbc710bc4f3","Type":"ContainerDied","Data":"e054dead3688eeb85afccf93d975a6293fdf7fdde17ec109bf4977b2b4adb687"} Mar 09 14:28:38 crc kubenswrapper[4722]: I0309 14:28:38.534340 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"9226bef9-b3b3-4f10-8e02-6fbc710bc4f3","Type":"ContainerDied","Data":"6d9fb16100b3f52f35ea0198c84baf23497e15348c942b0f5dadd6b8d8a1b0de"} Mar 09 14:28:40 crc kubenswrapper[4722]: I0309 14:28:40.566704 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"43c9ed1c-0628-4235-b8ff-31392e2215d8","Type":"ContainerStarted","Data":"f3a99dc4529ead0773284c49cdc3ea4372abdd3c78fec592daed2f4423c027b5"} Mar 09 14:28:40 crc kubenswrapper[4722]: I0309 14:28:40.566999 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="43c9ed1c-0628-4235-b8ff-31392e2215d8" containerName="proxy-httpd" containerID="cri-o://f3a99dc4529ead0773284c49cdc3ea4372abdd3c78fec592daed2f4423c027b5" gracePeriod=30 Mar 09 14:28:40 crc kubenswrapper[4722]: I0309 14:28:40.567211 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="43c9ed1c-0628-4235-b8ff-31392e2215d8" containerName="sg-core" containerID="cri-o://515ccb7dd4ea0672c37ef29269279da8c7fefac77a047757893eb4540d26fe2f" gracePeriod=30 Mar 09 14:28:40 crc kubenswrapper[4722]: I0309 14:28:40.567226 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="43c9ed1c-0628-4235-b8ff-31392e2215d8" containerName="ceilometer-notification-agent" containerID="cri-o://1e85a6c73337ea17be4fffea4fa219e03d81e53c4166b213198e319fcc3c288d" gracePeriod=30 Mar 09 14:28:40 crc kubenswrapper[4722]: I0309 14:28:40.566971 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="43c9ed1c-0628-4235-b8ff-31392e2215d8" containerName="ceilometer-central-agent" containerID="cri-o://d463ee9894c1202165df79fd362814ec517ef527a2efd4cdd8ef55fba94b2b31" gracePeriod=30 Mar 09 14:28:40 crc kubenswrapper[4722]: I0309 14:28:40.569992 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 09 14:28:40 crc kubenswrapper[4722]: I0309 14:28:40.609330 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.627294838 podStartE2EDuration="7.609305685s" podCreationTimestamp="2026-03-09 14:28:33 +0000 UTC" firstStartedPulling="2026-03-09 14:28:34.565719434 +0000 UTC m=+1555.121288010" lastFinishedPulling="2026-03-09 14:28:39.547730241 +0000 UTC m=+1560.103298857" observedRunningTime="2026-03-09 14:28:40.600230654 +0000 UTC m=+1561.155799230" watchObservedRunningTime="2026-03-09 14:28:40.609305685 +0000 UTC m=+1561.164874261" Mar 09 14:28:41 crc kubenswrapper[4722]: I0309 14:28:41.601968 4722 generic.go:334] "Generic (PLEG): container finished" podID="43c9ed1c-0628-4235-b8ff-31392e2215d8" containerID="f3a99dc4529ead0773284c49cdc3ea4372abdd3c78fec592daed2f4423c027b5" exitCode=0 Mar 09 14:28:41 crc kubenswrapper[4722]: I0309 14:28:41.603131 4722 generic.go:334] "Generic (PLEG): container finished" podID="43c9ed1c-0628-4235-b8ff-31392e2215d8" containerID="515ccb7dd4ea0672c37ef29269279da8c7fefac77a047757893eb4540d26fe2f" exitCode=2 Mar 09 14:28:41 crc kubenswrapper[4722]: I0309 14:28:41.603256 4722 generic.go:334] "Generic (PLEG): container finished" podID="43c9ed1c-0628-4235-b8ff-31392e2215d8" containerID="1e85a6c73337ea17be4fffea4fa219e03d81e53c4166b213198e319fcc3c288d" exitCode=0 Mar 09 14:28:41 crc kubenswrapper[4722]: I0309 14:28:41.602059 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"43c9ed1c-0628-4235-b8ff-31392e2215d8","Type":"ContainerDied","Data":"f3a99dc4529ead0773284c49cdc3ea4372abdd3c78fec592daed2f4423c027b5"} Mar 09 14:28:41 crc kubenswrapper[4722]: I0309 14:28:41.603421 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"43c9ed1c-0628-4235-b8ff-31392e2215d8","Type":"ContainerDied","Data":"515ccb7dd4ea0672c37ef29269279da8c7fefac77a047757893eb4540d26fe2f"} Mar 09 14:28:41 crc kubenswrapper[4722]: I0309 14:28:41.603488 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"43c9ed1c-0628-4235-b8ff-31392e2215d8","Type":"ContainerDied","Data":"1e85a6c73337ea17be4fffea4fa219e03d81e53c4166b213198e319fcc3c288d"} Mar 09 14:28:42 crc kubenswrapper[4722]: I0309 14:28:42.485348 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 14:28:42 crc kubenswrapper[4722]: I0309 14:28:42.644002 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xss6j\" (UniqueName: \"kubernetes.io/projected/43c9ed1c-0628-4235-b8ff-31392e2215d8-kube-api-access-xss6j\") pod \"43c9ed1c-0628-4235-b8ff-31392e2215d8\" (UID: \"43c9ed1c-0628-4235-b8ff-31392e2215d8\") " Mar 09 14:28:42 crc kubenswrapper[4722]: I0309 14:28:42.644062 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/43c9ed1c-0628-4235-b8ff-31392e2215d8-sg-core-conf-yaml\") pod \"43c9ed1c-0628-4235-b8ff-31392e2215d8\" (UID: \"43c9ed1c-0628-4235-b8ff-31392e2215d8\") " Mar 09 14:28:42 crc kubenswrapper[4722]: I0309 14:28:42.644146 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/43c9ed1c-0628-4235-b8ff-31392e2215d8-run-httpd\") pod \"43c9ed1c-0628-4235-b8ff-31392e2215d8\" (UID: \"43c9ed1c-0628-4235-b8ff-31392e2215d8\") " Mar 09 14:28:42 crc kubenswrapper[4722]: I0309 14:28:42.644222 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/43c9ed1c-0628-4235-b8ff-31392e2215d8-log-httpd\") pod \"43c9ed1c-0628-4235-b8ff-31392e2215d8\" (UID: \"43c9ed1c-0628-4235-b8ff-31392e2215d8\") " Mar 09 14:28:42 crc kubenswrapper[4722]: I0309 14:28:42.644450 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43c9ed1c-0628-4235-b8ff-31392e2215d8-scripts\") pod \"43c9ed1c-0628-4235-b8ff-31392e2215d8\" (UID: \"43c9ed1c-0628-4235-b8ff-31392e2215d8\") " Mar 09 14:28:42 crc kubenswrapper[4722]: I0309 14:28:42.644577 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43c9ed1c-0628-4235-b8ff-31392e2215d8-combined-ca-bundle\") pod \"43c9ed1c-0628-4235-b8ff-31392e2215d8\" (UID: \"43c9ed1c-0628-4235-b8ff-31392e2215d8\") " Mar 09 14:28:42 crc kubenswrapper[4722]: I0309 14:28:42.644615 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43c9ed1c-0628-4235-b8ff-31392e2215d8-config-data\") pod \"43c9ed1c-0628-4235-b8ff-31392e2215d8\" (UID: \"43c9ed1c-0628-4235-b8ff-31392e2215d8\") " Mar 09 14:28:42 crc kubenswrapper[4722]: I0309 14:28:42.644978 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43c9ed1c-0628-4235-b8ff-31392e2215d8-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "43c9ed1c-0628-4235-b8ff-31392e2215d8" (UID: "43c9ed1c-0628-4235-b8ff-31392e2215d8"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:28:42 crc kubenswrapper[4722]: I0309 14:28:42.646862 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43c9ed1c-0628-4235-b8ff-31392e2215d8-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "43c9ed1c-0628-4235-b8ff-31392e2215d8" (UID: "43c9ed1c-0628-4235-b8ff-31392e2215d8"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:28:42 crc kubenswrapper[4722]: I0309 14:28:42.650733 4722 generic.go:334] "Generic (PLEG): container finished" podID="43c9ed1c-0628-4235-b8ff-31392e2215d8" containerID="d463ee9894c1202165df79fd362814ec517ef527a2efd4cdd8ef55fba94b2b31" exitCode=0 Mar 09 14:28:42 crc kubenswrapper[4722]: I0309 14:28:42.650784 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"43c9ed1c-0628-4235-b8ff-31392e2215d8","Type":"ContainerDied","Data":"d463ee9894c1202165df79fd362814ec517ef527a2efd4cdd8ef55fba94b2b31"} Mar 09 14:28:42 crc kubenswrapper[4722]: I0309 14:28:42.650850 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"43c9ed1c-0628-4235-b8ff-31392e2215d8","Type":"ContainerDied","Data":"30fdc62be671364d698d195ad4eb2a48ff3f0fc8b7a9e8ad2215a7923d9567f4"} Mar 09 14:28:42 crc kubenswrapper[4722]: I0309 14:28:42.650875 4722 scope.go:117] "RemoveContainer" containerID="f3a99dc4529ead0773284c49cdc3ea4372abdd3c78fec592daed2f4423c027b5" Mar 09 14:28:42 crc kubenswrapper[4722]: I0309 14:28:42.651108 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 14:28:42 crc kubenswrapper[4722]: I0309 14:28:42.675914 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43c9ed1c-0628-4235-b8ff-31392e2215d8-kube-api-access-xss6j" (OuterVolumeSpecName: "kube-api-access-xss6j") pod "43c9ed1c-0628-4235-b8ff-31392e2215d8" (UID: "43c9ed1c-0628-4235-b8ff-31392e2215d8"). InnerVolumeSpecName "kube-api-access-xss6j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:28:42 crc kubenswrapper[4722]: I0309 14:28:42.705155 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43c9ed1c-0628-4235-b8ff-31392e2215d8-scripts" (OuterVolumeSpecName: "scripts") pod "43c9ed1c-0628-4235-b8ff-31392e2215d8" (UID: "43c9ed1c-0628-4235-b8ff-31392e2215d8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:28:42 crc kubenswrapper[4722]: I0309 14:28:42.737092 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43c9ed1c-0628-4235-b8ff-31392e2215d8-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "43c9ed1c-0628-4235-b8ff-31392e2215d8" (UID: "43c9ed1c-0628-4235-b8ff-31392e2215d8"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:28:42 crc kubenswrapper[4722]: I0309 14:28:42.747278 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43c9ed1c-0628-4235-b8ff-31392e2215d8-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:42 crc kubenswrapper[4722]: I0309 14:28:42.747315 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xss6j\" (UniqueName: \"kubernetes.io/projected/43c9ed1c-0628-4235-b8ff-31392e2215d8-kube-api-access-xss6j\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:42 crc kubenswrapper[4722]: I0309 14:28:42.747327 4722 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/43c9ed1c-0628-4235-b8ff-31392e2215d8-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:42 crc kubenswrapper[4722]: I0309 14:28:42.747335 4722 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/43c9ed1c-0628-4235-b8ff-31392e2215d8-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:42 crc kubenswrapper[4722]: I0309 14:28:42.747344 4722 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/43c9ed1c-0628-4235-b8ff-31392e2215d8-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:42 crc kubenswrapper[4722]: I0309 14:28:42.790292 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43c9ed1c-0628-4235-b8ff-31392e2215d8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "43c9ed1c-0628-4235-b8ff-31392e2215d8" (UID: "43c9ed1c-0628-4235-b8ff-31392e2215d8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:28:42 crc kubenswrapper[4722]: I0309 14:28:42.809347 4722 scope.go:117] "RemoveContainer" containerID="515ccb7dd4ea0672c37ef29269279da8c7fefac77a047757893eb4540d26fe2f" Mar 09 14:28:42 crc kubenswrapper[4722]: I0309 14:28:42.833305 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43c9ed1c-0628-4235-b8ff-31392e2215d8-config-data" (OuterVolumeSpecName: "config-data") pod "43c9ed1c-0628-4235-b8ff-31392e2215d8" (UID: "43c9ed1c-0628-4235-b8ff-31392e2215d8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:28:42 crc kubenswrapper[4722]: I0309 14:28:42.835965 4722 scope.go:117] "RemoveContainer" containerID="1e85a6c73337ea17be4fffea4fa219e03d81e53c4166b213198e319fcc3c288d" Mar 09 14:28:42 crc kubenswrapper[4722]: I0309 14:28:42.851062 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43c9ed1c-0628-4235-b8ff-31392e2215d8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:42 crc kubenswrapper[4722]: I0309 14:28:42.851097 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43c9ed1c-0628-4235-b8ff-31392e2215d8-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:42 crc kubenswrapper[4722]: I0309 14:28:42.989843 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 14:28:42 crc kubenswrapper[4722]: I0309 14:28:42.990498 4722 scope.go:117] "RemoveContainer" containerID="d463ee9894c1202165df79fd362814ec517ef527a2efd4cdd8ef55fba94b2b31" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.004929 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.048292 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 09 14:28:43 crc kubenswrapper[4722]: E0309 14:28:43.048988 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43c9ed1c-0628-4235-b8ff-31392e2215d8" containerName="ceilometer-notification-agent" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.049015 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="43c9ed1c-0628-4235-b8ff-31392e2215d8" containerName="ceilometer-notification-agent" Mar 09 14:28:43 crc kubenswrapper[4722]: E0309 14:28:43.049039 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43c9ed1c-0628-4235-b8ff-31392e2215d8" containerName="ceilometer-central-agent" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.049047 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="43c9ed1c-0628-4235-b8ff-31392e2215d8" containerName="ceilometer-central-agent" Mar 09 14:28:43 crc kubenswrapper[4722]: E0309 14:28:43.049077 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43c9ed1c-0628-4235-b8ff-31392e2215d8" containerName="proxy-httpd" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.049086 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="43c9ed1c-0628-4235-b8ff-31392e2215d8" containerName="proxy-httpd" Mar 09 14:28:43 crc kubenswrapper[4722]: E0309 14:28:43.049113 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43c9ed1c-0628-4235-b8ff-31392e2215d8" containerName="sg-core" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.049121 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="43c9ed1c-0628-4235-b8ff-31392e2215d8" containerName="sg-core" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.049417 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="43c9ed1c-0628-4235-b8ff-31392e2215d8" containerName="ceilometer-notification-agent" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.049447 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="43c9ed1c-0628-4235-b8ff-31392e2215d8" containerName="ceilometer-central-agent" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.049471 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="43c9ed1c-0628-4235-b8ff-31392e2215d8" containerName="proxy-httpd" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.049487 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="43c9ed1c-0628-4235-b8ff-31392e2215d8" containerName="sg-core" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.051986 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.053826 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.053872 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.064046 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.065826 4722 scope.go:117] "RemoveContainer" containerID="f3a99dc4529ead0773284c49cdc3ea4372abdd3c78fec592daed2f4423c027b5" Mar 09 14:28:43 crc kubenswrapper[4722]: E0309 14:28:43.066717 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3a99dc4529ead0773284c49cdc3ea4372abdd3c78fec592daed2f4423c027b5\": container with ID starting with f3a99dc4529ead0773284c49cdc3ea4372abdd3c78fec592daed2f4423c027b5 not found: ID does not exist" containerID="f3a99dc4529ead0773284c49cdc3ea4372abdd3c78fec592daed2f4423c027b5" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.066753 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3a99dc4529ead0773284c49cdc3ea4372abdd3c78fec592daed2f4423c027b5"} err="failed to get container status \"f3a99dc4529ead0773284c49cdc3ea4372abdd3c78fec592daed2f4423c027b5\": rpc error: code = NotFound desc = could not find container \"f3a99dc4529ead0773284c49cdc3ea4372abdd3c78fec592daed2f4423c027b5\": container with ID starting with f3a99dc4529ead0773284c49cdc3ea4372abdd3c78fec592daed2f4423c027b5 not found: ID does not exist" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.066772 4722 scope.go:117] "RemoveContainer" containerID="515ccb7dd4ea0672c37ef29269279da8c7fefac77a047757893eb4540d26fe2f" Mar 09 14:28:43 crc kubenswrapper[4722]: E0309 14:28:43.067490 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"515ccb7dd4ea0672c37ef29269279da8c7fefac77a047757893eb4540d26fe2f\": container with ID starting with 515ccb7dd4ea0672c37ef29269279da8c7fefac77a047757893eb4540d26fe2f not found: ID does not exist" containerID="515ccb7dd4ea0672c37ef29269279da8c7fefac77a047757893eb4540d26fe2f" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.067521 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"515ccb7dd4ea0672c37ef29269279da8c7fefac77a047757893eb4540d26fe2f"} err="failed to get container status \"515ccb7dd4ea0672c37ef29269279da8c7fefac77a047757893eb4540d26fe2f\": rpc error: code = NotFound desc = could not find container \"515ccb7dd4ea0672c37ef29269279da8c7fefac77a047757893eb4540d26fe2f\": container with ID starting with 515ccb7dd4ea0672c37ef29269279da8c7fefac77a047757893eb4540d26fe2f not found: ID does not exist" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.067535 4722 scope.go:117] "RemoveContainer" containerID="1e85a6c73337ea17be4fffea4fa219e03d81e53c4166b213198e319fcc3c288d" Mar 09 14:28:43 crc kubenswrapper[4722]: E0309 14:28:43.068467 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e85a6c73337ea17be4fffea4fa219e03d81e53c4166b213198e319fcc3c288d\": container with ID starting with 1e85a6c73337ea17be4fffea4fa219e03d81e53c4166b213198e319fcc3c288d not found: ID does not exist" containerID="1e85a6c73337ea17be4fffea4fa219e03d81e53c4166b213198e319fcc3c288d" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.068523 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e85a6c73337ea17be4fffea4fa219e03d81e53c4166b213198e319fcc3c288d"} err="failed to get container status \"1e85a6c73337ea17be4fffea4fa219e03d81e53c4166b213198e319fcc3c288d\": rpc error: code = NotFound desc = could not find container \"1e85a6c73337ea17be4fffea4fa219e03d81e53c4166b213198e319fcc3c288d\": container with ID starting with 1e85a6c73337ea17be4fffea4fa219e03d81e53c4166b213198e319fcc3c288d not found: ID does not exist" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.068561 4722 scope.go:117] "RemoveContainer" containerID="d463ee9894c1202165df79fd362814ec517ef527a2efd4cdd8ef55fba94b2b31" Mar 09 14:28:43 crc kubenswrapper[4722]: E0309 14:28:43.070047 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d463ee9894c1202165df79fd362814ec517ef527a2efd4cdd8ef55fba94b2b31\": container with ID starting with d463ee9894c1202165df79fd362814ec517ef527a2efd4cdd8ef55fba94b2b31 not found: ID does not exist" containerID="d463ee9894c1202165df79fd362814ec517ef527a2efd4cdd8ef55fba94b2b31" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.070090 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d463ee9894c1202165df79fd362814ec517ef527a2efd4cdd8ef55fba94b2b31"} err="failed to get container status \"d463ee9894c1202165df79fd362814ec517ef527a2efd4cdd8ef55fba94b2b31\": rpc error: code = NotFound desc = could not find container \"d463ee9894c1202165df79fd362814ec517ef527a2efd4cdd8ef55fba94b2b31\": container with ID starting with d463ee9894c1202165df79fd362814ec517ef527a2efd4cdd8ef55fba94b2b31 not found: ID does not exist" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.158032 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08c4782a-e6b2-459f-94ff-a215e159f733-config-data\") pod \"ceilometer-0\" (UID: \"08c4782a-e6b2-459f-94ff-a215e159f733\") " pod="openstack/ceilometer-0" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.158595 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08c4782a-e6b2-459f-94ff-a215e159f733-scripts\") pod \"ceilometer-0\" (UID: \"08c4782a-e6b2-459f-94ff-a215e159f733\") " pod="openstack/ceilometer-0" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.158690 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45qg7\" (UniqueName: \"kubernetes.io/projected/08c4782a-e6b2-459f-94ff-a215e159f733-kube-api-access-45qg7\") pod \"ceilometer-0\" (UID: \"08c4782a-e6b2-459f-94ff-a215e159f733\") " pod="openstack/ceilometer-0" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.158868 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/08c4782a-e6b2-459f-94ff-a215e159f733-log-httpd\") pod \"ceilometer-0\" (UID: \"08c4782a-e6b2-459f-94ff-a215e159f733\") " pod="openstack/ceilometer-0" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.159019 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08c4782a-e6b2-459f-94ff-a215e159f733-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"08c4782a-e6b2-459f-94ff-a215e159f733\") " pod="openstack/ceilometer-0" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.159068 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/08c4782a-e6b2-459f-94ff-a215e159f733-run-httpd\") pod \"ceilometer-0\" (UID: \"08c4782a-e6b2-459f-94ff-a215e159f733\") " pod="openstack/ceilometer-0" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.159356 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/08c4782a-e6b2-459f-94ff-a215e159f733-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"08c4782a-e6b2-459f-94ff-a215e159f733\") " pod="openstack/ceilometer-0" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.262744 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/08c4782a-e6b2-459f-94ff-a215e159f733-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"08c4782a-e6b2-459f-94ff-a215e159f733\") " pod="openstack/ceilometer-0" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.262980 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08c4782a-e6b2-459f-94ff-a215e159f733-config-data\") pod \"ceilometer-0\" (UID: \"08c4782a-e6b2-459f-94ff-a215e159f733\") " pod="openstack/ceilometer-0" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.263237 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08c4782a-e6b2-459f-94ff-a215e159f733-scripts\") pod \"ceilometer-0\" (UID: \"08c4782a-e6b2-459f-94ff-a215e159f733\") " pod="openstack/ceilometer-0" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.263298 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45qg7\" (UniqueName: \"kubernetes.io/projected/08c4782a-e6b2-459f-94ff-a215e159f733-kube-api-access-45qg7\") pod \"ceilometer-0\" (UID: \"08c4782a-e6b2-459f-94ff-a215e159f733\") " pod="openstack/ceilometer-0" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.263398 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/08c4782a-e6b2-459f-94ff-a215e159f733-log-httpd\") pod \"ceilometer-0\" (UID: \"08c4782a-e6b2-459f-94ff-a215e159f733\") " pod="openstack/ceilometer-0" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.263486 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08c4782a-e6b2-459f-94ff-a215e159f733-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"08c4782a-e6b2-459f-94ff-a215e159f733\") " pod="openstack/ceilometer-0" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.263521 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/08c4782a-e6b2-459f-94ff-a215e159f733-run-httpd\") pod \"ceilometer-0\" (UID: \"08c4782a-e6b2-459f-94ff-a215e159f733\") " pod="openstack/ceilometer-0" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.264118 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/08c4782a-e6b2-459f-94ff-a215e159f733-log-httpd\") pod \"ceilometer-0\" (UID: \"08c4782a-e6b2-459f-94ff-a215e159f733\") " pod="openstack/ceilometer-0" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.264537 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/08c4782a-e6b2-459f-94ff-a215e159f733-run-httpd\") pod \"ceilometer-0\" (UID: \"08c4782a-e6b2-459f-94ff-a215e159f733\") " pod="openstack/ceilometer-0" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.266788 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/08c4782a-e6b2-459f-94ff-a215e159f733-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"08c4782a-e6b2-459f-94ff-a215e159f733\") " pod="openstack/ceilometer-0" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.268140 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08c4782a-e6b2-459f-94ff-a215e159f733-config-data\") pod \"ceilometer-0\" (UID: \"08c4782a-e6b2-459f-94ff-a215e159f733\") " pod="openstack/ceilometer-0" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.270123 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08c4782a-e6b2-459f-94ff-a215e159f733-scripts\") pod \"ceilometer-0\" (UID: \"08c4782a-e6b2-459f-94ff-a215e159f733\") " pod="openstack/ceilometer-0" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.271123 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08c4782a-e6b2-459f-94ff-a215e159f733-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"08c4782a-e6b2-459f-94ff-a215e159f733\") " pod="openstack/ceilometer-0" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.296484 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45qg7\" (UniqueName: \"kubernetes.io/projected/08c4782a-e6b2-459f-94ff-a215e159f733-kube-api-access-45qg7\") pod \"ceilometer-0\" (UID: \"08c4782a-e6b2-459f-94ff-a215e159f733\") " pod="openstack/ceilometer-0" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.374028 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.392506 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.393187 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.467860 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce2e955c-c37e-4d29-9887-0daf8cb6ceea-config-data\") pod \"ce2e955c-c37e-4d29-9887-0daf8cb6ceea\" (UID: \"ce2e955c-c37e-4d29-9887-0daf8cb6ceea\") " Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.467918 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7tp4\" (UniqueName: \"kubernetes.io/projected/ce2e955c-c37e-4d29-9887-0daf8cb6ceea-kube-api-access-p7tp4\") pod \"ce2e955c-c37e-4d29-9887-0daf8cb6ceea\" (UID: \"ce2e955c-c37e-4d29-9887-0daf8cb6ceea\") " Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.467943 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce2e955c-c37e-4d29-9887-0daf8cb6ceea-logs\") pod \"ce2e955c-c37e-4d29-9887-0daf8cb6ceea\" (UID: \"ce2e955c-c37e-4d29-9887-0daf8cb6ceea\") " Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.468006 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfdb4207-6d78-4641-809a-83fc15cd750e-config-data\") pod \"cfdb4207-6d78-4641-809a-83fc15cd750e\" (UID: \"cfdb4207-6d78-4641-809a-83fc15cd750e\") " Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.468179 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce2e955c-c37e-4d29-9887-0daf8cb6ceea-combined-ca-bundle\") pod \"ce2e955c-c37e-4d29-9887-0daf8cb6ceea\" (UID: \"ce2e955c-c37e-4d29-9887-0daf8cb6ceea\") " Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.468218 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfdb4207-6d78-4641-809a-83fc15cd750e-combined-ca-bundle\") pod \"cfdb4207-6d78-4641-809a-83fc15cd750e\" (UID: \"cfdb4207-6d78-4641-809a-83fc15cd750e\") " Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.468249 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kj2pp\" (UniqueName: \"kubernetes.io/projected/cfdb4207-6d78-4641-809a-83fc15cd750e-kube-api-access-kj2pp\") pod \"cfdb4207-6d78-4641-809a-83fc15cd750e\" (UID: \"cfdb4207-6d78-4641-809a-83fc15cd750e\") " Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.470330 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce2e955c-c37e-4d29-9887-0daf8cb6ceea-logs" (OuterVolumeSpecName: "logs") pod "ce2e955c-c37e-4d29-9887-0daf8cb6ceea" (UID: "ce2e955c-c37e-4d29-9887-0daf8cb6ceea"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.473158 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfdb4207-6d78-4641-809a-83fc15cd750e-kube-api-access-kj2pp" (OuterVolumeSpecName: "kube-api-access-kj2pp") pod "cfdb4207-6d78-4641-809a-83fc15cd750e" (UID: "cfdb4207-6d78-4641-809a-83fc15cd750e"). InnerVolumeSpecName "kube-api-access-kj2pp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.479986 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce2e955c-c37e-4d29-9887-0daf8cb6ceea-kube-api-access-p7tp4" (OuterVolumeSpecName: "kube-api-access-p7tp4") pod "ce2e955c-c37e-4d29-9887-0daf8cb6ceea" (UID: "ce2e955c-c37e-4d29-9887-0daf8cb6ceea"). InnerVolumeSpecName "kube-api-access-p7tp4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.515404 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce2e955c-c37e-4d29-9887-0daf8cb6ceea-config-data" (OuterVolumeSpecName: "config-data") pod "ce2e955c-c37e-4d29-9887-0daf8cb6ceea" (UID: "ce2e955c-c37e-4d29-9887-0daf8cb6ceea"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.532647 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfdb4207-6d78-4641-809a-83fc15cd750e-config-data" (OuterVolumeSpecName: "config-data") pod "cfdb4207-6d78-4641-809a-83fc15cd750e" (UID: "cfdb4207-6d78-4641-809a-83fc15cd750e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.533832 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce2e955c-c37e-4d29-9887-0daf8cb6ceea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ce2e955c-c37e-4d29-9887-0daf8cb6ceea" (UID: "ce2e955c-c37e-4d29-9887-0daf8cb6ceea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.536474 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfdb4207-6d78-4641-809a-83fc15cd750e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cfdb4207-6d78-4641-809a-83fc15cd750e" (UID: "cfdb4207-6d78-4641-809a-83fc15cd750e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.570982 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce2e955c-c37e-4d29-9887-0daf8cb6ceea-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.571854 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p7tp4\" (UniqueName: \"kubernetes.io/projected/ce2e955c-c37e-4d29-9887-0daf8cb6ceea-kube-api-access-p7tp4\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.571960 4722 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ce2e955c-c37e-4d29-9887-0daf8cb6ceea-logs\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.572022 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cfdb4207-6d78-4641-809a-83fc15cd750e-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.572077 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce2e955c-c37e-4d29-9887-0daf8cb6ceea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.572156 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cfdb4207-6d78-4641-809a-83fc15cd750e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.572246 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kj2pp\" (UniqueName: \"kubernetes.io/projected/cfdb4207-6d78-4641-809a-83fc15cd750e-kube-api-access-kj2pp\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.676708 4722 generic.go:334] "Generic (PLEG): container finished" podID="ce2e955c-c37e-4d29-9887-0daf8cb6ceea" containerID="6e1646526fccbf44f84656b18ed84e13ad604b922833bd4566ac826242fa85c7" exitCode=137 Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.676819 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.677331 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ce2e955c-c37e-4d29-9887-0daf8cb6ceea","Type":"ContainerDied","Data":"6e1646526fccbf44f84656b18ed84e13ad604b922833bd4566ac826242fa85c7"} Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.677899 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ce2e955c-c37e-4d29-9887-0daf8cb6ceea","Type":"ContainerDied","Data":"18a2330040e053f0ee31d2a071720509237cd64ccdb83f7cbfdf96df05ae9563"} Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.677921 4722 scope.go:117] "RemoveContainer" containerID="6e1646526fccbf44f84656b18ed84e13ad604b922833bd4566ac826242fa85c7" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.685265 4722 generic.go:334] "Generic (PLEG): container finished" podID="cfdb4207-6d78-4641-809a-83fc15cd750e" containerID="419a0ceea0f63127048ea992454ea646d8b399148a977e09dfa153c6e7a8c132" exitCode=137 Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.685313 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"cfdb4207-6d78-4641-809a-83fc15cd750e","Type":"ContainerDied","Data":"419a0ceea0f63127048ea992454ea646d8b399148a977e09dfa153c6e7a8c132"} Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.685345 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"cfdb4207-6d78-4641-809a-83fc15cd750e","Type":"ContainerDied","Data":"039e016b6e9ff95306850e8701866536bb298ae048d98ffca3123162f845b539"} Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.685595 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.714774 4722 scope.go:117] "RemoveContainer" containerID="29c23d7e883ee162938f3c346da5878bb9a55af0cf267b08f37f07a57ea1c3e3" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.733147 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.758713 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.772221 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.775375 4722 scope.go:117] "RemoveContainer" containerID="6e1646526fccbf44f84656b18ed84e13ad604b922833bd4566ac826242fa85c7" Mar 09 14:28:43 crc kubenswrapper[4722]: E0309 14:28:43.782807 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e1646526fccbf44f84656b18ed84e13ad604b922833bd4566ac826242fa85c7\": container with ID starting with 6e1646526fccbf44f84656b18ed84e13ad604b922833bd4566ac826242fa85c7 not found: ID does not exist" containerID="6e1646526fccbf44f84656b18ed84e13ad604b922833bd4566ac826242fa85c7" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.782952 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e1646526fccbf44f84656b18ed84e13ad604b922833bd4566ac826242fa85c7"} err="failed to get container status \"6e1646526fccbf44f84656b18ed84e13ad604b922833bd4566ac826242fa85c7\": rpc error: code = NotFound desc = could not find container \"6e1646526fccbf44f84656b18ed84e13ad604b922833bd4566ac826242fa85c7\": container with ID starting with 6e1646526fccbf44f84656b18ed84e13ad604b922833bd4566ac826242fa85c7 not found: ID does not exist" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.782986 4722 scope.go:117] "RemoveContainer" containerID="29c23d7e883ee162938f3c346da5878bb9a55af0cf267b08f37f07a57ea1c3e3" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.783685 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 09 14:28:43 crc kubenswrapper[4722]: E0309 14:28:43.788688 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29c23d7e883ee162938f3c346da5878bb9a55af0cf267b08f37f07a57ea1c3e3\": container with ID starting with 29c23d7e883ee162938f3c346da5878bb9a55af0cf267b08f37f07a57ea1c3e3 not found: ID does not exist" containerID="29c23d7e883ee162938f3c346da5878bb9a55af0cf267b08f37f07a57ea1c3e3" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.788851 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29c23d7e883ee162938f3c346da5878bb9a55af0cf267b08f37f07a57ea1c3e3"} err="failed to get container status \"29c23d7e883ee162938f3c346da5878bb9a55af0cf267b08f37f07a57ea1c3e3\": rpc error: code = NotFound desc = could not find container \"29c23d7e883ee162938f3c346da5878bb9a55af0cf267b08f37f07a57ea1c3e3\": container with ID starting with 29c23d7e883ee162938f3c346da5878bb9a55af0cf267b08f37f07a57ea1c3e3 not found: ID does not exist" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.788931 4722 scope.go:117] "RemoveContainer" containerID="419a0ceea0f63127048ea992454ea646d8b399148a977e09dfa153c6e7a8c132" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.799248 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 09 14:28:43 crc kubenswrapper[4722]: E0309 14:28:43.799866 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce2e955c-c37e-4d29-9887-0daf8cb6ceea" containerName="nova-metadata-log" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.799889 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce2e955c-c37e-4d29-9887-0daf8cb6ceea" containerName="nova-metadata-log" Mar 09 14:28:43 crc kubenswrapper[4722]: E0309 14:28:43.799951 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfdb4207-6d78-4641-809a-83fc15cd750e" containerName="nova-cell1-novncproxy-novncproxy" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.799958 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfdb4207-6d78-4641-809a-83fc15cd750e" containerName="nova-cell1-novncproxy-novncproxy" Mar 09 14:28:43 crc kubenswrapper[4722]: E0309 14:28:43.799979 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce2e955c-c37e-4d29-9887-0daf8cb6ceea" containerName="nova-metadata-metadata" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.799986 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce2e955c-c37e-4d29-9887-0daf8cb6ceea" containerName="nova-metadata-metadata" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.800238 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfdb4207-6d78-4641-809a-83fc15cd750e" containerName="nova-cell1-novncproxy-novncproxy" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.800257 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce2e955c-c37e-4d29-9887-0daf8cb6ceea" containerName="nova-metadata-metadata" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.800269 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce2e955c-c37e-4d29-9887-0daf8cb6ceea" containerName="nova-metadata-log" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.801709 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.805367 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.805861 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.814307 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.824727 4722 scope.go:117] "RemoveContainer" containerID="419a0ceea0f63127048ea992454ea646d8b399148a977e09dfa153c6e7a8c132" Mar 09 14:28:43 crc kubenswrapper[4722]: E0309 14:28:43.825338 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"419a0ceea0f63127048ea992454ea646d8b399148a977e09dfa153c6e7a8c132\": container with ID starting with 419a0ceea0f63127048ea992454ea646d8b399148a977e09dfa153c6e7a8c132 not found: ID does not exist" containerID="419a0ceea0f63127048ea992454ea646d8b399148a977e09dfa153c6e7a8c132" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.825400 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"419a0ceea0f63127048ea992454ea646d8b399148a977e09dfa153c6e7a8c132"} err="failed to get container status \"419a0ceea0f63127048ea992454ea646d8b399148a977e09dfa153c6e7a8c132\": rpc error: code = NotFound desc = could not find container \"419a0ceea0f63127048ea992454ea646d8b399148a977e09dfa153c6e7a8c132\": container with ID starting with 419a0ceea0f63127048ea992454ea646d8b399148a977e09dfa153c6e7a8c132 not found: ID does not exist" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.841969 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.848290 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.851818 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.862634 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.881744 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.883503 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clvgn\" (UniqueName: \"kubernetes.io/projected/b58c3f1a-21b8-4a75-bfb5-970122e20db6-kube-api-access-clvgn\") pod \"nova-metadata-0\" (UID: \"b58c3f1a-21b8-4a75-bfb5-970122e20db6\") " pod="openstack/nova-metadata-0" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.883598 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b58c3f1a-21b8-4a75-bfb5-970122e20db6-config-data\") pod \"nova-metadata-0\" (UID: \"b58c3f1a-21b8-4a75-bfb5-970122e20db6\") " pod="openstack/nova-metadata-0" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.883649 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b58c3f1a-21b8-4a75-bfb5-970122e20db6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b58c3f1a-21b8-4a75-bfb5-970122e20db6\") " pod="openstack/nova-metadata-0" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.883690 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b58c3f1a-21b8-4a75-bfb5-970122e20db6-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b58c3f1a-21b8-4a75-bfb5-970122e20db6\") " pod="openstack/nova-metadata-0" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.883731 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b58c3f1a-21b8-4a75-bfb5-970122e20db6-logs\") pod \"nova-metadata-0\" (UID: \"b58c3f1a-21b8-4a75-bfb5-970122e20db6\") " pod="openstack/nova-metadata-0" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.901395 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.928523 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.985571 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a4409e4-5fc9-4f5b-ad06-af09ea87f10b-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"3a4409e4-5fc9-4f5b-ad06-af09ea87f10b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.985651 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a4409e4-5fc9-4f5b-ad06-af09ea87f10b-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"3a4409e4-5fc9-4f5b-ad06-af09ea87f10b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.985673 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a4409e4-5fc9-4f5b-ad06-af09ea87f10b-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"3a4409e4-5fc9-4f5b-ad06-af09ea87f10b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.985724 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgkpx\" (UniqueName: \"kubernetes.io/projected/3a4409e4-5fc9-4f5b-ad06-af09ea87f10b-kube-api-access-wgkpx\") pod \"nova-cell1-novncproxy-0\" (UID: \"3a4409e4-5fc9-4f5b-ad06-af09ea87f10b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.985754 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clvgn\" (UniqueName: \"kubernetes.io/projected/b58c3f1a-21b8-4a75-bfb5-970122e20db6-kube-api-access-clvgn\") pod \"nova-metadata-0\" (UID: \"b58c3f1a-21b8-4a75-bfb5-970122e20db6\") " pod="openstack/nova-metadata-0" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.985823 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b58c3f1a-21b8-4a75-bfb5-970122e20db6-config-data\") pod \"nova-metadata-0\" (UID: \"b58c3f1a-21b8-4a75-bfb5-970122e20db6\") " pod="openstack/nova-metadata-0" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.985867 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b58c3f1a-21b8-4a75-bfb5-970122e20db6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b58c3f1a-21b8-4a75-bfb5-970122e20db6\") " pod="openstack/nova-metadata-0" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.985902 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b58c3f1a-21b8-4a75-bfb5-970122e20db6-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b58c3f1a-21b8-4a75-bfb5-970122e20db6\") " pod="openstack/nova-metadata-0" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.985936 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a4409e4-5fc9-4f5b-ad06-af09ea87f10b-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"3a4409e4-5fc9-4f5b-ad06-af09ea87f10b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.985952 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b58c3f1a-21b8-4a75-bfb5-970122e20db6-logs\") pod \"nova-metadata-0\" (UID: \"b58c3f1a-21b8-4a75-bfb5-970122e20db6\") " pod="openstack/nova-metadata-0" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.986481 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b58c3f1a-21b8-4a75-bfb5-970122e20db6-logs\") pod \"nova-metadata-0\" (UID: \"b58c3f1a-21b8-4a75-bfb5-970122e20db6\") " pod="openstack/nova-metadata-0" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.990041 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b58c3f1a-21b8-4a75-bfb5-970122e20db6-config-data\") pod \"nova-metadata-0\" (UID: \"b58c3f1a-21b8-4a75-bfb5-970122e20db6\") " pod="openstack/nova-metadata-0" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.992691 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b58c3f1a-21b8-4a75-bfb5-970122e20db6-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b58c3f1a-21b8-4a75-bfb5-970122e20db6\") " pod="openstack/nova-metadata-0" Mar 09 14:28:43 crc kubenswrapper[4722]: I0309 14:28:43.993986 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b58c3f1a-21b8-4a75-bfb5-970122e20db6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b58c3f1a-21b8-4a75-bfb5-970122e20db6\") " pod="openstack/nova-metadata-0" Mar 09 14:28:44 crc kubenswrapper[4722]: I0309 14:28:44.001929 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clvgn\" (UniqueName: \"kubernetes.io/projected/b58c3f1a-21b8-4a75-bfb5-970122e20db6-kube-api-access-clvgn\") pod \"nova-metadata-0\" (UID: \"b58c3f1a-21b8-4a75-bfb5-970122e20db6\") " pod="openstack/nova-metadata-0" Mar 09 14:28:44 crc kubenswrapper[4722]: I0309 14:28:44.088150 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a4409e4-5fc9-4f5b-ad06-af09ea87f10b-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"3a4409e4-5fc9-4f5b-ad06-af09ea87f10b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 14:28:44 crc kubenswrapper[4722]: I0309 14:28:44.088471 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a4409e4-5fc9-4f5b-ad06-af09ea87f10b-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"3a4409e4-5fc9-4f5b-ad06-af09ea87f10b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 14:28:44 crc kubenswrapper[4722]: I0309 14:28:44.088624 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a4409e4-5fc9-4f5b-ad06-af09ea87f10b-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"3a4409e4-5fc9-4f5b-ad06-af09ea87f10b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 14:28:44 crc kubenswrapper[4722]: I0309 14:28:44.088880 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a4409e4-5fc9-4f5b-ad06-af09ea87f10b-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"3a4409e4-5fc9-4f5b-ad06-af09ea87f10b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 14:28:44 crc kubenswrapper[4722]: I0309 14:28:44.089089 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgkpx\" (UniqueName: \"kubernetes.io/projected/3a4409e4-5fc9-4f5b-ad06-af09ea87f10b-kube-api-access-wgkpx\") pod \"nova-cell1-novncproxy-0\" (UID: \"3a4409e4-5fc9-4f5b-ad06-af09ea87f10b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 14:28:44 crc kubenswrapper[4722]: I0309 14:28:44.091627 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a4409e4-5fc9-4f5b-ad06-af09ea87f10b-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"3a4409e4-5fc9-4f5b-ad06-af09ea87f10b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 14:28:44 crc kubenswrapper[4722]: I0309 14:28:44.092025 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a4409e4-5fc9-4f5b-ad06-af09ea87f10b-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"3a4409e4-5fc9-4f5b-ad06-af09ea87f10b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 14:28:44 crc kubenswrapper[4722]: I0309 14:28:44.092418 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a4409e4-5fc9-4f5b-ad06-af09ea87f10b-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"3a4409e4-5fc9-4f5b-ad06-af09ea87f10b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 14:28:44 crc kubenswrapper[4722]: I0309 14:28:44.093002 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a4409e4-5fc9-4f5b-ad06-af09ea87f10b-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"3a4409e4-5fc9-4f5b-ad06-af09ea87f10b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 14:28:44 crc kubenswrapper[4722]: I0309 14:28:44.107871 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgkpx\" (UniqueName: \"kubernetes.io/projected/3a4409e4-5fc9-4f5b-ad06-af09ea87f10b-kube-api-access-wgkpx\") pod \"nova-cell1-novncproxy-0\" (UID: \"3a4409e4-5fc9-4f5b-ad06-af09ea87f10b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 09 14:28:44 crc kubenswrapper[4722]: I0309 14:28:44.129043 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 09 14:28:44 crc kubenswrapper[4722]: I0309 14:28:44.164214 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43c9ed1c-0628-4235-b8ff-31392e2215d8" path="/var/lib/kubelet/pods/43c9ed1c-0628-4235-b8ff-31392e2215d8/volumes" Mar 09 14:28:44 crc kubenswrapper[4722]: I0309 14:28:44.165311 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce2e955c-c37e-4d29-9887-0daf8cb6ceea" path="/var/lib/kubelet/pods/ce2e955c-c37e-4d29-9887-0daf8cb6ceea/volumes" Mar 09 14:28:44 crc kubenswrapper[4722]: I0309 14:28:44.165914 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfdb4207-6d78-4641-809a-83fc15cd750e" path="/var/lib/kubelet/pods/cfdb4207-6d78-4641-809a-83fc15cd750e/volumes" Mar 09 14:28:44 crc kubenswrapper[4722]: I0309 14:28:44.201330 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 09 14:28:44 crc kubenswrapper[4722]: I0309 14:28:44.617856 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 14:28:44 crc kubenswrapper[4722]: I0309 14:28:44.696842 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"08c4782a-e6b2-459f-94ff-a215e159f733","Type":"ContainerStarted","Data":"15785102d818b457b8b65ef39cc7363deeacfce8e259b9b0591aa912e605ce8a"} Mar 09 14:28:44 crc kubenswrapper[4722]: I0309 14:28:44.696900 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"08c4782a-e6b2-459f-94ff-a215e159f733","Type":"ContainerStarted","Data":"504be224c93d1fe325130ad91ab5ae5ac7775d2b985d05f32e75d9fcbe36db77"} Mar 09 14:28:44 crc kubenswrapper[4722]: I0309 14:28:44.701015 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b58c3f1a-21b8-4a75-bfb5-970122e20db6","Type":"ContainerStarted","Data":"fa2d300a8b9c0a65b4a54981c995c94cee16cae4cfb386064d800981c5814874"} Mar 09 14:28:44 crc kubenswrapper[4722]: W0309 14:28:44.775740 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3a4409e4_5fc9_4f5b_ad06_af09ea87f10b.slice/crio-0e18532cb7d782b54484e0fc4b2f4e2be4c023a68255d9118b09cb61eac8cd02 WatchSource:0}: Error finding container 0e18532cb7d782b54484e0fc4b2f4e2be4c023a68255d9118b09cb61eac8cd02: Status 404 returned error can't find the container with id 0e18532cb7d782b54484e0fc4b2f4e2be4c023a68255d9118b09cb61eac8cd02 Mar 09 14:28:44 crc kubenswrapper[4722]: I0309 14:28:44.776098 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 09 14:28:45 crc kubenswrapper[4722]: I0309 14:28:45.113846 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 09 14:28:45 crc kubenswrapper[4722]: I0309 14:28:45.114579 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 09 14:28:45 crc kubenswrapper[4722]: I0309 14:28:45.120314 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 09 14:28:45 crc kubenswrapper[4722]: I0309 14:28:45.120732 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 09 14:28:45 crc kubenswrapper[4722]: I0309 14:28:45.723712 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"3a4409e4-5fc9-4f5b-ad06-af09ea87f10b","Type":"ContainerStarted","Data":"5b59239d79c9e211fbf4f3e301ba7e659dd56fcb1a26960b997ad578827baaa7"} Mar 09 14:28:45 crc kubenswrapper[4722]: I0309 14:28:45.723995 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"3a4409e4-5fc9-4f5b-ad06-af09ea87f10b","Type":"ContainerStarted","Data":"0e18532cb7d782b54484e0fc4b2f4e2be4c023a68255d9118b09cb61eac8cd02"} Mar 09 14:28:45 crc kubenswrapper[4722]: I0309 14:28:45.727286 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b58c3f1a-21b8-4a75-bfb5-970122e20db6","Type":"ContainerStarted","Data":"5891fa687a19a92c246a30786d3a0fe3fca15abe1ec15a94af6e37d35fff007a"} Mar 09 14:28:45 crc kubenswrapper[4722]: I0309 14:28:45.727333 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b58c3f1a-21b8-4a75-bfb5-970122e20db6","Type":"ContainerStarted","Data":"a1b1a60a9d6e55a96d906a241559260ee757706a4d44f98b4b1268ac1a05ec9c"} Mar 09 14:28:45 crc kubenswrapper[4722]: I0309 14:28:45.731251 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"08c4782a-e6b2-459f-94ff-a215e159f733","Type":"ContainerStarted","Data":"773088410ddce829ec1c0f2ca1e39fa81f72f6394348e408b4c3cd80df3feb86"} Mar 09 14:28:45 crc kubenswrapper[4722]: I0309 14:28:45.731618 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 09 14:28:45 crc kubenswrapper[4722]: I0309 14:28:45.739245 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 09 14:28:45 crc kubenswrapper[4722]: I0309 14:28:45.767862 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.767843963 podStartE2EDuration="2.767843963s" podCreationTimestamp="2026-03-09 14:28:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:28:45.755426741 +0000 UTC m=+1566.310995317" watchObservedRunningTime="2026-03-09 14:28:45.767843963 +0000 UTC m=+1566.323412539" Mar 09 14:28:45 crc kubenswrapper[4722]: I0309 14:28:45.808476 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.808428313 podStartE2EDuration="2.808428313s" podCreationTimestamp="2026-03-09 14:28:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:28:45.796316878 +0000 UTC m=+1566.351885464" watchObservedRunningTime="2026-03-09 14:28:45.808428313 +0000 UTC m=+1566.363996889" Mar 09 14:28:46 crc kubenswrapper[4722]: I0309 14:28:46.014300 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b7bbf7cf9-k62sg"] Mar 09 14:28:46 crc kubenswrapper[4722]: I0309 14:28:46.016634 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7bbf7cf9-k62sg" Mar 09 14:28:46 crc kubenswrapper[4722]: I0309 14:28:46.030759 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7bbf7cf9-k62sg"] Mar 09 14:28:46 crc kubenswrapper[4722]: I0309 14:28:46.151071 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/33c8ed47-f7d5-485b-b413-8c80c3a5b276-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7bbf7cf9-k62sg\" (UID: \"33c8ed47-f7d5-485b-b413-8c80c3a5b276\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-k62sg" Mar 09 14:28:46 crc kubenswrapper[4722]: I0309 14:28:46.151116 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33c8ed47-f7d5-485b-b413-8c80c3a5b276-dns-svc\") pod \"dnsmasq-dns-6b7bbf7cf9-k62sg\" (UID: \"33c8ed47-f7d5-485b-b413-8c80c3a5b276\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-k62sg" Mar 09 14:28:46 crc kubenswrapper[4722]: I0309 14:28:46.151146 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmc68\" (UniqueName: \"kubernetes.io/projected/33c8ed47-f7d5-485b-b413-8c80c3a5b276-kube-api-access-hmc68\") pod \"dnsmasq-dns-6b7bbf7cf9-k62sg\" (UID: \"33c8ed47-f7d5-485b-b413-8c80c3a5b276\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-k62sg" Mar 09 14:28:46 crc kubenswrapper[4722]: I0309 14:28:46.151166 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/33c8ed47-f7d5-485b-b413-8c80c3a5b276-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7bbf7cf9-k62sg\" (UID: \"33c8ed47-f7d5-485b-b413-8c80c3a5b276\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-k62sg" Mar 09 14:28:46 crc kubenswrapper[4722]: I0309 14:28:46.151342 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/33c8ed47-f7d5-485b-b413-8c80c3a5b276-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7bbf7cf9-k62sg\" (UID: \"33c8ed47-f7d5-485b-b413-8c80c3a5b276\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-k62sg" Mar 09 14:28:46 crc kubenswrapper[4722]: I0309 14:28:46.151377 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33c8ed47-f7d5-485b-b413-8c80c3a5b276-config\") pod \"dnsmasq-dns-6b7bbf7cf9-k62sg\" (UID: \"33c8ed47-f7d5-485b-b413-8c80c3a5b276\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-k62sg" Mar 09 14:28:46 crc kubenswrapper[4722]: I0309 14:28:46.253134 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/33c8ed47-f7d5-485b-b413-8c80c3a5b276-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7bbf7cf9-k62sg\" (UID: \"33c8ed47-f7d5-485b-b413-8c80c3a5b276\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-k62sg" Mar 09 14:28:46 crc kubenswrapper[4722]: I0309 14:28:46.254117 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33c8ed47-f7d5-485b-b413-8c80c3a5b276-dns-svc\") pod \"dnsmasq-dns-6b7bbf7cf9-k62sg\" (UID: \"33c8ed47-f7d5-485b-b413-8c80c3a5b276\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-k62sg" Mar 09 14:28:46 crc kubenswrapper[4722]: I0309 14:28:46.254141 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/33c8ed47-f7d5-485b-b413-8c80c3a5b276-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7bbf7cf9-k62sg\" (UID: \"33c8ed47-f7d5-485b-b413-8c80c3a5b276\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-k62sg" Mar 09 14:28:46 crc kubenswrapper[4722]: I0309 14:28:46.254231 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmc68\" (UniqueName: \"kubernetes.io/projected/33c8ed47-f7d5-485b-b413-8c80c3a5b276-kube-api-access-hmc68\") pod \"dnsmasq-dns-6b7bbf7cf9-k62sg\" (UID: \"33c8ed47-f7d5-485b-b413-8c80c3a5b276\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-k62sg" Mar 09 14:28:46 crc kubenswrapper[4722]: I0309 14:28:46.254443 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/33c8ed47-f7d5-485b-b413-8c80c3a5b276-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7bbf7cf9-k62sg\" (UID: \"33c8ed47-f7d5-485b-b413-8c80c3a5b276\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-k62sg" Mar 09 14:28:46 crc kubenswrapper[4722]: I0309 14:28:46.255038 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/33c8ed47-f7d5-485b-b413-8c80c3a5b276-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7bbf7cf9-k62sg\" (UID: \"33c8ed47-f7d5-485b-b413-8c80c3a5b276\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-k62sg" Mar 09 14:28:46 crc kubenswrapper[4722]: I0309 14:28:46.255110 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33c8ed47-f7d5-485b-b413-8c80c3a5b276-config\") pod \"dnsmasq-dns-6b7bbf7cf9-k62sg\" (UID: \"33c8ed47-f7d5-485b-b413-8c80c3a5b276\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-k62sg" Mar 09 14:28:46 crc kubenswrapper[4722]: I0309 14:28:46.255330 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33c8ed47-f7d5-485b-b413-8c80c3a5b276-dns-svc\") pod \"dnsmasq-dns-6b7bbf7cf9-k62sg\" (UID: \"33c8ed47-f7d5-485b-b413-8c80c3a5b276\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-k62sg" Mar 09 14:28:46 crc kubenswrapper[4722]: I0309 14:28:46.255734 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/33c8ed47-f7d5-485b-b413-8c80c3a5b276-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7bbf7cf9-k62sg\" (UID: \"33c8ed47-f7d5-485b-b413-8c80c3a5b276\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-k62sg" Mar 09 14:28:46 crc kubenswrapper[4722]: I0309 14:28:46.256188 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/33c8ed47-f7d5-485b-b413-8c80c3a5b276-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7bbf7cf9-k62sg\" (UID: \"33c8ed47-f7d5-485b-b413-8c80c3a5b276\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-k62sg" Mar 09 14:28:46 crc kubenswrapper[4722]: I0309 14:28:46.256669 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33c8ed47-f7d5-485b-b413-8c80c3a5b276-config\") pod \"dnsmasq-dns-6b7bbf7cf9-k62sg\" (UID: \"33c8ed47-f7d5-485b-b413-8c80c3a5b276\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-k62sg" Mar 09 14:28:46 crc kubenswrapper[4722]: I0309 14:28:46.283498 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmc68\" (UniqueName: \"kubernetes.io/projected/33c8ed47-f7d5-485b-b413-8c80c3a5b276-kube-api-access-hmc68\") pod \"dnsmasq-dns-6b7bbf7cf9-k62sg\" (UID: \"33c8ed47-f7d5-485b-b413-8c80c3a5b276\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-k62sg" Mar 09 14:28:46 crc kubenswrapper[4722]: I0309 14:28:46.420175 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7bbf7cf9-k62sg" Mar 09 14:28:46 crc kubenswrapper[4722]: I0309 14:28:46.763244 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"08c4782a-e6b2-459f-94ff-a215e159f733","Type":"ContainerStarted","Data":"d52fcd4ad5c64c2027b5d5db825b2ac0478c6a967da7cf3fc75c22fcc8853049"} Mar 09 14:28:47 crc kubenswrapper[4722]: I0309 14:28:47.198737 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7bbf7cf9-k62sg"] Mar 09 14:28:47 crc kubenswrapper[4722]: I0309 14:28:47.776643 4722 generic.go:334] "Generic (PLEG): container finished" podID="33c8ed47-f7d5-485b-b413-8c80c3a5b276" containerID="3e5c937c1579aa97f73dcf384a185ff4c132b3765ccff35b93d939c8ea6a1f88" exitCode=0 Mar 09 14:28:47 crc kubenswrapper[4722]: I0309 14:28:47.776688 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7bbf7cf9-k62sg" event={"ID":"33c8ed47-f7d5-485b-b413-8c80c3a5b276","Type":"ContainerDied","Data":"3e5c937c1579aa97f73dcf384a185ff4c132b3765ccff35b93d939c8ea6a1f88"} Mar 09 14:28:47 crc kubenswrapper[4722]: I0309 14:28:47.777372 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7bbf7cf9-k62sg" event={"ID":"33c8ed47-f7d5-485b-b413-8c80c3a5b276","Type":"ContainerStarted","Data":"38da260493b48d3c384d023462fac6c726901df598b1f1d4cfb644db64839d27"} Mar 09 14:28:48 crc kubenswrapper[4722]: I0309 14:28:48.542841 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 14:28:48 crc kubenswrapper[4722]: I0309 14:28:48.793281 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"08c4782a-e6b2-459f-94ff-a215e159f733","Type":"ContainerStarted","Data":"2a42a105b3cf3309a0a3c938af50d1d16511cf8be2274058f502f6a7745f0653"} Mar 09 14:28:48 crc kubenswrapper[4722]: I0309 14:28:48.793461 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 09 14:28:48 crc kubenswrapper[4722]: I0309 14:28:48.796065 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7bbf7cf9-k62sg" event={"ID":"33c8ed47-f7d5-485b-b413-8c80c3a5b276","Type":"ContainerStarted","Data":"c9b3f6e72a54b5597dc47652c56be9a758f995531c5fbf3fb7265ef6524329f9"} Mar 09 14:28:48 crc kubenswrapper[4722]: I0309 14:28:48.796257 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b7bbf7cf9-k62sg" Mar 09 14:28:48 crc kubenswrapper[4722]: I0309 14:28:48.824470 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.7933064399999998 podStartE2EDuration="5.824448799s" podCreationTimestamp="2026-03-09 14:28:43 +0000 UTC" firstStartedPulling="2026-03-09 14:28:43.858659448 +0000 UTC m=+1564.414228014" lastFinishedPulling="2026-03-09 14:28:47.889801797 +0000 UTC m=+1568.445370373" observedRunningTime="2026-03-09 14:28:48.817631411 +0000 UTC m=+1569.373199977" watchObservedRunningTime="2026-03-09 14:28:48.824448799 +0000 UTC m=+1569.380017375" Mar 09 14:28:48 crc kubenswrapper[4722]: I0309 14:28:48.866158 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 09 14:28:48 crc kubenswrapper[4722]: I0309 14:28:48.866740 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="9c58393a-f27d-4e46-87ac-352b1370c295" containerName="nova-api-log" containerID="cri-o://9cf4b538321d3fa703583336e99c0ac5959c1b9751f95fd0ea88e2cc3ab41b80" gracePeriod=30 Mar 09 14:28:48 crc kubenswrapper[4722]: I0309 14:28:48.867326 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="9c58393a-f27d-4e46-87ac-352b1370c295" containerName="nova-api-api" containerID="cri-o://3c8f0ecc6e288a2c45b7eecaa7fbb8df96e687d06d2ecda8b11d40e75a734c21" gracePeriod=30 Mar 09 14:28:48 crc kubenswrapper[4722]: I0309 14:28:48.874431 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b7bbf7cf9-k62sg" podStartSLOduration=3.874407348 podStartE2EDuration="3.874407348s" podCreationTimestamp="2026-03-09 14:28:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:28:48.84800984 +0000 UTC m=+1569.403578426" watchObservedRunningTime="2026-03-09 14:28:48.874407348 +0000 UTC m=+1569.429975924" Mar 09 14:28:49 crc kubenswrapper[4722]: I0309 14:28:49.129340 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 09 14:28:49 crc kubenswrapper[4722]: I0309 14:28:49.129411 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 09 14:28:49 crc kubenswrapper[4722]: I0309 14:28:49.202292 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 09 14:28:49 crc kubenswrapper[4722]: I0309 14:28:49.807799 4722 generic.go:334] "Generic (PLEG): container finished" podID="9c58393a-f27d-4e46-87ac-352b1370c295" containerID="9cf4b538321d3fa703583336e99c0ac5959c1b9751f95fd0ea88e2cc3ab41b80" exitCode=143 Mar 09 14:28:49 crc kubenswrapper[4722]: I0309 14:28:49.807931 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9c58393a-f27d-4e46-87ac-352b1370c295","Type":"ContainerDied","Data":"9cf4b538321d3fa703583336e99c0ac5959c1b9751f95fd0ea88e2cc3ab41b80"} Mar 09 14:28:49 crc kubenswrapper[4722]: I0309 14:28:49.809783 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="08c4782a-e6b2-459f-94ff-a215e159f733" containerName="ceilometer-central-agent" containerID="cri-o://15785102d818b457b8b65ef39cc7363deeacfce8e259b9b0591aa912e605ce8a" gracePeriod=30 Mar 09 14:28:49 crc kubenswrapper[4722]: I0309 14:28:49.809891 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="08c4782a-e6b2-459f-94ff-a215e159f733" containerName="proxy-httpd" containerID="cri-o://2a42a105b3cf3309a0a3c938af50d1d16511cf8be2274058f502f6a7745f0653" gracePeriod=30 Mar 09 14:28:49 crc kubenswrapper[4722]: I0309 14:28:49.809826 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="08c4782a-e6b2-459f-94ff-a215e159f733" containerName="sg-core" containerID="cri-o://d52fcd4ad5c64c2027b5d5db825b2ac0478c6a967da7cf3fc75c22fcc8853049" gracePeriod=30 Mar 09 14:28:49 crc kubenswrapper[4722]: I0309 14:28:49.809837 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="08c4782a-e6b2-459f-94ff-a215e159f733" containerName="ceilometer-notification-agent" containerID="cri-o://773088410ddce829ec1c0f2ca1e39fa81f72f6394348e408b4c3cd80df3feb86" gracePeriod=30 Mar 09 14:28:50 crc kubenswrapper[4722]: I0309 14:28:50.825466 4722 generic.go:334] "Generic (PLEG): container finished" podID="08c4782a-e6b2-459f-94ff-a215e159f733" containerID="2a42a105b3cf3309a0a3c938af50d1d16511cf8be2274058f502f6a7745f0653" exitCode=0 Mar 09 14:28:50 crc kubenswrapper[4722]: I0309 14:28:50.825731 4722 generic.go:334] "Generic (PLEG): container finished" podID="08c4782a-e6b2-459f-94ff-a215e159f733" containerID="d52fcd4ad5c64c2027b5d5db825b2ac0478c6a967da7cf3fc75c22fcc8853049" exitCode=2 Mar 09 14:28:50 crc kubenswrapper[4722]: I0309 14:28:50.825741 4722 generic.go:334] "Generic (PLEG): container finished" podID="08c4782a-e6b2-459f-94ff-a215e159f733" containerID="773088410ddce829ec1c0f2ca1e39fa81f72f6394348e408b4c3cd80df3feb86" exitCode=0 Mar 09 14:28:50 crc kubenswrapper[4722]: I0309 14:28:50.825563 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"08c4782a-e6b2-459f-94ff-a215e159f733","Type":"ContainerDied","Data":"2a42a105b3cf3309a0a3c938af50d1d16511cf8be2274058f502f6a7745f0653"} Mar 09 14:28:50 crc kubenswrapper[4722]: I0309 14:28:50.825776 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"08c4782a-e6b2-459f-94ff-a215e159f733","Type":"ContainerDied","Data":"d52fcd4ad5c64c2027b5d5db825b2ac0478c6a967da7cf3fc75c22fcc8853049"} Mar 09 14:28:50 crc kubenswrapper[4722]: I0309 14:28:50.825794 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"08c4782a-e6b2-459f-94ff-a215e159f733","Type":"ContainerDied","Data":"773088410ddce829ec1c0f2ca1e39fa81f72f6394348e408b4c3cd80df3feb86"} Mar 09 14:28:51 crc kubenswrapper[4722]: I0309 14:28:51.527880 4722 patch_prober.go:28] interesting pod/machine-config-daemon-hjrrb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:28:51 crc kubenswrapper[4722]: I0309 14:28:51.529134 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:28:52 crc kubenswrapper[4722]: I0309 14:28:52.854054 4722 generic.go:334] "Generic (PLEG): container finished" podID="9c58393a-f27d-4e46-87ac-352b1370c295" containerID="3c8f0ecc6e288a2c45b7eecaa7fbb8df96e687d06d2ecda8b11d40e75a734c21" exitCode=0 Mar 09 14:28:52 crc kubenswrapper[4722]: I0309 14:28:52.854270 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9c58393a-f27d-4e46-87ac-352b1370c295","Type":"ContainerDied","Data":"3c8f0ecc6e288a2c45b7eecaa7fbb8df96e687d06d2ecda8b11d40e75a734c21"} Mar 09 14:28:52 crc kubenswrapper[4722]: I0309 14:28:52.854399 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9c58393a-f27d-4e46-87ac-352b1370c295","Type":"ContainerDied","Data":"b4be2528460b9ad671efaa3af9bb9b811d4976fd9bafc8d84a430b0099c5d19e"} Mar 09 14:28:52 crc kubenswrapper[4722]: I0309 14:28:52.854416 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b4be2528460b9ad671efaa3af9bb9b811d4976fd9bafc8d84a430b0099c5d19e" Mar 09 14:28:52 crc kubenswrapper[4722]: I0309 14:28:52.864041 4722 generic.go:334] "Generic (PLEG): container finished" podID="08c4782a-e6b2-459f-94ff-a215e159f733" containerID="15785102d818b457b8b65ef39cc7363deeacfce8e259b9b0591aa912e605ce8a" exitCode=0 Mar 09 14:28:52 crc kubenswrapper[4722]: I0309 14:28:52.864080 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"08c4782a-e6b2-459f-94ff-a215e159f733","Type":"ContainerDied","Data":"15785102d818b457b8b65ef39cc7363deeacfce8e259b9b0591aa912e605ce8a"} Mar 09 14:28:52 crc kubenswrapper[4722]: I0309 14:28:52.864394 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 14:28:52 crc kubenswrapper[4722]: I0309 14:28:52.963913 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-drp7d\" (UniqueName: \"kubernetes.io/projected/9c58393a-f27d-4e46-87ac-352b1370c295-kube-api-access-drp7d\") pod \"9c58393a-f27d-4e46-87ac-352b1370c295\" (UID: \"9c58393a-f27d-4e46-87ac-352b1370c295\") " Mar 09 14:28:52 crc kubenswrapper[4722]: I0309 14:28:52.964064 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c58393a-f27d-4e46-87ac-352b1370c295-config-data\") pod \"9c58393a-f27d-4e46-87ac-352b1370c295\" (UID: \"9c58393a-f27d-4e46-87ac-352b1370c295\") " Mar 09 14:28:52 crc kubenswrapper[4722]: I0309 14:28:52.964110 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c58393a-f27d-4e46-87ac-352b1370c295-combined-ca-bundle\") pod \"9c58393a-f27d-4e46-87ac-352b1370c295\" (UID: \"9c58393a-f27d-4e46-87ac-352b1370c295\") " Mar 09 14:28:52 crc kubenswrapper[4722]: I0309 14:28:52.964231 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c58393a-f27d-4e46-87ac-352b1370c295-logs\") pod \"9c58393a-f27d-4e46-87ac-352b1370c295\" (UID: \"9c58393a-f27d-4e46-87ac-352b1370c295\") " Mar 09 14:28:52 crc kubenswrapper[4722]: I0309 14:28:52.965371 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c58393a-f27d-4e46-87ac-352b1370c295-logs" (OuterVolumeSpecName: "logs") pod "9c58393a-f27d-4e46-87ac-352b1370c295" (UID: "9c58393a-f27d-4e46-87ac-352b1370c295"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:28:52 crc kubenswrapper[4722]: I0309 14:28:52.970803 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c58393a-f27d-4e46-87ac-352b1370c295-kube-api-access-drp7d" (OuterVolumeSpecName: "kube-api-access-drp7d") pod "9c58393a-f27d-4e46-87ac-352b1370c295" (UID: "9c58393a-f27d-4e46-87ac-352b1370c295"). InnerVolumeSpecName "kube-api-access-drp7d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:28:53 crc kubenswrapper[4722]: I0309 14:28:53.011782 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c58393a-f27d-4e46-87ac-352b1370c295-config-data" (OuterVolumeSpecName: "config-data") pod "9c58393a-f27d-4e46-87ac-352b1370c295" (UID: "9c58393a-f27d-4e46-87ac-352b1370c295"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:28:53 crc kubenswrapper[4722]: I0309 14:28:53.065715 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c58393a-f27d-4e46-87ac-352b1370c295-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9c58393a-f27d-4e46-87ac-352b1370c295" (UID: "9c58393a-f27d-4e46-87ac-352b1370c295"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:28:53 crc kubenswrapper[4722]: I0309 14:28:53.067328 4722 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9c58393a-f27d-4e46-87ac-352b1370c295-logs\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:53 crc kubenswrapper[4722]: I0309 14:28:53.067368 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-drp7d\" (UniqueName: \"kubernetes.io/projected/9c58393a-f27d-4e46-87ac-352b1370c295-kube-api-access-drp7d\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:53 crc kubenswrapper[4722]: I0309 14:28:53.067381 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9c58393a-f27d-4e46-87ac-352b1370c295-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:53 crc kubenswrapper[4722]: I0309 14:28:53.067393 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c58393a-f27d-4e46-87ac-352b1370c295-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:53 crc kubenswrapper[4722]: I0309 14:28:53.165266 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 14:28:53 crc kubenswrapper[4722]: I0309 14:28:53.271691 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08c4782a-e6b2-459f-94ff-a215e159f733-combined-ca-bundle\") pod \"08c4782a-e6b2-459f-94ff-a215e159f733\" (UID: \"08c4782a-e6b2-459f-94ff-a215e159f733\") " Mar 09 14:28:53 crc kubenswrapper[4722]: I0309 14:28:53.271799 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/08c4782a-e6b2-459f-94ff-a215e159f733-run-httpd\") pod \"08c4782a-e6b2-459f-94ff-a215e159f733\" (UID: \"08c4782a-e6b2-459f-94ff-a215e159f733\") " Mar 09 14:28:53 crc kubenswrapper[4722]: I0309 14:28:53.271929 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08c4782a-e6b2-459f-94ff-a215e159f733-config-data\") pod \"08c4782a-e6b2-459f-94ff-a215e159f733\" (UID: \"08c4782a-e6b2-459f-94ff-a215e159f733\") " Mar 09 14:28:53 crc kubenswrapper[4722]: I0309 14:28:53.271960 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/08c4782a-e6b2-459f-94ff-a215e159f733-log-httpd\") pod \"08c4782a-e6b2-459f-94ff-a215e159f733\" (UID: \"08c4782a-e6b2-459f-94ff-a215e159f733\") " Mar 09 14:28:53 crc kubenswrapper[4722]: I0309 14:28:53.272077 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/08c4782a-e6b2-459f-94ff-a215e159f733-sg-core-conf-yaml\") pod \"08c4782a-e6b2-459f-94ff-a215e159f733\" (UID: \"08c4782a-e6b2-459f-94ff-a215e159f733\") " Mar 09 14:28:53 crc kubenswrapper[4722]: I0309 14:28:53.272101 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08c4782a-e6b2-459f-94ff-a215e159f733-scripts\") pod \"08c4782a-e6b2-459f-94ff-a215e159f733\" (UID: \"08c4782a-e6b2-459f-94ff-a215e159f733\") " Mar 09 14:28:53 crc kubenswrapper[4722]: I0309 14:28:53.272177 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45qg7\" (UniqueName: \"kubernetes.io/projected/08c4782a-e6b2-459f-94ff-a215e159f733-kube-api-access-45qg7\") pod \"08c4782a-e6b2-459f-94ff-a215e159f733\" (UID: \"08c4782a-e6b2-459f-94ff-a215e159f733\") " Mar 09 14:28:53 crc kubenswrapper[4722]: I0309 14:28:53.272545 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08c4782a-e6b2-459f-94ff-a215e159f733-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "08c4782a-e6b2-459f-94ff-a215e159f733" (UID: "08c4782a-e6b2-459f-94ff-a215e159f733"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:28:53 crc kubenswrapper[4722]: I0309 14:28:53.272729 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08c4782a-e6b2-459f-94ff-a215e159f733-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "08c4782a-e6b2-459f-94ff-a215e159f733" (UID: "08c4782a-e6b2-459f-94ff-a215e159f733"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:28:53 crc kubenswrapper[4722]: I0309 14:28:53.273253 4722 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/08c4782a-e6b2-459f-94ff-a215e159f733-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:53 crc kubenswrapper[4722]: I0309 14:28:53.273277 4722 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/08c4782a-e6b2-459f-94ff-a215e159f733-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:53 crc kubenswrapper[4722]: I0309 14:28:53.277508 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08c4782a-e6b2-459f-94ff-a215e159f733-scripts" (OuterVolumeSpecName: "scripts") pod "08c4782a-e6b2-459f-94ff-a215e159f733" (UID: "08c4782a-e6b2-459f-94ff-a215e159f733"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:28:53 crc kubenswrapper[4722]: I0309 14:28:53.279958 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08c4782a-e6b2-459f-94ff-a215e159f733-kube-api-access-45qg7" (OuterVolumeSpecName: "kube-api-access-45qg7") pod "08c4782a-e6b2-459f-94ff-a215e159f733" (UID: "08c4782a-e6b2-459f-94ff-a215e159f733"). InnerVolumeSpecName "kube-api-access-45qg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:28:53 crc kubenswrapper[4722]: I0309 14:28:53.323655 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08c4782a-e6b2-459f-94ff-a215e159f733-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "08c4782a-e6b2-459f-94ff-a215e159f733" (UID: "08c4782a-e6b2-459f-94ff-a215e159f733"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:28:53 crc kubenswrapper[4722]: I0309 14:28:53.375178 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08c4782a-e6b2-459f-94ff-a215e159f733-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "08c4782a-e6b2-459f-94ff-a215e159f733" (UID: "08c4782a-e6b2-459f-94ff-a215e159f733"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:28:53 crc kubenswrapper[4722]: I0309 14:28:53.379184 4722 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/08c4782a-e6b2-459f-94ff-a215e159f733-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:53 crc kubenswrapper[4722]: I0309 14:28:53.379304 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08c4782a-e6b2-459f-94ff-a215e159f733-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:53 crc kubenswrapper[4722]: I0309 14:28:53.379314 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45qg7\" (UniqueName: \"kubernetes.io/projected/08c4782a-e6b2-459f-94ff-a215e159f733-kube-api-access-45qg7\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:53 crc kubenswrapper[4722]: I0309 14:28:53.379327 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08c4782a-e6b2-459f-94ff-a215e159f733-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:53 crc kubenswrapper[4722]: I0309 14:28:53.400005 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08c4782a-e6b2-459f-94ff-a215e159f733-config-data" (OuterVolumeSpecName: "config-data") pod "08c4782a-e6b2-459f-94ff-a215e159f733" (UID: "08c4782a-e6b2-459f-94ff-a215e159f733"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:28:53 crc kubenswrapper[4722]: I0309 14:28:53.481259 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08c4782a-e6b2-459f-94ff-a215e159f733-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:53 crc kubenswrapper[4722]: I0309 14:28:53.878889 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"08c4782a-e6b2-459f-94ff-a215e159f733","Type":"ContainerDied","Data":"504be224c93d1fe325130ad91ab5ae5ac7775d2b985d05f32e75d9fcbe36db77"} Mar 09 14:28:53 crc kubenswrapper[4722]: I0309 14:28:53.879255 4722 scope.go:117] "RemoveContainer" containerID="2a42a105b3cf3309a0a3c938af50d1d16511cf8be2274058f502f6a7745f0653" Mar 09 14:28:53 crc kubenswrapper[4722]: I0309 14:28:53.878962 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 14:28:53 crc kubenswrapper[4722]: I0309 14:28:53.878933 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 14:28:53 crc kubenswrapper[4722]: I0309 14:28:53.918013 4722 scope.go:117] "RemoveContainer" containerID="d52fcd4ad5c64c2027b5d5db825b2ac0478c6a967da7cf3fc75c22fcc8853049" Mar 09 14:28:53 crc kubenswrapper[4722]: I0309 14:28:53.929023 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 14:28:53 crc kubenswrapper[4722]: I0309 14:28:53.947748 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 09 14:28:53 crc kubenswrapper[4722]: I0309 14:28:53.947899 4722 scope.go:117] "RemoveContainer" containerID="773088410ddce829ec1c0f2ca1e39fa81f72f6394348e408b4c3cd80df3feb86" Mar 09 14:28:53 crc kubenswrapper[4722]: I0309 14:28:53.979272 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 09 14:28:53 crc kubenswrapper[4722]: E0309 14:28:53.979860 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c58393a-f27d-4e46-87ac-352b1370c295" containerName="nova-api-api" Mar 09 14:28:53 crc kubenswrapper[4722]: I0309 14:28:53.979872 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c58393a-f27d-4e46-87ac-352b1370c295" containerName="nova-api-api" Mar 09 14:28:53 crc kubenswrapper[4722]: E0309 14:28:53.979890 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08c4782a-e6b2-459f-94ff-a215e159f733" containerName="ceilometer-central-agent" Mar 09 14:28:53 crc kubenswrapper[4722]: I0309 14:28:53.979896 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="08c4782a-e6b2-459f-94ff-a215e159f733" containerName="ceilometer-central-agent" Mar 09 14:28:53 crc kubenswrapper[4722]: E0309 14:28:53.979908 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08c4782a-e6b2-459f-94ff-a215e159f733" containerName="ceilometer-notification-agent" Mar 09 14:28:53 crc kubenswrapper[4722]: I0309 14:28:53.979919 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="08c4782a-e6b2-459f-94ff-a215e159f733" containerName="ceilometer-notification-agent" Mar 09 14:28:53 crc kubenswrapper[4722]: E0309 14:28:53.979929 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08c4782a-e6b2-459f-94ff-a215e159f733" containerName="proxy-httpd" Mar 09 14:28:53 crc kubenswrapper[4722]: I0309 14:28:53.979934 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="08c4782a-e6b2-459f-94ff-a215e159f733" containerName="proxy-httpd" Mar 09 14:28:53 crc kubenswrapper[4722]: E0309 14:28:53.979960 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c58393a-f27d-4e46-87ac-352b1370c295" containerName="nova-api-log" Mar 09 14:28:53 crc kubenswrapper[4722]: I0309 14:28:53.979965 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c58393a-f27d-4e46-87ac-352b1370c295" containerName="nova-api-log" Mar 09 14:28:53 crc kubenswrapper[4722]: E0309 14:28:53.979991 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08c4782a-e6b2-459f-94ff-a215e159f733" containerName="sg-core" Mar 09 14:28:53 crc kubenswrapper[4722]: I0309 14:28:53.979997 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="08c4782a-e6b2-459f-94ff-a215e159f733" containerName="sg-core" Mar 09 14:28:53 crc kubenswrapper[4722]: I0309 14:28:53.980195 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="08c4782a-e6b2-459f-94ff-a215e159f733" containerName="proxy-httpd" Mar 09 14:28:53 crc kubenswrapper[4722]: I0309 14:28:53.980224 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="08c4782a-e6b2-459f-94ff-a215e159f733" containerName="ceilometer-central-agent" Mar 09 14:28:53 crc kubenswrapper[4722]: I0309 14:28:53.980235 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="08c4782a-e6b2-459f-94ff-a215e159f733" containerName="ceilometer-notification-agent" Mar 09 14:28:53 crc kubenswrapper[4722]: I0309 14:28:53.980253 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="08c4782a-e6b2-459f-94ff-a215e159f733" containerName="sg-core" Mar 09 14:28:53 crc kubenswrapper[4722]: I0309 14:28:53.980260 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c58393a-f27d-4e46-87ac-352b1370c295" containerName="nova-api-api" Mar 09 14:28:53 crc kubenswrapper[4722]: I0309 14:28:53.980278 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c58393a-f27d-4e46-87ac-352b1370c295" containerName="nova-api-log" Mar 09 14:28:53 crc kubenswrapper[4722]: I0309 14:28:53.982353 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 14:28:53 crc kubenswrapper[4722]: I0309 14:28:53.986553 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 09 14:28:53 crc kubenswrapper[4722]: I0309 14:28:53.987943 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 09 14:28:53 crc kubenswrapper[4722]: I0309 14:28:53.996953 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 09 14:28:53 crc kubenswrapper[4722]: I0309 14:28:53.997931 4722 scope.go:117] "RemoveContainer" containerID="15785102d818b457b8b65ef39cc7363deeacfce8e259b9b0591aa912e605ce8a" Mar 09 14:28:54 crc kubenswrapper[4722]: I0309 14:28:54.010363 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 14:28:54 crc kubenswrapper[4722]: I0309 14:28:54.021303 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 09 14:28:54 crc kubenswrapper[4722]: I0309 14:28:54.033348 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 09 14:28:54 crc kubenswrapper[4722]: I0309 14:28:54.035664 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 14:28:54 crc kubenswrapper[4722]: I0309 14:28:54.037992 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 09 14:28:54 crc kubenswrapper[4722]: I0309 14:28:54.038078 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 09 14:28:54 crc kubenswrapper[4722]: I0309 14:28:54.039100 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 09 14:28:54 crc kubenswrapper[4722]: I0309 14:28:54.044750 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 09 14:28:54 crc kubenswrapper[4722]: I0309 14:28:54.097322 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d791fb7b-d6f9-40c5-97e2-c306a91059f0-run-httpd\") pod \"ceilometer-0\" (UID: \"d791fb7b-d6f9-40c5-97e2-c306a91059f0\") " pod="openstack/ceilometer-0" Mar 09 14:28:54 crc kubenswrapper[4722]: I0309 14:28:54.097375 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfjc4\" (UniqueName: \"kubernetes.io/projected/d791fb7b-d6f9-40c5-97e2-c306a91059f0-kube-api-access-vfjc4\") pod \"ceilometer-0\" (UID: \"d791fb7b-d6f9-40c5-97e2-c306a91059f0\") " pod="openstack/ceilometer-0" Mar 09 14:28:54 crc kubenswrapper[4722]: I0309 14:28:54.097489 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d791fb7b-d6f9-40c5-97e2-c306a91059f0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d791fb7b-d6f9-40c5-97e2-c306a91059f0\") " pod="openstack/ceilometer-0" Mar 09 14:28:54 crc kubenswrapper[4722]: I0309 14:28:54.097555 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d791fb7b-d6f9-40c5-97e2-c306a91059f0-scripts\") pod \"ceilometer-0\" (UID: \"d791fb7b-d6f9-40c5-97e2-c306a91059f0\") " pod="openstack/ceilometer-0" Mar 09 14:28:54 crc kubenswrapper[4722]: I0309 14:28:54.097583 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d791fb7b-d6f9-40c5-97e2-c306a91059f0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d791fb7b-d6f9-40c5-97e2-c306a91059f0\") " pod="openstack/ceilometer-0" Mar 09 14:28:54 crc kubenswrapper[4722]: I0309 14:28:54.097607 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d791fb7b-d6f9-40c5-97e2-c306a91059f0-log-httpd\") pod \"ceilometer-0\" (UID: \"d791fb7b-d6f9-40c5-97e2-c306a91059f0\") " pod="openstack/ceilometer-0" Mar 09 14:28:54 crc kubenswrapper[4722]: I0309 14:28:54.097638 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d791fb7b-d6f9-40c5-97e2-c306a91059f0-config-data\") pod \"ceilometer-0\" (UID: \"d791fb7b-d6f9-40c5-97e2-c306a91059f0\") " pod="openstack/ceilometer-0" Mar 09 14:28:54 crc kubenswrapper[4722]: I0309 14:28:54.129901 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 09 14:28:54 crc kubenswrapper[4722]: I0309 14:28:54.129939 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 09 14:28:54 crc kubenswrapper[4722]: I0309 14:28:54.163380 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08c4782a-e6b2-459f-94ff-a215e159f733" path="/var/lib/kubelet/pods/08c4782a-e6b2-459f-94ff-a215e159f733/volumes" Mar 09 14:28:54 crc kubenswrapper[4722]: I0309 14:28:54.164136 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c58393a-f27d-4e46-87ac-352b1370c295" path="/var/lib/kubelet/pods/9c58393a-f27d-4e46-87ac-352b1370c295/volumes" Mar 09 14:28:54 crc kubenswrapper[4722]: I0309 14:28:54.199748 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d791fb7b-d6f9-40c5-97e2-c306a91059f0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d791fb7b-d6f9-40c5-97e2-c306a91059f0\") " pod="openstack/ceilometer-0" Mar 09 14:28:54 crc kubenswrapper[4722]: I0309 14:28:54.199810 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff90b60b-bdfe-41ca-b835-b911cec10a8f-logs\") pod \"nova-api-0\" (UID: \"ff90b60b-bdfe-41ca-b835-b911cec10a8f\") " pod="openstack/nova-api-0" Mar 09 14:28:54 crc kubenswrapper[4722]: I0309 14:28:54.199865 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d791fb7b-d6f9-40c5-97e2-c306a91059f0-scripts\") pod \"ceilometer-0\" (UID: \"d791fb7b-d6f9-40c5-97e2-c306a91059f0\") " pod="openstack/ceilometer-0" Mar 09 14:28:54 crc kubenswrapper[4722]: I0309 14:28:54.199881 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff90b60b-bdfe-41ca-b835-b911cec10a8f-config-data\") pod \"nova-api-0\" (UID: \"ff90b60b-bdfe-41ca-b835-b911cec10a8f\") " pod="openstack/nova-api-0" Mar 09 14:28:54 crc kubenswrapper[4722]: I0309 14:28:54.199904 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d791fb7b-d6f9-40c5-97e2-c306a91059f0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d791fb7b-d6f9-40c5-97e2-c306a91059f0\") " pod="openstack/ceilometer-0" Mar 09 14:28:54 crc kubenswrapper[4722]: I0309 14:28:54.199924 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d791fb7b-d6f9-40c5-97e2-c306a91059f0-log-httpd\") pod \"ceilometer-0\" (UID: \"d791fb7b-d6f9-40c5-97e2-c306a91059f0\") " pod="openstack/ceilometer-0" Mar 09 14:28:54 crc kubenswrapper[4722]: I0309 14:28:54.199953 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d791fb7b-d6f9-40c5-97e2-c306a91059f0-config-data\") pod \"ceilometer-0\" (UID: \"d791fb7b-d6f9-40c5-97e2-c306a91059f0\") " pod="openstack/ceilometer-0" Mar 09 14:28:54 crc kubenswrapper[4722]: I0309 14:28:54.200012 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff90b60b-bdfe-41ca-b835-b911cec10a8f-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ff90b60b-bdfe-41ca-b835-b911cec10a8f\") " pod="openstack/nova-api-0" Mar 09 14:28:54 crc kubenswrapper[4722]: I0309 14:28:54.200036 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d791fb7b-d6f9-40c5-97e2-c306a91059f0-run-httpd\") pod \"ceilometer-0\" (UID: \"d791fb7b-d6f9-40c5-97e2-c306a91059f0\") " pod="openstack/ceilometer-0" Mar 09 14:28:54 crc kubenswrapper[4722]: I0309 14:28:54.200058 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfjc4\" (UniqueName: \"kubernetes.io/projected/d791fb7b-d6f9-40c5-97e2-c306a91059f0-kube-api-access-vfjc4\") pod \"ceilometer-0\" (UID: \"d791fb7b-d6f9-40c5-97e2-c306a91059f0\") " pod="openstack/ceilometer-0" Mar 09 14:28:54 crc kubenswrapper[4722]: I0309 14:28:54.200083 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff90b60b-bdfe-41ca-b835-b911cec10a8f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ff90b60b-bdfe-41ca-b835-b911cec10a8f\") " pod="openstack/nova-api-0" Mar 09 14:28:54 crc kubenswrapper[4722]: I0309 14:28:54.200100 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff90b60b-bdfe-41ca-b835-b911cec10a8f-public-tls-certs\") pod \"nova-api-0\" (UID: \"ff90b60b-bdfe-41ca-b835-b911cec10a8f\") " pod="openstack/nova-api-0" Mar 09 14:28:54 crc kubenswrapper[4722]: I0309 14:28:54.200136 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czxwv\" (UniqueName: \"kubernetes.io/projected/ff90b60b-bdfe-41ca-b835-b911cec10a8f-kube-api-access-czxwv\") pod \"nova-api-0\" (UID: \"ff90b60b-bdfe-41ca-b835-b911cec10a8f\") " pod="openstack/nova-api-0" Mar 09 14:28:54 crc kubenswrapper[4722]: I0309 14:28:54.201487 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d791fb7b-d6f9-40c5-97e2-c306a91059f0-log-httpd\") pod \"ceilometer-0\" (UID: \"d791fb7b-d6f9-40c5-97e2-c306a91059f0\") " pod="openstack/ceilometer-0" Mar 09 14:28:54 crc kubenswrapper[4722]: I0309 14:28:54.201556 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 09 14:28:54 crc kubenswrapper[4722]: I0309 14:28:54.201703 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d791fb7b-d6f9-40c5-97e2-c306a91059f0-run-httpd\") pod \"ceilometer-0\" (UID: \"d791fb7b-d6f9-40c5-97e2-c306a91059f0\") " pod="openstack/ceilometer-0" Mar 09 14:28:54 crc kubenswrapper[4722]: I0309 14:28:54.206371 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d791fb7b-d6f9-40c5-97e2-c306a91059f0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d791fb7b-d6f9-40c5-97e2-c306a91059f0\") " pod="openstack/ceilometer-0" Mar 09 14:28:54 crc kubenswrapper[4722]: I0309 14:28:54.206810 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d791fb7b-d6f9-40c5-97e2-c306a91059f0-scripts\") pod \"ceilometer-0\" (UID: \"d791fb7b-d6f9-40c5-97e2-c306a91059f0\") " pod="openstack/ceilometer-0" Mar 09 14:28:54 crc kubenswrapper[4722]: I0309 14:28:54.207146 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d791fb7b-d6f9-40c5-97e2-c306a91059f0-config-data\") pod \"ceilometer-0\" (UID: \"d791fb7b-d6f9-40c5-97e2-c306a91059f0\") " pod="openstack/ceilometer-0" Mar 09 14:28:54 crc kubenswrapper[4722]: I0309 14:28:54.207224 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d791fb7b-d6f9-40c5-97e2-c306a91059f0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d791fb7b-d6f9-40c5-97e2-c306a91059f0\") " pod="openstack/ceilometer-0" Mar 09 14:28:54 crc kubenswrapper[4722]: I0309 14:28:54.219957 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfjc4\" (UniqueName: \"kubernetes.io/projected/d791fb7b-d6f9-40c5-97e2-c306a91059f0-kube-api-access-vfjc4\") pod \"ceilometer-0\" (UID: \"d791fb7b-d6f9-40c5-97e2-c306a91059f0\") " pod="openstack/ceilometer-0" Mar 09 14:28:54 crc kubenswrapper[4722]: I0309 14:28:54.222974 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 09 14:28:54 crc kubenswrapper[4722]: I0309 14:28:54.302080 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czxwv\" (UniqueName: \"kubernetes.io/projected/ff90b60b-bdfe-41ca-b835-b911cec10a8f-kube-api-access-czxwv\") pod \"nova-api-0\" (UID: \"ff90b60b-bdfe-41ca-b835-b911cec10a8f\") " pod="openstack/nova-api-0" Mar 09 14:28:54 crc kubenswrapper[4722]: I0309 14:28:54.302271 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff90b60b-bdfe-41ca-b835-b911cec10a8f-logs\") pod \"nova-api-0\" (UID: \"ff90b60b-bdfe-41ca-b835-b911cec10a8f\") " pod="openstack/nova-api-0" Mar 09 14:28:54 crc kubenswrapper[4722]: I0309 14:28:54.302375 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff90b60b-bdfe-41ca-b835-b911cec10a8f-config-data\") pod \"nova-api-0\" (UID: \"ff90b60b-bdfe-41ca-b835-b911cec10a8f\") " pod="openstack/nova-api-0" Mar 09 14:28:54 crc kubenswrapper[4722]: I0309 14:28:54.302540 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff90b60b-bdfe-41ca-b835-b911cec10a8f-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ff90b60b-bdfe-41ca-b835-b911cec10a8f\") " pod="openstack/nova-api-0" Mar 09 14:28:54 crc kubenswrapper[4722]: I0309 14:28:54.302610 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff90b60b-bdfe-41ca-b835-b911cec10a8f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ff90b60b-bdfe-41ca-b835-b911cec10a8f\") " pod="openstack/nova-api-0" Mar 09 14:28:54 crc kubenswrapper[4722]: I0309 14:28:54.302635 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff90b60b-bdfe-41ca-b835-b911cec10a8f-public-tls-certs\") pod \"nova-api-0\" (UID: \"ff90b60b-bdfe-41ca-b835-b911cec10a8f\") " pod="openstack/nova-api-0" Mar 09 14:28:54 crc kubenswrapper[4722]: I0309 14:28:54.302647 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff90b60b-bdfe-41ca-b835-b911cec10a8f-logs\") pod \"nova-api-0\" (UID: \"ff90b60b-bdfe-41ca-b835-b911cec10a8f\") " pod="openstack/nova-api-0" Mar 09 14:28:54 crc kubenswrapper[4722]: I0309 14:28:54.306403 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff90b60b-bdfe-41ca-b835-b911cec10a8f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ff90b60b-bdfe-41ca-b835-b911cec10a8f\") " pod="openstack/nova-api-0" Mar 09 14:28:54 crc kubenswrapper[4722]: I0309 14:28:54.306560 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff90b60b-bdfe-41ca-b835-b911cec10a8f-public-tls-certs\") pod \"nova-api-0\" (UID: \"ff90b60b-bdfe-41ca-b835-b911cec10a8f\") " pod="openstack/nova-api-0" Mar 09 14:28:54 crc kubenswrapper[4722]: I0309 14:28:54.306673 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff90b60b-bdfe-41ca-b835-b911cec10a8f-config-data\") pod \"nova-api-0\" (UID: \"ff90b60b-bdfe-41ca-b835-b911cec10a8f\") " pod="openstack/nova-api-0" Mar 09 14:28:54 crc kubenswrapper[4722]: I0309 14:28:54.306985 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff90b60b-bdfe-41ca-b835-b911cec10a8f-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ff90b60b-bdfe-41ca-b835-b911cec10a8f\") " pod="openstack/nova-api-0" Mar 09 14:28:54 crc kubenswrapper[4722]: I0309 14:28:54.316539 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 14:28:54 crc kubenswrapper[4722]: I0309 14:28:54.318975 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czxwv\" (UniqueName: \"kubernetes.io/projected/ff90b60b-bdfe-41ca-b835-b911cec10a8f-kube-api-access-czxwv\") pod \"nova-api-0\" (UID: \"ff90b60b-bdfe-41ca-b835-b911cec10a8f\") " pod="openstack/nova-api-0" Mar 09 14:28:54 crc kubenswrapper[4722]: I0309 14:28:54.359544 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 14:28:54 crc kubenswrapper[4722]: I0309 14:28:54.894752 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 14:28:54 crc kubenswrapper[4722]: I0309 14:28:54.922749 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 09 14:28:55 crc kubenswrapper[4722]: I0309 14:28:55.021420 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 09 14:28:55 crc kubenswrapper[4722]: I0309 14:28:55.144390 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="b58c3f1a-21b8-4a75-bfb5-970122e20db6" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.4:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 14:28:55 crc kubenswrapper[4722]: I0309 14:28:55.144430 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="b58c3f1a-21b8-4a75-bfb5-970122e20db6" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.4:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 14:28:55 crc kubenswrapper[4722]: I0309 14:28:55.187573 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-rll7l"] Mar 09 14:28:55 crc kubenswrapper[4722]: I0309 14:28:55.189407 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-rll7l" Mar 09 14:28:55 crc kubenswrapper[4722]: I0309 14:28:55.191664 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 09 14:28:55 crc kubenswrapper[4722]: I0309 14:28:55.193259 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 09 14:28:55 crc kubenswrapper[4722]: I0309 14:28:55.208989 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-rll7l"] Mar 09 14:28:55 crc kubenswrapper[4722]: I0309 14:28:55.332181 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25a37fbc-0819-4a2b-b782-d566e3f765d7-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-rll7l\" (UID: \"25a37fbc-0819-4a2b-b782-d566e3f765d7\") " pod="openstack/nova-cell1-cell-mapping-rll7l" Mar 09 14:28:55 crc kubenswrapper[4722]: I0309 14:28:55.332480 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25a37fbc-0819-4a2b-b782-d566e3f765d7-config-data\") pod \"nova-cell1-cell-mapping-rll7l\" (UID: \"25a37fbc-0819-4a2b-b782-d566e3f765d7\") " pod="openstack/nova-cell1-cell-mapping-rll7l" Mar 09 14:28:55 crc kubenswrapper[4722]: I0309 14:28:55.332594 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gc55d\" (UniqueName: \"kubernetes.io/projected/25a37fbc-0819-4a2b-b782-d566e3f765d7-kube-api-access-gc55d\") pod \"nova-cell1-cell-mapping-rll7l\" (UID: \"25a37fbc-0819-4a2b-b782-d566e3f765d7\") " pod="openstack/nova-cell1-cell-mapping-rll7l" Mar 09 14:28:55 crc kubenswrapper[4722]: I0309 14:28:55.332671 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25a37fbc-0819-4a2b-b782-d566e3f765d7-scripts\") pod \"nova-cell1-cell-mapping-rll7l\" (UID: \"25a37fbc-0819-4a2b-b782-d566e3f765d7\") " pod="openstack/nova-cell1-cell-mapping-rll7l" Mar 09 14:28:55 crc kubenswrapper[4722]: I0309 14:28:55.435763 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25a37fbc-0819-4a2b-b782-d566e3f765d7-config-data\") pod \"nova-cell1-cell-mapping-rll7l\" (UID: \"25a37fbc-0819-4a2b-b782-d566e3f765d7\") " pod="openstack/nova-cell1-cell-mapping-rll7l" Mar 09 14:28:55 crc kubenswrapper[4722]: I0309 14:28:55.435852 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gc55d\" (UniqueName: \"kubernetes.io/projected/25a37fbc-0819-4a2b-b782-d566e3f765d7-kube-api-access-gc55d\") pod \"nova-cell1-cell-mapping-rll7l\" (UID: \"25a37fbc-0819-4a2b-b782-d566e3f765d7\") " pod="openstack/nova-cell1-cell-mapping-rll7l" Mar 09 14:28:55 crc kubenswrapper[4722]: I0309 14:28:55.435907 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25a37fbc-0819-4a2b-b782-d566e3f765d7-scripts\") pod \"nova-cell1-cell-mapping-rll7l\" (UID: \"25a37fbc-0819-4a2b-b782-d566e3f765d7\") " pod="openstack/nova-cell1-cell-mapping-rll7l" Mar 09 14:28:55 crc kubenswrapper[4722]: I0309 14:28:55.436094 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25a37fbc-0819-4a2b-b782-d566e3f765d7-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-rll7l\" (UID: \"25a37fbc-0819-4a2b-b782-d566e3f765d7\") " pod="openstack/nova-cell1-cell-mapping-rll7l" Mar 09 14:28:55 crc kubenswrapper[4722]: I0309 14:28:55.442038 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25a37fbc-0819-4a2b-b782-d566e3f765d7-config-data\") pod \"nova-cell1-cell-mapping-rll7l\" (UID: \"25a37fbc-0819-4a2b-b782-d566e3f765d7\") " pod="openstack/nova-cell1-cell-mapping-rll7l" Mar 09 14:28:55 crc kubenswrapper[4722]: I0309 14:28:55.443817 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25a37fbc-0819-4a2b-b782-d566e3f765d7-scripts\") pod \"nova-cell1-cell-mapping-rll7l\" (UID: \"25a37fbc-0819-4a2b-b782-d566e3f765d7\") " pod="openstack/nova-cell1-cell-mapping-rll7l" Mar 09 14:28:55 crc kubenswrapper[4722]: I0309 14:28:55.447898 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25a37fbc-0819-4a2b-b782-d566e3f765d7-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-rll7l\" (UID: \"25a37fbc-0819-4a2b-b782-d566e3f765d7\") " pod="openstack/nova-cell1-cell-mapping-rll7l" Mar 09 14:28:55 crc kubenswrapper[4722]: I0309 14:28:55.456430 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gc55d\" (UniqueName: \"kubernetes.io/projected/25a37fbc-0819-4a2b-b782-d566e3f765d7-kube-api-access-gc55d\") pod \"nova-cell1-cell-mapping-rll7l\" (UID: \"25a37fbc-0819-4a2b-b782-d566e3f765d7\") " pod="openstack/nova-cell1-cell-mapping-rll7l" Mar 09 14:28:55 crc kubenswrapper[4722]: I0309 14:28:55.510039 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-rll7l" Mar 09 14:28:55 crc kubenswrapper[4722]: I0309 14:28:55.914385 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d791fb7b-d6f9-40c5-97e2-c306a91059f0","Type":"ContainerStarted","Data":"0994e76809996dc7b440acf904594d46c2893329840b9b3b1aa81f3e949aba94"} Mar 09 14:28:55 crc kubenswrapper[4722]: I0309 14:28:55.914654 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d791fb7b-d6f9-40c5-97e2-c306a91059f0","Type":"ContainerStarted","Data":"c6f6ee38a3c6e5345aae2d72f7d934cd2d719cbeeeaa8b67bd428ffecc7aa0d9"} Mar 09 14:28:55 crc kubenswrapper[4722]: I0309 14:28:55.918118 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ff90b60b-bdfe-41ca-b835-b911cec10a8f","Type":"ContainerStarted","Data":"b6a9dcfb7b1ff1a237bf93656d6cacdcbb190251911cbe42a6665b553842c150"} Mar 09 14:28:55 crc kubenswrapper[4722]: I0309 14:28:55.918150 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ff90b60b-bdfe-41ca-b835-b911cec10a8f","Type":"ContainerStarted","Data":"6d1c7a52c957bb592000666b12ec3180ee75d47bc71576eff7f609bf0eae7457"} Mar 09 14:28:55 crc kubenswrapper[4722]: I0309 14:28:55.918325 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ff90b60b-bdfe-41ca-b835-b911cec10a8f","Type":"ContainerStarted","Data":"d57a3b2667c0d5e27f9e6a085a4ad26fadd28a884a3dbf14c55bc9f5b7d59011"} Mar 09 14:28:55 crc kubenswrapper[4722]: I0309 14:28:55.941954 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.941933235 podStartE2EDuration="2.941933235s" podCreationTimestamp="2026-03-09 14:28:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:28:55.941411691 +0000 UTC m=+1576.496980267" watchObservedRunningTime="2026-03-09 14:28:55.941933235 +0000 UTC m=+1576.497501801" Mar 09 14:28:56 crc kubenswrapper[4722]: I0309 14:28:56.017125 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-rll7l"] Mar 09 14:28:56 crc kubenswrapper[4722]: I0309 14:28:56.422176 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6b7bbf7cf9-k62sg" Mar 09 14:28:56 crc kubenswrapper[4722]: I0309 14:28:56.500655 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-s8lxr"] Mar 09 14:28:56 crc kubenswrapper[4722]: I0309 14:28:56.500938 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-9b86998b5-s8lxr" podUID="181d67a9-9459-4457-a22f-ba515776d23c" containerName="dnsmasq-dns" containerID="cri-o://5b641acbb31e09da34d87a916b159a499bec718235322dd7e3894c41b3d59c30" gracePeriod=10 Mar 09 14:28:56 crc kubenswrapper[4722]: I0309 14:28:56.595314 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-9b86998b5-s8lxr" podUID="181d67a9-9459-4457-a22f-ba515776d23c" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.250:5353: connect: connection refused" Mar 09 14:28:56 crc kubenswrapper[4722]: I0309 14:28:56.979321 4722 generic.go:334] "Generic (PLEG): container finished" podID="181d67a9-9459-4457-a22f-ba515776d23c" containerID="5b641acbb31e09da34d87a916b159a499bec718235322dd7e3894c41b3d59c30" exitCode=0 Mar 09 14:28:56 crc kubenswrapper[4722]: I0309 14:28:56.979418 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b86998b5-s8lxr" event={"ID":"181d67a9-9459-4457-a22f-ba515776d23c","Type":"ContainerDied","Data":"5b641acbb31e09da34d87a916b159a499bec718235322dd7e3894c41b3d59c30"} Mar 09 14:28:57 crc kubenswrapper[4722]: I0309 14:28:57.002587 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-rll7l" event={"ID":"25a37fbc-0819-4a2b-b782-d566e3f765d7","Type":"ContainerStarted","Data":"8df963eaf8b6f3a867ce207588dc0c93e76df2b1e21bf269210fef574c2aaf34"} Mar 09 14:28:57 crc kubenswrapper[4722]: I0309 14:28:57.002628 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-rll7l" event={"ID":"25a37fbc-0819-4a2b-b782-d566e3f765d7","Type":"ContainerStarted","Data":"04afb94ceaa27e80c7e901e3d1121f253b83a61b31be74934ec054faf6b93ab5"} Mar 09 14:28:57 crc kubenswrapper[4722]: I0309 14:28:57.216591 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9b86998b5-s8lxr" Mar 09 14:28:57 crc kubenswrapper[4722]: I0309 14:28:57.241871 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-rll7l" podStartSLOduration=2.241852033 podStartE2EDuration="2.241852033s" podCreationTimestamp="2026-03-09 14:28:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:28:57.048249393 +0000 UTC m=+1577.603817969" watchObservedRunningTime="2026-03-09 14:28:57.241852033 +0000 UTC m=+1577.797420609" Mar 09 14:28:57 crc kubenswrapper[4722]: I0309 14:28:57.310652 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/181d67a9-9459-4457-a22f-ba515776d23c-ovsdbserver-sb\") pod \"181d67a9-9459-4457-a22f-ba515776d23c\" (UID: \"181d67a9-9459-4457-a22f-ba515776d23c\") " Mar 09 14:28:57 crc kubenswrapper[4722]: I0309 14:28:57.310739 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/181d67a9-9459-4457-a22f-ba515776d23c-ovsdbserver-nb\") pod \"181d67a9-9459-4457-a22f-ba515776d23c\" (UID: \"181d67a9-9459-4457-a22f-ba515776d23c\") " Mar 09 14:28:57 crc kubenswrapper[4722]: I0309 14:28:57.310763 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/181d67a9-9459-4457-a22f-ba515776d23c-dns-svc\") pod \"181d67a9-9459-4457-a22f-ba515776d23c\" (UID: \"181d67a9-9459-4457-a22f-ba515776d23c\") " Mar 09 14:28:57 crc kubenswrapper[4722]: I0309 14:28:57.310840 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xslrb\" (UniqueName: \"kubernetes.io/projected/181d67a9-9459-4457-a22f-ba515776d23c-kube-api-access-xslrb\") pod \"181d67a9-9459-4457-a22f-ba515776d23c\" (UID: \"181d67a9-9459-4457-a22f-ba515776d23c\") " Mar 09 14:28:57 crc kubenswrapper[4722]: I0309 14:28:57.310873 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/181d67a9-9459-4457-a22f-ba515776d23c-dns-swift-storage-0\") pod \"181d67a9-9459-4457-a22f-ba515776d23c\" (UID: \"181d67a9-9459-4457-a22f-ba515776d23c\") " Mar 09 14:28:57 crc kubenswrapper[4722]: I0309 14:28:57.310970 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/181d67a9-9459-4457-a22f-ba515776d23c-config\") pod \"181d67a9-9459-4457-a22f-ba515776d23c\" (UID: \"181d67a9-9459-4457-a22f-ba515776d23c\") " Mar 09 14:28:57 crc kubenswrapper[4722]: I0309 14:28:57.319053 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/181d67a9-9459-4457-a22f-ba515776d23c-kube-api-access-xslrb" (OuterVolumeSpecName: "kube-api-access-xslrb") pod "181d67a9-9459-4457-a22f-ba515776d23c" (UID: "181d67a9-9459-4457-a22f-ba515776d23c"). InnerVolumeSpecName "kube-api-access-xslrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:28:57 crc kubenswrapper[4722]: I0309 14:28:57.375014 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/181d67a9-9459-4457-a22f-ba515776d23c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "181d67a9-9459-4457-a22f-ba515776d23c" (UID: "181d67a9-9459-4457-a22f-ba515776d23c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:28:57 crc kubenswrapper[4722]: I0309 14:28:57.390041 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/181d67a9-9459-4457-a22f-ba515776d23c-config" (OuterVolumeSpecName: "config") pod "181d67a9-9459-4457-a22f-ba515776d23c" (UID: "181d67a9-9459-4457-a22f-ba515776d23c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:28:57 crc kubenswrapper[4722]: I0309 14:28:57.394454 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/181d67a9-9459-4457-a22f-ba515776d23c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "181d67a9-9459-4457-a22f-ba515776d23c" (UID: "181d67a9-9459-4457-a22f-ba515776d23c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:28:57 crc kubenswrapper[4722]: I0309 14:28:57.414817 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/181d67a9-9459-4457-a22f-ba515776d23c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:57 crc kubenswrapper[4722]: I0309 14:28:57.414858 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xslrb\" (UniqueName: \"kubernetes.io/projected/181d67a9-9459-4457-a22f-ba515776d23c-kube-api-access-xslrb\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:57 crc kubenswrapper[4722]: I0309 14:28:57.414878 4722 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/181d67a9-9459-4457-a22f-ba515776d23c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:57 crc kubenswrapper[4722]: I0309 14:28:57.414893 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/181d67a9-9459-4457-a22f-ba515776d23c-config\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:57 crc kubenswrapper[4722]: I0309 14:28:57.421738 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/181d67a9-9459-4457-a22f-ba515776d23c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "181d67a9-9459-4457-a22f-ba515776d23c" (UID: "181d67a9-9459-4457-a22f-ba515776d23c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:28:57 crc kubenswrapper[4722]: I0309 14:28:57.427321 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/181d67a9-9459-4457-a22f-ba515776d23c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "181d67a9-9459-4457-a22f-ba515776d23c" (UID: "181d67a9-9459-4457-a22f-ba515776d23c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:28:57 crc kubenswrapper[4722]: I0309 14:28:57.517302 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/181d67a9-9459-4457-a22f-ba515776d23c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:57 crc kubenswrapper[4722]: I0309 14:28:57.517335 4722 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/181d67a9-9459-4457-a22f-ba515776d23c-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 14:28:58 crc kubenswrapper[4722]: I0309 14:28:58.015025 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9b86998b5-s8lxr" Mar 09 14:28:58 crc kubenswrapper[4722]: I0309 14:28:58.015025 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b86998b5-s8lxr" event={"ID":"181d67a9-9459-4457-a22f-ba515776d23c","Type":"ContainerDied","Data":"444e390d24fa4e9b47fc85371b7faa0854056a86e3941d747490fcc3082da9b7"} Mar 09 14:28:58 crc kubenswrapper[4722]: I0309 14:28:58.015486 4722 scope.go:117] "RemoveContainer" containerID="5b641acbb31e09da34d87a916b159a499bec718235322dd7e3894c41b3d59c30" Mar 09 14:28:58 crc kubenswrapper[4722]: I0309 14:28:58.017973 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d791fb7b-d6f9-40c5-97e2-c306a91059f0","Type":"ContainerStarted","Data":"3d69b67b5c4eb8b6756a282ac7b9c516b6ced6c1227a394399484cc65a0277fe"} Mar 09 14:28:58 crc kubenswrapper[4722]: I0309 14:28:58.018005 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d791fb7b-d6f9-40c5-97e2-c306a91059f0","Type":"ContainerStarted","Data":"3e7961719fd0f860de9af5fcedcd4ba6553fdc5c1b4547e6b414a69456a8d0bd"} Mar 09 14:28:58 crc kubenswrapper[4722]: I0309 14:28:58.059580 4722 scope.go:117] "RemoveContainer" containerID="2aa35b1c19c07721a2680d9e41fc1cf9fc0342153c3d37d5aa637a1d689f054b" Mar 09 14:28:58 crc kubenswrapper[4722]: I0309 14:28:58.069584 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-s8lxr"] Mar 09 14:28:58 crc kubenswrapper[4722]: I0309 14:28:58.090581 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-s8lxr"] Mar 09 14:28:58 crc kubenswrapper[4722]: I0309 14:28:58.167053 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="181d67a9-9459-4457-a22f-ba515776d23c" path="/var/lib/kubelet/pods/181d67a9-9459-4457-a22f-ba515776d23c/volumes" Mar 09 14:29:00 crc kubenswrapper[4722]: I0309 14:29:00.045711 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d791fb7b-d6f9-40c5-97e2-c306a91059f0","Type":"ContainerStarted","Data":"f786b00170a945de94d0f4f3fa7223c609b5a257826d9fde4fd1d13c213df85b"} Mar 09 14:29:00 crc kubenswrapper[4722]: I0309 14:29:00.046627 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 09 14:29:00 crc kubenswrapper[4722]: I0309 14:29:00.073073 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.3858329449999998 podStartE2EDuration="7.073055962s" podCreationTimestamp="2026-03-09 14:28:53 +0000 UTC" firstStartedPulling="2026-03-09 14:28:54.903459729 +0000 UTC m=+1575.459028305" lastFinishedPulling="2026-03-09 14:28:59.590682756 +0000 UTC m=+1580.146251322" observedRunningTime="2026-03-09 14:29:00.072927499 +0000 UTC m=+1580.628496085" watchObservedRunningTime="2026-03-09 14:29:00.073055962 +0000 UTC m=+1580.628624538" Mar 09 14:29:02 crc kubenswrapper[4722]: I0309 14:29:02.072245 4722 generic.go:334] "Generic (PLEG): container finished" podID="25a37fbc-0819-4a2b-b782-d566e3f765d7" containerID="8df963eaf8b6f3a867ce207588dc0c93e76df2b1e21bf269210fef574c2aaf34" exitCode=0 Mar 09 14:29:02 crc kubenswrapper[4722]: I0309 14:29:02.072347 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-rll7l" event={"ID":"25a37fbc-0819-4a2b-b782-d566e3f765d7","Type":"ContainerDied","Data":"8df963eaf8b6f3a867ce207588dc0c93e76df2b1e21bf269210fef574c2aaf34"} Mar 09 14:29:03 crc kubenswrapper[4722]: I0309 14:29:03.612726 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-rll7l" Mar 09 14:29:03 crc kubenswrapper[4722]: I0309 14:29:03.672809 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25a37fbc-0819-4a2b-b782-d566e3f765d7-config-data\") pod \"25a37fbc-0819-4a2b-b782-d566e3f765d7\" (UID: \"25a37fbc-0819-4a2b-b782-d566e3f765d7\") " Mar 09 14:29:03 crc kubenswrapper[4722]: I0309 14:29:03.673653 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25a37fbc-0819-4a2b-b782-d566e3f765d7-combined-ca-bundle\") pod \"25a37fbc-0819-4a2b-b782-d566e3f765d7\" (UID: \"25a37fbc-0819-4a2b-b782-d566e3f765d7\") " Mar 09 14:29:03 crc kubenswrapper[4722]: I0309 14:29:03.673710 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gc55d\" (UniqueName: \"kubernetes.io/projected/25a37fbc-0819-4a2b-b782-d566e3f765d7-kube-api-access-gc55d\") pod \"25a37fbc-0819-4a2b-b782-d566e3f765d7\" (UID: \"25a37fbc-0819-4a2b-b782-d566e3f765d7\") " Mar 09 14:29:03 crc kubenswrapper[4722]: I0309 14:29:03.673774 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25a37fbc-0819-4a2b-b782-d566e3f765d7-scripts\") pod \"25a37fbc-0819-4a2b-b782-d566e3f765d7\" (UID: \"25a37fbc-0819-4a2b-b782-d566e3f765d7\") " Mar 09 14:29:03 crc kubenswrapper[4722]: I0309 14:29:03.678603 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25a37fbc-0819-4a2b-b782-d566e3f765d7-scripts" (OuterVolumeSpecName: "scripts") pod "25a37fbc-0819-4a2b-b782-d566e3f765d7" (UID: "25a37fbc-0819-4a2b-b782-d566e3f765d7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:29:03 crc kubenswrapper[4722]: I0309 14:29:03.683165 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25a37fbc-0819-4a2b-b782-d566e3f765d7-kube-api-access-gc55d" (OuterVolumeSpecName: "kube-api-access-gc55d") pod "25a37fbc-0819-4a2b-b782-d566e3f765d7" (UID: "25a37fbc-0819-4a2b-b782-d566e3f765d7"). InnerVolumeSpecName "kube-api-access-gc55d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:29:03 crc kubenswrapper[4722]: I0309 14:29:03.708917 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25a37fbc-0819-4a2b-b782-d566e3f765d7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "25a37fbc-0819-4a2b-b782-d566e3f765d7" (UID: "25a37fbc-0819-4a2b-b782-d566e3f765d7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:29:03 crc kubenswrapper[4722]: I0309 14:29:03.719975 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25a37fbc-0819-4a2b-b782-d566e3f765d7-config-data" (OuterVolumeSpecName: "config-data") pod "25a37fbc-0819-4a2b-b782-d566e3f765d7" (UID: "25a37fbc-0819-4a2b-b782-d566e3f765d7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:29:03 crc kubenswrapper[4722]: I0309 14:29:03.776574 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25a37fbc-0819-4a2b-b782-d566e3f765d7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:29:03 crc kubenswrapper[4722]: I0309 14:29:03.776778 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gc55d\" (UniqueName: \"kubernetes.io/projected/25a37fbc-0819-4a2b-b782-d566e3f765d7-kube-api-access-gc55d\") on node \"crc\" DevicePath \"\"" Mar 09 14:29:03 crc kubenswrapper[4722]: I0309 14:29:03.776890 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25a37fbc-0819-4a2b-b782-d566e3f765d7-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 14:29:03 crc kubenswrapper[4722]: I0309 14:29:03.776947 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25a37fbc-0819-4a2b-b782-d566e3f765d7-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:29:04 crc kubenswrapper[4722]: I0309 14:29:04.097227 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-rll7l" event={"ID":"25a37fbc-0819-4a2b-b782-d566e3f765d7","Type":"ContainerDied","Data":"04afb94ceaa27e80c7e901e3d1121f253b83a61b31be74934ec054faf6b93ab5"} Mar 09 14:29:04 crc kubenswrapper[4722]: I0309 14:29:04.097590 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04afb94ceaa27e80c7e901e3d1121f253b83a61b31be74934ec054faf6b93ab5" Mar 09 14:29:04 crc kubenswrapper[4722]: I0309 14:29:04.097331 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-rll7l" Mar 09 14:29:04 crc kubenswrapper[4722]: I0309 14:29:04.145482 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 09 14:29:04 crc kubenswrapper[4722]: I0309 14:29:04.145916 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 09 14:29:04 crc kubenswrapper[4722]: I0309 14:29:04.179382 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 09 14:29:04 crc kubenswrapper[4722]: I0309 14:29:04.294922 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 14:29:04 crc kubenswrapper[4722]: I0309 14:29:04.295286 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="1fe185bd-3b13-48df-9ceb-df1ba08e277b" containerName="nova-scheduler-scheduler" containerID="cri-o://e181a3d1402638abb6a7fd52b392c6c4834488f8c071ebed96b611f31e146cdb" gracePeriod=30 Mar 09 14:29:04 crc kubenswrapper[4722]: I0309 14:29:04.315275 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 09 14:29:04 crc kubenswrapper[4722]: I0309 14:29:04.315649 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ff90b60b-bdfe-41ca-b835-b911cec10a8f" containerName="nova-api-log" containerID="cri-o://6d1c7a52c957bb592000666b12ec3180ee75d47bc71576eff7f609bf0eae7457" gracePeriod=30 Mar 09 14:29:04 crc kubenswrapper[4722]: I0309 14:29:04.316006 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ff90b60b-bdfe-41ca-b835-b911cec10a8f" containerName="nova-api-api" containerID="cri-o://b6a9dcfb7b1ff1a237bf93656d6cacdcbb190251911cbe42a6665b553842c150" gracePeriod=30 Mar 09 14:29:04 crc kubenswrapper[4722]: I0309 14:29:04.332141 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 14:29:04 crc kubenswrapper[4722]: I0309 14:29:04.941102 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 14:29:05 crc kubenswrapper[4722]: I0309 14:29:05.017365 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff90b60b-bdfe-41ca-b835-b911cec10a8f-logs\") pod \"ff90b60b-bdfe-41ca-b835-b911cec10a8f\" (UID: \"ff90b60b-bdfe-41ca-b835-b911cec10a8f\") " Mar 09 14:29:05 crc kubenswrapper[4722]: I0309 14:29:05.017456 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff90b60b-bdfe-41ca-b835-b911cec10a8f-public-tls-certs\") pod \"ff90b60b-bdfe-41ca-b835-b911cec10a8f\" (UID: \"ff90b60b-bdfe-41ca-b835-b911cec10a8f\") " Mar 09 14:29:05 crc kubenswrapper[4722]: I0309 14:29:05.017502 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff90b60b-bdfe-41ca-b835-b911cec10a8f-internal-tls-certs\") pod \"ff90b60b-bdfe-41ca-b835-b911cec10a8f\" (UID: \"ff90b60b-bdfe-41ca-b835-b911cec10a8f\") " Mar 09 14:29:05 crc kubenswrapper[4722]: I0309 14:29:05.017519 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff90b60b-bdfe-41ca-b835-b911cec10a8f-config-data\") pod \"ff90b60b-bdfe-41ca-b835-b911cec10a8f\" (UID: \"ff90b60b-bdfe-41ca-b835-b911cec10a8f\") " Mar 09 14:29:05 crc kubenswrapper[4722]: I0309 14:29:05.017548 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czxwv\" (UniqueName: \"kubernetes.io/projected/ff90b60b-bdfe-41ca-b835-b911cec10a8f-kube-api-access-czxwv\") pod \"ff90b60b-bdfe-41ca-b835-b911cec10a8f\" (UID: \"ff90b60b-bdfe-41ca-b835-b911cec10a8f\") " Mar 09 14:29:05 crc kubenswrapper[4722]: I0309 14:29:05.017598 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff90b60b-bdfe-41ca-b835-b911cec10a8f-combined-ca-bundle\") pod \"ff90b60b-bdfe-41ca-b835-b911cec10a8f\" (UID: \"ff90b60b-bdfe-41ca-b835-b911cec10a8f\") " Mar 09 14:29:05 crc kubenswrapper[4722]: I0309 14:29:05.018646 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff90b60b-bdfe-41ca-b835-b911cec10a8f-logs" (OuterVolumeSpecName: "logs") pod "ff90b60b-bdfe-41ca-b835-b911cec10a8f" (UID: "ff90b60b-bdfe-41ca-b835-b911cec10a8f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:29:05 crc kubenswrapper[4722]: I0309 14:29:05.024638 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff90b60b-bdfe-41ca-b835-b911cec10a8f-kube-api-access-czxwv" (OuterVolumeSpecName: "kube-api-access-czxwv") pod "ff90b60b-bdfe-41ca-b835-b911cec10a8f" (UID: "ff90b60b-bdfe-41ca-b835-b911cec10a8f"). InnerVolumeSpecName "kube-api-access-czxwv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:29:05 crc kubenswrapper[4722]: I0309 14:29:05.052990 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff90b60b-bdfe-41ca-b835-b911cec10a8f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ff90b60b-bdfe-41ca-b835-b911cec10a8f" (UID: "ff90b60b-bdfe-41ca-b835-b911cec10a8f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:29:05 crc kubenswrapper[4722]: I0309 14:29:05.069132 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff90b60b-bdfe-41ca-b835-b911cec10a8f-config-data" (OuterVolumeSpecName: "config-data") pod "ff90b60b-bdfe-41ca-b835-b911cec10a8f" (UID: "ff90b60b-bdfe-41ca-b835-b911cec10a8f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:29:05 crc kubenswrapper[4722]: I0309 14:29:05.084936 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff90b60b-bdfe-41ca-b835-b911cec10a8f-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ff90b60b-bdfe-41ca-b835-b911cec10a8f" (UID: "ff90b60b-bdfe-41ca-b835-b911cec10a8f"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:29:05 crc kubenswrapper[4722]: I0309 14:29:05.085755 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff90b60b-bdfe-41ca-b835-b911cec10a8f-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ff90b60b-bdfe-41ca-b835-b911cec10a8f" (UID: "ff90b60b-bdfe-41ca-b835-b911cec10a8f"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:29:05 crc kubenswrapper[4722]: I0309 14:29:05.115905 4722 generic.go:334] "Generic (PLEG): container finished" podID="ff90b60b-bdfe-41ca-b835-b911cec10a8f" containerID="b6a9dcfb7b1ff1a237bf93656d6cacdcbb190251911cbe42a6665b553842c150" exitCode=0 Mar 09 14:29:05 crc kubenswrapper[4722]: I0309 14:29:05.115933 4722 generic.go:334] "Generic (PLEG): container finished" podID="ff90b60b-bdfe-41ca-b835-b911cec10a8f" containerID="6d1c7a52c957bb592000666b12ec3180ee75d47bc71576eff7f609bf0eae7457" exitCode=143 Mar 09 14:29:05 crc kubenswrapper[4722]: I0309 14:29:05.117381 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 14:29:05 crc kubenswrapper[4722]: I0309 14:29:05.117687 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ff90b60b-bdfe-41ca-b835-b911cec10a8f","Type":"ContainerDied","Data":"b6a9dcfb7b1ff1a237bf93656d6cacdcbb190251911cbe42a6665b553842c150"} Mar 09 14:29:05 crc kubenswrapper[4722]: I0309 14:29:05.117738 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ff90b60b-bdfe-41ca-b835-b911cec10a8f","Type":"ContainerDied","Data":"6d1c7a52c957bb592000666b12ec3180ee75d47bc71576eff7f609bf0eae7457"} Mar 09 14:29:05 crc kubenswrapper[4722]: I0309 14:29:05.117770 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ff90b60b-bdfe-41ca-b835-b911cec10a8f","Type":"ContainerDied","Data":"d57a3b2667c0d5e27f9e6a085a4ad26fadd28a884a3dbf14c55bc9f5b7d59011"} Mar 09 14:29:05 crc kubenswrapper[4722]: I0309 14:29:05.117788 4722 scope.go:117] "RemoveContainer" containerID="b6a9dcfb7b1ff1a237bf93656d6cacdcbb190251911cbe42a6665b553842c150" Mar 09 14:29:05 crc kubenswrapper[4722]: I0309 14:29:05.125219 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff90b60b-bdfe-41ca-b835-b911cec10a8f-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:29:05 crc kubenswrapper[4722]: I0309 14:29:05.125254 4722 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff90b60b-bdfe-41ca-b835-b911cec10a8f-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 14:29:05 crc kubenswrapper[4722]: I0309 14:29:05.125284 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-czxwv\" (UniqueName: \"kubernetes.io/projected/ff90b60b-bdfe-41ca-b835-b911cec10a8f-kube-api-access-czxwv\") on node \"crc\" DevicePath \"\"" Mar 09 14:29:05 crc kubenswrapper[4722]: I0309 14:29:05.125299 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff90b60b-bdfe-41ca-b835-b911cec10a8f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:29:05 crc kubenswrapper[4722]: I0309 14:29:05.125311 4722 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff90b60b-bdfe-41ca-b835-b911cec10a8f-logs\") on node \"crc\" DevicePath \"\"" Mar 09 14:29:05 crc kubenswrapper[4722]: I0309 14:29:05.125369 4722 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff90b60b-bdfe-41ca-b835-b911cec10a8f-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 14:29:05 crc kubenswrapper[4722]: I0309 14:29:05.127183 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 09 14:29:05 crc kubenswrapper[4722]: I0309 14:29:05.193398 4722 scope.go:117] "RemoveContainer" containerID="6d1c7a52c957bb592000666b12ec3180ee75d47bc71576eff7f609bf0eae7457" Mar 09 14:29:05 crc kubenswrapper[4722]: I0309 14:29:05.203764 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 09 14:29:05 crc kubenswrapper[4722]: I0309 14:29:05.216035 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 09 14:29:05 crc kubenswrapper[4722]: I0309 14:29:05.251959 4722 scope.go:117] "RemoveContainer" containerID="b6a9dcfb7b1ff1a237bf93656d6cacdcbb190251911cbe42a6665b553842c150" Mar 09 14:29:05 crc kubenswrapper[4722]: I0309 14:29:05.252081 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 09 14:29:05 crc kubenswrapper[4722]: E0309 14:29:05.252714 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="181d67a9-9459-4457-a22f-ba515776d23c" containerName="dnsmasq-dns" Mar 09 14:29:05 crc kubenswrapper[4722]: I0309 14:29:05.252737 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="181d67a9-9459-4457-a22f-ba515776d23c" containerName="dnsmasq-dns" Mar 09 14:29:05 crc kubenswrapper[4722]: E0309 14:29:05.252771 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff90b60b-bdfe-41ca-b835-b911cec10a8f" containerName="nova-api-api" Mar 09 14:29:05 crc kubenswrapper[4722]: I0309 14:29:05.252781 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff90b60b-bdfe-41ca-b835-b911cec10a8f" containerName="nova-api-api" Mar 09 14:29:05 crc kubenswrapper[4722]: E0309 14:29:05.252821 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff90b60b-bdfe-41ca-b835-b911cec10a8f" containerName="nova-api-log" Mar 09 14:29:05 crc kubenswrapper[4722]: I0309 14:29:05.252828 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff90b60b-bdfe-41ca-b835-b911cec10a8f" containerName="nova-api-log" Mar 09 14:29:05 crc kubenswrapper[4722]: E0309 14:29:05.252853 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25a37fbc-0819-4a2b-b782-d566e3f765d7" containerName="nova-manage" Mar 09 14:29:05 crc kubenswrapper[4722]: I0309 14:29:05.252864 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="25a37fbc-0819-4a2b-b782-d566e3f765d7" containerName="nova-manage" Mar 09 14:29:05 crc kubenswrapper[4722]: E0309 14:29:05.252893 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="181d67a9-9459-4457-a22f-ba515776d23c" containerName="init" Mar 09 14:29:05 crc kubenswrapper[4722]: I0309 14:29:05.252901 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="181d67a9-9459-4457-a22f-ba515776d23c" containerName="init" Mar 09 14:29:05 crc kubenswrapper[4722]: I0309 14:29:05.253147 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff90b60b-bdfe-41ca-b835-b911cec10a8f" containerName="nova-api-log" Mar 09 14:29:05 crc kubenswrapper[4722]: I0309 14:29:05.253190 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="25a37fbc-0819-4a2b-b782-d566e3f765d7" containerName="nova-manage" Mar 09 14:29:05 crc kubenswrapper[4722]: I0309 14:29:05.253235 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="181d67a9-9459-4457-a22f-ba515776d23c" containerName="dnsmasq-dns" Mar 09 14:29:05 crc kubenswrapper[4722]: I0309 14:29:05.253250 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff90b60b-bdfe-41ca-b835-b911cec10a8f" containerName="nova-api-api" Mar 09 14:29:05 crc kubenswrapper[4722]: E0309 14:29:05.253890 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6a9dcfb7b1ff1a237bf93656d6cacdcbb190251911cbe42a6665b553842c150\": container with ID starting with b6a9dcfb7b1ff1a237bf93656d6cacdcbb190251911cbe42a6665b553842c150 not found: ID does not exist" containerID="b6a9dcfb7b1ff1a237bf93656d6cacdcbb190251911cbe42a6665b553842c150" Mar 09 14:29:05 crc kubenswrapper[4722]: I0309 14:29:05.254038 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6a9dcfb7b1ff1a237bf93656d6cacdcbb190251911cbe42a6665b553842c150"} err="failed to get container status \"b6a9dcfb7b1ff1a237bf93656d6cacdcbb190251911cbe42a6665b553842c150\": rpc error: code = NotFound desc = could not find container \"b6a9dcfb7b1ff1a237bf93656d6cacdcbb190251911cbe42a6665b553842c150\": container with ID starting with b6a9dcfb7b1ff1a237bf93656d6cacdcbb190251911cbe42a6665b553842c150 not found: ID does not exist" Mar 09 14:29:05 crc kubenswrapper[4722]: I0309 14:29:05.254120 4722 scope.go:117] "RemoveContainer" containerID="6d1c7a52c957bb592000666b12ec3180ee75d47bc71576eff7f609bf0eae7457" Mar 09 14:29:05 crc kubenswrapper[4722]: E0309 14:29:05.254490 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d1c7a52c957bb592000666b12ec3180ee75d47bc71576eff7f609bf0eae7457\": container with ID starting with 6d1c7a52c957bb592000666b12ec3180ee75d47bc71576eff7f609bf0eae7457 not found: ID does not exist" containerID="6d1c7a52c957bb592000666b12ec3180ee75d47bc71576eff7f609bf0eae7457" Mar 09 14:29:05 crc kubenswrapper[4722]: I0309 14:29:05.254589 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d1c7a52c957bb592000666b12ec3180ee75d47bc71576eff7f609bf0eae7457"} err="failed to get container status \"6d1c7a52c957bb592000666b12ec3180ee75d47bc71576eff7f609bf0eae7457\": rpc error: code = NotFound desc = could not find container \"6d1c7a52c957bb592000666b12ec3180ee75d47bc71576eff7f609bf0eae7457\": container with ID starting with 6d1c7a52c957bb592000666b12ec3180ee75d47bc71576eff7f609bf0eae7457 not found: ID does not exist" Mar 09 14:29:05 crc kubenswrapper[4722]: I0309 14:29:05.254674 4722 scope.go:117] "RemoveContainer" containerID="b6a9dcfb7b1ff1a237bf93656d6cacdcbb190251911cbe42a6665b553842c150" Mar 09 14:29:05 crc kubenswrapper[4722]: I0309 14:29:05.254785 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 14:29:05 crc kubenswrapper[4722]: I0309 14:29:05.256146 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 09 14:29:05 crc kubenswrapper[4722]: I0309 14:29:05.258890 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6a9dcfb7b1ff1a237bf93656d6cacdcbb190251911cbe42a6665b553842c150"} err="failed to get container status \"b6a9dcfb7b1ff1a237bf93656d6cacdcbb190251911cbe42a6665b553842c150\": rpc error: code = NotFound desc = could not find container \"b6a9dcfb7b1ff1a237bf93656d6cacdcbb190251911cbe42a6665b553842c150\": container with ID starting with b6a9dcfb7b1ff1a237bf93656d6cacdcbb190251911cbe42a6665b553842c150 not found: ID does not exist" Mar 09 14:29:05 crc kubenswrapper[4722]: I0309 14:29:05.258992 4722 scope.go:117] "RemoveContainer" containerID="6d1c7a52c957bb592000666b12ec3180ee75d47bc71576eff7f609bf0eae7457" Mar 09 14:29:05 crc kubenswrapper[4722]: I0309 14:29:05.259327 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 09 14:29:05 crc kubenswrapper[4722]: I0309 14:29:05.259610 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 09 14:29:05 crc kubenswrapper[4722]: I0309 14:29:05.259990 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d1c7a52c957bb592000666b12ec3180ee75d47bc71576eff7f609bf0eae7457"} err="failed to get container status \"6d1c7a52c957bb592000666b12ec3180ee75d47bc71576eff7f609bf0eae7457\": rpc error: code = NotFound desc = could not find container \"6d1c7a52c957bb592000666b12ec3180ee75d47bc71576eff7f609bf0eae7457\": container with ID starting with 6d1c7a52c957bb592000666b12ec3180ee75d47bc71576eff7f609bf0eae7457 not found: ID does not exist" Mar 09 14:29:05 crc kubenswrapper[4722]: I0309 14:29:05.268654 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 09 14:29:05 crc kubenswrapper[4722]: I0309 14:29:05.331056 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2znlm\" (UniqueName: \"kubernetes.io/projected/b915b224-7fbf-4ec6-be9a-7205dd818ed4-kube-api-access-2znlm\") pod \"nova-api-0\" (UID: \"b915b224-7fbf-4ec6-be9a-7205dd818ed4\") " pod="openstack/nova-api-0" Mar 09 14:29:05 crc kubenswrapper[4722]: I0309 14:29:05.331116 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b915b224-7fbf-4ec6-be9a-7205dd818ed4-internal-tls-certs\") pod \"nova-api-0\" (UID: \"b915b224-7fbf-4ec6-be9a-7205dd818ed4\") " pod="openstack/nova-api-0" Mar 09 14:29:05 crc kubenswrapper[4722]: I0309 14:29:05.331181 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b915b224-7fbf-4ec6-be9a-7205dd818ed4-config-data\") pod \"nova-api-0\" (UID: \"b915b224-7fbf-4ec6-be9a-7205dd818ed4\") " pod="openstack/nova-api-0" Mar 09 14:29:05 crc kubenswrapper[4722]: I0309 14:29:05.331333 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b915b224-7fbf-4ec6-be9a-7205dd818ed4-public-tls-certs\") pod \"nova-api-0\" (UID: \"b915b224-7fbf-4ec6-be9a-7205dd818ed4\") " pod="openstack/nova-api-0" Mar 09 14:29:05 crc kubenswrapper[4722]: I0309 14:29:05.331379 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b915b224-7fbf-4ec6-be9a-7205dd818ed4-logs\") pod \"nova-api-0\" (UID: \"b915b224-7fbf-4ec6-be9a-7205dd818ed4\") " pod="openstack/nova-api-0" Mar 09 14:29:05 crc kubenswrapper[4722]: I0309 14:29:05.331444 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b915b224-7fbf-4ec6-be9a-7205dd818ed4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b915b224-7fbf-4ec6-be9a-7205dd818ed4\") " pod="openstack/nova-api-0" Mar 09 14:29:05 crc kubenswrapper[4722]: I0309 14:29:05.433943 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b915b224-7fbf-4ec6-be9a-7205dd818ed4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b915b224-7fbf-4ec6-be9a-7205dd818ed4\") " pod="openstack/nova-api-0" Mar 09 14:29:05 crc kubenswrapper[4722]: I0309 14:29:05.434057 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2znlm\" (UniqueName: \"kubernetes.io/projected/b915b224-7fbf-4ec6-be9a-7205dd818ed4-kube-api-access-2znlm\") pod \"nova-api-0\" (UID: \"b915b224-7fbf-4ec6-be9a-7205dd818ed4\") " pod="openstack/nova-api-0" Mar 09 14:29:05 crc kubenswrapper[4722]: I0309 14:29:05.434081 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b915b224-7fbf-4ec6-be9a-7205dd818ed4-internal-tls-certs\") pod \"nova-api-0\" (UID: \"b915b224-7fbf-4ec6-be9a-7205dd818ed4\") " pod="openstack/nova-api-0" Mar 09 14:29:05 crc kubenswrapper[4722]: I0309 14:29:05.434119 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b915b224-7fbf-4ec6-be9a-7205dd818ed4-config-data\") pod \"nova-api-0\" (UID: \"b915b224-7fbf-4ec6-be9a-7205dd818ed4\") " pod="openstack/nova-api-0" Mar 09 14:29:05 crc kubenswrapper[4722]: I0309 14:29:05.434191 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b915b224-7fbf-4ec6-be9a-7205dd818ed4-public-tls-certs\") pod \"nova-api-0\" (UID: \"b915b224-7fbf-4ec6-be9a-7205dd818ed4\") " pod="openstack/nova-api-0" Mar 09 14:29:05 crc kubenswrapper[4722]: I0309 14:29:05.434226 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b915b224-7fbf-4ec6-be9a-7205dd818ed4-logs\") pod \"nova-api-0\" (UID: \"b915b224-7fbf-4ec6-be9a-7205dd818ed4\") " pod="openstack/nova-api-0" Mar 09 14:29:05 crc kubenswrapper[4722]: I0309 14:29:05.434631 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b915b224-7fbf-4ec6-be9a-7205dd818ed4-logs\") pod \"nova-api-0\" (UID: \"b915b224-7fbf-4ec6-be9a-7205dd818ed4\") " pod="openstack/nova-api-0" Mar 09 14:29:05 crc kubenswrapper[4722]: I0309 14:29:05.438713 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b915b224-7fbf-4ec6-be9a-7205dd818ed4-config-data\") pod \"nova-api-0\" (UID: \"b915b224-7fbf-4ec6-be9a-7205dd818ed4\") " pod="openstack/nova-api-0" Mar 09 14:29:05 crc kubenswrapper[4722]: I0309 14:29:05.438967 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b915b224-7fbf-4ec6-be9a-7205dd818ed4-public-tls-certs\") pod \"nova-api-0\" (UID: \"b915b224-7fbf-4ec6-be9a-7205dd818ed4\") " pod="openstack/nova-api-0" Mar 09 14:29:05 crc kubenswrapper[4722]: I0309 14:29:05.439423 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b915b224-7fbf-4ec6-be9a-7205dd818ed4-internal-tls-certs\") pod \"nova-api-0\" (UID: \"b915b224-7fbf-4ec6-be9a-7205dd818ed4\") " pod="openstack/nova-api-0" Mar 09 14:29:05 crc kubenswrapper[4722]: I0309 14:29:05.439665 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b915b224-7fbf-4ec6-be9a-7205dd818ed4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b915b224-7fbf-4ec6-be9a-7205dd818ed4\") " pod="openstack/nova-api-0" Mar 09 14:29:05 crc kubenswrapper[4722]: I0309 14:29:05.450941 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2znlm\" (UniqueName: \"kubernetes.io/projected/b915b224-7fbf-4ec6-be9a-7205dd818ed4-kube-api-access-2znlm\") pod \"nova-api-0\" (UID: \"b915b224-7fbf-4ec6-be9a-7205dd818ed4\") " pod="openstack/nova-api-0" Mar 09 14:29:05 crc kubenswrapper[4722]: I0309 14:29:05.581258 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 09 14:29:06 crc kubenswrapper[4722]: I0309 14:29:06.101798 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 09 14:29:06 crc kubenswrapper[4722]: W0309 14:29:06.103577 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb915b224_7fbf_4ec6_be9a_7205dd818ed4.slice/crio-7574ee73c63f176c83df665cc2bcf5ee7539d27665ef3d5b005ff2bca3645b93 WatchSource:0}: Error finding container 7574ee73c63f176c83df665cc2bcf5ee7539d27665ef3d5b005ff2bca3645b93: Status 404 returned error can't find the container with id 7574ee73c63f176c83df665cc2bcf5ee7539d27665ef3d5b005ff2bca3645b93 Mar 09 14:29:06 crc kubenswrapper[4722]: I0309 14:29:06.135670 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b58c3f1a-21b8-4a75-bfb5-970122e20db6" containerName="nova-metadata-log" containerID="cri-o://a1b1a60a9d6e55a96d906a241559260ee757706a4d44f98b4b1268ac1a05ec9c" gracePeriod=30 Mar 09 14:29:06 crc kubenswrapper[4722]: I0309 14:29:06.135956 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b915b224-7fbf-4ec6-be9a-7205dd818ed4","Type":"ContainerStarted","Data":"7574ee73c63f176c83df665cc2bcf5ee7539d27665ef3d5b005ff2bca3645b93"} Mar 09 14:29:06 crc kubenswrapper[4722]: I0309 14:29:06.136177 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b58c3f1a-21b8-4a75-bfb5-970122e20db6" containerName="nova-metadata-metadata" containerID="cri-o://5891fa687a19a92c246a30786d3a0fe3fca15abe1ec15a94af6e37d35fff007a" gracePeriod=30 Mar 09 14:29:06 crc kubenswrapper[4722]: I0309 14:29:06.169586 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff90b60b-bdfe-41ca-b835-b911cec10a8f" path="/var/lib/kubelet/pods/ff90b60b-bdfe-41ca-b835-b911cec10a8f/volumes" Mar 09 14:29:06 crc kubenswrapper[4722]: E0309 14:29:06.491220 4722 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e181a3d1402638abb6a7fd52b392c6c4834488f8c071ebed96b611f31e146cdb" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 09 14:29:06 crc kubenswrapper[4722]: E0309 14:29:06.492989 4722 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e181a3d1402638abb6a7fd52b392c6c4834488f8c071ebed96b611f31e146cdb" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 09 14:29:06 crc kubenswrapper[4722]: E0309 14:29:06.495284 4722 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e181a3d1402638abb6a7fd52b392c6c4834488f8c071ebed96b611f31e146cdb" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 09 14:29:06 crc kubenswrapper[4722]: E0309 14:29:06.495336 4722 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="1fe185bd-3b13-48df-9ceb-df1ba08e277b" containerName="nova-scheduler-scheduler" Mar 09 14:29:07 crc kubenswrapper[4722]: I0309 14:29:07.142831 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 09 14:29:07 crc kubenswrapper[4722]: I0309 14:29:07.148287 4722 generic.go:334] "Generic (PLEG): container finished" podID="b58c3f1a-21b8-4a75-bfb5-970122e20db6" containerID="a1b1a60a9d6e55a96d906a241559260ee757706a4d44f98b4b1268ac1a05ec9c" exitCode=143 Mar 09 14:29:07 crc kubenswrapper[4722]: I0309 14:29:07.148366 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b58c3f1a-21b8-4a75-bfb5-970122e20db6","Type":"ContainerDied","Data":"a1b1a60a9d6e55a96d906a241559260ee757706a4d44f98b4b1268ac1a05ec9c"} Mar 09 14:29:07 crc kubenswrapper[4722]: I0309 14:29:07.150753 4722 generic.go:334] "Generic (PLEG): container finished" podID="1fe185bd-3b13-48df-9ceb-df1ba08e277b" containerID="e181a3d1402638abb6a7fd52b392c6c4834488f8c071ebed96b611f31e146cdb" exitCode=0 Mar 09 14:29:07 crc kubenswrapper[4722]: I0309 14:29:07.150856 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1fe185bd-3b13-48df-9ceb-df1ba08e277b","Type":"ContainerDied","Data":"e181a3d1402638abb6a7fd52b392c6c4834488f8c071ebed96b611f31e146cdb"} Mar 09 14:29:07 crc kubenswrapper[4722]: I0309 14:29:07.150891 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"1fe185bd-3b13-48df-9ceb-df1ba08e277b","Type":"ContainerDied","Data":"7e797916bf99fb06ad0a2846b0c3f3f33f9c999d2c43e165feedf9ff47a052ed"} Mar 09 14:29:07 crc kubenswrapper[4722]: I0309 14:29:07.150920 4722 scope.go:117] "RemoveContainer" containerID="e181a3d1402638abb6a7fd52b392c6c4834488f8c071ebed96b611f31e146cdb" Mar 09 14:29:07 crc kubenswrapper[4722]: I0309 14:29:07.151059 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 09 14:29:07 crc kubenswrapper[4722]: I0309 14:29:07.154791 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b915b224-7fbf-4ec6-be9a-7205dd818ed4","Type":"ContainerStarted","Data":"cd5bff184b9716c18a3f00010672f3779a4774062e4f74de100dfc5a86d7ef2a"} Mar 09 14:29:07 crc kubenswrapper[4722]: I0309 14:29:07.154842 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b915b224-7fbf-4ec6-be9a-7205dd818ed4","Type":"ContainerStarted","Data":"e3aaa9c5cba0d2a6fb4aa3196d16cd4e52edfac0095d4c749ceb00f88e49afcd"} Mar 09 14:29:07 crc kubenswrapper[4722]: I0309 14:29:07.190598 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.190579599 podStartE2EDuration="2.190579599s" podCreationTimestamp="2026-03-09 14:29:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:29:07.189724666 +0000 UTC m=+1587.745293232" watchObservedRunningTime="2026-03-09 14:29:07.190579599 +0000 UTC m=+1587.746148165" Mar 09 14:29:07 crc kubenswrapper[4722]: I0309 14:29:07.191897 4722 scope.go:117] "RemoveContainer" containerID="e181a3d1402638abb6a7fd52b392c6c4834488f8c071ebed96b611f31e146cdb" Mar 09 14:29:07 crc kubenswrapper[4722]: I0309 14:29:07.190740 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fe185bd-3b13-48df-9ceb-df1ba08e277b-combined-ca-bundle\") pod \"1fe185bd-3b13-48df-9ceb-df1ba08e277b\" (UID: \"1fe185bd-3b13-48df-9ceb-df1ba08e277b\") " Mar 09 14:29:07 crc kubenswrapper[4722]: I0309 14:29:07.192792 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6frd5\" (UniqueName: \"kubernetes.io/projected/1fe185bd-3b13-48df-9ceb-df1ba08e277b-kube-api-access-6frd5\") pod \"1fe185bd-3b13-48df-9ceb-df1ba08e277b\" (UID: \"1fe185bd-3b13-48df-9ceb-df1ba08e277b\") " Mar 09 14:29:07 crc kubenswrapper[4722]: I0309 14:29:07.192881 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fe185bd-3b13-48df-9ceb-df1ba08e277b-config-data\") pod \"1fe185bd-3b13-48df-9ceb-df1ba08e277b\" (UID: \"1fe185bd-3b13-48df-9ceb-df1ba08e277b\") " Mar 09 14:29:07 crc kubenswrapper[4722]: E0309 14:29:07.197338 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e181a3d1402638abb6a7fd52b392c6c4834488f8c071ebed96b611f31e146cdb\": container with ID starting with e181a3d1402638abb6a7fd52b392c6c4834488f8c071ebed96b611f31e146cdb not found: ID does not exist" containerID="e181a3d1402638abb6a7fd52b392c6c4834488f8c071ebed96b611f31e146cdb" Mar 09 14:29:07 crc kubenswrapper[4722]: I0309 14:29:07.197392 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e181a3d1402638abb6a7fd52b392c6c4834488f8c071ebed96b611f31e146cdb"} err="failed to get container status \"e181a3d1402638abb6a7fd52b392c6c4834488f8c071ebed96b611f31e146cdb\": rpc error: code = NotFound desc = could not find container \"e181a3d1402638abb6a7fd52b392c6c4834488f8c071ebed96b611f31e146cdb\": container with ID starting with e181a3d1402638abb6a7fd52b392c6c4834488f8c071ebed96b611f31e146cdb not found: ID does not exist" Mar 09 14:29:07 crc kubenswrapper[4722]: I0309 14:29:07.220894 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fe185bd-3b13-48df-9ceb-df1ba08e277b-kube-api-access-6frd5" (OuterVolumeSpecName: "kube-api-access-6frd5") pod "1fe185bd-3b13-48df-9ceb-df1ba08e277b" (UID: "1fe185bd-3b13-48df-9ceb-df1ba08e277b"). InnerVolumeSpecName "kube-api-access-6frd5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:29:07 crc kubenswrapper[4722]: I0309 14:29:07.249274 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fe185bd-3b13-48df-9ceb-df1ba08e277b-config-data" (OuterVolumeSpecName: "config-data") pod "1fe185bd-3b13-48df-9ceb-df1ba08e277b" (UID: "1fe185bd-3b13-48df-9ceb-df1ba08e277b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:29:07 crc kubenswrapper[4722]: I0309 14:29:07.271178 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fe185bd-3b13-48df-9ceb-df1ba08e277b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1fe185bd-3b13-48df-9ceb-df1ba08e277b" (UID: "1fe185bd-3b13-48df-9ceb-df1ba08e277b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:29:07 crc kubenswrapper[4722]: I0309 14:29:07.299735 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fe185bd-3b13-48df-9ceb-df1ba08e277b-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:29:07 crc kubenswrapper[4722]: I0309 14:29:07.299769 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fe185bd-3b13-48df-9ceb-df1ba08e277b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:29:07 crc kubenswrapper[4722]: I0309 14:29:07.299781 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6frd5\" (UniqueName: \"kubernetes.io/projected/1fe185bd-3b13-48df-9ceb-df1ba08e277b-kube-api-access-6frd5\") on node \"crc\" DevicePath \"\"" Mar 09 14:29:07 crc kubenswrapper[4722]: I0309 14:29:07.491515 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 14:29:07 crc kubenswrapper[4722]: I0309 14:29:07.518342 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 14:29:07 crc kubenswrapper[4722]: I0309 14:29:07.543374 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 14:29:07 crc kubenswrapper[4722]: E0309 14:29:07.544038 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fe185bd-3b13-48df-9ceb-df1ba08e277b" containerName="nova-scheduler-scheduler" Mar 09 14:29:07 crc kubenswrapper[4722]: I0309 14:29:07.544058 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fe185bd-3b13-48df-9ceb-df1ba08e277b" containerName="nova-scheduler-scheduler" Mar 09 14:29:07 crc kubenswrapper[4722]: I0309 14:29:07.544376 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fe185bd-3b13-48df-9ceb-df1ba08e277b" containerName="nova-scheduler-scheduler" Mar 09 14:29:07 crc kubenswrapper[4722]: I0309 14:29:07.545405 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 09 14:29:07 crc kubenswrapper[4722]: I0309 14:29:07.549528 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 09 14:29:07 crc kubenswrapper[4722]: I0309 14:29:07.549901 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 14:29:07 crc kubenswrapper[4722]: I0309 14:29:07.614121 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbc84d58-b38f-49b4-b80f-f3377b43d7a4-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bbc84d58-b38f-49b4-b80f-f3377b43d7a4\") " pod="openstack/nova-scheduler-0" Mar 09 14:29:07 crc kubenswrapper[4722]: I0309 14:29:07.614235 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbc84d58-b38f-49b4-b80f-f3377b43d7a4-config-data\") pod \"nova-scheduler-0\" (UID: \"bbc84d58-b38f-49b4-b80f-f3377b43d7a4\") " pod="openstack/nova-scheduler-0" Mar 09 14:29:07 crc kubenswrapper[4722]: I0309 14:29:07.614302 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glqzs\" (UniqueName: \"kubernetes.io/projected/bbc84d58-b38f-49b4-b80f-f3377b43d7a4-kube-api-access-glqzs\") pod \"nova-scheduler-0\" (UID: \"bbc84d58-b38f-49b4-b80f-f3377b43d7a4\") " pod="openstack/nova-scheduler-0" Mar 09 14:29:07 crc kubenswrapper[4722]: I0309 14:29:07.716470 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbc84d58-b38f-49b4-b80f-f3377b43d7a4-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bbc84d58-b38f-49b4-b80f-f3377b43d7a4\") " pod="openstack/nova-scheduler-0" Mar 09 14:29:07 crc kubenswrapper[4722]: I0309 14:29:07.716577 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbc84d58-b38f-49b4-b80f-f3377b43d7a4-config-data\") pod \"nova-scheduler-0\" (UID: \"bbc84d58-b38f-49b4-b80f-f3377b43d7a4\") " pod="openstack/nova-scheduler-0" Mar 09 14:29:07 crc kubenswrapper[4722]: I0309 14:29:07.716647 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glqzs\" (UniqueName: \"kubernetes.io/projected/bbc84d58-b38f-49b4-b80f-f3377b43d7a4-kube-api-access-glqzs\") pod \"nova-scheduler-0\" (UID: \"bbc84d58-b38f-49b4-b80f-f3377b43d7a4\") " pod="openstack/nova-scheduler-0" Mar 09 14:29:07 crc kubenswrapper[4722]: I0309 14:29:07.720768 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbc84d58-b38f-49b4-b80f-f3377b43d7a4-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bbc84d58-b38f-49b4-b80f-f3377b43d7a4\") " pod="openstack/nova-scheduler-0" Mar 09 14:29:07 crc kubenswrapper[4722]: I0309 14:29:07.721043 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbc84d58-b38f-49b4-b80f-f3377b43d7a4-config-data\") pod \"nova-scheduler-0\" (UID: \"bbc84d58-b38f-49b4-b80f-f3377b43d7a4\") " pod="openstack/nova-scheduler-0" Mar 09 14:29:07 crc kubenswrapper[4722]: I0309 14:29:07.735553 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glqzs\" (UniqueName: \"kubernetes.io/projected/bbc84d58-b38f-49b4-b80f-f3377b43d7a4-kube-api-access-glqzs\") pod \"nova-scheduler-0\" (UID: \"bbc84d58-b38f-49b4-b80f-f3377b43d7a4\") " pod="openstack/nova-scheduler-0" Mar 09 14:29:07 crc kubenswrapper[4722]: I0309 14:29:07.859621 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 09 14:29:08 crc kubenswrapper[4722]: I0309 14:29:08.157687 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 09 14:29:08 crc kubenswrapper[4722]: I0309 14:29:08.171313 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fe185bd-3b13-48df-9ceb-df1ba08e277b" path="/var/lib/kubelet/pods/1fe185bd-3b13-48df-9ceb-df1ba08e277b/volumes" Mar 09 14:29:08 crc kubenswrapper[4722]: I0309 14:29:08.171703 4722 generic.go:334] "Generic (PLEG): container finished" podID="9226bef9-b3b3-4f10-8e02-6fbc710bc4f3" containerID="7d7d2b8b808b7e55d128afce9189980d53bf001ea1c56ed1149a3958838025b3" exitCode=137 Mar 09 14:29:08 crc kubenswrapper[4722]: I0309 14:29:08.171888 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 09 14:29:08 crc kubenswrapper[4722]: I0309 14:29:08.172618 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"9226bef9-b3b3-4f10-8e02-6fbc710bc4f3","Type":"ContainerDied","Data":"7d7d2b8b808b7e55d128afce9189980d53bf001ea1c56ed1149a3958838025b3"} Mar 09 14:29:08 crc kubenswrapper[4722]: I0309 14:29:08.172658 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"9226bef9-b3b3-4f10-8e02-6fbc710bc4f3","Type":"ContainerDied","Data":"b178e6de68f3d0a7ba0e3dc493d804391749665d2f57fa337c4435a67a78a420"} Mar 09 14:29:08 crc kubenswrapper[4722]: I0309 14:29:08.172681 4722 scope.go:117] "RemoveContainer" containerID="7d7d2b8b808b7e55d128afce9189980d53bf001ea1c56ed1149a3958838025b3" Mar 09 14:29:08 crc kubenswrapper[4722]: I0309 14:29:08.226073 4722 scope.go:117] "RemoveContainer" containerID="f57ad0ec10555687fa3af79581c9163214066df556cd0b5525e8a43d7e191299" Mar 09 14:29:08 crc kubenswrapper[4722]: I0309 14:29:08.227102 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9226bef9-b3b3-4f10-8e02-6fbc710bc4f3-config-data\") pod \"9226bef9-b3b3-4f10-8e02-6fbc710bc4f3\" (UID: \"9226bef9-b3b3-4f10-8e02-6fbc710bc4f3\") " Mar 09 14:29:08 crc kubenswrapper[4722]: I0309 14:29:08.227151 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qnkzm\" (UniqueName: \"kubernetes.io/projected/9226bef9-b3b3-4f10-8e02-6fbc710bc4f3-kube-api-access-qnkzm\") pod \"9226bef9-b3b3-4f10-8e02-6fbc710bc4f3\" (UID: \"9226bef9-b3b3-4f10-8e02-6fbc710bc4f3\") " Mar 09 14:29:08 crc kubenswrapper[4722]: I0309 14:29:08.227632 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9226bef9-b3b3-4f10-8e02-6fbc710bc4f3-combined-ca-bundle\") pod \"9226bef9-b3b3-4f10-8e02-6fbc710bc4f3\" (UID: \"9226bef9-b3b3-4f10-8e02-6fbc710bc4f3\") " Mar 09 14:29:08 crc kubenswrapper[4722]: I0309 14:29:08.227808 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9226bef9-b3b3-4f10-8e02-6fbc710bc4f3-scripts\") pod \"9226bef9-b3b3-4f10-8e02-6fbc710bc4f3\" (UID: \"9226bef9-b3b3-4f10-8e02-6fbc710bc4f3\") " Mar 09 14:29:08 crc kubenswrapper[4722]: I0309 14:29:08.233143 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9226bef9-b3b3-4f10-8e02-6fbc710bc4f3-kube-api-access-qnkzm" (OuterVolumeSpecName: "kube-api-access-qnkzm") pod "9226bef9-b3b3-4f10-8e02-6fbc710bc4f3" (UID: "9226bef9-b3b3-4f10-8e02-6fbc710bc4f3"). InnerVolumeSpecName "kube-api-access-qnkzm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:29:08 crc kubenswrapper[4722]: I0309 14:29:08.248010 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9226bef9-b3b3-4f10-8e02-6fbc710bc4f3-scripts" (OuterVolumeSpecName: "scripts") pod "9226bef9-b3b3-4f10-8e02-6fbc710bc4f3" (UID: "9226bef9-b3b3-4f10-8e02-6fbc710bc4f3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:29:08 crc kubenswrapper[4722]: I0309 14:29:08.262782 4722 scope.go:117] "RemoveContainer" containerID="e054dead3688eeb85afccf93d975a6293fdf7fdde17ec109bf4977b2b4adb687" Mar 09 14:29:08 crc kubenswrapper[4722]: I0309 14:29:08.292683 4722 scope.go:117] "RemoveContainer" containerID="6d9fb16100b3f52f35ea0198c84baf23497e15348c942b0f5dadd6b8d8a1b0de" Mar 09 14:29:08 crc kubenswrapper[4722]: I0309 14:29:08.335975 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 09 14:29:08 crc kubenswrapper[4722]: I0309 14:29:08.339763 4722 scope.go:117] "RemoveContainer" containerID="7d7d2b8b808b7e55d128afce9189980d53bf001ea1c56ed1149a3958838025b3" Mar 09 14:29:08 crc kubenswrapper[4722]: I0309 14:29:08.340031 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9226bef9-b3b3-4f10-8e02-6fbc710bc4f3-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 14:29:08 crc kubenswrapper[4722]: I0309 14:29:08.340060 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qnkzm\" (UniqueName: \"kubernetes.io/projected/9226bef9-b3b3-4f10-8e02-6fbc710bc4f3-kube-api-access-qnkzm\") on node \"crc\" DevicePath \"\"" Mar 09 14:29:08 crc kubenswrapper[4722]: E0309 14:29:08.341820 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d7d2b8b808b7e55d128afce9189980d53bf001ea1c56ed1149a3958838025b3\": container with ID starting with 7d7d2b8b808b7e55d128afce9189980d53bf001ea1c56ed1149a3958838025b3 not found: ID does not exist" containerID="7d7d2b8b808b7e55d128afce9189980d53bf001ea1c56ed1149a3958838025b3" Mar 09 14:29:08 crc kubenswrapper[4722]: I0309 14:29:08.341950 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d7d2b8b808b7e55d128afce9189980d53bf001ea1c56ed1149a3958838025b3"} err="failed to get container status \"7d7d2b8b808b7e55d128afce9189980d53bf001ea1c56ed1149a3958838025b3\": rpc error: code = NotFound desc = could not find container \"7d7d2b8b808b7e55d128afce9189980d53bf001ea1c56ed1149a3958838025b3\": container with ID starting with 7d7d2b8b808b7e55d128afce9189980d53bf001ea1c56ed1149a3958838025b3 not found: ID does not exist" Mar 09 14:29:08 crc kubenswrapper[4722]: I0309 14:29:08.342070 4722 scope.go:117] "RemoveContainer" containerID="f57ad0ec10555687fa3af79581c9163214066df556cd0b5525e8a43d7e191299" Mar 09 14:29:08 crc kubenswrapper[4722]: E0309 14:29:08.343083 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f57ad0ec10555687fa3af79581c9163214066df556cd0b5525e8a43d7e191299\": container with ID starting with f57ad0ec10555687fa3af79581c9163214066df556cd0b5525e8a43d7e191299 not found: ID does not exist" containerID="f57ad0ec10555687fa3af79581c9163214066df556cd0b5525e8a43d7e191299" Mar 09 14:29:08 crc kubenswrapper[4722]: I0309 14:29:08.343116 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f57ad0ec10555687fa3af79581c9163214066df556cd0b5525e8a43d7e191299"} err="failed to get container status \"f57ad0ec10555687fa3af79581c9163214066df556cd0b5525e8a43d7e191299\": rpc error: code = NotFound desc = could not find container \"f57ad0ec10555687fa3af79581c9163214066df556cd0b5525e8a43d7e191299\": container with ID starting with f57ad0ec10555687fa3af79581c9163214066df556cd0b5525e8a43d7e191299 not found: ID does not exist" Mar 09 14:29:08 crc kubenswrapper[4722]: I0309 14:29:08.343143 4722 scope.go:117] "RemoveContainer" containerID="e054dead3688eeb85afccf93d975a6293fdf7fdde17ec109bf4977b2b4adb687" Mar 09 14:29:08 crc kubenswrapper[4722]: E0309 14:29:08.343473 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e054dead3688eeb85afccf93d975a6293fdf7fdde17ec109bf4977b2b4adb687\": container with ID starting with e054dead3688eeb85afccf93d975a6293fdf7fdde17ec109bf4977b2b4adb687 not found: ID does not exist" containerID="e054dead3688eeb85afccf93d975a6293fdf7fdde17ec109bf4977b2b4adb687" Mar 09 14:29:08 crc kubenswrapper[4722]: I0309 14:29:08.343499 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e054dead3688eeb85afccf93d975a6293fdf7fdde17ec109bf4977b2b4adb687"} err="failed to get container status \"e054dead3688eeb85afccf93d975a6293fdf7fdde17ec109bf4977b2b4adb687\": rpc error: code = NotFound desc = could not find container \"e054dead3688eeb85afccf93d975a6293fdf7fdde17ec109bf4977b2b4adb687\": container with ID starting with e054dead3688eeb85afccf93d975a6293fdf7fdde17ec109bf4977b2b4adb687 not found: ID does not exist" Mar 09 14:29:08 crc kubenswrapper[4722]: I0309 14:29:08.343513 4722 scope.go:117] "RemoveContainer" containerID="6d9fb16100b3f52f35ea0198c84baf23497e15348c942b0f5dadd6b8d8a1b0de" Mar 09 14:29:08 crc kubenswrapper[4722]: E0309 14:29:08.343812 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d9fb16100b3f52f35ea0198c84baf23497e15348c942b0f5dadd6b8d8a1b0de\": container with ID starting with 6d9fb16100b3f52f35ea0198c84baf23497e15348c942b0f5dadd6b8d8a1b0de not found: ID does not exist" containerID="6d9fb16100b3f52f35ea0198c84baf23497e15348c942b0f5dadd6b8d8a1b0de" Mar 09 14:29:08 crc kubenswrapper[4722]: I0309 14:29:08.343834 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d9fb16100b3f52f35ea0198c84baf23497e15348c942b0f5dadd6b8d8a1b0de"} err="failed to get container status \"6d9fb16100b3f52f35ea0198c84baf23497e15348c942b0f5dadd6b8d8a1b0de\": rpc error: code = NotFound desc = could not find container \"6d9fb16100b3f52f35ea0198c84baf23497e15348c942b0f5dadd6b8d8a1b0de\": container with ID starting with 6d9fb16100b3f52f35ea0198c84baf23497e15348c942b0f5dadd6b8d8a1b0de not found: ID does not exist" Mar 09 14:29:08 crc kubenswrapper[4722]: W0309 14:29:08.346572 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbbc84d58_b38f_49b4_b80f_f3377b43d7a4.slice/crio-51b7eb3950ba3892fb35a9c3c5a860aef10ceb840f59ec27782a057df61bde3b WatchSource:0}: Error finding container 51b7eb3950ba3892fb35a9c3c5a860aef10ceb840f59ec27782a057df61bde3b: Status 404 returned error can't find the container with id 51b7eb3950ba3892fb35a9c3c5a860aef10ceb840f59ec27782a057df61bde3b Mar 09 14:29:08 crc kubenswrapper[4722]: I0309 14:29:08.379687 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9226bef9-b3b3-4f10-8e02-6fbc710bc4f3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9226bef9-b3b3-4f10-8e02-6fbc710bc4f3" (UID: "9226bef9-b3b3-4f10-8e02-6fbc710bc4f3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:29:08 crc kubenswrapper[4722]: I0309 14:29:08.415417 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9226bef9-b3b3-4f10-8e02-6fbc710bc4f3-config-data" (OuterVolumeSpecName: "config-data") pod "9226bef9-b3b3-4f10-8e02-6fbc710bc4f3" (UID: "9226bef9-b3b3-4f10-8e02-6fbc710bc4f3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:29:08 crc kubenswrapper[4722]: I0309 14:29:08.442706 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9226bef9-b3b3-4f10-8e02-6fbc710bc4f3-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:29:08 crc kubenswrapper[4722]: I0309 14:29:08.442745 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9226bef9-b3b3-4f10-8e02-6fbc710bc4f3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:29:08 crc kubenswrapper[4722]: I0309 14:29:08.507033 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Mar 09 14:29:08 crc kubenswrapper[4722]: I0309 14:29:08.519145 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0"] Mar 09 14:29:08 crc kubenswrapper[4722]: I0309 14:29:08.545954 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Mar 09 14:29:08 crc kubenswrapper[4722]: E0309 14:29:08.546662 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9226bef9-b3b3-4f10-8e02-6fbc710bc4f3" containerName="aodh-notifier" Mar 09 14:29:08 crc kubenswrapper[4722]: I0309 14:29:08.546688 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="9226bef9-b3b3-4f10-8e02-6fbc710bc4f3" containerName="aodh-notifier" Mar 09 14:29:08 crc kubenswrapper[4722]: E0309 14:29:08.546710 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9226bef9-b3b3-4f10-8e02-6fbc710bc4f3" containerName="aodh-evaluator" Mar 09 14:29:08 crc kubenswrapper[4722]: I0309 14:29:08.546721 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="9226bef9-b3b3-4f10-8e02-6fbc710bc4f3" containerName="aodh-evaluator" Mar 09 14:29:08 crc kubenswrapper[4722]: E0309 14:29:08.546756 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9226bef9-b3b3-4f10-8e02-6fbc710bc4f3" containerName="aodh-listener" Mar 09 14:29:08 crc kubenswrapper[4722]: I0309 14:29:08.546765 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="9226bef9-b3b3-4f10-8e02-6fbc710bc4f3" containerName="aodh-listener" Mar 09 14:29:08 crc kubenswrapper[4722]: E0309 14:29:08.546791 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9226bef9-b3b3-4f10-8e02-6fbc710bc4f3" containerName="aodh-api" Mar 09 14:29:08 crc kubenswrapper[4722]: I0309 14:29:08.546799 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="9226bef9-b3b3-4f10-8e02-6fbc710bc4f3" containerName="aodh-api" Mar 09 14:29:08 crc kubenswrapper[4722]: I0309 14:29:08.547068 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="9226bef9-b3b3-4f10-8e02-6fbc710bc4f3" containerName="aodh-listener" Mar 09 14:29:08 crc kubenswrapper[4722]: I0309 14:29:08.547094 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="9226bef9-b3b3-4f10-8e02-6fbc710bc4f3" containerName="aodh-evaluator" Mar 09 14:29:08 crc kubenswrapper[4722]: I0309 14:29:08.547112 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="9226bef9-b3b3-4f10-8e02-6fbc710bc4f3" containerName="aodh-api" Mar 09 14:29:08 crc kubenswrapper[4722]: I0309 14:29:08.547154 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="9226bef9-b3b3-4f10-8e02-6fbc710bc4f3" containerName="aodh-notifier" Mar 09 14:29:08 crc kubenswrapper[4722]: I0309 14:29:08.550466 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 09 14:29:08 crc kubenswrapper[4722]: I0309 14:29:08.552754 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Mar 09 14:29:08 crc kubenswrapper[4722]: I0309 14:29:08.553160 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-internal-svc" Mar 09 14:29:08 crc kubenswrapper[4722]: I0309 14:29:08.553470 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Mar 09 14:29:08 crc kubenswrapper[4722]: I0309 14:29:08.553590 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-d98zq" Mar 09 14:29:08 crc kubenswrapper[4722]: I0309 14:29:08.554473 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-public-svc" Mar 09 14:29:08 crc kubenswrapper[4722]: I0309 14:29:08.560500 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 09 14:29:08 crc kubenswrapper[4722]: I0309 14:29:08.647161 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32835244-16f5-407f-87fe-8e9f860879a1-config-data\") pod \"aodh-0\" (UID: \"32835244-16f5-407f-87fe-8e9f860879a1\") " pod="openstack/aodh-0" Mar 09 14:29:08 crc kubenswrapper[4722]: I0309 14:29:08.647547 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/32835244-16f5-407f-87fe-8e9f860879a1-public-tls-certs\") pod \"aodh-0\" (UID: \"32835244-16f5-407f-87fe-8e9f860879a1\") " pod="openstack/aodh-0" Mar 09 14:29:08 crc kubenswrapper[4722]: I0309 14:29:08.647579 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32835244-16f5-407f-87fe-8e9f860879a1-combined-ca-bundle\") pod \"aodh-0\" (UID: \"32835244-16f5-407f-87fe-8e9f860879a1\") " pod="openstack/aodh-0" Mar 09 14:29:08 crc kubenswrapper[4722]: I0309 14:29:08.647679 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32835244-16f5-407f-87fe-8e9f860879a1-scripts\") pod \"aodh-0\" (UID: \"32835244-16f5-407f-87fe-8e9f860879a1\") " pod="openstack/aodh-0" Mar 09 14:29:08 crc kubenswrapper[4722]: I0309 14:29:08.647840 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/32835244-16f5-407f-87fe-8e9f860879a1-internal-tls-certs\") pod \"aodh-0\" (UID: \"32835244-16f5-407f-87fe-8e9f860879a1\") " pod="openstack/aodh-0" Mar 09 14:29:08 crc kubenswrapper[4722]: I0309 14:29:08.648133 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbqpr\" (UniqueName: \"kubernetes.io/projected/32835244-16f5-407f-87fe-8e9f860879a1-kube-api-access-jbqpr\") pod \"aodh-0\" (UID: \"32835244-16f5-407f-87fe-8e9f860879a1\") " pod="openstack/aodh-0" Mar 09 14:29:08 crc kubenswrapper[4722]: I0309 14:29:08.750747 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32835244-16f5-407f-87fe-8e9f860879a1-config-data\") pod \"aodh-0\" (UID: \"32835244-16f5-407f-87fe-8e9f860879a1\") " pod="openstack/aodh-0" Mar 09 14:29:08 crc kubenswrapper[4722]: I0309 14:29:08.750881 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/32835244-16f5-407f-87fe-8e9f860879a1-public-tls-certs\") pod \"aodh-0\" (UID: \"32835244-16f5-407f-87fe-8e9f860879a1\") " pod="openstack/aodh-0" Mar 09 14:29:08 crc kubenswrapper[4722]: I0309 14:29:08.750921 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32835244-16f5-407f-87fe-8e9f860879a1-combined-ca-bundle\") pod \"aodh-0\" (UID: \"32835244-16f5-407f-87fe-8e9f860879a1\") " pod="openstack/aodh-0" Mar 09 14:29:08 crc kubenswrapper[4722]: I0309 14:29:08.751056 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32835244-16f5-407f-87fe-8e9f860879a1-scripts\") pod \"aodh-0\" (UID: \"32835244-16f5-407f-87fe-8e9f860879a1\") " pod="openstack/aodh-0" Mar 09 14:29:08 crc kubenswrapper[4722]: I0309 14:29:08.751121 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/32835244-16f5-407f-87fe-8e9f860879a1-internal-tls-certs\") pod \"aodh-0\" (UID: \"32835244-16f5-407f-87fe-8e9f860879a1\") " pod="openstack/aodh-0" Mar 09 14:29:08 crc kubenswrapper[4722]: I0309 14:29:08.751273 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbqpr\" (UniqueName: \"kubernetes.io/projected/32835244-16f5-407f-87fe-8e9f860879a1-kube-api-access-jbqpr\") pod \"aodh-0\" (UID: \"32835244-16f5-407f-87fe-8e9f860879a1\") " pod="openstack/aodh-0" Mar 09 14:29:08 crc kubenswrapper[4722]: I0309 14:29:08.759459 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32835244-16f5-407f-87fe-8e9f860879a1-config-data\") pod \"aodh-0\" (UID: \"32835244-16f5-407f-87fe-8e9f860879a1\") " pod="openstack/aodh-0" Mar 09 14:29:08 crc kubenswrapper[4722]: I0309 14:29:08.760395 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/32835244-16f5-407f-87fe-8e9f860879a1-internal-tls-certs\") pod \"aodh-0\" (UID: \"32835244-16f5-407f-87fe-8e9f860879a1\") " pod="openstack/aodh-0" Mar 09 14:29:08 crc kubenswrapper[4722]: I0309 14:29:08.763415 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/32835244-16f5-407f-87fe-8e9f860879a1-public-tls-certs\") pod \"aodh-0\" (UID: \"32835244-16f5-407f-87fe-8e9f860879a1\") " pod="openstack/aodh-0" Mar 09 14:29:08 crc kubenswrapper[4722]: I0309 14:29:08.763592 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32835244-16f5-407f-87fe-8e9f860879a1-combined-ca-bundle\") pod \"aodh-0\" (UID: \"32835244-16f5-407f-87fe-8e9f860879a1\") " pod="openstack/aodh-0" Mar 09 14:29:08 crc kubenswrapper[4722]: I0309 14:29:08.764483 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32835244-16f5-407f-87fe-8e9f860879a1-scripts\") pod \"aodh-0\" (UID: \"32835244-16f5-407f-87fe-8e9f860879a1\") " pod="openstack/aodh-0" Mar 09 14:29:08 crc kubenswrapper[4722]: I0309 14:29:08.778735 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbqpr\" (UniqueName: \"kubernetes.io/projected/32835244-16f5-407f-87fe-8e9f860879a1-kube-api-access-jbqpr\") pod \"aodh-0\" (UID: \"32835244-16f5-407f-87fe-8e9f860879a1\") " pod="openstack/aodh-0" Mar 09 14:29:08 crc kubenswrapper[4722]: I0309 14:29:08.940797 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 09 14:29:09 crc kubenswrapper[4722]: I0309 14:29:09.185861 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bbc84d58-b38f-49b4-b80f-f3377b43d7a4","Type":"ContainerStarted","Data":"c230acf6d8613cf0ce68ce7952b5b6fe837b320e4759e079b57f2cb5c2e51df0"} Mar 09 14:29:09 crc kubenswrapper[4722]: I0309 14:29:09.186211 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bbc84d58-b38f-49b4-b80f-f3377b43d7a4","Type":"ContainerStarted","Data":"51b7eb3950ba3892fb35a9c3c5a860aef10ceb840f59ec27782a057df61bde3b"} Mar 09 14:29:09 crc kubenswrapper[4722]: I0309 14:29:09.216771 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.216750591 podStartE2EDuration="2.216750591s" podCreationTimestamp="2026-03-09 14:29:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:29:09.210685674 +0000 UTC m=+1589.766254260" watchObservedRunningTime="2026-03-09 14:29:09.216750591 +0000 UTC m=+1589.772319167" Mar 09 14:29:09 crc kubenswrapper[4722]: I0309 14:29:09.455447 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 09 14:29:09 crc kubenswrapper[4722]: I0309 14:29:09.499377 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="b58c3f1a-21b8-4a75-bfb5-970122e20db6" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.4:8775/\": read tcp 10.217.0.2:55258->10.217.1.4:8775: read: connection reset by peer" Mar 09 14:29:09 crc kubenswrapper[4722]: I0309 14:29:09.500120 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="b58c3f1a-21b8-4a75-bfb5-970122e20db6" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.4:8775/\": read tcp 10.217.0.2:55272->10.217.1.4:8775: read: connection reset by peer" Mar 09 14:29:10 crc kubenswrapper[4722]: I0309 14:29:10.167943 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9226bef9-b3b3-4f10-8e02-6fbc710bc4f3" path="/var/lib/kubelet/pods/9226bef9-b3b3-4f10-8e02-6fbc710bc4f3/volumes" Mar 09 14:29:10 crc kubenswrapper[4722]: I0309 14:29:10.231788 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"32835244-16f5-407f-87fe-8e9f860879a1","Type":"ContainerStarted","Data":"cd1baac69ca27665f23f54b2848a483ab2e3e2d3ff6969486aaef6d610fb4820"} Mar 09 14:29:10 crc kubenswrapper[4722]: I0309 14:29:10.239445 4722 generic.go:334] "Generic (PLEG): container finished" podID="b58c3f1a-21b8-4a75-bfb5-970122e20db6" containerID="5891fa687a19a92c246a30786d3a0fe3fca15abe1ec15a94af6e37d35fff007a" exitCode=0 Mar 09 14:29:10 crc kubenswrapper[4722]: I0309 14:29:10.239535 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b58c3f1a-21b8-4a75-bfb5-970122e20db6","Type":"ContainerDied","Data":"5891fa687a19a92c246a30786d3a0fe3fca15abe1ec15a94af6e37d35fff007a"} Mar 09 14:29:10 crc kubenswrapper[4722]: I0309 14:29:10.715804 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 09 14:29:10 crc kubenswrapper[4722]: I0309 14:29:10.842988 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-clvgn\" (UniqueName: \"kubernetes.io/projected/b58c3f1a-21b8-4a75-bfb5-970122e20db6-kube-api-access-clvgn\") pod \"b58c3f1a-21b8-4a75-bfb5-970122e20db6\" (UID: \"b58c3f1a-21b8-4a75-bfb5-970122e20db6\") " Mar 09 14:29:10 crc kubenswrapper[4722]: I0309 14:29:10.843101 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b58c3f1a-21b8-4a75-bfb5-970122e20db6-nova-metadata-tls-certs\") pod \"b58c3f1a-21b8-4a75-bfb5-970122e20db6\" (UID: \"b58c3f1a-21b8-4a75-bfb5-970122e20db6\") " Mar 09 14:29:10 crc kubenswrapper[4722]: I0309 14:29:10.843294 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b58c3f1a-21b8-4a75-bfb5-970122e20db6-logs\") pod \"b58c3f1a-21b8-4a75-bfb5-970122e20db6\" (UID: \"b58c3f1a-21b8-4a75-bfb5-970122e20db6\") " Mar 09 14:29:10 crc kubenswrapper[4722]: I0309 14:29:10.843392 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b58c3f1a-21b8-4a75-bfb5-970122e20db6-combined-ca-bundle\") pod \"b58c3f1a-21b8-4a75-bfb5-970122e20db6\" (UID: \"b58c3f1a-21b8-4a75-bfb5-970122e20db6\") " Mar 09 14:29:10 crc kubenswrapper[4722]: I0309 14:29:10.843424 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b58c3f1a-21b8-4a75-bfb5-970122e20db6-config-data\") pod \"b58c3f1a-21b8-4a75-bfb5-970122e20db6\" (UID: \"b58c3f1a-21b8-4a75-bfb5-970122e20db6\") " Mar 09 14:29:10 crc kubenswrapper[4722]: I0309 14:29:10.853474 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b58c3f1a-21b8-4a75-bfb5-970122e20db6-logs" (OuterVolumeSpecName: "logs") pod "b58c3f1a-21b8-4a75-bfb5-970122e20db6" (UID: "b58c3f1a-21b8-4a75-bfb5-970122e20db6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:29:10 crc kubenswrapper[4722]: I0309 14:29:10.861460 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b58c3f1a-21b8-4a75-bfb5-970122e20db6-kube-api-access-clvgn" (OuterVolumeSpecName: "kube-api-access-clvgn") pod "b58c3f1a-21b8-4a75-bfb5-970122e20db6" (UID: "b58c3f1a-21b8-4a75-bfb5-970122e20db6"). InnerVolumeSpecName "kube-api-access-clvgn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:29:10 crc kubenswrapper[4722]: I0309 14:29:10.895831 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b58c3f1a-21b8-4a75-bfb5-970122e20db6-config-data" (OuterVolumeSpecName: "config-data") pod "b58c3f1a-21b8-4a75-bfb5-970122e20db6" (UID: "b58c3f1a-21b8-4a75-bfb5-970122e20db6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:29:10 crc kubenswrapper[4722]: I0309 14:29:10.898122 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b58c3f1a-21b8-4a75-bfb5-970122e20db6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b58c3f1a-21b8-4a75-bfb5-970122e20db6" (UID: "b58c3f1a-21b8-4a75-bfb5-970122e20db6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:29:10 crc kubenswrapper[4722]: I0309 14:29:10.946389 4722 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b58c3f1a-21b8-4a75-bfb5-970122e20db6-logs\") on node \"crc\" DevicePath \"\"" Mar 09 14:29:10 crc kubenswrapper[4722]: I0309 14:29:10.946733 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b58c3f1a-21b8-4a75-bfb5-970122e20db6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:29:10 crc kubenswrapper[4722]: I0309 14:29:10.946748 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b58c3f1a-21b8-4a75-bfb5-970122e20db6-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:29:10 crc kubenswrapper[4722]: I0309 14:29:10.946762 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-clvgn\" (UniqueName: \"kubernetes.io/projected/b58c3f1a-21b8-4a75-bfb5-970122e20db6-kube-api-access-clvgn\") on node \"crc\" DevicePath \"\"" Mar 09 14:29:10 crc kubenswrapper[4722]: I0309 14:29:10.952326 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b58c3f1a-21b8-4a75-bfb5-970122e20db6-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "b58c3f1a-21b8-4a75-bfb5-970122e20db6" (UID: "b58c3f1a-21b8-4a75-bfb5-970122e20db6"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:29:11 crc kubenswrapper[4722]: I0309 14:29:11.048877 4722 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b58c3f1a-21b8-4a75-bfb5-970122e20db6-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 14:29:11 crc kubenswrapper[4722]: I0309 14:29:11.252709 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 09 14:29:11 crc kubenswrapper[4722]: I0309 14:29:11.252722 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b58c3f1a-21b8-4a75-bfb5-970122e20db6","Type":"ContainerDied","Data":"fa2d300a8b9c0a65b4a54981c995c94cee16cae4cfb386064d800981c5814874"} Mar 09 14:29:11 crc kubenswrapper[4722]: I0309 14:29:11.252773 4722 scope.go:117] "RemoveContainer" containerID="5891fa687a19a92c246a30786d3a0fe3fca15abe1ec15a94af6e37d35fff007a" Mar 09 14:29:11 crc kubenswrapper[4722]: I0309 14:29:11.260383 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"32835244-16f5-407f-87fe-8e9f860879a1","Type":"ContainerStarted","Data":"11e3c915b7b5f964267f5dd40b1448790d2c0b4d7f3c65d0c4a79ef035fc7b12"} Mar 09 14:29:11 crc kubenswrapper[4722]: I0309 14:29:11.301095 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 14:29:11 crc kubenswrapper[4722]: I0309 14:29:11.306153 4722 scope.go:117] "RemoveContainer" containerID="a1b1a60a9d6e55a96d906a241559260ee757706a4d44f98b4b1268ac1a05ec9c" Mar 09 14:29:11 crc kubenswrapper[4722]: I0309 14:29:11.317119 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 14:29:11 crc kubenswrapper[4722]: I0309 14:29:11.330007 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 09 14:29:11 crc kubenswrapper[4722]: E0309 14:29:11.330675 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b58c3f1a-21b8-4a75-bfb5-970122e20db6" containerName="nova-metadata-log" Mar 09 14:29:11 crc kubenswrapper[4722]: I0309 14:29:11.330689 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="b58c3f1a-21b8-4a75-bfb5-970122e20db6" containerName="nova-metadata-log" Mar 09 14:29:11 crc kubenswrapper[4722]: E0309 14:29:11.330706 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b58c3f1a-21b8-4a75-bfb5-970122e20db6" containerName="nova-metadata-metadata" Mar 09 14:29:11 crc kubenswrapper[4722]: I0309 14:29:11.330712 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="b58c3f1a-21b8-4a75-bfb5-970122e20db6" containerName="nova-metadata-metadata" Mar 09 14:29:11 crc kubenswrapper[4722]: I0309 14:29:11.330918 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="b58c3f1a-21b8-4a75-bfb5-970122e20db6" containerName="nova-metadata-log" Mar 09 14:29:11 crc kubenswrapper[4722]: I0309 14:29:11.330935 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="b58c3f1a-21b8-4a75-bfb5-970122e20db6" containerName="nova-metadata-metadata" Mar 09 14:29:11 crc kubenswrapper[4722]: I0309 14:29:11.332191 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 09 14:29:11 crc kubenswrapper[4722]: I0309 14:29:11.335166 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 09 14:29:11 crc kubenswrapper[4722]: I0309 14:29:11.336448 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 09 14:29:11 crc kubenswrapper[4722]: I0309 14:29:11.355075 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 14:29:11 crc kubenswrapper[4722]: I0309 14:29:11.458346 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5a79b9f-1947-4307-bc83-cba88e1e00cf-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f5a79b9f-1947-4307-bc83-cba88e1e00cf\") " pod="openstack/nova-metadata-0" Mar 09 14:29:11 crc kubenswrapper[4722]: I0309 14:29:11.458409 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5a79b9f-1947-4307-bc83-cba88e1e00cf-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f5a79b9f-1947-4307-bc83-cba88e1e00cf\") " pod="openstack/nova-metadata-0" Mar 09 14:29:11 crc kubenswrapper[4722]: I0309 14:29:11.458509 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffflj\" (UniqueName: \"kubernetes.io/projected/f5a79b9f-1947-4307-bc83-cba88e1e00cf-kube-api-access-ffflj\") pod \"nova-metadata-0\" (UID: \"f5a79b9f-1947-4307-bc83-cba88e1e00cf\") " pod="openstack/nova-metadata-0" Mar 09 14:29:11 crc kubenswrapper[4722]: I0309 14:29:11.458597 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5a79b9f-1947-4307-bc83-cba88e1e00cf-config-data\") pod \"nova-metadata-0\" (UID: \"f5a79b9f-1947-4307-bc83-cba88e1e00cf\") " pod="openstack/nova-metadata-0" Mar 09 14:29:11 crc kubenswrapper[4722]: I0309 14:29:11.458635 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f5a79b9f-1947-4307-bc83-cba88e1e00cf-logs\") pod \"nova-metadata-0\" (UID: \"f5a79b9f-1947-4307-bc83-cba88e1e00cf\") " pod="openstack/nova-metadata-0" Mar 09 14:29:11 crc kubenswrapper[4722]: I0309 14:29:11.560413 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5a79b9f-1947-4307-bc83-cba88e1e00cf-config-data\") pod \"nova-metadata-0\" (UID: \"f5a79b9f-1947-4307-bc83-cba88e1e00cf\") " pod="openstack/nova-metadata-0" Mar 09 14:29:11 crc kubenswrapper[4722]: I0309 14:29:11.560859 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f5a79b9f-1947-4307-bc83-cba88e1e00cf-logs\") pod \"nova-metadata-0\" (UID: \"f5a79b9f-1947-4307-bc83-cba88e1e00cf\") " pod="openstack/nova-metadata-0" Mar 09 14:29:11 crc kubenswrapper[4722]: I0309 14:29:11.561077 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5a79b9f-1947-4307-bc83-cba88e1e00cf-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f5a79b9f-1947-4307-bc83-cba88e1e00cf\") " pod="openstack/nova-metadata-0" Mar 09 14:29:11 crc kubenswrapper[4722]: I0309 14:29:11.561114 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5a79b9f-1947-4307-bc83-cba88e1e00cf-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f5a79b9f-1947-4307-bc83-cba88e1e00cf\") " pod="openstack/nova-metadata-0" Mar 09 14:29:11 crc kubenswrapper[4722]: I0309 14:29:11.561236 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffflj\" (UniqueName: \"kubernetes.io/projected/f5a79b9f-1947-4307-bc83-cba88e1e00cf-kube-api-access-ffflj\") pod \"nova-metadata-0\" (UID: \"f5a79b9f-1947-4307-bc83-cba88e1e00cf\") " pod="openstack/nova-metadata-0" Mar 09 14:29:11 crc kubenswrapper[4722]: I0309 14:29:11.561375 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f5a79b9f-1947-4307-bc83-cba88e1e00cf-logs\") pod \"nova-metadata-0\" (UID: \"f5a79b9f-1947-4307-bc83-cba88e1e00cf\") " pod="openstack/nova-metadata-0" Mar 09 14:29:11 crc kubenswrapper[4722]: I0309 14:29:11.565422 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5a79b9f-1947-4307-bc83-cba88e1e00cf-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f5a79b9f-1947-4307-bc83-cba88e1e00cf\") " pod="openstack/nova-metadata-0" Mar 09 14:29:11 crc kubenswrapper[4722]: I0309 14:29:11.565677 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5a79b9f-1947-4307-bc83-cba88e1e00cf-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f5a79b9f-1947-4307-bc83-cba88e1e00cf\") " pod="openstack/nova-metadata-0" Mar 09 14:29:11 crc kubenswrapper[4722]: I0309 14:29:11.566306 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5a79b9f-1947-4307-bc83-cba88e1e00cf-config-data\") pod \"nova-metadata-0\" (UID: \"f5a79b9f-1947-4307-bc83-cba88e1e00cf\") " pod="openstack/nova-metadata-0" Mar 09 14:29:11 crc kubenswrapper[4722]: I0309 14:29:11.589967 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffflj\" (UniqueName: \"kubernetes.io/projected/f5a79b9f-1947-4307-bc83-cba88e1e00cf-kube-api-access-ffflj\") pod \"nova-metadata-0\" (UID: \"f5a79b9f-1947-4307-bc83-cba88e1e00cf\") " pod="openstack/nova-metadata-0" Mar 09 14:29:11 crc kubenswrapper[4722]: I0309 14:29:11.665513 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 09 14:29:12 crc kubenswrapper[4722]: I0309 14:29:12.213323 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b58c3f1a-21b8-4a75-bfb5-970122e20db6" path="/var/lib/kubelet/pods/b58c3f1a-21b8-4a75-bfb5-970122e20db6/volumes" Mar 09 14:29:12 crc kubenswrapper[4722]: I0309 14:29:12.214602 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 09 14:29:12 crc kubenswrapper[4722]: I0309 14:29:12.277557 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"32835244-16f5-407f-87fe-8e9f860879a1","Type":"ContainerStarted","Data":"5ee6f3fb3854cc97d72b1225b0810ee97b98bfdbbaf067f7d20c95620ac51434"} Mar 09 14:29:12 crc kubenswrapper[4722]: I0309 14:29:12.278905 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f5a79b9f-1947-4307-bc83-cba88e1e00cf","Type":"ContainerStarted","Data":"8256b0fcf322c8567f1fd40a74dc0069e487bfe3897dcd171dfdf1af78b085a5"} Mar 09 14:29:12 crc kubenswrapper[4722]: I0309 14:29:12.860146 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 09 14:29:13 crc kubenswrapper[4722]: I0309 14:29:13.304160 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f5a79b9f-1947-4307-bc83-cba88e1e00cf","Type":"ContainerStarted","Data":"ce31e80b125763d6b2b1601b9bd271eb3c47381c84fbed83c8393b7f3be113aa"} Mar 09 14:29:13 crc kubenswrapper[4722]: I0309 14:29:13.304624 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f5a79b9f-1947-4307-bc83-cba88e1e00cf","Type":"ContainerStarted","Data":"73b26d958ec4b13889c24461e01934a5577920bfdd3b4d1f5ca095aa1b3deec7"} Mar 09 14:29:13 crc kubenswrapper[4722]: I0309 14:29:13.330856 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.330832368 podStartE2EDuration="2.330832368s" podCreationTimestamp="2026-03-09 14:29:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:29:13.324898194 +0000 UTC m=+1593.880466760" watchObservedRunningTime="2026-03-09 14:29:13.330832368 +0000 UTC m=+1593.886400964" Mar 09 14:29:14 crc kubenswrapper[4722]: I0309 14:29:14.324019 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"32835244-16f5-407f-87fe-8e9f860879a1","Type":"ContainerStarted","Data":"33efd07beecb95d9c5a7c65c557ef23d3fc4f7096265860a26df35d8467ea987"} Mar 09 14:29:15 crc kubenswrapper[4722]: I0309 14:29:15.339859 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"32835244-16f5-407f-87fe-8e9f860879a1","Type":"ContainerStarted","Data":"17a0c0808de0ddcab290ce82e6cc5add34970fa0f13ed7e0572596829e5f0e53"} Mar 09 14:29:15 crc kubenswrapper[4722]: I0309 14:29:15.381596 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.162551301 podStartE2EDuration="7.381572227s" podCreationTimestamp="2026-03-09 14:29:08 +0000 UTC" firstStartedPulling="2026-03-09 14:29:09.460716391 +0000 UTC m=+1590.016284957" lastFinishedPulling="2026-03-09 14:29:14.679737307 +0000 UTC m=+1595.235305883" observedRunningTime="2026-03-09 14:29:15.363710835 +0000 UTC m=+1595.919279421" watchObservedRunningTime="2026-03-09 14:29:15.381572227 +0000 UTC m=+1595.937140803" Mar 09 14:29:15 crc kubenswrapper[4722]: I0309 14:29:15.582022 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 09 14:29:15 crc kubenswrapper[4722]: I0309 14:29:15.582088 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 09 14:29:16 crc kubenswrapper[4722]: I0309 14:29:16.606471 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b915b224-7fbf-4ec6-be9a-7205dd818ed4" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.1.10:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 14:29:16 crc kubenswrapper[4722]: I0309 14:29:16.606593 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b915b224-7fbf-4ec6-be9a-7205dd818ed4" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.1.10:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 14:29:16 crc kubenswrapper[4722]: I0309 14:29:16.666456 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 09 14:29:16 crc kubenswrapper[4722]: I0309 14:29:16.666510 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 09 14:29:17 crc kubenswrapper[4722]: I0309 14:29:17.860217 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 09 14:29:17 crc kubenswrapper[4722]: I0309 14:29:17.895636 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 09 14:29:18 crc kubenswrapper[4722]: I0309 14:29:18.433709 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 09 14:29:21 crc kubenswrapper[4722]: I0309 14:29:21.528101 4722 patch_prober.go:28] interesting pod/machine-config-daemon-hjrrb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:29:21 crc kubenswrapper[4722]: I0309 14:29:21.528792 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:29:21 crc kubenswrapper[4722]: I0309 14:29:21.666665 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 09 14:29:21 crc kubenswrapper[4722]: I0309 14:29:21.666744 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 09 14:29:22 crc kubenswrapper[4722]: I0309 14:29:22.683375 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="f5a79b9f-1947-4307-bc83-cba88e1e00cf" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.13:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 14:29:22 crc kubenswrapper[4722]: I0309 14:29:22.683459 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="f5a79b9f-1947-4307-bc83-cba88e1e00cf" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.13:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 14:29:23 crc kubenswrapper[4722]: I0309 14:29:23.316278 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rq8bb"] Mar 09 14:29:23 crc kubenswrapper[4722]: I0309 14:29:23.319700 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rq8bb" Mar 09 14:29:23 crc kubenswrapper[4722]: I0309 14:29:23.371838 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5800ac8e-62cf-446c-a10f-cde22ba63b77-catalog-content\") pod \"certified-operators-rq8bb\" (UID: \"5800ac8e-62cf-446c-a10f-cde22ba63b77\") " pod="openshift-marketplace/certified-operators-rq8bb" Mar 09 14:29:23 crc kubenswrapper[4722]: I0309 14:29:23.371886 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5800ac8e-62cf-446c-a10f-cde22ba63b77-utilities\") pod \"certified-operators-rq8bb\" (UID: \"5800ac8e-62cf-446c-a10f-cde22ba63b77\") " pod="openshift-marketplace/certified-operators-rq8bb" Mar 09 14:29:23 crc kubenswrapper[4722]: I0309 14:29:23.371926 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s79j6\" (UniqueName: \"kubernetes.io/projected/5800ac8e-62cf-446c-a10f-cde22ba63b77-kube-api-access-s79j6\") pod \"certified-operators-rq8bb\" (UID: \"5800ac8e-62cf-446c-a10f-cde22ba63b77\") " pod="openshift-marketplace/certified-operators-rq8bb" Mar 09 14:29:23 crc kubenswrapper[4722]: I0309 14:29:23.402743 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rq8bb"] Mar 09 14:29:23 crc kubenswrapper[4722]: I0309 14:29:23.474022 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5800ac8e-62cf-446c-a10f-cde22ba63b77-catalog-content\") pod \"certified-operators-rq8bb\" (UID: \"5800ac8e-62cf-446c-a10f-cde22ba63b77\") " pod="openshift-marketplace/certified-operators-rq8bb" Mar 09 14:29:23 crc kubenswrapper[4722]: I0309 14:29:23.474077 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5800ac8e-62cf-446c-a10f-cde22ba63b77-utilities\") pod \"certified-operators-rq8bb\" (UID: \"5800ac8e-62cf-446c-a10f-cde22ba63b77\") " pod="openshift-marketplace/certified-operators-rq8bb" Mar 09 14:29:23 crc kubenswrapper[4722]: I0309 14:29:23.474123 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s79j6\" (UniqueName: \"kubernetes.io/projected/5800ac8e-62cf-446c-a10f-cde22ba63b77-kube-api-access-s79j6\") pod \"certified-operators-rq8bb\" (UID: \"5800ac8e-62cf-446c-a10f-cde22ba63b77\") " pod="openshift-marketplace/certified-operators-rq8bb" Mar 09 14:29:23 crc kubenswrapper[4722]: I0309 14:29:23.474593 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5800ac8e-62cf-446c-a10f-cde22ba63b77-catalog-content\") pod \"certified-operators-rq8bb\" (UID: \"5800ac8e-62cf-446c-a10f-cde22ba63b77\") " pod="openshift-marketplace/certified-operators-rq8bb" Mar 09 14:29:23 crc kubenswrapper[4722]: I0309 14:29:23.474699 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5800ac8e-62cf-446c-a10f-cde22ba63b77-utilities\") pod \"certified-operators-rq8bb\" (UID: \"5800ac8e-62cf-446c-a10f-cde22ba63b77\") " pod="openshift-marketplace/certified-operators-rq8bb" Mar 09 14:29:23 crc kubenswrapper[4722]: I0309 14:29:23.494151 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s79j6\" (UniqueName: \"kubernetes.io/projected/5800ac8e-62cf-446c-a10f-cde22ba63b77-kube-api-access-s79j6\") pod \"certified-operators-rq8bb\" (UID: \"5800ac8e-62cf-446c-a10f-cde22ba63b77\") " pod="openshift-marketplace/certified-operators-rq8bb" Mar 09 14:29:23 crc kubenswrapper[4722]: I0309 14:29:23.640015 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rq8bb" Mar 09 14:29:24 crc kubenswrapper[4722]: I0309 14:29:24.432486 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 09 14:29:24 crc kubenswrapper[4722]: W0309 14:29:24.561087 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5800ac8e_62cf_446c_a10f_cde22ba63b77.slice/crio-14cebcfa0a7815770dfba9ca0141d618db06737b5cb19d49238ced6a758882a1 WatchSource:0}: Error finding container 14cebcfa0a7815770dfba9ca0141d618db06737b5cb19d49238ced6a758882a1: Status 404 returned error can't find the container with id 14cebcfa0a7815770dfba9ca0141d618db06737b5cb19d49238ced6a758882a1 Mar 09 14:29:24 crc kubenswrapper[4722]: I0309 14:29:24.579058 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rq8bb"] Mar 09 14:29:25 crc kubenswrapper[4722]: I0309 14:29:25.556509 4722 generic.go:334] "Generic (PLEG): container finished" podID="5800ac8e-62cf-446c-a10f-cde22ba63b77" containerID="3c9dbdb8c53c63a4bff3d9da22a0b2f4b9b2c92a688eba5dda7d994cf3b3f1ee" exitCode=0 Mar 09 14:29:25 crc kubenswrapper[4722]: I0309 14:29:25.556598 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rq8bb" event={"ID":"5800ac8e-62cf-446c-a10f-cde22ba63b77","Type":"ContainerDied","Data":"3c9dbdb8c53c63a4bff3d9da22a0b2f4b9b2c92a688eba5dda7d994cf3b3f1ee"} Mar 09 14:29:25 crc kubenswrapper[4722]: I0309 14:29:25.557076 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rq8bb" event={"ID":"5800ac8e-62cf-446c-a10f-cde22ba63b77","Type":"ContainerStarted","Data":"14cebcfa0a7815770dfba9ca0141d618db06737b5cb19d49238ced6a758882a1"} Mar 09 14:29:25 crc kubenswrapper[4722]: I0309 14:29:25.597948 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 09 14:29:25 crc kubenswrapper[4722]: I0309 14:29:25.598789 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 09 14:29:25 crc kubenswrapper[4722]: I0309 14:29:25.607704 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 09 14:29:25 crc kubenswrapper[4722]: I0309 14:29:25.615756 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 09 14:29:26 crc kubenswrapper[4722]: I0309 14:29:26.568103 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 09 14:29:26 crc kubenswrapper[4722]: I0309 14:29:26.575393 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 09 14:29:27 crc kubenswrapper[4722]: I0309 14:29:27.604366 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rq8bb" event={"ID":"5800ac8e-62cf-446c-a10f-cde22ba63b77","Type":"ContainerStarted","Data":"dcce2dd22bcf030f8f9685f1d097bf581b742cac2b013e44234239c9a446ba32"} Mar 09 14:29:28 crc kubenswrapper[4722]: E0309 14:29:28.957798 4722 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5800ac8e_62cf_446c_a10f_cde22ba63b77.slice/crio-dcce2dd22bcf030f8f9685f1d097bf581b742cac2b013e44234239c9a446ba32.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5800ac8e_62cf_446c_a10f_cde22ba63b77.slice/crio-conmon-dcce2dd22bcf030f8f9685f1d097bf581b742cac2b013e44234239c9a446ba32.scope\": RecentStats: unable to find data in memory cache]" Mar 09 14:29:29 crc kubenswrapper[4722]: I0309 14:29:29.488883 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 09 14:29:29 crc kubenswrapper[4722]: I0309 14:29:29.489124 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="4d8e49ae-5932-472c-a714-7872980c5a9b" containerName="kube-state-metrics" containerID="cri-o://f46a83809e16de46291a90c86e9dec05353037c7e623d1559e8436af46ab3f4a" gracePeriod=30 Mar 09 14:29:29 crc kubenswrapper[4722]: I0309 14:29:29.631592 4722 generic.go:334] "Generic (PLEG): container finished" podID="4d8e49ae-5932-472c-a714-7872980c5a9b" containerID="f46a83809e16de46291a90c86e9dec05353037c7e623d1559e8436af46ab3f4a" exitCode=2 Mar 09 14:29:29 crc kubenswrapper[4722]: I0309 14:29:29.631723 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4d8e49ae-5932-472c-a714-7872980c5a9b","Type":"ContainerDied","Data":"f46a83809e16de46291a90c86e9dec05353037c7e623d1559e8436af46ab3f4a"} Mar 09 14:29:29 crc kubenswrapper[4722]: I0309 14:29:29.633968 4722 generic.go:334] "Generic (PLEG): container finished" podID="5800ac8e-62cf-446c-a10f-cde22ba63b77" containerID="dcce2dd22bcf030f8f9685f1d097bf581b742cac2b013e44234239c9a446ba32" exitCode=0 Mar 09 14:29:29 crc kubenswrapper[4722]: I0309 14:29:29.634007 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rq8bb" event={"ID":"5800ac8e-62cf-446c-a10f-cde22ba63b77","Type":"ContainerDied","Data":"dcce2dd22bcf030f8f9685f1d097bf581b742cac2b013e44234239c9a446ba32"} Mar 09 14:29:29 crc kubenswrapper[4722]: I0309 14:29:29.702022 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-0"] Mar 09 14:29:29 crc kubenswrapper[4722]: I0309 14:29:29.702909 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mysqld-exporter-0" podUID="cdb981de-3376-4da4-834d-ae2446f02b8e" containerName="mysqld-exporter" containerID="cri-o://a923ea13e4da33d77768149e9e6f1a1366d9961f621630001d7b5d50fdd6ce08" gracePeriod=30 Mar 09 14:29:30 crc kubenswrapper[4722]: I0309 14:29:30.179576 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 09 14:29:30 crc kubenswrapper[4722]: I0309 14:29:30.285079 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcfrm\" (UniqueName: \"kubernetes.io/projected/4d8e49ae-5932-472c-a714-7872980c5a9b-kube-api-access-xcfrm\") pod \"4d8e49ae-5932-472c-a714-7872980c5a9b\" (UID: \"4d8e49ae-5932-472c-a714-7872980c5a9b\") " Mar 09 14:29:30 crc kubenswrapper[4722]: I0309 14:29:30.297575 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d8e49ae-5932-472c-a714-7872980c5a9b-kube-api-access-xcfrm" (OuterVolumeSpecName: "kube-api-access-xcfrm") pod "4d8e49ae-5932-472c-a714-7872980c5a9b" (UID: "4d8e49ae-5932-472c-a714-7872980c5a9b"). InnerVolumeSpecName "kube-api-access-xcfrm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:29:30 crc kubenswrapper[4722]: I0309 14:29:30.304119 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Mar 09 14:29:30 crc kubenswrapper[4722]: I0309 14:29:30.386945 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdb981de-3376-4da4-834d-ae2446f02b8e-config-data\") pod \"cdb981de-3376-4da4-834d-ae2446f02b8e\" (UID: \"cdb981de-3376-4da4-834d-ae2446f02b8e\") " Mar 09 14:29:30 crc kubenswrapper[4722]: I0309 14:29:30.387412 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9v7s7\" (UniqueName: \"kubernetes.io/projected/cdb981de-3376-4da4-834d-ae2446f02b8e-kube-api-access-9v7s7\") pod \"cdb981de-3376-4da4-834d-ae2446f02b8e\" (UID: \"cdb981de-3376-4da4-834d-ae2446f02b8e\") " Mar 09 14:29:30 crc kubenswrapper[4722]: I0309 14:29:30.387518 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdb981de-3376-4da4-834d-ae2446f02b8e-combined-ca-bundle\") pod \"cdb981de-3376-4da4-834d-ae2446f02b8e\" (UID: \"cdb981de-3376-4da4-834d-ae2446f02b8e\") " Mar 09 14:29:30 crc kubenswrapper[4722]: I0309 14:29:30.388374 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcfrm\" (UniqueName: \"kubernetes.io/projected/4d8e49ae-5932-472c-a714-7872980c5a9b-kube-api-access-xcfrm\") on node \"crc\" DevicePath \"\"" Mar 09 14:29:30 crc kubenswrapper[4722]: I0309 14:29:30.392695 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdb981de-3376-4da4-834d-ae2446f02b8e-kube-api-access-9v7s7" (OuterVolumeSpecName: "kube-api-access-9v7s7") pod "cdb981de-3376-4da4-834d-ae2446f02b8e" (UID: "cdb981de-3376-4da4-834d-ae2446f02b8e"). InnerVolumeSpecName "kube-api-access-9v7s7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:29:30 crc kubenswrapper[4722]: I0309 14:29:30.429461 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdb981de-3376-4da4-834d-ae2446f02b8e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cdb981de-3376-4da4-834d-ae2446f02b8e" (UID: "cdb981de-3376-4da4-834d-ae2446f02b8e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:29:30 crc kubenswrapper[4722]: I0309 14:29:30.453905 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdb981de-3376-4da4-834d-ae2446f02b8e-config-data" (OuterVolumeSpecName: "config-data") pod "cdb981de-3376-4da4-834d-ae2446f02b8e" (UID: "cdb981de-3376-4da4-834d-ae2446f02b8e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:29:30 crc kubenswrapper[4722]: I0309 14:29:30.491010 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9v7s7\" (UniqueName: \"kubernetes.io/projected/cdb981de-3376-4da4-834d-ae2446f02b8e-kube-api-access-9v7s7\") on node \"crc\" DevicePath \"\"" Mar 09 14:29:30 crc kubenswrapper[4722]: I0309 14:29:30.491046 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdb981de-3376-4da4-834d-ae2446f02b8e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:29:30 crc kubenswrapper[4722]: I0309 14:29:30.491057 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdb981de-3376-4da4-834d-ae2446f02b8e-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:29:30 crc kubenswrapper[4722]: I0309 14:29:30.647971 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4d8e49ae-5932-472c-a714-7872980c5a9b","Type":"ContainerDied","Data":"2dcc0b6e482a4bf6c4d23c45dbd2fab04460373e1df31b072003bbb064cfafc1"} Mar 09 14:29:30 crc kubenswrapper[4722]: I0309 14:29:30.648032 4722 scope.go:117] "RemoveContainer" containerID="f46a83809e16de46291a90c86e9dec05353037c7e623d1559e8436af46ab3f4a" Mar 09 14:29:30 crc kubenswrapper[4722]: I0309 14:29:30.647984 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 09 14:29:30 crc kubenswrapper[4722]: I0309 14:29:30.664048 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rq8bb" event={"ID":"5800ac8e-62cf-446c-a10f-cde22ba63b77","Type":"ContainerStarted","Data":"7b04ddba55acff2e8e3982d2527b79aad7aceabe390d82575421248e849731a4"} Mar 09 14:29:30 crc kubenswrapper[4722]: I0309 14:29:30.672867 4722 generic.go:334] "Generic (PLEG): container finished" podID="cdb981de-3376-4da4-834d-ae2446f02b8e" containerID="a923ea13e4da33d77768149e9e6f1a1366d9961f621630001d7b5d50fdd6ce08" exitCode=2 Mar 09 14:29:30 crc kubenswrapper[4722]: I0309 14:29:30.672916 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"cdb981de-3376-4da4-834d-ae2446f02b8e","Type":"ContainerDied","Data":"a923ea13e4da33d77768149e9e6f1a1366d9961f621630001d7b5d50fdd6ce08"} Mar 09 14:29:30 crc kubenswrapper[4722]: I0309 14:29:30.672947 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"cdb981de-3376-4da4-834d-ae2446f02b8e","Type":"ContainerDied","Data":"acb467c49c3d1959f49b618ffd68b5618a2943275f9f2a4bc3d5ebb9d50257f3"} Mar 09 14:29:30 crc kubenswrapper[4722]: I0309 14:29:30.672987 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Mar 09 14:29:30 crc kubenswrapper[4722]: I0309 14:29:30.693892 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rq8bb" podStartSLOduration=3.143918456 podStartE2EDuration="7.693873237s" podCreationTimestamp="2026-03-09 14:29:23 +0000 UTC" firstStartedPulling="2026-03-09 14:29:25.560174514 +0000 UTC m=+1606.115743090" lastFinishedPulling="2026-03-09 14:29:30.110129285 +0000 UTC m=+1610.665697871" observedRunningTime="2026-03-09 14:29:30.686998688 +0000 UTC m=+1611.242567274" watchObservedRunningTime="2026-03-09 14:29:30.693873237 +0000 UTC m=+1611.249441813" Mar 09 14:29:30 crc kubenswrapper[4722]: I0309 14:29:30.746280 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-0"] Mar 09 14:29:30 crc kubenswrapper[4722]: I0309 14:29:30.753434 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-0"] Mar 09 14:29:30 crc kubenswrapper[4722]: I0309 14:29:30.764906 4722 scope.go:117] "RemoveContainer" containerID="a923ea13e4da33d77768149e9e6f1a1366d9961f621630001d7b5d50fdd6ce08" Mar 09 14:29:30 crc kubenswrapper[4722]: I0309 14:29:30.774581 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 09 14:29:30 crc kubenswrapper[4722]: I0309 14:29:30.790236 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 09 14:29:30 crc kubenswrapper[4722]: I0309 14:29:30.801712 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-0"] Mar 09 14:29:30 crc kubenswrapper[4722]: E0309 14:29:30.802267 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdb981de-3376-4da4-834d-ae2446f02b8e" containerName="mysqld-exporter" Mar 09 14:29:30 crc kubenswrapper[4722]: I0309 14:29:30.802282 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdb981de-3376-4da4-834d-ae2446f02b8e" containerName="mysqld-exporter" Mar 09 14:29:30 crc kubenswrapper[4722]: E0309 14:29:30.802332 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d8e49ae-5932-472c-a714-7872980c5a9b" containerName="kube-state-metrics" Mar 09 14:29:30 crc kubenswrapper[4722]: I0309 14:29:30.802339 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d8e49ae-5932-472c-a714-7872980c5a9b" containerName="kube-state-metrics" Mar 09 14:29:30 crc kubenswrapper[4722]: I0309 14:29:30.802566 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d8e49ae-5932-472c-a714-7872980c5a9b" containerName="kube-state-metrics" Mar 09 14:29:30 crc kubenswrapper[4722]: I0309 14:29:30.802580 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdb981de-3376-4da4-834d-ae2446f02b8e" containerName="mysqld-exporter" Mar 09 14:29:30 crc kubenswrapper[4722]: I0309 14:29:30.803683 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Mar 09 14:29:30 crc kubenswrapper[4722]: I0309 14:29:30.803686 4722 scope.go:117] "RemoveContainer" containerID="a923ea13e4da33d77768149e9e6f1a1366d9961f621630001d7b5d50fdd6ce08" Mar 09 14:29:30 crc kubenswrapper[4722]: E0309 14:29:30.805064 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a923ea13e4da33d77768149e9e6f1a1366d9961f621630001d7b5d50fdd6ce08\": container with ID starting with a923ea13e4da33d77768149e9e6f1a1366d9961f621630001d7b5d50fdd6ce08 not found: ID does not exist" containerID="a923ea13e4da33d77768149e9e6f1a1366d9961f621630001d7b5d50fdd6ce08" Mar 09 14:29:30 crc kubenswrapper[4722]: I0309 14:29:30.805157 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a923ea13e4da33d77768149e9e6f1a1366d9961f621630001d7b5d50fdd6ce08"} err="failed to get container status \"a923ea13e4da33d77768149e9e6f1a1366d9961f621630001d7b5d50fdd6ce08\": rpc error: code = NotFound desc = could not find container \"a923ea13e4da33d77768149e9e6f1a1366d9961f621630001d7b5d50fdd6ce08\": container with ID starting with a923ea13e4da33d77768149e9e6f1a1366d9961f621630001d7b5d50fdd6ce08 not found: ID does not exist" Mar 09 14:29:30 crc kubenswrapper[4722]: I0309 14:29:30.805966 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-config-data" Mar 09 14:29:30 crc kubenswrapper[4722]: I0309 14:29:30.806827 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-mysqld-exporter-svc" Mar 09 14:29:30 crc kubenswrapper[4722]: I0309 14:29:30.814226 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 09 14:29:30 crc kubenswrapper[4722]: I0309 14:29:30.816850 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 09 14:29:30 crc kubenswrapper[4722]: I0309 14:29:30.820215 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Mar 09 14:29:30 crc kubenswrapper[4722]: I0309 14:29:30.822046 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Mar 09 14:29:30 crc kubenswrapper[4722]: I0309 14:29:30.829362 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Mar 09 14:29:30 crc kubenswrapper[4722]: I0309 14:29:30.841994 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 09 14:29:30 crc kubenswrapper[4722]: I0309 14:29:30.905033 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/652c419b-2a86-4c6f-ac7a-c2d7818ef55f-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"652c419b-2a86-4c6f-ac7a-c2d7818ef55f\") " pod="openstack/kube-state-metrics-0" Mar 09 14:29:30 crc kubenswrapper[4722]: I0309 14:29:30.905108 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwwnh\" (UniqueName: \"kubernetes.io/projected/652c419b-2a86-4c6f-ac7a-c2d7818ef55f-kube-api-access-zwwnh\") pod \"kube-state-metrics-0\" (UID: \"652c419b-2a86-4c6f-ac7a-c2d7818ef55f\") " pod="openstack/kube-state-metrics-0" Mar 09 14:29:30 crc kubenswrapper[4722]: I0309 14:29:30.905227 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mysqld-exporter-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f04dc47-34bc-4124-b129-f0c643f73284-mysqld-exporter-tls-certs\") pod \"mysqld-exporter-0\" (UID: \"5f04dc47-34bc-4124-b129-f0c643f73284\") " pod="openstack/mysqld-exporter-0" Mar 09 14:29:30 crc kubenswrapper[4722]: I0309 14:29:30.905447 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f04dc47-34bc-4124-b129-f0c643f73284-config-data\") pod \"mysqld-exporter-0\" (UID: \"5f04dc47-34bc-4124-b129-f0c643f73284\") " pod="openstack/mysqld-exporter-0" Mar 09 14:29:30 crc kubenswrapper[4722]: I0309 14:29:30.905593 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/652c419b-2a86-4c6f-ac7a-c2d7818ef55f-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"652c419b-2a86-4c6f-ac7a-c2d7818ef55f\") " pod="openstack/kube-state-metrics-0" Mar 09 14:29:30 crc kubenswrapper[4722]: I0309 14:29:30.905692 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/652c419b-2a86-4c6f-ac7a-c2d7818ef55f-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"652c419b-2a86-4c6f-ac7a-c2d7818ef55f\") " pod="openstack/kube-state-metrics-0" Mar 09 14:29:30 crc kubenswrapper[4722]: I0309 14:29:30.905739 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f04dc47-34bc-4124-b129-f0c643f73284-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"5f04dc47-34bc-4124-b129-f0c643f73284\") " pod="openstack/mysqld-exporter-0" Mar 09 14:29:30 crc kubenswrapper[4722]: I0309 14:29:30.905798 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqlhh\" (UniqueName: \"kubernetes.io/projected/5f04dc47-34bc-4124-b129-f0c643f73284-kube-api-access-kqlhh\") pod \"mysqld-exporter-0\" (UID: \"5f04dc47-34bc-4124-b129-f0c643f73284\") " pod="openstack/mysqld-exporter-0" Mar 09 14:29:31 crc kubenswrapper[4722]: I0309 14:29:31.007634 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f04dc47-34bc-4124-b129-f0c643f73284-config-data\") pod \"mysqld-exporter-0\" (UID: \"5f04dc47-34bc-4124-b129-f0c643f73284\") " pod="openstack/mysqld-exporter-0" Mar 09 14:29:31 crc kubenswrapper[4722]: I0309 14:29:31.007925 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/652c419b-2a86-4c6f-ac7a-c2d7818ef55f-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"652c419b-2a86-4c6f-ac7a-c2d7818ef55f\") " pod="openstack/kube-state-metrics-0" Mar 09 14:29:31 crc kubenswrapper[4722]: I0309 14:29:31.007974 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/652c419b-2a86-4c6f-ac7a-c2d7818ef55f-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"652c419b-2a86-4c6f-ac7a-c2d7818ef55f\") " pod="openstack/kube-state-metrics-0" Mar 09 14:29:31 crc kubenswrapper[4722]: I0309 14:29:31.007993 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f04dc47-34bc-4124-b129-f0c643f73284-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"5f04dc47-34bc-4124-b129-f0c643f73284\") " pod="openstack/mysqld-exporter-0" Mar 09 14:29:31 crc kubenswrapper[4722]: I0309 14:29:31.008013 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqlhh\" (UniqueName: \"kubernetes.io/projected/5f04dc47-34bc-4124-b129-f0c643f73284-kube-api-access-kqlhh\") pod \"mysqld-exporter-0\" (UID: \"5f04dc47-34bc-4124-b129-f0c643f73284\") " pod="openstack/mysqld-exporter-0" Mar 09 14:29:31 crc kubenswrapper[4722]: I0309 14:29:31.008082 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/652c419b-2a86-4c6f-ac7a-c2d7818ef55f-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"652c419b-2a86-4c6f-ac7a-c2d7818ef55f\") " pod="openstack/kube-state-metrics-0" Mar 09 14:29:31 crc kubenswrapper[4722]: I0309 14:29:31.008108 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwwnh\" (UniqueName: \"kubernetes.io/projected/652c419b-2a86-4c6f-ac7a-c2d7818ef55f-kube-api-access-zwwnh\") pod \"kube-state-metrics-0\" (UID: \"652c419b-2a86-4c6f-ac7a-c2d7818ef55f\") " pod="openstack/kube-state-metrics-0" Mar 09 14:29:31 crc kubenswrapper[4722]: I0309 14:29:31.008159 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mysqld-exporter-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f04dc47-34bc-4124-b129-f0c643f73284-mysqld-exporter-tls-certs\") pod \"mysqld-exporter-0\" (UID: \"5f04dc47-34bc-4124-b129-f0c643f73284\") " pod="openstack/mysqld-exporter-0" Mar 09 14:29:31 crc kubenswrapper[4722]: I0309 14:29:31.013384 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/652c419b-2a86-4c6f-ac7a-c2d7818ef55f-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"652c419b-2a86-4c6f-ac7a-c2d7818ef55f\") " pod="openstack/kube-state-metrics-0" Mar 09 14:29:31 crc kubenswrapper[4722]: I0309 14:29:31.013711 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f04dc47-34bc-4124-b129-f0c643f73284-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"5f04dc47-34bc-4124-b129-f0c643f73284\") " pod="openstack/mysqld-exporter-0" Mar 09 14:29:31 crc kubenswrapper[4722]: I0309 14:29:31.014648 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mysqld-exporter-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f04dc47-34bc-4124-b129-f0c643f73284-mysqld-exporter-tls-certs\") pod \"mysqld-exporter-0\" (UID: \"5f04dc47-34bc-4124-b129-f0c643f73284\") " pod="openstack/mysqld-exporter-0" Mar 09 14:29:31 crc kubenswrapper[4722]: I0309 14:29:31.015089 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/652c419b-2a86-4c6f-ac7a-c2d7818ef55f-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"652c419b-2a86-4c6f-ac7a-c2d7818ef55f\") " pod="openstack/kube-state-metrics-0" Mar 09 14:29:31 crc kubenswrapper[4722]: I0309 14:29:31.015741 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f04dc47-34bc-4124-b129-f0c643f73284-config-data\") pod \"mysqld-exporter-0\" (UID: \"5f04dc47-34bc-4124-b129-f0c643f73284\") " pod="openstack/mysqld-exporter-0" Mar 09 14:29:31 crc kubenswrapper[4722]: I0309 14:29:31.019005 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/652c419b-2a86-4c6f-ac7a-c2d7818ef55f-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"652c419b-2a86-4c6f-ac7a-c2d7818ef55f\") " pod="openstack/kube-state-metrics-0" Mar 09 14:29:31 crc kubenswrapper[4722]: I0309 14:29:31.030020 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwwnh\" (UniqueName: \"kubernetes.io/projected/652c419b-2a86-4c6f-ac7a-c2d7818ef55f-kube-api-access-zwwnh\") pod \"kube-state-metrics-0\" (UID: \"652c419b-2a86-4c6f-ac7a-c2d7818ef55f\") " pod="openstack/kube-state-metrics-0" Mar 09 14:29:31 crc kubenswrapper[4722]: I0309 14:29:31.037939 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqlhh\" (UniqueName: \"kubernetes.io/projected/5f04dc47-34bc-4124-b129-f0c643f73284-kube-api-access-kqlhh\") pod \"mysqld-exporter-0\" (UID: \"5f04dc47-34bc-4124-b129-f0c643f73284\") " pod="openstack/mysqld-exporter-0" Mar 09 14:29:31 crc kubenswrapper[4722]: I0309 14:29:31.136235 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Mar 09 14:29:31 crc kubenswrapper[4722]: I0309 14:29:31.155591 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 09 14:29:31 crc kubenswrapper[4722]: I0309 14:29:31.678267 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 09 14:29:31 crc kubenswrapper[4722]: I0309 14:29:31.681405 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 09 14:29:31 crc kubenswrapper[4722]: I0309 14:29:31.705666 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 09 14:29:31 crc kubenswrapper[4722]: I0309 14:29:31.749613 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 09 14:29:31 crc kubenswrapper[4722]: W0309 14:29:31.759549 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f04dc47_34bc_4124_b129_f0c643f73284.slice/crio-07904fff5268543c6b385ee416c06b447b5e2d22541a78cef3ddd39bbf4cbf88 WatchSource:0}: Error finding container 07904fff5268543c6b385ee416c06b447b5e2d22541a78cef3ddd39bbf4cbf88: Status 404 returned error can't find the container with id 07904fff5268543c6b385ee416c06b447b5e2d22541a78cef3ddd39bbf4cbf88 Mar 09 14:29:31 crc kubenswrapper[4722]: I0309 14:29:31.786237 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Mar 09 14:29:32 crc kubenswrapper[4722]: I0309 14:29:32.132768 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 14:29:32 crc kubenswrapper[4722]: I0309 14:29:32.133075 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d791fb7b-d6f9-40c5-97e2-c306a91059f0" containerName="ceilometer-central-agent" containerID="cri-o://0994e76809996dc7b440acf904594d46c2893329840b9b3b1aa81f3e949aba94" gracePeriod=30 Mar 09 14:29:32 crc kubenswrapper[4722]: I0309 14:29:32.133238 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d791fb7b-d6f9-40c5-97e2-c306a91059f0" containerName="proxy-httpd" containerID="cri-o://f786b00170a945de94d0f4f3fa7223c609b5a257826d9fde4fd1d13c213df85b" gracePeriod=30 Mar 09 14:29:32 crc kubenswrapper[4722]: I0309 14:29:32.133308 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d791fb7b-d6f9-40c5-97e2-c306a91059f0" containerName="sg-core" containerID="cri-o://3d69b67b5c4eb8b6756a282ac7b9c516b6ced6c1227a394399484cc65a0277fe" gracePeriod=30 Mar 09 14:29:32 crc kubenswrapper[4722]: I0309 14:29:32.133349 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d791fb7b-d6f9-40c5-97e2-c306a91059f0" containerName="ceilometer-notification-agent" containerID="cri-o://3e7961719fd0f860de9af5fcedcd4ba6553fdc5c1b4547e6b414a69456a8d0bd" gracePeriod=30 Mar 09 14:29:32 crc kubenswrapper[4722]: I0309 14:29:32.162896 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d8e49ae-5932-472c-a714-7872980c5a9b" path="/var/lib/kubelet/pods/4d8e49ae-5932-472c-a714-7872980c5a9b/volumes" Mar 09 14:29:32 crc kubenswrapper[4722]: I0309 14:29:32.164028 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdb981de-3376-4da4-834d-ae2446f02b8e" path="/var/lib/kubelet/pods/cdb981de-3376-4da4-834d-ae2446f02b8e/volumes" Mar 09 14:29:32 crc kubenswrapper[4722]: I0309 14:29:32.699767 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"652c419b-2a86-4c6f-ac7a-c2d7818ef55f","Type":"ContainerStarted","Data":"095d76880e80c386aa9f7fca9391735b55ff0971e29fbda2b8126c70011b1679"} Mar 09 14:29:32 crc kubenswrapper[4722]: I0309 14:29:32.700343 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 09 14:29:32 crc kubenswrapper[4722]: I0309 14:29:32.700388 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"652c419b-2a86-4c6f-ac7a-c2d7818ef55f","Type":"ContainerStarted","Data":"6d37253bf26d78cc16e7d9d1dea5a9d3d03e09337f0bfb77b03abaf4452ce642"} Mar 09 14:29:32 crc kubenswrapper[4722]: I0309 14:29:32.703141 4722 generic.go:334] "Generic (PLEG): container finished" podID="d791fb7b-d6f9-40c5-97e2-c306a91059f0" containerID="f786b00170a945de94d0f4f3fa7223c609b5a257826d9fde4fd1d13c213df85b" exitCode=0 Mar 09 14:29:32 crc kubenswrapper[4722]: I0309 14:29:32.703175 4722 generic.go:334] "Generic (PLEG): container finished" podID="d791fb7b-d6f9-40c5-97e2-c306a91059f0" containerID="3d69b67b5c4eb8b6756a282ac7b9c516b6ced6c1227a394399484cc65a0277fe" exitCode=2 Mar 09 14:29:32 crc kubenswrapper[4722]: I0309 14:29:32.703184 4722 generic.go:334] "Generic (PLEG): container finished" podID="d791fb7b-d6f9-40c5-97e2-c306a91059f0" containerID="0994e76809996dc7b440acf904594d46c2893329840b9b3b1aa81f3e949aba94" exitCode=0 Mar 09 14:29:32 crc kubenswrapper[4722]: I0309 14:29:32.703245 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d791fb7b-d6f9-40c5-97e2-c306a91059f0","Type":"ContainerDied","Data":"f786b00170a945de94d0f4f3fa7223c609b5a257826d9fde4fd1d13c213df85b"} Mar 09 14:29:32 crc kubenswrapper[4722]: I0309 14:29:32.703277 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d791fb7b-d6f9-40c5-97e2-c306a91059f0","Type":"ContainerDied","Data":"3d69b67b5c4eb8b6756a282ac7b9c516b6ced6c1227a394399484cc65a0277fe"} Mar 09 14:29:32 crc kubenswrapper[4722]: I0309 14:29:32.703301 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d791fb7b-d6f9-40c5-97e2-c306a91059f0","Type":"ContainerDied","Data":"0994e76809996dc7b440acf904594d46c2893329840b9b3b1aa81f3e949aba94"} Mar 09 14:29:32 crc kubenswrapper[4722]: I0309 14:29:32.704937 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"5f04dc47-34bc-4124-b129-f0c643f73284","Type":"ContainerStarted","Data":"07904fff5268543c6b385ee416c06b447b5e2d22541a78cef3ddd39bbf4cbf88"} Mar 09 14:29:32 crc kubenswrapper[4722]: I0309 14:29:32.719938 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.269056689 podStartE2EDuration="2.719920926s" podCreationTimestamp="2026-03-09 14:29:30 +0000 UTC" firstStartedPulling="2026-03-09 14:29:31.750805182 +0000 UTC m=+1612.306373758" lastFinishedPulling="2026-03-09 14:29:32.201669419 +0000 UTC m=+1612.757237995" observedRunningTime="2026-03-09 14:29:32.715482833 +0000 UTC m=+1613.271051409" watchObservedRunningTime="2026-03-09 14:29:32.719920926 +0000 UTC m=+1613.275489502" Mar 09 14:29:32 crc kubenswrapper[4722]: I0309 14:29:32.722758 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 09 14:29:33 crc kubenswrapper[4722]: I0309 14:29:33.640627 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rq8bb" Mar 09 14:29:33 crc kubenswrapper[4722]: I0309 14:29:33.642314 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rq8bb" Mar 09 14:29:33 crc kubenswrapper[4722]: I0309 14:29:33.723720 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"5f04dc47-34bc-4124-b129-f0c643f73284","Type":"ContainerStarted","Data":"af57c7bd0749d3ee0ba582c35a504ab6024489d0c091853b0616d51211951d24"} Mar 09 14:29:34 crc kubenswrapper[4722]: I0309 14:29:34.506051 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 14:29:34 crc kubenswrapper[4722]: I0309 14:29:34.540955 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-0" podStartSLOduration=3.677935272 podStartE2EDuration="4.540932098s" podCreationTimestamp="2026-03-09 14:29:30 +0000 UTC" firstStartedPulling="2026-03-09 14:29:31.767574245 +0000 UTC m=+1612.323142821" lastFinishedPulling="2026-03-09 14:29:32.630571071 +0000 UTC m=+1613.186139647" observedRunningTime="2026-03-09 14:29:33.743988954 +0000 UTC m=+1614.299557530" watchObservedRunningTime="2026-03-09 14:29:34.540932098 +0000 UTC m=+1615.096500674" Mar 09 14:29:34 crc kubenswrapper[4722]: I0309 14:29:34.642998 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d791fb7b-d6f9-40c5-97e2-c306a91059f0-config-data\") pod \"d791fb7b-d6f9-40c5-97e2-c306a91059f0\" (UID: \"d791fb7b-d6f9-40c5-97e2-c306a91059f0\") " Mar 09 14:29:34 crc kubenswrapper[4722]: I0309 14:29:34.643058 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d791fb7b-d6f9-40c5-97e2-c306a91059f0-sg-core-conf-yaml\") pod \"d791fb7b-d6f9-40c5-97e2-c306a91059f0\" (UID: \"d791fb7b-d6f9-40c5-97e2-c306a91059f0\") " Mar 09 14:29:34 crc kubenswrapper[4722]: I0309 14:29:34.643176 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vfjc4\" (UniqueName: \"kubernetes.io/projected/d791fb7b-d6f9-40c5-97e2-c306a91059f0-kube-api-access-vfjc4\") pod \"d791fb7b-d6f9-40c5-97e2-c306a91059f0\" (UID: \"d791fb7b-d6f9-40c5-97e2-c306a91059f0\") " Mar 09 14:29:34 crc kubenswrapper[4722]: I0309 14:29:34.643227 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d791fb7b-d6f9-40c5-97e2-c306a91059f0-scripts\") pod \"d791fb7b-d6f9-40c5-97e2-c306a91059f0\" (UID: \"d791fb7b-d6f9-40c5-97e2-c306a91059f0\") " Mar 09 14:29:34 crc kubenswrapper[4722]: I0309 14:29:34.643285 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d791fb7b-d6f9-40c5-97e2-c306a91059f0-combined-ca-bundle\") pod \"d791fb7b-d6f9-40c5-97e2-c306a91059f0\" (UID: \"d791fb7b-d6f9-40c5-97e2-c306a91059f0\") " Mar 09 14:29:34 crc kubenswrapper[4722]: I0309 14:29:34.643327 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d791fb7b-d6f9-40c5-97e2-c306a91059f0-log-httpd\") pod \"d791fb7b-d6f9-40c5-97e2-c306a91059f0\" (UID: \"d791fb7b-d6f9-40c5-97e2-c306a91059f0\") " Mar 09 14:29:34 crc kubenswrapper[4722]: I0309 14:29:34.643380 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d791fb7b-d6f9-40c5-97e2-c306a91059f0-run-httpd\") pod \"d791fb7b-d6f9-40c5-97e2-c306a91059f0\" (UID: \"d791fb7b-d6f9-40c5-97e2-c306a91059f0\") " Mar 09 14:29:34 crc kubenswrapper[4722]: I0309 14:29:34.644256 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d791fb7b-d6f9-40c5-97e2-c306a91059f0-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d791fb7b-d6f9-40c5-97e2-c306a91059f0" (UID: "d791fb7b-d6f9-40c5-97e2-c306a91059f0"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:29:34 crc kubenswrapper[4722]: I0309 14:29:34.644414 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d791fb7b-d6f9-40c5-97e2-c306a91059f0-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d791fb7b-d6f9-40c5-97e2-c306a91059f0" (UID: "d791fb7b-d6f9-40c5-97e2-c306a91059f0"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:29:34 crc kubenswrapper[4722]: I0309 14:29:34.644707 4722 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d791fb7b-d6f9-40c5-97e2-c306a91059f0-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 14:29:34 crc kubenswrapper[4722]: I0309 14:29:34.656725 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d791fb7b-d6f9-40c5-97e2-c306a91059f0-kube-api-access-vfjc4" (OuterVolumeSpecName: "kube-api-access-vfjc4") pod "d791fb7b-d6f9-40c5-97e2-c306a91059f0" (UID: "d791fb7b-d6f9-40c5-97e2-c306a91059f0"). InnerVolumeSpecName "kube-api-access-vfjc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:29:34 crc kubenswrapper[4722]: I0309 14:29:34.658117 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d791fb7b-d6f9-40c5-97e2-c306a91059f0-scripts" (OuterVolumeSpecName: "scripts") pod "d791fb7b-d6f9-40c5-97e2-c306a91059f0" (UID: "d791fb7b-d6f9-40c5-97e2-c306a91059f0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:29:34 crc kubenswrapper[4722]: I0309 14:29:34.681692 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d791fb7b-d6f9-40c5-97e2-c306a91059f0-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d791fb7b-d6f9-40c5-97e2-c306a91059f0" (UID: "d791fb7b-d6f9-40c5-97e2-c306a91059f0"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:29:34 crc kubenswrapper[4722]: I0309 14:29:34.699803 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-rq8bb" podUID="5800ac8e-62cf-446c-a10f-cde22ba63b77" containerName="registry-server" probeResult="failure" output=< Mar 09 14:29:34 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Mar 09 14:29:34 crc kubenswrapper[4722]: > Mar 09 14:29:34 crc kubenswrapper[4722]: I0309 14:29:34.739640 4722 generic.go:334] "Generic (PLEG): container finished" podID="d791fb7b-d6f9-40c5-97e2-c306a91059f0" containerID="3e7961719fd0f860de9af5fcedcd4ba6553fdc5c1b4547e6b414a69456a8d0bd" exitCode=0 Mar 09 14:29:34 crc kubenswrapper[4722]: I0309 14:29:34.739762 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 14:29:34 crc kubenswrapper[4722]: I0309 14:29:34.739791 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d791fb7b-d6f9-40c5-97e2-c306a91059f0","Type":"ContainerDied","Data":"3e7961719fd0f860de9af5fcedcd4ba6553fdc5c1b4547e6b414a69456a8d0bd"} Mar 09 14:29:34 crc kubenswrapper[4722]: I0309 14:29:34.739909 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d791fb7b-d6f9-40c5-97e2-c306a91059f0","Type":"ContainerDied","Data":"c6f6ee38a3c6e5345aae2d72f7d934cd2d719cbeeeaa8b67bd428ffecc7aa0d9"} Mar 09 14:29:34 crc kubenswrapper[4722]: I0309 14:29:34.739932 4722 scope.go:117] "RemoveContainer" containerID="f786b00170a945de94d0f4f3fa7223c609b5a257826d9fde4fd1d13c213df85b" Mar 09 14:29:34 crc kubenswrapper[4722]: I0309 14:29:34.746299 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d791fb7b-d6f9-40c5-97e2-c306a91059f0-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 14:29:34 crc kubenswrapper[4722]: I0309 14:29:34.746328 4722 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d791fb7b-d6f9-40c5-97e2-c306a91059f0-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 14:29:34 crc kubenswrapper[4722]: I0309 14:29:34.746337 4722 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d791fb7b-d6f9-40c5-97e2-c306a91059f0-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 09 14:29:34 crc kubenswrapper[4722]: I0309 14:29:34.746346 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vfjc4\" (UniqueName: \"kubernetes.io/projected/d791fb7b-d6f9-40c5-97e2-c306a91059f0-kube-api-access-vfjc4\") on node \"crc\" DevicePath \"\"" Mar 09 14:29:34 crc kubenswrapper[4722]: I0309 14:29:34.747340 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d791fb7b-d6f9-40c5-97e2-c306a91059f0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d791fb7b-d6f9-40c5-97e2-c306a91059f0" (UID: "d791fb7b-d6f9-40c5-97e2-c306a91059f0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:29:34 crc kubenswrapper[4722]: I0309 14:29:34.773097 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d791fb7b-d6f9-40c5-97e2-c306a91059f0-config-data" (OuterVolumeSpecName: "config-data") pod "d791fb7b-d6f9-40c5-97e2-c306a91059f0" (UID: "d791fb7b-d6f9-40c5-97e2-c306a91059f0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:29:34 crc kubenswrapper[4722]: I0309 14:29:34.825886 4722 scope.go:117] "RemoveContainer" containerID="3d69b67b5c4eb8b6756a282ac7b9c516b6ced6c1227a394399484cc65a0277fe" Mar 09 14:29:34 crc kubenswrapper[4722]: I0309 14:29:34.851245 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d791fb7b-d6f9-40c5-97e2-c306a91059f0-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:29:34 crc kubenswrapper[4722]: I0309 14:29:34.851282 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d791fb7b-d6f9-40c5-97e2-c306a91059f0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:29:34 crc kubenswrapper[4722]: I0309 14:29:34.860933 4722 scope.go:117] "RemoveContainer" containerID="3e7961719fd0f860de9af5fcedcd4ba6553fdc5c1b4547e6b414a69456a8d0bd" Mar 09 14:29:34 crc kubenswrapper[4722]: I0309 14:29:34.890108 4722 scope.go:117] "RemoveContainer" containerID="0994e76809996dc7b440acf904594d46c2893329840b9b3b1aa81f3e949aba94" Mar 09 14:29:34 crc kubenswrapper[4722]: I0309 14:29:34.934103 4722 scope.go:117] "RemoveContainer" containerID="f786b00170a945de94d0f4f3fa7223c609b5a257826d9fde4fd1d13c213df85b" Mar 09 14:29:34 crc kubenswrapper[4722]: E0309 14:29:34.934886 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f786b00170a945de94d0f4f3fa7223c609b5a257826d9fde4fd1d13c213df85b\": container with ID starting with f786b00170a945de94d0f4f3fa7223c609b5a257826d9fde4fd1d13c213df85b not found: ID does not exist" containerID="f786b00170a945de94d0f4f3fa7223c609b5a257826d9fde4fd1d13c213df85b" Mar 09 14:29:34 crc kubenswrapper[4722]: I0309 14:29:34.934935 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f786b00170a945de94d0f4f3fa7223c609b5a257826d9fde4fd1d13c213df85b"} err="failed to get container status \"f786b00170a945de94d0f4f3fa7223c609b5a257826d9fde4fd1d13c213df85b\": rpc error: code = NotFound desc = could not find container \"f786b00170a945de94d0f4f3fa7223c609b5a257826d9fde4fd1d13c213df85b\": container with ID starting with f786b00170a945de94d0f4f3fa7223c609b5a257826d9fde4fd1d13c213df85b not found: ID does not exist" Mar 09 14:29:34 crc kubenswrapper[4722]: I0309 14:29:34.934967 4722 scope.go:117] "RemoveContainer" containerID="3d69b67b5c4eb8b6756a282ac7b9c516b6ced6c1227a394399484cc65a0277fe" Mar 09 14:29:34 crc kubenswrapper[4722]: E0309 14:29:34.935351 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d69b67b5c4eb8b6756a282ac7b9c516b6ced6c1227a394399484cc65a0277fe\": container with ID starting with 3d69b67b5c4eb8b6756a282ac7b9c516b6ced6c1227a394399484cc65a0277fe not found: ID does not exist" containerID="3d69b67b5c4eb8b6756a282ac7b9c516b6ced6c1227a394399484cc65a0277fe" Mar 09 14:29:34 crc kubenswrapper[4722]: I0309 14:29:34.935384 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d69b67b5c4eb8b6756a282ac7b9c516b6ced6c1227a394399484cc65a0277fe"} err="failed to get container status \"3d69b67b5c4eb8b6756a282ac7b9c516b6ced6c1227a394399484cc65a0277fe\": rpc error: code = NotFound desc = could not find container \"3d69b67b5c4eb8b6756a282ac7b9c516b6ced6c1227a394399484cc65a0277fe\": container with ID starting with 3d69b67b5c4eb8b6756a282ac7b9c516b6ced6c1227a394399484cc65a0277fe not found: ID does not exist" Mar 09 14:29:34 crc kubenswrapper[4722]: I0309 14:29:34.935413 4722 scope.go:117] "RemoveContainer" containerID="3e7961719fd0f860de9af5fcedcd4ba6553fdc5c1b4547e6b414a69456a8d0bd" Mar 09 14:29:34 crc kubenswrapper[4722]: E0309 14:29:34.935662 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e7961719fd0f860de9af5fcedcd4ba6553fdc5c1b4547e6b414a69456a8d0bd\": container with ID starting with 3e7961719fd0f860de9af5fcedcd4ba6553fdc5c1b4547e6b414a69456a8d0bd not found: ID does not exist" containerID="3e7961719fd0f860de9af5fcedcd4ba6553fdc5c1b4547e6b414a69456a8d0bd" Mar 09 14:29:34 crc kubenswrapper[4722]: I0309 14:29:34.935683 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e7961719fd0f860de9af5fcedcd4ba6553fdc5c1b4547e6b414a69456a8d0bd"} err="failed to get container status \"3e7961719fd0f860de9af5fcedcd4ba6553fdc5c1b4547e6b414a69456a8d0bd\": rpc error: code = NotFound desc = could not find container \"3e7961719fd0f860de9af5fcedcd4ba6553fdc5c1b4547e6b414a69456a8d0bd\": container with ID starting with 3e7961719fd0f860de9af5fcedcd4ba6553fdc5c1b4547e6b414a69456a8d0bd not found: ID does not exist" Mar 09 14:29:34 crc kubenswrapper[4722]: I0309 14:29:34.935698 4722 scope.go:117] "RemoveContainer" containerID="0994e76809996dc7b440acf904594d46c2893329840b9b3b1aa81f3e949aba94" Mar 09 14:29:34 crc kubenswrapper[4722]: E0309 14:29:34.935905 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0994e76809996dc7b440acf904594d46c2893329840b9b3b1aa81f3e949aba94\": container with ID starting with 0994e76809996dc7b440acf904594d46c2893329840b9b3b1aa81f3e949aba94 not found: ID does not exist" containerID="0994e76809996dc7b440acf904594d46c2893329840b9b3b1aa81f3e949aba94" Mar 09 14:29:34 crc kubenswrapper[4722]: I0309 14:29:34.935933 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0994e76809996dc7b440acf904594d46c2893329840b9b3b1aa81f3e949aba94"} err="failed to get container status \"0994e76809996dc7b440acf904594d46c2893329840b9b3b1aa81f3e949aba94\": rpc error: code = NotFound desc = could not find container \"0994e76809996dc7b440acf904594d46c2893329840b9b3b1aa81f3e949aba94\": container with ID starting with 0994e76809996dc7b440acf904594d46c2893329840b9b3b1aa81f3e949aba94 not found: ID does not exist" Mar 09 14:29:35 crc kubenswrapper[4722]: I0309 14:29:35.084769 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 14:29:35 crc kubenswrapper[4722]: I0309 14:29:35.105493 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 09 14:29:35 crc kubenswrapper[4722]: I0309 14:29:35.121241 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 09 14:29:35 crc kubenswrapper[4722]: E0309 14:29:35.122127 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d791fb7b-d6f9-40c5-97e2-c306a91059f0" containerName="sg-core" Mar 09 14:29:35 crc kubenswrapper[4722]: I0309 14:29:35.122141 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="d791fb7b-d6f9-40c5-97e2-c306a91059f0" containerName="sg-core" Mar 09 14:29:35 crc kubenswrapper[4722]: E0309 14:29:35.122158 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d791fb7b-d6f9-40c5-97e2-c306a91059f0" containerName="proxy-httpd" Mar 09 14:29:35 crc kubenswrapper[4722]: I0309 14:29:35.122164 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="d791fb7b-d6f9-40c5-97e2-c306a91059f0" containerName="proxy-httpd" Mar 09 14:29:35 crc kubenswrapper[4722]: E0309 14:29:35.122173 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d791fb7b-d6f9-40c5-97e2-c306a91059f0" containerName="ceilometer-central-agent" Mar 09 14:29:35 crc kubenswrapper[4722]: I0309 14:29:35.122189 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="d791fb7b-d6f9-40c5-97e2-c306a91059f0" containerName="ceilometer-central-agent" Mar 09 14:29:35 crc kubenswrapper[4722]: E0309 14:29:35.122229 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d791fb7b-d6f9-40c5-97e2-c306a91059f0" containerName="ceilometer-notification-agent" Mar 09 14:29:35 crc kubenswrapper[4722]: I0309 14:29:35.122236 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="d791fb7b-d6f9-40c5-97e2-c306a91059f0" containerName="ceilometer-notification-agent" Mar 09 14:29:35 crc kubenswrapper[4722]: I0309 14:29:35.122453 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="d791fb7b-d6f9-40c5-97e2-c306a91059f0" containerName="sg-core" Mar 09 14:29:35 crc kubenswrapper[4722]: I0309 14:29:35.122483 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="d791fb7b-d6f9-40c5-97e2-c306a91059f0" containerName="ceilometer-notification-agent" Mar 09 14:29:35 crc kubenswrapper[4722]: I0309 14:29:35.122496 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="d791fb7b-d6f9-40c5-97e2-c306a91059f0" containerName="proxy-httpd" Mar 09 14:29:35 crc kubenswrapper[4722]: I0309 14:29:35.122508 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="d791fb7b-d6f9-40c5-97e2-c306a91059f0" containerName="ceilometer-central-agent" Mar 09 14:29:35 crc kubenswrapper[4722]: I0309 14:29:35.124691 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 14:29:35 crc kubenswrapper[4722]: I0309 14:29:35.135540 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 14:29:35 crc kubenswrapper[4722]: I0309 14:29:35.137651 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 09 14:29:35 crc kubenswrapper[4722]: I0309 14:29:35.137801 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 09 14:29:35 crc kubenswrapper[4722]: I0309 14:29:35.138606 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 09 14:29:35 crc kubenswrapper[4722]: I0309 14:29:35.261498 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2597\" (UniqueName: \"kubernetes.io/projected/2fd9ab47-8790-4f12-b211-50b37603c5f0-kube-api-access-c2597\") pod \"ceilometer-0\" (UID: \"2fd9ab47-8790-4f12-b211-50b37603c5f0\") " pod="openstack/ceilometer-0" Mar 09 14:29:35 crc kubenswrapper[4722]: I0309 14:29:35.261570 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fd9ab47-8790-4f12-b211-50b37603c5f0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2fd9ab47-8790-4f12-b211-50b37603c5f0\") " pod="openstack/ceilometer-0" Mar 09 14:29:35 crc kubenswrapper[4722]: I0309 14:29:35.261701 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2fd9ab47-8790-4f12-b211-50b37603c5f0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2fd9ab47-8790-4f12-b211-50b37603c5f0\") " pod="openstack/ceilometer-0" Mar 09 14:29:35 crc kubenswrapper[4722]: I0309 14:29:35.261746 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2fd9ab47-8790-4f12-b211-50b37603c5f0-scripts\") pod \"ceilometer-0\" (UID: \"2fd9ab47-8790-4f12-b211-50b37603c5f0\") " pod="openstack/ceilometer-0" Mar 09 14:29:35 crc kubenswrapper[4722]: I0309 14:29:35.261785 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fd9ab47-8790-4f12-b211-50b37603c5f0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2fd9ab47-8790-4f12-b211-50b37603c5f0\") " pod="openstack/ceilometer-0" Mar 09 14:29:35 crc kubenswrapper[4722]: I0309 14:29:35.261935 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2fd9ab47-8790-4f12-b211-50b37603c5f0-log-httpd\") pod \"ceilometer-0\" (UID: \"2fd9ab47-8790-4f12-b211-50b37603c5f0\") " pod="openstack/ceilometer-0" Mar 09 14:29:35 crc kubenswrapper[4722]: I0309 14:29:35.263012 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2fd9ab47-8790-4f12-b211-50b37603c5f0-run-httpd\") pod \"ceilometer-0\" (UID: \"2fd9ab47-8790-4f12-b211-50b37603c5f0\") " pod="openstack/ceilometer-0" Mar 09 14:29:35 crc kubenswrapper[4722]: I0309 14:29:35.263078 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fd9ab47-8790-4f12-b211-50b37603c5f0-config-data\") pod \"ceilometer-0\" (UID: \"2fd9ab47-8790-4f12-b211-50b37603c5f0\") " pod="openstack/ceilometer-0" Mar 09 14:29:35 crc kubenswrapper[4722]: I0309 14:29:35.365286 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2fd9ab47-8790-4f12-b211-50b37603c5f0-run-httpd\") pod \"ceilometer-0\" (UID: \"2fd9ab47-8790-4f12-b211-50b37603c5f0\") " pod="openstack/ceilometer-0" Mar 09 14:29:35 crc kubenswrapper[4722]: I0309 14:29:35.365850 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fd9ab47-8790-4f12-b211-50b37603c5f0-config-data\") pod \"ceilometer-0\" (UID: \"2fd9ab47-8790-4f12-b211-50b37603c5f0\") " pod="openstack/ceilometer-0" Mar 09 14:29:35 crc kubenswrapper[4722]: I0309 14:29:35.366151 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2fd9ab47-8790-4f12-b211-50b37603c5f0-run-httpd\") pod \"ceilometer-0\" (UID: \"2fd9ab47-8790-4f12-b211-50b37603c5f0\") " pod="openstack/ceilometer-0" Mar 09 14:29:35 crc kubenswrapper[4722]: I0309 14:29:35.366857 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2597\" (UniqueName: \"kubernetes.io/projected/2fd9ab47-8790-4f12-b211-50b37603c5f0-kube-api-access-c2597\") pod \"ceilometer-0\" (UID: \"2fd9ab47-8790-4f12-b211-50b37603c5f0\") " pod="openstack/ceilometer-0" Mar 09 14:29:35 crc kubenswrapper[4722]: I0309 14:29:35.366920 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fd9ab47-8790-4f12-b211-50b37603c5f0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2fd9ab47-8790-4f12-b211-50b37603c5f0\") " pod="openstack/ceilometer-0" Mar 09 14:29:35 crc kubenswrapper[4722]: I0309 14:29:35.367014 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2fd9ab47-8790-4f12-b211-50b37603c5f0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2fd9ab47-8790-4f12-b211-50b37603c5f0\") " pod="openstack/ceilometer-0" Mar 09 14:29:35 crc kubenswrapper[4722]: I0309 14:29:35.367064 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2fd9ab47-8790-4f12-b211-50b37603c5f0-scripts\") pod \"ceilometer-0\" (UID: \"2fd9ab47-8790-4f12-b211-50b37603c5f0\") " pod="openstack/ceilometer-0" Mar 09 14:29:35 crc kubenswrapper[4722]: I0309 14:29:35.367097 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fd9ab47-8790-4f12-b211-50b37603c5f0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2fd9ab47-8790-4f12-b211-50b37603c5f0\") " pod="openstack/ceilometer-0" Mar 09 14:29:35 crc kubenswrapper[4722]: I0309 14:29:35.367142 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2fd9ab47-8790-4f12-b211-50b37603c5f0-log-httpd\") pod \"ceilometer-0\" (UID: \"2fd9ab47-8790-4f12-b211-50b37603c5f0\") " pod="openstack/ceilometer-0" Mar 09 14:29:35 crc kubenswrapper[4722]: I0309 14:29:35.367626 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2fd9ab47-8790-4f12-b211-50b37603c5f0-log-httpd\") pod \"ceilometer-0\" (UID: \"2fd9ab47-8790-4f12-b211-50b37603c5f0\") " pod="openstack/ceilometer-0" Mar 09 14:29:35 crc kubenswrapper[4722]: I0309 14:29:35.370937 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2fd9ab47-8790-4f12-b211-50b37603c5f0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"2fd9ab47-8790-4f12-b211-50b37603c5f0\") " pod="openstack/ceilometer-0" Mar 09 14:29:35 crc kubenswrapper[4722]: I0309 14:29:35.371896 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fd9ab47-8790-4f12-b211-50b37603c5f0-config-data\") pod \"ceilometer-0\" (UID: \"2fd9ab47-8790-4f12-b211-50b37603c5f0\") " pod="openstack/ceilometer-0" Mar 09 14:29:35 crc kubenswrapper[4722]: I0309 14:29:35.372798 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2fd9ab47-8790-4f12-b211-50b37603c5f0-scripts\") pod \"ceilometer-0\" (UID: \"2fd9ab47-8790-4f12-b211-50b37603c5f0\") " pod="openstack/ceilometer-0" Mar 09 14:29:35 crc kubenswrapper[4722]: I0309 14:29:35.374845 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fd9ab47-8790-4f12-b211-50b37603c5f0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"2fd9ab47-8790-4f12-b211-50b37603c5f0\") " pod="openstack/ceilometer-0" Mar 09 14:29:35 crc kubenswrapper[4722]: I0309 14:29:35.377448 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fd9ab47-8790-4f12-b211-50b37603c5f0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"2fd9ab47-8790-4f12-b211-50b37603c5f0\") " pod="openstack/ceilometer-0" Mar 09 14:29:35 crc kubenswrapper[4722]: I0309 14:29:35.385380 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2597\" (UniqueName: \"kubernetes.io/projected/2fd9ab47-8790-4f12-b211-50b37603c5f0-kube-api-access-c2597\") pod \"ceilometer-0\" (UID: \"2fd9ab47-8790-4f12-b211-50b37603c5f0\") " pod="openstack/ceilometer-0" Mar 09 14:29:35 crc kubenswrapper[4722]: I0309 14:29:35.454823 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 14:29:35 crc kubenswrapper[4722]: W0309 14:29:35.904267 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2fd9ab47_8790_4f12_b211_50b37603c5f0.slice/crio-2b5be031e43cbbe7fb4b68aca7706f203c51ef71c8aaa9edb387aa9f30dd9c9c WatchSource:0}: Error finding container 2b5be031e43cbbe7fb4b68aca7706f203c51ef71c8aaa9edb387aa9f30dd9c9c: Status 404 returned error can't find the container with id 2b5be031e43cbbe7fb4b68aca7706f203c51ef71c8aaa9edb387aa9f30dd9c9c Mar 09 14:29:35 crc kubenswrapper[4722]: I0309 14:29:35.905823 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 14:29:36 crc kubenswrapper[4722]: I0309 14:29:36.201353 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d791fb7b-d6f9-40c5-97e2-c306a91059f0" path="/var/lib/kubelet/pods/d791fb7b-d6f9-40c5-97e2-c306a91059f0/volumes" Mar 09 14:29:36 crc kubenswrapper[4722]: I0309 14:29:36.764906 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2fd9ab47-8790-4f12-b211-50b37603c5f0","Type":"ContainerStarted","Data":"c995a7d038526114a350fd8db45d236d3ee4b7d0a02da7a4e6a1661cdf7c15d7"} Mar 09 14:29:36 crc kubenswrapper[4722]: I0309 14:29:36.765263 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2fd9ab47-8790-4f12-b211-50b37603c5f0","Type":"ContainerStarted","Data":"2b5be031e43cbbe7fb4b68aca7706f203c51ef71c8aaa9edb387aa9f30dd9c9c"} Mar 09 14:29:37 crc kubenswrapper[4722]: I0309 14:29:37.783744 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2fd9ab47-8790-4f12-b211-50b37603c5f0","Type":"ContainerStarted","Data":"21864526935aae9127c5f43aa061b8a9eeaefc7e517417536934bcabb25c6c10"} Mar 09 14:29:38 crc kubenswrapper[4722]: I0309 14:29:38.800703 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2fd9ab47-8790-4f12-b211-50b37603c5f0","Type":"ContainerStarted","Data":"e213af05c1a6a7bd1d06fdef0f34ec4fa1b384260d859b5d91c2a8066f6466be"} Mar 09 14:29:40 crc kubenswrapper[4722]: I0309 14:29:40.830463 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2fd9ab47-8790-4f12-b211-50b37603c5f0","Type":"ContainerStarted","Data":"5c2264d76f370fc5c07d44de97df5c7aeb2646eb6cb15fcb5cfb9ea56f6eaadd"} Mar 09 14:29:40 crc kubenswrapper[4722]: I0309 14:29:40.833673 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 09 14:29:40 crc kubenswrapper[4722]: I0309 14:29:40.870705 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.6463682579999999 podStartE2EDuration="5.870663645s" podCreationTimestamp="2026-03-09 14:29:35 +0000 UTC" firstStartedPulling="2026-03-09 14:29:35.910824328 +0000 UTC m=+1616.466392944" lastFinishedPulling="2026-03-09 14:29:40.135119755 +0000 UTC m=+1620.690688331" observedRunningTime="2026-03-09 14:29:40.858882751 +0000 UTC m=+1621.414451337" watchObservedRunningTime="2026-03-09 14:29:40.870663645 +0000 UTC m=+1621.426232241" Mar 09 14:29:41 crc kubenswrapper[4722]: I0309 14:29:41.183667 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 09 14:29:44 crc kubenswrapper[4722]: I0309 14:29:44.703406 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-rq8bb" podUID="5800ac8e-62cf-446c-a10f-cde22ba63b77" containerName="registry-server" probeResult="failure" output=< Mar 09 14:29:44 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Mar 09 14:29:44 crc kubenswrapper[4722]: > Mar 09 14:29:51 crc kubenswrapper[4722]: I0309 14:29:51.528544 4722 patch_prober.go:28] interesting pod/machine-config-daemon-hjrrb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:29:51 crc kubenswrapper[4722]: I0309 14:29:51.529355 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:29:51 crc kubenswrapper[4722]: I0309 14:29:51.529427 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" Mar 09 14:29:51 crc kubenswrapper[4722]: I0309 14:29:51.530643 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"86a6dcdf7dcf4d48b8bdfb71d9a2c30b89f235700c424312c7e16580a86021d6"} pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 14:29:51 crc kubenswrapper[4722]: I0309 14:29:51.530741 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerName="machine-config-daemon" containerID="cri-o://86a6dcdf7dcf4d48b8bdfb71d9a2c30b89f235700c424312c7e16580a86021d6" gracePeriod=600 Mar 09 14:29:51 crc kubenswrapper[4722]: E0309 14:29:51.655840 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 14:29:51 crc kubenswrapper[4722]: I0309 14:29:51.958599 4722 generic.go:334] "Generic (PLEG): container finished" podID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerID="86a6dcdf7dcf4d48b8bdfb71d9a2c30b89f235700c424312c7e16580a86021d6" exitCode=0 Mar 09 14:29:51 crc kubenswrapper[4722]: I0309 14:29:51.958645 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" event={"ID":"dac2aaf5-653b-4b2a-8efe-ed26bac8d648","Type":"ContainerDied","Data":"86a6dcdf7dcf4d48b8bdfb71d9a2c30b89f235700c424312c7e16580a86021d6"} Mar 09 14:29:51 crc kubenswrapper[4722]: I0309 14:29:51.958683 4722 scope.go:117] "RemoveContainer" containerID="2d686d3e92fab7cd0f339e5d57afd546181543a3a9585b91ecf278050136cecb" Mar 09 14:29:51 crc kubenswrapper[4722]: I0309 14:29:51.959831 4722 scope.go:117] "RemoveContainer" containerID="86a6dcdf7dcf4d48b8bdfb71d9a2c30b89f235700c424312c7e16580a86021d6" Mar 09 14:29:51 crc kubenswrapper[4722]: E0309 14:29:51.960159 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 14:29:53 crc kubenswrapper[4722]: I0309 14:29:53.722842 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rq8bb" Mar 09 14:29:53 crc kubenswrapper[4722]: I0309 14:29:53.794781 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rq8bb" Mar 09 14:29:54 crc kubenswrapper[4722]: I0309 14:29:54.517509 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rq8bb"] Mar 09 14:29:54 crc kubenswrapper[4722]: I0309 14:29:54.993310 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rq8bb" podUID="5800ac8e-62cf-446c-a10f-cde22ba63b77" containerName="registry-server" containerID="cri-o://7b04ddba55acff2e8e3982d2527b79aad7aceabe390d82575421248e849731a4" gracePeriod=2 Mar 09 14:29:55 crc kubenswrapper[4722]: I0309 14:29:55.650736 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rq8bb" Mar 09 14:29:55 crc kubenswrapper[4722]: I0309 14:29:55.701620 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5800ac8e-62cf-446c-a10f-cde22ba63b77-catalog-content\") pod \"5800ac8e-62cf-446c-a10f-cde22ba63b77\" (UID: \"5800ac8e-62cf-446c-a10f-cde22ba63b77\") " Mar 09 14:29:55 crc kubenswrapper[4722]: I0309 14:29:55.701943 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5800ac8e-62cf-446c-a10f-cde22ba63b77-utilities\") pod \"5800ac8e-62cf-446c-a10f-cde22ba63b77\" (UID: \"5800ac8e-62cf-446c-a10f-cde22ba63b77\") " Mar 09 14:29:55 crc kubenswrapper[4722]: I0309 14:29:55.702009 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s79j6\" (UniqueName: \"kubernetes.io/projected/5800ac8e-62cf-446c-a10f-cde22ba63b77-kube-api-access-s79j6\") pod \"5800ac8e-62cf-446c-a10f-cde22ba63b77\" (UID: \"5800ac8e-62cf-446c-a10f-cde22ba63b77\") " Mar 09 14:29:55 crc kubenswrapper[4722]: I0309 14:29:55.703365 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5800ac8e-62cf-446c-a10f-cde22ba63b77-utilities" (OuterVolumeSpecName: "utilities") pod "5800ac8e-62cf-446c-a10f-cde22ba63b77" (UID: "5800ac8e-62cf-446c-a10f-cde22ba63b77"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:29:55 crc kubenswrapper[4722]: I0309 14:29:55.712125 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5800ac8e-62cf-446c-a10f-cde22ba63b77-kube-api-access-s79j6" (OuterVolumeSpecName: "kube-api-access-s79j6") pod "5800ac8e-62cf-446c-a10f-cde22ba63b77" (UID: "5800ac8e-62cf-446c-a10f-cde22ba63b77"). InnerVolumeSpecName "kube-api-access-s79j6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:29:55 crc kubenswrapper[4722]: I0309 14:29:55.758058 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5800ac8e-62cf-446c-a10f-cde22ba63b77-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5800ac8e-62cf-446c-a10f-cde22ba63b77" (UID: "5800ac8e-62cf-446c-a10f-cde22ba63b77"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:29:55 crc kubenswrapper[4722]: I0309 14:29:55.805809 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5800ac8e-62cf-446c-a10f-cde22ba63b77-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 14:29:55 crc kubenswrapper[4722]: I0309 14:29:55.805857 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5800ac8e-62cf-446c-a10f-cde22ba63b77-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 14:29:55 crc kubenswrapper[4722]: I0309 14:29:55.805871 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s79j6\" (UniqueName: \"kubernetes.io/projected/5800ac8e-62cf-446c-a10f-cde22ba63b77-kube-api-access-s79j6\") on node \"crc\" DevicePath \"\"" Mar 09 14:29:56 crc kubenswrapper[4722]: I0309 14:29:56.006794 4722 generic.go:334] "Generic (PLEG): container finished" podID="5800ac8e-62cf-446c-a10f-cde22ba63b77" containerID="7b04ddba55acff2e8e3982d2527b79aad7aceabe390d82575421248e849731a4" exitCode=0 Mar 09 14:29:56 crc kubenswrapper[4722]: I0309 14:29:56.006837 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rq8bb" event={"ID":"5800ac8e-62cf-446c-a10f-cde22ba63b77","Type":"ContainerDied","Data":"7b04ddba55acff2e8e3982d2527b79aad7aceabe390d82575421248e849731a4"} Mar 09 14:29:56 crc kubenswrapper[4722]: I0309 14:29:56.006863 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rq8bb" event={"ID":"5800ac8e-62cf-446c-a10f-cde22ba63b77","Type":"ContainerDied","Data":"14cebcfa0a7815770dfba9ca0141d618db06737b5cb19d49238ced6a758882a1"} Mar 09 14:29:56 crc kubenswrapper[4722]: I0309 14:29:56.006867 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rq8bb" Mar 09 14:29:56 crc kubenswrapper[4722]: I0309 14:29:56.006880 4722 scope.go:117] "RemoveContainer" containerID="7b04ddba55acff2e8e3982d2527b79aad7aceabe390d82575421248e849731a4" Mar 09 14:29:56 crc kubenswrapper[4722]: I0309 14:29:56.030611 4722 scope.go:117] "RemoveContainer" containerID="dcce2dd22bcf030f8f9685f1d097bf581b742cac2b013e44234239c9a446ba32" Mar 09 14:29:56 crc kubenswrapper[4722]: I0309 14:29:56.048119 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rq8bb"] Mar 09 14:29:56 crc kubenswrapper[4722]: I0309 14:29:56.060901 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rq8bb"] Mar 09 14:29:56 crc kubenswrapper[4722]: I0309 14:29:56.067027 4722 scope.go:117] "RemoveContainer" containerID="3c9dbdb8c53c63a4bff3d9da22a0b2f4b9b2c92a688eba5dda7d994cf3b3f1ee" Mar 09 14:29:56 crc kubenswrapper[4722]: I0309 14:29:56.130059 4722 scope.go:117] "RemoveContainer" containerID="7b04ddba55acff2e8e3982d2527b79aad7aceabe390d82575421248e849731a4" Mar 09 14:29:56 crc kubenswrapper[4722]: E0309 14:29:56.130725 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b04ddba55acff2e8e3982d2527b79aad7aceabe390d82575421248e849731a4\": container with ID starting with 7b04ddba55acff2e8e3982d2527b79aad7aceabe390d82575421248e849731a4 not found: ID does not exist" containerID="7b04ddba55acff2e8e3982d2527b79aad7aceabe390d82575421248e849731a4" Mar 09 14:29:56 crc kubenswrapper[4722]: I0309 14:29:56.130805 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b04ddba55acff2e8e3982d2527b79aad7aceabe390d82575421248e849731a4"} err="failed to get container status \"7b04ddba55acff2e8e3982d2527b79aad7aceabe390d82575421248e849731a4\": rpc error: code = NotFound desc = could not find container \"7b04ddba55acff2e8e3982d2527b79aad7aceabe390d82575421248e849731a4\": container with ID starting with 7b04ddba55acff2e8e3982d2527b79aad7aceabe390d82575421248e849731a4 not found: ID does not exist" Mar 09 14:29:56 crc kubenswrapper[4722]: I0309 14:29:56.130868 4722 scope.go:117] "RemoveContainer" containerID="dcce2dd22bcf030f8f9685f1d097bf581b742cac2b013e44234239c9a446ba32" Mar 09 14:29:56 crc kubenswrapper[4722]: E0309 14:29:56.131612 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dcce2dd22bcf030f8f9685f1d097bf581b742cac2b013e44234239c9a446ba32\": container with ID starting with dcce2dd22bcf030f8f9685f1d097bf581b742cac2b013e44234239c9a446ba32 not found: ID does not exist" containerID="dcce2dd22bcf030f8f9685f1d097bf581b742cac2b013e44234239c9a446ba32" Mar 09 14:29:56 crc kubenswrapper[4722]: I0309 14:29:56.131645 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcce2dd22bcf030f8f9685f1d097bf581b742cac2b013e44234239c9a446ba32"} err="failed to get container status \"dcce2dd22bcf030f8f9685f1d097bf581b742cac2b013e44234239c9a446ba32\": rpc error: code = NotFound desc = could not find container \"dcce2dd22bcf030f8f9685f1d097bf581b742cac2b013e44234239c9a446ba32\": container with ID starting with dcce2dd22bcf030f8f9685f1d097bf581b742cac2b013e44234239c9a446ba32 not found: ID does not exist" Mar 09 14:29:56 crc kubenswrapper[4722]: I0309 14:29:56.131669 4722 scope.go:117] "RemoveContainer" containerID="3c9dbdb8c53c63a4bff3d9da22a0b2f4b9b2c92a688eba5dda7d994cf3b3f1ee" Mar 09 14:29:56 crc kubenswrapper[4722]: E0309 14:29:56.131937 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c9dbdb8c53c63a4bff3d9da22a0b2f4b9b2c92a688eba5dda7d994cf3b3f1ee\": container with ID starting with 3c9dbdb8c53c63a4bff3d9da22a0b2f4b9b2c92a688eba5dda7d994cf3b3f1ee not found: ID does not exist" containerID="3c9dbdb8c53c63a4bff3d9da22a0b2f4b9b2c92a688eba5dda7d994cf3b3f1ee" Mar 09 14:29:56 crc kubenswrapper[4722]: I0309 14:29:56.131963 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c9dbdb8c53c63a4bff3d9da22a0b2f4b9b2c92a688eba5dda7d994cf3b3f1ee"} err="failed to get container status \"3c9dbdb8c53c63a4bff3d9da22a0b2f4b9b2c92a688eba5dda7d994cf3b3f1ee\": rpc error: code = NotFound desc = could not find container \"3c9dbdb8c53c63a4bff3d9da22a0b2f4b9b2c92a688eba5dda7d994cf3b3f1ee\": container with ID starting with 3c9dbdb8c53c63a4bff3d9da22a0b2f4b9b2c92a688eba5dda7d994cf3b3f1ee not found: ID does not exist" Mar 09 14:29:56 crc kubenswrapper[4722]: I0309 14:29:56.161624 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5800ac8e-62cf-446c-a10f-cde22ba63b77" path="/var/lib/kubelet/pods/5800ac8e-62cf-446c-a10f-cde22ba63b77/volumes" Mar 09 14:30:00 crc kubenswrapper[4722]: I0309 14:30:00.186909 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551110-bsfw4"] Mar 09 14:30:00 crc kubenswrapper[4722]: E0309 14:30:00.188581 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5800ac8e-62cf-446c-a10f-cde22ba63b77" containerName="registry-server" Mar 09 14:30:00 crc kubenswrapper[4722]: I0309 14:30:00.188609 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="5800ac8e-62cf-446c-a10f-cde22ba63b77" containerName="registry-server" Mar 09 14:30:00 crc kubenswrapper[4722]: E0309 14:30:00.188649 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5800ac8e-62cf-446c-a10f-cde22ba63b77" containerName="extract-utilities" Mar 09 14:30:00 crc kubenswrapper[4722]: I0309 14:30:00.188662 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="5800ac8e-62cf-446c-a10f-cde22ba63b77" containerName="extract-utilities" Mar 09 14:30:00 crc kubenswrapper[4722]: E0309 14:30:00.188715 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5800ac8e-62cf-446c-a10f-cde22ba63b77" containerName="extract-content" Mar 09 14:30:00 crc kubenswrapper[4722]: I0309 14:30:00.188726 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="5800ac8e-62cf-446c-a10f-cde22ba63b77" containerName="extract-content" Mar 09 14:30:00 crc kubenswrapper[4722]: I0309 14:30:00.189136 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="5800ac8e-62cf-446c-a10f-cde22ba63b77" containerName="registry-server" Mar 09 14:30:00 crc kubenswrapper[4722]: I0309 14:30:00.190766 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551110-bsfw4" Mar 09 14:30:00 crc kubenswrapper[4722]: I0309 14:30:00.193439 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 14:30:00 crc kubenswrapper[4722]: I0309 14:30:00.193487 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 14:30:00 crc kubenswrapper[4722]: I0309 14:30:00.194032 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-dr2x6" Mar 09 14:30:00 crc kubenswrapper[4722]: I0309 14:30:00.200775 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551110-8j24g"] Mar 09 14:30:00 crc kubenswrapper[4722]: I0309 14:30:00.203041 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551110-8j24g" Mar 09 14:30:00 crc kubenswrapper[4722]: I0309 14:30:00.208675 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 09 14:30:00 crc kubenswrapper[4722]: I0309 14:30:00.208688 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 09 14:30:00 crc kubenswrapper[4722]: I0309 14:30:00.229097 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551110-8j24g"] Mar 09 14:30:00 crc kubenswrapper[4722]: I0309 14:30:00.253869 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551110-bsfw4"] Mar 09 14:30:00 crc kubenswrapper[4722]: I0309 14:30:00.319916 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/673ece17-a82b-4e82-b811-5c704ab9a1b4-secret-volume\") pod \"collect-profiles-29551110-8j24g\" (UID: \"673ece17-a82b-4e82-b811-5c704ab9a1b4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551110-8j24g" Mar 09 14:30:00 crc kubenswrapper[4722]: I0309 14:30:00.320353 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/673ece17-a82b-4e82-b811-5c704ab9a1b4-config-volume\") pod \"collect-profiles-29551110-8j24g\" (UID: \"673ece17-a82b-4e82-b811-5c704ab9a1b4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551110-8j24g" Mar 09 14:30:00 crc kubenswrapper[4722]: I0309 14:30:00.320461 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbgh8\" (UniqueName: \"kubernetes.io/projected/673ece17-a82b-4e82-b811-5c704ab9a1b4-kube-api-access-sbgh8\") pod \"collect-profiles-29551110-8j24g\" (UID: \"673ece17-a82b-4e82-b811-5c704ab9a1b4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551110-8j24g" Mar 09 14:30:00 crc kubenswrapper[4722]: I0309 14:30:00.321678 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzw24\" (UniqueName: \"kubernetes.io/projected/399309e2-cc0e-4ecb-a9ed-ae5efd75dfb5-kube-api-access-bzw24\") pod \"auto-csr-approver-29551110-bsfw4\" (UID: \"399309e2-cc0e-4ecb-a9ed-ae5efd75dfb5\") " pod="openshift-infra/auto-csr-approver-29551110-bsfw4" Mar 09 14:30:00 crc kubenswrapper[4722]: I0309 14:30:00.423362 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbgh8\" (UniqueName: \"kubernetes.io/projected/673ece17-a82b-4e82-b811-5c704ab9a1b4-kube-api-access-sbgh8\") pod \"collect-profiles-29551110-8j24g\" (UID: \"673ece17-a82b-4e82-b811-5c704ab9a1b4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551110-8j24g" Mar 09 14:30:00 crc kubenswrapper[4722]: I0309 14:30:00.423475 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzw24\" (UniqueName: \"kubernetes.io/projected/399309e2-cc0e-4ecb-a9ed-ae5efd75dfb5-kube-api-access-bzw24\") pod \"auto-csr-approver-29551110-bsfw4\" (UID: \"399309e2-cc0e-4ecb-a9ed-ae5efd75dfb5\") " pod="openshift-infra/auto-csr-approver-29551110-bsfw4" Mar 09 14:30:00 crc kubenswrapper[4722]: I0309 14:30:00.423570 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/673ece17-a82b-4e82-b811-5c704ab9a1b4-secret-volume\") pod \"collect-profiles-29551110-8j24g\" (UID: \"673ece17-a82b-4e82-b811-5c704ab9a1b4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551110-8j24g" Mar 09 14:30:00 crc kubenswrapper[4722]: I0309 14:30:00.423635 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/673ece17-a82b-4e82-b811-5c704ab9a1b4-config-volume\") pod \"collect-profiles-29551110-8j24g\" (UID: \"673ece17-a82b-4e82-b811-5c704ab9a1b4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551110-8j24g" Mar 09 14:30:00 crc kubenswrapper[4722]: I0309 14:30:00.424401 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/673ece17-a82b-4e82-b811-5c704ab9a1b4-config-volume\") pod \"collect-profiles-29551110-8j24g\" (UID: \"673ece17-a82b-4e82-b811-5c704ab9a1b4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551110-8j24g" Mar 09 14:30:00 crc kubenswrapper[4722]: I0309 14:30:00.431843 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/673ece17-a82b-4e82-b811-5c704ab9a1b4-secret-volume\") pod \"collect-profiles-29551110-8j24g\" (UID: \"673ece17-a82b-4e82-b811-5c704ab9a1b4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551110-8j24g" Mar 09 14:30:00 crc kubenswrapper[4722]: I0309 14:30:00.440390 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbgh8\" (UniqueName: \"kubernetes.io/projected/673ece17-a82b-4e82-b811-5c704ab9a1b4-kube-api-access-sbgh8\") pod \"collect-profiles-29551110-8j24g\" (UID: \"673ece17-a82b-4e82-b811-5c704ab9a1b4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551110-8j24g" Mar 09 14:30:00 crc kubenswrapper[4722]: I0309 14:30:00.448598 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzw24\" (UniqueName: \"kubernetes.io/projected/399309e2-cc0e-4ecb-a9ed-ae5efd75dfb5-kube-api-access-bzw24\") pod \"auto-csr-approver-29551110-bsfw4\" (UID: \"399309e2-cc0e-4ecb-a9ed-ae5efd75dfb5\") " pod="openshift-infra/auto-csr-approver-29551110-bsfw4" Mar 09 14:30:00 crc kubenswrapper[4722]: I0309 14:30:00.525122 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551110-bsfw4" Mar 09 14:30:00 crc kubenswrapper[4722]: I0309 14:30:00.545391 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551110-8j24g" Mar 09 14:30:01 crc kubenswrapper[4722]: W0309 14:30:01.037858 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod399309e2_cc0e_4ecb_a9ed_ae5efd75dfb5.slice/crio-69f0338eec16f6a2867b40ca879d90523d95094a1551acc9480e66fca9f98bb3 WatchSource:0}: Error finding container 69f0338eec16f6a2867b40ca879d90523d95094a1551acc9480e66fca9f98bb3: Status 404 returned error can't find the container with id 69f0338eec16f6a2867b40ca879d90523d95094a1551acc9480e66fca9f98bb3 Mar 09 14:30:01 crc kubenswrapper[4722]: I0309 14:30:01.038389 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551110-bsfw4"] Mar 09 14:30:01 crc kubenswrapper[4722]: I0309 14:30:01.096473 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551110-bsfw4" event={"ID":"399309e2-cc0e-4ecb-a9ed-ae5efd75dfb5","Type":"ContainerStarted","Data":"69f0338eec16f6a2867b40ca879d90523d95094a1551acc9480e66fca9f98bb3"} Mar 09 14:30:01 crc kubenswrapper[4722]: I0309 14:30:01.172969 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551110-8j24g"] Mar 09 14:30:01 crc kubenswrapper[4722]: W0309 14:30:01.173453 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod673ece17_a82b_4e82_b811_5c704ab9a1b4.slice/crio-735162e244b3e74a3f4e8be60b771b930fb29ca9c1dbc0cd34feec83bb838b86 WatchSource:0}: Error finding container 735162e244b3e74a3f4e8be60b771b930fb29ca9c1dbc0cd34feec83bb838b86: Status 404 returned error can't find the container with id 735162e244b3e74a3f4e8be60b771b930fb29ca9c1dbc0cd34feec83bb838b86 Mar 09 14:30:02 crc kubenswrapper[4722]: I0309 14:30:02.106810 4722 generic.go:334] "Generic (PLEG): container finished" podID="673ece17-a82b-4e82-b811-5c704ab9a1b4" containerID="d6113c52887efd5fe65d1135756dbb3f60c61fb6a4dcfe637d9efc547c003933" exitCode=0 Mar 09 14:30:02 crc kubenswrapper[4722]: I0309 14:30:02.106852 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551110-8j24g" event={"ID":"673ece17-a82b-4e82-b811-5c704ab9a1b4","Type":"ContainerDied","Data":"d6113c52887efd5fe65d1135756dbb3f60c61fb6a4dcfe637d9efc547c003933"} Mar 09 14:30:02 crc kubenswrapper[4722]: I0309 14:30:02.106874 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551110-8j24g" event={"ID":"673ece17-a82b-4e82-b811-5c704ab9a1b4","Type":"ContainerStarted","Data":"735162e244b3e74a3f4e8be60b771b930fb29ca9c1dbc0cd34feec83bb838b86"} Mar 09 14:30:03 crc kubenswrapper[4722]: I0309 14:30:03.124538 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551110-bsfw4" event={"ID":"399309e2-cc0e-4ecb-a9ed-ae5efd75dfb5","Type":"ContainerStarted","Data":"785c9b3fd4f944246425859922e4b9f248364ff8e2049eee438ad214e9c731f4"} Mar 09 14:30:03 crc kubenswrapper[4722]: I0309 14:30:03.163126 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551110-bsfw4" podStartSLOduration=1.5196989890000001 podStartE2EDuration="3.163094951s" podCreationTimestamp="2026-03-09 14:30:00 +0000 UTC" firstStartedPulling="2026-03-09 14:30:01.040779798 +0000 UTC m=+1641.596348384" lastFinishedPulling="2026-03-09 14:30:02.68417577 +0000 UTC m=+1643.239744346" observedRunningTime="2026-03-09 14:30:03.143272714 +0000 UTC m=+1643.698841310" watchObservedRunningTime="2026-03-09 14:30:03.163094951 +0000 UTC m=+1643.718663527" Mar 09 14:30:03 crc kubenswrapper[4722]: I0309 14:30:03.615167 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551110-8j24g" Mar 09 14:30:03 crc kubenswrapper[4722]: I0309 14:30:03.718579 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/673ece17-a82b-4e82-b811-5c704ab9a1b4-config-volume\") pod \"673ece17-a82b-4e82-b811-5c704ab9a1b4\" (UID: \"673ece17-a82b-4e82-b811-5c704ab9a1b4\") " Mar 09 14:30:03 crc kubenswrapper[4722]: I0309 14:30:03.718653 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/673ece17-a82b-4e82-b811-5c704ab9a1b4-secret-volume\") pod \"673ece17-a82b-4e82-b811-5c704ab9a1b4\" (UID: \"673ece17-a82b-4e82-b811-5c704ab9a1b4\") " Mar 09 14:30:03 crc kubenswrapper[4722]: I0309 14:30:03.718701 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbgh8\" (UniqueName: \"kubernetes.io/projected/673ece17-a82b-4e82-b811-5c704ab9a1b4-kube-api-access-sbgh8\") pod \"673ece17-a82b-4e82-b811-5c704ab9a1b4\" (UID: \"673ece17-a82b-4e82-b811-5c704ab9a1b4\") " Mar 09 14:30:03 crc kubenswrapper[4722]: I0309 14:30:03.719681 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/673ece17-a82b-4e82-b811-5c704ab9a1b4-config-volume" (OuterVolumeSpecName: "config-volume") pod "673ece17-a82b-4e82-b811-5c704ab9a1b4" (UID: "673ece17-a82b-4e82-b811-5c704ab9a1b4"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:30:03 crc kubenswrapper[4722]: I0309 14:30:03.732867 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/673ece17-a82b-4e82-b811-5c704ab9a1b4-kube-api-access-sbgh8" (OuterVolumeSpecName: "kube-api-access-sbgh8") pod "673ece17-a82b-4e82-b811-5c704ab9a1b4" (UID: "673ece17-a82b-4e82-b811-5c704ab9a1b4"). InnerVolumeSpecName "kube-api-access-sbgh8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:30:03 crc kubenswrapper[4722]: I0309 14:30:03.734119 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/673ece17-a82b-4e82-b811-5c704ab9a1b4-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "673ece17-a82b-4e82-b811-5c704ab9a1b4" (UID: "673ece17-a82b-4e82-b811-5c704ab9a1b4"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:30:03 crc kubenswrapper[4722]: I0309 14:30:03.823122 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sbgh8\" (UniqueName: \"kubernetes.io/projected/673ece17-a82b-4e82-b811-5c704ab9a1b4-kube-api-access-sbgh8\") on node \"crc\" DevicePath \"\"" Mar 09 14:30:03 crc kubenswrapper[4722]: I0309 14:30:03.823171 4722 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/673ece17-a82b-4e82-b811-5c704ab9a1b4-config-volume\") on node \"crc\" DevicePath \"\"" Mar 09 14:30:03 crc kubenswrapper[4722]: I0309 14:30:03.823184 4722 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/673ece17-a82b-4e82-b811-5c704ab9a1b4-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 09 14:30:04 crc kubenswrapper[4722]: I0309 14:30:04.136811 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551110-8j24g" Mar 09 14:30:04 crc kubenswrapper[4722]: I0309 14:30:04.136925 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551110-8j24g" event={"ID":"673ece17-a82b-4e82-b811-5c704ab9a1b4","Type":"ContainerDied","Data":"735162e244b3e74a3f4e8be60b771b930fb29ca9c1dbc0cd34feec83bb838b86"} Mar 09 14:30:04 crc kubenswrapper[4722]: I0309 14:30:04.136958 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="735162e244b3e74a3f4e8be60b771b930fb29ca9c1dbc0cd34feec83bb838b86" Mar 09 14:30:05 crc kubenswrapper[4722]: I0309 14:30:05.149495 4722 scope.go:117] "RemoveContainer" containerID="86a6dcdf7dcf4d48b8bdfb71d9a2c30b89f235700c424312c7e16580a86021d6" Mar 09 14:30:05 crc kubenswrapper[4722]: E0309 14:30:05.150486 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 14:30:05 crc kubenswrapper[4722]: I0309 14:30:05.152918 4722 generic.go:334] "Generic (PLEG): container finished" podID="399309e2-cc0e-4ecb-a9ed-ae5efd75dfb5" containerID="785c9b3fd4f944246425859922e4b9f248364ff8e2049eee438ad214e9c731f4" exitCode=0 Mar 09 14:30:05 crc kubenswrapper[4722]: I0309 14:30:05.152961 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551110-bsfw4" event={"ID":"399309e2-cc0e-4ecb-a9ed-ae5efd75dfb5","Type":"ContainerDied","Data":"785c9b3fd4f944246425859922e4b9f248364ff8e2049eee438ad214e9c731f4"} Mar 09 14:30:05 crc kubenswrapper[4722]: I0309 14:30:05.469509 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 09 14:30:06 crc kubenswrapper[4722]: I0309 14:30:06.599238 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551110-bsfw4" Mar 09 14:30:06 crc kubenswrapper[4722]: I0309 14:30:06.694027 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bzw24\" (UniqueName: \"kubernetes.io/projected/399309e2-cc0e-4ecb-a9ed-ae5efd75dfb5-kube-api-access-bzw24\") pod \"399309e2-cc0e-4ecb-a9ed-ae5efd75dfb5\" (UID: \"399309e2-cc0e-4ecb-a9ed-ae5efd75dfb5\") " Mar 09 14:30:06 crc kubenswrapper[4722]: I0309 14:30:06.699291 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/399309e2-cc0e-4ecb-a9ed-ae5efd75dfb5-kube-api-access-bzw24" (OuterVolumeSpecName: "kube-api-access-bzw24") pod "399309e2-cc0e-4ecb-a9ed-ae5efd75dfb5" (UID: "399309e2-cc0e-4ecb-a9ed-ae5efd75dfb5"). InnerVolumeSpecName "kube-api-access-bzw24". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:30:06 crc kubenswrapper[4722]: I0309 14:30:06.797458 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bzw24\" (UniqueName: \"kubernetes.io/projected/399309e2-cc0e-4ecb-a9ed-ae5efd75dfb5-kube-api-access-bzw24\") on node \"crc\" DevicePath \"\"" Mar 09 14:30:07 crc kubenswrapper[4722]: I0309 14:30:07.178392 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551110-bsfw4" event={"ID":"399309e2-cc0e-4ecb-a9ed-ae5efd75dfb5","Type":"ContainerDied","Data":"69f0338eec16f6a2867b40ca879d90523d95094a1551acc9480e66fca9f98bb3"} Mar 09 14:30:07 crc kubenswrapper[4722]: I0309 14:30:07.178436 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="69f0338eec16f6a2867b40ca879d90523d95094a1551acc9480e66fca9f98bb3" Mar 09 14:30:07 crc kubenswrapper[4722]: I0309 14:30:07.178493 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551110-bsfw4" Mar 09 14:30:07 crc kubenswrapper[4722]: I0309 14:30:07.254336 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551104-vdth8"] Mar 09 14:30:07 crc kubenswrapper[4722]: I0309 14:30:07.266782 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551104-vdth8"] Mar 09 14:30:08 crc kubenswrapper[4722]: I0309 14:30:08.162645 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2ae19a8-dc3a-4368-94d0-a8be2f8ed011" path="/var/lib/kubelet/pods/b2ae19a8-dc3a-4368-94d0-a8be2f8ed011/volumes" Mar 09 14:30:17 crc kubenswrapper[4722]: I0309 14:30:17.113740 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-j4lgt"] Mar 09 14:30:17 crc kubenswrapper[4722]: I0309 14:30:17.126449 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-j4lgt"] Mar 09 14:30:17 crc kubenswrapper[4722]: I0309 14:30:17.201359 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-cf85b"] Mar 09 14:30:17 crc kubenswrapper[4722]: E0309 14:30:17.201837 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="399309e2-cc0e-4ecb-a9ed-ae5efd75dfb5" containerName="oc" Mar 09 14:30:17 crc kubenswrapper[4722]: I0309 14:30:17.201859 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="399309e2-cc0e-4ecb-a9ed-ae5efd75dfb5" containerName="oc" Mar 09 14:30:17 crc kubenswrapper[4722]: E0309 14:30:17.201910 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="673ece17-a82b-4e82-b811-5c704ab9a1b4" containerName="collect-profiles" Mar 09 14:30:17 crc kubenswrapper[4722]: I0309 14:30:17.201919 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="673ece17-a82b-4e82-b811-5c704ab9a1b4" containerName="collect-profiles" Mar 09 14:30:17 crc kubenswrapper[4722]: I0309 14:30:17.202146 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="673ece17-a82b-4e82-b811-5c704ab9a1b4" containerName="collect-profiles" Mar 09 14:30:17 crc kubenswrapper[4722]: I0309 14:30:17.202165 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="399309e2-cc0e-4ecb-a9ed-ae5efd75dfb5" containerName="oc" Mar 09 14:30:17 crc kubenswrapper[4722]: I0309 14:30:17.203052 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-cf85b" Mar 09 14:30:17 crc kubenswrapper[4722]: I0309 14:30:17.217404 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-cf85b"] Mar 09 14:30:17 crc kubenswrapper[4722]: I0309 14:30:17.322512 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4t68\" (UniqueName: \"kubernetes.io/projected/c83de432-8e6a-45bf-9395-215f28461090-kube-api-access-p4t68\") pod \"heat-db-sync-cf85b\" (UID: \"c83de432-8e6a-45bf-9395-215f28461090\") " pod="openstack/heat-db-sync-cf85b" Mar 09 14:30:17 crc kubenswrapper[4722]: I0309 14:30:17.322833 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c83de432-8e6a-45bf-9395-215f28461090-config-data\") pod \"heat-db-sync-cf85b\" (UID: \"c83de432-8e6a-45bf-9395-215f28461090\") " pod="openstack/heat-db-sync-cf85b" Mar 09 14:30:17 crc kubenswrapper[4722]: I0309 14:30:17.322879 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c83de432-8e6a-45bf-9395-215f28461090-combined-ca-bundle\") pod \"heat-db-sync-cf85b\" (UID: \"c83de432-8e6a-45bf-9395-215f28461090\") " pod="openstack/heat-db-sync-cf85b" Mar 09 14:30:17 crc kubenswrapper[4722]: I0309 14:30:17.424688 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c83de432-8e6a-45bf-9395-215f28461090-config-data\") pod \"heat-db-sync-cf85b\" (UID: \"c83de432-8e6a-45bf-9395-215f28461090\") " pod="openstack/heat-db-sync-cf85b" Mar 09 14:30:17 crc kubenswrapper[4722]: I0309 14:30:17.424750 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c83de432-8e6a-45bf-9395-215f28461090-combined-ca-bundle\") pod \"heat-db-sync-cf85b\" (UID: \"c83de432-8e6a-45bf-9395-215f28461090\") " pod="openstack/heat-db-sync-cf85b" Mar 09 14:30:17 crc kubenswrapper[4722]: I0309 14:30:17.424807 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4t68\" (UniqueName: \"kubernetes.io/projected/c83de432-8e6a-45bf-9395-215f28461090-kube-api-access-p4t68\") pod \"heat-db-sync-cf85b\" (UID: \"c83de432-8e6a-45bf-9395-215f28461090\") " pod="openstack/heat-db-sync-cf85b" Mar 09 14:30:17 crc kubenswrapper[4722]: I0309 14:30:17.431580 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c83de432-8e6a-45bf-9395-215f28461090-config-data\") pod \"heat-db-sync-cf85b\" (UID: \"c83de432-8e6a-45bf-9395-215f28461090\") " pod="openstack/heat-db-sync-cf85b" Mar 09 14:30:17 crc kubenswrapper[4722]: I0309 14:30:17.434035 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c83de432-8e6a-45bf-9395-215f28461090-combined-ca-bundle\") pod \"heat-db-sync-cf85b\" (UID: \"c83de432-8e6a-45bf-9395-215f28461090\") " pod="openstack/heat-db-sync-cf85b" Mar 09 14:30:17 crc kubenswrapper[4722]: I0309 14:30:17.448813 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4t68\" (UniqueName: \"kubernetes.io/projected/c83de432-8e6a-45bf-9395-215f28461090-kube-api-access-p4t68\") pod \"heat-db-sync-cf85b\" (UID: \"c83de432-8e6a-45bf-9395-215f28461090\") " pod="openstack/heat-db-sync-cf85b" Mar 09 14:30:17 crc kubenswrapper[4722]: I0309 14:30:17.554297 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-cf85b" Mar 09 14:30:18 crc kubenswrapper[4722]: I0309 14:30:18.087617 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-cf85b"] Mar 09 14:30:18 crc kubenswrapper[4722]: I0309 14:30:18.151323 4722 scope.go:117] "RemoveContainer" containerID="86a6dcdf7dcf4d48b8bdfb71d9a2c30b89f235700c424312c7e16580a86021d6" Mar 09 14:30:18 crc kubenswrapper[4722]: E0309 14:30:18.151802 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 14:30:18 crc kubenswrapper[4722]: I0309 14:30:18.172167 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f51218c-6b15-4f4a-ad49-1ba0ccd5e292" path="/var/lib/kubelet/pods/7f51218c-6b15-4f4a-ad49-1ba0ccd5e292/volumes" Mar 09 14:30:18 crc kubenswrapper[4722]: I0309 14:30:18.336149 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-cf85b" event={"ID":"c83de432-8e6a-45bf-9395-215f28461090","Type":"ContainerStarted","Data":"e8751321bb092c628355ca9b01adf206a34271923881a7149af061d890d52f18"} Mar 09 14:30:19 crc kubenswrapper[4722]: I0309 14:30:19.558723 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-2"] Mar 09 14:30:19 crc kubenswrapper[4722]: I0309 14:30:19.623293 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 14:30:19 crc kubenswrapper[4722]: I0309 14:30:19.623787 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2fd9ab47-8790-4f12-b211-50b37603c5f0" containerName="ceilometer-central-agent" containerID="cri-o://c995a7d038526114a350fd8db45d236d3ee4b7d0a02da7a4e6a1661cdf7c15d7" gracePeriod=30 Mar 09 14:30:19 crc kubenswrapper[4722]: I0309 14:30:19.624253 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2fd9ab47-8790-4f12-b211-50b37603c5f0" containerName="proxy-httpd" containerID="cri-o://5c2264d76f370fc5c07d44de97df5c7aeb2646eb6cb15fcb5cfb9ea56f6eaadd" gracePeriod=30 Mar 09 14:30:19 crc kubenswrapper[4722]: I0309 14:30:19.624301 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2fd9ab47-8790-4f12-b211-50b37603c5f0" containerName="sg-core" containerID="cri-o://e213af05c1a6a7bd1d06fdef0f34ec4fa1b384260d859b5d91c2a8066f6466be" gracePeriod=30 Mar 09 14:30:19 crc kubenswrapper[4722]: I0309 14:30:19.624335 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="2fd9ab47-8790-4f12-b211-50b37603c5f0" containerName="ceilometer-notification-agent" containerID="cri-o://21864526935aae9127c5f43aa061b8a9eeaefc7e517417536934bcabb25c6c10" gracePeriod=30 Mar 09 14:30:20 crc kubenswrapper[4722]: I0309 14:30:20.385793 4722 generic.go:334] "Generic (PLEG): container finished" podID="2fd9ab47-8790-4f12-b211-50b37603c5f0" containerID="5c2264d76f370fc5c07d44de97df5c7aeb2646eb6cb15fcb5cfb9ea56f6eaadd" exitCode=0 Mar 09 14:30:20 crc kubenswrapper[4722]: I0309 14:30:20.385823 4722 generic.go:334] "Generic (PLEG): container finished" podID="2fd9ab47-8790-4f12-b211-50b37603c5f0" containerID="e213af05c1a6a7bd1d06fdef0f34ec4fa1b384260d859b5d91c2a8066f6466be" exitCode=2 Mar 09 14:30:20 crc kubenswrapper[4722]: I0309 14:30:20.385833 4722 generic.go:334] "Generic (PLEG): container finished" podID="2fd9ab47-8790-4f12-b211-50b37603c5f0" containerID="c995a7d038526114a350fd8db45d236d3ee4b7d0a02da7a4e6a1661cdf7c15d7" exitCode=0 Mar 09 14:30:20 crc kubenswrapper[4722]: I0309 14:30:20.385854 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2fd9ab47-8790-4f12-b211-50b37603c5f0","Type":"ContainerDied","Data":"5c2264d76f370fc5c07d44de97df5c7aeb2646eb6cb15fcb5cfb9ea56f6eaadd"} Mar 09 14:30:20 crc kubenswrapper[4722]: I0309 14:30:20.385879 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2fd9ab47-8790-4f12-b211-50b37603c5f0","Type":"ContainerDied","Data":"e213af05c1a6a7bd1d06fdef0f34ec4fa1b384260d859b5d91c2a8066f6466be"} Mar 09 14:30:20 crc kubenswrapper[4722]: I0309 14:30:20.385889 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2fd9ab47-8790-4f12-b211-50b37603c5f0","Type":"ContainerDied","Data":"c995a7d038526114a350fd8db45d236d3ee4b7d0a02da7a4e6a1661cdf7c15d7"} Mar 09 14:30:20 crc kubenswrapper[4722]: I0309 14:30:20.821867 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 09 14:30:21 crc kubenswrapper[4722]: I0309 14:30:21.074882 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 14:30:21 crc kubenswrapper[4722]: I0309 14:30:21.142644 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2fd9ab47-8790-4f12-b211-50b37603c5f0-scripts\") pod \"2fd9ab47-8790-4f12-b211-50b37603c5f0\" (UID: \"2fd9ab47-8790-4f12-b211-50b37603c5f0\") " Mar 09 14:30:21 crc kubenswrapper[4722]: I0309 14:30:21.142736 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2fd9ab47-8790-4f12-b211-50b37603c5f0-log-httpd\") pod \"2fd9ab47-8790-4f12-b211-50b37603c5f0\" (UID: \"2fd9ab47-8790-4f12-b211-50b37603c5f0\") " Mar 09 14:30:21 crc kubenswrapper[4722]: I0309 14:30:21.142791 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fd9ab47-8790-4f12-b211-50b37603c5f0-combined-ca-bundle\") pod \"2fd9ab47-8790-4f12-b211-50b37603c5f0\" (UID: \"2fd9ab47-8790-4f12-b211-50b37603c5f0\") " Mar 09 14:30:21 crc kubenswrapper[4722]: I0309 14:30:21.142870 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2fd9ab47-8790-4f12-b211-50b37603c5f0-run-httpd\") pod \"2fd9ab47-8790-4f12-b211-50b37603c5f0\" (UID: \"2fd9ab47-8790-4f12-b211-50b37603c5f0\") " Mar 09 14:30:21 crc kubenswrapper[4722]: I0309 14:30:21.142963 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fd9ab47-8790-4f12-b211-50b37603c5f0-config-data\") pod \"2fd9ab47-8790-4f12-b211-50b37603c5f0\" (UID: \"2fd9ab47-8790-4f12-b211-50b37603c5f0\") " Mar 09 14:30:21 crc kubenswrapper[4722]: I0309 14:30:21.142984 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2597\" (UniqueName: \"kubernetes.io/projected/2fd9ab47-8790-4f12-b211-50b37603c5f0-kube-api-access-c2597\") pod \"2fd9ab47-8790-4f12-b211-50b37603c5f0\" (UID: \"2fd9ab47-8790-4f12-b211-50b37603c5f0\") " Mar 09 14:30:21 crc kubenswrapper[4722]: I0309 14:30:21.143074 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fd9ab47-8790-4f12-b211-50b37603c5f0-ceilometer-tls-certs\") pod \"2fd9ab47-8790-4f12-b211-50b37603c5f0\" (UID: \"2fd9ab47-8790-4f12-b211-50b37603c5f0\") " Mar 09 14:30:21 crc kubenswrapper[4722]: I0309 14:30:21.143140 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2fd9ab47-8790-4f12-b211-50b37603c5f0-sg-core-conf-yaml\") pod \"2fd9ab47-8790-4f12-b211-50b37603c5f0\" (UID: \"2fd9ab47-8790-4f12-b211-50b37603c5f0\") " Mar 09 14:30:21 crc kubenswrapper[4722]: I0309 14:30:21.143845 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2fd9ab47-8790-4f12-b211-50b37603c5f0-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2fd9ab47-8790-4f12-b211-50b37603c5f0" (UID: "2fd9ab47-8790-4f12-b211-50b37603c5f0"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:30:21 crc kubenswrapper[4722]: I0309 14:30:21.144070 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2fd9ab47-8790-4f12-b211-50b37603c5f0-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2fd9ab47-8790-4f12-b211-50b37603c5f0" (UID: "2fd9ab47-8790-4f12-b211-50b37603c5f0"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:30:21 crc kubenswrapper[4722]: I0309 14:30:21.157481 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fd9ab47-8790-4f12-b211-50b37603c5f0-kube-api-access-c2597" (OuterVolumeSpecName: "kube-api-access-c2597") pod "2fd9ab47-8790-4f12-b211-50b37603c5f0" (UID: "2fd9ab47-8790-4f12-b211-50b37603c5f0"). InnerVolumeSpecName "kube-api-access-c2597". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:30:21 crc kubenswrapper[4722]: I0309 14:30:21.162334 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fd9ab47-8790-4f12-b211-50b37603c5f0-scripts" (OuterVolumeSpecName: "scripts") pod "2fd9ab47-8790-4f12-b211-50b37603c5f0" (UID: "2fd9ab47-8790-4f12-b211-50b37603c5f0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:30:21 crc kubenswrapper[4722]: I0309 14:30:21.254403 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fd9ab47-8790-4f12-b211-50b37603c5f0-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "2fd9ab47-8790-4f12-b211-50b37603c5f0" (UID: "2fd9ab47-8790-4f12-b211-50b37603c5f0"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:30:21 crc kubenswrapper[4722]: I0309 14:30:21.271571 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fd9ab47-8790-4f12-b211-50b37603c5f0-ceilometer-tls-certs\") pod \"2fd9ab47-8790-4f12-b211-50b37603c5f0\" (UID: \"2fd9ab47-8790-4f12-b211-50b37603c5f0\") " Mar 09 14:30:21 crc kubenswrapper[4722]: I0309 14:30:21.273817 4722 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2fd9ab47-8790-4f12-b211-50b37603c5f0-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 14:30:21 crc kubenswrapper[4722]: I0309 14:30:21.273841 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2597\" (UniqueName: \"kubernetes.io/projected/2fd9ab47-8790-4f12-b211-50b37603c5f0-kube-api-access-c2597\") on node \"crc\" DevicePath \"\"" Mar 09 14:30:21 crc kubenswrapper[4722]: I0309 14:30:21.273853 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2fd9ab47-8790-4f12-b211-50b37603c5f0-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 14:30:21 crc kubenswrapper[4722]: I0309 14:30:21.273861 4722 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2fd9ab47-8790-4f12-b211-50b37603c5f0-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 14:30:21 crc kubenswrapper[4722]: W0309 14:30:21.274279 4722 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/2fd9ab47-8790-4f12-b211-50b37603c5f0/volumes/kubernetes.io~secret/ceilometer-tls-certs Mar 09 14:30:21 crc kubenswrapper[4722]: I0309 14:30:21.274305 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fd9ab47-8790-4f12-b211-50b37603c5f0-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "2fd9ab47-8790-4f12-b211-50b37603c5f0" (UID: "2fd9ab47-8790-4f12-b211-50b37603c5f0"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:30:21 crc kubenswrapper[4722]: I0309 14:30:21.378975 4722 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fd9ab47-8790-4f12-b211-50b37603c5f0-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 14:30:21 crc kubenswrapper[4722]: I0309 14:30:21.383817 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fd9ab47-8790-4f12-b211-50b37603c5f0-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "2fd9ab47-8790-4f12-b211-50b37603c5f0" (UID: "2fd9ab47-8790-4f12-b211-50b37603c5f0"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:30:21 crc kubenswrapper[4722]: I0309 14:30:21.400880 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fd9ab47-8790-4f12-b211-50b37603c5f0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2fd9ab47-8790-4f12-b211-50b37603c5f0" (UID: "2fd9ab47-8790-4f12-b211-50b37603c5f0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:30:21 crc kubenswrapper[4722]: I0309 14:30:21.406098 4722 generic.go:334] "Generic (PLEG): container finished" podID="2fd9ab47-8790-4f12-b211-50b37603c5f0" containerID="21864526935aae9127c5f43aa061b8a9eeaefc7e517417536934bcabb25c6c10" exitCode=0 Mar 09 14:30:21 crc kubenswrapper[4722]: I0309 14:30:21.406143 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2fd9ab47-8790-4f12-b211-50b37603c5f0","Type":"ContainerDied","Data":"21864526935aae9127c5f43aa061b8a9eeaefc7e517417536934bcabb25c6c10"} Mar 09 14:30:21 crc kubenswrapper[4722]: I0309 14:30:21.406172 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"2fd9ab47-8790-4f12-b211-50b37603c5f0","Type":"ContainerDied","Data":"2b5be031e43cbbe7fb4b68aca7706f203c51ef71c8aaa9edb387aa9f30dd9c9c"} Mar 09 14:30:21 crc kubenswrapper[4722]: I0309 14:30:21.406193 4722 scope.go:117] "RemoveContainer" containerID="5c2264d76f370fc5c07d44de97df5c7aeb2646eb6cb15fcb5cfb9ea56f6eaadd" Mar 09 14:30:21 crc kubenswrapper[4722]: I0309 14:30:21.406453 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 14:30:21 crc kubenswrapper[4722]: I0309 14:30:21.459992 4722 scope.go:117] "RemoveContainer" containerID="e213af05c1a6a7bd1d06fdef0f34ec4fa1b384260d859b5d91c2a8066f6466be" Mar 09 14:30:21 crc kubenswrapper[4722]: I0309 14:30:21.483482 4722 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/2fd9ab47-8790-4f12-b211-50b37603c5f0-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 09 14:30:21 crc kubenswrapper[4722]: I0309 14:30:21.483510 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fd9ab47-8790-4f12-b211-50b37603c5f0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:30:21 crc kubenswrapper[4722]: I0309 14:30:21.486693 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fd9ab47-8790-4f12-b211-50b37603c5f0-config-data" (OuterVolumeSpecName: "config-data") pod "2fd9ab47-8790-4f12-b211-50b37603c5f0" (UID: "2fd9ab47-8790-4f12-b211-50b37603c5f0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:30:21 crc kubenswrapper[4722]: I0309 14:30:21.547696 4722 scope.go:117] "RemoveContainer" containerID="21864526935aae9127c5f43aa061b8a9eeaefc7e517417536934bcabb25c6c10" Mar 09 14:30:21 crc kubenswrapper[4722]: I0309 14:30:21.586365 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2fd9ab47-8790-4f12-b211-50b37603c5f0-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:30:21 crc kubenswrapper[4722]: I0309 14:30:21.626673 4722 scope.go:117] "RemoveContainer" containerID="c995a7d038526114a350fd8db45d236d3ee4b7d0a02da7a4e6a1661cdf7c15d7" Mar 09 14:30:21 crc kubenswrapper[4722]: I0309 14:30:21.676118 4722 scope.go:117] "RemoveContainer" containerID="5c2264d76f370fc5c07d44de97df5c7aeb2646eb6cb15fcb5cfb9ea56f6eaadd" Mar 09 14:30:21 crc kubenswrapper[4722]: E0309 14:30:21.678703 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c2264d76f370fc5c07d44de97df5c7aeb2646eb6cb15fcb5cfb9ea56f6eaadd\": container with ID starting with 5c2264d76f370fc5c07d44de97df5c7aeb2646eb6cb15fcb5cfb9ea56f6eaadd not found: ID does not exist" containerID="5c2264d76f370fc5c07d44de97df5c7aeb2646eb6cb15fcb5cfb9ea56f6eaadd" Mar 09 14:30:21 crc kubenswrapper[4722]: I0309 14:30:21.678736 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c2264d76f370fc5c07d44de97df5c7aeb2646eb6cb15fcb5cfb9ea56f6eaadd"} err="failed to get container status \"5c2264d76f370fc5c07d44de97df5c7aeb2646eb6cb15fcb5cfb9ea56f6eaadd\": rpc error: code = NotFound desc = could not find container \"5c2264d76f370fc5c07d44de97df5c7aeb2646eb6cb15fcb5cfb9ea56f6eaadd\": container with ID starting with 5c2264d76f370fc5c07d44de97df5c7aeb2646eb6cb15fcb5cfb9ea56f6eaadd not found: ID does not exist" Mar 09 14:30:21 crc kubenswrapper[4722]: I0309 14:30:21.678758 4722 scope.go:117] "RemoveContainer" containerID="e213af05c1a6a7bd1d06fdef0f34ec4fa1b384260d859b5d91c2a8066f6466be" Mar 09 14:30:21 crc kubenswrapper[4722]: E0309 14:30:21.682627 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e213af05c1a6a7bd1d06fdef0f34ec4fa1b384260d859b5d91c2a8066f6466be\": container with ID starting with e213af05c1a6a7bd1d06fdef0f34ec4fa1b384260d859b5d91c2a8066f6466be not found: ID does not exist" containerID="e213af05c1a6a7bd1d06fdef0f34ec4fa1b384260d859b5d91c2a8066f6466be" Mar 09 14:30:21 crc kubenswrapper[4722]: I0309 14:30:21.682661 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e213af05c1a6a7bd1d06fdef0f34ec4fa1b384260d859b5d91c2a8066f6466be"} err="failed to get container status \"e213af05c1a6a7bd1d06fdef0f34ec4fa1b384260d859b5d91c2a8066f6466be\": rpc error: code = NotFound desc = could not find container \"e213af05c1a6a7bd1d06fdef0f34ec4fa1b384260d859b5d91c2a8066f6466be\": container with ID starting with e213af05c1a6a7bd1d06fdef0f34ec4fa1b384260d859b5d91c2a8066f6466be not found: ID does not exist" Mar 09 14:30:21 crc kubenswrapper[4722]: I0309 14:30:21.682678 4722 scope.go:117] "RemoveContainer" containerID="21864526935aae9127c5f43aa061b8a9eeaefc7e517417536934bcabb25c6c10" Mar 09 14:30:21 crc kubenswrapper[4722]: E0309 14:30:21.683077 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21864526935aae9127c5f43aa061b8a9eeaefc7e517417536934bcabb25c6c10\": container with ID starting with 21864526935aae9127c5f43aa061b8a9eeaefc7e517417536934bcabb25c6c10 not found: ID does not exist" containerID="21864526935aae9127c5f43aa061b8a9eeaefc7e517417536934bcabb25c6c10" Mar 09 14:30:21 crc kubenswrapper[4722]: I0309 14:30:21.683126 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21864526935aae9127c5f43aa061b8a9eeaefc7e517417536934bcabb25c6c10"} err="failed to get container status \"21864526935aae9127c5f43aa061b8a9eeaefc7e517417536934bcabb25c6c10\": rpc error: code = NotFound desc = could not find container \"21864526935aae9127c5f43aa061b8a9eeaefc7e517417536934bcabb25c6c10\": container with ID starting with 21864526935aae9127c5f43aa061b8a9eeaefc7e517417536934bcabb25c6c10 not found: ID does not exist" Mar 09 14:30:21 crc kubenswrapper[4722]: I0309 14:30:21.683156 4722 scope.go:117] "RemoveContainer" containerID="c995a7d038526114a350fd8db45d236d3ee4b7d0a02da7a4e6a1661cdf7c15d7" Mar 09 14:30:21 crc kubenswrapper[4722]: E0309 14:30:21.683429 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c995a7d038526114a350fd8db45d236d3ee4b7d0a02da7a4e6a1661cdf7c15d7\": container with ID starting with c995a7d038526114a350fd8db45d236d3ee4b7d0a02da7a4e6a1661cdf7c15d7 not found: ID does not exist" containerID="c995a7d038526114a350fd8db45d236d3ee4b7d0a02da7a4e6a1661cdf7c15d7" Mar 09 14:30:21 crc kubenswrapper[4722]: I0309 14:30:21.683451 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c995a7d038526114a350fd8db45d236d3ee4b7d0a02da7a4e6a1661cdf7c15d7"} err="failed to get container status \"c995a7d038526114a350fd8db45d236d3ee4b7d0a02da7a4e6a1661cdf7c15d7\": rpc error: code = NotFound desc = could not find container \"c995a7d038526114a350fd8db45d236d3ee4b7d0a02da7a4e6a1661cdf7c15d7\": container with ID starting with c995a7d038526114a350fd8db45d236d3ee4b7d0a02da7a4e6a1661cdf7c15d7 not found: ID does not exist" Mar 09 14:30:21 crc kubenswrapper[4722]: I0309 14:30:21.767044 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 14:30:21 crc kubenswrapper[4722]: I0309 14:30:21.778401 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 09 14:30:21 crc kubenswrapper[4722]: I0309 14:30:21.820850 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 09 14:30:21 crc kubenswrapper[4722]: E0309 14:30:21.821378 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fd9ab47-8790-4f12-b211-50b37603c5f0" containerName="sg-core" Mar 09 14:30:21 crc kubenswrapper[4722]: I0309 14:30:21.821396 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fd9ab47-8790-4f12-b211-50b37603c5f0" containerName="sg-core" Mar 09 14:30:21 crc kubenswrapper[4722]: E0309 14:30:21.821428 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fd9ab47-8790-4f12-b211-50b37603c5f0" containerName="proxy-httpd" Mar 09 14:30:21 crc kubenswrapper[4722]: I0309 14:30:21.821436 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fd9ab47-8790-4f12-b211-50b37603c5f0" containerName="proxy-httpd" Mar 09 14:30:21 crc kubenswrapper[4722]: E0309 14:30:21.821448 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fd9ab47-8790-4f12-b211-50b37603c5f0" containerName="ceilometer-notification-agent" Mar 09 14:30:21 crc kubenswrapper[4722]: I0309 14:30:21.821454 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fd9ab47-8790-4f12-b211-50b37603c5f0" containerName="ceilometer-notification-agent" Mar 09 14:30:21 crc kubenswrapper[4722]: E0309 14:30:21.821468 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fd9ab47-8790-4f12-b211-50b37603c5f0" containerName="ceilometer-central-agent" Mar 09 14:30:21 crc kubenswrapper[4722]: I0309 14:30:21.821474 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fd9ab47-8790-4f12-b211-50b37603c5f0" containerName="ceilometer-central-agent" Mar 09 14:30:21 crc kubenswrapper[4722]: I0309 14:30:21.821677 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fd9ab47-8790-4f12-b211-50b37603c5f0" containerName="sg-core" Mar 09 14:30:21 crc kubenswrapper[4722]: I0309 14:30:21.821699 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fd9ab47-8790-4f12-b211-50b37603c5f0" containerName="ceilometer-notification-agent" Mar 09 14:30:21 crc kubenswrapper[4722]: I0309 14:30:21.821712 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fd9ab47-8790-4f12-b211-50b37603c5f0" containerName="proxy-httpd" Mar 09 14:30:21 crc kubenswrapper[4722]: I0309 14:30:21.821726 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fd9ab47-8790-4f12-b211-50b37603c5f0" containerName="ceilometer-central-agent" Mar 09 14:30:21 crc kubenswrapper[4722]: I0309 14:30:21.823781 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 14:30:21 crc kubenswrapper[4722]: I0309 14:30:21.831057 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 09 14:30:21 crc kubenswrapper[4722]: I0309 14:30:21.831237 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 09 14:30:21 crc kubenswrapper[4722]: I0309 14:30:21.835969 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 09 14:30:21 crc kubenswrapper[4722]: I0309 14:30:21.849924 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 14:30:21 crc kubenswrapper[4722]: I0309 14:30:21.997949 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4a22f8c-ed38-47cf-8238-baf804f573a1-log-httpd\") pod \"ceilometer-0\" (UID: \"e4a22f8c-ed38-47cf-8238-baf804f573a1\") " pod="openstack/ceilometer-0" Mar 09 14:30:21 crc kubenswrapper[4722]: I0309 14:30:21.997990 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4a22f8c-ed38-47cf-8238-baf804f573a1-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e4a22f8c-ed38-47cf-8238-baf804f573a1\") " pod="openstack/ceilometer-0" Mar 09 14:30:21 crc kubenswrapper[4722]: I0309 14:30:21.998023 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4a22f8c-ed38-47cf-8238-baf804f573a1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e4a22f8c-ed38-47cf-8238-baf804f573a1\") " pod="openstack/ceilometer-0" Mar 09 14:30:21 crc kubenswrapper[4722]: I0309 14:30:21.998130 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ff2j9\" (UniqueName: \"kubernetes.io/projected/e4a22f8c-ed38-47cf-8238-baf804f573a1-kube-api-access-ff2j9\") pod \"ceilometer-0\" (UID: \"e4a22f8c-ed38-47cf-8238-baf804f573a1\") " pod="openstack/ceilometer-0" Mar 09 14:30:21 crc kubenswrapper[4722]: I0309 14:30:21.998158 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e4a22f8c-ed38-47cf-8238-baf804f573a1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e4a22f8c-ed38-47cf-8238-baf804f573a1\") " pod="openstack/ceilometer-0" Mar 09 14:30:21 crc kubenswrapper[4722]: I0309 14:30:21.998178 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4a22f8c-ed38-47cf-8238-baf804f573a1-config-data\") pod \"ceilometer-0\" (UID: \"e4a22f8c-ed38-47cf-8238-baf804f573a1\") " pod="openstack/ceilometer-0" Mar 09 14:30:21 crc kubenswrapper[4722]: I0309 14:30:21.998463 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4a22f8c-ed38-47cf-8238-baf804f573a1-run-httpd\") pod \"ceilometer-0\" (UID: \"e4a22f8c-ed38-47cf-8238-baf804f573a1\") " pod="openstack/ceilometer-0" Mar 09 14:30:21 crc kubenswrapper[4722]: I0309 14:30:21.998559 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4a22f8c-ed38-47cf-8238-baf804f573a1-scripts\") pod \"ceilometer-0\" (UID: \"e4a22f8c-ed38-47cf-8238-baf804f573a1\") " pod="openstack/ceilometer-0" Mar 09 14:30:22 crc kubenswrapper[4722]: I0309 14:30:22.101235 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e4a22f8c-ed38-47cf-8238-baf804f573a1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e4a22f8c-ed38-47cf-8238-baf804f573a1\") " pod="openstack/ceilometer-0" Mar 09 14:30:22 crc kubenswrapper[4722]: I0309 14:30:22.101284 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4a22f8c-ed38-47cf-8238-baf804f573a1-config-data\") pod \"ceilometer-0\" (UID: \"e4a22f8c-ed38-47cf-8238-baf804f573a1\") " pod="openstack/ceilometer-0" Mar 09 14:30:22 crc kubenswrapper[4722]: I0309 14:30:22.101379 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4a22f8c-ed38-47cf-8238-baf804f573a1-run-httpd\") pod \"ceilometer-0\" (UID: \"e4a22f8c-ed38-47cf-8238-baf804f573a1\") " pod="openstack/ceilometer-0" Mar 09 14:30:22 crc kubenswrapper[4722]: I0309 14:30:22.101407 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4a22f8c-ed38-47cf-8238-baf804f573a1-scripts\") pod \"ceilometer-0\" (UID: \"e4a22f8c-ed38-47cf-8238-baf804f573a1\") " pod="openstack/ceilometer-0" Mar 09 14:30:22 crc kubenswrapper[4722]: I0309 14:30:22.101518 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4a22f8c-ed38-47cf-8238-baf804f573a1-log-httpd\") pod \"ceilometer-0\" (UID: \"e4a22f8c-ed38-47cf-8238-baf804f573a1\") " pod="openstack/ceilometer-0" Mar 09 14:30:22 crc kubenswrapper[4722]: I0309 14:30:22.101539 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4a22f8c-ed38-47cf-8238-baf804f573a1-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e4a22f8c-ed38-47cf-8238-baf804f573a1\") " pod="openstack/ceilometer-0" Mar 09 14:30:22 crc kubenswrapper[4722]: I0309 14:30:22.101564 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4a22f8c-ed38-47cf-8238-baf804f573a1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e4a22f8c-ed38-47cf-8238-baf804f573a1\") " pod="openstack/ceilometer-0" Mar 09 14:30:22 crc kubenswrapper[4722]: I0309 14:30:22.101636 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ff2j9\" (UniqueName: \"kubernetes.io/projected/e4a22f8c-ed38-47cf-8238-baf804f573a1-kube-api-access-ff2j9\") pod \"ceilometer-0\" (UID: \"e4a22f8c-ed38-47cf-8238-baf804f573a1\") " pod="openstack/ceilometer-0" Mar 09 14:30:22 crc kubenswrapper[4722]: I0309 14:30:22.102083 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4a22f8c-ed38-47cf-8238-baf804f573a1-run-httpd\") pod \"ceilometer-0\" (UID: \"e4a22f8c-ed38-47cf-8238-baf804f573a1\") " pod="openstack/ceilometer-0" Mar 09 14:30:22 crc kubenswrapper[4722]: I0309 14:30:22.104674 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4a22f8c-ed38-47cf-8238-baf804f573a1-log-httpd\") pod \"ceilometer-0\" (UID: \"e4a22f8c-ed38-47cf-8238-baf804f573a1\") " pod="openstack/ceilometer-0" Mar 09 14:30:22 crc kubenswrapper[4722]: I0309 14:30:22.105398 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4a22f8c-ed38-47cf-8238-baf804f573a1-scripts\") pod \"ceilometer-0\" (UID: \"e4a22f8c-ed38-47cf-8238-baf804f573a1\") " pod="openstack/ceilometer-0" Mar 09 14:30:22 crc kubenswrapper[4722]: I0309 14:30:22.106287 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e4a22f8c-ed38-47cf-8238-baf804f573a1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e4a22f8c-ed38-47cf-8238-baf804f573a1\") " pod="openstack/ceilometer-0" Mar 09 14:30:22 crc kubenswrapper[4722]: I0309 14:30:22.107679 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4a22f8c-ed38-47cf-8238-baf804f573a1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e4a22f8c-ed38-47cf-8238-baf804f573a1\") " pod="openstack/ceilometer-0" Mar 09 14:30:22 crc kubenswrapper[4722]: I0309 14:30:22.119513 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4a22f8c-ed38-47cf-8238-baf804f573a1-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"e4a22f8c-ed38-47cf-8238-baf804f573a1\") " pod="openstack/ceilometer-0" Mar 09 14:30:22 crc kubenswrapper[4722]: I0309 14:30:22.120880 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4a22f8c-ed38-47cf-8238-baf804f573a1-config-data\") pod \"ceilometer-0\" (UID: \"e4a22f8c-ed38-47cf-8238-baf804f573a1\") " pod="openstack/ceilometer-0" Mar 09 14:30:22 crc kubenswrapper[4722]: I0309 14:30:22.131701 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ff2j9\" (UniqueName: \"kubernetes.io/projected/e4a22f8c-ed38-47cf-8238-baf804f573a1-kube-api-access-ff2j9\") pod \"ceilometer-0\" (UID: \"e4a22f8c-ed38-47cf-8238-baf804f573a1\") " pod="openstack/ceilometer-0" Mar 09 14:30:22 crc kubenswrapper[4722]: I0309 14:30:22.144391 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 14:30:22 crc kubenswrapper[4722]: I0309 14:30:22.165839 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fd9ab47-8790-4f12-b211-50b37603c5f0" path="/var/lib/kubelet/pods/2fd9ab47-8790-4f12-b211-50b37603c5f0/volumes" Mar 09 14:30:22 crc kubenswrapper[4722]: I0309 14:30:22.749382 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 14:30:23 crc kubenswrapper[4722]: I0309 14:30:23.474708 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4a22f8c-ed38-47cf-8238-baf804f573a1","Type":"ContainerStarted","Data":"63ae7fe34969d0cd121ab26d69678707d71b9e4e87f10d6776f6cdcad4b23266"} Mar 09 14:30:25 crc kubenswrapper[4722]: I0309 14:30:25.124245 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-2" podUID="17ce7999-f86f-45fa-ae07-785f70d797a1" containerName="rabbitmq" containerID="cri-o://269fab5abed3a65c1a525f2ce6fb7275926a484f18e769dddcc4268e414b80cd" gracePeriod=604795 Mar 09 14:30:25 crc kubenswrapper[4722]: I0309 14:30:25.793919 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="1c98e541-4b72-465d-8799-89e8c9791c3e" containerName="rabbitmq" containerID="cri-o://491a32d2b4f4528a2d9ac9ed69a4bc0a4f2ea0173c8712dc6ab51834b9601f38" gracePeriod=604796 Mar 09 14:30:28 crc kubenswrapper[4722]: I0309 14:30:28.256166 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="1c98e541-4b72-465d-8799-89e8c9791c3e" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.133:5671: connect: connection refused" Mar 09 14:30:28 crc kubenswrapper[4722]: I0309 14:30:28.413620 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-2" podUID="17ce7999-f86f-45fa-ae07-785f70d797a1" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.135:5671: connect: connection refused" Mar 09 14:30:32 crc kubenswrapper[4722]: I0309 14:30:32.149719 4722 scope.go:117] "RemoveContainer" containerID="86a6dcdf7dcf4d48b8bdfb71d9a2c30b89f235700c424312c7e16580a86021d6" Mar 09 14:30:32 crc kubenswrapper[4722]: E0309 14:30:32.150175 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 14:30:32 crc kubenswrapper[4722]: I0309 14:30:32.610157 4722 generic.go:334] "Generic (PLEG): container finished" podID="17ce7999-f86f-45fa-ae07-785f70d797a1" containerID="269fab5abed3a65c1a525f2ce6fb7275926a484f18e769dddcc4268e414b80cd" exitCode=0 Mar 09 14:30:32 crc kubenswrapper[4722]: I0309 14:30:32.610265 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"17ce7999-f86f-45fa-ae07-785f70d797a1","Type":"ContainerDied","Data":"269fab5abed3a65c1a525f2ce6fb7275926a484f18e769dddcc4268e414b80cd"} Mar 09 14:30:33 crc kubenswrapper[4722]: I0309 14:30:33.624745 4722 generic.go:334] "Generic (PLEG): container finished" podID="1c98e541-4b72-465d-8799-89e8c9791c3e" containerID="491a32d2b4f4528a2d9ac9ed69a4bc0a4f2ea0173c8712dc6ab51834b9601f38" exitCode=0 Mar 09 14:30:33 crc kubenswrapper[4722]: I0309 14:30:33.624971 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1c98e541-4b72-465d-8799-89e8c9791c3e","Type":"ContainerDied","Data":"491a32d2b4f4528a2d9ac9ed69a4bc0a4f2ea0173c8712dc6ab51834b9601f38"} Mar 09 14:30:34 crc kubenswrapper[4722]: I0309 14:30:34.179371 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7d84b4d45c-wcc8n"] Mar 09 14:30:34 crc kubenswrapper[4722]: I0309 14:30:34.181500 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d84b4d45c-wcc8n" Mar 09 14:30:34 crc kubenswrapper[4722]: I0309 14:30:34.189160 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Mar 09 14:30:34 crc kubenswrapper[4722]: I0309 14:30:34.209130 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d84b4d45c-wcc8n"] Mar 09 14:30:34 crc kubenswrapper[4722]: I0309 14:30:34.330652 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4f573d9-eabb-492d-b4ab-25c64166e91f-config\") pod \"dnsmasq-dns-7d84b4d45c-wcc8n\" (UID: \"c4f573d9-eabb-492d-b4ab-25c64166e91f\") " pod="openstack/dnsmasq-dns-7d84b4d45c-wcc8n" Mar 09 14:30:34 crc kubenswrapper[4722]: I0309 14:30:34.330698 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c4f573d9-eabb-492d-b4ab-25c64166e91f-ovsdbserver-nb\") pod \"dnsmasq-dns-7d84b4d45c-wcc8n\" (UID: \"c4f573d9-eabb-492d-b4ab-25c64166e91f\") " pod="openstack/dnsmasq-dns-7d84b4d45c-wcc8n" Mar 09 14:30:34 crc kubenswrapper[4722]: I0309 14:30:34.330758 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c4f573d9-eabb-492d-b4ab-25c64166e91f-dns-swift-storage-0\") pod \"dnsmasq-dns-7d84b4d45c-wcc8n\" (UID: \"c4f573d9-eabb-492d-b4ab-25c64166e91f\") " pod="openstack/dnsmasq-dns-7d84b4d45c-wcc8n" Mar 09 14:30:34 crc kubenswrapper[4722]: I0309 14:30:34.330989 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/c4f573d9-eabb-492d-b4ab-25c64166e91f-openstack-edpm-ipam\") pod \"dnsmasq-dns-7d84b4d45c-wcc8n\" (UID: \"c4f573d9-eabb-492d-b4ab-25c64166e91f\") " pod="openstack/dnsmasq-dns-7d84b4d45c-wcc8n" Mar 09 14:30:34 crc kubenswrapper[4722]: I0309 14:30:34.331140 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c4f573d9-eabb-492d-b4ab-25c64166e91f-dns-svc\") pod \"dnsmasq-dns-7d84b4d45c-wcc8n\" (UID: \"c4f573d9-eabb-492d-b4ab-25c64166e91f\") " pod="openstack/dnsmasq-dns-7d84b4d45c-wcc8n" Mar 09 14:30:34 crc kubenswrapper[4722]: I0309 14:30:34.331553 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c4f573d9-eabb-492d-b4ab-25c64166e91f-ovsdbserver-sb\") pod \"dnsmasq-dns-7d84b4d45c-wcc8n\" (UID: \"c4f573d9-eabb-492d-b4ab-25c64166e91f\") " pod="openstack/dnsmasq-dns-7d84b4d45c-wcc8n" Mar 09 14:30:34 crc kubenswrapper[4722]: I0309 14:30:34.331622 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfgbv\" (UniqueName: \"kubernetes.io/projected/c4f573d9-eabb-492d-b4ab-25c64166e91f-kube-api-access-xfgbv\") pod \"dnsmasq-dns-7d84b4d45c-wcc8n\" (UID: \"c4f573d9-eabb-492d-b4ab-25c64166e91f\") " pod="openstack/dnsmasq-dns-7d84b4d45c-wcc8n" Mar 09 14:30:34 crc kubenswrapper[4722]: I0309 14:30:34.433774 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4f573d9-eabb-492d-b4ab-25c64166e91f-config\") pod \"dnsmasq-dns-7d84b4d45c-wcc8n\" (UID: \"c4f573d9-eabb-492d-b4ab-25c64166e91f\") " pod="openstack/dnsmasq-dns-7d84b4d45c-wcc8n" Mar 09 14:30:34 crc kubenswrapper[4722]: I0309 14:30:34.434094 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c4f573d9-eabb-492d-b4ab-25c64166e91f-ovsdbserver-nb\") pod \"dnsmasq-dns-7d84b4d45c-wcc8n\" (UID: \"c4f573d9-eabb-492d-b4ab-25c64166e91f\") " pod="openstack/dnsmasq-dns-7d84b4d45c-wcc8n" Mar 09 14:30:34 crc kubenswrapper[4722]: I0309 14:30:34.434145 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c4f573d9-eabb-492d-b4ab-25c64166e91f-dns-swift-storage-0\") pod \"dnsmasq-dns-7d84b4d45c-wcc8n\" (UID: \"c4f573d9-eabb-492d-b4ab-25c64166e91f\") " pod="openstack/dnsmasq-dns-7d84b4d45c-wcc8n" Mar 09 14:30:34 crc kubenswrapper[4722]: I0309 14:30:34.434182 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/c4f573d9-eabb-492d-b4ab-25c64166e91f-openstack-edpm-ipam\") pod \"dnsmasq-dns-7d84b4d45c-wcc8n\" (UID: \"c4f573d9-eabb-492d-b4ab-25c64166e91f\") " pod="openstack/dnsmasq-dns-7d84b4d45c-wcc8n" Mar 09 14:30:34 crc kubenswrapper[4722]: I0309 14:30:34.434239 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c4f573d9-eabb-492d-b4ab-25c64166e91f-dns-svc\") pod \"dnsmasq-dns-7d84b4d45c-wcc8n\" (UID: \"c4f573d9-eabb-492d-b4ab-25c64166e91f\") " pod="openstack/dnsmasq-dns-7d84b4d45c-wcc8n" Mar 09 14:30:34 crc kubenswrapper[4722]: I0309 14:30:34.434313 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c4f573d9-eabb-492d-b4ab-25c64166e91f-ovsdbserver-sb\") pod \"dnsmasq-dns-7d84b4d45c-wcc8n\" (UID: \"c4f573d9-eabb-492d-b4ab-25c64166e91f\") " pod="openstack/dnsmasq-dns-7d84b4d45c-wcc8n" Mar 09 14:30:34 crc kubenswrapper[4722]: I0309 14:30:34.434371 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfgbv\" (UniqueName: \"kubernetes.io/projected/c4f573d9-eabb-492d-b4ab-25c64166e91f-kube-api-access-xfgbv\") pod \"dnsmasq-dns-7d84b4d45c-wcc8n\" (UID: \"c4f573d9-eabb-492d-b4ab-25c64166e91f\") " pod="openstack/dnsmasq-dns-7d84b4d45c-wcc8n" Mar 09 14:30:34 crc kubenswrapper[4722]: I0309 14:30:34.434776 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4f573d9-eabb-492d-b4ab-25c64166e91f-config\") pod \"dnsmasq-dns-7d84b4d45c-wcc8n\" (UID: \"c4f573d9-eabb-492d-b4ab-25c64166e91f\") " pod="openstack/dnsmasq-dns-7d84b4d45c-wcc8n" Mar 09 14:30:34 crc kubenswrapper[4722]: I0309 14:30:34.435327 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/c4f573d9-eabb-492d-b4ab-25c64166e91f-openstack-edpm-ipam\") pod \"dnsmasq-dns-7d84b4d45c-wcc8n\" (UID: \"c4f573d9-eabb-492d-b4ab-25c64166e91f\") " pod="openstack/dnsmasq-dns-7d84b4d45c-wcc8n" Mar 09 14:30:34 crc kubenswrapper[4722]: I0309 14:30:34.435674 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c4f573d9-eabb-492d-b4ab-25c64166e91f-dns-svc\") pod \"dnsmasq-dns-7d84b4d45c-wcc8n\" (UID: \"c4f573d9-eabb-492d-b4ab-25c64166e91f\") " pod="openstack/dnsmasq-dns-7d84b4d45c-wcc8n" Mar 09 14:30:34 crc kubenswrapper[4722]: I0309 14:30:34.435893 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c4f573d9-eabb-492d-b4ab-25c64166e91f-dns-swift-storage-0\") pod \"dnsmasq-dns-7d84b4d45c-wcc8n\" (UID: \"c4f573d9-eabb-492d-b4ab-25c64166e91f\") " pod="openstack/dnsmasq-dns-7d84b4d45c-wcc8n" Mar 09 14:30:34 crc kubenswrapper[4722]: I0309 14:30:34.435915 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c4f573d9-eabb-492d-b4ab-25c64166e91f-ovsdbserver-nb\") pod \"dnsmasq-dns-7d84b4d45c-wcc8n\" (UID: \"c4f573d9-eabb-492d-b4ab-25c64166e91f\") " pod="openstack/dnsmasq-dns-7d84b4d45c-wcc8n" Mar 09 14:30:34 crc kubenswrapper[4722]: I0309 14:30:34.436136 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c4f573d9-eabb-492d-b4ab-25c64166e91f-ovsdbserver-sb\") pod \"dnsmasq-dns-7d84b4d45c-wcc8n\" (UID: \"c4f573d9-eabb-492d-b4ab-25c64166e91f\") " pod="openstack/dnsmasq-dns-7d84b4d45c-wcc8n" Mar 09 14:30:34 crc kubenswrapper[4722]: I0309 14:30:34.462961 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfgbv\" (UniqueName: \"kubernetes.io/projected/c4f573d9-eabb-492d-b4ab-25c64166e91f-kube-api-access-xfgbv\") pod \"dnsmasq-dns-7d84b4d45c-wcc8n\" (UID: \"c4f573d9-eabb-492d-b4ab-25c64166e91f\") " pod="openstack/dnsmasq-dns-7d84b4d45c-wcc8n" Mar 09 14:30:34 crc kubenswrapper[4722]: I0309 14:30:34.519140 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d84b4d45c-wcc8n" Mar 09 14:30:35 crc kubenswrapper[4722]: I0309 14:30:35.217278 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-2" Mar 09 14:30:35 crc kubenswrapper[4722]: I0309 14:30:35.370909 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/17ce7999-f86f-45fa-ae07-785f70d797a1-erlang-cookie-secret\") pod \"17ce7999-f86f-45fa-ae07-785f70d797a1\" (UID: \"17ce7999-f86f-45fa-ae07-785f70d797a1\") " Mar 09 14:30:35 crc kubenswrapper[4722]: I0309 14:30:35.372227 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-49d64f63-cdc8-4473-958f-e6d9024e45ab\") pod \"17ce7999-f86f-45fa-ae07-785f70d797a1\" (UID: \"17ce7999-f86f-45fa-ae07-785f70d797a1\") " Mar 09 14:30:35 crc kubenswrapper[4722]: I0309 14:30:35.372319 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/17ce7999-f86f-45fa-ae07-785f70d797a1-rabbitmq-tls\") pod \"17ce7999-f86f-45fa-ae07-785f70d797a1\" (UID: \"17ce7999-f86f-45fa-ae07-785f70d797a1\") " Mar 09 14:30:35 crc kubenswrapper[4722]: I0309 14:30:35.372438 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/17ce7999-f86f-45fa-ae07-785f70d797a1-rabbitmq-plugins\") pod \"17ce7999-f86f-45fa-ae07-785f70d797a1\" (UID: \"17ce7999-f86f-45fa-ae07-785f70d797a1\") " Mar 09 14:30:35 crc kubenswrapper[4722]: I0309 14:30:35.372467 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zpn7f\" (UniqueName: \"kubernetes.io/projected/17ce7999-f86f-45fa-ae07-785f70d797a1-kube-api-access-zpn7f\") pod \"17ce7999-f86f-45fa-ae07-785f70d797a1\" (UID: \"17ce7999-f86f-45fa-ae07-785f70d797a1\") " Mar 09 14:30:35 crc kubenswrapper[4722]: I0309 14:30:35.372531 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/17ce7999-f86f-45fa-ae07-785f70d797a1-pod-info\") pod \"17ce7999-f86f-45fa-ae07-785f70d797a1\" (UID: \"17ce7999-f86f-45fa-ae07-785f70d797a1\") " Mar 09 14:30:35 crc kubenswrapper[4722]: I0309 14:30:35.372596 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/17ce7999-f86f-45fa-ae07-785f70d797a1-config-data\") pod \"17ce7999-f86f-45fa-ae07-785f70d797a1\" (UID: \"17ce7999-f86f-45fa-ae07-785f70d797a1\") " Mar 09 14:30:35 crc kubenswrapper[4722]: I0309 14:30:35.372621 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/17ce7999-f86f-45fa-ae07-785f70d797a1-rabbitmq-erlang-cookie\") pod \"17ce7999-f86f-45fa-ae07-785f70d797a1\" (UID: \"17ce7999-f86f-45fa-ae07-785f70d797a1\") " Mar 09 14:30:35 crc kubenswrapper[4722]: I0309 14:30:35.372636 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/17ce7999-f86f-45fa-ae07-785f70d797a1-server-conf\") pod \"17ce7999-f86f-45fa-ae07-785f70d797a1\" (UID: \"17ce7999-f86f-45fa-ae07-785f70d797a1\") " Mar 09 14:30:35 crc kubenswrapper[4722]: I0309 14:30:35.372655 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/17ce7999-f86f-45fa-ae07-785f70d797a1-plugins-conf\") pod \"17ce7999-f86f-45fa-ae07-785f70d797a1\" (UID: \"17ce7999-f86f-45fa-ae07-785f70d797a1\") " Mar 09 14:30:35 crc kubenswrapper[4722]: I0309 14:30:35.372671 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/17ce7999-f86f-45fa-ae07-785f70d797a1-rabbitmq-confd\") pod \"17ce7999-f86f-45fa-ae07-785f70d797a1\" (UID: \"17ce7999-f86f-45fa-ae07-785f70d797a1\") " Mar 09 14:30:35 crc kubenswrapper[4722]: I0309 14:30:35.386082 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17ce7999-f86f-45fa-ae07-785f70d797a1-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "17ce7999-f86f-45fa-ae07-785f70d797a1" (UID: "17ce7999-f86f-45fa-ae07-785f70d797a1"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:30:35 crc kubenswrapper[4722]: I0309 14:30:35.386178 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17ce7999-f86f-45fa-ae07-785f70d797a1-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "17ce7999-f86f-45fa-ae07-785f70d797a1" (UID: "17ce7999-f86f-45fa-ae07-785f70d797a1"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:30:35 crc kubenswrapper[4722]: I0309 14:30:35.386939 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17ce7999-f86f-45fa-ae07-785f70d797a1-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "17ce7999-f86f-45fa-ae07-785f70d797a1" (UID: "17ce7999-f86f-45fa-ae07-785f70d797a1"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:30:35 crc kubenswrapper[4722]: I0309 14:30:35.387007 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17ce7999-f86f-45fa-ae07-785f70d797a1-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "17ce7999-f86f-45fa-ae07-785f70d797a1" (UID: "17ce7999-f86f-45fa-ae07-785f70d797a1"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:30:35 crc kubenswrapper[4722]: I0309 14:30:35.388102 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17ce7999-f86f-45fa-ae07-785f70d797a1-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "17ce7999-f86f-45fa-ae07-785f70d797a1" (UID: "17ce7999-f86f-45fa-ae07-785f70d797a1"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:30:35 crc kubenswrapper[4722]: I0309 14:30:35.391251 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17ce7999-f86f-45fa-ae07-785f70d797a1-kube-api-access-zpn7f" (OuterVolumeSpecName: "kube-api-access-zpn7f") pod "17ce7999-f86f-45fa-ae07-785f70d797a1" (UID: "17ce7999-f86f-45fa-ae07-785f70d797a1"). InnerVolumeSpecName "kube-api-access-zpn7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:30:35 crc kubenswrapper[4722]: I0309 14:30:35.391671 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/17ce7999-f86f-45fa-ae07-785f70d797a1-pod-info" (OuterVolumeSpecName: "pod-info") pod "17ce7999-f86f-45fa-ae07-785f70d797a1" (UID: "17ce7999-f86f-45fa-ae07-785f70d797a1"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 09 14:30:35 crc kubenswrapper[4722]: I0309 14:30:35.433748 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-49d64f63-cdc8-4473-958f-e6d9024e45ab" (OuterVolumeSpecName: "persistence") pod "17ce7999-f86f-45fa-ae07-785f70d797a1" (UID: "17ce7999-f86f-45fa-ae07-785f70d797a1"). InnerVolumeSpecName "pvc-49d64f63-cdc8-4473-958f-e6d9024e45ab". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 09 14:30:35 crc kubenswrapper[4722]: I0309 14:30:35.466297 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17ce7999-f86f-45fa-ae07-785f70d797a1-config-data" (OuterVolumeSpecName: "config-data") pod "17ce7999-f86f-45fa-ae07-785f70d797a1" (UID: "17ce7999-f86f-45fa-ae07-785f70d797a1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:30:35 crc kubenswrapper[4722]: I0309 14:30:35.467840 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17ce7999-f86f-45fa-ae07-785f70d797a1-server-conf" (OuterVolumeSpecName: "server-conf") pod "17ce7999-f86f-45fa-ae07-785f70d797a1" (UID: "17ce7999-f86f-45fa-ae07-785f70d797a1"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:30:35 crc kubenswrapper[4722]: I0309 14:30:35.476174 4722 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/17ce7999-f86f-45fa-ae07-785f70d797a1-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 09 14:30:35 crc kubenswrapper[4722]: I0309 14:30:35.476246 4722 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-49d64f63-cdc8-4473-958f-e6d9024e45ab\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-49d64f63-cdc8-4473-958f-e6d9024e45ab\") on node \"crc\" " Mar 09 14:30:35 crc kubenswrapper[4722]: I0309 14:30:35.476260 4722 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/17ce7999-f86f-45fa-ae07-785f70d797a1-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 09 14:30:35 crc kubenswrapper[4722]: I0309 14:30:35.476269 4722 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/17ce7999-f86f-45fa-ae07-785f70d797a1-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 09 14:30:35 crc kubenswrapper[4722]: I0309 14:30:35.476279 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zpn7f\" (UniqueName: \"kubernetes.io/projected/17ce7999-f86f-45fa-ae07-785f70d797a1-kube-api-access-zpn7f\") on node \"crc\" DevicePath \"\"" Mar 09 14:30:35 crc kubenswrapper[4722]: I0309 14:30:35.476286 4722 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/17ce7999-f86f-45fa-ae07-785f70d797a1-pod-info\") on node \"crc\" DevicePath \"\"" Mar 09 14:30:35 crc kubenswrapper[4722]: I0309 14:30:35.476294 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/17ce7999-f86f-45fa-ae07-785f70d797a1-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:30:35 crc kubenswrapper[4722]: I0309 14:30:35.476304 4722 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/17ce7999-f86f-45fa-ae07-785f70d797a1-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 09 14:30:35 crc kubenswrapper[4722]: I0309 14:30:35.476312 4722 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/17ce7999-f86f-45fa-ae07-785f70d797a1-server-conf\") on node \"crc\" DevicePath \"\"" Mar 09 14:30:35 crc kubenswrapper[4722]: I0309 14:30:35.476320 4722 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/17ce7999-f86f-45fa-ae07-785f70d797a1-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 09 14:30:35 crc kubenswrapper[4722]: I0309 14:30:35.523260 4722 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 09 14:30:35 crc kubenswrapper[4722]: I0309 14:30:35.524183 4722 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-49d64f63-cdc8-4473-958f-e6d9024e45ab" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-49d64f63-cdc8-4473-958f-e6d9024e45ab") on node "crc" Mar 09 14:30:35 crc kubenswrapper[4722]: I0309 14:30:35.540180 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17ce7999-f86f-45fa-ae07-785f70d797a1-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "17ce7999-f86f-45fa-ae07-785f70d797a1" (UID: "17ce7999-f86f-45fa-ae07-785f70d797a1"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:30:35 crc kubenswrapper[4722]: I0309 14:30:35.579993 4722 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/17ce7999-f86f-45fa-ae07-785f70d797a1-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 09 14:30:35 crc kubenswrapper[4722]: I0309 14:30:35.580252 4722 reconciler_common.go:293] "Volume detached for volume \"pvc-49d64f63-cdc8-4473-958f-e6d9024e45ab\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-49d64f63-cdc8-4473-958f-e6d9024e45ab\") on node \"crc\" DevicePath \"\"" Mar 09 14:30:35 crc kubenswrapper[4722]: I0309 14:30:35.661131 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"17ce7999-f86f-45fa-ae07-785f70d797a1","Type":"ContainerDied","Data":"5bf93b12a7d905614f0f42183e3fc57fef84de58c9b870b252fb21aa59eeed03"} Mar 09 14:30:35 crc kubenswrapper[4722]: I0309 14:30:35.661450 4722 scope.go:117] "RemoveContainer" containerID="269fab5abed3a65c1a525f2ce6fb7275926a484f18e769dddcc4268e414b80cd" Mar 09 14:30:35 crc kubenswrapper[4722]: I0309 14:30:35.661167 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-2" Mar 09 14:30:35 crc kubenswrapper[4722]: I0309 14:30:35.726614 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-2"] Mar 09 14:30:35 crc kubenswrapper[4722]: I0309 14:30:35.753304 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-2"] Mar 09 14:30:35 crc kubenswrapper[4722]: I0309 14:30:35.773339 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-2"] Mar 09 14:30:35 crc kubenswrapper[4722]: E0309 14:30:35.773970 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17ce7999-f86f-45fa-ae07-785f70d797a1" containerName="rabbitmq" Mar 09 14:30:35 crc kubenswrapper[4722]: I0309 14:30:35.773993 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="17ce7999-f86f-45fa-ae07-785f70d797a1" containerName="rabbitmq" Mar 09 14:30:35 crc kubenswrapper[4722]: E0309 14:30:35.774028 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17ce7999-f86f-45fa-ae07-785f70d797a1" containerName="setup-container" Mar 09 14:30:35 crc kubenswrapper[4722]: I0309 14:30:35.774037 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="17ce7999-f86f-45fa-ae07-785f70d797a1" containerName="setup-container" Mar 09 14:30:35 crc kubenswrapper[4722]: I0309 14:30:35.774339 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="17ce7999-f86f-45fa-ae07-785f70d797a1" containerName="rabbitmq" Mar 09 14:30:35 crc kubenswrapper[4722]: I0309 14:30:35.775904 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-2" Mar 09 14:30:35 crc kubenswrapper[4722]: I0309 14:30:35.793762 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-2"] Mar 09 14:30:35 crc kubenswrapper[4722]: I0309 14:30:35.911070 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/fabf84f5-0f35-4400-b612-235235a21f3c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-2\" (UID: \"fabf84f5-0f35-4400-b612-235235a21f3c\") " pod="openstack/rabbitmq-server-2" Mar 09 14:30:35 crc kubenswrapper[4722]: I0309 14:30:35.911419 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/fabf84f5-0f35-4400-b612-235235a21f3c-rabbitmq-plugins\") pod \"rabbitmq-server-2\" (UID: \"fabf84f5-0f35-4400-b612-235235a21f3c\") " pod="openstack/rabbitmq-server-2" Mar 09 14:30:35 crc kubenswrapper[4722]: I0309 14:30:35.911468 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/fabf84f5-0f35-4400-b612-235235a21f3c-erlang-cookie-secret\") pod \"rabbitmq-server-2\" (UID: \"fabf84f5-0f35-4400-b612-235235a21f3c\") " pod="openstack/rabbitmq-server-2" Mar 09 14:30:35 crc kubenswrapper[4722]: I0309 14:30:35.911541 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fabf84f5-0f35-4400-b612-235235a21f3c-config-data\") pod \"rabbitmq-server-2\" (UID: \"fabf84f5-0f35-4400-b612-235235a21f3c\") " pod="openstack/rabbitmq-server-2" Mar 09 14:30:35 crc kubenswrapper[4722]: I0309 14:30:35.911569 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/fabf84f5-0f35-4400-b612-235235a21f3c-pod-info\") pod \"rabbitmq-server-2\" (UID: \"fabf84f5-0f35-4400-b612-235235a21f3c\") " pod="openstack/rabbitmq-server-2" Mar 09 14:30:35 crc kubenswrapper[4722]: I0309 14:30:35.911598 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/fabf84f5-0f35-4400-b612-235235a21f3c-server-conf\") pod \"rabbitmq-server-2\" (UID: \"fabf84f5-0f35-4400-b612-235235a21f3c\") " pod="openstack/rabbitmq-server-2" Mar 09 14:30:35 crc kubenswrapper[4722]: I0309 14:30:35.911613 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/fabf84f5-0f35-4400-b612-235235a21f3c-rabbitmq-tls\") pod \"rabbitmq-server-2\" (UID: \"fabf84f5-0f35-4400-b612-235235a21f3c\") " pod="openstack/rabbitmq-server-2" Mar 09 14:30:35 crc kubenswrapper[4722]: I0309 14:30:35.911642 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/fabf84f5-0f35-4400-b612-235235a21f3c-plugins-conf\") pod \"rabbitmq-server-2\" (UID: \"fabf84f5-0f35-4400-b612-235235a21f3c\") " pod="openstack/rabbitmq-server-2" Mar 09 14:30:35 crc kubenswrapper[4722]: I0309 14:30:35.911662 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/fabf84f5-0f35-4400-b612-235235a21f3c-rabbitmq-confd\") pod \"rabbitmq-server-2\" (UID: \"fabf84f5-0f35-4400-b612-235235a21f3c\") " pod="openstack/rabbitmq-server-2" Mar 09 14:30:35 crc kubenswrapper[4722]: I0309 14:30:35.911699 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-49d64f63-cdc8-4473-958f-e6d9024e45ab\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-49d64f63-cdc8-4473-958f-e6d9024e45ab\") pod \"rabbitmq-server-2\" (UID: \"fabf84f5-0f35-4400-b612-235235a21f3c\") " pod="openstack/rabbitmq-server-2" Mar 09 14:30:35 crc kubenswrapper[4722]: I0309 14:30:35.911730 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nlgn\" (UniqueName: \"kubernetes.io/projected/fabf84f5-0f35-4400-b612-235235a21f3c-kube-api-access-8nlgn\") pod \"rabbitmq-server-2\" (UID: \"fabf84f5-0f35-4400-b612-235235a21f3c\") " pod="openstack/rabbitmq-server-2" Mar 09 14:30:36 crc kubenswrapper[4722]: I0309 14:30:36.014223 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/fabf84f5-0f35-4400-b612-235235a21f3c-plugins-conf\") pod \"rabbitmq-server-2\" (UID: \"fabf84f5-0f35-4400-b612-235235a21f3c\") " pod="openstack/rabbitmq-server-2" Mar 09 14:30:36 crc kubenswrapper[4722]: I0309 14:30:36.014271 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/fabf84f5-0f35-4400-b612-235235a21f3c-rabbitmq-confd\") pod \"rabbitmq-server-2\" (UID: \"fabf84f5-0f35-4400-b612-235235a21f3c\") " pod="openstack/rabbitmq-server-2" Mar 09 14:30:36 crc kubenswrapper[4722]: I0309 14:30:36.014320 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-49d64f63-cdc8-4473-958f-e6d9024e45ab\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-49d64f63-cdc8-4473-958f-e6d9024e45ab\") pod \"rabbitmq-server-2\" (UID: \"fabf84f5-0f35-4400-b612-235235a21f3c\") " pod="openstack/rabbitmq-server-2" Mar 09 14:30:36 crc kubenswrapper[4722]: I0309 14:30:36.014359 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nlgn\" (UniqueName: \"kubernetes.io/projected/fabf84f5-0f35-4400-b612-235235a21f3c-kube-api-access-8nlgn\") pod \"rabbitmq-server-2\" (UID: \"fabf84f5-0f35-4400-b612-235235a21f3c\") " pod="openstack/rabbitmq-server-2" Mar 09 14:30:36 crc kubenswrapper[4722]: I0309 14:30:36.014427 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/fabf84f5-0f35-4400-b612-235235a21f3c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-2\" (UID: \"fabf84f5-0f35-4400-b612-235235a21f3c\") " pod="openstack/rabbitmq-server-2" Mar 09 14:30:36 crc kubenswrapper[4722]: I0309 14:30:36.014452 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/fabf84f5-0f35-4400-b612-235235a21f3c-rabbitmq-plugins\") pod \"rabbitmq-server-2\" (UID: \"fabf84f5-0f35-4400-b612-235235a21f3c\") " pod="openstack/rabbitmq-server-2" Mar 09 14:30:36 crc kubenswrapper[4722]: I0309 14:30:36.014486 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/fabf84f5-0f35-4400-b612-235235a21f3c-erlang-cookie-secret\") pod \"rabbitmq-server-2\" (UID: \"fabf84f5-0f35-4400-b612-235235a21f3c\") " pod="openstack/rabbitmq-server-2" Mar 09 14:30:36 crc kubenswrapper[4722]: I0309 14:30:36.014552 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fabf84f5-0f35-4400-b612-235235a21f3c-config-data\") pod \"rabbitmq-server-2\" (UID: \"fabf84f5-0f35-4400-b612-235235a21f3c\") " pod="openstack/rabbitmq-server-2" Mar 09 14:30:36 crc kubenswrapper[4722]: I0309 14:30:36.014570 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/fabf84f5-0f35-4400-b612-235235a21f3c-pod-info\") pod \"rabbitmq-server-2\" (UID: \"fabf84f5-0f35-4400-b612-235235a21f3c\") " pod="openstack/rabbitmq-server-2" Mar 09 14:30:36 crc kubenswrapper[4722]: I0309 14:30:36.014599 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/fabf84f5-0f35-4400-b612-235235a21f3c-server-conf\") pod \"rabbitmq-server-2\" (UID: \"fabf84f5-0f35-4400-b612-235235a21f3c\") " pod="openstack/rabbitmq-server-2" Mar 09 14:30:36 crc kubenswrapper[4722]: I0309 14:30:36.014612 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/fabf84f5-0f35-4400-b612-235235a21f3c-rabbitmq-tls\") pod \"rabbitmq-server-2\" (UID: \"fabf84f5-0f35-4400-b612-235235a21f3c\") " pod="openstack/rabbitmq-server-2" Mar 09 14:30:36 crc kubenswrapper[4722]: I0309 14:30:36.015098 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/fabf84f5-0f35-4400-b612-235235a21f3c-plugins-conf\") pod \"rabbitmq-server-2\" (UID: \"fabf84f5-0f35-4400-b612-235235a21f3c\") " pod="openstack/rabbitmq-server-2" Mar 09 14:30:36 crc kubenswrapper[4722]: I0309 14:30:36.015376 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/fabf84f5-0f35-4400-b612-235235a21f3c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-2\" (UID: \"fabf84f5-0f35-4400-b612-235235a21f3c\") " pod="openstack/rabbitmq-server-2" Mar 09 14:30:36 crc kubenswrapper[4722]: I0309 14:30:36.015836 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fabf84f5-0f35-4400-b612-235235a21f3c-config-data\") pod \"rabbitmq-server-2\" (UID: \"fabf84f5-0f35-4400-b612-235235a21f3c\") " pod="openstack/rabbitmq-server-2" Mar 09 14:30:36 crc kubenswrapper[4722]: I0309 14:30:36.016221 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/fabf84f5-0f35-4400-b612-235235a21f3c-rabbitmq-plugins\") pod \"rabbitmq-server-2\" (UID: \"fabf84f5-0f35-4400-b612-235235a21f3c\") " pod="openstack/rabbitmq-server-2" Mar 09 14:30:36 crc kubenswrapper[4722]: I0309 14:30:36.018367 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/fabf84f5-0f35-4400-b612-235235a21f3c-server-conf\") pod \"rabbitmq-server-2\" (UID: \"fabf84f5-0f35-4400-b612-235235a21f3c\") " pod="openstack/rabbitmq-server-2" Mar 09 14:30:36 crc kubenswrapper[4722]: I0309 14:30:36.018490 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 09 14:30:36 crc kubenswrapper[4722]: I0309 14:30:36.018528 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-49d64f63-cdc8-4473-958f-e6d9024e45ab\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-49d64f63-cdc8-4473-958f-e6d9024e45ab\") pod \"rabbitmq-server-2\" (UID: \"fabf84f5-0f35-4400-b612-235235a21f3c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/0c27981484d45a0452f6bc7b25565dc834ac0db89d430ab9341cec8b8dfe57f8/globalmount\"" pod="openstack/rabbitmq-server-2" Mar 09 14:30:36 crc kubenswrapper[4722]: I0309 14:30:36.023950 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/fabf84f5-0f35-4400-b612-235235a21f3c-rabbitmq-confd\") pod \"rabbitmq-server-2\" (UID: \"fabf84f5-0f35-4400-b612-235235a21f3c\") " pod="openstack/rabbitmq-server-2" Mar 09 14:30:36 crc kubenswrapper[4722]: I0309 14:30:36.024705 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/fabf84f5-0f35-4400-b612-235235a21f3c-erlang-cookie-secret\") pod \"rabbitmq-server-2\" (UID: \"fabf84f5-0f35-4400-b612-235235a21f3c\") " pod="openstack/rabbitmq-server-2" Mar 09 14:30:36 crc kubenswrapper[4722]: I0309 14:30:36.029280 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/fabf84f5-0f35-4400-b612-235235a21f3c-pod-info\") pod \"rabbitmq-server-2\" (UID: \"fabf84f5-0f35-4400-b612-235235a21f3c\") " pod="openstack/rabbitmq-server-2" Mar 09 14:30:36 crc kubenswrapper[4722]: I0309 14:30:36.031388 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nlgn\" (UniqueName: \"kubernetes.io/projected/fabf84f5-0f35-4400-b612-235235a21f3c-kube-api-access-8nlgn\") pod \"rabbitmq-server-2\" (UID: \"fabf84f5-0f35-4400-b612-235235a21f3c\") " pod="openstack/rabbitmq-server-2" Mar 09 14:30:36 crc kubenswrapper[4722]: I0309 14:30:36.041588 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/fabf84f5-0f35-4400-b612-235235a21f3c-rabbitmq-tls\") pod \"rabbitmq-server-2\" (UID: \"fabf84f5-0f35-4400-b612-235235a21f3c\") " pod="openstack/rabbitmq-server-2" Mar 09 14:30:36 crc kubenswrapper[4722]: I0309 14:30:36.110732 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-49d64f63-cdc8-4473-958f-e6d9024e45ab\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-49d64f63-cdc8-4473-958f-e6d9024e45ab\") pod \"rabbitmq-server-2\" (UID: \"fabf84f5-0f35-4400-b612-235235a21f3c\") " pod="openstack/rabbitmq-server-2" Mar 09 14:30:36 crc kubenswrapper[4722]: I0309 14:30:36.138181 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-2" Mar 09 14:30:36 crc kubenswrapper[4722]: I0309 14:30:36.161848 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17ce7999-f86f-45fa-ae07-785f70d797a1" path="/var/lib/kubelet/pods/17ce7999-f86f-45fa-ae07-785f70d797a1/volumes" Mar 09 14:30:38 crc kubenswrapper[4722]: I0309 14:30:38.255654 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="1c98e541-4b72-465d-8799-89e8c9791c3e" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.133:5671: connect: connection refused" Mar 09 14:30:42 crc kubenswrapper[4722]: E0309 14:30:42.197962 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested" Mar 09 14:30:42 crc kubenswrapper[4722]: E0309 14:30:42.198620 4722 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested" Mar 09 14:30:42 crc kubenswrapper[4722]: E0309 14:30:42.198748 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n67h56fh665hdbh5bch57ch5d6hch567h694hcfh644h67h94h5bch697h95h5d4h7fh664h54dh64fh55h59bhb6h589h98h8dh5h585h5bh5b8q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ff2j9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(e4a22f8c-ed38-47cf-8238-baf804f573a1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 14:30:43 crc kubenswrapper[4722]: I0309 14:30:43.263586 4722 scope.go:117] "RemoveContainer" containerID="2ed1c447dbb8dfe73b7c01fa28b0e8e47079d52fb2e0e72560df64042052f747" Mar 09 14:30:43 crc kubenswrapper[4722]: E0309 14:30:43.266862 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested" Mar 09 14:30:43 crc kubenswrapper[4722]: E0309 14:30:43.266904 4722 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested" Mar 09 14:30:43 crc kubenswrapper[4722]: E0309 14:30:43.267052 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:heat-db-sync,Image:quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested,Command:[/bin/bash],Args:[-c /usr/bin/heat-manage --config-dir /etc/heat/heat.conf.d db_sync],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/heat/heat.conf.d/00-default.conf,SubPath:00-default.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/heat/heat.conf.d/01-custom.conf,SubPath:01-custom.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-p4t68,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42418,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*42418,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-db-sync-cf85b_openstack(c83de432-8e6a-45bf-9395-215f28461090): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 14:30:43 crc kubenswrapper[4722]: E0309 14:30:43.268235 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/heat-db-sync-cf85b" podUID="c83de432-8e6a-45bf-9395-215f28461090" Mar 09 14:30:43 crc kubenswrapper[4722]: I0309 14:30:43.473228 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 09 14:30:43 crc kubenswrapper[4722]: I0309 14:30:43.607234 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1c98e541-4b72-465d-8799-89e8c9791c3e-rabbitmq-tls\") pod \"1c98e541-4b72-465d-8799-89e8c9791c3e\" (UID: \"1c98e541-4b72-465d-8799-89e8c9791c3e\") " Mar 09 14:30:43 crc kubenswrapper[4722]: I0309 14:30:43.608653 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-35d3f19e-3c36-4038-9906-dc1e23e87cf8\") pod \"1c98e541-4b72-465d-8799-89e8c9791c3e\" (UID: \"1c98e541-4b72-465d-8799-89e8c9791c3e\") " Mar 09 14:30:43 crc kubenswrapper[4722]: I0309 14:30:43.608704 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1c98e541-4b72-465d-8799-89e8c9791c3e-rabbitmq-erlang-cookie\") pod \"1c98e541-4b72-465d-8799-89e8c9791c3e\" (UID: \"1c98e541-4b72-465d-8799-89e8c9791c3e\") " Mar 09 14:30:43 crc kubenswrapper[4722]: I0309 14:30:43.608767 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1c98e541-4b72-465d-8799-89e8c9791c3e-rabbitmq-plugins\") pod \"1c98e541-4b72-465d-8799-89e8c9791c3e\" (UID: \"1c98e541-4b72-465d-8799-89e8c9791c3e\") " Mar 09 14:30:43 crc kubenswrapper[4722]: I0309 14:30:43.608920 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1c98e541-4b72-465d-8799-89e8c9791c3e-erlang-cookie-secret\") pod \"1c98e541-4b72-465d-8799-89e8c9791c3e\" (UID: \"1c98e541-4b72-465d-8799-89e8c9791c3e\") " Mar 09 14:30:43 crc kubenswrapper[4722]: I0309 14:30:43.608969 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1c98e541-4b72-465d-8799-89e8c9791c3e-server-conf\") pod \"1c98e541-4b72-465d-8799-89e8c9791c3e\" (UID: \"1c98e541-4b72-465d-8799-89e8c9791c3e\") " Mar 09 14:30:43 crc kubenswrapper[4722]: I0309 14:30:43.609062 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1c98e541-4b72-465d-8799-89e8c9791c3e-config-data\") pod \"1c98e541-4b72-465d-8799-89e8c9791c3e\" (UID: \"1c98e541-4b72-465d-8799-89e8c9791c3e\") " Mar 09 14:30:43 crc kubenswrapper[4722]: I0309 14:30:43.609095 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1c98e541-4b72-465d-8799-89e8c9791c3e-plugins-conf\") pod \"1c98e541-4b72-465d-8799-89e8c9791c3e\" (UID: \"1c98e541-4b72-465d-8799-89e8c9791c3e\") " Mar 09 14:30:43 crc kubenswrapper[4722]: I0309 14:30:43.609164 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1c98e541-4b72-465d-8799-89e8c9791c3e-pod-info\") pod \"1c98e541-4b72-465d-8799-89e8c9791c3e\" (UID: \"1c98e541-4b72-465d-8799-89e8c9791c3e\") " Mar 09 14:30:43 crc kubenswrapper[4722]: I0309 14:30:43.609398 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xptdn\" (UniqueName: \"kubernetes.io/projected/1c98e541-4b72-465d-8799-89e8c9791c3e-kube-api-access-xptdn\") pod \"1c98e541-4b72-465d-8799-89e8c9791c3e\" (UID: \"1c98e541-4b72-465d-8799-89e8c9791c3e\") " Mar 09 14:30:43 crc kubenswrapper[4722]: I0309 14:30:43.609492 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1c98e541-4b72-465d-8799-89e8c9791c3e-rabbitmq-confd\") pod \"1c98e541-4b72-465d-8799-89e8c9791c3e\" (UID: \"1c98e541-4b72-465d-8799-89e8c9791c3e\") " Mar 09 14:30:43 crc kubenswrapper[4722]: I0309 14:30:43.616446 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c98e541-4b72-465d-8799-89e8c9791c3e-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "1c98e541-4b72-465d-8799-89e8c9791c3e" (UID: "1c98e541-4b72-465d-8799-89e8c9791c3e"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:30:43 crc kubenswrapper[4722]: I0309 14:30:43.616557 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c98e541-4b72-465d-8799-89e8c9791c3e-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "1c98e541-4b72-465d-8799-89e8c9791c3e" (UID: "1c98e541-4b72-465d-8799-89e8c9791c3e"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:30:43 crc kubenswrapper[4722]: I0309 14:30:43.617304 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c98e541-4b72-465d-8799-89e8c9791c3e-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "1c98e541-4b72-465d-8799-89e8c9791c3e" (UID: "1c98e541-4b72-465d-8799-89e8c9791c3e"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:30:43 crc kubenswrapper[4722]: I0309 14:30:43.619548 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c98e541-4b72-465d-8799-89e8c9791c3e-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "1c98e541-4b72-465d-8799-89e8c9791c3e" (UID: "1c98e541-4b72-465d-8799-89e8c9791c3e"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:30:43 crc kubenswrapper[4722]: I0309 14:30:43.623136 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c98e541-4b72-465d-8799-89e8c9791c3e-kube-api-access-xptdn" (OuterVolumeSpecName: "kube-api-access-xptdn") pod "1c98e541-4b72-465d-8799-89e8c9791c3e" (UID: "1c98e541-4b72-465d-8799-89e8c9791c3e"). InnerVolumeSpecName "kube-api-access-xptdn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:30:43 crc kubenswrapper[4722]: I0309 14:30:43.626849 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/1c98e541-4b72-465d-8799-89e8c9791c3e-pod-info" (OuterVolumeSpecName: "pod-info") pod "1c98e541-4b72-465d-8799-89e8c9791c3e" (UID: "1c98e541-4b72-465d-8799-89e8c9791c3e"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 09 14:30:43 crc kubenswrapper[4722]: I0309 14:30:43.649230 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c98e541-4b72-465d-8799-89e8c9791c3e-config-data" (OuterVolumeSpecName: "config-data") pod "1c98e541-4b72-465d-8799-89e8c9791c3e" (UID: "1c98e541-4b72-465d-8799-89e8c9791c3e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:30:43 crc kubenswrapper[4722]: I0309 14:30:43.657514 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c98e541-4b72-465d-8799-89e8c9791c3e-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "1c98e541-4b72-465d-8799-89e8c9791c3e" (UID: "1c98e541-4b72-465d-8799-89e8c9791c3e"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:30:43 crc kubenswrapper[4722]: I0309 14:30:43.662997 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-35d3f19e-3c36-4038-9906-dc1e23e87cf8" (OuterVolumeSpecName: "persistence") pod "1c98e541-4b72-465d-8799-89e8c9791c3e" (UID: "1c98e541-4b72-465d-8799-89e8c9791c3e"). InnerVolumeSpecName "pvc-35d3f19e-3c36-4038-9906-dc1e23e87cf8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 09 14:30:43 crc kubenswrapper[4722]: I0309 14:30:43.712504 4722 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1c98e541-4b72-465d-8799-89e8c9791c3e-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 09 14:30:43 crc kubenswrapper[4722]: I0309 14:30:43.712535 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1c98e541-4b72-465d-8799-89e8c9791c3e-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:30:43 crc kubenswrapper[4722]: I0309 14:30:43.712544 4722 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1c98e541-4b72-465d-8799-89e8c9791c3e-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 09 14:30:43 crc kubenswrapper[4722]: I0309 14:30:43.712553 4722 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1c98e541-4b72-465d-8799-89e8c9791c3e-pod-info\") on node \"crc\" DevicePath \"\"" Mar 09 14:30:43 crc kubenswrapper[4722]: I0309 14:30:43.712561 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xptdn\" (UniqueName: \"kubernetes.io/projected/1c98e541-4b72-465d-8799-89e8c9791c3e-kube-api-access-xptdn\") on node \"crc\" DevicePath \"\"" Mar 09 14:30:43 crc kubenswrapper[4722]: I0309 14:30:43.712570 4722 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1c98e541-4b72-465d-8799-89e8c9791c3e-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 09 14:30:43 crc kubenswrapper[4722]: I0309 14:30:43.712597 4722 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-35d3f19e-3c36-4038-9906-dc1e23e87cf8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-35d3f19e-3c36-4038-9906-dc1e23e87cf8\") on node \"crc\" " Mar 09 14:30:43 crc kubenswrapper[4722]: I0309 14:30:43.712606 4722 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1c98e541-4b72-465d-8799-89e8c9791c3e-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 09 14:30:43 crc kubenswrapper[4722]: I0309 14:30:43.712615 4722 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1c98e541-4b72-465d-8799-89e8c9791c3e-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 09 14:30:43 crc kubenswrapper[4722]: I0309 14:30:43.713502 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c98e541-4b72-465d-8799-89e8c9791c3e-server-conf" (OuterVolumeSpecName: "server-conf") pod "1c98e541-4b72-465d-8799-89e8c9791c3e" (UID: "1c98e541-4b72-465d-8799-89e8c9791c3e"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:30:43 crc kubenswrapper[4722]: I0309 14:30:43.739568 4722 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 09 14:30:43 crc kubenswrapper[4722]: I0309 14:30:43.739778 4722 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-35d3f19e-3c36-4038-9906-dc1e23e87cf8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-35d3f19e-3c36-4038-9906-dc1e23e87cf8") on node "crc" Mar 09 14:30:43 crc kubenswrapper[4722]: I0309 14:30:43.780963 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"1c98e541-4b72-465d-8799-89e8c9791c3e","Type":"ContainerDied","Data":"07c39314f6fb722f15d56d57d5a45a08afff090e92aa0fd3ad5d75a4f3acf3b8"} Mar 09 14:30:43 crc kubenswrapper[4722]: I0309 14:30:43.781009 4722 scope.go:117] "RemoveContainer" containerID="491a32d2b4f4528a2d9ac9ed69a4bc0a4f2ea0173c8712dc6ab51834b9601f38" Mar 09 14:30:43 crc kubenswrapper[4722]: I0309 14:30:43.781129 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 09 14:30:43 crc kubenswrapper[4722]: E0309 14:30:43.784612 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested\\\"\"" pod="openstack/heat-db-sync-cf85b" podUID="c83de432-8e6a-45bf-9395-215f28461090" Mar 09 14:30:43 crc kubenswrapper[4722]: I0309 14:30:43.798242 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c98e541-4b72-465d-8799-89e8c9791c3e-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "1c98e541-4b72-465d-8799-89e8c9791c3e" (UID: "1c98e541-4b72-465d-8799-89e8c9791c3e"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:30:43 crc kubenswrapper[4722]: I0309 14:30:43.814789 4722 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1c98e541-4b72-465d-8799-89e8c9791c3e-server-conf\") on node \"crc\" DevicePath \"\"" Mar 09 14:30:43 crc kubenswrapper[4722]: I0309 14:30:43.814823 4722 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1c98e541-4b72-465d-8799-89e8c9791c3e-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 09 14:30:43 crc kubenswrapper[4722]: I0309 14:30:43.814836 4722 reconciler_common.go:293] "Volume detached for volume \"pvc-35d3f19e-3c36-4038-9906-dc1e23e87cf8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-35d3f19e-3c36-4038-9906-dc1e23e87cf8\") on node \"crc\" DevicePath \"\"" Mar 09 14:30:43 crc kubenswrapper[4722]: I0309 14:30:43.817538 4722 scope.go:117] "RemoveContainer" containerID="83b4abfa07a9f0cdc86b0978dab11bb5c16ae3ddce1e3930e50e2705f0aa51fa" Mar 09 14:30:43 crc kubenswrapper[4722]: I0309 14:30:43.907650 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-2"] Mar 09 14:30:43 crc kubenswrapper[4722]: W0309 14:30:43.948191 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc4f573d9_eabb_492d_b4ab_25c64166e91f.slice/crio-d7dfdaaedce5c451ed235c04d86fca1988baf3541eef23830d8404500151e945 WatchSource:0}: Error finding container d7dfdaaedce5c451ed235c04d86fca1988baf3541eef23830d8404500151e945: Status 404 returned error can't find the container with id d7dfdaaedce5c451ed235c04d86fca1988baf3541eef23830d8404500151e945 Mar 09 14:30:43 crc kubenswrapper[4722]: I0309 14:30:43.950264 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d84b4d45c-wcc8n"] Mar 09 14:30:44 crc kubenswrapper[4722]: I0309 14:30:44.132699 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 09 14:30:44 crc kubenswrapper[4722]: I0309 14:30:44.172817 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 09 14:30:44 crc kubenswrapper[4722]: I0309 14:30:44.180610 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 09 14:30:44 crc kubenswrapper[4722]: E0309 14:30:44.181555 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c98e541-4b72-465d-8799-89e8c9791c3e" containerName="setup-container" Mar 09 14:30:44 crc kubenswrapper[4722]: I0309 14:30:44.181581 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c98e541-4b72-465d-8799-89e8c9791c3e" containerName="setup-container" Mar 09 14:30:44 crc kubenswrapper[4722]: E0309 14:30:44.181614 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c98e541-4b72-465d-8799-89e8c9791c3e" containerName="rabbitmq" Mar 09 14:30:44 crc kubenswrapper[4722]: I0309 14:30:44.181620 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c98e541-4b72-465d-8799-89e8c9791c3e" containerName="rabbitmq" Mar 09 14:30:44 crc kubenswrapper[4722]: I0309 14:30:44.181861 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c98e541-4b72-465d-8799-89e8c9791c3e" containerName="rabbitmq" Mar 09 14:30:44 crc kubenswrapper[4722]: I0309 14:30:44.183559 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 09 14:30:44 crc kubenswrapper[4722]: I0309 14:30:44.186520 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 09 14:30:44 crc kubenswrapper[4722]: I0309 14:30:44.186653 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 09 14:30:44 crc kubenswrapper[4722]: I0309 14:30:44.186794 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 09 14:30:44 crc kubenswrapper[4722]: I0309 14:30:44.186903 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 09 14:30:44 crc kubenswrapper[4722]: I0309 14:30:44.187007 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-sj5h5" Mar 09 14:30:44 crc kubenswrapper[4722]: I0309 14:30:44.187121 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 09 14:30:44 crc kubenswrapper[4722]: I0309 14:30:44.189601 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 09 14:30:44 crc kubenswrapper[4722]: I0309 14:30:44.224094 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 09 14:30:44 crc kubenswrapper[4722]: I0309 14:30:44.329273 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8b6ee542-26e6-4126-8566-a34f7621d104-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8b6ee542-26e6-4126-8566-a34f7621d104\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 14:30:44 crc kubenswrapper[4722]: I0309 14:30:44.329343 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8b6ee542-26e6-4126-8566-a34f7621d104-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"8b6ee542-26e6-4126-8566-a34f7621d104\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 14:30:44 crc kubenswrapper[4722]: I0309 14:30:44.329387 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8b6ee542-26e6-4126-8566-a34f7621d104-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"8b6ee542-26e6-4126-8566-a34f7621d104\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 14:30:44 crc kubenswrapper[4722]: I0309 14:30:44.329434 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8b6ee542-26e6-4126-8566-a34f7621d104-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"8b6ee542-26e6-4126-8566-a34f7621d104\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 14:30:44 crc kubenswrapper[4722]: I0309 14:30:44.329458 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8b6ee542-26e6-4126-8566-a34f7621d104-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8b6ee542-26e6-4126-8566-a34f7621d104\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 14:30:44 crc kubenswrapper[4722]: I0309 14:30:44.329619 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8b6ee542-26e6-4126-8566-a34f7621d104-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"8b6ee542-26e6-4126-8566-a34f7621d104\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 14:30:44 crc kubenswrapper[4722]: I0309 14:30:44.329858 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjjz9\" (UniqueName: \"kubernetes.io/projected/8b6ee542-26e6-4126-8566-a34f7621d104-kube-api-access-wjjz9\") pod \"rabbitmq-cell1-server-0\" (UID: \"8b6ee542-26e6-4126-8566-a34f7621d104\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 14:30:44 crc kubenswrapper[4722]: I0309 14:30:44.329883 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8b6ee542-26e6-4126-8566-a34f7621d104-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"8b6ee542-26e6-4126-8566-a34f7621d104\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 14:30:44 crc kubenswrapper[4722]: I0309 14:30:44.329906 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8b6ee542-26e6-4126-8566-a34f7621d104-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"8b6ee542-26e6-4126-8566-a34f7621d104\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 14:30:44 crc kubenswrapper[4722]: I0309 14:30:44.329930 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8b6ee542-26e6-4126-8566-a34f7621d104-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"8b6ee542-26e6-4126-8566-a34f7621d104\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 14:30:44 crc kubenswrapper[4722]: I0309 14:30:44.329968 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-35d3f19e-3c36-4038-9906-dc1e23e87cf8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-35d3f19e-3c36-4038-9906-dc1e23e87cf8\") pod \"rabbitmq-cell1-server-0\" (UID: \"8b6ee542-26e6-4126-8566-a34f7621d104\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 14:30:44 crc kubenswrapper[4722]: I0309 14:30:44.438850 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8b6ee542-26e6-4126-8566-a34f7621d104-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"8b6ee542-26e6-4126-8566-a34f7621d104\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 14:30:44 crc kubenswrapper[4722]: I0309 14:30:44.439225 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8b6ee542-26e6-4126-8566-a34f7621d104-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8b6ee542-26e6-4126-8566-a34f7621d104\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 14:30:44 crc kubenswrapper[4722]: I0309 14:30:44.439566 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8b6ee542-26e6-4126-8566-a34f7621d104-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"8b6ee542-26e6-4126-8566-a34f7621d104\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 14:30:44 crc kubenswrapper[4722]: I0309 14:30:44.439635 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjjz9\" (UniqueName: \"kubernetes.io/projected/8b6ee542-26e6-4126-8566-a34f7621d104-kube-api-access-wjjz9\") pod \"rabbitmq-cell1-server-0\" (UID: \"8b6ee542-26e6-4126-8566-a34f7621d104\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 14:30:44 crc kubenswrapper[4722]: I0309 14:30:44.439677 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8b6ee542-26e6-4126-8566-a34f7621d104-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"8b6ee542-26e6-4126-8566-a34f7621d104\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 14:30:44 crc kubenswrapper[4722]: I0309 14:30:44.439686 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8b6ee542-26e6-4126-8566-a34f7621d104-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"8b6ee542-26e6-4126-8566-a34f7621d104\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 14:30:44 crc kubenswrapper[4722]: I0309 14:30:44.439961 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8b6ee542-26e6-4126-8566-a34f7621d104-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"8b6ee542-26e6-4126-8566-a34f7621d104\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 14:30:44 crc kubenswrapper[4722]: I0309 14:30:44.440025 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8b6ee542-26e6-4126-8566-a34f7621d104-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"8b6ee542-26e6-4126-8566-a34f7621d104\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 14:30:44 crc kubenswrapper[4722]: I0309 14:30:44.440103 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-35d3f19e-3c36-4038-9906-dc1e23e87cf8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-35d3f19e-3c36-4038-9906-dc1e23e87cf8\") pod \"rabbitmq-cell1-server-0\" (UID: \"8b6ee542-26e6-4126-8566-a34f7621d104\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 14:30:44 crc kubenswrapper[4722]: I0309 14:30:44.440230 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8b6ee542-26e6-4126-8566-a34f7621d104-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"8b6ee542-26e6-4126-8566-a34f7621d104\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 14:30:44 crc kubenswrapper[4722]: I0309 14:30:44.440295 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8b6ee542-26e6-4126-8566-a34f7621d104-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8b6ee542-26e6-4126-8566-a34f7621d104\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 14:30:44 crc kubenswrapper[4722]: I0309 14:30:44.440354 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8b6ee542-26e6-4126-8566-a34f7621d104-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"8b6ee542-26e6-4126-8566-a34f7621d104\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 14:30:44 crc kubenswrapper[4722]: I0309 14:30:44.440409 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8b6ee542-26e6-4126-8566-a34f7621d104-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"8b6ee542-26e6-4126-8566-a34f7621d104\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 14:30:44 crc kubenswrapper[4722]: I0309 14:30:44.443182 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8b6ee542-26e6-4126-8566-a34f7621d104-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8b6ee542-26e6-4126-8566-a34f7621d104\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 14:30:44 crc kubenswrapper[4722]: I0309 14:30:44.443864 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8b6ee542-26e6-4126-8566-a34f7621d104-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8b6ee542-26e6-4126-8566-a34f7621d104\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 14:30:44 crc kubenswrapper[4722]: I0309 14:30:44.444304 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8b6ee542-26e6-4126-8566-a34f7621d104-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"8b6ee542-26e6-4126-8566-a34f7621d104\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 14:30:44 crc kubenswrapper[4722]: I0309 14:30:44.444776 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8b6ee542-26e6-4126-8566-a34f7621d104-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"8b6ee542-26e6-4126-8566-a34f7621d104\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 14:30:44 crc kubenswrapper[4722]: I0309 14:30:44.444807 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8b6ee542-26e6-4126-8566-a34f7621d104-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"8b6ee542-26e6-4126-8566-a34f7621d104\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 14:30:44 crc kubenswrapper[4722]: I0309 14:30:44.450660 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8b6ee542-26e6-4126-8566-a34f7621d104-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"8b6ee542-26e6-4126-8566-a34f7621d104\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 14:30:44 crc kubenswrapper[4722]: I0309 14:30:44.459634 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 09 14:30:44 crc kubenswrapper[4722]: I0309 14:30:44.459679 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-35d3f19e-3c36-4038-9906-dc1e23e87cf8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-35d3f19e-3c36-4038-9906-dc1e23e87cf8\") pod \"rabbitmq-cell1-server-0\" (UID: \"8b6ee542-26e6-4126-8566-a34f7621d104\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/bd40df8b19fd4b7f91f0427928b43bf8fc8992041a81fccf4003d9f0fcaf3986/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Mar 09 14:30:44 crc kubenswrapper[4722]: I0309 14:30:44.466999 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8b6ee542-26e6-4126-8566-a34f7621d104-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"8b6ee542-26e6-4126-8566-a34f7621d104\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 14:30:44 crc kubenswrapper[4722]: I0309 14:30:44.471502 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjjz9\" (UniqueName: \"kubernetes.io/projected/8b6ee542-26e6-4126-8566-a34f7621d104-kube-api-access-wjjz9\") pod \"rabbitmq-cell1-server-0\" (UID: \"8b6ee542-26e6-4126-8566-a34f7621d104\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 14:30:44 crc kubenswrapper[4722]: I0309 14:30:44.532769 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-35d3f19e-3c36-4038-9906-dc1e23e87cf8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-35d3f19e-3c36-4038-9906-dc1e23e87cf8\") pod \"rabbitmq-cell1-server-0\" (UID: \"8b6ee542-26e6-4126-8566-a34f7621d104\") " pod="openstack/rabbitmq-cell1-server-0" Mar 09 14:30:44 crc kubenswrapper[4722]: I0309 14:30:44.796679 4722 generic.go:334] "Generic (PLEG): container finished" podID="c4f573d9-eabb-492d-b4ab-25c64166e91f" containerID="3c7b4b5e492fcdaa97a5c34bfb3b3859d5c76e030d8c09cf584f324750986f00" exitCode=0 Mar 09 14:30:44 crc kubenswrapper[4722]: I0309 14:30:44.796982 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d84b4d45c-wcc8n" event={"ID":"c4f573d9-eabb-492d-b4ab-25c64166e91f","Type":"ContainerDied","Data":"3c7b4b5e492fcdaa97a5c34bfb3b3859d5c76e030d8c09cf584f324750986f00"} Mar 09 14:30:44 crc kubenswrapper[4722]: I0309 14:30:44.797012 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d84b4d45c-wcc8n" event={"ID":"c4f573d9-eabb-492d-b4ab-25c64166e91f","Type":"ContainerStarted","Data":"d7dfdaaedce5c451ed235c04d86fca1988baf3541eef23830d8404500151e945"} Mar 09 14:30:44 crc kubenswrapper[4722]: I0309 14:30:44.803524 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4a22f8c-ed38-47cf-8238-baf804f573a1","Type":"ContainerStarted","Data":"85414c6aa3d412ee02c488bb1d4b37cbe6610e47d25cb2cf02f72ee5191cff5d"} Mar 09 14:30:44 crc kubenswrapper[4722]: I0309 14:30:44.804626 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"fabf84f5-0f35-4400-b612-235235a21f3c","Type":"ContainerStarted","Data":"e0059184f8c67c7026ab4c78931151b68986c68dec94e1033596a8932a8491eb"} Mar 09 14:30:44 crc kubenswrapper[4722]: I0309 14:30:44.811812 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 09 14:30:45 crc kubenswrapper[4722]: I0309 14:30:45.352447 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 09 14:30:45 crc kubenswrapper[4722]: I0309 14:30:45.817093 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8b6ee542-26e6-4126-8566-a34f7621d104","Type":"ContainerStarted","Data":"80e9425ad286c3a67e84af2ad6b17a51de6f102353df05e0ad37663100e8ee50"} Mar 09 14:30:45 crc kubenswrapper[4722]: I0309 14:30:45.818895 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4a22f8c-ed38-47cf-8238-baf804f573a1","Type":"ContainerStarted","Data":"6287639f5110871948ab340f5fc565b05341b6ee17f9cfceba7413ea012ddad7"} Mar 09 14:30:45 crc kubenswrapper[4722]: I0309 14:30:45.820249 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"fabf84f5-0f35-4400-b612-235235a21f3c","Type":"ContainerStarted","Data":"17b99e8421029ea591318c368e93dea2f5a965ed586feebfbb3dc7acd8da617b"} Mar 09 14:30:45 crc kubenswrapper[4722]: I0309 14:30:45.823797 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d84b4d45c-wcc8n" event={"ID":"c4f573d9-eabb-492d-b4ab-25c64166e91f","Type":"ContainerStarted","Data":"daf014a3b39097408cc997070b424c8e3c13da8d994063d4f2655ccc78a1539e"} Mar 09 14:30:45 crc kubenswrapper[4722]: I0309 14:30:45.823965 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7d84b4d45c-wcc8n" Mar 09 14:30:45 crc kubenswrapper[4722]: I0309 14:30:45.922462 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7d84b4d45c-wcc8n" podStartSLOduration=11.922432498 podStartE2EDuration="11.922432498s" podCreationTimestamp="2026-03-09 14:30:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:30:45.910649902 +0000 UTC m=+1686.466218478" watchObservedRunningTime="2026-03-09 14:30:45.922432498 +0000 UTC m=+1686.478001084" Mar 09 14:30:46 crc kubenswrapper[4722]: I0309 14:30:46.149192 4722 scope.go:117] "RemoveContainer" containerID="86a6dcdf7dcf4d48b8bdfb71d9a2c30b89f235700c424312c7e16580a86021d6" Mar 09 14:30:46 crc kubenswrapper[4722]: E0309 14:30:46.150183 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 14:30:46 crc kubenswrapper[4722]: I0309 14:30:46.168025 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c98e541-4b72-465d-8799-89e8c9791c3e" path="/var/lib/kubelet/pods/1c98e541-4b72-465d-8799-89e8c9791c3e/volumes" Mar 09 14:30:47 crc kubenswrapper[4722]: E0309 14:30:47.521085 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="e4a22f8c-ed38-47cf-8238-baf804f573a1" Mar 09 14:30:47 crc kubenswrapper[4722]: I0309 14:30:47.861187 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4a22f8c-ed38-47cf-8238-baf804f573a1","Type":"ContainerStarted","Data":"2227fa96630e194da4addb6cfc3a43b10b936a7d8ca77af3b21c6ddd6d69615d"} Mar 09 14:30:47 crc kubenswrapper[4722]: I0309 14:30:47.861704 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 09 14:30:47 crc kubenswrapper[4722]: E0309 14:30:47.863167 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested\\\"\"" pod="openstack/ceilometer-0" podUID="e4a22f8c-ed38-47cf-8238-baf804f573a1" Mar 09 14:30:47 crc kubenswrapper[4722]: I0309 14:30:47.863637 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8b6ee542-26e6-4126-8566-a34f7621d104","Type":"ContainerStarted","Data":"166693fcef0f0e5cf4fb139e3aa52ab299afb35da080f1e1a96788f921624861"} Mar 09 14:30:48 crc kubenswrapper[4722]: E0309 14:30:48.877381 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested\\\"\"" pod="openstack/ceilometer-0" podUID="e4a22f8c-ed38-47cf-8238-baf804f573a1" Mar 09 14:30:54 crc kubenswrapper[4722]: I0309 14:30:54.520330 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7d84b4d45c-wcc8n" Mar 09 14:30:54 crc kubenswrapper[4722]: I0309 14:30:54.606065 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7bbf7cf9-k62sg"] Mar 09 14:30:54 crc kubenswrapper[4722]: I0309 14:30:54.606348 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6b7bbf7cf9-k62sg" podUID="33c8ed47-f7d5-485b-b413-8c80c3a5b276" containerName="dnsmasq-dns" containerID="cri-o://c9b3f6e72a54b5597dc47652c56be9a758f995531c5fbf3fb7265ef6524329f9" gracePeriod=10 Mar 09 14:30:54 crc kubenswrapper[4722]: I0309 14:30:54.760406 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6f6df4f56c-8q8pp"] Mar 09 14:30:54 crc kubenswrapper[4722]: I0309 14:30:54.763273 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f6df4f56c-8q8pp" Mar 09 14:30:54 crc kubenswrapper[4722]: I0309 14:30:54.784333 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f6df4f56c-8q8pp"] Mar 09 14:30:54 crc kubenswrapper[4722]: I0309 14:30:54.908769 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/868763d5-a256-477e-b82e-dd85f1e05dea-dns-swift-storage-0\") pod \"dnsmasq-dns-6f6df4f56c-8q8pp\" (UID: \"868763d5-a256-477e-b82e-dd85f1e05dea\") " pod="openstack/dnsmasq-dns-6f6df4f56c-8q8pp" Mar 09 14:30:54 crc kubenswrapper[4722]: I0309 14:30:54.908894 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/868763d5-a256-477e-b82e-dd85f1e05dea-ovsdbserver-nb\") pod \"dnsmasq-dns-6f6df4f56c-8q8pp\" (UID: \"868763d5-a256-477e-b82e-dd85f1e05dea\") " pod="openstack/dnsmasq-dns-6f6df4f56c-8q8pp" Mar 09 14:30:54 crc kubenswrapper[4722]: I0309 14:30:54.909102 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/868763d5-a256-477e-b82e-dd85f1e05dea-openstack-edpm-ipam\") pod \"dnsmasq-dns-6f6df4f56c-8q8pp\" (UID: \"868763d5-a256-477e-b82e-dd85f1e05dea\") " pod="openstack/dnsmasq-dns-6f6df4f56c-8q8pp" Mar 09 14:30:54 crc kubenswrapper[4722]: I0309 14:30:54.909374 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c428v\" (UniqueName: \"kubernetes.io/projected/868763d5-a256-477e-b82e-dd85f1e05dea-kube-api-access-c428v\") pod \"dnsmasq-dns-6f6df4f56c-8q8pp\" (UID: \"868763d5-a256-477e-b82e-dd85f1e05dea\") " pod="openstack/dnsmasq-dns-6f6df4f56c-8q8pp" Mar 09 14:30:54 crc kubenswrapper[4722]: I0309 14:30:54.909407 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/868763d5-a256-477e-b82e-dd85f1e05dea-config\") pod \"dnsmasq-dns-6f6df4f56c-8q8pp\" (UID: \"868763d5-a256-477e-b82e-dd85f1e05dea\") " pod="openstack/dnsmasq-dns-6f6df4f56c-8q8pp" Mar 09 14:30:54 crc kubenswrapper[4722]: I0309 14:30:54.909436 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/868763d5-a256-477e-b82e-dd85f1e05dea-dns-svc\") pod \"dnsmasq-dns-6f6df4f56c-8q8pp\" (UID: \"868763d5-a256-477e-b82e-dd85f1e05dea\") " pod="openstack/dnsmasq-dns-6f6df4f56c-8q8pp" Mar 09 14:30:54 crc kubenswrapper[4722]: I0309 14:30:54.909466 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/868763d5-a256-477e-b82e-dd85f1e05dea-ovsdbserver-sb\") pod \"dnsmasq-dns-6f6df4f56c-8q8pp\" (UID: \"868763d5-a256-477e-b82e-dd85f1e05dea\") " pod="openstack/dnsmasq-dns-6f6df4f56c-8q8pp" Mar 09 14:30:54 crc kubenswrapper[4722]: I0309 14:30:54.963433 4722 generic.go:334] "Generic (PLEG): container finished" podID="33c8ed47-f7d5-485b-b413-8c80c3a5b276" containerID="c9b3f6e72a54b5597dc47652c56be9a758f995531c5fbf3fb7265ef6524329f9" exitCode=0 Mar 09 14:30:54 crc kubenswrapper[4722]: I0309 14:30:54.963505 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7bbf7cf9-k62sg" event={"ID":"33c8ed47-f7d5-485b-b413-8c80c3a5b276","Type":"ContainerDied","Data":"c9b3f6e72a54b5597dc47652c56be9a758f995531c5fbf3fb7265ef6524329f9"} Mar 09 14:30:55 crc kubenswrapper[4722]: I0309 14:30:55.011563 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/868763d5-a256-477e-b82e-dd85f1e05dea-openstack-edpm-ipam\") pod \"dnsmasq-dns-6f6df4f56c-8q8pp\" (UID: \"868763d5-a256-477e-b82e-dd85f1e05dea\") " pod="openstack/dnsmasq-dns-6f6df4f56c-8q8pp" Mar 09 14:30:55 crc kubenswrapper[4722]: I0309 14:30:55.011672 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c428v\" (UniqueName: \"kubernetes.io/projected/868763d5-a256-477e-b82e-dd85f1e05dea-kube-api-access-c428v\") pod \"dnsmasq-dns-6f6df4f56c-8q8pp\" (UID: \"868763d5-a256-477e-b82e-dd85f1e05dea\") " pod="openstack/dnsmasq-dns-6f6df4f56c-8q8pp" Mar 09 14:30:55 crc kubenswrapper[4722]: I0309 14:30:55.011706 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/868763d5-a256-477e-b82e-dd85f1e05dea-config\") pod \"dnsmasq-dns-6f6df4f56c-8q8pp\" (UID: \"868763d5-a256-477e-b82e-dd85f1e05dea\") " pod="openstack/dnsmasq-dns-6f6df4f56c-8q8pp" Mar 09 14:30:55 crc kubenswrapper[4722]: I0309 14:30:55.011741 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/868763d5-a256-477e-b82e-dd85f1e05dea-dns-svc\") pod \"dnsmasq-dns-6f6df4f56c-8q8pp\" (UID: \"868763d5-a256-477e-b82e-dd85f1e05dea\") " pod="openstack/dnsmasq-dns-6f6df4f56c-8q8pp" Mar 09 14:30:55 crc kubenswrapper[4722]: I0309 14:30:55.011785 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/868763d5-a256-477e-b82e-dd85f1e05dea-ovsdbserver-sb\") pod \"dnsmasq-dns-6f6df4f56c-8q8pp\" (UID: \"868763d5-a256-477e-b82e-dd85f1e05dea\") " pod="openstack/dnsmasq-dns-6f6df4f56c-8q8pp" Mar 09 14:30:55 crc kubenswrapper[4722]: I0309 14:30:55.011898 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/868763d5-a256-477e-b82e-dd85f1e05dea-dns-swift-storage-0\") pod \"dnsmasq-dns-6f6df4f56c-8q8pp\" (UID: \"868763d5-a256-477e-b82e-dd85f1e05dea\") " pod="openstack/dnsmasq-dns-6f6df4f56c-8q8pp" Mar 09 14:30:55 crc kubenswrapper[4722]: I0309 14:30:55.012026 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/868763d5-a256-477e-b82e-dd85f1e05dea-ovsdbserver-nb\") pod \"dnsmasq-dns-6f6df4f56c-8q8pp\" (UID: \"868763d5-a256-477e-b82e-dd85f1e05dea\") " pod="openstack/dnsmasq-dns-6f6df4f56c-8q8pp" Mar 09 14:30:55 crc kubenswrapper[4722]: I0309 14:30:55.013803 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/868763d5-a256-477e-b82e-dd85f1e05dea-ovsdbserver-nb\") pod \"dnsmasq-dns-6f6df4f56c-8q8pp\" (UID: \"868763d5-a256-477e-b82e-dd85f1e05dea\") " pod="openstack/dnsmasq-dns-6f6df4f56c-8q8pp" Mar 09 14:30:55 crc kubenswrapper[4722]: I0309 14:30:55.014240 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/868763d5-a256-477e-b82e-dd85f1e05dea-openstack-edpm-ipam\") pod \"dnsmasq-dns-6f6df4f56c-8q8pp\" (UID: \"868763d5-a256-477e-b82e-dd85f1e05dea\") " pod="openstack/dnsmasq-dns-6f6df4f56c-8q8pp" Mar 09 14:30:55 crc kubenswrapper[4722]: I0309 14:30:55.014551 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/868763d5-a256-477e-b82e-dd85f1e05dea-config\") pod \"dnsmasq-dns-6f6df4f56c-8q8pp\" (UID: \"868763d5-a256-477e-b82e-dd85f1e05dea\") " pod="openstack/dnsmasq-dns-6f6df4f56c-8q8pp" Mar 09 14:30:55 crc kubenswrapper[4722]: I0309 14:30:55.014754 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/868763d5-a256-477e-b82e-dd85f1e05dea-dns-svc\") pod \"dnsmasq-dns-6f6df4f56c-8q8pp\" (UID: \"868763d5-a256-477e-b82e-dd85f1e05dea\") " pod="openstack/dnsmasq-dns-6f6df4f56c-8q8pp" Mar 09 14:30:55 crc kubenswrapper[4722]: I0309 14:30:55.014967 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/868763d5-a256-477e-b82e-dd85f1e05dea-ovsdbserver-sb\") pod \"dnsmasq-dns-6f6df4f56c-8q8pp\" (UID: \"868763d5-a256-477e-b82e-dd85f1e05dea\") " pod="openstack/dnsmasq-dns-6f6df4f56c-8q8pp" Mar 09 14:30:55 crc kubenswrapper[4722]: I0309 14:30:55.015391 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/868763d5-a256-477e-b82e-dd85f1e05dea-dns-swift-storage-0\") pod \"dnsmasq-dns-6f6df4f56c-8q8pp\" (UID: \"868763d5-a256-477e-b82e-dd85f1e05dea\") " pod="openstack/dnsmasq-dns-6f6df4f56c-8q8pp" Mar 09 14:30:55 crc kubenswrapper[4722]: I0309 14:30:55.043097 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c428v\" (UniqueName: \"kubernetes.io/projected/868763d5-a256-477e-b82e-dd85f1e05dea-kube-api-access-c428v\") pod \"dnsmasq-dns-6f6df4f56c-8q8pp\" (UID: \"868763d5-a256-477e-b82e-dd85f1e05dea\") " pod="openstack/dnsmasq-dns-6f6df4f56c-8q8pp" Mar 09 14:30:55 crc kubenswrapper[4722]: I0309 14:30:55.122967 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f6df4f56c-8q8pp" Mar 09 14:30:55 crc kubenswrapper[4722]: I0309 14:30:55.357919 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7bbf7cf9-k62sg" Mar 09 14:30:55 crc kubenswrapper[4722]: I0309 14:30:55.420584 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/33c8ed47-f7d5-485b-b413-8c80c3a5b276-ovsdbserver-nb\") pod \"33c8ed47-f7d5-485b-b413-8c80c3a5b276\" (UID: \"33c8ed47-f7d5-485b-b413-8c80c3a5b276\") " Mar 09 14:30:55 crc kubenswrapper[4722]: I0309 14:30:55.420652 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hmc68\" (UniqueName: \"kubernetes.io/projected/33c8ed47-f7d5-485b-b413-8c80c3a5b276-kube-api-access-hmc68\") pod \"33c8ed47-f7d5-485b-b413-8c80c3a5b276\" (UID: \"33c8ed47-f7d5-485b-b413-8c80c3a5b276\") " Mar 09 14:30:55 crc kubenswrapper[4722]: I0309 14:30:55.420709 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/33c8ed47-f7d5-485b-b413-8c80c3a5b276-dns-swift-storage-0\") pod \"33c8ed47-f7d5-485b-b413-8c80c3a5b276\" (UID: \"33c8ed47-f7d5-485b-b413-8c80c3a5b276\") " Mar 09 14:30:55 crc kubenswrapper[4722]: I0309 14:30:55.420903 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33c8ed47-f7d5-485b-b413-8c80c3a5b276-dns-svc\") pod \"33c8ed47-f7d5-485b-b413-8c80c3a5b276\" (UID: \"33c8ed47-f7d5-485b-b413-8c80c3a5b276\") " Mar 09 14:30:55 crc kubenswrapper[4722]: I0309 14:30:55.420981 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/33c8ed47-f7d5-485b-b413-8c80c3a5b276-ovsdbserver-sb\") pod \"33c8ed47-f7d5-485b-b413-8c80c3a5b276\" (UID: \"33c8ed47-f7d5-485b-b413-8c80c3a5b276\") " Mar 09 14:30:55 crc kubenswrapper[4722]: I0309 14:30:55.421044 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33c8ed47-f7d5-485b-b413-8c80c3a5b276-config\") pod \"33c8ed47-f7d5-485b-b413-8c80c3a5b276\" (UID: \"33c8ed47-f7d5-485b-b413-8c80c3a5b276\") " Mar 09 14:30:55 crc kubenswrapper[4722]: I0309 14:30:55.424854 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33c8ed47-f7d5-485b-b413-8c80c3a5b276-kube-api-access-hmc68" (OuterVolumeSpecName: "kube-api-access-hmc68") pod "33c8ed47-f7d5-485b-b413-8c80c3a5b276" (UID: "33c8ed47-f7d5-485b-b413-8c80c3a5b276"). InnerVolumeSpecName "kube-api-access-hmc68". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:30:55 crc kubenswrapper[4722]: I0309 14:30:55.522123 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33c8ed47-f7d5-485b-b413-8c80c3a5b276-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "33c8ed47-f7d5-485b-b413-8c80c3a5b276" (UID: "33c8ed47-f7d5-485b-b413-8c80c3a5b276"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:30:55 crc kubenswrapper[4722]: I0309 14:30:55.527784 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hmc68\" (UniqueName: \"kubernetes.io/projected/33c8ed47-f7d5-485b-b413-8c80c3a5b276-kube-api-access-hmc68\") on node \"crc\" DevicePath \"\"" Mar 09 14:30:55 crc kubenswrapper[4722]: I0309 14:30:55.527810 4722 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/33c8ed47-f7d5-485b-b413-8c80c3a5b276-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 09 14:30:55 crc kubenswrapper[4722]: I0309 14:30:55.530855 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33c8ed47-f7d5-485b-b413-8c80c3a5b276-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "33c8ed47-f7d5-485b-b413-8c80c3a5b276" (UID: "33c8ed47-f7d5-485b-b413-8c80c3a5b276"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:30:55 crc kubenswrapper[4722]: I0309 14:30:55.539773 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33c8ed47-f7d5-485b-b413-8c80c3a5b276-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "33c8ed47-f7d5-485b-b413-8c80c3a5b276" (UID: "33c8ed47-f7d5-485b-b413-8c80c3a5b276"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:30:55 crc kubenswrapper[4722]: I0309 14:30:55.550878 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33c8ed47-f7d5-485b-b413-8c80c3a5b276-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "33c8ed47-f7d5-485b-b413-8c80c3a5b276" (UID: "33c8ed47-f7d5-485b-b413-8c80c3a5b276"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:30:55 crc kubenswrapper[4722]: I0309 14:30:55.558880 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33c8ed47-f7d5-485b-b413-8c80c3a5b276-config" (OuterVolumeSpecName: "config") pod "33c8ed47-f7d5-485b-b413-8c80c3a5b276" (UID: "33c8ed47-f7d5-485b-b413-8c80c3a5b276"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:30:55 crc kubenswrapper[4722]: I0309 14:30:55.632317 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/33c8ed47-f7d5-485b-b413-8c80c3a5b276-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 09 14:30:55 crc kubenswrapper[4722]: I0309 14:30:55.634377 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33c8ed47-f7d5-485b-b413-8c80c3a5b276-config\") on node \"crc\" DevicePath \"\"" Mar 09 14:30:55 crc kubenswrapper[4722]: I0309 14:30:55.634425 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/33c8ed47-f7d5-485b-b413-8c80c3a5b276-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 09 14:30:55 crc kubenswrapper[4722]: I0309 14:30:55.634436 4722 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33c8ed47-f7d5-485b-b413-8c80c3a5b276-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 14:30:55 crc kubenswrapper[4722]: I0309 14:30:55.702870 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f6df4f56c-8q8pp"] Mar 09 14:30:55 crc kubenswrapper[4722]: W0309 14:30:55.713238 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod868763d5_a256_477e_b82e_dd85f1e05dea.slice/crio-ccd62714ece2ed5428ab449c26e6d1aef26235b3d348964698fac9955bdcb0c4 WatchSource:0}: Error finding container ccd62714ece2ed5428ab449c26e6d1aef26235b3d348964698fac9955bdcb0c4: Status 404 returned error can't find the container with id ccd62714ece2ed5428ab449c26e6d1aef26235b3d348964698fac9955bdcb0c4 Mar 09 14:30:55 crc kubenswrapper[4722]: I0309 14:30:55.987546 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7bbf7cf9-k62sg" event={"ID":"33c8ed47-f7d5-485b-b413-8c80c3a5b276","Type":"ContainerDied","Data":"38da260493b48d3c384d023462fac6c726901df598b1f1d4cfb644db64839d27"} Mar 09 14:30:55 crc kubenswrapper[4722]: I0309 14:30:55.987777 4722 scope.go:117] "RemoveContainer" containerID="c9b3f6e72a54b5597dc47652c56be9a758f995531c5fbf3fb7265ef6524329f9" Mar 09 14:30:55 crc kubenswrapper[4722]: I0309 14:30:55.987998 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7bbf7cf9-k62sg" Mar 09 14:30:56 crc kubenswrapper[4722]: I0309 14:30:56.001126 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-cf85b" event={"ID":"c83de432-8e6a-45bf-9395-215f28461090","Type":"ContainerStarted","Data":"034af84defe4bf2ca490931a683a7b4b7d083656e7d3a918c374e1b08704433a"} Mar 09 14:30:56 crc kubenswrapper[4722]: I0309 14:30:56.009346 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f6df4f56c-8q8pp" event={"ID":"868763d5-a256-477e-b82e-dd85f1e05dea","Type":"ContainerStarted","Data":"ccd62714ece2ed5428ab449c26e6d1aef26235b3d348964698fac9955bdcb0c4"} Mar 09 14:30:56 crc kubenswrapper[4722]: I0309 14:30:56.042659 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-cf85b" podStartSLOduration=1.7283943609999999 podStartE2EDuration="39.042643884s" podCreationTimestamp="2026-03-09 14:30:17 +0000 UTC" firstStartedPulling="2026-03-09 14:30:18.088714585 +0000 UTC m=+1658.644283161" lastFinishedPulling="2026-03-09 14:30:55.402964118 +0000 UTC m=+1695.958532684" observedRunningTime="2026-03-09 14:30:56.024567175 +0000 UTC m=+1696.580135751" watchObservedRunningTime="2026-03-09 14:30:56.042643884 +0000 UTC m=+1696.598212460" Mar 09 14:30:56 crc kubenswrapper[4722]: I0309 14:30:56.056450 4722 scope.go:117] "RemoveContainer" containerID="3e5c937c1579aa97f73dcf384a185ff4c132b3765ccff35b93d939c8ea6a1f88" Mar 09 14:30:56 crc kubenswrapper[4722]: I0309 14:30:56.085267 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7bbf7cf9-k62sg"] Mar 09 14:30:56 crc kubenswrapper[4722]: I0309 14:30:56.104038 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b7bbf7cf9-k62sg"] Mar 09 14:30:56 crc kubenswrapper[4722]: I0309 14:30:56.167924 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33c8ed47-f7d5-485b-b413-8c80c3a5b276" path="/var/lib/kubelet/pods/33c8ed47-f7d5-485b-b413-8c80c3a5b276/volumes" Mar 09 14:30:57 crc kubenswrapper[4722]: I0309 14:30:57.027726 4722 generic.go:334] "Generic (PLEG): container finished" podID="868763d5-a256-477e-b82e-dd85f1e05dea" containerID="be0363b2291ef68cfafca91e689f85b7c9fc0ad439e4476cd1238c51b79836cc" exitCode=0 Mar 09 14:30:57 crc kubenswrapper[4722]: I0309 14:30:57.027803 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f6df4f56c-8q8pp" event={"ID":"868763d5-a256-477e-b82e-dd85f1e05dea","Type":"ContainerDied","Data":"be0363b2291ef68cfafca91e689f85b7c9fc0ad439e4476cd1238c51b79836cc"} Mar 09 14:30:58 crc kubenswrapper[4722]: I0309 14:30:58.040291 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f6df4f56c-8q8pp" event={"ID":"868763d5-a256-477e-b82e-dd85f1e05dea","Type":"ContainerStarted","Data":"06e45cc3b45db2a89046dda333ea769615ac04b6f9a890af15bdef9fe4ec9b3f"} Mar 09 14:30:58 crc kubenswrapper[4722]: I0309 14:30:58.040558 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6f6df4f56c-8q8pp" Mar 09 14:30:58 crc kubenswrapper[4722]: I0309 14:30:58.070581 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6f6df4f56c-8q8pp" podStartSLOduration=4.070559463 podStartE2EDuration="4.070559463s" podCreationTimestamp="2026-03-09 14:30:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:30:58.059718684 +0000 UTC m=+1698.615287260" watchObservedRunningTime="2026-03-09 14:30:58.070559463 +0000 UTC m=+1698.626128039" Mar 09 14:30:59 crc kubenswrapper[4722]: I0309 14:30:59.070537 4722 generic.go:334] "Generic (PLEG): container finished" podID="c83de432-8e6a-45bf-9395-215f28461090" containerID="034af84defe4bf2ca490931a683a7b4b7d083656e7d3a918c374e1b08704433a" exitCode=0 Mar 09 14:30:59 crc kubenswrapper[4722]: I0309 14:30:59.070634 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-cf85b" event={"ID":"c83de432-8e6a-45bf-9395-215f28461090","Type":"ContainerDied","Data":"034af84defe4bf2ca490931a683a7b4b7d083656e7d3a918c374e1b08704433a"} Mar 09 14:30:59 crc kubenswrapper[4722]: I0309 14:30:59.150723 4722 scope.go:117] "RemoveContainer" containerID="86a6dcdf7dcf4d48b8bdfb71d9a2c30b89f235700c424312c7e16580a86021d6" Mar 09 14:30:59 crc kubenswrapper[4722]: E0309 14:30:59.151265 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 14:30:59 crc kubenswrapper[4722]: I0309 14:30:59.166136 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 09 14:31:00 crc kubenswrapper[4722]: I0309 14:31:00.084895 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4a22f8c-ed38-47cf-8238-baf804f573a1","Type":"ContainerStarted","Data":"cd582a254af464cea392ee9b3f2a8bbe909916d04f0053279129bca6b32f8102"} Mar 09 14:31:00 crc kubenswrapper[4722]: I0309 14:31:00.131813 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.164723526 podStartE2EDuration="39.131790752s" podCreationTimestamp="2026-03-09 14:30:21 +0000 UTC" firstStartedPulling="2026-03-09 14:30:22.761454682 +0000 UTC m=+1663.317023258" lastFinishedPulling="2026-03-09 14:30:59.728521908 +0000 UTC m=+1700.284090484" observedRunningTime="2026-03-09 14:31:00.123567275 +0000 UTC m=+1700.679135851" watchObservedRunningTime="2026-03-09 14:31:00.131790752 +0000 UTC m=+1700.687359328" Mar 09 14:31:00 crc kubenswrapper[4722]: I0309 14:31:00.671621 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-cf85b" Mar 09 14:31:00 crc kubenswrapper[4722]: I0309 14:31:00.773817 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c83de432-8e6a-45bf-9395-215f28461090-config-data\") pod \"c83de432-8e6a-45bf-9395-215f28461090\" (UID: \"c83de432-8e6a-45bf-9395-215f28461090\") " Mar 09 14:31:00 crc kubenswrapper[4722]: I0309 14:31:00.773982 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c83de432-8e6a-45bf-9395-215f28461090-combined-ca-bundle\") pod \"c83de432-8e6a-45bf-9395-215f28461090\" (UID: \"c83de432-8e6a-45bf-9395-215f28461090\") " Mar 09 14:31:00 crc kubenswrapper[4722]: I0309 14:31:00.774099 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4t68\" (UniqueName: \"kubernetes.io/projected/c83de432-8e6a-45bf-9395-215f28461090-kube-api-access-p4t68\") pod \"c83de432-8e6a-45bf-9395-215f28461090\" (UID: \"c83de432-8e6a-45bf-9395-215f28461090\") " Mar 09 14:31:00 crc kubenswrapper[4722]: I0309 14:31:00.800982 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c83de432-8e6a-45bf-9395-215f28461090-kube-api-access-p4t68" (OuterVolumeSpecName: "kube-api-access-p4t68") pod "c83de432-8e6a-45bf-9395-215f28461090" (UID: "c83de432-8e6a-45bf-9395-215f28461090"). InnerVolumeSpecName "kube-api-access-p4t68". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:31:00 crc kubenswrapper[4722]: I0309 14:31:00.827342 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c83de432-8e6a-45bf-9395-215f28461090-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c83de432-8e6a-45bf-9395-215f28461090" (UID: "c83de432-8e6a-45bf-9395-215f28461090"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:31:00 crc kubenswrapper[4722]: I0309 14:31:00.877992 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c83de432-8e6a-45bf-9395-215f28461090-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:31:00 crc kubenswrapper[4722]: I0309 14:31:00.878023 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p4t68\" (UniqueName: \"kubernetes.io/projected/c83de432-8e6a-45bf-9395-215f28461090-kube-api-access-p4t68\") on node \"crc\" DevicePath \"\"" Mar 09 14:31:00 crc kubenswrapper[4722]: I0309 14:31:00.908396 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c83de432-8e6a-45bf-9395-215f28461090-config-data" (OuterVolumeSpecName: "config-data") pod "c83de432-8e6a-45bf-9395-215f28461090" (UID: "c83de432-8e6a-45bf-9395-215f28461090"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:31:00 crc kubenswrapper[4722]: I0309 14:31:00.980563 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c83de432-8e6a-45bf-9395-215f28461090-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:31:01 crc kubenswrapper[4722]: I0309 14:31:01.101555 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-cf85b" Mar 09 14:31:01 crc kubenswrapper[4722]: I0309 14:31:01.101580 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-cf85b" event={"ID":"c83de432-8e6a-45bf-9395-215f28461090","Type":"ContainerDied","Data":"e8751321bb092c628355ca9b01adf206a34271923881a7149af061d890d52f18"} Mar 09 14:31:01 crc kubenswrapper[4722]: I0309 14:31:01.101800 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e8751321bb092c628355ca9b01adf206a34271923881a7149af061d890d52f18" Mar 09 14:31:02 crc kubenswrapper[4722]: I0309 14:31:02.372244 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-775d545446-nrcd2"] Mar 09 14:31:02 crc kubenswrapper[4722]: E0309 14:31:02.373155 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c83de432-8e6a-45bf-9395-215f28461090" containerName="heat-db-sync" Mar 09 14:31:02 crc kubenswrapper[4722]: I0309 14:31:02.373175 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="c83de432-8e6a-45bf-9395-215f28461090" containerName="heat-db-sync" Mar 09 14:31:02 crc kubenswrapper[4722]: E0309 14:31:02.373253 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33c8ed47-f7d5-485b-b413-8c80c3a5b276" containerName="init" Mar 09 14:31:02 crc kubenswrapper[4722]: I0309 14:31:02.373264 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="33c8ed47-f7d5-485b-b413-8c80c3a5b276" containerName="init" Mar 09 14:31:02 crc kubenswrapper[4722]: E0309 14:31:02.373280 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33c8ed47-f7d5-485b-b413-8c80c3a5b276" containerName="dnsmasq-dns" Mar 09 14:31:02 crc kubenswrapper[4722]: I0309 14:31:02.373289 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="33c8ed47-f7d5-485b-b413-8c80c3a5b276" containerName="dnsmasq-dns" Mar 09 14:31:02 crc kubenswrapper[4722]: I0309 14:31:02.373602 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="33c8ed47-f7d5-485b-b413-8c80c3a5b276" containerName="dnsmasq-dns" Mar 09 14:31:02 crc kubenswrapper[4722]: I0309 14:31:02.373623 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="c83de432-8e6a-45bf-9395-215f28461090" containerName="heat-db-sync" Mar 09 14:31:02 crc kubenswrapper[4722]: I0309 14:31:02.374709 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-775d545446-nrcd2" Mar 09 14:31:02 crc kubenswrapper[4722]: I0309 14:31:02.394330 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-775d545446-nrcd2"] Mar 09 14:31:02 crc kubenswrapper[4722]: I0309 14:31:02.423617 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b78f1ea8-1fc2-4469-966c-4568370bfae9-combined-ca-bundle\") pod \"heat-engine-775d545446-nrcd2\" (UID: \"b78f1ea8-1fc2-4469-966c-4568370bfae9\") " pod="openstack/heat-engine-775d545446-nrcd2" Mar 09 14:31:02 crc kubenswrapper[4722]: I0309 14:31:02.423677 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvkjm\" (UniqueName: \"kubernetes.io/projected/b78f1ea8-1fc2-4469-966c-4568370bfae9-kube-api-access-qvkjm\") pod \"heat-engine-775d545446-nrcd2\" (UID: \"b78f1ea8-1fc2-4469-966c-4568370bfae9\") " pod="openstack/heat-engine-775d545446-nrcd2" Mar 09 14:31:02 crc kubenswrapper[4722]: I0309 14:31:02.423711 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b78f1ea8-1fc2-4469-966c-4568370bfae9-config-data\") pod \"heat-engine-775d545446-nrcd2\" (UID: \"b78f1ea8-1fc2-4469-966c-4568370bfae9\") " pod="openstack/heat-engine-775d545446-nrcd2" Mar 09 14:31:02 crc kubenswrapper[4722]: I0309 14:31:02.423793 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b78f1ea8-1fc2-4469-966c-4568370bfae9-config-data-custom\") pod \"heat-engine-775d545446-nrcd2\" (UID: \"b78f1ea8-1fc2-4469-966c-4568370bfae9\") " pod="openstack/heat-engine-775d545446-nrcd2" Mar 09 14:31:02 crc kubenswrapper[4722]: I0309 14:31:02.424144 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-867b85dcfc-rx6pp"] Mar 09 14:31:02 crc kubenswrapper[4722]: I0309 14:31:02.426159 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-867b85dcfc-rx6pp" Mar 09 14:31:02 crc kubenswrapper[4722]: I0309 14:31:02.453266 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-867b85dcfc-rx6pp"] Mar 09 14:31:02 crc kubenswrapper[4722]: I0309 14:31:02.465473 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-6df8cdc54c-dd8gs"] Mar 09 14:31:02 crc kubenswrapper[4722]: I0309 14:31:02.466994 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6df8cdc54c-dd8gs" Mar 09 14:31:02 crc kubenswrapper[4722]: I0309 14:31:02.498688 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-6df8cdc54c-dd8gs"] Mar 09 14:31:02 crc kubenswrapper[4722]: I0309 14:31:02.526121 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4de9ab3-ed0f-4ad7-8fe6-e4e67aa9eabe-config-data\") pod \"heat-cfnapi-6df8cdc54c-dd8gs\" (UID: \"b4de9ab3-ed0f-4ad7-8fe6-e4e67aa9eabe\") " pod="openstack/heat-cfnapi-6df8cdc54c-dd8gs" Mar 09 14:31:02 crc kubenswrapper[4722]: I0309 14:31:02.526167 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b78f1ea8-1fc2-4469-966c-4568370bfae9-combined-ca-bundle\") pod \"heat-engine-775d545446-nrcd2\" (UID: \"b78f1ea8-1fc2-4469-966c-4568370bfae9\") " pod="openstack/heat-engine-775d545446-nrcd2" Mar 09 14:31:02 crc kubenswrapper[4722]: I0309 14:31:02.526194 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvkjm\" (UniqueName: \"kubernetes.io/projected/b78f1ea8-1fc2-4469-966c-4568370bfae9-kube-api-access-qvkjm\") pod \"heat-engine-775d545446-nrcd2\" (UID: \"b78f1ea8-1fc2-4469-966c-4568370bfae9\") " pod="openstack/heat-engine-775d545446-nrcd2" Mar 09 14:31:02 crc kubenswrapper[4722]: I0309 14:31:02.526242 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b78f1ea8-1fc2-4469-966c-4568370bfae9-config-data\") pod \"heat-engine-775d545446-nrcd2\" (UID: \"b78f1ea8-1fc2-4469-966c-4568370bfae9\") " pod="openstack/heat-engine-775d545446-nrcd2" Mar 09 14:31:02 crc kubenswrapper[4722]: I0309 14:31:02.526796 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b87d6ac1-cd7f-4e42-b8b3-7c70fdffda0d-config-data-custom\") pod \"heat-api-867b85dcfc-rx6pp\" (UID: \"b87d6ac1-cd7f-4e42-b8b3-7c70fdffda0d\") " pod="openstack/heat-api-867b85dcfc-rx6pp" Mar 09 14:31:02 crc kubenswrapper[4722]: I0309 14:31:02.526888 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94jtr\" (UniqueName: \"kubernetes.io/projected/b87d6ac1-cd7f-4e42-b8b3-7c70fdffda0d-kube-api-access-94jtr\") pod \"heat-api-867b85dcfc-rx6pp\" (UID: \"b87d6ac1-cd7f-4e42-b8b3-7c70fdffda0d\") " pod="openstack/heat-api-867b85dcfc-rx6pp" Mar 09 14:31:02 crc kubenswrapper[4722]: I0309 14:31:02.526925 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b87d6ac1-cd7f-4e42-b8b3-7c70fdffda0d-public-tls-certs\") pod \"heat-api-867b85dcfc-rx6pp\" (UID: \"b87d6ac1-cd7f-4e42-b8b3-7c70fdffda0d\") " pod="openstack/heat-api-867b85dcfc-rx6pp" Mar 09 14:31:02 crc kubenswrapper[4722]: I0309 14:31:02.526984 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4de9ab3-ed0f-4ad7-8fe6-e4e67aa9eabe-public-tls-certs\") pod \"heat-cfnapi-6df8cdc54c-dd8gs\" (UID: \"b4de9ab3-ed0f-4ad7-8fe6-e4e67aa9eabe\") " pod="openstack/heat-cfnapi-6df8cdc54c-dd8gs" Mar 09 14:31:02 crc kubenswrapper[4722]: I0309 14:31:02.534439 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b4de9ab3-ed0f-4ad7-8fe6-e4e67aa9eabe-config-data-custom\") pod \"heat-cfnapi-6df8cdc54c-dd8gs\" (UID: \"b4de9ab3-ed0f-4ad7-8fe6-e4e67aa9eabe\") " pod="openstack/heat-cfnapi-6df8cdc54c-dd8gs" Mar 09 14:31:02 crc kubenswrapper[4722]: I0309 14:31:02.534807 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b78f1ea8-1fc2-4469-966c-4568370bfae9-config-data-custom\") pod \"heat-engine-775d545446-nrcd2\" (UID: \"b78f1ea8-1fc2-4469-966c-4568370bfae9\") " pod="openstack/heat-engine-775d545446-nrcd2" Mar 09 14:31:02 crc kubenswrapper[4722]: I0309 14:31:02.535411 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4de9ab3-ed0f-4ad7-8fe6-e4e67aa9eabe-combined-ca-bundle\") pod \"heat-cfnapi-6df8cdc54c-dd8gs\" (UID: \"b4de9ab3-ed0f-4ad7-8fe6-e4e67aa9eabe\") " pod="openstack/heat-cfnapi-6df8cdc54c-dd8gs" Mar 09 14:31:02 crc kubenswrapper[4722]: I0309 14:31:02.535903 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b78f1ea8-1fc2-4469-966c-4568370bfae9-config-data\") pod \"heat-engine-775d545446-nrcd2\" (UID: \"b78f1ea8-1fc2-4469-966c-4568370bfae9\") " pod="openstack/heat-engine-775d545446-nrcd2" Mar 09 14:31:02 crc kubenswrapper[4722]: I0309 14:31:02.535921 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b87d6ac1-cd7f-4e42-b8b3-7c70fdffda0d-combined-ca-bundle\") pod \"heat-api-867b85dcfc-rx6pp\" (UID: \"b87d6ac1-cd7f-4e42-b8b3-7c70fdffda0d\") " pod="openstack/heat-api-867b85dcfc-rx6pp" Mar 09 14:31:02 crc kubenswrapper[4722]: I0309 14:31:02.536195 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b87d6ac1-cd7f-4e42-b8b3-7c70fdffda0d-config-data\") pod \"heat-api-867b85dcfc-rx6pp\" (UID: \"b87d6ac1-cd7f-4e42-b8b3-7c70fdffda0d\") " pod="openstack/heat-api-867b85dcfc-rx6pp" Mar 09 14:31:02 crc kubenswrapper[4722]: I0309 14:31:02.536344 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tt6v5\" (UniqueName: \"kubernetes.io/projected/b4de9ab3-ed0f-4ad7-8fe6-e4e67aa9eabe-kube-api-access-tt6v5\") pod \"heat-cfnapi-6df8cdc54c-dd8gs\" (UID: \"b4de9ab3-ed0f-4ad7-8fe6-e4e67aa9eabe\") " pod="openstack/heat-cfnapi-6df8cdc54c-dd8gs" Mar 09 14:31:02 crc kubenswrapper[4722]: I0309 14:31:02.536649 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b87d6ac1-cd7f-4e42-b8b3-7c70fdffda0d-internal-tls-certs\") pod \"heat-api-867b85dcfc-rx6pp\" (UID: \"b87d6ac1-cd7f-4e42-b8b3-7c70fdffda0d\") " pod="openstack/heat-api-867b85dcfc-rx6pp" Mar 09 14:31:02 crc kubenswrapper[4722]: I0309 14:31:02.536831 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4de9ab3-ed0f-4ad7-8fe6-e4e67aa9eabe-internal-tls-certs\") pod \"heat-cfnapi-6df8cdc54c-dd8gs\" (UID: \"b4de9ab3-ed0f-4ad7-8fe6-e4e67aa9eabe\") " pod="openstack/heat-cfnapi-6df8cdc54c-dd8gs" Mar 09 14:31:02 crc kubenswrapper[4722]: I0309 14:31:02.542380 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b78f1ea8-1fc2-4469-966c-4568370bfae9-config-data-custom\") pod \"heat-engine-775d545446-nrcd2\" (UID: \"b78f1ea8-1fc2-4469-966c-4568370bfae9\") " pod="openstack/heat-engine-775d545446-nrcd2" Mar 09 14:31:02 crc kubenswrapper[4722]: I0309 14:31:02.565975 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvkjm\" (UniqueName: \"kubernetes.io/projected/b78f1ea8-1fc2-4469-966c-4568370bfae9-kube-api-access-qvkjm\") pod \"heat-engine-775d545446-nrcd2\" (UID: \"b78f1ea8-1fc2-4469-966c-4568370bfae9\") " pod="openstack/heat-engine-775d545446-nrcd2" Mar 09 14:31:02 crc kubenswrapper[4722]: I0309 14:31:02.566058 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b78f1ea8-1fc2-4469-966c-4568370bfae9-combined-ca-bundle\") pod \"heat-engine-775d545446-nrcd2\" (UID: \"b78f1ea8-1fc2-4469-966c-4568370bfae9\") " pod="openstack/heat-engine-775d545446-nrcd2" Mar 09 14:31:02 crc kubenswrapper[4722]: I0309 14:31:02.639453 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b87d6ac1-cd7f-4e42-b8b3-7c70fdffda0d-config-data-custom\") pod \"heat-api-867b85dcfc-rx6pp\" (UID: \"b87d6ac1-cd7f-4e42-b8b3-7c70fdffda0d\") " pod="openstack/heat-api-867b85dcfc-rx6pp" Mar 09 14:31:02 crc kubenswrapper[4722]: I0309 14:31:02.639515 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94jtr\" (UniqueName: \"kubernetes.io/projected/b87d6ac1-cd7f-4e42-b8b3-7c70fdffda0d-kube-api-access-94jtr\") pod \"heat-api-867b85dcfc-rx6pp\" (UID: \"b87d6ac1-cd7f-4e42-b8b3-7c70fdffda0d\") " pod="openstack/heat-api-867b85dcfc-rx6pp" Mar 09 14:31:02 crc kubenswrapper[4722]: I0309 14:31:02.639540 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b87d6ac1-cd7f-4e42-b8b3-7c70fdffda0d-public-tls-certs\") pod \"heat-api-867b85dcfc-rx6pp\" (UID: \"b87d6ac1-cd7f-4e42-b8b3-7c70fdffda0d\") " pod="openstack/heat-api-867b85dcfc-rx6pp" Mar 09 14:31:02 crc kubenswrapper[4722]: I0309 14:31:02.639559 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4de9ab3-ed0f-4ad7-8fe6-e4e67aa9eabe-public-tls-certs\") pod \"heat-cfnapi-6df8cdc54c-dd8gs\" (UID: \"b4de9ab3-ed0f-4ad7-8fe6-e4e67aa9eabe\") " pod="openstack/heat-cfnapi-6df8cdc54c-dd8gs" Mar 09 14:31:02 crc kubenswrapper[4722]: I0309 14:31:02.639580 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b4de9ab3-ed0f-4ad7-8fe6-e4e67aa9eabe-config-data-custom\") pod \"heat-cfnapi-6df8cdc54c-dd8gs\" (UID: \"b4de9ab3-ed0f-4ad7-8fe6-e4e67aa9eabe\") " pod="openstack/heat-cfnapi-6df8cdc54c-dd8gs" Mar 09 14:31:02 crc kubenswrapper[4722]: I0309 14:31:02.639623 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4de9ab3-ed0f-4ad7-8fe6-e4e67aa9eabe-combined-ca-bundle\") pod \"heat-cfnapi-6df8cdc54c-dd8gs\" (UID: \"b4de9ab3-ed0f-4ad7-8fe6-e4e67aa9eabe\") " pod="openstack/heat-cfnapi-6df8cdc54c-dd8gs" Mar 09 14:31:02 crc kubenswrapper[4722]: I0309 14:31:02.639637 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b87d6ac1-cd7f-4e42-b8b3-7c70fdffda0d-combined-ca-bundle\") pod \"heat-api-867b85dcfc-rx6pp\" (UID: \"b87d6ac1-cd7f-4e42-b8b3-7c70fdffda0d\") " pod="openstack/heat-api-867b85dcfc-rx6pp" Mar 09 14:31:02 crc kubenswrapper[4722]: I0309 14:31:02.639677 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b87d6ac1-cd7f-4e42-b8b3-7c70fdffda0d-config-data\") pod \"heat-api-867b85dcfc-rx6pp\" (UID: \"b87d6ac1-cd7f-4e42-b8b3-7c70fdffda0d\") " pod="openstack/heat-api-867b85dcfc-rx6pp" Mar 09 14:31:02 crc kubenswrapper[4722]: I0309 14:31:02.639732 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tt6v5\" (UniqueName: \"kubernetes.io/projected/b4de9ab3-ed0f-4ad7-8fe6-e4e67aa9eabe-kube-api-access-tt6v5\") pod \"heat-cfnapi-6df8cdc54c-dd8gs\" (UID: \"b4de9ab3-ed0f-4ad7-8fe6-e4e67aa9eabe\") " pod="openstack/heat-cfnapi-6df8cdc54c-dd8gs" Mar 09 14:31:02 crc kubenswrapper[4722]: I0309 14:31:02.639797 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b87d6ac1-cd7f-4e42-b8b3-7c70fdffda0d-internal-tls-certs\") pod \"heat-api-867b85dcfc-rx6pp\" (UID: \"b87d6ac1-cd7f-4e42-b8b3-7c70fdffda0d\") " pod="openstack/heat-api-867b85dcfc-rx6pp" Mar 09 14:31:02 crc kubenswrapper[4722]: I0309 14:31:02.639830 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4de9ab3-ed0f-4ad7-8fe6-e4e67aa9eabe-internal-tls-certs\") pod \"heat-cfnapi-6df8cdc54c-dd8gs\" (UID: \"b4de9ab3-ed0f-4ad7-8fe6-e4e67aa9eabe\") " pod="openstack/heat-cfnapi-6df8cdc54c-dd8gs" Mar 09 14:31:02 crc kubenswrapper[4722]: I0309 14:31:02.639899 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4de9ab3-ed0f-4ad7-8fe6-e4e67aa9eabe-config-data\") pod \"heat-cfnapi-6df8cdc54c-dd8gs\" (UID: \"b4de9ab3-ed0f-4ad7-8fe6-e4e67aa9eabe\") " pod="openstack/heat-cfnapi-6df8cdc54c-dd8gs" Mar 09 14:31:02 crc kubenswrapper[4722]: I0309 14:31:02.647068 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4de9ab3-ed0f-4ad7-8fe6-e4e67aa9eabe-config-data\") pod \"heat-cfnapi-6df8cdc54c-dd8gs\" (UID: \"b4de9ab3-ed0f-4ad7-8fe6-e4e67aa9eabe\") " pod="openstack/heat-cfnapi-6df8cdc54c-dd8gs" Mar 09 14:31:02 crc kubenswrapper[4722]: I0309 14:31:02.649630 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b87d6ac1-cd7f-4e42-b8b3-7c70fdffda0d-combined-ca-bundle\") pod \"heat-api-867b85dcfc-rx6pp\" (UID: \"b87d6ac1-cd7f-4e42-b8b3-7c70fdffda0d\") " pod="openstack/heat-api-867b85dcfc-rx6pp" Mar 09 14:31:02 crc kubenswrapper[4722]: I0309 14:31:02.650979 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4de9ab3-ed0f-4ad7-8fe6-e4e67aa9eabe-public-tls-certs\") pod \"heat-cfnapi-6df8cdc54c-dd8gs\" (UID: \"b4de9ab3-ed0f-4ad7-8fe6-e4e67aa9eabe\") " pod="openstack/heat-cfnapi-6df8cdc54c-dd8gs" Mar 09 14:31:02 crc kubenswrapper[4722]: I0309 14:31:02.658567 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b87d6ac1-cd7f-4e42-b8b3-7c70fdffda0d-public-tls-certs\") pod \"heat-api-867b85dcfc-rx6pp\" (UID: \"b87d6ac1-cd7f-4e42-b8b3-7c70fdffda0d\") " pod="openstack/heat-api-867b85dcfc-rx6pp" Mar 09 14:31:02 crc kubenswrapper[4722]: I0309 14:31:02.659357 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4de9ab3-ed0f-4ad7-8fe6-e4e67aa9eabe-combined-ca-bundle\") pod \"heat-cfnapi-6df8cdc54c-dd8gs\" (UID: \"b4de9ab3-ed0f-4ad7-8fe6-e4e67aa9eabe\") " pod="openstack/heat-cfnapi-6df8cdc54c-dd8gs" Mar 09 14:31:02 crc kubenswrapper[4722]: I0309 14:31:02.663959 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b87d6ac1-cd7f-4e42-b8b3-7c70fdffda0d-config-data-custom\") pod \"heat-api-867b85dcfc-rx6pp\" (UID: \"b87d6ac1-cd7f-4e42-b8b3-7c70fdffda0d\") " pod="openstack/heat-api-867b85dcfc-rx6pp" Mar 09 14:31:02 crc kubenswrapper[4722]: I0309 14:31:02.664523 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94jtr\" (UniqueName: \"kubernetes.io/projected/b87d6ac1-cd7f-4e42-b8b3-7c70fdffda0d-kube-api-access-94jtr\") pod \"heat-api-867b85dcfc-rx6pp\" (UID: \"b87d6ac1-cd7f-4e42-b8b3-7c70fdffda0d\") " pod="openstack/heat-api-867b85dcfc-rx6pp" Mar 09 14:31:02 crc kubenswrapper[4722]: I0309 14:31:02.664812 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b87d6ac1-cd7f-4e42-b8b3-7c70fdffda0d-config-data\") pod \"heat-api-867b85dcfc-rx6pp\" (UID: \"b87d6ac1-cd7f-4e42-b8b3-7c70fdffda0d\") " pod="openstack/heat-api-867b85dcfc-rx6pp" Mar 09 14:31:02 crc kubenswrapper[4722]: I0309 14:31:02.665362 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b4de9ab3-ed0f-4ad7-8fe6-e4e67aa9eabe-config-data-custom\") pod \"heat-cfnapi-6df8cdc54c-dd8gs\" (UID: \"b4de9ab3-ed0f-4ad7-8fe6-e4e67aa9eabe\") " pod="openstack/heat-cfnapi-6df8cdc54c-dd8gs" Mar 09 14:31:02 crc kubenswrapper[4722]: I0309 14:31:02.667022 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b4de9ab3-ed0f-4ad7-8fe6-e4e67aa9eabe-internal-tls-certs\") pod \"heat-cfnapi-6df8cdc54c-dd8gs\" (UID: \"b4de9ab3-ed0f-4ad7-8fe6-e4e67aa9eabe\") " pod="openstack/heat-cfnapi-6df8cdc54c-dd8gs" Mar 09 14:31:02 crc kubenswrapper[4722]: I0309 14:31:02.667240 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tt6v5\" (UniqueName: \"kubernetes.io/projected/b4de9ab3-ed0f-4ad7-8fe6-e4e67aa9eabe-kube-api-access-tt6v5\") pod \"heat-cfnapi-6df8cdc54c-dd8gs\" (UID: \"b4de9ab3-ed0f-4ad7-8fe6-e4e67aa9eabe\") " pod="openstack/heat-cfnapi-6df8cdc54c-dd8gs" Mar 09 14:31:02 crc kubenswrapper[4722]: I0309 14:31:02.672959 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b87d6ac1-cd7f-4e42-b8b3-7c70fdffda0d-internal-tls-certs\") pod \"heat-api-867b85dcfc-rx6pp\" (UID: \"b87d6ac1-cd7f-4e42-b8b3-7c70fdffda0d\") " pod="openstack/heat-api-867b85dcfc-rx6pp" Mar 09 14:31:02 crc kubenswrapper[4722]: I0309 14:31:02.707131 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-775d545446-nrcd2" Mar 09 14:31:02 crc kubenswrapper[4722]: I0309 14:31:02.749538 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-867b85dcfc-rx6pp" Mar 09 14:31:02 crc kubenswrapper[4722]: I0309 14:31:02.812008 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6df8cdc54c-dd8gs" Mar 09 14:31:03 crc kubenswrapper[4722]: I0309 14:31:03.262480 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-775d545446-nrcd2"] Mar 09 14:31:03 crc kubenswrapper[4722]: I0309 14:31:03.400841 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-867b85dcfc-rx6pp"] Mar 09 14:31:03 crc kubenswrapper[4722]: I0309 14:31:03.642760 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-6df8cdc54c-dd8gs"] Mar 09 14:31:04 crc kubenswrapper[4722]: I0309 14:31:04.170121 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-775d545446-nrcd2" Mar 09 14:31:04 crc kubenswrapper[4722]: I0309 14:31:04.170727 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-775d545446-nrcd2" event={"ID":"b78f1ea8-1fc2-4469-966c-4568370bfae9","Type":"ContainerStarted","Data":"ebc3666d11938ac0dba7c978aee4e056d3043f246bda65677b26b4e6f2d739ed"} Mar 09 14:31:04 crc kubenswrapper[4722]: I0309 14:31:04.170874 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-775d545446-nrcd2" event={"ID":"b78f1ea8-1fc2-4469-966c-4568370bfae9","Type":"ContainerStarted","Data":"61a26913103574b740700a7064b01b48e2a6be00f557ca5602c50480c1244bbb"} Mar 09 14:31:04 crc kubenswrapper[4722]: I0309 14:31:04.170965 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-867b85dcfc-rx6pp" event={"ID":"b87d6ac1-cd7f-4e42-b8b3-7c70fdffda0d","Type":"ContainerStarted","Data":"4f78b7ab5737177c1561fba233381a8563566ebd44f09db4c53c7303f26da15b"} Mar 09 14:31:04 crc kubenswrapper[4722]: I0309 14:31:04.171058 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6df8cdc54c-dd8gs" event={"ID":"b4de9ab3-ed0f-4ad7-8fe6-e4e67aa9eabe","Type":"ContainerStarted","Data":"19898aff1bb44f4c2d7ec071ec9e7a9785da605d3cfbf5afeef80ab23844a88e"} Mar 09 14:31:04 crc kubenswrapper[4722]: I0309 14:31:04.179216 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-775d545446-nrcd2" podStartSLOduration=2.179177989 podStartE2EDuration="2.179177989s" podCreationTimestamp="2026-03-09 14:31:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:31:04.16581142 +0000 UTC m=+1704.721379996" watchObservedRunningTime="2026-03-09 14:31:04.179177989 +0000 UTC m=+1704.734746565" Mar 09 14:31:05 crc kubenswrapper[4722]: I0309 14:31:05.125475 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6f6df4f56c-8q8pp" Mar 09 14:31:05 crc kubenswrapper[4722]: I0309 14:31:05.234456 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d84b4d45c-wcc8n"] Mar 09 14:31:05 crc kubenswrapper[4722]: I0309 14:31:05.234750 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7d84b4d45c-wcc8n" podUID="c4f573d9-eabb-492d-b4ab-25c64166e91f" containerName="dnsmasq-dns" containerID="cri-o://daf014a3b39097408cc997070b424c8e3c13da8d994063d4f2655ccc78a1539e" gracePeriod=10 Mar 09 14:31:06 crc kubenswrapper[4722]: I0309 14:31:06.224226 4722 generic.go:334] "Generic (PLEG): container finished" podID="c4f573d9-eabb-492d-b4ab-25c64166e91f" containerID="daf014a3b39097408cc997070b424c8e3c13da8d994063d4f2655ccc78a1539e" exitCode=0 Mar 09 14:31:06 crc kubenswrapper[4722]: I0309 14:31:06.224526 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d84b4d45c-wcc8n" event={"ID":"c4f573d9-eabb-492d-b4ab-25c64166e91f","Type":"ContainerDied","Data":"daf014a3b39097408cc997070b424c8e3c13da8d994063d4f2655ccc78a1539e"} Mar 09 14:31:06 crc kubenswrapper[4722]: I0309 14:31:06.359685 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d84b4d45c-wcc8n" Mar 09 14:31:06 crc kubenswrapper[4722]: I0309 14:31:06.469703 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4f573d9-eabb-492d-b4ab-25c64166e91f-config\") pod \"c4f573d9-eabb-492d-b4ab-25c64166e91f\" (UID: \"c4f573d9-eabb-492d-b4ab-25c64166e91f\") " Mar 09 14:31:06 crc kubenswrapper[4722]: I0309 14:31:06.469838 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c4f573d9-eabb-492d-b4ab-25c64166e91f-ovsdbserver-nb\") pod \"c4f573d9-eabb-492d-b4ab-25c64166e91f\" (UID: \"c4f573d9-eabb-492d-b4ab-25c64166e91f\") " Mar 09 14:31:06 crc kubenswrapper[4722]: I0309 14:31:06.469962 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c4f573d9-eabb-492d-b4ab-25c64166e91f-ovsdbserver-sb\") pod \"c4f573d9-eabb-492d-b4ab-25c64166e91f\" (UID: \"c4f573d9-eabb-492d-b4ab-25c64166e91f\") " Mar 09 14:31:06 crc kubenswrapper[4722]: I0309 14:31:06.470018 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/c4f573d9-eabb-492d-b4ab-25c64166e91f-openstack-edpm-ipam\") pod \"c4f573d9-eabb-492d-b4ab-25c64166e91f\" (UID: \"c4f573d9-eabb-492d-b4ab-25c64166e91f\") " Mar 09 14:31:06 crc kubenswrapper[4722]: I0309 14:31:06.470500 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c4f573d9-eabb-492d-b4ab-25c64166e91f-dns-svc\") pod \"c4f573d9-eabb-492d-b4ab-25c64166e91f\" (UID: \"c4f573d9-eabb-492d-b4ab-25c64166e91f\") " Mar 09 14:31:06 crc kubenswrapper[4722]: I0309 14:31:06.470544 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c4f573d9-eabb-492d-b4ab-25c64166e91f-dns-swift-storage-0\") pod \"c4f573d9-eabb-492d-b4ab-25c64166e91f\" (UID: \"c4f573d9-eabb-492d-b4ab-25c64166e91f\") " Mar 09 14:31:06 crc kubenswrapper[4722]: I0309 14:31:06.470572 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfgbv\" (UniqueName: \"kubernetes.io/projected/c4f573d9-eabb-492d-b4ab-25c64166e91f-kube-api-access-xfgbv\") pod \"c4f573d9-eabb-492d-b4ab-25c64166e91f\" (UID: \"c4f573d9-eabb-492d-b4ab-25c64166e91f\") " Mar 09 14:31:06 crc kubenswrapper[4722]: I0309 14:31:06.477276 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4f573d9-eabb-492d-b4ab-25c64166e91f-kube-api-access-xfgbv" (OuterVolumeSpecName: "kube-api-access-xfgbv") pod "c4f573d9-eabb-492d-b4ab-25c64166e91f" (UID: "c4f573d9-eabb-492d-b4ab-25c64166e91f"). InnerVolumeSpecName "kube-api-access-xfgbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:31:06 crc kubenswrapper[4722]: I0309 14:31:06.547844 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4f573d9-eabb-492d-b4ab-25c64166e91f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c4f573d9-eabb-492d-b4ab-25c64166e91f" (UID: "c4f573d9-eabb-492d-b4ab-25c64166e91f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:31:06 crc kubenswrapper[4722]: I0309 14:31:06.552712 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4f573d9-eabb-492d-b4ab-25c64166e91f-config" (OuterVolumeSpecName: "config") pod "c4f573d9-eabb-492d-b4ab-25c64166e91f" (UID: "c4f573d9-eabb-492d-b4ab-25c64166e91f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:31:06 crc kubenswrapper[4722]: I0309 14:31:06.560170 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4f573d9-eabb-492d-b4ab-25c64166e91f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c4f573d9-eabb-492d-b4ab-25c64166e91f" (UID: "c4f573d9-eabb-492d-b4ab-25c64166e91f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:31:06 crc kubenswrapper[4722]: I0309 14:31:06.569378 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4f573d9-eabb-492d-b4ab-25c64166e91f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c4f573d9-eabb-492d-b4ab-25c64166e91f" (UID: "c4f573d9-eabb-492d-b4ab-25c64166e91f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:31:06 crc kubenswrapper[4722]: I0309 14:31:06.574299 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4f573d9-eabb-492d-b4ab-25c64166e91f-config\") on node \"crc\" DevicePath \"\"" Mar 09 14:31:06 crc kubenswrapper[4722]: I0309 14:31:06.574339 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c4f573d9-eabb-492d-b4ab-25c64166e91f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 09 14:31:06 crc kubenswrapper[4722]: I0309 14:31:06.574352 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c4f573d9-eabb-492d-b4ab-25c64166e91f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 09 14:31:06 crc kubenswrapper[4722]: I0309 14:31:06.574361 4722 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c4f573d9-eabb-492d-b4ab-25c64166e91f-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 09 14:31:06 crc kubenswrapper[4722]: I0309 14:31:06.574371 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xfgbv\" (UniqueName: \"kubernetes.io/projected/c4f573d9-eabb-492d-b4ab-25c64166e91f-kube-api-access-xfgbv\") on node \"crc\" DevicePath \"\"" Mar 09 14:31:06 crc kubenswrapper[4722]: I0309 14:31:06.582577 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4f573d9-eabb-492d-b4ab-25c64166e91f-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "c4f573d9-eabb-492d-b4ab-25c64166e91f" (UID: "c4f573d9-eabb-492d-b4ab-25c64166e91f"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:31:06 crc kubenswrapper[4722]: I0309 14:31:06.596716 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4f573d9-eabb-492d-b4ab-25c64166e91f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c4f573d9-eabb-492d-b4ab-25c64166e91f" (UID: "c4f573d9-eabb-492d-b4ab-25c64166e91f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:31:06 crc kubenswrapper[4722]: I0309 14:31:06.677246 4722 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/c4f573d9-eabb-492d-b4ab-25c64166e91f-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 14:31:06 crc kubenswrapper[4722]: I0309 14:31:06.677275 4722 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c4f573d9-eabb-492d-b4ab-25c64166e91f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 09 14:31:07 crc kubenswrapper[4722]: I0309 14:31:07.251949 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d84b4d45c-wcc8n" event={"ID":"c4f573d9-eabb-492d-b4ab-25c64166e91f","Type":"ContainerDied","Data":"d7dfdaaedce5c451ed235c04d86fca1988baf3541eef23830d8404500151e945"} Mar 09 14:31:07 crc kubenswrapper[4722]: I0309 14:31:07.252191 4722 scope.go:117] "RemoveContainer" containerID="daf014a3b39097408cc997070b424c8e3c13da8d994063d4f2655ccc78a1539e" Mar 09 14:31:07 crc kubenswrapper[4722]: I0309 14:31:07.252361 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d84b4d45c-wcc8n" Mar 09 14:31:07 crc kubenswrapper[4722]: I0309 14:31:07.261856 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-867b85dcfc-rx6pp" event={"ID":"b87d6ac1-cd7f-4e42-b8b3-7c70fdffda0d","Type":"ContainerStarted","Data":"fded4a208ed7ea6424dba4b5bd696276fa3106967cec6d75b6f1960011694cb1"} Mar 09 14:31:07 crc kubenswrapper[4722]: I0309 14:31:07.262555 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-867b85dcfc-rx6pp" Mar 09 14:31:07 crc kubenswrapper[4722]: I0309 14:31:07.270272 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6df8cdc54c-dd8gs" event={"ID":"b4de9ab3-ed0f-4ad7-8fe6-e4e67aa9eabe","Type":"ContainerStarted","Data":"46d8981d6f45b78ace9797239157af9b06d151cb766d4e1157ae069e065edfcd"} Mar 09 14:31:07 crc kubenswrapper[4722]: I0309 14:31:07.270425 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-6df8cdc54c-dd8gs" Mar 09 14:31:07 crc kubenswrapper[4722]: I0309 14:31:07.288474 4722 scope.go:117] "RemoveContainer" containerID="3c7b4b5e492fcdaa97a5c34bfb3b3859d5c76e030d8c09cf584f324750986f00" Mar 09 14:31:07 crc kubenswrapper[4722]: I0309 14:31:07.298283 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-867b85dcfc-rx6pp" podStartSLOduration=2.793899167 podStartE2EDuration="5.29826936s" podCreationTimestamp="2026-03-09 14:31:02 +0000 UTC" firstStartedPulling="2026-03-09 14:31:03.40595528 +0000 UTC m=+1703.961523856" lastFinishedPulling="2026-03-09 14:31:05.910325473 +0000 UTC m=+1706.465894049" observedRunningTime="2026-03-09 14:31:07.282020831 +0000 UTC m=+1707.837589437" watchObservedRunningTime="2026-03-09 14:31:07.29826936 +0000 UTC m=+1707.853837936" Mar 09 14:31:07 crc kubenswrapper[4722]: I0309 14:31:07.320181 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-6df8cdc54c-dd8gs" podStartSLOduration=3.042299398 podStartE2EDuration="5.320160573s" podCreationTimestamp="2026-03-09 14:31:02 +0000 UTC" firstStartedPulling="2026-03-09 14:31:03.636076108 +0000 UTC m=+1704.191644684" lastFinishedPulling="2026-03-09 14:31:05.913937273 +0000 UTC m=+1706.469505859" observedRunningTime="2026-03-09 14:31:07.300063639 +0000 UTC m=+1707.855632215" watchObservedRunningTime="2026-03-09 14:31:07.320160573 +0000 UTC m=+1707.875729149" Mar 09 14:31:07 crc kubenswrapper[4722]: I0309 14:31:07.340408 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d84b4d45c-wcc8n"] Mar 09 14:31:07 crc kubenswrapper[4722]: I0309 14:31:07.348561 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7d84b4d45c-wcc8n"] Mar 09 14:31:08 crc kubenswrapper[4722]: I0309 14:31:08.169617 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4f573d9-eabb-492d-b4ab-25c64166e91f" path="/var/lib/kubelet/pods/c4f573d9-eabb-492d-b4ab-25c64166e91f/volumes" Mar 09 14:31:09 crc kubenswrapper[4722]: I0309 14:31:09.005560 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-ml5m4"] Mar 09 14:31:09 crc kubenswrapper[4722]: E0309 14:31:09.007844 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4f573d9-eabb-492d-b4ab-25c64166e91f" containerName="dnsmasq-dns" Mar 09 14:31:09 crc kubenswrapper[4722]: I0309 14:31:09.007991 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4f573d9-eabb-492d-b4ab-25c64166e91f" containerName="dnsmasq-dns" Mar 09 14:31:09 crc kubenswrapper[4722]: E0309 14:31:09.008101 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4f573d9-eabb-492d-b4ab-25c64166e91f" containerName="init" Mar 09 14:31:09 crc kubenswrapper[4722]: I0309 14:31:09.008176 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4f573d9-eabb-492d-b4ab-25c64166e91f" containerName="init" Mar 09 14:31:09 crc kubenswrapper[4722]: I0309 14:31:09.009248 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4f573d9-eabb-492d-b4ab-25c64166e91f" containerName="dnsmasq-dns" Mar 09 14:31:09 crc kubenswrapper[4722]: I0309 14:31:09.013680 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ml5m4" Mar 09 14:31:09 crc kubenswrapper[4722]: I0309 14:31:09.075868 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ml5m4"] Mar 09 14:31:09 crc kubenswrapper[4722]: I0309 14:31:09.150966 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10d0cd54-d5b4-4b4a-ae3b-4ad5f0154500-catalog-content\") pod \"redhat-marketplace-ml5m4\" (UID: \"10d0cd54-d5b4-4b4a-ae3b-4ad5f0154500\") " pod="openshift-marketplace/redhat-marketplace-ml5m4" Mar 09 14:31:09 crc kubenswrapper[4722]: I0309 14:31:09.151083 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10d0cd54-d5b4-4b4a-ae3b-4ad5f0154500-utilities\") pod \"redhat-marketplace-ml5m4\" (UID: \"10d0cd54-d5b4-4b4a-ae3b-4ad5f0154500\") " pod="openshift-marketplace/redhat-marketplace-ml5m4" Mar 09 14:31:09 crc kubenswrapper[4722]: I0309 14:31:09.151125 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlpx8\" (UniqueName: \"kubernetes.io/projected/10d0cd54-d5b4-4b4a-ae3b-4ad5f0154500-kube-api-access-hlpx8\") pod \"redhat-marketplace-ml5m4\" (UID: \"10d0cd54-d5b4-4b4a-ae3b-4ad5f0154500\") " pod="openshift-marketplace/redhat-marketplace-ml5m4" Mar 09 14:31:09 crc kubenswrapper[4722]: I0309 14:31:09.254908 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10d0cd54-d5b4-4b4a-ae3b-4ad5f0154500-utilities\") pod \"redhat-marketplace-ml5m4\" (UID: \"10d0cd54-d5b4-4b4a-ae3b-4ad5f0154500\") " pod="openshift-marketplace/redhat-marketplace-ml5m4" Mar 09 14:31:09 crc kubenswrapper[4722]: I0309 14:31:09.255443 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10d0cd54-d5b4-4b4a-ae3b-4ad5f0154500-utilities\") pod \"redhat-marketplace-ml5m4\" (UID: \"10d0cd54-d5b4-4b4a-ae3b-4ad5f0154500\") " pod="openshift-marketplace/redhat-marketplace-ml5m4" Mar 09 14:31:09 crc kubenswrapper[4722]: I0309 14:31:09.255765 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlpx8\" (UniqueName: \"kubernetes.io/projected/10d0cd54-d5b4-4b4a-ae3b-4ad5f0154500-kube-api-access-hlpx8\") pod \"redhat-marketplace-ml5m4\" (UID: \"10d0cd54-d5b4-4b4a-ae3b-4ad5f0154500\") " pod="openshift-marketplace/redhat-marketplace-ml5m4" Mar 09 14:31:09 crc kubenswrapper[4722]: I0309 14:31:09.257016 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10d0cd54-d5b4-4b4a-ae3b-4ad5f0154500-catalog-content\") pod \"redhat-marketplace-ml5m4\" (UID: \"10d0cd54-d5b4-4b4a-ae3b-4ad5f0154500\") " pod="openshift-marketplace/redhat-marketplace-ml5m4" Mar 09 14:31:09 crc kubenswrapper[4722]: I0309 14:31:09.257710 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10d0cd54-d5b4-4b4a-ae3b-4ad5f0154500-catalog-content\") pod \"redhat-marketplace-ml5m4\" (UID: \"10d0cd54-d5b4-4b4a-ae3b-4ad5f0154500\") " pod="openshift-marketplace/redhat-marketplace-ml5m4" Mar 09 14:31:09 crc kubenswrapper[4722]: I0309 14:31:09.296769 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlpx8\" (UniqueName: \"kubernetes.io/projected/10d0cd54-d5b4-4b4a-ae3b-4ad5f0154500-kube-api-access-hlpx8\") pod \"redhat-marketplace-ml5m4\" (UID: \"10d0cd54-d5b4-4b4a-ae3b-4ad5f0154500\") " pod="openshift-marketplace/redhat-marketplace-ml5m4" Mar 09 14:31:09 crc kubenswrapper[4722]: I0309 14:31:09.347297 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ml5m4" Mar 09 14:31:09 crc kubenswrapper[4722]: I0309 14:31:09.910616 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-ml5m4"] Mar 09 14:31:10 crc kubenswrapper[4722]: I0309 14:31:10.314955 4722 generic.go:334] "Generic (PLEG): container finished" podID="10d0cd54-d5b4-4b4a-ae3b-4ad5f0154500" containerID="644ef690a5fed758daca5d89aa9cba20404323bb8bdd9a0dfaa8f28dea79074f" exitCode=0 Mar 09 14:31:10 crc kubenswrapper[4722]: I0309 14:31:10.315018 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ml5m4" event={"ID":"10d0cd54-d5b4-4b4a-ae3b-4ad5f0154500","Type":"ContainerDied","Data":"644ef690a5fed758daca5d89aa9cba20404323bb8bdd9a0dfaa8f28dea79074f"} Mar 09 14:31:10 crc kubenswrapper[4722]: I0309 14:31:10.315060 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ml5m4" event={"ID":"10d0cd54-d5b4-4b4a-ae3b-4ad5f0154500","Type":"ContainerStarted","Data":"426a42be2b1c9937c335eaeb272b0c3c2378530be8e65ef979668739314b8d56"} Mar 09 14:31:12 crc kubenswrapper[4722]: I0309 14:31:12.149556 4722 scope.go:117] "RemoveContainer" containerID="86a6dcdf7dcf4d48b8bdfb71d9a2c30b89f235700c424312c7e16580a86021d6" Mar 09 14:31:12 crc kubenswrapper[4722]: E0309 14:31:12.150399 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 14:31:12 crc kubenswrapper[4722]: I0309 14:31:12.344888 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ml5m4" event={"ID":"10d0cd54-d5b4-4b4a-ae3b-4ad5f0154500","Type":"ContainerStarted","Data":"57ccf516c4d79f8833b176942300148c4d7f8b7056bf9f6d6dd5d652971ba3ee"} Mar 09 14:31:13 crc kubenswrapper[4722]: I0309 14:31:13.370707 4722 generic.go:334] "Generic (PLEG): container finished" podID="10d0cd54-d5b4-4b4a-ae3b-4ad5f0154500" containerID="57ccf516c4d79f8833b176942300148c4d7f8b7056bf9f6d6dd5d652971ba3ee" exitCode=0 Mar 09 14:31:13 crc kubenswrapper[4722]: I0309 14:31:13.370779 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ml5m4" event={"ID":"10d0cd54-d5b4-4b4a-ae3b-4ad5f0154500","Type":"ContainerDied","Data":"57ccf516c4d79f8833b176942300148c4d7f8b7056bf9f6d6dd5d652971ba3ee"} Mar 09 14:31:14 crc kubenswrapper[4722]: I0309 14:31:14.386135 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ml5m4" event={"ID":"10d0cd54-d5b4-4b4a-ae3b-4ad5f0154500","Type":"ContainerStarted","Data":"b15b84a7f908329be743d8e4ca04d04202cb9f9ee93a41e038887a59debd406d"} Mar 09 14:31:14 crc kubenswrapper[4722]: I0309 14:31:14.421400 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-ml5m4" podStartSLOduration=2.780387634 podStartE2EDuration="6.42138312s" podCreationTimestamp="2026-03-09 14:31:08 +0000 UTC" firstStartedPulling="2026-03-09 14:31:10.316872358 +0000 UTC m=+1710.872440944" lastFinishedPulling="2026-03-09 14:31:13.957867854 +0000 UTC m=+1714.513436430" observedRunningTime="2026-03-09 14:31:14.411404455 +0000 UTC m=+1714.966973031" watchObservedRunningTime="2026-03-09 14:31:14.42138312 +0000 UTC m=+1714.976951696" Mar 09 14:31:14 crc kubenswrapper[4722]: I0309 14:31:14.774310 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-867b85dcfc-rx6pp" Mar 09 14:31:14 crc kubenswrapper[4722]: I0309 14:31:14.868557 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-6957fcb6b8-8jmt4"] Mar 09 14:31:14 crc kubenswrapper[4722]: I0309 14:31:14.871429 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-api-6957fcb6b8-8jmt4" podUID="9cd4ad6c-298e-47c9-8648-3cfa1f407aad" containerName="heat-api" containerID="cri-o://7a6d6410a31756df86ae1256bbe1b8693b637feb0999a0f93b3883be294fab8b" gracePeriod=60 Mar 09 14:31:15 crc kubenswrapper[4722]: I0309 14:31:15.263965 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-6df8cdc54c-dd8gs" Mar 09 14:31:15 crc kubenswrapper[4722]: I0309 14:31:15.378152 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-5c766df7b4-znr5j"] Mar 09 14:31:15 crc kubenswrapper[4722]: I0309 14:31:15.378418 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-cfnapi-5c766df7b4-znr5j" podUID="9213500f-da49-496b-b99e-1ec95658b48f" containerName="heat-cfnapi" containerID="cri-o://16ca1376d5e10d5ddff1520fbf35e02673d95dd3732d67bdc68c20e9647c98aa" gracePeriod=60 Mar 09 14:31:18 crc kubenswrapper[4722]: I0309 14:31:18.439586 4722 generic.go:334] "Generic (PLEG): container finished" podID="fabf84f5-0f35-4400-b612-235235a21f3c" containerID="17b99e8421029ea591318c368e93dea2f5a965ed586feebfbb3dc7acd8da617b" exitCode=0 Mar 09 14:31:18 crc kubenswrapper[4722]: I0309 14:31:18.440106 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"fabf84f5-0f35-4400-b612-235235a21f3c","Type":"ContainerDied","Data":"17b99e8421029ea591318c368e93dea2f5a965ed586feebfbb3dc7acd8da617b"} Mar 09 14:31:18 crc kubenswrapper[4722]: I0309 14:31:18.448185 4722 generic.go:334] "Generic (PLEG): container finished" podID="9cd4ad6c-298e-47c9-8648-3cfa1f407aad" containerID="7a6d6410a31756df86ae1256bbe1b8693b637feb0999a0f93b3883be294fab8b" exitCode=0 Mar 09 14:31:18 crc kubenswrapper[4722]: I0309 14:31:18.448243 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6957fcb6b8-8jmt4" event={"ID":"9cd4ad6c-298e-47c9-8648-3cfa1f407aad","Type":"ContainerDied","Data":"7a6d6410a31756df86ae1256bbe1b8693b637feb0999a0f93b3883be294fab8b"} Mar 09 14:31:18 crc kubenswrapper[4722]: I0309 14:31:18.861077 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-6957fcb6b8-8jmt4" Mar 09 14:31:19 crc kubenswrapper[4722]: I0309 14:31:19.039548 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5c766df7b4-znr5j" Mar 09 14:31:19 crc kubenswrapper[4722]: I0309 14:31:19.066343 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cd4ad6c-298e-47c9-8648-3cfa1f407aad-internal-tls-certs\") pod \"9cd4ad6c-298e-47c9-8648-3cfa1f407aad\" (UID: \"9cd4ad6c-298e-47c9-8648-3cfa1f407aad\") " Mar 09 14:31:19 crc kubenswrapper[4722]: I0309 14:31:19.066440 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9cd4ad6c-298e-47c9-8648-3cfa1f407aad-config-data-custom\") pod \"9cd4ad6c-298e-47c9-8648-3cfa1f407aad\" (UID: \"9cd4ad6c-298e-47c9-8648-3cfa1f407aad\") " Mar 09 14:31:19 crc kubenswrapper[4722]: I0309 14:31:19.066515 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cd4ad6c-298e-47c9-8648-3cfa1f407aad-combined-ca-bundle\") pod \"9cd4ad6c-298e-47c9-8648-3cfa1f407aad\" (UID: \"9cd4ad6c-298e-47c9-8648-3cfa1f407aad\") " Mar 09 14:31:19 crc kubenswrapper[4722]: I0309 14:31:19.066545 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cd4ad6c-298e-47c9-8648-3cfa1f407aad-config-data\") pod \"9cd4ad6c-298e-47c9-8648-3cfa1f407aad\" (UID: \"9cd4ad6c-298e-47c9-8648-3cfa1f407aad\") " Mar 09 14:31:19 crc kubenswrapper[4722]: I0309 14:31:19.066581 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6j9b\" (UniqueName: \"kubernetes.io/projected/9cd4ad6c-298e-47c9-8648-3cfa1f407aad-kube-api-access-p6j9b\") pod \"9cd4ad6c-298e-47c9-8648-3cfa1f407aad\" (UID: \"9cd4ad6c-298e-47c9-8648-3cfa1f407aad\") " Mar 09 14:31:19 crc kubenswrapper[4722]: I0309 14:31:19.066682 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cd4ad6c-298e-47c9-8648-3cfa1f407aad-public-tls-certs\") pod \"9cd4ad6c-298e-47c9-8648-3cfa1f407aad\" (UID: \"9cd4ad6c-298e-47c9-8648-3cfa1f407aad\") " Mar 09 14:31:19 crc kubenswrapper[4722]: I0309 14:31:19.075054 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cd4ad6c-298e-47c9-8648-3cfa1f407aad-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "9cd4ad6c-298e-47c9-8648-3cfa1f407aad" (UID: "9cd4ad6c-298e-47c9-8648-3cfa1f407aad"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:31:19 crc kubenswrapper[4722]: I0309 14:31:19.075754 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cd4ad6c-298e-47c9-8648-3cfa1f407aad-kube-api-access-p6j9b" (OuterVolumeSpecName: "kube-api-access-p6j9b") pod "9cd4ad6c-298e-47c9-8648-3cfa1f407aad" (UID: "9cd4ad6c-298e-47c9-8648-3cfa1f407aad"). InnerVolumeSpecName "kube-api-access-p6j9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:31:19 crc kubenswrapper[4722]: I0309 14:31:19.170111 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9213500f-da49-496b-b99e-1ec95658b48f-config-data-custom\") pod \"9213500f-da49-496b-b99e-1ec95658b48f\" (UID: \"9213500f-da49-496b-b99e-1ec95658b48f\") " Mar 09 14:31:19 crc kubenswrapper[4722]: I0309 14:31:19.170526 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9213500f-da49-496b-b99e-1ec95658b48f-combined-ca-bundle\") pod \"9213500f-da49-496b-b99e-1ec95658b48f\" (UID: \"9213500f-da49-496b-b99e-1ec95658b48f\") " Mar 09 14:31:19 crc kubenswrapper[4722]: I0309 14:31:19.170743 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9213500f-da49-496b-b99e-1ec95658b48f-internal-tls-certs\") pod \"9213500f-da49-496b-b99e-1ec95658b48f\" (UID: \"9213500f-da49-496b-b99e-1ec95658b48f\") " Mar 09 14:31:19 crc kubenswrapper[4722]: I0309 14:31:19.171453 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9213500f-da49-496b-b99e-1ec95658b48f-public-tls-certs\") pod \"9213500f-da49-496b-b99e-1ec95658b48f\" (UID: \"9213500f-da49-496b-b99e-1ec95658b48f\") " Mar 09 14:31:19 crc kubenswrapper[4722]: I0309 14:31:19.171518 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbwqz\" (UniqueName: \"kubernetes.io/projected/9213500f-da49-496b-b99e-1ec95658b48f-kube-api-access-wbwqz\") pod \"9213500f-da49-496b-b99e-1ec95658b48f\" (UID: \"9213500f-da49-496b-b99e-1ec95658b48f\") " Mar 09 14:31:19 crc kubenswrapper[4722]: I0309 14:31:19.171620 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9213500f-da49-496b-b99e-1ec95658b48f-config-data\") pod \"9213500f-da49-496b-b99e-1ec95658b48f\" (UID: \"9213500f-da49-496b-b99e-1ec95658b48f\") " Mar 09 14:31:19 crc kubenswrapper[4722]: I0309 14:31:19.172241 4722 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9cd4ad6c-298e-47c9-8648-3cfa1f407aad-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 09 14:31:19 crc kubenswrapper[4722]: I0309 14:31:19.172260 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6j9b\" (UniqueName: \"kubernetes.io/projected/9cd4ad6c-298e-47c9-8648-3cfa1f407aad-kube-api-access-p6j9b\") on node \"crc\" DevicePath \"\"" Mar 09 14:31:19 crc kubenswrapper[4722]: I0309 14:31:19.172351 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cd4ad6c-298e-47c9-8648-3cfa1f407aad-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "9cd4ad6c-298e-47c9-8648-3cfa1f407aad" (UID: "9cd4ad6c-298e-47c9-8648-3cfa1f407aad"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:31:19 crc kubenswrapper[4722]: I0309 14:31:19.178188 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9213500f-da49-496b-b99e-1ec95658b48f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "9213500f-da49-496b-b99e-1ec95658b48f" (UID: "9213500f-da49-496b-b99e-1ec95658b48f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:31:19 crc kubenswrapper[4722]: I0309 14:31:19.180677 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9213500f-da49-496b-b99e-1ec95658b48f-kube-api-access-wbwqz" (OuterVolumeSpecName: "kube-api-access-wbwqz") pod "9213500f-da49-496b-b99e-1ec95658b48f" (UID: "9213500f-da49-496b-b99e-1ec95658b48f"). InnerVolumeSpecName "kube-api-access-wbwqz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:31:19 crc kubenswrapper[4722]: I0309 14:31:19.194846 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cd4ad6c-298e-47c9-8648-3cfa1f407aad-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9cd4ad6c-298e-47c9-8648-3cfa1f407aad" (UID: "9cd4ad6c-298e-47c9-8648-3cfa1f407aad"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:31:19 crc kubenswrapper[4722]: I0309 14:31:19.220264 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9213500f-da49-496b-b99e-1ec95658b48f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9213500f-da49-496b-b99e-1ec95658b48f" (UID: "9213500f-da49-496b-b99e-1ec95658b48f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:31:19 crc kubenswrapper[4722]: I0309 14:31:19.247448 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cd4ad6c-298e-47c9-8648-3cfa1f407aad-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "9cd4ad6c-298e-47c9-8648-3cfa1f407aad" (UID: "9cd4ad6c-298e-47c9-8648-3cfa1f407aad"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:31:19 crc kubenswrapper[4722]: I0309 14:31:19.255403 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9213500f-da49-496b-b99e-1ec95658b48f-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "9213500f-da49-496b-b99e-1ec95658b48f" (UID: "9213500f-da49-496b-b99e-1ec95658b48f"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:31:19 crc kubenswrapper[4722]: I0309 14:31:19.258299 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9213500f-da49-496b-b99e-1ec95658b48f-config-data" (OuterVolumeSpecName: "config-data") pod "9213500f-da49-496b-b99e-1ec95658b48f" (UID: "9213500f-da49-496b-b99e-1ec95658b48f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:31:19 crc kubenswrapper[4722]: I0309 14:31:19.272403 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cd4ad6c-298e-47c9-8648-3cfa1f407aad-config-data" (OuterVolumeSpecName: "config-data") pod "9cd4ad6c-298e-47c9-8648-3cfa1f407aad" (UID: "9cd4ad6c-298e-47c9-8648-3cfa1f407aad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:31:19 crc kubenswrapper[4722]: I0309 14:31:19.274103 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wbwqz\" (UniqueName: \"kubernetes.io/projected/9213500f-da49-496b-b99e-1ec95658b48f-kube-api-access-wbwqz\") on node \"crc\" DevicePath \"\"" Mar 09 14:31:19 crc kubenswrapper[4722]: I0309 14:31:19.274125 4722 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cd4ad6c-298e-47c9-8648-3cfa1f407aad-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 14:31:19 crc kubenswrapper[4722]: I0309 14:31:19.274135 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9213500f-da49-496b-b99e-1ec95658b48f-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:31:19 crc kubenswrapper[4722]: I0309 14:31:19.274145 4722 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9213500f-da49-496b-b99e-1ec95658b48f-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 09 14:31:19 crc kubenswrapper[4722]: I0309 14:31:19.274152 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9cd4ad6c-298e-47c9-8648-3cfa1f407aad-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:31:19 crc kubenswrapper[4722]: I0309 14:31:19.274160 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9cd4ad6c-298e-47c9-8648-3cfa1f407aad-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:31:19 crc kubenswrapper[4722]: I0309 14:31:19.274168 4722 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9cd4ad6c-298e-47c9-8648-3cfa1f407aad-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 14:31:19 crc kubenswrapper[4722]: I0309 14:31:19.274177 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9213500f-da49-496b-b99e-1ec95658b48f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:31:19 crc kubenswrapper[4722]: I0309 14:31:19.274185 4722 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9213500f-da49-496b-b99e-1ec95658b48f-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 14:31:19 crc kubenswrapper[4722]: I0309 14:31:19.282966 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9213500f-da49-496b-b99e-1ec95658b48f-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "9213500f-da49-496b-b99e-1ec95658b48f" (UID: "9213500f-da49-496b-b99e-1ec95658b48f"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:31:19 crc kubenswrapper[4722]: I0309 14:31:19.348372 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-ml5m4" Mar 09 14:31:19 crc kubenswrapper[4722]: I0309 14:31:19.348450 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-ml5m4" Mar 09 14:31:19 crc kubenswrapper[4722]: I0309 14:31:19.376295 4722 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9213500f-da49-496b-b99e-1ec95658b48f-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 14:31:19 crc kubenswrapper[4722]: I0309 14:31:19.410867 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-ml5m4" Mar 09 14:31:19 crc kubenswrapper[4722]: I0309 14:31:19.461915 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"fabf84f5-0f35-4400-b612-235235a21f3c","Type":"ContainerStarted","Data":"aa10c897dacc74fa73423c321dbb7636fc428bf68942bf63df7a76c7719c43be"} Mar 09 14:31:19 crc kubenswrapper[4722]: I0309 14:31:19.462125 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-2" Mar 09 14:31:19 crc kubenswrapper[4722]: I0309 14:31:19.464417 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-6957fcb6b8-8jmt4" Mar 09 14:31:19 crc kubenswrapper[4722]: I0309 14:31:19.465836 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6957fcb6b8-8jmt4" event={"ID":"9cd4ad6c-298e-47c9-8648-3cfa1f407aad","Type":"ContainerDied","Data":"14e1e2adbf020bfb97669fe703440cebd4aef6774ae12c997f83538f125d3769"} Mar 09 14:31:19 crc kubenswrapper[4722]: I0309 14:31:19.465876 4722 scope.go:117] "RemoveContainer" containerID="7a6d6410a31756df86ae1256bbe1b8693b637feb0999a0f93b3883be294fab8b" Mar 09 14:31:19 crc kubenswrapper[4722]: I0309 14:31:19.482070 4722 generic.go:334] "Generic (PLEG): container finished" podID="9213500f-da49-496b-b99e-1ec95658b48f" containerID="16ca1376d5e10d5ddff1520fbf35e02673d95dd3732d67bdc68c20e9647c98aa" exitCode=0 Mar 09 14:31:19 crc kubenswrapper[4722]: I0309 14:31:19.482240 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5c766df7b4-znr5j" event={"ID":"9213500f-da49-496b-b99e-1ec95658b48f","Type":"ContainerDied","Data":"16ca1376d5e10d5ddff1520fbf35e02673d95dd3732d67bdc68c20e9647c98aa"} Mar 09 14:31:19 crc kubenswrapper[4722]: I0309 14:31:19.482288 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-5c766df7b4-znr5j" event={"ID":"9213500f-da49-496b-b99e-1ec95658b48f","Type":"ContainerDied","Data":"08b8f36076a9b9afbf23360ea871dec1d014a02f3bbc37147b10da79b806d583"} Mar 09 14:31:19 crc kubenswrapper[4722]: I0309 14:31:19.483862 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-5c766df7b4-znr5j" Mar 09 14:31:19 crc kubenswrapper[4722]: I0309 14:31:19.497559 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-2" podStartSLOduration=44.497537225 podStartE2EDuration="44.497537225s" podCreationTimestamp="2026-03-09 14:30:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:31:19.484590449 +0000 UTC m=+1720.040159025" watchObservedRunningTime="2026-03-09 14:31:19.497537225 +0000 UTC m=+1720.053105801" Mar 09 14:31:19 crc kubenswrapper[4722]: I0309 14:31:19.536064 4722 scope.go:117] "RemoveContainer" containerID="16ca1376d5e10d5ddff1520fbf35e02673d95dd3732d67bdc68c20e9647c98aa" Mar 09 14:31:19 crc kubenswrapper[4722]: I0309 14:31:19.556398 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-5c766df7b4-znr5j"] Mar 09 14:31:19 crc kubenswrapper[4722]: I0309 14:31:19.568226 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-ml5m4" Mar 09 14:31:19 crc kubenswrapper[4722]: I0309 14:31:19.571185 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-5c766df7b4-znr5j"] Mar 09 14:31:19 crc kubenswrapper[4722]: I0309 14:31:19.582192 4722 scope.go:117] "RemoveContainer" containerID="16ca1376d5e10d5ddff1520fbf35e02673d95dd3732d67bdc68c20e9647c98aa" Mar 09 14:31:19 crc kubenswrapper[4722]: E0309 14:31:19.582806 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16ca1376d5e10d5ddff1520fbf35e02673d95dd3732d67bdc68c20e9647c98aa\": container with ID starting with 16ca1376d5e10d5ddff1520fbf35e02673d95dd3732d67bdc68c20e9647c98aa not found: ID does not exist" containerID="16ca1376d5e10d5ddff1520fbf35e02673d95dd3732d67bdc68c20e9647c98aa" Mar 09 14:31:19 crc kubenswrapper[4722]: I0309 14:31:19.582869 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16ca1376d5e10d5ddff1520fbf35e02673d95dd3732d67bdc68c20e9647c98aa"} err="failed to get container status \"16ca1376d5e10d5ddff1520fbf35e02673d95dd3732d67bdc68c20e9647c98aa\": rpc error: code = NotFound desc = could not find container \"16ca1376d5e10d5ddff1520fbf35e02673d95dd3732d67bdc68c20e9647c98aa\": container with ID starting with 16ca1376d5e10d5ddff1520fbf35e02673d95dd3732d67bdc68c20e9647c98aa not found: ID does not exist" Mar 09 14:31:19 crc kubenswrapper[4722]: I0309 14:31:19.594510 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-6957fcb6b8-8jmt4"] Mar 09 14:31:19 crc kubenswrapper[4722]: I0309 14:31:19.609625 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-6957fcb6b8-8jmt4"] Mar 09 14:31:19 crc kubenswrapper[4722]: I0309 14:31:19.650106 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ml5m4"] Mar 09 14:31:20 crc kubenswrapper[4722]: I0309 14:31:20.163087 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9213500f-da49-496b-b99e-1ec95658b48f" path="/var/lib/kubelet/pods/9213500f-da49-496b-b99e-1ec95658b48f/volumes" Mar 09 14:31:20 crc kubenswrapper[4722]: I0309 14:31:20.164110 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9cd4ad6c-298e-47c9-8648-3cfa1f407aad" path="/var/lib/kubelet/pods/9cd4ad6c-298e-47c9-8648-3cfa1f407aad/volumes" Mar 09 14:31:20 crc kubenswrapper[4722]: I0309 14:31:20.496496 4722 generic.go:334] "Generic (PLEG): container finished" podID="8b6ee542-26e6-4126-8566-a34f7621d104" containerID="166693fcef0f0e5cf4fb139e3aa52ab299afb35da080f1e1a96788f921624861" exitCode=0 Mar 09 14:31:20 crc kubenswrapper[4722]: I0309 14:31:20.496600 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8b6ee542-26e6-4126-8566-a34f7621d104","Type":"ContainerDied","Data":"166693fcef0f0e5cf4fb139e3aa52ab299afb35da080f1e1a96788f921624861"} Mar 09 14:31:21 crc kubenswrapper[4722]: I0309 14:31:21.515054 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8b6ee542-26e6-4126-8566-a34f7621d104","Type":"ContainerStarted","Data":"dd4ed45b3954b071edc59a820a633fb028cc77e41d685f699a3c313f53223ebb"} Mar 09 14:31:21 crc kubenswrapper[4722]: I0309 14:31:21.515223 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-ml5m4" podUID="10d0cd54-d5b4-4b4a-ae3b-4ad5f0154500" containerName="registry-server" containerID="cri-o://b15b84a7f908329be743d8e4ca04d04202cb9f9ee93a41e038887a59debd406d" gracePeriod=2 Mar 09 14:31:21 crc kubenswrapper[4722]: I0309 14:31:21.515851 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 09 14:31:21 crc kubenswrapper[4722]: I0309 14:31:21.633083 4722 scope.go:117] "RemoveContainer" containerID="2fa001aeecbb2e4d0fbf27e36b5acac60963a831415d2c1be0bb7d705ff66e2e" Mar 09 14:31:21 crc kubenswrapper[4722]: I0309 14:31:21.741280 4722 scope.go:117] "RemoveContainer" containerID="f2225cc475fe79f373efd2edc2c53e8d89e8acca9f05facf3b97f00de7c31ca4" Mar 09 14:31:21 crc kubenswrapper[4722]: I0309 14:31:21.904084 4722 scope.go:117] "RemoveContainer" containerID="9914625d14d065d2c597a16fdbdc977022690005aca822fcfd2634b47fba082d" Mar 09 14:31:22 crc kubenswrapper[4722]: I0309 14:31:22.012885 4722 scope.go:117] "RemoveContainer" containerID="255b938bfd69ad600dd36a900edecc9cf5d23939501d8d23e0bda82feb284e0e" Mar 09 14:31:22 crc kubenswrapper[4722]: I0309 14:31:22.129662 4722 scope.go:117] "RemoveContainer" containerID="032f4d0bf04298bfee55256aec077d596bd22f1c59ec4dc749b6b9452d36eb42" Mar 09 14:31:22 crc kubenswrapper[4722]: I0309 14:31:22.283334 4722 scope.go:117] "RemoveContainer" containerID="f43ad056ea4fd9b6d874ac1b12dbe3d25d05f26348f873c90015860853bd08a4" Mar 09 14:31:22 crc kubenswrapper[4722]: I0309 14:31:22.341991 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ml5m4" Mar 09 14:31:22 crc kubenswrapper[4722]: I0309 14:31:22.361290 4722 scope.go:117] "RemoveContainer" containerID="4aa6764ae1c61b4084dcc5f0a098a411934735e696ad3102d63cf6771c505e59" Mar 09 14:31:22 crc kubenswrapper[4722]: I0309 14:31:22.367250 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=38.367233086 podStartE2EDuration="38.367233086s" podCreationTimestamp="2026-03-09 14:30:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:31:21.54701446 +0000 UTC m=+1722.102583046" watchObservedRunningTime="2026-03-09 14:31:22.367233086 +0000 UTC m=+1722.922801662" Mar 09 14:31:22 crc kubenswrapper[4722]: I0309 14:31:22.521724 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10d0cd54-d5b4-4b4a-ae3b-4ad5f0154500-catalog-content\") pod \"10d0cd54-d5b4-4b4a-ae3b-4ad5f0154500\" (UID: \"10d0cd54-d5b4-4b4a-ae3b-4ad5f0154500\") " Mar 09 14:31:22 crc kubenswrapper[4722]: I0309 14:31:22.521835 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10d0cd54-d5b4-4b4a-ae3b-4ad5f0154500-utilities\") pod \"10d0cd54-d5b4-4b4a-ae3b-4ad5f0154500\" (UID: \"10d0cd54-d5b4-4b4a-ae3b-4ad5f0154500\") " Mar 09 14:31:22 crc kubenswrapper[4722]: I0309 14:31:22.521871 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hlpx8\" (UniqueName: \"kubernetes.io/projected/10d0cd54-d5b4-4b4a-ae3b-4ad5f0154500-kube-api-access-hlpx8\") pod \"10d0cd54-d5b4-4b4a-ae3b-4ad5f0154500\" (UID: \"10d0cd54-d5b4-4b4a-ae3b-4ad5f0154500\") " Mar 09 14:31:22 crc kubenswrapper[4722]: I0309 14:31:22.522773 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10d0cd54-d5b4-4b4a-ae3b-4ad5f0154500-utilities" (OuterVolumeSpecName: "utilities") pod "10d0cd54-d5b4-4b4a-ae3b-4ad5f0154500" (UID: "10d0cd54-d5b4-4b4a-ae3b-4ad5f0154500"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:31:22 crc kubenswrapper[4722]: I0309 14:31:22.533519 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10d0cd54-d5b4-4b4a-ae3b-4ad5f0154500-kube-api-access-hlpx8" (OuterVolumeSpecName: "kube-api-access-hlpx8") pod "10d0cd54-d5b4-4b4a-ae3b-4ad5f0154500" (UID: "10d0cd54-d5b4-4b4a-ae3b-4ad5f0154500"). InnerVolumeSpecName "kube-api-access-hlpx8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:31:22 crc kubenswrapper[4722]: I0309 14:31:22.536420 4722 generic.go:334] "Generic (PLEG): container finished" podID="10d0cd54-d5b4-4b4a-ae3b-4ad5f0154500" containerID="b15b84a7f908329be743d8e4ca04d04202cb9f9ee93a41e038887a59debd406d" exitCode=0 Mar 09 14:31:22 crc kubenswrapper[4722]: I0309 14:31:22.536471 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ml5m4" event={"ID":"10d0cd54-d5b4-4b4a-ae3b-4ad5f0154500","Type":"ContainerDied","Data":"b15b84a7f908329be743d8e4ca04d04202cb9f9ee93a41e038887a59debd406d"} Mar 09 14:31:22 crc kubenswrapper[4722]: I0309 14:31:22.536497 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-ml5m4" event={"ID":"10d0cd54-d5b4-4b4a-ae3b-4ad5f0154500","Type":"ContainerDied","Data":"426a42be2b1c9937c335eaeb272b0c3c2378530be8e65ef979668739314b8d56"} Mar 09 14:31:22 crc kubenswrapper[4722]: I0309 14:31:22.536514 4722 scope.go:117] "RemoveContainer" containerID="b15b84a7f908329be743d8e4ca04d04202cb9f9ee93a41e038887a59debd406d" Mar 09 14:31:22 crc kubenswrapper[4722]: I0309 14:31:22.536655 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-ml5m4" Mar 09 14:31:22 crc kubenswrapper[4722]: I0309 14:31:22.562119 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10d0cd54-d5b4-4b4a-ae3b-4ad5f0154500-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "10d0cd54-d5b4-4b4a-ae3b-4ad5f0154500" (UID: "10d0cd54-d5b4-4b4a-ae3b-4ad5f0154500"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:31:22 crc kubenswrapper[4722]: I0309 14:31:22.601353 4722 scope.go:117] "RemoveContainer" containerID="57ccf516c4d79f8833b176942300148c4d7f8b7056bf9f6d6dd5d652971ba3ee" Mar 09 14:31:22 crc kubenswrapper[4722]: I0309 14:31:22.624920 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/10d0cd54-d5b4-4b4a-ae3b-4ad5f0154500-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 14:31:22 crc kubenswrapper[4722]: I0309 14:31:22.624955 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/10d0cd54-d5b4-4b4a-ae3b-4ad5f0154500-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 14:31:22 crc kubenswrapper[4722]: I0309 14:31:22.624965 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hlpx8\" (UniqueName: \"kubernetes.io/projected/10d0cd54-d5b4-4b4a-ae3b-4ad5f0154500-kube-api-access-hlpx8\") on node \"crc\" DevicePath \"\"" Mar 09 14:31:22 crc kubenswrapper[4722]: I0309 14:31:22.674108 4722 scope.go:117] "RemoveContainer" containerID="644ef690a5fed758daca5d89aa9cba20404323bb8bdd9a0dfaa8f28dea79074f" Mar 09 14:31:22 crc kubenswrapper[4722]: I0309 14:31:22.723243 4722 scope.go:117] "RemoveContainer" containerID="b15b84a7f908329be743d8e4ca04d04202cb9f9ee93a41e038887a59debd406d" Mar 09 14:31:22 crc kubenswrapper[4722]: E0309 14:31:22.723902 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b15b84a7f908329be743d8e4ca04d04202cb9f9ee93a41e038887a59debd406d\": container with ID starting with b15b84a7f908329be743d8e4ca04d04202cb9f9ee93a41e038887a59debd406d not found: ID does not exist" containerID="b15b84a7f908329be743d8e4ca04d04202cb9f9ee93a41e038887a59debd406d" Mar 09 14:31:22 crc kubenswrapper[4722]: I0309 14:31:22.723934 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b15b84a7f908329be743d8e4ca04d04202cb9f9ee93a41e038887a59debd406d"} err="failed to get container status \"b15b84a7f908329be743d8e4ca04d04202cb9f9ee93a41e038887a59debd406d\": rpc error: code = NotFound desc = could not find container \"b15b84a7f908329be743d8e4ca04d04202cb9f9ee93a41e038887a59debd406d\": container with ID starting with b15b84a7f908329be743d8e4ca04d04202cb9f9ee93a41e038887a59debd406d not found: ID does not exist" Mar 09 14:31:22 crc kubenswrapper[4722]: I0309 14:31:22.723954 4722 scope.go:117] "RemoveContainer" containerID="57ccf516c4d79f8833b176942300148c4d7f8b7056bf9f6d6dd5d652971ba3ee" Mar 09 14:31:22 crc kubenswrapper[4722]: E0309 14:31:22.724412 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57ccf516c4d79f8833b176942300148c4d7f8b7056bf9f6d6dd5d652971ba3ee\": container with ID starting with 57ccf516c4d79f8833b176942300148c4d7f8b7056bf9f6d6dd5d652971ba3ee not found: ID does not exist" containerID="57ccf516c4d79f8833b176942300148c4d7f8b7056bf9f6d6dd5d652971ba3ee" Mar 09 14:31:22 crc kubenswrapper[4722]: I0309 14:31:22.724458 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57ccf516c4d79f8833b176942300148c4d7f8b7056bf9f6d6dd5d652971ba3ee"} err="failed to get container status \"57ccf516c4d79f8833b176942300148c4d7f8b7056bf9f6d6dd5d652971ba3ee\": rpc error: code = NotFound desc = could not find container \"57ccf516c4d79f8833b176942300148c4d7f8b7056bf9f6d6dd5d652971ba3ee\": container with ID starting with 57ccf516c4d79f8833b176942300148c4d7f8b7056bf9f6d6dd5d652971ba3ee not found: ID does not exist" Mar 09 14:31:22 crc kubenswrapper[4722]: I0309 14:31:22.724487 4722 scope.go:117] "RemoveContainer" containerID="644ef690a5fed758daca5d89aa9cba20404323bb8bdd9a0dfaa8f28dea79074f" Mar 09 14:31:22 crc kubenswrapper[4722]: E0309 14:31:22.725314 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"644ef690a5fed758daca5d89aa9cba20404323bb8bdd9a0dfaa8f28dea79074f\": container with ID starting with 644ef690a5fed758daca5d89aa9cba20404323bb8bdd9a0dfaa8f28dea79074f not found: ID does not exist" containerID="644ef690a5fed758daca5d89aa9cba20404323bb8bdd9a0dfaa8f28dea79074f" Mar 09 14:31:22 crc kubenswrapper[4722]: I0309 14:31:22.725660 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"644ef690a5fed758daca5d89aa9cba20404323bb8bdd9a0dfaa8f28dea79074f"} err="failed to get container status \"644ef690a5fed758daca5d89aa9cba20404323bb8bdd9a0dfaa8f28dea79074f\": rpc error: code = NotFound desc = could not find container \"644ef690a5fed758daca5d89aa9cba20404323bb8bdd9a0dfaa8f28dea79074f\": container with ID starting with 644ef690a5fed758daca5d89aa9cba20404323bb8bdd9a0dfaa8f28dea79074f not found: ID does not exist" Mar 09 14:31:22 crc kubenswrapper[4722]: I0309 14:31:22.765966 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-775d545446-nrcd2" Mar 09 14:31:22 crc kubenswrapper[4722]: I0309 14:31:22.861075 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-678fb995f7-q5bfj"] Mar 09 14:31:22 crc kubenswrapper[4722]: I0309 14:31:22.863186 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-engine-678fb995f7-q5bfj" podUID="dfb53424-4444-464a-9be1-97e2e095c496" containerName="heat-engine" containerID="cri-o://d13db23d16ba06dabebfe79eacf52b9d98971ac551281b501b27011139739a15" gracePeriod=60 Mar 09 14:31:22 crc kubenswrapper[4722]: I0309 14:31:22.928795 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-ml5m4"] Mar 09 14:31:22 crc kubenswrapper[4722]: I0309 14:31:22.951316 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-ml5m4"] Mar 09 14:31:23 crc kubenswrapper[4722]: I0309 14:31:23.149635 4722 scope.go:117] "RemoveContainer" containerID="86a6dcdf7dcf4d48b8bdfb71d9a2c30b89f235700c424312c7e16580a86021d6" Mar 09 14:31:23 crc kubenswrapper[4722]: E0309 14:31:23.149926 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 14:31:24 crc kubenswrapper[4722]: I0309 14:31:24.165053 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10d0cd54-d5b4-4b4a-ae3b-4ad5f0154500" path="/var/lib/kubelet/pods/10d0cd54-d5b4-4b4a-ae3b-4ad5f0154500/volumes" Mar 09 14:31:24 crc kubenswrapper[4722]: I0309 14:31:24.682221 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-rct52"] Mar 09 14:31:24 crc kubenswrapper[4722]: E0309 14:31:24.682943 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cd4ad6c-298e-47c9-8648-3cfa1f407aad" containerName="heat-api" Mar 09 14:31:24 crc kubenswrapper[4722]: I0309 14:31:24.682968 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cd4ad6c-298e-47c9-8648-3cfa1f407aad" containerName="heat-api" Mar 09 14:31:24 crc kubenswrapper[4722]: E0309 14:31:24.683006 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10d0cd54-d5b4-4b4a-ae3b-4ad5f0154500" containerName="registry-server" Mar 09 14:31:24 crc kubenswrapper[4722]: I0309 14:31:24.683015 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="10d0cd54-d5b4-4b4a-ae3b-4ad5f0154500" containerName="registry-server" Mar 09 14:31:24 crc kubenswrapper[4722]: E0309 14:31:24.683046 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10d0cd54-d5b4-4b4a-ae3b-4ad5f0154500" containerName="extract-content" Mar 09 14:31:24 crc kubenswrapper[4722]: I0309 14:31:24.683055 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="10d0cd54-d5b4-4b4a-ae3b-4ad5f0154500" containerName="extract-content" Mar 09 14:31:24 crc kubenswrapper[4722]: E0309 14:31:24.683071 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10d0cd54-d5b4-4b4a-ae3b-4ad5f0154500" containerName="extract-utilities" Mar 09 14:31:24 crc kubenswrapper[4722]: I0309 14:31:24.683079 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="10d0cd54-d5b4-4b4a-ae3b-4ad5f0154500" containerName="extract-utilities" Mar 09 14:31:24 crc kubenswrapper[4722]: E0309 14:31:24.683095 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9213500f-da49-496b-b99e-1ec95658b48f" containerName="heat-cfnapi" Mar 09 14:31:24 crc kubenswrapper[4722]: I0309 14:31:24.683102 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="9213500f-da49-496b-b99e-1ec95658b48f" containerName="heat-cfnapi" Mar 09 14:31:24 crc kubenswrapper[4722]: I0309 14:31:24.683369 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cd4ad6c-298e-47c9-8648-3cfa1f407aad" containerName="heat-api" Mar 09 14:31:24 crc kubenswrapper[4722]: I0309 14:31:24.683391 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="9213500f-da49-496b-b99e-1ec95658b48f" containerName="heat-cfnapi" Mar 09 14:31:24 crc kubenswrapper[4722]: I0309 14:31:24.683399 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="10d0cd54-d5b4-4b4a-ae3b-4ad5f0154500" containerName="registry-server" Mar 09 14:31:24 crc kubenswrapper[4722]: I0309 14:31:24.684361 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-rct52" Mar 09 14:31:24 crc kubenswrapper[4722]: I0309 14:31:24.688720 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-dwlbg" Mar 09 14:31:24 crc kubenswrapper[4722]: I0309 14:31:24.692766 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 14:31:24 crc kubenswrapper[4722]: I0309 14:31:24.692848 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 14:31:24 crc kubenswrapper[4722]: I0309 14:31:24.707218 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 14:31:24 crc kubenswrapper[4722]: I0309 14:31:24.712522 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-rct52"] Mar 09 14:31:24 crc kubenswrapper[4722]: I0309 14:31:24.774592 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5tgb\" (UniqueName: \"kubernetes.io/projected/5f8120f5-690a-4bb4-ba23-dead16d6946f-kube-api-access-r5tgb\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-rct52\" (UID: \"5f8120f5-690a-4bb4-ba23-dead16d6946f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-rct52" Mar 09 14:31:24 crc kubenswrapper[4722]: I0309 14:31:24.774665 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f8120f5-690a-4bb4-ba23-dead16d6946f-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-rct52\" (UID: \"5f8120f5-690a-4bb4-ba23-dead16d6946f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-rct52" Mar 09 14:31:24 crc kubenswrapper[4722]: I0309 14:31:24.774820 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5f8120f5-690a-4bb4-ba23-dead16d6946f-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-rct52\" (UID: \"5f8120f5-690a-4bb4-ba23-dead16d6946f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-rct52" Mar 09 14:31:24 crc kubenswrapper[4722]: I0309 14:31:24.774877 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5f8120f5-690a-4bb4-ba23-dead16d6946f-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-rct52\" (UID: \"5f8120f5-690a-4bb4-ba23-dead16d6946f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-rct52" Mar 09 14:31:24 crc kubenswrapper[4722]: I0309 14:31:24.877889 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5f8120f5-690a-4bb4-ba23-dead16d6946f-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-rct52\" (UID: \"5f8120f5-690a-4bb4-ba23-dead16d6946f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-rct52" Mar 09 14:31:24 crc kubenswrapper[4722]: I0309 14:31:24.878342 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5f8120f5-690a-4bb4-ba23-dead16d6946f-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-rct52\" (UID: \"5f8120f5-690a-4bb4-ba23-dead16d6946f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-rct52" Mar 09 14:31:24 crc kubenswrapper[4722]: I0309 14:31:24.878387 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5tgb\" (UniqueName: \"kubernetes.io/projected/5f8120f5-690a-4bb4-ba23-dead16d6946f-kube-api-access-r5tgb\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-rct52\" (UID: \"5f8120f5-690a-4bb4-ba23-dead16d6946f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-rct52" Mar 09 14:31:24 crc kubenswrapper[4722]: I0309 14:31:24.878447 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f8120f5-690a-4bb4-ba23-dead16d6946f-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-rct52\" (UID: \"5f8120f5-690a-4bb4-ba23-dead16d6946f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-rct52" Mar 09 14:31:24 crc kubenswrapper[4722]: I0309 14:31:24.883749 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5f8120f5-690a-4bb4-ba23-dead16d6946f-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-rct52\" (UID: \"5f8120f5-690a-4bb4-ba23-dead16d6946f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-rct52" Mar 09 14:31:24 crc kubenswrapper[4722]: I0309 14:31:24.884524 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f8120f5-690a-4bb4-ba23-dead16d6946f-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-rct52\" (UID: \"5f8120f5-690a-4bb4-ba23-dead16d6946f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-rct52" Mar 09 14:31:24 crc kubenswrapper[4722]: I0309 14:31:24.895589 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5tgb\" (UniqueName: \"kubernetes.io/projected/5f8120f5-690a-4bb4-ba23-dead16d6946f-kube-api-access-r5tgb\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-rct52\" (UID: \"5f8120f5-690a-4bb4-ba23-dead16d6946f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-rct52" Mar 09 14:31:24 crc kubenswrapper[4722]: I0309 14:31:24.895871 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5f8120f5-690a-4bb4-ba23-dead16d6946f-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-rct52\" (UID: \"5f8120f5-690a-4bb4-ba23-dead16d6946f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-rct52" Mar 09 14:31:25 crc kubenswrapper[4722]: I0309 14:31:25.003985 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-rct52" Mar 09 14:31:26 crc kubenswrapper[4722]: I0309 14:31:25.999936 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-rct52"] Mar 09 14:31:26 crc kubenswrapper[4722]: W0309 14:31:26.010339 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f8120f5_690a_4bb4_ba23_dead16d6946f.slice/crio-fa86e54ae662b8ada509c7012ce7a0bbe3973aedb3398e2fbcaa18191a085013 WatchSource:0}: Error finding container fa86e54ae662b8ada509c7012ce7a0bbe3973aedb3398e2fbcaa18191a085013: Status 404 returned error can't find the container with id fa86e54ae662b8ada509c7012ce7a0bbe3973aedb3398e2fbcaa18191a085013 Mar 09 14:31:26 crc kubenswrapper[4722]: I0309 14:31:26.641582 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-rct52" event={"ID":"5f8120f5-690a-4bb4-ba23-dead16d6946f","Type":"ContainerStarted","Data":"fa86e54ae662b8ada509c7012ce7a0bbe3973aedb3398e2fbcaa18191a085013"} Mar 09 14:31:27 crc kubenswrapper[4722]: E0309 14:31:27.229960 4722 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d13db23d16ba06dabebfe79eacf52b9d98971ac551281b501b27011139739a15" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 09 14:31:27 crc kubenswrapper[4722]: E0309 14:31:27.232512 4722 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d13db23d16ba06dabebfe79eacf52b9d98971ac551281b501b27011139739a15" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 09 14:31:27 crc kubenswrapper[4722]: E0309 14:31:27.233758 4722 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d13db23d16ba06dabebfe79eacf52b9d98971ac551281b501b27011139739a15" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 09 14:31:27 crc kubenswrapper[4722]: E0309 14:31:27.233826 4722 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-678fb995f7-q5bfj" podUID="dfb53424-4444-464a-9be1-97e2e095c496" containerName="heat-engine" Mar 09 14:31:27 crc kubenswrapper[4722]: I0309 14:31:27.328158 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-vl8jp"] Mar 09 14:31:27 crc kubenswrapper[4722]: I0309 14:31:27.339464 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-vl8jp"] Mar 09 14:31:27 crc kubenswrapper[4722]: I0309 14:31:27.487278 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-qn7dx"] Mar 09 14:31:27 crc kubenswrapper[4722]: I0309 14:31:27.488946 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-qn7dx" Mar 09 14:31:27 crc kubenswrapper[4722]: I0309 14:31:27.492246 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 09 14:31:27 crc kubenswrapper[4722]: I0309 14:31:27.534170 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-qn7dx"] Mar 09 14:31:27 crc kubenswrapper[4722]: I0309 14:31:27.550655 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40e6bb4d-411c-43c2-9959-0d1b9e005a11-scripts\") pod \"aodh-db-sync-qn7dx\" (UID: \"40e6bb4d-411c-43c2-9959-0d1b9e005a11\") " pod="openstack/aodh-db-sync-qn7dx" Mar 09 14:31:27 crc kubenswrapper[4722]: I0309 14:31:27.550722 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40e6bb4d-411c-43c2-9959-0d1b9e005a11-config-data\") pod \"aodh-db-sync-qn7dx\" (UID: \"40e6bb4d-411c-43c2-9959-0d1b9e005a11\") " pod="openstack/aodh-db-sync-qn7dx" Mar 09 14:31:27 crc kubenswrapper[4722]: I0309 14:31:27.550765 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmlbm\" (UniqueName: \"kubernetes.io/projected/40e6bb4d-411c-43c2-9959-0d1b9e005a11-kube-api-access-jmlbm\") pod \"aodh-db-sync-qn7dx\" (UID: \"40e6bb4d-411c-43c2-9959-0d1b9e005a11\") " pod="openstack/aodh-db-sync-qn7dx" Mar 09 14:31:27 crc kubenswrapper[4722]: I0309 14:31:27.550798 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40e6bb4d-411c-43c2-9959-0d1b9e005a11-combined-ca-bundle\") pod \"aodh-db-sync-qn7dx\" (UID: \"40e6bb4d-411c-43c2-9959-0d1b9e005a11\") " pod="openstack/aodh-db-sync-qn7dx" Mar 09 14:31:27 crc kubenswrapper[4722]: I0309 14:31:27.653105 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40e6bb4d-411c-43c2-9959-0d1b9e005a11-config-data\") pod \"aodh-db-sync-qn7dx\" (UID: \"40e6bb4d-411c-43c2-9959-0d1b9e005a11\") " pod="openstack/aodh-db-sync-qn7dx" Mar 09 14:31:27 crc kubenswrapper[4722]: I0309 14:31:27.653204 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmlbm\" (UniqueName: \"kubernetes.io/projected/40e6bb4d-411c-43c2-9959-0d1b9e005a11-kube-api-access-jmlbm\") pod \"aodh-db-sync-qn7dx\" (UID: \"40e6bb4d-411c-43c2-9959-0d1b9e005a11\") " pod="openstack/aodh-db-sync-qn7dx" Mar 09 14:31:27 crc kubenswrapper[4722]: I0309 14:31:27.653273 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40e6bb4d-411c-43c2-9959-0d1b9e005a11-combined-ca-bundle\") pod \"aodh-db-sync-qn7dx\" (UID: \"40e6bb4d-411c-43c2-9959-0d1b9e005a11\") " pod="openstack/aodh-db-sync-qn7dx" Mar 09 14:31:27 crc kubenswrapper[4722]: I0309 14:31:27.653531 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40e6bb4d-411c-43c2-9959-0d1b9e005a11-scripts\") pod \"aodh-db-sync-qn7dx\" (UID: \"40e6bb4d-411c-43c2-9959-0d1b9e005a11\") " pod="openstack/aodh-db-sync-qn7dx" Mar 09 14:31:27 crc kubenswrapper[4722]: I0309 14:31:27.669178 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40e6bb4d-411c-43c2-9959-0d1b9e005a11-scripts\") pod \"aodh-db-sync-qn7dx\" (UID: \"40e6bb4d-411c-43c2-9959-0d1b9e005a11\") " pod="openstack/aodh-db-sync-qn7dx" Mar 09 14:31:27 crc kubenswrapper[4722]: I0309 14:31:27.672204 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40e6bb4d-411c-43c2-9959-0d1b9e005a11-config-data\") pod \"aodh-db-sync-qn7dx\" (UID: \"40e6bb4d-411c-43c2-9959-0d1b9e005a11\") " pod="openstack/aodh-db-sync-qn7dx" Mar 09 14:31:27 crc kubenswrapper[4722]: I0309 14:31:27.700805 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40e6bb4d-411c-43c2-9959-0d1b9e005a11-combined-ca-bundle\") pod \"aodh-db-sync-qn7dx\" (UID: \"40e6bb4d-411c-43c2-9959-0d1b9e005a11\") " pod="openstack/aodh-db-sync-qn7dx" Mar 09 14:31:27 crc kubenswrapper[4722]: I0309 14:31:27.705179 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmlbm\" (UniqueName: \"kubernetes.io/projected/40e6bb4d-411c-43c2-9959-0d1b9e005a11-kube-api-access-jmlbm\") pod \"aodh-db-sync-qn7dx\" (UID: \"40e6bb4d-411c-43c2-9959-0d1b9e005a11\") " pod="openstack/aodh-db-sync-qn7dx" Mar 09 14:31:27 crc kubenswrapper[4722]: I0309 14:31:27.850777 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-qn7dx" Mar 09 14:31:28 crc kubenswrapper[4722]: I0309 14:31:28.194766 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c58faa7-5e6b-4140-9859-a4729c6354d9" path="/var/lib/kubelet/pods/1c58faa7-5e6b-4140-9859-a4729c6354d9/volumes" Mar 09 14:31:28 crc kubenswrapper[4722]: I0309 14:31:28.487468 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-qn7dx"] Mar 09 14:31:28 crc kubenswrapper[4722]: I0309 14:31:28.698787 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-qn7dx" event={"ID":"40e6bb4d-411c-43c2-9959-0d1b9e005a11","Type":"ContainerStarted","Data":"c79b0b528bda7fe57228cfb11ed32304aa1de3a658e529ef733181f6399562a3"} Mar 09 14:31:31 crc kubenswrapper[4722]: I0309 14:31:31.165235 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rvtqn"] Mar 09 14:31:31 crc kubenswrapper[4722]: I0309 14:31:31.168932 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rvtqn" Mar 09 14:31:31 crc kubenswrapper[4722]: I0309 14:31:31.253647 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rvtqn"] Mar 09 14:31:31 crc kubenswrapper[4722]: I0309 14:31:31.276857 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jdtn\" (UniqueName: \"kubernetes.io/projected/3411289f-3e7c-4e43-b545-5e612822b18e-kube-api-access-8jdtn\") pod \"community-operators-rvtqn\" (UID: \"3411289f-3e7c-4e43-b545-5e612822b18e\") " pod="openshift-marketplace/community-operators-rvtqn" Mar 09 14:31:31 crc kubenswrapper[4722]: I0309 14:31:31.277038 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3411289f-3e7c-4e43-b545-5e612822b18e-catalog-content\") pod \"community-operators-rvtqn\" (UID: \"3411289f-3e7c-4e43-b545-5e612822b18e\") " pod="openshift-marketplace/community-operators-rvtqn" Mar 09 14:31:31 crc kubenswrapper[4722]: I0309 14:31:31.277106 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3411289f-3e7c-4e43-b545-5e612822b18e-utilities\") pod \"community-operators-rvtqn\" (UID: \"3411289f-3e7c-4e43-b545-5e612822b18e\") " pod="openshift-marketplace/community-operators-rvtqn" Mar 09 14:31:31 crc kubenswrapper[4722]: I0309 14:31:31.379082 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jdtn\" (UniqueName: \"kubernetes.io/projected/3411289f-3e7c-4e43-b545-5e612822b18e-kube-api-access-8jdtn\") pod \"community-operators-rvtqn\" (UID: \"3411289f-3e7c-4e43-b545-5e612822b18e\") " pod="openshift-marketplace/community-operators-rvtqn" Mar 09 14:31:31 crc kubenswrapper[4722]: I0309 14:31:31.379210 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3411289f-3e7c-4e43-b545-5e612822b18e-catalog-content\") pod \"community-operators-rvtqn\" (UID: \"3411289f-3e7c-4e43-b545-5e612822b18e\") " pod="openshift-marketplace/community-operators-rvtqn" Mar 09 14:31:31 crc kubenswrapper[4722]: I0309 14:31:31.379274 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3411289f-3e7c-4e43-b545-5e612822b18e-utilities\") pod \"community-operators-rvtqn\" (UID: \"3411289f-3e7c-4e43-b545-5e612822b18e\") " pod="openshift-marketplace/community-operators-rvtqn" Mar 09 14:31:31 crc kubenswrapper[4722]: I0309 14:31:31.380559 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3411289f-3e7c-4e43-b545-5e612822b18e-utilities\") pod \"community-operators-rvtqn\" (UID: \"3411289f-3e7c-4e43-b545-5e612822b18e\") " pod="openshift-marketplace/community-operators-rvtqn" Mar 09 14:31:31 crc kubenswrapper[4722]: I0309 14:31:31.381663 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3411289f-3e7c-4e43-b545-5e612822b18e-catalog-content\") pod \"community-operators-rvtqn\" (UID: \"3411289f-3e7c-4e43-b545-5e612822b18e\") " pod="openshift-marketplace/community-operators-rvtqn" Mar 09 14:31:31 crc kubenswrapper[4722]: I0309 14:31:31.425174 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jdtn\" (UniqueName: \"kubernetes.io/projected/3411289f-3e7c-4e43-b545-5e612822b18e-kube-api-access-8jdtn\") pod \"community-operators-rvtqn\" (UID: \"3411289f-3e7c-4e43-b545-5e612822b18e\") " pod="openshift-marketplace/community-operators-rvtqn" Mar 09 14:31:31 crc kubenswrapper[4722]: I0309 14:31:31.547180 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rvtqn" Mar 09 14:31:34 crc kubenswrapper[4722]: I0309 14:31:34.818571 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 09 14:31:35 crc kubenswrapper[4722]: I0309 14:31:35.150016 4722 scope.go:117] "RemoveContainer" containerID="86a6dcdf7dcf4d48b8bdfb71d9a2c30b89f235700c424312c7e16580a86021d6" Mar 09 14:31:35 crc kubenswrapper[4722]: E0309 14:31:35.150695 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 14:31:36 crc kubenswrapper[4722]: I0309 14:31:36.144710 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-2" Mar 09 14:31:36 crc kubenswrapper[4722]: I0309 14:31:36.288646 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-1"] Mar 09 14:31:37 crc kubenswrapper[4722]: E0309 14:31:37.227513 4722 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d13db23d16ba06dabebfe79eacf52b9d98971ac551281b501b27011139739a15" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 09 14:31:37 crc kubenswrapper[4722]: E0309 14:31:37.229942 4722 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d13db23d16ba06dabebfe79eacf52b9d98971ac551281b501b27011139739a15" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 09 14:31:37 crc kubenswrapper[4722]: E0309 14:31:37.232489 4722 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d13db23d16ba06dabebfe79eacf52b9d98971ac551281b501b27011139739a15" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 09 14:31:37 crc kubenswrapper[4722]: E0309 14:31:37.232528 4722 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-678fb995f7-q5bfj" podUID="dfb53424-4444-464a-9be1-97e2e095c496" containerName="heat-engine" Mar 09 14:31:41 crc kubenswrapper[4722]: I0309 14:31:41.119744 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-1" podUID="c6ada086-becc-4f4a-a0a0-0aad894dc550" containerName="rabbitmq" containerID="cri-o://0cdebc0d2406fceecb9b5fd85c03284427aaea6f5c3f1ce6e67e86ed2f14dfed" gracePeriod=604796 Mar 09 14:31:42 crc kubenswrapper[4722]: E0309 14:31:42.748931 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest" Mar 09 14:31:42 crc kubenswrapper[4722]: E0309 14:31:42.749110 4722 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 09 14:31:42 crc kubenswrapper[4722]: container &Container{Name:repo-setup-edpm-deployment-openstack-edpm-ipam,Image:quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest,Command:[],Args:[ansible-runner run /runner -p playbook.yaml -i repo-setup-edpm-deployment-openstack-edpm-ipam],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ANSIBLE_VERBOSITY,Value:2,ValueFrom:nil,},EnvVar{Name:RUNNER_PLAYBOOK,Value: Mar 09 14:31:42 crc kubenswrapper[4722]: - hosts: all Mar 09 14:31:42 crc kubenswrapper[4722]: strategy: linear Mar 09 14:31:42 crc kubenswrapper[4722]: tasks: Mar 09 14:31:42 crc kubenswrapper[4722]: - name: Enable podified-repos Mar 09 14:31:42 crc kubenswrapper[4722]: become: true Mar 09 14:31:42 crc kubenswrapper[4722]: ansible.builtin.shell: | Mar 09 14:31:42 crc kubenswrapper[4722]: set -euxo pipefail Mar 09 14:31:42 crc kubenswrapper[4722]: pushd /var/tmp Mar 09 14:31:42 crc kubenswrapper[4722]: curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz Mar 09 14:31:42 crc kubenswrapper[4722]: pushd repo-setup-main Mar 09 14:31:42 crc kubenswrapper[4722]: python3 -m venv ./venv Mar 09 14:31:42 crc kubenswrapper[4722]: PBR_VERSION=0.0.0 ./venv/bin/pip install ./ Mar 09 14:31:42 crc kubenswrapper[4722]: ./venv/bin/repo-setup current-podified -b antelope Mar 09 14:31:42 crc kubenswrapper[4722]: popd Mar 09 14:31:42 crc kubenswrapper[4722]: rm -rf repo-setup-main Mar 09 14:31:42 crc kubenswrapper[4722]: Mar 09 14:31:42 crc kubenswrapper[4722]: Mar 09 14:31:42 crc kubenswrapper[4722]: ,ValueFrom:nil,},EnvVar{Name:RUNNER_EXTRA_VARS,Value: Mar 09 14:31:42 crc kubenswrapper[4722]: edpm_override_hosts: openstack-edpm-ipam Mar 09 14:31:42 crc kubenswrapper[4722]: edpm_service_type: repo-setup Mar 09 14:31:42 crc kubenswrapper[4722]: Mar 09 14:31:42 crc kubenswrapper[4722]: Mar 09 14:31:42 crc kubenswrapper[4722]: ,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:repo-setup-combined-ca-bundle,ReadOnly:false,MountPath:/var/lib/openstack/cacerts/repo-setup,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key-openstack-edpm-ipam,ReadOnly:false,MountPath:/runner/env/ssh_key/ssh_key_openstack-edpm-ipam,SubPath:ssh_key_openstack-edpm-ipam,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:inventory,ReadOnly:false,MountPath:/runner/inventory/hosts,SubPath:inventory,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r5tgb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:openstack-aee-default-env,},Optional:*true,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod repo-setup-edpm-deployment-openstack-edpm-ipam-rct52_openstack(5f8120f5-690a-4bb4-ba23-dead16d6946f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled Mar 09 14:31:42 crc kubenswrapper[4722]: > logger="UnhandledError" Mar 09 14:31:42 crc kubenswrapper[4722]: E0309 14:31:42.750299 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"repo-setup-edpm-deployment-openstack-edpm-ipam\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-rct52" podUID="5f8120f5-690a-4bb4-ba23-dead16d6946f" Mar 09 14:31:42 crc kubenswrapper[4722]: I0309 14:31:42.850977 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 09 14:31:42 crc kubenswrapper[4722]: E0309 14:31:42.937601 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"repo-setup-edpm-deployment-openstack-edpm-ipam\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest\\\"\"" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-rct52" podUID="5f8120f5-690a-4bb4-ba23-dead16d6946f" Mar 09 14:31:43 crc kubenswrapper[4722]: I0309 14:31:43.385314 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rvtqn"] Mar 09 14:31:43 crc kubenswrapper[4722]: I0309 14:31:43.953762 4722 generic.go:334] "Generic (PLEG): container finished" podID="3411289f-3e7c-4e43-b545-5e612822b18e" containerID="e3f827c375c78b19fe91de16a29fab6e26e437a529663a3ab97b7c2d7119faa5" exitCode=0 Mar 09 14:31:43 crc kubenswrapper[4722]: I0309 14:31:43.953856 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rvtqn" event={"ID":"3411289f-3e7c-4e43-b545-5e612822b18e","Type":"ContainerDied","Data":"e3f827c375c78b19fe91de16a29fab6e26e437a529663a3ab97b7c2d7119faa5"} Mar 09 14:31:43 crc kubenswrapper[4722]: I0309 14:31:43.954231 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rvtqn" event={"ID":"3411289f-3e7c-4e43-b545-5e612822b18e","Type":"ContainerStarted","Data":"6fe82e28d9d814c5fe482ef20bb88c9bd26991c22f4e2503abc1819301fd3579"} Mar 09 14:31:43 crc kubenswrapper[4722]: I0309 14:31:43.956784 4722 generic.go:334] "Generic (PLEG): container finished" podID="dfb53424-4444-464a-9be1-97e2e095c496" containerID="d13db23d16ba06dabebfe79eacf52b9d98971ac551281b501b27011139739a15" exitCode=0 Mar 09 14:31:43 crc kubenswrapper[4722]: I0309 14:31:43.956852 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-678fb995f7-q5bfj" event={"ID":"dfb53424-4444-464a-9be1-97e2e095c496","Type":"ContainerDied","Data":"d13db23d16ba06dabebfe79eacf52b9d98971ac551281b501b27011139739a15"} Mar 09 14:31:43 crc kubenswrapper[4722]: I0309 14:31:43.958976 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-qn7dx" event={"ID":"40e6bb4d-411c-43c2-9959-0d1b9e005a11","Type":"ContainerStarted","Data":"e0d21535954f2715f616aa2539fb3b03570f619c92ebfbd782a330836605d26a"} Mar 09 14:31:43 crc kubenswrapper[4722]: E0309 14:31:43.979420 4722 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3411289f_3e7c_4e43_b545_5e612822b18e.slice/crio-conmon-e3f827c375c78b19fe91de16a29fab6e26e437a529663a3ab97b7c2d7119faa5.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3411289f_3e7c_4e43_b545_5e612822b18e.slice/crio-e3f827c375c78b19fe91de16a29fab6e26e437a529663a3ab97b7c2d7119faa5.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddfb53424_4444_464a_9be1_97e2e095c496.slice/crio-conmon-d13db23d16ba06dabebfe79eacf52b9d98971ac551281b501b27011139739a15.scope\": RecentStats: unable to find data in memory cache]" Mar 09 14:31:44 crc kubenswrapper[4722]: I0309 14:31:44.003110 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-qn7dx" podStartSLOduration=2.669003927 podStartE2EDuration="17.003084241s" podCreationTimestamp="2026-03-09 14:31:27 +0000 UTC" firstStartedPulling="2026-03-09 14:31:28.513346677 +0000 UTC m=+1729.068915243" lastFinishedPulling="2026-03-09 14:31:42.847426981 +0000 UTC m=+1743.402995557" observedRunningTime="2026-03-09 14:31:43.994745581 +0000 UTC m=+1744.550314167" watchObservedRunningTime="2026-03-09 14:31:44.003084241 +0000 UTC m=+1744.558652817" Mar 09 14:31:44 crc kubenswrapper[4722]: I0309 14:31:44.801504 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-678fb995f7-q5bfj" Mar 09 14:31:44 crc kubenswrapper[4722]: I0309 14:31:44.961999 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4pt2x\" (UniqueName: \"kubernetes.io/projected/dfb53424-4444-464a-9be1-97e2e095c496-kube-api-access-4pt2x\") pod \"dfb53424-4444-464a-9be1-97e2e095c496\" (UID: \"dfb53424-4444-464a-9be1-97e2e095c496\") " Mar 09 14:31:44 crc kubenswrapper[4722]: I0309 14:31:44.962407 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dfb53424-4444-464a-9be1-97e2e095c496-config-data-custom\") pod \"dfb53424-4444-464a-9be1-97e2e095c496\" (UID: \"dfb53424-4444-464a-9be1-97e2e095c496\") " Mar 09 14:31:44 crc kubenswrapper[4722]: I0309 14:31:44.962522 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfb53424-4444-464a-9be1-97e2e095c496-config-data\") pod \"dfb53424-4444-464a-9be1-97e2e095c496\" (UID: \"dfb53424-4444-464a-9be1-97e2e095c496\") " Mar 09 14:31:44 crc kubenswrapper[4722]: I0309 14:31:44.962560 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfb53424-4444-464a-9be1-97e2e095c496-combined-ca-bundle\") pod \"dfb53424-4444-464a-9be1-97e2e095c496\" (UID: \"dfb53424-4444-464a-9be1-97e2e095c496\") " Mar 09 14:31:44 crc kubenswrapper[4722]: I0309 14:31:44.969112 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfb53424-4444-464a-9be1-97e2e095c496-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "dfb53424-4444-464a-9be1-97e2e095c496" (UID: "dfb53424-4444-464a-9be1-97e2e095c496"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:31:44 crc kubenswrapper[4722]: I0309 14:31:44.976174 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfb53424-4444-464a-9be1-97e2e095c496-kube-api-access-4pt2x" (OuterVolumeSpecName: "kube-api-access-4pt2x") pod "dfb53424-4444-464a-9be1-97e2e095c496" (UID: "dfb53424-4444-464a-9be1-97e2e095c496"). InnerVolumeSpecName "kube-api-access-4pt2x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:31:44 crc kubenswrapper[4722]: I0309 14:31:44.979406 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-678fb995f7-q5bfj" Mar 09 14:31:44 crc kubenswrapper[4722]: I0309 14:31:44.980095 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-678fb995f7-q5bfj" event={"ID":"dfb53424-4444-464a-9be1-97e2e095c496","Type":"ContainerDied","Data":"8d548a7591e05f2c3428e1d40835e5da777fad9fe54741b13665685b21446a8c"} Mar 09 14:31:44 crc kubenswrapper[4722]: I0309 14:31:44.980135 4722 scope.go:117] "RemoveContainer" containerID="d13db23d16ba06dabebfe79eacf52b9d98971ac551281b501b27011139739a15" Mar 09 14:31:45 crc kubenswrapper[4722]: I0309 14:31:45.008810 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfb53424-4444-464a-9be1-97e2e095c496-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dfb53424-4444-464a-9be1-97e2e095c496" (UID: "dfb53424-4444-464a-9be1-97e2e095c496"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:31:45 crc kubenswrapper[4722]: I0309 14:31:45.058109 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfb53424-4444-464a-9be1-97e2e095c496-config-data" (OuterVolumeSpecName: "config-data") pod "dfb53424-4444-464a-9be1-97e2e095c496" (UID: "dfb53424-4444-464a-9be1-97e2e095c496"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:31:45 crc kubenswrapper[4722]: I0309 14:31:45.065569 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4pt2x\" (UniqueName: \"kubernetes.io/projected/dfb53424-4444-464a-9be1-97e2e095c496-kube-api-access-4pt2x\") on node \"crc\" DevicePath \"\"" Mar 09 14:31:45 crc kubenswrapper[4722]: I0309 14:31:45.065637 4722 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dfb53424-4444-464a-9be1-97e2e095c496-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 09 14:31:45 crc kubenswrapper[4722]: I0309 14:31:45.065647 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfb53424-4444-464a-9be1-97e2e095c496-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:31:45 crc kubenswrapper[4722]: I0309 14:31:45.065657 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfb53424-4444-464a-9be1-97e2e095c496-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:31:45 crc kubenswrapper[4722]: I0309 14:31:45.325827 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-678fb995f7-q5bfj"] Mar 09 14:31:45 crc kubenswrapper[4722]: I0309 14:31:45.340851 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-engine-678fb995f7-q5bfj"] Mar 09 14:31:46 crc kubenswrapper[4722]: I0309 14:31:46.166622 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfb53424-4444-464a-9be1-97e2e095c496" path="/var/lib/kubelet/pods/dfb53424-4444-464a-9be1-97e2e095c496/volumes" Mar 09 14:31:47 crc kubenswrapper[4722]: I0309 14:31:47.014166 4722 generic.go:334] "Generic (PLEG): container finished" podID="40e6bb4d-411c-43c2-9959-0d1b9e005a11" containerID="e0d21535954f2715f616aa2539fb3b03570f619c92ebfbd782a330836605d26a" exitCode=0 Mar 09 14:31:47 crc kubenswrapper[4722]: I0309 14:31:47.014230 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-qn7dx" event={"ID":"40e6bb4d-411c-43c2-9959-0d1b9e005a11","Type":"ContainerDied","Data":"e0d21535954f2715f616aa2539fb3b03570f619c92ebfbd782a330836605d26a"} Mar 09 14:31:48 crc kubenswrapper[4722]: E0309 14:31:48.168942 4722 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6ada086_becc_4f4a_a0a0_0aad894dc550.slice/crio-conmon-0cdebc0d2406fceecb9b5fd85c03284427aaea6f5c3f1ce6e67e86ed2f14dfed.scope\": RecentStats: unable to find data in memory cache]" Mar 09 14:31:48 crc kubenswrapper[4722]: I0309 14:31:48.429594 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-1" podUID="c6ada086-becc-4f4a-a0a0-0aad894dc550" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.136:5671: connect: connection refused" Mar 09 14:31:49 crc kubenswrapper[4722]: I0309 14:31:49.067506 4722 generic.go:334] "Generic (PLEG): container finished" podID="c6ada086-becc-4f4a-a0a0-0aad894dc550" containerID="0cdebc0d2406fceecb9b5fd85c03284427aaea6f5c3f1ce6e67e86ed2f14dfed" exitCode=0 Mar 09 14:31:49 crc kubenswrapper[4722]: I0309 14:31:49.067570 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"c6ada086-becc-4f4a-a0a0-0aad894dc550","Type":"ContainerDied","Data":"0cdebc0d2406fceecb9b5fd85c03284427aaea6f5c3f1ce6e67e86ed2f14dfed"} Mar 09 14:31:49 crc kubenswrapper[4722]: I0309 14:31:49.150802 4722 scope.go:117] "RemoveContainer" containerID="86a6dcdf7dcf4d48b8bdfb71d9a2c30b89f235700c424312c7e16580a86021d6" Mar 09 14:31:49 crc kubenswrapper[4722]: E0309 14:31:49.151444 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 14:31:49 crc kubenswrapper[4722]: I0309 14:31:49.478365 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-qn7dx" Mar 09 14:31:49 crc kubenswrapper[4722]: I0309 14:31:49.549394 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jmlbm\" (UniqueName: \"kubernetes.io/projected/40e6bb4d-411c-43c2-9959-0d1b9e005a11-kube-api-access-jmlbm\") pod \"40e6bb4d-411c-43c2-9959-0d1b9e005a11\" (UID: \"40e6bb4d-411c-43c2-9959-0d1b9e005a11\") " Mar 09 14:31:49 crc kubenswrapper[4722]: I0309 14:31:49.549815 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40e6bb4d-411c-43c2-9959-0d1b9e005a11-config-data\") pod \"40e6bb4d-411c-43c2-9959-0d1b9e005a11\" (UID: \"40e6bb4d-411c-43c2-9959-0d1b9e005a11\") " Mar 09 14:31:49 crc kubenswrapper[4722]: I0309 14:31:49.549900 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40e6bb4d-411c-43c2-9959-0d1b9e005a11-scripts\") pod \"40e6bb4d-411c-43c2-9959-0d1b9e005a11\" (UID: \"40e6bb4d-411c-43c2-9959-0d1b9e005a11\") " Mar 09 14:31:49 crc kubenswrapper[4722]: I0309 14:31:49.549952 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40e6bb4d-411c-43c2-9959-0d1b9e005a11-combined-ca-bundle\") pod \"40e6bb4d-411c-43c2-9959-0d1b9e005a11\" (UID: \"40e6bb4d-411c-43c2-9959-0d1b9e005a11\") " Mar 09 14:31:49 crc kubenswrapper[4722]: I0309 14:31:49.562964 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40e6bb4d-411c-43c2-9959-0d1b9e005a11-scripts" (OuterVolumeSpecName: "scripts") pod "40e6bb4d-411c-43c2-9959-0d1b9e005a11" (UID: "40e6bb4d-411c-43c2-9959-0d1b9e005a11"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:31:49 crc kubenswrapper[4722]: I0309 14:31:49.576046 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40e6bb4d-411c-43c2-9959-0d1b9e005a11-kube-api-access-jmlbm" (OuterVolumeSpecName: "kube-api-access-jmlbm") pod "40e6bb4d-411c-43c2-9959-0d1b9e005a11" (UID: "40e6bb4d-411c-43c2-9959-0d1b9e005a11"). InnerVolumeSpecName "kube-api-access-jmlbm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:31:49 crc kubenswrapper[4722]: I0309 14:31:49.615252 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40e6bb4d-411c-43c2-9959-0d1b9e005a11-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "40e6bb4d-411c-43c2-9959-0d1b9e005a11" (UID: "40e6bb4d-411c-43c2-9959-0d1b9e005a11"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:31:49 crc kubenswrapper[4722]: I0309 14:31:49.623662 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40e6bb4d-411c-43c2-9959-0d1b9e005a11-config-data" (OuterVolumeSpecName: "config-data") pod "40e6bb4d-411c-43c2-9959-0d1b9e005a11" (UID: "40e6bb4d-411c-43c2-9959-0d1b9e005a11"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:31:49 crc kubenswrapper[4722]: I0309 14:31:49.653185 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40e6bb4d-411c-43c2-9959-0d1b9e005a11-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:31:49 crc kubenswrapper[4722]: I0309 14:31:49.653237 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40e6bb4d-411c-43c2-9959-0d1b9e005a11-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 14:31:49 crc kubenswrapper[4722]: I0309 14:31:49.653246 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40e6bb4d-411c-43c2-9959-0d1b9e005a11-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:31:49 crc kubenswrapper[4722]: I0309 14:31:49.653257 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jmlbm\" (UniqueName: \"kubernetes.io/projected/40e6bb4d-411c-43c2-9959-0d1b9e005a11-kube-api-access-jmlbm\") on node \"crc\" DevicePath \"\"" Mar 09 14:31:49 crc kubenswrapper[4722]: I0309 14:31:49.748920 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-1" Mar 09 14:31:49 crc kubenswrapper[4722]: I0309 14:31:49.857337 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c6ada086-becc-4f4a-a0a0-0aad894dc550-pod-info\") pod \"c6ada086-becc-4f4a-a0a0-0aad894dc550\" (UID: \"c6ada086-becc-4f4a-a0a0-0aad894dc550\") " Mar 09 14:31:49 crc kubenswrapper[4722]: I0309 14:31:49.857404 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c6ada086-becc-4f4a-a0a0-0aad894dc550-rabbitmq-confd\") pod \"c6ada086-becc-4f4a-a0a0-0aad894dc550\" (UID: \"c6ada086-becc-4f4a-a0a0-0aad894dc550\") " Mar 09 14:31:49 crc kubenswrapper[4722]: I0309 14:31:49.857467 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c6ada086-becc-4f4a-a0a0-0aad894dc550-server-conf\") pod \"c6ada086-becc-4f4a-a0a0-0aad894dc550\" (UID: \"c6ada086-becc-4f4a-a0a0-0aad894dc550\") " Mar 09 14:31:49 crc kubenswrapper[4722]: I0309 14:31:49.857596 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c6ada086-becc-4f4a-a0a0-0aad894dc550-config-data\") pod \"c6ada086-becc-4f4a-a0a0-0aad894dc550\" (UID: \"c6ada086-becc-4f4a-a0a0-0aad894dc550\") " Mar 09 14:31:49 crc kubenswrapper[4722]: I0309 14:31:49.858464 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2885c5b9-d401-48f7-ae60-bfa289e45112\") pod \"c6ada086-becc-4f4a-a0a0-0aad894dc550\" (UID: \"c6ada086-becc-4f4a-a0a0-0aad894dc550\") " Mar 09 14:31:49 crc kubenswrapper[4722]: I0309 14:31:49.858501 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c6ada086-becc-4f4a-a0a0-0aad894dc550-rabbitmq-plugins\") pod \"c6ada086-becc-4f4a-a0a0-0aad894dc550\" (UID: \"c6ada086-becc-4f4a-a0a0-0aad894dc550\") " Mar 09 14:31:49 crc kubenswrapper[4722]: I0309 14:31:49.858579 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c6ada086-becc-4f4a-a0a0-0aad894dc550-rabbitmq-tls\") pod \"c6ada086-becc-4f4a-a0a0-0aad894dc550\" (UID: \"c6ada086-becc-4f4a-a0a0-0aad894dc550\") " Mar 09 14:31:49 crc kubenswrapper[4722]: I0309 14:31:49.858607 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c6ada086-becc-4f4a-a0a0-0aad894dc550-erlang-cookie-secret\") pod \"c6ada086-becc-4f4a-a0a0-0aad894dc550\" (UID: \"c6ada086-becc-4f4a-a0a0-0aad894dc550\") " Mar 09 14:31:49 crc kubenswrapper[4722]: I0309 14:31:49.858708 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4pvdb\" (UniqueName: \"kubernetes.io/projected/c6ada086-becc-4f4a-a0a0-0aad894dc550-kube-api-access-4pvdb\") pod \"c6ada086-becc-4f4a-a0a0-0aad894dc550\" (UID: \"c6ada086-becc-4f4a-a0a0-0aad894dc550\") " Mar 09 14:31:49 crc kubenswrapper[4722]: I0309 14:31:49.858733 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c6ada086-becc-4f4a-a0a0-0aad894dc550-plugins-conf\") pod \"c6ada086-becc-4f4a-a0a0-0aad894dc550\" (UID: \"c6ada086-becc-4f4a-a0a0-0aad894dc550\") " Mar 09 14:31:49 crc kubenswrapper[4722]: I0309 14:31:49.858761 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c6ada086-becc-4f4a-a0a0-0aad894dc550-rabbitmq-erlang-cookie\") pod \"c6ada086-becc-4f4a-a0a0-0aad894dc550\" (UID: \"c6ada086-becc-4f4a-a0a0-0aad894dc550\") " Mar 09 14:31:49 crc kubenswrapper[4722]: I0309 14:31:49.860905 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6ada086-becc-4f4a-a0a0-0aad894dc550-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "c6ada086-becc-4f4a-a0a0-0aad894dc550" (UID: "c6ada086-becc-4f4a-a0a0-0aad894dc550"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:31:49 crc kubenswrapper[4722]: I0309 14:31:49.862507 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6ada086-becc-4f4a-a0a0-0aad894dc550-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "c6ada086-becc-4f4a-a0a0-0aad894dc550" (UID: "c6ada086-becc-4f4a-a0a0-0aad894dc550"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:31:49 crc kubenswrapper[4722]: I0309 14:31:49.863212 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6ada086-becc-4f4a-a0a0-0aad894dc550-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "c6ada086-becc-4f4a-a0a0-0aad894dc550" (UID: "c6ada086-becc-4f4a-a0a0-0aad894dc550"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:31:49 crc kubenswrapper[4722]: I0309 14:31:49.878603 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/c6ada086-becc-4f4a-a0a0-0aad894dc550-pod-info" (OuterVolumeSpecName: "pod-info") pod "c6ada086-becc-4f4a-a0a0-0aad894dc550" (UID: "c6ada086-becc-4f4a-a0a0-0aad894dc550"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 09 14:31:49 crc kubenswrapper[4722]: I0309 14:31:49.879255 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6ada086-becc-4f4a-a0a0-0aad894dc550-kube-api-access-4pvdb" (OuterVolumeSpecName: "kube-api-access-4pvdb") pod "c6ada086-becc-4f4a-a0a0-0aad894dc550" (UID: "c6ada086-becc-4f4a-a0a0-0aad894dc550"). InnerVolumeSpecName "kube-api-access-4pvdb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:31:49 crc kubenswrapper[4722]: I0309 14:31:49.879346 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6ada086-becc-4f4a-a0a0-0aad894dc550-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "c6ada086-becc-4f4a-a0a0-0aad894dc550" (UID: "c6ada086-becc-4f4a-a0a0-0aad894dc550"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:31:49 crc kubenswrapper[4722]: I0309 14:31:49.880043 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6ada086-becc-4f4a-a0a0-0aad894dc550-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "c6ada086-becc-4f4a-a0a0-0aad894dc550" (UID: "c6ada086-becc-4f4a-a0a0-0aad894dc550"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:31:49 crc kubenswrapper[4722]: I0309 14:31:49.900313 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6ada086-becc-4f4a-a0a0-0aad894dc550-config-data" (OuterVolumeSpecName: "config-data") pod "c6ada086-becc-4f4a-a0a0-0aad894dc550" (UID: "c6ada086-becc-4f4a-a0a0-0aad894dc550"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:31:49 crc kubenswrapper[4722]: I0309 14:31:49.911541 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2885c5b9-d401-48f7-ae60-bfa289e45112" (OuterVolumeSpecName: "persistence") pod "c6ada086-becc-4f4a-a0a0-0aad894dc550" (UID: "c6ada086-becc-4f4a-a0a0-0aad894dc550"). InnerVolumeSpecName "pvc-2885c5b9-d401-48f7-ae60-bfa289e45112". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 09 14:31:49 crc kubenswrapper[4722]: I0309 14:31:49.962745 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4pvdb\" (UniqueName: \"kubernetes.io/projected/c6ada086-becc-4f4a-a0a0-0aad894dc550-kube-api-access-4pvdb\") on node \"crc\" DevicePath \"\"" Mar 09 14:31:49 crc kubenswrapper[4722]: I0309 14:31:49.962776 4722 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c6ada086-becc-4f4a-a0a0-0aad894dc550-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 09 14:31:49 crc kubenswrapper[4722]: I0309 14:31:49.962788 4722 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c6ada086-becc-4f4a-a0a0-0aad894dc550-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 09 14:31:49 crc kubenswrapper[4722]: I0309 14:31:49.962797 4722 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c6ada086-becc-4f4a-a0a0-0aad894dc550-pod-info\") on node \"crc\" DevicePath \"\"" Mar 09 14:31:49 crc kubenswrapper[4722]: I0309 14:31:49.962805 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c6ada086-becc-4f4a-a0a0-0aad894dc550-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:31:49 crc kubenswrapper[4722]: I0309 14:31:49.962830 4722 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-2885c5b9-d401-48f7-ae60-bfa289e45112\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2885c5b9-d401-48f7-ae60-bfa289e45112\") on node \"crc\" " Mar 09 14:31:49 crc kubenswrapper[4722]: I0309 14:31:49.962843 4722 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c6ada086-becc-4f4a-a0a0-0aad894dc550-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 09 14:31:49 crc kubenswrapper[4722]: I0309 14:31:49.962853 4722 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c6ada086-becc-4f4a-a0a0-0aad894dc550-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 09 14:31:49 crc kubenswrapper[4722]: I0309 14:31:49.962862 4722 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c6ada086-becc-4f4a-a0a0-0aad894dc550-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 09 14:31:49 crc kubenswrapper[4722]: I0309 14:31:49.966517 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6ada086-becc-4f4a-a0a0-0aad894dc550-server-conf" (OuterVolumeSpecName: "server-conf") pod "c6ada086-becc-4f4a-a0a0-0aad894dc550" (UID: "c6ada086-becc-4f4a-a0a0-0aad894dc550"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:31:50 crc kubenswrapper[4722]: I0309 14:31:50.005777 4722 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 09 14:31:50 crc kubenswrapper[4722]: I0309 14:31:50.005984 4722 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-2885c5b9-d401-48f7-ae60-bfa289e45112" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2885c5b9-d401-48f7-ae60-bfa289e45112") on node "crc" Mar 09 14:31:50 crc kubenswrapper[4722]: I0309 14:31:50.014870 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6ada086-becc-4f4a-a0a0-0aad894dc550-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "c6ada086-becc-4f4a-a0a0-0aad894dc550" (UID: "c6ada086-becc-4f4a-a0a0-0aad894dc550"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:31:50 crc kubenswrapper[4722]: I0309 14:31:50.065063 4722 reconciler_common.go:293] "Volume detached for volume \"pvc-2885c5b9-d401-48f7-ae60-bfa289e45112\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2885c5b9-d401-48f7-ae60-bfa289e45112\") on node \"crc\" DevicePath \"\"" Mar 09 14:31:50 crc kubenswrapper[4722]: I0309 14:31:50.065099 4722 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c6ada086-becc-4f4a-a0a0-0aad894dc550-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 09 14:31:50 crc kubenswrapper[4722]: I0309 14:31:50.065110 4722 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c6ada086-becc-4f4a-a0a0-0aad894dc550-server-conf\") on node \"crc\" DevicePath \"\"" Mar 09 14:31:50 crc kubenswrapper[4722]: I0309 14:31:50.080135 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"c6ada086-becc-4f4a-a0a0-0aad894dc550","Type":"ContainerDied","Data":"7daf28ee40a43853387b92aa2acd493b5b17000dad0500d62280373a0f538481"} Mar 09 14:31:50 crc kubenswrapper[4722]: I0309 14:31:50.080182 4722 scope.go:117] "RemoveContainer" containerID="0cdebc0d2406fceecb9b5fd85c03284427aaea6f5c3f1ce6e67e86ed2f14dfed" Mar 09 14:31:50 crc kubenswrapper[4722]: I0309 14:31:50.080323 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-1" Mar 09 14:31:50 crc kubenswrapper[4722]: I0309 14:31:50.087933 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-qn7dx" event={"ID":"40e6bb4d-411c-43c2-9959-0d1b9e005a11","Type":"ContainerDied","Data":"c79b0b528bda7fe57228cfb11ed32304aa1de3a658e529ef733181f6399562a3"} Mar 09 14:31:50 crc kubenswrapper[4722]: I0309 14:31:50.087962 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-qn7dx" Mar 09 14:31:50 crc kubenswrapper[4722]: I0309 14:31:50.087984 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c79b0b528bda7fe57228cfb11ed32304aa1de3a658e529ef733181f6399562a3" Mar 09 14:31:50 crc kubenswrapper[4722]: I0309 14:31:50.097803 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rvtqn" event={"ID":"3411289f-3e7c-4e43-b545-5e612822b18e","Type":"ContainerStarted","Data":"7bf4b2058a43729ec5a619368f21ebceee1f551c95b6e9fcf9b86ed588420988"} Mar 09 14:31:50 crc kubenswrapper[4722]: I0309 14:31:50.118847 4722 scope.go:117] "RemoveContainer" containerID="c14c75a63d784852902e25e93e5a8cf7646ecf4eccaa9b3cddd0d364975c6f55" Mar 09 14:31:50 crc kubenswrapper[4722]: I0309 14:31:50.125921 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-1"] Mar 09 14:31:50 crc kubenswrapper[4722]: I0309 14:31:50.141748 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-1"] Mar 09 14:31:50 crc kubenswrapper[4722]: I0309 14:31:50.186737 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6ada086-becc-4f4a-a0a0-0aad894dc550" path="/var/lib/kubelet/pods/c6ada086-becc-4f4a-a0a0-0aad894dc550/volumes" Mar 09 14:31:50 crc kubenswrapper[4722]: I0309 14:31:50.189614 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-1"] Mar 09 14:31:50 crc kubenswrapper[4722]: E0309 14:31:50.189997 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40e6bb4d-411c-43c2-9959-0d1b9e005a11" containerName="aodh-db-sync" Mar 09 14:31:50 crc kubenswrapper[4722]: I0309 14:31:50.190012 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="40e6bb4d-411c-43c2-9959-0d1b9e005a11" containerName="aodh-db-sync" Mar 09 14:31:50 crc kubenswrapper[4722]: E0309 14:31:50.190034 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6ada086-becc-4f4a-a0a0-0aad894dc550" containerName="rabbitmq" Mar 09 14:31:50 crc kubenswrapper[4722]: I0309 14:31:50.190042 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6ada086-becc-4f4a-a0a0-0aad894dc550" containerName="rabbitmq" Mar 09 14:31:50 crc kubenswrapper[4722]: E0309 14:31:50.190067 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6ada086-becc-4f4a-a0a0-0aad894dc550" containerName="setup-container" Mar 09 14:31:50 crc kubenswrapper[4722]: I0309 14:31:50.190073 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6ada086-becc-4f4a-a0a0-0aad894dc550" containerName="setup-container" Mar 09 14:31:50 crc kubenswrapper[4722]: E0309 14:31:50.190102 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfb53424-4444-464a-9be1-97e2e095c496" containerName="heat-engine" Mar 09 14:31:50 crc kubenswrapper[4722]: I0309 14:31:50.190108 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfb53424-4444-464a-9be1-97e2e095c496" containerName="heat-engine" Mar 09 14:31:50 crc kubenswrapper[4722]: I0309 14:31:50.190313 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfb53424-4444-464a-9be1-97e2e095c496" containerName="heat-engine" Mar 09 14:31:50 crc kubenswrapper[4722]: I0309 14:31:50.190347 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6ada086-becc-4f4a-a0a0-0aad894dc550" containerName="rabbitmq" Mar 09 14:31:50 crc kubenswrapper[4722]: I0309 14:31:50.190355 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="40e6bb4d-411c-43c2-9959-0d1b9e005a11" containerName="aodh-db-sync" Mar 09 14:31:50 crc kubenswrapper[4722]: I0309 14:31:50.191926 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-1" Mar 09 14:31:50 crc kubenswrapper[4722]: I0309 14:31:50.240877 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-1"] Mar 09 14:31:50 crc kubenswrapper[4722]: I0309 14:31:50.269785 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/694c0bbd-9c21-4de1-b82b-e79aa32feb6b-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"694c0bbd-9c21-4de1-b82b-e79aa32feb6b\") " pod="openstack/rabbitmq-server-1" Mar 09 14:31:50 crc kubenswrapper[4722]: I0309 14:31:50.269898 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/694c0bbd-9c21-4de1-b82b-e79aa32feb6b-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"694c0bbd-9c21-4de1-b82b-e79aa32feb6b\") " pod="openstack/rabbitmq-server-1" Mar 09 14:31:50 crc kubenswrapper[4722]: I0309 14:31:50.269924 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/694c0bbd-9c21-4de1-b82b-e79aa32feb6b-config-data\") pod \"rabbitmq-server-1\" (UID: \"694c0bbd-9c21-4de1-b82b-e79aa32feb6b\") " pod="openstack/rabbitmq-server-1" Mar 09 14:31:50 crc kubenswrapper[4722]: I0309 14:31:50.269940 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/694c0bbd-9c21-4de1-b82b-e79aa32feb6b-pod-info\") pod \"rabbitmq-server-1\" (UID: \"694c0bbd-9c21-4de1-b82b-e79aa32feb6b\") " pod="openstack/rabbitmq-server-1" Mar 09 14:31:50 crc kubenswrapper[4722]: I0309 14:31:50.269998 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/694c0bbd-9c21-4de1-b82b-e79aa32feb6b-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"694c0bbd-9c21-4de1-b82b-e79aa32feb6b\") " pod="openstack/rabbitmq-server-1" Mar 09 14:31:50 crc kubenswrapper[4722]: I0309 14:31:50.270021 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/694c0bbd-9c21-4de1-b82b-e79aa32feb6b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"694c0bbd-9c21-4de1-b82b-e79aa32feb6b\") " pod="openstack/rabbitmq-server-1" Mar 09 14:31:50 crc kubenswrapper[4722]: I0309 14:31:50.270039 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tqql\" (UniqueName: \"kubernetes.io/projected/694c0bbd-9c21-4de1-b82b-e79aa32feb6b-kube-api-access-7tqql\") pod \"rabbitmq-server-1\" (UID: \"694c0bbd-9c21-4de1-b82b-e79aa32feb6b\") " pod="openstack/rabbitmq-server-1" Mar 09 14:31:50 crc kubenswrapper[4722]: I0309 14:31:50.270059 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/694c0bbd-9c21-4de1-b82b-e79aa32feb6b-server-conf\") pod \"rabbitmq-server-1\" (UID: \"694c0bbd-9c21-4de1-b82b-e79aa32feb6b\") " pod="openstack/rabbitmq-server-1" Mar 09 14:31:50 crc kubenswrapper[4722]: I0309 14:31:50.270092 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/694c0bbd-9c21-4de1-b82b-e79aa32feb6b-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"694c0bbd-9c21-4de1-b82b-e79aa32feb6b\") " pod="openstack/rabbitmq-server-1" Mar 09 14:31:50 crc kubenswrapper[4722]: I0309 14:31:50.270886 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-2885c5b9-d401-48f7-ae60-bfa289e45112\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2885c5b9-d401-48f7-ae60-bfa289e45112\") pod \"rabbitmq-server-1\" (UID: \"694c0bbd-9c21-4de1-b82b-e79aa32feb6b\") " pod="openstack/rabbitmq-server-1" Mar 09 14:31:50 crc kubenswrapper[4722]: I0309 14:31:50.270939 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/694c0bbd-9c21-4de1-b82b-e79aa32feb6b-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"694c0bbd-9c21-4de1-b82b-e79aa32feb6b\") " pod="openstack/rabbitmq-server-1" Mar 09 14:31:50 crc kubenswrapper[4722]: I0309 14:31:50.373014 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/694c0bbd-9c21-4de1-b82b-e79aa32feb6b-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"694c0bbd-9c21-4de1-b82b-e79aa32feb6b\") " pod="openstack/rabbitmq-server-1" Mar 09 14:31:50 crc kubenswrapper[4722]: I0309 14:31:50.373112 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/694c0bbd-9c21-4de1-b82b-e79aa32feb6b-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"694c0bbd-9c21-4de1-b82b-e79aa32feb6b\") " pod="openstack/rabbitmq-server-1" Mar 09 14:31:50 crc kubenswrapper[4722]: I0309 14:31:50.373198 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/694c0bbd-9c21-4de1-b82b-e79aa32feb6b-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"694c0bbd-9c21-4de1-b82b-e79aa32feb6b\") " pod="openstack/rabbitmq-server-1" Mar 09 14:31:50 crc kubenswrapper[4722]: I0309 14:31:50.373240 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/694c0bbd-9c21-4de1-b82b-e79aa32feb6b-config-data\") pod \"rabbitmq-server-1\" (UID: \"694c0bbd-9c21-4de1-b82b-e79aa32feb6b\") " pod="openstack/rabbitmq-server-1" Mar 09 14:31:50 crc kubenswrapper[4722]: I0309 14:31:50.373267 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/694c0bbd-9c21-4de1-b82b-e79aa32feb6b-pod-info\") pod \"rabbitmq-server-1\" (UID: \"694c0bbd-9c21-4de1-b82b-e79aa32feb6b\") " pod="openstack/rabbitmq-server-1" Mar 09 14:31:50 crc kubenswrapper[4722]: I0309 14:31:50.373343 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/694c0bbd-9c21-4de1-b82b-e79aa32feb6b-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"694c0bbd-9c21-4de1-b82b-e79aa32feb6b\") " pod="openstack/rabbitmq-server-1" Mar 09 14:31:50 crc kubenswrapper[4722]: I0309 14:31:50.373410 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/694c0bbd-9c21-4de1-b82b-e79aa32feb6b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"694c0bbd-9c21-4de1-b82b-e79aa32feb6b\") " pod="openstack/rabbitmq-server-1" Mar 09 14:31:50 crc kubenswrapper[4722]: I0309 14:31:50.373439 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tqql\" (UniqueName: \"kubernetes.io/projected/694c0bbd-9c21-4de1-b82b-e79aa32feb6b-kube-api-access-7tqql\") pod \"rabbitmq-server-1\" (UID: \"694c0bbd-9c21-4de1-b82b-e79aa32feb6b\") " pod="openstack/rabbitmq-server-1" Mar 09 14:31:50 crc kubenswrapper[4722]: I0309 14:31:50.373466 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/694c0bbd-9c21-4de1-b82b-e79aa32feb6b-server-conf\") pod \"rabbitmq-server-1\" (UID: \"694c0bbd-9c21-4de1-b82b-e79aa32feb6b\") " pod="openstack/rabbitmq-server-1" Mar 09 14:31:50 crc kubenswrapper[4722]: I0309 14:31:50.373510 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/694c0bbd-9c21-4de1-b82b-e79aa32feb6b-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"694c0bbd-9c21-4de1-b82b-e79aa32feb6b\") " pod="openstack/rabbitmq-server-1" Mar 09 14:31:50 crc kubenswrapper[4722]: I0309 14:31:50.373609 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-2885c5b9-d401-48f7-ae60-bfa289e45112\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2885c5b9-d401-48f7-ae60-bfa289e45112\") pod \"rabbitmq-server-1\" (UID: \"694c0bbd-9c21-4de1-b82b-e79aa32feb6b\") " pod="openstack/rabbitmq-server-1" Mar 09 14:31:50 crc kubenswrapper[4722]: I0309 14:31:50.375904 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/694c0bbd-9c21-4de1-b82b-e79aa32feb6b-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"694c0bbd-9c21-4de1-b82b-e79aa32feb6b\") " pod="openstack/rabbitmq-server-1" Mar 09 14:31:50 crc kubenswrapper[4722]: I0309 14:31:50.376173 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/694c0bbd-9c21-4de1-b82b-e79aa32feb6b-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"694c0bbd-9c21-4de1-b82b-e79aa32feb6b\") " pod="openstack/rabbitmq-server-1" Mar 09 14:31:50 crc kubenswrapper[4722]: I0309 14:31:50.376862 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/694c0bbd-9c21-4de1-b82b-e79aa32feb6b-config-data\") pod \"rabbitmq-server-1\" (UID: \"694c0bbd-9c21-4de1-b82b-e79aa32feb6b\") " pod="openstack/rabbitmq-server-1" Mar 09 14:31:50 crc kubenswrapper[4722]: I0309 14:31:50.378012 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/694c0bbd-9c21-4de1-b82b-e79aa32feb6b-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"694c0bbd-9c21-4de1-b82b-e79aa32feb6b\") " pod="openstack/rabbitmq-server-1" Mar 09 14:31:50 crc kubenswrapper[4722]: I0309 14:31:50.378937 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/694c0bbd-9c21-4de1-b82b-e79aa32feb6b-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"694c0bbd-9c21-4de1-b82b-e79aa32feb6b\") " pod="openstack/rabbitmq-server-1" Mar 09 14:31:50 crc kubenswrapper[4722]: I0309 14:31:50.379274 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/694c0bbd-9c21-4de1-b82b-e79aa32feb6b-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"694c0bbd-9c21-4de1-b82b-e79aa32feb6b\") " pod="openstack/rabbitmq-server-1" Mar 09 14:31:50 crc kubenswrapper[4722]: I0309 14:31:50.379660 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/694c0bbd-9c21-4de1-b82b-e79aa32feb6b-server-conf\") pod \"rabbitmq-server-1\" (UID: \"694c0bbd-9c21-4de1-b82b-e79aa32feb6b\") " pod="openstack/rabbitmq-server-1" Mar 09 14:31:50 crc kubenswrapper[4722]: I0309 14:31:50.381159 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/694c0bbd-9c21-4de1-b82b-e79aa32feb6b-pod-info\") pod \"rabbitmq-server-1\" (UID: \"694c0bbd-9c21-4de1-b82b-e79aa32feb6b\") " pod="openstack/rabbitmq-server-1" Mar 09 14:31:50 crc kubenswrapper[4722]: I0309 14:31:50.381816 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/694c0bbd-9c21-4de1-b82b-e79aa32feb6b-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"694c0bbd-9c21-4de1-b82b-e79aa32feb6b\") " pod="openstack/rabbitmq-server-1" Mar 09 14:31:50 crc kubenswrapper[4722]: I0309 14:31:50.381862 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 09 14:31:50 crc kubenswrapper[4722]: I0309 14:31:50.381892 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-2885c5b9-d401-48f7-ae60-bfa289e45112\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2885c5b9-d401-48f7-ae60-bfa289e45112\") pod \"rabbitmq-server-1\" (UID: \"694c0bbd-9c21-4de1-b82b-e79aa32feb6b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/723a9ce05431756e03b00745520f00559943d81b66433ce834e3e67d95e138ab/globalmount\"" pod="openstack/rabbitmq-server-1" Mar 09 14:31:50 crc kubenswrapper[4722]: I0309 14:31:50.394822 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tqql\" (UniqueName: \"kubernetes.io/projected/694c0bbd-9c21-4de1-b82b-e79aa32feb6b-kube-api-access-7tqql\") pod \"rabbitmq-server-1\" (UID: \"694c0bbd-9c21-4de1-b82b-e79aa32feb6b\") " pod="openstack/rabbitmq-server-1" Mar 09 14:31:50 crc kubenswrapper[4722]: I0309 14:31:50.477930 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-2885c5b9-d401-48f7-ae60-bfa289e45112\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2885c5b9-d401-48f7-ae60-bfa289e45112\") pod \"rabbitmq-server-1\" (UID: \"694c0bbd-9c21-4de1-b82b-e79aa32feb6b\") " pod="openstack/rabbitmq-server-1" Mar 09 14:31:50 crc kubenswrapper[4722]: I0309 14:31:50.588037 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-1" Mar 09 14:31:50 crc kubenswrapper[4722]: I0309 14:31:50.934184 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-1"] Mar 09 14:31:50 crc kubenswrapper[4722]: W0309 14:31:50.936252 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod694c0bbd_9c21_4de1_b82b_e79aa32feb6b.slice/crio-2be5a4440a8c046ec7b8448b39b7b17a904af35ab5b191831dcaf5a224139055 WatchSource:0}: Error finding container 2be5a4440a8c046ec7b8448b39b7b17a904af35ab5b191831dcaf5a224139055: Status 404 returned error can't find the container with id 2be5a4440a8c046ec7b8448b39b7b17a904af35ab5b191831dcaf5a224139055 Mar 09 14:31:51 crc kubenswrapper[4722]: I0309 14:31:51.109131 4722 generic.go:334] "Generic (PLEG): container finished" podID="3411289f-3e7c-4e43-b545-5e612822b18e" containerID="7bf4b2058a43729ec5a619368f21ebceee1f551c95b6e9fcf9b86ed588420988" exitCode=0 Mar 09 14:31:51 crc kubenswrapper[4722]: I0309 14:31:51.109173 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rvtqn" event={"ID":"3411289f-3e7c-4e43-b545-5e612822b18e","Type":"ContainerDied","Data":"7bf4b2058a43729ec5a619368f21ebceee1f551c95b6e9fcf9b86ed588420988"} Mar 09 14:31:51 crc kubenswrapper[4722]: I0309 14:31:51.112350 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"694c0bbd-9c21-4de1-b82b-e79aa32feb6b","Type":"ContainerStarted","Data":"2be5a4440a8c046ec7b8448b39b7b17a904af35ab5b191831dcaf5a224139055"} Mar 09 14:31:52 crc kubenswrapper[4722]: I0309 14:31:52.126067 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rvtqn" event={"ID":"3411289f-3e7c-4e43-b545-5e612822b18e","Type":"ContainerStarted","Data":"6caf48dd254c82f3e713c8b7d146aa636738f0e702e7888634d4fbddac4cdbd7"} Mar 09 14:31:52 crc kubenswrapper[4722]: I0309 14:31:52.147868 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rvtqn" podStartSLOduration=13.577538647 podStartE2EDuration="21.147851894s" podCreationTimestamp="2026-03-09 14:31:31 +0000 UTC" firstStartedPulling="2026-03-09 14:31:43.957830512 +0000 UTC m=+1744.513399088" lastFinishedPulling="2026-03-09 14:31:51.528143759 +0000 UTC m=+1752.083712335" observedRunningTime="2026-03-09 14:31:52.145694104 +0000 UTC m=+1752.701262690" watchObservedRunningTime="2026-03-09 14:31:52.147851894 +0000 UTC m=+1752.703420470" Mar 09 14:31:52 crc kubenswrapper[4722]: I0309 14:31:52.433408 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Mar 09 14:31:52 crc kubenswrapper[4722]: I0309 14:31:52.433998 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="32835244-16f5-407f-87fe-8e9f860879a1" containerName="aodh-api" containerID="cri-o://11e3c915b7b5f964267f5dd40b1448790d2c0b4d7f3c65d0c4a79ef035fc7b12" gracePeriod=30 Mar 09 14:31:52 crc kubenswrapper[4722]: I0309 14:31:52.434088 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="32835244-16f5-407f-87fe-8e9f860879a1" containerName="aodh-evaluator" containerID="cri-o://5ee6f3fb3854cc97d72b1225b0810ee97b98bfdbbaf067f7d20c95620ac51434" gracePeriod=30 Mar 09 14:31:52 crc kubenswrapper[4722]: I0309 14:31:52.434087 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="32835244-16f5-407f-87fe-8e9f860879a1" containerName="aodh-notifier" containerID="cri-o://33efd07beecb95d9c5a7c65c557ef23d3fc4f7096265860a26df35d8467ea987" gracePeriod=30 Mar 09 14:31:52 crc kubenswrapper[4722]: I0309 14:31:52.434117 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="32835244-16f5-407f-87fe-8e9f860879a1" containerName="aodh-listener" containerID="cri-o://17a0c0808de0ddcab290ce82e6cc5add34970fa0f13ed7e0572596829e5f0e53" gracePeriod=30 Mar 09 14:31:53 crc kubenswrapper[4722]: I0309 14:31:53.138532 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"694c0bbd-9c21-4de1-b82b-e79aa32feb6b","Type":"ContainerStarted","Data":"7d4245a328cb68f7c34eb728fe14d3b34a183a1f63704a6ea7f69e79391785f8"} Mar 09 14:31:53 crc kubenswrapper[4722]: I0309 14:31:53.141980 4722 generic.go:334] "Generic (PLEG): container finished" podID="32835244-16f5-407f-87fe-8e9f860879a1" containerID="5ee6f3fb3854cc97d72b1225b0810ee97b98bfdbbaf067f7d20c95620ac51434" exitCode=0 Mar 09 14:31:53 crc kubenswrapper[4722]: I0309 14:31:53.142021 4722 generic.go:334] "Generic (PLEG): container finished" podID="32835244-16f5-407f-87fe-8e9f860879a1" containerID="11e3c915b7b5f964267f5dd40b1448790d2c0b4d7f3c65d0c4a79ef035fc7b12" exitCode=0 Mar 09 14:31:53 crc kubenswrapper[4722]: I0309 14:31:53.142076 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"32835244-16f5-407f-87fe-8e9f860879a1","Type":"ContainerDied","Data":"5ee6f3fb3854cc97d72b1225b0810ee97b98bfdbbaf067f7d20c95620ac51434"} Mar 09 14:31:53 crc kubenswrapper[4722]: I0309 14:31:53.142143 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"32835244-16f5-407f-87fe-8e9f860879a1","Type":"ContainerDied","Data":"11e3c915b7b5f964267f5dd40b1448790d2c0b4d7f3c65d0c4a79ef035fc7b12"} Mar 09 14:31:55 crc kubenswrapper[4722]: I0309 14:31:55.913410 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 14:31:57 crc kubenswrapper[4722]: I0309 14:31:57.195196 4722 generic.go:334] "Generic (PLEG): container finished" podID="32835244-16f5-407f-87fe-8e9f860879a1" containerID="17a0c0808de0ddcab290ce82e6cc5add34970fa0f13ed7e0572596829e5f0e53" exitCode=0 Mar 09 14:31:57 crc kubenswrapper[4722]: I0309 14:31:57.195248 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"32835244-16f5-407f-87fe-8e9f860879a1","Type":"ContainerDied","Data":"17a0c0808de0ddcab290ce82e6cc5add34970fa0f13ed7e0572596829e5f0e53"} Mar 09 14:31:57 crc kubenswrapper[4722]: I0309 14:31:57.197586 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-rct52" event={"ID":"5f8120f5-690a-4bb4-ba23-dead16d6946f","Type":"ContainerStarted","Data":"6d48e426fd3eee9a37856fba92065a93a3336de7f393be272fd17621990f78b6"} Mar 09 14:31:57 crc kubenswrapper[4722]: I0309 14:31:57.226521 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-rct52" podStartSLOduration=3.329296514 podStartE2EDuration="33.226504458s" podCreationTimestamp="2026-03-09 14:31:24 +0000 UTC" firstStartedPulling="2026-03-09 14:31:26.013276642 +0000 UTC m=+1726.568845218" lastFinishedPulling="2026-03-09 14:31:55.910484586 +0000 UTC m=+1756.466053162" observedRunningTime="2026-03-09 14:31:57.215421702 +0000 UTC m=+1757.770990278" watchObservedRunningTime="2026-03-09 14:31:57.226504458 +0000 UTC m=+1757.782073034" Mar 09 14:31:58 crc kubenswrapper[4722]: I0309 14:31:58.065275 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 09 14:31:58 crc kubenswrapper[4722]: I0309 14:31:58.175442 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32835244-16f5-407f-87fe-8e9f860879a1-combined-ca-bundle\") pod \"32835244-16f5-407f-87fe-8e9f860879a1\" (UID: \"32835244-16f5-407f-87fe-8e9f860879a1\") " Mar 09 14:31:58 crc kubenswrapper[4722]: I0309 14:31:58.175572 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jbqpr\" (UniqueName: \"kubernetes.io/projected/32835244-16f5-407f-87fe-8e9f860879a1-kube-api-access-jbqpr\") pod \"32835244-16f5-407f-87fe-8e9f860879a1\" (UID: \"32835244-16f5-407f-87fe-8e9f860879a1\") " Mar 09 14:31:58 crc kubenswrapper[4722]: I0309 14:31:58.175783 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/32835244-16f5-407f-87fe-8e9f860879a1-public-tls-certs\") pod \"32835244-16f5-407f-87fe-8e9f860879a1\" (UID: \"32835244-16f5-407f-87fe-8e9f860879a1\") " Mar 09 14:31:58 crc kubenswrapper[4722]: I0309 14:31:58.175859 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/32835244-16f5-407f-87fe-8e9f860879a1-internal-tls-certs\") pod \"32835244-16f5-407f-87fe-8e9f860879a1\" (UID: \"32835244-16f5-407f-87fe-8e9f860879a1\") " Mar 09 14:31:58 crc kubenswrapper[4722]: I0309 14:31:58.175978 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32835244-16f5-407f-87fe-8e9f860879a1-scripts\") pod \"32835244-16f5-407f-87fe-8e9f860879a1\" (UID: \"32835244-16f5-407f-87fe-8e9f860879a1\") " Mar 09 14:31:58 crc kubenswrapper[4722]: I0309 14:31:58.176043 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32835244-16f5-407f-87fe-8e9f860879a1-config-data\") pod \"32835244-16f5-407f-87fe-8e9f860879a1\" (UID: \"32835244-16f5-407f-87fe-8e9f860879a1\") " Mar 09 14:31:58 crc kubenswrapper[4722]: I0309 14:31:58.184242 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32835244-16f5-407f-87fe-8e9f860879a1-scripts" (OuterVolumeSpecName: "scripts") pod "32835244-16f5-407f-87fe-8e9f860879a1" (UID: "32835244-16f5-407f-87fe-8e9f860879a1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:31:58 crc kubenswrapper[4722]: I0309 14:31:58.187409 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32835244-16f5-407f-87fe-8e9f860879a1-kube-api-access-jbqpr" (OuterVolumeSpecName: "kube-api-access-jbqpr") pod "32835244-16f5-407f-87fe-8e9f860879a1" (UID: "32835244-16f5-407f-87fe-8e9f860879a1"). InnerVolumeSpecName "kube-api-access-jbqpr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:31:58 crc kubenswrapper[4722]: I0309 14:31:58.261473 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32835244-16f5-407f-87fe-8e9f860879a1-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "32835244-16f5-407f-87fe-8e9f860879a1" (UID: "32835244-16f5-407f-87fe-8e9f860879a1"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:31:58 crc kubenswrapper[4722]: I0309 14:31:58.263798 4722 generic.go:334] "Generic (PLEG): container finished" podID="32835244-16f5-407f-87fe-8e9f860879a1" containerID="33efd07beecb95d9c5a7c65c557ef23d3fc4f7096265860a26df35d8467ea987" exitCode=0 Mar 09 14:31:58 crc kubenswrapper[4722]: I0309 14:31:58.263840 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"32835244-16f5-407f-87fe-8e9f860879a1","Type":"ContainerDied","Data":"33efd07beecb95d9c5a7c65c557ef23d3fc4f7096265860a26df35d8467ea987"} Mar 09 14:31:58 crc kubenswrapper[4722]: I0309 14:31:58.263866 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"32835244-16f5-407f-87fe-8e9f860879a1","Type":"ContainerDied","Data":"cd1baac69ca27665f23f54b2848a483ab2e3e2d3ff6969486aaef6d610fb4820"} Mar 09 14:31:58 crc kubenswrapper[4722]: I0309 14:31:58.263881 4722 scope.go:117] "RemoveContainer" containerID="17a0c0808de0ddcab290ce82e6cc5add34970fa0f13ed7e0572596829e5f0e53" Mar 09 14:31:58 crc kubenswrapper[4722]: I0309 14:31:58.264012 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 09 14:31:58 crc kubenswrapper[4722]: I0309 14:31:58.279784 4722 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/32835244-16f5-407f-87fe-8e9f860879a1-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 14:31:58 crc kubenswrapper[4722]: I0309 14:31:58.279827 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32835244-16f5-407f-87fe-8e9f860879a1-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 14:31:58 crc kubenswrapper[4722]: I0309 14:31:58.279840 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jbqpr\" (UniqueName: \"kubernetes.io/projected/32835244-16f5-407f-87fe-8e9f860879a1-kube-api-access-jbqpr\") on node \"crc\" DevicePath \"\"" Mar 09 14:31:58 crc kubenswrapper[4722]: I0309 14:31:58.317749 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32835244-16f5-407f-87fe-8e9f860879a1-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "32835244-16f5-407f-87fe-8e9f860879a1" (UID: "32835244-16f5-407f-87fe-8e9f860879a1"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:31:58 crc kubenswrapper[4722]: I0309 14:31:58.354192 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32835244-16f5-407f-87fe-8e9f860879a1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "32835244-16f5-407f-87fe-8e9f860879a1" (UID: "32835244-16f5-407f-87fe-8e9f860879a1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:31:58 crc kubenswrapper[4722]: I0309 14:31:58.366277 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32835244-16f5-407f-87fe-8e9f860879a1-config-data" (OuterVolumeSpecName: "config-data") pod "32835244-16f5-407f-87fe-8e9f860879a1" (UID: "32835244-16f5-407f-87fe-8e9f860879a1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:31:58 crc kubenswrapper[4722]: I0309 14:31:58.382220 4722 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/32835244-16f5-407f-87fe-8e9f860879a1-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 14:31:58 crc kubenswrapper[4722]: I0309 14:31:58.382610 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32835244-16f5-407f-87fe-8e9f860879a1-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:31:58 crc kubenswrapper[4722]: I0309 14:31:58.382626 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32835244-16f5-407f-87fe-8e9f860879a1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:31:58 crc kubenswrapper[4722]: I0309 14:31:58.480372 4722 scope.go:117] "RemoveContainer" containerID="33efd07beecb95d9c5a7c65c557ef23d3fc4f7096265860a26df35d8467ea987" Mar 09 14:31:58 crc kubenswrapper[4722]: I0309 14:31:58.526801 4722 scope.go:117] "RemoveContainer" containerID="5ee6f3fb3854cc97d72b1225b0810ee97b98bfdbbaf067f7d20c95620ac51434" Mar 09 14:31:58 crc kubenswrapper[4722]: I0309 14:31:58.567036 4722 scope.go:117] "RemoveContainer" containerID="11e3c915b7b5f964267f5dd40b1448790d2c0b4d7f3c65d0c4a79ef035fc7b12" Mar 09 14:31:58 crc kubenswrapper[4722]: I0309 14:31:58.604629 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Mar 09 14:31:58 crc kubenswrapper[4722]: I0309 14:31:58.613623 4722 scope.go:117] "RemoveContainer" containerID="17a0c0808de0ddcab290ce82e6cc5add34970fa0f13ed7e0572596829e5f0e53" Mar 09 14:31:58 crc kubenswrapper[4722]: E0309 14:31:58.614374 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17a0c0808de0ddcab290ce82e6cc5add34970fa0f13ed7e0572596829e5f0e53\": container with ID starting with 17a0c0808de0ddcab290ce82e6cc5add34970fa0f13ed7e0572596829e5f0e53 not found: ID does not exist" containerID="17a0c0808de0ddcab290ce82e6cc5add34970fa0f13ed7e0572596829e5f0e53" Mar 09 14:31:58 crc kubenswrapper[4722]: I0309 14:31:58.614422 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17a0c0808de0ddcab290ce82e6cc5add34970fa0f13ed7e0572596829e5f0e53"} err="failed to get container status \"17a0c0808de0ddcab290ce82e6cc5add34970fa0f13ed7e0572596829e5f0e53\": rpc error: code = NotFound desc = could not find container \"17a0c0808de0ddcab290ce82e6cc5add34970fa0f13ed7e0572596829e5f0e53\": container with ID starting with 17a0c0808de0ddcab290ce82e6cc5add34970fa0f13ed7e0572596829e5f0e53 not found: ID does not exist" Mar 09 14:31:58 crc kubenswrapper[4722]: I0309 14:31:58.614457 4722 scope.go:117] "RemoveContainer" containerID="33efd07beecb95d9c5a7c65c557ef23d3fc4f7096265860a26df35d8467ea987" Mar 09 14:31:58 crc kubenswrapper[4722]: E0309 14:31:58.615002 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33efd07beecb95d9c5a7c65c557ef23d3fc4f7096265860a26df35d8467ea987\": container with ID starting with 33efd07beecb95d9c5a7c65c557ef23d3fc4f7096265860a26df35d8467ea987 not found: ID does not exist" containerID="33efd07beecb95d9c5a7c65c557ef23d3fc4f7096265860a26df35d8467ea987" Mar 09 14:31:58 crc kubenswrapper[4722]: I0309 14:31:58.615051 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33efd07beecb95d9c5a7c65c557ef23d3fc4f7096265860a26df35d8467ea987"} err="failed to get container status \"33efd07beecb95d9c5a7c65c557ef23d3fc4f7096265860a26df35d8467ea987\": rpc error: code = NotFound desc = could not find container \"33efd07beecb95d9c5a7c65c557ef23d3fc4f7096265860a26df35d8467ea987\": container with ID starting with 33efd07beecb95d9c5a7c65c557ef23d3fc4f7096265860a26df35d8467ea987 not found: ID does not exist" Mar 09 14:31:58 crc kubenswrapper[4722]: I0309 14:31:58.615085 4722 scope.go:117] "RemoveContainer" containerID="5ee6f3fb3854cc97d72b1225b0810ee97b98bfdbbaf067f7d20c95620ac51434" Mar 09 14:31:58 crc kubenswrapper[4722]: E0309 14:31:58.615530 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ee6f3fb3854cc97d72b1225b0810ee97b98bfdbbaf067f7d20c95620ac51434\": container with ID starting with 5ee6f3fb3854cc97d72b1225b0810ee97b98bfdbbaf067f7d20c95620ac51434 not found: ID does not exist" containerID="5ee6f3fb3854cc97d72b1225b0810ee97b98bfdbbaf067f7d20c95620ac51434" Mar 09 14:31:58 crc kubenswrapper[4722]: I0309 14:31:58.615591 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ee6f3fb3854cc97d72b1225b0810ee97b98bfdbbaf067f7d20c95620ac51434"} err="failed to get container status \"5ee6f3fb3854cc97d72b1225b0810ee97b98bfdbbaf067f7d20c95620ac51434\": rpc error: code = NotFound desc = could not find container \"5ee6f3fb3854cc97d72b1225b0810ee97b98bfdbbaf067f7d20c95620ac51434\": container with ID starting with 5ee6f3fb3854cc97d72b1225b0810ee97b98bfdbbaf067f7d20c95620ac51434 not found: ID does not exist" Mar 09 14:31:58 crc kubenswrapper[4722]: I0309 14:31:58.615614 4722 scope.go:117] "RemoveContainer" containerID="11e3c915b7b5f964267f5dd40b1448790d2c0b4d7f3c65d0c4a79ef035fc7b12" Mar 09 14:31:58 crc kubenswrapper[4722]: E0309 14:31:58.616343 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11e3c915b7b5f964267f5dd40b1448790d2c0b4d7f3c65d0c4a79ef035fc7b12\": container with ID starting with 11e3c915b7b5f964267f5dd40b1448790d2c0b4d7f3c65d0c4a79ef035fc7b12 not found: ID does not exist" containerID="11e3c915b7b5f964267f5dd40b1448790d2c0b4d7f3c65d0c4a79ef035fc7b12" Mar 09 14:31:58 crc kubenswrapper[4722]: I0309 14:31:58.616531 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11e3c915b7b5f964267f5dd40b1448790d2c0b4d7f3c65d0c4a79ef035fc7b12"} err="failed to get container status \"11e3c915b7b5f964267f5dd40b1448790d2c0b4d7f3c65d0c4a79ef035fc7b12\": rpc error: code = NotFound desc = could not find container \"11e3c915b7b5f964267f5dd40b1448790d2c0b4d7f3c65d0c4a79ef035fc7b12\": container with ID starting with 11e3c915b7b5f964267f5dd40b1448790d2c0b4d7f3c65d0c4a79ef035fc7b12 not found: ID does not exist" Mar 09 14:31:58 crc kubenswrapper[4722]: I0309 14:31:58.643728 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0"] Mar 09 14:31:58 crc kubenswrapper[4722]: I0309 14:31:58.662174 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Mar 09 14:31:58 crc kubenswrapper[4722]: E0309 14:31:58.667647 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32835244-16f5-407f-87fe-8e9f860879a1" containerName="aodh-listener" Mar 09 14:31:58 crc kubenswrapper[4722]: I0309 14:31:58.667914 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="32835244-16f5-407f-87fe-8e9f860879a1" containerName="aodh-listener" Mar 09 14:31:58 crc kubenswrapper[4722]: E0309 14:31:58.668040 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32835244-16f5-407f-87fe-8e9f860879a1" containerName="aodh-evaluator" Mar 09 14:31:58 crc kubenswrapper[4722]: I0309 14:31:58.668126 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="32835244-16f5-407f-87fe-8e9f860879a1" containerName="aodh-evaluator" Mar 09 14:31:58 crc kubenswrapper[4722]: E0309 14:31:58.668250 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32835244-16f5-407f-87fe-8e9f860879a1" containerName="aodh-notifier" Mar 09 14:31:58 crc kubenswrapper[4722]: I0309 14:31:58.668343 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="32835244-16f5-407f-87fe-8e9f860879a1" containerName="aodh-notifier" Mar 09 14:31:58 crc kubenswrapper[4722]: E0309 14:31:58.668474 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32835244-16f5-407f-87fe-8e9f860879a1" containerName="aodh-api" Mar 09 14:31:58 crc kubenswrapper[4722]: I0309 14:31:58.668645 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="32835244-16f5-407f-87fe-8e9f860879a1" containerName="aodh-api" Mar 09 14:31:58 crc kubenswrapper[4722]: I0309 14:31:58.669239 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="32835244-16f5-407f-87fe-8e9f860879a1" containerName="aodh-api" Mar 09 14:31:58 crc kubenswrapper[4722]: I0309 14:31:58.669377 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="32835244-16f5-407f-87fe-8e9f860879a1" containerName="aodh-evaluator" Mar 09 14:31:58 crc kubenswrapper[4722]: I0309 14:31:58.669489 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="32835244-16f5-407f-87fe-8e9f860879a1" containerName="aodh-notifier" Mar 09 14:31:58 crc kubenswrapper[4722]: I0309 14:31:58.669588 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="32835244-16f5-407f-87fe-8e9f860879a1" containerName="aodh-listener" Mar 09 14:31:58 crc kubenswrapper[4722]: I0309 14:31:58.672721 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 09 14:31:58 crc kubenswrapper[4722]: I0309 14:31:58.675283 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-internal-svc" Mar 09 14:31:58 crc kubenswrapper[4722]: I0309 14:31:58.675510 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Mar 09 14:31:58 crc kubenswrapper[4722]: I0309 14:31:58.675916 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-public-svc" Mar 09 14:31:58 crc kubenswrapper[4722]: I0309 14:31:58.676084 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-d98zq" Mar 09 14:31:58 crc kubenswrapper[4722]: I0309 14:31:58.679587 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Mar 09 14:31:58 crc kubenswrapper[4722]: I0309 14:31:58.679677 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 09 14:31:58 crc kubenswrapper[4722]: I0309 14:31:58.791021 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0083dc56-7904-4b06-8c12-2429f8a7fe9a-internal-tls-certs\") pod \"aodh-0\" (UID: \"0083dc56-7904-4b06-8c12-2429f8a7fe9a\") " pod="openstack/aodh-0" Mar 09 14:31:58 crc kubenswrapper[4722]: I0309 14:31:58.792948 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0083dc56-7904-4b06-8c12-2429f8a7fe9a-config-data\") pod \"aodh-0\" (UID: \"0083dc56-7904-4b06-8c12-2429f8a7fe9a\") " pod="openstack/aodh-0" Mar 09 14:31:58 crc kubenswrapper[4722]: I0309 14:31:58.793231 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0083dc56-7904-4b06-8c12-2429f8a7fe9a-scripts\") pod \"aodh-0\" (UID: \"0083dc56-7904-4b06-8c12-2429f8a7fe9a\") " pod="openstack/aodh-0" Mar 09 14:31:58 crc kubenswrapper[4722]: I0309 14:31:58.793539 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0083dc56-7904-4b06-8c12-2429f8a7fe9a-public-tls-certs\") pod \"aodh-0\" (UID: \"0083dc56-7904-4b06-8c12-2429f8a7fe9a\") " pod="openstack/aodh-0" Mar 09 14:31:58 crc kubenswrapper[4722]: I0309 14:31:58.793689 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0083dc56-7904-4b06-8c12-2429f8a7fe9a-combined-ca-bundle\") pod \"aodh-0\" (UID: \"0083dc56-7904-4b06-8c12-2429f8a7fe9a\") " pod="openstack/aodh-0" Mar 09 14:31:58 crc kubenswrapper[4722]: I0309 14:31:58.793935 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-db6mg\" (UniqueName: \"kubernetes.io/projected/0083dc56-7904-4b06-8c12-2429f8a7fe9a-kube-api-access-db6mg\") pod \"aodh-0\" (UID: \"0083dc56-7904-4b06-8c12-2429f8a7fe9a\") " pod="openstack/aodh-0" Mar 09 14:31:58 crc kubenswrapper[4722]: I0309 14:31:58.896267 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0083dc56-7904-4b06-8c12-2429f8a7fe9a-public-tls-certs\") pod \"aodh-0\" (UID: \"0083dc56-7904-4b06-8c12-2429f8a7fe9a\") " pod="openstack/aodh-0" Mar 09 14:31:58 crc kubenswrapper[4722]: I0309 14:31:58.896340 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0083dc56-7904-4b06-8c12-2429f8a7fe9a-combined-ca-bundle\") pod \"aodh-0\" (UID: \"0083dc56-7904-4b06-8c12-2429f8a7fe9a\") " pod="openstack/aodh-0" Mar 09 14:31:58 crc kubenswrapper[4722]: I0309 14:31:58.896424 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-db6mg\" (UniqueName: \"kubernetes.io/projected/0083dc56-7904-4b06-8c12-2429f8a7fe9a-kube-api-access-db6mg\") pod \"aodh-0\" (UID: \"0083dc56-7904-4b06-8c12-2429f8a7fe9a\") " pod="openstack/aodh-0" Mar 09 14:31:58 crc kubenswrapper[4722]: I0309 14:31:58.896545 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0083dc56-7904-4b06-8c12-2429f8a7fe9a-internal-tls-certs\") pod \"aodh-0\" (UID: \"0083dc56-7904-4b06-8c12-2429f8a7fe9a\") " pod="openstack/aodh-0" Mar 09 14:31:58 crc kubenswrapper[4722]: I0309 14:31:58.896574 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0083dc56-7904-4b06-8c12-2429f8a7fe9a-config-data\") pod \"aodh-0\" (UID: \"0083dc56-7904-4b06-8c12-2429f8a7fe9a\") " pod="openstack/aodh-0" Mar 09 14:31:58 crc kubenswrapper[4722]: I0309 14:31:58.896621 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0083dc56-7904-4b06-8c12-2429f8a7fe9a-scripts\") pod \"aodh-0\" (UID: \"0083dc56-7904-4b06-8c12-2429f8a7fe9a\") " pod="openstack/aodh-0" Mar 09 14:31:58 crc kubenswrapper[4722]: I0309 14:31:58.902323 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0083dc56-7904-4b06-8c12-2429f8a7fe9a-internal-tls-certs\") pod \"aodh-0\" (UID: \"0083dc56-7904-4b06-8c12-2429f8a7fe9a\") " pod="openstack/aodh-0" Mar 09 14:31:58 crc kubenswrapper[4722]: I0309 14:31:58.902468 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0083dc56-7904-4b06-8c12-2429f8a7fe9a-combined-ca-bundle\") pod \"aodh-0\" (UID: \"0083dc56-7904-4b06-8c12-2429f8a7fe9a\") " pod="openstack/aodh-0" Mar 09 14:31:58 crc kubenswrapper[4722]: I0309 14:31:58.902780 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0083dc56-7904-4b06-8c12-2429f8a7fe9a-public-tls-certs\") pod \"aodh-0\" (UID: \"0083dc56-7904-4b06-8c12-2429f8a7fe9a\") " pod="openstack/aodh-0" Mar 09 14:31:58 crc kubenswrapper[4722]: I0309 14:31:58.903260 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0083dc56-7904-4b06-8c12-2429f8a7fe9a-config-data\") pod \"aodh-0\" (UID: \"0083dc56-7904-4b06-8c12-2429f8a7fe9a\") " pod="openstack/aodh-0" Mar 09 14:31:58 crc kubenswrapper[4722]: I0309 14:31:58.904510 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0083dc56-7904-4b06-8c12-2429f8a7fe9a-scripts\") pod \"aodh-0\" (UID: \"0083dc56-7904-4b06-8c12-2429f8a7fe9a\") " pod="openstack/aodh-0" Mar 09 14:31:58 crc kubenswrapper[4722]: I0309 14:31:58.919659 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-db6mg\" (UniqueName: \"kubernetes.io/projected/0083dc56-7904-4b06-8c12-2429f8a7fe9a-kube-api-access-db6mg\") pod \"aodh-0\" (UID: \"0083dc56-7904-4b06-8c12-2429f8a7fe9a\") " pod="openstack/aodh-0" Mar 09 14:31:58 crc kubenswrapper[4722]: I0309 14:31:58.996164 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 09 14:31:59 crc kubenswrapper[4722]: I0309 14:31:59.589487 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 09 14:32:00 crc kubenswrapper[4722]: I0309 14:32:00.147780 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551112-ptlnj"] Mar 09 14:32:00 crc kubenswrapper[4722]: I0309 14:32:00.155423 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551112-ptlnj" Mar 09 14:32:00 crc kubenswrapper[4722]: I0309 14:32:00.164015 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 14:32:00 crc kubenswrapper[4722]: I0309 14:32:00.164455 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 14:32:00 crc kubenswrapper[4722]: I0309 14:32:00.165068 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-dr2x6" Mar 09 14:32:00 crc kubenswrapper[4722]: I0309 14:32:00.173379 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32835244-16f5-407f-87fe-8e9f860879a1" path="/var/lib/kubelet/pods/32835244-16f5-407f-87fe-8e9f860879a1/volumes" Mar 09 14:32:00 crc kubenswrapper[4722]: I0309 14:32:00.175108 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551112-ptlnj"] Mar 09 14:32:00 crc kubenswrapper[4722]: I0309 14:32:00.235848 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vjs8\" (UniqueName: \"kubernetes.io/projected/ff07094b-d1ed-4466-b69f-457d3bbacfed-kube-api-access-7vjs8\") pod \"auto-csr-approver-29551112-ptlnj\" (UID: \"ff07094b-d1ed-4466-b69f-457d3bbacfed\") " pod="openshift-infra/auto-csr-approver-29551112-ptlnj" Mar 09 14:32:00 crc kubenswrapper[4722]: I0309 14:32:00.290501 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"0083dc56-7904-4b06-8c12-2429f8a7fe9a","Type":"ContainerStarted","Data":"2d75f65e61b40bee89edfddcadd6e114b3ab09318ead0ddb452c27603e0e0af2"} Mar 09 14:32:00 crc kubenswrapper[4722]: I0309 14:32:00.290542 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"0083dc56-7904-4b06-8c12-2429f8a7fe9a","Type":"ContainerStarted","Data":"ad79c8997725c69a498f6982e5eb5f870afc524627837e1acf14e3cac42c2b75"} Mar 09 14:32:00 crc kubenswrapper[4722]: I0309 14:32:00.338299 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vjs8\" (UniqueName: \"kubernetes.io/projected/ff07094b-d1ed-4466-b69f-457d3bbacfed-kube-api-access-7vjs8\") pod \"auto-csr-approver-29551112-ptlnj\" (UID: \"ff07094b-d1ed-4466-b69f-457d3bbacfed\") " pod="openshift-infra/auto-csr-approver-29551112-ptlnj" Mar 09 14:32:00 crc kubenswrapper[4722]: I0309 14:32:00.360939 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vjs8\" (UniqueName: \"kubernetes.io/projected/ff07094b-d1ed-4466-b69f-457d3bbacfed-kube-api-access-7vjs8\") pod \"auto-csr-approver-29551112-ptlnj\" (UID: \"ff07094b-d1ed-4466-b69f-457d3bbacfed\") " pod="openshift-infra/auto-csr-approver-29551112-ptlnj" Mar 09 14:32:00 crc kubenswrapper[4722]: I0309 14:32:00.496467 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551112-ptlnj" Mar 09 14:32:01 crc kubenswrapper[4722]: I0309 14:32:01.018279 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551112-ptlnj"] Mar 09 14:32:01 crc kubenswrapper[4722]: I0309 14:32:01.306791 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551112-ptlnj" event={"ID":"ff07094b-d1ed-4466-b69f-457d3bbacfed","Type":"ContainerStarted","Data":"6d690293b00393b554bf8ddfb5901141047df003bf3cf0cc72bea468645eb807"} Mar 09 14:32:01 crc kubenswrapper[4722]: I0309 14:32:01.547969 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rvtqn" Mar 09 14:32:01 crc kubenswrapper[4722]: I0309 14:32:01.548535 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rvtqn" Mar 09 14:32:01 crc kubenswrapper[4722]: I0309 14:32:01.648597 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rvtqn" Mar 09 14:32:02 crc kubenswrapper[4722]: I0309 14:32:02.152444 4722 scope.go:117] "RemoveContainer" containerID="86a6dcdf7dcf4d48b8bdfb71d9a2c30b89f235700c424312c7e16580a86021d6" Mar 09 14:32:02 crc kubenswrapper[4722]: E0309 14:32:02.152792 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 14:32:02 crc kubenswrapper[4722]: I0309 14:32:02.333872 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"0083dc56-7904-4b06-8c12-2429f8a7fe9a","Type":"ContainerStarted","Data":"7f7614ef6af26774b270349a04a153526bd544d2fbe27ea38db9c47ac1785b4e"} Mar 09 14:32:02 crc kubenswrapper[4722]: I0309 14:32:02.400278 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rvtqn" Mar 09 14:32:02 crc kubenswrapper[4722]: I0309 14:32:02.497163 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rvtqn"] Mar 09 14:32:02 crc kubenswrapper[4722]: I0309 14:32:02.552508 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hrrr6"] Mar 09 14:32:02 crc kubenswrapper[4722]: I0309 14:32:02.552866 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hrrr6" podUID="c85eb92e-6d30-4e52-9176-70140b518ce9" containerName="registry-server" containerID="cri-o://0b9fd0ff003c995530f76971084a2f2fb868bbb21a5f5dc686aa7c069616d9a4" gracePeriod=2 Mar 09 14:32:03 crc kubenswrapper[4722]: I0309 14:32:03.318286 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hrrr6" Mar 09 14:32:03 crc kubenswrapper[4722]: I0309 14:32:03.383543 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"0083dc56-7904-4b06-8c12-2429f8a7fe9a","Type":"ContainerStarted","Data":"3d43e921cf089562168d79c840df54f69deba21a67fe301103919ee694aced01"} Mar 09 14:32:03 crc kubenswrapper[4722]: I0309 14:32:03.394429 4722 generic.go:334] "Generic (PLEG): container finished" podID="c85eb92e-6d30-4e52-9176-70140b518ce9" containerID="0b9fd0ff003c995530f76971084a2f2fb868bbb21a5f5dc686aa7c069616d9a4" exitCode=0 Mar 09 14:32:03 crc kubenswrapper[4722]: I0309 14:32:03.394507 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hrrr6" event={"ID":"c85eb92e-6d30-4e52-9176-70140b518ce9","Type":"ContainerDied","Data":"0b9fd0ff003c995530f76971084a2f2fb868bbb21a5f5dc686aa7c069616d9a4"} Mar 09 14:32:03 crc kubenswrapper[4722]: I0309 14:32:03.394580 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hrrr6" event={"ID":"c85eb92e-6d30-4e52-9176-70140b518ce9","Type":"ContainerDied","Data":"efb2586f3d856f1ee84b0d6379cd480c55766ad7bd456633efe29691c223582b"} Mar 09 14:32:03 crc kubenswrapper[4722]: I0309 14:32:03.394603 4722 scope.go:117] "RemoveContainer" containerID="0b9fd0ff003c995530f76971084a2f2fb868bbb21a5f5dc686aa7c069616d9a4" Mar 09 14:32:03 crc kubenswrapper[4722]: I0309 14:32:03.395266 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hrrr6" Mar 09 14:32:03 crc kubenswrapper[4722]: I0309 14:32:03.457736 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4mnhk\" (UniqueName: \"kubernetes.io/projected/c85eb92e-6d30-4e52-9176-70140b518ce9-kube-api-access-4mnhk\") pod \"c85eb92e-6d30-4e52-9176-70140b518ce9\" (UID: \"c85eb92e-6d30-4e52-9176-70140b518ce9\") " Mar 09 14:32:03 crc kubenswrapper[4722]: I0309 14:32:03.458325 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c85eb92e-6d30-4e52-9176-70140b518ce9-catalog-content\") pod \"c85eb92e-6d30-4e52-9176-70140b518ce9\" (UID: \"c85eb92e-6d30-4e52-9176-70140b518ce9\") " Mar 09 14:32:03 crc kubenswrapper[4722]: I0309 14:32:03.458565 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c85eb92e-6d30-4e52-9176-70140b518ce9-utilities\") pod \"c85eb92e-6d30-4e52-9176-70140b518ce9\" (UID: \"c85eb92e-6d30-4e52-9176-70140b518ce9\") " Mar 09 14:32:03 crc kubenswrapper[4722]: I0309 14:32:03.464814 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c85eb92e-6d30-4e52-9176-70140b518ce9-utilities" (OuterVolumeSpecName: "utilities") pod "c85eb92e-6d30-4e52-9176-70140b518ce9" (UID: "c85eb92e-6d30-4e52-9176-70140b518ce9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:32:03 crc kubenswrapper[4722]: I0309 14:32:03.471568 4722 scope.go:117] "RemoveContainer" containerID="b72d6c9c9baff3642f192cd7fe21a0d800fe78ebbb0e9106e4c2ff7389e60b81" Mar 09 14:32:03 crc kubenswrapper[4722]: I0309 14:32:03.477517 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c85eb92e-6d30-4e52-9176-70140b518ce9-kube-api-access-4mnhk" (OuterVolumeSpecName: "kube-api-access-4mnhk") pod "c85eb92e-6d30-4e52-9176-70140b518ce9" (UID: "c85eb92e-6d30-4e52-9176-70140b518ce9"). InnerVolumeSpecName "kube-api-access-4mnhk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:32:03 crc kubenswrapper[4722]: I0309 14:32:03.569697 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4mnhk\" (UniqueName: \"kubernetes.io/projected/c85eb92e-6d30-4e52-9176-70140b518ce9-kube-api-access-4mnhk\") on node \"crc\" DevicePath \"\"" Mar 09 14:32:03 crc kubenswrapper[4722]: I0309 14:32:03.569729 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c85eb92e-6d30-4e52-9176-70140b518ce9-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 14:32:03 crc kubenswrapper[4722]: I0309 14:32:03.617425 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c85eb92e-6d30-4e52-9176-70140b518ce9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c85eb92e-6d30-4e52-9176-70140b518ce9" (UID: "c85eb92e-6d30-4e52-9176-70140b518ce9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:32:03 crc kubenswrapper[4722]: I0309 14:32:03.624929 4722 scope.go:117] "RemoveContainer" containerID="974026ef92eb9be1a82222b8f60ee7e5839cd035daee5e85fb5ebb63c9f5c990" Mar 09 14:32:03 crc kubenswrapper[4722]: I0309 14:32:03.671878 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c85eb92e-6d30-4e52-9176-70140b518ce9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 14:32:03 crc kubenswrapper[4722]: I0309 14:32:03.691696 4722 scope.go:117] "RemoveContainer" containerID="0b9fd0ff003c995530f76971084a2f2fb868bbb21a5f5dc686aa7c069616d9a4" Mar 09 14:32:03 crc kubenswrapper[4722]: E0309 14:32:03.692068 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b9fd0ff003c995530f76971084a2f2fb868bbb21a5f5dc686aa7c069616d9a4\": container with ID starting with 0b9fd0ff003c995530f76971084a2f2fb868bbb21a5f5dc686aa7c069616d9a4 not found: ID does not exist" containerID="0b9fd0ff003c995530f76971084a2f2fb868bbb21a5f5dc686aa7c069616d9a4" Mar 09 14:32:03 crc kubenswrapper[4722]: I0309 14:32:03.692097 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b9fd0ff003c995530f76971084a2f2fb868bbb21a5f5dc686aa7c069616d9a4"} err="failed to get container status \"0b9fd0ff003c995530f76971084a2f2fb868bbb21a5f5dc686aa7c069616d9a4\": rpc error: code = NotFound desc = could not find container \"0b9fd0ff003c995530f76971084a2f2fb868bbb21a5f5dc686aa7c069616d9a4\": container with ID starting with 0b9fd0ff003c995530f76971084a2f2fb868bbb21a5f5dc686aa7c069616d9a4 not found: ID does not exist" Mar 09 14:32:03 crc kubenswrapper[4722]: I0309 14:32:03.692118 4722 scope.go:117] "RemoveContainer" containerID="b72d6c9c9baff3642f192cd7fe21a0d800fe78ebbb0e9106e4c2ff7389e60b81" Mar 09 14:32:03 crc kubenswrapper[4722]: E0309 14:32:03.692487 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b72d6c9c9baff3642f192cd7fe21a0d800fe78ebbb0e9106e4c2ff7389e60b81\": container with ID starting with b72d6c9c9baff3642f192cd7fe21a0d800fe78ebbb0e9106e4c2ff7389e60b81 not found: ID does not exist" containerID="b72d6c9c9baff3642f192cd7fe21a0d800fe78ebbb0e9106e4c2ff7389e60b81" Mar 09 14:32:03 crc kubenswrapper[4722]: I0309 14:32:03.692514 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b72d6c9c9baff3642f192cd7fe21a0d800fe78ebbb0e9106e4c2ff7389e60b81"} err="failed to get container status \"b72d6c9c9baff3642f192cd7fe21a0d800fe78ebbb0e9106e4c2ff7389e60b81\": rpc error: code = NotFound desc = could not find container \"b72d6c9c9baff3642f192cd7fe21a0d800fe78ebbb0e9106e4c2ff7389e60b81\": container with ID starting with b72d6c9c9baff3642f192cd7fe21a0d800fe78ebbb0e9106e4c2ff7389e60b81 not found: ID does not exist" Mar 09 14:32:03 crc kubenswrapper[4722]: I0309 14:32:03.692529 4722 scope.go:117] "RemoveContainer" containerID="974026ef92eb9be1a82222b8f60ee7e5839cd035daee5e85fb5ebb63c9f5c990" Mar 09 14:32:03 crc kubenswrapper[4722]: E0309 14:32:03.692712 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"974026ef92eb9be1a82222b8f60ee7e5839cd035daee5e85fb5ebb63c9f5c990\": container with ID starting with 974026ef92eb9be1a82222b8f60ee7e5839cd035daee5e85fb5ebb63c9f5c990 not found: ID does not exist" containerID="974026ef92eb9be1a82222b8f60ee7e5839cd035daee5e85fb5ebb63c9f5c990" Mar 09 14:32:03 crc kubenswrapper[4722]: I0309 14:32:03.692733 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"974026ef92eb9be1a82222b8f60ee7e5839cd035daee5e85fb5ebb63c9f5c990"} err="failed to get container status \"974026ef92eb9be1a82222b8f60ee7e5839cd035daee5e85fb5ebb63c9f5c990\": rpc error: code = NotFound desc = could not find container \"974026ef92eb9be1a82222b8f60ee7e5839cd035daee5e85fb5ebb63c9f5c990\": container with ID starting with 974026ef92eb9be1a82222b8f60ee7e5839cd035daee5e85fb5ebb63c9f5c990 not found: ID does not exist" Mar 09 14:32:03 crc kubenswrapper[4722]: I0309 14:32:03.745000 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hrrr6"] Mar 09 14:32:03 crc kubenswrapper[4722]: I0309 14:32:03.756896 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hrrr6"] Mar 09 14:32:04 crc kubenswrapper[4722]: I0309 14:32:04.169304 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c85eb92e-6d30-4e52-9176-70140b518ce9" path="/var/lib/kubelet/pods/c85eb92e-6d30-4e52-9176-70140b518ce9/volumes" Mar 09 14:32:04 crc kubenswrapper[4722]: I0309 14:32:04.520069 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551112-ptlnj" event={"ID":"ff07094b-d1ed-4466-b69f-457d3bbacfed","Type":"ContainerStarted","Data":"ab9bc4b07b84903b4fc451399f48da08c27672ca2db39820c74ff7fcc92fd841"} Mar 09 14:32:04 crc kubenswrapper[4722]: I0309 14:32:04.554632 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551112-ptlnj" podStartSLOduration=2.561533765 podStartE2EDuration="4.554609714s" podCreationTimestamp="2026-03-09 14:32:00 +0000 UTC" firstStartedPulling="2026-03-09 14:32:01.106515358 +0000 UTC m=+1761.662083934" lastFinishedPulling="2026-03-09 14:32:03.099591307 +0000 UTC m=+1763.655159883" observedRunningTime="2026-03-09 14:32:04.534770097 +0000 UTC m=+1765.090338683" watchObservedRunningTime="2026-03-09 14:32:04.554609714 +0000 UTC m=+1765.110178290" Mar 09 14:32:05 crc kubenswrapper[4722]: I0309 14:32:05.532192 4722 generic.go:334] "Generic (PLEG): container finished" podID="ff07094b-d1ed-4466-b69f-457d3bbacfed" containerID="ab9bc4b07b84903b4fc451399f48da08c27672ca2db39820c74ff7fcc92fd841" exitCode=0 Mar 09 14:32:05 crc kubenswrapper[4722]: I0309 14:32:05.532310 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551112-ptlnj" event={"ID":"ff07094b-d1ed-4466-b69f-457d3bbacfed","Type":"ContainerDied","Data":"ab9bc4b07b84903b4fc451399f48da08c27672ca2db39820c74ff7fcc92fd841"} Mar 09 14:32:05 crc kubenswrapper[4722]: I0309 14:32:05.536555 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"0083dc56-7904-4b06-8c12-2429f8a7fe9a","Type":"ContainerStarted","Data":"2862b92e774c366ba182322ae8e651b3d2c4f846a33c5d61de728de72d236cc0"} Mar 09 14:32:05 crc kubenswrapper[4722]: I0309 14:32:05.568875 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.150865046 podStartE2EDuration="7.568854281s" podCreationTimestamp="2026-03-09 14:31:58 +0000 UTC" firstStartedPulling="2026-03-09 14:31:59.592975347 +0000 UTC m=+1760.148543913" lastFinishedPulling="2026-03-09 14:32:05.010964572 +0000 UTC m=+1765.566533148" observedRunningTime="2026-03-09 14:32:05.566929239 +0000 UTC m=+1766.122497825" watchObservedRunningTime="2026-03-09 14:32:05.568854281 +0000 UTC m=+1766.124422857" Mar 09 14:32:07 crc kubenswrapper[4722]: I0309 14:32:07.068495 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551112-ptlnj" Mar 09 14:32:07 crc kubenswrapper[4722]: I0309 14:32:07.178304 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vjs8\" (UniqueName: \"kubernetes.io/projected/ff07094b-d1ed-4466-b69f-457d3bbacfed-kube-api-access-7vjs8\") pod \"ff07094b-d1ed-4466-b69f-457d3bbacfed\" (UID: \"ff07094b-d1ed-4466-b69f-457d3bbacfed\") " Mar 09 14:32:07 crc kubenswrapper[4722]: I0309 14:32:07.187044 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff07094b-d1ed-4466-b69f-457d3bbacfed-kube-api-access-7vjs8" (OuterVolumeSpecName: "kube-api-access-7vjs8") pod "ff07094b-d1ed-4466-b69f-457d3bbacfed" (UID: "ff07094b-d1ed-4466-b69f-457d3bbacfed"). InnerVolumeSpecName "kube-api-access-7vjs8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:32:07 crc kubenswrapper[4722]: I0309 14:32:07.282144 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7vjs8\" (UniqueName: \"kubernetes.io/projected/ff07094b-d1ed-4466-b69f-457d3bbacfed-kube-api-access-7vjs8\") on node \"crc\" DevicePath \"\"" Mar 09 14:32:07 crc kubenswrapper[4722]: I0309 14:32:07.572668 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551112-ptlnj" event={"ID":"ff07094b-d1ed-4466-b69f-457d3bbacfed","Type":"ContainerDied","Data":"6d690293b00393b554bf8ddfb5901141047df003bf3cf0cc72bea468645eb807"} Mar 09 14:32:07 crc kubenswrapper[4722]: I0309 14:32:07.572719 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d690293b00393b554bf8ddfb5901141047df003bf3cf0cc72bea468645eb807" Mar 09 14:32:07 crc kubenswrapper[4722]: I0309 14:32:07.572785 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551112-ptlnj" Mar 09 14:32:07 crc kubenswrapper[4722]: I0309 14:32:07.581511 4722 generic.go:334] "Generic (PLEG): container finished" podID="5f8120f5-690a-4bb4-ba23-dead16d6946f" containerID="6d48e426fd3eee9a37856fba92065a93a3336de7f393be272fd17621990f78b6" exitCode=0 Mar 09 14:32:07 crc kubenswrapper[4722]: I0309 14:32:07.581582 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-rct52" event={"ID":"5f8120f5-690a-4bb4-ba23-dead16d6946f","Type":"ContainerDied","Data":"6d48e426fd3eee9a37856fba92065a93a3336de7f393be272fd17621990f78b6"} Mar 09 14:32:07 crc kubenswrapper[4722]: I0309 14:32:07.621884 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551106-tzqf7"] Mar 09 14:32:07 crc kubenswrapper[4722]: I0309 14:32:07.652951 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551106-tzqf7"] Mar 09 14:32:08 crc kubenswrapper[4722]: I0309 14:32:08.170177 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e7bbbcf-fe3a-4007-b98d-2692b6f6cbe5" path="/var/lib/kubelet/pods/6e7bbbcf-fe3a-4007-b98d-2692b6f6cbe5/volumes" Mar 09 14:32:09 crc kubenswrapper[4722]: I0309 14:32:09.247660 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-rct52" Mar 09 14:32:09 crc kubenswrapper[4722]: I0309 14:32:09.342294 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5tgb\" (UniqueName: \"kubernetes.io/projected/5f8120f5-690a-4bb4-ba23-dead16d6946f-kube-api-access-r5tgb\") pod \"5f8120f5-690a-4bb4-ba23-dead16d6946f\" (UID: \"5f8120f5-690a-4bb4-ba23-dead16d6946f\") " Mar 09 14:32:09 crc kubenswrapper[4722]: I0309 14:32:09.342640 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5f8120f5-690a-4bb4-ba23-dead16d6946f-inventory\") pod \"5f8120f5-690a-4bb4-ba23-dead16d6946f\" (UID: \"5f8120f5-690a-4bb4-ba23-dead16d6946f\") " Mar 09 14:32:09 crc kubenswrapper[4722]: I0309 14:32:09.342704 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5f8120f5-690a-4bb4-ba23-dead16d6946f-ssh-key-openstack-edpm-ipam\") pod \"5f8120f5-690a-4bb4-ba23-dead16d6946f\" (UID: \"5f8120f5-690a-4bb4-ba23-dead16d6946f\") " Mar 09 14:32:09 crc kubenswrapper[4722]: I0309 14:32:09.342742 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f8120f5-690a-4bb4-ba23-dead16d6946f-repo-setup-combined-ca-bundle\") pod \"5f8120f5-690a-4bb4-ba23-dead16d6946f\" (UID: \"5f8120f5-690a-4bb4-ba23-dead16d6946f\") " Mar 09 14:32:09 crc kubenswrapper[4722]: I0309 14:32:09.353010 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f8120f5-690a-4bb4-ba23-dead16d6946f-kube-api-access-r5tgb" (OuterVolumeSpecName: "kube-api-access-r5tgb") pod "5f8120f5-690a-4bb4-ba23-dead16d6946f" (UID: "5f8120f5-690a-4bb4-ba23-dead16d6946f"). InnerVolumeSpecName "kube-api-access-r5tgb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:32:09 crc kubenswrapper[4722]: I0309 14:32:09.356346 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f8120f5-690a-4bb4-ba23-dead16d6946f-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "5f8120f5-690a-4bb4-ba23-dead16d6946f" (UID: "5f8120f5-690a-4bb4-ba23-dead16d6946f"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:32:09 crc kubenswrapper[4722]: I0309 14:32:09.377661 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f8120f5-690a-4bb4-ba23-dead16d6946f-inventory" (OuterVolumeSpecName: "inventory") pod "5f8120f5-690a-4bb4-ba23-dead16d6946f" (UID: "5f8120f5-690a-4bb4-ba23-dead16d6946f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:32:09 crc kubenswrapper[4722]: I0309 14:32:09.403033 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f8120f5-690a-4bb4-ba23-dead16d6946f-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "5f8120f5-690a-4bb4-ba23-dead16d6946f" (UID: "5f8120f5-690a-4bb4-ba23-dead16d6946f"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:32:09 crc kubenswrapper[4722]: I0309 14:32:09.446686 4722 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5f8120f5-690a-4bb4-ba23-dead16d6946f-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 14:32:09 crc kubenswrapper[4722]: I0309 14:32:09.446733 4722 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5f8120f5-690a-4bb4-ba23-dead16d6946f-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 14:32:09 crc kubenswrapper[4722]: I0309 14:32:09.446759 4722 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f8120f5-690a-4bb4-ba23-dead16d6946f-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:32:09 crc kubenswrapper[4722]: I0309 14:32:09.446778 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r5tgb\" (UniqueName: \"kubernetes.io/projected/5f8120f5-690a-4bb4-ba23-dead16d6946f-kube-api-access-r5tgb\") on node \"crc\" DevicePath \"\"" Mar 09 14:32:09 crc kubenswrapper[4722]: I0309 14:32:09.610388 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-rct52" event={"ID":"5f8120f5-690a-4bb4-ba23-dead16d6946f","Type":"ContainerDied","Data":"fa86e54ae662b8ada509c7012ce7a0bbe3973aedb3398e2fbcaa18191a085013"} Mar 09 14:32:09 crc kubenswrapper[4722]: I0309 14:32:09.610421 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa86e54ae662b8ada509c7012ce7a0bbe3973aedb3398e2fbcaa18191a085013" Mar 09 14:32:09 crc kubenswrapper[4722]: I0309 14:32:09.610455 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-rct52" Mar 09 14:32:09 crc kubenswrapper[4722]: I0309 14:32:09.712640 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-7k5vg"] Mar 09 14:32:09 crc kubenswrapper[4722]: E0309 14:32:09.713389 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c85eb92e-6d30-4e52-9176-70140b518ce9" containerName="registry-server" Mar 09 14:32:09 crc kubenswrapper[4722]: I0309 14:32:09.713414 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="c85eb92e-6d30-4e52-9176-70140b518ce9" containerName="registry-server" Mar 09 14:32:09 crc kubenswrapper[4722]: E0309 14:32:09.713433 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f8120f5-690a-4bb4-ba23-dead16d6946f" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 09 14:32:09 crc kubenswrapper[4722]: I0309 14:32:09.713443 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f8120f5-690a-4bb4-ba23-dead16d6946f" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 09 14:32:09 crc kubenswrapper[4722]: E0309 14:32:09.713466 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c85eb92e-6d30-4e52-9176-70140b518ce9" containerName="extract-utilities" Mar 09 14:32:09 crc kubenswrapper[4722]: I0309 14:32:09.713475 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="c85eb92e-6d30-4e52-9176-70140b518ce9" containerName="extract-utilities" Mar 09 14:32:09 crc kubenswrapper[4722]: E0309 14:32:09.713504 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff07094b-d1ed-4466-b69f-457d3bbacfed" containerName="oc" Mar 09 14:32:09 crc kubenswrapper[4722]: I0309 14:32:09.713519 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff07094b-d1ed-4466-b69f-457d3bbacfed" containerName="oc" Mar 09 14:32:09 crc kubenswrapper[4722]: E0309 14:32:09.713541 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c85eb92e-6d30-4e52-9176-70140b518ce9" containerName="extract-content" Mar 09 14:32:09 crc kubenswrapper[4722]: I0309 14:32:09.713551 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="c85eb92e-6d30-4e52-9176-70140b518ce9" containerName="extract-content" Mar 09 14:32:09 crc kubenswrapper[4722]: I0309 14:32:09.713924 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="c85eb92e-6d30-4e52-9176-70140b518ce9" containerName="registry-server" Mar 09 14:32:09 crc kubenswrapper[4722]: I0309 14:32:09.713949 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f8120f5-690a-4bb4-ba23-dead16d6946f" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 09 14:32:09 crc kubenswrapper[4722]: I0309 14:32:09.713964 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff07094b-d1ed-4466-b69f-457d3bbacfed" containerName="oc" Mar 09 14:32:09 crc kubenswrapper[4722]: I0309 14:32:09.725587 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7k5vg" Mar 09 14:32:09 crc kubenswrapper[4722]: I0309 14:32:09.736233 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-dwlbg" Mar 09 14:32:09 crc kubenswrapper[4722]: I0309 14:32:09.736660 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 14:32:09 crc kubenswrapper[4722]: I0309 14:32:09.737095 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 14:32:09 crc kubenswrapper[4722]: I0309 14:32:09.737304 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 14:32:09 crc kubenswrapper[4722]: I0309 14:32:09.770190 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-7k5vg"] Mar 09 14:32:09 crc kubenswrapper[4722]: I0309 14:32:09.862732 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fd8102b7-59d6-4f93-8f71-43701a8b99ad-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-7k5vg\" (UID: \"fd8102b7-59d6-4f93-8f71-43701a8b99ad\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7k5vg" Mar 09 14:32:09 crc kubenswrapper[4722]: I0309 14:32:09.863212 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fd8102b7-59d6-4f93-8f71-43701a8b99ad-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-7k5vg\" (UID: \"fd8102b7-59d6-4f93-8f71-43701a8b99ad\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7k5vg" Mar 09 14:32:09 crc kubenswrapper[4722]: I0309 14:32:09.863248 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pss2f\" (UniqueName: \"kubernetes.io/projected/fd8102b7-59d6-4f93-8f71-43701a8b99ad-kube-api-access-pss2f\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-7k5vg\" (UID: \"fd8102b7-59d6-4f93-8f71-43701a8b99ad\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7k5vg" Mar 09 14:32:09 crc kubenswrapper[4722]: I0309 14:32:09.965674 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fd8102b7-59d6-4f93-8f71-43701a8b99ad-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-7k5vg\" (UID: \"fd8102b7-59d6-4f93-8f71-43701a8b99ad\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7k5vg" Mar 09 14:32:09 crc kubenswrapper[4722]: I0309 14:32:09.965730 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pss2f\" (UniqueName: \"kubernetes.io/projected/fd8102b7-59d6-4f93-8f71-43701a8b99ad-kube-api-access-pss2f\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-7k5vg\" (UID: \"fd8102b7-59d6-4f93-8f71-43701a8b99ad\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7k5vg" Mar 09 14:32:09 crc kubenswrapper[4722]: I0309 14:32:09.965925 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fd8102b7-59d6-4f93-8f71-43701a8b99ad-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-7k5vg\" (UID: \"fd8102b7-59d6-4f93-8f71-43701a8b99ad\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7k5vg" Mar 09 14:32:09 crc kubenswrapper[4722]: I0309 14:32:09.969286 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fd8102b7-59d6-4f93-8f71-43701a8b99ad-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-7k5vg\" (UID: \"fd8102b7-59d6-4f93-8f71-43701a8b99ad\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7k5vg" Mar 09 14:32:09 crc kubenswrapper[4722]: I0309 14:32:09.969612 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fd8102b7-59d6-4f93-8f71-43701a8b99ad-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-7k5vg\" (UID: \"fd8102b7-59d6-4f93-8f71-43701a8b99ad\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7k5vg" Mar 09 14:32:09 crc kubenswrapper[4722]: I0309 14:32:09.981294 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pss2f\" (UniqueName: \"kubernetes.io/projected/fd8102b7-59d6-4f93-8f71-43701a8b99ad-kube-api-access-pss2f\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-7k5vg\" (UID: \"fd8102b7-59d6-4f93-8f71-43701a8b99ad\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7k5vg" Mar 09 14:32:10 crc kubenswrapper[4722]: I0309 14:32:10.053146 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7k5vg" Mar 09 14:32:10 crc kubenswrapper[4722]: I0309 14:32:10.758799 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-7k5vg"] Mar 09 14:32:11 crc kubenswrapper[4722]: I0309 14:32:11.633556 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7k5vg" event={"ID":"fd8102b7-59d6-4f93-8f71-43701a8b99ad","Type":"ContainerStarted","Data":"6d506229d11672f346c801381fb9b4cfa1d95b59b5c5b0ebb9359f4e76b7d9c7"} Mar 09 14:32:11 crc kubenswrapper[4722]: I0309 14:32:11.634350 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7k5vg" event={"ID":"fd8102b7-59d6-4f93-8f71-43701a8b99ad","Type":"ContainerStarted","Data":"212596b5b054cbc15f10fa642e86653c89a8b804f75526bcec52a1598057633e"} Mar 09 14:32:11 crc kubenswrapper[4722]: I0309 14:32:11.673649 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7k5vg" podStartSLOduration=2.2462673029999998 podStartE2EDuration="2.673629612s" podCreationTimestamp="2026-03-09 14:32:09 +0000 UTC" firstStartedPulling="2026-03-09 14:32:10.756261386 +0000 UTC m=+1771.311829962" lastFinishedPulling="2026-03-09 14:32:11.183623685 +0000 UTC m=+1771.739192271" observedRunningTime="2026-03-09 14:32:11.650810612 +0000 UTC m=+1772.206379188" watchObservedRunningTime="2026-03-09 14:32:11.673629612 +0000 UTC m=+1772.229198178" Mar 09 14:32:14 crc kubenswrapper[4722]: I0309 14:32:14.685700 4722 generic.go:334] "Generic (PLEG): container finished" podID="fd8102b7-59d6-4f93-8f71-43701a8b99ad" containerID="6d506229d11672f346c801381fb9b4cfa1d95b59b5c5b0ebb9359f4e76b7d9c7" exitCode=0 Mar 09 14:32:14 crc kubenswrapper[4722]: I0309 14:32:14.685810 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7k5vg" event={"ID":"fd8102b7-59d6-4f93-8f71-43701a8b99ad","Type":"ContainerDied","Data":"6d506229d11672f346c801381fb9b4cfa1d95b59b5c5b0ebb9359f4e76b7d9c7"} Mar 09 14:32:16 crc kubenswrapper[4722]: I0309 14:32:16.150027 4722 scope.go:117] "RemoveContainer" containerID="86a6dcdf7dcf4d48b8bdfb71d9a2c30b89f235700c424312c7e16580a86021d6" Mar 09 14:32:16 crc kubenswrapper[4722]: E0309 14:32:16.151030 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 14:32:16 crc kubenswrapper[4722]: I0309 14:32:16.314667 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7k5vg" Mar 09 14:32:16 crc kubenswrapper[4722]: I0309 14:32:16.422250 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pss2f\" (UniqueName: \"kubernetes.io/projected/fd8102b7-59d6-4f93-8f71-43701a8b99ad-kube-api-access-pss2f\") pod \"fd8102b7-59d6-4f93-8f71-43701a8b99ad\" (UID: \"fd8102b7-59d6-4f93-8f71-43701a8b99ad\") " Mar 09 14:32:16 crc kubenswrapper[4722]: I0309 14:32:16.422513 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fd8102b7-59d6-4f93-8f71-43701a8b99ad-inventory\") pod \"fd8102b7-59d6-4f93-8f71-43701a8b99ad\" (UID: \"fd8102b7-59d6-4f93-8f71-43701a8b99ad\") " Mar 09 14:32:16 crc kubenswrapper[4722]: I0309 14:32:16.422607 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fd8102b7-59d6-4f93-8f71-43701a8b99ad-ssh-key-openstack-edpm-ipam\") pod \"fd8102b7-59d6-4f93-8f71-43701a8b99ad\" (UID: \"fd8102b7-59d6-4f93-8f71-43701a8b99ad\") " Mar 09 14:32:16 crc kubenswrapper[4722]: I0309 14:32:16.427696 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd8102b7-59d6-4f93-8f71-43701a8b99ad-kube-api-access-pss2f" (OuterVolumeSpecName: "kube-api-access-pss2f") pod "fd8102b7-59d6-4f93-8f71-43701a8b99ad" (UID: "fd8102b7-59d6-4f93-8f71-43701a8b99ad"). InnerVolumeSpecName "kube-api-access-pss2f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:32:16 crc kubenswrapper[4722]: I0309 14:32:16.456555 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd8102b7-59d6-4f93-8f71-43701a8b99ad-inventory" (OuterVolumeSpecName: "inventory") pod "fd8102b7-59d6-4f93-8f71-43701a8b99ad" (UID: "fd8102b7-59d6-4f93-8f71-43701a8b99ad"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:32:16 crc kubenswrapper[4722]: I0309 14:32:16.463246 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd8102b7-59d6-4f93-8f71-43701a8b99ad-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "fd8102b7-59d6-4f93-8f71-43701a8b99ad" (UID: "fd8102b7-59d6-4f93-8f71-43701a8b99ad"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:32:16 crc kubenswrapper[4722]: I0309 14:32:16.532105 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pss2f\" (UniqueName: \"kubernetes.io/projected/fd8102b7-59d6-4f93-8f71-43701a8b99ad-kube-api-access-pss2f\") on node \"crc\" DevicePath \"\"" Mar 09 14:32:16 crc kubenswrapper[4722]: I0309 14:32:16.532164 4722 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fd8102b7-59d6-4f93-8f71-43701a8b99ad-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 14:32:16 crc kubenswrapper[4722]: I0309 14:32:16.532181 4722 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fd8102b7-59d6-4f93-8f71-43701a8b99ad-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 14:32:16 crc kubenswrapper[4722]: I0309 14:32:16.712795 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7k5vg" event={"ID":"fd8102b7-59d6-4f93-8f71-43701a8b99ad","Type":"ContainerDied","Data":"212596b5b054cbc15f10fa642e86653c89a8b804f75526bcec52a1598057633e"} Mar 09 14:32:16 crc kubenswrapper[4722]: I0309 14:32:16.712839 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="212596b5b054cbc15f10fa642e86653c89a8b804f75526bcec52a1598057633e" Mar 09 14:32:16 crc kubenswrapper[4722]: I0309 14:32:16.712884 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-7k5vg" Mar 09 14:32:16 crc kubenswrapper[4722]: I0309 14:32:16.799701 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fsqlh"] Mar 09 14:32:16 crc kubenswrapper[4722]: E0309 14:32:16.800219 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd8102b7-59d6-4f93-8f71-43701a8b99ad" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 09 14:32:16 crc kubenswrapper[4722]: I0309 14:32:16.800237 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd8102b7-59d6-4f93-8f71-43701a8b99ad" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 09 14:32:16 crc kubenswrapper[4722]: I0309 14:32:16.800443 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd8102b7-59d6-4f93-8f71-43701a8b99ad" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 09 14:32:16 crc kubenswrapper[4722]: I0309 14:32:16.801263 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fsqlh" Mar 09 14:32:16 crc kubenswrapper[4722]: I0309 14:32:16.831528 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 14:32:16 crc kubenswrapper[4722]: I0309 14:32:16.832061 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-dwlbg" Mar 09 14:32:16 crc kubenswrapper[4722]: I0309 14:32:16.832349 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 14:32:16 crc kubenswrapper[4722]: I0309 14:32:16.832545 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 14:32:16 crc kubenswrapper[4722]: I0309 14:32:16.841356 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6whl\" (UniqueName: \"kubernetes.io/projected/4a60c1ca-738f-48a9-b972-3eef08a28fc6-kube-api-access-f6whl\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fsqlh\" (UID: \"4a60c1ca-738f-48a9-b972-3eef08a28fc6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fsqlh" Mar 09 14:32:16 crc kubenswrapper[4722]: I0309 14:32:16.841631 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4a60c1ca-738f-48a9-b972-3eef08a28fc6-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fsqlh\" (UID: \"4a60c1ca-738f-48a9-b972-3eef08a28fc6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fsqlh" Mar 09 14:32:16 crc kubenswrapper[4722]: I0309 14:32:16.841720 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a60c1ca-738f-48a9-b972-3eef08a28fc6-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fsqlh\" (UID: \"4a60c1ca-738f-48a9-b972-3eef08a28fc6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fsqlh" Mar 09 14:32:16 crc kubenswrapper[4722]: I0309 14:32:16.841884 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4a60c1ca-738f-48a9-b972-3eef08a28fc6-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fsqlh\" (UID: \"4a60c1ca-738f-48a9-b972-3eef08a28fc6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fsqlh" Mar 09 14:32:16 crc kubenswrapper[4722]: I0309 14:32:16.853844 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fsqlh"] Mar 09 14:32:16 crc kubenswrapper[4722]: I0309 14:32:16.943121 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6whl\" (UniqueName: \"kubernetes.io/projected/4a60c1ca-738f-48a9-b972-3eef08a28fc6-kube-api-access-f6whl\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fsqlh\" (UID: \"4a60c1ca-738f-48a9-b972-3eef08a28fc6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fsqlh" Mar 09 14:32:16 crc kubenswrapper[4722]: I0309 14:32:16.943407 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4a60c1ca-738f-48a9-b972-3eef08a28fc6-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fsqlh\" (UID: \"4a60c1ca-738f-48a9-b972-3eef08a28fc6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fsqlh" Mar 09 14:32:16 crc kubenswrapper[4722]: I0309 14:32:16.943533 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a60c1ca-738f-48a9-b972-3eef08a28fc6-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fsqlh\" (UID: \"4a60c1ca-738f-48a9-b972-3eef08a28fc6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fsqlh" Mar 09 14:32:16 crc kubenswrapper[4722]: I0309 14:32:16.943679 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4a60c1ca-738f-48a9-b972-3eef08a28fc6-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fsqlh\" (UID: \"4a60c1ca-738f-48a9-b972-3eef08a28fc6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fsqlh" Mar 09 14:32:16 crc kubenswrapper[4722]: I0309 14:32:16.954258 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a60c1ca-738f-48a9-b972-3eef08a28fc6-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fsqlh\" (UID: \"4a60c1ca-738f-48a9-b972-3eef08a28fc6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fsqlh" Mar 09 14:32:16 crc kubenswrapper[4722]: I0309 14:32:16.954906 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4a60c1ca-738f-48a9-b972-3eef08a28fc6-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fsqlh\" (UID: \"4a60c1ca-738f-48a9-b972-3eef08a28fc6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fsqlh" Mar 09 14:32:16 crc kubenswrapper[4722]: I0309 14:32:16.955835 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4a60c1ca-738f-48a9-b972-3eef08a28fc6-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fsqlh\" (UID: \"4a60c1ca-738f-48a9-b972-3eef08a28fc6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fsqlh" Mar 09 14:32:16 crc kubenswrapper[4722]: I0309 14:32:16.958428 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6whl\" (UniqueName: \"kubernetes.io/projected/4a60c1ca-738f-48a9-b972-3eef08a28fc6-kube-api-access-f6whl\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-fsqlh\" (UID: \"4a60c1ca-738f-48a9-b972-3eef08a28fc6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fsqlh" Mar 09 14:32:17 crc kubenswrapper[4722]: I0309 14:32:17.164103 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fsqlh" Mar 09 14:32:17 crc kubenswrapper[4722]: I0309 14:32:17.778464 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fsqlh"] Mar 09 14:32:18 crc kubenswrapper[4722]: I0309 14:32:18.740434 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fsqlh" event={"ID":"4a60c1ca-738f-48a9-b972-3eef08a28fc6","Type":"ContainerStarted","Data":"2179063018fc63a106d493df46f46ad2bd648830e45bdf5244a79f206eb9dc1b"} Mar 09 14:32:18 crc kubenswrapper[4722]: I0309 14:32:18.740899 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fsqlh" event={"ID":"4a60c1ca-738f-48a9-b972-3eef08a28fc6","Type":"ContainerStarted","Data":"7d3d0a0344f13e515d04bc7e76c95ea37d872c7f12dcfd287aecc55e73a701be"} Mar 09 14:32:18 crc kubenswrapper[4722]: I0309 14:32:18.764256 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fsqlh" podStartSLOduration=2.23988828 podStartE2EDuration="2.764234435s" podCreationTimestamp="2026-03-09 14:32:16 +0000 UTC" firstStartedPulling="2026-03-09 14:32:17.772666573 +0000 UTC m=+1778.328235139" lastFinishedPulling="2026-03-09 14:32:18.297012718 +0000 UTC m=+1778.852581294" observedRunningTime="2026-03-09 14:32:18.756595945 +0000 UTC m=+1779.312164511" watchObservedRunningTime="2026-03-09 14:32:18.764234435 +0000 UTC m=+1779.319803021" Mar 09 14:32:22 crc kubenswrapper[4722]: I0309 14:32:22.829029 4722 scope.go:117] "RemoveContainer" containerID="730a73dac07e27476a0ac62dd9ca48a96be66fe6494762e1268c7950776a78d1" Mar 09 14:32:22 crc kubenswrapper[4722]: I0309 14:32:22.885573 4722 scope.go:117] "RemoveContainer" containerID="7774a1b7beb6709cc7100d6b0e05365cd9498f94cb28bd5282e6c7b3a858a60d" Mar 09 14:32:22 crc kubenswrapper[4722]: I0309 14:32:22.927497 4722 scope.go:117] "RemoveContainer" containerID="9fdfba0d1ad4b491c62ca71f25534c103428e138f08244b726136f8e29e87ad2" Mar 09 14:32:23 crc kubenswrapper[4722]: I0309 14:32:23.005743 4722 scope.go:117] "RemoveContainer" containerID="f03fc0a6d2153285b68b06ee976c841e642a446a81e1e6eb709dce371d8eb9b3" Mar 09 14:32:23 crc kubenswrapper[4722]: I0309 14:32:23.080546 4722 scope.go:117] "RemoveContainer" containerID="2b782e1d01d6f1e5d883408b56654cb190f65dda28ade9acdcc2b519429bd927" Mar 09 14:32:25 crc kubenswrapper[4722]: E0309 14:32:25.349841 4722 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod694c0bbd_9c21_4de1_b82b_e79aa32feb6b.slice/crio-7d4245a328cb68f7c34eb728fe14d3b34a183a1f63704a6ea7f69e79391785f8.scope\": RecentStats: unable to find data in memory cache]" Mar 09 14:32:25 crc kubenswrapper[4722]: I0309 14:32:25.830280 4722 generic.go:334] "Generic (PLEG): container finished" podID="694c0bbd-9c21-4de1-b82b-e79aa32feb6b" containerID="7d4245a328cb68f7c34eb728fe14d3b34a183a1f63704a6ea7f69e79391785f8" exitCode=0 Mar 09 14:32:25 crc kubenswrapper[4722]: I0309 14:32:25.830378 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"694c0bbd-9c21-4de1-b82b-e79aa32feb6b","Type":"ContainerDied","Data":"7d4245a328cb68f7c34eb728fe14d3b34a183a1f63704a6ea7f69e79391785f8"} Mar 09 14:32:26 crc kubenswrapper[4722]: I0309 14:32:26.845521 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"694c0bbd-9c21-4de1-b82b-e79aa32feb6b","Type":"ContainerStarted","Data":"8eddf957de97e5de5e261fc2636b63bb1b9d891a03bb7d3272b3fc720476be45"} Mar 09 14:32:26 crc kubenswrapper[4722]: I0309 14:32:26.846067 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-1" Mar 09 14:32:26 crc kubenswrapper[4722]: I0309 14:32:26.871155 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-1" podStartSLOduration=36.871132334 podStartE2EDuration="36.871132334s" podCreationTimestamp="2026-03-09 14:31:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:32:26.868989665 +0000 UTC m=+1787.424558251" watchObservedRunningTime="2026-03-09 14:32:26.871132334 +0000 UTC m=+1787.426700920" Mar 09 14:32:29 crc kubenswrapper[4722]: I0309 14:32:29.150945 4722 scope.go:117] "RemoveContainer" containerID="86a6dcdf7dcf4d48b8bdfb71d9a2c30b89f235700c424312c7e16580a86021d6" Mar 09 14:32:29 crc kubenswrapper[4722]: E0309 14:32:29.151686 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 14:32:40 crc kubenswrapper[4722]: I0309 14:32:40.591425 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-1" Mar 09 14:32:40 crc kubenswrapper[4722]: I0309 14:32:40.704194 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 09 14:32:44 crc kubenswrapper[4722]: I0309 14:32:44.150252 4722 scope.go:117] "RemoveContainer" containerID="86a6dcdf7dcf4d48b8bdfb71d9a2c30b89f235700c424312c7e16580a86021d6" Mar 09 14:32:44 crc kubenswrapper[4722]: E0309 14:32:44.152224 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 14:32:45 crc kubenswrapper[4722]: I0309 14:32:45.432818 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="6f4e007a-4a18-40e6-bf96-4a751e00cd73" containerName="rabbitmq" containerID="cri-o://1d4f1f48ede004d37f49113410c9accaab4140f421e9ac96954fd4856a15b12f" gracePeriod=604796 Mar 09 14:32:48 crc kubenswrapper[4722]: I0309 14:32:48.361887 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="6f4e007a-4a18-40e6-bf96-4a751e00cd73" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.134:5671: connect: connection refused" Mar 09 14:32:52 crc kubenswrapper[4722]: I0309 14:32:52.127713 4722 generic.go:334] "Generic (PLEG): container finished" podID="6f4e007a-4a18-40e6-bf96-4a751e00cd73" containerID="1d4f1f48ede004d37f49113410c9accaab4140f421e9ac96954fd4856a15b12f" exitCode=0 Mar 09 14:32:52 crc kubenswrapper[4722]: I0309 14:32:52.127875 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6f4e007a-4a18-40e6-bf96-4a751e00cd73","Type":"ContainerDied","Data":"1d4f1f48ede004d37f49113410c9accaab4140f421e9ac96954fd4856a15b12f"} Mar 09 14:32:52 crc kubenswrapper[4722]: I0309 14:32:52.328146 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 09 14:32:52 crc kubenswrapper[4722]: I0309 14:32:52.373882 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8l5q\" (UniqueName: \"kubernetes.io/projected/6f4e007a-4a18-40e6-bf96-4a751e00cd73-kube-api-access-w8l5q\") pod \"6f4e007a-4a18-40e6-bf96-4a751e00cd73\" (UID: \"6f4e007a-4a18-40e6-bf96-4a751e00cd73\") " Mar 09 14:32:52 crc kubenswrapper[4722]: I0309 14:32:52.374171 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6f4e007a-4a18-40e6-bf96-4a751e00cd73-pod-info\") pod \"6f4e007a-4a18-40e6-bf96-4a751e00cd73\" (UID: \"6f4e007a-4a18-40e6-bf96-4a751e00cd73\") " Mar 09 14:32:52 crc kubenswrapper[4722]: I0309 14:32:52.374309 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6f4e007a-4a18-40e6-bf96-4a751e00cd73-erlang-cookie-secret\") pod \"6f4e007a-4a18-40e6-bf96-4a751e00cd73\" (UID: \"6f4e007a-4a18-40e6-bf96-4a751e00cd73\") " Mar 09 14:32:52 crc kubenswrapper[4722]: I0309 14:32:52.374483 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6f4e007a-4a18-40e6-bf96-4a751e00cd73-plugins-conf\") pod \"6f4e007a-4a18-40e6-bf96-4a751e00cd73\" (UID: \"6f4e007a-4a18-40e6-bf96-4a751e00cd73\") " Mar 09 14:32:52 crc kubenswrapper[4722]: I0309 14:32:52.374602 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6f4e007a-4a18-40e6-bf96-4a751e00cd73-server-conf\") pod \"6f4e007a-4a18-40e6-bf96-4a751e00cd73\" (UID: \"6f4e007a-4a18-40e6-bf96-4a751e00cd73\") " Mar 09 14:32:52 crc kubenswrapper[4722]: I0309 14:32:52.374727 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6f4e007a-4a18-40e6-bf96-4a751e00cd73-rabbitmq-confd\") pod \"6f4e007a-4a18-40e6-bf96-4a751e00cd73\" (UID: \"6f4e007a-4a18-40e6-bf96-4a751e00cd73\") " Mar 09 14:32:52 crc kubenswrapper[4722]: I0309 14:32:52.375281 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d921ac5a-690a-43f2-832b-27c969b7b3d5\") pod \"6f4e007a-4a18-40e6-bf96-4a751e00cd73\" (UID: \"6f4e007a-4a18-40e6-bf96-4a751e00cd73\") " Mar 09 14:32:52 crc kubenswrapper[4722]: I0309 14:32:52.375465 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6f4e007a-4a18-40e6-bf96-4a751e00cd73-config-data\") pod \"6f4e007a-4a18-40e6-bf96-4a751e00cd73\" (UID: \"6f4e007a-4a18-40e6-bf96-4a751e00cd73\") " Mar 09 14:32:52 crc kubenswrapper[4722]: I0309 14:32:52.375567 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6f4e007a-4a18-40e6-bf96-4a751e00cd73-rabbitmq-plugins\") pod \"6f4e007a-4a18-40e6-bf96-4a751e00cd73\" (UID: \"6f4e007a-4a18-40e6-bf96-4a751e00cd73\") " Mar 09 14:32:52 crc kubenswrapper[4722]: I0309 14:32:52.376462 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6f4e007a-4a18-40e6-bf96-4a751e00cd73-rabbitmq-erlang-cookie\") pod \"6f4e007a-4a18-40e6-bf96-4a751e00cd73\" (UID: \"6f4e007a-4a18-40e6-bf96-4a751e00cd73\") " Mar 09 14:32:52 crc kubenswrapper[4722]: I0309 14:32:52.376598 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6f4e007a-4a18-40e6-bf96-4a751e00cd73-rabbitmq-tls\") pod \"6f4e007a-4a18-40e6-bf96-4a751e00cd73\" (UID: \"6f4e007a-4a18-40e6-bf96-4a751e00cd73\") " Mar 09 14:32:52 crc kubenswrapper[4722]: I0309 14:32:52.377108 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f4e007a-4a18-40e6-bf96-4a751e00cd73-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "6f4e007a-4a18-40e6-bf96-4a751e00cd73" (UID: "6f4e007a-4a18-40e6-bf96-4a751e00cd73"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:32:52 crc kubenswrapper[4722]: I0309 14:32:52.377636 4722 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6f4e007a-4a18-40e6-bf96-4a751e00cd73-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 09 14:32:52 crc kubenswrapper[4722]: I0309 14:32:52.378751 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f4e007a-4a18-40e6-bf96-4a751e00cd73-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "6f4e007a-4a18-40e6-bf96-4a751e00cd73" (UID: "6f4e007a-4a18-40e6-bf96-4a751e00cd73"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:32:52 crc kubenswrapper[4722]: I0309 14:32:52.379606 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f4e007a-4a18-40e6-bf96-4a751e00cd73-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "6f4e007a-4a18-40e6-bf96-4a751e00cd73" (UID: "6f4e007a-4a18-40e6-bf96-4a751e00cd73"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:32:52 crc kubenswrapper[4722]: I0309 14:32:52.382014 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f4e007a-4a18-40e6-bf96-4a751e00cd73-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "6f4e007a-4a18-40e6-bf96-4a751e00cd73" (UID: "6f4e007a-4a18-40e6-bf96-4a751e00cd73"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:32:52 crc kubenswrapper[4722]: I0309 14:32:52.384174 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/6f4e007a-4a18-40e6-bf96-4a751e00cd73-pod-info" (OuterVolumeSpecName: "pod-info") pod "6f4e007a-4a18-40e6-bf96-4a751e00cd73" (UID: "6f4e007a-4a18-40e6-bf96-4a751e00cd73"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 09 14:32:52 crc kubenswrapper[4722]: I0309 14:32:52.387848 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f4e007a-4a18-40e6-bf96-4a751e00cd73-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "6f4e007a-4a18-40e6-bf96-4a751e00cd73" (UID: "6f4e007a-4a18-40e6-bf96-4a751e00cd73"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:32:52 crc kubenswrapper[4722]: I0309 14:32:52.402760 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f4e007a-4a18-40e6-bf96-4a751e00cd73-kube-api-access-w8l5q" (OuterVolumeSpecName: "kube-api-access-w8l5q") pod "6f4e007a-4a18-40e6-bf96-4a751e00cd73" (UID: "6f4e007a-4a18-40e6-bf96-4a751e00cd73"). InnerVolumeSpecName "kube-api-access-w8l5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:32:52 crc kubenswrapper[4722]: I0309 14:32:52.442628 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d921ac5a-690a-43f2-832b-27c969b7b3d5" (OuterVolumeSpecName: "persistence") pod "6f4e007a-4a18-40e6-bf96-4a751e00cd73" (UID: "6f4e007a-4a18-40e6-bf96-4a751e00cd73"). InnerVolumeSpecName "pvc-d921ac5a-690a-43f2-832b-27c969b7b3d5". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 09 14:32:52 crc kubenswrapper[4722]: I0309 14:32:52.481375 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f4e007a-4a18-40e6-bf96-4a751e00cd73-config-data" (OuterVolumeSpecName: "config-data") pod "6f4e007a-4a18-40e6-bf96-4a751e00cd73" (UID: "6f4e007a-4a18-40e6-bf96-4a751e00cd73"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:32:52 crc kubenswrapper[4722]: I0309 14:32:52.481848 4722 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6f4e007a-4a18-40e6-bf96-4a751e00cd73-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 09 14:32:52 crc kubenswrapper[4722]: I0309 14:32:52.481877 4722 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6f4e007a-4a18-40e6-bf96-4a751e00cd73-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 09 14:32:52 crc kubenswrapper[4722]: I0309 14:32:52.481889 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w8l5q\" (UniqueName: \"kubernetes.io/projected/6f4e007a-4a18-40e6-bf96-4a751e00cd73-kube-api-access-w8l5q\") on node \"crc\" DevicePath \"\"" Mar 09 14:32:52 crc kubenswrapper[4722]: I0309 14:32:52.481898 4722 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6f4e007a-4a18-40e6-bf96-4a751e00cd73-pod-info\") on node \"crc\" DevicePath \"\"" Mar 09 14:32:52 crc kubenswrapper[4722]: I0309 14:32:52.481908 4722 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6f4e007a-4a18-40e6-bf96-4a751e00cd73-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 09 14:32:52 crc kubenswrapper[4722]: I0309 14:32:52.481939 4722 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-d921ac5a-690a-43f2-832b-27c969b7b3d5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d921ac5a-690a-43f2-832b-27c969b7b3d5\") on node \"crc\" " Mar 09 14:32:52 crc kubenswrapper[4722]: I0309 14:32:52.481950 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6f4e007a-4a18-40e6-bf96-4a751e00cd73-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 14:32:52 crc kubenswrapper[4722]: I0309 14:32:52.481959 4722 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6f4e007a-4a18-40e6-bf96-4a751e00cd73-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 09 14:32:52 crc kubenswrapper[4722]: I0309 14:32:52.511160 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f4e007a-4a18-40e6-bf96-4a751e00cd73-server-conf" (OuterVolumeSpecName: "server-conf") pod "6f4e007a-4a18-40e6-bf96-4a751e00cd73" (UID: "6f4e007a-4a18-40e6-bf96-4a751e00cd73"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:32:52 crc kubenswrapper[4722]: I0309 14:32:52.527904 4722 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 09 14:32:52 crc kubenswrapper[4722]: I0309 14:32:52.528059 4722 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-d921ac5a-690a-43f2-832b-27c969b7b3d5" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d921ac5a-690a-43f2-832b-27c969b7b3d5") on node "crc" Mar 09 14:32:52 crc kubenswrapper[4722]: I0309 14:32:52.584289 4722 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6f4e007a-4a18-40e6-bf96-4a751e00cd73-server-conf\") on node \"crc\" DevicePath \"\"" Mar 09 14:32:52 crc kubenswrapper[4722]: I0309 14:32:52.584569 4722 reconciler_common.go:293] "Volume detached for volume \"pvc-d921ac5a-690a-43f2-832b-27c969b7b3d5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d921ac5a-690a-43f2-832b-27c969b7b3d5\") on node \"crc\" DevicePath \"\"" Mar 09 14:32:52 crc kubenswrapper[4722]: I0309 14:32:52.597314 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f4e007a-4a18-40e6-bf96-4a751e00cd73-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "6f4e007a-4a18-40e6-bf96-4a751e00cd73" (UID: "6f4e007a-4a18-40e6-bf96-4a751e00cd73"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:32:52 crc kubenswrapper[4722]: I0309 14:32:52.686626 4722 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6f4e007a-4a18-40e6-bf96-4a751e00cd73-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 09 14:32:53 crc kubenswrapper[4722]: I0309 14:32:53.154537 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6f4e007a-4a18-40e6-bf96-4a751e00cd73","Type":"ContainerDied","Data":"8826d7e3b5c4577b329e1668f6236ec0bcbdcc1ff604699b20e9e79150ba01d7"} Mar 09 14:32:53 crc kubenswrapper[4722]: I0309 14:32:53.154612 4722 scope.go:117] "RemoveContainer" containerID="1d4f1f48ede004d37f49113410c9accaab4140f421e9ac96954fd4856a15b12f" Mar 09 14:32:53 crc kubenswrapper[4722]: I0309 14:32:53.154805 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 09 14:32:53 crc kubenswrapper[4722]: I0309 14:32:53.198986 4722 scope.go:117] "RemoveContainer" containerID="89991879f6e59e858e98954d53f4101c5c7935bc1ad02ef1f93145110f421678" Mar 09 14:32:53 crc kubenswrapper[4722]: I0309 14:32:53.219956 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 09 14:32:53 crc kubenswrapper[4722]: I0309 14:32:53.244856 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 09 14:32:53 crc kubenswrapper[4722]: I0309 14:32:53.269871 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 09 14:32:53 crc kubenswrapper[4722]: E0309 14:32:53.270391 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f4e007a-4a18-40e6-bf96-4a751e00cd73" containerName="setup-container" Mar 09 14:32:53 crc kubenswrapper[4722]: I0309 14:32:53.270409 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f4e007a-4a18-40e6-bf96-4a751e00cd73" containerName="setup-container" Mar 09 14:32:53 crc kubenswrapper[4722]: E0309 14:32:53.270444 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f4e007a-4a18-40e6-bf96-4a751e00cd73" containerName="rabbitmq" Mar 09 14:32:53 crc kubenswrapper[4722]: I0309 14:32:53.270451 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f4e007a-4a18-40e6-bf96-4a751e00cd73" containerName="rabbitmq" Mar 09 14:32:53 crc kubenswrapper[4722]: I0309 14:32:53.270719 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f4e007a-4a18-40e6-bf96-4a751e00cd73" containerName="rabbitmq" Mar 09 14:32:53 crc kubenswrapper[4722]: I0309 14:32:53.272044 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 09 14:32:53 crc kubenswrapper[4722]: I0309 14:32:53.294815 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 09 14:32:53 crc kubenswrapper[4722]: I0309 14:32:53.405074 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/49886118-4852-41ba-bbed-a946764f4649-server-conf\") pod \"rabbitmq-server-0\" (UID: \"49886118-4852-41ba-bbed-a946764f4649\") " pod="openstack/rabbitmq-server-0" Mar 09 14:32:53 crc kubenswrapper[4722]: I0309 14:32:53.405470 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/49886118-4852-41ba-bbed-a946764f4649-config-data\") pod \"rabbitmq-server-0\" (UID: \"49886118-4852-41ba-bbed-a946764f4649\") " pod="openstack/rabbitmq-server-0" Mar 09 14:32:53 crc kubenswrapper[4722]: I0309 14:32:53.405658 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/49886118-4852-41ba-bbed-a946764f4649-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"49886118-4852-41ba-bbed-a946764f4649\") " pod="openstack/rabbitmq-server-0" Mar 09 14:32:53 crc kubenswrapper[4722]: I0309 14:32:53.405759 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/49886118-4852-41ba-bbed-a946764f4649-pod-info\") pod \"rabbitmq-server-0\" (UID: \"49886118-4852-41ba-bbed-a946764f4649\") " pod="openstack/rabbitmq-server-0" Mar 09 14:32:53 crc kubenswrapper[4722]: I0309 14:32:53.405848 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/49886118-4852-41ba-bbed-a946764f4649-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"49886118-4852-41ba-bbed-a946764f4649\") " pod="openstack/rabbitmq-server-0" Mar 09 14:32:53 crc kubenswrapper[4722]: I0309 14:32:53.406014 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/49886118-4852-41ba-bbed-a946764f4649-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"49886118-4852-41ba-bbed-a946764f4649\") " pod="openstack/rabbitmq-server-0" Mar 09 14:32:53 crc kubenswrapper[4722]: I0309 14:32:53.406068 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cq8fh\" (UniqueName: \"kubernetes.io/projected/49886118-4852-41ba-bbed-a946764f4649-kube-api-access-cq8fh\") pod \"rabbitmq-server-0\" (UID: \"49886118-4852-41ba-bbed-a946764f4649\") " pod="openstack/rabbitmq-server-0" Mar 09 14:32:53 crc kubenswrapper[4722]: I0309 14:32:53.406119 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d921ac5a-690a-43f2-832b-27c969b7b3d5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d921ac5a-690a-43f2-832b-27c969b7b3d5\") pod \"rabbitmq-server-0\" (UID: \"49886118-4852-41ba-bbed-a946764f4649\") " pod="openstack/rabbitmq-server-0" Mar 09 14:32:53 crc kubenswrapper[4722]: I0309 14:32:53.406141 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/49886118-4852-41ba-bbed-a946764f4649-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"49886118-4852-41ba-bbed-a946764f4649\") " pod="openstack/rabbitmq-server-0" Mar 09 14:32:53 crc kubenswrapper[4722]: I0309 14:32:53.406216 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/49886118-4852-41ba-bbed-a946764f4649-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"49886118-4852-41ba-bbed-a946764f4649\") " pod="openstack/rabbitmq-server-0" Mar 09 14:32:53 crc kubenswrapper[4722]: I0309 14:32:53.406354 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/49886118-4852-41ba-bbed-a946764f4649-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"49886118-4852-41ba-bbed-a946764f4649\") " pod="openstack/rabbitmq-server-0" Mar 09 14:32:53 crc kubenswrapper[4722]: I0309 14:32:53.508756 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/49886118-4852-41ba-bbed-a946764f4649-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"49886118-4852-41ba-bbed-a946764f4649\") " pod="openstack/rabbitmq-server-0" Mar 09 14:32:53 crc kubenswrapper[4722]: I0309 14:32:53.508813 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/49886118-4852-41ba-bbed-a946764f4649-pod-info\") pod \"rabbitmq-server-0\" (UID: \"49886118-4852-41ba-bbed-a946764f4649\") " pod="openstack/rabbitmq-server-0" Mar 09 14:32:53 crc kubenswrapper[4722]: I0309 14:32:53.508842 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/49886118-4852-41ba-bbed-a946764f4649-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"49886118-4852-41ba-bbed-a946764f4649\") " pod="openstack/rabbitmq-server-0" Mar 09 14:32:53 crc kubenswrapper[4722]: I0309 14:32:53.508873 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/49886118-4852-41ba-bbed-a946764f4649-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"49886118-4852-41ba-bbed-a946764f4649\") " pod="openstack/rabbitmq-server-0" Mar 09 14:32:53 crc kubenswrapper[4722]: I0309 14:32:53.508892 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cq8fh\" (UniqueName: \"kubernetes.io/projected/49886118-4852-41ba-bbed-a946764f4649-kube-api-access-cq8fh\") pod \"rabbitmq-server-0\" (UID: \"49886118-4852-41ba-bbed-a946764f4649\") " pod="openstack/rabbitmq-server-0" Mar 09 14:32:53 crc kubenswrapper[4722]: I0309 14:32:53.508926 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d921ac5a-690a-43f2-832b-27c969b7b3d5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d921ac5a-690a-43f2-832b-27c969b7b3d5\") pod \"rabbitmq-server-0\" (UID: \"49886118-4852-41ba-bbed-a946764f4649\") " pod="openstack/rabbitmq-server-0" Mar 09 14:32:53 crc kubenswrapper[4722]: I0309 14:32:53.508961 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/49886118-4852-41ba-bbed-a946764f4649-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"49886118-4852-41ba-bbed-a946764f4649\") " pod="openstack/rabbitmq-server-0" Mar 09 14:32:53 crc kubenswrapper[4722]: I0309 14:32:53.508990 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/49886118-4852-41ba-bbed-a946764f4649-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"49886118-4852-41ba-bbed-a946764f4649\") " pod="openstack/rabbitmq-server-0" Mar 09 14:32:53 crc kubenswrapper[4722]: I0309 14:32:53.509035 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/49886118-4852-41ba-bbed-a946764f4649-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"49886118-4852-41ba-bbed-a946764f4649\") " pod="openstack/rabbitmq-server-0" Mar 09 14:32:53 crc kubenswrapper[4722]: I0309 14:32:53.509123 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/49886118-4852-41ba-bbed-a946764f4649-server-conf\") pod \"rabbitmq-server-0\" (UID: \"49886118-4852-41ba-bbed-a946764f4649\") " pod="openstack/rabbitmq-server-0" Mar 09 14:32:53 crc kubenswrapper[4722]: I0309 14:32:53.509171 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/49886118-4852-41ba-bbed-a946764f4649-config-data\") pod \"rabbitmq-server-0\" (UID: \"49886118-4852-41ba-bbed-a946764f4649\") " pod="openstack/rabbitmq-server-0" Mar 09 14:32:53 crc kubenswrapper[4722]: I0309 14:32:53.509938 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/49886118-4852-41ba-bbed-a946764f4649-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"49886118-4852-41ba-bbed-a946764f4649\") " pod="openstack/rabbitmq-server-0" Mar 09 14:32:53 crc kubenswrapper[4722]: I0309 14:32:53.510154 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/49886118-4852-41ba-bbed-a946764f4649-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"49886118-4852-41ba-bbed-a946764f4649\") " pod="openstack/rabbitmq-server-0" Mar 09 14:32:53 crc kubenswrapper[4722]: I0309 14:32:53.510644 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/49886118-4852-41ba-bbed-a946764f4649-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"49886118-4852-41ba-bbed-a946764f4649\") " pod="openstack/rabbitmq-server-0" Mar 09 14:32:53 crc kubenswrapper[4722]: I0309 14:32:53.511053 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/49886118-4852-41ba-bbed-a946764f4649-server-conf\") pod \"rabbitmq-server-0\" (UID: \"49886118-4852-41ba-bbed-a946764f4649\") " pod="openstack/rabbitmq-server-0" Mar 09 14:32:53 crc kubenswrapper[4722]: I0309 14:32:53.511299 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/49886118-4852-41ba-bbed-a946764f4649-config-data\") pod \"rabbitmq-server-0\" (UID: \"49886118-4852-41ba-bbed-a946764f4649\") " pod="openstack/rabbitmq-server-0" Mar 09 14:32:53 crc kubenswrapper[4722]: I0309 14:32:53.512703 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 09 14:32:53 crc kubenswrapper[4722]: I0309 14:32:53.512743 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d921ac5a-690a-43f2-832b-27c969b7b3d5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d921ac5a-690a-43f2-832b-27c969b7b3d5\") pod \"rabbitmq-server-0\" (UID: \"49886118-4852-41ba-bbed-a946764f4649\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/f27f5e88ec17f45c75ef399615f6a29b6f2764d2fa42d7f4a87340be7a71cdcc/globalmount\"" pod="openstack/rabbitmq-server-0" Mar 09 14:32:53 crc kubenswrapper[4722]: I0309 14:32:53.514256 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/49886118-4852-41ba-bbed-a946764f4649-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"49886118-4852-41ba-bbed-a946764f4649\") " pod="openstack/rabbitmq-server-0" Mar 09 14:32:53 crc kubenswrapper[4722]: I0309 14:32:53.515019 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/49886118-4852-41ba-bbed-a946764f4649-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"49886118-4852-41ba-bbed-a946764f4649\") " pod="openstack/rabbitmq-server-0" Mar 09 14:32:53 crc kubenswrapper[4722]: I0309 14:32:53.519339 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/49886118-4852-41ba-bbed-a946764f4649-pod-info\") pod \"rabbitmq-server-0\" (UID: \"49886118-4852-41ba-bbed-a946764f4649\") " pod="openstack/rabbitmq-server-0" Mar 09 14:32:53 crc kubenswrapper[4722]: I0309 14:32:53.520803 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/49886118-4852-41ba-bbed-a946764f4649-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"49886118-4852-41ba-bbed-a946764f4649\") " pod="openstack/rabbitmq-server-0" Mar 09 14:32:53 crc kubenswrapper[4722]: I0309 14:32:53.529636 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cq8fh\" (UniqueName: \"kubernetes.io/projected/49886118-4852-41ba-bbed-a946764f4649-kube-api-access-cq8fh\") pod \"rabbitmq-server-0\" (UID: \"49886118-4852-41ba-bbed-a946764f4649\") " pod="openstack/rabbitmq-server-0" Mar 09 14:32:53 crc kubenswrapper[4722]: I0309 14:32:53.603517 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d921ac5a-690a-43f2-832b-27c969b7b3d5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d921ac5a-690a-43f2-832b-27c969b7b3d5\") pod \"rabbitmq-server-0\" (UID: \"49886118-4852-41ba-bbed-a946764f4649\") " pod="openstack/rabbitmq-server-0" Mar 09 14:32:53 crc kubenswrapper[4722]: I0309 14:32:53.613074 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 09 14:32:54 crc kubenswrapper[4722]: I0309 14:32:54.169421 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f4e007a-4a18-40e6-bf96-4a751e00cd73" path="/var/lib/kubelet/pods/6f4e007a-4a18-40e6-bf96-4a751e00cd73/volumes" Mar 09 14:32:54 crc kubenswrapper[4722]: I0309 14:32:54.180119 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 09 14:32:55 crc kubenswrapper[4722]: I0309 14:32:55.185419 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"49886118-4852-41ba-bbed-a946764f4649","Type":"ContainerStarted","Data":"8844edec867b6a2be7539a5558d11875ee20c781ed47810a06ed0ff0396a19ed"} Mar 09 14:32:56 crc kubenswrapper[4722]: I0309 14:32:56.150285 4722 scope.go:117] "RemoveContainer" containerID="86a6dcdf7dcf4d48b8bdfb71d9a2c30b89f235700c424312c7e16580a86021d6" Mar 09 14:32:56 crc kubenswrapper[4722]: E0309 14:32:56.151263 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 14:32:57 crc kubenswrapper[4722]: I0309 14:32:57.209539 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"49886118-4852-41ba-bbed-a946764f4649","Type":"ContainerStarted","Data":"bd2f0453a9676ad2a653c487c4e9f45080ac64d7a55c532b919bebccef3dc027"} Mar 09 14:33:07 crc kubenswrapper[4722]: I0309 14:33:07.149121 4722 scope.go:117] "RemoveContainer" containerID="86a6dcdf7dcf4d48b8bdfb71d9a2c30b89f235700c424312c7e16580a86021d6" Mar 09 14:33:07 crc kubenswrapper[4722]: E0309 14:33:07.149943 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 14:33:21 crc kubenswrapper[4722]: I0309 14:33:21.150117 4722 scope.go:117] "RemoveContainer" containerID="86a6dcdf7dcf4d48b8bdfb71d9a2c30b89f235700c424312c7e16580a86021d6" Mar 09 14:33:21 crc kubenswrapper[4722]: E0309 14:33:21.151579 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 14:33:23 crc kubenswrapper[4722]: I0309 14:33:23.311703 4722 scope.go:117] "RemoveContainer" containerID="48eff71955332d3b52375f0a991f10f3e4374f91b7e99deb991ff46c62536e31" Mar 09 14:33:23 crc kubenswrapper[4722]: I0309 14:33:23.338655 4722 scope.go:117] "RemoveContainer" containerID="4c2160840bc5dd1aa87e9cf70ed2a592e6050a1f29dadce21d04f9229d43ba63" Mar 09 14:33:23 crc kubenswrapper[4722]: I0309 14:33:23.366257 4722 scope.go:117] "RemoveContainer" containerID="c24875009a9faa332cc97d55a86352afdf804d1f1652af3912b252e0ee1afaea" Mar 09 14:33:23 crc kubenswrapper[4722]: I0309 14:33:23.410252 4722 scope.go:117] "RemoveContainer" containerID="78edfd45b55f5b27e7faa8f4b105788a5b0ac9f34b23575abe46855f8cccbd18" Mar 09 14:33:23 crc kubenswrapper[4722]: I0309 14:33:23.439139 4722 scope.go:117] "RemoveContainer" containerID="47459de10f70446fb04c6ce8a3b18a01c64f884125b7c2303d821b0bc5b88e0d" Mar 09 14:33:23 crc kubenswrapper[4722]: I0309 14:33:23.472747 4722 scope.go:117] "RemoveContainer" containerID="876323b8a50b38036cbeb67f8682fe72e20d6ca089e54cdc4c767cfd67f20e7f" Mar 09 14:33:29 crc kubenswrapper[4722]: I0309 14:33:29.622238 4722 generic.go:334] "Generic (PLEG): container finished" podID="49886118-4852-41ba-bbed-a946764f4649" containerID="bd2f0453a9676ad2a653c487c4e9f45080ac64d7a55c532b919bebccef3dc027" exitCode=0 Mar 09 14:33:29 crc kubenswrapper[4722]: I0309 14:33:29.622316 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"49886118-4852-41ba-bbed-a946764f4649","Type":"ContainerDied","Data":"bd2f0453a9676ad2a653c487c4e9f45080ac64d7a55c532b919bebccef3dc027"} Mar 09 14:33:30 crc kubenswrapper[4722]: I0309 14:33:30.634474 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"49886118-4852-41ba-bbed-a946764f4649","Type":"ContainerStarted","Data":"b1d394062ae88bed520cc5ad9d24a108cb15a037bb4e50c741ecd91df253ecdb"} Mar 09 14:33:30 crc kubenswrapper[4722]: I0309 14:33:30.636039 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 09 14:33:30 crc kubenswrapper[4722]: I0309 14:33:30.672022 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.671999571 podStartE2EDuration="37.671999571s" podCreationTimestamp="2026-03-09 14:32:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:33:30.667488847 +0000 UTC m=+1851.223057443" watchObservedRunningTime="2026-03-09 14:33:30.671999571 +0000 UTC m=+1851.227568147" Mar 09 14:33:35 crc kubenswrapper[4722]: I0309 14:33:35.149567 4722 scope.go:117] "RemoveContainer" containerID="86a6dcdf7dcf4d48b8bdfb71d9a2c30b89f235700c424312c7e16580a86021d6" Mar 09 14:33:35 crc kubenswrapper[4722]: E0309 14:33:35.150662 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 14:33:43 crc kubenswrapper[4722]: I0309 14:33:43.617333 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 09 14:33:50 crc kubenswrapper[4722]: I0309 14:33:50.166392 4722 scope.go:117] "RemoveContainer" containerID="86a6dcdf7dcf4d48b8bdfb71d9a2c30b89f235700c424312c7e16580a86021d6" Mar 09 14:33:50 crc kubenswrapper[4722]: E0309 14:33:50.167752 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 14:34:00 crc kubenswrapper[4722]: I0309 14:34:00.172530 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551114-jfxkb"] Mar 09 14:34:00 crc kubenswrapper[4722]: I0309 14:34:00.175705 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551114-jfxkb" Mar 09 14:34:00 crc kubenswrapper[4722]: I0309 14:34:00.186305 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551114-jfxkb"] Mar 09 14:34:00 crc kubenswrapper[4722]: I0309 14:34:00.211543 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 14:34:00 crc kubenswrapper[4722]: I0309 14:34:00.211777 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 14:34:00 crc kubenswrapper[4722]: I0309 14:34:00.211998 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-dr2x6" Mar 09 14:34:00 crc kubenswrapper[4722]: I0309 14:34:00.270097 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnwkd\" (UniqueName: \"kubernetes.io/projected/066f98c7-83dc-4e2d-9e51-702dd17db261-kube-api-access-vnwkd\") pod \"auto-csr-approver-29551114-jfxkb\" (UID: \"066f98c7-83dc-4e2d-9e51-702dd17db261\") " pod="openshift-infra/auto-csr-approver-29551114-jfxkb" Mar 09 14:34:00 crc kubenswrapper[4722]: I0309 14:34:00.373686 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnwkd\" (UniqueName: \"kubernetes.io/projected/066f98c7-83dc-4e2d-9e51-702dd17db261-kube-api-access-vnwkd\") pod \"auto-csr-approver-29551114-jfxkb\" (UID: \"066f98c7-83dc-4e2d-9e51-702dd17db261\") " pod="openshift-infra/auto-csr-approver-29551114-jfxkb" Mar 09 14:34:00 crc kubenswrapper[4722]: I0309 14:34:00.395135 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnwkd\" (UniqueName: \"kubernetes.io/projected/066f98c7-83dc-4e2d-9e51-702dd17db261-kube-api-access-vnwkd\") pod \"auto-csr-approver-29551114-jfxkb\" (UID: \"066f98c7-83dc-4e2d-9e51-702dd17db261\") " pod="openshift-infra/auto-csr-approver-29551114-jfxkb" Mar 09 14:34:00 crc kubenswrapper[4722]: I0309 14:34:00.537820 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551114-jfxkb" Mar 09 14:34:01 crc kubenswrapper[4722]: I0309 14:34:01.057018 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551114-jfxkb"] Mar 09 14:34:01 crc kubenswrapper[4722]: W0309 14:34:01.063763 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod066f98c7_83dc_4e2d_9e51_702dd17db261.slice/crio-4014a3c57e6b2359a96dd8575c9353e4daa2e16f3f405bc577ead280db6c07d9 WatchSource:0}: Error finding container 4014a3c57e6b2359a96dd8575c9353e4daa2e16f3f405bc577ead280db6c07d9: Status 404 returned error can't find the container with id 4014a3c57e6b2359a96dd8575c9353e4daa2e16f3f405bc577ead280db6c07d9 Mar 09 14:34:01 crc kubenswrapper[4722]: I0309 14:34:01.067593 4722 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 14:34:02 crc kubenswrapper[4722]: I0309 14:34:02.049481 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551114-jfxkb" event={"ID":"066f98c7-83dc-4e2d-9e51-702dd17db261","Type":"ContainerStarted","Data":"4014a3c57e6b2359a96dd8575c9353e4daa2e16f3f405bc577ead280db6c07d9"} Mar 09 14:34:02 crc kubenswrapper[4722]: I0309 14:34:02.152443 4722 scope.go:117] "RemoveContainer" containerID="86a6dcdf7dcf4d48b8bdfb71d9a2c30b89f235700c424312c7e16580a86021d6" Mar 09 14:34:02 crc kubenswrapper[4722]: E0309 14:34:02.156410 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 14:34:03 crc kubenswrapper[4722]: I0309 14:34:03.062397 4722 generic.go:334] "Generic (PLEG): container finished" podID="066f98c7-83dc-4e2d-9e51-702dd17db261" containerID="384ceb3cd98cac6077daedbd758a939b3c58af663dade4bf4e245ec91b3e624e" exitCode=0 Mar 09 14:34:03 crc kubenswrapper[4722]: I0309 14:34:03.062711 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551114-jfxkb" event={"ID":"066f98c7-83dc-4e2d-9e51-702dd17db261","Type":"ContainerDied","Data":"384ceb3cd98cac6077daedbd758a939b3c58af663dade4bf4e245ec91b3e624e"} Mar 09 14:34:04 crc kubenswrapper[4722]: I0309 14:34:04.479659 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551114-jfxkb" Mar 09 14:34:04 crc kubenswrapper[4722]: I0309 14:34:04.582218 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vnwkd\" (UniqueName: \"kubernetes.io/projected/066f98c7-83dc-4e2d-9e51-702dd17db261-kube-api-access-vnwkd\") pod \"066f98c7-83dc-4e2d-9e51-702dd17db261\" (UID: \"066f98c7-83dc-4e2d-9e51-702dd17db261\") " Mar 09 14:34:04 crc kubenswrapper[4722]: I0309 14:34:04.589682 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/066f98c7-83dc-4e2d-9e51-702dd17db261-kube-api-access-vnwkd" (OuterVolumeSpecName: "kube-api-access-vnwkd") pod "066f98c7-83dc-4e2d-9e51-702dd17db261" (UID: "066f98c7-83dc-4e2d-9e51-702dd17db261"). InnerVolumeSpecName "kube-api-access-vnwkd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:34:04 crc kubenswrapper[4722]: I0309 14:34:04.686044 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vnwkd\" (UniqueName: \"kubernetes.io/projected/066f98c7-83dc-4e2d-9e51-702dd17db261-kube-api-access-vnwkd\") on node \"crc\" DevicePath \"\"" Mar 09 14:34:05 crc kubenswrapper[4722]: I0309 14:34:05.097744 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551114-jfxkb" event={"ID":"066f98c7-83dc-4e2d-9e51-702dd17db261","Type":"ContainerDied","Data":"4014a3c57e6b2359a96dd8575c9353e4daa2e16f3f405bc577ead280db6c07d9"} Mar 09 14:34:05 crc kubenswrapper[4722]: I0309 14:34:05.098697 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4014a3c57e6b2359a96dd8575c9353e4daa2e16f3f405bc577ead280db6c07d9" Mar 09 14:34:05 crc kubenswrapper[4722]: I0309 14:34:05.097890 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551114-jfxkb" Mar 09 14:34:05 crc kubenswrapper[4722]: I0309 14:34:05.577841 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551108-xnm76"] Mar 09 14:34:05 crc kubenswrapper[4722]: I0309 14:34:05.594765 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551108-xnm76"] Mar 09 14:34:06 crc kubenswrapper[4722]: I0309 14:34:06.168733 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a5f5ad3-9957-4077-b5a1-7a53d5c4caaf" path="/var/lib/kubelet/pods/6a5f5ad3-9957-4077-b5a1-7a53d5c4caaf/volumes" Mar 09 14:34:13 crc kubenswrapper[4722]: I0309 14:34:13.149977 4722 scope.go:117] "RemoveContainer" containerID="86a6dcdf7dcf4d48b8bdfb71d9a2c30b89f235700c424312c7e16580a86021d6" Mar 09 14:34:13 crc kubenswrapper[4722]: E0309 14:34:13.150717 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 14:34:23 crc kubenswrapper[4722]: I0309 14:34:23.662136 4722 scope.go:117] "RemoveContainer" containerID="31de818897cbe2c334e4d93aef21bd7aabc4a46e9f9ad4e25698a2e64984583b" Mar 09 14:34:23 crc kubenswrapper[4722]: I0309 14:34:23.711364 4722 scope.go:117] "RemoveContainer" containerID="04b2201652075c3dca2d3c2bdd5acf063c18205cade60221aa03618b3d270250" Mar 09 14:34:23 crc kubenswrapper[4722]: I0309 14:34:23.754219 4722 scope.go:117] "RemoveContainer" containerID="ed42696657e1eeb95d8b396c66e34829ba43892ddf5f42c7ce6aa4a57730253b" Mar 09 14:34:23 crc kubenswrapper[4722]: I0309 14:34:23.777467 4722 scope.go:117] "RemoveContainer" containerID="91cc55bbe0c15e0bda9e4b09eb94f00df004b0b14ebbb120410702aeca6523b4" Mar 09 14:34:23 crc kubenswrapper[4722]: I0309 14:34:23.806065 4722 scope.go:117] "RemoveContainer" containerID="a872ee97ebf499e50f3742bdc8cbbec5579a4732c47ec733caa75f4d83a96e13" Mar 09 14:34:23 crc kubenswrapper[4722]: I0309 14:34:23.836014 4722 scope.go:117] "RemoveContainer" containerID="1456c5b63dd83c218d01c12cae5566503438f38b2defeadc341d008a2ceecb6a" Mar 09 14:34:23 crc kubenswrapper[4722]: I0309 14:34:23.877344 4722 scope.go:117] "RemoveContainer" containerID="231703ca880ff1453e2218d6219ba5ae4b08bd9416a34771d6be6aabef221562" Mar 09 14:34:24 crc kubenswrapper[4722]: I0309 14:34:24.150557 4722 scope.go:117] "RemoveContainer" containerID="86a6dcdf7dcf4d48b8bdfb71d9a2c30b89f235700c424312c7e16580a86021d6" Mar 09 14:34:24 crc kubenswrapper[4722]: E0309 14:34:24.151275 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 14:34:36 crc kubenswrapper[4722]: I0309 14:34:36.097621 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-fd4g6"] Mar 09 14:34:36 crc kubenswrapper[4722]: I0309 14:34:36.114769 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-4qrms"] Mar 09 14:34:36 crc kubenswrapper[4722]: I0309 14:34:36.129560 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-33f7-account-create-update-tc7x6"] Mar 09 14:34:36 crc kubenswrapper[4722]: I0309 14:34:36.144674 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-23c6-account-create-update-nk68b"] Mar 09 14:34:36 crc kubenswrapper[4722]: I0309 14:34:36.162667 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-c437-account-create-update-mdmhs"] Mar 09 14:34:36 crc kubenswrapper[4722]: I0309 14:34:36.173298 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-33f7-account-create-update-tc7x6"] Mar 09 14:34:36 crc kubenswrapper[4722]: I0309 14:34:36.184576 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-4qrms"] Mar 09 14:34:36 crc kubenswrapper[4722]: I0309 14:34:36.195913 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-fd4g6"] Mar 09 14:34:36 crc kubenswrapper[4722]: I0309 14:34:36.206476 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-23c6-account-create-update-nk68b"] Mar 09 14:34:36 crc kubenswrapper[4722]: I0309 14:34:36.221374 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-c437-account-create-update-mdmhs"] Mar 09 14:34:37 crc kubenswrapper[4722]: I0309 14:34:37.029270 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-7dk6r"] Mar 09 14:34:37 crc kubenswrapper[4722]: I0309 14:34:37.040641 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-65cx2"] Mar 09 14:34:37 crc kubenswrapper[4722]: I0309 14:34:37.052213 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-7dk6r"] Mar 09 14:34:37 crc kubenswrapper[4722]: I0309 14:34:37.061928 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-65cx2"] Mar 09 14:34:38 crc kubenswrapper[4722]: I0309 14:34:38.033957 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-34cc-account-create-update-hqh5l"] Mar 09 14:34:38 crc kubenswrapper[4722]: I0309 14:34:38.047627 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-34cc-account-create-update-hqh5l"] Mar 09 14:34:38 crc kubenswrapper[4722]: I0309 14:34:38.164191 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d9d7716-b622-4ae7-9f3a-480c5807525b" path="/var/lib/kubelet/pods/1d9d7716-b622-4ae7-9f3a-480c5807525b/volumes" Mar 09 14:34:38 crc kubenswrapper[4722]: I0309 14:34:38.165609 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5397205b-9c4a-4575-8bfa-8604e88784e9" path="/var/lib/kubelet/pods/5397205b-9c4a-4575-8bfa-8604e88784e9/volumes" Mar 09 14:34:38 crc kubenswrapper[4722]: I0309 14:34:38.167418 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="759a8d7f-dc3f-4432-96ca-4adaf31331ae" path="/var/lib/kubelet/pods/759a8d7f-dc3f-4432-96ca-4adaf31331ae/volumes" Mar 09 14:34:38 crc kubenswrapper[4722]: I0309 14:34:38.169649 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b730192a-1410-4860-a847-f5e5974fd728" path="/var/lib/kubelet/pods/b730192a-1410-4860-a847-f5e5974fd728/volumes" Mar 09 14:34:38 crc kubenswrapper[4722]: I0309 14:34:38.171653 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d94802a7-3f7c-4172-b781-a1eac89761d6" path="/var/lib/kubelet/pods/d94802a7-3f7c-4172-b781-a1eac89761d6/volumes" Mar 09 14:34:38 crc kubenswrapper[4722]: I0309 14:34:38.174425 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e55131a4-f18d-417f-8d87-408a7f3bb919" path="/var/lib/kubelet/pods/e55131a4-f18d-417f-8d87-408a7f3bb919/volumes" Mar 09 14:34:38 crc kubenswrapper[4722]: I0309 14:34:38.179644 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9b4c63d-ce42-432e-bbcf-abe490c33d2e" path="/var/lib/kubelet/pods/e9b4c63d-ce42-432e-bbcf-abe490c33d2e/volumes" Mar 09 14:34:38 crc kubenswrapper[4722]: I0309 14:34:38.184334 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9e36087-1859-42c2-bf99-100d32617755" path="/var/lib/kubelet/pods/f9e36087-1859-42c2-bf99-100d32617755/volumes" Mar 09 14:34:39 crc kubenswrapper[4722]: I0309 14:34:39.150295 4722 scope.go:117] "RemoveContainer" containerID="86a6dcdf7dcf4d48b8bdfb71d9a2c30b89f235700c424312c7e16580a86021d6" Mar 09 14:34:39 crc kubenswrapper[4722]: E0309 14:34:39.150556 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 14:34:48 crc kubenswrapper[4722]: I0309 14:34:48.040789 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-b04f-account-create-update-bmkn7"] Mar 09 14:34:48 crc kubenswrapper[4722]: I0309 14:34:48.053913 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-sbfvj"] Mar 09 14:34:48 crc kubenswrapper[4722]: I0309 14:34:48.064348 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-b04f-account-create-update-bmkn7"] Mar 09 14:34:48 crc kubenswrapper[4722]: I0309 14:34:48.074641 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-sbfvj"] Mar 09 14:34:48 crc kubenswrapper[4722]: I0309 14:34:48.163747 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b9c74d7-fa1d-4c29-8875-47d506627d77" path="/var/lib/kubelet/pods/2b9c74d7-fa1d-4c29-8875-47d506627d77/volumes" Mar 09 14:34:48 crc kubenswrapper[4722]: I0309 14:34:48.165410 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d162f3c-13b0-4ec3-955b-5a1bf804c61c" path="/var/lib/kubelet/pods/4d162f3c-13b0-4ec3-955b-5a1bf804c61c/volumes" Mar 09 14:34:50 crc kubenswrapper[4722]: I0309 14:34:50.160682 4722 scope.go:117] "RemoveContainer" containerID="86a6dcdf7dcf4d48b8bdfb71d9a2c30b89f235700c424312c7e16580a86021d6" Mar 09 14:34:50 crc kubenswrapper[4722]: E0309 14:34:50.162323 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 14:35:02 crc kubenswrapper[4722]: I0309 14:35:02.036529 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-dxsxq"] Mar 09 14:35:02 crc kubenswrapper[4722]: I0309 14:35:02.050024 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-dxsxq"] Mar 09 14:35:02 crc kubenswrapper[4722]: I0309 14:35:02.162642 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10519e91-e280-418b-947a-114e2696e8a8" path="/var/lib/kubelet/pods/10519e91-e280-418b-947a-114e2696e8a8/volumes" Mar 09 14:35:04 crc kubenswrapper[4722]: I0309 14:35:04.150864 4722 scope.go:117] "RemoveContainer" containerID="86a6dcdf7dcf4d48b8bdfb71d9a2c30b89f235700c424312c7e16580a86021d6" Mar 09 14:35:04 crc kubenswrapper[4722]: I0309 14:35:04.833581 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" event={"ID":"dac2aaf5-653b-4b2a-8efe-ed26bac8d648","Type":"ContainerStarted","Data":"4efa58158532ee2b76f83ca6efe53930357d25040e1664917766707f5e03ced2"} Mar 09 14:35:13 crc kubenswrapper[4722]: I0309 14:35:13.056224 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-rl9l9"] Mar 09 14:35:13 crc kubenswrapper[4722]: I0309 14:35:13.067453 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-rl9l9"] Mar 09 14:35:14 crc kubenswrapper[4722]: I0309 14:35:14.171558 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17f0c1f6-4aea-4ada-aaec-3493cec60053" path="/var/lib/kubelet/pods/17f0c1f6-4aea-4ada-aaec-3493cec60053/volumes" Mar 09 14:35:24 crc kubenswrapper[4722]: I0309 14:35:24.071499 4722 scope.go:117] "RemoveContainer" containerID="9cf4b538321d3fa703583336e99c0ac5959c1b9751f95fd0ea88e2cc3ab41b80" Mar 09 14:35:24 crc kubenswrapper[4722]: I0309 14:35:24.109072 4722 scope.go:117] "RemoveContainer" containerID="e7d19a62a82beca0226b21f1e83b0cce1b7119fc6bc4bdd57cf3ae7a89c8013d" Mar 09 14:35:24 crc kubenswrapper[4722]: I0309 14:35:24.155116 4722 scope.go:117] "RemoveContainer" containerID="48c3518495d5e233291f49d08b4afd8f53185fc9cc71a1db82bf42a1673c4010" Mar 09 14:35:24 crc kubenswrapper[4722]: I0309 14:35:24.211640 4722 scope.go:117] "RemoveContainer" containerID="05e4e023af02d9f6a33763b4fa8ad55942cdd55d767418bf2f9c9ebd94a63ce4" Mar 09 14:35:24 crc kubenswrapper[4722]: I0309 14:35:24.705662 4722 scope.go:117] "RemoveContainer" containerID="ab1bc7f3f075346e0c3f08ec47a17a509a3b83aa2fd81823f6dbd4674ac6db74" Mar 09 14:35:24 crc kubenswrapper[4722]: I0309 14:35:24.737035 4722 scope.go:117] "RemoveContainer" containerID="362a3baf5e0261fd5426cdde6dead9ccd0a206479e0b9484ca198bd9bce59b63" Mar 09 14:35:24 crc kubenswrapper[4722]: I0309 14:35:24.795084 4722 scope.go:117] "RemoveContainer" containerID="84a5442f49e47806ced5341e3762f4d3f9e499c6bee42c6d293f5c8c8d961f4d" Mar 09 14:35:24 crc kubenswrapper[4722]: I0309 14:35:24.852355 4722 scope.go:117] "RemoveContainer" containerID="9eb0caf05965c54430c17365b4f1c9c40b3e0003965ce27c87e41bfdeae42841" Mar 09 14:35:24 crc kubenswrapper[4722]: I0309 14:35:24.879890 4722 scope.go:117] "RemoveContainer" containerID="474844d6f470855b27765fc7bc63d527480411f21441a78f7ec486c1dc4a2e0b" Mar 09 14:35:24 crc kubenswrapper[4722]: I0309 14:35:24.904129 4722 scope.go:117] "RemoveContainer" containerID="8ba9f880a35b90799326c51b4ef8ca4c612c935b1f0676c9b692a22f589c5bf6" Mar 09 14:35:24 crc kubenswrapper[4722]: I0309 14:35:24.926981 4722 scope.go:117] "RemoveContainer" containerID="3c8f0ecc6e288a2c45b7eecaa7fbb8df96e687d06d2ecda8b11d40e75a734c21" Mar 09 14:35:24 crc kubenswrapper[4722]: I0309 14:35:24.952713 4722 scope.go:117] "RemoveContainer" containerID="79e69e9857ff6bcf1ff3a0ee629adedb582f12c826219910c63e7b4fe8b7365e" Mar 09 14:35:24 crc kubenswrapper[4722]: I0309 14:35:24.978969 4722 scope.go:117] "RemoveContainer" containerID="2d71a3ff9d06d6730989830c92086552aabfd7058ad9bdbfa73c47fa2069e148" Mar 09 14:35:25 crc kubenswrapper[4722]: I0309 14:35:25.011093 4722 scope.go:117] "RemoveContainer" containerID="bd82c164ee409dde9a58c38a0dd185a5560bf460d2e5290d5d5480ce344605d9" Mar 09 14:35:27 crc kubenswrapper[4722]: I0309 14:35:27.037722 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-b1e8-account-create-update-6xt7j"] Mar 09 14:35:27 crc kubenswrapper[4722]: I0309 14:35:27.050647 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-b1e8-account-create-update-6xt7j"] Mar 09 14:35:27 crc kubenswrapper[4722]: I0309 14:35:27.065623 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-b229-account-create-update-8qcbk"] Mar 09 14:35:27 crc kubenswrapper[4722]: I0309 14:35:27.079900 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-b229-account-create-update-8qcbk"] Mar 09 14:35:27 crc kubenswrapper[4722]: I0309 14:35:27.093916 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-d6qd7"] Mar 09 14:35:27 crc kubenswrapper[4722]: I0309 14:35:27.102416 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-28mvd"] Mar 09 14:35:27 crc kubenswrapper[4722]: I0309 14:35:27.112340 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-c159-account-create-update-wrg5t"] Mar 09 14:35:27 crc kubenswrapper[4722]: I0309 14:35:27.122925 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-d6qd7"] Mar 09 14:35:27 crc kubenswrapper[4722]: I0309 14:35:27.133829 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-28mvd"] Mar 09 14:35:27 crc kubenswrapper[4722]: I0309 14:35:27.144927 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-c159-account-create-update-wrg5t"] Mar 09 14:35:27 crc kubenswrapper[4722]: I0309 14:35:27.156621 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-fbldj"] Mar 09 14:35:27 crc kubenswrapper[4722]: I0309 14:35:27.166896 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-ea4e-account-create-update-vgttx"] Mar 09 14:35:27 crc kubenswrapper[4722]: I0309 14:35:27.177576 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-fbldj"] Mar 09 14:35:27 crc kubenswrapper[4722]: I0309 14:35:27.187968 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-2f2fz"] Mar 09 14:35:27 crc kubenswrapper[4722]: I0309 14:35:27.197589 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-ea4e-account-create-update-vgttx"] Mar 09 14:35:27 crc kubenswrapper[4722]: I0309 14:35:27.207755 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-2f2fz"] Mar 09 14:35:28 crc kubenswrapper[4722]: I0309 14:35:28.164940 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e8291dd-3577-4312-9ef5-fddf9a98b9db" path="/var/lib/kubelet/pods/4e8291dd-3577-4312-9ef5-fddf9a98b9db/volumes" Mar 09 14:35:28 crc kubenswrapper[4722]: I0309 14:35:28.167545 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6934dfa5-4f2a-4241-8cc2-20a87e4659e8" path="/var/lib/kubelet/pods/6934dfa5-4f2a-4241-8cc2-20a87e4659e8/volumes" Mar 09 14:35:28 crc kubenswrapper[4722]: I0309 14:35:28.171642 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69af744d-8cc5-460e-8a51-f867d17f8e49" path="/var/lib/kubelet/pods/69af744d-8cc5-460e-8a51-f867d17f8e49/volumes" Mar 09 14:35:28 crc kubenswrapper[4722]: I0309 14:35:28.173732 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c70c5f8-b667-4d94-96f4-7184d2fc36aa" path="/var/lib/kubelet/pods/7c70c5f8-b667-4d94-96f4-7184d2fc36aa/volumes" Mar 09 14:35:28 crc kubenswrapper[4722]: I0309 14:35:28.177573 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e2e23b8-e05f-4e51-b435-baba257c7a55" path="/var/lib/kubelet/pods/8e2e23b8-e05f-4e51-b435-baba257c7a55/volumes" Mar 09 14:35:28 crc kubenswrapper[4722]: I0309 14:35:28.180675 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6cddd9c-6855-4f24-bbe7-b628c2431354" path="/var/lib/kubelet/pods/d6cddd9c-6855-4f24-bbe7-b628c2431354/volumes" Mar 09 14:35:28 crc kubenswrapper[4722]: I0309 14:35:28.181903 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e291a0c4-8966-4c54-b4c9-fa2ed6c28b1b" path="/var/lib/kubelet/pods/e291a0c4-8966-4c54-b4c9-fa2ed6c28b1b/volumes" Mar 09 14:35:28 crc kubenswrapper[4722]: I0309 14:35:28.185240 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee1ac8e7-cf85-4fe4-9e02-7be1c6ac8119" path="/var/lib/kubelet/pods/ee1ac8e7-cf85-4fe4-9e02-7be1c6ac8119/volumes" Mar 09 14:35:30 crc kubenswrapper[4722]: I0309 14:35:30.841150 4722 generic.go:334] "Generic (PLEG): container finished" podID="4a60c1ca-738f-48a9-b972-3eef08a28fc6" containerID="2179063018fc63a106d493df46f46ad2bd648830e45bdf5244a79f206eb9dc1b" exitCode=0 Mar 09 14:35:30 crc kubenswrapper[4722]: I0309 14:35:30.841756 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fsqlh" event={"ID":"4a60c1ca-738f-48a9-b972-3eef08a28fc6","Type":"ContainerDied","Data":"2179063018fc63a106d493df46f46ad2bd648830e45bdf5244a79f206eb9dc1b"} Mar 09 14:35:31 crc kubenswrapper[4722]: I0309 14:35:31.055601 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-gw98d"] Mar 09 14:35:31 crc kubenswrapper[4722]: I0309 14:35:31.068740 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-gw98d"] Mar 09 14:35:32 crc kubenswrapper[4722]: I0309 14:35:32.163801 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f457549-38ab-40d6-97ad-160ce234e8e5" path="/var/lib/kubelet/pods/1f457549-38ab-40d6-97ad-160ce234e8e5/volumes" Mar 09 14:35:32 crc kubenswrapper[4722]: I0309 14:35:32.329937 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fsqlh" Mar 09 14:35:32 crc kubenswrapper[4722]: I0309 14:35:32.525258 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4a60c1ca-738f-48a9-b972-3eef08a28fc6-inventory\") pod \"4a60c1ca-738f-48a9-b972-3eef08a28fc6\" (UID: \"4a60c1ca-738f-48a9-b972-3eef08a28fc6\") " Mar 09 14:35:32 crc kubenswrapper[4722]: I0309 14:35:32.525424 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f6whl\" (UniqueName: \"kubernetes.io/projected/4a60c1ca-738f-48a9-b972-3eef08a28fc6-kube-api-access-f6whl\") pod \"4a60c1ca-738f-48a9-b972-3eef08a28fc6\" (UID: \"4a60c1ca-738f-48a9-b972-3eef08a28fc6\") " Mar 09 14:35:32 crc kubenswrapper[4722]: I0309 14:35:32.525516 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4a60c1ca-738f-48a9-b972-3eef08a28fc6-ssh-key-openstack-edpm-ipam\") pod \"4a60c1ca-738f-48a9-b972-3eef08a28fc6\" (UID: \"4a60c1ca-738f-48a9-b972-3eef08a28fc6\") " Mar 09 14:35:32 crc kubenswrapper[4722]: I0309 14:35:32.525794 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a60c1ca-738f-48a9-b972-3eef08a28fc6-bootstrap-combined-ca-bundle\") pod \"4a60c1ca-738f-48a9-b972-3eef08a28fc6\" (UID: \"4a60c1ca-738f-48a9-b972-3eef08a28fc6\") " Mar 09 14:35:32 crc kubenswrapper[4722]: I0309 14:35:32.535292 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a60c1ca-738f-48a9-b972-3eef08a28fc6-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "4a60c1ca-738f-48a9-b972-3eef08a28fc6" (UID: "4a60c1ca-738f-48a9-b972-3eef08a28fc6"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:35:32 crc kubenswrapper[4722]: I0309 14:35:32.540460 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a60c1ca-738f-48a9-b972-3eef08a28fc6-kube-api-access-f6whl" (OuterVolumeSpecName: "kube-api-access-f6whl") pod "4a60c1ca-738f-48a9-b972-3eef08a28fc6" (UID: "4a60c1ca-738f-48a9-b972-3eef08a28fc6"). InnerVolumeSpecName "kube-api-access-f6whl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:35:32 crc kubenswrapper[4722]: I0309 14:35:32.566851 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a60c1ca-738f-48a9-b972-3eef08a28fc6-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "4a60c1ca-738f-48a9-b972-3eef08a28fc6" (UID: "4a60c1ca-738f-48a9-b972-3eef08a28fc6"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:35:32 crc kubenswrapper[4722]: I0309 14:35:32.579430 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a60c1ca-738f-48a9-b972-3eef08a28fc6-inventory" (OuterVolumeSpecName: "inventory") pod "4a60c1ca-738f-48a9-b972-3eef08a28fc6" (UID: "4a60c1ca-738f-48a9-b972-3eef08a28fc6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:35:32 crc kubenswrapper[4722]: I0309 14:35:32.628887 4722 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4a60c1ca-738f-48a9-b972-3eef08a28fc6-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 14:35:32 crc kubenswrapper[4722]: I0309 14:35:32.628928 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f6whl\" (UniqueName: \"kubernetes.io/projected/4a60c1ca-738f-48a9-b972-3eef08a28fc6-kube-api-access-f6whl\") on node \"crc\" DevicePath \"\"" Mar 09 14:35:32 crc kubenswrapper[4722]: I0309 14:35:32.628940 4722 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4a60c1ca-738f-48a9-b972-3eef08a28fc6-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 14:35:32 crc kubenswrapper[4722]: I0309 14:35:32.628951 4722 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a60c1ca-738f-48a9-b972-3eef08a28fc6-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:35:32 crc kubenswrapper[4722]: I0309 14:35:32.884162 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fsqlh" event={"ID":"4a60c1ca-738f-48a9-b972-3eef08a28fc6","Type":"ContainerDied","Data":"7d3d0a0344f13e515d04bc7e76c95ea37d872c7f12dcfd287aecc55e73a701be"} Mar 09 14:35:32 crc kubenswrapper[4722]: I0309 14:35:32.884245 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d3d0a0344f13e515d04bc7e76c95ea37d872c7f12dcfd287aecc55e73a701be" Mar 09 14:35:32 crc kubenswrapper[4722]: I0309 14:35:32.884309 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-fsqlh" Mar 09 14:35:32 crc kubenswrapper[4722]: I0309 14:35:32.963884 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gk84b"] Mar 09 14:35:32 crc kubenswrapper[4722]: E0309 14:35:32.966461 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a60c1ca-738f-48a9-b972-3eef08a28fc6" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 09 14:35:32 crc kubenswrapper[4722]: I0309 14:35:32.966497 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a60c1ca-738f-48a9-b972-3eef08a28fc6" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 09 14:35:32 crc kubenswrapper[4722]: E0309 14:35:32.966578 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="066f98c7-83dc-4e2d-9e51-702dd17db261" containerName="oc" Mar 09 14:35:32 crc kubenswrapper[4722]: I0309 14:35:32.966588 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="066f98c7-83dc-4e2d-9e51-702dd17db261" containerName="oc" Mar 09 14:35:32 crc kubenswrapper[4722]: I0309 14:35:32.966882 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="066f98c7-83dc-4e2d-9e51-702dd17db261" containerName="oc" Mar 09 14:35:32 crc kubenswrapper[4722]: I0309 14:35:32.966914 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a60c1ca-738f-48a9-b972-3eef08a28fc6" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 09 14:35:32 crc kubenswrapper[4722]: I0309 14:35:32.968153 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gk84b" Mar 09 14:35:32 crc kubenswrapper[4722]: I0309 14:35:32.975181 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-dwlbg" Mar 09 14:35:32 crc kubenswrapper[4722]: I0309 14:35:32.975351 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 14:35:32 crc kubenswrapper[4722]: I0309 14:35:32.975503 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 14:35:32 crc kubenswrapper[4722]: I0309 14:35:32.975644 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 14:35:32 crc kubenswrapper[4722]: I0309 14:35:32.979546 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gk84b"] Mar 09 14:35:33 crc kubenswrapper[4722]: I0309 14:35:33.038406 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ed3d0a76-d1d6-4c0b-afb3-e9a74176d95f-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-gk84b\" (UID: \"ed3d0a76-d1d6-4c0b-afb3-e9a74176d95f\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gk84b" Mar 09 14:35:33 crc kubenswrapper[4722]: I0309 14:35:33.038903 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68cjk\" (UniqueName: \"kubernetes.io/projected/ed3d0a76-d1d6-4c0b-afb3-e9a74176d95f-kube-api-access-68cjk\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-gk84b\" (UID: \"ed3d0a76-d1d6-4c0b-afb3-e9a74176d95f\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gk84b" Mar 09 14:35:33 crc kubenswrapper[4722]: I0309 14:35:33.038992 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ed3d0a76-d1d6-4c0b-afb3-e9a74176d95f-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-gk84b\" (UID: \"ed3d0a76-d1d6-4c0b-afb3-e9a74176d95f\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gk84b" Mar 09 14:35:33 crc kubenswrapper[4722]: I0309 14:35:33.141882 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68cjk\" (UniqueName: \"kubernetes.io/projected/ed3d0a76-d1d6-4c0b-afb3-e9a74176d95f-kube-api-access-68cjk\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-gk84b\" (UID: \"ed3d0a76-d1d6-4c0b-afb3-e9a74176d95f\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gk84b" Mar 09 14:35:33 crc kubenswrapper[4722]: I0309 14:35:33.141992 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ed3d0a76-d1d6-4c0b-afb3-e9a74176d95f-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-gk84b\" (UID: \"ed3d0a76-d1d6-4c0b-afb3-e9a74176d95f\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gk84b" Mar 09 14:35:33 crc kubenswrapper[4722]: I0309 14:35:33.142126 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ed3d0a76-d1d6-4c0b-afb3-e9a74176d95f-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-gk84b\" (UID: \"ed3d0a76-d1d6-4c0b-afb3-e9a74176d95f\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gk84b" Mar 09 14:35:33 crc kubenswrapper[4722]: I0309 14:35:33.145938 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ed3d0a76-d1d6-4c0b-afb3-e9a74176d95f-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-gk84b\" (UID: \"ed3d0a76-d1d6-4c0b-afb3-e9a74176d95f\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gk84b" Mar 09 14:35:33 crc kubenswrapper[4722]: I0309 14:35:33.146171 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ed3d0a76-d1d6-4c0b-afb3-e9a74176d95f-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-gk84b\" (UID: \"ed3d0a76-d1d6-4c0b-afb3-e9a74176d95f\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gk84b" Mar 09 14:35:33 crc kubenswrapper[4722]: I0309 14:35:33.164839 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68cjk\" (UniqueName: \"kubernetes.io/projected/ed3d0a76-d1d6-4c0b-afb3-e9a74176d95f-kube-api-access-68cjk\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-gk84b\" (UID: \"ed3d0a76-d1d6-4c0b-afb3-e9a74176d95f\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gk84b" Mar 09 14:35:33 crc kubenswrapper[4722]: I0309 14:35:33.295544 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gk84b" Mar 09 14:35:33 crc kubenswrapper[4722]: I0309 14:35:33.903477 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gk84b"] Mar 09 14:35:34 crc kubenswrapper[4722]: I0309 14:35:34.908992 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gk84b" event={"ID":"ed3d0a76-d1d6-4c0b-afb3-e9a74176d95f","Type":"ContainerStarted","Data":"9933bd31732a9c4eda9b67b65f8c4003ccd6b37dacb799c45629a918579952e0"} Mar 09 14:35:34 crc kubenswrapper[4722]: I0309 14:35:34.909604 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gk84b" event={"ID":"ed3d0a76-d1d6-4c0b-afb3-e9a74176d95f","Type":"ContainerStarted","Data":"423cdaf51e1cc75157f7e586bff2158c565c6f95a49b2f25f7ef00e8c8c0b48b"} Mar 09 14:35:34 crc kubenswrapper[4722]: I0309 14:35:34.929521 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gk84b" podStartSLOduration=2.381371838 podStartE2EDuration="2.929504137s" podCreationTimestamp="2026-03-09 14:35:32 +0000 UTC" firstStartedPulling="2026-03-09 14:35:33.907174217 +0000 UTC m=+1974.462742793" lastFinishedPulling="2026-03-09 14:35:34.455306516 +0000 UTC m=+1975.010875092" observedRunningTime="2026-03-09 14:35:34.922961247 +0000 UTC m=+1975.478529823" watchObservedRunningTime="2026-03-09 14:35:34.929504137 +0000 UTC m=+1975.485072703" Mar 09 14:36:00 crc kubenswrapper[4722]: I0309 14:36:00.164002 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551116-czdm7"] Mar 09 14:36:00 crc kubenswrapper[4722]: I0309 14:36:00.166159 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551116-czdm7" Mar 09 14:36:00 crc kubenswrapper[4722]: I0309 14:36:00.169695 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 14:36:00 crc kubenswrapper[4722]: I0309 14:36:00.169833 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-dr2x6" Mar 09 14:36:00 crc kubenswrapper[4722]: I0309 14:36:00.169993 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 14:36:00 crc kubenswrapper[4722]: I0309 14:36:00.193224 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551116-czdm7"] Mar 09 14:36:00 crc kubenswrapper[4722]: I0309 14:36:00.329840 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6q8pl\" (UniqueName: \"kubernetes.io/projected/dd7ebfdc-0261-4646-9601-cd3367751122-kube-api-access-6q8pl\") pod \"auto-csr-approver-29551116-czdm7\" (UID: \"dd7ebfdc-0261-4646-9601-cd3367751122\") " pod="openshift-infra/auto-csr-approver-29551116-czdm7" Mar 09 14:36:00 crc kubenswrapper[4722]: I0309 14:36:00.435474 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6q8pl\" (UniqueName: \"kubernetes.io/projected/dd7ebfdc-0261-4646-9601-cd3367751122-kube-api-access-6q8pl\") pod \"auto-csr-approver-29551116-czdm7\" (UID: \"dd7ebfdc-0261-4646-9601-cd3367751122\") " pod="openshift-infra/auto-csr-approver-29551116-czdm7" Mar 09 14:36:00 crc kubenswrapper[4722]: I0309 14:36:00.463406 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6q8pl\" (UniqueName: \"kubernetes.io/projected/dd7ebfdc-0261-4646-9601-cd3367751122-kube-api-access-6q8pl\") pod \"auto-csr-approver-29551116-czdm7\" (UID: \"dd7ebfdc-0261-4646-9601-cd3367751122\") " pod="openshift-infra/auto-csr-approver-29551116-czdm7" Mar 09 14:36:00 crc kubenswrapper[4722]: I0309 14:36:00.504913 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551116-czdm7" Mar 09 14:36:00 crc kubenswrapper[4722]: I0309 14:36:00.994397 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551116-czdm7"] Mar 09 14:36:01 crc kubenswrapper[4722]: I0309 14:36:01.199770 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551116-czdm7" event={"ID":"dd7ebfdc-0261-4646-9601-cd3367751122","Type":"ContainerStarted","Data":"3104cd95117c91fec93845d6a236175d8b499446970347a7bd70dbb1a0dcdce0"} Mar 09 14:36:02 crc kubenswrapper[4722]: I0309 14:36:02.059601 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-l9wtc"] Mar 09 14:36:02 crc kubenswrapper[4722]: I0309 14:36:02.072553 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-l9wtc"] Mar 09 14:36:02 crc kubenswrapper[4722]: I0309 14:36:02.162607 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69530d68-be96-4605-be46-8053083aa178" path="/var/lib/kubelet/pods/69530d68-be96-4605-be46-8053083aa178/volumes" Mar 09 14:36:03 crc kubenswrapper[4722]: I0309 14:36:03.223274 4722 generic.go:334] "Generic (PLEG): container finished" podID="dd7ebfdc-0261-4646-9601-cd3367751122" containerID="756e3aaa32e5086309c4a2d9889d64503e81017d2539d8f1b301b840339d420f" exitCode=0 Mar 09 14:36:03 crc kubenswrapper[4722]: I0309 14:36:03.224403 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551116-czdm7" event={"ID":"dd7ebfdc-0261-4646-9601-cd3367751122","Type":"ContainerDied","Data":"756e3aaa32e5086309c4a2d9889d64503e81017d2539d8f1b301b840339d420f"} Mar 09 14:36:04 crc kubenswrapper[4722]: I0309 14:36:04.658328 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551116-czdm7" Mar 09 14:36:04 crc kubenswrapper[4722]: I0309 14:36:04.765313 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6q8pl\" (UniqueName: \"kubernetes.io/projected/dd7ebfdc-0261-4646-9601-cd3367751122-kube-api-access-6q8pl\") pod \"dd7ebfdc-0261-4646-9601-cd3367751122\" (UID: \"dd7ebfdc-0261-4646-9601-cd3367751122\") " Mar 09 14:36:04 crc kubenswrapper[4722]: I0309 14:36:04.772264 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd7ebfdc-0261-4646-9601-cd3367751122-kube-api-access-6q8pl" (OuterVolumeSpecName: "kube-api-access-6q8pl") pod "dd7ebfdc-0261-4646-9601-cd3367751122" (UID: "dd7ebfdc-0261-4646-9601-cd3367751122"). InnerVolumeSpecName "kube-api-access-6q8pl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:36:04 crc kubenswrapper[4722]: I0309 14:36:04.868731 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6q8pl\" (UniqueName: \"kubernetes.io/projected/dd7ebfdc-0261-4646-9601-cd3367751122-kube-api-access-6q8pl\") on node \"crc\" DevicePath \"\"" Mar 09 14:36:05 crc kubenswrapper[4722]: I0309 14:36:05.252680 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551116-czdm7" event={"ID":"dd7ebfdc-0261-4646-9601-cd3367751122","Type":"ContainerDied","Data":"3104cd95117c91fec93845d6a236175d8b499446970347a7bd70dbb1a0dcdce0"} Mar 09 14:36:05 crc kubenswrapper[4722]: I0309 14:36:05.252936 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3104cd95117c91fec93845d6a236175d8b499446970347a7bd70dbb1a0dcdce0" Mar 09 14:36:05 crc kubenswrapper[4722]: I0309 14:36:05.252759 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551116-czdm7" Mar 09 14:36:05 crc kubenswrapper[4722]: I0309 14:36:05.743130 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551110-bsfw4"] Mar 09 14:36:05 crc kubenswrapper[4722]: I0309 14:36:05.758417 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551110-bsfw4"] Mar 09 14:36:06 crc kubenswrapper[4722]: I0309 14:36:06.162007 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="399309e2-cc0e-4ecb-a9ed-ae5efd75dfb5" path="/var/lib/kubelet/pods/399309e2-cc0e-4ecb-a9ed-ae5efd75dfb5/volumes" Mar 09 14:36:13 crc kubenswrapper[4722]: I0309 14:36:13.034380 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-9llgc"] Mar 09 14:36:13 crc kubenswrapper[4722]: I0309 14:36:13.047599 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-9llgc"] Mar 09 14:36:14 crc kubenswrapper[4722]: I0309 14:36:14.161451 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1e56f45-1812-4ee1-aacc-0b012cf07111" path="/var/lib/kubelet/pods/d1e56f45-1812-4ee1-aacc-0b012cf07111/volumes" Mar 09 14:36:17 crc kubenswrapper[4722]: I0309 14:36:17.034371 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-tlbnm"] Mar 09 14:36:17 crc kubenswrapper[4722]: I0309 14:36:17.050543 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-fx7ks"] Mar 09 14:36:17 crc kubenswrapper[4722]: I0309 14:36:17.062801 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-tlbnm"] Mar 09 14:36:17 crc kubenswrapper[4722]: I0309 14:36:17.075099 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-fx7ks"] Mar 09 14:36:18 crc kubenswrapper[4722]: I0309 14:36:18.164942 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b63a3af5-a347-40ed-b9bc-52ad70e7ff13" path="/var/lib/kubelet/pods/b63a3af5-a347-40ed-b9bc-52ad70e7ff13/volumes" Mar 09 14:36:18 crc kubenswrapper[4722]: I0309 14:36:18.167311 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdb62cc9-3fd6-41fd-bbbd-de3aaee2073b" path="/var/lib/kubelet/pods/cdb62cc9-3fd6-41fd-bbbd-de3aaee2073b/volumes" Mar 09 14:36:25 crc kubenswrapper[4722]: I0309 14:36:25.421055 4722 scope.go:117] "RemoveContainer" containerID="24a467e4c3a886fcf22d058f6004b174454c597a6959d3e90b0c7b0a148e24d1" Mar 09 14:36:25 crc kubenswrapper[4722]: I0309 14:36:25.505834 4722 scope.go:117] "RemoveContainer" containerID="e6aa7414333ac95def3ea245d3435a51edd711e5e6e631810dfb12614097070f" Mar 09 14:36:25 crc kubenswrapper[4722]: I0309 14:36:25.553681 4722 scope.go:117] "RemoveContainer" containerID="ae0f0a10b39149567e274974d05e02450f5f2fa41fba87a0a9d831ebf28b999e" Mar 09 14:36:25 crc kubenswrapper[4722]: I0309 14:36:25.616574 4722 scope.go:117] "RemoveContainer" containerID="942a11c95cd02763b3e8064661fec1d9adf3cea22dc10a549c6a28ccb2e72f87" Mar 09 14:36:25 crc kubenswrapper[4722]: I0309 14:36:25.683905 4722 scope.go:117] "RemoveContainer" containerID="785c9b3fd4f944246425859922e4b9f248364ff8e2049eee438ad214e9c731f4" Mar 09 14:36:25 crc kubenswrapper[4722]: I0309 14:36:25.780618 4722 scope.go:117] "RemoveContainer" containerID="8146c5ee9c87f326a93c865b0b24bb207155822d151e5247f424dbf90262a039" Mar 09 14:36:25 crc kubenswrapper[4722]: I0309 14:36:25.812444 4722 scope.go:117] "RemoveContainer" containerID="e343ca6f06216a8b36f8366941533ba1cb1fd88a064c2a90674afaf0e6169fee" Mar 09 14:36:25 crc kubenswrapper[4722]: I0309 14:36:25.841075 4722 scope.go:117] "RemoveContainer" containerID="3a6f8c5ed200b2e304dfa5940d263adcd0ac9b8b79a760183d551dd9d0eee91d" Mar 09 14:36:25 crc kubenswrapper[4722]: I0309 14:36:25.874974 4722 scope.go:117] "RemoveContainer" containerID="df3dbbc38c06144873dfadfa1c59d99738ef935ead1458109e138b25317b0454" Mar 09 14:36:25 crc kubenswrapper[4722]: I0309 14:36:25.915600 4722 scope.go:117] "RemoveContainer" containerID="096316f8c04dfff3c49fef10ecd7a9fa77ee78f8e756c864851cd88a0c5c6a66" Mar 09 14:36:25 crc kubenswrapper[4722]: I0309 14:36:25.938852 4722 scope.go:117] "RemoveContainer" containerID="911525eabae91f320ecd486d8c9736832ea7f1256cdd8b97a7ae5e490d7c6c14" Mar 09 14:36:25 crc kubenswrapper[4722]: I0309 14:36:25.968487 4722 scope.go:117] "RemoveContainer" containerID="0818377d7e7c8f400a6f0a5a2ec3ea62dc6867105f8b49f85ba04d8ffe1a915d" Mar 09 14:36:25 crc kubenswrapper[4722]: I0309 14:36:25.994266 4722 scope.go:117] "RemoveContainer" containerID="ae82c8c817df06b4b73dcd2172abb119c668da4ef69be56ed5bb917ed7198e43" Mar 09 14:36:26 crc kubenswrapper[4722]: I0309 14:36:26.022303 4722 scope.go:117] "RemoveContainer" containerID="3ffd34a8cc592524621e7a2905633a55ed45699ded92c0dbfb682c1d36cd612c" Mar 09 14:36:35 crc kubenswrapper[4722]: I0309 14:36:35.049355 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-fzj6s"] Mar 09 14:36:35 crc kubenswrapper[4722]: I0309 14:36:35.078755 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-fzj6s"] Mar 09 14:36:36 crc kubenswrapper[4722]: I0309 14:36:36.172049 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5035bd54-0aaa-4ff3-b90a-6145145fe95c" path="/var/lib/kubelet/pods/5035bd54-0aaa-4ff3-b90a-6145145fe95c/volumes" Mar 09 14:37:19 crc kubenswrapper[4722]: I0309 14:37:19.110094 4722 generic.go:334] "Generic (PLEG): container finished" podID="ed3d0a76-d1d6-4c0b-afb3-e9a74176d95f" containerID="9933bd31732a9c4eda9b67b65f8c4003ccd6b37dacb799c45629a918579952e0" exitCode=0 Mar 09 14:37:19 crc kubenswrapper[4722]: I0309 14:37:19.110168 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gk84b" event={"ID":"ed3d0a76-d1d6-4c0b-afb3-e9a74176d95f","Type":"ContainerDied","Data":"9933bd31732a9c4eda9b67b65f8c4003ccd6b37dacb799c45629a918579952e0"} Mar 09 14:37:20 crc kubenswrapper[4722]: I0309 14:37:20.050893 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-48rrj"] Mar 09 14:37:20 crc kubenswrapper[4722]: I0309 14:37:20.087511 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-48rrj"] Mar 09 14:37:20 crc kubenswrapper[4722]: I0309 14:37:20.102908 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-wtg6v"] Mar 09 14:37:20 crc kubenswrapper[4722]: I0309 14:37:20.122263 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-b4fe-account-create-update-jrh5p"] Mar 09 14:37:20 crc kubenswrapper[4722]: I0309 14:37:20.141276 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-wtg6v"] Mar 09 14:37:20 crc kubenswrapper[4722]: I0309 14:37:20.185007 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22868fd9-09e2-4ad6-b923-ab373da94453" path="/var/lib/kubelet/pods/22868fd9-09e2-4ad6-b923-ab373da94453/volumes" Mar 09 14:37:20 crc kubenswrapper[4722]: I0309 14:37:20.185645 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bcd1ab8-574f-4e9e-8b60-058524c8be9f" path="/var/lib/kubelet/pods/8bcd1ab8-574f-4e9e-8b60-058524c8be9f/volumes" Mar 09 14:37:20 crc kubenswrapper[4722]: I0309 14:37:20.186872 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-25f0-account-create-update-cmbgw"] Mar 09 14:37:20 crc kubenswrapper[4722]: I0309 14:37:20.189426 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-b4fe-account-create-update-jrh5p"] Mar 09 14:37:20 crc kubenswrapper[4722]: I0309 14:37:20.207041 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-25f0-account-create-update-cmbgw"] Mar 09 14:37:20 crc kubenswrapper[4722]: I0309 14:37:20.744260 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gk84b" Mar 09 14:37:20 crc kubenswrapper[4722]: I0309 14:37:20.861276 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ed3d0a76-d1d6-4c0b-afb3-e9a74176d95f-ssh-key-openstack-edpm-ipam\") pod \"ed3d0a76-d1d6-4c0b-afb3-e9a74176d95f\" (UID: \"ed3d0a76-d1d6-4c0b-afb3-e9a74176d95f\") " Mar 09 14:37:20 crc kubenswrapper[4722]: I0309 14:37:20.861603 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68cjk\" (UniqueName: \"kubernetes.io/projected/ed3d0a76-d1d6-4c0b-afb3-e9a74176d95f-kube-api-access-68cjk\") pod \"ed3d0a76-d1d6-4c0b-afb3-e9a74176d95f\" (UID: \"ed3d0a76-d1d6-4c0b-afb3-e9a74176d95f\") " Mar 09 14:37:20 crc kubenswrapper[4722]: I0309 14:37:20.861822 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ed3d0a76-d1d6-4c0b-afb3-e9a74176d95f-inventory\") pod \"ed3d0a76-d1d6-4c0b-afb3-e9a74176d95f\" (UID: \"ed3d0a76-d1d6-4c0b-afb3-e9a74176d95f\") " Mar 09 14:37:20 crc kubenswrapper[4722]: I0309 14:37:20.867116 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed3d0a76-d1d6-4c0b-afb3-e9a74176d95f-kube-api-access-68cjk" (OuterVolumeSpecName: "kube-api-access-68cjk") pod "ed3d0a76-d1d6-4c0b-afb3-e9a74176d95f" (UID: "ed3d0a76-d1d6-4c0b-afb3-e9a74176d95f"). InnerVolumeSpecName "kube-api-access-68cjk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:37:20 crc kubenswrapper[4722]: I0309 14:37:20.892287 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed3d0a76-d1d6-4c0b-afb3-e9a74176d95f-inventory" (OuterVolumeSpecName: "inventory") pod "ed3d0a76-d1d6-4c0b-afb3-e9a74176d95f" (UID: "ed3d0a76-d1d6-4c0b-afb3-e9a74176d95f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:37:20 crc kubenswrapper[4722]: I0309 14:37:20.892642 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed3d0a76-d1d6-4c0b-afb3-e9a74176d95f-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ed3d0a76-d1d6-4c0b-afb3-e9a74176d95f" (UID: "ed3d0a76-d1d6-4c0b-afb3-e9a74176d95f"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:37:20 crc kubenswrapper[4722]: I0309 14:37:20.964100 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68cjk\" (UniqueName: \"kubernetes.io/projected/ed3d0a76-d1d6-4c0b-afb3-e9a74176d95f-kube-api-access-68cjk\") on node \"crc\" DevicePath \"\"" Mar 09 14:37:20 crc kubenswrapper[4722]: I0309 14:37:20.964129 4722 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ed3d0a76-d1d6-4c0b-afb3-e9a74176d95f-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 14:37:20 crc kubenswrapper[4722]: I0309 14:37:20.964140 4722 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ed3d0a76-d1d6-4c0b-afb3-e9a74176d95f-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 14:37:21 crc kubenswrapper[4722]: I0309 14:37:21.039409 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-9wrb2"] Mar 09 14:37:21 crc kubenswrapper[4722]: I0309 14:37:21.053739 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-9wrb2"] Mar 09 14:37:21 crc kubenswrapper[4722]: I0309 14:37:21.063917 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-489a-account-create-update-z4jfp"] Mar 09 14:37:21 crc kubenswrapper[4722]: I0309 14:37:21.074176 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-489a-account-create-update-z4jfp"] Mar 09 14:37:21 crc kubenswrapper[4722]: I0309 14:37:21.133561 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gk84b" event={"ID":"ed3d0a76-d1d6-4c0b-afb3-e9a74176d95f","Type":"ContainerDied","Data":"423cdaf51e1cc75157f7e586bff2158c565c6f95a49b2f25f7ef00e8c8c0b48b"} Mar 09 14:37:21 crc kubenswrapper[4722]: I0309 14:37:21.133605 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="423cdaf51e1cc75157f7e586bff2158c565c6f95a49b2f25f7ef00e8c8c0b48b" Mar 09 14:37:21 crc kubenswrapper[4722]: I0309 14:37:21.133619 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-gk84b" Mar 09 14:37:21 crc kubenswrapper[4722]: I0309 14:37:21.229846 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-rtjd6"] Mar 09 14:37:21 crc kubenswrapper[4722]: E0309 14:37:21.230576 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd7ebfdc-0261-4646-9601-cd3367751122" containerName="oc" Mar 09 14:37:21 crc kubenswrapper[4722]: I0309 14:37:21.232313 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd7ebfdc-0261-4646-9601-cd3367751122" containerName="oc" Mar 09 14:37:21 crc kubenswrapper[4722]: E0309 14:37:21.232382 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed3d0a76-d1d6-4c0b-afb3-e9a74176d95f" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 09 14:37:21 crc kubenswrapper[4722]: I0309 14:37:21.232395 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed3d0a76-d1d6-4c0b-afb3-e9a74176d95f" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 09 14:37:21 crc kubenswrapper[4722]: I0309 14:37:21.232719 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed3d0a76-d1d6-4c0b-afb3-e9a74176d95f" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 09 14:37:21 crc kubenswrapper[4722]: I0309 14:37:21.232748 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd7ebfdc-0261-4646-9601-cd3367751122" containerName="oc" Mar 09 14:37:21 crc kubenswrapper[4722]: I0309 14:37:21.233594 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-rtjd6" Mar 09 14:37:21 crc kubenswrapper[4722]: I0309 14:37:21.236417 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 14:37:21 crc kubenswrapper[4722]: I0309 14:37:21.236541 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 14:37:21 crc kubenswrapper[4722]: I0309 14:37:21.236864 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-dwlbg" Mar 09 14:37:21 crc kubenswrapper[4722]: I0309 14:37:21.237360 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 14:37:21 crc kubenswrapper[4722]: I0309 14:37:21.242679 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-rtjd6"] Mar 09 14:37:21 crc kubenswrapper[4722]: I0309 14:37:21.374139 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ecc0ba5d-ebce-4f74-b55c-284c3c6edc05-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-rtjd6\" (UID: \"ecc0ba5d-ebce-4f74-b55c-284c3c6edc05\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-rtjd6" Mar 09 14:37:21 crc kubenswrapper[4722]: I0309 14:37:21.374511 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tvjf\" (UniqueName: \"kubernetes.io/projected/ecc0ba5d-ebce-4f74-b55c-284c3c6edc05-kube-api-access-6tvjf\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-rtjd6\" (UID: \"ecc0ba5d-ebce-4f74-b55c-284c3c6edc05\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-rtjd6" Mar 09 14:37:21 crc kubenswrapper[4722]: I0309 14:37:21.374719 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ecc0ba5d-ebce-4f74-b55c-284c3c6edc05-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-rtjd6\" (UID: \"ecc0ba5d-ebce-4f74-b55c-284c3c6edc05\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-rtjd6" Mar 09 14:37:21 crc kubenswrapper[4722]: I0309 14:37:21.476922 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ecc0ba5d-ebce-4f74-b55c-284c3c6edc05-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-rtjd6\" (UID: \"ecc0ba5d-ebce-4f74-b55c-284c3c6edc05\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-rtjd6" Mar 09 14:37:21 crc kubenswrapper[4722]: I0309 14:37:21.477528 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ecc0ba5d-ebce-4f74-b55c-284c3c6edc05-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-rtjd6\" (UID: \"ecc0ba5d-ebce-4f74-b55c-284c3c6edc05\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-rtjd6" Mar 09 14:37:21 crc kubenswrapper[4722]: I0309 14:37:21.477717 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tvjf\" (UniqueName: \"kubernetes.io/projected/ecc0ba5d-ebce-4f74-b55c-284c3c6edc05-kube-api-access-6tvjf\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-rtjd6\" (UID: \"ecc0ba5d-ebce-4f74-b55c-284c3c6edc05\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-rtjd6" Mar 09 14:37:21 crc kubenswrapper[4722]: I0309 14:37:21.482582 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ecc0ba5d-ebce-4f74-b55c-284c3c6edc05-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-rtjd6\" (UID: \"ecc0ba5d-ebce-4f74-b55c-284c3c6edc05\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-rtjd6" Mar 09 14:37:21 crc kubenswrapper[4722]: I0309 14:37:21.488807 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ecc0ba5d-ebce-4f74-b55c-284c3c6edc05-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-rtjd6\" (UID: \"ecc0ba5d-ebce-4f74-b55c-284c3c6edc05\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-rtjd6" Mar 09 14:37:21 crc kubenswrapper[4722]: I0309 14:37:21.499932 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tvjf\" (UniqueName: \"kubernetes.io/projected/ecc0ba5d-ebce-4f74-b55c-284c3c6edc05-kube-api-access-6tvjf\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-rtjd6\" (UID: \"ecc0ba5d-ebce-4f74-b55c-284c3c6edc05\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-rtjd6" Mar 09 14:37:21 crc kubenswrapper[4722]: I0309 14:37:21.528282 4722 patch_prober.go:28] interesting pod/machine-config-daemon-hjrrb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:37:21 crc kubenswrapper[4722]: I0309 14:37:21.528337 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:37:21 crc kubenswrapper[4722]: I0309 14:37:21.558288 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-rtjd6" Mar 09 14:37:22 crc kubenswrapper[4722]: I0309 14:37:22.109469 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-rtjd6"] Mar 09 14:37:22 crc kubenswrapper[4722]: I0309 14:37:22.147212 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-rtjd6" event={"ID":"ecc0ba5d-ebce-4f74-b55c-284c3c6edc05","Type":"ContainerStarted","Data":"b36fb27f1bbcab3d82b5a87a325f6c093dc8fc8f7474fbd63d061b3950655652"} Mar 09 14:37:22 crc kubenswrapper[4722]: I0309 14:37:22.166496 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bebeafd-3acc-450b-85b1-145cb598ac05" path="/var/lib/kubelet/pods/3bebeafd-3acc-450b-85b1-145cb598ac05/volumes" Mar 09 14:37:22 crc kubenswrapper[4722]: I0309 14:37:22.168008 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87133d68-d972-4b38-a6b9-f88733004c17" path="/var/lib/kubelet/pods/87133d68-d972-4b38-a6b9-f88733004c17/volumes" Mar 09 14:37:22 crc kubenswrapper[4722]: I0309 14:37:22.170370 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ce279d8-a769-4e43-89f9-18c598b6f207" path="/var/lib/kubelet/pods/9ce279d8-a769-4e43-89f9-18c598b6f207/volumes" Mar 09 14:37:22 crc kubenswrapper[4722]: I0309 14:37:22.171001 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff53300e-f89f-4204-82ce-5100fa8b10be" path="/var/lib/kubelet/pods/ff53300e-f89f-4204-82ce-5100fa8b10be/volumes" Mar 09 14:37:23 crc kubenswrapper[4722]: I0309 14:37:23.158962 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-rtjd6" event={"ID":"ecc0ba5d-ebce-4f74-b55c-284c3c6edc05","Type":"ContainerStarted","Data":"043362c79d87f10f332ad746412ed1837f196a4dc2d99deccdd0a7c1d1cb1dd6"} Mar 09 14:37:23 crc kubenswrapper[4722]: I0309 14:37:23.200535 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-rtjd6" podStartSLOduration=1.762239232 podStartE2EDuration="2.200514762s" podCreationTimestamp="2026-03-09 14:37:21 +0000 UTC" firstStartedPulling="2026-03-09 14:37:22.115472847 +0000 UTC m=+2082.671041423" lastFinishedPulling="2026-03-09 14:37:22.553748377 +0000 UTC m=+2083.109316953" observedRunningTime="2026-03-09 14:37:23.182506702 +0000 UTC m=+2083.738075278" watchObservedRunningTime="2026-03-09 14:37:23.200514762 +0000 UTC m=+2083.756083338" Mar 09 14:37:26 crc kubenswrapper[4722]: I0309 14:37:26.348091 4722 scope.go:117] "RemoveContainer" containerID="bc4d4d8cb0425cb7086646156e49b032cf304d0fd45000968132c6a706f3fc6e" Mar 09 14:37:26 crc kubenswrapper[4722]: I0309 14:37:26.409395 4722 scope.go:117] "RemoveContainer" containerID="fdcbebcda0784849d010d9f351b3cab1c65a75030b21be0704a612a41d13ddc4" Mar 09 14:37:26 crc kubenswrapper[4722]: I0309 14:37:26.437497 4722 scope.go:117] "RemoveContainer" containerID="be95b2bc7740c453215243f49acbc33a603da69895f058d18be9c4f1ce1acf40" Mar 09 14:37:26 crc kubenswrapper[4722]: I0309 14:37:26.500274 4722 scope.go:117] "RemoveContainer" containerID="264abab60d8e40cc9ad26af630c68c524c74d4168716437d47ead93f83091391" Mar 09 14:37:26 crc kubenswrapper[4722]: I0309 14:37:26.558356 4722 scope.go:117] "RemoveContainer" containerID="92e1b081619a85f67e51a0724fade902503af8f7f9809e6c3d63fba6df4a4861" Mar 09 14:37:26 crc kubenswrapper[4722]: I0309 14:37:26.608308 4722 scope.go:117] "RemoveContainer" containerID="2360808bc57a87b6b2fa7f5fc11ab102736e19dbbb32712fc1d07735f9404fa8" Mar 09 14:37:26 crc kubenswrapper[4722]: I0309 14:37:26.665027 4722 scope.go:117] "RemoveContainer" containerID="e70b1eca4a308b4ae963e0695763ed8e18e59f63b05e049fc322abf1067f5f46" Mar 09 14:37:48 crc kubenswrapper[4722]: I0309 14:37:48.534146 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gjk8k"] Mar 09 14:37:48 crc kubenswrapper[4722]: I0309 14:37:48.537557 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gjk8k" Mar 09 14:37:48 crc kubenswrapper[4722]: I0309 14:37:48.554434 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gjk8k"] Mar 09 14:37:48 crc kubenswrapper[4722]: I0309 14:37:48.629814 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4a919cf-468a-4e2a-af70-435ec58bb60d-utilities\") pod \"redhat-operators-gjk8k\" (UID: \"c4a919cf-468a-4e2a-af70-435ec58bb60d\") " pod="openshift-marketplace/redhat-operators-gjk8k" Mar 09 14:37:48 crc kubenswrapper[4722]: I0309 14:37:48.629974 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmz84\" (UniqueName: \"kubernetes.io/projected/c4a919cf-468a-4e2a-af70-435ec58bb60d-kube-api-access-rmz84\") pod \"redhat-operators-gjk8k\" (UID: \"c4a919cf-468a-4e2a-af70-435ec58bb60d\") " pod="openshift-marketplace/redhat-operators-gjk8k" Mar 09 14:37:48 crc kubenswrapper[4722]: I0309 14:37:48.630028 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4a919cf-468a-4e2a-af70-435ec58bb60d-catalog-content\") pod \"redhat-operators-gjk8k\" (UID: \"c4a919cf-468a-4e2a-af70-435ec58bb60d\") " pod="openshift-marketplace/redhat-operators-gjk8k" Mar 09 14:37:48 crc kubenswrapper[4722]: I0309 14:37:48.731815 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4a919cf-468a-4e2a-af70-435ec58bb60d-utilities\") pod \"redhat-operators-gjk8k\" (UID: \"c4a919cf-468a-4e2a-af70-435ec58bb60d\") " pod="openshift-marketplace/redhat-operators-gjk8k" Mar 09 14:37:48 crc kubenswrapper[4722]: I0309 14:37:48.731970 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmz84\" (UniqueName: \"kubernetes.io/projected/c4a919cf-468a-4e2a-af70-435ec58bb60d-kube-api-access-rmz84\") pod \"redhat-operators-gjk8k\" (UID: \"c4a919cf-468a-4e2a-af70-435ec58bb60d\") " pod="openshift-marketplace/redhat-operators-gjk8k" Mar 09 14:37:48 crc kubenswrapper[4722]: I0309 14:37:48.732027 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4a919cf-468a-4e2a-af70-435ec58bb60d-catalog-content\") pod \"redhat-operators-gjk8k\" (UID: \"c4a919cf-468a-4e2a-af70-435ec58bb60d\") " pod="openshift-marketplace/redhat-operators-gjk8k" Mar 09 14:37:48 crc kubenswrapper[4722]: I0309 14:37:48.732614 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4a919cf-468a-4e2a-af70-435ec58bb60d-catalog-content\") pod \"redhat-operators-gjk8k\" (UID: \"c4a919cf-468a-4e2a-af70-435ec58bb60d\") " pod="openshift-marketplace/redhat-operators-gjk8k" Mar 09 14:37:48 crc kubenswrapper[4722]: I0309 14:37:48.732624 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4a919cf-468a-4e2a-af70-435ec58bb60d-utilities\") pod \"redhat-operators-gjk8k\" (UID: \"c4a919cf-468a-4e2a-af70-435ec58bb60d\") " pod="openshift-marketplace/redhat-operators-gjk8k" Mar 09 14:37:48 crc kubenswrapper[4722]: I0309 14:37:48.753343 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmz84\" (UniqueName: \"kubernetes.io/projected/c4a919cf-468a-4e2a-af70-435ec58bb60d-kube-api-access-rmz84\") pod \"redhat-operators-gjk8k\" (UID: \"c4a919cf-468a-4e2a-af70-435ec58bb60d\") " pod="openshift-marketplace/redhat-operators-gjk8k" Mar 09 14:37:48 crc kubenswrapper[4722]: I0309 14:37:48.864739 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gjk8k" Mar 09 14:37:49 crc kubenswrapper[4722]: I0309 14:37:49.363028 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gjk8k"] Mar 09 14:37:49 crc kubenswrapper[4722]: I0309 14:37:49.478487 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gjk8k" event={"ID":"c4a919cf-468a-4e2a-af70-435ec58bb60d","Type":"ContainerStarted","Data":"67d7d7e43d9275bbd56035ceece14b4340455f8d681896d31d8faed5003aeeab"} Mar 09 14:37:50 crc kubenswrapper[4722]: I0309 14:37:50.493597 4722 generic.go:334] "Generic (PLEG): container finished" podID="c4a919cf-468a-4e2a-af70-435ec58bb60d" containerID="9503aef5a873cc8d98aa5afc6457f31c43855dbb5acae55935f466757d8277cf" exitCode=0 Mar 09 14:37:50 crc kubenswrapper[4722]: I0309 14:37:50.493701 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gjk8k" event={"ID":"c4a919cf-468a-4e2a-af70-435ec58bb60d","Type":"ContainerDied","Data":"9503aef5a873cc8d98aa5afc6457f31c43855dbb5acae55935f466757d8277cf"} Mar 09 14:37:51 crc kubenswrapper[4722]: I0309 14:37:51.528672 4722 patch_prober.go:28] interesting pod/machine-config-daemon-hjrrb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:37:51 crc kubenswrapper[4722]: I0309 14:37:51.529754 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:37:52 crc kubenswrapper[4722]: I0309 14:37:52.523280 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gjk8k" event={"ID":"c4a919cf-468a-4e2a-af70-435ec58bb60d","Type":"ContainerStarted","Data":"503147cf596933e09726925d8c48836d90d5f5e12f8617ad228899ef488a351d"} Mar 09 14:37:54 crc kubenswrapper[4722]: I0309 14:37:54.093333 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-dxrwp"] Mar 09 14:37:54 crc kubenswrapper[4722]: I0309 14:37:54.101709 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-dxrwp"] Mar 09 14:37:54 crc kubenswrapper[4722]: I0309 14:37:54.163086 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8345b50-e7db-4e96-8ed3-1c4593079a7e" path="/var/lib/kubelet/pods/d8345b50-e7db-4e96-8ed3-1c4593079a7e/volumes" Mar 09 14:37:58 crc kubenswrapper[4722]: I0309 14:37:58.591158 4722 generic.go:334] "Generic (PLEG): container finished" podID="c4a919cf-468a-4e2a-af70-435ec58bb60d" containerID="503147cf596933e09726925d8c48836d90d5f5e12f8617ad228899ef488a351d" exitCode=0 Mar 09 14:37:58 crc kubenswrapper[4722]: I0309 14:37:58.591459 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gjk8k" event={"ID":"c4a919cf-468a-4e2a-af70-435ec58bb60d","Type":"ContainerDied","Data":"503147cf596933e09726925d8c48836d90d5f5e12f8617ad228899ef488a351d"} Mar 09 14:38:00 crc kubenswrapper[4722]: I0309 14:38:00.164441 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551118-j6srk"] Mar 09 14:38:00 crc kubenswrapper[4722]: I0309 14:38:00.166334 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551118-j6srk" Mar 09 14:38:00 crc kubenswrapper[4722]: I0309 14:38:00.167611 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551118-j6srk"] Mar 09 14:38:00 crc kubenswrapper[4722]: I0309 14:38:00.168761 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 14:38:00 crc kubenswrapper[4722]: I0309 14:38:00.169114 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-dr2x6" Mar 09 14:38:00 crc kubenswrapper[4722]: I0309 14:38:00.173094 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 14:38:00 crc kubenswrapper[4722]: I0309 14:38:00.199922 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sf4tq\" (UniqueName: \"kubernetes.io/projected/a081f1f1-e065-4d4e-87b8-878c36dd0582-kube-api-access-sf4tq\") pod \"auto-csr-approver-29551118-j6srk\" (UID: \"a081f1f1-e065-4d4e-87b8-878c36dd0582\") " pod="openshift-infra/auto-csr-approver-29551118-j6srk" Mar 09 14:38:00 crc kubenswrapper[4722]: I0309 14:38:00.302675 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sf4tq\" (UniqueName: \"kubernetes.io/projected/a081f1f1-e065-4d4e-87b8-878c36dd0582-kube-api-access-sf4tq\") pod \"auto-csr-approver-29551118-j6srk\" (UID: \"a081f1f1-e065-4d4e-87b8-878c36dd0582\") " pod="openshift-infra/auto-csr-approver-29551118-j6srk" Mar 09 14:38:00 crc kubenswrapper[4722]: I0309 14:38:00.325893 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sf4tq\" (UniqueName: \"kubernetes.io/projected/a081f1f1-e065-4d4e-87b8-878c36dd0582-kube-api-access-sf4tq\") pod \"auto-csr-approver-29551118-j6srk\" (UID: \"a081f1f1-e065-4d4e-87b8-878c36dd0582\") " pod="openshift-infra/auto-csr-approver-29551118-j6srk" Mar 09 14:38:00 crc kubenswrapper[4722]: I0309 14:38:00.491684 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551118-j6srk" Mar 09 14:38:01 crc kubenswrapper[4722]: I0309 14:38:01.224086 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551118-j6srk"] Mar 09 14:38:01 crc kubenswrapper[4722]: I0309 14:38:01.629410 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551118-j6srk" event={"ID":"a081f1f1-e065-4d4e-87b8-878c36dd0582","Type":"ContainerStarted","Data":"4e0d8eb9817655cce707994d2f718b90f19fe652243ad0b9683a45dc1d09e108"} Mar 09 14:38:01 crc kubenswrapper[4722]: I0309 14:38:01.632495 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gjk8k" event={"ID":"c4a919cf-468a-4e2a-af70-435ec58bb60d","Type":"ContainerStarted","Data":"3b8350b366504445d85bd5584989bafdd74b15b7fda2591796d68625c013d986"} Mar 09 14:38:01 crc kubenswrapper[4722]: I0309 14:38:01.658843 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gjk8k" podStartSLOduration=3.77644292 podStartE2EDuration="13.658826209s" podCreationTimestamp="2026-03-09 14:37:48 +0000 UTC" firstStartedPulling="2026-03-09 14:37:50.495696458 +0000 UTC m=+2111.051265034" lastFinishedPulling="2026-03-09 14:38:00.378079747 +0000 UTC m=+2120.933648323" observedRunningTime="2026-03-09 14:38:01.651560531 +0000 UTC m=+2122.207129117" watchObservedRunningTime="2026-03-09 14:38:01.658826209 +0000 UTC m=+2122.214394785" Mar 09 14:38:03 crc kubenswrapper[4722]: I0309 14:38:03.682960 4722 generic.go:334] "Generic (PLEG): container finished" podID="a081f1f1-e065-4d4e-87b8-878c36dd0582" containerID="cfee5a6fa0116f9ffffb52b0c238fd511806c2adf6e486bfe1c7a705d6c669df" exitCode=0 Mar 09 14:38:03 crc kubenswrapper[4722]: I0309 14:38:03.683077 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551118-j6srk" event={"ID":"a081f1f1-e065-4d4e-87b8-878c36dd0582","Type":"ContainerDied","Data":"cfee5a6fa0116f9ffffb52b0c238fd511806c2adf6e486bfe1c7a705d6c669df"} Mar 09 14:38:05 crc kubenswrapper[4722]: I0309 14:38:05.124599 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551118-j6srk" Mar 09 14:38:05 crc kubenswrapper[4722]: I0309 14:38:05.246768 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sf4tq\" (UniqueName: \"kubernetes.io/projected/a081f1f1-e065-4d4e-87b8-878c36dd0582-kube-api-access-sf4tq\") pod \"a081f1f1-e065-4d4e-87b8-878c36dd0582\" (UID: \"a081f1f1-e065-4d4e-87b8-878c36dd0582\") " Mar 09 14:38:05 crc kubenswrapper[4722]: I0309 14:38:05.260760 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a081f1f1-e065-4d4e-87b8-878c36dd0582-kube-api-access-sf4tq" (OuterVolumeSpecName: "kube-api-access-sf4tq") pod "a081f1f1-e065-4d4e-87b8-878c36dd0582" (UID: "a081f1f1-e065-4d4e-87b8-878c36dd0582"). InnerVolumeSpecName "kube-api-access-sf4tq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:38:05 crc kubenswrapper[4722]: I0309 14:38:05.350343 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sf4tq\" (UniqueName: \"kubernetes.io/projected/a081f1f1-e065-4d4e-87b8-878c36dd0582-kube-api-access-sf4tq\") on node \"crc\" DevicePath \"\"" Mar 09 14:38:05 crc kubenswrapper[4722]: I0309 14:38:05.708421 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551118-j6srk" event={"ID":"a081f1f1-e065-4d4e-87b8-878c36dd0582","Type":"ContainerDied","Data":"4e0d8eb9817655cce707994d2f718b90f19fe652243ad0b9683a45dc1d09e108"} Mar 09 14:38:05 crc kubenswrapper[4722]: I0309 14:38:05.708915 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e0d8eb9817655cce707994d2f718b90f19fe652243ad0b9683a45dc1d09e108" Mar 09 14:38:05 crc kubenswrapper[4722]: I0309 14:38:05.708459 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551118-j6srk" Mar 09 14:38:06 crc kubenswrapper[4722]: I0309 14:38:06.211092 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551112-ptlnj"] Mar 09 14:38:06 crc kubenswrapper[4722]: I0309 14:38:06.223084 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551112-ptlnj"] Mar 09 14:38:07 crc kubenswrapper[4722]: I0309 14:38:07.038756 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-6e64-account-create-update-sddxq"] Mar 09 14:38:07 crc kubenswrapper[4722]: I0309 14:38:07.056859 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-create-kf5wr"] Mar 09 14:38:07 crc kubenswrapper[4722]: I0309 14:38:07.071691 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-6e64-account-create-update-sddxq"] Mar 09 14:38:07 crc kubenswrapper[4722]: I0309 14:38:07.084521 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-create-kf5wr"] Mar 09 14:38:08 crc kubenswrapper[4722]: I0309 14:38:08.164883 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb4ac952-30e4-4b7d-866f-5bd6c9825bb2" path="/var/lib/kubelet/pods/cb4ac952-30e4-4b7d-866f-5bd6c9825bb2/volumes" Mar 09 14:38:08 crc kubenswrapper[4722]: I0309 14:38:08.165754 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f48afe9d-4a0b-4d74-b9c4-a67f1fd098f3" path="/var/lib/kubelet/pods/f48afe9d-4a0b-4d74-b9c4-a67f1fd098f3/volumes" Mar 09 14:38:08 crc kubenswrapper[4722]: I0309 14:38:08.167104 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff07094b-d1ed-4466-b69f-457d3bbacfed" path="/var/lib/kubelet/pods/ff07094b-d1ed-4466-b69f-457d3bbacfed/volumes" Mar 09 14:38:08 crc kubenswrapper[4722]: I0309 14:38:08.865727 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gjk8k" Mar 09 14:38:08 crc kubenswrapper[4722]: I0309 14:38:08.866103 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gjk8k" Mar 09 14:38:09 crc kubenswrapper[4722]: I0309 14:38:09.924298 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gjk8k" podUID="c4a919cf-468a-4e2a-af70-435ec58bb60d" containerName="registry-server" probeResult="failure" output=< Mar 09 14:38:09 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Mar 09 14:38:09 crc kubenswrapper[4722]: > Mar 09 14:38:18 crc kubenswrapper[4722]: I0309 14:38:18.033187 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-8qttf"] Mar 09 14:38:18 crc kubenswrapper[4722]: I0309 14:38:18.045745 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-8qttf"] Mar 09 14:38:18 crc kubenswrapper[4722]: I0309 14:38:18.163828 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa375409-8285-4709-8abd-1916c59a6566" path="/var/lib/kubelet/pods/aa375409-8285-4709-8abd-1916c59a6566/volumes" Mar 09 14:38:19 crc kubenswrapper[4722]: I0309 14:38:19.928869 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gjk8k" podUID="c4a919cf-468a-4e2a-af70-435ec58bb60d" containerName="registry-server" probeResult="failure" output=< Mar 09 14:38:19 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Mar 09 14:38:19 crc kubenswrapper[4722]: > Mar 09 14:38:21 crc kubenswrapper[4722]: I0309 14:38:21.527654 4722 patch_prober.go:28] interesting pod/machine-config-daemon-hjrrb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:38:21 crc kubenswrapper[4722]: I0309 14:38:21.528097 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:38:21 crc kubenswrapper[4722]: I0309 14:38:21.528159 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" Mar 09 14:38:21 crc kubenswrapper[4722]: I0309 14:38:21.529264 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4efa58158532ee2b76f83ca6efe53930357d25040e1664917766707f5e03ced2"} pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 14:38:21 crc kubenswrapper[4722]: I0309 14:38:21.529343 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerName="machine-config-daemon" containerID="cri-o://4efa58158532ee2b76f83ca6efe53930357d25040e1664917766707f5e03ced2" gracePeriod=600 Mar 09 14:38:21 crc kubenswrapper[4722]: I0309 14:38:21.934990 4722 generic.go:334] "Generic (PLEG): container finished" podID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerID="4efa58158532ee2b76f83ca6efe53930357d25040e1664917766707f5e03ced2" exitCode=0 Mar 09 14:38:21 crc kubenswrapper[4722]: I0309 14:38:21.935094 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" event={"ID":"dac2aaf5-653b-4b2a-8efe-ed26bac8d648","Type":"ContainerDied","Data":"4efa58158532ee2b76f83ca6efe53930357d25040e1664917766707f5e03ced2"} Mar 09 14:38:21 crc kubenswrapper[4722]: I0309 14:38:21.935949 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" event={"ID":"dac2aaf5-653b-4b2a-8efe-ed26bac8d648","Type":"ContainerStarted","Data":"931310cf17f93e3962cdbbdf531169aefb3915147cff22375dd9c06d9d6b2906"} Mar 09 14:38:21 crc kubenswrapper[4722]: I0309 14:38:21.935979 4722 scope.go:117] "RemoveContainer" containerID="86a6dcdf7dcf4d48b8bdfb71d9a2c30b89f235700c424312c7e16580a86021d6" Mar 09 14:38:22 crc kubenswrapper[4722]: I0309 14:38:22.045864 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-cklld"] Mar 09 14:38:22 crc kubenswrapper[4722]: I0309 14:38:22.058059 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-cklld"] Mar 09 14:38:22 crc kubenswrapper[4722]: I0309 14:38:22.178383 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e25996e-4f6c-4135-b781-95eae749689e" path="/var/lib/kubelet/pods/8e25996e-4f6c-4135-b781-95eae749689e/volumes" Mar 09 14:38:26 crc kubenswrapper[4722]: I0309 14:38:26.851192 4722 scope.go:117] "RemoveContainer" containerID="866dea47edd6b6df9c06f06c280717c12a33b978612ab76a0a4e30d589b49aba" Mar 09 14:38:26 crc kubenswrapper[4722]: I0309 14:38:26.906614 4722 scope.go:117] "RemoveContainer" containerID="ab9bc4b07b84903b4fc451399f48da08c27672ca2db39820c74ff7fcc92fd841" Mar 09 14:38:26 crc kubenswrapper[4722]: I0309 14:38:26.972632 4722 scope.go:117] "RemoveContainer" containerID="d9ca2f07ecb9bf38ccd438169eed0c8afa14d0ac674a9f2f87af6aba969cea02" Mar 09 14:38:27 crc kubenswrapper[4722]: I0309 14:38:27.031189 4722 scope.go:117] "RemoveContainer" containerID="a76ff0649e5c91ef5ff4b1a327c4c8abcf7793a48f4877104cf10541009801e1" Mar 09 14:38:27 crc kubenswrapper[4722]: I0309 14:38:27.076770 4722 scope.go:117] "RemoveContainer" containerID="d634f9a55d0ddc3d3c58d5a7f5aa0b15dcc0956bd1f9c086901e198d8e8e3d3e" Mar 09 14:38:27 crc kubenswrapper[4722]: I0309 14:38:27.131321 4722 scope.go:117] "RemoveContainer" containerID="5cfcdbdecc8b8ae4a8263e25cb31ca5a7a3a52f2aea68b74ff60f3033bbdbc44" Mar 09 14:38:29 crc kubenswrapper[4722]: I0309 14:38:29.935470 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gjk8k" podUID="c4a919cf-468a-4e2a-af70-435ec58bb60d" containerName="registry-server" probeResult="failure" output=< Mar 09 14:38:29 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Mar 09 14:38:29 crc kubenswrapper[4722]: > Mar 09 14:38:31 crc kubenswrapper[4722]: I0309 14:38:31.815292 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="4159e308-3ccf-45d9-a97b-8133542007a8" containerName="galera" probeResult="failure" output="command timed out" Mar 09 14:38:31 crc kubenswrapper[4722]: I0309 14:38:31.815910 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-cell1-galera-0" podUID="4159e308-3ccf-45d9-a97b-8133542007a8" containerName="galera" probeResult="failure" output="command timed out" Mar 09 14:38:31 crc kubenswrapper[4722]: I0309 14:38:31.871857 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-lvfgg" podUID="7a62b98d-e9d4-4cbc-bea8-0da13fcc4467" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.112:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 14:38:31 crc kubenswrapper[4722]: I0309 14:38:31.961177 4722 generic.go:334] "Generic (PLEG): container finished" podID="ecc0ba5d-ebce-4f74-b55c-284c3c6edc05" containerID="043362c79d87f10f332ad746412ed1837f196a4dc2d99deccdd0a7c1d1cb1dd6" exitCode=0 Mar 09 14:38:31 crc kubenswrapper[4722]: I0309 14:38:31.961504 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-rtjd6" event={"ID":"ecc0ba5d-ebce-4f74-b55c-284c3c6edc05","Type":"ContainerDied","Data":"043362c79d87f10f332ad746412ed1837f196a4dc2d99deccdd0a7c1d1cb1dd6"} Mar 09 14:38:33 crc kubenswrapper[4722]: I0309 14:38:33.509530 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-rtjd6" Mar 09 14:38:33 crc kubenswrapper[4722]: I0309 14:38:33.584808 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ecc0ba5d-ebce-4f74-b55c-284c3c6edc05-ssh-key-openstack-edpm-ipam\") pod \"ecc0ba5d-ebce-4f74-b55c-284c3c6edc05\" (UID: \"ecc0ba5d-ebce-4f74-b55c-284c3c6edc05\") " Mar 09 14:38:33 crc kubenswrapper[4722]: I0309 14:38:33.584930 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6tvjf\" (UniqueName: \"kubernetes.io/projected/ecc0ba5d-ebce-4f74-b55c-284c3c6edc05-kube-api-access-6tvjf\") pod \"ecc0ba5d-ebce-4f74-b55c-284c3c6edc05\" (UID: \"ecc0ba5d-ebce-4f74-b55c-284c3c6edc05\") " Mar 09 14:38:33 crc kubenswrapper[4722]: I0309 14:38:33.585048 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ecc0ba5d-ebce-4f74-b55c-284c3c6edc05-inventory\") pod \"ecc0ba5d-ebce-4f74-b55c-284c3c6edc05\" (UID: \"ecc0ba5d-ebce-4f74-b55c-284c3c6edc05\") " Mar 09 14:38:33 crc kubenswrapper[4722]: I0309 14:38:33.593576 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecc0ba5d-ebce-4f74-b55c-284c3c6edc05-kube-api-access-6tvjf" (OuterVolumeSpecName: "kube-api-access-6tvjf") pod "ecc0ba5d-ebce-4f74-b55c-284c3c6edc05" (UID: "ecc0ba5d-ebce-4f74-b55c-284c3c6edc05"). InnerVolumeSpecName "kube-api-access-6tvjf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:38:33 crc kubenswrapper[4722]: I0309 14:38:33.636361 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecc0ba5d-ebce-4f74-b55c-284c3c6edc05-inventory" (OuterVolumeSpecName: "inventory") pod "ecc0ba5d-ebce-4f74-b55c-284c3c6edc05" (UID: "ecc0ba5d-ebce-4f74-b55c-284c3c6edc05"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:38:33 crc kubenswrapper[4722]: I0309 14:38:33.648771 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecc0ba5d-ebce-4f74-b55c-284c3c6edc05-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ecc0ba5d-ebce-4f74-b55c-284c3c6edc05" (UID: "ecc0ba5d-ebce-4f74-b55c-284c3c6edc05"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:38:33 crc kubenswrapper[4722]: I0309 14:38:33.686775 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6tvjf\" (UniqueName: \"kubernetes.io/projected/ecc0ba5d-ebce-4f74-b55c-284c3c6edc05-kube-api-access-6tvjf\") on node \"crc\" DevicePath \"\"" Mar 09 14:38:33 crc kubenswrapper[4722]: I0309 14:38:33.686820 4722 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ecc0ba5d-ebce-4f74-b55c-284c3c6edc05-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 14:38:33 crc kubenswrapper[4722]: I0309 14:38:33.686835 4722 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ecc0ba5d-ebce-4f74-b55c-284c3c6edc05-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 14:38:33 crc kubenswrapper[4722]: I0309 14:38:33.982391 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-rtjd6" event={"ID":"ecc0ba5d-ebce-4f74-b55c-284c3c6edc05","Type":"ContainerDied","Data":"b36fb27f1bbcab3d82b5a87a325f6c093dc8fc8f7474fbd63d061b3950655652"} Mar 09 14:38:33 crc kubenswrapper[4722]: I0309 14:38:33.982443 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b36fb27f1bbcab3d82b5a87a325f6c093dc8fc8f7474fbd63d061b3950655652" Mar 09 14:38:33 crc kubenswrapper[4722]: I0309 14:38:33.982452 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-rtjd6" Mar 09 14:38:34 crc kubenswrapper[4722]: I0309 14:38:34.085715 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tf98x"] Mar 09 14:38:34 crc kubenswrapper[4722]: E0309 14:38:34.086299 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a081f1f1-e065-4d4e-87b8-878c36dd0582" containerName="oc" Mar 09 14:38:34 crc kubenswrapper[4722]: I0309 14:38:34.086321 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a081f1f1-e065-4d4e-87b8-878c36dd0582" containerName="oc" Mar 09 14:38:34 crc kubenswrapper[4722]: E0309 14:38:34.086384 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecc0ba5d-ebce-4f74-b55c-284c3c6edc05" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 09 14:38:34 crc kubenswrapper[4722]: I0309 14:38:34.086394 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecc0ba5d-ebce-4f74-b55c-284c3c6edc05" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 09 14:38:34 crc kubenswrapper[4722]: I0309 14:38:34.086649 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecc0ba5d-ebce-4f74-b55c-284c3c6edc05" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 09 14:38:34 crc kubenswrapper[4722]: I0309 14:38:34.086681 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="a081f1f1-e065-4d4e-87b8-878c36dd0582" containerName="oc" Mar 09 14:38:34 crc kubenswrapper[4722]: I0309 14:38:34.087704 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tf98x" Mar 09 14:38:34 crc kubenswrapper[4722]: I0309 14:38:34.091507 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 14:38:34 crc kubenswrapper[4722]: I0309 14:38:34.091578 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 14:38:34 crc kubenswrapper[4722]: I0309 14:38:34.091945 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-dwlbg" Mar 09 14:38:34 crc kubenswrapper[4722]: I0309 14:38:34.093173 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 14:38:34 crc kubenswrapper[4722]: I0309 14:38:34.102111 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tf98x"] Mar 09 14:38:34 crc kubenswrapper[4722]: I0309 14:38:34.197523 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfxnx\" (UniqueName: \"kubernetes.io/projected/0b760182-c6d3-4f80-8f18-89b16c3c480d-kube-api-access-lfxnx\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-tf98x\" (UID: \"0b760182-c6d3-4f80-8f18-89b16c3c480d\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tf98x" Mar 09 14:38:34 crc kubenswrapper[4722]: I0309 14:38:34.199678 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b760182-c6d3-4f80-8f18-89b16c3c480d-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-tf98x\" (UID: \"0b760182-c6d3-4f80-8f18-89b16c3c480d\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tf98x" Mar 09 14:38:34 crc kubenswrapper[4722]: I0309 14:38:34.199807 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0b760182-c6d3-4f80-8f18-89b16c3c480d-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-tf98x\" (UID: \"0b760182-c6d3-4f80-8f18-89b16c3c480d\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tf98x" Mar 09 14:38:34 crc kubenswrapper[4722]: I0309 14:38:34.300936 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0b760182-c6d3-4f80-8f18-89b16c3c480d-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-tf98x\" (UID: \"0b760182-c6d3-4f80-8f18-89b16c3c480d\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tf98x" Mar 09 14:38:34 crc kubenswrapper[4722]: I0309 14:38:34.302140 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfxnx\" (UniqueName: \"kubernetes.io/projected/0b760182-c6d3-4f80-8f18-89b16c3c480d-kube-api-access-lfxnx\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-tf98x\" (UID: \"0b760182-c6d3-4f80-8f18-89b16c3c480d\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tf98x" Mar 09 14:38:34 crc kubenswrapper[4722]: I0309 14:38:34.302712 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b760182-c6d3-4f80-8f18-89b16c3c480d-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-tf98x\" (UID: \"0b760182-c6d3-4f80-8f18-89b16c3c480d\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tf98x" Mar 09 14:38:34 crc kubenswrapper[4722]: I0309 14:38:34.306288 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0b760182-c6d3-4f80-8f18-89b16c3c480d-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-tf98x\" (UID: \"0b760182-c6d3-4f80-8f18-89b16c3c480d\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tf98x" Mar 09 14:38:34 crc kubenswrapper[4722]: I0309 14:38:34.306531 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b760182-c6d3-4f80-8f18-89b16c3c480d-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-tf98x\" (UID: \"0b760182-c6d3-4f80-8f18-89b16c3c480d\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tf98x" Mar 09 14:38:34 crc kubenswrapper[4722]: I0309 14:38:34.320658 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfxnx\" (UniqueName: \"kubernetes.io/projected/0b760182-c6d3-4f80-8f18-89b16c3c480d-kube-api-access-lfxnx\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-tf98x\" (UID: \"0b760182-c6d3-4f80-8f18-89b16c3c480d\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tf98x" Mar 09 14:38:34 crc kubenswrapper[4722]: I0309 14:38:34.411687 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tf98x" Mar 09 14:38:35 crc kubenswrapper[4722]: I0309 14:38:35.115145 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tf98x"] Mar 09 14:38:36 crc kubenswrapper[4722]: I0309 14:38:36.022447 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tf98x" event={"ID":"0b760182-c6d3-4f80-8f18-89b16c3c480d","Type":"ContainerStarted","Data":"5871dfc61981a21acddd5d5180a9cab949733e9813323761295203f9217d32fe"} Mar 09 14:38:37 crc kubenswrapper[4722]: I0309 14:38:37.036424 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tf98x" event={"ID":"0b760182-c6d3-4f80-8f18-89b16c3c480d","Type":"ContainerStarted","Data":"d43768c67e3cdf5cf17c0375c7cebccba2eef1761939dabb8a6e0e5896c7665e"} Mar 09 14:38:39 crc kubenswrapper[4722]: I0309 14:38:39.953734 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gjk8k" podUID="c4a919cf-468a-4e2a-af70-435ec58bb60d" containerName="registry-server" probeResult="failure" output=< Mar 09 14:38:39 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Mar 09 14:38:39 crc kubenswrapper[4722]: > Mar 09 14:38:41 crc kubenswrapper[4722]: I0309 14:38:41.077545 4722 generic.go:334] "Generic (PLEG): container finished" podID="0b760182-c6d3-4f80-8f18-89b16c3c480d" containerID="d43768c67e3cdf5cf17c0375c7cebccba2eef1761939dabb8a6e0e5896c7665e" exitCode=0 Mar 09 14:38:41 crc kubenswrapper[4722]: I0309 14:38:41.077635 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tf98x" event={"ID":"0b760182-c6d3-4f80-8f18-89b16c3c480d","Type":"ContainerDied","Data":"d43768c67e3cdf5cf17c0375c7cebccba2eef1761939dabb8a6e0e5896c7665e"} Mar 09 14:38:42 crc kubenswrapper[4722]: I0309 14:38:42.629578 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tf98x" Mar 09 14:38:42 crc kubenswrapper[4722]: I0309 14:38:42.653395 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lfxnx\" (UniqueName: \"kubernetes.io/projected/0b760182-c6d3-4f80-8f18-89b16c3c480d-kube-api-access-lfxnx\") pod \"0b760182-c6d3-4f80-8f18-89b16c3c480d\" (UID: \"0b760182-c6d3-4f80-8f18-89b16c3c480d\") " Mar 09 14:38:42 crc kubenswrapper[4722]: I0309 14:38:42.655547 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0b760182-c6d3-4f80-8f18-89b16c3c480d-ssh-key-openstack-edpm-ipam\") pod \"0b760182-c6d3-4f80-8f18-89b16c3c480d\" (UID: \"0b760182-c6d3-4f80-8f18-89b16c3c480d\") " Mar 09 14:38:42 crc kubenswrapper[4722]: I0309 14:38:42.655871 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b760182-c6d3-4f80-8f18-89b16c3c480d-inventory\") pod \"0b760182-c6d3-4f80-8f18-89b16c3c480d\" (UID: \"0b760182-c6d3-4f80-8f18-89b16c3c480d\") " Mar 09 14:38:42 crc kubenswrapper[4722]: I0309 14:38:42.713075 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b760182-c6d3-4f80-8f18-89b16c3c480d-kube-api-access-lfxnx" (OuterVolumeSpecName: "kube-api-access-lfxnx") pod "0b760182-c6d3-4f80-8f18-89b16c3c480d" (UID: "0b760182-c6d3-4f80-8f18-89b16c3c480d"). InnerVolumeSpecName "kube-api-access-lfxnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:38:42 crc kubenswrapper[4722]: I0309 14:38:42.719471 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b760182-c6d3-4f80-8f18-89b16c3c480d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "0b760182-c6d3-4f80-8f18-89b16c3c480d" (UID: "0b760182-c6d3-4f80-8f18-89b16c3c480d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:38:42 crc kubenswrapper[4722]: I0309 14:38:42.719603 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b760182-c6d3-4f80-8f18-89b16c3c480d-inventory" (OuterVolumeSpecName: "inventory") pod "0b760182-c6d3-4f80-8f18-89b16c3c480d" (UID: "0b760182-c6d3-4f80-8f18-89b16c3c480d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:38:42 crc kubenswrapper[4722]: I0309 14:38:42.811364 4722 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0b760182-c6d3-4f80-8f18-89b16c3c480d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 14:38:42 crc kubenswrapper[4722]: I0309 14:38:42.811946 4722 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0b760182-c6d3-4f80-8f18-89b16c3c480d-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 14:38:42 crc kubenswrapper[4722]: I0309 14:38:42.812025 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lfxnx\" (UniqueName: \"kubernetes.io/projected/0b760182-c6d3-4f80-8f18-89b16c3c480d-kube-api-access-lfxnx\") on node \"crc\" DevicePath \"\"" Mar 09 14:38:43 crc kubenswrapper[4722]: I0309 14:38:43.103706 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tf98x" event={"ID":"0b760182-c6d3-4f80-8f18-89b16c3c480d","Type":"ContainerDied","Data":"5871dfc61981a21acddd5d5180a9cab949733e9813323761295203f9217d32fe"} Mar 09 14:38:43 crc kubenswrapper[4722]: I0309 14:38:43.103754 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5871dfc61981a21acddd5d5180a9cab949733e9813323761295203f9217d32fe" Mar 09 14:38:43 crc kubenswrapper[4722]: I0309 14:38:43.103871 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-tf98x" Mar 09 14:38:43 crc kubenswrapper[4722]: I0309 14:38:43.243301 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-745rg"] Mar 09 14:38:43 crc kubenswrapper[4722]: E0309 14:38:43.244079 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b760182-c6d3-4f80-8f18-89b16c3c480d" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 09 14:38:43 crc kubenswrapper[4722]: I0309 14:38:43.244106 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b760182-c6d3-4f80-8f18-89b16c3c480d" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 09 14:38:43 crc kubenswrapper[4722]: I0309 14:38:43.244434 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b760182-c6d3-4f80-8f18-89b16c3c480d" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 09 14:38:43 crc kubenswrapper[4722]: I0309 14:38:43.245563 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-745rg" Mar 09 14:38:43 crc kubenswrapper[4722]: I0309 14:38:43.252322 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 14:38:43 crc kubenswrapper[4722]: I0309 14:38:43.252599 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 14:38:43 crc kubenswrapper[4722]: I0309 14:38:43.252848 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 14:38:43 crc kubenswrapper[4722]: I0309 14:38:43.253119 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-dwlbg" Mar 09 14:38:43 crc kubenswrapper[4722]: I0309 14:38:43.263226 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-745rg"] Mar 09 14:38:43 crc kubenswrapper[4722]: I0309 14:38:43.324738 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fb7d79c1-83d1-4e4a-aab1-7f9e069e6442-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-745rg\" (UID: \"fb7d79c1-83d1-4e4a-aab1-7f9e069e6442\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-745rg" Mar 09 14:38:43 crc kubenswrapper[4722]: I0309 14:38:43.324824 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fb7d79c1-83d1-4e4a-aab1-7f9e069e6442-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-745rg\" (UID: \"fb7d79c1-83d1-4e4a-aab1-7f9e069e6442\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-745rg" Mar 09 14:38:43 crc kubenswrapper[4722]: I0309 14:38:43.325049 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzzkx\" (UniqueName: \"kubernetes.io/projected/fb7d79c1-83d1-4e4a-aab1-7f9e069e6442-kube-api-access-xzzkx\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-745rg\" (UID: \"fb7d79c1-83d1-4e4a-aab1-7f9e069e6442\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-745rg" Mar 09 14:38:43 crc kubenswrapper[4722]: I0309 14:38:43.428073 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzzkx\" (UniqueName: \"kubernetes.io/projected/fb7d79c1-83d1-4e4a-aab1-7f9e069e6442-kube-api-access-xzzkx\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-745rg\" (UID: \"fb7d79c1-83d1-4e4a-aab1-7f9e069e6442\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-745rg" Mar 09 14:38:43 crc kubenswrapper[4722]: I0309 14:38:43.428526 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fb7d79c1-83d1-4e4a-aab1-7f9e069e6442-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-745rg\" (UID: \"fb7d79c1-83d1-4e4a-aab1-7f9e069e6442\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-745rg" Mar 09 14:38:43 crc kubenswrapper[4722]: I0309 14:38:43.428597 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fb7d79c1-83d1-4e4a-aab1-7f9e069e6442-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-745rg\" (UID: \"fb7d79c1-83d1-4e4a-aab1-7f9e069e6442\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-745rg" Mar 09 14:38:43 crc kubenswrapper[4722]: I0309 14:38:43.437147 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fb7d79c1-83d1-4e4a-aab1-7f9e069e6442-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-745rg\" (UID: \"fb7d79c1-83d1-4e4a-aab1-7f9e069e6442\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-745rg" Mar 09 14:38:43 crc kubenswrapper[4722]: I0309 14:38:43.437519 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fb7d79c1-83d1-4e4a-aab1-7f9e069e6442-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-745rg\" (UID: \"fb7d79c1-83d1-4e4a-aab1-7f9e069e6442\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-745rg" Mar 09 14:38:43 crc kubenswrapper[4722]: I0309 14:38:43.455173 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzzkx\" (UniqueName: \"kubernetes.io/projected/fb7d79c1-83d1-4e4a-aab1-7f9e069e6442-kube-api-access-xzzkx\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-745rg\" (UID: \"fb7d79c1-83d1-4e4a-aab1-7f9e069e6442\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-745rg" Mar 09 14:38:43 crc kubenswrapper[4722]: I0309 14:38:43.570810 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-745rg" Mar 09 14:38:44 crc kubenswrapper[4722]: I0309 14:38:44.223687 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-745rg"] Mar 09 14:38:45 crc kubenswrapper[4722]: I0309 14:38:45.129650 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-745rg" event={"ID":"fb7d79c1-83d1-4e4a-aab1-7f9e069e6442","Type":"ContainerStarted","Data":"4413d6b12a9a564b582fe4895b60b4aa986e75b72e6467448fa77b9259910fe1"} Mar 09 14:38:46 crc kubenswrapper[4722]: I0309 14:38:46.145299 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-745rg" event={"ID":"fb7d79c1-83d1-4e4a-aab1-7f9e069e6442","Type":"ContainerStarted","Data":"0d8e596faa553e520fc12ea9668b6f8c252ee15bfb126e0fbc1e105b23f49f2e"} Mar 09 14:38:46 crc kubenswrapper[4722]: I0309 14:38:46.169190 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-745rg" podStartSLOduration=2.6370053159999998 podStartE2EDuration="3.169170242s" podCreationTimestamp="2026-03-09 14:38:43 +0000 UTC" firstStartedPulling="2026-03-09 14:38:44.234277934 +0000 UTC m=+2164.789846510" lastFinishedPulling="2026-03-09 14:38:44.76644285 +0000 UTC m=+2165.322011436" observedRunningTime="2026-03-09 14:38:46.16543941 +0000 UTC m=+2166.721008016" watchObservedRunningTime="2026-03-09 14:38:46.169170242 +0000 UTC m=+2166.724738818" Mar 09 14:38:48 crc kubenswrapper[4722]: I0309 14:38:48.925860 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gjk8k" Mar 09 14:38:48 crc kubenswrapper[4722]: I0309 14:38:48.983758 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gjk8k" Mar 09 14:38:49 crc kubenswrapper[4722]: I0309 14:38:49.819679 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gjk8k"] Mar 09 14:38:50 crc kubenswrapper[4722]: I0309 14:38:50.185412 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gjk8k" podUID="c4a919cf-468a-4e2a-af70-435ec58bb60d" containerName="registry-server" containerID="cri-o://3b8350b366504445d85bd5584989bafdd74b15b7fda2591796d68625c013d986" gracePeriod=2 Mar 09 14:38:51 crc kubenswrapper[4722]: I0309 14:38:51.188169 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gjk8k" Mar 09 14:38:51 crc kubenswrapper[4722]: I0309 14:38:51.199182 4722 generic.go:334] "Generic (PLEG): container finished" podID="c4a919cf-468a-4e2a-af70-435ec58bb60d" containerID="3b8350b366504445d85bd5584989bafdd74b15b7fda2591796d68625c013d986" exitCode=0 Mar 09 14:38:51 crc kubenswrapper[4722]: I0309 14:38:51.199277 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gjk8k" Mar 09 14:38:51 crc kubenswrapper[4722]: I0309 14:38:51.199275 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gjk8k" event={"ID":"c4a919cf-468a-4e2a-af70-435ec58bb60d","Type":"ContainerDied","Data":"3b8350b366504445d85bd5584989bafdd74b15b7fda2591796d68625c013d986"} Mar 09 14:38:51 crc kubenswrapper[4722]: I0309 14:38:51.199380 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gjk8k" event={"ID":"c4a919cf-468a-4e2a-af70-435ec58bb60d","Type":"ContainerDied","Data":"67d7d7e43d9275bbd56035ceece14b4340455f8d681896d31d8faed5003aeeab"} Mar 09 14:38:51 crc kubenswrapper[4722]: I0309 14:38:51.199409 4722 scope.go:117] "RemoveContainer" containerID="3b8350b366504445d85bd5584989bafdd74b15b7fda2591796d68625c013d986" Mar 09 14:38:51 crc kubenswrapper[4722]: I0309 14:38:51.233732 4722 scope.go:117] "RemoveContainer" containerID="503147cf596933e09726925d8c48836d90d5f5e12f8617ad228899ef488a351d" Mar 09 14:38:51 crc kubenswrapper[4722]: I0309 14:38:51.270463 4722 scope.go:117] "RemoveContainer" containerID="9503aef5a873cc8d98aa5afc6457f31c43855dbb5acae55935f466757d8277cf" Mar 09 14:38:51 crc kubenswrapper[4722]: I0309 14:38:51.318892 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmz84\" (UniqueName: \"kubernetes.io/projected/c4a919cf-468a-4e2a-af70-435ec58bb60d-kube-api-access-rmz84\") pod \"c4a919cf-468a-4e2a-af70-435ec58bb60d\" (UID: \"c4a919cf-468a-4e2a-af70-435ec58bb60d\") " Mar 09 14:38:51 crc kubenswrapper[4722]: I0309 14:38:51.318972 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4a919cf-468a-4e2a-af70-435ec58bb60d-catalog-content\") pod \"c4a919cf-468a-4e2a-af70-435ec58bb60d\" (UID: \"c4a919cf-468a-4e2a-af70-435ec58bb60d\") " Mar 09 14:38:51 crc kubenswrapper[4722]: I0309 14:38:51.319042 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4a919cf-468a-4e2a-af70-435ec58bb60d-utilities\") pod \"c4a919cf-468a-4e2a-af70-435ec58bb60d\" (UID: \"c4a919cf-468a-4e2a-af70-435ec58bb60d\") " Mar 09 14:38:51 crc kubenswrapper[4722]: I0309 14:38:51.322051 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4a919cf-468a-4e2a-af70-435ec58bb60d-utilities" (OuterVolumeSpecName: "utilities") pod "c4a919cf-468a-4e2a-af70-435ec58bb60d" (UID: "c4a919cf-468a-4e2a-af70-435ec58bb60d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:38:51 crc kubenswrapper[4722]: I0309 14:38:51.328477 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4a919cf-468a-4e2a-af70-435ec58bb60d-kube-api-access-rmz84" (OuterVolumeSpecName: "kube-api-access-rmz84") pod "c4a919cf-468a-4e2a-af70-435ec58bb60d" (UID: "c4a919cf-468a-4e2a-af70-435ec58bb60d"). InnerVolumeSpecName "kube-api-access-rmz84". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:38:51 crc kubenswrapper[4722]: I0309 14:38:51.332571 4722 scope.go:117] "RemoveContainer" containerID="3b8350b366504445d85bd5584989bafdd74b15b7fda2591796d68625c013d986" Mar 09 14:38:51 crc kubenswrapper[4722]: E0309 14:38:51.333278 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b8350b366504445d85bd5584989bafdd74b15b7fda2591796d68625c013d986\": container with ID starting with 3b8350b366504445d85bd5584989bafdd74b15b7fda2591796d68625c013d986 not found: ID does not exist" containerID="3b8350b366504445d85bd5584989bafdd74b15b7fda2591796d68625c013d986" Mar 09 14:38:51 crc kubenswrapper[4722]: I0309 14:38:51.333327 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b8350b366504445d85bd5584989bafdd74b15b7fda2591796d68625c013d986"} err="failed to get container status \"3b8350b366504445d85bd5584989bafdd74b15b7fda2591796d68625c013d986\": rpc error: code = NotFound desc = could not find container \"3b8350b366504445d85bd5584989bafdd74b15b7fda2591796d68625c013d986\": container with ID starting with 3b8350b366504445d85bd5584989bafdd74b15b7fda2591796d68625c013d986 not found: ID does not exist" Mar 09 14:38:51 crc kubenswrapper[4722]: I0309 14:38:51.333353 4722 scope.go:117] "RemoveContainer" containerID="503147cf596933e09726925d8c48836d90d5f5e12f8617ad228899ef488a351d" Mar 09 14:38:51 crc kubenswrapper[4722]: E0309 14:38:51.334385 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"503147cf596933e09726925d8c48836d90d5f5e12f8617ad228899ef488a351d\": container with ID starting with 503147cf596933e09726925d8c48836d90d5f5e12f8617ad228899ef488a351d not found: ID does not exist" containerID="503147cf596933e09726925d8c48836d90d5f5e12f8617ad228899ef488a351d" Mar 09 14:38:51 crc kubenswrapper[4722]: I0309 14:38:51.334446 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"503147cf596933e09726925d8c48836d90d5f5e12f8617ad228899ef488a351d"} err="failed to get container status \"503147cf596933e09726925d8c48836d90d5f5e12f8617ad228899ef488a351d\": rpc error: code = NotFound desc = could not find container \"503147cf596933e09726925d8c48836d90d5f5e12f8617ad228899ef488a351d\": container with ID starting with 503147cf596933e09726925d8c48836d90d5f5e12f8617ad228899ef488a351d not found: ID does not exist" Mar 09 14:38:51 crc kubenswrapper[4722]: I0309 14:38:51.334478 4722 scope.go:117] "RemoveContainer" containerID="9503aef5a873cc8d98aa5afc6457f31c43855dbb5acae55935f466757d8277cf" Mar 09 14:38:51 crc kubenswrapper[4722]: E0309 14:38:51.335031 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9503aef5a873cc8d98aa5afc6457f31c43855dbb5acae55935f466757d8277cf\": container with ID starting with 9503aef5a873cc8d98aa5afc6457f31c43855dbb5acae55935f466757d8277cf not found: ID does not exist" containerID="9503aef5a873cc8d98aa5afc6457f31c43855dbb5acae55935f466757d8277cf" Mar 09 14:38:51 crc kubenswrapper[4722]: I0309 14:38:51.335088 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9503aef5a873cc8d98aa5afc6457f31c43855dbb5acae55935f466757d8277cf"} err="failed to get container status \"9503aef5a873cc8d98aa5afc6457f31c43855dbb5acae55935f466757d8277cf\": rpc error: code = NotFound desc = could not find container \"9503aef5a873cc8d98aa5afc6457f31c43855dbb5acae55935f466757d8277cf\": container with ID starting with 9503aef5a873cc8d98aa5afc6457f31c43855dbb5acae55935f466757d8277cf not found: ID does not exist" Mar 09 14:38:51 crc kubenswrapper[4722]: I0309 14:38:51.424263 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rmz84\" (UniqueName: \"kubernetes.io/projected/c4a919cf-468a-4e2a-af70-435ec58bb60d-kube-api-access-rmz84\") on node \"crc\" DevicePath \"\"" Mar 09 14:38:51 crc kubenswrapper[4722]: I0309 14:38:51.424563 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4a919cf-468a-4e2a-af70-435ec58bb60d-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 14:38:51 crc kubenswrapper[4722]: I0309 14:38:51.455773 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4a919cf-468a-4e2a-af70-435ec58bb60d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c4a919cf-468a-4e2a-af70-435ec58bb60d" (UID: "c4a919cf-468a-4e2a-af70-435ec58bb60d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:38:51 crc kubenswrapper[4722]: I0309 14:38:51.527559 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4a919cf-468a-4e2a-af70-435ec58bb60d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 14:38:51 crc kubenswrapper[4722]: I0309 14:38:51.541136 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gjk8k"] Mar 09 14:38:51 crc kubenswrapper[4722]: I0309 14:38:51.552734 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gjk8k"] Mar 09 14:38:52 crc kubenswrapper[4722]: I0309 14:38:52.176109 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4a919cf-468a-4e2a-af70-435ec58bb60d" path="/var/lib/kubelet/pods/c4a919cf-468a-4e2a-af70-435ec58bb60d/volumes" Mar 09 14:39:04 crc kubenswrapper[4722]: I0309 14:39:04.053170 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-rll7l"] Mar 09 14:39:04 crc kubenswrapper[4722]: I0309 14:39:04.066872 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-rll7l"] Mar 09 14:39:04 crc kubenswrapper[4722]: I0309 14:39:04.173220 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25a37fbc-0819-4a2b-b782-d566e3f765d7" path="/var/lib/kubelet/pods/25a37fbc-0819-4a2b-b782-d566e3f765d7/volumes" Mar 09 14:39:22 crc kubenswrapper[4722]: I0309 14:39:22.564448 4722 generic.go:334] "Generic (PLEG): container finished" podID="fb7d79c1-83d1-4e4a-aab1-7f9e069e6442" containerID="0d8e596faa553e520fc12ea9668b6f8c252ee15bfb126e0fbc1e105b23f49f2e" exitCode=0 Mar 09 14:39:22 crc kubenswrapper[4722]: I0309 14:39:22.564521 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-745rg" event={"ID":"fb7d79c1-83d1-4e4a-aab1-7f9e069e6442","Type":"ContainerDied","Data":"0d8e596faa553e520fc12ea9668b6f8c252ee15bfb126e0fbc1e105b23f49f2e"} Mar 09 14:39:24 crc kubenswrapper[4722]: I0309 14:39:24.157983 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-745rg" Mar 09 14:39:24 crc kubenswrapper[4722]: I0309 14:39:24.289405 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xzzkx\" (UniqueName: \"kubernetes.io/projected/fb7d79c1-83d1-4e4a-aab1-7f9e069e6442-kube-api-access-xzzkx\") pod \"fb7d79c1-83d1-4e4a-aab1-7f9e069e6442\" (UID: \"fb7d79c1-83d1-4e4a-aab1-7f9e069e6442\") " Mar 09 14:39:24 crc kubenswrapper[4722]: I0309 14:39:24.289579 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fb7d79c1-83d1-4e4a-aab1-7f9e069e6442-ssh-key-openstack-edpm-ipam\") pod \"fb7d79c1-83d1-4e4a-aab1-7f9e069e6442\" (UID: \"fb7d79c1-83d1-4e4a-aab1-7f9e069e6442\") " Mar 09 14:39:24 crc kubenswrapper[4722]: I0309 14:39:24.289737 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fb7d79c1-83d1-4e4a-aab1-7f9e069e6442-inventory\") pod \"fb7d79c1-83d1-4e4a-aab1-7f9e069e6442\" (UID: \"fb7d79c1-83d1-4e4a-aab1-7f9e069e6442\") " Mar 09 14:39:24 crc kubenswrapper[4722]: I0309 14:39:24.301467 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb7d79c1-83d1-4e4a-aab1-7f9e069e6442-kube-api-access-xzzkx" (OuterVolumeSpecName: "kube-api-access-xzzkx") pod "fb7d79c1-83d1-4e4a-aab1-7f9e069e6442" (UID: "fb7d79c1-83d1-4e4a-aab1-7f9e069e6442"). InnerVolumeSpecName "kube-api-access-xzzkx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:39:24 crc kubenswrapper[4722]: I0309 14:39:24.328337 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb7d79c1-83d1-4e4a-aab1-7f9e069e6442-inventory" (OuterVolumeSpecName: "inventory") pod "fb7d79c1-83d1-4e4a-aab1-7f9e069e6442" (UID: "fb7d79c1-83d1-4e4a-aab1-7f9e069e6442"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:39:24 crc kubenswrapper[4722]: I0309 14:39:24.328753 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb7d79c1-83d1-4e4a-aab1-7f9e069e6442-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "fb7d79c1-83d1-4e4a-aab1-7f9e069e6442" (UID: "fb7d79c1-83d1-4e4a-aab1-7f9e069e6442"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:39:24 crc kubenswrapper[4722]: I0309 14:39:24.393417 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xzzkx\" (UniqueName: \"kubernetes.io/projected/fb7d79c1-83d1-4e4a-aab1-7f9e069e6442-kube-api-access-xzzkx\") on node \"crc\" DevicePath \"\"" Mar 09 14:39:24 crc kubenswrapper[4722]: I0309 14:39:24.393641 4722 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fb7d79c1-83d1-4e4a-aab1-7f9e069e6442-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 14:39:24 crc kubenswrapper[4722]: I0309 14:39:24.393700 4722 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fb7d79c1-83d1-4e4a-aab1-7f9e069e6442-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 14:39:24 crc kubenswrapper[4722]: I0309 14:39:24.589320 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-745rg" event={"ID":"fb7d79c1-83d1-4e4a-aab1-7f9e069e6442","Type":"ContainerDied","Data":"4413d6b12a9a564b582fe4895b60b4aa986e75b72e6467448fa77b9259910fe1"} Mar 09 14:39:24 crc kubenswrapper[4722]: I0309 14:39:24.589710 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4413d6b12a9a564b582fe4895b60b4aa986e75b72e6467448fa77b9259910fe1" Mar 09 14:39:24 crc kubenswrapper[4722]: I0309 14:39:24.589485 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-745rg" Mar 09 14:39:24 crc kubenswrapper[4722]: I0309 14:39:24.700694 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rpdzb"] Mar 09 14:39:24 crc kubenswrapper[4722]: E0309 14:39:24.701241 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4a919cf-468a-4e2a-af70-435ec58bb60d" containerName="registry-server" Mar 09 14:39:24 crc kubenswrapper[4722]: I0309 14:39:24.701260 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4a919cf-468a-4e2a-af70-435ec58bb60d" containerName="registry-server" Mar 09 14:39:24 crc kubenswrapper[4722]: E0309 14:39:24.701283 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4a919cf-468a-4e2a-af70-435ec58bb60d" containerName="extract-content" Mar 09 14:39:24 crc kubenswrapper[4722]: I0309 14:39:24.701292 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4a919cf-468a-4e2a-af70-435ec58bb60d" containerName="extract-content" Mar 09 14:39:24 crc kubenswrapper[4722]: E0309 14:39:24.701334 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4a919cf-468a-4e2a-af70-435ec58bb60d" containerName="extract-utilities" Mar 09 14:39:24 crc kubenswrapper[4722]: I0309 14:39:24.701353 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4a919cf-468a-4e2a-af70-435ec58bb60d" containerName="extract-utilities" Mar 09 14:39:24 crc kubenswrapper[4722]: E0309 14:39:24.701385 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb7d79c1-83d1-4e4a-aab1-7f9e069e6442" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 09 14:39:24 crc kubenswrapper[4722]: I0309 14:39:24.701394 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb7d79c1-83d1-4e4a-aab1-7f9e069e6442" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 09 14:39:24 crc kubenswrapper[4722]: I0309 14:39:24.701714 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4a919cf-468a-4e2a-af70-435ec58bb60d" containerName="registry-server" Mar 09 14:39:24 crc kubenswrapper[4722]: I0309 14:39:24.701756 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb7d79c1-83d1-4e4a-aab1-7f9e069e6442" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 09 14:39:24 crc kubenswrapper[4722]: I0309 14:39:24.702796 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rpdzb" Mar 09 14:39:24 crc kubenswrapper[4722]: I0309 14:39:24.706565 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 14:39:24 crc kubenswrapper[4722]: I0309 14:39:24.706831 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-dwlbg" Mar 09 14:39:24 crc kubenswrapper[4722]: I0309 14:39:24.706989 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 14:39:24 crc kubenswrapper[4722]: I0309 14:39:24.708534 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 14:39:24 crc kubenswrapper[4722]: I0309 14:39:24.717657 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rpdzb"] Mar 09 14:39:24 crc kubenswrapper[4722]: I0309 14:39:24.803149 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjxxw\" (UniqueName: \"kubernetes.io/projected/1ece0886-09c8-4a9e-a309-5d538fefed94-kube-api-access-pjxxw\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-rpdzb\" (UID: \"1ece0886-09c8-4a9e-a309-5d538fefed94\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rpdzb" Mar 09 14:39:24 crc kubenswrapper[4722]: I0309 14:39:24.803438 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1ece0886-09c8-4a9e-a309-5d538fefed94-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-rpdzb\" (UID: \"1ece0886-09c8-4a9e-a309-5d538fefed94\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rpdzb" Mar 09 14:39:24 crc kubenswrapper[4722]: I0309 14:39:24.803479 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1ece0886-09c8-4a9e-a309-5d538fefed94-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-rpdzb\" (UID: \"1ece0886-09c8-4a9e-a309-5d538fefed94\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rpdzb" Mar 09 14:39:24 crc kubenswrapper[4722]: I0309 14:39:24.906817 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1ece0886-09c8-4a9e-a309-5d538fefed94-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-rpdzb\" (UID: \"1ece0886-09c8-4a9e-a309-5d538fefed94\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rpdzb" Mar 09 14:39:24 crc kubenswrapper[4722]: I0309 14:39:24.906893 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1ece0886-09c8-4a9e-a309-5d538fefed94-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-rpdzb\" (UID: \"1ece0886-09c8-4a9e-a309-5d538fefed94\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rpdzb" Mar 09 14:39:24 crc kubenswrapper[4722]: I0309 14:39:24.907136 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjxxw\" (UniqueName: \"kubernetes.io/projected/1ece0886-09c8-4a9e-a309-5d538fefed94-kube-api-access-pjxxw\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-rpdzb\" (UID: \"1ece0886-09c8-4a9e-a309-5d538fefed94\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rpdzb" Mar 09 14:39:24 crc kubenswrapper[4722]: I0309 14:39:24.911754 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1ece0886-09c8-4a9e-a309-5d538fefed94-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-rpdzb\" (UID: \"1ece0886-09c8-4a9e-a309-5d538fefed94\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rpdzb" Mar 09 14:39:24 crc kubenswrapper[4722]: I0309 14:39:24.911792 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1ece0886-09c8-4a9e-a309-5d538fefed94-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-rpdzb\" (UID: \"1ece0886-09c8-4a9e-a309-5d538fefed94\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rpdzb" Mar 09 14:39:24 crc kubenswrapper[4722]: I0309 14:39:24.929146 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjxxw\" (UniqueName: \"kubernetes.io/projected/1ece0886-09c8-4a9e-a309-5d538fefed94-kube-api-access-pjxxw\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-rpdzb\" (UID: \"1ece0886-09c8-4a9e-a309-5d538fefed94\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rpdzb" Mar 09 14:39:25 crc kubenswrapper[4722]: I0309 14:39:25.036535 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rpdzb" Mar 09 14:39:25 crc kubenswrapper[4722]: I0309 14:39:25.610543 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rpdzb"] Mar 09 14:39:25 crc kubenswrapper[4722]: I0309 14:39:25.614990 4722 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 14:39:26 crc kubenswrapper[4722]: I0309 14:39:26.629657 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rpdzb" event={"ID":"1ece0886-09c8-4a9e-a309-5d538fefed94","Type":"ContainerStarted","Data":"927cc2c73eb4055b3288e4409ac00fcbad18548d072efc8d999df6dc348ef71c"} Mar 09 14:39:27 crc kubenswrapper[4722]: I0309 14:39:27.368969 4722 scope.go:117] "RemoveContainer" containerID="8df963eaf8b6f3a867ce207588dc0c93e76df2b1e21bf269210fef574c2aaf34" Mar 09 14:39:27 crc kubenswrapper[4722]: I0309 14:39:27.656418 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rpdzb" event={"ID":"1ece0886-09c8-4a9e-a309-5d538fefed94","Type":"ContainerStarted","Data":"e0051e1edd3c1de1520d5bee29e0f312b9549f965f403e4d2efd6bbf575cfd43"} Mar 09 14:39:27 crc kubenswrapper[4722]: I0309 14:39:27.678501 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rpdzb" podStartSLOduration=2.736377785 podStartE2EDuration="3.678480169s" podCreationTimestamp="2026-03-09 14:39:24 +0000 UTC" firstStartedPulling="2026-03-09 14:39:25.614756184 +0000 UTC m=+2206.170324760" lastFinishedPulling="2026-03-09 14:39:26.556858568 +0000 UTC m=+2207.112427144" observedRunningTime="2026-03-09 14:39:27.674837639 +0000 UTC m=+2208.230406215" watchObservedRunningTime="2026-03-09 14:39:27.678480169 +0000 UTC m=+2208.234048745" Mar 09 14:39:43 crc kubenswrapper[4722]: I0309 14:39:43.925042 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-69rk7"] Mar 09 14:39:43 crc kubenswrapper[4722]: I0309 14:39:43.928255 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-69rk7" Mar 09 14:39:43 crc kubenswrapper[4722]: I0309 14:39:43.936844 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-69rk7"] Mar 09 14:39:44 crc kubenswrapper[4722]: I0309 14:39:44.009791 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnxhx\" (UniqueName: \"kubernetes.io/projected/1ad65c2c-6314-4a03-b373-bc1280fd2256-kube-api-access-vnxhx\") pod \"certified-operators-69rk7\" (UID: \"1ad65c2c-6314-4a03-b373-bc1280fd2256\") " pod="openshift-marketplace/certified-operators-69rk7" Mar 09 14:39:44 crc kubenswrapper[4722]: I0309 14:39:44.010125 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ad65c2c-6314-4a03-b373-bc1280fd2256-utilities\") pod \"certified-operators-69rk7\" (UID: \"1ad65c2c-6314-4a03-b373-bc1280fd2256\") " pod="openshift-marketplace/certified-operators-69rk7" Mar 09 14:39:44 crc kubenswrapper[4722]: I0309 14:39:44.010169 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ad65c2c-6314-4a03-b373-bc1280fd2256-catalog-content\") pod \"certified-operators-69rk7\" (UID: \"1ad65c2c-6314-4a03-b373-bc1280fd2256\") " pod="openshift-marketplace/certified-operators-69rk7" Mar 09 14:39:44 crc kubenswrapper[4722]: I0309 14:39:44.112727 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnxhx\" (UniqueName: \"kubernetes.io/projected/1ad65c2c-6314-4a03-b373-bc1280fd2256-kube-api-access-vnxhx\") pod \"certified-operators-69rk7\" (UID: \"1ad65c2c-6314-4a03-b373-bc1280fd2256\") " pod="openshift-marketplace/certified-operators-69rk7" Mar 09 14:39:44 crc kubenswrapper[4722]: I0309 14:39:44.112817 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ad65c2c-6314-4a03-b373-bc1280fd2256-utilities\") pod \"certified-operators-69rk7\" (UID: \"1ad65c2c-6314-4a03-b373-bc1280fd2256\") " pod="openshift-marketplace/certified-operators-69rk7" Mar 09 14:39:44 crc kubenswrapper[4722]: I0309 14:39:44.112862 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ad65c2c-6314-4a03-b373-bc1280fd2256-catalog-content\") pod \"certified-operators-69rk7\" (UID: \"1ad65c2c-6314-4a03-b373-bc1280fd2256\") " pod="openshift-marketplace/certified-operators-69rk7" Mar 09 14:39:44 crc kubenswrapper[4722]: I0309 14:39:44.113412 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ad65c2c-6314-4a03-b373-bc1280fd2256-catalog-content\") pod \"certified-operators-69rk7\" (UID: \"1ad65c2c-6314-4a03-b373-bc1280fd2256\") " pod="openshift-marketplace/certified-operators-69rk7" Mar 09 14:39:44 crc kubenswrapper[4722]: I0309 14:39:44.113580 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ad65c2c-6314-4a03-b373-bc1280fd2256-utilities\") pod \"certified-operators-69rk7\" (UID: \"1ad65c2c-6314-4a03-b373-bc1280fd2256\") " pod="openshift-marketplace/certified-operators-69rk7" Mar 09 14:39:44 crc kubenswrapper[4722]: I0309 14:39:44.135399 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnxhx\" (UniqueName: \"kubernetes.io/projected/1ad65c2c-6314-4a03-b373-bc1280fd2256-kube-api-access-vnxhx\") pod \"certified-operators-69rk7\" (UID: \"1ad65c2c-6314-4a03-b373-bc1280fd2256\") " pod="openshift-marketplace/certified-operators-69rk7" Mar 09 14:39:44 crc kubenswrapper[4722]: I0309 14:39:44.275563 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-69rk7" Mar 09 14:39:44 crc kubenswrapper[4722]: I0309 14:39:44.856215 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-69rk7"] Mar 09 14:39:44 crc kubenswrapper[4722]: W0309 14:39:44.866610 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ad65c2c_6314_4a03_b373_bc1280fd2256.slice/crio-45e8ae03ae0bdc5a3ac6a0e7405be2f4b3cebbd8b454cf65786559d53f7fc9e4 WatchSource:0}: Error finding container 45e8ae03ae0bdc5a3ac6a0e7405be2f4b3cebbd8b454cf65786559d53f7fc9e4: Status 404 returned error can't find the container with id 45e8ae03ae0bdc5a3ac6a0e7405be2f4b3cebbd8b454cf65786559d53f7fc9e4 Mar 09 14:39:45 crc kubenswrapper[4722]: I0309 14:39:45.848579 4722 generic.go:334] "Generic (PLEG): container finished" podID="1ad65c2c-6314-4a03-b373-bc1280fd2256" containerID="ca67d9e6008c875a898f1e4a251776069bd8cd06c315e369bc8137724f84f528" exitCode=0 Mar 09 14:39:45 crc kubenswrapper[4722]: I0309 14:39:45.848651 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-69rk7" event={"ID":"1ad65c2c-6314-4a03-b373-bc1280fd2256","Type":"ContainerDied","Data":"ca67d9e6008c875a898f1e4a251776069bd8cd06c315e369bc8137724f84f528"} Mar 09 14:39:45 crc kubenswrapper[4722]: I0309 14:39:45.848902 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-69rk7" event={"ID":"1ad65c2c-6314-4a03-b373-bc1280fd2256","Type":"ContainerStarted","Data":"45e8ae03ae0bdc5a3ac6a0e7405be2f4b3cebbd8b454cf65786559d53f7fc9e4"} Mar 09 14:39:47 crc kubenswrapper[4722]: I0309 14:39:47.874039 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-69rk7" event={"ID":"1ad65c2c-6314-4a03-b373-bc1280fd2256","Type":"ContainerStarted","Data":"7dcce7ef3c86a1600b9163cdd26df8e73b370fc4c144693e8fd833db07f3395d"} Mar 09 14:39:48 crc kubenswrapper[4722]: I0309 14:39:48.887112 4722 generic.go:334] "Generic (PLEG): container finished" podID="1ad65c2c-6314-4a03-b373-bc1280fd2256" containerID="7dcce7ef3c86a1600b9163cdd26df8e73b370fc4c144693e8fd833db07f3395d" exitCode=0 Mar 09 14:39:48 crc kubenswrapper[4722]: I0309 14:39:48.887284 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-69rk7" event={"ID":"1ad65c2c-6314-4a03-b373-bc1280fd2256","Type":"ContainerDied","Data":"7dcce7ef3c86a1600b9163cdd26df8e73b370fc4c144693e8fd833db07f3395d"} Mar 09 14:39:49 crc kubenswrapper[4722]: I0309 14:39:49.899130 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-69rk7" event={"ID":"1ad65c2c-6314-4a03-b373-bc1280fd2256","Type":"ContainerStarted","Data":"920a33001406f98288321fadbaf3e8a3d41b8259ebd6d3e14cb6f3419840a24a"} Mar 09 14:39:49 crc kubenswrapper[4722]: I0309 14:39:49.927949 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-69rk7" podStartSLOduration=3.177248578 podStartE2EDuration="6.927931382s" podCreationTimestamp="2026-03-09 14:39:43 +0000 UTC" firstStartedPulling="2026-03-09 14:39:45.85073761 +0000 UTC m=+2226.406306186" lastFinishedPulling="2026-03-09 14:39:49.601420414 +0000 UTC m=+2230.156988990" observedRunningTime="2026-03-09 14:39:49.919139833 +0000 UTC m=+2230.474708409" watchObservedRunningTime="2026-03-09 14:39:49.927931382 +0000 UTC m=+2230.483499958" Mar 09 14:39:54 crc kubenswrapper[4722]: I0309 14:39:54.276995 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-69rk7" Mar 09 14:39:54 crc kubenswrapper[4722]: I0309 14:39:54.278611 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-69rk7" Mar 09 14:39:54 crc kubenswrapper[4722]: I0309 14:39:54.330183 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-69rk7" Mar 09 14:39:54 crc kubenswrapper[4722]: I0309 14:39:54.995491 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-69rk7" Mar 09 14:39:55 crc kubenswrapper[4722]: I0309 14:39:55.057386 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-69rk7"] Mar 09 14:39:56 crc kubenswrapper[4722]: I0309 14:39:56.972091 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-69rk7" podUID="1ad65c2c-6314-4a03-b373-bc1280fd2256" containerName="registry-server" containerID="cri-o://920a33001406f98288321fadbaf3e8a3d41b8259ebd6d3e14cb6f3419840a24a" gracePeriod=2 Mar 09 14:39:57 crc kubenswrapper[4722]: I0309 14:39:57.529006 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-69rk7" Mar 09 14:39:57 crc kubenswrapper[4722]: I0309 14:39:57.575701 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ad65c2c-6314-4a03-b373-bc1280fd2256-utilities\") pod \"1ad65c2c-6314-4a03-b373-bc1280fd2256\" (UID: \"1ad65c2c-6314-4a03-b373-bc1280fd2256\") " Mar 09 14:39:57 crc kubenswrapper[4722]: I0309 14:39:57.575762 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vnxhx\" (UniqueName: \"kubernetes.io/projected/1ad65c2c-6314-4a03-b373-bc1280fd2256-kube-api-access-vnxhx\") pod \"1ad65c2c-6314-4a03-b373-bc1280fd2256\" (UID: \"1ad65c2c-6314-4a03-b373-bc1280fd2256\") " Mar 09 14:39:57 crc kubenswrapper[4722]: I0309 14:39:57.575847 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ad65c2c-6314-4a03-b373-bc1280fd2256-catalog-content\") pod \"1ad65c2c-6314-4a03-b373-bc1280fd2256\" (UID: \"1ad65c2c-6314-4a03-b373-bc1280fd2256\") " Mar 09 14:39:57 crc kubenswrapper[4722]: I0309 14:39:57.577030 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ad65c2c-6314-4a03-b373-bc1280fd2256-utilities" (OuterVolumeSpecName: "utilities") pod "1ad65c2c-6314-4a03-b373-bc1280fd2256" (UID: "1ad65c2c-6314-4a03-b373-bc1280fd2256"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:39:57 crc kubenswrapper[4722]: I0309 14:39:57.581515 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ad65c2c-6314-4a03-b373-bc1280fd2256-kube-api-access-vnxhx" (OuterVolumeSpecName: "kube-api-access-vnxhx") pod "1ad65c2c-6314-4a03-b373-bc1280fd2256" (UID: "1ad65c2c-6314-4a03-b373-bc1280fd2256"). InnerVolumeSpecName "kube-api-access-vnxhx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:39:57 crc kubenswrapper[4722]: I0309 14:39:57.678477 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ad65c2c-6314-4a03-b373-bc1280fd2256-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 14:39:57 crc kubenswrapper[4722]: I0309 14:39:57.678507 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vnxhx\" (UniqueName: \"kubernetes.io/projected/1ad65c2c-6314-4a03-b373-bc1280fd2256-kube-api-access-vnxhx\") on node \"crc\" DevicePath \"\"" Mar 09 14:39:57 crc kubenswrapper[4722]: I0309 14:39:57.987805 4722 generic.go:334] "Generic (PLEG): container finished" podID="1ad65c2c-6314-4a03-b373-bc1280fd2256" containerID="920a33001406f98288321fadbaf3e8a3d41b8259ebd6d3e14cb6f3419840a24a" exitCode=0 Mar 09 14:39:57 crc kubenswrapper[4722]: I0309 14:39:57.987887 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-69rk7" event={"ID":"1ad65c2c-6314-4a03-b373-bc1280fd2256","Type":"ContainerDied","Data":"920a33001406f98288321fadbaf3e8a3d41b8259ebd6d3e14cb6f3419840a24a"} Mar 09 14:39:57 crc kubenswrapper[4722]: I0309 14:39:57.987922 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-69rk7" event={"ID":"1ad65c2c-6314-4a03-b373-bc1280fd2256","Type":"ContainerDied","Data":"45e8ae03ae0bdc5a3ac6a0e7405be2f4b3cebbd8b454cf65786559d53f7fc9e4"} Mar 09 14:39:57 crc kubenswrapper[4722]: I0309 14:39:57.987944 4722 scope.go:117] "RemoveContainer" containerID="920a33001406f98288321fadbaf3e8a3d41b8259ebd6d3e14cb6f3419840a24a" Mar 09 14:39:57 crc kubenswrapper[4722]: I0309 14:39:57.988195 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-69rk7" Mar 09 14:39:58 crc kubenswrapper[4722]: I0309 14:39:58.015418 4722 scope.go:117] "RemoveContainer" containerID="7dcce7ef3c86a1600b9163cdd26df8e73b370fc4c144693e8fd833db07f3395d" Mar 09 14:39:58 crc kubenswrapper[4722]: I0309 14:39:58.041154 4722 scope.go:117] "RemoveContainer" containerID="ca67d9e6008c875a898f1e4a251776069bd8cd06c315e369bc8137724f84f528" Mar 09 14:39:58 crc kubenswrapper[4722]: I0309 14:39:58.091547 4722 scope.go:117] "RemoveContainer" containerID="920a33001406f98288321fadbaf3e8a3d41b8259ebd6d3e14cb6f3419840a24a" Mar 09 14:39:58 crc kubenswrapper[4722]: E0309 14:39:58.092008 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"920a33001406f98288321fadbaf3e8a3d41b8259ebd6d3e14cb6f3419840a24a\": container with ID starting with 920a33001406f98288321fadbaf3e8a3d41b8259ebd6d3e14cb6f3419840a24a not found: ID does not exist" containerID="920a33001406f98288321fadbaf3e8a3d41b8259ebd6d3e14cb6f3419840a24a" Mar 09 14:39:58 crc kubenswrapper[4722]: I0309 14:39:58.092040 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"920a33001406f98288321fadbaf3e8a3d41b8259ebd6d3e14cb6f3419840a24a"} err="failed to get container status \"920a33001406f98288321fadbaf3e8a3d41b8259ebd6d3e14cb6f3419840a24a\": rpc error: code = NotFound desc = could not find container \"920a33001406f98288321fadbaf3e8a3d41b8259ebd6d3e14cb6f3419840a24a\": container with ID starting with 920a33001406f98288321fadbaf3e8a3d41b8259ebd6d3e14cb6f3419840a24a not found: ID does not exist" Mar 09 14:39:58 crc kubenswrapper[4722]: I0309 14:39:58.092062 4722 scope.go:117] "RemoveContainer" containerID="7dcce7ef3c86a1600b9163cdd26df8e73b370fc4c144693e8fd833db07f3395d" Mar 09 14:39:58 crc kubenswrapper[4722]: E0309 14:39:58.092387 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7dcce7ef3c86a1600b9163cdd26df8e73b370fc4c144693e8fd833db07f3395d\": container with ID starting with 7dcce7ef3c86a1600b9163cdd26df8e73b370fc4c144693e8fd833db07f3395d not found: ID does not exist" containerID="7dcce7ef3c86a1600b9163cdd26df8e73b370fc4c144693e8fd833db07f3395d" Mar 09 14:39:58 crc kubenswrapper[4722]: I0309 14:39:58.092414 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7dcce7ef3c86a1600b9163cdd26df8e73b370fc4c144693e8fd833db07f3395d"} err="failed to get container status \"7dcce7ef3c86a1600b9163cdd26df8e73b370fc4c144693e8fd833db07f3395d\": rpc error: code = NotFound desc = could not find container \"7dcce7ef3c86a1600b9163cdd26df8e73b370fc4c144693e8fd833db07f3395d\": container with ID starting with 7dcce7ef3c86a1600b9163cdd26df8e73b370fc4c144693e8fd833db07f3395d not found: ID does not exist" Mar 09 14:39:58 crc kubenswrapper[4722]: I0309 14:39:58.092427 4722 scope.go:117] "RemoveContainer" containerID="ca67d9e6008c875a898f1e4a251776069bd8cd06c315e369bc8137724f84f528" Mar 09 14:39:58 crc kubenswrapper[4722]: E0309 14:39:58.092666 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca67d9e6008c875a898f1e4a251776069bd8cd06c315e369bc8137724f84f528\": container with ID starting with ca67d9e6008c875a898f1e4a251776069bd8cd06c315e369bc8137724f84f528 not found: ID does not exist" containerID="ca67d9e6008c875a898f1e4a251776069bd8cd06c315e369bc8137724f84f528" Mar 09 14:39:58 crc kubenswrapper[4722]: I0309 14:39:58.092698 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca67d9e6008c875a898f1e4a251776069bd8cd06c315e369bc8137724f84f528"} err="failed to get container status \"ca67d9e6008c875a898f1e4a251776069bd8cd06c315e369bc8137724f84f528\": rpc error: code = NotFound desc = could not find container \"ca67d9e6008c875a898f1e4a251776069bd8cd06c315e369bc8137724f84f528\": container with ID starting with ca67d9e6008c875a898f1e4a251776069bd8cd06c315e369bc8137724f84f528 not found: ID does not exist" Mar 09 14:39:58 crc kubenswrapper[4722]: I0309 14:39:58.415813 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ad65c2c-6314-4a03-b373-bc1280fd2256-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1ad65c2c-6314-4a03-b373-bc1280fd2256" (UID: "1ad65c2c-6314-4a03-b373-bc1280fd2256"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:39:58 crc kubenswrapper[4722]: I0309 14:39:58.503632 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ad65c2c-6314-4a03-b373-bc1280fd2256-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 14:39:58 crc kubenswrapper[4722]: I0309 14:39:58.629638 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-69rk7"] Mar 09 14:39:58 crc kubenswrapper[4722]: I0309 14:39:58.641391 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-69rk7"] Mar 09 14:40:00 crc kubenswrapper[4722]: I0309 14:40:00.161157 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ad65c2c-6314-4a03-b373-bc1280fd2256" path="/var/lib/kubelet/pods/1ad65c2c-6314-4a03-b373-bc1280fd2256/volumes" Mar 09 14:40:00 crc kubenswrapper[4722]: I0309 14:40:00.162111 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551120-q9k6x"] Mar 09 14:40:00 crc kubenswrapper[4722]: E0309 14:40:00.162491 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ad65c2c-6314-4a03-b373-bc1280fd2256" containerName="extract-utilities" Mar 09 14:40:00 crc kubenswrapper[4722]: I0309 14:40:00.162503 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ad65c2c-6314-4a03-b373-bc1280fd2256" containerName="extract-utilities" Mar 09 14:40:00 crc kubenswrapper[4722]: E0309 14:40:00.162524 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ad65c2c-6314-4a03-b373-bc1280fd2256" containerName="extract-content" Mar 09 14:40:00 crc kubenswrapper[4722]: I0309 14:40:00.162530 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ad65c2c-6314-4a03-b373-bc1280fd2256" containerName="extract-content" Mar 09 14:40:00 crc kubenswrapper[4722]: E0309 14:40:00.162540 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ad65c2c-6314-4a03-b373-bc1280fd2256" containerName="registry-server" Mar 09 14:40:00 crc kubenswrapper[4722]: I0309 14:40:00.162547 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ad65c2c-6314-4a03-b373-bc1280fd2256" containerName="registry-server" Mar 09 14:40:00 crc kubenswrapper[4722]: I0309 14:40:00.162825 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ad65c2c-6314-4a03-b373-bc1280fd2256" containerName="registry-server" Mar 09 14:40:00 crc kubenswrapper[4722]: I0309 14:40:00.164021 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551120-q9k6x" Mar 09 14:40:00 crc kubenswrapper[4722]: I0309 14:40:00.165991 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551120-q9k6x"] Mar 09 14:40:00 crc kubenswrapper[4722]: I0309 14:40:00.166342 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 14:40:00 crc kubenswrapper[4722]: I0309 14:40:00.166377 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-dr2x6" Mar 09 14:40:00 crc kubenswrapper[4722]: I0309 14:40:00.166403 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 14:40:00 crc kubenswrapper[4722]: I0309 14:40:00.343183 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbw42\" (UniqueName: \"kubernetes.io/projected/4324c08a-d9aa-4d6e-ad2a-900c2124d167-kube-api-access-rbw42\") pod \"auto-csr-approver-29551120-q9k6x\" (UID: \"4324c08a-d9aa-4d6e-ad2a-900c2124d167\") " pod="openshift-infra/auto-csr-approver-29551120-q9k6x" Mar 09 14:40:00 crc kubenswrapper[4722]: I0309 14:40:00.446097 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbw42\" (UniqueName: \"kubernetes.io/projected/4324c08a-d9aa-4d6e-ad2a-900c2124d167-kube-api-access-rbw42\") pod \"auto-csr-approver-29551120-q9k6x\" (UID: \"4324c08a-d9aa-4d6e-ad2a-900c2124d167\") " pod="openshift-infra/auto-csr-approver-29551120-q9k6x" Mar 09 14:40:00 crc kubenswrapper[4722]: I0309 14:40:00.471912 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbw42\" (UniqueName: \"kubernetes.io/projected/4324c08a-d9aa-4d6e-ad2a-900c2124d167-kube-api-access-rbw42\") pod \"auto-csr-approver-29551120-q9k6x\" (UID: \"4324c08a-d9aa-4d6e-ad2a-900c2124d167\") " pod="openshift-infra/auto-csr-approver-29551120-q9k6x" Mar 09 14:40:00 crc kubenswrapper[4722]: I0309 14:40:00.486049 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551120-q9k6x" Mar 09 14:40:01 crc kubenswrapper[4722]: I0309 14:40:01.001655 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551120-q9k6x"] Mar 09 14:40:01 crc kubenswrapper[4722]: I0309 14:40:01.023137 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551120-q9k6x" event={"ID":"4324c08a-d9aa-4d6e-ad2a-900c2124d167","Type":"ContainerStarted","Data":"2eb4656f90041ffc3a0af352951124ecc4b76cb452db8c8b955971e5c5726fd9"} Mar 09 14:40:03 crc kubenswrapper[4722]: I0309 14:40:03.055831 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551120-q9k6x" event={"ID":"4324c08a-d9aa-4d6e-ad2a-900c2124d167","Type":"ContainerStarted","Data":"2d0b302dca238f52c5be1715bb7e4345bc1005ab14c32ff54a20c4baf6b3d79a"} Mar 09 14:40:03 crc kubenswrapper[4722]: I0309 14:40:03.078482 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551120-q9k6x" podStartSLOduration=1.556617334 podStartE2EDuration="3.078453639s" podCreationTimestamp="2026-03-09 14:40:00 +0000 UTC" firstStartedPulling="2026-03-09 14:40:01.004306031 +0000 UTC m=+2241.559874607" lastFinishedPulling="2026-03-09 14:40:02.526142336 +0000 UTC m=+2243.081710912" observedRunningTime="2026-03-09 14:40:03.071622043 +0000 UTC m=+2243.627190619" watchObservedRunningTime="2026-03-09 14:40:03.078453639 +0000 UTC m=+2243.634022245" Mar 09 14:40:04 crc kubenswrapper[4722]: I0309 14:40:04.066900 4722 generic.go:334] "Generic (PLEG): container finished" podID="4324c08a-d9aa-4d6e-ad2a-900c2124d167" containerID="2d0b302dca238f52c5be1715bb7e4345bc1005ab14c32ff54a20c4baf6b3d79a" exitCode=0 Mar 09 14:40:04 crc kubenswrapper[4722]: I0309 14:40:04.067019 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551120-q9k6x" event={"ID":"4324c08a-d9aa-4d6e-ad2a-900c2124d167","Type":"ContainerDied","Data":"2d0b302dca238f52c5be1715bb7e4345bc1005ab14c32ff54a20c4baf6b3d79a"} Mar 09 14:40:05 crc kubenswrapper[4722]: I0309 14:40:05.516015 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551120-q9k6x" Mar 09 14:40:05 crc kubenswrapper[4722]: I0309 14:40:05.684730 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbw42\" (UniqueName: \"kubernetes.io/projected/4324c08a-d9aa-4d6e-ad2a-900c2124d167-kube-api-access-rbw42\") pod \"4324c08a-d9aa-4d6e-ad2a-900c2124d167\" (UID: \"4324c08a-d9aa-4d6e-ad2a-900c2124d167\") " Mar 09 14:40:05 crc kubenswrapper[4722]: I0309 14:40:05.692523 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4324c08a-d9aa-4d6e-ad2a-900c2124d167-kube-api-access-rbw42" (OuterVolumeSpecName: "kube-api-access-rbw42") pod "4324c08a-d9aa-4d6e-ad2a-900c2124d167" (UID: "4324c08a-d9aa-4d6e-ad2a-900c2124d167"). InnerVolumeSpecName "kube-api-access-rbw42". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:40:05 crc kubenswrapper[4722]: I0309 14:40:05.789368 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbw42\" (UniqueName: \"kubernetes.io/projected/4324c08a-d9aa-4d6e-ad2a-900c2124d167-kube-api-access-rbw42\") on node \"crc\" DevicePath \"\"" Mar 09 14:40:06 crc kubenswrapper[4722]: I0309 14:40:06.091683 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551120-q9k6x" event={"ID":"4324c08a-d9aa-4d6e-ad2a-900c2124d167","Type":"ContainerDied","Data":"2eb4656f90041ffc3a0af352951124ecc4b76cb452db8c8b955971e5c5726fd9"} Mar 09 14:40:06 crc kubenswrapper[4722]: I0309 14:40:06.091724 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2eb4656f90041ffc3a0af352951124ecc4b76cb452db8c8b955971e5c5726fd9" Mar 09 14:40:06 crc kubenswrapper[4722]: I0309 14:40:06.091789 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551120-q9k6x" Mar 09 14:40:06 crc kubenswrapper[4722]: I0309 14:40:06.163507 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551114-jfxkb"] Mar 09 14:40:06 crc kubenswrapper[4722]: I0309 14:40:06.163878 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551114-jfxkb"] Mar 09 14:40:08 crc kubenswrapper[4722]: I0309 14:40:08.163364 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="066f98c7-83dc-4e2d-9e51-702dd17db261" path="/var/lib/kubelet/pods/066f98c7-83dc-4e2d-9e51-702dd17db261/volumes" Mar 09 14:40:12 crc kubenswrapper[4722]: I0309 14:40:12.158063 4722 generic.go:334] "Generic (PLEG): container finished" podID="1ece0886-09c8-4a9e-a309-5d538fefed94" containerID="e0051e1edd3c1de1520d5bee29e0f312b9549f965f403e4d2efd6bbf575cfd43" exitCode=0 Mar 09 14:40:12 crc kubenswrapper[4722]: I0309 14:40:12.167922 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rpdzb" event={"ID":"1ece0886-09c8-4a9e-a309-5d538fefed94","Type":"ContainerDied","Data":"e0051e1edd3c1de1520d5bee29e0f312b9549f965f403e4d2efd6bbf575cfd43"} Mar 09 14:40:13 crc kubenswrapper[4722]: I0309 14:40:13.697621 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rpdzb" Mar 09 14:40:13 crc kubenswrapper[4722]: I0309 14:40:13.758506 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1ece0886-09c8-4a9e-a309-5d538fefed94-inventory\") pod \"1ece0886-09c8-4a9e-a309-5d538fefed94\" (UID: \"1ece0886-09c8-4a9e-a309-5d538fefed94\") " Mar 09 14:40:13 crc kubenswrapper[4722]: I0309 14:40:13.800639 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ece0886-09c8-4a9e-a309-5d538fefed94-inventory" (OuterVolumeSpecName: "inventory") pod "1ece0886-09c8-4a9e-a309-5d538fefed94" (UID: "1ece0886-09c8-4a9e-a309-5d538fefed94"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:40:13 crc kubenswrapper[4722]: I0309 14:40:13.860966 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1ece0886-09c8-4a9e-a309-5d538fefed94-ssh-key-openstack-edpm-ipam\") pod \"1ece0886-09c8-4a9e-a309-5d538fefed94\" (UID: \"1ece0886-09c8-4a9e-a309-5d538fefed94\") " Mar 09 14:40:13 crc kubenswrapper[4722]: I0309 14:40:13.861028 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjxxw\" (UniqueName: \"kubernetes.io/projected/1ece0886-09c8-4a9e-a309-5d538fefed94-kube-api-access-pjxxw\") pod \"1ece0886-09c8-4a9e-a309-5d538fefed94\" (UID: \"1ece0886-09c8-4a9e-a309-5d538fefed94\") " Mar 09 14:40:13 crc kubenswrapper[4722]: I0309 14:40:13.862741 4722 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1ece0886-09c8-4a9e-a309-5d538fefed94-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 14:40:13 crc kubenswrapper[4722]: I0309 14:40:13.865103 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ece0886-09c8-4a9e-a309-5d538fefed94-kube-api-access-pjxxw" (OuterVolumeSpecName: "kube-api-access-pjxxw") pod "1ece0886-09c8-4a9e-a309-5d538fefed94" (UID: "1ece0886-09c8-4a9e-a309-5d538fefed94"). InnerVolumeSpecName "kube-api-access-pjxxw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:40:13 crc kubenswrapper[4722]: I0309 14:40:13.897213 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ece0886-09c8-4a9e-a309-5d538fefed94-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "1ece0886-09c8-4a9e-a309-5d538fefed94" (UID: "1ece0886-09c8-4a9e-a309-5d538fefed94"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:40:13 crc kubenswrapper[4722]: I0309 14:40:13.964780 4722 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1ece0886-09c8-4a9e-a309-5d538fefed94-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 14:40:13 crc kubenswrapper[4722]: I0309 14:40:13.964815 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjxxw\" (UniqueName: \"kubernetes.io/projected/1ece0886-09c8-4a9e-a309-5d538fefed94-kube-api-access-pjxxw\") on node \"crc\" DevicePath \"\"" Mar 09 14:40:14 crc kubenswrapper[4722]: I0309 14:40:14.199806 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rpdzb" event={"ID":"1ece0886-09c8-4a9e-a309-5d538fefed94","Type":"ContainerDied","Data":"927cc2c73eb4055b3288e4409ac00fcbad18548d072efc8d999df6dc348ef71c"} Mar 09 14:40:14 crc kubenswrapper[4722]: I0309 14:40:14.199906 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="927cc2c73eb4055b3288e4409ac00fcbad18548d072efc8d999df6dc348ef71c" Mar 09 14:40:14 crc kubenswrapper[4722]: I0309 14:40:14.200067 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-rpdzb" Mar 09 14:40:14 crc kubenswrapper[4722]: I0309 14:40:14.304318 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-hj7ts"] Mar 09 14:40:14 crc kubenswrapper[4722]: E0309 14:40:14.304851 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ece0886-09c8-4a9e-a309-5d538fefed94" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 09 14:40:14 crc kubenswrapper[4722]: I0309 14:40:14.304870 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ece0886-09c8-4a9e-a309-5d538fefed94" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 09 14:40:14 crc kubenswrapper[4722]: E0309 14:40:14.304887 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4324c08a-d9aa-4d6e-ad2a-900c2124d167" containerName="oc" Mar 09 14:40:14 crc kubenswrapper[4722]: I0309 14:40:14.304894 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="4324c08a-d9aa-4d6e-ad2a-900c2124d167" containerName="oc" Mar 09 14:40:14 crc kubenswrapper[4722]: I0309 14:40:14.305107 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="4324c08a-d9aa-4d6e-ad2a-900c2124d167" containerName="oc" Mar 09 14:40:14 crc kubenswrapper[4722]: I0309 14:40:14.305127 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ece0886-09c8-4a9e-a309-5d538fefed94" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 09 14:40:14 crc kubenswrapper[4722]: I0309 14:40:14.305919 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-hj7ts" Mar 09 14:40:14 crc kubenswrapper[4722]: I0309 14:40:14.308040 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 14:40:14 crc kubenswrapper[4722]: I0309 14:40:14.308360 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 14:40:14 crc kubenswrapper[4722]: I0309 14:40:14.308745 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 14:40:14 crc kubenswrapper[4722]: I0309 14:40:14.309280 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-dwlbg" Mar 09 14:40:14 crc kubenswrapper[4722]: I0309 14:40:14.330779 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-hj7ts"] Mar 09 14:40:14 crc kubenswrapper[4722]: I0309 14:40:14.490482 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/0104a7c4-89e8-4e4e-a184-d514fb780bb0-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-hj7ts\" (UID: \"0104a7c4-89e8-4e4e-a184-d514fb780bb0\") " pod="openstack/ssh-known-hosts-edpm-deployment-hj7ts" Mar 09 14:40:14 crc kubenswrapper[4722]: I0309 14:40:14.491137 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvrw8\" (UniqueName: \"kubernetes.io/projected/0104a7c4-89e8-4e4e-a184-d514fb780bb0-kube-api-access-vvrw8\") pod \"ssh-known-hosts-edpm-deployment-hj7ts\" (UID: \"0104a7c4-89e8-4e4e-a184-d514fb780bb0\") " pod="openstack/ssh-known-hosts-edpm-deployment-hj7ts" Mar 09 14:40:14 crc kubenswrapper[4722]: I0309 14:40:14.491226 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0104a7c4-89e8-4e4e-a184-d514fb780bb0-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-hj7ts\" (UID: \"0104a7c4-89e8-4e4e-a184-d514fb780bb0\") " pod="openstack/ssh-known-hosts-edpm-deployment-hj7ts" Mar 09 14:40:14 crc kubenswrapper[4722]: I0309 14:40:14.593739 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvrw8\" (UniqueName: \"kubernetes.io/projected/0104a7c4-89e8-4e4e-a184-d514fb780bb0-kube-api-access-vvrw8\") pod \"ssh-known-hosts-edpm-deployment-hj7ts\" (UID: \"0104a7c4-89e8-4e4e-a184-d514fb780bb0\") " pod="openstack/ssh-known-hosts-edpm-deployment-hj7ts" Mar 09 14:40:14 crc kubenswrapper[4722]: I0309 14:40:14.593862 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0104a7c4-89e8-4e4e-a184-d514fb780bb0-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-hj7ts\" (UID: \"0104a7c4-89e8-4e4e-a184-d514fb780bb0\") " pod="openstack/ssh-known-hosts-edpm-deployment-hj7ts" Mar 09 14:40:14 crc kubenswrapper[4722]: I0309 14:40:14.593991 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/0104a7c4-89e8-4e4e-a184-d514fb780bb0-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-hj7ts\" (UID: \"0104a7c4-89e8-4e4e-a184-d514fb780bb0\") " pod="openstack/ssh-known-hosts-edpm-deployment-hj7ts" Mar 09 14:40:14 crc kubenswrapper[4722]: I0309 14:40:14.598449 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/0104a7c4-89e8-4e4e-a184-d514fb780bb0-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-hj7ts\" (UID: \"0104a7c4-89e8-4e4e-a184-d514fb780bb0\") " pod="openstack/ssh-known-hosts-edpm-deployment-hj7ts" Mar 09 14:40:14 crc kubenswrapper[4722]: I0309 14:40:14.601048 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0104a7c4-89e8-4e4e-a184-d514fb780bb0-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-hj7ts\" (UID: \"0104a7c4-89e8-4e4e-a184-d514fb780bb0\") " pod="openstack/ssh-known-hosts-edpm-deployment-hj7ts" Mar 09 14:40:14 crc kubenswrapper[4722]: I0309 14:40:14.612622 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvrw8\" (UniqueName: \"kubernetes.io/projected/0104a7c4-89e8-4e4e-a184-d514fb780bb0-kube-api-access-vvrw8\") pod \"ssh-known-hosts-edpm-deployment-hj7ts\" (UID: \"0104a7c4-89e8-4e4e-a184-d514fb780bb0\") " pod="openstack/ssh-known-hosts-edpm-deployment-hj7ts" Mar 09 14:40:14 crc kubenswrapper[4722]: I0309 14:40:14.623813 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-hj7ts" Mar 09 14:40:15 crc kubenswrapper[4722]: I0309 14:40:15.195846 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-hj7ts"] Mar 09 14:40:15 crc kubenswrapper[4722]: I0309 14:40:15.225747 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-hj7ts" event={"ID":"0104a7c4-89e8-4e4e-a184-d514fb780bb0","Type":"ContainerStarted","Data":"292fae6be17c3356282799bdbe26b6e92e509f325c6b6a9f67a568a5f7f1694c"} Mar 09 14:40:16 crc kubenswrapper[4722]: I0309 14:40:16.235769 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-hj7ts" event={"ID":"0104a7c4-89e8-4e4e-a184-d514fb780bb0","Type":"ContainerStarted","Data":"77b7cdb0609b8591cf1cce96ff9a0d1eb8fd0b3306ffb35e19feeeb0b35e56de"} Mar 09 14:40:16 crc kubenswrapper[4722]: I0309 14:40:16.258380 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-hj7ts" podStartSLOduration=1.709850008 podStartE2EDuration="2.258358328s" podCreationTimestamp="2026-03-09 14:40:14 +0000 UTC" firstStartedPulling="2026-03-09 14:40:15.197310516 +0000 UTC m=+2255.752879092" lastFinishedPulling="2026-03-09 14:40:15.745818816 +0000 UTC m=+2256.301387412" observedRunningTime="2026-03-09 14:40:16.255676545 +0000 UTC m=+2256.811245131" watchObservedRunningTime="2026-03-09 14:40:16.258358328 +0000 UTC m=+2256.813926914" Mar 09 14:40:21 crc kubenswrapper[4722]: I0309 14:40:21.528070 4722 patch_prober.go:28] interesting pod/machine-config-daemon-hjrrb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:40:21 crc kubenswrapper[4722]: I0309 14:40:21.528706 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:40:27 crc kubenswrapper[4722]: I0309 14:40:27.401011 4722 generic.go:334] "Generic (PLEG): container finished" podID="0104a7c4-89e8-4e4e-a184-d514fb780bb0" containerID="77b7cdb0609b8591cf1cce96ff9a0d1eb8fd0b3306ffb35e19feeeb0b35e56de" exitCode=0 Mar 09 14:40:27 crc kubenswrapper[4722]: I0309 14:40:27.401090 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-hj7ts" event={"ID":"0104a7c4-89e8-4e4e-a184-d514fb780bb0","Type":"ContainerDied","Data":"77b7cdb0609b8591cf1cce96ff9a0d1eb8fd0b3306ffb35e19feeeb0b35e56de"} Mar 09 14:40:27 crc kubenswrapper[4722]: I0309 14:40:27.508134 4722 scope.go:117] "RemoveContainer" containerID="384ceb3cd98cac6077daedbd758a939b3c58af663dade4bf4e245ec91b3e624e" Mar 09 14:40:28 crc kubenswrapper[4722]: I0309 14:40:28.909622 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-hj7ts" Mar 09 14:40:29 crc kubenswrapper[4722]: I0309 14:40:29.062219 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0104a7c4-89e8-4e4e-a184-d514fb780bb0-ssh-key-openstack-edpm-ipam\") pod \"0104a7c4-89e8-4e4e-a184-d514fb780bb0\" (UID: \"0104a7c4-89e8-4e4e-a184-d514fb780bb0\") " Mar 09 14:40:29 crc kubenswrapper[4722]: I0309 14:40:29.062442 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/0104a7c4-89e8-4e4e-a184-d514fb780bb0-inventory-0\") pod \"0104a7c4-89e8-4e4e-a184-d514fb780bb0\" (UID: \"0104a7c4-89e8-4e4e-a184-d514fb780bb0\") " Mar 09 14:40:29 crc kubenswrapper[4722]: I0309 14:40:29.062591 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvrw8\" (UniqueName: \"kubernetes.io/projected/0104a7c4-89e8-4e4e-a184-d514fb780bb0-kube-api-access-vvrw8\") pod \"0104a7c4-89e8-4e4e-a184-d514fb780bb0\" (UID: \"0104a7c4-89e8-4e4e-a184-d514fb780bb0\") " Mar 09 14:40:29 crc kubenswrapper[4722]: I0309 14:40:29.072531 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0104a7c4-89e8-4e4e-a184-d514fb780bb0-kube-api-access-vvrw8" (OuterVolumeSpecName: "kube-api-access-vvrw8") pod "0104a7c4-89e8-4e4e-a184-d514fb780bb0" (UID: "0104a7c4-89e8-4e4e-a184-d514fb780bb0"). InnerVolumeSpecName "kube-api-access-vvrw8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:40:29 crc kubenswrapper[4722]: I0309 14:40:29.165304 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvrw8\" (UniqueName: \"kubernetes.io/projected/0104a7c4-89e8-4e4e-a184-d514fb780bb0-kube-api-access-vvrw8\") on node \"crc\" DevicePath \"\"" Mar 09 14:40:29 crc kubenswrapper[4722]: I0309 14:40:29.178152 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0104a7c4-89e8-4e4e-a184-d514fb780bb0-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "0104a7c4-89e8-4e4e-a184-d514fb780bb0" (UID: "0104a7c4-89e8-4e4e-a184-d514fb780bb0"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:40:29 crc kubenswrapper[4722]: I0309 14:40:29.204713 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0104a7c4-89e8-4e4e-a184-d514fb780bb0-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "0104a7c4-89e8-4e4e-a184-d514fb780bb0" (UID: "0104a7c4-89e8-4e4e-a184-d514fb780bb0"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:40:29 crc kubenswrapper[4722]: I0309 14:40:29.267862 4722 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0104a7c4-89e8-4e4e-a184-d514fb780bb0-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 14:40:29 crc kubenswrapper[4722]: I0309 14:40:29.268033 4722 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/0104a7c4-89e8-4e4e-a184-d514fb780bb0-inventory-0\") on node \"crc\" DevicePath \"\"" Mar 09 14:40:29 crc kubenswrapper[4722]: I0309 14:40:29.423887 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-hj7ts" event={"ID":"0104a7c4-89e8-4e4e-a184-d514fb780bb0","Type":"ContainerDied","Data":"292fae6be17c3356282799bdbe26b6e92e509f325c6b6a9f67a568a5f7f1694c"} Mar 09 14:40:29 crc kubenswrapper[4722]: I0309 14:40:29.423928 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="292fae6be17c3356282799bdbe26b6e92e509f325c6b6a9f67a568a5f7f1694c" Mar 09 14:40:29 crc kubenswrapper[4722]: I0309 14:40:29.423966 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-hj7ts" Mar 09 14:40:29 crc kubenswrapper[4722]: I0309 14:40:29.506446 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-7fht9"] Mar 09 14:40:29 crc kubenswrapper[4722]: E0309 14:40:29.507151 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0104a7c4-89e8-4e4e-a184-d514fb780bb0" containerName="ssh-known-hosts-edpm-deployment" Mar 09 14:40:29 crc kubenswrapper[4722]: I0309 14:40:29.507178 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="0104a7c4-89e8-4e4e-a184-d514fb780bb0" containerName="ssh-known-hosts-edpm-deployment" Mar 09 14:40:29 crc kubenswrapper[4722]: I0309 14:40:29.507610 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="0104a7c4-89e8-4e4e-a184-d514fb780bb0" containerName="ssh-known-hosts-edpm-deployment" Mar 09 14:40:29 crc kubenswrapper[4722]: I0309 14:40:29.509010 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7fht9" Mar 09 14:40:29 crc kubenswrapper[4722]: I0309 14:40:29.512042 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 14:40:29 crc kubenswrapper[4722]: I0309 14:40:29.512138 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 14:40:29 crc kubenswrapper[4722]: I0309 14:40:29.513589 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 14:40:29 crc kubenswrapper[4722]: I0309 14:40:29.514285 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-dwlbg" Mar 09 14:40:29 crc kubenswrapper[4722]: I0309 14:40:29.521329 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-7fht9"] Mar 09 14:40:29 crc kubenswrapper[4722]: I0309 14:40:29.677465 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/36900a74-39f3-4977-9151-b3b4bdc64554-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7fht9\" (UID: \"36900a74-39f3-4977-9151-b3b4bdc64554\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7fht9" Mar 09 14:40:29 crc kubenswrapper[4722]: I0309 14:40:29.677863 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36900a74-39f3-4977-9151-b3b4bdc64554-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7fht9\" (UID: \"36900a74-39f3-4977-9151-b3b4bdc64554\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7fht9" Mar 09 14:40:29 crc kubenswrapper[4722]: I0309 14:40:29.678229 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jq6zv\" (UniqueName: \"kubernetes.io/projected/36900a74-39f3-4977-9151-b3b4bdc64554-kube-api-access-jq6zv\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7fht9\" (UID: \"36900a74-39f3-4977-9151-b3b4bdc64554\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7fht9" Mar 09 14:40:29 crc kubenswrapper[4722]: I0309 14:40:29.780391 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/36900a74-39f3-4977-9151-b3b4bdc64554-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7fht9\" (UID: \"36900a74-39f3-4977-9151-b3b4bdc64554\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7fht9" Mar 09 14:40:29 crc kubenswrapper[4722]: I0309 14:40:29.780544 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36900a74-39f3-4977-9151-b3b4bdc64554-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7fht9\" (UID: \"36900a74-39f3-4977-9151-b3b4bdc64554\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7fht9" Mar 09 14:40:29 crc kubenswrapper[4722]: I0309 14:40:29.780652 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jq6zv\" (UniqueName: \"kubernetes.io/projected/36900a74-39f3-4977-9151-b3b4bdc64554-kube-api-access-jq6zv\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7fht9\" (UID: \"36900a74-39f3-4977-9151-b3b4bdc64554\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7fht9" Mar 09 14:40:29 crc kubenswrapper[4722]: I0309 14:40:29.785744 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/36900a74-39f3-4977-9151-b3b4bdc64554-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7fht9\" (UID: \"36900a74-39f3-4977-9151-b3b4bdc64554\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7fht9" Mar 09 14:40:29 crc kubenswrapper[4722]: I0309 14:40:29.785813 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36900a74-39f3-4977-9151-b3b4bdc64554-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7fht9\" (UID: \"36900a74-39f3-4977-9151-b3b4bdc64554\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7fht9" Mar 09 14:40:29 crc kubenswrapper[4722]: I0309 14:40:29.803120 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jq6zv\" (UniqueName: \"kubernetes.io/projected/36900a74-39f3-4977-9151-b3b4bdc64554-kube-api-access-jq6zv\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-7fht9\" (UID: \"36900a74-39f3-4977-9151-b3b4bdc64554\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7fht9" Mar 09 14:40:29 crc kubenswrapper[4722]: I0309 14:40:29.827618 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7fht9" Mar 09 14:40:30 crc kubenswrapper[4722]: I0309 14:40:30.425797 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-7fht9"] Mar 09 14:40:30 crc kubenswrapper[4722]: W0309 14:40:30.431470 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod36900a74_39f3_4977_9151_b3b4bdc64554.slice/crio-2969150cbbf7d41ef0848fbbcf61f5bdfa1a3e11aac93e2df25147b369e8035b WatchSource:0}: Error finding container 2969150cbbf7d41ef0848fbbcf61f5bdfa1a3e11aac93e2df25147b369e8035b: Status 404 returned error can't find the container with id 2969150cbbf7d41ef0848fbbcf61f5bdfa1a3e11aac93e2df25147b369e8035b Mar 09 14:40:31 crc kubenswrapper[4722]: I0309 14:40:31.449135 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7fht9" event={"ID":"36900a74-39f3-4977-9151-b3b4bdc64554","Type":"ContainerStarted","Data":"f0eff49ab96437b87e5192666050f4a58866b0d186673365b90d11e7c5879770"} Mar 09 14:40:31 crc kubenswrapper[4722]: I0309 14:40:31.450017 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7fht9" event={"ID":"36900a74-39f3-4977-9151-b3b4bdc64554","Type":"ContainerStarted","Data":"2969150cbbf7d41ef0848fbbcf61f5bdfa1a3e11aac93e2df25147b369e8035b"} Mar 09 14:40:31 crc kubenswrapper[4722]: I0309 14:40:31.473414 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7fht9" podStartSLOduration=1.885252633 podStartE2EDuration="2.473385272s" podCreationTimestamp="2026-03-09 14:40:29 +0000 UTC" firstStartedPulling="2026-03-09 14:40:30.435030917 +0000 UTC m=+2270.990599493" lastFinishedPulling="2026-03-09 14:40:31.023163556 +0000 UTC m=+2271.578732132" observedRunningTime="2026-03-09 14:40:31.47114377 +0000 UTC m=+2272.026712346" watchObservedRunningTime="2026-03-09 14:40:31.473385272 +0000 UTC m=+2272.028953848" Mar 09 14:40:41 crc kubenswrapper[4722]: I0309 14:40:41.553637 4722 generic.go:334] "Generic (PLEG): container finished" podID="36900a74-39f3-4977-9151-b3b4bdc64554" containerID="f0eff49ab96437b87e5192666050f4a58866b0d186673365b90d11e7c5879770" exitCode=0 Mar 09 14:40:41 crc kubenswrapper[4722]: I0309 14:40:41.553727 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7fht9" event={"ID":"36900a74-39f3-4977-9151-b3b4bdc64554","Type":"ContainerDied","Data":"f0eff49ab96437b87e5192666050f4a58866b0d186673365b90d11e7c5879770"} Mar 09 14:40:43 crc kubenswrapper[4722]: I0309 14:40:43.138465 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7fht9" Mar 09 14:40:43 crc kubenswrapper[4722]: I0309 14:40:43.222602 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jq6zv\" (UniqueName: \"kubernetes.io/projected/36900a74-39f3-4977-9151-b3b4bdc64554-kube-api-access-jq6zv\") pod \"36900a74-39f3-4977-9151-b3b4bdc64554\" (UID: \"36900a74-39f3-4977-9151-b3b4bdc64554\") " Mar 09 14:40:43 crc kubenswrapper[4722]: I0309 14:40:43.222704 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/36900a74-39f3-4977-9151-b3b4bdc64554-ssh-key-openstack-edpm-ipam\") pod \"36900a74-39f3-4977-9151-b3b4bdc64554\" (UID: \"36900a74-39f3-4977-9151-b3b4bdc64554\") " Mar 09 14:40:43 crc kubenswrapper[4722]: I0309 14:40:43.222979 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36900a74-39f3-4977-9151-b3b4bdc64554-inventory\") pod \"36900a74-39f3-4977-9151-b3b4bdc64554\" (UID: \"36900a74-39f3-4977-9151-b3b4bdc64554\") " Mar 09 14:40:43 crc kubenswrapper[4722]: I0309 14:40:43.238426 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36900a74-39f3-4977-9151-b3b4bdc64554-kube-api-access-jq6zv" (OuterVolumeSpecName: "kube-api-access-jq6zv") pod "36900a74-39f3-4977-9151-b3b4bdc64554" (UID: "36900a74-39f3-4977-9151-b3b4bdc64554"). InnerVolumeSpecName "kube-api-access-jq6zv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:40:43 crc kubenswrapper[4722]: I0309 14:40:43.253870 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36900a74-39f3-4977-9151-b3b4bdc64554-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "36900a74-39f3-4977-9151-b3b4bdc64554" (UID: "36900a74-39f3-4977-9151-b3b4bdc64554"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:40:43 crc kubenswrapper[4722]: I0309 14:40:43.258084 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36900a74-39f3-4977-9151-b3b4bdc64554-inventory" (OuterVolumeSpecName: "inventory") pod "36900a74-39f3-4977-9151-b3b4bdc64554" (UID: "36900a74-39f3-4977-9151-b3b4bdc64554"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:40:43 crc kubenswrapper[4722]: I0309 14:40:43.326172 4722 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/36900a74-39f3-4977-9151-b3b4bdc64554-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 14:40:43 crc kubenswrapper[4722]: I0309 14:40:43.326223 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jq6zv\" (UniqueName: \"kubernetes.io/projected/36900a74-39f3-4977-9151-b3b4bdc64554-kube-api-access-jq6zv\") on node \"crc\" DevicePath \"\"" Mar 09 14:40:43 crc kubenswrapper[4722]: I0309 14:40:43.326257 4722 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/36900a74-39f3-4977-9151-b3b4bdc64554-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 14:40:43 crc kubenswrapper[4722]: I0309 14:40:43.574969 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7fht9" event={"ID":"36900a74-39f3-4977-9151-b3b4bdc64554","Type":"ContainerDied","Data":"2969150cbbf7d41ef0848fbbcf61f5bdfa1a3e11aac93e2df25147b369e8035b"} Mar 09 14:40:43 crc kubenswrapper[4722]: I0309 14:40:43.575415 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2969150cbbf7d41ef0848fbbcf61f5bdfa1a3e11aac93e2df25147b369e8035b" Mar 09 14:40:43 crc kubenswrapper[4722]: I0309 14:40:43.575012 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-7fht9" Mar 09 14:40:43 crc kubenswrapper[4722]: I0309 14:40:43.658031 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2hn5t"] Mar 09 14:40:43 crc kubenswrapper[4722]: E0309 14:40:43.658617 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36900a74-39f3-4977-9151-b3b4bdc64554" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 09 14:40:43 crc kubenswrapper[4722]: I0309 14:40:43.658641 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="36900a74-39f3-4977-9151-b3b4bdc64554" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 09 14:40:43 crc kubenswrapper[4722]: I0309 14:40:43.658888 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="36900a74-39f3-4977-9151-b3b4bdc64554" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 09 14:40:43 crc kubenswrapper[4722]: I0309 14:40:43.659936 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2hn5t" Mar 09 14:40:43 crc kubenswrapper[4722]: I0309 14:40:43.662178 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 14:40:43 crc kubenswrapper[4722]: I0309 14:40:43.662466 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 14:40:43 crc kubenswrapper[4722]: I0309 14:40:43.662524 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-dwlbg" Mar 09 14:40:43 crc kubenswrapper[4722]: I0309 14:40:43.663259 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 14:40:43 crc kubenswrapper[4722]: I0309 14:40:43.674582 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2hn5t"] Mar 09 14:40:43 crc kubenswrapper[4722]: I0309 14:40:43.735225 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsjvt\" (UniqueName: \"kubernetes.io/projected/cfb8fcd2-1bf9-44f5-a82c-6049951ea321-kube-api-access-wsjvt\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2hn5t\" (UID: \"cfb8fcd2-1bf9-44f5-a82c-6049951ea321\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2hn5t" Mar 09 14:40:43 crc kubenswrapper[4722]: I0309 14:40:43.735352 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cfb8fcd2-1bf9-44f5-a82c-6049951ea321-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2hn5t\" (UID: \"cfb8fcd2-1bf9-44f5-a82c-6049951ea321\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2hn5t" Mar 09 14:40:43 crc kubenswrapper[4722]: I0309 14:40:43.735442 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cfb8fcd2-1bf9-44f5-a82c-6049951ea321-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2hn5t\" (UID: \"cfb8fcd2-1bf9-44f5-a82c-6049951ea321\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2hn5t" Mar 09 14:40:43 crc kubenswrapper[4722]: I0309 14:40:43.837561 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsjvt\" (UniqueName: \"kubernetes.io/projected/cfb8fcd2-1bf9-44f5-a82c-6049951ea321-kube-api-access-wsjvt\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2hn5t\" (UID: \"cfb8fcd2-1bf9-44f5-a82c-6049951ea321\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2hn5t" Mar 09 14:40:43 crc kubenswrapper[4722]: I0309 14:40:43.837685 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cfb8fcd2-1bf9-44f5-a82c-6049951ea321-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2hn5t\" (UID: \"cfb8fcd2-1bf9-44f5-a82c-6049951ea321\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2hn5t" Mar 09 14:40:43 crc kubenswrapper[4722]: I0309 14:40:43.837769 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cfb8fcd2-1bf9-44f5-a82c-6049951ea321-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2hn5t\" (UID: \"cfb8fcd2-1bf9-44f5-a82c-6049951ea321\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2hn5t" Mar 09 14:40:43 crc kubenswrapper[4722]: I0309 14:40:43.843238 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cfb8fcd2-1bf9-44f5-a82c-6049951ea321-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2hn5t\" (UID: \"cfb8fcd2-1bf9-44f5-a82c-6049951ea321\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2hn5t" Mar 09 14:40:43 crc kubenswrapper[4722]: I0309 14:40:43.843524 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cfb8fcd2-1bf9-44f5-a82c-6049951ea321-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2hn5t\" (UID: \"cfb8fcd2-1bf9-44f5-a82c-6049951ea321\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2hn5t" Mar 09 14:40:43 crc kubenswrapper[4722]: I0309 14:40:43.870034 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsjvt\" (UniqueName: \"kubernetes.io/projected/cfb8fcd2-1bf9-44f5-a82c-6049951ea321-kube-api-access-wsjvt\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2hn5t\" (UID: \"cfb8fcd2-1bf9-44f5-a82c-6049951ea321\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2hn5t" Mar 09 14:40:43 crc kubenswrapper[4722]: I0309 14:40:43.977882 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2hn5t" Mar 09 14:40:44 crc kubenswrapper[4722]: W0309 14:40:44.515787 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcfb8fcd2_1bf9_44f5_a82c_6049951ea321.slice/crio-664576324d6250128b1746fb85ed6b0292ad0ecc15f88f36f0f27e4590a3e2d9 WatchSource:0}: Error finding container 664576324d6250128b1746fb85ed6b0292ad0ecc15f88f36f0f27e4590a3e2d9: Status 404 returned error can't find the container with id 664576324d6250128b1746fb85ed6b0292ad0ecc15f88f36f0f27e4590a3e2d9 Mar 09 14:40:44 crc kubenswrapper[4722]: I0309 14:40:44.522882 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2hn5t"] Mar 09 14:40:44 crc kubenswrapper[4722]: I0309 14:40:44.588309 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2hn5t" event={"ID":"cfb8fcd2-1bf9-44f5-a82c-6049951ea321","Type":"ContainerStarted","Data":"664576324d6250128b1746fb85ed6b0292ad0ecc15f88f36f0f27e4590a3e2d9"} Mar 09 14:40:45 crc kubenswrapper[4722]: I0309 14:40:45.600501 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2hn5t" event={"ID":"cfb8fcd2-1bf9-44f5-a82c-6049951ea321","Type":"ContainerStarted","Data":"b35dde210c35580c2d750eb78358b24c8bfd129f41e9d64586e489577e1aa85c"} Mar 09 14:40:45 crc kubenswrapper[4722]: I0309 14:40:45.626061 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2hn5t" podStartSLOduration=2.204278637 podStartE2EDuration="2.625480622s" podCreationTimestamp="2026-03-09 14:40:43 +0000 UTC" firstStartedPulling="2026-03-09 14:40:44.518415148 +0000 UTC m=+2285.073983724" lastFinishedPulling="2026-03-09 14:40:44.939617133 +0000 UTC m=+2285.495185709" observedRunningTime="2026-03-09 14:40:45.619772007 +0000 UTC m=+2286.175340583" watchObservedRunningTime="2026-03-09 14:40:45.625480622 +0000 UTC m=+2286.181049198" Mar 09 14:40:51 crc kubenswrapper[4722]: I0309 14:40:51.527651 4722 patch_prober.go:28] interesting pod/machine-config-daemon-hjrrb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:40:51 crc kubenswrapper[4722]: I0309 14:40:51.528283 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:40:54 crc kubenswrapper[4722]: I0309 14:40:54.687136 4722 generic.go:334] "Generic (PLEG): container finished" podID="cfb8fcd2-1bf9-44f5-a82c-6049951ea321" containerID="b35dde210c35580c2d750eb78358b24c8bfd129f41e9d64586e489577e1aa85c" exitCode=0 Mar 09 14:40:54 crc kubenswrapper[4722]: I0309 14:40:54.687238 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2hn5t" event={"ID":"cfb8fcd2-1bf9-44f5-a82c-6049951ea321","Type":"ContainerDied","Data":"b35dde210c35580c2d750eb78358b24c8bfd129f41e9d64586e489577e1aa85c"} Mar 09 14:40:56 crc kubenswrapper[4722]: I0309 14:40:56.197162 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2hn5t" Mar 09 14:40:56 crc kubenswrapper[4722]: I0309 14:40:56.242286 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cfb8fcd2-1bf9-44f5-a82c-6049951ea321-inventory\") pod \"cfb8fcd2-1bf9-44f5-a82c-6049951ea321\" (UID: \"cfb8fcd2-1bf9-44f5-a82c-6049951ea321\") " Mar 09 14:40:56 crc kubenswrapper[4722]: I0309 14:40:56.242716 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cfb8fcd2-1bf9-44f5-a82c-6049951ea321-ssh-key-openstack-edpm-ipam\") pod \"cfb8fcd2-1bf9-44f5-a82c-6049951ea321\" (UID: \"cfb8fcd2-1bf9-44f5-a82c-6049951ea321\") " Mar 09 14:40:56 crc kubenswrapper[4722]: I0309 14:40:56.242895 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wsjvt\" (UniqueName: \"kubernetes.io/projected/cfb8fcd2-1bf9-44f5-a82c-6049951ea321-kube-api-access-wsjvt\") pod \"cfb8fcd2-1bf9-44f5-a82c-6049951ea321\" (UID: \"cfb8fcd2-1bf9-44f5-a82c-6049951ea321\") " Mar 09 14:40:56 crc kubenswrapper[4722]: I0309 14:40:56.256664 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfb8fcd2-1bf9-44f5-a82c-6049951ea321-kube-api-access-wsjvt" (OuterVolumeSpecName: "kube-api-access-wsjvt") pod "cfb8fcd2-1bf9-44f5-a82c-6049951ea321" (UID: "cfb8fcd2-1bf9-44f5-a82c-6049951ea321"). InnerVolumeSpecName "kube-api-access-wsjvt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:40:56 crc kubenswrapper[4722]: I0309 14:40:56.278721 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfb8fcd2-1bf9-44f5-a82c-6049951ea321-inventory" (OuterVolumeSpecName: "inventory") pod "cfb8fcd2-1bf9-44f5-a82c-6049951ea321" (UID: "cfb8fcd2-1bf9-44f5-a82c-6049951ea321"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:40:56 crc kubenswrapper[4722]: I0309 14:40:56.279538 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cfb8fcd2-1bf9-44f5-a82c-6049951ea321-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "cfb8fcd2-1bf9-44f5-a82c-6049951ea321" (UID: "cfb8fcd2-1bf9-44f5-a82c-6049951ea321"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:40:56 crc kubenswrapper[4722]: I0309 14:40:56.345738 4722 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cfb8fcd2-1bf9-44f5-a82c-6049951ea321-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 14:40:56 crc kubenswrapper[4722]: I0309 14:40:56.345766 4722 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cfb8fcd2-1bf9-44f5-a82c-6049951ea321-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 14:40:56 crc kubenswrapper[4722]: I0309 14:40:56.345777 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wsjvt\" (UniqueName: \"kubernetes.io/projected/cfb8fcd2-1bf9-44f5-a82c-6049951ea321-kube-api-access-wsjvt\") on node \"crc\" DevicePath \"\"" Mar 09 14:40:56 crc kubenswrapper[4722]: I0309 14:40:56.716064 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2hn5t" event={"ID":"cfb8fcd2-1bf9-44f5-a82c-6049951ea321","Type":"ContainerDied","Data":"664576324d6250128b1746fb85ed6b0292ad0ecc15f88f36f0f27e4590a3e2d9"} Mar 09 14:40:56 crc kubenswrapper[4722]: I0309 14:40:56.716123 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="664576324d6250128b1746fb85ed6b0292ad0ecc15f88f36f0f27e4590a3e2d9" Mar 09 14:40:56 crc kubenswrapper[4722]: I0309 14:40:56.716142 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2hn5t" Mar 09 14:40:56 crc kubenswrapper[4722]: I0309 14:40:56.828375 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh"] Mar 09 14:40:56 crc kubenswrapper[4722]: E0309 14:40:56.828974 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfb8fcd2-1bf9-44f5-a82c-6049951ea321" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 09 14:40:56 crc kubenswrapper[4722]: I0309 14:40:56.828990 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfb8fcd2-1bf9-44f5-a82c-6049951ea321" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 09 14:40:56 crc kubenswrapper[4722]: I0309 14:40:56.829314 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfb8fcd2-1bf9-44f5-a82c-6049951ea321" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 09 14:40:56 crc kubenswrapper[4722]: I0309 14:40:56.830473 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh" Mar 09 14:40:56 crc kubenswrapper[4722]: I0309 14:40:56.835861 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Mar 09 14:40:56 crc kubenswrapper[4722]: I0309 14:40:56.836289 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 14:40:56 crc kubenswrapper[4722]: I0309 14:40:56.836822 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 14:40:56 crc kubenswrapper[4722]: I0309 14:40:56.836859 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Mar 09 14:40:56 crc kubenswrapper[4722]: I0309 14:40:56.837005 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Mar 09 14:40:56 crc kubenswrapper[4722]: I0309 14:40:56.837428 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-dwlbg" Mar 09 14:40:56 crc kubenswrapper[4722]: I0309 14:40:56.837606 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0" Mar 09 14:40:56 crc kubenswrapper[4722]: I0309 14:40:56.837764 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 14:40:56 crc kubenswrapper[4722]: I0309 14:40:56.838766 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Mar 09 14:40:56 crc kubenswrapper[4722]: I0309 14:40:56.842067 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh"] Mar 09 14:40:56 crc kubenswrapper[4722]: I0309 14:40:56.968306 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e37df29-c0ce-40d1-bd50-50936c544bb0-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh\" (UID: \"7e37df29-c0ce-40d1-bd50-50936c544bb0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh" Mar 09 14:40:56 crc kubenswrapper[4722]: I0309 14:40:56.968353 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e37df29-c0ce-40d1-bd50-50936c544bb0-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh\" (UID: \"7e37df29-c0ce-40d1-bd50-50936c544bb0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh" Mar 09 14:40:56 crc kubenswrapper[4722]: I0309 14:40:56.968423 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e37df29-c0ce-40d1-bd50-50936c544bb0-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh\" (UID: \"7e37df29-c0ce-40d1-bd50-50936c544bb0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh" Mar 09 14:40:56 crc kubenswrapper[4722]: I0309 14:40:56.968448 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e37df29-c0ce-40d1-bd50-50936c544bb0-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh\" (UID: \"7e37df29-c0ce-40d1-bd50-50936c544bb0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh" Mar 09 14:40:56 crc kubenswrapper[4722]: I0309 14:40:56.968472 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e37df29-c0ce-40d1-bd50-50936c544bb0-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh\" (UID: \"7e37df29-c0ce-40d1-bd50-50936c544bb0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh" Mar 09 14:40:56 crc kubenswrapper[4722]: I0309 14:40:56.968645 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e37df29-c0ce-40d1-bd50-50936c544bb0-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh\" (UID: \"7e37df29-c0ce-40d1-bd50-50936c544bb0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh" Mar 09 14:40:56 crc kubenswrapper[4722]: I0309 14:40:56.968828 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e37df29-c0ce-40d1-bd50-50936c544bb0-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh\" (UID: \"7e37df29-c0ce-40d1-bd50-50936c544bb0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh" Mar 09 14:40:56 crc kubenswrapper[4722]: I0309 14:40:56.968871 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vndpj\" (UniqueName: \"kubernetes.io/projected/7e37df29-c0ce-40d1-bd50-50936c544bb0-kube-api-access-vndpj\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh\" (UID: \"7e37df29-c0ce-40d1-bd50-50936c544bb0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh" Mar 09 14:40:56 crc kubenswrapper[4722]: I0309 14:40:56.969066 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e37df29-c0ce-40d1-bd50-50936c544bb0-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh\" (UID: \"7e37df29-c0ce-40d1-bd50-50936c544bb0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh" Mar 09 14:40:56 crc kubenswrapper[4722]: I0309 14:40:56.969246 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e37df29-c0ce-40d1-bd50-50936c544bb0-telemetry-power-monitoring-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh\" (UID: \"7e37df29-c0ce-40d1-bd50-50936c544bb0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh" Mar 09 14:40:56 crc kubenswrapper[4722]: I0309 14:40:56.969349 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e37df29-c0ce-40d1-bd50-50936c544bb0-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh\" (UID: \"7e37df29-c0ce-40d1-bd50-50936c544bb0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh" Mar 09 14:40:56 crc kubenswrapper[4722]: I0309 14:40:56.969414 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e37df29-c0ce-40d1-bd50-50936c544bb0-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh\" (UID: \"7e37df29-c0ce-40d1-bd50-50936c544bb0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh" Mar 09 14:40:56 crc kubenswrapper[4722]: I0309 14:40:56.969521 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e37df29-c0ce-40d1-bd50-50936c544bb0-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh\" (UID: \"7e37df29-c0ce-40d1-bd50-50936c544bb0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh" Mar 09 14:40:56 crc kubenswrapper[4722]: I0309 14:40:56.969632 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7e37df29-c0ce-40d1-bd50-50936c544bb0-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh\" (UID: \"7e37df29-c0ce-40d1-bd50-50936c544bb0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh" Mar 09 14:40:56 crc kubenswrapper[4722]: I0309 14:40:56.969707 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e37df29-c0ce-40d1-bd50-50936c544bb0-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh\" (UID: \"7e37df29-c0ce-40d1-bd50-50936c544bb0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh" Mar 09 14:40:56 crc kubenswrapper[4722]: I0309 14:40:56.969856 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e37df29-c0ce-40d1-bd50-50936c544bb0-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh\" (UID: \"7e37df29-c0ce-40d1-bd50-50936c544bb0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh" Mar 09 14:40:57 crc kubenswrapper[4722]: I0309 14:40:57.071694 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e37df29-c0ce-40d1-bd50-50936c544bb0-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh\" (UID: \"7e37df29-c0ce-40d1-bd50-50936c544bb0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh" Mar 09 14:40:57 crc kubenswrapper[4722]: I0309 14:40:57.071758 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e37df29-c0ce-40d1-bd50-50936c544bb0-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh\" (UID: \"7e37df29-c0ce-40d1-bd50-50936c544bb0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh" Mar 09 14:40:57 crc kubenswrapper[4722]: I0309 14:40:57.072798 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e37df29-c0ce-40d1-bd50-50936c544bb0-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh\" (UID: \"7e37df29-c0ce-40d1-bd50-50936c544bb0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh" Mar 09 14:40:57 crc kubenswrapper[4722]: I0309 14:40:57.072837 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e37df29-c0ce-40d1-bd50-50936c544bb0-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh\" (UID: \"7e37df29-c0ce-40d1-bd50-50936c544bb0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh" Mar 09 14:40:57 crc kubenswrapper[4722]: I0309 14:40:57.072909 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e37df29-c0ce-40d1-bd50-50936c544bb0-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh\" (UID: \"7e37df29-c0ce-40d1-bd50-50936c544bb0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh" Mar 09 14:40:57 crc kubenswrapper[4722]: I0309 14:40:57.072946 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e37df29-c0ce-40d1-bd50-50936c544bb0-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh\" (UID: \"7e37df29-c0ce-40d1-bd50-50936c544bb0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh" Mar 09 14:40:57 crc kubenswrapper[4722]: I0309 14:40:57.073429 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e37df29-c0ce-40d1-bd50-50936c544bb0-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh\" (UID: \"7e37df29-c0ce-40d1-bd50-50936c544bb0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh" Mar 09 14:40:57 crc kubenswrapper[4722]: I0309 14:40:57.073458 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vndpj\" (UniqueName: \"kubernetes.io/projected/7e37df29-c0ce-40d1-bd50-50936c544bb0-kube-api-access-vndpj\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh\" (UID: \"7e37df29-c0ce-40d1-bd50-50936c544bb0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh" Mar 09 14:40:57 crc kubenswrapper[4722]: I0309 14:40:57.073532 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e37df29-c0ce-40d1-bd50-50936c544bb0-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh\" (UID: \"7e37df29-c0ce-40d1-bd50-50936c544bb0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh" Mar 09 14:40:57 crc kubenswrapper[4722]: I0309 14:40:57.073579 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e37df29-c0ce-40d1-bd50-50936c544bb0-telemetry-power-monitoring-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh\" (UID: \"7e37df29-c0ce-40d1-bd50-50936c544bb0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh" Mar 09 14:40:57 crc kubenswrapper[4722]: I0309 14:40:57.073619 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e37df29-c0ce-40d1-bd50-50936c544bb0-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh\" (UID: \"7e37df29-c0ce-40d1-bd50-50936c544bb0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh" Mar 09 14:40:57 crc kubenswrapper[4722]: I0309 14:40:57.073651 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e37df29-c0ce-40d1-bd50-50936c544bb0-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh\" (UID: \"7e37df29-c0ce-40d1-bd50-50936c544bb0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh" Mar 09 14:40:57 crc kubenswrapper[4722]: I0309 14:40:57.073691 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e37df29-c0ce-40d1-bd50-50936c544bb0-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh\" (UID: \"7e37df29-c0ce-40d1-bd50-50936c544bb0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh" Mar 09 14:40:57 crc kubenswrapper[4722]: I0309 14:40:57.073728 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7e37df29-c0ce-40d1-bd50-50936c544bb0-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh\" (UID: \"7e37df29-c0ce-40d1-bd50-50936c544bb0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh" Mar 09 14:40:57 crc kubenswrapper[4722]: I0309 14:40:57.073760 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e37df29-c0ce-40d1-bd50-50936c544bb0-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh\" (UID: \"7e37df29-c0ce-40d1-bd50-50936c544bb0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh" Mar 09 14:40:57 crc kubenswrapper[4722]: I0309 14:40:57.073813 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e37df29-c0ce-40d1-bd50-50936c544bb0-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh\" (UID: \"7e37df29-c0ce-40d1-bd50-50936c544bb0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh" Mar 09 14:40:57 crc kubenswrapper[4722]: I0309 14:40:57.078946 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e37df29-c0ce-40d1-bd50-50936c544bb0-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh\" (UID: \"7e37df29-c0ce-40d1-bd50-50936c544bb0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh" Mar 09 14:40:57 crc kubenswrapper[4722]: I0309 14:40:57.079442 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e37df29-c0ce-40d1-bd50-50936c544bb0-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh\" (UID: \"7e37df29-c0ce-40d1-bd50-50936c544bb0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh" Mar 09 14:40:57 crc kubenswrapper[4722]: I0309 14:40:57.080325 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e37df29-c0ce-40d1-bd50-50936c544bb0-telemetry-power-monitoring-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh\" (UID: \"7e37df29-c0ce-40d1-bd50-50936c544bb0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh" Mar 09 14:40:57 crc kubenswrapper[4722]: I0309 14:40:57.080898 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e37df29-c0ce-40d1-bd50-50936c544bb0-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh\" (UID: \"7e37df29-c0ce-40d1-bd50-50936c544bb0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh" Mar 09 14:40:57 crc kubenswrapper[4722]: I0309 14:40:57.081364 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e37df29-c0ce-40d1-bd50-50936c544bb0-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh\" (UID: \"7e37df29-c0ce-40d1-bd50-50936c544bb0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh" Mar 09 14:40:57 crc kubenswrapper[4722]: I0309 14:40:57.081933 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e37df29-c0ce-40d1-bd50-50936c544bb0-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh\" (UID: \"7e37df29-c0ce-40d1-bd50-50936c544bb0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh" Mar 09 14:40:57 crc kubenswrapper[4722]: I0309 14:40:57.082385 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7e37df29-c0ce-40d1-bd50-50936c544bb0-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh\" (UID: \"7e37df29-c0ce-40d1-bd50-50936c544bb0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh" Mar 09 14:40:57 crc kubenswrapper[4722]: I0309 14:40:57.082527 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e37df29-c0ce-40d1-bd50-50936c544bb0-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh\" (UID: \"7e37df29-c0ce-40d1-bd50-50936c544bb0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh" Mar 09 14:40:57 crc kubenswrapper[4722]: I0309 14:40:57.082672 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e37df29-c0ce-40d1-bd50-50936c544bb0-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh\" (UID: \"7e37df29-c0ce-40d1-bd50-50936c544bb0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh" Mar 09 14:40:57 crc kubenswrapper[4722]: I0309 14:40:57.084734 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e37df29-c0ce-40d1-bd50-50936c544bb0-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh\" (UID: \"7e37df29-c0ce-40d1-bd50-50936c544bb0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh" Mar 09 14:40:57 crc kubenswrapper[4722]: I0309 14:40:57.086151 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e37df29-c0ce-40d1-bd50-50936c544bb0-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh\" (UID: \"7e37df29-c0ce-40d1-bd50-50936c544bb0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh" Mar 09 14:40:57 crc kubenswrapper[4722]: I0309 14:40:57.087364 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e37df29-c0ce-40d1-bd50-50936c544bb0-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh\" (UID: \"7e37df29-c0ce-40d1-bd50-50936c544bb0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh" Mar 09 14:40:57 crc kubenswrapper[4722]: I0309 14:40:57.089759 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e37df29-c0ce-40d1-bd50-50936c544bb0-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh\" (UID: \"7e37df29-c0ce-40d1-bd50-50936c544bb0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh" Mar 09 14:40:57 crc kubenswrapper[4722]: I0309 14:40:57.091175 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e37df29-c0ce-40d1-bd50-50936c544bb0-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh\" (UID: \"7e37df29-c0ce-40d1-bd50-50936c544bb0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh" Mar 09 14:40:57 crc kubenswrapper[4722]: I0309 14:40:57.091626 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e37df29-c0ce-40d1-bd50-50936c544bb0-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh\" (UID: \"7e37df29-c0ce-40d1-bd50-50936c544bb0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh" Mar 09 14:40:57 crc kubenswrapper[4722]: I0309 14:40:57.097813 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vndpj\" (UniqueName: \"kubernetes.io/projected/7e37df29-c0ce-40d1-bd50-50936c544bb0-kube-api-access-vndpj\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh\" (UID: \"7e37df29-c0ce-40d1-bd50-50936c544bb0\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh" Mar 09 14:40:57 crc kubenswrapper[4722]: I0309 14:40:57.153150 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh" Mar 09 14:40:57 crc kubenswrapper[4722]: W0309 14:40:57.726045 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e37df29_c0ce_40d1_bd50_50936c544bb0.slice/crio-6a2e755093f92342e5349892ae250c885f416fd8c29c9dbdc116a0de55b8ad40 WatchSource:0}: Error finding container 6a2e755093f92342e5349892ae250c885f416fd8c29c9dbdc116a0de55b8ad40: Status 404 returned error can't find the container with id 6a2e755093f92342e5349892ae250c885f416fd8c29c9dbdc116a0de55b8ad40 Mar 09 14:40:57 crc kubenswrapper[4722]: I0309 14:40:57.730767 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh"] Mar 09 14:40:58 crc kubenswrapper[4722]: I0309 14:40:58.739024 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh" event={"ID":"7e37df29-c0ce-40d1-bd50-50936c544bb0","Type":"ContainerStarted","Data":"e5eb1f7ccec345efed1b67a86b43e0fd13f8eed47b874e832784af163d054ef5"} Mar 09 14:40:58 crc kubenswrapper[4722]: I0309 14:40:58.739426 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh" event={"ID":"7e37df29-c0ce-40d1-bd50-50936c544bb0","Type":"ContainerStarted","Data":"6a2e755093f92342e5349892ae250c885f416fd8c29c9dbdc116a0de55b8ad40"} Mar 09 14:40:58 crc kubenswrapper[4722]: I0309 14:40:58.768527 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh" podStartSLOduration=2.309128593 podStartE2EDuration="2.768504997s" podCreationTimestamp="2026-03-09 14:40:56 +0000 UTC" firstStartedPulling="2026-03-09 14:40:57.733071732 +0000 UTC m=+2298.288640298" lastFinishedPulling="2026-03-09 14:40:58.192448126 +0000 UTC m=+2298.748016702" observedRunningTime="2026-03-09 14:40:58.763541482 +0000 UTC m=+2299.319110058" watchObservedRunningTime="2026-03-09 14:40:58.768504997 +0000 UTC m=+2299.324073573" Mar 09 14:41:01 crc kubenswrapper[4722]: I0309 14:41:01.073718 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-cf85b"] Mar 09 14:41:01 crc kubenswrapper[4722]: I0309 14:41:01.083509 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-cf85b"] Mar 09 14:41:02 crc kubenswrapper[4722]: I0309 14:41:02.163590 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c83de432-8e6a-45bf-9395-215f28461090" path="/var/lib/kubelet/pods/c83de432-8e6a-45bf-9395-215f28461090/volumes" Mar 09 14:41:21 crc kubenswrapper[4722]: I0309 14:41:21.528493 4722 patch_prober.go:28] interesting pod/machine-config-daemon-hjrrb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:41:21 crc kubenswrapper[4722]: I0309 14:41:21.529290 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:41:21 crc kubenswrapper[4722]: I0309 14:41:21.529348 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" Mar 09 14:41:21 crc kubenswrapper[4722]: I0309 14:41:21.530236 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"931310cf17f93e3962cdbbdf531169aefb3915147cff22375dd9c06d9d6b2906"} pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 14:41:21 crc kubenswrapper[4722]: I0309 14:41:21.530309 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerName="machine-config-daemon" containerID="cri-o://931310cf17f93e3962cdbbdf531169aefb3915147cff22375dd9c06d9d6b2906" gracePeriod=600 Mar 09 14:41:21 crc kubenswrapper[4722]: E0309 14:41:21.653826 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 14:41:21 crc kubenswrapper[4722]: I0309 14:41:21.973303 4722 generic.go:334] "Generic (PLEG): container finished" podID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerID="931310cf17f93e3962cdbbdf531169aefb3915147cff22375dd9c06d9d6b2906" exitCode=0 Mar 09 14:41:21 crc kubenswrapper[4722]: I0309 14:41:21.973436 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" event={"ID":"dac2aaf5-653b-4b2a-8efe-ed26bac8d648","Type":"ContainerDied","Data":"931310cf17f93e3962cdbbdf531169aefb3915147cff22375dd9c06d9d6b2906"} Mar 09 14:41:21 crc kubenswrapper[4722]: I0309 14:41:21.973677 4722 scope.go:117] "RemoveContainer" containerID="4efa58158532ee2b76f83ca6efe53930357d25040e1664917766707f5e03ced2" Mar 09 14:41:21 crc kubenswrapper[4722]: I0309 14:41:21.974811 4722 scope.go:117] "RemoveContainer" containerID="931310cf17f93e3962cdbbdf531169aefb3915147cff22375dd9c06d9d6b2906" Mar 09 14:41:21 crc kubenswrapper[4722]: E0309 14:41:21.975519 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 14:41:27 crc kubenswrapper[4722]: I0309 14:41:27.593290 4722 scope.go:117] "RemoveContainer" containerID="034af84defe4bf2ca490931a683a7b4b7d083656e7d3a918c374e1b08704433a" Mar 09 14:41:34 crc kubenswrapper[4722]: I0309 14:41:34.149287 4722 scope.go:117] "RemoveContainer" containerID="931310cf17f93e3962cdbbdf531169aefb3915147cff22375dd9c06d9d6b2906" Mar 09 14:41:34 crc kubenswrapper[4722]: E0309 14:41:34.150142 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 14:41:48 crc kubenswrapper[4722]: I0309 14:41:48.151139 4722 scope.go:117] "RemoveContainer" containerID="931310cf17f93e3962cdbbdf531169aefb3915147cff22375dd9c06d9d6b2906" Mar 09 14:41:48 crc kubenswrapper[4722]: E0309 14:41:48.152054 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 14:41:50 crc kubenswrapper[4722]: I0309 14:41:50.047679 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-qn7dx"] Mar 09 14:41:50 crc kubenswrapper[4722]: I0309 14:41:50.058791 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-qn7dx"] Mar 09 14:41:50 crc kubenswrapper[4722]: I0309 14:41:50.164543 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40e6bb4d-411c-43c2-9959-0d1b9e005a11" path="/var/lib/kubelet/pods/40e6bb4d-411c-43c2-9959-0d1b9e005a11/volumes" Mar 09 14:41:50 crc kubenswrapper[4722]: I0309 14:41:50.279985 4722 generic.go:334] "Generic (PLEG): container finished" podID="7e37df29-c0ce-40d1-bd50-50936c544bb0" containerID="e5eb1f7ccec345efed1b67a86b43e0fd13f8eed47b874e832784af163d054ef5" exitCode=0 Mar 09 14:41:50 crc kubenswrapper[4722]: I0309 14:41:50.280057 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh" event={"ID":"7e37df29-c0ce-40d1-bd50-50936c544bb0","Type":"ContainerDied","Data":"e5eb1f7ccec345efed1b67a86b43e0fd13f8eed47b874e832784af163d054ef5"} Mar 09 14:41:51 crc kubenswrapper[4722]: I0309 14:41:51.779715 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh" Mar 09 14:41:51 crc kubenswrapper[4722]: I0309 14:41:51.862347 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e37df29-c0ce-40d1-bd50-50936c544bb0-ovn-combined-ca-bundle\") pod \"7e37df29-c0ce-40d1-bd50-50936c544bb0\" (UID: \"7e37df29-c0ce-40d1-bd50-50936c544bb0\") " Mar 09 14:41:51 crc kubenswrapper[4722]: I0309 14:41:51.862403 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e37df29-c0ce-40d1-bd50-50936c544bb0-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"7e37df29-c0ce-40d1-bd50-50936c544bb0\" (UID: \"7e37df29-c0ce-40d1-bd50-50936c544bb0\") " Mar 09 14:41:51 crc kubenswrapper[4722]: I0309 14:41:51.862434 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e37df29-c0ce-40d1-bd50-50936c544bb0-nova-combined-ca-bundle\") pod \"7e37df29-c0ce-40d1-bd50-50936c544bb0\" (UID: \"7e37df29-c0ce-40d1-bd50-50936c544bb0\") " Mar 09 14:41:51 crc kubenswrapper[4722]: I0309 14:41:51.862463 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e37df29-c0ce-40d1-bd50-50936c544bb0-bootstrap-combined-ca-bundle\") pod \"7e37df29-c0ce-40d1-bd50-50936c544bb0\" (UID: \"7e37df29-c0ce-40d1-bd50-50936c544bb0\") " Mar 09 14:41:51 crc kubenswrapper[4722]: I0309 14:41:51.862487 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e37df29-c0ce-40d1-bd50-50936c544bb0-telemetry-combined-ca-bundle\") pod \"7e37df29-c0ce-40d1-bd50-50936c544bb0\" (UID: \"7e37df29-c0ce-40d1-bd50-50936c544bb0\") " Mar 09 14:41:51 crc kubenswrapper[4722]: I0309 14:41:51.868698 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e37df29-c0ce-40d1-bd50-50936c544bb0-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "7e37df29-c0ce-40d1-bd50-50936c544bb0" (UID: "7e37df29-c0ce-40d1-bd50-50936c544bb0"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:41:51 crc kubenswrapper[4722]: I0309 14:41:51.870080 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e37df29-c0ce-40d1-bd50-50936c544bb0-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "7e37df29-c0ce-40d1-bd50-50936c544bb0" (UID: "7e37df29-c0ce-40d1-bd50-50936c544bb0"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:41:51 crc kubenswrapper[4722]: I0309 14:41:51.870529 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e37df29-c0ce-40d1-bd50-50936c544bb0-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "7e37df29-c0ce-40d1-bd50-50936c544bb0" (UID: "7e37df29-c0ce-40d1-bd50-50936c544bb0"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:41:51 crc kubenswrapper[4722]: I0309 14:41:51.870574 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e37df29-c0ce-40d1-bd50-50936c544bb0-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "7e37df29-c0ce-40d1-bd50-50936c544bb0" (UID: "7e37df29-c0ce-40d1-bd50-50936c544bb0"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:41:51 crc kubenswrapper[4722]: I0309 14:41:51.881737 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e37df29-c0ce-40d1-bd50-50936c544bb0-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "7e37df29-c0ce-40d1-bd50-50936c544bb0" (UID: "7e37df29-c0ce-40d1-bd50-50936c544bb0"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:41:51 crc kubenswrapper[4722]: I0309 14:41:51.963971 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e37df29-c0ce-40d1-bd50-50936c544bb0-openstack-edpm-ipam-ovn-default-certs-0\") pod \"7e37df29-c0ce-40d1-bd50-50936c544bb0\" (UID: \"7e37df29-c0ce-40d1-bd50-50936c544bb0\") " Mar 09 14:41:51 crc kubenswrapper[4722]: I0309 14:41:51.964040 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e37df29-c0ce-40d1-bd50-50936c544bb0-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"7e37df29-c0ce-40d1-bd50-50936c544bb0\" (UID: \"7e37df29-c0ce-40d1-bd50-50936c544bb0\") " Mar 09 14:41:51 crc kubenswrapper[4722]: I0309 14:41:51.964069 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e37df29-c0ce-40d1-bd50-50936c544bb0-neutron-metadata-combined-ca-bundle\") pod \"7e37df29-c0ce-40d1-bd50-50936c544bb0\" (UID: \"7e37df29-c0ce-40d1-bd50-50936c544bb0\") " Mar 09 14:41:51 crc kubenswrapper[4722]: I0309 14:41:51.964097 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7e37df29-c0ce-40d1-bd50-50936c544bb0-ssh-key-openstack-edpm-ipam\") pod \"7e37df29-c0ce-40d1-bd50-50936c544bb0\" (UID: \"7e37df29-c0ce-40d1-bd50-50936c544bb0\") " Mar 09 14:41:51 crc kubenswrapper[4722]: I0309 14:41:51.964632 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e37df29-c0ce-40d1-bd50-50936c544bb0-repo-setup-combined-ca-bundle\") pod \"7e37df29-c0ce-40d1-bd50-50936c544bb0\" (UID: \"7e37df29-c0ce-40d1-bd50-50936c544bb0\") " Mar 09 14:41:51 crc kubenswrapper[4722]: I0309 14:41:51.964719 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vndpj\" (UniqueName: \"kubernetes.io/projected/7e37df29-c0ce-40d1-bd50-50936c544bb0-kube-api-access-vndpj\") pod \"7e37df29-c0ce-40d1-bd50-50936c544bb0\" (UID: \"7e37df29-c0ce-40d1-bd50-50936c544bb0\") " Mar 09 14:41:51 crc kubenswrapper[4722]: I0309 14:41:51.964747 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e37df29-c0ce-40d1-bd50-50936c544bb0-telemetry-power-monitoring-combined-ca-bundle\") pod \"7e37df29-c0ce-40d1-bd50-50936c544bb0\" (UID: \"7e37df29-c0ce-40d1-bd50-50936c544bb0\") " Mar 09 14:41:51 crc kubenswrapper[4722]: I0309 14:41:51.964772 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e37df29-c0ce-40d1-bd50-50936c544bb0-inventory\") pod \"7e37df29-c0ce-40d1-bd50-50936c544bb0\" (UID: \"7e37df29-c0ce-40d1-bd50-50936c544bb0\") " Mar 09 14:41:51 crc kubenswrapper[4722]: I0309 14:41:51.964792 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e37df29-c0ce-40d1-bd50-50936c544bb0-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"7e37df29-c0ce-40d1-bd50-50936c544bb0\" (UID: \"7e37df29-c0ce-40d1-bd50-50936c544bb0\") " Mar 09 14:41:51 crc kubenswrapper[4722]: I0309 14:41:51.964813 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e37df29-c0ce-40d1-bd50-50936c544bb0-libvirt-combined-ca-bundle\") pod \"7e37df29-c0ce-40d1-bd50-50936c544bb0\" (UID: \"7e37df29-c0ce-40d1-bd50-50936c544bb0\") " Mar 09 14:41:51 crc kubenswrapper[4722]: I0309 14:41:51.964833 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e37df29-c0ce-40d1-bd50-50936c544bb0-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"7e37df29-c0ce-40d1-bd50-50936c544bb0\" (UID: \"7e37df29-c0ce-40d1-bd50-50936c544bb0\") " Mar 09 14:41:51 crc kubenswrapper[4722]: I0309 14:41:51.965403 4722 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e37df29-c0ce-40d1-bd50-50936c544bb0-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:41:51 crc kubenswrapper[4722]: I0309 14:41:51.965442 4722 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e37df29-c0ce-40d1-bd50-50936c544bb0-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 09 14:41:51 crc kubenswrapper[4722]: I0309 14:41:51.965452 4722 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e37df29-c0ce-40d1-bd50-50936c544bb0-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:41:51 crc kubenswrapper[4722]: I0309 14:41:51.965464 4722 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e37df29-c0ce-40d1-bd50-50936c544bb0-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:41:51 crc kubenswrapper[4722]: I0309 14:41:51.965477 4722 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e37df29-c0ce-40d1-bd50-50936c544bb0-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:41:51 crc kubenswrapper[4722]: I0309 14:41:51.969042 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e37df29-c0ce-40d1-bd50-50936c544bb0-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "7e37df29-c0ce-40d1-bd50-50936c544bb0" (UID: "7e37df29-c0ce-40d1-bd50-50936c544bb0"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:41:51 crc kubenswrapper[4722]: I0309 14:41:51.969658 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e37df29-c0ce-40d1-bd50-50936c544bb0-telemetry-power-monitoring-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-power-monitoring-combined-ca-bundle") pod "7e37df29-c0ce-40d1-bd50-50936c544bb0" (UID: "7e37df29-c0ce-40d1-bd50-50936c544bb0"). InnerVolumeSpecName "telemetry-power-monitoring-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:41:51 crc kubenswrapper[4722]: I0309 14:41:51.969748 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e37df29-c0ce-40d1-bd50-50936c544bb0-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "7e37df29-c0ce-40d1-bd50-50936c544bb0" (UID: "7e37df29-c0ce-40d1-bd50-50936c544bb0"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:41:51 crc kubenswrapper[4722]: I0309 14:41:51.970743 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e37df29-c0ce-40d1-bd50-50936c544bb0-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "7e37df29-c0ce-40d1-bd50-50936c544bb0" (UID: "7e37df29-c0ce-40d1-bd50-50936c544bb0"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:41:51 crc kubenswrapper[4722]: I0309 14:41:51.970775 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e37df29-c0ce-40d1-bd50-50936c544bb0-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0") pod "7e37df29-c0ce-40d1-bd50-50936c544bb0" (UID: "7e37df29-c0ce-40d1-bd50-50936c544bb0"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:41:51 crc kubenswrapper[4722]: I0309 14:41:51.970965 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e37df29-c0ce-40d1-bd50-50936c544bb0-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "7e37df29-c0ce-40d1-bd50-50936c544bb0" (UID: "7e37df29-c0ce-40d1-bd50-50936c544bb0"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:41:51 crc kubenswrapper[4722]: I0309 14:41:51.972355 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e37df29-c0ce-40d1-bd50-50936c544bb0-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "7e37df29-c0ce-40d1-bd50-50936c544bb0" (UID: "7e37df29-c0ce-40d1-bd50-50936c544bb0"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:41:51 crc kubenswrapper[4722]: I0309 14:41:51.973228 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e37df29-c0ce-40d1-bd50-50936c544bb0-kube-api-access-vndpj" (OuterVolumeSpecName: "kube-api-access-vndpj") pod "7e37df29-c0ce-40d1-bd50-50936c544bb0" (UID: "7e37df29-c0ce-40d1-bd50-50936c544bb0"). InnerVolumeSpecName "kube-api-access-vndpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:41:51 crc kubenswrapper[4722]: I0309 14:41:51.973573 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e37df29-c0ce-40d1-bd50-50936c544bb0-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "7e37df29-c0ce-40d1-bd50-50936c544bb0" (UID: "7e37df29-c0ce-40d1-bd50-50936c544bb0"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:41:52 crc kubenswrapper[4722]: I0309 14:41:52.012116 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e37df29-c0ce-40d1-bd50-50936c544bb0-inventory" (OuterVolumeSpecName: "inventory") pod "7e37df29-c0ce-40d1-bd50-50936c544bb0" (UID: "7e37df29-c0ce-40d1-bd50-50936c544bb0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:41:52 crc kubenswrapper[4722]: I0309 14:41:52.022546 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e37df29-c0ce-40d1-bd50-50936c544bb0-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "7e37df29-c0ce-40d1-bd50-50936c544bb0" (UID: "7e37df29-c0ce-40d1-bd50-50936c544bb0"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:41:52 crc kubenswrapper[4722]: I0309 14:41:52.067996 4722 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e37df29-c0ce-40d1-bd50-50936c544bb0-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 09 14:41:52 crc kubenswrapper[4722]: I0309 14:41:52.068046 4722 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e37df29-c0ce-40d1-bd50-50936c544bb0-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 09 14:41:52 crc kubenswrapper[4722]: I0309 14:41:52.068101 4722 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e37df29-c0ce-40d1-bd50-50936c544bb0-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:41:52 crc kubenswrapper[4722]: I0309 14:41:52.068122 4722 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7e37df29-c0ce-40d1-bd50-50936c544bb0-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 14:41:52 crc kubenswrapper[4722]: I0309 14:41:52.068153 4722 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e37df29-c0ce-40d1-bd50-50936c544bb0-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:41:52 crc kubenswrapper[4722]: I0309 14:41:52.068182 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vndpj\" (UniqueName: \"kubernetes.io/projected/7e37df29-c0ce-40d1-bd50-50936c544bb0-kube-api-access-vndpj\") on node \"crc\" DevicePath \"\"" Mar 09 14:41:52 crc kubenswrapper[4722]: I0309 14:41:52.068198 4722 reconciler_common.go:293] "Volume detached for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e37df29-c0ce-40d1-bd50-50936c544bb0-telemetry-power-monitoring-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:41:52 crc kubenswrapper[4722]: I0309 14:41:52.068257 4722 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e37df29-c0ce-40d1-bd50-50936c544bb0-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 14:41:52 crc kubenswrapper[4722]: I0309 14:41:52.068270 4722 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e37df29-c0ce-40d1-bd50-50936c544bb0-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 09 14:41:52 crc kubenswrapper[4722]: I0309 14:41:52.068288 4722 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e37df29-c0ce-40d1-bd50-50936c544bb0-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:41:52 crc kubenswrapper[4722]: I0309 14:41:52.068309 4722 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/7e37df29-c0ce-40d1-bd50-50936c544bb0-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 09 14:41:52 crc kubenswrapper[4722]: I0309 14:41:52.303049 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh" event={"ID":"7e37df29-c0ce-40d1-bd50-50936c544bb0","Type":"ContainerDied","Data":"6a2e755093f92342e5349892ae250c885f416fd8c29c9dbdc116a0de55b8ad40"} Mar 09 14:41:52 crc kubenswrapper[4722]: I0309 14:41:52.303118 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh" Mar 09 14:41:52 crc kubenswrapper[4722]: I0309 14:41:52.303141 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a2e755093f92342e5349892ae250c885f416fd8c29c9dbdc116a0de55b8ad40" Mar 09 14:41:52 crc kubenswrapper[4722]: I0309 14:41:52.428421 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-c6hrn"] Mar 09 14:41:52 crc kubenswrapper[4722]: E0309 14:41:52.434177 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e37df29-c0ce-40d1-bd50-50936c544bb0" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 09 14:41:52 crc kubenswrapper[4722]: I0309 14:41:52.434246 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e37df29-c0ce-40d1-bd50-50936c544bb0" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 09 14:41:52 crc kubenswrapper[4722]: I0309 14:41:52.435348 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e37df29-c0ce-40d1-bd50-50936c544bb0" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 09 14:41:52 crc kubenswrapper[4722]: I0309 14:41:52.438786 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c6hrn" Mar 09 14:41:52 crc kubenswrapper[4722]: I0309 14:41:52.445323 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Mar 09 14:41:52 crc kubenswrapper[4722]: I0309 14:41:52.445356 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 14:41:52 crc kubenswrapper[4722]: I0309 14:41:52.445435 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 14:41:52 crc kubenswrapper[4722]: I0309 14:41:52.445512 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 14:41:52 crc kubenswrapper[4722]: I0309 14:41:52.445361 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-dwlbg" Mar 09 14:41:52 crc kubenswrapper[4722]: I0309 14:41:52.453878 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-c6hrn"] Mar 09 14:41:52 crc kubenswrapper[4722]: I0309 14:41:52.475845 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6d67f5f0-866e-4de7-ba1a-9a1b4a4086e5-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-c6hrn\" (UID: \"6d67f5f0-866e-4de7-ba1a-9a1b4a4086e5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c6hrn" Mar 09 14:41:52 crc kubenswrapper[4722]: I0309 14:41:52.475899 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d67f5f0-866e-4de7-ba1a-9a1b4a4086e5-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-c6hrn\" (UID: \"6d67f5f0-866e-4de7-ba1a-9a1b4a4086e5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c6hrn" Mar 09 14:41:52 crc kubenswrapper[4722]: I0309 14:41:52.476042 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/6d67f5f0-866e-4de7-ba1a-9a1b4a4086e5-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-c6hrn\" (UID: \"6d67f5f0-866e-4de7-ba1a-9a1b4a4086e5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c6hrn" Mar 09 14:41:52 crc kubenswrapper[4722]: I0309 14:41:52.476071 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6d67f5f0-866e-4de7-ba1a-9a1b4a4086e5-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-c6hrn\" (UID: \"6d67f5f0-866e-4de7-ba1a-9a1b4a4086e5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c6hrn" Mar 09 14:41:52 crc kubenswrapper[4722]: I0309 14:41:52.476143 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wks28\" (UniqueName: \"kubernetes.io/projected/6d67f5f0-866e-4de7-ba1a-9a1b4a4086e5-kube-api-access-wks28\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-c6hrn\" (UID: \"6d67f5f0-866e-4de7-ba1a-9a1b4a4086e5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c6hrn" Mar 09 14:41:52 crc kubenswrapper[4722]: I0309 14:41:52.577296 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wks28\" (UniqueName: \"kubernetes.io/projected/6d67f5f0-866e-4de7-ba1a-9a1b4a4086e5-kube-api-access-wks28\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-c6hrn\" (UID: \"6d67f5f0-866e-4de7-ba1a-9a1b4a4086e5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c6hrn" Mar 09 14:41:52 crc kubenswrapper[4722]: I0309 14:41:52.577510 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6d67f5f0-866e-4de7-ba1a-9a1b4a4086e5-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-c6hrn\" (UID: \"6d67f5f0-866e-4de7-ba1a-9a1b4a4086e5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c6hrn" Mar 09 14:41:52 crc kubenswrapper[4722]: I0309 14:41:52.577556 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d67f5f0-866e-4de7-ba1a-9a1b4a4086e5-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-c6hrn\" (UID: \"6d67f5f0-866e-4de7-ba1a-9a1b4a4086e5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c6hrn" Mar 09 14:41:52 crc kubenswrapper[4722]: I0309 14:41:52.577657 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/6d67f5f0-866e-4de7-ba1a-9a1b4a4086e5-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-c6hrn\" (UID: \"6d67f5f0-866e-4de7-ba1a-9a1b4a4086e5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c6hrn" Mar 09 14:41:52 crc kubenswrapper[4722]: I0309 14:41:52.577694 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6d67f5f0-866e-4de7-ba1a-9a1b4a4086e5-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-c6hrn\" (UID: \"6d67f5f0-866e-4de7-ba1a-9a1b4a4086e5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c6hrn" Mar 09 14:41:52 crc kubenswrapper[4722]: I0309 14:41:52.578700 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/6d67f5f0-866e-4de7-ba1a-9a1b4a4086e5-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-c6hrn\" (UID: \"6d67f5f0-866e-4de7-ba1a-9a1b4a4086e5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c6hrn" Mar 09 14:41:52 crc kubenswrapper[4722]: I0309 14:41:52.581580 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d67f5f0-866e-4de7-ba1a-9a1b4a4086e5-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-c6hrn\" (UID: \"6d67f5f0-866e-4de7-ba1a-9a1b4a4086e5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c6hrn" Mar 09 14:41:52 crc kubenswrapper[4722]: I0309 14:41:52.581960 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6d67f5f0-866e-4de7-ba1a-9a1b4a4086e5-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-c6hrn\" (UID: \"6d67f5f0-866e-4de7-ba1a-9a1b4a4086e5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c6hrn" Mar 09 14:41:52 crc kubenswrapper[4722]: I0309 14:41:52.583287 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6d67f5f0-866e-4de7-ba1a-9a1b4a4086e5-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-c6hrn\" (UID: \"6d67f5f0-866e-4de7-ba1a-9a1b4a4086e5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c6hrn" Mar 09 14:41:52 crc kubenswrapper[4722]: I0309 14:41:52.592831 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wks28\" (UniqueName: \"kubernetes.io/projected/6d67f5f0-866e-4de7-ba1a-9a1b4a4086e5-kube-api-access-wks28\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-c6hrn\" (UID: \"6d67f5f0-866e-4de7-ba1a-9a1b4a4086e5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c6hrn" Mar 09 14:41:52 crc kubenswrapper[4722]: I0309 14:41:52.768503 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c6hrn" Mar 09 14:41:53 crc kubenswrapper[4722]: I0309 14:41:53.397556 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-c6hrn"] Mar 09 14:41:54 crc kubenswrapper[4722]: I0309 14:41:54.327113 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c6hrn" event={"ID":"6d67f5f0-866e-4de7-ba1a-9a1b4a4086e5","Type":"ContainerStarted","Data":"86b46de24b4145a911f677af74c1c58f401ccc27ee3d0ecc5670204c8a50b8fc"} Mar 09 14:41:54 crc kubenswrapper[4722]: I0309 14:41:54.327703 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c6hrn" event={"ID":"6d67f5f0-866e-4de7-ba1a-9a1b4a4086e5","Type":"ContainerStarted","Data":"5929fd13f354315a834a08b1440260d188088232638c24ea4dec16f3f6fff704"} Mar 09 14:41:54 crc kubenswrapper[4722]: I0309 14:41:54.350809 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c6hrn" podStartSLOduration=1.9003316730000002 podStartE2EDuration="2.350780262s" podCreationTimestamp="2026-03-09 14:41:52 +0000 UTC" firstStartedPulling="2026-03-09 14:41:53.399978822 +0000 UTC m=+2353.955547418" lastFinishedPulling="2026-03-09 14:41:53.850427421 +0000 UTC m=+2354.405996007" observedRunningTime="2026-03-09 14:41:54.347224536 +0000 UTC m=+2354.902793122" watchObservedRunningTime="2026-03-09 14:41:54.350780262 +0000 UTC m=+2354.906348878" Mar 09 14:42:00 crc kubenswrapper[4722]: I0309 14:42:00.197777 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551122-lg5w9"] Mar 09 14:42:00 crc kubenswrapper[4722]: I0309 14:42:00.206986 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551122-lg5w9"] Mar 09 14:42:00 crc kubenswrapper[4722]: I0309 14:42:00.207193 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551122-lg5w9" Mar 09 14:42:00 crc kubenswrapper[4722]: I0309 14:42:00.212096 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 14:42:00 crc kubenswrapper[4722]: I0309 14:42:00.212257 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-dr2x6" Mar 09 14:42:00 crc kubenswrapper[4722]: I0309 14:42:00.212411 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 14:42:00 crc kubenswrapper[4722]: I0309 14:42:00.314479 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkmn8\" (UniqueName: \"kubernetes.io/projected/a7fb0dbe-1b62-48eb-9d4a-8df7727dd63a-kube-api-access-kkmn8\") pod \"auto-csr-approver-29551122-lg5w9\" (UID: \"a7fb0dbe-1b62-48eb-9d4a-8df7727dd63a\") " pod="openshift-infra/auto-csr-approver-29551122-lg5w9" Mar 09 14:42:00 crc kubenswrapper[4722]: I0309 14:42:00.417164 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkmn8\" (UniqueName: \"kubernetes.io/projected/a7fb0dbe-1b62-48eb-9d4a-8df7727dd63a-kube-api-access-kkmn8\") pod \"auto-csr-approver-29551122-lg5w9\" (UID: \"a7fb0dbe-1b62-48eb-9d4a-8df7727dd63a\") " pod="openshift-infra/auto-csr-approver-29551122-lg5w9" Mar 09 14:42:00 crc kubenswrapper[4722]: I0309 14:42:00.439836 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkmn8\" (UniqueName: \"kubernetes.io/projected/a7fb0dbe-1b62-48eb-9d4a-8df7727dd63a-kube-api-access-kkmn8\") pod \"auto-csr-approver-29551122-lg5w9\" (UID: \"a7fb0dbe-1b62-48eb-9d4a-8df7727dd63a\") " pod="openshift-infra/auto-csr-approver-29551122-lg5w9" Mar 09 14:42:00 crc kubenswrapper[4722]: I0309 14:42:00.531726 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551122-lg5w9" Mar 09 14:42:01 crc kubenswrapper[4722]: I0309 14:42:01.012149 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551122-lg5w9"] Mar 09 14:42:01 crc kubenswrapper[4722]: I0309 14:42:01.410167 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551122-lg5w9" event={"ID":"a7fb0dbe-1b62-48eb-9d4a-8df7727dd63a","Type":"ContainerStarted","Data":"761a5b87bef26f4eb7c5c5eda5f4a3f070e82ed4b586151a91436c0f4409c2f0"} Mar 09 14:42:02 crc kubenswrapper[4722]: I0309 14:42:02.151823 4722 scope.go:117] "RemoveContainer" containerID="931310cf17f93e3962cdbbdf531169aefb3915147cff22375dd9c06d9d6b2906" Mar 09 14:42:02 crc kubenswrapper[4722]: E0309 14:42:02.152400 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 14:42:03 crc kubenswrapper[4722]: I0309 14:42:03.449097 4722 generic.go:334] "Generic (PLEG): container finished" podID="a7fb0dbe-1b62-48eb-9d4a-8df7727dd63a" containerID="9ed9a3e3b723ec09b4d6338af042e6c6ba44ee25eed470dab7629e29774a7323" exitCode=0 Mar 09 14:42:03 crc kubenswrapper[4722]: I0309 14:42:03.449383 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551122-lg5w9" event={"ID":"a7fb0dbe-1b62-48eb-9d4a-8df7727dd63a","Type":"ContainerDied","Data":"9ed9a3e3b723ec09b4d6338af042e6c6ba44ee25eed470dab7629e29774a7323"} Mar 09 14:42:04 crc kubenswrapper[4722]: I0309 14:42:04.958247 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551122-lg5w9" Mar 09 14:42:05 crc kubenswrapper[4722]: I0309 14:42:05.043373 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kkmn8\" (UniqueName: \"kubernetes.io/projected/a7fb0dbe-1b62-48eb-9d4a-8df7727dd63a-kube-api-access-kkmn8\") pod \"a7fb0dbe-1b62-48eb-9d4a-8df7727dd63a\" (UID: \"a7fb0dbe-1b62-48eb-9d4a-8df7727dd63a\") " Mar 09 14:42:05 crc kubenswrapper[4722]: I0309 14:42:05.049222 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7fb0dbe-1b62-48eb-9d4a-8df7727dd63a-kube-api-access-kkmn8" (OuterVolumeSpecName: "kube-api-access-kkmn8") pod "a7fb0dbe-1b62-48eb-9d4a-8df7727dd63a" (UID: "a7fb0dbe-1b62-48eb-9d4a-8df7727dd63a"). InnerVolumeSpecName "kube-api-access-kkmn8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:42:05 crc kubenswrapper[4722]: I0309 14:42:05.146607 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kkmn8\" (UniqueName: \"kubernetes.io/projected/a7fb0dbe-1b62-48eb-9d4a-8df7727dd63a-kube-api-access-kkmn8\") on node \"crc\" DevicePath \"\"" Mar 09 14:42:05 crc kubenswrapper[4722]: I0309 14:42:05.470372 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551122-lg5w9" event={"ID":"a7fb0dbe-1b62-48eb-9d4a-8df7727dd63a","Type":"ContainerDied","Data":"761a5b87bef26f4eb7c5c5eda5f4a3f070e82ed4b586151a91436c0f4409c2f0"} Mar 09 14:42:05 crc kubenswrapper[4722]: I0309 14:42:05.470727 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="761a5b87bef26f4eb7c5c5eda5f4a3f070e82ed4b586151a91436c0f4409c2f0" Mar 09 14:42:05 crc kubenswrapper[4722]: I0309 14:42:05.470421 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551122-lg5w9" Mar 09 14:42:06 crc kubenswrapper[4722]: I0309 14:42:06.039407 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551116-czdm7"] Mar 09 14:42:06 crc kubenswrapper[4722]: I0309 14:42:06.053256 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551116-czdm7"] Mar 09 14:42:06 crc kubenswrapper[4722]: I0309 14:42:06.164980 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd7ebfdc-0261-4646-9601-cd3367751122" path="/var/lib/kubelet/pods/dd7ebfdc-0261-4646-9601-cd3367751122/volumes" Mar 09 14:42:15 crc kubenswrapper[4722]: I0309 14:42:15.149961 4722 scope.go:117] "RemoveContainer" containerID="931310cf17f93e3962cdbbdf531169aefb3915147cff22375dd9c06d9d6b2906" Mar 09 14:42:15 crc kubenswrapper[4722]: E0309 14:42:15.150700 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 14:42:27 crc kubenswrapper[4722]: I0309 14:42:27.689904 4722 scope.go:117] "RemoveContainer" containerID="756e3aaa32e5086309c4a2d9889d64503e81017d2539d8f1b301b840339d420f" Mar 09 14:42:27 crc kubenswrapper[4722]: I0309 14:42:27.753472 4722 scope.go:117] "RemoveContainer" containerID="e0d21535954f2715f616aa2539fb3b03570f619c92ebfbd782a330836605d26a" Mar 09 14:42:29 crc kubenswrapper[4722]: I0309 14:42:29.149037 4722 scope.go:117] "RemoveContainer" containerID="931310cf17f93e3962cdbbdf531169aefb3915147cff22375dd9c06d9d6b2906" Mar 09 14:42:29 crc kubenswrapper[4722]: E0309 14:42:29.149665 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 14:42:44 crc kubenswrapper[4722]: I0309 14:42:44.150736 4722 scope.go:117] "RemoveContainer" containerID="931310cf17f93e3962cdbbdf531169aefb3915147cff22375dd9c06d9d6b2906" Mar 09 14:42:44 crc kubenswrapper[4722]: E0309 14:42:44.151539 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 14:42:52 crc kubenswrapper[4722]: I0309 14:42:52.964968 4722 generic.go:334] "Generic (PLEG): container finished" podID="6d67f5f0-866e-4de7-ba1a-9a1b4a4086e5" containerID="86b46de24b4145a911f677af74c1c58f401ccc27ee3d0ecc5670204c8a50b8fc" exitCode=0 Mar 09 14:42:52 crc kubenswrapper[4722]: I0309 14:42:52.965176 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c6hrn" event={"ID":"6d67f5f0-866e-4de7-ba1a-9a1b4a4086e5","Type":"ContainerDied","Data":"86b46de24b4145a911f677af74c1c58f401ccc27ee3d0ecc5670204c8a50b8fc"} Mar 09 14:42:53 crc kubenswrapper[4722]: I0309 14:42:53.889666 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8tvtb"] Mar 09 14:42:53 crc kubenswrapper[4722]: E0309 14:42:53.890466 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7fb0dbe-1b62-48eb-9d4a-8df7727dd63a" containerName="oc" Mar 09 14:42:53 crc kubenswrapper[4722]: I0309 14:42:53.890483 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7fb0dbe-1b62-48eb-9d4a-8df7727dd63a" containerName="oc" Mar 09 14:42:53 crc kubenswrapper[4722]: I0309 14:42:53.890701 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7fb0dbe-1b62-48eb-9d4a-8df7727dd63a" containerName="oc" Mar 09 14:42:53 crc kubenswrapper[4722]: I0309 14:42:53.895515 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8tvtb" Mar 09 14:42:53 crc kubenswrapper[4722]: I0309 14:42:53.904861 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8tvtb"] Mar 09 14:42:54 crc kubenswrapper[4722]: I0309 14:42:54.086769 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75e8821e-13f2-4b95-b427-5aa473f37dd8-catalog-content\") pod \"community-operators-8tvtb\" (UID: \"75e8821e-13f2-4b95-b427-5aa473f37dd8\") " pod="openshift-marketplace/community-operators-8tvtb" Mar 09 14:42:54 crc kubenswrapper[4722]: I0309 14:42:54.086898 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75e8821e-13f2-4b95-b427-5aa473f37dd8-utilities\") pod \"community-operators-8tvtb\" (UID: \"75e8821e-13f2-4b95-b427-5aa473f37dd8\") " pod="openshift-marketplace/community-operators-8tvtb" Mar 09 14:42:54 crc kubenswrapper[4722]: I0309 14:42:54.087034 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v89d9\" (UniqueName: \"kubernetes.io/projected/75e8821e-13f2-4b95-b427-5aa473f37dd8-kube-api-access-v89d9\") pod \"community-operators-8tvtb\" (UID: \"75e8821e-13f2-4b95-b427-5aa473f37dd8\") " pod="openshift-marketplace/community-operators-8tvtb" Mar 09 14:42:54 crc kubenswrapper[4722]: I0309 14:42:54.188737 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75e8821e-13f2-4b95-b427-5aa473f37dd8-utilities\") pod \"community-operators-8tvtb\" (UID: \"75e8821e-13f2-4b95-b427-5aa473f37dd8\") " pod="openshift-marketplace/community-operators-8tvtb" Mar 09 14:42:54 crc kubenswrapper[4722]: I0309 14:42:54.188812 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v89d9\" (UniqueName: \"kubernetes.io/projected/75e8821e-13f2-4b95-b427-5aa473f37dd8-kube-api-access-v89d9\") pod \"community-operators-8tvtb\" (UID: \"75e8821e-13f2-4b95-b427-5aa473f37dd8\") " pod="openshift-marketplace/community-operators-8tvtb" Mar 09 14:42:54 crc kubenswrapper[4722]: I0309 14:42:54.188952 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75e8821e-13f2-4b95-b427-5aa473f37dd8-catalog-content\") pod \"community-operators-8tvtb\" (UID: \"75e8821e-13f2-4b95-b427-5aa473f37dd8\") " pod="openshift-marketplace/community-operators-8tvtb" Mar 09 14:42:54 crc kubenswrapper[4722]: I0309 14:42:54.189621 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75e8821e-13f2-4b95-b427-5aa473f37dd8-catalog-content\") pod \"community-operators-8tvtb\" (UID: \"75e8821e-13f2-4b95-b427-5aa473f37dd8\") " pod="openshift-marketplace/community-operators-8tvtb" Mar 09 14:42:54 crc kubenswrapper[4722]: I0309 14:42:54.189621 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75e8821e-13f2-4b95-b427-5aa473f37dd8-utilities\") pod \"community-operators-8tvtb\" (UID: \"75e8821e-13f2-4b95-b427-5aa473f37dd8\") " pod="openshift-marketplace/community-operators-8tvtb" Mar 09 14:42:54 crc kubenswrapper[4722]: I0309 14:42:54.210892 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v89d9\" (UniqueName: \"kubernetes.io/projected/75e8821e-13f2-4b95-b427-5aa473f37dd8-kube-api-access-v89d9\") pod \"community-operators-8tvtb\" (UID: \"75e8821e-13f2-4b95-b427-5aa473f37dd8\") " pod="openshift-marketplace/community-operators-8tvtb" Mar 09 14:42:54 crc kubenswrapper[4722]: I0309 14:42:54.216460 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8tvtb" Mar 09 14:42:54 crc kubenswrapper[4722]: I0309 14:42:54.566347 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c6hrn" Mar 09 14:42:54 crc kubenswrapper[4722]: I0309 14:42:54.714731 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6d67f5f0-866e-4de7-ba1a-9a1b4a4086e5-inventory\") pod \"6d67f5f0-866e-4de7-ba1a-9a1b4a4086e5\" (UID: \"6d67f5f0-866e-4de7-ba1a-9a1b4a4086e5\") " Mar 09 14:42:54 crc kubenswrapper[4722]: I0309 14:42:54.714870 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6d67f5f0-866e-4de7-ba1a-9a1b4a4086e5-ssh-key-openstack-edpm-ipam\") pod \"6d67f5f0-866e-4de7-ba1a-9a1b4a4086e5\" (UID: \"6d67f5f0-866e-4de7-ba1a-9a1b4a4086e5\") " Mar 09 14:42:54 crc kubenswrapper[4722]: I0309 14:42:54.715050 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/6d67f5f0-866e-4de7-ba1a-9a1b4a4086e5-ovncontroller-config-0\") pod \"6d67f5f0-866e-4de7-ba1a-9a1b4a4086e5\" (UID: \"6d67f5f0-866e-4de7-ba1a-9a1b4a4086e5\") " Mar 09 14:42:54 crc kubenswrapper[4722]: I0309 14:42:54.715113 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d67f5f0-866e-4de7-ba1a-9a1b4a4086e5-ovn-combined-ca-bundle\") pod \"6d67f5f0-866e-4de7-ba1a-9a1b4a4086e5\" (UID: \"6d67f5f0-866e-4de7-ba1a-9a1b4a4086e5\") " Mar 09 14:42:54 crc kubenswrapper[4722]: I0309 14:42:54.715168 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wks28\" (UniqueName: \"kubernetes.io/projected/6d67f5f0-866e-4de7-ba1a-9a1b4a4086e5-kube-api-access-wks28\") pod \"6d67f5f0-866e-4de7-ba1a-9a1b4a4086e5\" (UID: \"6d67f5f0-866e-4de7-ba1a-9a1b4a4086e5\") " Mar 09 14:42:54 crc kubenswrapper[4722]: I0309 14:42:54.768890 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d67f5f0-866e-4de7-ba1a-9a1b4a4086e5-kube-api-access-wks28" (OuterVolumeSpecName: "kube-api-access-wks28") pod "6d67f5f0-866e-4de7-ba1a-9a1b4a4086e5" (UID: "6d67f5f0-866e-4de7-ba1a-9a1b4a4086e5"). InnerVolumeSpecName "kube-api-access-wks28". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:42:54 crc kubenswrapper[4722]: I0309 14:42:54.769182 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d67f5f0-866e-4de7-ba1a-9a1b4a4086e5-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "6d67f5f0-866e-4de7-ba1a-9a1b4a4086e5" (UID: "6d67f5f0-866e-4de7-ba1a-9a1b4a4086e5"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:42:54 crc kubenswrapper[4722]: I0309 14:42:54.774765 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d67f5f0-866e-4de7-ba1a-9a1b4a4086e5-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "6d67f5f0-866e-4de7-ba1a-9a1b4a4086e5" (UID: "6d67f5f0-866e-4de7-ba1a-9a1b4a4086e5"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:42:54 crc kubenswrapper[4722]: I0309 14:42:54.775401 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d67f5f0-866e-4de7-ba1a-9a1b4a4086e5-inventory" (OuterVolumeSpecName: "inventory") pod "6d67f5f0-866e-4de7-ba1a-9a1b4a4086e5" (UID: "6d67f5f0-866e-4de7-ba1a-9a1b4a4086e5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:42:54 crc kubenswrapper[4722]: I0309 14:42:54.786599 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d67f5f0-866e-4de7-ba1a-9a1b4a4086e5-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "6d67f5f0-866e-4de7-ba1a-9a1b4a4086e5" (UID: "6d67f5f0-866e-4de7-ba1a-9a1b4a4086e5"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:42:54 crc kubenswrapper[4722]: I0309 14:42:54.817666 4722 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6d67f5f0-866e-4de7-ba1a-9a1b4a4086e5-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 14:42:54 crc kubenswrapper[4722]: I0309 14:42:54.817974 4722 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6d67f5f0-866e-4de7-ba1a-9a1b4a4086e5-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 14:42:54 crc kubenswrapper[4722]: I0309 14:42:54.817987 4722 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/6d67f5f0-866e-4de7-ba1a-9a1b4a4086e5-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Mar 09 14:42:54 crc kubenswrapper[4722]: I0309 14:42:54.817997 4722 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d67f5f0-866e-4de7-ba1a-9a1b4a4086e5-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:42:54 crc kubenswrapper[4722]: I0309 14:42:54.818007 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wks28\" (UniqueName: \"kubernetes.io/projected/6d67f5f0-866e-4de7-ba1a-9a1b4a4086e5-kube-api-access-wks28\") on node \"crc\" DevicePath \"\"" Mar 09 14:42:54 crc kubenswrapper[4722]: I0309 14:42:54.824709 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8tvtb"] Mar 09 14:42:55 crc kubenswrapper[4722]: I0309 14:42:55.001142 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c6hrn" event={"ID":"6d67f5f0-866e-4de7-ba1a-9a1b4a4086e5","Type":"ContainerDied","Data":"5929fd13f354315a834a08b1440260d188088232638c24ea4dec16f3f6fff704"} Mar 09 14:42:55 crc kubenswrapper[4722]: I0309 14:42:55.001235 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5929fd13f354315a834a08b1440260d188088232638c24ea4dec16f3f6fff704" Mar 09 14:42:55 crc kubenswrapper[4722]: I0309 14:42:55.001166 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-c6hrn" Mar 09 14:42:55 crc kubenswrapper[4722]: I0309 14:42:55.002800 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8tvtb" event={"ID":"75e8821e-13f2-4b95-b427-5aa473f37dd8","Type":"ContainerStarted","Data":"b90bc8b4c861ca5ca3c7b4545da619b5777e2e3399914a6ce7da64e5b9f250a4"} Mar 09 14:42:55 crc kubenswrapper[4722]: I0309 14:42:55.088537 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-26wzs"] Mar 09 14:42:55 crc kubenswrapper[4722]: E0309 14:42:55.089109 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d67f5f0-866e-4de7-ba1a-9a1b4a4086e5" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 09 14:42:55 crc kubenswrapper[4722]: I0309 14:42:55.089127 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d67f5f0-866e-4de7-ba1a-9a1b4a4086e5" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 09 14:42:55 crc kubenswrapper[4722]: I0309 14:42:55.089490 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d67f5f0-866e-4de7-ba1a-9a1b4a4086e5" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 09 14:42:55 crc kubenswrapper[4722]: I0309 14:42:55.090552 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-26wzs" Mar 09 14:42:55 crc kubenswrapper[4722]: I0309 14:42:55.096851 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 14:42:55 crc kubenswrapper[4722]: I0309 14:42:55.096899 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Mar 09 14:42:55 crc kubenswrapper[4722]: I0309 14:42:55.096902 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 14:42:55 crc kubenswrapper[4722]: I0309 14:42:55.096956 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Mar 09 14:42:55 crc kubenswrapper[4722]: I0309 14:42:55.098109 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 14:42:55 crc kubenswrapper[4722]: I0309 14:42:55.098427 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-dwlbg" Mar 09 14:42:55 crc kubenswrapper[4722]: I0309 14:42:55.109750 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-26wzs"] Mar 09 14:42:55 crc kubenswrapper[4722]: I0309 14:42:55.127998 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38ca037e-8b49-4fa3-a8e2-edbfacecdaf5-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-26wzs\" (UID: \"38ca037e-8b49-4fa3-a8e2-edbfacecdaf5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-26wzs" Mar 09 14:42:55 crc kubenswrapper[4722]: I0309 14:42:55.128048 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/38ca037e-8b49-4fa3-a8e2-edbfacecdaf5-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-26wzs\" (UID: \"38ca037e-8b49-4fa3-a8e2-edbfacecdaf5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-26wzs" Mar 09 14:42:55 crc kubenswrapper[4722]: I0309 14:42:55.128297 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/38ca037e-8b49-4fa3-a8e2-edbfacecdaf5-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-26wzs\" (UID: \"38ca037e-8b49-4fa3-a8e2-edbfacecdaf5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-26wzs" Mar 09 14:42:55 crc kubenswrapper[4722]: I0309 14:42:55.128349 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7nww\" (UniqueName: \"kubernetes.io/projected/38ca037e-8b49-4fa3-a8e2-edbfacecdaf5-kube-api-access-z7nww\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-26wzs\" (UID: \"38ca037e-8b49-4fa3-a8e2-edbfacecdaf5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-26wzs" Mar 09 14:42:55 crc kubenswrapper[4722]: I0309 14:42:55.128461 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/38ca037e-8b49-4fa3-a8e2-edbfacecdaf5-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-26wzs\" (UID: \"38ca037e-8b49-4fa3-a8e2-edbfacecdaf5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-26wzs" Mar 09 14:42:55 crc kubenswrapper[4722]: I0309 14:42:55.128489 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/38ca037e-8b49-4fa3-a8e2-edbfacecdaf5-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-26wzs\" (UID: \"38ca037e-8b49-4fa3-a8e2-edbfacecdaf5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-26wzs" Mar 09 14:42:55 crc kubenswrapper[4722]: I0309 14:42:55.230991 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/38ca037e-8b49-4fa3-a8e2-edbfacecdaf5-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-26wzs\" (UID: \"38ca037e-8b49-4fa3-a8e2-edbfacecdaf5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-26wzs" Mar 09 14:42:55 crc kubenswrapper[4722]: I0309 14:42:55.231051 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/38ca037e-8b49-4fa3-a8e2-edbfacecdaf5-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-26wzs\" (UID: \"38ca037e-8b49-4fa3-a8e2-edbfacecdaf5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-26wzs" Mar 09 14:42:55 crc kubenswrapper[4722]: I0309 14:42:55.231243 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38ca037e-8b49-4fa3-a8e2-edbfacecdaf5-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-26wzs\" (UID: \"38ca037e-8b49-4fa3-a8e2-edbfacecdaf5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-26wzs" Mar 09 14:42:55 crc kubenswrapper[4722]: I0309 14:42:55.231279 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/38ca037e-8b49-4fa3-a8e2-edbfacecdaf5-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-26wzs\" (UID: \"38ca037e-8b49-4fa3-a8e2-edbfacecdaf5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-26wzs" Mar 09 14:42:55 crc kubenswrapper[4722]: I0309 14:42:55.231313 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/38ca037e-8b49-4fa3-a8e2-edbfacecdaf5-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-26wzs\" (UID: \"38ca037e-8b49-4fa3-a8e2-edbfacecdaf5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-26wzs" Mar 09 14:42:55 crc kubenswrapper[4722]: I0309 14:42:55.231377 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7nww\" (UniqueName: \"kubernetes.io/projected/38ca037e-8b49-4fa3-a8e2-edbfacecdaf5-kube-api-access-z7nww\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-26wzs\" (UID: \"38ca037e-8b49-4fa3-a8e2-edbfacecdaf5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-26wzs" Mar 09 14:42:55 crc kubenswrapper[4722]: I0309 14:42:55.235373 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/38ca037e-8b49-4fa3-a8e2-edbfacecdaf5-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-26wzs\" (UID: \"38ca037e-8b49-4fa3-a8e2-edbfacecdaf5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-26wzs" Mar 09 14:42:55 crc kubenswrapper[4722]: I0309 14:42:55.235813 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/38ca037e-8b49-4fa3-a8e2-edbfacecdaf5-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-26wzs\" (UID: \"38ca037e-8b49-4fa3-a8e2-edbfacecdaf5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-26wzs" Mar 09 14:42:55 crc kubenswrapper[4722]: I0309 14:42:55.236531 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/38ca037e-8b49-4fa3-a8e2-edbfacecdaf5-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-26wzs\" (UID: \"38ca037e-8b49-4fa3-a8e2-edbfacecdaf5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-26wzs" Mar 09 14:42:55 crc kubenswrapper[4722]: I0309 14:42:55.237541 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38ca037e-8b49-4fa3-a8e2-edbfacecdaf5-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-26wzs\" (UID: \"38ca037e-8b49-4fa3-a8e2-edbfacecdaf5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-26wzs" Mar 09 14:42:55 crc kubenswrapper[4722]: I0309 14:42:55.239487 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/38ca037e-8b49-4fa3-a8e2-edbfacecdaf5-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-26wzs\" (UID: \"38ca037e-8b49-4fa3-a8e2-edbfacecdaf5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-26wzs" Mar 09 14:42:55 crc kubenswrapper[4722]: I0309 14:42:55.246710 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7nww\" (UniqueName: \"kubernetes.io/projected/38ca037e-8b49-4fa3-a8e2-edbfacecdaf5-kube-api-access-z7nww\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-26wzs\" (UID: \"38ca037e-8b49-4fa3-a8e2-edbfacecdaf5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-26wzs" Mar 09 14:42:55 crc kubenswrapper[4722]: I0309 14:42:55.433825 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-26wzs" Mar 09 14:42:56 crc kubenswrapper[4722]: I0309 14:42:56.016808 4722 generic.go:334] "Generic (PLEG): container finished" podID="75e8821e-13f2-4b95-b427-5aa473f37dd8" containerID="3029d322bec3ec8408e96d0a4733cbdfda1da8c468d5a372088c3b928cde62d5" exitCode=0 Mar 09 14:42:56 crc kubenswrapper[4722]: I0309 14:42:56.016892 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8tvtb" event={"ID":"75e8821e-13f2-4b95-b427-5aa473f37dd8","Type":"ContainerDied","Data":"3029d322bec3ec8408e96d0a4733cbdfda1da8c468d5a372088c3b928cde62d5"} Mar 09 14:42:56 crc kubenswrapper[4722]: I0309 14:42:56.060402 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-26wzs"] Mar 09 14:42:56 crc kubenswrapper[4722]: W0309 14:42:56.068117 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod38ca037e_8b49_4fa3_a8e2_edbfacecdaf5.slice/crio-cf5bad262f5591cda984b1b0d293461fff6c713d4598ae19d6b7f41b6a1bb32e WatchSource:0}: Error finding container cf5bad262f5591cda984b1b0d293461fff6c713d4598ae19d6b7f41b6a1bb32e: Status 404 returned error can't find the container with id cf5bad262f5591cda984b1b0d293461fff6c713d4598ae19d6b7f41b6a1bb32e Mar 09 14:42:57 crc kubenswrapper[4722]: I0309 14:42:57.030274 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8tvtb" event={"ID":"75e8821e-13f2-4b95-b427-5aa473f37dd8","Type":"ContainerStarted","Data":"1e5cf952b5694ca0e543d217e7d5e41f185358b06b77c61ea8b1d4f9caa3371c"} Mar 09 14:42:57 crc kubenswrapper[4722]: I0309 14:42:57.033284 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-26wzs" event={"ID":"38ca037e-8b49-4fa3-a8e2-edbfacecdaf5","Type":"ContainerStarted","Data":"a800d727f8fc02c45f8682dbfb6bb15c8bd4bfe90837018060ed4c8788658704"} Mar 09 14:42:57 crc kubenswrapper[4722]: I0309 14:42:57.033341 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-26wzs" event={"ID":"38ca037e-8b49-4fa3-a8e2-edbfacecdaf5","Type":"ContainerStarted","Data":"cf5bad262f5591cda984b1b0d293461fff6c713d4598ae19d6b7f41b6a1bb32e"} Mar 09 14:42:57 crc kubenswrapper[4722]: I0309 14:42:57.103501 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-26wzs" podStartSLOduration=1.637066376 podStartE2EDuration="2.103478277s" podCreationTimestamp="2026-03-09 14:42:55 +0000 UTC" firstStartedPulling="2026-03-09 14:42:56.072124594 +0000 UTC m=+2416.627693170" lastFinishedPulling="2026-03-09 14:42:56.538536495 +0000 UTC m=+2417.094105071" observedRunningTime="2026-03-09 14:42:57.09063576 +0000 UTC m=+2417.646204366" watchObservedRunningTime="2026-03-09 14:42:57.103478277 +0000 UTC m=+2417.659046853" Mar 09 14:42:58 crc kubenswrapper[4722]: I0309 14:42:58.155374 4722 scope.go:117] "RemoveContainer" containerID="931310cf17f93e3962cdbbdf531169aefb3915147cff22375dd9c06d9d6b2906" Mar 09 14:42:58 crc kubenswrapper[4722]: E0309 14:42:58.155953 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 14:42:59 crc kubenswrapper[4722]: I0309 14:42:59.054688 4722 generic.go:334] "Generic (PLEG): container finished" podID="75e8821e-13f2-4b95-b427-5aa473f37dd8" containerID="1e5cf952b5694ca0e543d217e7d5e41f185358b06b77c61ea8b1d4f9caa3371c" exitCode=0 Mar 09 14:42:59 crc kubenswrapper[4722]: I0309 14:42:59.055020 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8tvtb" event={"ID":"75e8821e-13f2-4b95-b427-5aa473f37dd8","Type":"ContainerDied","Data":"1e5cf952b5694ca0e543d217e7d5e41f185358b06b77c61ea8b1d4f9caa3371c"} Mar 09 14:43:00 crc kubenswrapper[4722]: I0309 14:43:00.066371 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8tvtb" event={"ID":"75e8821e-13f2-4b95-b427-5aa473f37dd8","Type":"ContainerStarted","Data":"49d2aa1bf7edfb2075f74f98f04d141a730e6ccd10b6dc6d8a129eaa0e3f1295"} Mar 09 14:43:00 crc kubenswrapper[4722]: I0309 14:43:00.104269 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8tvtb" podStartSLOduration=3.610259999 podStartE2EDuration="7.104227968s" podCreationTimestamp="2026-03-09 14:42:53 +0000 UTC" firstStartedPulling="2026-03-09 14:42:56.020985528 +0000 UTC m=+2416.576554104" lastFinishedPulling="2026-03-09 14:42:59.514953497 +0000 UTC m=+2420.070522073" observedRunningTime="2026-03-09 14:43:00.084410381 +0000 UTC m=+2420.639978957" watchObservedRunningTime="2026-03-09 14:43:00.104227968 +0000 UTC m=+2420.659796544" Mar 09 14:43:04 crc kubenswrapper[4722]: I0309 14:43:04.217593 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8tvtb" Mar 09 14:43:04 crc kubenswrapper[4722]: I0309 14:43:04.218189 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8tvtb" Mar 09 14:43:04 crc kubenswrapper[4722]: I0309 14:43:04.273729 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8tvtb" Mar 09 14:43:05 crc kubenswrapper[4722]: I0309 14:43:05.189897 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8tvtb" Mar 09 14:43:05 crc kubenswrapper[4722]: I0309 14:43:05.252031 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8tvtb"] Mar 09 14:43:07 crc kubenswrapper[4722]: I0309 14:43:07.163744 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8tvtb" podUID="75e8821e-13f2-4b95-b427-5aa473f37dd8" containerName="registry-server" containerID="cri-o://49d2aa1bf7edfb2075f74f98f04d141a730e6ccd10b6dc6d8a129eaa0e3f1295" gracePeriod=2 Mar 09 14:43:07 crc kubenswrapper[4722]: I0309 14:43:07.683800 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8tvtb" Mar 09 14:43:07 crc kubenswrapper[4722]: I0309 14:43:07.777111 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v89d9\" (UniqueName: \"kubernetes.io/projected/75e8821e-13f2-4b95-b427-5aa473f37dd8-kube-api-access-v89d9\") pod \"75e8821e-13f2-4b95-b427-5aa473f37dd8\" (UID: \"75e8821e-13f2-4b95-b427-5aa473f37dd8\") " Mar 09 14:43:07 crc kubenswrapper[4722]: I0309 14:43:07.777367 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75e8821e-13f2-4b95-b427-5aa473f37dd8-utilities\") pod \"75e8821e-13f2-4b95-b427-5aa473f37dd8\" (UID: \"75e8821e-13f2-4b95-b427-5aa473f37dd8\") " Mar 09 14:43:07 crc kubenswrapper[4722]: I0309 14:43:07.777446 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75e8821e-13f2-4b95-b427-5aa473f37dd8-catalog-content\") pod \"75e8821e-13f2-4b95-b427-5aa473f37dd8\" (UID: \"75e8821e-13f2-4b95-b427-5aa473f37dd8\") " Mar 09 14:43:07 crc kubenswrapper[4722]: I0309 14:43:07.778657 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75e8821e-13f2-4b95-b427-5aa473f37dd8-utilities" (OuterVolumeSpecName: "utilities") pod "75e8821e-13f2-4b95-b427-5aa473f37dd8" (UID: "75e8821e-13f2-4b95-b427-5aa473f37dd8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:43:07 crc kubenswrapper[4722]: I0309 14:43:07.781819 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75e8821e-13f2-4b95-b427-5aa473f37dd8-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 14:43:07 crc kubenswrapper[4722]: I0309 14:43:07.786548 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75e8821e-13f2-4b95-b427-5aa473f37dd8-kube-api-access-v89d9" (OuterVolumeSpecName: "kube-api-access-v89d9") pod "75e8821e-13f2-4b95-b427-5aa473f37dd8" (UID: "75e8821e-13f2-4b95-b427-5aa473f37dd8"). InnerVolumeSpecName "kube-api-access-v89d9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:43:07 crc kubenswrapper[4722]: I0309 14:43:07.850917 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/75e8821e-13f2-4b95-b427-5aa473f37dd8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "75e8821e-13f2-4b95-b427-5aa473f37dd8" (UID: "75e8821e-13f2-4b95-b427-5aa473f37dd8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:43:07 crc kubenswrapper[4722]: I0309 14:43:07.884747 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75e8821e-13f2-4b95-b427-5aa473f37dd8-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 14:43:07 crc kubenswrapper[4722]: I0309 14:43:07.884789 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v89d9\" (UniqueName: \"kubernetes.io/projected/75e8821e-13f2-4b95-b427-5aa473f37dd8-kube-api-access-v89d9\") on node \"crc\" DevicePath \"\"" Mar 09 14:43:08 crc kubenswrapper[4722]: I0309 14:43:08.182869 4722 generic.go:334] "Generic (PLEG): container finished" podID="75e8821e-13f2-4b95-b427-5aa473f37dd8" containerID="49d2aa1bf7edfb2075f74f98f04d141a730e6ccd10b6dc6d8a129eaa0e3f1295" exitCode=0 Mar 09 14:43:08 crc kubenswrapper[4722]: I0309 14:43:08.182911 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8tvtb" event={"ID":"75e8821e-13f2-4b95-b427-5aa473f37dd8","Type":"ContainerDied","Data":"49d2aa1bf7edfb2075f74f98f04d141a730e6ccd10b6dc6d8a129eaa0e3f1295"} Mar 09 14:43:08 crc kubenswrapper[4722]: I0309 14:43:08.182941 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8tvtb" event={"ID":"75e8821e-13f2-4b95-b427-5aa473f37dd8","Type":"ContainerDied","Data":"b90bc8b4c861ca5ca3c7b4545da619b5777e2e3399914a6ce7da64e5b9f250a4"} Mar 09 14:43:08 crc kubenswrapper[4722]: I0309 14:43:08.182959 4722 scope.go:117] "RemoveContainer" containerID="49d2aa1bf7edfb2075f74f98f04d141a730e6ccd10b6dc6d8a129eaa0e3f1295" Mar 09 14:43:08 crc kubenswrapper[4722]: I0309 14:43:08.182985 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8tvtb" Mar 09 14:43:08 crc kubenswrapper[4722]: I0309 14:43:08.218845 4722 scope.go:117] "RemoveContainer" containerID="1e5cf952b5694ca0e543d217e7d5e41f185358b06b77c61ea8b1d4f9caa3371c" Mar 09 14:43:08 crc kubenswrapper[4722]: I0309 14:43:08.223016 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8tvtb"] Mar 09 14:43:08 crc kubenswrapper[4722]: I0309 14:43:08.240930 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8tvtb"] Mar 09 14:43:08 crc kubenswrapper[4722]: I0309 14:43:08.241948 4722 scope.go:117] "RemoveContainer" containerID="3029d322bec3ec8408e96d0a4733cbdfda1da8c468d5a372088c3b928cde62d5" Mar 09 14:43:08 crc kubenswrapper[4722]: I0309 14:43:08.306092 4722 scope.go:117] "RemoveContainer" containerID="49d2aa1bf7edfb2075f74f98f04d141a730e6ccd10b6dc6d8a129eaa0e3f1295" Mar 09 14:43:08 crc kubenswrapper[4722]: E0309 14:43:08.306768 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49d2aa1bf7edfb2075f74f98f04d141a730e6ccd10b6dc6d8a129eaa0e3f1295\": container with ID starting with 49d2aa1bf7edfb2075f74f98f04d141a730e6ccd10b6dc6d8a129eaa0e3f1295 not found: ID does not exist" containerID="49d2aa1bf7edfb2075f74f98f04d141a730e6ccd10b6dc6d8a129eaa0e3f1295" Mar 09 14:43:08 crc kubenswrapper[4722]: I0309 14:43:08.306840 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49d2aa1bf7edfb2075f74f98f04d141a730e6ccd10b6dc6d8a129eaa0e3f1295"} err="failed to get container status \"49d2aa1bf7edfb2075f74f98f04d141a730e6ccd10b6dc6d8a129eaa0e3f1295\": rpc error: code = NotFound desc = could not find container \"49d2aa1bf7edfb2075f74f98f04d141a730e6ccd10b6dc6d8a129eaa0e3f1295\": container with ID starting with 49d2aa1bf7edfb2075f74f98f04d141a730e6ccd10b6dc6d8a129eaa0e3f1295 not found: ID does not exist" Mar 09 14:43:08 crc kubenswrapper[4722]: I0309 14:43:08.306892 4722 scope.go:117] "RemoveContainer" containerID="1e5cf952b5694ca0e543d217e7d5e41f185358b06b77c61ea8b1d4f9caa3371c" Mar 09 14:43:08 crc kubenswrapper[4722]: E0309 14:43:08.307811 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e5cf952b5694ca0e543d217e7d5e41f185358b06b77c61ea8b1d4f9caa3371c\": container with ID starting with 1e5cf952b5694ca0e543d217e7d5e41f185358b06b77c61ea8b1d4f9caa3371c not found: ID does not exist" containerID="1e5cf952b5694ca0e543d217e7d5e41f185358b06b77c61ea8b1d4f9caa3371c" Mar 09 14:43:08 crc kubenswrapper[4722]: I0309 14:43:08.307893 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e5cf952b5694ca0e543d217e7d5e41f185358b06b77c61ea8b1d4f9caa3371c"} err="failed to get container status \"1e5cf952b5694ca0e543d217e7d5e41f185358b06b77c61ea8b1d4f9caa3371c\": rpc error: code = NotFound desc = could not find container \"1e5cf952b5694ca0e543d217e7d5e41f185358b06b77c61ea8b1d4f9caa3371c\": container with ID starting with 1e5cf952b5694ca0e543d217e7d5e41f185358b06b77c61ea8b1d4f9caa3371c not found: ID does not exist" Mar 09 14:43:08 crc kubenswrapper[4722]: I0309 14:43:08.307927 4722 scope.go:117] "RemoveContainer" containerID="3029d322bec3ec8408e96d0a4733cbdfda1da8c468d5a372088c3b928cde62d5" Mar 09 14:43:08 crc kubenswrapper[4722]: E0309 14:43:08.308254 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3029d322bec3ec8408e96d0a4733cbdfda1da8c468d5a372088c3b928cde62d5\": container with ID starting with 3029d322bec3ec8408e96d0a4733cbdfda1da8c468d5a372088c3b928cde62d5 not found: ID does not exist" containerID="3029d322bec3ec8408e96d0a4733cbdfda1da8c468d5a372088c3b928cde62d5" Mar 09 14:43:08 crc kubenswrapper[4722]: I0309 14:43:08.308285 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3029d322bec3ec8408e96d0a4733cbdfda1da8c468d5a372088c3b928cde62d5"} err="failed to get container status \"3029d322bec3ec8408e96d0a4733cbdfda1da8c468d5a372088c3b928cde62d5\": rpc error: code = NotFound desc = could not find container \"3029d322bec3ec8408e96d0a4733cbdfda1da8c468d5a372088c3b928cde62d5\": container with ID starting with 3029d322bec3ec8408e96d0a4733cbdfda1da8c468d5a372088c3b928cde62d5 not found: ID does not exist" Mar 09 14:43:08 crc kubenswrapper[4722]: E0309 14:43:08.372227 4722 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75e8821e_13f2_4b95_b427_5aa473f37dd8.slice/crio-b90bc8b4c861ca5ca3c7b4545da619b5777e2e3399914a6ce7da64e5b9f250a4\": RecentStats: unable to find data in memory cache]" Mar 09 14:43:10 crc kubenswrapper[4722]: I0309 14:43:10.161290 4722 scope.go:117] "RemoveContainer" containerID="931310cf17f93e3962cdbbdf531169aefb3915147cff22375dd9c06d9d6b2906" Mar 09 14:43:10 crc kubenswrapper[4722]: E0309 14:43:10.161860 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 14:43:10 crc kubenswrapper[4722]: I0309 14:43:10.164057 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75e8821e-13f2-4b95-b427-5aa473f37dd8" path="/var/lib/kubelet/pods/75e8821e-13f2-4b95-b427-5aa473f37dd8/volumes" Mar 09 14:43:25 crc kubenswrapper[4722]: I0309 14:43:25.148629 4722 scope.go:117] "RemoveContainer" containerID="931310cf17f93e3962cdbbdf531169aefb3915147cff22375dd9c06d9d6b2906" Mar 09 14:43:25 crc kubenswrapper[4722]: E0309 14:43:25.149388 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 14:43:36 crc kubenswrapper[4722]: I0309 14:43:36.149521 4722 scope.go:117] "RemoveContainer" containerID="931310cf17f93e3962cdbbdf531169aefb3915147cff22375dd9c06d9d6b2906" Mar 09 14:43:36 crc kubenswrapper[4722]: E0309 14:43:36.150454 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 14:43:47 crc kubenswrapper[4722]: I0309 14:43:47.149421 4722 scope.go:117] "RemoveContainer" containerID="931310cf17f93e3962cdbbdf531169aefb3915147cff22375dd9c06d9d6b2906" Mar 09 14:43:47 crc kubenswrapper[4722]: E0309 14:43:47.150299 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 14:43:50 crc kubenswrapper[4722]: I0309 14:43:50.702975 4722 generic.go:334] "Generic (PLEG): container finished" podID="38ca037e-8b49-4fa3-a8e2-edbfacecdaf5" containerID="a800d727f8fc02c45f8682dbfb6bb15c8bd4bfe90837018060ed4c8788658704" exitCode=0 Mar 09 14:43:50 crc kubenswrapper[4722]: I0309 14:43:50.703070 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-26wzs" event={"ID":"38ca037e-8b49-4fa3-a8e2-edbfacecdaf5","Type":"ContainerDied","Data":"a800d727f8fc02c45f8682dbfb6bb15c8bd4bfe90837018060ed4c8788658704"} Mar 09 14:43:50 crc kubenswrapper[4722]: I0309 14:43:50.729639 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-srb2n"] Mar 09 14:43:50 crc kubenswrapper[4722]: E0309 14:43:50.730325 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75e8821e-13f2-4b95-b427-5aa473f37dd8" containerName="extract-content" Mar 09 14:43:50 crc kubenswrapper[4722]: I0309 14:43:50.730348 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="75e8821e-13f2-4b95-b427-5aa473f37dd8" containerName="extract-content" Mar 09 14:43:50 crc kubenswrapper[4722]: E0309 14:43:50.730381 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75e8821e-13f2-4b95-b427-5aa473f37dd8" containerName="extract-utilities" Mar 09 14:43:50 crc kubenswrapper[4722]: I0309 14:43:50.730389 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="75e8821e-13f2-4b95-b427-5aa473f37dd8" containerName="extract-utilities" Mar 09 14:43:50 crc kubenswrapper[4722]: E0309 14:43:50.730417 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75e8821e-13f2-4b95-b427-5aa473f37dd8" containerName="registry-server" Mar 09 14:43:50 crc kubenswrapper[4722]: I0309 14:43:50.730423 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="75e8821e-13f2-4b95-b427-5aa473f37dd8" containerName="registry-server" Mar 09 14:43:50 crc kubenswrapper[4722]: I0309 14:43:50.730647 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="75e8821e-13f2-4b95-b427-5aa473f37dd8" containerName="registry-server" Mar 09 14:43:50 crc kubenswrapper[4722]: I0309 14:43:50.732890 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-srb2n" Mar 09 14:43:50 crc kubenswrapper[4722]: I0309 14:43:50.760079 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-srb2n"] Mar 09 14:43:50 crc kubenswrapper[4722]: I0309 14:43:50.777015 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/064f14fc-c287-473b-a503-bc35661e31b2-utilities\") pod \"redhat-marketplace-srb2n\" (UID: \"064f14fc-c287-473b-a503-bc35661e31b2\") " pod="openshift-marketplace/redhat-marketplace-srb2n" Mar 09 14:43:50 crc kubenswrapper[4722]: I0309 14:43:50.777108 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/064f14fc-c287-473b-a503-bc35661e31b2-catalog-content\") pod \"redhat-marketplace-srb2n\" (UID: \"064f14fc-c287-473b-a503-bc35661e31b2\") " pod="openshift-marketplace/redhat-marketplace-srb2n" Mar 09 14:43:50 crc kubenswrapper[4722]: I0309 14:43:50.777144 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzcmg\" (UniqueName: \"kubernetes.io/projected/064f14fc-c287-473b-a503-bc35661e31b2-kube-api-access-qzcmg\") pod \"redhat-marketplace-srb2n\" (UID: \"064f14fc-c287-473b-a503-bc35661e31b2\") " pod="openshift-marketplace/redhat-marketplace-srb2n" Mar 09 14:43:50 crc kubenswrapper[4722]: I0309 14:43:50.881244 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/064f14fc-c287-473b-a503-bc35661e31b2-catalog-content\") pod \"redhat-marketplace-srb2n\" (UID: \"064f14fc-c287-473b-a503-bc35661e31b2\") " pod="openshift-marketplace/redhat-marketplace-srb2n" Mar 09 14:43:50 crc kubenswrapper[4722]: I0309 14:43:50.881366 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzcmg\" (UniqueName: \"kubernetes.io/projected/064f14fc-c287-473b-a503-bc35661e31b2-kube-api-access-qzcmg\") pod \"redhat-marketplace-srb2n\" (UID: \"064f14fc-c287-473b-a503-bc35661e31b2\") " pod="openshift-marketplace/redhat-marketplace-srb2n" Mar 09 14:43:50 crc kubenswrapper[4722]: I0309 14:43:50.881675 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/064f14fc-c287-473b-a503-bc35661e31b2-utilities\") pod \"redhat-marketplace-srb2n\" (UID: \"064f14fc-c287-473b-a503-bc35661e31b2\") " pod="openshift-marketplace/redhat-marketplace-srb2n" Mar 09 14:43:50 crc kubenswrapper[4722]: I0309 14:43:50.881858 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/064f14fc-c287-473b-a503-bc35661e31b2-catalog-content\") pod \"redhat-marketplace-srb2n\" (UID: \"064f14fc-c287-473b-a503-bc35661e31b2\") " pod="openshift-marketplace/redhat-marketplace-srb2n" Mar 09 14:43:50 crc kubenswrapper[4722]: I0309 14:43:50.882172 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/064f14fc-c287-473b-a503-bc35661e31b2-utilities\") pod \"redhat-marketplace-srb2n\" (UID: \"064f14fc-c287-473b-a503-bc35661e31b2\") " pod="openshift-marketplace/redhat-marketplace-srb2n" Mar 09 14:43:50 crc kubenswrapper[4722]: I0309 14:43:50.903726 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzcmg\" (UniqueName: \"kubernetes.io/projected/064f14fc-c287-473b-a503-bc35661e31b2-kube-api-access-qzcmg\") pod \"redhat-marketplace-srb2n\" (UID: \"064f14fc-c287-473b-a503-bc35661e31b2\") " pod="openshift-marketplace/redhat-marketplace-srb2n" Mar 09 14:43:51 crc kubenswrapper[4722]: I0309 14:43:51.071364 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-srb2n" Mar 09 14:43:51 crc kubenswrapper[4722]: I0309 14:43:51.674076 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-srb2n"] Mar 09 14:43:51 crc kubenswrapper[4722]: I0309 14:43:51.718263 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-srb2n" event={"ID":"064f14fc-c287-473b-a503-bc35661e31b2","Type":"ContainerStarted","Data":"ec0d755cacb9c01fea044a230644be3cceac07251f7269b0af9d3704af941557"} Mar 09 14:43:52 crc kubenswrapper[4722]: I0309 14:43:52.256337 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-26wzs" Mar 09 14:43:52 crc kubenswrapper[4722]: I0309 14:43:52.422056 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38ca037e-8b49-4fa3-a8e2-edbfacecdaf5-neutron-metadata-combined-ca-bundle\") pod \"38ca037e-8b49-4fa3-a8e2-edbfacecdaf5\" (UID: \"38ca037e-8b49-4fa3-a8e2-edbfacecdaf5\") " Mar 09 14:43:52 crc kubenswrapper[4722]: I0309 14:43:52.422352 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7nww\" (UniqueName: \"kubernetes.io/projected/38ca037e-8b49-4fa3-a8e2-edbfacecdaf5-kube-api-access-z7nww\") pod \"38ca037e-8b49-4fa3-a8e2-edbfacecdaf5\" (UID: \"38ca037e-8b49-4fa3-a8e2-edbfacecdaf5\") " Mar 09 14:43:52 crc kubenswrapper[4722]: I0309 14:43:52.422568 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/38ca037e-8b49-4fa3-a8e2-edbfacecdaf5-inventory\") pod \"38ca037e-8b49-4fa3-a8e2-edbfacecdaf5\" (UID: \"38ca037e-8b49-4fa3-a8e2-edbfacecdaf5\") " Mar 09 14:43:52 crc kubenswrapper[4722]: I0309 14:43:52.422735 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/38ca037e-8b49-4fa3-a8e2-edbfacecdaf5-nova-metadata-neutron-config-0\") pod \"38ca037e-8b49-4fa3-a8e2-edbfacecdaf5\" (UID: \"38ca037e-8b49-4fa3-a8e2-edbfacecdaf5\") " Mar 09 14:43:52 crc kubenswrapper[4722]: I0309 14:43:52.422822 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/38ca037e-8b49-4fa3-a8e2-edbfacecdaf5-neutron-ovn-metadata-agent-neutron-config-0\") pod \"38ca037e-8b49-4fa3-a8e2-edbfacecdaf5\" (UID: \"38ca037e-8b49-4fa3-a8e2-edbfacecdaf5\") " Mar 09 14:43:52 crc kubenswrapper[4722]: I0309 14:43:52.422954 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/38ca037e-8b49-4fa3-a8e2-edbfacecdaf5-ssh-key-openstack-edpm-ipam\") pod \"38ca037e-8b49-4fa3-a8e2-edbfacecdaf5\" (UID: \"38ca037e-8b49-4fa3-a8e2-edbfacecdaf5\") " Mar 09 14:43:52 crc kubenswrapper[4722]: I0309 14:43:52.428910 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38ca037e-8b49-4fa3-a8e2-edbfacecdaf5-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "38ca037e-8b49-4fa3-a8e2-edbfacecdaf5" (UID: "38ca037e-8b49-4fa3-a8e2-edbfacecdaf5"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:43:52 crc kubenswrapper[4722]: I0309 14:43:52.429233 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38ca037e-8b49-4fa3-a8e2-edbfacecdaf5-kube-api-access-z7nww" (OuterVolumeSpecName: "kube-api-access-z7nww") pod "38ca037e-8b49-4fa3-a8e2-edbfacecdaf5" (UID: "38ca037e-8b49-4fa3-a8e2-edbfacecdaf5"). InnerVolumeSpecName "kube-api-access-z7nww". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:43:52 crc kubenswrapper[4722]: I0309 14:43:52.454263 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38ca037e-8b49-4fa3-a8e2-edbfacecdaf5-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "38ca037e-8b49-4fa3-a8e2-edbfacecdaf5" (UID: "38ca037e-8b49-4fa3-a8e2-edbfacecdaf5"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:43:52 crc kubenswrapper[4722]: I0309 14:43:52.455926 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38ca037e-8b49-4fa3-a8e2-edbfacecdaf5-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "38ca037e-8b49-4fa3-a8e2-edbfacecdaf5" (UID: "38ca037e-8b49-4fa3-a8e2-edbfacecdaf5"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:43:52 crc kubenswrapper[4722]: I0309 14:43:52.458070 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38ca037e-8b49-4fa3-a8e2-edbfacecdaf5-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "38ca037e-8b49-4fa3-a8e2-edbfacecdaf5" (UID: "38ca037e-8b49-4fa3-a8e2-edbfacecdaf5"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:43:52 crc kubenswrapper[4722]: I0309 14:43:52.469038 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38ca037e-8b49-4fa3-a8e2-edbfacecdaf5-inventory" (OuterVolumeSpecName: "inventory") pod "38ca037e-8b49-4fa3-a8e2-edbfacecdaf5" (UID: "38ca037e-8b49-4fa3-a8e2-edbfacecdaf5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:43:52 crc kubenswrapper[4722]: I0309 14:43:52.526769 4722 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38ca037e-8b49-4fa3-a8e2-edbfacecdaf5-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:43:52 crc kubenswrapper[4722]: I0309 14:43:52.526802 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7nww\" (UniqueName: \"kubernetes.io/projected/38ca037e-8b49-4fa3-a8e2-edbfacecdaf5-kube-api-access-z7nww\") on node \"crc\" DevicePath \"\"" Mar 09 14:43:52 crc kubenswrapper[4722]: I0309 14:43:52.526814 4722 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/38ca037e-8b49-4fa3-a8e2-edbfacecdaf5-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 14:43:52 crc kubenswrapper[4722]: I0309 14:43:52.526827 4722 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/38ca037e-8b49-4fa3-a8e2-edbfacecdaf5-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 09 14:43:52 crc kubenswrapper[4722]: I0309 14:43:52.526840 4722 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/38ca037e-8b49-4fa3-a8e2-edbfacecdaf5-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 09 14:43:52 crc kubenswrapper[4722]: I0309 14:43:52.526851 4722 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/38ca037e-8b49-4fa3-a8e2-edbfacecdaf5-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 14:43:52 crc kubenswrapper[4722]: I0309 14:43:52.732722 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-26wzs" event={"ID":"38ca037e-8b49-4fa3-a8e2-edbfacecdaf5","Type":"ContainerDied","Data":"cf5bad262f5591cda984b1b0d293461fff6c713d4598ae19d6b7f41b6a1bb32e"} Mar 09 14:43:52 crc kubenswrapper[4722]: I0309 14:43:52.732770 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf5bad262f5591cda984b1b0d293461fff6c713d4598ae19d6b7f41b6a1bb32e" Mar 09 14:43:52 crc kubenswrapper[4722]: I0309 14:43:52.732771 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-26wzs" Mar 09 14:43:52 crc kubenswrapper[4722]: I0309 14:43:52.734985 4722 generic.go:334] "Generic (PLEG): container finished" podID="064f14fc-c287-473b-a503-bc35661e31b2" containerID="2f6242d7b866783b7ee72187d602482f86d45d7a5b42637d6220fc322f986302" exitCode=0 Mar 09 14:43:52 crc kubenswrapper[4722]: I0309 14:43:52.735025 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-srb2n" event={"ID":"064f14fc-c287-473b-a503-bc35661e31b2","Type":"ContainerDied","Data":"2f6242d7b866783b7ee72187d602482f86d45d7a5b42637d6220fc322f986302"} Mar 09 14:43:52 crc kubenswrapper[4722]: I0309 14:43:52.867357 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wqbn2"] Mar 09 14:43:52 crc kubenswrapper[4722]: E0309 14:43:52.867932 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38ca037e-8b49-4fa3-a8e2-edbfacecdaf5" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 09 14:43:52 crc kubenswrapper[4722]: I0309 14:43:52.867957 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="38ca037e-8b49-4fa3-a8e2-edbfacecdaf5" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 09 14:43:52 crc kubenswrapper[4722]: I0309 14:43:52.868371 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="38ca037e-8b49-4fa3-a8e2-edbfacecdaf5" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 09 14:43:52 crc kubenswrapper[4722]: I0309 14:43:52.869765 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wqbn2" Mar 09 14:43:52 crc kubenswrapper[4722]: I0309 14:43:52.873367 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Mar 09 14:43:52 crc kubenswrapper[4722]: I0309 14:43:52.873543 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 14:43:52 crc kubenswrapper[4722]: I0309 14:43:52.873664 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-dwlbg" Mar 09 14:43:52 crc kubenswrapper[4722]: I0309 14:43:52.879951 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 14:43:52 crc kubenswrapper[4722]: I0309 14:43:52.880342 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 14:43:52 crc kubenswrapper[4722]: I0309 14:43:52.917936 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wqbn2"] Mar 09 14:43:53 crc kubenswrapper[4722]: I0309 14:43:53.038410 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9k6q\" (UniqueName: \"kubernetes.io/projected/b0bc3b99-5368-4287-8a9d-7b19b8b33e40-kube-api-access-b9k6q\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wqbn2\" (UID: \"b0bc3b99-5368-4287-8a9d-7b19b8b33e40\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wqbn2" Mar 09 14:43:53 crc kubenswrapper[4722]: I0309 14:43:53.038479 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0bc3b99-5368-4287-8a9d-7b19b8b33e40-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wqbn2\" (UID: \"b0bc3b99-5368-4287-8a9d-7b19b8b33e40\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wqbn2" Mar 09 14:43:53 crc kubenswrapper[4722]: I0309 14:43:53.038598 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b0bc3b99-5368-4287-8a9d-7b19b8b33e40-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wqbn2\" (UID: \"b0bc3b99-5368-4287-8a9d-7b19b8b33e40\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wqbn2" Mar 09 14:43:53 crc kubenswrapper[4722]: I0309 14:43:53.038632 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0bc3b99-5368-4287-8a9d-7b19b8b33e40-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wqbn2\" (UID: \"b0bc3b99-5368-4287-8a9d-7b19b8b33e40\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wqbn2" Mar 09 14:43:53 crc kubenswrapper[4722]: I0309 14:43:53.038698 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/b0bc3b99-5368-4287-8a9d-7b19b8b33e40-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wqbn2\" (UID: \"b0bc3b99-5368-4287-8a9d-7b19b8b33e40\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wqbn2" Mar 09 14:43:53 crc kubenswrapper[4722]: I0309 14:43:53.140745 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9k6q\" (UniqueName: \"kubernetes.io/projected/b0bc3b99-5368-4287-8a9d-7b19b8b33e40-kube-api-access-b9k6q\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wqbn2\" (UID: \"b0bc3b99-5368-4287-8a9d-7b19b8b33e40\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wqbn2" Mar 09 14:43:53 crc kubenswrapper[4722]: I0309 14:43:53.140804 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0bc3b99-5368-4287-8a9d-7b19b8b33e40-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wqbn2\" (UID: \"b0bc3b99-5368-4287-8a9d-7b19b8b33e40\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wqbn2" Mar 09 14:43:53 crc kubenswrapper[4722]: I0309 14:43:53.140891 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b0bc3b99-5368-4287-8a9d-7b19b8b33e40-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wqbn2\" (UID: \"b0bc3b99-5368-4287-8a9d-7b19b8b33e40\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wqbn2" Mar 09 14:43:53 crc kubenswrapper[4722]: I0309 14:43:53.140921 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0bc3b99-5368-4287-8a9d-7b19b8b33e40-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wqbn2\" (UID: \"b0bc3b99-5368-4287-8a9d-7b19b8b33e40\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wqbn2" Mar 09 14:43:53 crc kubenswrapper[4722]: I0309 14:43:53.140983 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/b0bc3b99-5368-4287-8a9d-7b19b8b33e40-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wqbn2\" (UID: \"b0bc3b99-5368-4287-8a9d-7b19b8b33e40\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wqbn2" Mar 09 14:43:53 crc kubenswrapper[4722]: I0309 14:43:53.145797 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0bc3b99-5368-4287-8a9d-7b19b8b33e40-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wqbn2\" (UID: \"b0bc3b99-5368-4287-8a9d-7b19b8b33e40\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wqbn2" Mar 09 14:43:53 crc kubenswrapper[4722]: I0309 14:43:53.145797 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/b0bc3b99-5368-4287-8a9d-7b19b8b33e40-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wqbn2\" (UID: \"b0bc3b99-5368-4287-8a9d-7b19b8b33e40\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wqbn2" Mar 09 14:43:53 crc kubenswrapper[4722]: I0309 14:43:53.146601 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b0bc3b99-5368-4287-8a9d-7b19b8b33e40-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wqbn2\" (UID: \"b0bc3b99-5368-4287-8a9d-7b19b8b33e40\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wqbn2" Mar 09 14:43:53 crc kubenswrapper[4722]: I0309 14:43:53.158566 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0bc3b99-5368-4287-8a9d-7b19b8b33e40-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wqbn2\" (UID: \"b0bc3b99-5368-4287-8a9d-7b19b8b33e40\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wqbn2" Mar 09 14:43:53 crc kubenswrapper[4722]: I0309 14:43:53.161553 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9k6q\" (UniqueName: \"kubernetes.io/projected/b0bc3b99-5368-4287-8a9d-7b19b8b33e40-kube-api-access-b9k6q\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-wqbn2\" (UID: \"b0bc3b99-5368-4287-8a9d-7b19b8b33e40\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wqbn2" Mar 09 14:43:53 crc kubenswrapper[4722]: I0309 14:43:53.204964 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wqbn2" Mar 09 14:43:53 crc kubenswrapper[4722]: I0309 14:43:53.804801 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wqbn2"] Mar 09 14:43:54 crc kubenswrapper[4722]: I0309 14:43:54.753659 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wqbn2" event={"ID":"b0bc3b99-5368-4287-8a9d-7b19b8b33e40","Type":"ContainerStarted","Data":"ca7729f7380f7b4585532e01d2ac2f6ecfa27c49b0a89574cc5163d0692a65a7"} Mar 09 14:43:54 crc kubenswrapper[4722]: I0309 14:43:54.756178 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-srb2n" event={"ID":"064f14fc-c287-473b-a503-bc35661e31b2","Type":"ContainerStarted","Data":"604020f6aa67457521d257e377086347d5738882aca6eaaeba41f22224559c93"} Mar 09 14:43:55 crc kubenswrapper[4722]: I0309 14:43:55.770696 4722 generic.go:334] "Generic (PLEG): container finished" podID="064f14fc-c287-473b-a503-bc35661e31b2" containerID="604020f6aa67457521d257e377086347d5738882aca6eaaeba41f22224559c93" exitCode=0 Mar 09 14:43:55 crc kubenswrapper[4722]: I0309 14:43:55.771097 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-srb2n" event={"ID":"064f14fc-c287-473b-a503-bc35661e31b2","Type":"ContainerDied","Data":"604020f6aa67457521d257e377086347d5738882aca6eaaeba41f22224559c93"} Mar 09 14:43:55 crc kubenswrapper[4722]: I0309 14:43:55.773562 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wqbn2" event={"ID":"b0bc3b99-5368-4287-8a9d-7b19b8b33e40","Type":"ContainerStarted","Data":"341d5625aff9f0cd0efb9bef91d2bc9e3c10a9467d9d33de0ae3b0e0e1c396f6"} Mar 09 14:43:55 crc kubenswrapper[4722]: I0309 14:43:55.806133 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wqbn2" podStartSLOduration=3.137111321 podStartE2EDuration="3.806116082s" podCreationTimestamp="2026-03-09 14:43:52 +0000 UTC" firstStartedPulling="2026-03-09 14:43:53.80778389 +0000 UTC m=+2474.363352466" lastFinishedPulling="2026-03-09 14:43:54.476788641 +0000 UTC m=+2475.032357227" observedRunningTime="2026-03-09 14:43:55.804010764 +0000 UTC m=+2476.359579330" watchObservedRunningTime="2026-03-09 14:43:55.806116082 +0000 UTC m=+2476.361684658" Mar 09 14:43:56 crc kubenswrapper[4722]: I0309 14:43:56.791986 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-srb2n" event={"ID":"064f14fc-c287-473b-a503-bc35661e31b2","Type":"ContainerStarted","Data":"d055cbf058a735fa6a48bdd2fcf0cbd3b055f837fd1dbe56289eacb23b0b6c1c"} Mar 09 14:43:56 crc kubenswrapper[4722]: I0309 14:43:56.840506 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-srb2n" podStartSLOduration=3.129827785 podStartE2EDuration="6.840482727s" podCreationTimestamp="2026-03-09 14:43:50 +0000 UTC" firstStartedPulling="2026-03-09 14:43:52.737955574 +0000 UTC m=+2473.293524150" lastFinishedPulling="2026-03-09 14:43:56.448610516 +0000 UTC m=+2477.004179092" observedRunningTime="2026-03-09 14:43:56.823004753 +0000 UTC m=+2477.378573329" watchObservedRunningTime="2026-03-09 14:43:56.840482727 +0000 UTC m=+2477.396051303" Mar 09 14:44:00 crc kubenswrapper[4722]: I0309 14:44:00.131096 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551124-rgb7c"] Mar 09 14:44:00 crc kubenswrapper[4722]: I0309 14:44:00.133252 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551124-rgb7c" Mar 09 14:44:00 crc kubenswrapper[4722]: I0309 14:44:00.142651 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551124-rgb7c"] Mar 09 14:44:00 crc kubenswrapper[4722]: I0309 14:44:00.165076 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 14:44:00 crc kubenswrapper[4722]: I0309 14:44:00.165319 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 14:44:00 crc kubenswrapper[4722]: I0309 14:44:00.165407 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-dr2x6" Mar 09 14:44:00 crc kubenswrapper[4722]: I0309 14:44:00.166710 4722 scope.go:117] "RemoveContainer" containerID="931310cf17f93e3962cdbbdf531169aefb3915147cff22375dd9c06d9d6b2906" Mar 09 14:44:00 crc kubenswrapper[4722]: E0309 14:44:00.182405 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 14:44:00 crc kubenswrapper[4722]: I0309 14:44:00.327192 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2npq\" (UniqueName: \"kubernetes.io/projected/2837a381-5705-491b-81d2-62e8a19e50b4-kube-api-access-t2npq\") pod \"auto-csr-approver-29551124-rgb7c\" (UID: \"2837a381-5705-491b-81d2-62e8a19e50b4\") " pod="openshift-infra/auto-csr-approver-29551124-rgb7c" Mar 09 14:44:00 crc kubenswrapper[4722]: I0309 14:44:00.429505 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2npq\" (UniqueName: \"kubernetes.io/projected/2837a381-5705-491b-81d2-62e8a19e50b4-kube-api-access-t2npq\") pod \"auto-csr-approver-29551124-rgb7c\" (UID: \"2837a381-5705-491b-81d2-62e8a19e50b4\") " pod="openshift-infra/auto-csr-approver-29551124-rgb7c" Mar 09 14:44:00 crc kubenswrapper[4722]: I0309 14:44:00.447494 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2npq\" (UniqueName: \"kubernetes.io/projected/2837a381-5705-491b-81d2-62e8a19e50b4-kube-api-access-t2npq\") pod \"auto-csr-approver-29551124-rgb7c\" (UID: \"2837a381-5705-491b-81d2-62e8a19e50b4\") " pod="openshift-infra/auto-csr-approver-29551124-rgb7c" Mar 09 14:44:00 crc kubenswrapper[4722]: I0309 14:44:00.506429 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551124-rgb7c" Mar 09 14:44:00 crc kubenswrapper[4722]: I0309 14:44:00.988511 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551124-rgb7c"] Mar 09 14:44:01 crc kubenswrapper[4722]: I0309 14:44:01.072393 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-srb2n" Mar 09 14:44:01 crc kubenswrapper[4722]: I0309 14:44:01.073467 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-srb2n" Mar 09 14:44:01 crc kubenswrapper[4722]: I0309 14:44:01.123611 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-srb2n" Mar 09 14:44:01 crc kubenswrapper[4722]: I0309 14:44:01.841472 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551124-rgb7c" event={"ID":"2837a381-5705-491b-81d2-62e8a19e50b4","Type":"ContainerStarted","Data":"7028e5146802960a85188eb67ef7f66d9aaea7ef57906a5fa335df96ee5d3597"} Mar 09 14:44:01 crc kubenswrapper[4722]: I0309 14:44:01.896839 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-srb2n" Mar 09 14:44:01 crc kubenswrapper[4722]: I0309 14:44:01.956091 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-srb2n"] Mar 09 14:44:02 crc kubenswrapper[4722]: I0309 14:44:02.872036 4722 generic.go:334] "Generic (PLEG): container finished" podID="2837a381-5705-491b-81d2-62e8a19e50b4" containerID="064ff878f52dc16d37f76a0e2885da26d610dd057b9931999269f6530097eeb0" exitCode=0 Mar 09 14:44:02 crc kubenswrapper[4722]: I0309 14:44:02.872124 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551124-rgb7c" event={"ID":"2837a381-5705-491b-81d2-62e8a19e50b4","Type":"ContainerDied","Data":"064ff878f52dc16d37f76a0e2885da26d610dd057b9931999269f6530097eeb0"} Mar 09 14:44:03 crc kubenswrapper[4722]: I0309 14:44:03.881726 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-srb2n" podUID="064f14fc-c287-473b-a503-bc35661e31b2" containerName="registry-server" containerID="cri-o://d055cbf058a735fa6a48bdd2fcf0cbd3b055f837fd1dbe56289eacb23b0b6c1c" gracePeriod=2 Mar 09 14:44:04 crc kubenswrapper[4722]: I0309 14:44:04.307140 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551124-rgb7c" Mar 09 14:44:04 crc kubenswrapper[4722]: I0309 14:44:04.444000 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2npq\" (UniqueName: \"kubernetes.io/projected/2837a381-5705-491b-81d2-62e8a19e50b4-kube-api-access-t2npq\") pod \"2837a381-5705-491b-81d2-62e8a19e50b4\" (UID: \"2837a381-5705-491b-81d2-62e8a19e50b4\") " Mar 09 14:44:04 crc kubenswrapper[4722]: I0309 14:44:04.471343 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2837a381-5705-491b-81d2-62e8a19e50b4-kube-api-access-t2npq" (OuterVolumeSpecName: "kube-api-access-t2npq") pod "2837a381-5705-491b-81d2-62e8a19e50b4" (UID: "2837a381-5705-491b-81d2-62e8a19e50b4"). InnerVolumeSpecName "kube-api-access-t2npq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:44:04 crc kubenswrapper[4722]: I0309 14:44:04.482973 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-srb2n" Mar 09 14:44:04 crc kubenswrapper[4722]: I0309 14:44:04.547783 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2npq\" (UniqueName: \"kubernetes.io/projected/2837a381-5705-491b-81d2-62e8a19e50b4-kube-api-access-t2npq\") on node \"crc\" DevicePath \"\"" Mar 09 14:44:04 crc kubenswrapper[4722]: I0309 14:44:04.649663 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/064f14fc-c287-473b-a503-bc35661e31b2-catalog-content\") pod \"064f14fc-c287-473b-a503-bc35661e31b2\" (UID: \"064f14fc-c287-473b-a503-bc35661e31b2\") " Mar 09 14:44:04 crc kubenswrapper[4722]: I0309 14:44:04.649865 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qzcmg\" (UniqueName: \"kubernetes.io/projected/064f14fc-c287-473b-a503-bc35661e31b2-kube-api-access-qzcmg\") pod \"064f14fc-c287-473b-a503-bc35661e31b2\" (UID: \"064f14fc-c287-473b-a503-bc35661e31b2\") " Mar 09 14:44:04 crc kubenswrapper[4722]: I0309 14:44:04.649959 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/064f14fc-c287-473b-a503-bc35661e31b2-utilities\") pod \"064f14fc-c287-473b-a503-bc35661e31b2\" (UID: \"064f14fc-c287-473b-a503-bc35661e31b2\") " Mar 09 14:44:04 crc kubenswrapper[4722]: I0309 14:44:04.651296 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/064f14fc-c287-473b-a503-bc35661e31b2-utilities" (OuterVolumeSpecName: "utilities") pod "064f14fc-c287-473b-a503-bc35661e31b2" (UID: "064f14fc-c287-473b-a503-bc35661e31b2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:44:04 crc kubenswrapper[4722]: I0309 14:44:04.653369 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/064f14fc-c287-473b-a503-bc35661e31b2-kube-api-access-qzcmg" (OuterVolumeSpecName: "kube-api-access-qzcmg") pod "064f14fc-c287-473b-a503-bc35661e31b2" (UID: "064f14fc-c287-473b-a503-bc35661e31b2"). InnerVolumeSpecName "kube-api-access-qzcmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:44:04 crc kubenswrapper[4722]: I0309 14:44:04.674954 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/064f14fc-c287-473b-a503-bc35661e31b2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "064f14fc-c287-473b-a503-bc35661e31b2" (UID: "064f14fc-c287-473b-a503-bc35661e31b2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:44:04 crc kubenswrapper[4722]: I0309 14:44:04.754174 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qzcmg\" (UniqueName: \"kubernetes.io/projected/064f14fc-c287-473b-a503-bc35661e31b2-kube-api-access-qzcmg\") on node \"crc\" DevicePath \"\"" Mar 09 14:44:04 crc kubenswrapper[4722]: I0309 14:44:04.754236 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/064f14fc-c287-473b-a503-bc35661e31b2-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 14:44:04 crc kubenswrapper[4722]: I0309 14:44:04.754251 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/064f14fc-c287-473b-a503-bc35661e31b2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 14:44:04 crc kubenswrapper[4722]: I0309 14:44:04.926717 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551124-rgb7c" Mar 09 14:44:04 crc kubenswrapper[4722]: I0309 14:44:04.926714 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551124-rgb7c" event={"ID":"2837a381-5705-491b-81d2-62e8a19e50b4","Type":"ContainerDied","Data":"7028e5146802960a85188eb67ef7f66d9aaea7ef57906a5fa335df96ee5d3597"} Mar 09 14:44:04 crc kubenswrapper[4722]: I0309 14:44:04.926926 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7028e5146802960a85188eb67ef7f66d9aaea7ef57906a5fa335df96ee5d3597" Mar 09 14:44:04 crc kubenswrapper[4722]: I0309 14:44:04.929465 4722 generic.go:334] "Generic (PLEG): container finished" podID="064f14fc-c287-473b-a503-bc35661e31b2" containerID="d055cbf058a735fa6a48bdd2fcf0cbd3b055f837fd1dbe56289eacb23b0b6c1c" exitCode=0 Mar 09 14:44:04 crc kubenswrapper[4722]: I0309 14:44:04.929507 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-srb2n" event={"ID":"064f14fc-c287-473b-a503-bc35661e31b2","Type":"ContainerDied","Data":"d055cbf058a735fa6a48bdd2fcf0cbd3b055f837fd1dbe56289eacb23b0b6c1c"} Mar 09 14:44:04 crc kubenswrapper[4722]: I0309 14:44:04.929536 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-srb2n" event={"ID":"064f14fc-c287-473b-a503-bc35661e31b2","Type":"ContainerDied","Data":"ec0d755cacb9c01fea044a230644be3cceac07251f7269b0af9d3704af941557"} Mar 09 14:44:04 crc kubenswrapper[4722]: I0309 14:44:04.929555 4722 scope.go:117] "RemoveContainer" containerID="d055cbf058a735fa6a48bdd2fcf0cbd3b055f837fd1dbe56289eacb23b0b6c1c" Mar 09 14:44:04 crc kubenswrapper[4722]: I0309 14:44:04.929590 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-srb2n" Mar 09 14:44:04 crc kubenswrapper[4722]: I0309 14:44:04.949577 4722 scope.go:117] "RemoveContainer" containerID="604020f6aa67457521d257e377086347d5738882aca6eaaeba41f22224559c93" Mar 09 14:44:04 crc kubenswrapper[4722]: I0309 14:44:04.981360 4722 scope.go:117] "RemoveContainer" containerID="2f6242d7b866783b7ee72187d602482f86d45d7a5b42637d6220fc322f986302" Mar 09 14:44:04 crc kubenswrapper[4722]: I0309 14:44:04.982599 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-srb2n"] Mar 09 14:44:04 crc kubenswrapper[4722]: I0309 14:44:04.994701 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-srb2n"] Mar 09 14:44:05 crc kubenswrapper[4722]: I0309 14:44:05.011265 4722 scope.go:117] "RemoveContainer" containerID="d055cbf058a735fa6a48bdd2fcf0cbd3b055f837fd1dbe56289eacb23b0b6c1c" Mar 09 14:44:05 crc kubenswrapper[4722]: E0309 14:44:05.011629 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d055cbf058a735fa6a48bdd2fcf0cbd3b055f837fd1dbe56289eacb23b0b6c1c\": container with ID starting with d055cbf058a735fa6a48bdd2fcf0cbd3b055f837fd1dbe56289eacb23b0b6c1c not found: ID does not exist" containerID="d055cbf058a735fa6a48bdd2fcf0cbd3b055f837fd1dbe56289eacb23b0b6c1c" Mar 09 14:44:05 crc kubenswrapper[4722]: I0309 14:44:05.011671 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d055cbf058a735fa6a48bdd2fcf0cbd3b055f837fd1dbe56289eacb23b0b6c1c"} err="failed to get container status \"d055cbf058a735fa6a48bdd2fcf0cbd3b055f837fd1dbe56289eacb23b0b6c1c\": rpc error: code = NotFound desc = could not find container \"d055cbf058a735fa6a48bdd2fcf0cbd3b055f837fd1dbe56289eacb23b0b6c1c\": container with ID starting with d055cbf058a735fa6a48bdd2fcf0cbd3b055f837fd1dbe56289eacb23b0b6c1c not found: ID does not exist" Mar 09 14:44:05 crc kubenswrapper[4722]: I0309 14:44:05.011701 4722 scope.go:117] "RemoveContainer" containerID="604020f6aa67457521d257e377086347d5738882aca6eaaeba41f22224559c93" Mar 09 14:44:05 crc kubenswrapper[4722]: E0309 14:44:05.011984 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"604020f6aa67457521d257e377086347d5738882aca6eaaeba41f22224559c93\": container with ID starting with 604020f6aa67457521d257e377086347d5738882aca6eaaeba41f22224559c93 not found: ID does not exist" containerID="604020f6aa67457521d257e377086347d5738882aca6eaaeba41f22224559c93" Mar 09 14:44:05 crc kubenswrapper[4722]: I0309 14:44:05.012008 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"604020f6aa67457521d257e377086347d5738882aca6eaaeba41f22224559c93"} err="failed to get container status \"604020f6aa67457521d257e377086347d5738882aca6eaaeba41f22224559c93\": rpc error: code = NotFound desc = could not find container \"604020f6aa67457521d257e377086347d5738882aca6eaaeba41f22224559c93\": container with ID starting with 604020f6aa67457521d257e377086347d5738882aca6eaaeba41f22224559c93 not found: ID does not exist" Mar 09 14:44:05 crc kubenswrapper[4722]: I0309 14:44:05.012026 4722 scope.go:117] "RemoveContainer" containerID="2f6242d7b866783b7ee72187d602482f86d45d7a5b42637d6220fc322f986302" Mar 09 14:44:05 crc kubenswrapper[4722]: E0309 14:44:05.012291 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f6242d7b866783b7ee72187d602482f86d45d7a5b42637d6220fc322f986302\": container with ID starting with 2f6242d7b866783b7ee72187d602482f86d45d7a5b42637d6220fc322f986302 not found: ID does not exist" containerID="2f6242d7b866783b7ee72187d602482f86d45d7a5b42637d6220fc322f986302" Mar 09 14:44:05 crc kubenswrapper[4722]: I0309 14:44:05.012338 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f6242d7b866783b7ee72187d602482f86d45d7a5b42637d6220fc322f986302"} err="failed to get container status \"2f6242d7b866783b7ee72187d602482f86d45d7a5b42637d6220fc322f986302\": rpc error: code = NotFound desc = could not find container \"2f6242d7b866783b7ee72187d602482f86d45d7a5b42637d6220fc322f986302\": container with ID starting with 2f6242d7b866783b7ee72187d602482f86d45d7a5b42637d6220fc322f986302 not found: ID does not exist" Mar 09 14:44:05 crc kubenswrapper[4722]: I0309 14:44:05.407648 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551118-j6srk"] Mar 09 14:44:05 crc kubenswrapper[4722]: I0309 14:44:05.417816 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551118-j6srk"] Mar 09 14:44:06 crc kubenswrapper[4722]: I0309 14:44:06.163009 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="064f14fc-c287-473b-a503-bc35661e31b2" path="/var/lib/kubelet/pods/064f14fc-c287-473b-a503-bc35661e31b2/volumes" Mar 09 14:44:06 crc kubenswrapper[4722]: I0309 14:44:06.164150 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a081f1f1-e065-4d4e-87b8-878c36dd0582" path="/var/lib/kubelet/pods/a081f1f1-e065-4d4e-87b8-878c36dd0582/volumes" Mar 09 14:44:15 crc kubenswrapper[4722]: I0309 14:44:15.150274 4722 scope.go:117] "RemoveContainer" containerID="931310cf17f93e3962cdbbdf531169aefb3915147cff22375dd9c06d9d6b2906" Mar 09 14:44:15 crc kubenswrapper[4722]: E0309 14:44:15.151160 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 14:44:27 crc kubenswrapper[4722]: I0309 14:44:27.931853 4722 scope.go:117] "RemoveContainer" containerID="cfee5a6fa0116f9ffffb52b0c238fd511806c2adf6e486bfe1c7a705d6c669df" Mar 09 14:44:29 crc kubenswrapper[4722]: I0309 14:44:29.151621 4722 scope.go:117] "RemoveContainer" containerID="931310cf17f93e3962cdbbdf531169aefb3915147cff22375dd9c06d9d6b2906" Mar 09 14:44:29 crc kubenswrapper[4722]: E0309 14:44:29.152592 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 14:44:42 crc kubenswrapper[4722]: I0309 14:44:42.149108 4722 scope.go:117] "RemoveContainer" containerID="931310cf17f93e3962cdbbdf531169aefb3915147cff22375dd9c06d9d6b2906" Mar 09 14:44:42 crc kubenswrapper[4722]: E0309 14:44:42.150060 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 14:44:55 crc kubenswrapper[4722]: I0309 14:44:55.150460 4722 scope.go:117] "RemoveContainer" containerID="931310cf17f93e3962cdbbdf531169aefb3915147cff22375dd9c06d9d6b2906" Mar 09 14:44:55 crc kubenswrapper[4722]: E0309 14:44:55.151606 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 14:45:00 crc kubenswrapper[4722]: I0309 14:45:00.167300 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551125-s9v26"] Mar 09 14:45:00 crc kubenswrapper[4722]: E0309 14:45:00.168513 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="064f14fc-c287-473b-a503-bc35661e31b2" containerName="registry-server" Mar 09 14:45:00 crc kubenswrapper[4722]: I0309 14:45:00.168535 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="064f14fc-c287-473b-a503-bc35661e31b2" containerName="registry-server" Mar 09 14:45:00 crc kubenswrapper[4722]: E0309 14:45:00.168579 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2837a381-5705-491b-81d2-62e8a19e50b4" containerName="oc" Mar 09 14:45:00 crc kubenswrapper[4722]: I0309 14:45:00.168593 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="2837a381-5705-491b-81d2-62e8a19e50b4" containerName="oc" Mar 09 14:45:00 crc kubenswrapper[4722]: E0309 14:45:00.168638 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="064f14fc-c287-473b-a503-bc35661e31b2" containerName="extract-content" Mar 09 14:45:00 crc kubenswrapper[4722]: I0309 14:45:00.168647 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="064f14fc-c287-473b-a503-bc35661e31b2" containerName="extract-content" Mar 09 14:45:00 crc kubenswrapper[4722]: E0309 14:45:00.168663 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="064f14fc-c287-473b-a503-bc35661e31b2" containerName="extract-utilities" Mar 09 14:45:00 crc kubenswrapper[4722]: I0309 14:45:00.168671 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="064f14fc-c287-473b-a503-bc35661e31b2" containerName="extract-utilities" Mar 09 14:45:00 crc kubenswrapper[4722]: I0309 14:45:00.169042 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="2837a381-5705-491b-81d2-62e8a19e50b4" containerName="oc" Mar 09 14:45:00 crc kubenswrapper[4722]: I0309 14:45:00.169093 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="064f14fc-c287-473b-a503-bc35661e31b2" containerName="registry-server" Mar 09 14:45:00 crc kubenswrapper[4722]: I0309 14:45:00.170295 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551125-s9v26" Mar 09 14:45:00 crc kubenswrapper[4722]: I0309 14:45:00.173461 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 09 14:45:00 crc kubenswrapper[4722]: I0309 14:45:00.173705 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 09 14:45:00 crc kubenswrapper[4722]: I0309 14:45:00.206406 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551125-s9v26"] Mar 09 14:45:00 crc kubenswrapper[4722]: I0309 14:45:00.315978 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9db8477f-7668-4dee-8ffd-8ceec067e99f-secret-volume\") pod \"collect-profiles-29551125-s9v26\" (UID: \"9db8477f-7668-4dee-8ffd-8ceec067e99f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551125-s9v26" Mar 09 14:45:00 crc kubenswrapper[4722]: I0309 14:45:00.317146 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9db8477f-7668-4dee-8ffd-8ceec067e99f-config-volume\") pod \"collect-profiles-29551125-s9v26\" (UID: \"9db8477f-7668-4dee-8ffd-8ceec067e99f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551125-s9v26" Mar 09 14:45:00 crc kubenswrapper[4722]: I0309 14:45:00.317588 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnp6z\" (UniqueName: \"kubernetes.io/projected/9db8477f-7668-4dee-8ffd-8ceec067e99f-kube-api-access-rnp6z\") pod \"collect-profiles-29551125-s9v26\" (UID: \"9db8477f-7668-4dee-8ffd-8ceec067e99f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551125-s9v26" Mar 09 14:45:00 crc kubenswrapper[4722]: I0309 14:45:00.420896 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9db8477f-7668-4dee-8ffd-8ceec067e99f-secret-volume\") pod \"collect-profiles-29551125-s9v26\" (UID: \"9db8477f-7668-4dee-8ffd-8ceec067e99f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551125-s9v26" Mar 09 14:45:00 crc kubenswrapper[4722]: I0309 14:45:00.421190 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9db8477f-7668-4dee-8ffd-8ceec067e99f-config-volume\") pod \"collect-profiles-29551125-s9v26\" (UID: \"9db8477f-7668-4dee-8ffd-8ceec067e99f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551125-s9v26" Mar 09 14:45:00 crc kubenswrapper[4722]: I0309 14:45:00.422234 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9db8477f-7668-4dee-8ffd-8ceec067e99f-config-volume\") pod \"collect-profiles-29551125-s9v26\" (UID: \"9db8477f-7668-4dee-8ffd-8ceec067e99f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551125-s9v26" Mar 09 14:45:00 crc kubenswrapper[4722]: I0309 14:45:00.422542 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnp6z\" (UniqueName: \"kubernetes.io/projected/9db8477f-7668-4dee-8ffd-8ceec067e99f-kube-api-access-rnp6z\") pod \"collect-profiles-29551125-s9v26\" (UID: \"9db8477f-7668-4dee-8ffd-8ceec067e99f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551125-s9v26" Mar 09 14:45:00 crc kubenswrapper[4722]: I0309 14:45:00.429136 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9db8477f-7668-4dee-8ffd-8ceec067e99f-secret-volume\") pod \"collect-profiles-29551125-s9v26\" (UID: \"9db8477f-7668-4dee-8ffd-8ceec067e99f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551125-s9v26" Mar 09 14:45:00 crc kubenswrapper[4722]: I0309 14:45:00.446308 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnp6z\" (UniqueName: \"kubernetes.io/projected/9db8477f-7668-4dee-8ffd-8ceec067e99f-kube-api-access-rnp6z\") pod \"collect-profiles-29551125-s9v26\" (UID: \"9db8477f-7668-4dee-8ffd-8ceec067e99f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551125-s9v26" Mar 09 14:45:00 crc kubenswrapper[4722]: I0309 14:45:00.503400 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551125-s9v26" Mar 09 14:45:01 crc kubenswrapper[4722]: I0309 14:45:01.031720 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551125-s9v26"] Mar 09 14:45:01 crc kubenswrapper[4722]: I0309 14:45:01.593584 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551125-s9v26" event={"ID":"9db8477f-7668-4dee-8ffd-8ceec067e99f","Type":"ContainerStarted","Data":"686f78f6fd8badc955a3137b91963993db8394a75fbc1f22cbbf6617559d2bfe"} Mar 09 14:45:01 crc kubenswrapper[4722]: I0309 14:45:01.593946 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551125-s9v26" event={"ID":"9db8477f-7668-4dee-8ffd-8ceec067e99f","Type":"ContainerStarted","Data":"73360a68b57fa8d38600759a03a4549d7b4c0cb667a3869ce01c8a8e643027c6"} Mar 09 14:45:01 crc kubenswrapper[4722]: I0309 14:45:01.635965 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29551125-s9v26" podStartSLOduration=1.635942738 podStartE2EDuration="1.635942738s" podCreationTimestamp="2026-03-09 14:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 14:45:01.614519608 +0000 UTC m=+2542.170088184" watchObservedRunningTime="2026-03-09 14:45:01.635942738 +0000 UTC m=+2542.191511314" Mar 09 14:45:02 crc kubenswrapper[4722]: I0309 14:45:02.603531 4722 generic.go:334] "Generic (PLEG): container finished" podID="9db8477f-7668-4dee-8ffd-8ceec067e99f" containerID="686f78f6fd8badc955a3137b91963993db8394a75fbc1f22cbbf6617559d2bfe" exitCode=0 Mar 09 14:45:02 crc kubenswrapper[4722]: I0309 14:45:02.603576 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551125-s9v26" event={"ID":"9db8477f-7668-4dee-8ffd-8ceec067e99f","Type":"ContainerDied","Data":"686f78f6fd8badc955a3137b91963993db8394a75fbc1f22cbbf6617559d2bfe"} Mar 09 14:45:04 crc kubenswrapper[4722]: I0309 14:45:04.049164 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551125-s9v26" Mar 09 14:45:04 crc kubenswrapper[4722]: I0309 14:45:04.234894 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9db8477f-7668-4dee-8ffd-8ceec067e99f-config-volume\") pod \"9db8477f-7668-4dee-8ffd-8ceec067e99f\" (UID: \"9db8477f-7668-4dee-8ffd-8ceec067e99f\") " Mar 09 14:45:04 crc kubenswrapper[4722]: I0309 14:45:04.235435 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnp6z\" (UniqueName: \"kubernetes.io/projected/9db8477f-7668-4dee-8ffd-8ceec067e99f-kube-api-access-rnp6z\") pod \"9db8477f-7668-4dee-8ffd-8ceec067e99f\" (UID: \"9db8477f-7668-4dee-8ffd-8ceec067e99f\") " Mar 09 14:45:04 crc kubenswrapper[4722]: I0309 14:45:04.235612 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9db8477f-7668-4dee-8ffd-8ceec067e99f-config-volume" (OuterVolumeSpecName: "config-volume") pod "9db8477f-7668-4dee-8ffd-8ceec067e99f" (UID: "9db8477f-7668-4dee-8ffd-8ceec067e99f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:45:04 crc kubenswrapper[4722]: I0309 14:45:04.235662 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9db8477f-7668-4dee-8ffd-8ceec067e99f-secret-volume\") pod \"9db8477f-7668-4dee-8ffd-8ceec067e99f\" (UID: \"9db8477f-7668-4dee-8ffd-8ceec067e99f\") " Mar 09 14:45:04 crc kubenswrapper[4722]: I0309 14:45:04.238018 4722 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9db8477f-7668-4dee-8ffd-8ceec067e99f-config-volume\") on node \"crc\" DevicePath \"\"" Mar 09 14:45:04 crc kubenswrapper[4722]: I0309 14:45:04.241027 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9db8477f-7668-4dee-8ffd-8ceec067e99f-kube-api-access-rnp6z" (OuterVolumeSpecName: "kube-api-access-rnp6z") pod "9db8477f-7668-4dee-8ffd-8ceec067e99f" (UID: "9db8477f-7668-4dee-8ffd-8ceec067e99f"). InnerVolumeSpecName "kube-api-access-rnp6z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:45:04 crc kubenswrapper[4722]: I0309 14:45:04.248891 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9db8477f-7668-4dee-8ffd-8ceec067e99f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "9db8477f-7668-4dee-8ffd-8ceec067e99f" (UID: "9db8477f-7668-4dee-8ffd-8ceec067e99f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:45:04 crc kubenswrapper[4722]: I0309 14:45:04.340896 4722 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9db8477f-7668-4dee-8ffd-8ceec067e99f-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 09 14:45:04 crc kubenswrapper[4722]: I0309 14:45:04.340934 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnp6z\" (UniqueName: \"kubernetes.io/projected/9db8477f-7668-4dee-8ffd-8ceec067e99f-kube-api-access-rnp6z\") on node \"crc\" DevicePath \"\"" Mar 09 14:45:04 crc kubenswrapper[4722]: I0309 14:45:04.637551 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551125-s9v26" event={"ID":"9db8477f-7668-4dee-8ffd-8ceec067e99f","Type":"ContainerDied","Data":"73360a68b57fa8d38600759a03a4549d7b4c0cb667a3869ce01c8a8e643027c6"} Mar 09 14:45:04 crc kubenswrapper[4722]: I0309 14:45:04.637593 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="73360a68b57fa8d38600759a03a4549d7b4c0cb667a3869ce01c8a8e643027c6" Mar 09 14:45:04 crc kubenswrapper[4722]: I0309 14:45:04.637632 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551125-s9v26" Mar 09 14:45:04 crc kubenswrapper[4722]: I0309 14:45:04.708690 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551080-mdzx2"] Mar 09 14:45:04 crc kubenswrapper[4722]: I0309 14:45:04.723525 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551080-mdzx2"] Mar 09 14:45:06 crc kubenswrapper[4722]: I0309 14:45:06.167888 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e1fdc12-aac5-4f72-9b22-0212c2f3988e" path="/var/lib/kubelet/pods/7e1fdc12-aac5-4f72-9b22-0212c2f3988e/volumes" Mar 09 14:45:07 crc kubenswrapper[4722]: I0309 14:45:07.149638 4722 scope.go:117] "RemoveContainer" containerID="931310cf17f93e3962cdbbdf531169aefb3915147cff22375dd9c06d9d6b2906" Mar 09 14:45:07 crc kubenswrapper[4722]: E0309 14:45:07.150573 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 14:45:21 crc kubenswrapper[4722]: I0309 14:45:21.149649 4722 scope.go:117] "RemoveContainer" containerID="931310cf17f93e3962cdbbdf531169aefb3915147cff22375dd9c06d9d6b2906" Mar 09 14:45:21 crc kubenswrapper[4722]: E0309 14:45:21.150526 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 14:45:28 crc kubenswrapper[4722]: I0309 14:45:28.056231 4722 scope.go:117] "RemoveContainer" containerID="dc947cec8c36a2871a868e6175dd1a21b31edd8e69d4e4e07cedf44eaf8a8a73" Mar 09 14:45:34 crc kubenswrapper[4722]: I0309 14:45:34.149766 4722 scope.go:117] "RemoveContainer" containerID="931310cf17f93e3962cdbbdf531169aefb3915147cff22375dd9c06d9d6b2906" Mar 09 14:45:34 crc kubenswrapper[4722]: E0309 14:45:34.150812 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 14:45:48 crc kubenswrapper[4722]: I0309 14:45:48.153561 4722 scope.go:117] "RemoveContainer" containerID="931310cf17f93e3962cdbbdf531169aefb3915147cff22375dd9c06d9d6b2906" Mar 09 14:45:48 crc kubenswrapper[4722]: E0309 14:45:48.154710 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 14:46:00 crc kubenswrapper[4722]: I0309 14:46:00.184109 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551126-zh8m5"] Mar 09 14:46:00 crc kubenswrapper[4722]: E0309 14:46:00.185320 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9db8477f-7668-4dee-8ffd-8ceec067e99f" containerName="collect-profiles" Mar 09 14:46:00 crc kubenswrapper[4722]: I0309 14:46:00.185338 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="9db8477f-7668-4dee-8ffd-8ceec067e99f" containerName="collect-profiles" Mar 09 14:46:00 crc kubenswrapper[4722]: I0309 14:46:00.185641 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="9db8477f-7668-4dee-8ffd-8ceec067e99f" containerName="collect-profiles" Mar 09 14:46:00 crc kubenswrapper[4722]: I0309 14:46:00.187034 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551126-zh8m5" Mar 09 14:46:00 crc kubenswrapper[4722]: I0309 14:46:00.189779 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 14:46:00 crc kubenswrapper[4722]: I0309 14:46:00.189931 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-dr2x6" Mar 09 14:46:00 crc kubenswrapper[4722]: I0309 14:46:00.190007 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 14:46:00 crc kubenswrapper[4722]: I0309 14:46:00.197059 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551126-zh8m5"] Mar 09 14:46:00 crc kubenswrapper[4722]: I0309 14:46:00.281799 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hzcr\" (UniqueName: \"kubernetes.io/projected/b3326142-7932-42ca-918c-dc1afd1a443b-kube-api-access-5hzcr\") pod \"auto-csr-approver-29551126-zh8m5\" (UID: \"b3326142-7932-42ca-918c-dc1afd1a443b\") " pod="openshift-infra/auto-csr-approver-29551126-zh8m5" Mar 09 14:46:00 crc kubenswrapper[4722]: I0309 14:46:00.384231 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hzcr\" (UniqueName: \"kubernetes.io/projected/b3326142-7932-42ca-918c-dc1afd1a443b-kube-api-access-5hzcr\") pod \"auto-csr-approver-29551126-zh8m5\" (UID: \"b3326142-7932-42ca-918c-dc1afd1a443b\") " pod="openshift-infra/auto-csr-approver-29551126-zh8m5" Mar 09 14:46:00 crc kubenswrapper[4722]: I0309 14:46:00.402583 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hzcr\" (UniqueName: \"kubernetes.io/projected/b3326142-7932-42ca-918c-dc1afd1a443b-kube-api-access-5hzcr\") pod \"auto-csr-approver-29551126-zh8m5\" (UID: \"b3326142-7932-42ca-918c-dc1afd1a443b\") " pod="openshift-infra/auto-csr-approver-29551126-zh8m5" Mar 09 14:46:00 crc kubenswrapper[4722]: I0309 14:46:00.508655 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551126-zh8m5" Mar 09 14:46:00 crc kubenswrapper[4722]: I0309 14:46:00.992012 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551126-zh8m5"] Mar 09 14:46:00 crc kubenswrapper[4722]: I0309 14:46:00.996107 4722 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 14:46:01 crc kubenswrapper[4722]: I0309 14:46:01.348370 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551126-zh8m5" event={"ID":"b3326142-7932-42ca-918c-dc1afd1a443b","Type":"ContainerStarted","Data":"89641bd2face53c022880bbd047ba78e343f66eb6e662d1229187303c498bdf5"} Mar 09 14:46:02 crc kubenswrapper[4722]: I0309 14:46:02.157424 4722 scope.go:117] "RemoveContainer" containerID="931310cf17f93e3962cdbbdf531169aefb3915147cff22375dd9c06d9d6b2906" Mar 09 14:46:02 crc kubenswrapper[4722]: E0309 14:46:02.157997 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 14:46:03 crc kubenswrapper[4722]: I0309 14:46:03.368006 4722 generic.go:334] "Generic (PLEG): container finished" podID="b3326142-7932-42ca-918c-dc1afd1a443b" containerID="764c2630b59c9872d02ac080b51ce0ed88d52a8d0b0ef3f3b0ac1f44a6ec0e31" exitCode=0 Mar 09 14:46:03 crc kubenswrapper[4722]: I0309 14:46:03.368104 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551126-zh8m5" event={"ID":"b3326142-7932-42ca-918c-dc1afd1a443b","Type":"ContainerDied","Data":"764c2630b59c9872d02ac080b51ce0ed88d52a8d0b0ef3f3b0ac1f44a6ec0e31"} Mar 09 14:46:04 crc kubenswrapper[4722]: I0309 14:46:04.770842 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551126-zh8m5" Mar 09 14:46:04 crc kubenswrapper[4722]: I0309 14:46:04.932636 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5hzcr\" (UniqueName: \"kubernetes.io/projected/b3326142-7932-42ca-918c-dc1afd1a443b-kube-api-access-5hzcr\") pod \"b3326142-7932-42ca-918c-dc1afd1a443b\" (UID: \"b3326142-7932-42ca-918c-dc1afd1a443b\") " Mar 09 14:46:04 crc kubenswrapper[4722]: I0309 14:46:04.939426 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3326142-7932-42ca-918c-dc1afd1a443b-kube-api-access-5hzcr" (OuterVolumeSpecName: "kube-api-access-5hzcr") pod "b3326142-7932-42ca-918c-dc1afd1a443b" (UID: "b3326142-7932-42ca-918c-dc1afd1a443b"). InnerVolumeSpecName "kube-api-access-5hzcr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:46:05 crc kubenswrapper[4722]: I0309 14:46:05.036122 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5hzcr\" (UniqueName: \"kubernetes.io/projected/b3326142-7932-42ca-918c-dc1afd1a443b-kube-api-access-5hzcr\") on node \"crc\" DevicePath \"\"" Mar 09 14:46:05 crc kubenswrapper[4722]: I0309 14:46:05.396989 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551126-zh8m5" event={"ID":"b3326142-7932-42ca-918c-dc1afd1a443b","Type":"ContainerDied","Data":"89641bd2face53c022880bbd047ba78e343f66eb6e662d1229187303c498bdf5"} Mar 09 14:46:05 crc kubenswrapper[4722]: I0309 14:46:05.397384 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="89641bd2face53c022880bbd047ba78e343f66eb6e662d1229187303c498bdf5" Mar 09 14:46:05 crc kubenswrapper[4722]: I0309 14:46:05.397152 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551126-zh8m5" Mar 09 14:46:05 crc kubenswrapper[4722]: I0309 14:46:05.850995 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551120-q9k6x"] Mar 09 14:46:05 crc kubenswrapper[4722]: I0309 14:46:05.863336 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551120-q9k6x"] Mar 09 14:46:06 crc kubenswrapper[4722]: I0309 14:46:06.162735 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4324c08a-d9aa-4d6e-ad2a-900c2124d167" path="/var/lib/kubelet/pods/4324c08a-d9aa-4d6e-ad2a-900c2124d167/volumes" Mar 09 14:46:17 crc kubenswrapper[4722]: I0309 14:46:17.149812 4722 scope.go:117] "RemoveContainer" containerID="931310cf17f93e3962cdbbdf531169aefb3915147cff22375dd9c06d9d6b2906" Mar 09 14:46:17 crc kubenswrapper[4722]: E0309 14:46:17.150852 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 14:46:28 crc kubenswrapper[4722]: I0309 14:46:28.133901 4722 scope.go:117] "RemoveContainer" containerID="2d0b302dca238f52c5be1715bb7e4345bc1005ab14c32ff54a20c4baf6b3d79a" Mar 09 14:46:32 crc kubenswrapper[4722]: I0309 14:46:32.150102 4722 scope.go:117] "RemoveContainer" containerID="931310cf17f93e3962cdbbdf531169aefb3915147cff22375dd9c06d9d6b2906" Mar 09 14:46:32 crc kubenswrapper[4722]: I0309 14:46:32.731124 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" event={"ID":"dac2aaf5-653b-4b2a-8efe-ed26bac8d648","Type":"ContainerStarted","Data":"ac01e46e450ae3d431cf312a0286617c799422de23e62d8665856d6ee9c42ab7"} Mar 09 14:47:53 crc kubenswrapper[4722]: I0309 14:47:53.671728 4722 generic.go:334] "Generic (PLEG): container finished" podID="b0bc3b99-5368-4287-8a9d-7b19b8b33e40" containerID="341d5625aff9f0cd0efb9bef91d2bc9e3c10a9467d9d33de0ae3b0e0e1c396f6" exitCode=0 Mar 09 14:47:53 crc kubenswrapper[4722]: I0309 14:47:53.672366 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wqbn2" event={"ID":"b0bc3b99-5368-4287-8a9d-7b19b8b33e40","Type":"ContainerDied","Data":"341d5625aff9f0cd0efb9bef91d2bc9e3c10a9467d9d33de0ae3b0e0e1c396f6"} Mar 09 14:47:55 crc kubenswrapper[4722]: I0309 14:47:55.241877 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wqbn2" Mar 09 14:47:55 crc kubenswrapper[4722]: I0309 14:47:55.376262 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0bc3b99-5368-4287-8a9d-7b19b8b33e40-inventory\") pod \"b0bc3b99-5368-4287-8a9d-7b19b8b33e40\" (UID: \"b0bc3b99-5368-4287-8a9d-7b19b8b33e40\") " Mar 09 14:47:55 crc kubenswrapper[4722]: I0309 14:47:55.376307 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b9k6q\" (UniqueName: \"kubernetes.io/projected/b0bc3b99-5368-4287-8a9d-7b19b8b33e40-kube-api-access-b9k6q\") pod \"b0bc3b99-5368-4287-8a9d-7b19b8b33e40\" (UID: \"b0bc3b99-5368-4287-8a9d-7b19b8b33e40\") " Mar 09 14:47:55 crc kubenswrapper[4722]: I0309 14:47:55.376407 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/b0bc3b99-5368-4287-8a9d-7b19b8b33e40-libvirt-secret-0\") pod \"b0bc3b99-5368-4287-8a9d-7b19b8b33e40\" (UID: \"b0bc3b99-5368-4287-8a9d-7b19b8b33e40\") " Mar 09 14:47:55 crc kubenswrapper[4722]: I0309 14:47:55.376755 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b0bc3b99-5368-4287-8a9d-7b19b8b33e40-ssh-key-openstack-edpm-ipam\") pod \"b0bc3b99-5368-4287-8a9d-7b19b8b33e40\" (UID: \"b0bc3b99-5368-4287-8a9d-7b19b8b33e40\") " Mar 09 14:47:55 crc kubenswrapper[4722]: I0309 14:47:55.376938 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0bc3b99-5368-4287-8a9d-7b19b8b33e40-libvirt-combined-ca-bundle\") pod \"b0bc3b99-5368-4287-8a9d-7b19b8b33e40\" (UID: \"b0bc3b99-5368-4287-8a9d-7b19b8b33e40\") " Mar 09 14:47:55 crc kubenswrapper[4722]: I0309 14:47:55.388738 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0bc3b99-5368-4287-8a9d-7b19b8b33e40-kube-api-access-b9k6q" (OuterVolumeSpecName: "kube-api-access-b9k6q") pod "b0bc3b99-5368-4287-8a9d-7b19b8b33e40" (UID: "b0bc3b99-5368-4287-8a9d-7b19b8b33e40"). InnerVolumeSpecName "kube-api-access-b9k6q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:47:55 crc kubenswrapper[4722]: I0309 14:47:55.398374 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0bc3b99-5368-4287-8a9d-7b19b8b33e40-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "b0bc3b99-5368-4287-8a9d-7b19b8b33e40" (UID: "b0bc3b99-5368-4287-8a9d-7b19b8b33e40"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:47:55 crc kubenswrapper[4722]: I0309 14:47:55.410377 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0bc3b99-5368-4287-8a9d-7b19b8b33e40-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "b0bc3b99-5368-4287-8a9d-7b19b8b33e40" (UID: "b0bc3b99-5368-4287-8a9d-7b19b8b33e40"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:47:55 crc kubenswrapper[4722]: I0309 14:47:55.415507 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0bc3b99-5368-4287-8a9d-7b19b8b33e40-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "b0bc3b99-5368-4287-8a9d-7b19b8b33e40" (UID: "b0bc3b99-5368-4287-8a9d-7b19b8b33e40"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:47:55 crc kubenswrapper[4722]: I0309 14:47:55.420776 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0bc3b99-5368-4287-8a9d-7b19b8b33e40-inventory" (OuterVolumeSpecName: "inventory") pod "b0bc3b99-5368-4287-8a9d-7b19b8b33e40" (UID: "b0bc3b99-5368-4287-8a9d-7b19b8b33e40"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:47:55 crc kubenswrapper[4722]: I0309 14:47:55.481940 4722 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0bc3b99-5368-4287-8a9d-7b19b8b33e40-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 14:47:55 crc kubenswrapper[4722]: I0309 14:47:55.482001 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b9k6q\" (UniqueName: \"kubernetes.io/projected/b0bc3b99-5368-4287-8a9d-7b19b8b33e40-kube-api-access-b9k6q\") on node \"crc\" DevicePath \"\"" Mar 09 14:47:55 crc kubenswrapper[4722]: I0309 14:47:55.482018 4722 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/b0bc3b99-5368-4287-8a9d-7b19b8b33e40-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Mar 09 14:47:55 crc kubenswrapper[4722]: I0309 14:47:55.482032 4722 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b0bc3b99-5368-4287-8a9d-7b19b8b33e40-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 14:47:55 crc kubenswrapper[4722]: I0309 14:47:55.482047 4722 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0bc3b99-5368-4287-8a9d-7b19b8b33e40-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:47:55 crc kubenswrapper[4722]: I0309 14:47:55.709817 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wqbn2" event={"ID":"b0bc3b99-5368-4287-8a9d-7b19b8b33e40","Type":"ContainerDied","Data":"ca7729f7380f7b4585532e01d2ac2f6ecfa27c49b0a89574cc5163d0692a65a7"} Mar 09 14:47:55 crc kubenswrapper[4722]: I0309 14:47:55.709896 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca7729f7380f7b4585532e01d2ac2f6ecfa27c49b0a89574cc5163d0692a65a7" Mar 09 14:47:55 crc kubenswrapper[4722]: I0309 14:47:55.709974 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-wqbn2" Mar 09 14:47:55 crc kubenswrapper[4722]: I0309 14:47:55.814215 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-kbmxh"] Mar 09 14:47:55 crc kubenswrapper[4722]: E0309 14:47:55.814724 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0bc3b99-5368-4287-8a9d-7b19b8b33e40" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 09 14:47:55 crc kubenswrapper[4722]: I0309 14:47:55.814745 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0bc3b99-5368-4287-8a9d-7b19b8b33e40" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 09 14:47:55 crc kubenswrapper[4722]: E0309 14:47:55.814800 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3326142-7932-42ca-918c-dc1afd1a443b" containerName="oc" Mar 09 14:47:55 crc kubenswrapper[4722]: I0309 14:47:55.814811 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3326142-7932-42ca-918c-dc1afd1a443b" containerName="oc" Mar 09 14:47:55 crc kubenswrapper[4722]: I0309 14:47:55.815098 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3326142-7932-42ca-918c-dc1afd1a443b" containerName="oc" Mar 09 14:47:55 crc kubenswrapper[4722]: I0309 14:47:55.815123 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0bc3b99-5368-4287-8a9d-7b19b8b33e40" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 09 14:47:55 crc kubenswrapper[4722]: I0309 14:47:55.816009 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kbmxh" Mar 09 14:47:55 crc kubenswrapper[4722]: I0309 14:47:55.823160 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 14:47:55 crc kubenswrapper[4722]: I0309 14:47:55.823412 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Mar 09 14:47:55 crc kubenswrapper[4722]: I0309 14:47:55.823622 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Mar 09 14:47:55 crc kubenswrapper[4722]: I0309 14:47:55.823752 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 14:47:55 crc kubenswrapper[4722]: I0309 14:47:55.828385 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Mar 09 14:47:55 crc kubenswrapper[4722]: I0309 14:47:55.828614 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-dwlbg" Mar 09 14:47:55 crc kubenswrapper[4722]: I0309 14:47:55.828699 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 14:47:55 crc kubenswrapper[4722]: I0309 14:47:55.857929 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-kbmxh"] Mar 09 14:47:55 crc kubenswrapper[4722]: I0309 14:47:55.994245 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d10fe7e3-dd24-40b8-a94c-63dce2cb64ca-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kbmxh\" (UID: \"d10fe7e3-dd24-40b8-a94c-63dce2cb64ca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kbmxh" Mar 09 14:47:55 crc kubenswrapper[4722]: I0309 14:47:55.995194 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d10fe7e3-dd24-40b8-a94c-63dce2cb64ca-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kbmxh\" (UID: \"d10fe7e3-dd24-40b8-a94c-63dce2cb64ca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kbmxh" Mar 09 14:47:55 crc kubenswrapper[4722]: I0309 14:47:55.995297 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/d10fe7e3-dd24-40b8-a94c-63dce2cb64ca-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kbmxh\" (UID: \"d10fe7e3-dd24-40b8-a94c-63dce2cb64ca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kbmxh" Mar 09 14:47:55 crc kubenswrapper[4722]: I0309 14:47:55.995386 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/d10fe7e3-dd24-40b8-a94c-63dce2cb64ca-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kbmxh\" (UID: \"d10fe7e3-dd24-40b8-a94c-63dce2cb64ca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kbmxh" Mar 09 14:47:55 crc kubenswrapper[4722]: I0309 14:47:55.995613 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/d10fe7e3-dd24-40b8-a94c-63dce2cb64ca-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kbmxh\" (UID: \"d10fe7e3-dd24-40b8-a94c-63dce2cb64ca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kbmxh" Mar 09 14:47:55 crc kubenswrapper[4722]: I0309 14:47:55.995705 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxvqp\" (UniqueName: \"kubernetes.io/projected/d10fe7e3-dd24-40b8-a94c-63dce2cb64ca-kube-api-access-xxvqp\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kbmxh\" (UID: \"d10fe7e3-dd24-40b8-a94c-63dce2cb64ca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kbmxh" Mar 09 14:47:55 crc kubenswrapper[4722]: I0309 14:47:55.995933 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/d10fe7e3-dd24-40b8-a94c-63dce2cb64ca-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kbmxh\" (UID: \"d10fe7e3-dd24-40b8-a94c-63dce2cb64ca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kbmxh" Mar 09 14:47:55 crc kubenswrapper[4722]: I0309 14:47:55.996045 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/d10fe7e3-dd24-40b8-a94c-63dce2cb64ca-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kbmxh\" (UID: \"d10fe7e3-dd24-40b8-a94c-63dce2cb64ca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kbmxh" Mar 09 14:47:55 crc kubenswrapper[4722]: I0309 14:47:55.996134 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d10fe7e3-dd24-40b8-a94c-63dce2cb64ca-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kbmxh\" (UID: \"d10fe7e3-dd24-40b8-a94c-63dce2cb64ca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kbmxh" Mar 09 14:47:55 crc kubenswrapper[4722]: I0309 14:47:55.996328 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/d10fe7e3-dd24-40b8-a94c-63dce2cb64ca-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kbmxh\" (UID: \"d10fe7e3-dd24-40b8-a94c-63dce2cb64ca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kbmxh" Mar 09 14:47:55 crc kubenswrapper[4722]: I0309 14:47:55.996448 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/d10fe7e3-dd24-40b8-a94c-63dce2cb64ca-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kbmxh\" (UID: \"d10fe7e3-dd24-40b8-a94c-63dce2cb64ca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kbmxh" Mar 09 14:47:56 crc kubenswrapper[4722]: I0309 14:47:56.098152 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/d10fe7e3-dd24-40b8-a94c-63dce2cb64ca-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kbmxh\" (UID: \"d10fe7e3-dd24-40b8-a94c-63dce2cb64ca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kbmxh" Mar 09 14:47:56 crc kubenswrapper[4722]: I0309 14:47:56.098344 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/d10fe7e3-dd24-40b8-a94c-63dce2cb64ca-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kbmxh\" (UID: \"d10fe7e3-dd24-40b8-a94c-63dce2cb64ca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kbmxh" Mar 09 14:47:56 crc kubenswrapper[4722]: I0309 14:47:56.098380 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxvqp\" (UniqueName: \"kubernetes.io/projected/d10fe7e3-dd24-40b8-a94c-63dce2cb64ca-kube-api-access-xxvqp\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kbmxh\" (UID: \"d10fe7e3-dd24-40b8-a94c-63dce2cb64ca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kbmxh" Mar 09 14:47:56 crc kubenswrapper[4722]: I0309 14:47:56.098662 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/d10fe7e3-dd24-40b8-a94c-63dce2cb64ca-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kbmxh\" (UID: \"d10fe7e3-dd24-40b8-a94c-63dce2cb64ca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kbmxh" Mar 09 14:47:56 crc kubenswrapper[4722]: I0309 14:47:56.098727 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/d10fe7e3-dd24-40b8-a94c-63dce2cb64ca-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kbmxh\" (UID: \"d10fe7e3-dd24-40b8-a94c-63dce2cb64ca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kbmxh" Mar 09 14:47:56 crc kubenswrapper[4722]: I0309 14:47:56.098771 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d10fe7e3-dd24-40b8-a94c-63dce2cb64ca-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kbmxh\" (UID: \"d10fe7e3-dd24-40b8-a94c-63dce2cb64ca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kbmxh" Mar 09 14:47:56 crc kubenswrapper[4722]: I0309 14:47:56.098954 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/d10fe7e3-dd24-40b8-a94c-63dce2cb64ca-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kbmxh\" (UID: \"d10fe7e3-dd24-40b8-a94c-63dce2cb64ca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kbmxh" Mar 09 14:47:56 crc kubenswrapper[4722]: I0309 14:47:56.099002 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/d10fe7e3-dd24-40b8-a94c-63dce2cb64ca-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kbmxh\" (UID: \"d10fe7e3-dd24-40b8-a94c-63dce2cb64ca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kbmxh" Mar 09 14:47:56 crc kubenswrapper[4722]: I0309 14:47:56.099052 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d10fe7e3-dd24-40b8-a94c-63dce2cb64ca-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kbmxh\" (UID: \"d10fe7e3-dd24-40b8-a94c-63dce2cb64ca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kbmxh" Mar 09 14:47:56 crc kubenswrapper[4722]: I0309 14:47:56.099083 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d10fe7e3-dd24-40b8-a94c-63dce2cb64ca-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kbmxh\" (UID: \"d10fe7e3-dd24-40b8-a94c-63dce2cb64ca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kbmxh" Mar 09 14:47:56 crc kubenswrapper[4722]: I0309 14:47:56.099149 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/d10fe7e3-dd24-40b8-a94c-63dce2cb64ca-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kbmxh\" (UID: \"d10fe7e3-dd24-40b8-a94c-63dce2cb64ca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kbmxh" Mar 09 14:47:56 crc kubenswrapper[4722]: I0309 14:47:56.100054 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/d10fe7e3-dd24-40b8-a94c-63dce2cb64ca-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kbmxh\" (UID: \"d10fe7e3-dd24-40b8-a94c-63dce2cb64ca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kbmxh" Mar 09 14:47:56 crc kubenswrapper[4722]: I0309 14:47:56.105197 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/d10fe7e3-dd24-40b8-a94c-63dce2cb64ca-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kbmxh\" (UID: \"d10fe7e3-dd24-40b8-a94c-63dce2cb64ca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kbmxh" Mar 09 14:47:56 crc kubenswrapper[4722]: I0309 14:47:56.105196 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/d10fe7e3-dd24-40b8-a94c-63dce2cb64ca-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kbmxh\" (UID: \"d10fe7e3-dd24-40b8-a94c-63dce2cb64ca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kbmxh" Mar 09 14:47:56 crc kubenswrapper[4722]: I0309 14:47:56.105254 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d10fe7e3-dd24-40b8-a94c-63dce2cb64ca-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kbmxh\" (UID: \"d10fe7e3-dd24-40b8-a94c-63dce2cb64ca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kbmxh" Mar 09 14:47:56 crc kubenswrapper[4722]: I0309 14:47:56.106979 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/d10fe7e3-dd24-40b8-a94c-63dce2cb64ca-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kbmxh\" (UID: \"d10fe7e3-dd24-40b8-a94c-63dce2cb64ca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kbmxh" Mar 09 14:47:56 crc kubenswrapper[4722]: I0309 14:47:56.107257 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/d10fe7e3-dd24-40b8-a94c-63dce2cb64ca-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kbmxh\" (UID: \"d10fe7e3-dd24-40b8-a94c-63dce2cb64ca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kbmxh" Mar 09 14:47:56 crc kubenswrapper[4722]: I0309 14:47:56.107485 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d10fe7e3-dd24-40b8-a94c-63dce2cb64ca-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kbmxh\" (UID: \"d10fe7e3-dd24-40b8-a94c-63dce2cb64ca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kbmxh" Mar 09 14:47:56 crc kubenswrapper[4722]: I0309 14:47:56.107796 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/d10fe7e3-dd24-40b8-a94c-63dce2cb64ca-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kbmxh\" (UID: \"d10fe7e3-dd24-40b8-a94c-63dce2cb64ca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kbmxh" Mar 09 14:47:56 crc kubenswrapper[4722]: I0309 14:47:56.107998 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d10fe7e3-dd24-40b8-a94c-63dce2cb64ca-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kbmxh\" (UID: \"d10fe7e3-dd24-40b8-a94c-63dce2cb64ca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kbmxh" Mar 09 14:47:56 crc kubenswrapper[4722]: I0309 14:47:56.108078 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/d10fe7e3-dd24-40b8-a94c-63dce2cb64ca-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kbmxh\" (UID: \"d10fe7e3-dd24-40b8-a94c-63dce2cb64ca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kbmxh" Mar 09 14:47:56 crc kubenswrapper[4722]: I0309 14:47:56.117882 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxvqp\" (UniqueName: \"kubernetes.io/projected/d10fe7e3-dd24-40b8-a94c-63dce2cb64ca-kube-api-access-xxvqp\") pod \"nova-edpm-deployment-openstack-edpm-ipam-kbmxh\" (UID: \"d10fe7e3-dd24-40b8-a94c-63dce2cb64ca\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kbmxh" Mar 09 14:47:56 crc kubenswrapper[4722]: I0309 14:47:56.139862 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kbmxh" Mar 09 14:47:56 crc kubenswrapper[4722]: I0309 14:47:56.683720 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-kbmxh"] Mar 09 14:47:56 crc kubenswrapper[4722]: I0309 14:47:56.727589 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kbmxh" event={"ID":"d10fe7e3-dd24-40b8-a94c-63dce2cb64ca","Type":"ContainerStarted","Data":"f126cd4bdf935d82d34881d7c739f2056f518b0b4576d11ff3d63b173b9e5a8d"} Mar 09 14:47:57 crc kubenswrapper[4722]: I0309 14:47:57.738129 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kbmxh" event={"ID":"d10fe7e3-dd24-40b8-a94c-63dce2cb64ca","Type":"ContainerStarted","Data":"abf9ea5603e8e2b2f3d80d9d21ac342c5c41cfcc94014e9bf1957728074a0359"} Mar 09 14:47:57 crc kubenswrapper[4722]: I0309 14:47:57.756027 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kbmxh" podStartSLOduration=2.199669163 podStartE2EDuration="2.756010672s" podCreationTimestamp="2026-03-09 14:47:55 +0000 UTC" firstStartedPulling="2026-03-09 14:47:56.68847411 +0000 UTC m=+2717.244042716" lastFinishedPulling="2026-03-09 14:47:57.244815649 +0000 UTC m=+2717.800384225" observedRunningTime="2026-03-09 14:47:57.753923936 +0000 UTC m=+2718.309492522" watchObservedRunningTime="2026-03-09 14:47:57.756010672 +0000 UTC m=+2718.311579248" Mar 09 14:48:00 crc kubenswrapper[4722]: I0309 14:48:00.135341 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551128-wxwp8"] Mar 09 14:48:00 crc kubenswrapper[4722]: I0309 14:48:00.138406 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551128-wxwp8" Mar 09 14:48:00 crc kubenswrapper[4722]: I0309 14:48:00.141151 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 14:48:00 crc kubenswrapper[4722]: I0309 14:48:00.141480 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-dr2x6" Mar 09 14:48:00 crc kubenswrapper[4722]: I0309 14:48:00.141736 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 14:48:00 crc kubenswrapper[4722]: I0309 14:48:00.170326 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551128-wxwp8"] Mar 09 14:48:00 crc kubenswrapper[4722]: I0309 14:48:00.213935 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfpmh\" (UniqueName: \"kubernetes.io/projected/0356803a-3ad1-40f7-895e-25dbf882fe52-kube-api-access-kfpmh\") pod \"auto-csr-approver-29551128-wxwp8\" (UID: \"0356803a-3ad1-40f7-895e-25dbf882fe52\") " pod="openshift-infra/auto-csr-approver-29551128-wxwp8" Mar 09 14:48:00 crc kubenswrapper[4722]: I0309 14:48:00.315847 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfpmh\" (UniqueName: \"kubernetes.io/projected/0356803a-3ad1-40f7-895e-25dbf882fe52-kube-api-access-kfpmh\") pod \"auto-csr-approver-29551128-wxwp8\" (UID: \"0356803a-3ad1-40f7-895e-25dbf882fe52\") " pod="openshift-infra/auto-csr-approver-29551128-wxwp8" Mar 09 14:48:00 crc kubenswrapper[4722]: I0309 14:48:00.336848 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfpmh\" (UniqueName: \"kubernetes.io/projected/0356803a-3ad1-40f7-895e-25dbf882fe52-kube-api-access-kfpmh\") pod \"auto-csr-approver-29551128-wxwp8\" (UID: \"0356803a-3ad1-40f7-895e-25dbf882fe52\") " pod="openshift-infra/auto-csr-approver-29551128-wxwp8" Mar 09 14:48:00 crc kubenswrapper[4722]: I0309 14:48:00.461285 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551128-wxwp8" Mar 09 14:48:00 crc kubenswrapper[4722]: I0309 14:48:00.977109 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551128-wxwp8"] Mar 09 14:48:01 crc kubenswrapper[4722]: I0309 14:48:01.782287 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551128-wxwp8" event={"ID":"0356803a-3ad1-40f7-895e-25dbf882fe52","Type":"ContainerStarted","Data":"43d1866a6cc1e0b1ea06d65bbe93fb3dd7319fd7d67ede918972f834555a065f"} Mar 09 14:48:03 crc kubenswrapper[4722]: I0309 14:48:03.813056 4722 generic.go:334] "Generic (PLEG): container finished" podID="0356803a-3ad1-40f7-895e-25dbf882fe52" containerID="b4ade2f4b120f2502bbad4f900fab6bf72bca83c8ebfd5d6b2463474b904a76c" exitCode=0 Mar 09 14:48:03 crc kubenswrapper[4722]: I0309 14:48:03.813649 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551128-wxwp8" event={"ID":"0356803a-3ad1-40f7-895e-25dbf882fe52","Type":"ContainerDied","Data":"b4ade2f4b120f2502bbad4f900fab6bf72bca83c8ebfd5d6b2463474b904a76c"} Mar 09 14:48:05 crc kubenswrapper[4722]: I0309 14:48:05.259441 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551128-wxwp8" Mar 09 14:48:05 crc kubenswrapper[4722]: I0309 14:48:05.391773 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfpmh\" (UniqueName: \"kubernetes.io/projected/0356803a-3ad1-40f7-895e-25dbf882fe52-kube-api-access-kfpmh\") pod \"0356803a-3ad1-40f7-895e-25dbf882fe52\" (UID: \"0356803a-3ad1-40f7-895e-25dbf882fe52\") " Mar 09 14:48:05 crc kubenswrapper[4722]: I0309 14:48:05.400517 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0356803a-3ad1-40f7-895e-25dbf882fe52-kube-api-access-kfpmh" (OuterVolumeSpecName: "kube-api-access-kfpmh") pod "0356803a-3ad1-40f7-895e-25dbf882fe52" (UID: "0356803a-3ad1-40f7-895e-25dbf882fe52"). InnerVolumeSpecName "kube-api-access-kfpmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:48:05 crc kubenswrapper[4722]: I0309 14:48:05.495801 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfpmh\" (UniqueName: \"kubernetes.io/projected/0356803a-3ad1-40f7-895e-25dbf882fe52-kube-api-access-kfpmh\") on node \"crc\" DevicePath \"\"" Mar 09 14:48:05 crc kubenswrapper[4722]: I0309 14:48:05.840990 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551128-wxwp8" event={"ID":"0356803a-3ad1-40f7-895e-25dbf882fe52","Type":"ContainerDied","Data":"43d1866a6cc1e0b1ea06d65bbe93fb3dd7319fd7d67ede918972f834555a065f"} Mar 09 14:48:05 crc kubenswrapper[4722]: I0309 14:48:05.841423 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="43d1866a6cc1e0b1ea06d65bbe93fb3dd7319fd7d67ede918972f834555a065f" Mar 09 14:48:05 crc kubenswrapper[4722]: I0309 14:48:05.841066 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551128-wxwp8" Mar 09 14:48:06 crc kubenswrapper[4722]: I0309 14:48:06.334324 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551122-lg5w9"] Mar 09 14:48:06 crc kubenswrapper[4722]: I0309 14:48:06.362116 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551122-lg5w9"] Mar 09 14:48:08 crc kubenswrapper[4722]: I0309 14:48:08.160972 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7fb0dbe-1b62-48eb-9d4a-8df7727dd63a" path="/var/lib/kubelet/pods/a7fb0dbe-1b62-48eb-9d4a-8df7727dd63a/volumes" Mar 09 14:48:28 crc kubenswrapper[4722]: I0309 14:48:28.257315 4722 scope.go:117] "RemoveContainer" containerID="9ed9a3e3b723ec09b4d6338af042e6c6ba44ee25eed470dab7629e29774a7323" Mar 09 14:48:51 crc kubenswrapper[4722]: I0309 14:48:51.528168 4722 patch_prober.go:28] interesting pod/machine-config-daemon-hjrrb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:48:51 crc kubenswrapper[4722]: I0309 14:48:51.528788 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:48:52 crc kubenswrapper[4722]: I0309 14:48:52.020438 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fp9pz"] Mar 09 14:48:52 crc kubenswrapper[4722]: E0309 14:48:52.021074 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0356803a-3ad1-40f7-895e-25dbf882fe52" containerName="oc" Mar 09 14:48:52 crc kubenswrapper[4722]: I0309 14:48:52.021100 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="0356803a-3ad1-40f7-895e-25dbf882fe52" containerName="oc" Mar 09 14:48:52 crc kubenswrapper[4722]: I0309 14:48:52.021506 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="0356803a-3ad1-40f7-895e-25dbf882fe52" containerName="oc" Mar 09 14:48:52 crc kubenswrapper[4722]: I0309 14:48:52.023631 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fp9pz" Mar 09 14:48:52 crc kubenswrapper[4722]: I0309 14:48:52.035895 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fp9pz"] Mar 09 14:48:52 crc kubenswrapper[4722]: I0309 14:48:52.061022 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6db8891-a11c-471f-89bf-918891a4f8d8-utilities\") pod \"redhat-operators-fp9pz\" (UID: \"f6db8891-a11c-471f-89bf-918891a4f8d8\") " pod="openshift-marketplace/redhat-operators-fp9pz" Mar 09 14:48:52 crc kubenswrapper[4722]: I0309 14:48:52.061071 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6db8891-a11c-471f-89bf-918891a4f8d8-catalog-content\") pod \"redhat-operators-fp9pz\" (UID: \"f6db8891-a11c-471f-89bf-918891a4f8d8\") " pod="openshift-marketplace/redhat-operators-fp9pz" Mar 09 14:48:52 crc kubenswrapper[4722]: I0309 14:48:52.061333 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4zt5\" (UniqueName: \"kubernetes.io/projected/f6db8891-a11c-471f-89bf-918891a4f8d8-kube-api-access-d4zt5\") pod \"redhat-operators-fp9pz\" (UID: \"f6db8891-a11c-471f-89bf-918891a4f8d8\") " pod="openshift-marketplace/redhat-operators-fp9pz" Mar 09 14:48:52 crc kubenswrapper[4722]: I0309 14:48:52.164951 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4zt5\" (UniqueName: \"kubernetes.io/projected/f6db8891-a11c-471f-89bf-918891a4f8d8-kube-api-access-d4zt5\") pod \"redhat-operators-fp9pz\" (UID: \"f6db8891-a11c-471f-89bf-918891a4f8d8\") " pod="openshift-marketplace/redhat-operators-fp9pz" Mar 09 14:48:52 crc kubenswrapper[4722]: I0309 14:48:52.165468 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6db8891-a11c-471f-89bf-918891a4f8d8-utilities\") pod \"redhat-operators-fp9pz\" (UID: \"f6db8891-a11c-471f-89bf-918891a4f8d8\") " pod="openshift-marketplace/redhat-operators-fp9pz" Mar 09 14:48:52 crc kubenswrapper[4722]: I0309 14:48:52.165511 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6db8891-a11c-471f-89bf-918891a4f8d8-catalog-content\") pod \"redhat-operators-fp9pz\" (UID: \"f6db8891-a11c-471f-89bf-918891a4f8d8\") " pod="openshift-marketplace/redhat-operators-fp9pz" Mar 09 14:48:52 crc kubenswrapper[4722]: I0309 14:48:52.166329 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6db8891-a11c-471f-89bf-918891a4f8d8-catalog-content\") pod \"redhat-operators-fp9pz\" (UID: \"f6db8891-a11c-471f-89bf-918891a4f8d8\") " pod="openshift-marketplace/redhat-operators-fp9pz" Mar 09 14:48:52 crc kubenswrapper[4722]: I0309 14:48:52.167113 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6db8891-a11c-471f-89bf-918891a4f8d8-utilities\") pod \"redhat-operators-fp9pz\" (UID: \"f6db8891-a11c-471f-89bf-918891a4f8d8\") " pod="openshift-marketplace/redhat-operators-fp9pz" Mar 09 14:48:52 crc kubenswrapper[4722]: I0309 14:48:52.187713 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4zt5\" (UniqueName: \"kubernetes.io/projected/f6db8891-a11c-471f-89bf-918891a4f8d8-kube-api-access-d4zt5\") pod \"redhat-operators-fp9pz\" (UID: \"f6db8891-a11c-471f-89bf-918891a4f8d8\") " pod="openshift-marketplace/redhat-operators-fp9pz" Mar 09 14:48:52 crc kubenswrapper[4722]: I0309 14:48:52.353704 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fp9pz" Mar 09 14:48:52 crc kubenswrapper[4722]: I0309 14:48:52.852439 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fp9pz"] Mar 09 14:48:53 crc kubenswrapper[4722]: I0309 14:48:53.680436 4722 generic.go:334] "Generic (PLEG): container finished" podID="f6db8891-a11c-471f-89bf-918891a4f8d8" containerID="506f7fd87e734c636ac2a9061f46d3c676b083451dd042098666153962a90069" exitCode=0 Mar 09 14:48:53 crc kubenswrapper[4722]: I0309 14:48:53.680555 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fp9pz" event={"ID":"f6db8891-a11c-471f-89bf-918891a4f8d8","Type":"ContainerDied","Data":"506f7fd87e734c636ac2a9061f46d3c676b083451dd042098666153962a90069"} Mar 09 14:48:53 crc kubenswrapper[4722]: I0309 14:48:53.680679 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fp9pz" event={"ID":"f6db8891-a11c-471f-89bf-918891a4f8d8","Type":"ContainerStarted","Data":"20d9f2f494a6d8a397ed80f739e4c560f489f388320ca31af31b0c275dabb83d"} Mar 09 14:48:54 crc kubenswrapper[4722]: I0309 14:48:54.698231 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fp9pz" event={"ID":"f6db8891-a11c-471f-89bf-918891a4f8d8","Type":"ContainerStarted","Data":"810f515f8895b088eb83f0ff65d86eba94ef49e4530ffa57beb46fa7938bf080"} Mar 09 14:49:08 crc kubenswrapper[4722]: I0309 14:49:08.872788 4722 generic.go:334] "Generic (PLEG): container finished" podID="f6db8891-a11c-471f-89bf-918891a4f8d8" containerID="810f515f8895b088eb83f0ff65d86eba94ef49e4530ffa57beb46fa7938bf080" exitCode=0 Mar 09 14:49:08 crc kubenswrapper[4722]: I0309 14:49:08.872843 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fp9pz" event={"ID":"f6db8891-a11c-471f-89bf-918891a4f8d8","Type":"ContainerDied","Data":"810f515f8895b088eb83f0ff65d86eba94ef49e4530ffa57beb46fa7938bf080"} Mar 09 14:49:10 crc kubenswrapper[4722]: I0309 14:49:10.898535 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fp9pz" event={"ID":"f6db8891-a11c-471f-89bf-918891a4f8d8","Type":"ContainerStarted","Data":"2e30ad90927e4aea8a7717492fc1b58f8f4b2f55e7b062beee4b3e3317475fe3"} Mar 09 14:49:10 crc kubenswrapper[4722]: I0309 14:49:10.921668 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fp9pz" podStartSLOduration=3.660778747 podStartE2EDuration="19.921645599s" podCreationTimestamp="2026-03-09 14:48:51 +0000 UTC" firstStartedPulling="2026-03-09 14:48:53.682629692 +0000 UTC m=+2774.238198268" lastFinishedPulling="2026-03-09 14:49:09.943496524 +0000 UTC m=+2790.499065120" observedRunningTime="2026-03-09 14:49:10.915142181 +0000 UTC m=+2791.470710757" watchObservedRunningTime="2026-03-09 14:49:10.921645599 +0000 UTC m=+2791.477214175" Mar 09 14:49:12 crc kubenswrapper[4722]: I0309 14:49:12.354102 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fp9pz" Mar 09 14:49:12 crc kubenswrapper[4722]: I0309 14:49:12.354496 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fp9pz" Mar 09 14:49:13 crc kubenswrapper[4722]: I0309 14:49:13.403373 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fp9pz" podUID="f6db8891-a11c-471f-89bf-918891a4f8d8" containerName="registry-server" probeResult="failure" output=< Mar 09 14:49:13 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Mar 09 14:49:13 crc kubenswrapper[4722]: > Mar 09 14:49:21 crc kubenswrapper[4722]: I0309 14:49:21.528164 4722 patch_prober.go:28] interesting pod/machine-config-daemon-hjrrb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:49:21 crc kubenswrapper[4722]: I0309 14:49:21.528868 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:49:23 crc kubenswrapper[4722]: I0309 14:49:23.408595 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fp9pz" podUID="f6db8891-a11c-471f-89bf-918891a4f8d8" containerName="registry-server" probeResult="failure" output=< Mar 09 14:49:23 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Mar 09 14:49:23 crc kubenswrapper[4722]: > Mar 09 14:49:32 crc kubenswrapper[4722]: I0309 14:49:32.440698 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fp9pz" Mar 09 14:49:32 crc kubenswrapper[4722]: I0309 14:49:32.501002 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fp9pz" Mar 09 14:49:32 crc kubenswrapper[4722]: I0309 14:49:32.684802 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fp9pz"] Mar 09 14:49:34 crc kubenswrapper[4722]: I0309 14:49:34.156078 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fp9pz" podUID="f6db8891-a11c-471f-89bf-918891a4f8d8" containerName="registry-server" containerID="cri-o://2e30ad90927e4aea8a7717492fc1b58f8f4b2f55e7b062beee4b3e3317475fe3" gracePeriod=2 Mar 09 14:49:34 crc kubenswrapper[4722]: I0309 14:49:34.724139 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fp9pz" Mar 09 14:49:34 crc kubenswrapper[4722]: I0309 14:49:34.818917 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6db8891-a11c-471f-89bf-918891a4f8d8-catalog-content\") pod \"f6db8891-a11c-471f-89bf-918891a4f8d8\" (UID: \"f6db8891-a11c-471f-89bf-918891a4f8d8\") " Mar 09 14:49:34 crc kubenswrapper[4722]: I0309 14:49:34.819192 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4zt5\" (UniqueName: \"kubernetes.io/projected/f6db8891-a11c-471f-89bf-918891a4f8d8-kube-api-access-d4zt5\") pod \"f6db8891-a11c-471f-89bf-918891a4f8d8\" (UID: \"f6db8891-a11c-471f-89bf-918891a4f8d8\") " Mar 09 14:49:34 crc kubenswrapper[4722]: I0309 14:49:34.819267 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6db8891-a11c-471f-89bf-918891a4f8d8-utilities\") pod \"f6db8891-a11c-471f-89bf-918891a4f8d8\" (UID: \"f6db8891-a11c-471f-89bf-918891a4f8d8\") " Mar 09 14:49:34 crc kubenswrapper[4722]: I0309 14:49:34.819993 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6db8891-a11c-471f-89bf-918891a4f8d8-utilities" (OuterVolumeSpecName: "utilities") pod "f6db8891-a11c-471f-89bf-918891a4f8d8" (UID: "f6db8891-a11c-471f-89bf-918891a4f8d8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:49:34 crc kubenswrapper[4722]: I0309 14:49:34.829684 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6db8891-a11c-471f-89bf-918891a4f8d8-kube-api-access-d4zt5" (OuterVolumeSpecName: "kube-api-access-d4zt5") pod "f6db8891-a11c-471f-89bf-918891a4f8d8" (UID: "f6db8891-a11c-471f-89bf-918891a4f8d8"). InnerVolumeSpecName "kube-api-access-d4zt5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:49:34 crc kubenswrapper[4722]: I0309 14:49:34.922353 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f6db8891-a11c-471f-89bf-918891a4f8d8-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 14:49:34 crc kubenswrapper[4722]: I0309 14:49:34.922399 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4zt5\" (UniqueName: \"kubernetes.io/projected/f6db8891-a11c-471f-89bf-918891a4f8d8-kube-api-access-d4zt5\") on node \"crc\" DevicePath \"\"" Mar 09 14:49:34 crc kubenswrapper[4722]: I0309 14:49:34.968638 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f6db8891-a11c-471f-89bf-918891a4f8d8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f6db8891-a11c-471f-89bf-918891a4f8d8" (UID: "f6db8891-a11c-471f-89bf-918891a4f8d8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:49:35 crc kubenswrapper[4722]: I0309 14:49:35.024834 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f6db8891-a11c-471f-89bf-918891a4f8d8-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 14:49:35 crc kubenswrapper[4722]: I0309 14:49:35.169780 4722 generic.go:334] "Generic (PLEG): container finished" podID="f6db8891-a11c-471f-89bf-918891a4f8d8" containerID="2e30ad90927e4aea8a7717492fc1b58f8f4b2f55e7b062beee4b3e3317475fe3" exitCode=0 Mar 09 14:49:35 crc kubenswrapper[4722]: I0309 14:49:35.169856 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fp9pz" Mar 09 14:49:35 crc kubenswrapper[4722]: I0309 14:49:35.169863 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fp9pz" event={"ID":"f6db8891-a11c-471f-89bf-918891a4f8d8","Type":"ContainerDied","Data":"2e30ad90927e4aea8a7717492fc1b58f8f4b2f55e7b062beee4b3e3317475fe3"} Mar 09 14:49:35 crc kubenswrapper[4722]: I0309 14:49:35.170300 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fp9pz" event={"ID":"f6db8891-a11c-471f-89bf-918891a4f8d8","Type":"ContainerDied","Data":"20d9f2f494a6d8a397ed80f739e4c560f489f388320ca31af31b0c275dabb83d"} Mar 09 14:49:35 crc kubenswrapper[4722]: I0309 14:49:35.170336 4722 scope.go:117] "RemoveContainer" containerID="2e30ad90927e4aea8a7717492fc1b58f8f4b2f55e7b062beee4b3e3317475fe3" Mar 09 14:49:35 crc kubenswrapper[4722]: I0309 14:49:35.200669 4722 scope.go:117] "RemoveContainer" containerID="810f515f8895b088eb83f0ff65d86eba94ef49e4530ffa57beb46fa7938bf080" Mar 09 14:49:35 crc kubenswrapper[4722]: I0309 14:49:35.250010 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fp9pz"] Mar 09 14:49:35 crc kubenswrapper[4722]: I0309 14:49:35.263882 4722 scope.go:117] "RemoveContainer" containerID="506f7fd87e734c636ac2a9061f46d3c676b083451dd042098666153962a90069" Mar 09 14:49:35 crc kubenswrapper[4722]: I0309 14:49:35.268872 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fp9pz"] Mar 09 14:49:35 crc kubenswrapper[4722]: I0309 14:49:35.295999 4722 scope.go:117] "RemoveContainer" containerID="2e30ad90927e4aea8a7717492fc1b58f8f4b2f55e7b062beee4b3e3317475fe3" Mar 09 14:49:35 crc kubenswrapper[4722]: E0309 14:49:35.296496 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e30ad90927e4aea8a7717492fc1b58f8f4b2f55e7b062beee4b3e3317475fe3\": container with ID starting with 2e30ad90927e4aea8a7717492fc1b58f8f4b2f55e7b062beee4b3e3317475fe3 not found: ID does not exist" containerID="2e30ad90927e4aea8a7717492fc1b58f8f4b2f55e7b062beee4b3e3317475fe3" Mar 09 14:49:35 crc kubenswrapper[4722]: I0309 14:49:35.296576 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e30ad90927e4aea8a7717492fc1b58f8f4b2f55e7b062beee4b3e3317475fe3"} err="failed to get container status \"2e30ad90927e4aea8a7717492fc1b58f8f4b2f55e7b062beee4b3e3317475fe3\": rpc error: code = NotFound desc = could not find container \"2e30ad90927e4aea8a7717492fc1b58f8f4b2f55e7b062beee4b3e3317475fe3\": container with ID starting with 2e30ad90927e4aea8a7717492fc1b58f8f4b2f55e7b062beee4b3e3317475fe3 not found: ID does not exist" Mar 09 14:49:35 crc kubenswrapper[4722]: I0309 14:49:35.296620 4722 scope.go:117] "RemoveContainer" containerID="810f515f8895b088eb83f0ff65d86eba94ef49e4530ffa57beb46fa7938bf080" Mar 09 14:49:35 crc kubenswrapper[4722]: E0309 14:49:35.297246 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"810f515f8895b088eb83f0ff65d86eba94ef49e4530ffa57beb46fa7938bf080\": container with ID starting with 810f515f8895b088eb83f0ff65d86eba94ef49e4530ffa57beb46fa7938bf080 not found: ID does not exist" containerID="810f515f8895b088eb83f0ff65d86eba94ef49e4530ffa57beb46fa7938bf080" Mar 09 14:49:35 crc kubenswrapper[4722]: I0309 14:49:35.297280 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"810f515f8895b088eb83f0ff65d86eba94ef49e4530ffa57beb46fa7938bf080"} err="failed to get container status \"810f515f8895b088eb83f0ff65d86eba94ef49e4530ffa57beb46fa7938bf080\": rpc error: code = NotFound desc = could not find container \"810f515f8895b088eb83f0ff65d86eba94ef49e4530ffa57beb46fa7938bf080\": container with ID starting with 810f515f8895b088eb83f0ff65d86eba94ef49e4530ffa57beb46fa7938bf080 not found: ID does not exist" Mar 09 14:49:35 crc kubenswrapper[4722]: I0309 14:49:35.297302 4722 scope.go:117] "RemoveContainer" containerID="506f7fd87e734c636ac2a9061f46d3c676b083451dd042098666153962a90069" Mar 09 14:49:35 crc kubenswrapper[4722]: E0309 14:49:35.297637 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"506f7fd87e734c636ac2a9061f46d3c676b083451dd042098666153962a90069\": container with ID starting with 506f7fd87e734c636ac2a9061f46d3c676b083451dd042098666153962a90069 not found: ID does not exist" containerID="506f7fd87e734c636ac2a9061f46d3c676b083451dd042098666153962a90069" Mar 09 14:49:35 crc kubenswrapper[4722]: I0309 14:49:35.297676 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"506f7fd87e734c636ac2a9061f46d3c676b083451dd042098666153962a90069"} err="failed to get container status \"506f7fd87e734c636ac2a9061f46d3c676b083451dd042098666153962a90069\": rpc error: code = NotFound desc = could not find container \"506f7fd87e734c636ac2a9061f46d3c676b083451dd042098666153962a90069\": container with ID starting with 506f7fd87e734c636ac2a9061f46d3c676b083451dd042098666153962a90069 not found: ID does not exist" Mar 09 14:49:36 crc kubenswrapper[4722]: I0309 14:49:36.161311 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6db8891-a11c-471f-89bf-918891a4f8d8" path="/var/lib/kubelet/pods/f6db8891-a11c-471f-89bf-918891a4f8d8/volumes" Mar 09 14:49:51 crc kubenswrapper[4722]: I0309 14:49:51.527513 4722 patch_prober.go:28] interesting pod/machine-config-daemon-hjrrb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:49:51 crc kubenswrapper[4722]: I0309 14:49:51.528147 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:49:51 crc kubenswrapper[4722]: I0309 14:49:51.528203 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" Mar 09 14:49:51 crc kubenswrapper[4722]: I0309 14:49:51.529152 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ac01e46e450ae3d431cf312a0286617c799422de23e62d8665856d6ee9c42ab7"} pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 14:49:51 crc kubenswrapper[4722]: I0309 14:49:51.529243 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerName="machine-config-daemon" containerID="cri-o://ac01e46e450ae3d431cf312a0286617c799422de23e62d8665856d6ee9c42ab7" gracePeriod=600 Mar 09 14:49:52 crc kubenswrapper[4722]: I0309 14:49:52.380075 4722 generic.go:334] "Generic (PLEG): container finished" podID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerID="ac01e46e450ae3d431cf312a0286617c799422de23e62d8665856d6ee9c42ab7" exitCode=0 Mar 09 14:49:52 crc kubenswrapper[4722]: I0309 14:49:52.380127 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" event={"ID":"dac2aaf5-653b-4b2a-8efe-ed26bac8d648","Type":"ContainerDied","Data":"ac01e46e450ae3d431cf312a0286617c799422de23e62d8665856d6ee9c42ab7"} Mar 09 14:49:52 crc kubenswrapper[4722]: I0309 14:49:52.380639 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" event={"ID":"dac2aaf5-653b-4b2a-8efe-ed26bac8d648","Type":"ContainerStarted","Data":"e93d9830f7a284f5574f07e5fff41b2ae62d84fa8023d799598acc091cbd5a28"} Mar 09 14:49:52 crc kubenswrapper[4722]: I0309 14:49:52.380661 4722 scope.go:117] "RemoveContainer" containerID="931310cf17f93e3962cdbbdf531169aefb3915147cff22375dd9c06d9d6b2906" Mar 09 14:50:00 crc kubenswrapper[4722]: I0309 14:50:00.162325 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551130-kvkpt"] Mar 09 14:50:00 crc kubenswrapper[4722]: E0309 14:50:00.163246 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6db8891-a11c-471f-89bf-918891a4f8d8" containerName="extract-content" Mar 09 14:50:00 crc kubenswrapper[4722]: I0309 14:50:00.163258 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6db8891-a11c-471f-89bf-918891a4f8d8" containerName="extract-content" Mar 09 14:50:00 crc kubenswrapper[4722]: E0309 14:50:00.163270 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6db8891-a11c-471f-89bf-918891a4f8d8" containerName="registry-server" Mar 09 14:50:00 crc kubenswrapper[4722]: I0309 14:50:00.163276 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6db8891-a11c-471f-89bf-918891a4f8d8" containerName="registry-server" Mar 09 14:50:00 crc kubenswrapper[4722]: E0309 14:50:00.163299 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6db8891-a11c-471f-89bf-918891a4f8d8" containerName="extract-utilities" Mar 09 14:50:00 crc kubenswrapper[4722]: I0309 14:50:00.163305 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6db8891-a11c-471f-89bf-918891a4f8d8" containerName="extract-utilities" Mar 09 14:50:00 crc kubenswrapper[4722]: I0309 14:50:00.163526 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6db8891-a11c-471f-89bf-918891a4f8d8" containerName="registry-server" Mar 09 14:50:00 crc kubenswrapper[4722]: I0309 14:50:00.164438 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551130-kvkpt" Mar 09 14:50:00 crc kubenswrapper[4722]: I0309 14:50:00.166770 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 14:50:00 crc kubenswrapper[4722]: I0309 14:50:00.166949 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 14:50:00 crc kubenswrapper[4722]: I0309 14:50:00.168309 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-dr2x6" Mar 09 14:50:00 crc kubenswrapper[4722]: I0309 14:50:00.174513 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551130-kvkpt"] Mar 09 14:50:00 crc kubenswrapper[4722]: I0309 14:50:00.311134 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q84qf\" (UniqueName: \"kubernetes.io/projected/0b65eeff-ebce-4cd9-bd1f-d7ede07a1e53-kube-api-access-q84qf\") pod \"auto-csr-approver-29551130-kvkpt\" (UID: \"0b65eeff-ebce-4cd9-bd1f-d7ede07a1e53\") " pod="openshift-infra/auto-csr-approver-29551130-kvkpt" Mar 09 14:50:00 crc kubenswrapper[4722]: I0309 14:50:00.413270 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q84qf\" (UniqueName: \"kubernetes.io/projected/0b65eeff-ebce-4cd9-bd1f-d7ede07a1e53-kube-api-access-q84qf\") pod \"auto-csr-approver-29551130-kvkpt\" (UID: \"0b65eeff-ebce-4cd9-bd1f-d7ede07a1e53\") " pod="openshift-infra/auto-csr-approver-29551130-kvkpt" Mar 09 14:50:00 crc kubenswrapper[4722]: I0309 14:50:00.433428 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q84qf\" (UniqueName: \"kubernetes.io/projected/0b65eeff-ebce-4cd9-bd1f-d7ede07a1e53-kube-api-access-q84qf\") pod \"auto-csr-approver-29551130-kvkpt\" (UID: \"0b65eeff-ebce-4cd9-bd1f-d7ede07a1e53\") " pod="openshift-infra/auto-csr-approver-29551130-kvkpt" Mar 09 14:50:00 crc kubenswrapper[4722]: I0309 14:50:00.528741 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551130-kvkpt" Mar 09 14:50:01 crc kubenswrapper[4722]: I0309 14:50:01.040794 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551130-kvkpt"] Mar 09 14:50:01 crc kubenswrapper[4722]: I0309 14:50:01.508521 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551130-kvkpt" event={"ID":"0b65eeff-ebce-4cd9-bd1f-d7ede07a1e53","Type":"ContainerStarted","Data":"bb525e30880ca4e8a7cfd876472aa1ec2360176f933966d1a217cb14ac18056c"} Mar 09 14:50:02 crc kubenswrapper[4722]: I0309 14:50:02.519277 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551130-kvkpt" event={"ID":"0b65eeff-ebce-4cd9-bd1f-d7ede07a1e53","Type":"ContainerStarted","Data":"0f9a1828d93c00b7f8016a0df10a321083cac201e49d8f70c90d7c9e7fc8a675"} Mar 09 14:50:02 crc kubenswrapper[4722]: I0309 14:50:02.540420 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551130-kvkpt" podStartSLOduration=1.458457595 podStartE2EDuration="2.540398062s" podCreationTimestamp="2026-03-09 14:50:00 +0000 UTC" firstStartedPulling="2026-03-09 14:50:01.044179704 +0000 UTC m=+2841.599748270" lastFinishedPulling="2026-03-09 14:50:02.126120161 +0000 UTC m=+2842.681688737" observedRunningTime="2026-03-09 14:50:02.531879268 +0000 UTC m=+2843.087447844" watchObservedRunningTime="2026-03-09 14:50:02.540398062 +0000 UTC m=+2843.095966638" Mar 09 14:50:03 crc kubenswrapper[4722]: I0309 14:50:03.530915 4722 generic.go:334] "Generic (PLEG): container finished" podID="0b65eeff-ebce-4cd9-bd1f-d7ede07a1e53" containerID="0f9a1828d93c00b7f8016a0df10a321083cac201e49d8f70c90d7c9e7fc8a675" exitCode=0 Mar 09 14:50:03 crc kubenswrapper[4722]: I0309 14:50:03.530962 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551130-kvkpt" event={"ID":"0b65eeff-ebce-4cd9-bd1f-d7ede07a1e53","Type":"ContainerDied","Data":"0f9a1828d93c00b7f8016a0df10a321083cac201e49d8f70c90d7c9e7fc8a675"} Mar 09 14:50:04 crc kubenswrapper[4722]: I0309 14:50:04.997089 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551130-kvkpt" Mar 09 14:50:05 crc kubenswrapper[4722]: I0309 14:50:05.155676 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q84qf\" (UniqueName: \"kubernetes.io/projected/0b65eeff-ebce-4cd9-bd1f-d7ede07a1e53-kube-api-access-q84qf\") pod \"0b65eeff-ebce-4cd9-bd1f-d7ede07a1e53\" (UID: \"0b65eeff-ebce-4cd9-bd1f-d7ede07a1e53\") " Mar 09 14:50:05 crc kubenswrapper[4722]: I0309 14:50:05.163106 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b65eeff-ebce-4cd9-bd1f-d7ede07a1e53-kube-api-access-q84qf" (OuterVolumeSpecName: "kube-api-access-q84qf") pod "0b65eeff-ebce-4cd9-bd1f-d7ede07a1e53" (UID: "0b65eeff-ebce-4cd9-bd1f-d7ede07a1e53"). InnerVolumeSpecName "kube-api-access-q84qf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:50:05 crc kubenswrapper[4722]: I0309 14:50:05.259630 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q84qf\" (UniqueName: \"kubernetes.io/projected/0b65eeff-ebce-4cd9-bd1f-d7ede07a1e53-kube-api-access-q84qf\") on node \"crc\" DevicePath \"\"" Mar 09 14:50:05 crc kubenswrapper[4722]: I0309 14:50:05.562384 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551130-kvkpt" event={"ID":"0b65eeff-ebce-4cd9-bd1f-d7ede07a1e53","Type":"ContainerDied","Data":"bb525e30880ca4e8a7cfd876472aa1ec2360176f933966d1a217cb14ac18056c"} Mar 09 14:50:05 crc kubenswrapper[4722]: I0309 14:50:05.562465 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb525e30880ca4e8a7cfd876472aa1ec2360176f933966d1a217cb14ac18056c" Mar 09 14:50:05 crc kubenswrapper[4722]: I0309 14:50:05.562529 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551130-kvkpt" Mar 09 14:50:05 crc kubenswrapper[4722]: I0309 14:50:05.638494 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551124-rgb7c"] Mar 09 14:50:05 crc kubenswrapper[4722]: I0309 14:50:05.653756 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551124-rgb7c"] Mar 09 14:50:06 crc kubenswrapper[4722]: I0309 14:50:06.162300 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2837a381-5705-491b-81d2-62e8a19e50b4" path="/var/lib/kubelet/pods/2837a381-5705-491b-81d2-62e8a19e50b4/volumes" Mar 09 14:50:28 crc kubenswrapper[4722]: I0309 14:50:28.366395 4722 scope.go:117] "RemoveContainer" containerID="064ff878f52dc16d37f76a0e2885da26d610dd057b9931999269f6530097eeb0" Mar 09 14:50:32 crc kubenswrapper[4722]: I0309 14:50:32.959527 4722 generic.go:334] "Generic (PLEG): container finished" podID="d10fe7e3-dd24-40b8-a94c-63dce2cb64ca" containerID="abf9ea5603e8e2b2f3d80d9d21ac342c5c41cfcc94014e9bf1957728074a0359" exitCode=0 Mar 09 14:50:32 crc kubenswrapper[4722]: I0309 14:50:32.959616 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kbmxh" event={"ID":"d10fe7e3-dd24-40b8-a94c-63dce2cb64ca","Type":"ContainerDied","Data":"abf9ea5603e8e2b2f3d80d9d21ac342c5c41cfcc94014e9bf1957728074a0359"} Mar 09 14:50:34 crc kubenswrapper[4722]: I0309 14:50:34.480922 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kbmxh" Mar 09 14:50:34 crc kubenswrapper[4722]: I0309 14:50:34.602417 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/d10fe7e3-dd24-40b8-a94c-63dce2cb64ca-nova-migration-ssh-key-1\") pod \"d10fe7e3-dd24-40b8-a94c-63dce2cb64ca\" (UID: \"d10fe7e3-dd24-40b8-a94c-63dce2cb64ca\") " Mar 09 14:50:34 crc kubenswrapper[4722]: I0309 14:50:34.602503 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/d10fe7e3-dd24-40b8-a94c-63dce2cb64ca-nova-migration-ssh-key-0\") pod \"d10fe7e3-dd24-40b8-a94c-63dce2cb64ca\" (UID: \"d10fe7e3-dd24-40b8-a94c-63dce2cb64ca\") " Mar 09 14:50:34 crc kubenswrapper[4722]: I0309 14:50:34.602637 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d10fe7e3-dd24-40b8-a94c-63dce2cb64ca-inventory\") pod \"d10fe7e3-dd24-40b8-a94c-63dce2cb64ca\" (UID: \"d10fe7e3-dd24-40b8-a94c-63dce2cb64ca\") " Mar 09 14:50:34 crc kubenswrapper[4722]: I0309 14:50:34.602684 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d10fe7e3-dd24-40b8-a94c-63dce2cb64ca-nova-combined-ca-bundle\") pod \"d10fe7e3-dd24-40b8-a94c-63dce2cb64ca\" (UID: \"d10fe7e3-dd24-40b8-a94c-63dce2cb64ca\") " Mar 09 14:50:34 crc kubenswrapper[4722]: I0309 14:50:34.602756 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d10fe7e3-dd24-40b8-a94c-63dce2cb64ca-ssh-key-openstack-edpm-ipam\") pod \"d10fe7e3-dd24-40b8-a94c-63dce2cb64ca\" (UID: \"d10fe7e3-dd24-40b8-a94c-63dce2cb64ca\") " Mar 09 14:50:34 crc kubenswrapper[4722]: I0309 14:50:34.602836 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/d10fe7e3-dd24-40b8-a94c-63dce2cb64ca-nova-cell1-compute-config-1\") pod \"d10fe7e3-dd24-40b8-a94c-63dce2cb64ca\" (UID: \"d10fe7e3-dd24-40b8-a94c-63dce2cb64ca\") " Mar 09 14:50:34 crc kubenswrapper[4722]: I0309 14:50:34.602895 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/d10fe7e3-dd24-40b8-a94c-63dce2cb64ca-nova-cell1-compute-config-0\") pod \"d10fe7e3-dd24-40b8-a94c-63dce2cb64ca\" (UID: \"d10fe7e3-dd24-40b8-a94c-63dce2cb64ca\") " Mar 09 14:50:34 crc kubenswrapper[4722]: I0309 14:50:34.602917 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/d10fe7e3-dd24-40b8-a94c-63dce2cb64ca-nova-cell1-compute-config-3\") pod \"d10fe7e3-dd24-40b8-a94c-63dce2cb64ca\" (UID: \"d10fe7e3-dd24-40b8-a94c-63dce2cb64ca\") " Mar 09 14:50:34 crc kubenswrapper[4722]: I0309 14:50:34.602962 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/d10fe7e3-dd24-40b8-a94c-63dce2cb64ca-nova-cell1-compute-config-2\") pod \"d10fe7e3-dd24-40b8-a94c-63dce2cb64ca\" (UID: \"d10fe7e3-dd24-40b8-a94c-63dce2cb64ca\") " Mar 09 14:50:34 crc kubenswrapper[4722]: I0309 14:50:34.603010 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/d10fe7e3-dd24-40b8-a94c-63dce2cb64ca-nova-extra-config-0\") pod \"d10fe7e3-dd24-40b8-a94c-63dce2cb64ca\" (UID: \"d10fe7e3-dd24-40b8-a94c-63dce2cb64ca\") " Mar 09 14:50:34 crc kubenswrapper[4722]: I0309 14:50:34.603063 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxvqp\" (UniqueName: \"kubernetes.io/projected/d10fe7e3-dd24-40b8-a94c-63dce2cb64ca-kube-api-access-xxvqp\") pod \"d10fe7e3-dd24-40b8-a94c-63dce2cb64ca\" (UID: \"d10fe7e3-dd24-40b8-a94c-63dce2cb64ca\") " Mar 09 14:50:34 crc kubenswrapper[4722]: I0309 14:50:34.609770 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d10fe7e3-dd24-40b8-a94c-63dce2cb64ca-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "d10fe7e3-dd24-40b8-a94c-63dce2cb64ca" (UID: "d10fe7e3-dd24-40b8-a94c-63dce2cb64ca"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:50:34 crc kubenswrapper[4722]: I0309 14:50:34.628179 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d10fe7e3-dd24-40b8-a94c-63dce2cb64ca-kube-api-access-xxvqp" (OuterVolumeSpecName: "kube-api-access-xxvqp") pod "d10fe7e3-dd24-40b8-a94c-63dce2cb64ca" (UID: "d10fe7e3-dd24-40b8-a94c-63dce2cb64ca"). InnerVolumeSpecName "kube-api-access-xxvqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:50:34 crc kubenswrapper[4722]: I0309 14:50:34.653040 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d10fe7e3-dd24-40b8-a94c-63dce2cb64ca-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "d10fe7e3-dd24-40b8-a94c-63dce2cb64ca" (UID: "d10fe7e3-dd24-40b8-a94c-63dce2cb64ca"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:50:34 crc kubenswrapper[4722]: I0309 14:50:34.654976 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d10fe7e3-dd24-40b8-a94c-63dce2cb64ca-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "d10fe7e3-dd24-40b8-a94c-63dce2cb64ca" (UID: "d10fe7e3-dd24-40b8-a94c-63dce2cb64ca"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:50:34 crc kubenswrapper[4722]: I0309 14:50:34.662401 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d10fe7e3-dd24-40b8-a94c-63dce2cb64ca-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "d10fe7e3-dd24-40b8-a94c-63dce2cb64ca" (UID: "d10fe7e3-dd24-40b8-a94c-63dce2cb64ca"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:50:34 crc kubenswrapper[4722]: I0309 14:50:34.665508 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d10fe7e3-dd24-40b8-a94c-63dce2cb64ca-inventory" (OuterVolumeSpecName: "inventory") pod "d10fe7e3-dd24-40b8-a94c-63dce2cb64ca" (UID: "d10fe7e3-dd24-40b8-a94c-63dce2cb64ca"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:50:34 crc kubenswrapper[4722]: I0309 14:50:34.665769 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d10fe7e3-dd24-40b8-a94c-63dce2cb64ca-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "d10fe7e3-dd24-40b8-a94c-63dce2cb64ca" (UID: "d10fe7e3-dd24-40b8-a94c-63dce2cb64ca"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:50:34 crc kubenswrapper[4722]: I0309 14:50:34.672079 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d10fe7e3-dd24-40b8-a94c-63dce2cb64ca-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "d10fe7e3-dd24-40b8-a94c-63dce2cb64ca" (UID: "d10fe7e3-dd24-40b8-a94c-63dce2cb64ca"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:50:34 crc kubenswrapper[4722]: I0309 14:50:34.682126 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d10fe7e3-dd24-40b8-a94c-63dce2cb64ca-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "d10fe7e3-dd24-40b8-a94c-63dce2cb64ca" (UID: "d10fe7e3-dd24-40b8-a94c-63dce2cb64ca"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:50:34 crc kubenswrapper[4722]: I0309 14:50:34.690672 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d10fe7e3-dd24-40b8-a94c-63dce2cb64ca-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "d10fe7e3-dd24-40b8-a94c-63dce2cb64ca" (UID: "d10fe7e3-dd24-40b8-a94c-63dce2cb64ca"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 14:50:34 crc kubenswrapper[4722]: I0309 14:50:34.702481 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d10fe7e3-dd24-40b8-a94c-63dce2cb64ca-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "d10fe7e3-dd24-40b8-a94c-63dce2cb64ca" (UID: "d10fe7e3-dd24-40b8-a94c-63dce2cb64ca"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:50:34 crc kubenswrapper[4722]: I0309 14:50:34.706965 4722 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d10fe7e3-dd24-40b8-a94c-63dce2cb64ca-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 14:50:34 crc kubenswrapper[4722]: I0309 14:50:34.706999 4722 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d10fe7e3-dd24-40b8-a94c-63dce2cb64ca-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:50:34 crc kubenswrapper[4722]: I0309 14:50:34.707010 4722 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d10fe7e3-dd24-40b8-a94c-63dce2cb64ca-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 14:50:34 crc kubenswrapper[4722]: I0309 14:50:34.707021 4722 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/d10fe7e3-dd24-40b8-a94c-63dce2cb64ca-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Mar 09 14:50:34 crc kubenswrapper[4722]: I0309 14:50:34.707029 4722 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/d10fe7e3-dd24-40b8-a94c-63dce2cb64ca-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Mar 09 14:50:34 crc kubenswrapper[4722]: I0309 14:50:34.707039 4722 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/d10fe7e3-dd24-40b8-a94c-63dce2cb64ca-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Mar 09 14:50:34 crc kubenswrapper[4722]: I0309 14:50:34.707047 4722 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/d10fe7e3-dd24-40b8-a94c-63dce2cb64ca-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Mar 09 14:50:34 crc kubenswrapper[4722]: I0309 14:50:34.707056 4722 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/d10fe7e3-dd24-40b8-a94c-63dce2cb64ca-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Mar 09 14:50:34 crc kubenswrapper[4722]: I0309 14:50:34.707064 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxvqp\" (UniqueName: \"kubernetes.io/projected/d10fe7e3-dd24-40b8-a94c-63dce2cb64ca-kube-api-access-xxvqp\") on node \"crc\" DevicePath \"\"" Mar 09 14:50:34 crc kubenswrapper[4722]: I0309 14:50:34.707072 4722 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/d10fe7e3-dd24-40b8-a94c-63dce2cb64ca-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Mar 09 14:50:34 crc kubenswrapper[4722]: I0309 14:50:34.707080 4722 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/d10fe7e3-dd24-40b8-a94c-63dce2cb64ca-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Mar 09 14:50:34 crc kubenswrapper[4722]: I0309 14:50:34.981689 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kbmxh" event={"ID":"d10fe7e3-dd24-40b8-a94c-63dce2cb64ca","Type":"ContainerDied","Data":"f126cd4bdf935d82d34881d7c739f2056f518b0b4576d11ff3d63b173b9e5a8d"} Mar 09 14:50:34 crc kubenswrapper[4722]: I0309 14:50:34.981976 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f126cd4bdf935d82d34881d7c739f2056f518b0b4576d11ff3d63b173b9e5a8d" Mar 09 14:50:34 crc kubenswrapper[4722]: I0309 14:50:34.981744 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-kbmxh" Mar 09 14:50:35 crc kubenswrapper[4722]: I0309 14:50:35.092715 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7lzxs"] Mar 09 14:50:35 crc kubenswrapper[4722]: E0309 14:50:35.093195 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d10fe7e3-dd24-40b8-a94c-63dce2cb64ca" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 09 14:50:35 crc kubenswrapper[4722]: I0309 14:50:35.093226 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="d10fe7e3-dd24-40b8-a94c-63dce2cb64ca" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 09 14:50:35 crc kubenswrapper[4722]: E0309 14:50:35.093245 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b65eeff-ebce-4cd9-bd1f-d7ede07a1e53" containerName="oc" Mar 09 14:50:35 crc kubenswrapper[4722]: I0309 14:50:35.093251 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b65eeff-ebce-4cd9-bd1f-d7ede07a1e53" containerName="oc" Mar 09 14:50:35 crc kubenswrapper[4722]: I0309 14:50:35.093504 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b65eeff-ebce-4cd9-bd1f-d7ede07a1e53" containerName="oc" Mar 09 14:50:35 crc kubenswrapper[4722]: I0309 14:50:35.093524 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="d10fe7e3-dd24-40b8-a94c-63dce2cb64ca" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 09 14:50:35 crc kubenswrapper[4722]: I0309 14:50:35.094324 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7lzxs" Mar 09 14:50:35 crc kubenswrapper[4722]: I0309 14:50:35.097516 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 14:50:35 crc kubenswrapper[4722]: I0309 14:50:35.098486 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 14:50:35 crc kubenswrapper[4722]: I0309 14:50:35.099196 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-dwlbg" Mar 09 14:50:35 crc kubenswrapper[4722]: I0309 14:50:35.104667 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Mar 09 14:50:35 crc kubenswrapper[4722]: I0309 14:50:35.105319 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 14:50:35 crc kubenswrapper[4722]: I0309 14:50:35.110385 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7lzxs"] Mar 09 14:50:35 crc kubenswrapper[4722]: I0309 14:50:35.216828 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/713bd472-187b-47a0-9094-5ac6496d7830-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7lzxs\" (UID: \"713bd472-187b-47a0-9094-5ac6496d7830\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7lzxs" Mar 09 14:50:35 crc kubenswrapper[4722]: I0309 14:50:35.216899 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/713bd472-187b-47a0-9094-5ac6496d7830-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7lzxs\" (UID: \"713bd472-187b-47a0-9094-5ac6496d7830\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7lzxs" Mar 09 14:50:35 crc kubenswrapper[4722]: I0309 14:50:35.217017 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/713bd472-187b-47a0-9094-5ac6496d7830-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7lzxs\" (UID: \"713bd472-187b-47a0-9094-5ac6496d7830\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7lzxs" Mar 09 14:50:35 crc kubenswrapper[4722]: I0309 14:50:35.217110 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/713bd472-187b-47a0-9094-5ac6496d7830-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7lzxs\" (UID: \"713bd472-187b-47a0-9094-5ac6496d7830\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7lzxs" Mar 09 14:50:35 crc kubenswrapper[4722]: I0309 14:50:35.217231 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/713bd472-187b-47a0-9094-5ac6496d7830-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7lzxs\" (UID: \"713bd472-187b-47a0-9094-5ac6496d7830\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7lzxs" Mar 09 14:50:35 crc kubenswrapper[4722]: I0309 14:50:35.217307 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhmmr\" (UniqueName: \"kubernetes.io/projected/713bd472-187b-47a0-9094-5ac6496d7830-kube-api-access-jhmmr\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7lzxs\" (UID: \"713bd472-187b-47a0-9094-5ac6496d7830\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7lzxs" Mar 09 14:50:35 crc kubenswrapper[4722]: I0309 14:50:35.217345 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/713bd472-187b-47a0-9094-5ac6496d7830-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7lzxs\" (UID: \"713bd472-187b-47a0-9094-5ac6496d7830\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7lzxs" Mar 09 14:50:35 crc kubenswrapper[4722]: I0309 14:50:35.319432 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/713bd472-187b-47a0-9094-5ac6496d7830-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7lzxs\" (UID: \"713bd472-187b-47a0-9094-5ac6496d7830\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7lzxs" Mar 09 14:50:35 crc kubenswrapper[4722]: I0309 14:50:35.319561 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/713bd472-187b-47a0-9094-5ac6496d7830-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7lzxs\" (UID: \"713bd472-187b-47a0-9094-5ac6496d7830\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7lzxs" Mar 09 14:50:35 crc kubenswrapper[4722]: I0309 14:50:35.319585 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/713bd472-187b-47a0-9094-5ac6496d7830-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7lzxs\" (UID: \"713bd472-187b-47a0-9094-5ac6496d7830\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7lzxs" Mar 09 14:50:35 crc kubenswrapper[4722]: I0309 14:50:35.319720 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/713bd472-187b-47a0-9094-5ac6496d7830-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7lzxs\" (UID: \"713bd472-187b-47a0-9094-5ac6496d7830\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7lzxs" Mar 09 14:50:35 crc kubenswrapper[4722]: I0309 14:50:35.319813 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/713bd472-187b-47a0-9094-5ac6496d7830-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7lzxs\" (UID: \"713bd472-187b-47a0-9094-5ac6496d7830\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7lzxs" Mar 09 14:50:35 crc kubenswrapper[4722]: I0309 14:50:35.319883 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/713bd472-187b-47a0-9094-5ac6496d7830-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7lzxs\" (UID: \"713bd472-187b-47a0-9094-5ac6496d7830\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7lzxs" Mar 09 14:50:35 crc kubenswrapper[4722]: I0309 14:50:35.319912 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhmmr\" (UniqueName: \"kubernetes.io/projected/713bd472-187b-47a0-9094-5ac6496d7830-kube-api-access-jhmmr\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7lzxs\" (UID: \"713bd472-187b-47a0-9094-5ac6496d7830\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7lzxs" Mar 09 14:50:35 crc kubenswrapper[4722]: I0309 14:50:35.326300 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/713bd472-187b-47a0-9094-5ac6496d7830-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7lzxs\" (UID: \"713bd472-187b-47a0-9094-5ac6496d7830\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7lzxs" Mar 09 14:50:35 crc kubenswrapper[4722]: I0309 14:50:35.326307 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/713bd472-187b-47a0-9094-5ac6496d7830-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7lzxs\" (UID: \"713bd472-187b-47a0-9094-5ac6496d7830\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7lzxs" Mar 09 14:50:35 crc kubenswrapper[4722]: I0309 14:50:35.326735 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/713bd472-187b-47a0-9094-5ac6496d7830-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7lzxs\" (UID: \"713bd472-187b-47a0-9094-5ac6496d7830\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7lzxs" Mar 09 14:50:35 crc kubenswrapper[4722]: I0309 14:50:35.328771 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/713bd472-187b-47a0-9094-5ac6496d7830-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7lzxs\" (UID: \"713bd472-187b-47a0-9094-5ac6496d7830\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7lzxs" Mar 09 14:50:35 crc kubenswrapper[4722]: I0309 14:50:35.329753 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/713bd472-187b-47a0-9094-5ac6496d7830-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7lzxs\" (UID: \"713bd472-187b-47a0-9094-5ac6496d7830\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7lzxs" Mar 09 14:50:35 crc kubenswrapper[4722]: I0309 14:50:35.330159 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/713bd472-187b-47a0-9094-5ac6496d7830-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7lzxs\" (UID: \"713bd472-187b-47a0-9094-5ac6496d7830\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7lzxs" Mar 09 14:50:35 crc kubenswrapper[4722]: I0309 14:50:35.347275 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhmmr\" (UniqueName: \"kubernetes.io/projected/713bd472-187b-47a0-9094-5ac6496d7830-kube-api-access-jhmmr\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-7lzxs\" (UID: \"713bd472-187b-47a0-9094-5ac6496d7830\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7lzxs" Mar 09 14:50:35 crc kubenswrapper[4722]: I0309 14:50:35.414989 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7lzxs" Mar 09 14:50:35 crc kubenswrapper[4722]: I0309 14:50:35.989841 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7lzxs"] Mar 09 14:50:37 crc kubenswrapper[4722]: I0309 14:50:37.000525 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7lzxs" event={"ID":"713bd472-187b-47a0-9094-5ac6496d7830","Type":"ContainerStarted","Data":"dcf7fbcac336052aba16d3041b828159477feae850f5e2792a944c6ced106c4f"} Mar 09 14:50:37 crc kubenswrapper[4722]: I0309 14:50:37.000930 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7lzxs" event={"ID":"713bd472-187b-47a0-9094-5ac6496d7830","Type":"ContainerStarted","Data":"3005db55014da8fbc02d778ca74d5daa895ef727f32abd67eaa8f1c6c719b9e0"} Mar 09 14:50:37 crc kubenswrapper[4722]: I0309 14:50:37.028948 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7lzxs" podStartSLOduration=1.573745063 podStartE2EDuration="2.028930324s" podCreationTimestamp="2026-03-09 14:50:35 +0000 UTC" firstStartedPulling="2026-03-09 14:50:35.998894047 +0000 UTC m=+2876.554462623" lastFinishedPulling="2026-03-09 14:50:36.454079298 +0000 UTC m=+2877.009647884" observedRunningTime="2026-03-09 14:50:37.021547382 +0000 UTC m=+2877.577115958" watchObservedRunningTime="2026-03-09 14:50:37.028930324 +0000 UTC m=+2877.584498900" Mar 09 14:51:51 crc kubenswrapper[4722]: I0309 14:51:51.527529 4722 patch_prober.go:28] interesting pod/machine-config-daemon-hjrrb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:51:51 crc kubenswrapper[4722]: I0309 14:51:51.528293 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:52:00 crc kubenswrapper[4722]: I0309 14:52:00.172608 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551132-8shq7"] Mar 09 14:52:00 crc kubenswrapper[4722]: I0309 14:52:00.175387 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551132-8shq7"] Mar 09 14:52:00 crc kubenswrapper[4722]: I0309 14:52:00.175496 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551132-8shq7" Mar 09 14:52:00 crc kubenswrapper[4722]: I0309 14:52:00.177846 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-dr2x6" Mar 09 14:52:00 crc kubenswrapper[4722]: I0309 14:52:00.178080 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 14:52:00 crc kubenswrapper[4722]: I0309 14:52:00.178448 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 14:52:00 crc kubenswrapper[4722]: I0309 14:52:00.326860 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcsxz\" (UniqueName: \"kubernetes.io/projected/1e304347-ef41-4512-be35-420e7cb2a398-kube-api-access-wcsxz\") pod \"auto-csr-approver-29551132-8shq7\" (UID: \"1e304347-ef41-4512-be35-420e7cb2a398\") " pod="openshift-infra/auto-csr-approver-29551132-8shq7" Mar 09 14:52:00 crc kubenswrapper[4722]: I0309 14:52:00.429861 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcsxz\" (UniqueName: \"kubernetes.io/projected/1e304347-ef41-4512-be35-420e7cb2a398-kube-api-access-wcsxz\") pod \"auto-csr-approver-29551132-8shq7\" (UID: \"1e304347-ef41-4512-be35-420e7cb2a398\") " pod="openshift-infra/auto-csr-approver-29551132-8shq7" Mar 09 14:52:00 crc kubenswrapper[4722]: I0309 14:52:00.453518 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcsxz\" (UniqueName: \"kubernetes.io/projected/1e304347-ef41-4512-be35-420e7cb2a398-kube-api-access-wcsxz\") pod \"auto-csr-approver-29551132-8shq7\" (UID: \"1e304347-ef41-4512-be35-420e7cb2a398\") " pod="openshift-infra/auto-csr-approver-29551132-8shq7" Mar 09 14:52:00 crc kubenswrapper[4722]: I0309 14:52:00.499366 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551132-8shq7" Mar 09 14:52:01 crc kubenswrapper[4722]: I0309 14:52:01.031283 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551132-8shq7"] Mar 09 14:52:01 crc kubenswrapper[4722]: I0309 14:52:01.039142 4722 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 14:52:01 crc kubenswrapper[4722]: I0309 14:52:01.980495 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551132-8shq7" event={"ID":"1e304347-ef41-4512-be35-420e7cb2a398","Type":"ContainerStarted","Data":"0141bfce4c2cf1db01b650e9a50c4343ed62e234558f738c0cf920470ae83dc7"} Mar 09 14:52:02 crc kubenswrapper[4722]: I0309 14:52:02.996183 4722 generic.go:334] "Generic (PLEG): container finished" podID="1e304347-ef41-4512-be35-420e7cb2a398" containerID="5b5df56e2c0e7d3aa67f0b6fa0e003691d28f5bf74f4a326692d95347a4613e7" exitCode=0 Mar 09 14:52:02 crc kubenswrapper[4722]: I0309 14:52:02.996271 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551132-8shq7" event={"ID":"1e304347-ef41-4512-be35-420e7cb2a398","Type":"ContainerDied","Data":"5b5df56e2c0e7d3aa67f0b6fa0e003691d28f5bf74f4a326692d95347a4613e7"} Mar 09 14:52:04 crc kubenswrapper[4722]: I0309 14:52:04.462436 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551132-8shq7" Mar 09 14:52:04 crc kubenswrapper[4722]: I0309 14:52:04.555024 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wcsxz\" (UniqueName: \"kubernetes.io/projected/1e304347-ef41-4512-be35-420e7cb2a398-kube-api-access-wcsxz\") pod \"1e304347-ef41-4512-be35-420e7cb2a398\" (UID: \"1e304347-ef41-4512-be35-420e7cb2a398\") " Mar 09 14:52:04 crc kubenswrapper[4722]: I0309 14:52:04.563264 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e304347-ef41-4512-be35-420e7cb2a398-kube-api-access-wcsxz" (OuterVolumeSpecName: "kube-api-access-wcsxz") pod "1e304347-ef41-4512-be35-420e7cb2a398" (UID: "1e304347-ef41-4512-be35-420e7cb2a398"). InnerVolumeSpecName "kube-api-access-wcsxz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:52:04 crc kubenswrapper[4722]: I0309 14:52:04.658280 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wcsxz\" (UniqueName: \"kubernetes.io/projected/1e304347-ef41-4512-be35-420e7cb2a398-kube-api-access-wcsxz\") on node \"crc\" DevicePath \"\"" Mar 09 14:52:05 crc kubenswrapper[4722]: I0309 14:52:05.022418 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551132-8shq7" event={"ID":"1e304347-ef41-4512-be35-420e7cb2a398","Type":"ContainerDied","Data":"0141bfce4c2cf1db01b650e9a50c4343ed62e234558f738c0cf920470ae83dc7"} Mar 09 14:52:05 crc kubenswrapper[4722]: I0309 14:52:05.022466 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0141bfce4c2cf1db01b650e9a50c4343ed62e234558f738c0cf920470ae83dc7" Mar 09 14:52:05 crc kubenswrapper[4722]: I0309 14:52:05.022511 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551132-8shq7" Mar 09 14:52:05 crc kubenswrapper[4722]: I0309 14:52:05.550144 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551126-zh8m5"] Mar 09 14:52:05 crc kubenswrapper[4722]: I0309 14:52:05.560589 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551126-zh8m5"] Mar 09 14:52:06 crc kubenswrapper[4722]: I0309 14:52:06.169253 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3326142-7932-42ca-918c-dc1afd1a443b" path="/var/lib/kubelet/pods/b3326142-7932-42ca-918c-dc1afd1a443b/volumes" Mar 09 14:52:21 crc kubenswrapper[4722]: I0309 14:52:21.528369 4722 patch_prober.go:28] interesting pod/machine-config-daemon-hjrrb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:52:21 crc kubenswrapper[4722]: I0309 14:52:21.529003 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:52:28 crc kubenswrapper[4722]: I0309 14:52:28.519774 4722 scope.go:117] "RemoveContainer" containerID="764c2630b59c9872d02ac080b51ce0ed88d52a8d0b0ef3f3b0ac1f44a6ec0e31" Mar 09 14:52:51 crc kubenswrapper[4722]: I0309 14:52:51.528045 4722 patch_prober.go:28] interesting pod/machine-config-daemon-hjrrb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 14:52:51 crc kubenswrapper[4722]: I0309 14:52:51.528687 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 14:52:51 crc kubenswrapper[4722]: I0309 14:52:51.528738 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" Mar 09 14:52:51 crc kubenswrapper[4722]: I0309 14:52:51.529693 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e93d9830f7a284f5574f07e5fff41b2ae62d84fa8023d799598acc091cbd5a28"} pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 14:52:51 crc kubenswrapper[4722]: I0309 14:52:51.529750 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerName="machine-config-daemon" containerID="cri-o://e93d9830f7a284f5574f07e5fff41b2ae62d84fa8023d799598acc091cbd5a28" gracePeriod=600 Mar 09 14:52:51 crc kubenswrapper[4722]: E0309 14:52:51.652978 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 14:52:51 crc kubenswrapper[4722]: I0309 14:52:51.807156 4722 generic.go:334] "Generic (PLEG): container finished" podID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerID="e93d9830f7a284f5574f07e5fff41b2ae62d84fa8023d799598acc091cbd5a28" exitCode=0 Mar 09 14:52:51 crc kubenswrapper[4722]: I0309 14:52:51.807250 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" event={"ID":"dac2aaf5-653b-4b2a-8efe-ed26bac8d648","Type":"ContainerDied","Data":"e93d9830f7a284f5574f07e5fff41b2ae62d84fa8023d799598acc091cbd5a28"} Mar 09 14:52:51 crc kubenswrapper[4722]: I0309 14:52:51.807348 4722 scope.go:117] "RemoveContainer" containerID="ac01e46e450ae3d431cf312a0286617c799422de23e62d8665856d6ee9c42ab7" Mar 09 14:52:51 crc kubenswrapper[4722]: I0309 14:52:51.807786 4722 scope.go:117] "RemoveContainer" containerID="e93d9830f7a284f5574f07e5fff41b2ae62d84fa8023d799598acc091cbd5a28" Mar 09 14:52:51 crc kubenswrapper[4722]: E0309 14:52:51.808153 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 14:53:03 crc kubenswrapper[4722]: I0309 14:53:03.150548 4722 scope.go:117] "RemoveContainer" containerID="e93d9830f7a284f5574f07e5fff41b2ae62d84fa8023d799598acc091cbd5a28" Mar 09 14:53:03 crc kubenswrapper[4722]: E0309 14:53:03.151331 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 14:53:03 crc kubenswrapper[4722]: I0309 14:53:03.941768 4722 generic.go:334] "Generic (PLEG): container finished" podID="713bd472-187b-47a0-9094-5ac6496d7830" containerID="dcf7fbcac336052aba16d3041b828159477feae850f5e2792a944c6ced106c4f" exitCode=0 Mar 09 14:53:03 crc kubenswrapper[4722]: I0309 14:53:03.941955 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7lzxs" event={"ID":"713bd472-187b-47a0-9094-5ac6496d7830","Type":"ContainerDied","Data":"dcf7fbcac336052aba16d3041b828159477feae850f5e2792a944c6ced106c4f"} Mar 09 14:53:05 crc kubenswrapper[4722]: I0309 14:53:05.468599 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7lzxs" Mar 09 14:53:05 crc kubenswrapper[4722]: I0309 14:53:05.557738 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/713bd472-187b-47a0-9094-5ac6496d7830-inventory\") pod \"713bd472-187b-47a0-9094-5ac6496d7830\" (UID: \"713bd472-187b-47a0-9094-5ac6496d7830\") " Mar 09 14:53:05 crc kubenswrapper[4722]: I0309 14:53:05.557866 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/713bd472-187b-47a0-9094-5ac6496d7830-ceilometer-compute-config-data-2\") pod \"713bd472-187b-47a0-9094-5ac6496d7830\" (UID: \"713bd472-187b-47a0-9094-5ac6496d7830\") " Mar 09 14:53:05 crc kubenswrapper[4722]: I0309 14:53:05.557904 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/713bd472-187b-47a0-9094-5ac6496d7830-ceilometer-compute-config-data-0\") pod \"713bd472-187b-47a0-9094-5ac6496d7830\" (UID: \"713bd472-187b-47a0-9094-5ac6496d7830\") " Mar 09 14:53:05 crc kubenswrapper[4722]: I0309 14:53:05.558072 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/713bd472-187b-47a0-9094-5ac6496d7830-telemetry-combined-ca-bundle\") pod \"713bd472-187b-47a0-9094-5ac6496d7830\" (UID: \"713bd472-187b-47a0-9094-5ac6496d7830\") " Mar 09 14:53:05 crc kubenswrapper[4722]: I0309 14:53:05.558307 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhmmr\" (UniqueName: \"kubernetes.io/projected/713bd472-187b-47a0-9094-5ac6496d7830-kube-api-access-jhmmr\") pod \"713bd472-187b-47a0-9094-5ac6496d7830\" (UID: \"713bd472-187b-47a0-9094-5ac6496d7830\") " Mar 09 14:53:05 crc kubenswrapper[4722]: I0309 14:53:05.558358 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/713bd472-187b-47a0-9094-5ac6496d7830-ssh-key-openstack-edpm-ipam\") pod \"713bd472-187b-47a0-9094-5ac6496d7830\" (UID: \"713bd472-187b-47a0-9094-5ac6496d7830\") " Mar 09 14:53:05 crc kubenswrapper[4722]: I0309 14:53:05.558439 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/713bd472-187b-47a0-9094-5ac6496d7830-ceilometer-compute-config-data-1\") pod \"713bd472-187b-47a0-9094-5ac6496d7830\" (UID: \"713bd472-187b-47a0-9094-5ac6496d7830\") " Mar 09 14:53:05 crc kubenswrapper[4722]: I0309 14:53:05.564414 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/713bd472-187b-47a0-9094-5ac6496d7830-kube-api-access-jhmmr" (OuterVolumeSpecName: "kube-api-access-jhmmr") pod "713bd472-187b-47a0-9094-5ac6496d7830" (UID: "713bd472-187b-47a0-9094-5ac6496d7830"). InnerVolumeSpecName "kube-api-access-jhmmr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:53:05 crc kubenswrapper[4722]: I0309 14:53:05.573399 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/713bd472-187b-47a0-9094-5ac6496d7830-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "713bd472-187b-47a0-9094-5ac6496d7830" (UID: "713bd472-187b-47a0-9094-5ac6496d7830"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:53:05 crc kubenswrapper[4722]: I0309 14:53:05.597482 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/713bd472-187b-47a0-9094-5ac6496d7830-inventory" (OuterVolumeSpecName: "inventory") pod "713bd472-187b-47a0-9094-5ac6496d7830" (UID: "713bd472-187b-47a0-9094-5ac6496d7830"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:53:05 crc kubenswrapper[4722]: I0309 14:53:05.600521 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/713bd472-187b-47a0-9094-5ac6496d7830-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "713bd472-187b-47a0-9094-5ac6496d7830" (UID: "713bd472-187b-47a0-9094-5ac6496d7830"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:53:05 crc kubenswrapper[4722]: I0309 14:53:05.603491 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/713bd472-187b-47a0-9094-5ac6496d7830-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "713bd472-187b-47a0-9094-5ac6496d7830" (UID: "713bd472-187b-47a0-9094-5ac6496d7830"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:53:05 crc kubenswrapper[4722]: I0309 14:53:05.614083 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/713bd472-187b-47a0-9094-5ac6496d7830-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "713bd472-187b-47a0-9094-5ac6496d7830" (UID: "713bd472-187b-47a0-9094-5ac6496d7830"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:53:05 crc kubenswrapper[4722]: I0309 14:53:05.624030 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/713bd472-187b-47a0-9094-5ac6496d7830-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "713bd472-187b-47a0-9094-5ac6496d7830" (UID: "713bd472-187b-47a0-9094-5ac6496d7830"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:53:05 crc kubenswrapper[4722]: I0309 14:53:05.661152 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhmmr\" (UniqueName: \"kubernetes.io/projected/713bd472-187b-47a0-9094-5ac6496d7830-kube-api-access-jhmmr\") on node \"crc\" DevicePath \"\"" Mar 09 14:53:05 crc kubenswrapper[4722]: I0309 14:53:05.661184 4722 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/713bd472-187b-47a0-9094-5ac6496d7830-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 14:53:05 crc kubenswrapper[4722]: I0309 14:53:05.661194 4722 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/713bd472-187b-47a0-9094-5ac6496d7830-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Mar 09 14:53:05 crc kubenswrapper[4722]: I0309 14:53:05.661224 4722 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/713bd472-187b-47a0-9094-5ac6496d7830-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 14:53:05 crc kubenswrapper[4722]: I0309 14:53:05.661235 4722 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/713bd472-187b-47a0-9094-5ac6496d7830-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Mar 09 14:53:05 crc kubenswrapper[4722]: I0309 14:53:05.661246 4722 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/713bd472-187b-47a0-9094-5ac6496d7830-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Mar 09 14:53:05 crc kubenswrapper[4722]: I0309 14:53:05.661256 4722 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/713bd472-187b-47a0-9094-5ac6496d7830-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:53:05 crc kubenswrapper[4722]: I0309 14:53:05.967655 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7lzxs" event={"ID":"713bd472-187b-47a0-9094-5ac6496d7830","Type":"ContainerDied","Data":"3005db55014da8fbc02d778ca74d5daa895ef727f32abd67eaa8f1c6c719b9e0"} Mar 09 14:53:05 crc kubenswrapper[4722]: I0309 14:53:05.967709 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3005db55014da8fbc02d778ca74d5daa895ef727f32abd67eaa8f1c6c719b9e0" Mar 09 14:53:05 crc kubenswrapper[4722]: I0309 14:53:05.967734 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-7lzxs" Mar 09 14:53:06 crc kubenswrapper[4722]: I0309 14:53:06.052548 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-9sbwv"] Mar 09 14:53:06 crc kubenswrapper[4722]: E0309 14:53:06.053158 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="713bd472-187b-47a0-9094-5ac6496d7830" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 09 14:53:06 crc kubenswrapper[4722]: I0309 14:53:06.053186 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="713bd472-187b-47a0-9094-5ac6496d7830" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 09 14:53:06 crc kubenswrapper[4722]: E0309 14:53:06.053303 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e304347-ef41-4512-be35-420e7cb2a398" containerName="oc" Mar 09 14:53:06 crc kubenswrapper[4722]: I0309 14:53:06.053324 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e304347-ef41-4512-be35-420e7cb2a398" containerName="oc" Mar 09 14:53:06 crc kubenswrapper[4722]: I0309 14:53:06.053615 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="713bd472-187b-47a0-9094-5ac6496d7830" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 09 14:53:06 crc kubenswrapper[4722]: I0309 14:53:06.053662 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e304347-ef41-4512-be35-420e7cb2a398" containerName="oc" Mar 09 14:53:06 crc kubenswrapper[4722]: I0309 14:53:06.054979 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-9sbwv" Mar 09 14:53:06 crc kubenswrapper[4722]: I0309 14:53:06.058328 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-dwlbg" Mar 09 14:53:06 crc kubenswrapper[4722]: I0309 14:53:06.058382 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 14:53:06 crc kubenswrapper[4722]: I0309 14:53:06.058440 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 14:53:06 crc kubenswrapper[4722]: I0309 14:53:06.058601 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 14:53:06 crc kubenswrapper[4722]: I0309 14:53:06.058646 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-ipmi-config-data" Mar 09 14:53:06 crc kubenswrapper[4722]: I0309 14:53:06.080143 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-9sbwv"] Mar 09 14:53:06 crc kubenswrapper[4722]: I0309 14:53:06.173248 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/6c87731f-8737-43d6-ba7a-e1427fc96fd4-ceilometer-ipmi-config-data-0\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-9sbwv\" (UID: \"6c87731f-8737-43d6-ba7a-e1427fc96fd4\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-9sbwv" Mar 09 14:53:06 crc kubenswrapper[4722]: I0309 14:53:06.173343 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/6c87731f-8737-43d6-ba7a-e1427fc96fd4-ceilometer-ipmi-config-data-1\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-9sbwv\" (UID: \"6c87731f-8737-43d6-ba7a-e1427fc96fd4\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-9sbwv" Mar 09 14:53:06 crc kubenswrapper[4722]: I0309 14:53:06.173369 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/6c87731f-8737-43d6-ba7a-e1427fc96fd4-ceilometer-ipmi-config-data-2\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-9sbwv\" (UID: \"6c87731f-8737-43d6-ba7a-e1427fc96fd4\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-9sbwv" Mar 09 14:53:06 crc kubenswrapper[4722]: I0309 14:53:06.173507 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c87731f-8737-43d6-ba7a-e1427fc96fd4-telemetry-power-monitoring-combined-ca-bundle\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-9sbwv\" (UID: \"6c87731f-8737-43d6-ba7a-e1427fc96fd4\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-9sbwv" Mar 09 14:53:06 crc kubenswrapper[4722]: I0309 14:53:06.173559 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qpcp\" (UniqueName: \"kubernetes.io/projected/6c87731f-8737-43d6-ba7a-e1427fc96fd4-kube-api-access-4qpcp\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-9sbwv\" (UID: \"6c87731f-8737-43d6-ba7a-e1427fc96fd4\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-9sbwv" Mar 09 14:53:06 crc kubenswrapper[4722]: I0309 14:53:06.173580 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6c87731f-8737-43d6-ba7a-e1427fc96fd4-ssh-key-openstack-edpm-ipam\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-9sbwv\" (UID: \"6c87731f-8737-43d6-ba7a-e1427fc96fd4\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-9sbwv" Mar 09 14:53:06 crc kubenswrapper[4722]: I0309 14:53:06.173688 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6c87731f-8737-43d6-ba7a-e1427fc96fd4-inventory\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-9sbwv\" (UID: \"6c87731f-8737-43d6-ba7a-e1427fc96fd4\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-9sbwv" Mar 09 14:53:06 crc kubenswrapper[4722]: I0309 14:53:06.277449 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qpcp\" (UniqueName: \"kubernetes.io/projected/6c87731f-8737-43d6-ba7a-e1427fc96fd4-kube-api-access-4qpcp\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-9sbwv\" (UID: \"6c87731f-8737-43d6-ba7a-e1427fc96fd4\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-9sbwv" Mar 09 14:53:06 crc kubenswrapper[4722]: I0309 14:53:06.277501 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6c87731f-8737-43d6-ba7a-e1427fc96fd4-ssh-key-openstack-edpm-ipam\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-9sbwv\" (UID: \"6c87731f-8737-43d6-ba7a-e1427fc96fd4\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-9sbwv" Mar 09 14:53:06 crc kubenswrapper[4722]: I0309 14:53:06.277612 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6c87731f-8737-43d6-ba7a-e1427fc96fd4-inventory\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-9sbwv\" (UID: \"6c87731f-8737-43d6-ba7a-e1427fc96fd4\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-9sbwv" Mar 09 14:53:06 crc kubenswrapper[4722]: I0309 14:53:06.277690 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/6c87731f-8737-43d6-ba7a-e1427fc96fd4-ceilometer-ipmi-config-data-0\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-9sbwv\" (UID: \"6c87731f-8737-43d6-ba7a-e1427fc96fd4\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-9sbwv" Mar 09 14:53:06 crc kubenswrapper[4722]: I0309 14:53:06.277811 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/6c87731f-8737-43d6-ba7a-e1427fc96fd4-ceilometer-ipmi-config-data-1\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-9sbwv\" (UID: \"6c87731f-8737-43d6-ba7a-e1427fc96fd4\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-9sbwv" Mar 09 14:53:06 crc kubenswrapper[4722]: I0309 14:53:06.277839 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/6c87731f-8737-43d6-ba7a-e1427fc96fd4-ceilometer-ipmi-config-data-2\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-9sbwv\" (UID: \"6c87731f-8737-43d6-ba7a-e1427fc96fd4\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-9sbwv" Mar 09 14:53:06 crc kubenswrapper[4722]: I0309 14:53:06.277946 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c87731f-8737-43d6-ba7a-e1427fc96fd4-telemetry-power-monitoring-combined-ca-bundle\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-9sbwv\" (UID: \"6c87731f-8737-43d6-ba7a-e1427fc96fd4\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-9sbwv" Mar 09 14:53:06 crc kubenswrapper[4722]: I0309 14:53:06.282708 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/6c87731f-8737-43d6-ba7a-e1427fc96fd4-ceilometer-ipmi-config-data-1\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-9sbwv\" (UID: \"6c87731f-8737-43d6-ba7a-e1427fc96fd4\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-9sbwv" Mar 09 14:53:06 crc kubenswrapper[4722]: I0309 14:53:06.283100 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6c87731f-8737-43d6-ba7a-e1427fc96fd4-inventory\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-9sbwv\" (UID: \"6c87731f-8737-43d6-ba7a-e1427fc96fd4\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-9sbwv" Mar 09 14:53:06 crc kubenswrapper[4722]: I0309 14:53:06.283304 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/6c87731f-8737-43d6-ba7a-e1427fc96fd4-ceilometer-ipmi-config-data-0\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-9sbwv\" (UID: \"6c87731f-8737-43d6-ba7a-e1427fc96fd4\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-9sbwv" Mar 09 14:53:06 crc kubenswrapper[4722]: I0309 14:53:06.291845 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/6c87731f-8737-43d6-ba7a-e1427fc96fd4-ceilometer-ipmi-config-data-2\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-9sbwv\" (UID: \"6c87731f-8737-43d6-ba7a-e1427fc96fd4\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-9sbwv" Mar 09 14:53:06 crc kubenswrapper[4722]: I0309 14:53:06.292640 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6c87731f-8737-43d6-ba7a-e1427fc96fd4-ssh-key-openstack-edpm-ipam\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-9sbwv\" (UID: \"6c87731f-8737-43d6-ba7a-e1427fc96fd4\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-9sbwv" Mar 09 14:53:06 crc kubenswrapper[4722]: I0309 14:53:06.293648 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qpcp\" (UniqueName: \"kubernetes.io/projected/6c87731f-8737-43d6-ba7a-e1427fc96fd4-kube-api-access-4qpcp\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-9sbwv\" (UID: \"6c87731f-8737-43d6-ba7a-e1427fc96fd4\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-9sbwv" Mar 09 14:53:06 crc kubenswrapper[4722]: I0309 14:53:06.295009 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c87731f-8737-43d6-ba7a-e1427fc96fd4-telemetry-power-monitoring-combined-ca-bundle\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-9sbwv\" (UID: \"6c87731f-8737-43d6-ba7a-e1427fc96fd4\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-9sbwv" Mar 09 14:53:06 crc kubenswrapper[4722]: I0309 14:53:06.380241 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-9sbwv" Mar 09 14:53:06 crc kubenswrapper[4722]: I0309 14:53:06.957743 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-9sbwv"] Mar 09 14:53:06 crc kubenswrapper[4722]: I0309 14:53:06.980714 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-9sbwv" event={"ID":"6c87731f-8737-43d6-ba7a-e1427fc96fd4","Type":"ContainerStarted","Data":"2b0e55c9df234920660aa368805bc28a72388377467e75426456056397e561b2"} Mar 09 14:53:07 crc kubenswrapper[4722]: I0309 14:53:07.993524 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-9sbwv" event={"ID":"6c87731f-8737-43d6-ba7a-e1427fc96fd4","Type":"ContainerStarted","Data":"1117034e2a5360dfc41fe5549446274eaec1bebec5f53bad035a16bf3ae24f9f"} Mar 09 14:53:09 crc kubenswrapper[4722]: I0309 14:53:09.033870 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-9sbwv" podStartSLOduration=2.338395098 podStartE2EDuration="3.033849666s" podCreationTimestamp="2026-03-09 14:53:06 +0000 UTC" firstStartedPulling="2026-03-09 14:53:06.970427332 +0000 UTC m=+3027.525995908" lastFinishedPulling="2026-03-09 14:53:07.66588189 +0000 UTC m=+3028.221450476" observedRunningTime="2026-03-09 14:53:09.026296139 +0000 UTC m=+3029.581864715" watchObservedRunningTime="2026-03-09 14:53:09.033849666 +0000 UTC m=+3029.589418242" Mar 09 14:53:11 crc kubenswrapper[4722]: I0309 14:53:11.587673 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-t6c69"] Mar 09 14:53:11 crc kubenswrapper[4722]: I0309 14:53:11.590704 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t6c69" Mar 09 14:53:11 crc kubenswrapper[4722]: I0309 14:53:11.598906 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-t6c69"] Mar 09 14:53:11 crc kubenswrapper[4722]: I0309 14:53:11.723489 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c04049d5-5281-4fbc-ac5b-83b91da12fdf-utilities\") pod \"community-operators-t6c69\" (UID: \"c04049d5-5281-4fbc-ac5b-83b91da12fdf\") " pod="openshift-marketplace/community-operators-t6c69" Mar 09 14:53:11 crc kubenswrapper[4722]: I0309 14:53:11.723795 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jwpr\" (UniqueName: \"kubernetes.io/projected/c04049d5-5281-4fbc-ac5b-83b91da12fdf-kube-api-access-9jwpr\") pod \"community-operators-t6c69\" (UID: \"c04049d5-5281-4fbc-ac5b-83b91da12fdf\") " pod="openshift-marketplace/community-operators-t6c69" Mar 09 14:53:11 crc kubenswrapper[4722]: I0309 14:53:11.724143 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c04049d5-5281-4fbc-ac5b-83b91da12fdf-catalog-content\") pod \"community-operators-t6c69\" (UID: \"c04049d5-5281-4fbc-ac5b-83b91da12fdf\") " pod="openshift-marketplace/community-operators-t6c69" Mar 09 14:53:11 crc kubenswrapper[4722]: I0309 14:53:11.826608 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jwpr\" (UniqueName: \"kubernetes.io/projected/c04049d5-5281-4fbc-ac5b-83b91da12fdf-kube-api-access-9jwpr\") pod \"community-operators-t6c69\" (UID: \"c04049d5-5281-4fbc-ac5b-83b91da12fdf\") " pod="openshift-marketplace/community-operators-t6c69" Mar 09 14:53:11 crc kubenswrapper[4722]: I0309 14:53:11.826700 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c04049d5-5281-4fbc-ac5b-83b91da12fdf-catalog-content\") pod \"community-operators-t6c69\" (UID: \"c04049d5-5281-4fbc-ac5b-83b91da12fdf\") " pod="openshift-marketplace/community-operators-t6c69" Mar 09 14:53:11 crc kubenswrapper[4722]: I0309 14:53:11.826764 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c04049d5-5281-4fbc-ac5b-83b91da12fdf-utilities\") pod \"community-operators-t6c69\" (UID: \"c04049d5-5281-4fbc-ac5b-83b91da12fdf\") " pod="openshift-marketplace/community-operators-t6c69" Mar 09 14:53:11 crc kubenswrapper[4722]: I0309 14:53:11.827459 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c04049d5-5281-4fbc-ac5b-83b91da12fdf-utilities\") pod \"community-operators-t6c69\" (UID: \"c04049d5-5281-4fbc-ac5b-83b91da12fdf\") " pod="openshift-marketplace/community-operators-t6c69" Mar 09 14:53:11 crc kubenswrapper[4722]: I0309 14:53:11.827513 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c04049d5-5281-4fbc-ac5b-83b91da12fdf-catalog-content\") pod \"community-operators-t6c69\" (UID: \"c04049d5-5281-4fbc-ac5b-83b91da12fdf\") " pod="openshift-marketplace/community-operators-t6c69" Mar 09 14:53:11 crc kubenswrapper[4722]: I0309 14:53:11.853690 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jwpr\" (UniqueName: \"kubernetes.io/projected/c04049d5-5281-4fbc-ac5b-83b91da12fdf-kube-api-access-9jwpr\") pod \"community-operators-t6c69\" (UID: \"c04049d5-5281-4fbc-ac5b-83b91da12fdf\") " pod="openshift-marketplace/community-operators-t6c69" Mar 09 14:53:11 crc kubenswrapper[4722]: I0309 14:53:11.929027 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t6c69" Mar 09 14:53:12 crc kubenswrapper[4722]: I0309 14:53:12.519558 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-t6c69"] Mar 09 14:53:13 crc kubenswrapper[4722]: I0309 14:53:13.063770 4722 generic.go:334] "Generic (PLEG): container finished" podID="c04049d5-5281-4fbc-ac5b-83b91da12fdf" containerID="311928bdb466b695e5cb507938f8e51491e480b69d3d5a22189dbe13676eba55" exitCode=0 Mar 09 14:53:13 crc kubenswrapper[4722]: I0309 14:53:13.063906 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t6c69" event={"ID":"c04049d5-5281-4fbc-ac5b-83b91da12fdf","Type":"ContainerDied","Data":"311928bdb466b695e5cb507938f8e51491e480b69d3d5a22189dbe13676eba55"} Mar 09 14:53:13 crc kubenswrapper[4722]: I0309 14:53:13.064813 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t6c69" event={"ID":"c04049d5-5281-4fbc-ac5b-83b91da12fdf","Type":"ContainerStarted","Data":"13d45d9d8e416e76cc01bf96056cfc6fe1830fedef309f73b6a035383e2ecee9"} Mar 09 14:53:14 crc kubenswrapper[4722]: I0309 14:53:14.077186 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t6c69" event={"ID":"c04049d5-5281-4fbc-ac5b-83b91da12fdf","Type":"ContainerStarted","Data":"132bf7226cde11e7ab39edb5a6b1f371f2560ddc7e870cdf216ac33b3257a605"} Mar 09 14:53:14 crc kubenswrapper[4722]: I0309 14:53:14.151042 4722 scope.go:117] "RemoveContainer" containerID="e93d9830f7a284f5574f07e5fff41b2ae62d84fa8023d799598acc091cbd5a28" Mar 09 14:53:14 crc kubenswrapper[4722]: E0309 14:53:14.151452 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 14:53:16 crc kubenswrapper[4722]: I0309 14:53:16.099476 4722 generic.go:334] "Generic (PLEG): container finished" podID="c04049d5-5281-4fbc-ac5b-83b91da12fdf" containerID="132bf7226cde11e7ab39edb5a6b1f371f2560ddc7e870cdf216ac33b3257a605" exitCode=0 Mar 09 14:53:16 crc kubenswrapper[4722]: I0309 14:53:16.099554 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t6c69" event={"ID":"c04049d5-5281-4fbc-ac5b-83b91da12fdf","Type":"ContainerDied","Data":"132bf7226cde11e7ab39edb5a6b1f371f2560ddc7e870cdf216ac33b3257a605"} Mar 09 14:53:17 crc kubenswrapper[4722]: I0309 14:53:17.112728 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t6c69" event={"ID":"c04049d5-5281-4fbc-ac5b-83b91da12fdf","Type":"ContainerStarted","Data":"3ef0c40cba3118336ce96a537427d1d6724bc7f2dda4519cb1f2bb9d95d4a7c5"} Mar 09 14:53:17 crc kubenswrapper[4722]: I0309 14:53:17.146131 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-t6c69" podStartSLOduration=2.6492413089999998 podStartE2EDuration="6.146108651s" podCreationTimestamp="2026-03-09 14:53:11 +0000 UTC" firstStartedPulling="2026-03-09 14:53:13.068995124 +0000 UTC m=+3033.624563730" lastFinishedPulling="2026-03-09 14:53:16.565862496 +0000 UTC m=+3037.121431072" observedRunningTime="2026-03-09 14:53:17.137474943 +0000 UTC m=+3037.693043529" watchObservedRunningTime="2026-03-09 14:53:17.146108651 +0000 UTC m=+3037.701677247" Mar 09 14:53:21 crc kubenswrapper[4722]: I0309 14:53:21.929589 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-t6c69" Mar 09 14:53:21 crc kubenswrapper[4722]: I0309 14:53:21.930238 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-t6c69" Mar 09 14:53:21 crc kubenswrapper[4722]: I0309 14:53:21.977373 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-t6c69" Mar 09 14:53:22 crc kubenswrapper[4722]: I0309 14:53:22.226159 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-t6c69" Mar 09 14:53:22 crc kubenswrapper[4722]: I0309 14:53:22.290491 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-t6c69"] Mar 09 14:53:24 crc kubenswrapper[4722]: I0309 14:53:24.198981 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-t6c69" podUID="c04049d5-5281-4fbc-ac5b-83b91da12fdf" containerName="registry-server" containerID="cri-o://3ef0c40cba3118336ce96a537427d1d6724bc7f2dda4519cb1f2bb9d95d4a7c5" gracePeriod=2 Mar 09 14:53:25 crc kubenswrapper[4722]: I0309 14:53:25.150173 4722 scope.go:117] "RemoveContainer" containerID="e93d9830f7a284f5574f07e5fff41b2ae62d84fa8023d799598acc091cbd5a28" Mar 09 14:53:25 crc kubenswrapper[4722]: E0309 14:53:25.151029 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 14:53:25 crc kubenswrapper[4722]: I0309 14:53:25.211241 4722 generic.go:334] "Generic (PLEG): container finished" podID="c04049d5-5281-4fbc-ac5b-83b91da12fdf" containerID="3ef0c40cba3118336ce96a537427d1d6724bc7f2dda4519cb1f2bb9d95d4a7c5" exitCode=0 Mar 09 14:53:25 crc kubenswrapper[4722]: I0309 14:53:25.211280 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t6c69" event={"ID":"c04049d5-5281-4fbc-ac5b-83b91da12fdf","Type":"ContainerDied","Data":"3ef0c40cba3118336ce96a537427d1d6724bc7f2dda4519cb1f2bb9d95d4a7c5"} Mar 09 14:53:25 crc kubenswrapper[4722]: I0309 14:53:25.341300 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t6c69" Mar 09 14:53:25 crc kubenswrapper[4722]: I0309 14:53:25.469552 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c04049d5-5281-4fbc-ac5b-83b91da12fdf-utilities\") pod \"c04049d5-5281-4fbc-ac5b-83b91da12fdf\" (UID: \"c04049d5-5281-4fbc-ac5b-83b91da12fdf\") " Mar 09 14:53:25 crc kubenswrapper[4722]: I0309 14:53:25.469628 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jwpr\" (UniqueName: \"kubernetes.io/projected/c04049d5-5281-4fbc-ac5b-83b91da12fdf-kube-api-access-9jwpr\") pod \"c04049d5-5281-4fbc-ac5b-83b91da12fdf\" (UID: \"c04049d5-5281-4fbc-ac5b-83b91da12fdf\") " Mar 09 14:53:25 crc kubenswrapper[4722]: I0309 14:53:25.469727 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c04049d5-5281-4fbc-ac5b-83b91da12fdf-catalog-content\") pod \"c04049d5-5281-4fbc-ac5b-83b91da12fdf\" (UID: \"c04049d5-5281-4fbc-ac5b-83b91da12fdf\") " Mar 09 14:53:25 crc kubenswrapper[4722]: I0309 14:53:25.470416 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c04049d5-5281-4fbc-ac5b-83b91da12fdf-utilities" (OuterVolumeSpecName: "utilities") pod "c04049d5-5281-4fbc-ac5b-83b91da12fdf" (UID: "c04049d5-5281-4fbc-ac5b-83b91da12fdf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:53:25 crc kubenswrapper[4722]: I0309 14:53:25.477463 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c04049d5-5281-4fbc-ac5b-83b91da12fdf-kube-api-access-9jwpr" (OuterVolumeSpecName: "kube-api-access-9jwpr") pod "c04049d5-5281-4fbc-ac5b-83b91da12fdf" (UID: "c04049d5-5281-4fbc-ac5b-83b91da12fdf"). InnerVolumeSpecName "kube-api-access-9jwpr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:53:25 crc kubenswrapper[4722]: I0309 14:53:25.527164 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c04049d5-5281-4fbc-ac5b-83b91da12fdf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c04049d5-5281-4fbc-ac5b-83b91da12fdf" (UID: "c04049d5-5281-4fbc-ac5b-83b91da12fdf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:53:25 crc kubenswrapper[4722]: I0309 14:53:25.572946 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c04049d5-5281-4fbc-ac5b-83b91da12fdf-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 14:53:25 crc kubenswrapper[4722]: I0309 14:53:25.572982 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c04049d5-5281-4fbc-ac5b-83b91da12fdf-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 14:53:25 crc kubenswrapper[4722]: I0309 14:53:25.572997 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jwpr\" (UniqueName: \"kubernetes.io/projected/c04049d5-5281-4fbc-ac5b-83b91da12fdf-kube-api-access-9jwpr\") on node \"crc\" DevicePath \"\"" Mar 09 14:53:26 crc kubenswrapper[4722]: I0309 14:53:26.234583 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-t6c69" event={"ID":"c04049d5-5281-4fbc-ac5b-83b91da12fdf","Type":"ContainerDied","Data":"13d45d9d8e416e76cc01bf96056cfc6fe1830fedef309f73b6a035383e2ecee9"} Mar 09 14:53:26 crc kubenswrapper[4722]: I0309 14:53:26.236075 4722 scope.go:117] "RemoveContainer" containerID="3ef0c40cba3118336ce96a537427d1d6724bc7f2dda4519cb1f2bb9d95d4a7c5" Mar 09 14:53:26 crc kubenswrapper[4722]: I0309 14:53:26.234699 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-t6c69" Mar 09 14:53:26 crc kubenswrapper[4722]: I0309 14:53:26.280263 4722 scope.go:117] "RemoveContainer" containerID="132bf7226cde11e7ab39edb5a6b1f371f2560ddc7e870cdf216ac33b3257a605" Mar 09 14:53:26 crc kubenswrapper[4722]: I0309 14:53:26.282346 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-t6c69"] Mar 09 14:53:26 crc kubenswrapper[4722]: I0309 14:53:26.297548 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-t6c69"] Mar 09 14:53:26 crc kubenswrapper[4722]: I0309 14:53:26.340897 4722 scope.go:117] "RemoveContainer" containerID="311928bdb466b695e5cb507938f8e51491e480b69d3d5a22189dbe13676eba55" Mar 09 14:53:28 crc kubenswrapper[4722]: I0309 14:53:28.165058 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c04049d5-5281-4fbc-ac5b-83b91da12fdf" path="/var/lib/kubelet/pods/c04049d5-5281-4fbc-ac5b-83b91da12fdf/volumes" Mar 09 14:53:37 crc kubenswrapper[4722]: I0309 14:53:37.149819 4722 scope.go:117] "RemoveContainer" containerID="e93d9830f7a284f5574f07e5fff41b2ae62d84fa8023d799598acc091cbd5a28" Mar 09 14:53:37 crc kubenswrapper[4722]: E0309 14:53:37.151276 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 14:53:51 crc kubenswrapper[4722]: I0309 14:53:51.150084 4722 scope.go:117] "RemoveContainer" containerID="e93d9830f7a284f5574f07e5fff41b2ae62d84fa8023d799598acc091cbd5a28" Mar 09 14:53:51 crc kubenswrapper[4722]: E0309 14:53:51.151057 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 14:54:00 crc kubenswrapper[4722]: I0309 14:54:00.143796 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551134-l25dd"] Mar 09 14:54:00 crc kubenswrapper[4722]: E0309 14:54:00.144895 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c04049d5-5281-4fbc-ac5b-83b91da12fdf" containerName="registry-server" Mar 09 14:54:00 crc kubenswrapper[4722]: I0309 14:54:00.144909 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="c04049d5-5281-4fbc-ac5b-83b91da12fdf" containerName="registry-server" Mar 09 14:54:00 crc kubenswrapper[4722]: E0309 14:54:00.144938 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c04049d5-5281-4fbc-ac5b-83b91da12fdf" containerName="extract-content" Mar 09 14:54:00 crc kubenswrapper[4722]: I0309 14:54:00.144944 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="c04049d5-5281-4fbc-ac5b-83b91da12fdf" containerName="extract-content" Mar 09 14:54:00 crc kubenswrapper[4722]: E0309 14:54:00.144958 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c04049d5-5281-4fbc-ac5b-83b91da12fdf" containerName="extract-utilities" Mar 09 14:54:00 crc kubenswrapper[4722]: I0309 14:54:00.144964 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="c04049d5-5281-4fbc-ac5b-83b91da12fdf" containerName="extract-utilities" Mar 09 14:54:00 crc kubenswrapper[4722]: I0309 14:54:00.145222 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="c04049d5-5281-4fbc-ac5b-83b91da12fdf" containerName="registry-server" Mar 09 14:54:00 crc kubenswrapper[4722]: I0309 14:54:00.146061 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551134-l25dd" Mar 09 14:54:00 crc kubenswrapper[4722]: I0309 14:54:00.148036 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 14:54:00 crc kubenswrapper[4722]: I0309 14:54:00.148448 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-dr2x6" Mar 09 14:54:00 crc kubenswrapper[4722]: I0309 14:54:00.149620 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 14:54:00 crc kubenswrapper[4722]: I0309 14:54:00.163477 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551134-l25dd"] Mar 09 14:54:00 crc kubenswrapper[4722]: I0309 14:54:00.262376 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zs9pp\" (UniqueName: \"kubernetes.io/projected/9853cd61-1429-4805-bcb1-bf9771928787-kube-api-access-zs9pp\") pod \"auto-csr-approver-29551134-l25dd\" (UID: \"9853cd61-1429-4805-bcb1-bf9771928787\") " pod="openshift-infra/auto-csr-approver-29551134-l25dd" Mar 09 14:54:00 crc kubenswrapper[4722]: I0309 14:54:00.364178 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zs9pp\" (UniqueName: \"kubernetes.io/projected/9853cd61-1429-4805-bcb1-bf9771928787-kube-api-access-zs9pp\") pod \"auto-csr-approver-29551134-l25dd\" (UID: \"9853cd61-1429-4805-bcb1-bf9771928787\") " pod="openshift-infra/auto-csr-approver-29551134-l25dd" Mar 09 14:54:00 crc kubenswrapper[4722]: I0309 14:54:00.381658 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zs9pp\" (UniqueName: \"kubernetes.io/projected/9853cd61-1429-4805-bcb1-bf9771928787-kube-api-access-zs9pp\") pod \"auto-csr-approver-29551134-l25dd\" (UID: \"9853cd61-1429-4805-bcb1-bf9771928787\") " pod="openshift-infra/auto-csr-approver-29551134-l25dd" Mar 09 14:54:00 crc kubenswrapper[4722]: I0309 14:54:00.469602 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551134-l25dd" Mar 09 14:54:00 crc kubenswrapper[4722]: I0309 14:54:00.930678 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551134-l25dd"] Mar 09 14:54:01 crc kubenswrapper[4722]: I0309 14:54:01.625561 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551134-l25dd" event={"ID":"9853cd61-1429-4805-bcb1-bf9771928787","Type":"ContainerStarted","Data":"686a1c72c53f1ca4eac44f1b571680fd745e843792c682fe6a1b5b14dbbcced7"} Mar 09 14:54:02 crc kubenswrapper[4722]: I0309 14:54:02.150723 4722 scope.go:117] "RemoveContainer" containerID="e93d9830f7a284f5574f07e5fff41b2ae62d84fa8023d799598acc091cbd5a28" Mar 09 14:54:02 crc kubenswrapper[4722]: E0309 14:54:02.151038 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 14:54:02 crc kubenswrapper[4722]: I0309 14:54:02.643221 4722 generic.go:334] "Generic (PLEG): container finished" podID="9853cd61-1429-4805-bcb1-bf9771928787" containerID="a98436cf88d27eb9f246f9faa7c15af2430870edf3d068a921cfe4ca0724b0c8" exitCode=0 Mar 09 14:54:02 crc kubenswrapper[4722]: I0309 14:54:02.643289 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551134-l25dd" event={"ID":"9853cd61-1429-4805-bcb1-bf9771928787","Type":"ContainerDied","Data":"a98436cf88d27eb9f246f9faa7c15af2430870edf3d068a921cfe4ca0724b0c8"} Mar 09 14:54:04 crc kubenswrapper[4722]: I0309 14:54:04.103610 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551134-l25dd" Mar 09 14:54:04 crc kubenswrapper[4722]: I0309 14:54:04.162504 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zs9pp\" (UniqueName: \"kubernetes.io/projected/9853cd61-1429-4805-bcb1-bf9771928787-kube-api-access-zs9pp\") pod \"9853cd61-1429-4805-bcb1-bf9771928787\" (UID: \"9853cd61-1429-4805-bcb1-bf9771928787\") " Mar 09 14:54:04 crc kubenswrapper[4722]: I0309 14:54:04.167918 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9853cd61-1429-4805-bcb1-bf9771928787-kube-api-access-zs9pp" (OuterVolumeSpecName: "kube-api-access-zs9pp") pod "9853cd61-1429-4805-bcb1-bf9771928787" (UID: "9853cd61-1429-4805-bcb1-bf9771928787"). InnerVolumeSpecName "kube-api-access-zs9pp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:54:04 crc kubenswrapper[4722]: I0309 14:54:04.266184 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zs9pp\" (UniqueName: \"kubernetes.io/projected/9853cd61-1429-4805-bcb1-bf9771928787-kube-api-access-zs9pp\") on node \"crc\" DevicePath \"\"" Mar 09 14:54:04 crc kubenswrapper[4722]: I0309 14:54:04.678536 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551134-l25dd" event={"ID":"9853cd61-1429-4805-bcb1-bf9771928787","Type":"ContainerDied","Data":"686a1c72c53f1ca4eac44f1b571680fd745e843792c682fe6a1b5b14dbbcced7"} Mar 09 14:54:04 crc kubenswrapper[4722]: I0309 14:54:04.678579 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="686a1c72c53f1ca4eac44f1b571680fd745e843792c682fe6a1b5b14dbbcced7" Mar 09 14:54:04 crc kubenswrapper[4722]: I0309 14:54:04.678599 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551134-l25dd" Mar 09 14:54:05 crc kubenswrapper[4722]: I0309 14:54:05.187825 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551128-wxwp8"] Mar 09 14:54:05 crc kubenswrapper[4722]: I0309 14:54:05.225318 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551128-wxwp8"] Mar 09 14:54:06 crc kubenswrapper[4722]: I0309 14:54:06.169822 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0356803a-3ad1-40f7-895e-25dbf882fe52" path="/var/lib/kubelet/pods/0356803a-3ad1-40f7-895e-25dbf882fe52/volumes" Mar 09 14:54:13 crc kubenswrapper[4722]: I0309 14:54:13.253628 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jpg5d"] Mar 09 14:54:13 crc kubenswrapper[4722]: E0309 14:54:13.254786 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9853cd61-1429-4805-bcb1-bf9771928787" containerName="oc" Mar 09 14:54:13 crc kubenswrapper[4722]: I0309 14:54:13.254802 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="9853cd61-1429-4805-bcb1-bf9771928787" containerName="oc" Mar 09 14:54:13 crc kubenswrapper[4722]: I0309 14:54:13.255037 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="9853cd61-1429-4805-bcb1-bf9771928787" containerName="oc" Mar 09 14:54:13 crc kubenswrapper[4722]: I0309 14:54:13.256727 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jpg5d" Mar 09 14:54:13 crc kubenswrapper[4722]: I0309 14:54:13.283603 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jpg5d"] Mar 09 14:54:13 crc kubenswrapper[4722]: I0309 14:54:13.395421 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1ba0d92-32ea-463d-a0d6-910a9f9c9e37-utilities\") pod \"redhat-marketplace-jpg5d\" (UID: \"e1ba0d92-32ea-463d-a0d6-910a9f9c9e37\") " pod="openshift-marketplace/redhat-marketplace-jpg5d" Mar 09 14:54:13 crc kubenswrapper[4722]: I0309 14:54:13.395508 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bf2gp\" (UniqueName: \"kubernetes.io/projected/e1ba0d92-32ea-463d-a0d6-910a9f9c9e37-kube-api-access-bf2gp\") pod \"redhat-marketplace-jpg5d\" (UID: \"e1ba0d92-32ea-463d-a0d6-910a9f9c9e37\") " pod="openshift-marketplace/redhat-marketplace-jpg5d" Mar 09 14:54:13 crc kubenswrapper[4722]: I0309 14:54:13.395583 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1ba0d92-32ea-463d-a0d6-910a9f9c9e37-catalog-content\") pod \"redhat-marketplace-jpg5d\" (UID: \"e1ba0d92-32ea-463d-a0d6-910a9f9c9e37\") " pod="openshift-marketplace/redhat-marketplace-jpg5d" Mar 09 14:54:13 crc kubenswrapper[4722]: I0309 14:54:13.498095 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bf2gp\" (UniqueName: \"kubernetes.io/projected/e1ba0d92-32ea-463d-a0d6-910a9f9c9e37-kube-api-access-bf2gp\") pod \"redhat-marketplace-jpg5d\" (UID: \"e1ba0d92-32ea-463d-a0d6-910a9f9c9e37\") " pod="openshift-marketplace/redhat-marketplace-jpg5d" Mar 09 14:54:13 crc kubenswrapper[4722]: I0309 14:54:13.498868 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1ba0d92-32ea-463d-a0d6-910a9f9c9e37-catalog-content\") pod \"redhat-marketplace-jpg5d\" (UID: \"e1ba0d92-32ea-463d-a0d6-910a9f9c9e37\") " pod="openshift-marketplace/redhat-marketplace-jpg5d" Mar 09 14:54:13 crc kubenswrapper[4722]: I0309 14:54:13.499223 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1ba0d92-32ea-463d-a0d6-910a9f9c9e37-catalog-content\") pod \"redhat-marketplace-jpg5d\" (UID: \"e1ba0d92-32ea-463d-a0d6-910a9f9c9e37\") " pod="openshift-marketplace/redhat-marketplace-jpg5d" Mar 09 14:54:13 crc kubenswrapper[4722]: I0309 14:54:13.499866 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1ba0d92-32ea-463d-a0d6-910a9f9c9e37-utilities\") pod \"redhat-marketplace-jpg5d\" (UID: \"e1ba0d92-32ea-463d-a0d6-910a9f9c9e37\") " pod="openshift-marketplace/redhat-marketplace-jpg5d" Mar 09 14:54:13 crc kubenswrapper[4722]: I0309 14:54:13.500121 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1ba0d92-32ea-463d-a0d6-910a9f9c9e37-utilities\") pod \"redhat-marketplace-jpg5d\" (UID: \"e1ba0d92-32ea-463d-a0d6-910a9f9c9e37\") " pod="openshift-marketplace/redhat-marketplace-jpg5d" Mar 09 14:54:13 crc kubenswrapper[4722]: I0309 14:54:13.521262 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bf2gp\" (UniqueName: \"kubernetes.io/projected/e1ba0d92-32ea-463d-a0d6-910a9f9c9e37-kube-api-access-bf2gp\") pod \"redhat-marketplace-jpg5d\" (UID: \"e1ba0d92-32ea-463d-a0d6-910a9f9c9e37\") " pod="openshift-marketplace/redhat-marketplace-jpg5d" Mar 09 14:54:13 crc kubenswrapper[4722]: I0309 14:54:13.581300 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jpg5d" Mar 09 14:54:14 crc kubenswrapper[4722]: I0309 14:54:14.095637 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jpg5d"] Mar 09 14:54:14 crc kubenswrapper[4722]: I0309 14:54:14.150298 4722 scope.go:117] "RemoveContainer" containerID="e93d9830f7a284f5574f07e5fff41b2ae62d84fa8023d799598acc091cbd5a28" Mar 09 14:54:14 crc kubenswrapper[4722]: E0309 14:54:14.150799 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 14:54:14 crc kubenswrapper[4722]: I0309 14:54:14.792860 4722 generic.go:334] "Generic (PLEG): container finished" podID="e1ba0d92-32ea-463d-a0d6-910a9f9c9e37" containerID="4ce37836b4da9dfd0f3dde19e5a799b8c30ff66f686836d5b1644c3259292e54" exitCode=0 Mar 09 14:54:14 crc kubenswrapper[4722]: I0309 14:54:14.793184 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jpg5d" event={"ID":"e1ba0d92-32ea-463d-a0d6-910a9f9c9e37","Type":"ContainerDied","Data":"4ce37836b4da9dfd0f3dde19e5a799b8c30ff66f686836d5b1644c3259292e54"} Mar 09 14:54:14 crc kubenswrapper[4722]: I0309 14:54:14.793292 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jpg5d" event={"ID":"e1ba0d92-32ea-463d-a0d6-910a9f9c9e37","Type":"ContainerStarted","Data":"3be6de7628c667cc3d2b397479cfa5ca3b8415850f1cdda8cb114dbab69c0d47"} Mar 09 14:54:17 crc kubenswrapper[4722]: I0309 14:54:17.826775 4722 generic.go:334] "Generic (PLEG): container finished" podID="e1ba0d92-32ea-463d-a0d6-910a9f9c9e37" containerID="39503447920c3153bbd5c1d70e24cd85aff07eecdaa8108b771066a7e5bb8c9d" exitCode=0 Mar 09 14:54:17 crc kubenswrapper[4722]: I0309 14:54:17.826906 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jpg5d" event={"ID":"e1ba0d92-32ea-463d-a0d6-910a9f9c9e37","Type":"ContainerDied","Data":"39503447920c3153bbd5c1d70e24cd85aff07eecdaa8108b771066a7e5bb8c9d"} Mar 09 14:54:18 crc kubenswrapper[4722]: I0309 14:54:18.839495 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jpg5d" event={"ID":"e1ba0d92-32ea-463d-a0d6-910a9f9c9e37","Type":"ContainerStarted","Data":"4b8497c0fb0bbf2f1446b00bef0a71399d3e232ac5f781bd5d58c1f3050d523a"} Mar 09 14:54:23 crc kubenswrapper[4722]: I0309 14:54:23.582461 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jpg5d" Mar 09 14:54:23 crc kubenswrapper[4722]: I0309 14:54:23.582984 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jpg5d" Mar 09 14:54:23 crc kubenswrapper[4722]: I0309 14:54:23.631476 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jpg5d" Mar 09 14:54:23 crc kubenswrapper[4722]: I0309 14:54:23.651072 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jpg5d" podStartSLOduration=7.197727582 podStartE2EDuration="10.651052783s" podCreationTimestamp="2026-03-09 14:54:13 +0000 UTC" firstStartedPulling="2026-03-09 14:54:14.796711986 +0000 UTC m=+3095.352280572" lastFinishedPulling="2026-03-09 14:54:18.250037197 +0000 UTC m=+3098.805605773" observedRunningTime="2026-03-09 14:54:18.866614765 +0000 UTC m=+3099.422183341" watchObservedRunningTime="2026-03-09 14:54:23.651052783 +0000 UTC m=+3104.206621349" Mar 09 14:54:23 crc kubenswrapper[4722]: I0309 14:54:23.959045 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jpg5d" Mar 09 14:54:24 crc kubenswrapper[4722]: I0309 14:54:24.017284 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jpg5d"] Mar 09 14:54:25 crc kubenswrapper[4722]: I0309 14:54:25.150575 4722 scope.go:117] "RemoveContainer" containerID="e93d9830f7a284f5574f07e5fff41b2ae62d84fa8023d799598acc091cbd5a28" Mar 09 14:54:25 crc kubenswrapper[4722]: E0309 14:54:25.152099 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 14:54:25 crc kubenswrapper[4722]: I0309 14:54:25.913239 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jpg5d" podUID="e1ba0d92-32ea-463d-a0d6-910a9f9c9e37" containerName="registry-server" containerID="cri-o://4b8497c0fb0bbf2f1446b00bef0a71399d3e232ac5f781bd5d58c1f3050d523a" gracePeriod=2 Mar 09 14:54:26 crc kubenswrapper[4722]: I0309 14:54:26.479401 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jpg5d" Mar 09 14:54:26 crc kubenswrapper[4722]: I0309 14:54:26.546380 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2gp\" (UniqueName: \"kubernetes.io/projected/e1ba0d92-32ea-463d-a0d6-910a9f9c9e37-kube-api-access-bf2gp\") pod \"e1ba0d92-32ea-463d-a0d6-910a9f9c9e37\" (UID: \"e1ba0d92-32ea-463d-a0d6-910a9f9c9e37\") " Mar 09 14:54:26 crc kubenswrapper[4722]: I0309 14:54:26.546733 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1ba0d92-32ea-463d-a0d6-910a9f9c9e37-utilities\") pod \"e1ba0d92-32ea-463d-a0d6-910a9f9c9e37\" (UID: \"e1ba0d92-32ea-463d-a0d6-910a9f9c9e37\") " Mar 09 14:54:26 crc kubenswrapper[4722]: I0309 14:54:26.546767 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1ba0d92-32ea-463d-a0d6-910a9f9c9e37-catalog-content\") pod \"e1ba0d92-32ea-463d-a0d6-910a9f9c9e37\" (UID: \"e1ba0d92-32ea-463d-a0d6-910a9f9c9e37\") " Mar 09 14:54:26 crc kubenswrapper[4722]: I0309 14:54:26.547886 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1ba0d92-32ea-463d-a0d6-910a9f9c9e37-utilities" (OuterVolumeSpecName: "utilities") pod "e1ba0d92-32ea-463d-a0d6-910a9f9c9e37" (UID: "e1ba0d92-32ea-463d-a0d6-910a9f9c9e37"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:54:26 crc kubenswrapper[4722]: I0309 14:54:26.557459 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1ba0d92-32ea-463d-a0d6-910a9f9c9e37-kube-api-access-bf2gp" (OuterVolumeSpecName: "kube-api-access-bf2gp") pod "e1ba0d92-32ea-463d-a0d6-910a9f9c9e37" (UID: "e1ba0d92-32ea-463d-a0d6-910a9f9c9e37"). InnerVolumeSpecName "kube-api-access-bf2gp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:54:26 crc kubenswrapper[4722]: I0309 14:54:26.577080 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1ba0d92-32ea-463d-a0d6-910a9f9c9e37-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e1ba0d92-32ea-463d-a0d6-910a9f9c9e37" (UID: "e1ba0d92-32ea-463d-a0d6-910a9f9c9e37"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 14:54:26 crc kubenswrapper[4722]: I0309 14:54:26.650429 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1ba0d92-32ea-463d-a0d6-910a9f9c9e37-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 14:54:26 crc kubenswrapper[4722]: I0309 14:54:26.650467 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1ba0d92-32ea-463d-a0d6-910a9f9c9e37-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 14:54:26 crc kubenswrapper[4722]: I0309 14:54:26.650478 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2gp\" (UniqueName: \"kubernetes.io/projected/e1ba0d92-32ea-463d-a0d6-910a9f9c9e37-kube-api-access-bf2gp\") on node \"crc\" DevicePath \"\"" Mar 09 14:54:26 crc kubenswrapper[4722]: I0309 14:54:26.926371 4722 generic.go:334] "Generic (PLEG): container finished" podID="e1ba0d92-32ea-463d-a0d6-910a9f9c9e37" containerID="4b8497c0fb0bbf2f1446b00bef0a71399d3e232ac5f781bd5d58c1f3050d523a" exitCode=0 Mar 09 14:54:26 crc kubenswrapper[4722]: I0309 14:54:26.926449 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jpg5d" Mar 09 14:54:26 crc kubenswrapper[4722]: I0309 14:54:26.926462 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jpg5d" event={"ID":"e1ba0d92-32ea-463d-a0d6-910a9f9c9e37","Type":"ContainerDied","Data":"4b8497c0fb0bbf2f1446b00bef0a71399d3e232ac5f781bd5d58c1f3050d523a"} Mar 09 14:54:26 crc kubenswrapper[4722]: I0309 14:54:26.926933 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jpg5d" event={"ID":"e1ba0d92-32ea-463d-a0d6-910a9f9c9e37","Type":"ContainerDied","Data":"3be6de7628c667cc3d2b397479cfa5ca3b8415850f1cdda8cb114dbab69c0d47"} Mar 09 14:54:26 crc kubenswrapper[4722]: I0309 14:54:26.926963 4722 scope.go:117] "RemoveContainer" containerID="4b8497c0fb0bbf2f1446b00bef0a71399d3e232ac5f781bd5d58c1f3050d523a" Mar 09 14:54:26 crc kubenswrapper[4722]: I0309 14:54:26.952840 4722 scope.go:117] "RemoveContainer" containerID="39503447920c3153bbd5c1d70e24cd85aff07eecdaa8108b771066a7e5bb8c9d" Mar 09 14:54:26 crc kubenswrapper[4722]: I0309 14:54:26.963296 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jpg5d"] Mar 09 14:54:26 crc kubenswrapper[4722]: I0309 14:54:26.975517 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jpg5d"] Mar 09 14:54:26 crc kubenswrapper[4722]: I0309 14:54:26.978082 4722 scope.go:117] "RemoveContainer" containerID="4ce37836b4da9dfd0f3dde19e5a799b8c30ff66f686836d5b1644c3259292e54" Mar 09 14:54:27 crc kubenswrapper[4722]: I0309 14:54:27.044269 4722 scope.go:117] "RemoveContainer" containerID="4b8497c0fb0bbf2f1446b00bef0a71399d3e232ac5f781bd5d58c1f3050d523a" Mar 09 14:54:27 crc kubenswrapper[4722]: E0309 14:54:27.045266 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b8497c0fb0bbf2f1446b00bef0a71399d3e232ac5f781bd5d58c1f3050d523a\": container with ID starting with 4b8497c0fb0bbf2f1446b00bef0a71399d3e232ac5f781bd5d58c1f3050d523a not found: ID does not exist" containerID="4b8497c0fb0bbf2f1446b00bef0a71399d3e232ac5f781bd5d58c1f3050d523a" Mar 09 14:54:27 crc kubenswrapper[4722]: I0309 14:54:27.045305 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b8497c0fb0bbf2f1446b00bef0a71399d3e232ac5f781bd5d58c1f3050d523a"} err="failed to get container status \"4b8497c0fb0bbf2f1446b00bef0a71399d3e232ac5f781bd5d58c1f3050d523a\": rpc error: code = NotFound desc = could not find container \"4b8497c0fb0bbf2f1446b00bef0a71399d3e232ac5f781bd5d58c1f3050d523a\": container with ID starting with 4b8497c0fb0bbf2f1446b00bef0a71399d3e232ac5f781bd5d58c1f3050d523a not found: ID does not exist" Mar 09 14:54:27 crc kubenswrapper[4722]: I0309 14:54:27.045333 4722 scope.go:117] "RemoveContainer" containerID="39503447920c3153bbd5c1d70e24cd85aff07eecdaa8108b771066a7e5bb8c9d" Mar 09 14:54:27 crc kubenswrapper[4722]: E0309 14:54:27.047045 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39503447920c3153bbd5c1d70e24cd85aff07eecdaa8108b771066a7e5bb8c9d\": container with ID starting with 39503447920c3153bbd5c1d70e24cd85aff07eecdaa8108b771066a7e5bb8c9d not found: ID does not exist" containerID="39503447920c3153bbd5c1d70e24cd85aff07eecdaa8108b771066a7e5bb8c9d" Mar 09 14:54:27 crc kubenswrapper[4722]: I0309 14:54:27.047085 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39503447920c3153bbd5c1d70e24cd85aff07eecdaa8108b771066a7e5bb8c9d"} err="failed to get container status \"39503447920c3153bbd5c1d70e24cd85aff07eecdaa8108b771066a7e5bb8c9d\": rpc error: code = NotFound desc = could not find container \"39503447920c3153bbd5c1d70e24cd85aff07eecdaa8108b771066a7e5bb8c9d\": container with ID starting with 39503447920c3153bbd5c1d70e24cd85aff07eecdaa8108b771066a7e5bb8c9d not found: ID does not exist" Mar 09 14:54:27 crc kubenswrapper[4722]: I0309 14:54:27.047110 4722 scope.go:117] "RemoveContainer" containerID="4ce37836b4da9dfd0f3dde19e5a799b8c30ff66f686836d5b1644c3259292e54" Mar 09 14:54:27 crc kubenswrapper[4722]: E0309 14:54:27.047575 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ce37836b4da9dfd0f3dde19e5a799b8c30ff66f686836d5b1644c3259292e54\": container with ID starting with 4ce37836b4da9dfd0f3dde19e5a799b8c30ff66f686836d5b1644c3259292e54 not found: ID does not exist" containerID="4ce37836b4da9dfd0f3dde19e5a799b8c30ff66f686836d5b1644c3259292e54" Mar 09 14:54:27 crc kubenswrapper[4722]: I0309 14:54:27.047598 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ce37836b4da9dfd0f3dde19e5a799b8c30ff66f686836d5b1644c3259292e54"} err="failed to get container status \"4ce37836b4da9dfd0f3dde19e5a799b8c30ff66f686836d5b1644c3259292e54\": rpc error: code = NotFound desc = could not find container \"4ce37836b4da9dfd0f3dde19e5a799b8c30ff66f686836d5b1644c3259292e54\": container with ID starting with 4ce37836b4da9dfd0f3dde19e5a799b8c30ff66f686836d5b1644c3259292e54 not found: ID does not exist" Mar 09 14:54:28 crc kubenswrapper[4722]: I0309 14:54:28.162562 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1ba0d92-32ea-463d-a0d6-910a9f9c9e37" path="/var/lib/kubelet/pods/e1ba0d92-32ea-463d-a0d6-910a9f9c9e37/volumes" Mar 09 14:54:28 crc kubenswrapper[4722]: I0309 14:54:28.646262 4722 scope.go:117] "RemoveContainer" containerID="b4ade2f4b120f2502bbad4f900fab6bf72bca83c8ebfd5d6b2463474b904a76c" Mar 09 14:54:37 crc kubenswrapper[4722]: I0309 14:54:37.149969 4722 scope.go:117] "RemoveContainer" containerID="e93d9830f7a284f5574f07e5fff41b2ae62d84fa8023d799598acc091cbd5a28" Mar 09 14:54:37 crc kubenswrapper[4722]: E0309 14:54:37.150986 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 14:54:51 crc kubenswrapper[4722]: I0309 14:54:51.149578 4722 scope.go:117] "RemoveContainer" containerID="e93d9830f7a284f5574f07e5fff41b2ae62d84fa8023d799598acc091cbd5a28" Mar 09 14:54:51 crc kubenswrapper[4722]: E0309 14:54:51.150417 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 14:55:04 crc kubenswrapper[4722]: I0309 14:55:04.200433 4722 scope.go:117] "RemoveContainer" containerID="e93d9830f7a284f5574f07e5fff41b2ae62d84fa8023d799598acc091cbd5a28" Mar 09 14:55:04 crc kubenswrapper[4722]: E0309 14:55:04.201403 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 14:55:08 crc kubenswrapper[4722]: I0309 14:55:08.417066 4722 generic.go:334] "Generic (PLEG): container finished" podID="6c87731f-8737-43d6-ba7a-e1427fc96fd4" containerID="1117034e2a5360dfc41fe5549446274eaec1bebec5f53bad035a16bf3ae24f9f" exitCode=0 Mar 09 14:55:08 crc kubenswrapper[4722]: I0309 14:55:08.417550 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-9sbwv" event={"ID":"6c87731f-8737-43d6-ba7a-e1427fc96fd4","Type":"ContainerDied","Data":"1117034e2a5360dfc41fe5549446274eaec1bebec5f53bad035a16bf3ae24f9f"} Mar 09 14:55:10 crc kubenswrapper[4722]: I0309 14:55:10.015684 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-9sbwv" Mar 09 14:55:10 crc kubenswrapper[4722]: I0309 14:55:10.105077 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/6c87731f-8737-43d6-ba7a-e1427fc96fd4-ceilometer-ipmi-config-data-1\") pod \"6c87731f-8737-43d6-ba7a-e1427fc96fd4\" (UID: \"6c87731f-8737-43d6-ba7a-e1427fc96fd4\") " Mar 09 14:55:10 crc kubenswrapper[4722]: I0309 14:55:10.105131 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6c87731f-8737-43d6-ba7a-e1427fc96fd4-ssh-key-openstack-edpm-ipam\") pod \"6c87731f-8737-43d6-ba7a-e1427fc96fd4\" (UID: \"6c87731f-8737-43d6-ba7a-e1427fc96fd4\") " Mar 09 14:55:10 crc kubenswrapper[4722]: I0309 14:55:10.105272 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4qpcp\" (UniqueName: \"kubernetes.io/projected/6c87731f-8737-43d6-ba7a-e1427fc96fd4-kube-api-access-4qpcp\") pod \"6c87731f-8737-43d6-ba7a-e1427fc96fd4\" (UID: \"6c87731f-8737-43d6-ba7a-e1427fc96fd4\") " Mar 09 14:55:10 crc kubenswrapper[4722]: I0309 14:55:10.105404 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c87731f-8737-43d6-ba7a-e1427fc96fd4-telemetry-power-monitoring-combined-ca-bundle\") pod \"6c87731f-8737-43d6-ba7a-e1427fc96fd4\" (UID: \"6c87731f-8737-43d6-ba7a-e1427fc96fd4\") " Mar 09 14:55:10 crc kubenswrapper[4722]: I0309 14:55:10.105442 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/6c87731f-8737-43d6-ba7a-e1427fc96fd4-ceilometer-ipmi-config-data-0\") pod \"6c87731f-8737-43d6-ba7a-e1427fc96fd4\" (UID: \"6c87731f-8737-43d6-ba7a-e1427fc96fd4\") " Mar 09 14:55:10 crc kubenswrapper[4722]: I0309 14:55:10.105708 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/6c87731f-8737-43d6-ba7a-e1427fc96fd4-ceilometer-ipmi-config-data-2\") pod \"6c87731f-8737-43d6-ba7a-e1427fc96fd4\" (UID: \"6c87731f-8737-43d6-ba7a-e1427fc96fd4\") " Mar 09 14:55:10 crc kubenswrapper[4722]: I0309 14:55:10.105785 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6c87731f-8737-43d6-ba7a-e1427fc96fd4-inventory\") pod \"6c87731f-8737-43d6-ba7a-e1427fc96fd4\" (UID: \"6c87731f-8737-43d6-ba7a-e1427fc96fd4\") " Mar 09 14:55:10 crc kubenswrapper[4722]: I0309 14:55:10.129855 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c87731f-8737-43d6-ba7a-e1427fc96fd4-telemetry-power-monitoring-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-power-monitoring-combined-ca-bundle") pod "6c87731f-8737-43d6-ba7a-e1427fc96fd4" (UID: "6c87731f-8737-43d6-ba7a-e1427fc96fd4"). InnerVolumeSpecName "telemetry-power-monitoring-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:55:10 crc kubenswrapper[4722]: I0309 14:55:10.130145 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c87731f-8737-43d6-ba7a-e1427fc96fd4-kube-api-access-4qpcp" (OuterVolumeSpecName: "kube-api-access-4qpcp") pod "6c87731f-8737-43d6-ba7a-e1427fc96fd4" (UID: "6c87731f-8737-43d6-ba7a-e1427fc96fd4"). InnerVolumeSpecName "kube-api-access-4qpcp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:55:10 crc kubenswrapper[4722]: I0309 14:55:10.139453 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c87731f-8737-43d6-ba7a-e1427fc96fd4-ceilometer-ipmi-config-data-2" (OuterVolumeSpecName: "ceilometer-ipmi-config-data-2") pod "6c87731f-8737-43d6-ba7a-e1427fc96fd4" (UID: "6c87731f-8737-43d6-ba7a-e1427fc96fd4"). InnerVolumeSpecName "ceilometer-ipmi-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:55:10 crc kubenswrapper[4722]: I0309 14:55:10.140833 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c87731f-8737-43d6-ba7a-e1427fc96fd4-inventory" (OuterVolumeSpecName: "inventory") pod "6c87731f-8737-43d6-ba7a-e1427fc96fd4" (UID: "6c87731f-8737-43d6-ba7a-e1427fc96fd4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:55:10 crc kubenswrapper[4722]: I0309 14:55:10.146232 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c87731f-8737-43d6-ba7a-e1427fc96fd4-ceilometer-ipmi-config-data-0" (OuterVolumeSpecName: "ceilometer-ipmi-config-data-0") pod "6c87731f-8737-43d6-ba7a-e1427fc96fd4" (UID: "6c87731f-8737-43d6-ba7a-e1427fc96fd4"). InnerVolumeSpecName "ceilometer-ipmi-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:55:10 crc kubenswrapper[4722]: I0309 14:55:10.147061 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c87731f-8737-43d6-ba7a-e1427fc96fd4-ceilometer-ipmi-config-data-1" (OuterVolumeSpecName: "ceilometer-ipmi-config-data-1") pod "6c87731f-8737-43d6-ba7a-e1427fc96fd4" (UID: "6c87731f-8737-43d6-ba7a-e1427fc96fd4"). InnerVolumeSpecName "ceilometer-ipmi-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:55:10 crc kubenswrapper[4722]: I0309 14:55:10.171895 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c87731f-8737-43d6-ba7a-e1427fc96fd4-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "6c87731f-8737-43d6-ba7a-e1427fc96fd4" (UID: "6c87731f-8737-43d6-ba7a-e1427fc96fd4"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:55:10 crc kubenswrapper[4722]: I0309 14:55:10.211354 4722 reconciler_common.go:293] "Volume detached for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/6c87731f-8737-43d6-ba7a-e1427fc96fd4-ceilometer-ipmi-config-data-1\") on node \"crc\" DevicePath \"\"" Mar 09 14:55:10 crc kubenswrapper[4722]: I0309 14:55:10.211395 4722 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6c87731f-8737-43d6-ba7a-e1427fc96fd4-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 14:55:10 crc kubenswrapper[4722]: I0309 14:55:10.211408 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4qpcp\" (UniqueName: \"kubernetes.io/projected/6c87731f-8737-43d6-ba7a-e1427fc96fd4-kube-api-access-4qpcp\") on node \"crc\" DevicePath \"\"" Mar 09 14:55:10 crc kubenswrapper[4722]: I0309 14:55:10.211424 4722 reconciler_common.go:293] "Volume detached for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c87731f-8737-43d6-ba7a-e1427fc96fd4-telemetry-power-monitoring-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 14:55:10 crc kubenswrapper[4722]: I0309 14:55:10.211438 4722 reconciler_common.go:293] "Volume detached for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/6c87731f-8737-43d6-ba7a-e1427fc96fd4-ceilometer-ipmi-config-data-0\") on node \"crc\" DevicePath \"\"" Mar 09 14:55:10 crc kubenswrapper[4722]: I0309 14:55:10.211454 4722 reconciler_common.go:293] "Volume detached for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/6c87731f-8737-43d6-ba7a-e1427fc96fd4-ceilometer-ipmi-config-data-2\") on node \"crc\" DevicePath \"\"" Mar 09 14:55:10 crc kubenswrapper[4722]: I0309 14:55:10.211468 4722 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6c87731f-8737-43d6-ba7a-e1427fc96fd4-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 14:55:10 crc kubenswrapper[4722]: I0309 14:55:10.449144 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-9sbwv" event={"ID":"6c87731f-8737-43d6-ba7a-e1427fc96fd4","Type":"ContainerDied","Data":"2b0e55c9df234920660aa368805bc28a72388377467e75426456056397e561b2"} Mar 09 14:55:10 crc kubenswrapper[4722]: I0309 14:55:10.449778 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b0e55c9df234920660aa368805bc28a72388377467e75426456056397e561b2" Mar 09 14:55:10 crc kubenswrapper[4722]: I0309 14:55:10.449483 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-9sbwv" Mar 09 14:55:10 crc kubenswrapper[4722]: I0309 14:55:10.580819 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-vv7zp"] Mar 09 14:55:10 crc kubenswrapper[4722]: E0309 14:55:10.581435 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1ba0d92-32ea-463d-a0d6-910a9f9c9e37" containerName="registry-server" Mar 09 14:55:10 crc kubenswrapper[4722]: I0309 14:55:10.581457 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1ba0d92-32ea-463d-a0d6-910a9f9c9e37" containerName="registry-server" Mar 09 14:55:10 crc kubenswrapper[4722]: E0309 14:55:10.581485 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1ba0d92-32ea-463d-a0d6-910a9f9c9e37" containerName="extract-content" Mar 09 14:55:10 crc kubenswrapper[4722]: I0309 14:55:10.581493 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1ba0d92-32ea-463d-a0d6-910a9f9c9e37" containerName="extract-content" Mar 09 14:55:10 crc kubenswrapper[4722]: E0309 14:55:10.581541 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1ba0d92-32ea-463d-a0d6-910a9f9c9e37" containerName="extract-utilities" Mar 09 14:55:10 crc kubenswrapper[4722]: I0309 14:55:10.581550 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1ba0d92-32ea-463d-a0d6-910a9f9c9e37" containerName="extract-utilities" Mar 09 14:55:10 crc kubenswrapper[4722]: E0309 14:55:10.581566 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c87731f-8737-43d6-ba7a-e1427fc96fd4" containerName="telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam" Mar 09 14:55:10 crc kubenswrapper[4722]: I0309 14:55:10.581575 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c87731f-8737-43d6-ba7a-e1427fc96fd4" containerName="telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam" Mar 09 14:55:10 crc kubenswrapper[4722]: I0309 14:55:10.581842 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c87731f-8737-43d6-ba7a-e1427fc96fd4" containerName="telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam" Mar 09 14:55:10 crc kubenswrapper[4722]: I0309 14:55:10.581884 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1ba0d92-32ea-463d-a0d6-910a9f9c9e37" containerName="registry-server" Mar 09 14:55:10 crc kubenswrapper[4722]: I0309 14:55:10.582865 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-vv7zp" Mar 09 14:55:10 crc kubenswrapper[4722]: I0309 14:55:10.585608 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 09 14:55:10 crc kubenswrapper[4722]: I0309 14:55:10.586709 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"logging-compute-config-data" Mar 09 14:55:10 crc kubenswrapper[4722]: I0309 14:55:10.586902 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 09 14:55:10 crc kubenswrapper[4722]: I0309 14:55:10.587116 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 09 14:55:10 crc kubenswrapper[4722]: I0309 14:55:10.590828 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-dwlbg" Mar 09 14:55:10 crc kubenswrapper[4722]: I0309 14:55:10.597359 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-vv7zp"] Mar 09 14:55:10 crc kubenswrapper[4722]: I0309 14:55:10.623400 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74rrx\" (UniqueName: \"kubernetes.io/projected/be833819-c229-4d0f-b489-a733e1b26a68-kube-api-access-74rrx\") pod \"logging-edpm-deployment-openstack-edpm-ipam-vv7zp\" (UID: \"be833819-c229-4d0f-b489-a733e1b26a68\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-vv7zp" Mar 09 14:55:10 crc kubenswrapper[4722]: I0309 14:55:10.623653 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/be833819-c229-4d0f-b489-a733e1b26a68-ssh-key-openstack-edpm-ipam\") pod \"logging-edpm-deployment-openstack-edpm-ipam-vv7zp\" (UID: \"be833819-c229-4d0f-b489-a733e1b26a68\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-vv7zp" Mar 09 14:55:10 crc kubenswrapper[4722]: I0309 14:55:10.623864 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/be833819-c229-4d0f-b489-a733e1b26a68-logging-compute-config-data-1\") pod \"logging-edpm-deployment-openstack-edpm-ipam-vv7zp\" (UID: \"be833819-c229-4d0f-b489-a733e1b26a68\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-vv7zp" Mar 09 14:55:10 crc kubenswrapper[4722]: I0309 14:55:10.624082 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/be833819-c229-4d0f-b489-a733e1b26a68-inventory\") pod \"logging-edpm-deployment-openstack-edpm-ipam-vv7zp\" (UID: \"be833819-c229-4d0f-b489-a733e1b26a68\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-vv7zp" Mar 09 14:55:10 crc kubenswrapper[4722]: I0309 14:55:10.624291 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/be833819-c229-4d0f-b489-a733e1b26a68-logging-compute-config-data-0\") pod \"logging-edpm-deployment-openstack-edpm-ipam-vv7zp\" (UID: \"be833819-c229-4d0f-b489-a733e1b26a68\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-vv7zp" Mar 09 14:55:10 crc kubenswrapper[4722]: I0309 14:55:10.726177 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74rrx\" (UniqueName: \"kubernetes.io/projected/be833819-c229-4d0f-b489-a733e1b26a68-kube-api-access-74rrx\") pod \"logging-edpm-deployment-openstack-edpm-ipam-vv7zp\" (UID: \"be833819-c229-4d0f-b489-a733e1b26a68\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-vv7zp" Mar 09 14:55:10 crc kubenswrapper[4722]: I0309 14:55:10.726317 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/be833819-c229-4d0f-b489-a733e1b26a68-ssh-key-openstack-edpm-ipam\") pod \"logging-edpm-deployment-openstack-edpm-ipam-vv7zp\" (UID: \"be833819-c229-4d0f-b489-a733e1b26a68\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-vv7zp" Mar 09 14:55:10 crc kubenswrapper[4722]: I0309 14:55:10.726404 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/be833819-c229-4d0f-b489-a733e1b26a68-logging-compute-config-data-1\") pod \"logging-edpm-deployment-openstack-edpm-ipam-vv7zp\" (UID: \"be833819-c229-4d0f-b489-a733e1b26a68\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-vv7zp" Mar 09 14:55:10 crc kubenswrapper[4722]: I0309 14:55:10.726527 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/be833819-c229-4d0f-b489-a733e1b26a68-inventory\") pod \"logging-edpm-deployment-openstack-edpm-ipam-vv7zp\" (UID: \"be833819-c229-4d0f-b489-a733e1b26a68\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-vv7zp" Mar 09 14:55:10 crc kubenswrapper[4722]: I0309 14:55:10.726642 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/be833819-c229-4d0f-b489-a733e1b26a68-logging-compute-config-data-0\") pod \"logging-edpm-deployment-openstack-edpm-ipam-vv7zp\" (UID: \"be833819-c229-4d0f-b489-a733e1b26a68\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-vv7zp" Mar 09 14:55:10 crc kubenswrapper[4722]: I0309 14:55:10.731438 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/be833819-c229-4d0f-b489-a733e1b26a68-ssh-key-openstack-edpm-ipam\") pod \"logging-edpm-deployment-openstack-edpm-ipam-vv7zp\" (UID: \"be833819-c229-4d0f-b489-a733e1b26a68\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-vv7zp" Mar 09 14:55:10 crc kubenswrapper[4722]: I0309 14:55:10.731550 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/be833819-c229-4d0f-b489-a733e1b26a68-inventory\") pod \"logging-edpm-deployment-openstack-edpm-ipam-vv7zp\" (UID: \"be833819-c229-4d0f-b489-a733e1b26a68\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-vv7zp" Mar 09 14:55:10 crc kubenswrapper[4722]: I0309 14:55:10.731675 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/be833819-c229-4d0f-b489-a733e1b26a68-logging-compute-config-data-1\") pod \"logging-edpm-deployment-openstack-edpm-ipam-vv7zp\" (UID: \"be833819-c229-4d0f-b489-a733e1b26a68\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-vv7zp" Mar 09 14:55:10 crc kubenswrapper[4722]: I0309 14:55:10.732678 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/be833819-c229-4d0f-b489-a733e1b26a68-logging-compute-config-data-0\") pod \"logging-edpm-deployment-openstack-edpm-ipam-vv7zp\" (UID: \"be833819-c229-4d0f-b489-a733e1b26a68\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-vv7zp" Mar 09 14:55:10 crc kubenswrapper[4722]: I0309 14:55:10.750975 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74rrx\" (UniqueName: \"kubernetes.io/projected/be833819-c229-4d0f-b489-a733e1b26a68-kube-api-access-74rrx\") pod \"logging-edpm-deployment-openstack-edpm-ipam-vv7zp\" (UID: \"be833819-c229-4d0f-b489-a733e1b26a68\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-vv7zp" Mar 09 14:55:10 crc kubenswrapper[4722]: I0309 14:55:10.908780 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-vv7zp" Mar 09 14:55:11 crc kubenswrapper[4722]: I0309 14:55:11.458950 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-vv7zp"] Mar 09 14:55:12 crc kubenswrapper[4722]: I0309 14:55:12.495311 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-vv7zp" event={"ID":"be833819-c229-4d0f-b489-a733e1b26a68","Type":"ContainerStarted","Data":"47e2ebf5d0dfe2f0104992c13bf8d586a939da29bf075c819a2fdc97a3412603"} Mar 09 14:55:12 crc kubenswrapper[4722]: I0309 14:55:12.495753 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-vv7zp" event={"ID":"be833819-c229-4d0f-b489-a733e1b26a68","Type":"ContainerStarted","Data":"c0007524cde567d97433b4b762dccc6ed8bb6b61187d5de796e2d7d4e61dd568"} Mar 09 14:55:12 crc kubenswrapper[4722]: I0309 14:55:12.530138 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-vv7zp" podStartSLOduration=1.989344965 podStartE2EDuration="2.530114822s" podCreationTimestamp="2026-03-09 14:55:10 +0000 UTC" firstStartedPulling="2026-03-09 14:55:11.467668166 +0000 UTC m=+3152.023236762" lastFinishedPulling="2026-03-09 14:55:12.008438013 +0000 UTC m=+3152.564006619" observedRunningTime="2026-03-09 14:55:12.515813264 +0000 UTC m=+3153.071381860" watchObservedRunningTime="2026-03-09 14:55:12.530114822 +0000 UTC m=+3153.085683408" Mar 09 14:55:19 crc kubenswrapper[4722]: I0309 14:55:19.149868 4722 scope.go:117] "RemoveContainer" containerID="e93d9830f7a284f5574f07e5fff41b2ae62d84fa8023d799598acc091cbd5a28" Mar 09 14:55:19 crc kubenswrapper[4722]: E0309 14:55:19.150838 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 14:55:28 crc kubenswrapper[4722]: I0309 14:55:28.735374 4722 generic.go:334] "Generic (PLEG): container finished" podID="be833819-c229-4d0f-b489-a733e1b26a68" containerID="47e2ebf5d0dfe2f0104992c13bf8d586a939da29bf075c819a2fdc97a3412603" exitCode=0 Mar 09 14:55:28 crc kubenswrapper[4722]: I0309 14:55:28.735455 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-vv7zp" event={"ID":"be833819-c229-4d0f-b489-a733e1b26a68","Type":"ContainerDied","Data":"47e2ebf5d0dfe2f0104992c13bf8d586a939da29bf075c819a2fdc97a3412603"} Mar 09 14:55:30 crc kubenswrapper[4722]: I0309 14:55:30.232697 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-vv7zp" Mar 09 14:55:30 crc kubenswrapper[4722]: I0309 14:55:30.373641 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/be833819-c229-4d0f-b489-a733e1b26a68-ssh-key-openstack-edpm-ipam\") pod \"be833819-c229-4d0f-b489-a733e1b26a68\" (UID: \"be833819-c229-4d0f-b489-a733e1b26a68\") " Mar 09 14:55:30 crc kubenswrapper[4722]: I0309 14:55:30.374100 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/be833819-c229-4d0f-b489-a733e1b26a68-inventory\") pod \"be833819-c229-4d0f-b489-a733e1b26a68\" (UID: \"be833819-c229-4d0f-b489-a733e1b26a68\") " Mar 09 14:55:30 crc kubenswrapper[4722]: I0309 14:55:30.374327 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74rrx\" (UniqueName: \"kubernetes.io/projected/be833819-c229-4d0f-b489-a733e1b26a68-kube-api-access-74rrx\") pod \"be833819-c229-4d0f-b489-a733e1b26a68\" (UID: \"be833819-c229-4d0f-b489-a733e1b26a68\") " Mar 09 14:55:30 crc kubenswrapper[4722]: I0309 14:55:30.374557 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/be833819-c229-4d0f-b489-a733e1b26a68-logging-compute-config-data-0\") pod \"be833819-c229-4d0f-b489-a733e1b26a68\" (UID: \"be833819-c229-4d0f-b489-a733e1b26a68\") " Mar 09 14:55:30 crc kubenswrapper[4722]: I0309 14:55:30.374904 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/be833819-c229-4d0f-b489-a733e1b26a68-logging-compute-config-data-1\") pod \"be833819-c229-4d0f-b489-a733e1b26a68\" (UID: \"be833819-c229-4d0f-b489-a733e1b26a68\") " Mar 09 14:55:30 crc kubenswrapper[4722]: I0309 14:55:30.379414 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be833819-c229-4d0f-b489-a733e1b26a68-kube-api-access-74rrx" (OuterVolumeSpecName: "kube-api-access-74rrx") pod "be833819-c229-4d0f-b489-a733e1b26a68" (UID: "be833819-c229-4d0f-b489-a733e1b26a68"). InnerVolumeSpecName "kube-api-access-74rrx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:55:30 crc kubenswrapper[4722]: I0309 14:55:30.406246 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be833819-c229-4d0f-b489-a733e1b26a68-logging-compute-config-data-0" (OuterVolumeSpecName: "logging-compute-config-data-0") pod "be833819-c229-4d0f-b489-a733e1b26a68" (UID: "be833819-c229-4d0f-b489-a733e1b26a68"). InnerVolumeSpecName "logging-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:55:30 crc kubenswrapper[4722]: I0309 14:55:30.409015 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be833819-c229-4d0f-b489-a733e1b26a68-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "be833819-c229-4d0f-b489-a733e1b26a68" (UID: "be833819-c229-4d0f-b489-a733e1b26a68"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:55:30 crc kubenswrapper[4722]: I0309 14:55:30.428140 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be833819-c229-4d0f-b489-a733e1b26a68-logging-compute-config-data-1" (OuterVolumeSpecName: "logging-compute-config-data-1") pod "be833819-c229-4d0f-b489-a733e1b26a68" (UID: "be833819-c229-4d0f-b489-a733e1b26a68"). InnerVolumeSpecName "logging-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:55:30 crc kubenswrapper[4722]: I0309 14:55:30.438355 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be833819-c229-4d0f-b489-a733e1b26a68-inventory" (OuterVolumeSpecName: "inventory") pod "be833819-c229-4d0f-b489-a733e1b26a68" (UID: "be833819-c229-4d0f-b489-a733e1b26a68"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 14:55:30 crc kubenswrapper[4722]: I0309 14:55:30.478129 4722 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/be833819-c229-4d0f-b489-a733e1b26a68-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 09 14:55:30 crc kubenswrapper[4722]: I0309 14:55:30.478493 4722 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/be833819-c229-4d0f-b489-a733e1b26a68-inventory\") on node \"crc\" DevicePath \"\"" Mar 09 14:55:30 crc kubenswrapper[4722]: I0309 14:55:30.478503 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74rrx\" (UniqueName: \"kubernetes.io/projected/be833819-c229-4d0f-b489-a733e1b26a68-kube-api-access-74rrx\") on node \"crc\" DevicePath \"\"" Mar 09 14:55:30 crc kubenswrapper[4722]: I0309 14:55:30.478513 4722 reconciler_common.go:293] "Volume detached for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/be833819-c229-4d0f-b489-a733e1b26a68-logging-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Mar 09 14:55:30 crc kubenswrapper[4722]: I0309 14:55:30.478524 4722 reconciler_common.go:293] "Volume detached for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/be833819-c229-4d0f-b489-a733e1b26a68-logging-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Mar 09 14:55:30 crc kubenswrapper[4722]: I0309 14:55:30.761641 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-vv7zp" event={"ID":"be833819-c229-4d0f-b489-a733e1b26a68","Type":"ContainerDied","Data":"c0007524cde567d97433b4b762dccc6ed8bb6b61187d5de796e2d7d4e61dd568"} Mar 09 14:55:30 crc kubenswrapper[4722]: I0309 14:55:30.761694 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c0007524cde567d97433b4b762dccc6ed8bb6b61187d5de796e2d7d4e61dd568" Mar 09 14:55:30 crc kubenswrapper[4722]: I0309 14:55:30.762059 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-vv7zp" Mar 09 14:55:32 crc kubenswrapper[4722]: I0309 14:55:32.150316 4722 scope.go:117] "RemoveContainer" containerID="e93d9830f7a284f5574f07e5fff41b2ae62d84fa8023d799598acc091cbd5a28" Mar 09 14:55:32 crc kubenswrapper[4722]: E0309 14:55:32.150844 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 14:55:43 crc kubenswrapper[4722]: I0309 14:55:43.150043 4722 scope.go:117] "RemoveContainer" containerID="e93d9830f7a284f5574f07e5fff41b2ae62d84fa8023d799598acc091cbd5a28" Mar 09 14:55:43 crc kubenswrapper[4722]: E0309 14:55:43.150745 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 14:55:55 crc kubenswrapper[4722]: I0309 14:55:55.149127 4722 scope.go:117] "RemoveContainer" containerID="e93d9830f7a284f5574f07e5fff41b2ae62d84fa8023d799598acc091cbd5a28" Mar 09 14:55:55 crc kubenswrapper[4722]: E0309 14:55:55.150005 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 14:56:00 crc kubenswrapper[4722]: I0309 14:56:00.169263 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551136-2mrlb"] Mar 09 14:56:00 crc kubenswrapper[4722]: E0309 14:56:00.170425 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be833819-c229-4d0f-b489-a733e1b26a68" containerName="logging-edpm-deployment-openstack-edpm-ipam" Mar 09 14:56:00 crc kubenswrapper[4722]: I0309 14:56:00.170447 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="be833819-c229-4d0f-b489-a733e1b26a68" containerName="logging-edpm-deployment-openstack-edpm-ipam" Mar 09 14:56:00 crc kubenswrapper[4722]: I0309 14:56:00.170892 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="be833819-c229-4d0f-b489-a733e1b26a68" containerName="logging-edpm-deployment-openstack-edpm-ipam" Mar 09 14:56:00 crc kubenswrapper[4722]: I0309 14:56:00.172253 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551136-2mrlb" Mar 09 14:56:00 crc kubenswrapper[4722]: I0309 14:56:00.174359 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 14:56:00 crc kubenswrapper[4722]: I0309 14:56:00.174670 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 14:56:00 crc kubenswrapper[4722]: I0309 14:56:00.177608 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-dr2x6" Mar 09 14:56:00 crc kubenswrapper[4722]: I0309 14:56:00.181429 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551136-2mrlb"] Mar 09 14:56:00 crc kubenswrapper[4722]: I0309 14:56:00.234400 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qr9vk\" (UniqueName: \"kubernetes.io/projected/35324614-f6de-45e7-817e-2df79b732b87-kube-api-access-qr9vk\") pod \"auto-csr-approver-29551136-2mrlb\" (UID: \"35324614-f6de-45e7-817e-2df79b732b87\") " pod="openshift-infra/auto-csr-approver-29551136-2mrlb" Mar 09 14:56:00 crc kubenswrapper[4722]: I0309 14:56:00.336634 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qr9vk\" (UniqueName: \"kubernetes.io/projected/35324614-f6de-45e7-817e-2df79b732b87-kube-api-access-qr9vk\") pod \"auto-csr-approver-29551136-2mrlb\" (UID: \"35324614-f6de-45e7-817e-2df79b732b87\") " pod="openshift-infra/auto-csr-approver-29551136-2mrlb" Mar 09 14:56:00 crc kubenswrapper[4722]: I0309 14:56:00.363559 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qr9vk\" (UniqueName: \"kubernetes.io/projected/35324614-f6de-45e7-817e-2df79b732b87-kube-api-access-qr9vk\") pod \"auto-csr-approver-29551136-2mrlb\" (UID: \"35324614-f6de-45e7-817e-2df79b732b87\") " pod="openshift-infra/auto-csr-approver-29551136-2mrlb" Mar 09 14:56:00 crc kubenswrapper[4722]: I0309 14:56:00.496755 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551136-2mrlb" Mar 09 14:56:01 crc kubenswrapper[4722]: I0309 14:56:01.014843 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551136-2mrlb"] Mar 09 14:56:01 crc kubenswrapper[4722]: I0309 14:56:01.124274 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551136-2mrlb" event={"ID":"35324614-f6de-45e7-817e-2df79b732b87","Type":"ContainerStarted","Data":"11fbe82fa56360273b243049fab6b94a642a69dcf0a57bf1a4186c5917cd37e7"} Mar 09 14:56:03 crc kubenswrapper[4722]: I0309 14:56:03.144502 4722 generic.go:334] "Generic (PLEG): container finished" podID="35324614-f6de-45e7-817e-2df79b732b87" containerID="56a727c2c234062d5084ac4bf17b8cadc5b1f074f388fe0c4ce72cf5e063078e" exitCode=0 Mar 09 14:56:03 crc kubenswrapper[4722]: I0309 14:56:03.144811 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551136-2mrlb" event={"ID":"35324614-f6de-45e7-817e-2df79b732b87","Type":"ContainerDied","Data":"56a727c2c234062d5084ac4bf17b8cadc5b1f074f388fe0c4ce72cf5e063078e"} Mar 09 14:56:04 crc kubenswrapper[4722]: I0309 14:56:04.559866 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551136-2mrlb" Mar 09 14:56:04 crc kubenswrapper[4722]: I0309 14:56:04.666007 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qr9vk\" (UniqueName: \"kubernetes.io/projected/35324614-f6de-45e7-817e-2df79b732b87-kube-api-access-qr9vk\") pod \"35324614-f6de-45e7-817e-2df79b732b87\" (UID: \"35324614-f6de-45e7-817e-2df79b732b87\") " Mar 09 14:56:04 crc kubenswrapper[4722]: I0309 14:56:04.673990 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35324614-f6de-45e7-817e-2df79b732b87-kube-api-access-qr9vk" (OuterVolumeSpecName: "kube-api-access-qr9vk") pod "35324614-f6de-45e7-817e-2df79b732b87" (UID: "35324614-f6de-45e7-817e-2df79b732b87"). InnerVolumeSpecName "kube-api-access-qr9vk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:56:04 crc kubenswrapper[4722]: I0309 14:56:04.769262 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qr9vk\" (UniqueName: \"kubernetes.io/projected/35324614-f6de-45e7-817e-2df79b732b87-kube-api-access-qr9vk\") on node \"crc\" DevicePath \"\"" Mar 09 14:56:05 crc kubenswrapper[4722]: I0309 14:56:05.169599 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551136-2mrlb" event={"ID":"35324614-f6de-45e7-817e-2df79b732b87","Type":"ContainerDied","Data":"11fbe82fa56360273b243049fab6b94a642a69dcf0a57bf1a4186c5917cd37e7"} Mar 09 14:56:05 crc kubenswrapper[4722]: I0309 14:56:05.169911 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11fbe82fa56360273b243049fab6b94a642a69dcf0a57bf1a4186c5917cd37e7" Mar 09 14:56:05 crc kubenswrapper[4722]: I0309 14:56:05.169699 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551136-2mrlb" Mar 09 14:56:05 crc kubenswrapper[4722]: I0309 14:56:05.643238 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551130-kvkpt"] Mar 09 14:56:05 crc kubenswrapper[4722]: I0309 14:56:05.654433 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551130-kvkpt"] Mar 09 14:56:06 crc kubenswrapper[4722]: I0309 14:56:06.168941 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b65eeff-ebce-4cd9-bd1f-d7ede07a1e53" path="/var/lib/kubelet/pods/0b65eeff-ebce-4cd9-bd1f-d7ede07a1e53/volumes" Mar 09 14:56:08 crc kubenswrapper[4722]: I0309 14:56:08.149479 4722 scope.go:117] "RemoveContainer" containerID="e93d9830f7a284f5574f07e5fff41b2ae62d84fa8023d799598acc091cbd5a28" Mar 09 14:56:08 crc kubenswrapper[4722]: E0309 14:56:08.150432 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 14:56:20 crc kubenswrapper[4722]: I0309 14:56:20.163828 4722 scope.go:117] "RemoveContainer" containerID="e93d9830f7a284f5574f07e5fff41b2ae62d84fa8023d799598acc091cbd5a28" Mar 09 14:56:20 crc kubenswrapper[4722]: E0309 14:56:20.164928 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 14:56:28 crc kubenswrapper[4722]: I0309 14:56:28.797533 4722 scope.go:117] "RemoveContainer" containerID="0f9a1828d93c00b7f8016a0df10a321083cac201e49d8f70c90d7c9e7fc8a675" Mar 09 14:56:34 crc kubenswrapper[4722]: I0309 14:56:34.149784 4722 scope.go:117] "RemoveContainer" containerID="e93d9830f7a284f5574f07e5fff41b2ae62d84fa8023d799598acc091cbd5a28" Mar 09 14:56:34 crc kubenswrapper[4722]: E0309 14:56:34.150683 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 14:56:48 crc kubenswrapper[4722]: I0309 14:56:48.150953 4722 scope.go:117] "RemoveContainer" containerID="e93d9830f7a284f5574f07e5fff41b2ae62d84fa8023d799598acc091cbd5a28" Mar 09 14:56:48 crc kubenswrapper[4722]: E0309 14:56:48.151950 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 14:57:00 crc kubenswrapper[4722]: I0309 14:57:00.157478 4722 scope.go:117] "RemoveContainer" containerID="e93d9830f7a284f5574f07e5fff41b2ae62d84fa8023d799598acc091cbd5a28" Mar 09 14:57:00 crc kubenswrapper[4722]: E0309 14:57:00.158224 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 14:57:13 crc kubenswrapper[4722]: I0309 14:57:13.149952 4722 scope.go:117] "RemoveContainer" containerID="e93d9830f7a284f5574f07e5fff41b2ae62d84fa8023d799598acc091cbd5a28" Mar 09 14:57:13 crc kubenswrapper[4722]: E0309 14:57:13.151002 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 14:57:27 crc kubenswrapper[4722]: I0309 14:57:27.150801 4722 scope.go:117] "RemoveContainer" containerID="e93d9830f7a284f5574f07e5fff41b2ae62d84fa8023d799598acc091cbd5a28" Mar 09 14:57:27 crc kubenswrapper[4722]: E0309 14:57:27.152404 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 14:57:40 crc kubenswrapper[4722]: I0309 14:57:40.161275 4722 scope.go:117] "RemoveContainer" containerID="e93d9830f7a284f5574f07e5fff41b2ae62d84fa8023d799598acc091cbd5a28" Mar 09 14:57:40 crc kubenswrapper[4722]: E0309 14:57:40.162426 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 14:57:54 crc kubenswrapper[4722]: I0309 14:57:54.150432 4722 scope.go:117] "RemoveContainer" containerID="e93d9830f7a284f5574f07e5fff41b2ae62d84fa8023d799598acc091cbd5a28" Mar 09 14:57:54 crc kubenswrapper[4722]: I0309 14:57:54.636995 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" event={"ID":"dac2aaf5-653b-4b2a-8efe-ed26bac8d648","Type":"ContainerStarted","Data":"3368343976f6a5d643fb9637cdcd4c447f53ede4dfe3a3fd016450bf969037e2"} Mar 09 14:58:00 crc kubenswrapper[4722]: I0309 14:58:00.180895 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551138-25c88"] Mar 09 14:58:00 crc kubenswrapper[4722]: E0309 14:58:00.181944 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35324614-f6de-45e7-817e-2df79b732b87" containerName="oc" Mar 09 14:58:00 crc kubenswrapper[4722]: I0309 14:58:00.181961 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="35324614-f6de-45e7-817e-2df79b732b87" containerName="oc" Mar 09 14:58:00 crc kubenswrapper[4722]: I0309 14:58:00.182341 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="35324614-f6de-45e7-817e-2df79b732b87" containerName="oc" Mar 09 14:58:00 crc kubenswrapper[4722]: I0309 14:58:00.183360 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551138-25c88" Mar 09 14:58:00 crc kubenswrapper[4722]: I0309 14:58:00.187177 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 14:58:00 crc kubenswrapper[4722]: I0309 14:58:00.187467 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 14:58:00 crc kubenswrapper[4722]: I0309 14:58:00.187637 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-dr2x6" Mar 09 14:58:00 crc kubenswrapper[4722]: I0309 14:58:00.193633 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551138-25c88"] Mar 09 14:58:00 crc kubenswrapper[4722]: I0309 14:58:00.353308 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbfwl\" (UniqueName: \"kubernetes.io/projected/d8319fe6-90bd-445e-98c1-0b4b5b0e5888-kube-api-access-kbfwl\") pod \"auto-csr-approver-29551138-25c88\" (UID: \"d8319fe6-90bd-445e-98c1-0b4b5b0e5888\") " pod="openshift-infra/auto-csr-approver-29551138-25c88" Mar 09 14:58:00 crc kubenswrapper[4722]: I0309 14:58:00.455630 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbfwl\" (UniqueName: \"kubernetes.io/projected/d8319fe6-90bd-445e-98c1-0b4b5b0e5888-kube-api-access-kbfwl\") pod \"auto-csr-approver-29551138-25c88\" (UID: \"d8319fe6-90bd-445e-98c1-0b4b5b0e5888\") " pod="openshift-infra/auto-csr-approver-29551138-25c88" Mar 09 14:58:00 crc kubenswrapper[4722]: I0309 14:58:00.479905 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbfwl\" (UniqueName: \"kubernetes.io/projected/d8319fe6-90bd-445e-98c1-0b4b5b0e5888-kube-api-access-kbfwl\") pod \"auto-csr-approver-29551138-25c88\" (UID: \"d8319fe6-90bd-445e-98c1-0b4b5b0e5888\") " pod="openshift-infra/auto-csr-approver-29551138-25c88" Mar 09 14:58:00 crc kubenswrapper[4722]: I0309 14:58:00.526242 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551138-25c88" Mar 09 14:58:01 crc kubenswrapper[4722]: I0309 14:58:01.080462 4722 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 14:58:01 crc kubenswrapper[4722]: I0309 14:58:01.086154 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551138-25c88"] Mar 09 14:58:01 crc kubenswrapper[4722]: I0309 14:58:01.731769 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551138-25c88" event={"ID":"d8319fe6-90bd-445e-98c1-0b4b5b0e5888","Type":"ContainerStarted","Data":"88275b6ae28e23f1a61bdfe37247d9417340daceb8d1bba693b7d4326759dd6a"} Mar 09 14:58:02 crc kubenswrapper[4722]: I0309 14:58:02.746335 4722 generic.go:334] "Generic (PLEG): container finished" podID="d8319fe6-90bd-445e-98c1-0b4b5b0e5888" containerID="210edfc086b973d8e86c1b119dd75b6cf900542ae5a5b9792d6f2787cca5c2d3" exitCode=0 Mar 09 14:58:02 crc kubenswrapper[4722]: I0309 14:58:02.746416 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551138-25c88" event={"ID":"d8319fe6-90bd-445e-98c1-0b4b5b0e5888","Type":"ContainerDied","Data":"210edfc086b973d8e86c1b119dd75b6cf900542ae5a5b9792d6f2787cca5c2d3"} Mar 09 14:58:04 crc kubenswrapper[4722]: I0309 14:58:04.199666 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551138-25c88" Mar 09 14:58:04 crc kubenswrapper[4722]: I0309 14:58:04.356348 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kbfwl\" (UniqueName: \"kubernetes.io/projected/d8319fe6-90bd-445e-98c1-0b4b5b0e5888-kube-api-access-kbfwl\") pod \"d8319fe6-90bd-445e-98c1-0b4b5b0e5888\" (UID: \"d8319fe6-90bd-445e-98c1-0b4b5b0e5888\") " Mar 09 14:58:04 crc kubenswrapper[4722]: I0309 14:58:04.367581 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8319fe6-90bd-445e-98c1-0b4b5b0e5888-kube-api-access-kbfwl" (OuterVolumeSpecName: "kube-api-access-kbfwl") pod "d8319fe6-90bd-445e-98c1-0b4b5b0e5888" (UID: "d8319fe6-90bd-445e-98c1-0b4b5b0e5888"). InnerVolumeSpecName "kube-api-access-kbfwl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 14:58:04 crc kubenswrapper[4722]: I0309 14:58:04.459551 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kbfwl\" (UniqueName: \"kubernetes.io/projected/d8319fe6-90bd-445e-98c1-0b4b5b0e5888-kube-api-access-kbfwl\") on node \"crc\" DevicePath \"\"" Mar 09 14:58:04 crc kubenswrapper[4722]: I0309 14:58:04.769378 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551138-25c88" event={"ID":"d8319fe6-90bd-445e-98c1-0b4b5b0e5888","Type":"ContainerDied","Data":"88275b6ae28e23f1a61bdfe37247d9417340daceb8d1bba693b7d4326759dd6a"} Mar 09 14:58:04 crc kubenswrapper[4722]: I0309 14:58:04.769703 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="88275b6ae28e23f1a61bdfe37247d9417340daceb8d1bba693b7d4326759dd6a" Mar 09 14:58:04 crc kubenswrapper[4722]: I0309 14:58:04.769511 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551138-25c88" Mar 09 14:58:05 crc kubenswrapper[4722]: I0309 14:58:05.278818 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551132-8shq7"] Mar 09 14:58:05 crc kubenswrapper[4722]: I0309 14:58:05.291526 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551132-8shq7"] Mar 09 14:58:06 crc kubenswrapper[4722]: I0309 14:58:06.165594 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e304347-ef41-4512-be35-420e7cb2a398" path="/var/lib/kubelet/pods/1e304347-ef41-4512-be35-420e7cb2a398/volumes" Mar 09 14:58:28 crc kubenswrapper[4722]: I0309 14:58:28.918161 4722 scope.go:117] "RemoveContainer" containerID="5b5df56e2c0e7d3aa67f0b6fa0e003691d28f5bf74f4a326692d95347a4613e7" Mar 09 14:59:43 crc kubenswrapper[4722]: I0309 14:59:43.083614 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-t98zg"] Mar 09 14:59:43 crc kubenswrapper[4722]: E0309 14:59:43.089103 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8319fe6-90bd-445e-98c1-0b4b5b0e5888" containerName="oc" Mar 09 14:59:43 crc kubenswrapper[4722]: I0309 14:59:43.089643 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8319fe6-90bd-445e-98c1-0b4b5b0e5888" containerName="oc" Mar 09 14:59:43 crc kubenswrapper[4722]: I0309 14:59:43.089973 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8319fe6-90bd-445e-98c1-0b4b5b0e5888" containerName="oc" Mar 09 14:59:43 crc kubenswrapper[4722]: I0309 14:59:43.091844 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t98zg" Mar 09 14:59:43 crc kubenswrapper[4722]: I0309 14:59:43.179700 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-t98zg"] Mar 09 14:59:43 crc kubenswrapper[4722]: I0309 14:59:43.243408 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea0036f4-ef02-4deb-9740-172a36372998-catalog-content\") pod \"certified-operators-t98zg\" (UID: \"ea0036f4-ef02-4deb-9740-172a36372998\") " pod="openshift-marketplace/certified-operators-t98zg" Mar 09 14:59:43 crc kubenswrapper[4722]: I0309 14:59:43.243556 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dqhd\" (UniqueName: \"kubernetes.io/projected/ea0036f4-ef02-4deb-9740-172a36372998-kube-api-access-9dqhd\") pod \"certified-operators-t98zg\" (UID: \"ea0036f4-ef02-4deb-9740-172a36372998\") " pod="openshift-marketplace/certified-operators-t98zg" Mar 09 14:59:43 crc kubenswrapper[4722]: I0309 14:59:43.243597 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea0036f4-ef02-4deb-9740-172a36372998-utilities\") pod \"certified-operators-t98zg\" (UID: \"ea0036f4-ef02-4deb-9740-172a36372998\") " pod="openshift-marketplace/certified-operators-t98zg" Mar 09 14:59:43 crc kubenswrapper[4722]: I0309 14:59:43.345604 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea0036f4-ef02-4deb-9740-172a36372998-catalog-content\") pod \"certified-operators-t98zg\" (UID: \"ea0036f4-ef02-4deb-9740-172a36372998\") " pod="openshift-marketplace/certified-operators-t98zg" Mar 09 14:59:43 crc kubenswrapper[4722]: I0309 14:59:43.345708 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dqhd\" (UniqueName: \"kubernetes.io/projected/ea0036f4-ef02-4deb-9740-172a36372998-kube-api-access-9dqhd\") pod \"certified-operators-t98zg\" (UID: \"ea0036f4-ef02-4deb-9740-172a36372998\") " pod="openshift-marketplace/certified-operators-t98zg" Mar 09 14:59:43 crc kubenswrapper[4722]: I0309 14:59:43.345756 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea0036f4-ef02-4deb-9740-172a36372998-utilities\") pod \"certified-operators-t98zg\" (UID: \"ea0036f4-ef02-4deb-9740-172a36372998\") " pod="openshift-marketplace/certified-operators-t98zg" Mar 09 14:59:43 crc kubenswrapper[4722]: I0309 14:59:43.346128 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea0036f4-ef02-4deb-9740-172a36372998-catalog-content\") pod \"certified-operators-t98zg\" (UID: \"ea0036f4-ef02-4deb-9740-172a36372998\") " pod="openshift-marketplace/certified-operators-t98zg" Mar 09 14:59:43 crc kubenswrapper[4722]: I0309 14:59:43.346264 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea0036f4-ef02-4deb-9740-172a36372998-utilities\") pod \"certified-operators-t98zg\" (UID: \"ea0036f4-ef02-4deb-9740-172a36372998\") " pod="openshift-marketplace/certified-operators-t98zg" Mar 09 14:59:43 crc kubenswrapper[4722]: I0309 14:59:43.367622 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dqhd\" (UniqueName: \"kubernetes.io/projected/ea0036f4-ef02-4deb-9740-172a36372998-kube-api-access-9dqhd\") pod \"certified-operators-t98zg\" (UID: \"ea0036f4-ef02-4deb-9740-172a36372998\") " pod="openshift-marketplace/certified-operators-t98zg" Mar 09 14:59:43 crc kubenswrapper[4722]: I0309 14:59:43.410883 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t98zg" Mar 09 14:59:44 crc kubenswrapper[4722]: I0309 14:59:44.043405 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-t98zg"] Mar 09 14:59:44 crc kubenswrapper[4722]: I0309 14:59:44.943819 4722 generic.go:334] "Generic (PLEG): container finished" podID="ea0036f4-ef02-4deb-9740-172a36372998" containerID="3510dec19e69f8515ec34d9d5579a57865ee9e15295e705c0426290bc81475a8" exitCode=0 Mar 09 14:59:44 crc kubenswrapper[4722]: I0309 14:59:44.943867 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t98zg" event={"ID":"ea0036f4-ef02-4deb-9740-172a36372998","Type":"ContainerDied","Data":"3510dec19e69f8515ec34d9d5579a57865ee9e15295e705c0426290bc81475a8"} Mar 09 14:59:44 crc kubenswrapper[4722]: I0309 14:59:44.944114 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t98zg" event={"ID":"ea0036f4-ef02-4deb-9740-172a36372998","Type":"ContainerStarted","Data":"cd2d39d185599ee8c563ff055f2f39c08f15e21cca499417471355b34c9a3578"} Mar 09 14:59:46 crc kubenswrapper[4722]: I0309 14:59:46.967753 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t98zg" event={"ID":"ea0036f4-ef02-4deb-9740-172a36372998","Type":"ContainerStarted","Data":"713724574f5201ee92829239ffa99ffefa5681c0de1004e445966beb59c3a7d1"} Mar 09 14:59:48 crc kubenswrapper[4722]: I0309 14:59:48.992646 4722 generic.go:334] "Generic (PLEG): container finished" podID="ea0036f4-ef02-4deb-9740-172a36372998" containerID="713724574f5201ee92829239ffa99ffefa5681c0de1004e445966beb59c3a7d1" exitCode=0 Mar 09 14:59:48 crc kubenswrapper[4722]: I0309 14:59:48.992742 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t98zg" event={"ID":"ea0036f4-ef02-4deb-9740-172a36372998","Type":"ContainerDied","Data":"713724574f5201ee92829239ffa99ffefa5681c0de1004e445966beb59c3a7d1"} Mar 09 14:59:50 crc kubenswrapper[4722]: I0309 14:59:50.020034 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t98zg" event={"ID":"ea0036f4-ef02-4deb-9740-172a36372998","Type":"ContainerStarted","Data":"15723a47e6c5358a0f29fd7a8b7d05c40b0b0847d3b14717d130c5e06a9afae3"} Mar 09 14:59:50 crc kubenswrapper[4722]: I0309 14:59:50.042813 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-t98zg" podStartSLOduration=2.317792274 podStartE2EDuration="7.042791305s" podCreationTimestamp="2026-03-09 14:59:43 +0000 UTC" firstStartedPulling="2026-03-09 14:59:44.945918855 +0000 UTC m=+3425.501487431" lastFinishedPulling="2026-03-09 14:59:49.670917886 +0000 UTC m=+3430.226486462" observedRunningTime="2026-03-09 14:59:50.038878499 +0000 UTC m=+3430.594447085" watchObservedRunningTime="2026-03-09 14:59:50.042791305 +0000 UTC m=+3430.598359891" Mar 09 14:59:53 crc kubenswrapper[4722]: I0309 14:59:53.411793 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-t98zg" Mar 09 14:59:53 crc kubenswrapper[4722]: I0309 14:59:53.412404 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-t98zg" Mar 09 14:59:54 crc kubenswrapper[4722]: I0309 14:59:54.464069 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-t98zg" podUID="ea0036f4-ef02-4deb-9740-172a36372998" containerName="registry-server" probeResult="failure" output=< Mar 09 14:59:54 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Mar 09 14:59:54 crc kubenswrapper[4722]: > Mar 09 15:00:00 crc kubenswrapper[4722]: I0309 15:00:00.163364 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551140-sgbwp"] Mar 09 15:00:00 crc kubenswrapper[4722]: I0309 15:00:00.166138 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551140-hm98x"] Mar 09 15:00:00 crc kubenswrapper[4722]: I0309 15:00:00.167012 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551140-sgbwp" Mar 09 15:00:00 crc kubenswrapper[4722]: I0309 15:00:00.167902 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551140-hm98x" Mar 09 15:00:00 crc kubenswrapper[4722]: I0309 15:00:00.169774 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 15:00:00 crc kubenswrapper[4722]: I0309 15:00:00.170662 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 09 15:00:00 crc kubenswrapper[4722]: I0309 15:00:00.171009 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 09 15:00:00 crc kubenswrapper[4722]: I0309 15:00:00.171257 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 15:00:00 crc kubenswrapper[4722]: I0309 15:00:00.171590 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-dr2x6" Mar 09 15:00:00 crc kubenswrapper[4722]: I0309 15:00:00.174639 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551140-sgbwp"] Mar 09 15:00:00 crc kubenswrapper[4722]: I0309 15:00:00.198020 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551140-hm98x"] Mar 09 15:00:00 crc kubenswrapper[4722]: I0309 15:00:00.267471 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b1a536c7-a5a3-4082-9493-692bdc2bd660-secret-volume\") pod \"collect-profiles-29551140-hm98x\" (UID: \"b1a536c7-a5a3-4082-9493-692bdc2bd660\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551140-hm98x" Mar 09 15:00:00 crc kubenswrapper[4722]: I0309 15:00:00.267660 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjqpm\" (UniqueName: \"kubernetes.io/projected/c493907c-5318-4159-930b-2c5108e7b026-kube-api-access-rjqpm\") pod \"auto-csr-approver-29551140-sgbwp\" (UID: \"c493907c-5318-4159-930b-2c5108e7b026\") " pod="openshift-infra/auto-csr-approver-29551140-sgbwp" Mar 09 15:00:00 crc kubenswrapper[4722]: I0309 15:00:00.267779 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b1a536c7-a5a3-4082-9493-692bdc2bd660-config-volume\") pod \"collect-profiles-29551140-hm98x\" (UID: \"b1a536c7-a5a3-4082-9493-692bdc2bd660\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551140-hm98x" Mar 09 15:00:00 crc kubenswrapper[4722]: I0309 15:00:00.267872 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8f6m\" (UniqueName: \"kubernetes.io/projected/b1a536c7-a5a3-4082-9493-692bdc2bd660-kube-api-access-j8f6m\") pod \"collect-profiles-29551140-hm98x\" (UID: \"b1a536c7-a5a3-4082-9493-692bdc2bd660\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551140-hm98x" Mar 09 15:00:00 crc kubenswrapper[4722]: I0309 15:00:00.370433 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b1a536c7-a5a3-4082-9493-692bdc2bd660-config-volume\") pod \"collect-profiles-29551140-hm98x\" (UID: \"b1a536c7-a5a3-4082-9493-692bdc2bd660\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551140-hm98x" Mar 09 15:00:00 crc kubenswrapper[4722]: I0309 15:00:00.370524 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8f6m\" (UniqueName: \"kubernetes.io/projected/b1a536c7-a5a3-4082-9493-692bdc2bd660-kube-api-access-j8f6m\") pod \"collect-profiles-29551140-hm98x\" (UID: \"b1a536c7-a5a3-4082-9493-692bdc2bd660\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551140-hm98x" Mar 09 15:00:00 crc kubenswrapper[4722]: I0309 15:00:00.370600 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b1a536c7-a5a3-4082-9493-692bdc2bd660-secret-volume\") pod \"collect-profiles-29551140-hm98x\" (UID: \"b1a536c7-a5a3-4082-9493-692bdc2bd660\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551140-hm98x" Mar 09 15:00:00 crc kubenswrapper[4722]: I0309 15:00:00.370670 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjqpm\" (UniqueName: \"kubernetes.io/projected/c493907c-5318-4159-930b-2c5108e7b026-kube-api-access-rjqpm\") pod \"auto-csr-approver-29551140-sgbwp\" (UID: \"c493907c-5318-4159-930b-2c5108e7b026\") " pod="openshift-infra/auto-csr-approver-29551140-sgbwp" Mar 09 15:00:00 crc kubenswrapper[4722]: I0309 15:00:00.371399 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b1a536c7-a5a3-4082-9493-692bdc2bd660-config-volume\") pod \"collect-profiles-29551140-hm98x\" (UID: \"b1a536c7-a5a3-4082-9493-692bdc2bd660\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551140-hm98x" Mar 09 15:00:00 crc kubenswrapper[4722]: I0309 15:00:00.377817 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b1a536c7-a5a3-4082-9493-692bdc2bd660-secret-volume\") pod \"collect-profiles-29551140-hm98x\" (UID: \"b1a536c7-a5a3-4082-9493-692bdc2bd660\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551140-hm98x" Mar 09 15:00:00 crc kubenswrapper[4722]: I0309 15:00:00.392309 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8f6m\" (UniqueName: \"kubernetes.io/projected/b1a536c7-a5a3-4082-9493-692bdc2bd660-kube-api-access-j8f6m\") pod \"collect-profiles-29551140-hm98x\" (UID: \"b1a536c7-a5a3-4082-9493-692bdc2bd660\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551140-hm98x" Mar 09 15:00:00 crc kubenswrapper[4722]: I0309 15:00:00.395297 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjqpm\" (UniqueName: \"kubernetes.io/projected/c493907c-5318-4159-930b-2c5108e7b026-kube-api-access-rjqpm\") pod \"auto-csr-approver-29551140-sgbwp\" (UID: \"c493907c-5318-4159-930b-2c5108e7b026\") " pod="openshift-infra/auto-csr-approver-29551140-sgbwp" Mar 09 15:00:00 crc kubenswrapper[4722]: I0309 15:00:00.504767 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551140-sgbwp" Mar 09 15:00:00 crc kubenswrapper[4722]: I0309 15:00:00.514159 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551140-hm98x" Mar 09 15:00:01 crc kubenswrapper[4722]: I0309 15:00:01.313133 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551140-hm98x"] Mar 09 15:00:01 crc kubenswrapper[4722]: I0309 15:00:01.330881 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551140-sgbwp"] Mar 09 15:00:01 crc kubenswrapper[4722]: W0309 15:00:01.334864 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1a536c7_a5a3_4082_9493_692bdc2bd660.slice/crio-59712a3ab41f1f33ab76e518ca079b01b91f98c42e763e1dfafc3a4f031d6776 WatchSource:0}: Error finding container 59712a3ab41f1f33ab76e518ca079b01b91f98c42e763e1dfafc3a4f031d6776: Status 404 returned error can't find the container with id 59712a3ab41f1f33ab76e518ca079b01b91f98c42e763e1dfafc3a4f031d6776 Mar 09 15:00:01 crc kubenswrapper[4722]: W0309 15:00:01.338514 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc493907c_5318_4159_930b_2c5108e7b026.slice/crio-57e2e8154250b169ac195e2a006f46772945775e600aefd58458620e952d0675 WatchSource:0}: Error finding container 57e2e8154250b169ac195e2a006f46772945775e600aefd58458620e952d0675: Status 404 returned error can't find the container with id 57e2e8154250b169ac195e2a006f46772945775e600aefd58458620e952d0675 Mar 09 15:00:02 crc kubenswrapper[4722]: I0309 15:00:02.149926 4722 generic.go:334] "Generic (PLEG): container finished" podID="b1a536c7-a5a3-4082-9493-692bdc2bd660" containerID="d6e5586c31071b754232c222e4d9c9c2dcc92c6db1148fd92ddf7024fb02194f" exitCode=0 Mar 09 15:00:02 crc kubenswrapper[4722]: I0309 15:00:02.161106 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551140-hm98x" event={"ID":"b1a536c7-a5a3-4082-9493-692bdc2bd660","Type":"ContainerDied","Data":"d6e5586c31071b754232c222e4d9c9c2dcc92c6db1148fd92ddf7024fb02194f"} Mar 09 15:00:02 crc kubenswrapper[4722]: I0309 15:00:02.161147 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551140-hm98x" event={"ID":"b1a536c7-a5a3-4082-9493-692bdc2bd660","Type":"ContainerStarted","Data":"59712a3ab41f1f33ab76e518ca079b01b91f98c42e763e1dfafc3a4f031d6776"} Mar 09 15:00:02 crc kubenswrapper[4722]: I0309 15:00:02.161158 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551140-sgbwp" event={"ID":"c493907c-5318-4159-930b-2c5108e7b026","Type":"ContainerStarted","Data":"57e2e8154250b169ac195e2a006f46772945775e600aefd58458620e952d0675"} Mar 09 15:00:03 crc kubenswrapper[4722]: I0309 15:00:03.491556 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-t98zg" Mar 09 15:00:03 crc kubenswrapper[4722]: I0309 15:00:03.581844 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-t98zg" Mar 09 15:00:03 crc kubenswrapper[4722]: I0309 15:00:03.729397 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-t98zg"] Mar 09 15:00:03 crc kubenswrapper[4722]: I0309 15:00:03.771100 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551140-hm98x" Mar 09 15:00:03 crc kubenswrapper[4722]: I0309 15:00:03.828676 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b1a536c7-a5a3-4082-9493-692bdc2bd660-secret-volume\") pod \"b1a536c7-a5a3-4082-9493-692bdc2bd660\" (UID: \"b1a536c7-a5a3-4082-9493-692bdc2bd660\") " Mar 09 15:00:03 crc kubenswrapper[4722]: I0309 15:00:03.828835 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b1a536c7-a5a3-4082-9493-692bdc2bd660-config-volume\") pod \"b1a536c7-a5a3-4082-9493-692bdc2bd660\" (UID: \"b1a536c7-a5a3-4082-9493-692bdc2bd660\") " Mar 09 15:00:03 crc kubenswrapper[4722]: I0309 15:00:03.828880 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j8f6m\" (UniqueName: \"kubernetes.io/projected/b1a536c7-a5a3-4082-9493-692bdc2bd660-kube-api-access-j8f6m\") pod \"b1a536c7-a5a3-4082-9493-692bdc2bd660\" (UID: \"b1a536c7-a5a3-4082-9493-692bdc2bd660\") " Mar 09 15:00:03 crc kubenswrapper[4722]: I0309 15:00:03.831973 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1a536c7-a5a3-4082-9493-692bdc2bd660-config-volume" (OuterVolumeSpecName: "config-volume") pod "b1a536c7-a5a3-4082-9493-692bdc2bd660" (UID: "b1a536c7-a5a3-4082-9493-692bdc2bd660"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 15:00:03 crc kubenswrapper[4722]: I0309 15:00:03.841833 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1a536c7-a5a3-4082-9493-692bdc2bd660-kube-api-access-j8f6m" (OuterVolumeSpecName: "kube-api-access-j8f6m") pod "b1a536c7-a5a3-4082-9493-692bdc2bd660" (UID: "b1a536c7-a5a3-4082-9493-692bdc2bd660"). InnerVolumeSpecName "kube-api-access-j8f6m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 15:00:03 crc kubenswrapper[4722]: I0309 15:00:03.847166 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1a536c7-a5a3-4082-9493-692bdc2bd660-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b1a536c7-a5a3-4082-9493-692bdc2bd660" (UID: "b1a536c7-a5a3-4082-9493-692bdc2bd660"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 15:00:03 crc kubenswrapper[4722]: I0309 15:00:03.931925 4722 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b1a536c7-a5a3-4082-9493-692bdc2bd660-config-volume\") on node \"crc\" DevicePath \"\"" Mar 09 15:00:03 crc kubenswrapper[4722]: I0309 15:00:03.931998 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j8f6m\" (UniqueName: \"kubernetes.io/projected/b1a536c7-a5a3-4082-9493-692bdc2bd660-kube-api-access-j8f6m\") on node \"crc\" DevicePath \"\"" Mar 09 15:00:03 crc kubenswrapper[4722]: I0309 15:00:03.932020 4722 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b1a536c7-a5a3-4082-9493-692bdc2bd660-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 09 15:00:04 crc kubenswrapper[4722]: I0309 15:00:04.179162 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551140-hm98x" Mar 09 15:00:04 crc kubenswrapper[4722]: I0309 15:00:04.181254 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551140-hm98x" event={"ID":"b1a536c7-a5a3-4082-9493-692bdc2bd660","Type":"ContainerDied","Data":"59712a3ab41f1f33ab76e518ca079b01b91f98c42e763e1dfafc3a4f031d6776"} Mar 09 15:00:04 crc kubenswrapper[4722]: I0309 15:00:04.181320 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="59712a3ab41f1f33ab76e518ca079b01b91f98c42e763e1dfafc3a4f031d6776" Mar 09 15:00:04 crc kubenswrapper[4722]: I0309 15:00:04.868071 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551095-h8w2n"] Mar 09 15:00:04 crc kubenswrapper[4722]: I0309 15:00:04.880363 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551095-h8w2n"] Mar 09 15:00:05 crc kubenswrapper[4722]: I0309 15:00:05.191599 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-t98zg" podUID="ea0036f4-ef02-4deb-9740-172a36372998" containerName="registry-server" containerID="cri-o://15723a47e6c5358a0f29fd7a8b7d05c40b0b0847d3b14717d130c5e06a9afae3" gracePeriod=2 Mar 09 15:00:05 crc kubenswrapper[4722]: I0309 15:00:05.815057 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t98zg" Mar 09 15:00:05 crc kubenswrapper[4722]: I0309 15:00:05.994468 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea0036f4-ef02-4deb-9740-172a36372998-utilities\") pod \"ea0036f4-ef02-4deb-9740-172a36372998\" (UID: \"ea0036f4-ef02-4deb-9740-172a36372998\") " Mar 09 15:00:05 crc kubenswrapper[4722]: I0309 15:00:05.994651 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea0036f4-ef02-4deb-9740-172a36372998-catalog-content\") pod \"ea0036f4-ef02-4deb-9740-172a36372998\" (UID: \"ea0036f4-ef02-4deb-9740-172a36372998\") " Mar 09 15:00:05 crc kubenswrapper[4722]: I0309 15:00:05.994850 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9dqhd\" (UniqueName: \"kubernetes.io/projected/ea0036f4-ef02-4deb-9740-172a36372998-kube-api-access-9dqhd\") pod \"ea0036f4-ef02-4deb-9740-172a36372998\" (UID: \"ea0036f4-ef02-4deb-9740-172a36372998\") " Mar 09 15:00:05 crc kubenswrapper[4722]: I0309 15:00:05.995497 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea0036f4-ef02-4deb-9740-172a36372998-utilities" (OuterVolumeSpecName: "utilities") pod "ea0036f4-ef02-4deb-9740-172a36372998" (UID: "ea0036f4-ef02-4deb-9740-172a36372998"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 15:00:06 crc kubenswrapper[4722]: I0309 15:00:06.001312 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea0036f4-ef02-4deb-9740-172a36372998-kube-api-access-9dqhd" (OuterVolumeSpecName: "kube-api-access-9dqhd") pod "ea0036f4-ef02-4deb-9740-172a36372998" (UID: "ea0036f4-ef02-4deb-9740-172a36372998"). InnerVolumeSpecName "kube-api-access-9dqhd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 15:00:06 crc kubenswrapper[4722]: I0309 15:00:06.049718 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea0036f4-ef02-4deb-9740-172a36372998-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ea0036f4-ef02-4deb-9740-172a36372998" (UID: "ea0036f4-ef02-4deb-9740-172a36372998"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 15:00:06 crc kubenswrapper[4722]: I0309 15:00:06.097384 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9dqhd\" (UniqueName: \"kubernetes.io/projected/ea0036f4-ef02-4deb-9740-172a36372998-kube-api-access-9dqhd\") on node \"crc\" DevicePath \"\"" Mar 09 15:00:06 crc kubenswrapper[4722]: I0309 15:00:06.097415 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea0036f4-ef02-4deb-9740-172a36372998-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 15:00:06 crc kubenswrapper[4722]: I0309 15:00:06.097425 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea0036f4-ef02-4deb-9740-172a36372998-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 15:00:06 crc kubenswrapper[4722]: I0309 15:00:06.164181 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00a50aa1-a705-42cb-a3f7-e90cb7212a19" path="/var/lib/kubelet/pods/00a50aa1-a705-42cb-a3f7-e90cb7212a19/volumes" Mar 09 15:00:06 crc kubenswrapper[4722]: I0309 15:00:06.204618 4722 generic.go:334] "Generic (PLEG): container finished" podID="ea0036f4-ef02-4deb-9740-172a36372998" containerID="15723a47e6c5358a0f29fd7a8b7d05c40b0b0847d3b14717d130c5e06a9afae3" exitCode=0 Mar 09 15:00:06 crc kubenswrapper[4722]: I0309 15:00:06.204670 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t98zg" event={"ID":"ea0036f4-ef02-4deb-9740-172a36372998","Type":"ContainerDied","Data":"15723a47e6c5358a0f29fd7a8b7d05c40b0b0847d3b14717d130c5e06a9afae3"} Mar 09 15:00:06 crc kubenswrapper[4722]: I0309 15:00:06.204682 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t98zg" Mar 09 15:00:06 crc kubenswrapper[4722]: I0309 15:00:06.204720 4722 scope.go:117] "RemoveContainer" containerID="15723a47e6c5358a0f29fd7a8b7d05c40b0b0847d3b14717d130c5e06a9afae3" Mar 09 15:00:06 crc kubenswrapper[4722]: I0309 15:00:06.204703 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t98zg" event={"ID":"ea0036f4-ef02-4deb-9740-172a36372998","Type":"ContainerDied","Data":"cd2d39d185599ee8c563ff055f2f39c08f15e21cca499417471355b34c9a3578"} Mar 09 15:00:06 crc kubenswrapper[4722]: I0309 15:00:06.231307 4722 scope.go:117] "RemoveContainer" containerID="713724574f5201ee92829239ffa99ffefa5681c0de1004e445966beb59c3a7d1" Mar 09 15:00:06 crc kubenswrapper[4722]: I0309 15:00:06.235894 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-t98zg"] Mar 09 15:00:06 crc kubenswrapper[4722]: I0309 15:00:06.249292 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-t98zg"] Mar 09 15:00:06 crc kubenswrapper[4722]: I0309 15:00:06.269499 4722 scope.go:117] "RemoveContainer" containerID="3510dec19e69f8515ec34d9d5579a57865ee9e15295e705c0426290bc81475a8" Mar 09 15:00:06 crc kubenswrapper[4722]: I0309 15:00:06.331558 4722 scope.go:117] "RemoveContainer" containerID="15723a47e6c5358a0f29fd7a8b7d05c40b0b0847d3b14717d130c5e06a9afae3" Mar 09 15:00:06 crc kubenswrapper[4722]: E0309 15:00:06.332319 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15723a47e6c5358a0f29fd7a8b7d05c40b0b0847d3b14717d130c5e06a9afae3\": container with ID starting with 15723a47e6c5358a0f29fd7a8b7d05c40b0b0847d3b14717d130c5e06a9afae3 not found: ID does not exist" containerID="15723a47e6c5358a0f29fd7a8b7d05c40b0b0847d3b14717d130c5e06a9afae3" Mar 09 15:00:06 crc kubenswrapper[4722]: I0309 15:00:06.332428 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15723a47e6c5358a0f29fd7a8b7d05c40b0b0847d3b14717d130c5e06a9afae3"} err="failed to get container status \"15723a47e6c5358a0f29fd7a8b7d05c40b0b0847d3b14717d130c5e06a9afae3\": rpc error: code = NotFound desc = could not find container \"15723a47e6c5358a0f29fd7a8b7d05c40b0b0847d3b14717d130c5e06a9afae3\": container with ID starting with 15723a47e6c5358a0f29fd7a8b7d05c40b0b0847d3b14717d130c5e06a9afae3 not found: ID does not exist" Mar 09 15:00:06 crc kubenswrapper[4722]: I0309 15:00:06.332512 4722 scope.go:117] "RemoveContainer" containerID="713724574f5201ee92829239ffa99ffefa5681c0de1004e445966beb59c3a7d1" Mar 09 15:00:06 crc kubenswrapper[4722]: E0309 15:00:06.333087 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"713724574f5201ee92829239ffa99ffefa5681c0de1004e445966beb59c3a7d1\": container with ID starting with 713724574f5201ee92829239ffa99ffefa5681c0de1004e445966beb59c3a7d1 not found: ID does not exist" containerID="713724574f5201ee92829239ffa99ffefa5681c0de1004e445966beb59c3a7d1" Mar 09 15:00:06 crc kubenswrapper[4722]: I0309 15:00:06.333128 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"713724574f5201ee92829239ffa99ffefa5681c0de1004e445966beb59c3a7d1"} err="failed to get container status \"713724574f5201ee92829239ffa99ffefa5681c0de1004e445966beb59c3a7d1\": rpc error: code = NotFound desc = could not find container \"713724574f5201ee92829239ffa99ffefa5681c0de1004e445966beb59c3a7d1\": container with ID starting with 713724574f5201ee92829239ffa99ffefa5681c0de1004e445966beb59c3a7d1 not found: ID does not exist" Mar 09 15:00:06 crc kubenswrapper[4722]: I0309 15:00:06.333155 4722 scope.go:117] "RemoveContainer" containerID="3510dec19e69f8515ec34d9d5579a57865ee9e15295e705c0426290bc81475a8" Mar 09 15:00:06 crc kubenswrapper[4722]: E0309 15:00:06.333591 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3510dec19e69f8515ec34d9d5579a57865ee9e15295e705c0426290bc81475a8\": container with ID starting with 3510dec19e69f8515ec34d9d5579a57865ee9e15295e705c0426290bc81475a8 not found: ID does not exist" containerID="3510dec19e69f8515ec34d9d5579a57865ee9e15295e705c0426290bc81475a8" Mar 09 15:00:06 crc kubenswrapper[4722]: I0309 15:00:06.333646 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3510dec19e69f8515ec34d9d5579a57865ee9e15295e705c0426290bc81475a8"} err="failed to get container status \"3510dec19e69f8515ec34d9d5579a57865ee9e15295e705c0426290bc81475a8\": rpc error: code = NotFound desc = could not find container \"3510dec19e69f8515ec34d9d5579a57865ee9e15295e705c0426290bc81475a8\": container with ID starting with 3510dec19e69f8515ec34d9d5579a57865ee9e15295e705c0426290bc81475a8 not found: ID does not exist" Mar 09 15:00:08 crc kubenswrapper[4722]: I0309 15:00:08.162378 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea0036f4-ef02-4deb-9740-172a36372998" path="/var/lib/kubelet/pods/ea0036f4-ef02-4deb-9740-172a36372998/volumes" Mar 09 15:00:17 crc kubenswrapper[4722]: I0309 15:00:17.350591 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551140-sgbwp" event={"ID":"c493907c-5318-4159-930b-2c5108e7b026","Type":"ContainerStarted","Data":"8a743afe9645604bd6b4abac660fce26be70e597a537f1522d327b50577f37c5"} Mar 09 15:00:17 crc kubenswrapper[4722]: I0309 15:00:17.374614 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551140-sgbwp" podStartSLOduration=1.808233515 podStartE2EDuration="17.374591629s" podCreationTimestamp="2026-03-09 15:00:00 +0000 UTC" firstStartedPulling="2026-03-09 15:00:01.340944909 +0000 UTC m=+3441.896513485" lastFinishedPulling="2026-03-09 15:00:16.907303023 +0000 UTC m=+3457.462871599" observedRunningTime="2026-03-09 15:00:17.365255755 +0000 UTC m=+3457.920824331" watchObservedRunningTime="2026-03-09 15:00:17.374591629 +0000 UTC m=+3457.930160205" Mar 09 15:00:18 crc kubenswrapper[4722]: I0309 15:00:18.365985 4722 generic.go:334] "Generic (PLEG): container finished" podID="c493907c-5318-4159-930b-2c5108e7b026" containerID="8a743afe9645604bd6b4abac660fce26be70e597a537f1522d327b50577f37c5" exitCode=0 Mar 09 15:00:18 crc kubenswrapper[4722]: I0309 15:00:18.366071 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551140-sgbwp" event={"ID":"c493907c-5318-4159-930b-2c5108e7b026","Type":"ContainerDied","Data":"8a743afe9645604bd6b4abac660fce26be70e597a537f1522d327b50577f37c5"} Mar 09 15:00:19 crc kubenswrapper[4722]: I0309 15:00:19.909328 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551140-sgbwp" Mar 09 15:00:19 crc kubenswrapper[4722]: I0309 15:00:19.977739 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rjqpm\" (UniqueName: \"kubernetes.io/projected/c493907c-5318-4159-930b-2c5108e7b026-kube-api-access-rjqpm\") pod \"c493907c-5318-4159-930b-2c5108e7b026\" (UID: \"c493907c-5318-4159-930b-2c5108e7b026\") " Mar 09 15:00:19 crc kubenswrapper[4722]: I0309 15:00:19.985671 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c493907c-5318-4159-930b-2c5108e7b026-kube-api-access-rjqpm" (OuterVolumeSpecName: "kube-api-access-rjqpm") pod "c493907c-5318-4159-930b-2c5108e7b026" (UID: "c493907c-5318-4159-930b-2c5108e7b026"). InnerVolumeSpecName "kube-api-access-rjqpm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 15:00:20 crc kubenswrapper[4722]: I0309 15:00:20.080157 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rjqpm\" (UniqueName: \"kubernetes.io/projected/c493907c-5318-4159-930b-2c5108e7b026-kube-api-access-rjqpm\") on node \"crc\" DevicePath \"\"" Mar 09 15:00:20 crc kubenswrapper[4722]: I0309 15:00:20.404636 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551140-sgbwp" event={"ID":"c493907c-5318-4159-930b-2c5108e7b026","Type":"ContainerDied","Data":"57e2e8154250b169ac195e2a006f46772945775e600aefd58458620e952d0675"} Mar 09 15:00:20 crc kubenswrapper[4722]: I0309 15:00:20.404687 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="57e2e8154250b169ac195e2a006f46772945775e600aefd58458620e952d0675" Mar 09 15:00:20 crc kubenswrapper[4722]: I0309 15:00:20.404699 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551140-sgbwp" Mar 09 15:00:20 crc kubenswrapper[4722]: I0309 15:00:20.430102 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551134-l25dd"] Mar 09 15:00:20 crc kubenswrapper[4722]: I0309 15:00:20.443735 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551134-l25dd"] Mar 09 15:00:21 crc kubenswrapper[4722]: I0309 15:00:21.527606 4722 patch_prober.go:28] interesting pod/machine-config-daemon-hjrrb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 15:00:21 crc kubenswrapper[4722]: I0309 15:00:21.529087 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 15:00:22 crc kubenswrapper[4722]: I0309 15:00:22.166647 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9853cd61-1429-4805-bcb1-bf9771928787" path="/var/lib/kubelet/pods/9853cd61-1429-4805-bcb1-bf9771928787/volumes" Mar 09 15:00:29 crc kubenswrapper[4722]: I0309 15:00:29.049013 4722 scope.go:117] "RemoveContainer" containerID="a98436cf88d27eb9f246f9faa7c15af2430870edf3d068a921cfe4ca0724b0c8" Mar 09 15:00:29 crc kubenswrapper[4722]: I0309 15:00:29.090488 4722 scope.go:117] "RemoveContainer" containerID="0649f7b51acfb6bb022ac48b78a5bcee767b4f61a5aebbe40c11419852d44da7" Mar 09 15:00:34 crc kubenswrapper[4722]: I0309 15:00:34.583786 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8zxl8"] Mar 09 15:00:34 crc kubenswrapper[4722]: E0309 15:00:34.584959 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1a536c7-a5a3-4082-9493-692bdc2bd660" containerName="collect-profiles" Mar 09 15:00:34 crc kubenswrapper[4722]: I0309 15:00:34.584980 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1a536c7-a5a3-4082-9493-692bdc2bd660" containerName="collect-profiles" Mar 09 15:00:34 crc kubenswrapper[4722]: E0309 15:00:34.585006 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea0036f4-ef02-4deb-9740-172a36372998" containerName="extract-content" Mar 09 15:00:34 crc kubenswrapper[4722]: I0309 15:00:34.585014 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea0036f4-ef02-4deb-9740-172a36372998" containerName="extract-content" Mar 09 15:00:34 crc kubenswrapper[4722]: E0309 15:00:34.585031 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea0036f4-ef02-4deb-9740-172a36372998" containerName="extract-utilities" Mar 09 15:00:34 crc kubenswrapper[4722]: I0309 15:00:34.585039 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea0036f4-ef02-4deb-9740-172a36372998" containerName="extract-utilities" Mar 09 15:00:34 crc kubenswrapper[4722]: E0309 15:00:34.585093 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c493907c-5318-4159-930b-2c5108e7b026" containerName="oc" Mar 09 15:00:34 crc kubenswrapper[4722]: I0309 15:00:34.585100 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="c493907c-5318-4159-930b-2c5108e7b026" containerName="oc" Mar 09 15:00:34 crc kubenswrapper[4722]: E0309 15:00:34.585115 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea0036f4-ef02-4deb-9740-172a36372998" containerName="registry-server" Mar 09 15:00:34 crc kubenswrapper[4722]: I0309 15:00:34.585122 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea0036f4-ef02-4deb-9740-172a36372998" containerName="registry-server" Mar 09 15:00:34 crc kubenswrapper[4722]: I0309 15:00:34.585386 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="c493907c-5318-4159-930b-2c5108e7b026" containerName="oc" Mar 09 15:00:34 crc kubenswrapper[4722]: I0309 15:00:34.585403 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea0036f4-ef02-4deb-9740-172a36372998" containerName="registry-server" Mar 09 15:00:34 crc kubenswrapper[4722]: I0309 15:00:34.585418 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1a536c7-a5a3-4082-9493-692bdc2bd660" containerName="collect-profiles" Mar 09 15:00:34 crc kubenswrapper[4722]: I0309 15:00:34.589449 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8zxl8" Mar 09 15:00:34 crc kubenswrapper[4722]: I0309 15:00:34.616748 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8zxl8"] Mar 09 15:00:34 crc kubenswrapper[4722]: I0309 15:00:34.748594 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzwvs\" (UniqueName: \"kubernetes.io/projected/74f38523-a6a4-4915-9307-6a770aa0c4e0-kube-api-access-fzwvs\") pod \"redhat-operators-8zxl8\" (UID: \"74f38523-a6a4-4915-9307-6a770aa0c4e0\") " pod="openshift-marketplace/redhat-operators-8zxl8" Mar 09 15:00:34 crc kubenswrapper[4722]: I0309 15:00:34.749147 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74f38523-a6a4-4915-9307-6a770aa0c4e0-utilities\") pod \"redhat-operators-8zxl8\" (UID: \"74f38523-a6a4-4915-9307-6a770aa0c4e0\") " pod="openshift-marketplace/redhat-operators-8zxl8" Mar 09 15:00:34 crc kubenswrapper[4722]: I0309 15:00:34.749194 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74f38523-a6a4-4915-9307-6a770aa0c4e0-catalog-content\") pod \"redhat-operators-8zxl8\" (UID: \"74f38523-a6a4-4915-9307-6a770aa0c4e0\") " pod="openshift-marketplace/redhat-operators-8zxl8" Mar 09 15:00:34 crc kubenswrapper[4722]: I0309 15:00:34.852776 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74f38523-a6a4-4915-9307-6a770aa0c4e0-utilities\") pod \"redhat-operators-8zxl8\" (UID: \"74f38523-a6a4-4915-9307-6a770aa0c4e0\") " pod="openshift-marketplace/redhat-operators-8zxl8" Mar 09 15:00:34 crc kubenswrapper[4722]: I0309 15:00:34.852852 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74f38523-a6a4-4915-9307-6a770aa0c4e0-catalog-content\") pod \"redhat-operators-8zxl8\" (UID: \"74f38523-a6a4-4915-9307-6a770aa0c4e0\") " pod="openshift-marketplace/redhat-operators-8zxl8" Mar 09 15:00:34 crc kubenswrapper[4722]: I0309 15:00:34.852998 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzwvs\" (UniqueName: \"kubernetes.io/projected/74f38523-a6a4-4915-9307-6a770aa0c4e0-kube-api-access-fzwvs\") pod \"redhat-operators-8zxl8\" (UID: \"74f38523-a6a4-4915-9307-6a770aa0c4e0\") " pod="openshift-marketplace/redhat-operators-8zxl8" Mar 09 15:00:34 crc kubenswrapper[4722]: I0309 15:00:34.853406 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74f38523-a6a4-4915-9307-6a770aa0c4e0-catalog-content\") pod \"redhat-operators-8zxl8\" (UID: \"74f38523-a6a4-4915-9307-6a770aa0c4e0\") " pod="openshift-marketplace/redhat-operators-8zxl8" Mar 09 15:00:34 crc kubenswrapper[4722]: I0309 15:00:34.853481 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74f38523-a6a4-4915-9307-6a770aa0c4e0-utilities\") pod \"redhat-operators-8zxl8\" (UID: \"74f38523-a6a4-4915-9307-6a770aa0c4e0\") " pod="openshift-marketplace/redhat-operators-8zxl8" Mar 09 15:00:34 crc kubenswrapper[4722]: I0309 15:00:34.873106 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzwvs\" (UniqueName: \"kubernetes.io/projected/74f38523-a6a4-4915-9307-6a770aa0c4e0-kube-api-access-fzwvs\") pod \"redhat-operators-8zxl8\" (UID: \"74f38523-a6a4-4915-9307-6a770aa0c4e0\") " pod="openshift-marketplace/redhat-operators-8zxl8" Mar 09 15:00:34 crc kubenswrapper[4722]: I0309 15:00:34.935345 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8zxl8" Mar 09 15:00:35 crc kubenswrapper[4722]: I0309 15:00:35.477555 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8zxl8"] Mar 09 15:00:35 crc kubenswrapper[4722]: W0309 15:00:35.478373 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74f38523_a6a4_4915_9307_6a770aa0c4e0.slice/crio-b9b05c2a824e6b7a05394778dae57e8a7fdb43467229fc3a7b59dd7fb415ba3a WatchSource:0}: Error finding container b9b05c2a824e6b7a05394778dae57e8a7fdb43467229fc3a7b59dd7fb415ba3a: Status 404 returned error can't find the container with id b9b05c2a824e6b7a05394778dae57e8a7fdb43467229fc3a7b59dd7fb415ba3a Mar 09 15:00:35 crc kubenswrapper[4722]: I0309 15:00:35.575732 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8zxl8" event={"ID":"74f38523-a6a4-4915-9307-6a770aa0c4e0","Type":"ContainerStarted","Data":"b9b05c2a824e6b7a05394778dae57e8a7fdb43467229fc3a7b59dd7fb415ba3a"} Mar 09 15:00:36 crc kubenswrapper[4722]: I0309 15:00:36.589004 4722 generic.go:334] "Generic (PLEG): container finished" podID="74f38523-a6a4-4915-9307-6a770aa0c4e0" containerID="85038d70b42465db5ac8d341064693eead791d56d7aaf2d1ec991b8e2cdf9909" exitCode=0 Mar 09 15:00:36 crc kubenswrapper[4722]: I0309 15:00:36.589107 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8zxl8" event={"ID":"74f38523-a6a4-4915-9307-6a770aa0c4e0","Type":"ContainerDied","Data":"85038d70b42465db5ac8d341064693eead791d56d7aaf2d1ec991b8e2cdf9909"} Mar 09 15:00:37 crc kubenswrapper[4722]: I0309 15:00:37.602088 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8zxl8" event={"ID":"74f38523-a6a4-4915-9307-6a770aa0c4e0","Type":"ContainerStarted","Data":"d80f616e5b21038eef385b1f12127cdae8f84f9be2b50247ffee8965af803ae4"} Mar 09 15:00:43 crc kubenswrapper[4722]: I0309 15:00:43.667476 4722 generic.go:334] "Generic (PLEG): container finished" podID="74f38523-a6a4-4915-9307-6a770aa0c4e0" containerID="d80f616e5b21038eef385b1f12127cdae8f84f9be2b50247ffee8965af803ae4" exitCode=0 Mar 09 15:00:43 crc kubenswrapper[4722]: I0309 15:00:43.667556 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8zxl8" event={"ID":"74f38523-a6a4-4915-9307-6a770aa0c4e0","Type":"ContainerDied","Data":"d80f616e5b21038eef385b1f12127cdae8f84f9be2b50247ffee8965af803ae4"} Mar 09 15:00:44 crc kubenswrapper[4722]: I0309 15:00:44.682758 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8zxl8" event={"ID":"74f38523-a6a4-4915-9307-6a770aa0c4e0","Type":"ContainerStarted","Data":"e840c7f8f2755a2dc54ee6fe4e10758f8245f9ae0bd52461a7ee4143a61e4460"} Mar 09 15:00:44 crc kubenswrapper[4722]: I0309 15:00:44.709188 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8zxl8" podStartSLOduration=3.131427898 podStartE2EDuration="10.709168537s" podCreationTimestamp="2026-03-09 15:00:34 +0000 UTC" firstStartedPulling="2026-03-09 15:00:36.591804423 +0000 UTC m=+3477.147372999" lastFinishedPulling="2026-03-09 15:00:44.169545062 +0000 UTC m=+3484.725113638" observedRunningTime="2026-03-09 15:00:44.701679984 +0000 UTC m=+3485.257248560" watchObservedRunningTime="2026-03-09 15:00:44.709168537 +0000 UTC m=+3485.264737133" Mar 09 15:00:44 crc kubenswrapper[4722]: I0309 15:00:44.935852 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8zxl8" Mar 09 15:00:44 crc kubenswrapper[4722]: I0309 15:00:44.936028 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8zxl8" Mar 09 15:00:46 crc kubenswrapper[4722]: I0309 15:00:46.018710 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8zxl8" podUID="74f38523-a6a4-4915-9307-6a770aa0c4e0" containerName="registry-server" probeResult="failure" output=< Mar 09 15:00:46 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Mar 09 15:00:46 crc kubenswrapper[4722]: > Mar 09 15:00:51 crc kubenswrapper[4722]: I0309 15:00:51.527928 4722 patch_prober.go:28] interesting pod/machine-config-daemon-hjrrb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 15:00:51 crc kubenswrapper[4722]: I0309 15:00:51.528560 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 15:00:55 crc kubenswrapper[4722]: I0309 15:00:55.985030 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8zxl8" podUID="74f38523-a6a4-4915-9307-6a770aa0c4e0" containerName="registry-server" probeResult="failure" output=< Mar 09 15:00:55 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Mar 09 15:00:55 crc kubenswrapper[4722]: > Mar 09 15:01:00 crc kubenswrapper[4722]: I0309 15:01:00.163842 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29551141-nnsmc"] Mar 09 15:01:00 crc kubenswrapper[4722]: I0309 15:01:00.166569 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29551141-nnsmc" Mar 09 15:01:00 crc kubenswrapper[4722]: I0309 15:01:00.177786 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29551141-nnsmc"] Mar 09 15:01:00 crc kubenswrapper[4722]: I0309 15:01:00.318277 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tq656\" (UniqueName: \"kubernetes.io/projected/a88cb554-d13a-427d-959b-70271e698efe-kube-api-access-tq656\") pod \"keystone-cron-29551141-nnsmc\" (UID: \"a88cb554-d13a-427d-959b-70271e698efe\") " pod="openstack/keystone-cron-29551141-nnsmc" Mar 09 15:01:00 crc kubenswrapper[4722]: I0309 15:01:00.320716 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a88cb554-d13a-427d-959b-70271e698efe-fernet-keys\") pod \"keystone-cron-29551141-nnsmc\" (UID: \"a88cb554-d13a-427d-959b-70271e698efe\") " pod="openstack/keystone-cron-29551141-nnsmc" Mar 09 15:01:00 crc kubenswrapper[4722]: I0309 15:01:00.321149 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a88cb554-d13a-427d-959b-70271e698efe-combined-ca-bundle\") pod \"keystone-cron-29551141-nnsmc\" (UID: \"a88cb554-d13a-427d-959b-70271e698efe\") " pod="openstack/keystone-cron-29551141-nnsmc" Mar 09 15:01:00 crc kubenswrapper[4722]: I0309 15:01:00.321291 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a88cb554-d13a-427d-959b-70271e698efe-config-data\") pod \"keystone-cron-29551141-nnsmc\" (UID: \"a88cb554-d13a-427d-959b-70271e698efe\") " pod="openstack/keystone-cron-29551141-nnsmc" Mar 09 15:01:00 crc kubenswrapper[4722]: I0309 15:01:00.423857 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tq656\" (UniqueName: \"kubernetes.io/projected/a88cb554-d13a-427d-959b-70271e698efe-kube-api-access-tq656\") pod \"keystone-cron-29551141-nnsmc\" (UID: \"a88cb554-d13a-427d-959b-70271e698efe\") " pod="openstack/keystone-cron-29551141-nnsmc" Mar 09 15:01:00 crc kubenswrapper[4722]: I0309 15:01:00.424033 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a88cb554-d13a-427d-959b-70271e698efe-fernet-keys\") pod \"keystone-cron-29551141-nnsmc\" (UID: \"a88cb554-d13a-427d-959b-70271e698efe\") " pod="openstack/keystone-cron-29551141-nnsmc" Mar 09 15:01:00 crc kubenswrapper[4722]: I0309 15:01:00.424109 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a88cb554-d13a-427d-959b-70271e698efe-combined-ca-bundle\") pod \"keystone-cron-29551141-nnsmc\" (UID: \"a88cb554-d13a-427d-959b-70271e698efe\") " pod="openstack/keystone-cron-29551141-nnsmc" Mar 09 15:01:00 crc kubenswrapper[4722]: I0309 15:01:00.424147 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a88cb554-d13a-427d-959b-70271e698efe-config-data\") pod \"keystone-cron-29551141-nnsmc\" (UID: \"a88cb554-d13a-427d-959b-70271e698efe\") " pod="openstack/keystone-cron-29551141-nnsmc" Mar 09 15:01:00 crc kubenswrapper[4722]: I0309 15:01:00.432385 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a88cb554-d13a-427d-959b-70271e698efe-fernet-keys\") pod \"keystone-cron-29551141-nnsmc\" (UID: \"a88cb554-d13a-427d-959b-70271e698efe\") " pod="openstack/keystone-cron-29551141-nnsmc" Mar 09 15:01:00 crc kubenswrapper[4722]: I0309 15:01:00.436105 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a88cb554-d13a-427d-959b-70271e698efe-combined-ca-bundle\") pod \"keystone-cron-29551141-nnsmc\" (UID: \"a88cb554-d13a-427d-959b-70271e698efe\") " pod="openstack/keystone-cron-29551141-nnsmc" Mar 09 15:01:00 crc kubenswrapper[4722]: I0309 15:01:00.436814 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a88cb554-d13a-427d-959b-70271e698efe-config-data\") pod \"keystone-cron-29551141-nnsmc\" (UID: \"a88cb554-d13a-427d-959b-70271e698efe\") " pod="openstack/keystone-cron-29551141-nnsmc" Mar 09 15:01:00 crc kubenswrapper[4722]: I0309 15:01:00.450190 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tq656\" (UniqueName: \"kubernetes.io/projected/a88cb554-d13a-427d-959b-70271e698efe-kube-api-access-tq656\") pod \"keystone-cron-29551141-nnsmc\" (UID: \"a88cb554-d13a-427d-959b-70271e698efe\") " pod="openstack/keystone-cron-29551141-nnsmc" Mar 09 15:01:00 crc kubenswrapper[4722]: I0309 15:01:00.500465 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29551141-nnsmc" Mar 09 15:01:01 crc kubenswrapper[4722]: I0309 15:01:01.069377 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29551141-nnsmc"] Mar 09 15:01:01 crc kubenswrapper[4722]: I0309 15:01:01.881841 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29551141-nnsmc" event={"ID":"a88cb554-d13a-427d-959b-70271e698efe","Type":"ContainerStarted","Data":"476ca45bf61b1766d8ebc1eb7a30a197a06968207a24ee97a84326dc75310dcb"} Mar 09 15:01:01 crc kubenswrapper[4722]: I0309 15:01:01.882333 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29551141-nnsmc" event={"ID":"a88cb554-d13a-427d-959b-70271e698efe","Type":"ContainerStarted","Data":"c2abbbf361e81f2375df1c360c47aa1f4516c403300799fb0726af584aadea6f"} Mar 09 15:01:01 crc kubenswrapper[4722]: I0309 15:01:01.907185 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29551141-nnsmc" podStartSLOduration=1.9071631359999999 podStartE2EDuration="1.907163136s" podCreationTimestamp="2026-03-09 15:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 15:01:01.898379498 +0000 UTC m=+3502.453948074" watchObservedRunningTime="2026-03-09 15:01:01.907163136 +0000 UTC m=+3502.462731712" Mar 09 15:01:04 crc kubenswrapper[4722]: I0309 15:01:04.919913 4722 generic.go:334] "Generic (PLEG): container finished" podID="a88cb554-d13a-427d-959b-70271e698efe" containerID="476ca45bf61b1766d8ebc1eb7a30a197a06968207a24ee97a84326dc75310dcb" exitCode=0 Mar 09 15:01:04 crc kubenswrapper[4722]: I0309 15:01:04.920454 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29551141-nnsmc" event={"ID":"a88cb554-d13a-427d-959b-70271e698efe","Type":"ContainerDied","Data":"476ca45bf61b1766d8ebc1eb7a30a197a06968207a24ee97a84326dc75310dcb"} Mar 09 15:01:05 crc kubenswrapper[4722]: I0309 15:01:05.031587 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8zxl8" Mar 09 15:01:05 crc kubenswrapper[4722]: I0309 15:01:05.088009 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8zxl8" Mar 09 15:01:05 crc kubenswrapper[4722]: I0309 15:01:05.787989 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8zxl8"] Mar 09 15:01:06 crc kubenswrapper[4722]: I0309 15:01:06.478401 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29551141-nnsmc" Mar 09 15:01:06 crc kubenswrapper[4722]: I0309 15:01:06.625257 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a88cb554-d13a-427d-959b-70271e698efe-combined-ca-bundle\") pod \"a88cb554-d13a-427d-959b-70271e698efe\" (UID: \"a88cb554-d13a-427d-959b-70271e698efe\") " Mar 09 15:01:06 crc kubenswrapper[4722]: I0309 15:01:06.625560 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tq656\" (UniqueName: \"kubernetes.io/projected/a88cb554-d13a-427d-959b-70271e698efe-kube-api-access-tq656\") pod \"a88cb554-d13a-427d-959b-70271e698efe\" (UID: \"a88cb554-d13a-427d-959b-70271e698efe\") " Mar 09 15:01:06 crc kubenswrapper[4722]: I0309 15:01:06.625616 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a88cb554-d13a-427d-959b-70271e698efe-fernet-keys\") pod \"a88cb554-d13a-427d-959b-70271e698efe\" (UID: \"a88cb554-d13a-427d-959b-70271e698efe\") " Mar 09 15:01:06 crc kubenswrapper[4722]: I0309 15:01:06.625667 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a88cb554-d13a-427d-959b-70271e698efe-config-data\") pod \"a88cb554-d13a-427d-959b-70271e698efe\" (UID: \"a88cb554-d13a-427d-959b-70271e698efe\") " Mar 09 15:01:06 crc kubenswrapper[4722]: I0309 15:01:06.634992 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a88cb554-d13a-427d-959b-70271e698efe-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "a88cb554-d13a-427d-959b-70271e698efe" (UID: "a88cb554-d13a-427d-959b-70271e698efe"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 15:01:06 crc kubenswrapper[4722]: I0309 15:01:06.645637 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a88cb554-d13a-427d-959b-70271e698efe-kube-api-access-tq656" (OuterVolumeSpecName: "kube-api-access-tq656") pod "a88cb554-d13a-427d-959b-70271e698efe" (UID: "a88cb554-d13a-427d-959b-70271e698efe"). InnerVolumeSpecName "kube-api-access-tq656". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 15:01:06 crc kubenswrapper[4722]: I0309 15:01:06.663894 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a88cb554-d13a-427d-959b-70271e698efe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a88cb554-d13a-427d-959b-70271e698efe" (UID: "a88cb554-d13a-427d-959b-70271e698efe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 15:01:06 crc kubenswrapper[4722]: I0309 15:01:06.728932 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tq656\" (UniqueName: \"kubernetes.io/projected/a88cb554-d13a-427d-959b-70271e698efe-kube-api-access-tq656\") on node \"crc\" DevicePath \"\"" Mar 09 15:01:06 crc kubenswrapper[4722]: I0309 15:01:06.728970 4722 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a88cb554-d13a-427d-959b-70271e698efe-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 09 15:01:06 crc kubenswrapper[4722]: I0309 15:01:06.728984 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a88cb554-d13a-427d-959b-70271e698efe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 15:01:06 crc kubenswrapper[4722]: I0309 15:01:06.739567 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a88cb554-d13a-427d-959b-70271e698efe-config-data" (OuterVolumeSpecName: "config-data") pod "a88cb554-d13a-427d-959b-70271e698efe" (UID: "a88cb554-d13a-427d-959b-70271e698efe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 15:01:06 crc kubenswrapper[4722]: I0309 15:01:06.833951 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a88cb554-d13a-427d-959b-70271e698efe-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 15:01:06 crc kubenswrapper[4722]: I0309 15:01:06.943329 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29551141-nnsmc" event={"ID":"a88cb554-d13a-427d-959b-70271e698efe","Type":"ContainerDied","Data":"c2abbbf361e81f2375df1c360c47aa1f4516c403300799fb0726af584aadea6f"} Mar 09 15:01:06 crc kubenswrapper[4722]: I0309 15:01:06.943390 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2abbbf361e81f2375df1c360c47aa1f4516c403300799fb0726af584aadea6f" Mar 09 15:01:06 crc kubenswrapper[4722]: I0309 15:01:06.943349 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29551141-nnsmc" Mar 09 15:01:06 crc kubenswrapper[4722]: I0309 15:01:06.943462 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8zxl8" podUID="74f38523-a6a4-4915-9307-6a770aa0c4e0" containerName="registry-server" containerID="cri-o://e840c7f8f2755a2dc54ee6fe4e10758f8245f9ae0bd52461a7ee4143a61e4460" gracePeriod=2 Mar 09 15:01:07 crc kubenswrapper[4722]: I0309 15:01:07.489773 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8zxl8" Mar 09 15:01:07 crc kubenswrapper[4722]: I0309 15:01:07.651995 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74f38523-a6a4-4915-9307-6a770aa0c4e0-utilities\") pod \"74f38523-a6a4-4915-9307-6a770aa0c4e0\" (UID: \"74f38523-a6a4-4915-9307-6a770aa0c4e0\") " Mar 09 15:01:07 crc kubenswrapper[4722]: I0309 15:01:07.652118 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74f38523-a6a4-4915-9307-6a770aa0c4e0-catalog-content\") pod \"74f38523-a6a4-4915-9307-6a770aa0c4e0\" (UID: \"74f38523-a6a4-4915-9307-6a770aa0c4e0\") " Mar 09 15:01:07 crc kubenswrapper[4722]: I0309 15:01:07.652268 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzwvs\" (UniqueName: \"kubernetes.io/projected/74f38523-a6a4-4915-9307-6a770aa0c4e0-kube-api-access-fzwvs\") pod \"74f38523-a6a4-4915-9307-6a770aa0c4e0\" (UID: \"74f38523-a6a4-4915-9307-6a770aa0c4e0\") " Mar 09 15:01:07 crc kubenswrapper[4722]: I0309 15:01:07.653821 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74f38523-a6a4-4915-9307-6a770aa0c4e0-utilities" (OuterVolumeSpecName: "utilities") pod "74f38523-a6a4-4915-9307-6a770aa0c4e0" (UID: "74f38523-a6a4-4915-9307-6a770aa0c4e0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 15:01:07 crc kubenswrapper[4722]: I0309 15:01:07.658066 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74f38523-a6a4-4915-9307-6a770aa0c4e0-kube-api-access-fzwvs" (OuterVolumeSpecName: "kube-api-access-fzwvs") pod "74f38523-a6a4-4915-9307-6a770aa0c4e0" (UID: "74f38523-a6a4-4915-9307-6a770aa0c4e0"). InnerVolumeSpecName "kube-api-access-fzwvs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 15:01:07 crc kubenswrapper[4722]: I0309 15:01:07.754767 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/74f38523-a6a4-4915-9307-6a770aa0c4e0-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 15:01:07 crc kubenswrapper[4722]: I0309 15:01:07.754802 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzwvs\" (UniqueName: \"kubernetes.io/projected/74f38523-a6a4-4915-9307-6a770aa0c4e0-kube-api-access-fzwvs\") on node \"crc\" DevicePath \"\"" Mar 09 15:01:07 crc kubenswrapper[4722]: I0309 15:01:07.792599 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74f38523-a6a4-4915-9307-6a770aa0c4e0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "74f38523-a6a4-4915-9307-6a770aa0c4e0" (UID: "74f38523-a6a4-4915-9307-6a770aa0c4e0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 15:01:07 crc kubenswrapper[4722]: I0309 15:01:07.857254 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/74f38523-a6a4-4915-9307-6a770aa0c4e0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 15:01:07 crc kubenswrapper[4722]: I0309 15:01:07.954781 4722 generic.go:334] "Generic (PLEG): container finished" podID="74f38523-a6a4-4915-9307-6a770aa0c4e0" containerID="e840c7f8f2755a2dc54ee6fe4e10758f8245f9ae0bd52461a7ee4143a61e4460" exitCode=0 Mar 09 15:01:07 crc kubenswrapper[4722]: I0309 15:01:07.954820 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8zxl8" event={"ID":"74f38523-a6a4-4915-9307-6a770aa0c4e0","Type":"ContainerDied","Data":"e840c7f8f2755a2dc54ee6fe4e10758f8245f9ae0bd52461a7ee4143a61e4460"} Mar 09 15:01:07 crc kubenswrapper[4722]: I0309 15:01:07.954846 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8zxl8" event={"ID":"74f38523-a6a4-4915-9307-6a770aa0c4e0","Type":"ContainerDied","Data":"b9b05c2a824e6b7a05394778dae57e8a7fdb43467229fc3a7b59dd7fb415ba3a"} Mar 09 15:01:07 crc kubenswrapper[4722]: I0309 15:01:07.954863 4722 scope.go:117] "RemoveContainer" containerID="e840c7f8f2755a2dc54ee6fe4e10758f8245f9ae0bd52461a7ee4143a61e4460" Mar 09 15:01:07 crc kubenswrapper[4722]: I0309 15:01:07.954998 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8zxl8" Mar 09 15:01:07 crc kubenswrapper[4722]: I0309 15:01:07.987486 4722 scope.go:117] "RemoveContainer" containerID="d80f616e5b21038eef385b1f12127cdae8f84f9be2b50247ffee8965af803ae4" Mar 09 15:01:07 crc kubenswrapper[4722]: I0309 15:01:07.995408 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8zxl8"] Mar 09 15:01:08 crc kubenswrapper[4722]: I0309 15:01:08.007057 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8zxl8"] Mar 09 15:01:08 crc kubenswrapper[4722]: I0309 15:01:08.012667 4722 scope.go:117] "RemoveContainer" containerID="85038d70b42465db5ac8d341064693eead791d56d7aaf2d1ec991b8e2cdf9909" Mar 09 15:01:08 crc kubenswrapper[4722]: I0309 15:01:08.070845 4722 scope.go:117] "RemoveContainer" containerID="e840c7f8f2755a2dc54ee6fe4e10758f8245f9ae0bd52461a7ee4143a61e4460" Mar 09 15:01:08 crc kubenswrapper[4722]: E0309 15:01:08.071302 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e840c7f8f2755a2dc54ee6fe4e10758f8245f9ae0bd52461a7ee4143a61e4460\": container with ID starting with e840c7f8f2755a2dc54ee6fe4e10758f8245f9ae0bd52461a7ee4143a61e4460 not found: ID does not exist" containerID="e840c7f8f2755a2dc54ee6fe4e10758f8245f9ae0bd52461a7ee4143a61e4460" Mar 09 15:01:08 crc kubenswrapper[4722]: I0309 15:01:08.071346 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e840c7f8f2755a2dc54ee6fe4e10758f8245f9ae0bd52461a7ee4143a61e4460"} err="failed to get container status \"e840c7f8f2755a2dc54ee6fe4e10758f8245f9ae0bd52461a7ee4143a61e4460\": rpc error: code = NotFound desc = could not find container \"e840c7f8f2755a2dc54ee6fe4e10758f8245f9ae0bd52461a7ee4143a61e4460\": container with ID starting with e840c7f8f2755a2dc54ee6fe4e10758f8245f9ae0bd52461a7ee4143a61e4460 not found: ID does not exist" Mar 09 15:01:08 crc kubenswrapper[4722]: I0309 15:01:08.071377 4722 scope.go:117] "RemoveContainer" containerID="d80f616e5b21038eef385b1f12127cdae8f84f9be2b50247ffee8965af803ae4" Mar 09 15:01:08 crc kubenswrapper[4722]: E0309 15:01:08.071772 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d80f616e5b21038eef385b1f12127cdae8f84f9be2b50247ffee8965af803ae4\": container with ID starting with d80f616e5b21038eef385b1f12127cdae8f84f9be2b50247ffee8965af803ae4 not found: ID does not exist" containerID="d80f616e5b21038eef385b1f12127cdae8f84f9be2b50247ffee8965af803ae4" Mar 09 15:01:08 crc kubenswrapper[4722]: I0309 15:01:08.071802 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d80f616e5b21038eef385b1f12127cdae8f84f9be2b50247ffee8965af803ae4"} err="failed to get container status \"d80f616e5b21038eef385b1f12127cdae8f84f9be2b50247ffee8965af803ae4\": rpc error: code = NotFound desc = could not find container \"d80f616e5b21038eef385b1f12127cdae8f84f9be2b50247ffee8965af803ae4\": container with ID starting with d80f616e5b21038eef385b1f12127cdae8f84f9be2b50247ffee8965af803ae4 not found: ID does not exist" Mar 09 15:01:08 crc kubenswrapper[4722]: I0309 15:01:08.071824 4722 scope.go:117] "RemoveContainer" containerID="85038d70b42465db5ac8d341064693eead791d56d7aaf2d1ec991b8e2cdf9909" Mar 09 15:01:08 crc kubenswrapper[4722]: E0309 15:01:08.072126 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85038d70b42465db5ac8d341064693eead791d56d7aaf2d1ec991b8e2cdf9909\": container with ID starting with 85038d70b42465db5ac8d341064693eead791d56d7aaf2d1ec991b8e2cdf9909 not found: ID does not exist" containerID="85038d70b42465db5ac8d341064693eead791d56d7aaf2d1ec991b8e2cdf9909" Mar 09 15:01:08 crc kubenswrapper[4722]: I0309 15:01:08.072146 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85038d70b42465db5ac8d341064693eead791d56d7aaf2d1ec991b8e2cdf9909"} err="failed to get container status \"85038d70b42465db5ac8d341064693eead791d56d7aaf2d1ec991b8e2cdf9909\": rpc error: code = NotFound desc = could not find container \"85038d70b42465db5ac8d341064693eead791d56d7aaf2d1ec991b8e2cdf9909\": container with ID starting with 85038d70b42465db5ac8d341064693eead791d56d7aaf2d1ec991b8e2cdf9909 not found: ID does not exist" Mar 09 15:01:08 crc kubenswrapper[4722]: I0309 15:01:08.164122 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74f38523-a6a4-4915-9307-6a770aa0c4e0" path="/var/lib/kubelet/pods/74f38523-a6a4-4915-9307-6a770aa0c4e0/volumes" Mar 09 15:01:21 crc kubenswrapper[4722]: I0309 15:01:21.527770 4722 patch_prober.go:28] interesting pod/machine-config-daemon-hjrrb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 15:01:21 crc kubenswrapper[4722]: I0309 15:01:21.528431 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 15:01:21 crc kubenswrapper[4722]: I0309 15:01:21.528488 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" Mar 09 15:01:21 crc kubenswrapper[4722]: I0309 15:01:21.529486 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3368343976f6a5d643fb9637cdcd4c447f53ede4dfe3a3fd016450bf969037e2"} pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 15:01:21 crc kubenswrapper[4722]: I0309 15:01:21.529544 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerName="machine-config-daemon" containerID="cri-o://3368343976f6a5d643fb9637cdcd4c447f53ede4dfe3a3fd016450bf969037e2" gracePeriod=600 Mar 09 15:01:22 crc kubenswrapper[4722]: I0309 15:01:22.123005 4722 generic.go:334] "Generic (PLEG): container finished" podID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerID="3368343976f6a5d643fb9637cdcd4c447f53ede4dfe3a3fd016450bf969037e2" exitCode=0 Mar 09 15:01:22 crc kubenswrapper[4722]: I0309 15:01:22.123076 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" event={"ID":"dac2aaf5-653b-4b2a-8efe-ed26bac8d648","Type":"ContainerDied","Data":"3368343976f6a5d643fb9637cdcd4c447f53ede4dfe3a3fd016450bf969037e2"} Mar 09 15:01:22 crc kubenswrapper[4722]: I0309 15:01:22.123638 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" event={"ID":"dac2aaf5-653b-4b2a-8efe-ed26bac8d648","Type":"ContainerStarted","Data":"cd181ecc4152a977f7835e27eed2a00ac92eaff69a01c8a79c0452e664e36b0f"} Mar 09 15:01:22 crc kubenswrapper[4722]: I0309 15:01:22.123666 4722 scope.go:117] "RemoveContainer" containerID="e93d9830f7a284f5574f07e5fff41b2ae62d84fa8023d799598acc091cbd5a28" Mar 09 15:02:00 crc kubenswrapper[4722]: I0309 15:02:00.173966 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551142-r7rrz"] Mar 09 15:02:00 crc kubenswrapper[4722]: E0309 15:02:00.174875 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74f38523-a6a4-4915-9307-6a770aa0c4e0" containerName="registry-server" Mar 09 15:02:00 crc kubenswrapper[4722]: I0309 15:02:00.174887 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="74f38523-a6a4-4915-9307-6a770aa0c4e0" containerName="registry-server" Mar 09 15:02:00 crc kubenswrapper[4722]: E0309 15:02:00.174922 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a88cb554-d13a-427d-959b-70271e698efe" containerName="keystone-cron" Mar 09 15:02:00 crc kubenswrapper[4722]: I0309 15:02:00.174929 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a88cb554-d13a-427d-959b-70271e698efe" containerName="keystone-cron" Mar 09 15:02:00 crc kubenswrapper[4722]: E0309 15:02:00.174945 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74f38523-a6a4-4915-9307-6a770aa0c4e0" containerName="extract-utilities" Mar 09 15:02:00 crc kubenswrapper[4722]: I0309 15:02:00.174951 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="74f38523-a6a4-4915-9307-6a770aa0c4e0" containerName="extract-utilities" Mar 09 15:02:00 crc kubenswrapper[4722]: E0309 15:02:00.174965 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74f38523-a6a4-4915-9307-6a770aa0c4e0" containerName="extract-content" Mar 09 15:02:00 crc kubenswrapper[4722]: I0309 15:02:00.174971 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="74f38523-a6a4-4915-9307-6a770aa0c4e0" containerName="extract-content" Mar 09 15:02:00 crc kubenswrapper[4722]: I0309 15:02:00.175184 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="74f38523-a6a4-4915-9307-6a770aa0c4e0" containerName="registry-server" Mar 09 15:02:00 crc kubenswrapper[4722]: I0309 15:02:00.175214 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="a88cb554-d13a-427d-959b-70271e698efe" containerName="keystone-cron" Mar 09 15:02:00 crc kubenswrapper[4722]: I0309 15:02:00.175963 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551142-r7rrz" Mar 09 15:02:00 crc kubenswrapper[4722]: I0309 15:02:00.179299 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-dr2x6" Mar 09 15:02:00 crc kubenswrapper[4722]: I0309 15:02:00.179438 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 15:02:00 crc kubenswrapper[4722]: I0309 15:02:00.179477 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 15:02:00 crc kubenswrapper[4722]: I0309 15:02:00.184788 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551142-r7rrz"] Mar 09 15:02:00 crc kubenswrapper[4722]: I0309 15:02:00.242395 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42c4z\" (UniqueName: \"kubernetes.io/projected/2e3cbe3a-4b01-4955-a466-64aeca1350d4-kube-api-access-42c4z\") pod \"auto-csr-approver-29551142-r7rrz\" (UID: \"2e3cbe3a-4b01-4955-a466-64aeca1350d4\") " pod="openshift-infra/auto-csr-approver-29551142-r7rrz" Mar 09 15:02:00 crc kubenswrapper[4722]: I0309 15:02:00.345305 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42c4z\" (UniqueName: \"kubernetes.io/projected/2e3cbe3a-4b01-4955-a466-64aeca1350d4-kube-api-access-42c4z\") pod \"auto-csr-approver-29551142-r7rrz\" (UID: \"2e3cbe3a-4b01-4955-a466-64aeca1350d4\") " pod="openshift-infra/auto-csr-approver-29551142-r7rrz" Mar 09 15:02:00 crc kubenswrapper[4722]: I0309 15:02:00.365003 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42c4z\" (UniqueName: \"kubernetes.io/projected/2e3cbe3a-4b01-4955-a466-64aeca1350d4-kube-api-access-42c4z\") pod \"auto-csr-approver-29551142-r7rrz\" (UID: \"2e3cbe3a-4b01-4955-a466-64aeca1350d4\") " pod="openshift-infra/auto-csr-approver-29551142-r7rrz" Mar 09 15:02:00 crc kubenswrapper[4722]: I0309 15:02:00.514566 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551142-r7rrz" Mar 09 15:02:01 crc kubenswrapper[4722]: I0309 15:02:01.021768 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551142-r7rrz"] Mar 09 15:02:01 crc kubenswrapper[4722]: I0309 15:02:01.621787 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551142-r7rrz" event={"ID":"2e3cbe3a-4b01-4955-a466-64aeca1350d4","Type":"ContainerStarted","Data":"dea0c1ab1383252c20893a72c5aa33f30c721db0705f052af6cde50c2cced9bf"} Mar 09 15:02:03 crc kubenswrapper[4722]: I0309 15:02:03.648926 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551142-r7rrz" event={"ID":"2e3cbe3a-4b01-4955-a466-64aeca1350d4","Type":"ContainerStarted","Data":"ca2e5d2663ad9afb088fbb5e128106528854c30f7dfa3e0d597c7fecedeafbf9"} Mar 09 15:02:03 crc kubenswrapper[4722]: I0309 15:02:03.674892 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551142-r7rrz" podStartSLOduration=1.498513641 podStartE2EDuration="3.674843985s" podCreationTimestamp="2026-03-09 15:02:00 +0000 UTC" firstStartedPulling="2026-03-09 15:02:01.019410735 +0000 UTC m=+3561.574979321" lastFinishedPulling="2026-03-09 15:02:03.195741089 +0000 UTC m=+3563.751309665" observedRunningTime="2026-03-09 15:02:03.667965479 +0000 UTC m=+3564.223534055" watchObservedRunningTime="2026-03-09 15:02:03.674843985 +0000 UTC m=+3564.230412561" Mar 09 15:02:04 crc kubenswrapper[4722]: I0309 15:02:04.661405 4722 generic.go:334] "Generic (PLEG): container finished" podID="2e3cbe3a-4b01-4955-a466-64aeca1350d4" containerID="ca2e5d2663ad9afb088fbb5e128106528854c30f7dfa3e0d597c7fecedeafbf9" exitCode=0 Mar 09 15:02:04 crc kubenswrapper[4722]: I0309 15:02:04.661529 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551142-r7rrz" event={"ID":"2e3cbe3a-4b01-4955-a466-64aeca1350d4","Type":"ContainerDied","Data":"ca2e5d2663ad9afb088fbb5e128106528854c30f7dfa3e0d597c7fecedeafbf9"} Mar 09 15:02:06 crc kubenswrapper[4722]: I0309 15:02:06.138575 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551142-r7rrz" Mar 09 15:02:06 crc kubenswrapper[4722]: I0309 15:02:06.221467 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42c4z\" (UniqueName: \"kubernetes.io/projected/2e3cbe3a-4b01-4955-a466-64aeca1350d4-kube-api-access-42c4z\") pod \"2e3cbe3a-4b01-4955-a466-64aeca1350d4\" (UID: \"2e3cbe3a-4b01-4955-a466-64aeca1350d4\") " Mar 09 15:02:06 crc kubenswrapper[4722]: I0309 15:02:06.227701 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e3cbe3a-4b01-4955-a466-64aeca1350d4-kube-api-access-42c4z" (OuterVolumeSpecName: "kube-api-access-42c4z") pod "2e3cbe3a-4b01-4955-a466-64aeca1350d4" (UID: "2e3cbe3a-4b01-4955-a466-64aeca1350d4"). InnerVolumeSpecName "kube-api-access-42c4z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 15:02:06 crc kubenswrapper[4722]: I0309 15:02:06.230494 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42c4z\" (UniqueName: \"kubernetes.io/projected/2e3cbe3a-4b01-4955-a466-64aeca1350d4-kube-api-access-42c4z\") on node \"crc\" DevicePath \"\"" Mar 09 15:02:06 crc kubenswrapper[4722]: I0309 15:02:06.690380 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551142-r7rrz" event={"ID":"2e3cbe3a-4b01-4955-a466-64aeca1350d4","Type":"ContainerDied","Data":"dea0c1ab1383252c20893a72c5aa33f30c721db0705f052af6cde50c2cced9bf"} Mar 09 15:02:06 crc kubenswrapper[4722]: I0309 15:02:06.690686 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dea0c1ab1383252c20893a72c5aa33f30c721db0705f052af6cde50c2cced9bf" Mar 09 15:02:06 crc kubenswrapper[4722]: I0309 15:02:06.690623 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551142-r7rrz" Mar 09 15:02:06 crc kubenswrapper[4722]: I0309 15:02:06.751021 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551136-2mrlb"] Mar 09 15:02:06 crc kubenswrapper[4722]: I0309 15:02:06.771030 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551136-2mrlb"] Mar 09 15:02:08 crc kubenswrapper[4722]: I0309 15:02:08.167569 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35324614-f6de-45e7-817e-2df79b732b87" path="/var/lib/kubelet/pods/35324614-f6de-45e7-817e-2df79b732b87/volumes" Mar 09 15:02:29 crc kubenswrapper[4722]: I0309 15:02:29.291310 4722 scope.go:117] "RemoveContainer" containerID="56a727c2c234062d5084ac4bf17b8cadc5b1f074f388fe0c4ce72cf5e063078e" Mar 09 15:03:21 crc kubenswrapper[4722]: I0309 15:03:21.528083 4722 patch_prober.go:28] interesting pod/machine-config-daemon-hjrrb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 15:03:21 crc kubenswrapper[4722]: I0309 15:03:21.528749 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 15:03:51 crc kubenswrapper[4722]: I0309 15:03:51.527985 4722 patch_prober.go:28] interesting pod/machine-config-daemon-hjrrb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 15:03:51 crc kubenswrapper[4722]: I0309 15:03:51.528626 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 15:04:00 crc kubenswrapper[4722]: I0309 15:04:00.145185 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551144-jtsn5"] Mar 09 15:04:00 crc kubenswrapper[4722]: E0309 15:04:00.146237 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e3cbe3a-4b01-4955-a466-64aeca1350d4" containerName="oc" Mar 09 15:04:00 crc kubenswrapper[4722]: I0309 15:04:00.146250 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e3cbe3a-4b01-4955-a466-64aeca1350d4" containerName="oc" Mar 09 15:04:00 crc kubenswrapper[4722]: I0309 15:04:00.146486 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e3cbe3a-4b01-4955-a466-64aeca1350d4" containerName="oc" Mar 09 15:04:00 crc kubenswrapper[4722]: I0309 15:04:00.147295 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551144-jtsn5" Mar 09 15:04:00 crc kubenswrapper[4722]: I0309 15:04:00.150648 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-dr2x6" Mar 09 15:04:00 crc kubenswrapper[4722]: I0309 15:04:00.150932 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 15:04:00 crc kubenswrapper[4722]: I0309 15:04:00.151319 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 15:04:00 crc kubenswrapper[4722]: I0309 15:04:00.170647 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551144-jtsn5"] Mar 09 15:04:00 crc kubenswrapper[4722]: I0309 15:04:00.183738 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9ghl\" (UniqueName: \"kubernetes.io/projected/b51890c5-640c-41fc-9fa5-46b27e01a70f-kube-api-access-x9ghl\") pod \"auto-csr-approver-29551144-jtsn5\" (UID: \"b51890c5-640c-41fc-9fa5-46b27e01a70f\") " pod="openshift-infra/auto-csr-approver-29551144-jtsn5" Mar 09 15:04:00 crc kubenswrapper[4722]: I0309 15:04:00.286741 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9ghl\" (UniqueName: \"kubernetes.io/projected/b51890c5-640c-41fc-9fa5-46b27e01a70f-kube-api-access-x9ghl\") pod \"auto-csr-approver-29551144-jtsn5\" (UID: \"b51890c5-640c-41fc-9fa5-46b27e01a70f\") " pod="openshift-infra/auto-csr-approver-29551144-jtsn5" Mar 09 15:04:00 crc kubenswrapper[4722]: I0309 15:04:00.307358 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9ghl\" (UniqueName: \"kubernetes.io/projected/b51890c5-640c-41fc-9fa5-46b27e01a70f-kube-api-access-x9ghl\") pod \"auto-csr-approver-29551144-jtsn5\" (UID: \"b51890c5-640c-41fc-9fa5-46b27e01a70f\") " pod="openshift-infra/auto-csr-approver-29551144-jtsn5" Mar 09 15:04:00 crc kubenswrapper[4722]: I0309 15:04:00.469850 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551144-jtsn5" Mar 09 15:04:00 crc kubenswrapper[4722]: I0309 15:04:00.957843 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551144-jtsn5"] Mar 09 15:04:00 crc kubenswrapper[4722]: I0309 15:04:00.967131 4722 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 15:04:01 crc kubenswrapper[4722]: I0309 15:04:01.084939 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551144-jtsn5" event={"ID":"b51890c5-640c-41fc-9fa5-46b27e01a70f","Type":"ContainerStarted","Data":"8869b2bbd46e3b4fdc7f34963753997cc129e3dc2af893e458b6a167d3bd694a"} Mar 09 15:04:03 crc kubenswrapper[4722]: I0309 15:04:03.115779 4722 generic.go:334] "Generic (PLEG): container finished" podID="b51890c5-640c-41fc-9fa5-46b27e01a70f" containerID="64c22c60d207466957d4cb3128b0cd7668f4bfbb95f46ac446ec5f41037ff3d7" exitCode=0 Mar 09 15:04:03 crc kubenswrapper[4722]: I0309 15:04:03.115831 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551144-jtsn5" event={"ID":"b51890c5-640c-41fc-9fa5-46b27e01a70f","Type":"ContainerDied","Data":"64c22c60d207466957d4cb3128b0cd7668f4bfbb95f46ac446ec5f41037ff3d7"} Mar 09 15:04:04 crc kubenswrapper[4722]: I0309 15:04:04.551584 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551144-jtsn5" Mar 09 15:04:04 crc kubenswrapper[4722]: I0309 15:04:04.596515 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x9ghl\" (UniqueName: \"kubernetes.io/projected/b51890c5-640c-41fc-9fa5-46b27e01a70f-kube-api-access-x9ghl\") pod \"b51890c5-640c-41fc-9fa5-46b27e01a70f\" (UID: \"b51890c5-640c-41fc-9fa5-46b27e01a70f\") " Mar 09 15:04:04 crc kubenswrapper[4722]: I0309 15:04:04.604273 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b51890c5-640c-41fc-9fa5-46b27e01a70f-kube-api-access-x9ghl" (OuterVolumeSpecName: "kube-api-access-x9ghl") pod "b51890c5-640c-41fc-9fa5-46b27e01a70f" (UID: "b51890c5-640c-41fc-9fa5-46b27e01a70f"). InnerVolumeSpecName "kube-api-access-x9ghl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 15:04:04 crc kubenswrapper[4722]: I0309 15:04:04.699953 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x9ghl\" (UniqueName: \"kubernetes.io/projected/b51890c5-640c-41fc-9fa5-46b27e01a70f-kube-api-access-x9ghl\") on node \"crc\" DevicePath \"\"" Mar 09 15:04:05 crc kubenswrapper[4722]: I0309 15:04:05.137581 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551144-jtsn5" event={"ID":"b51890c5-640c-41fc-9fa5-46b27e01a70f","Type":"ContainerDied","Data":"8869b2bbd46e3b4fdc7f34963753997cc129e3dc2af893e458b6a167d3bd694a"} Mar 09 15:04:05 crc kubenswrapper[4722]: I0309 15:04:05.137962 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8869b2bbd46e3b4fdc7f34963753997cc129e3dc2af893e458b6a167d3bd694a" Mar 09 15:04:05 crc kubenswrapper[4722]: I0309 15:04:05.137793 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551144-jtsn5" Mar 09 15:04:05 crc kubenswrapper[4722]: I0309 15:04:05.627309 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551138-25c88"] Mar 09 15:04:05 crc kubenswrapper[4722]: I0309 15:04:05.645552 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551138-25c88"] Mar 09 15:04:06 crc kubenswrapper[4722]: I0309 15:04:06.163763 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8319fe6-90bd-445e-98c1-0b4b5b0e5888" path="/var/lib/kubelet/pods/d8319fe6-90bd-445e-98c1-0b4b5b0e5888/volumes" Mar 09 15:04:21 crc kubenswrapper[4722]: I0309 15:04:21.527683 4722 patch_prober.go:28] interesting pod/machine-config-daemon-hjrrb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 15:04:21 crc kubenswrapper[4722]: I0309 15:04:21.528580 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 15:04:21 crc kubenswrapper[4722]: I0309 15:04:21.528637 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" Mar 09 15:04:21 crc kubenswrapper[4722]: I0309 15:04:21.529777 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cd181ecc4152a977f7835e27eed2a00ac92eaff69a01c8a79c0452e664e36b0f"} pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 15:04:21 crc kubenswrapper[4722]: I0309 15:04:21.529841 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerName="machine-config-daemon" containerID="cri-o://cd181ecc4152a977f7835e27eed2a00ac92eaff69a01c8a79c0452e664e36b0f" gracePeriod=600 Mar 09 15:04:21 crc kubenswrapper[4722]: E0309 15:04:21.670219 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 15:04:22 crc kubenswrapper[4722]: I0309 15:04:22.319927 4722 generic.go:334] "Generic (PLEG): container finished" podID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerID="cd181ecc4152a977f7835e27eed2a00ac92eaff69a01c8a79c0452e664e36b0f" exitCode=0 Mar 09 15:04:22 crc kubenswrapper[4722]: I0309 15:04:22.320023 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" event={"ID":"dac2aaf5-653b-4b2a-8efe-ed26bac8d648","Type":"ContainerDied","Data":"cd181ecc4152a977f7835e27eed2a00ac92eaff69a01c8a79c0452e664e36b0f"} Mar 09 15:04:22 crc kubenswrapper[4722]: I0309 15:04:22.320213 4722 scope.go:117] "RemoveContainer" containerID="3368343976f6a5d643fb9637cdcd4c447f53ede4dfe3a3fd016450bf969037e2" Mar 09 15:04:22 crc kubenswrapper[4722]: I0309 15:04:22.321102 4722 scope.go:117] "RemoveContainer" containerID="cd181ecc4152a977f7835e27eed2a00ac92eaff69a01c8a79c0452e664e36b0f" Mar 09 15:04:22 crc kubenswrapper[4722]: E0309 15:04:22.321506 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 15:04:29 crc kubenswrapper[4722]: I0309 15:04:29.402252 4722 scope.go:117] "RemoveContainer" containerID="210edfc086b973d8e86c1b119dd75b6cf900542ae5a5b9792d6f2787cca5c2d3" Mar 09 15:04:37 crc kubenswrapper[4722]: I0309 15:04:37.149149 4722 scope.go:117] "RemoveContainer" containerID="cd181ecc4152a977f7835e27eed2a00ac92eaff69a01c8a79c0452e664e36b0f" Mar 09 15:04:37 crc kubenswrapper[4722]: E0309 15:04:37.149988 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 15:04:50 crc kubenswrapper[4722]: I0309 15:04:50.150091 4722 scope.go:117] "RemoveContainer" containerID="cd181ecc4152a977f7835e27eed2a00ac92eaff69a01c8a79c0452e664e36b0f" Mar 09 15:04:50 crc kubenswrapper[4722]: E0309 15:04:50.151411 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 15:05:01 crc kubenswrapper[4722]: I0309 15:05:01.150043 4722 scope.go:117] "RemoveContainer" containerID="cd181ecc4152a977f7835e27eed2a00ac92eaff69a01c8a79c0452e664e36b0f" Mar 09 15:05:01 crc kubenswrapper[4722]: E0309 15:05:01.150913 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 15:05:12 crc kubenswrapper[4722]: I0309 15:05:12.149497 4722 scope.go:117] "RemoveContainer" containerID="cd181ecc4152a977f7835e27eed2a00ac92eaff69a01c8a79c0452e664e36b0f" Mar 09 15:05:12 crc kubenswrapper[4722]: E0309 15:05:12.150569 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 15:05:26 crc kubenswrapper[4722]: I0309 15:05:26.149598 4722 scope.go:117] "RemoveContainer" containerID="cd181ecc4152a977f7835e27eed2a00ac92eaff69a01c8a79c0452e664e36b0f" Mar 09 15:05:26 crc kubenswrapper[4722]: E0309 15:05:26.150620 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 15:05:39 crc kubenswrapper[4722]: I0309 15:05:39.150287 4722 scope.go:117] "RemoveContainer" containerID="cd181ecc4152a977f7835e27eed2a00ac92eaff69a01c8a79c0452e664e36b0f" Mar 09 15:05:39 crc kubenswrapper[4722]: E0309 15:05:39.151094 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 15:05:54 crc kubenswrapper[4722]: I0309 15:05:54.149074 4722 scope.go:117] "RemoveContainer" containerID="cd181ecc4152a977f7835e27eed2a00ac92eaff69a01c8a79c0452e664e36b0f" Mar 09 15:05:54 crc kubenswrapper[4722]: E0309 15:05:54.150025 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 15:06:00 crc kubenswrapper[4722]: I0309 15:06:00.182333 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551146-tbrjq"] Mar 09 15:06:00 crc kubenswrapper[4722]: E0309 15:06:00.183624 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b51890c5-640c-41fc-9fa5-46b27e01a70f" containerName="oc" Mar 09 15:06:00 crc kubenswrapper[4722]: I0309 15:06:00.183640 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="b51890c5-640c-41fc-9fa5-46b27e01a70f" containerName="oc" Mar 09 15:06:00 crc kubenswrapper[4722]: I0309 15:06:00.183903 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="b51890c5-640c-41fc-9fa5-46b27e01a70f" containerName="oc" Mar 09 15:06:00 crc kubenswrapper[4722]: I0309 15:06:00.184847 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551146-tbrjq" Mar 09 15:06:00 crc kubenswrapper[4722]: I0309 15:06:00.192929 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 15:06:00 crc kubenswrapper[4722]: I0309 15:06:00.193185 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-dr2x6" Mar 09 15:06:00 crc kubenswrapper[4722]: I0309 15:06:00.206602 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 15:06:00 crc kubenswrapper[4722]: I0309 15:06:00.210740 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551146-tbrjq"] Mar 09 15:06:00 crc kubenswrapper[4722]: I0309 15:06:00.220293 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6j7j\" (UniqueName: \"kubernetes.io/projected/d1279f77-ea98-417c-9845-cbbeec2536ac-kube-api-access-r6j7j\") pod \"auto-csr-approver-29551146-tbrjq\" (UID: \"d1279f77-ea98-417c-9845-cbbeec2536ac\") " pod="openshift-infra/auto-csr-approver-29551146-tbrjq" Mar 09 15:06:00 crc kubenswrapper[4722]: I0309 15:06:00.322604 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6j7j\" (UniqueName: \"kubernetes.io/projected/d1279f77-ea98-417c-9845-cbbeec2536ac-kube-api-access-r6j7j\") pod \"auto-csr-approver-29551146-tbrjq\" (UID: \"d1279f77-ea98-417c-9845-cbbeec2536ac\") " pod="openshift-infra/auto-csr-approver-29551146-tbrjq" Mar 09 15:06:00 crc kubenswrapper[4722]: I0309 15:06:00.342060 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6j7j\" (UniqueName: \"kubernetes.io/projected/d1279f77-ea98-417c-9845-cbbeec2536ac-kube-api-access-r6j7j\") pod \"auto-csr-approver-29551146-tbrjq\" (UID: \"d1279f77-ea98-417c-9845-cbbeec2536ac\") " pod="openshift-infra/auto-csr-approver-29551146-tbrjq" Mar 09 15:06:00 crc kubenswrapper[4722]: I0309 15:06:00.515220 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551146-tbrjq" Mar 09 15:06:01 crc kubenswrapper[4722]: I0309 15:06:01.030003 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551146-tbrjq"] Mar 09 15:06:01 crc kubenswrapper[4722]: I0309 15:06:01.492288 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551146-tbrjq" event={"ID":"d1279f77-ea98-417c-9845-cbbeec2536ac","Type":"ContainerStarted","Data":"5d5b8af47ff9de3f9665f0d514a9a9a3c9060d8e203c3d97631f69d7fed29517"} Mar 09 15:06:02 crc kubenswrapper[4722]: I0309 15:06:02.514168 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551146-tbrjq" event={"ID":"d1279f77-ea98-417c-9845-cbbeec2536ac","Type":"ContainerStarted","Data":"1d6155c10421ca5dc174dea3ca1101056b3b3d28a85cf7edab09591ca6af7ff8"} Mar 09 15:06:02 crc kubenswrapper[4722]: I0309 15:06:02.544421 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551146-tbrjq" podStartSLOduration=1.5111415990000001 podStartE2EDuration="2.544394541s" podCreationTimestamp="2026-03-09 15:06:00 +0000 UTC" firstStartedPulling="2026-03-09 15:06:01.034131683 +0000 UTC m=+3801.589700259" lastFinishedPulling="2026-03-09 15:06:02.067384585 +0000 UTC m=+3802.622953201" observedRunningTime="2026-03-09 15:06:02.527038822 +0000 UTC m=+3803.082607398" watchObservedRunningTime="2026-03-09 15:06:02.544394541 +0000 UTC m=+3803.099963117" Mar 09 15:06:03 crc kubenswrapper[4722]: I0309 15:06:03.527087 4722 generic.go:334] "Generic (PLEG): container finished" podID="d1279f77-ea98-417c-9845-cbbeec2536ac" containerID="1d6155c10421ca5dc174dea3ca1101056b3b3d28a85cf7edab09591ca6af7ff8" exitCode=0 Mar 09 15:06:03 crc kubenswrapper[4722]: I0309 15:06:03.527444 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551146-tbrjq" event={"ID":"d1279f77-ea98-417c-9845-cbbeec2536ac","Type":"ContainerDied","Data":"1d6155c10421ca5dc174dea3ca1101056b3b3d28a85cf7edab09591ca6af7ff8"} Mar 09 15:06:04 crc kubenswrapper[4722]: I0309 15:06:04.974033 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551146-tbrjq" Mar 09 15:06:05 crc kubenswrapper[4722]: I0309 15:06:05.041842 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r6j7j\" (UniqueName: \"kubernetes.io/projected/d1279f77-ea98-417c-9845-cbbeec2536ac-kube-api-access-r6j7j\") pod \"d1279f77-ea98-417c-9845-cbbeec2536ac\" (UID: \"d1279f77-ea98-417c-9845-cbbeec2536ac\") " Mar 09 15:06:05 crc kubenswrapper[4722]: I0309 15:06:05.059086 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1279f77-ea98-417c-9845-cbbeec2536ac-kube-api-access-r6j7j" (OuterVolumeSpecName: "kube-api-access-r6j7j") pod "d1279f77-ea98-417c-9845-cbbeec2536ac" (UID: "d1279f77-ea98-417c-9845-cbbeec2536ac"). InnerVolumeSpecName "kube-api-access-r6j7j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 15:06:05 crc kubenswrapper[4722]: I0309 15:06:05.145177 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r6j7j\" (UniqueName: \"kubernetes.io/projected/d1279f77-ea98-417c-9845-cbbeec2536ac-kube-api-access-r6j7j\") on node \"crc\" DevicePath \"\"" Mar 09 15:06:05 crc kubenswrapper[4722]: I0309 15:06:05.564198 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551146-tbrjq" event={"ID":"d1279f77-ea98-417c-9845-cbbeec2536ac","Type":"ContainerDied","Data":"5d5b8af47ff9de3f9665f0d514a9a9a3c9060d8e203c3d97631f69d7fed29517"} Mar 09 15:06:05 crc kubenswrapper[4722]: I0309 15:06:05.564416 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d5b8af47ff9de3f9665f0d514a9a9a3c9060d8e203c3d97631f69d7fed29517" Mar 09 15:06:05 crc kubenswrapper[4722]: I0309 15:06:05.564496 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551146-tbrjq" Mar 09 15:06:05 crc kubenswrapper[4722]: I0309 15:06:05.641356 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551140-sgbwp"] Mar 09 15:06:05 crc kubenswrapper[4722]: I0309 15:06:05.657505 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551140-sgbwp"] Mar 09 15:06:06 crc kubenswrapper[4722]: I0309 15:06:06.149480 4722 scope.go:117] "RemoveContainer" containerID="cd181ecc4152a977f7835e27eed2a00ac92eaff69a01c8a79c0452e664e36b0f" Mar 09 15:06:06 crc kubenswrapper[4722]: E0309 15:06:06.149801 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 15:06:06 crc kubenswrapper[4722]: I0309 15:06:06.163172 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c493907c-5318-4159-930b-2c5108e7b026" path="/var/lib/kubelet/pods/c493907c-5318-4159-930b-2c5108e7b026/volumes" Mar 09 15:06:20 crc kubenswrapper[4722]: I0309 15:06:20.157091 4722 scope.go:117] "RemoveContainer" containerID="cd181ecc4152a977f7835e27eed2a00ac92eaff69a01c8a79c0452e664e36b0f" Mar 09 15:06:20 crc kubenswrapper[4722]: E0309 15:06:20.158255 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 15:06:29 crc kubenswrapper[4722]: I0309 15:06:29.518486 4722 scope.go:117] "RemoveContainer" containerID="8a743afe9645604bd6b4abac660fce26be70e597a537f1522d327b50577f37c5" Mar 09 15:06:33 crc kubenswrapper[4722]: I0309 15:06:33.149839 4722 scope.go:117] "RemoveContainer" containerID="cd181ecc4152a977f7835e27eed2a00ac92eaff69a01c8a79c0452e664e36b0f" Mar 09 15:06:33 crc kubenswrapper[4722]: E0309 15:06:33.151006 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 15:06:46 crc kubenswrapper[4722]: I0309 15:06:46.149358 4722 scope.go:117] "RemoveContainer" containerID="cd181ecc4152a977f7835e27eed2a00ac92eaff69a01c8a79c0452e664e36b0f" Mar 09 15:06:46 crc kubenswrapper[4722]: E0309 15:06:46.150433 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 15:06:59 crc kubenswrapper[4722]: I0309 15:06:59.149935 4722 scope.go:117] "RemoveContainer" containerID="cd181ecc4152a977f7835e27eed2a00ac92eaff69a01c8a79c0452e664e36b0f" Mar 09 15:06:59 crc kubenswrapper[4722]: E0309 15:06:59.150773 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 15:07:13 crc kubenswrapper[4722]: I0309 15:07:13.149974 4722 scope.go:117] "RemoveContainer" containerID="cd181ecc4152a977f7835e27eed2a00ac92eaff69a01c8a79c0452e664e36b0f" Mar 09 15:07:13 crc kubenswrapper[4722]: E0309 15:07:13.151308 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 15:07:28 crc kubenswrapper[4722]: I0309 15:07:28.149141 4722 scope.go:117] "RemoveContainer" containerID="cd181ecc4152a977f7835e27eed2a00ac92eaff69a01c8a79c0452e664e36b0f" Mar 09 15:07:28 crc kubenswrapper[4722]: E0309 15:07:28.150001 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 15:07:41 crc kubenswrapper[4722]: I0309 15:07:41.149716 4722 scope.go:117] "RemoveContainer" containerID="cd181ecc4152a977f7835e27eed2a00ac92eaff69a01c8a79c0452e664e36b0f" Mar 09 15:07:41 crc kubenswrapper[4722]: E0309 15:07:41.150813 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 15:07:54 crc kubenswrapper[4722]: I0309 15:07:54.150643 4722 scope.go:117] "RemoveContainer" containerID="cd181ecc4152a977f7835e27eed2a00ac92eaff69a01c8a79c0452e664e36b0f" Mar 09 15:07:54 crc kubenswrapper[4722]: E0309 15:07:54.151406 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 15:08:00 crc kubenswrapper[4722]: I0309 15:08:00.163483 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551148-j78st"] Mar 09 15:08:00 crc kubenswrapper[4722]: E0309 15:08:00.164640 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1279f77-ea98-417c-9845-cbbeec2536ac" containerName="oc" Mar 09 15:08:00 crc kubenswrapper[4722]: I0309 15:08:00.164662 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1279f77-ea98-417c-9845-cbbeec2536ac" containerName="oc" Mar 09 15:08:00 crc kubenswrapper[4722]: I0309 15:08:00.164964 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1279f77-ea98-417c-9845-cbbeec2536ac" containerName="oc" Mar 09 15:08:00 crc kubenswrapper[4722]: I0309 15:08:00.165938 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551148-j78st"] Mar 09 15:08:00 crc kubenswrapper[4722]: I0309 15:08:00.166062 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551148-j78st" Mar 09 15:08:00 crc kubenswrapper[4722]: I0309 15:08:00.169327 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 15:08:00 crc kubenswrapper[4722]: I0309 15:08:00.169370 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-dr2x6" Mar 09 15:08:00 crc kubenswrapper[4722]: I0309 15:08:00.169817 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 15:08:00 crc kubenswrapper[4722]: I0309 15:08:00.248868 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74zbw\" (UniqueName: \"kubernetes.io/projected/f0975b72-4ff8-4c2a-b11f-b3ade5a05e7c-kube-api-access-74zbw\") pod \"auto-csr-approver-29551148-j78st\" (UID: \"f0975b72-4ff8-4c2a-b11f-b3ade5a05e7c\") " pod="openshift-infra/auto-csr-approver-29551148-j78st" Mar 09 15:08:00 crc kubenswrapper[4722]: I0309 15:08:00.351160 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74zbw\" (UniqueName: \"kubernetes.io/projected/f0975b72-4ff8-4c2a-b11f-b3ade5a05e7c-kube-api-access-74zbw\") pod \"auto-csr-approver-29551148-j78st\" (UID: \"f0975b72-4ff8-4c2a-b11f-b3ade5a05e7c\") " pod="openshift-infra/auto-csr-approver-29551148-j78st" Mar 09 15:08:00 crc kubenswrapper[4722]: I0309 15:08:00.382120 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74zbw\" (UniqueName: \"kubernetes.io/projected/f0975b72-4ff8-4c2a-b11f-b3ade5a05e7c-kube-api-access-74zbw\") pod \"auto-csr-approver-29551148-j78st\" (UID: \"f0975b72-4ff8-4c2a-b11f-b3ade5a05e7c\") " pod="openshift-infra/auto-csr-approver-29551148-j78st" Mar 09 15:08:00 crc kubenswrapper[4722]: I0309 15:08:00.491849 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551148-j78st" Mar 09 15:08:01 crc kubenswrapper[4722]: I0309 15:08:01.045741 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551148-j78st"] Mar 09 15:08:01 crc kubenswrapper[4722]: I0309 15:08:01.933776 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551148-j78st" event={"ID":"f0975b72-4ff8-4c2a-b11f-b3ade5a05e7c","Type":"ContainerStarted","Data":"68e2c5492be8e4b60b0f173251de8c98a3088bfd6b8f43a06f514b0ff42f8f07"} Mar 09 15:08:02 crc kubenswrapper[4722]: I0309 15:08:02.946855 4722 generic.go:334] "Generic (PLEG): container finished" podID="f0975b72-4ff8-4c2a-b11f-b3ade5a05e7c" containerID="1d5c5c98c936ab05ca6826755ddb291638f469ab090a1d7528d6744a2c5a77c3" exitCode=0 Mar 09 15:08:02 crc kubenswrapper[4722]: I0309 15:08:02.947190 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551148-j78st" event={"ID":"f0975b72-4ff8-4c2a-b11f-b3ade5a05e7c","Type":"ContainerDied","Data":"1d5c5c98c936ab05ca6826755ddb291638f469ab090a1d7528d6744a2c5a77c3"} Mar 09 15:08:04 crc kubenswrapper[4722]: I0309 15:08:04.534713 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551148-j78st" Mar 09 15:08:04 crc kubenswrapper[4722]: I0309 15:08:04.675911 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74zbw\" (UniqueName: \"kubernetes.io/projected/f0975b72-4ff8-4c2a-b11f-b3ade5a05e7c-kube-api-access-74zbw\") pod \"f0975b72-4ff8-4c2a-b11f-b3ade5a05e7c\" (UID: \"f0975b72-4ff8-4c2a-b11f-b3ade5a05e7c\") " Mar 09 15:08:04 crc kubenswrapper[4722]: I0309 15:08:04.682071 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0975b72-4ff8-4c2a-b11f-b3ade5a05e7c-kube-api-access-74zbw" (OuterVolumeSpecName: "kube-api-access-74zbw") pod "f0975b72-4ff8-4c2a-b11f-b3ade5a05e7c" (UID: "f0975b72-4ff8-4c2a-b11f-b3ade5a05e7c"). InnerVolumeSpecName "kube-api-access-74zbw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 15:08:04 crc kubenswrapper[4722]: I0309 15:08:04.779295 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74zbw\" (UniqueName: \"kubernetes.io/projected/f0975b72-4ff8-4c2a-b11f-b3ade5a05e7c-kube-api-access-74zbw\") on node \"crc\" DevicePath \"\"" Mar 09 15:08:04 crc kubenswrapper[4722]: I0309 15:08:04.967121 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551148-j78st" event={"ID":"f0975b72-4ff8-4c2a-b11f-b3ade5a05e7c","Type":"ContainerDied","Data":"68e2c5492be8e4b60b0f173251de8c98a3088bfd6b8f43a06f514b0ff42f8f07"} Mar 09 15:08:04 crc kubenswrapper[4722]: I0309 15:08:04.967393 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68e2c5492be8e4b60b0f173251de8c98a3088bfd6b8f43a06f514b0ff42f8f07" Mar 09 15:08:04 crc kubenswrapper[4722]: I0309 15:08:04.967441 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551148-j78st" Mar 09 15:08:05 crc kubenswrapper[4722]: I0309 15:08:05.623654 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551142-r7rrz"] Mar 09 15:08:05 crc kubenswrapper[4722]: I0309 15:08:05.638103 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551142-r7rrz"] Mar 09 15:08:06 crc kubenswrapper[4722]: I0309 15:08:06.149012 4722 scope.go:117] "RemoveContainer" containerID="cd181ecc4152a977f7835e27eed2a00ac92eaff69a01c8a79c0452e664e36b0f" Mar 09 15:08:06 crc kubenswrapper[4722]: E0309 15:08:06.149668 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 15:08:06 crc kubenswrapper[4722]: I0309 15:08:06.170005 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e3cbe3a-4b01-4955-a466-64aeca1350d4" path="/var/lib/kubelet/pods/2e3cbe3a-4b01-4955-a466-64aeca1350d4/volumes" Mar 09 15:08:21 crc kubenswrapper[4722]: I0309 15:08:21.149724 4722 scope.go:117] "RemoveContainer" containerID="cd181ecc4152a977f7835e27eed2a00ac92eaff69a01c8a79c0452e664e36b0f" Mar 09 15:08:21 crc kubenswrapper[4722]: E0309 15:08:21.150662 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 15:08:29 crc kubenswrapper[4722]: I0309 15:08:29.636400 4722 scope.go:117] "RemoveContainer" containerID="ca2e5d2663ad9afb088fbb5e128106528854c30f7dfa3e0d597c7fecedeafbf9" Mar 09 15:08:35 crc kubenswrapper[4722]: I0309 15:08:35.150250 4722 scope.go:117] "RemoveContainer" containerID="cd181ecc4152a977f7835e27eed2a00ac92eaff69a01c8a79c0452e664e36b0f" Mar 09 15:08:35 crc kubenswrapper[4722]: E0309 15:08:35.151229 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 15:08:47 crc kubenswrapper[4722]: I0309 15:08:47.150111 4722 scope.go:117] "RemoveContainer" containerID="cd181ecc4152a977f7835e27eed2a00ac92eaff69a01c8a79c0452e664e36b0f" Mar 09 15:08:47 crc kubenswrapper[4722]: E0309 15:08:47.151335 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 15:08:58 crc kubenswrapper[4722]: I0309 15:08:58.149756 4722 scope.go:117] "RemoveContainer" containerID="cd181ecc4152a977f7835e27eed2a00ac92eaff69a01c8a79c0452e664e36b0f" Mar 09 15:08:58 crc kubenswrapper[4722]: E0309 15:08:58.151177 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 15:09:12 crc kubenswrapper[4722]: I0309 15:09:12.569628 4722 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 09 15:09:13 crc kubenswrapper[4722]: I0309 15:09:13.150222 4722 scope.go:117] "RemoveContainer" containerID="cd181ecc4152a977f7835e27eed2a00ac92eaff69a01c8a79c0452e664e36b0f" Mar 09 15:09:13 crc kubenswrapper[4722]: E0309 15:09:13.150743 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 15:09:24 crc kubenswrapper[4722]: I0309 15:09:24.150047 4722 scope.go:117] "RemoveContainer" containerID="cd181ecc4152a977f7835e27eed2a00ac92eaff69a01c8a79c0452e664e36b0f" Mar 09 15:09:24 crc kubenswrapper[4722]: I0309 15:09:24.901795 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" event={"ID":"dac2aaf5-653b-4b2a-8efe-ed26bac8d648","Type":"ContainerStarted","Data":"605e7cfbce2d49113bae14138b0367a77ac331e6056b4329529ab18e27f72b42"} Mar 09 15:10:00 crc kubenswrapper[4722]: I0309 15:10:00.173177 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551150-4lzhw"] Mar 09 15:10:00 crc kubenswrapper[4722]: E0309 15:10:00.174414 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0975b72-4ff8-4c2a-b11f-b3ade5a05e7c" containerName="oc" Mar 09 15:10:00 crc kubenswrapper[4722]: I0309 15:10:00.174434 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0975b72-4ff8-4c2a-b11f-b3ade5a05e7c" containerName="oc" Mar 09 15:10:00 crc kubenswrapper[4722]: I0309 15:10:00.174740 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0975b72-4ff8-4c2a-b11f-b3ade5a05e7c" containerName="oc" Mar 09 15:10:00 crc kubenswrapper[4722]: I0309 15:10:00.175737 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551150-4lzhw" Mar 09 15:10:00 crc kubenswrapper[4722]: I0309 15:10:00.176256 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551150-4lzhw"] Mar 09 15:10:00 crc kubenswrapper[4722]: I0309 15:10:00.178389 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-dr2x6" Mar 09 15:10:00 crc kubenswrapper[4722]: I0309 15:10:00.178675 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 15:10:00 crc kubenswrapper[4722]: I0309 15:10:00.178894 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 15:10:00 crc kubenswrapper[4722]: I0309 15:10:00.322675 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7d7zc\" (UniqueName: \"kubernetes.io/projected/e82423a0-60f3-4c62-82ba-2b34ae623247-kube-api-access-7d7zc\") pod \"auto-csr-approver-29551150-4lzhw\" (UID: \"e82423a0-60f3-4c62-82ba-2b34ae623247\") " pod="openshift-infra/auto-csr-approver-29551150-4lzhw" Mar 09 15:10:00 crc kubenswrapper[4722]: I0309 15:10:00.425110 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7d7zc\" (UniqueName: \"kubernetes.io/projected/e82423a0-60f3-4c62-82ba-2b34ae623247-kube-api-access-7d7zc\") pod \"auto-csr-approver-29551150-4lzhw\" (UID: \"e82423a0-60f3-4c62-82ba-2b34ae623247\") " pod="openshift-infra/auto-csr-approver-29551150-4lzhw" Mar 09 15:10:00 crc kubenswrapper[4722]: I0309 15:10:00.451313 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7d7zc\" (UniqueName: \"kubernetes.io/projected/e82423a0-60f3-4c62-82ba-2b34ae623247-kube-api-access-7d7zc\") pod \"auto-csr-approver-29551150-4lzhw\" (UID: \"e82423a0-60f3-4c62-82ba-2b34ae623247\") " pod="openshift-infra/auto-csr-approver-29551150-4lzhw" Mar 09 15:10:00 crc kubenswrapper[4722]: I0309 15:10:00.503660 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551150-4lzhw" Mar 09 15:10:01 crc kubenswrapper[4722]: I0309 15:10:01.033717 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551150-4lzhw"] Mar 09 15:10:01 crc kubenswrapper[4722]: I0309 15:10:01.047980 4722 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 15:10:01 crc kubenswrapper[4722]: I0309 15:10:01.333582 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551150-4lzhw" event={"ID":"e82423a0-60f3-4c62-82ba-2b34ae623247","Type":"ContainerStarted","Data":"64dd2e214303e6542d374602572a9633bc7dc8f0baa031f6203ce8364ff4712e"} Mar 09 15:10:04 crc kubenswrapper[4722]: I0309 15:10:04.370105 4722 generic.go:334] "Generic (PLEG): container finished" podID="e82423a0-60f3-4c62-82ba-2b34ae623247" containerID="7a6ee4b2904c6524c8599793fcfec2e6c3d24072549f2309fdda697531abfc6d" exitCode=0 Mar 09 15:10:04 crc kubenswrapper[4722]: I0309 15:10:04.370146 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551150-4lzhw" event={"ID":"e82423a0-60f3-4c62-82ba-2b34ae623247","Type":"ContainerDied","Data":"7a6ee4b2904c6524c8599793fcfec2e6c3d24072549f2309fdda697531abfc6d"} Mar 09 15:10:05 crc kubenswrapper[4722]: I0309 15:10:05.872029 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551150-4lzhw" Mar 09 15:10:05 crc kubenswrapper[4722]: I0309 15:10:05.957834 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7d7zc\" (UniqueName: \"kubernetes.io/projected/e82423a0-60f3-4c62-82ba-2b34ae623247-kube-api-access-7d7zc\") pod \"e82423a0-60f3-4c62-82ba-2b34ae623247\" (UID: \"e82423a0-60f3-4c62-82ba-2b34ae623247\") " Mar 09 15:10:05 crc kubenswrapper[4722]: I0309 15:10:05.968426 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e82423a0-60f3-4c62-82ba-2b34ae623247-kube-api-access-7d7zc" (OuterVolumeSpecName: "kube-api-access-7d7zc") pod "e82423a0-60f3-4c62-82ba-2b34ae623247" (UID: "e82423a0-60f3-4c62-82ba-2b34ae623247"). InnerVolumeSpecName "kube-api-access-7d7zc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 15:10:06 crc kubenswrapper[4722]: I0309 15:10:06.061087 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7d7zc\" (UniqueName: \"kubernetes.io/projected/e82423a0-60f3-4c62-82ba-2b34ae623247-kube-api-access-7d7zc\") on node \"crc\" DevicePath \"\"" Mar 09 15:10:06 crc kubenswrapper[4722]: I0309 15:10:06.398676 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551150-4lzhw" event={"ID":"e82423a0-60f3-4c62-82ba-2b34ae623247","Type":"ContainerDied","Data":"64dd2e214303e6542d374602572a9633bc7dc8f0baa031f6203ce8364ff4712e"} Mar 09 15:10:06 crc kubenswrapper[4722]: I0309 15:10:06.398779 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="64dd2e214303e6542d374602572a9633bc7dc8f0baa031f6203ce8364ff4712e" Mar 09 15:10:06 crc kubenswrapper[4722]: I0309 15:10:06.398889 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551150-4lzhw" Mar 09 15:10:06 crc kubenswrapper[4722]: I0309 15:10:06.963958 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551144-jtsn5"] Mar 09 15:10:06 crc kubenswrapper[4722]: I0309 15:10:06.978243 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551144-jtsn5"] Mar 09 15:10:08 crc kubenswrapper[4722]: I0309 15:10:08.161161 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b51890c5-640c-41fc-9fa5-46b27e01a70f" path="/var/lib/kubelet/pods/b51890c5-640c-41fc-9fa5-46b27e01a70f/volumes" Mar 09 15:10:29 crc kubenswrapper[4722]: I0309 15:10:29.813335 4722 scope.go:117] "RemoveContainer" containerID="64c22c60d207466957d4cb3128b0cd7668f4bfbb95f46ac446ec5f41037ff3d7" Mar 09 15:10:42 crc kubenswrapper[4722]: I0309 15:10:42.343469 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-b7q8f"] Mar 09 15:10:42 crc kubenswrapper[4722]: E0309 15:10:42.344730 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e82423a0-60f3-4c62-82ba-2b34ae623247" containerName="oc" Mar 09 15:10:42 crc kubenswrapper[4722]: I0309 15:10:42.344747 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="e82423a0-60f3-4c62-82ba-2b34ae623247" containerName="oc" Mar 09 15:10:42 crc kubenswrapper[4722]: I0309 15:10:42.345009 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="e82423a0-60f3-4c62-82ba-2b34ae623247" containerName="oc" Mar 09 15:10:42 crc kubenswrapper[4722]: I0309 15:10:42.347101 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b7q8f" Mar 09 15:10:42 crc kubenswrapper[4722]: I0309 15:10:42.363867 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-b7q8f"] Mar 09 15:10:42 crc kubenswrapper[4722]: I0309 15:10:42.491950 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9tmj\" (UniqueName: \"kubernetes.io/projected/737ce8ca-c7e9-4771-9ef3-7ffd04d03c72-kube-api-access-h9tmj\") pod \"certified-operators-b7q8f\" (UID: \"737ce8ca-c7e9-4771-9ef3-7ffd04d03c72\") " pod="openshift-marketplace/certified-operators-b7q8f" Mar 09 15:10:42 crc kubenswrapper[4722]: I0309 15:10:42.492488 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/737ce8ca-c7e9-4771-9ef3-7ffd04d03c72-utilities\") pod \"certified-operators-b7q8f\" (UID: \"737ce8ca-c7e9-4771-9ef3-7ffd04d03c72\") " pod="openshift-marketplace/certified-operators-b7q8f" Mar 09 15:10:42 crc kubenswrapper[4722]: I0309 15:10:42.492856 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/737ce8ca-c7e9-4771-9ef3-7ffd04d03c72-catalog-content\") pod \"certified-operators-b7q8f\" (UID: \"737ce8ca-c7e9-4771-9ef3-7ffd04d03c72\") " pod="openshift-marketplace/certified-operators-b7q8f" Mar 09 15:10:42 crc kubenswrapper[4722]: I0309 15:10:42.595434 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/737ce8ca-c7e9-4771-9ef3-7ffd04d03c72-catalog-content\") pod \"certified-operators-b7q8f\" (UID: \"737ce8ca-c7e9-4771-9ef3-7ffd04d03c72\") " pod="openshift-marketplace/certified-operators-b7q8f" Mar 09 15:10:42 crc kubenswrapper[4722]: I0309 15:10:42.595597 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9tmj\" (UniqueName: \"kubernetes.io/projected/737ce8ca-c7e9-4771-9ef3-7ffd04d03c72-kube-api-access-h9tmj\") pod \"certified-operators-b7q8f\" (UID: \"737ce8ca-c7e9-4771-9ef3-7ffd04d03c72\") " pod="openshift-marketplace/certified-operators-b7q8f" Mar 09 15:10:42 crc kubenswrapper[4722]: I0309 15:10:42.595682 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/737ce8ca-c7e9-4771-9ef3-7ffd04d03c72-utilities\") pod \"certified-operators-b7q8f\" (UID: \"737ce8ca-c7e9-4771-9ef3-7ffd04d03c72\") " pod="openshift-marketplace/certified-operators-b7q8f" Mar 09 15:10:42 crc kubenswrapper[4722]: I0309 15:10:42.596441 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/737ce8ca-c7e9-4771-9ef3-7ffd04d03c72-utilities\") pod \"certified-operators-b7q8f\" (UID: \"737ce8ca-c7e9-4771-9ef3-7ffd04d03c72\") " pod="openshift-marketplace/certified-operators-b7q8f" Mar 09 15:10:42 crc kubenswrapper[4722]: I0309 15:10:42.596725 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/737ce8ca-c7e9-4771-9ef3-7ffd04d03c72-catalog-content\") pod \"certified-operators-b7q8f\" (UID: \"737ce8ca-c7e9-4771-9ef3-7ffd04d03c72\") " pod="openshift-marketplace/certified-operators-b7q8f" Mar 09 15:10:42 crc kubenswrapper[4722]: I0309 15:10:42.617168 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9tmj\" (UniqueName: \"kubernetes.io/projected/737ce8ca-c7e9-4771-9ef3-7ffd04d03c72-kube-api-access-h9tmj\") pod \"certified-operators-b7q8f\" (UID: \"737ce8ca-c7e9-4771-9ef3-7ffd04d03c72\") " pod="openshift-marketplace/certified-operators-b7q8f" Mar 09 15:10:42 crc kubenswrapper[4722]: I0309 15:10:42.677616 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b7q8f" Mar 09 15:10:43 crc kubenswrapper[4722]: I0309 15:10:43.329121 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-b7q8f"] Mar 09 15:10:43 crc kubenswrapper[4722]: I0309 15:10:43.913875 4722 generic.go:334] "Generic (PLEG): container finished" podID="737ce8ca-c7e9-4771-9ef3-7ffd04d03c72" containerID="c88a832b4b7b26b5d0456e0af2acfda428546dd85380c78204bde78f86efbba0" exitCode=0 Mar 09 15:10:43 crc kubenswrapper[4722]: I0309 15:10:43.913987 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b7q8f" event={"ID":"737ce8ca-c7e9-4771-9ef3-7ffd04d03c72","Type":"ContainerDied","Data":"c88a832b4b7b26b5d0456e0af2acfda428546dd85380c78204bde78f86efbba0"} Mar 09 15:10:43 crc kubenswrapper[4722]: I0309 15:10:43.914237 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b7q8f" event={"ID":"737ce8ca-c7e9-4771-9ef3-7ffd04d03c72","Type":"ContainerStarted","Data":"62ca276939cfddd96b4a24fff460b81fdec4107209d6dcdbbcddfde3455316b5"} Mar 09 15:10:45 crc kubenswrapper[4722]: I0309 15:10:45.940737 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b7q8f" event={"ID":"737ce8ca-c7e9-4771-9ef3-7ffd04d03c72","Type":"ContainerStarted","Data":"5a1ecbcc87df18129c4d9e83ae8ed9a259145b1d9f31226ae3f117ea5ee256fb"} Mar 09 15:10:47 crc kubenswrapper[4722]: I0309 15:10:47.967615 4722 generic.go:334] "Generic (PLEG): container finished" podID="737ce8ca-c7e9-4771-9ef3-7ffd04d03c72" containerID="5a1ecbcc87df18129c4d9e83ae8ed9a259145b1d9f31226ae3f117ea5ee256fb" exitCode=0 Mar 09 15:10:47 crc kubenswrapper[4722]: I0309 15:10:47.967682 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b7q8f" event={"ID":"737ce8ca-c7e9-4771-9ef3-7ffd04d03c72","Type":"ContainerDied","Data":"5a1ecbcc87df18129c4d9e83ae8ed9a259145b1d9f31226ae3f117ea5ee256fb"} Mar 09 15:10:48 crc kubenswrapper[4722]: I0309 15:10:48.979023 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b7q8f" event={"ID":"737ce8ca-c7e9-4771-9ef3-7ffd04d03c72","Type":"ContainerStarted","Data":"3b30128c3c8c55b5908303e34f665fe1e706be598f9d79400546a173aafd201e"} Mar 09 15:10:49 crc kubenswrapper[4722]: I0309 15:10:49.026521 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-b7q8f" podStartSLOduration=2.551480876 podStartE2EDuration="7.026503323s" podCreationTimestamp="2026-03-09 15:10:42 +0000 UTC" firstStartedPulling="2026-03-09 15:10:43.916969115 +0000 UTC m=+4084.472537701" lastFinishedPulling="2026-03-09 15:10:48.391991572 +0000 UTC m=+4088.947560148" observedRunningTime="2026-03-09 15:10:49.018708513 +0000 UTC m=+4089.574277089" watchObservedRunningTime="2026-03-09 15:10:49.026503323 +0000 UTC m=+4089.582071889" Mar 09 15:10:52 crc kubenswrapper[4722]: I0309 15:10:52.678316 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-b7q8f" Mar 09 15:10:52 crc kubenswrapper[4722]: I0309 15:10:52.680356 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-b7q8f" Mar 09 15:10:52 crc kubenswrapper[4722]: I0309 15:10:52.763321 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-b7q8f" Mar 09 15:10:53 crc kubenswrapper[4722]: I0309 15:10:53.073492 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-b7q8f" Mar 09 15:10:53 crc kubenswrapper[4722]: I0309 15:10:53.123367 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-b7q8f"] Mar 09 15:10:55 crc kubenswrapper[4722]: I0309 15:10:55.046541 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-b7q8f" podUID="737ce8ca-c7e9-4771-9ef3-7ffd04d03c72" containerName="registry-server" containerID="cri-o://3b30128c3c8c55b5908303e34f665fe1e706be598f9d79400546a173aafd201e" gracePeriod=2 Mar 09 15:10:55 crc kubenswrapper[4722]: I0309 15:10:55.728482 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b7q8f" Mar 09 15:10:55 crc kubenswrapper[4722]: I0309 15:10:55.873388 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/737ce8ca-c7e9-4771-9ef3-7ffd04d03c72-utilities\") pod \"737ce8ca-c7e9-4771-9ef3-7ffd04d03c72\" (UID: \"737ce8ca-c7e9-4771-9ef3-7ffd04d03c72\") " Mar 09 15:10:55 crc kubenswrapper[4722]: I0309 15:10:55.873479 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9tmj\" (UniqueName: \"kubernetes.io/projected/737ce8ca-c7e9-4771-9ef3-7ffd04d03c72-kube-api-access-h9tmj\") pod \"737ce8ca-c7e9-4771-9ef3-7ffd04d03c72\" (UID: \"737ce8ca-c7e9-4771-9ef3-7ffd04d03c72\") " Mar 09 15:10:55 crc kubenswrapper[4722]: I0309 15:10:55.873659 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/737ce8ca-c7e9-4771-9ef3-7ffd04d03c72-catalog-content\") pod \"737ce8ca-c7e9-4771-9ef3-7ffd04d03c72\" (UID: \"737ce8ca-c7e9-4771-9ef3-7ffd04d03c72\") " Mar 09 15:10:55 crc kubenswrapper[4722]: I0309 15:10:55.875501 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/737ce8ca-c7e9-4771-9ef3-7ffd04d03c72-utilities" (OuterVolumeSpecName: "utilities") pod "737ce8ca-c7e9-4771-9ef3-7ffd04d03c72" (UID: "737ce8ca-c7e9-4771-9ef3-7ffd04d03c72"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 15:10:55 crc kubenswrapper[4722]: I0309 15:10:55.892387 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/737ce8ca-c7e9-4771-9ef3-7ffd04d03c72-kube-api-access-h9tmj" (OuterVolumeSpecName: "kube-api-access-h9tmj") pod "737ce8ca-c7e9-4771-9ef3-7ffd04d03c72" (UID: "737ce8ca-c7e9-4771-9ef3-7ffd04d03c72"). InnerVolumeSpecName "kube-api-access-h9tmj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 15:10:55 crc kubenswrapper[4722]: I0309 15:10:55.948065 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/737ce8ca-c7e9-4771-9ef3-7ffd04d03c72-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "737ce8ca-c7e9-4771-9ef3-7ffd04d03c72" (UID: "737ce8ca-c7e9-4771-9ef3-7ffd04d03c72"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 15:10:55 crc kubenswrapper[4722]: I0309 15:10:55.976886 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/737ce8ca-c7e9-4771-9ef3-7ffd04d03c72-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 15:10:55 crc kubenswrapper[4722]: I0309 15:10:55.976942 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h9tmj\" (UniqueName: \"kubernetes.io/projected/737ce8ca-c7e9-4771-9ef3-7ffd04d03c72-kube-api-access-h9tmj\") on node \"crc\" DevicePath \"\"" Mar 09 15:10:55 crc kubenswrapper[4722]: I0309 15:10:55.976963 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/737ce8ca-c7e9-4771-9ef3-7ffd04d03c72-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 15:10:56 crc kubenswrapper[4722]: I0309 15:10:56.062438 4722 generic.go:334] "Generic (PLEG): container finished" podID="737ce8ca-c7e9-4771-9ef3-7ffd04d03c72" containerID="3b30128c3c8c55b5908303e34f665fe1e706be598f9d79400546a173aafd201e" exitCode=0 Mar 09 15:10:56 crc kubenswrapper[4722]: I0309 15:10:56.062498 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b7q8f" event={"ID":"737ce8ca-c7e9-4771-9ef3-7ffd04d03c72","Type":"ContainerDied","Data":"3b30128c3c8c55b5908303e34f665fe1e706be598f9d79400546a173aafd201e"} Mar 09 15:10:56 crc kubenswrapper[4722]: I0309 15:10:56.062545 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-b7q8f" event={"ID":"737ce8ca-c7e9-4771-9ef3-7ffd04d03c72","Type":"ContainerDied","Data":"62ca276939cfddd96b4a24fff460b81fdec4107209d6dcdbbcddfde3455316b5"} Mar 09 15:10:56 crc kubenswrapper[4722]: I0309 15:10:56.062555 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-b7q8f" Mar 09 15:10:56 crc kubenswrapper[4722]: I0309 15:10:56.062575 4722 scope.go:117] "RemoveContainer" containerID="3b30128c3c8c55b5908303e34f665fe1e706be598f9d79400546a173aafd201e" Mar 09 15:10:56 crc kubenswrapper[4722]: I0309 15:10:56.091812 4722 scope.go:117] "RemoveContainer" containerID="5a1ecbcc87df18129c4d9e83ae8ed9a259145b1d9f31226ae3f117ea5ee256fb" Mar 09 15:10:56 crc kubenswrapper[4722]: I0309 15:10:56.104354 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-b7q8f"] Mar 09 15:10:56 crc kubenswrapper[4722]: I0309 15:10:56.115700 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-b7q8f"] Mar 09 15:10:56 crc kubenswrapper[4722]: I0309 15:10:56.162840 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="737ce8ca-c7e9-4771-9ef3-7ffd04d03c72" path="/var/lib/kubelet/pods/737ce8ca-c7e9-4771-9ef3-7ffd04d03c72/volumes" Mar 09 15:10:56 crc kubenswrapper[4722]: I0309 15:10:56.898592 4722 scope.go:117] "RemoveContainer" containerID="c88a832b4b7b26b5d0456e0af2acfda428546dd85380c78204bde78f86efbba0" Mar 09 15:10:56 crc kubenswrapper[4722]: I0309 15:10:56.964297 4722 scope.go:117] "RemoveContainer" containerID="3b30128c3c8c55b5908303e34f665fe1e706be598f9d79400546a173aafd201e" Mar 09 15:10:56 crc kubenswrapper[4722]: E0309 15:10:56.964906 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b30128c3c8c55b5908303e34f665fe1e706be598f9d79400546a173aafd201e\": container with ID starting with 3b30128c3c8c55b5908303e34f665fe1e706be598f9d79400546a173aafd201e not found: ID does not exist" containerID="3b30128c3c8c55b5908303e34f665fe1e706be598f9d79400546a173aafd201e" Mar 09 15:10:56 crc kubenswrapper[4722]: I0309 15:10:56.964939 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b30128c3c8c55b5908303e34f665fe1e706be598f9d79400546a173aafd201e"} err="failed to get container status \"3b30128c3c8c55b5908303e34f665fe1e706be598f9d79400546a173aafd201e\": rpc error: code = NotFound desc = could not find container \"3b30128c3c8c55b5908303e34f665fe1e706be598f9d79400546a173aafd201e\": container with ID starting with 3b30128c3c8c55b5908303e34f665fe1e706be598f9d79400546a173aafd201e not found: ID does not exist" Mar 09 15:10:56 crc kubenswrapper[4722]: I0309 15:10:56.964961 4722 scope.go:117] "RemoveContainer" containerID="5a1ecbcc87df18129c4d9e83ae8ed9a259145b1d9f31226ae3f117ea5ee256fb" Mar 09 15:10:56 crc kubenswrapper[4722]: E0309 15:10:56.966253 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a1ecbcc87df18129c4d9e83ae8ed9a259145b1d9f31226ae3f117ea5ee256fb\": container with ID starting with 5a1ecbcc87df18129c4d9e83ae8ed9a259145b1d9f31226ae3f117ea5ee256fb not found: ID does not exist" containerID="5a1ecbcc87df18129c4d9e83ae8ed9a259145b1d9f31226ae3f117ea5ee256fb" Mar 09 15:10:56 crc kubenswrapper[4722]: I0309 15:10:56.966276 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a1ecbcc87df18129c4d9e83ae8ed9a259145b1d9f31226ae3f117ea5ee256fb"} err="failed to get container status \"5a1ecbcc87df18129c4d9e83ae8ed9a259145b1d9f31226ae3f117ea5ee256fb\": rpc error: code = NotFound desc = could not find container \"5a1ecbcc87df18129c4d9e83ae8ed9a259145b1d9f31226ae3f117ea5ee256fb\": container with ID starting with 5a1ecbcc87df18129c4d9e83ae8ed9a259145b1d9f31226ae3f117ea5ee256fb not found: ID does not exist" Mar 09 15:10:56 crc kubenswrapper[4722]: I0309 15:10:56.966291 4722 scope.go:117] "RemoveContainer" containerID="c88a832b4b7b26b5d0456e0af2acfda428546dd85380c78204bde78f86efbba0" Mar 09 15:10:56 crc kubenswrapper[4722]: E0309 15:10:56.966505 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c88a832b4b7b26b5d0456e0af2acfda428546dd85380c78204bde78f86efbba0\": container with ID starting with c88a832b4b7b26b5d0456e0af2acfda428546dd85380c78204bde78f86efbba0 not found: ID does not exist" containerID="c88a832b4b7b26b5d0456e0af2acfda428546dd85380c78204bde78f86efbba0" Mar 09 15:10:56 crc kubenswrapper[4722]: I0309 15:10:56.966527 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c88a832b4b7b26b5d0456e0af2acfda428546dd85380c78204bde78f86efbba0"} err="failed to get container status \"c88a832b4b7b26b5d0456e0af2acfda428546dd85380c78204bde78f86efbba0\": rpc error: code = NotFound desc = could not find container \"c88a832b4b7b26b5d0456e0af2acfda428546dd85380c78204bde78f86efbba0\": container with ID starting with c88a832b4b7b26b5d0456e0af2acfda428546dd85380c78204bde78f86efbba0 not found: ID does not exist" Mar 09 15:11:12 crc kubenswrapper[4722]: E0309 15:11:12.674409 4722 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.194:60296->38.102.83.194:41525: write tcp 38.102.83.194:60296->38.102.83.194:41525: write: broken pipe Mar 09 15:11:25 crc kubenswrapper[4722]: I0309 15:11:25.665342 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7jdf5"] Mar 09 15:11:25 crc kubenswrapper[4722]: E0309 15:11:25.667783 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="737ce8ca-c7e9-4771-9ef3-7ffd04d03c72" containerName="registry-server" Mar 09 15:11:25 crc kubenswrapper[4722]: I0309 15:11:25.667816 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="737ce8ca-c7e9-4771-9ef3-7ffd04d03c72" containerName="registry-server" Mar 09 15:11:25 crc kubenswrapper[4722]: E0309 15:11:25.667841 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="737ce8ca-c7e9-4771-9ef3-7ffd04d03c72" containerName="extract-utilities" Mar 09 15:11:25 crc kubenswrapper[4722]: I0309 15:11:25.667849 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="737ce8ca-c7e9-4771-9ef3-7ffd04d03c72" containerName="extract-utilities" Mar 09 15:11:25 crc kubenswrapper[4722]: E0309 15:11:25.667920 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="737ce8ca-c7e9-4771-9ef3-7ffd04d03c72" containerName="extract-content" Mar 09 15:11:25 crc kubenswrapper[4722]: I0309 15:11:25.667928 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="737ce8ca-c7e9-4771-9ef3-7ffd04d03c72" containerName="extract-content" Mar 09 15:11:25 crc kubenswrapper[4722]: I0309 15:11:25.668286 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="737ce8ca-c7e9-4771-9ef3-7ffd04d03c72" containerName="registry-server" Mar 09 15:11:25 crc kubenswrapper[4722]: I0309 15:11:25.670224 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7jdf5" Mar 09 15:11:25 crc kubenswrapper[4722]: I0309 15:11:25.704063 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7jdf5"] Mar 09 15:11:25 crc kubenswrapper[4722]: I0309 15:11:25.790330 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2aec8c4-7f12-4b84-b06d-1f9bc4ab8424-utilities\") pod \"redhat-operators-7jdf5\" (UID: \"b2aec8c4-7f12-4b84-b06d-1f9bc4ab8424\") " pod="openshift-marketplace/redhat-operators-7jdf5" Mar 09 15:11:25 crc kubenswrapper[4722]: I0309 15:11:25.790835 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2aec8c4-7f12-4b84-b06d-1f9bc4ab8424-catalog-content\") pod \"redhat-operators-7jdf5\" (UID: \"b2aec8c4-7f12-4b84-b06d-1f9bc4ab8424\") " pod="openshift-marketplace/redhat-operators-7jdf5" Mar 09 15:11:25 crc kubenswrapper[4722]: I0309 15:11:25.790962 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7krp\" (UniqueName: \"kubernetes.io/projected/b2aec8c4-7f12-4b84-b06d-1f9bc4ab8424-kube-api-access-z7krp\") pod \"redhat-operators-7jdf5\" (UID: \"b2aec8c4-7f12-4b84-b06d-1f9bc4ab8424\") " pod="openshift-marketplace/redhat-operators-7jdf5" Mar 09 15:11:25 crc kubenswrapper[4722]: I0309 15:11:25.893099 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2aec8c4-7f12-4b84-b06d-1f9bc4ab8424-catalog-content\") pod \"redhat-operators-7jdf5\" (UID: \"b2aec8c4-7f12-4b84-b06d-1f9bc4ab8424\") " pod="openshift-marketplace/redhat-operators-7jdf5" Mar 09 15:11:25 crc kubenswrapper[4722]: I0309 15:11:25.893147 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7krp\" (UniqueName: \"kubernetes.io/projected/b2aec8c4-7f12-4b84-b06d-1f9bc4ab8424-kube-api-access-z7krp\") pod \"redhat-operators-7jdf5\" (UID: \"b2aec8c4-7f12-4b84-b06d-1f9bc4ab8424\") " pod="openshift-marketplace/redhat-operators-7jdf5" Mar 09 15:11:25 crc kubenswrapper[4722]: I0309 15:11:25.893305 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2aec8c4-7f12-4b84-b06d-1f9bc4ab8424-utilities\") pod \"redhat-operators-7jdf5\" (UID: \"b2aec8c4-7f12-4b84-b06d-1f9bc4ab8424\") " pod="openshift-marketplace/redhat-operators-7jdf5" Mar 09 15:11:25 crc kubenswrapper[4722]: I0309 15:11:25.894055 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2aec8c4-7f12-4b84-b06d-1f9bc4ab8424-catalog-content\") pod \"redhat-operators-7jdf5\" (UID: \"b2aec8c4-7f12-4b84-b06d-1f9bc4ab8424\") " pod="openshift-marketplace/redhat-operators-7jdf5" Mar 09 15:11:25 crc kubenswrapper[4722]: I0309 15:11:25.894079 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2aec8c4-7f12-4b84-b06d-1f9bc4ab8424-utilities\") pod \"redhat-operators-7jdf5\" (UID: \"b2aec8c4-7f12-4b84-b06d-1f9bc4ab8424\") " pod="openshift-marketplace/redhat-operators-7jdf5" Mar 09 15:11:25 crc kubenswrapper[4722]: I0309 15:11:25.914660 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7krp\" (UniqueName: \"kubernetes.io/projected/b2aec8c4-7f12-4b84-b06d-1f9bc4ab8424-kube-api-access-z7krp\") pod \"redhat-operators-7jdf5\" (UID: \"b2aec8c4-7f12-4b84-b06d-1f9bc4ab8424\") " pod="openshift-marketplace/redhat-operators-7jdf5" Mar 09 15:11:26 crc kubenswrapper[4722]: I0309 15:11:26.001263 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7jdf5" Mar 09 15:11:26 crc kubenswrapper[4722]: I0309 15:11:26.510156 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7jdf5"] Mar 09 15:11:27 crc kubenswrapper[4722]: I0309 15:11:27.418057 4722 generic.go:334] "Generic (PLEG): container finished" podID="b2aec8c4-7f12-4b84-b06d-1f9bc4ab8424" containerID="d8c81676585a8d512628ebff036fa8d52dfec8f6d4632791c408e03dd222c312" exitCode=0 Mar 09 15:11:27 crc kubenswrapper[4722]: I0309 15:11:27.418216 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7jdf5" event={"ID":"b2aec8c4-7f12-4b84-b06d-1f9bc4ab8424","Type":"ContainerDied","Data":"d8c81676585a8d512628ebff036fa8d52dfec8f6d4632791c408e03dd222c312"} Mar 09 15:11:27 crc kubenswrapper[4722]: I0309 15:11:27.418576 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7jdf5" event={"ID":"b2aec8c4-7f12-4b84-b06d-1f9bc4ab8424","Type":"ContainerStarted","Data":"1b7842f0d33e242b48b38731f9de69d5bac2b94362c6c9fa0cddd65c1659d1c1"} Mar 09 15:11:29 crc kubenswrapper[4722]: I0309 15:11:29.467946 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7jdf5" event={"ID":"b2aec8c4-7f12-4b84-b06d-1f9bc4ab8424","Type":"ContainerStarted","Data":"4a29f9088b742c314210a04d2d87720a3f3f0cfd4f772f90e37d9dffac0541e9"} Mar 09 15:11:36 crc kubenswrapper[4722]: I0309 15:11:36.557286 4722 generic.go:334] "Generic (PLEG): container finished" podID="b2aec8c4-7f12-4b84-b06d-1f9bc4ab8424" containerID="4a29f9088b742c314210a04d2d87720a3f3f0cfd4f772f90e37d9dffac0541e9" exitCode=0 Mar 09 15:11:36 crc kubenswrapper[4722]: I0309 15:11:36.557326 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7jdf5" event={"ID":"b2aec8c4-7f12-4b84-b06d-1f9bc4ab8424","Type":"ContainerDied","Data":"4a29f9088b742c314210a04d2d87720a3f3f0cfd4f772f90e37d9dffac0541e9"} Mar 09 15:11:37 crc kubenswrapper[4722]: I0309 15:11:37.571019 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7jdf5" event={"ID":"b2aec8c4-7f12-4b84-b06d-1f9bc4ab8424","Type":"ContainerStarted","Data":"4e5f7537dc57e231d5952660795f4fc34fb18f240c9e733f8a6926a975566247"} Mar 09 15:11:37 crc kubenswrapper[4722]: I0309 15:11:37.605057 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7jdf5" podStartSLOduration=2.991160958 podStartE2EDuration="12.605038686s" podCreationTimestamp="2026-03-09 15:11:25 +0000 UTC" firstStartedPulling="2026-03-09 15:11:27.421003626 +0000 UTC m=+4127.976572202" lastFinishedPulling="2026-03-09 15:11:37.034881354 +0000 UTC m=+4137.590449930" observedRunningTime="2026-03-09 15:11:37.59256354 +0000 UTC m=+4138.148132116" watchObservedRunningTime="2026-03-09 15:11:37.605038686 +0000 UTC m=+4138.160607262" Mar 09 15:11:46 crc kubenswrapper[4722]: I0309 15:11:46.002434 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7jdf5" Mar 09 15:11:46 crc kubenswrapper[4722]: I0309 15:11:46.003029 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7jdf5" Mar 09 15:11:47 crc kubenswrapper[4722]: I0309 15:11:47.083481 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7jdf5" podUID="b2aec8c4-7f12-4b84-b06d-1f9bc4ab8424" containerName="registry-server" probeResult="failure" output=< Mar 09 15:11:47 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Mar 09 15:11:47 crc kubenswrapper[4722]: > Mar 09 15:11:51 crc kubenswrapper[4722]: I0309 15:11:51.528129 4722 patch_prober.go:28] interesting pod/machine-config-daemon-hjrrb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 15:11:51 crc kubenswrapper[4722]: I0309 15:11:51.528784 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 15:11:57 crc kubenswrapper[4722]: I0309 15:11:57.056099 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7jdf5" podUID="b2aec8c4-7f12-4b84-b06d-1f9bc4ab8424" containerName="registry-server" probeResult="failure" output=< Mar 09 15:11:57 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Mar 09 15:11:57 crc kubenswrapper[4722]: > Mar 09 15:12:00 crc kubenswrapper[4722]: I0309 15:12:00.176904 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551152-95ng8"] Mar 09 15:12:00 crc kubenswrapper[4722]: I0309 15:12:00.180013 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551152-95ng8"] Mar 09 15:12:00 crc kubenswrapper[4722]: I0309 15:12:00.180112 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551152-95ng8" Mar 09 15:12:00 crc kubenswrapper[4722]: I0309 15:12:00.183677 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 15:12:00 crc kubenswrapper[4722]: I0309 15:12:00.183757 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-dr2x6" Mar 09 15:12:00 crc kubenswrapper[4722]: I0309 15:12:00.185633 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 15:12:00 crc kubenswrapper[4722]: I0309 15:12:00.316646 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bs4f\" (UniqueName: \"kubernetes.io/projected/9c64bc37-546f-48b6-9fae-e1c085dc63a4-kube-api-access-9bs4f\") pod \"auto-csr-approver-29551152-95ng8\" (UID: \"9c64bc37-546f-48b6-9fae-e1c085dc63a4\") " pod="openshift-infra/auto-csr-approver-29551152-95ng8" Mar 09 15:12:00 crc kubenswrapper[4722]: I0309 15:12:00.418858 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bs4f\" (UniqueName: \"kubernetes.io/projected/9c64bc37-546f-48b6-9fae-e1c085dc63a4-kube-api-access-9bs4f\") pod \"auto-csr-approver-29551152-95ng8\" (UID: \"9c64bc37-546f-48b6-9fae-e1c085dc63a4\") " pod="openshift-infra/auto-csr-approver-29551152-95ng8" Mar 09 15:12:00 crc kubenswrapper[4722]: I0309 15:12:00.437682 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bs4f\" (UniqueName: \"kubernetes.io/projected/9c64bc37-546f-48b6-9fae-e1c085dc63a4-kube-api-access-9bs4f\") pod \"auto-csr-approver-29551152-95ng8\" (UID: \"9c64bc37-546f-48b6-9fae-e1c085dc63a4\") " pod="openshift-infra/auto-csr-approver-29551152-95ng8" Mar 09 15:12:00 crc kubenswrapper[4722]: I0309 15:12:00.528597 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551152-95ng8" Mar 09 15:12:01 crc kubenswrapper[4722]: I0309 15:12:01.070360 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551152-95ng8"] Mar 09 15:12:01 crc kubenswrapper[4722]: I0309 15:12:01.884968 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551152-95ng8" event={"ID":"9c64bc37-546f-48b6-9fae-e1c085dc63a4","Type":"ContainerStarted","Data":"f55c6b770dfa08b8b57ed6183abd3ce5c9053f6a10f4dbd0c09a8c083f7eb50b"} Mar 09 15:12:02 crc kubenswrapper[4722]: I0309 15:12:02.898560 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551152-95ng8" event={"ID":"9c64bc37-546f-48b6-9fae-e1c085dc63a4","Type":"ContainerStarted","Data":"57f703087ce04c4eff5680bef89eac7a19d5670e5094d244222882ab07740ecc"} Mar 09 15:12:02 crc kubenswrapper[4722]: I0309 15:12:02.913099 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551152-95ng8" podStartSLOduration=1.694803732 podStartE2EDuration="2.913077151s" podCreationTimestamp="2026-03-09 15:12:00 +0000 UTC" firstStartedPulling="2026-03-09 15:12:01.071981553 +0000 UTC m=+4161.627550129" lastFinishedPulling="2026-03-09 15:12:02.290254972 +0000 UTC m=+4162.845823548" observedRunningTime="2026-03-09 15:12:02.912545186 +0000 UTC m=+4163.468113802" watchObservedRunningTime="2026-03-09 15:12:02.913077151 +0000 UTC m=+4163.468645767" Mar 09 15:12:03 crc kubenswrapper[4722]: I0309 15:12:03.913512 4722 generic.go:334] "Generic (PLEG): container finished" podID="9c64bc37-546f-48b6-9fae-e1c085dc63a4" containerID="57f703087ce04c4eff5680bef89eac7a19d5670e5094d244222882ab07740ecc" exitCode=0 Mar 09 15:12:03 crc kubenswrapper[4722]: I0309 15:12:03.913566 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551152-95ng8" event={"ID":"9c64bc37-546f-48b6-9fae-e1c085dc63a4","Type":"ContainerDied","Data":"57f703087ce04c4eff5680bef89eac7a19d5670e5094d244222882ab07740ecc"} Mar 09 15:12:05 crc kubenswrapper[4722]: I0309 15:12:05.938920 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551152-95ng8" event={"ID":"9c64bc37-546f-48b6-9fae-e1c085dc63a4","Type":"ContainerDied","Data":"f55c6b770dfa08b8b57ed6183abd3ce5c9053f6a10f4dbd0c09a8c083f7eb50b"} Mar 09 15:12:05 crc kubenswrapper[4722]: I0309 15:12:05.939413 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f55c6b770dfa08b8b57ed6183abd3ce5c9053f6a10f4dbd0c09a8c083f7eb50b" Mar 09 15:12:06 crc kubenswrapper[4722]: I0309 15:12:06.073928 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7jdf5" Mar 09 15:12:06 crc kubenswrapper[4722]: I0309 15:12:06.109447 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551152-95ng8" Mar 09 15:12:06 crc kubenswrapper[4722]: I0309 15:12:06.171006 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9bs4f\" (UniqueName: \"kubernetes.io/projected/9c64bc37-546f-48b6-9fae-e1c085dc63a4-kube-api-access-9bs4f\") pod \"9c64bc37-546f-48b6-9fae-e1c085dc63a4\" (UID: \"9c64bc37-546f-48b6-9fae-e1c085dc63a4\") " Mar 09 15:12:06 crc kubenswrapper[4722]: I0309 15:12:06.178362 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c64bc37-546f-48b6-9fae-e1c085dc63a4-kube-api-access-9bs4f" (OuterVolumeSpecName: "kube-api-access-9bs4f") pod "9c64bc37-546f-48b6-9fae-e1c085dc63a4" (UID: "9c64bc37-546f-48b6-9fae-e1c085dc63a4"). InnerVolumeSpecName "kube-api-access-9bs4f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 15:12:06 crc kubenswrapper[4722]: I0309 15:12:06.194879 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7jdf5" Mar 09 15:12:06 crc kubenswrapper[4722]: I0309 15:12:06.274646 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9bs4f\" (UniqueName: \"kubernetes.io/projected/9c64bc37-546f-48b6-9fae-e1c085dc63a4-kube-api-access-9bs4f\") on node \"crc\" DevicePath \"\"" Mar 09 15:12:06 crc kubenswrapper[4722]: I0309 15:12:06.326725 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7jdf5"] Mar 09 15:12:06 crc kubenswrapper[4722]: I0309 15:12:06.952511 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551152-95ng8" Mar 09 15:12:07 crc kubenswrapper[4722]: I0309 15:12:07.194806 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551146-tbrjq"] Mar 09 15:12:07 crc kubenswrapper[4722]: I0309 15:12:07.207982 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551146-tbrjq"] Mar 09 15:12:07 crc kubenswrapper[4722]: I0309 15:12:07.963151 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7jdf5" podUID="b2aec8c4-7f12-4b84-b06d-1f9bc4ab8424" containerName="registry-server" containerID="cri-o://4e5f7537dc57e231d5952660795f4fc34fb18f240c9e733f8a6926a975566247" gracePeriod=2 Mar 09 15:12:08 crc kubenswrapper[4722]: I0309 15:12:08.162282 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1279f77-ea98-417c-9845-cbbeec2536ac" path="/var/lib/kubelet/pods/d1279f77-ea98-417c-9845-cbbeec2536ac/volumes" Mar 09 15:12:08 crc kubenswrapper[4722]: I0309 15:12:08.548411 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7jdf5" Mar 09 15:12:08 crc kubenswrapper[4722]: I0309 15:12:08.631604 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2aec8c4-7f12-4b84-b06d-1f9bc4ab8424-catalog-content\") pod \"b2aec8c4-7f12-4b84-b06d-1f9bc4ab8424\" (UID: \"b2aec8c4-7f12-4b84-b06d-1f9bc4ab8424\") " Mar 09 15:12:08 crc kubenswrapper[4722]: I0309 15:12:08.631910 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7krp\" (UniqueName: \"kubernetes.io/projected/b2aec8c4-7f12-4b84-b06d-1f9bc4ab8424-kube-api-access-z7krp\") pod \"b2aec8c4-7f12-4b84-b06d-1f9bc4ab8424\" (UID: \"b2aec8c4-7f12-4b84-b06d-1f9bc4ab8424\") " Mar 09 15:12:08 crc kubenswrapper[4722]: I0309 15:12:08.632052 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2aec8c4-7f12-4b84-b06d-1f9bc4ab8424-utilities\") pod \"b2aec8c4-7f12-4b84-b06d-1f9bc4ab8424\" (UID: \"b2aec8c4-7f12-4b84-b06d-1f9bc4ab8424\") " Mar 09 15:12:08 crc kubenswrapper[4722]: I0309 15:12:08.633948 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2aec8c4-7f12-4b84-b06d-1f9bc4ab8424-utilities" (OuterVolumeSpecName: "utilities") pod "b2aec8c4-7f12-4b84-b06d-1f9bc4ab8424" (UID: "b2aec8c4-7f12-4b84-b06d-1f9bc4ab8424"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 15:12:08 crc kubenswrapper[4722]: I0309 15:12:08.634461 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2aec8c4-7f12-4b84-b06d-1f9bc4ab8424-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 15:12:08 crc kubenswrapper[4722]: I0309 15:12:08.637273 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2aec8c4-7f12-4b84-b06d-1f9bc4ab8424-kube-api-access-z7krp" (OuterVolumeSpecName: "kube-api-access-z7krp") pod "b2aec8c4-7f12-4b84-b06d-1f9bc4ab8424" (UID: "b2aec8c4-7f12-4b84-b06d-1f9bc4ab8424"). InnerVolumeSpecName "kube-api-access-z7krp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 15:12:08 crc kubenswrapper[4722]: I0309 15:12:08.736566 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7krp\" (UniqueName: \"kubernetes.io/projected/b2aec8c4-7f12-4b84-b06d-1f9bc4ab8424-kube-api-access-z7krp\") on node \"crc\" DevicePath \"\"" Mar 09 15:12:08 crc kubenswrapper[4722]: I0309 15:12:08.767797 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2aec8c4-7f12-4b84-b06d-1f9bc4ab8424-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b2aec8c4-7f12-4b84-b06d-1f9bc4ab8424" (UID: "b2aec8c4-7f12-4b84-b06d-1f9bc4ab8424"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 15:12:08 crc kubenswrapper[4722]: I0309 15:12:08.838974 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2aec8c4-7f12-4b84-b06d-1f9bc4ab8424-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 15:12:08 crc kubenswrapper[4722]: I0309 15:12:08.976723 4722 generic.go:334] "Generic (PLEG): container finished" podID="b2aec8c4-7f12-4b84-b06d-1f9bc4ab8424" containerID="4e5f7537dc57e231d5952660795f4fc34fb18f240c9e733f8a6926a975566247" exitCode=0 Mar 09 15:12:08 crc kubenswrapper[4722]: I0309 15:12:08.976780 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7jdf5" event={"ID":"b2aec8c4-7f12-4b84-b06d-1f9bc4ab8424","Type":"ContainerDied","Data":"4e5f7537dc57e231d5952660795f4fc34fb18f240c9e733f8a6926a975566247"} Mar 09 15:12:08 crc kubenswrapper[4722]: I0309 15:12:08.976812 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7jdf5" event={"ID":"b2aec8c4-7f12-4b84-b06d-1f9bc4ab8424","Type":"ContainerDied","Data":"1b7842f0d33e242b48b38731f9de69d5bac2b94362c6c9fa0cddd65c1659d1c1"} Mar 09 15:12:08 crc kubenswrapper[4722]: I0309 15:12:08.976833 4722 scope.go:117] "RemoveContainer" containerID="4e5f7537dc57e231d5952660795f4fc34fb18f240c9e733f8a6926a975566247" Mar 09 15:12:08 crc kubenswrapper[4722]: I0309 15:12:08.976863 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7jdf5" Mar 09 15:12:09 crc kubenswrapper[4722]: I0309 15:12:09.010291 4722 scope.go:117] "RemoveContainer" containerID="4a29f9088b742c314210a04d2d87720a3f3f0cfd4f772f90e37d9dffac0541e9" Mar 09 15:12:09 crc kubenswrapper[4722]: I0309 15:12:09.032877 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7jdf5"] Mar 09 15:12:09 crc kubenswrapper[4722]: I0309 15:12:09.045664 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7jdf5"] Mar 09 15:12:09 crc kubenswrapper[4722]: I0309 15:12:09.061687 4722 scope.go:117] "RemoveContainer" containerID="d8c81676585a8d512628ebff036fa8d52dfec8f6d4632791c408e03dd222c312" Mar 09 15:12:09 crc kubenswrapper[4722]: I0309 15:12:09.098569 4722 scope.go:117] "RemoveContainer" containerID="4e5f7537dc57e231d5952660795f4fc34fb18f240c9e733f8a6926a975566247" Mar 09 15:12:09 crc kubenswrapper[4722]: E0309 15:12:09.099019 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e5f7537dc57e231d5952660795f4fc34fb18f240c9e733f8a6926a975566247\": container with ID starting with 4e5f7537dc57e231d5952660795f4fc34fb18f240c9e733f8a6926a975566247 not found: ID does not exist" containerID="4e5f7537dc57e231d5952660795f4fc34fb18f240c9e733f8a6926a975566247" Mar 09 15:12:09 crc kubenswrapper[4722]: I0309 15:12:09.099065 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e5f7537dc57e231d5952660795f4fc34fb18f240c9e733f8a6926a975566247"} err="failed to get container status \"4e5f7537dc57e231d5952660795f4fc34fb18f240c9e733f8a6926a975566247\": rpc error: code = NotFound desc = could not find container \"4e5f7537dc57e231d5952660795f4fc34fb18f240c9e733f8a6926a975566247\": container with ID starting with 4e5f7537dc57e231d5952660795f4fc34fb18f240c9e733f8a6926a975566247 not found: ID does not exist" Mar 09 15:12:09 crc kubenswrapper[4722]: I0309 15:12:09.099092 4722 scope.go:117] "RemoveContainer" containerID="4a29f9088b742c314210a04d2d87720a3f3f0cfd4f772f90e37d9dffac0541e9" Mar 09 15:12:09 crc kubenswrapper[4722]: E0309 15:12:09.099907 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a29f9088b742c314210a04d2d87720a3f3f0cfd4f772f90e37d9dffac0541e9\": container with ID starting with 4a29f9088b742c314210a04d2d87720a3f3f0cfd4f772f90e37d9dffac0541e9 not found: ID does not exist" containerID="4a29f9088b742c314210a04d2d87720a3f3f0cfd4f772f90e37d9dffac0541e9" Mar 09 15:12:09 crc kubenswrapper[4722]: I0309 15:12:09.099935 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a29f9088b742c314210a04d2d87720a3f3f0cfd4f772f90e37d9dffac0541e9"} err="failed to get container status \"4a29f9088b742c314210a04d2d87720a3f3f0cfd4f772f90e37d9dffac0541e9\": rpc error: code = NotFound desc = could not find container \"4a29f9088b742c314210a04d2d87720a3f3f0cfd4f772f90e37d9dffac0541e9\": container with ID starting with 4a29f9088b742c314210a04d2d87720a3f3f0cfd4f772f90e37d9dffac0541e9 not found: ID does not exist" Mar 09 15:12:09 crc kubenswrapper[4722]: I0309 15:12:09.099975 4722 scope.go:117] "RemoveContainer" containerID="d8c81676585a8d512628ebff036fa8d52dfec8f6d4632791c408e03dd222c312" Mar 09 15:12:09 crc kubenswrapper[4722]: E0309 15:12:09.100173 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8c81676585a8d512628ebff036fa8d52dfec8f6d4632791c408e03dd222c312\": container with ID starting with d8c81676585a8d512628ebff036fa8d52dfec8f6d4632791c408e03dd222c312 not found: ID does not exist" containerID="d8c81676585a8d512628ebff036fa8d52dfec8f6d4632791c408e03dd222c312" Mar 09 15:12:09 crc kubenswrapper[4722]: I0309 15:12:09.100192 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8c81676585a8d512628ebff036fa8d52dfec8f6d4632791c408e03dd222c312"} err="failed to get container status \"d8c81676585a8d512628ebff036fa8d52dfec8f6d4632791c408e03dd222c312\": rpc error: code = NotFound desc = could not find container \"d8c81676585a8d512628ebff036fa8d52dfec8f6d4632791c408e03dd222c312\": container with ID starting with d8c81676585a8d512628ebff036fa8d52dfec8f6d4632791c408e03dd222c312 not found: ID does not exist" Mar 09 15:12:10 crc kubenswrapper[4722]: I0309 15:12:10.173066 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2aec8c4-7f12-4b84-b06d-1f9bc4ab8424" path="/var/lib/kubelet/pods/b2aec8c4-7f12-4b84-b06d-1f9bc4ab8424/volumes" Mar 09 15:12:21 crc kubenswrapper[4722]: I0309 15:12:21.527992 4722 patch_prober.go:28] interesting pod/machine-config-daemon-hjrrb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 15:12:21 crc kubenswrapper[4722]: I0309 15:12:21.528619 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 15:12:29 crc kubenswrapper[4722]: I0309 15:12:29.971198 4722 scope.go:117] "RemoveContainer" containerID="1d6155c10421ca5dc174dea3ca1101056b3b3d28a85cf7edab09591ca6af7ff8" Mar 09 15:12:42 crc kubenswrapper[4722]: I0309 15:12:42.789745 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wqq49"] Mar 09 15:12:42 crc kubenswrapper[4722]: E0309 15:12:42.791012 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2aec8c4-7f12-4b84-b06d-1f9bc4ab8424" containerName="registry-server" Mar 09 15:12:42 crc kubenswrapper[4722]: I0309 15:12:42.791034 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2aec8c4-7f12-4b84-b06d-1f9bc4ab8424" containerName="registry-server" Mar 09 15:12:42 crc kubenswrapper[4722]: E0309 15:12:42.791047 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2aec8c4-7f12-4b84-b06d-1f9bc4ab8424" containerName="extract-utilities" Mar 09 15:12:42 crc kubenswrapper[4722]: I0309 15:12:42.791055 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2aec8c4-7f12-4b84-b06d-1f9bc4ab8424" containerName="extract-utilities" Mar 09 15:12:42 crc kubenswrapper[4722]: E0309 15:12:42.791084 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c64bc37-546f-48b6-9fae-e1c085dc63a4" containerName="oc" Mar 09 15:12:42 crc kubenswrapper[4722]: I0309 15:12:42.791093 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c64bc37-546f-48b6-9fae-e1c085dc63a4" containerName="oc" Mar 09 15:12:42 crc kubenswrapper[4722]: E0309 15:12:42.791104 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2aec8c4-7f12-4b84-b06d-1f9bc4ab8424" containerName="extract-content" Mar 09 15:12:42 crc kubenswrapper[4722]: I0309 15:12:42.791112 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2aec8c4-7f12-4b84-b06d-1f9bc4ab8424" containerName="extract-content" Mar 09 15:12:42 crc kubenswrapper[4722]: I0309 15:12:42.791399 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2aec8c4-7f12-4b84-b06d-1f9bc4ab8424" containerName="registry-server" Mar 09 15:12:42 crc kubenswrapper[4722]: I0309 15:12:42.791433 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c64bc37-546f-48b6-9fae-e1c085dc63a4" containerName="oc" Mar 09 15:12:42 crc kubenswrapper[4722]: I0309 15:12:42.793224 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wqq49" Mar 09 15:12:42 crc kubenswrapper[4722]: I0309 15:12:42.826701 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wqq49"] Mar 09 15:12:42 crc kubenswrapper[4722]: I0309 15:12:42.962300 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3403af33-70e2-4010-9951-5d91f55b7c05-utilities\") pod \"community-operators-wqq49\" (UID: \"3403af33-70e2-4010-9951-5d91f55b7c05\") " pod="openshift-marketplace/community-operators-wqq49" Mar 09 15:12:42 crc kubenswrapper[4722]: I0309 15:12:42.962558 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3403af33-70e2-4010-9951-5d91f55b7c05-catalog-content\") pod \"community-operators-wqq49\" (UID: \"3403af33-70e2-4010-9951-5d91f55b7c05\") " pod="openshift-marketplace/community-operators-wqq49" Mar 09 15:12:42 crc kubenswrapper[4722]: I0309 15:12:42.963061 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrnvr\" (UniqueName: \"kubernetes.io/projected/3403af33-70e2-4010-9951-5d91f55b7c05-kube-api-access-lrnvr\") pod \"community-operators-wqq49\" (UID: \"3403af33-70e2-4010-9951-5d91f55b7c05\") " pod="openshift-marketplace/community-operators-wqq49" Mar 09 15:12:43 crc kubenswrapper[4722]: I0309 15:12:43.065116 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrnvr\" (UniqueName: \"kubernetes.io/projected/3403af33-70e2-4010-9951-5d91f55b7c05-kube-api-access-lrnvr\") pod \"community-operators-wqq49\" (UID: \"3403af33-70e2-4010-9951-5d91f55b7c05\") " pod="openshift-marketplace/community-operators-wqq49" Mar 09 15:12:43 crc kubenswrapper[4722]: I0309 15:12:43.065274 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3403af33-70e2-4010-9951-5d91f55b7c05-utilities\") pod \"community-operators-wqq49\" (UID: \"3403af33-70e2-4010-9951-5d91f55b7c05\") " pod="openshift-marketplace/community-operators-wqq49" Mar 09 15:12:43 crc kubenswrapper[4722]: I0309 15:12:43.065329 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3403af33-70e2-4010-9951-5d91f55b7c05-catalog-content\") pod \"community-operators-wqq49\" (UID: \"3403af33-70e2-4010-9951-5d91f55b7c05\") " pod="openshift-marketplace/community-operators-wqq49" Mar 09 15:12:43 crc kubenswrapper[4722]: I0309 15:12:43.065767 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3403af33-70e2-4010-9951-5d91f55b7c05-catalog-content\") pod \"community-operators-wqq49\" (UID: \"3403af33-70e2-4010-9951-5d91f55b7c05\") " pod="openshift-marketplace/community-operators-wqq49" Mar 09 15:12:43 crc kubenswrapper[4722]: I0309 15:12:43.065897 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3403af33-70e2-4010-9951-5d91f55b7c05-utilities\") pod \"community-operators-wqq49\" (UID: \"3403af33-70e2-4010-9951-5d91f55b7c05\") " pod="openshift-marketplace/community-operators-wqq49" Mar 09 15:12:43 crc kubenswrapper[4722]: I0309 15:12:43.083992 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrnvr\" (UniqueName: \"kubernetes.io/projected/3403af33-70e2-4010-9951-5d91f55b7c05-kube-api-access-lrnvr\") pod \"community-operators-wqq49\" (UID: \"3403af33-70e2-4010-9951-5d91f55b7c05\") " pod="openshift-marketplace/community-operators-wqq49" Mar 09 15:12:43 crc kubenswrapper[4722]: I0309 15:12:43.141669 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wqq49" Mar 09 15:12:43 crc kubenswrapper[4722]: I0309 15:12:43.729130 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wqq49"] Mar 09 15:12:44 crc kubenswrapper[4722]: I0309 15:12:44.414067 4722 generic.go:334] "Generic (PLEG): container finished" podID="3403af33-70e2-4010-9951-5d91f55b7c05" containerID="005131e8bc90666fe251bde9faff5d4f20d14372de7c593085ce1ac5df2eeb80" exitCode=0 Mar 09 15:12:44 crc kubenswrapper[4722]: I0309 15:12:44.414129 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wqq49" event={"ID":"3403af33-70e2-4010-9951-5d91f55b7c05","Type":"ContainerDied","Data":"005131e8bc90666fe251bde9faff5d4f20d14372de7c593085ce1ac5df2eeb80"} Mar 09 15:12:44 crc kubenswrapper[4722]: I0309 15:12:44.414391 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wqq49" event={"ID":"3403af33-70e2-4010-9951-5d91f55b7c05","Type":"ContainerStarted","Data":"3fe12c719350c68a08f75e2b1c9ed06d285eb9707453830a9245b1f29f9564ea"} Mar 09 15:12:45 crc kubenswrapper[4722]: I0309 15:12:45.431607 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wqq49" event={"ID":"3403af33-70e2-4010-9951-5d91f55b7c05","Type":"ContainerStarted","Data":"131d65615f258e44277dd4df7af778f29b93ddc6e374ec388cba1b8347c67c0f"} Mar 09 15:12:47 crc kubenswrapper[4722]: I0309 15:12:47.456725 4722 generic.go:334] "Generic (PLEG): container finished" podID="3403af33-70e2-4010-9951-5d91f55b7c05" containerID="131d65615f258e44277dd4df7af778f29b93ddc6e374ec388cba1b8347c67c0f" exitCode=0 Mar 09 15:12:47 crc kubenswrapper[4722]: I0309 15:12:47.456836 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wqq49" event={"ID":"3403af33-70e2-4010-9951-5d91f55b7c05","Type":"ContainerDied","Data":"131d65615f258e44277dd4df7af778f29b93ddc6e374ec388cba1b8347c67c0f"} Mar 09 15:12:48 crc kubenswrapper[4722]: I0309 15:12:48.470400 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wqq49" event={"ID":"3403af33-70e2-4010-9951-5d91f55b7c05","Type":"ContainerStarted","Data":"682ab71742783d5e2969b60fbbaf52fadce7a2d3f707294eace89b47a8bf9b27"} Mar 09 15:12:48 crc kubenswrapper[4722]: I0309 15:12:48.496077 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wqq49" podStartSLOduration=2.816396217 podStartE2EDuration="6.496052692s" podCreationTimestamp="2026-03-09 15:12:42 +0000 UTC" firstStartedPulling="2026-03-09 15:12:44.445926179 +0000 UTC m=+4205.001494775" lastFinishedPulling="2026-03-09 15:12:48.125582674 +0000 UTC m=+4208.681151250" observedRunningTime="2026-03-09 15:12:48.493376738 +0000 UTC m=+4209.048945334" watchObservedRunningTime="2026-03-09 15:12:48.496052692 +0000 UTC m=+4209.051621288" Mar 09 15:12:51 crc kubenswrapper[4722]: I0309 15:12:51.528503 4722 patch_prober.go:28] interesting pod/machine-config-daemon-hjrrb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 15:12:51 crc kubenswrapper[4722]: I0309 15:12:51.529080 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 15:12:51 crc kubenswrapper[4722]: I0309 15:12:51.529132 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" Mar 09 15:12:51 crc kubenswrapper[4722]: I0309 15:12:51.530240 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"605e7cfbce2d49113bae14138b0367a77ac331e6056b4329529ab18e27f72b42"} pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 15:12:51 crc kubenswrapper[4722]: I0309 15:12:51.530349 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerName="machine-config-daemon" containerID="cri-o://605e7cfbce2d49113bae14138b0367a77ac331e6056b4329529ab18e27f72b42" gracePeriod=600 Mar 09 15:12:52 crc kubenswrapper[4722]: I0309 15:12:52.510365 4722 generic.go:334] "Generic (PLEG): container finished" podID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerID="605e7cfbce2d49113bae14138b0367a77ac331e6056b4329529ab18e27f72b42" exitCode=0 Mar 09 15:12:52 crc kubenswrapper[4722]: I0309 15:12:52.510442 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" event={"ID":"dac2aaf5-653b-4b2a-8efe-ed26bac8d648","Type":"ContainerDied","Data":"605e7cfbce2d49113bae14138b0367a77ac331e6056b4329529ab18e27f72b42"} Mar 09 15:12:52 crc kubenswrapper[4722]: I0309 15:12:52.511249 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" event={"ID":"dac2aaf5-653b-4b2a-8efe-ed26bac8d648","Type":"ContainerStarted","Data":"ed831924138fb872c73acc14a5ac7c6ed36b421017c8fe60dbbaede3d8f11789"} Mar 09 15:12:52 crc kubenswrapper[4722]: I0309 15:12:52.511286 4722 scope.go:117] "RemoveContainer" containerID="cd181ecc4152a977f7835e27eed2a00ac92eaff69a01c8a79c0452e664e36b0f" Mar 09 15:12:53 crc kubenswrapper[4722]: I0309 15:12:53.142334 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wqq49" Mar 09 15:12:53 crc kubenswrapper[4722]: I0309 15:12:53.142661 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wqq49" Mar 09 15:12:53 crc kubenswrapper[4722]: I0309 15:12:53.199179 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wqq49" Mar 09 15:12:53 crc kubenswrapper[4722]: I0309 15:12:53.596787 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wqq49" Mar 09 15:12:53 crc kubenswrapper[4722]: I0309 15:12:53.649763 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wqq49"] Mar 09 15:12:55 crc kubenswrapper[4722]: I0309 15:12:55.542129 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wqq49" podUID="3403af33-70e2-4010-9951-5d91f55b7c05" containerName="registry-server" containerID="cri-o://682ab71742783d5e2969b60fbbaf52fadce7a2d3f707294eace89b47a8bf9b27" gracePeriod=2 Mar 09 15:12:56 crc kubenswrapper[4722]: I0309 15:12:56.057818 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wqq49" Mar 09 15:12:56 crc kubenswrapper[4722]: I0309 15:12:56.214195 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrnvr\" (UniqueName: \"kubernetes.io/projected/3403af33-70e2-4010-9951-5d91f55b7c05-kube-api-access-lrnvr\") pod \"3403af33-70e2-4010-9951-5d91f55b7c05\" (UID: \"3403af33-70e2-4010-9951-5d91f55b7c05\") " Mar 09 15:12:56 crc kubenswrapper[4722]: I0309 15:12:56.214340 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3403af33-70e2-4010-9951-5d91f55b7c05-utilities\") pod \"3403af33-70e2-4010-9951-5d91f55b7c05\" (UID: \"3403af33-70e2-4010-9951-5d91f55b7c05\") " Mar 09 15:12:56 crc kubenswrapper[4722]: I0309 15:12:56.214522 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3403af33-70e2-4010-9951-5d91f55b7c05-catalog-content\") pod \"3403af33-70e2-4010-9951-5d91f55b7c05\" (UID: \"3403af33-70e2-4010-9951-5d91f55b7c05\") " Mar 09 15:12:56 crc kubenswrapper[4722]: I0309 15:12:56.215412 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3403af33-70e2-4010-9951-5d91f55b7c05-utilities" (OuterVolumeSpecName: "utilities") pod "3403af33-70e2-4010-9951-5d91f55b7c05" (UID: "3403af33-70e2-4010-9951-5d91f55b7c05"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 15:12:56 crc kubenswrapper[4722]: I0309 15:12:56.215916 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3403af33-70e2-4010-9951-5d91f55b7c05-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 15:12:56 crc kubenswrapper[4722]: I0309 15:12:56.222501 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3403af33-70e2-4010-9951-5d91f55b7c05-kube-api-access-lrnvr" (OuterVolumeSpecName: "kube-api-access-lrnvr") pod "3403af33-70e2-4010-9951-5d91f55b7c05" (UID: "3403af33-70e2-4010-9951-5d91f55b7c05"). InnerVolumeSpecName "kube-api-access-lrnvr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 15:12:56 crc kubenswrapper[4722]: I0309 15:12:56.320467 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lrnvr\" (UniqueName: \"kubernetes.io/projected/3403af33-70e2-4010-9951-5d91f55b7c05-kube-api-access-lrnvr\") on node \"crc\" DevicePath \"\"" Mar 09 15:12:56 crc kubenswrapper[4722]: I0309 15:12:56.449709 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3403af33-70e2-4010-9951-5d91f55b7c05-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3403af33-70e2-4010-9951-5d91f55b7c05" (UID: "3403af33-70e2-4010-9951-5d91f55b7c05"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 15:12:56 crc kubenswrapper[4722]: I0309 15:12:56.525445 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3403af33-70e2-4010-9951-5d91f55b7c05-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 15:12:56 crc kubenswrapper[4722]: I0309 15:12:56.557122 4722 generic.go:334] "Generic (PLEG): container finished" podID="3403af33-70e2-4010-9951-5d91f55b7c05" containerID="682ab71742783d5e2969b60fbbaf52fadce7a2d3f707294eace89b47a8bf9b27" exitCode=0 Mar 09 15:12:56 crc kubenswrapper[4722]: I0309 15:12:56.557169 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wqq49" event={"ID":"3403af33-70e2-4010-9951-5d91f55b7c05","Type":"ContainerDied","Data":"682ab71742783d5e2969b60fbbaf52fadce7a2d3f707294eace89b47a8bf9b27"} Mar 09 15:12:56 crc kubenswrapper[4722]: I0309 15:12:56.557222 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wqq49" event={"ID":"3403af33-70e2-4010-9951-5d91f55b7c05","Type":"ContainerDied","Data":"3fe12c719350c68a08f75e2b1c9ed06d285eb9707453830a9245b1f29f9564ea"} Mar 09 15:12:56 crc kubenswrapper[4722]: I0309 15:12:56.557235 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wqq49" Mar 09 15:12:56 crc kubenswrapper[4722]: I0309 15:12:56.557245 4722 scope.go:117] "RemoveContainer" containerID="682ab71742783d5e2969b60fbbaf52fadce7a2d3f707294eace89b47a8bf9b27" Mar 09 15:12:56 crc kubenswrapper[4722]: I0309 15:12:56.580766 4722 scope.go:117] "RemoveContainer" containerID="131d65615f258e44277dd4df7af778f29b93ddc6e374ec388cba1b8347c67c0f" Mar 09 15:12:56 crc kubenswrapper[4722]: I0309 15:12:56.608254 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wqq49"] Mar 09 15:12:56 crc kubenswrapper[4722]: I0309 15:12:56.622083 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wqq49"] Mar 09 15:12:56 crc kubenswrapper[4722]: I0309 15:12:56.624936 4722 scope.go:117] "RemoveContainer" containerID="005131e8bc90666fe251bde9faff5d4f20d14372de7c593085ce1ac5df2eeb80" Mar 09 15:12:56 crc kubenswrapper[4722]: I0309 15:12:56.667463 4722 scope.go:117] "RemoveContainer" containerID="682ab71742783d5e2969b60fbbaf52fadce7a2d3f707294eace89b47a8bf9b27" Mar 09 15:12:56 crc kubenswrapper[4722]: E0309 15:12:56.668012 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"682ab71742783d5e2969b60fbbaf52fadce7a2d3f707294eace89b47a8bf9b27\": container with ID starting with 682ab71742783d5e2969b60fbbaf52fadce7a2d3f707294eace89b47a8bf9b27 not found: ID does not exist" containerID="682ab71742783d5e2969b60fbbaf52fadce7a2d3f707294eace89b47a8bf9b27" Mar 09 15:12:56 crc kubenswrapper[4722]: I0309 15:12:56.668087 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"682ab71742783d5e2969b60fbbaf52fadce7a2d3f707294eace89b47a8bf9b27"} err="failed to get container status \"682ab71742783d5e2969b60fbbaf52fadce7a2d3f707294eace89b47a8bf9b27\": rpc error: code = NotFound desc = could not find container \"682ab71742783d5e2969b60fbbaf52fadce7a2d3f707294eace89b47a8bf9b27\": container with ID starting with 682ab71742783d5e2969b60fbbaf52fadce7a2d3f707294eace89b47a8bf9b27 not found: ID does not exist" Mar 09 15:12:56 crc kubenswrapper[4722]: I0309 15:12:56.668118 4722 scope.go:117] "RemoveContainer" containerID="131d65615f258e44277dd4df7af778f29b93ddc6e374ec388cba1b8347c67c0f" Mar 09 15:12:56 crc kubenswrapper[4722]: E0309 15:12:56.668709 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"131d65615f258e44277dd4df7af778f29b93ddc6e374ec388cba1b8347c67c0f\": container with ID starting with 131d65615f258e44277dd4df7af778f29b93ddc6e374ec388cba1b8347c67c0f not found: ID does not exist" containerID="131d65615f258e44277dd4df7af778f29b93ddc6e374ec388cba1b8347c67c0f" Mar 09 15:12:56 crc kubenswrapper[4722]: I0309 15:12:56.668747 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"131d65615f258e44277dd4df7af778f29b93ddc6e374ec388cba1b8347c67c0f"} err="failed to get container status \"131d65615f258e44277dd4df7af778f29b93ddc6e374ec388cba1b8347c67c0f\": rpc error: code = NotFound desc = could not find container \"131d65615f258e44277dd4df7af778f29b93ddc6e374ec388cba1b8347c67c0f\": container with ID starting with 131d65615f258e44277dd4df7af778f29b93ddc6e374ec388cba1b8347c67c0f not found: ID does not exist" Mar 09 15:12:56 crc kubenswrapper[4722]: I0309 15:12:56.668776 4722 scope.go:117] "RemoveContainer" containerID="005131e8bc90666fe251bde9faff5d4f20d14372de7c593085ce1ac5df2eeb80" Mar 09 15:12:56 crc kubenswrapper[4722]: E0309 15:12:56.669101 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"005131e8bc90666fe251bde9faff5d4f20d14372de7c593085ce1ac5df2eeb80\": container with ID starting with 005131e8bc90666fe251bde9faff5d4f20d14372de7c593085ce1ac5df2eeb80 not found: ID does not exist" containerID="005131e8bc90666fe251bde9faff5d4f20d14372de7c593085ce1ac5df2eeb80" Mar 09 15:12:56 crc kubenswrapper[4722]: I0309 15:12:56.669130 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"005131e8bc90666fe251bde9faff5d4f20d14372de7c593085ce1ac5df2eeb80"} err="failed to get container status \"005131e8bc90666fe251bde9faff5d4f20d14372de7c593085ce1ac5df2eeb80\": rpc error: code = NotFound desc = could not find container \"005131e8bc90666fe251bde9faff5d4f20d14372de7c593085ce1ac5df2eeb80\": container with ID starting with 005131e8bc90666fe251bde9faff5d4f20d14372de7c593085ce1ac5df2eeb80 not found: ID does not exist" Mar 09 15:12:58 crc kubenswrapper[4722]: I0309 15:12:58.161716 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3403af33-70e2-4010-9951-5d91f55b7c05" path="/var/lib/kubelet/pods/3403af33-70e2-4010-9951-5d91f55b7c05/volumes" Mar 09 15:14:00 crc kubenswrapper[4722]: I0309 15:14:00.197343 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551154-c478t"] Mar 09 15:14:00 crc kubenswrapper[4722]: E0309 15:14:00.198634 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3403af33-70e2-4010-9951-5d91f55b7c05" containerName="registry-server" Mar 09 15:14:00 crc kubenswrapper[4722]: I0309 15:14:00.198651 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="3403af33-70e2-4010-9951-5d91f55b7c05" containerName="registry-server" Mar 09 15:14:00 crc kubenswrapper[4722]: E0309 15:14:00.198711 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3403af33-70e2-4010-9951-5d91f55b7c05" containerName="extract-content" Mar 09 15:14:00 crc kubenswrapper[4722]: I0309 15:14:00.198721 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="3403af33-70e2-4010-9951-5d91f55b7c05" containerName="extract-content" Mar 09 15:14:00 crc kubenswrapper[4722]: E0309 15:14:00.198748 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3403af33-70e2-4010-9951-5d91f55b7c05" containerName="extract-utilities" Mar 09 15:14:00 crc kubenswrapper[4722]: I0309 15:14:00.198757 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="3403af33-70e2-4010-9951-5d91f55b7c05" containerName="extract-utilities" Mar 09 15:14:00 crc kubenswrapper[4722]: I0309 15:14:00.199032 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="3403af33-70e2-4010-9951-5d91f55b7c05" containerName="registry-server" Mar 09 15:14:00 crc kubenswrapper[4722]: I0309 15:14:00.200121 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551154-c478t"] Mar 09 15:14:00 crc kubenswrapper[4722]: I0309 15:14:00.200292 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551154-c478t" Mar 09 15:14:00 crc kubenswrapper[4722]: I0309 15:14:00.203853 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 15:14:00 crc kubenswrapper[4722]: I0309 15:14:00.204432 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 15:14:00 crc kubenswrapper[4722]: I0309 15:14:00.204923 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-dr2x6" Mar 09 15:14:00 crc kubenswrapper[4722]: I0309 15:14:00.317794 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnnvn\" (UniqueName: \"kubernetes.io/projected/a717f324-e899-4ed6-bdb8-23fac7294215-kube-api-access-lnnvn\") pod \"auto-csr-approver-29551154-c478t\" (UID: \"a717f324-e899-4ed6-bdb8-23fac7294215\") " pod="openshift-infra/auto-csr-approver-29551154-c478t" Mar 09 15:14:00 crc kubenswrapper[4722]: I0309 15:14:00.423187 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnnvn\" (UniqueName: \"kubernetes.io/projected/a717f324-e899-4ed6-bdb8-23fac7294215-kube-api-access-lnnvn\") pod \"auto-csr-approver-29551154-c478t\" (UID: \"a717f324-e899-4ed6-bdb8-23fac7294215\") " pod="openshift-infra/auto-csr-approver-29551154-c478t" Mar 09 15:14:00 crc kubenswrapper[4722]: I0309 15:14:00.451181 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnnvn\" (UniqueName: \"kubernetes.io/projected/a717f324-e899-4ed6-bdb8-23fac7294215-kube-api-access-lnnvn\") pod \"auto-csr-approver-29551154-c478t\" (UID: \"a717f324-e899-4ed6-bdb8-23fac7294215\") " pod="openshift-infra/auto-csr-approver-29551154-c478t" Mar 09 15:14:00 crc kubenswrapper[4722]: I0309 15:14:00.529153 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551154-c478t" Mar 09 15:14:01 crc kubenswrapper[4722]: I0309 15:14:01.013994 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551154-c478t"] Mar 09 15:14:01 crc kubenswrapper[4722]: W0309 15:14:01.020304 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda717f324_e899_4ed6_bdb8_23fac7294215.slice/crio-a4a82d447dc10545e32d00ac0b923b1030a62fb7dcc1324123d2843d16e2e315 WatchSource:0}: Error finding container a4a82d447dc10545e32d00ac0b923b1030a62fb7dcc1324123d2843d16e2e315: Status 404 returned error can't find the container with id a4a82d447dc10545e32d00ac0b923b1030a62fb7dcc1324123d2843d16e2e315 Mar 09 15:14:01 crc kubenswrapper[4722]: I0309 15:14:01.443962 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551154-c478t" event={"ID":"a717f324-e899-4ed6-bdb8-23fac7294215","Type":"ContainerStarted","Data":"a4a82d447dc10545e32d00ac0b923b1030a62fb7dcc1324123d2843d16e2e315"} Mar 09 15:14:03 crc kubenswrapper[4722]: I0309 15:14:03.478911 4722 generic.go:334] "Generic (PLEG): container finished" podID="a717f324-e899-4ed6-bdb8-23fac7294215" containerID="94a3861f8aa8959c666e8f15f78770bb8a16221e54f63dffdc7d9a2a9d2bfbd4" exitCode=0 Mar 09 15:14:03 crc kubenswrapper[4722]: I0309 15:14:03.479026 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551154-c478t" event={"ID":"a717f324-e899-4ed6-bdb8-23fac7294215","Type":"ContainerDied","Data":"94a3861f8aa8959c666e8f15f78770bb8a16221e54f63dffdc7d9a2a9d2bfbd4"} Mar 09 15:14:05 crc kubenswrapper[4722]: I0309 15:14:05.504408 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551154-c478t" event={"ID":"a717f324-e899-4ed6-bdb8-23fac7294215","Type":"ContainerDied","Data":"a4a82d447dc10545e32d00ac0b923b1030a62fb7dcc1324123d2843d16e2e315"} Mar 09 15:14:05 crc kubenswrapper[4722]: I0309 15:14:05.504931 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a4a82d447dc10545e32d00ac0b923b1030a62fb7dcc1324123d2843d16e2e315" Mar 09 15:14:05 crc kubenswrapper[4722]: I0309 15:14:05.514593 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551154-c478t" Mar 09 15:14:05 crc kubenswrapper[4722]: I0309 15:14:05.556709 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lnnvn\" (UniqueName: \"kubernetes.io/projected/a717f324-e899-4ed6-bdb8-23fac7294215-kube-api-access-lnnvn\") pod \"a717f324-e899-4ed6-bdb8-23fac7294215\" (UID: \"a717f324-e899-4ed6-bdb8-23fac7294215\") " Mar 09 15:14:05 crc kubenswrapper[4722]: I0309 15:14:05.576552 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a717f324-e899-4ed6-bdb8-23fac7294215-kube-api-access-lnnvn" (OuterVolumeSpecName: "kube-api-access-lnnvn") pod "a717f324-e899-4ed6-bdb8-23fac7294215" (UID: "a717f324-e899-4ed6-bdb8-23fac7294215"). InnerVolumeSpecName "kube-api-access-lnnvn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 15:14:05 crc kubenswrapper[4722]: I0309 15:14:05.661550 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lnnvn\" (UniqueName: \"kubernetes.io/projected/a717f324-e899-4ed6-bdb8-23fac7294215-kube-api-access-lnnvn\") on node \"crc\" DevicePath \"\"" Mar 09 15:14:06 crc kubenswrapper[4722]: I0309 15:14:06.516922 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551154-c478t" Mar 09 15:14:06 crc kubenswrapper[4722]: I0309 15:14:06.597593 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551148-j78st"] Mar 09 15:14:06 crc kubenswrapper[4722]: I0309 15:14:06.612158 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551148-j78st"] Mar 09 15:14:08 crc kubenswrapper[4722]: I0309 15:14:08.166105 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0975b72-4ff8-4c2a-b11f-b3ade5a05e7c" path="/var/lib/kubelet/pods/f0975b72-4ff8-4c2a-b11f-b3ade5a05e7c/volumes" Mar 09 15:14:19 crc kubenswrapper[4722]: I0309 15:14:19.095309 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bddwl"] Mar 09 15:14:19 crc kubenswrapper[4722]: E0309 15:14:19.097482 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a717f324-e899-4ed6-bdb8-23fac7294215" containerName="oc" Mar 09 15:14:19 crc kubenswrapper[4722]: I0309 15:14:19.097567 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a717f324-e899-4ed6-bdb8-23fac7294215" containerName="oc" Mar 09 15:14:19 crc kubenswrapper[4722]: I0309 15:14:19.097872 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="a717f324-e899-4ed6-bdb8-23fac7294215" containerName="oc" Mar 09 15:14:19 crc kubenswrapper[4722]: I0309 15:14:19.099788 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bddwl" Mar 09 15:14:19 crc kubenswrapper[4722]: I0309 15:14:19.124438 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bddwl"] Mar 09 15:14:19 crc kubenswrapper[4722]: I0309 15:14:19.184876 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3a04c11-128b-4891-8f8c-bbee1e1dd21e-catalog-content\") pod \"redhat-marketplace-bddwl\" (UID: \"e3a04c11-128b-4891-8f8c-bbee1e1dd21e\") " pod="openshift-marketplace/redhat-marketplace-bddwl" Mar 09 15:14:19 crc kubenswrapper[4722]: I0309 15:14:19.185583 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5g7xx\" (UniqueName: \"kubernetes.io/projected/e3a04c11-128b-4891-8f8c-bbee1e1dd21e-kube-api-access-5g7xx\") pod \"redhat-marketplace-bddwl\" (UID: \"e3a04c11-128b-4891-8f8c-bbee1e1dd21e\") " pod="openshift-marketplace/redhat-marketplace-bddwl" Mar 09 15:14:19 crc kubenswrapper[4722]: I0309 15:14:19.185812 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3a04c11-128b-4891-8f8c-bbee1e1dd21e-utilities\") pod \"redhat-marketplace-bddwl\" (UID: \"e3a04c11-128b-4891-8f8c-bbee1e1dd21e\") " pod="openshift-marketplace/redhat-marketplace-bddwl" Mar 09 15:14:19 crc kubenswrapper[4722]: I0309 15:14:19.288661 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3a04c11-128b-4891-8f8c-bbee1e1dd21e-catalog-content\") pod \"redhat-marketplace-bddwl\" (UID: \"e3a04c11-128b-4891-8f8c-bbee1e1dd21e\") " pod="openshift-marketplace/redhat-marketplace-bddwl" Mar 09 15:14:19 crc kubenswrapper[4722]: I0309 15:14:19.289126 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5g7xx\" (UniqueName: \"kubernetes.io/projected/e3a04c11-128b-4891-8f8c-bbee1e1dd21e-kube-api-access-5g7xx\") pod \"redhat-marketplace-bddwl\" (UID: \"e3a04c11-128b-4891-8f8c-bbee1e1dd21e\") " pod="openshift-marketplace/redhat-marketplace-bddwl" Mar 09 15:14:19 crc kubenswrapper[4722]: I0309 15:14:19.289215 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3a04c11-128b-4891-8f8c-bbee1e1dd21e-utilities\") pod \"redhat-marketplace-bddwl\" (UID: \"e3a04c11-128b-4891-8f8c-bbee1e1dd21e\") " pod="openshift-marketplace/redhat-marketplace-bddwl" Mar 09 15:14:19 crc kubenswrapper[4722]: I0309 15:14:19.289225 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3a04c11-128b-4891-8f8c-bbee1e1dd21e-catalog-content\") pod \"redhat-marketplace-bddwl\" (UID: \"e3a04c11-128b-4891-8f8c-bbee1e1dd21e\") " pod="openshift-marketplace/redhat-marketplace-bddwl" Mar 09 15:14:19 crc kubenswrapper[4722]: I0309 15:14:19.290126 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3a04c11-128b-4891-8f8c-bbee1e1dd21e-utilities\") pod \"redhat-marketplace-bddwl\" (UID: \"e3a04c11-128b-4891-8f8c-bbee1e1dd21e\") " pod="openshift-marketplace/redhat-marketplace-bddwl" Mar 09 15:14:19 crc kubenswrapper[4722]: I0309 15:14:19.315114 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5g7xx\" (UniqueName: \"kubernetes.io/projected/e3a04c11-128b-4891-8f8c-bbee1e1dd21e-kube-api-access-5g7xx\") pod \"redhat-marketplace-bddwl\" (UID: \"e3a04c11-128b-4891-8f8c-bbee1e1dd21e\") " pod="openshift-marketplace/redhat-marketplace-bddwl" Mar 09 15:14:19 crc kubenswrapper[4722]: I0309 15:14:19.438222 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bddwl" Mar 09 15:14:19 crc kubenswrapper[4722]: I0309 15:14:19.987975 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bddwl"] Mar 09 15:14:20 crc kubenswrapper[4722]: W0309 15:14:20.000005 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3a04c11_128b_4891_8f8c_bbee1e1dd21e.slice/crio-76cfe7d0614bafdfc70e6fd3caa74e9e49dfe96acf391e4151f2dc4d0e674a90 WatchSource:0}: Error finding container 76cfe7d0614bafdfc70e6fd3caa74e9e49dfe96acf391e4151f2dc4d0e674a90: Status 404 returned error can't find the container with id 76cfe7d0614bafdfc70e6fd3caa74e9e49dfe96acf391e4151f2dc4d0e674a90 Mar 09 15:14:20 crc kubenswrapper[4722]: I0309 15:14:20.713038 4722 generic.go:334] "Generic (PLEG): container finished" podID="e3a04c11-128b-4891-8f8c-bbee1e1dd21e" containerID="65dfa6129c4d5616d27b36f972c7cda566cbf810250a571da80eeb718cb6154b" exitCode=0 Mar 09 15:14:20 crc kubenswrapper[4722]: I0309 15:14:20.713354 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bddwl" event={"ID":"e3a04c11-128b-4891-8f8c-bbee1e1dd21e","Type":"ContainerDied","Data":"65dfa6129c4d5616d27b36f972c7cda566cbf810250a571da80eeb718cb6154b"} Mar 09 15:14:20 crc kubenswrapper[4722]: I0309 15:14:20.713384 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bddwl" event={"ID":"e3a04c11-128b-4891-8f8c-bbee1e1dd21e","Type":"ContainerStarted","Data":"76cfe7d0614bafdfc70e6fd3caa74e9e49dfe96acf391e4151f2dc4d0e674a90"} Mar 09 15:14:21 crc kubenswrapper[4722]: I0309 15:14:21.726642 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bddwl" event={"ID":"e3a04c11-128b-4891-8f8c-bbee1e1dd21e","Type":"ContainerStarted","Data":"475686c383ecab30ce7933d122d931a898319703a70187d40cf4b2fc78304324"} Mar 09 15:14:22 crc kubenswrapper[4722]: I0309 15:14:22.740344 4722 generic.go:334] "Generic (PLEG): container finished" podID="e3a04c11-128b-4891-8f8c-bbee1e1dd21e" containerID="475686c383ecab30ce7933d122d931a898319703a70187d40cf4b2fc78304324" exitCode=0 Mar 09 15:14:22 crc kubenswrapper[4722]: I0309 15:14:22.740446 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bddwl" event={"ID":"e3a04c11-128b-4891-8f8c-bbee1e1dd21e","Type":"ContainerDied","Data":"475686c383ecab30ce7933d122d931a898319703a70187d40cf4b2fc78304324"} Mar 09 15:14:23 crc kubenswrapper[4722]: I0309 15:14:23.755176 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bddwl" event={"ID":"e3a04c11-128b-4891-8f8c-bbee1e1dd21e","Type":"ContainerStarted","Data":"503cbeb05a72924176802c504e497cf4af41a34edf11edc8a0499ca03b5142b5"} Mar 09 15:14:23 crc kubenswrapper[4722]: I0309 15:14:23.792213 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bddwl" podStartSLOduration=2.238924993 podStartE2EDuration="4.792179087s" podCreationTimestamp="2026-03-09 15:14:19 +0000 UTC" firstStartedPulling="2026-03-09 15:14:20.716187956 +0000 UTC m=+4301.271756532" lastFinishedPulling="2026-03-09 15:14:23.26944206 +0000 UTC m=+4303.825010626" observedRunningTime="2026-03-09 15:14:23.779857059 +0000 UTC m=+4304.335425645" watchObservedRunningTime="2026-03-09 15:14:23.792179087 +0000 UTC m=+4304.347747673" Mar 09 15:14:29 crc kubenswrapper[4722]: I0309 15:14:29.438615 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bddwl" Mar 09 15:14:29 crc kubenswrapper[4722]: I0309 15:14:29.439330 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bddwl" Mar 09 15:14:29 crc kubenswrapper[4722]: I0309 15:14:29.487721 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bddwl" Mar 09 15:14:29 crc kubenswrapper[4722]: I0309 15:14:29.881439 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bddwl" Mar 09 15:14:29 crc kubenswrapper[4722]: I0309 15:14:29.934438 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bddwl"] Mar 09 15:14:30 crc kubenswrapper[4722]: I0309 15:14:30.197159 4722 scope.go:117] "RemoveContainer" containerID="1d5c5c98c936ab05ca6826755ddb291638f469ab090a1d7528d6744a2c5a77c3" Mar 09 15:14:31 crc kubenswrapper[4722]: I0309 15:14:31.848804 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bddwl" podUID="e3a04c11-128b-4891-8f8c-bbee1e1dd21e" containerName="registry-server" containerID="cri-o://503cbeb05a72924176802c504e497cf4af41a34edf11edc8a0499ca03b5142b5" gracePeriod=2 Mar 09 15:14:32 crc kubenswrapper[4722]: I0309 15:14:32.331749 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bddwl" Mar 09 15:14:32 crc kubenswrapper[4722]: I0309 15:14:32.428366 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3a04c11-128b-4891-8f8c-bbee1e1dd21e-catalog-content\") pod \"e3a04c11-128b-4891-8f8c-bbee1e1dd21e\" (UID: \"e3a04c11-128b-4891-8f8c-bbee1e1dd21e\") " Mar 09 15:14:32 crc kubenswrapper[4722]: I0309 15:14:32.428640 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5g7xx\" (UniqueName: \"kubernetes.io/projected/e3a04c11-128b-4891-8f8c-bbee1e1dd21e-kube-api-access-5g7xx\") pod \"e3a04c11-128b-4891-8f8c-bbee1e1dd21e\" (UID: \"e3a04c11-128b-4891-8f8c-bbee1e1dd21e\") " Mar 09 15:14:32 crc kubenswrapper[4722]: I0309 15:14:32.428701 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3a04c11-128b-4891-8f8c-bbee1e1dd21e-utilities\") pod \"e3a04c11-128b-4891-8f8c-bbee1e1dd21e\" (UID: \"e3a04c11-128b-4891-8f8c-bbee1e1dd21e\") " Mar 09 15:14:32 crc kubenswrapper[4722]: I0309 15:14:32.429625 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3a04c11-128b-4891-8f8c-bbee1e1dd21e-utilities" (OuterVolumeSpecName: "utilities") pod "e3a04c11-128b-4891-8f8c-bbee1e1dd21e" (UID: "e3a04c11-128b-4891-8f8c-bbee1e1dd21e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 15:14:32 crc kubenswrapper[4722]: I0309 15:14:32.438382 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3a04c11-128b-4891-8f8c-bbee1e1dd21e-kube-api-access-5g7xx" (OuterVolumeSpecName: "kube-api-access-5g7xx") pod "e3a04c11-128b-4891-8f8c-bbee1e1dd21e" (UID: "e3a04c11-128b-4891-8f8c-bbee1e1dd21e"). InnerVolumeSpecName "kube-api-access-5g7xx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 15:14:32 crc kubenswrapper[4722]: I0309 15:14:32.454535 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3a04c11-128b-4891-8f8c-bbee1e1dd21e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e3a04c11-128b-4891-8f8c-bbee1e1dd21e" (UID: "e3a04c11-128b-4891-8f8c-bbee1e1dd21e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 15:14:32 crc kubenswrapper[4722]: I0309 15:14:32.531409 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5g7xx\" (UniqueName: \"kubernetes.io/projected/e3a04c11-128b-4891-8f8c-bbee1e1dd21e-kube-api-access-5g7xx\") on node \"crc\" DevicePath \"\"" Mar 09 15:14:32 crc kubenswrapper[4722]: I0309 15:14:32.531750 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3a04c11-128b-4891-8f8c-bbee1e1dd21e-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 15:14:32 crc kubenswrapper[4722]: I0309 15:14:32.531764 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3a04c11-128b-4891-8f8c-bbee1e1dd21e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 15:14:32 crc kubenswrapper[4722]: I0309 15:14:32.863476 4722 generic.go:334] "Generic (PLEG): container finished" podID="e3a04c11-128b-4891-8f8c-bbee1e1dd21e" containerID="503cbeb05a72924176802c504e497cf4af41a34edf11edc8a0499ca03b5142b5" exitCode=0 Mar 09 15:14:32 crc kubenswrapper[4722]: I0309 15:14:32.863517 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bddwl" event={"ID":"e3a04c11-128b-4891-8f8c-bbee1e1dd21e","Type":"ContainerDied","Data":"503cbeb05a72924176802c504e497cf4af41a34edf11edc8a0499ca03b5142b5"} Mar 09 15:14:32 crc kubenswrapper[4722]: I0309 15:14:32.863541 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bddwl" event={"ID":"e3a04c11-128b-4891-8f8c-bbee1e1dd21e","Type":"ContainerDied","Data":"76cfe7d0614bafdfc70e6fd3caa74e9e49dfe96acf391e4151f2dc4d0e674a90"} Mar 09 15:14:32 crc kubenswrapper[4722]: I0309 15:14:32.863556 4722 scope.go:117] "RemoveContainer" containerID="503cbeb05a72924176802c504e497cf4af41a34edf11edc8a0499ca03b5142b5" Mar 09 15:14:32 crc kubenswrapper[4722]: I0309 15:14:32.863670 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bddwl" Mar 09 15:14:32 crc kubenswrapper[4722]: I0309 15:14:32.905763 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bddwl"] Mar 09 15:14:32 crc kubenswrapper[4722]: I0309 15:14:32.909562 4722 scope.go:117] "RemoveContainer" containerID="475686c383ecab30ce7933d122d931a898319703a70187d40cf4b2fc78304324" Mar 09 15:14:32 crc kubenswrapper[4722]: I0309 15:14:32.928827 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bddwl"] Mar 09 15:14:32 crc kubenswrapper[4722]: I0309 15:14:32.932586 4722 scope.go:117] "RemoveContainer" containerID="65dfa6129c4d5616d27b36f972c7cda566cbf810250a571da80eeb718cb6154b" Mar 09 15:14:32 crc kubenswrapper[4722]: I0309 15:14:32.981430 4722 scope.go:117] "RemoveContainer" containerID="503cbeb05a72924176802c504e497cf4af41a34edf11edc8a0499ca03b5142b5" Mar 09 15:14:32 crc kubenswrapper[4722]: E0309 15:14:32.981813 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"503cbeb05a72924176802c504e497cf4af41a34edf11edc8a0499ca03b5142b5\": container with ID starting with 503cbeb05a72924176802c504e497cf4af41a34edf11edc8a0499ca03b5142b5 not found: ID does not exist" containerID="503cbeb05a72924176802c504e497cf4af41a34edf11edc8a0499ca03b5142b5" Mar 09 15:14:32 crc kubenswrapper[4722]: I0309 15:14:32.981846 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"503cbeb05a72924176802c504e497cf4af41a34edf11edc8a0499ca03b5142b5"} err="failed to get container status \"503cbeb05a72924176802c504e497cf4af41a34edf11edc8a0499ca03b5142b5\": rpc error: code = NotFound desc = could not find container \"503cbeb05a72924176802c504e497cf4af41a34edf11edc8a0499ca03b5142b5\": container with ID starting with 503cbeb05a72924176802c504e497cf4af41a34edf11edc8a0499ca03b5142b5 not found: ID does not exist" Mar 09 15:14:32 crc kubenswrapper[4722]: I0309 15:14:32.981867 4722 scope.go:117] "RemoveContainer" containerID="475686c383ecab30ce7933d122d931a898319703a70187d40cf4b2fc78304324" Mar 09 15:14:32 crc kubenswrapper[4722]: E0309 15:14:32.982063 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"475686c383ecab30ce7933d122d931a898319703a70187d40cf4b2fc78304324\": container with ID starting with 475686c383ecab30ce7933d122d931a898319703a70187d40cf4b2fc78304324 not found: ID does not exist" containerID="475686c383ecab30ce7933d122d931a898319703a70187d40cf4b2fc78304324" Mar 09 15:14:32 crc kubenswrapper[4722]: I0309 15:14:32.982085 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"475686c383ecab30ce7933d122d931a898319703a70187d40cf4b2fc78304324"} err="failed to get container status \"475686c383ecab30ce7933d122d931a898319703a70187d40cf4b2fc78304324\": rpc error: code = NotFound desc = could not find container \"475686c383ecab30ce7933d122d931a898319703a70187d40cf4b2fc78304324\": container with ID starting with 475686c383ecab30ce7933d122d931a898319703a70187d40cf4b2fc78304324 not found: ID does not exist" Mar 09 15:14:32 crc kubenswrapper[4722]: I0309 15:14:32.982101 4722 scope.go:117] "RemoveContainer" containerID="65dfa6129c4d5616d27b36f972c7cda566cbf810250a571da80eeb718cb6154b" Mar 09 15:14:32 crc kubenswrapper[4722]: E0309 15:14:32.982380 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65dfa6129c4d5616d27b36f972c7cda566cbf810250a571da80eeb718cb6154b\": container with ID starting with 65dfa6129c4d5616d27b36f972c7cda566cbf810250a571da80eeb718cb6154b not found: ID does not exist" containerID="65dfa6129c4d5616d27b36f972c7cda566cbf810250a571da80eeb718cb6154b" Mar 09 15:14:32 crc kubenswrapper[4722]: I0309 15:14:32.982402 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65dfa6129c4d5616d27b36f972c7cda566cbf810250a571da80eeb718cb6154b"} err="failed to get container status \"65dfa6129c4d5616d27b36f972c7cda566cbf810250a571da80eeb718cb6154b\": rpc error: code = NotFound desc = could not find container \"65dfa6129c4d5616d27b36f972c7cda566cbf810250a571da80eeb718cb6154b\": container with ID starting with 65dfa6129c4d5616d27b36f972c7cda566cbf810250a571da80eeb718cb6154b not found: ID does not exist" Mar 09 15:14:34 crc kubenswrapper[4722]: I0309 15:14:34.172813 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3a04c11-128b-4891-8f8c-bbee1e1dd21e" path="/var/lib/kubelet/pods/e3a04c11-128b-4891-8f8c-bbee1e1dd21e/volumes" Mar 09 15:14:51 crc kubenswrapper[4722]: I0309 15:14:51.528078 4722 patch_prober.go:28] interesting pod/machine-config-daemon-hjrrb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 15:14:51 crc kubenswrapper[4722]: I0309 15:14:51.528784 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 15:15:00 crc kubenswrapper[4722]: I0309 15:15:00.182054 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551155-rllq4"] Mar 09 15:15:00 crc kubenswrapper[4722]: E0309 15:15:00.182903 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3a04c11-128b-4891-8f8c-bbee1e1dd21e" containerName="registry-server" Mar 09 15:15:00 crc kubenswrapper[4722]: I0309 15:15:00.182916 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3a04c11-128b-4891-8f8c-bbee1e1dd21e" containerName="registry-server" Mar 09 15:15:00 crc kubenswrapper[4722]: E0309 15:15:00.182933 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3a04c11-128b-4891-8f8c-bbee1e1dd21e" containerName="extract-content" Mar 09 15:15:00 crc kubenswrapper[4722]: I0309 15:15:00.182941 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3a04c11-128b-4891-8f8c-bbee1e1dd21e" containerName="extract-content" Mar 09 15:15:00 crc kubenswrapper[4722]: E0309 15:15:00.182955 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3a04c11-128b-4891-8f8c-bbee1e1dd21e" containerName="extract-utilities" Mar 09 15:15:00 crc kubenswrapper[4722]: I0309 15:15:00.182962 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3a04c11-128b-4891-8f8c-bbee1e1dd21e" containerName="extract-utilities" Mar 09 15:15:00 crc kubenswrapper[4722]: I0309 15:15:00.183281 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3a04c11-128b-4891-8f8c-bbee1e1dd21e" containerName="registry-server" Mar 09 15:15:00 crc kubenswrapper[4722]: I0309 15:15:00.184441 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551155-rllq4" Mar 09 15:15:00 crc kubenswrapper[4722]: I0309 15:15:00.187134 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 09 15:15:00 crc kubenswrapper[4722]: I0309 15:15:00.187414 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 09 15:15:00 crc kubenswrapper[4722]: I0309 15:15:00.195652 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551155-rllq4"] Mar 09 15:15:00 crc kubenswrapper[4722]: I0309 15:15:00.324080 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3d3ff2c4-8779-4662-858b-4fe557fa2da9-config-volume\") pod \"collect-profiles-29551155-rllq4\" (UID: \"3d3ff2c4-8779-4662-858b-4fe557fa2da9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551155-rllq4" Mar 09 15:15:00 crc kubenswrapper[4722]: I0309 15:15:00.324401 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slk2w\" (UniqueName: \"kubernetes.io/projected/3d3ff2c4-8779-4662-858b-4fe557fa2da9-kube-api-access-slk2w\") pod \"collect-profiles-29551155-rllq4\" (UID: \"3d3ff2c4-8779-4662-858b-4fe557fa2da9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551155-rllq4" Mar 09 15:15:00 crc kubenswrapper[4722]: I0309 15:15:00.324594 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3d3ff2c4-8779-4662-858b-4fe557fa2da9-secret-volume\") pod \"collect-profiles-29551155-rllq4\" (UID: \"3d3ff2c4-8779-4662-858b-4fe557fa2da9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551155-rllq4" Mar 09 15:15:00 crc kubenswrapper[4722]: I0309 15:15:00.428379 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3d3ff2c4-8779-4662-858b-4fe557fa2da9-config-volume\") pod \"collect-profiles-29551155-rllq4\" (UID: \"3d3ff2c4-8779-4662-858b-4fe557fa2da9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551155-rllq4" Mar 09 15:15:00 crc kubenswrapper[4722]: I0309 15:15:00.428435 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slk2w\" (UniqueName: \"kubernetes.io/projected/3d3ff2c4-8779-4662-858b-4fe557fa2da9-kube-api-access-slk2w\") pod \"collect-profiles-29551155-rllq4\" (UID: \"3d3ff2c4-8779-4662-858b-4fe557fa2da9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551155-rllq4" Mar 09 15:15:00 crc kubenswrapper[4722]: I0309 15:15:00.428547 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3d3ff2c4-8779-4662-858b-4fe557fa2da9-secret-volume\") pod \"collect-profiles-29551155-rllq4\" (UID: \"3d3ff2c4-8779-4662-858b-4fe557fa2da9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551155-rllq4" Mar 09 15:15:00 crc kubenswrapper[4722]: I0309 15:15:00.431085 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3d3ff2c4-8779-4662-858b-4fe557fa2da9-config-volume\") pod \"collect-profiles-29551155-rllq4\" (UID: \"3d3ff2c4-8779-4662-858b-4fe557fa2da9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551155-rllq4" Mar 09 15:15:00 crc kubenswrapper[4722]: I0309 15:15:00.435926 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3d3ff2c4-8779-4662-858b-4fe557fa2da9-secret-volume\") pod \"collect-profiles-29551155-rllq4\" (UID: \"3d3ff2c4-8779-4662-858b-4fe557fa2da9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551155-rllq4" Mar 09 15:15:00 crc kubenswrapper[4722]: I0309 15:15:00.446562 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slk2w\" (UniqueName: \"kubernetes.io/projected/3d3ff2c4-8779-4662-858b-4fe557fa2da9-kube-api-access-slk2w\") pod \"collect-profiles-29551155-rllq4\" (UID: \"3d3ff2c4-8779-4662-858b-4fe557fa2da9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551155-rllq4" Mar 09 15:15:00 crc kubenswrapper[4722]: I0309 15:15:00.536726 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551155-rllq4" Mar 09 15:15:01 crc kubenswrapper[4722]: I0309 15:15:01.056557 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551155-rllq4"] Mar 09 15:15:01 crc kubenswrapper[4722]: I0309 15:15:01.262113 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551155-rllq4" event={"ID":"3d3ff2c4-8779-4662-858b-4fe557fa2da9","Type":"ContainerStarted","Data":"e118d179ac991a93210d28083535f6966572b1a9fc9065918d730c5f3b1621a7"} Mar 09 15:15:01 crc kubenswrapper[4722]: I0309 15:15:01.262498 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551155-rllq4" event={"ID":"3d3ff2c4-8779-4662-858b-4fe557fa2da9","Type":"ContainerStarted","Data":"b12f721fd655177326215a6501a2baa198482f56f26ada212a91161978a966af"} Mar 09 15:15:01 crc kubenswrapper[4722]: I0309 15:15:01.290258 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29551155-rllq4" podStartSLOduration=1.290213783 podStartE2EDuration="1.290213783s" podCreationTimestamp="2026-03-09 15:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 15:15:01.281507415 +0000 UTC m=+4341.837075991" watchObservedRunningTime="2026-03-09 15:15:01.290213783 +0000 UTC m=+4341.845782359" Mar 09 15:15:02 crc kubenswrapper[4722]: I0309 15:15:02.281661 4722 generic.go:334] "Generic (PLEG): container finished" podID="3d3ff2c4-8779-4662-858b-4fe557fa2da9" containerID="e118d179ac991a93210d28083535f6966572b1a9fc9065918d730c5f3b1621a7" exitCode=0 Mar 09 15:15:02 crc kubenswrapper[4722]: I0309 15:15:02.282065 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551155-rllq4" event={"ID":"3d3ff2c4-8779-4662-858b-4fe557fa2da9","Type":"ContainerDied","Data":"e118d179ac991a93210d28083535f6966572b1a9fc9065918d730c5f3b1621a7"} Mar 09 15:15:03 crc kubenswrapper[4722]: I0309 15:15:03.818604 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551155-rllq4" Mar 09 15:15:03 crc kubenswrapper[4722]: I0309 15:15:03.912648 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3d3ff2c4-8779-4662-858b-4fe557fa2da9-config-volume\") pod \"3d3ff2c4-8779-4662-858b-4fe557fa2da9\" (UID: \"3d3ff2c4-8779-4662-858b-4fe557fa2da9\") " Mar 09 15:15:03 crc kubenswrapper[4722]: I0309 15:15:03.912785 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3d3ff2c4-8779-4662-858b-4fe557fa2da9-secret-volume\") pod \"3d3ff2c4-8779-4662-858b-4fe557fa2da9\" (UID: \"3d3ff2c4-8779-4662-858b-4fe557fa2da9\") " Mar 09 15:15:03 crc kubenswrapper[4722]: I0309 15:15:03.913024 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-slk2w\" (UniqueName: \"kubernetes.io/projected/3d3ff2c4-8779-4662-858b-4fe557fa2da9-kube-api-access-slk2w\") pod \"3d3ff2c4-8779-4662-858b-4fe557fa2da9\" (UID: \"3d3ff2c4-8779-4662-858b-4fe557fa2da9\") " Mar 09 15:15:03 crc kubenswrapper[4722]: I0309 15:15:03.913641 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d3ff2c4-8779-4662-858b-4fe557fa2da9-config-volume" (OuterVolumeSpecName: "config-volume") pod "3d3ff2c4-8779-4662-858b-4fe557fa2da9" (UID: "3d3ff2c4-8779-4662-858b-4fe557fa2da9"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 15:15:03 crc kubenswrapper[4722]: I0309 15:15:03.914132 4722 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3d3ff2c4-8779-4662-858b-4fe557fa2da9-config-volume\") on node \"crc\" DevicePath \"\"" Mar 09 15:15:03 crc kubenswrapper[4722]: I0309 15:15:03.920373 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d3ff2c4-8779-4662-858b-4fe557fa2da9-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3d3ff2c4-8779-4662-858b-4fe557fa2da9" (UID: "3d3ff2c4-8779-4662-858b-4fe557fa2da9"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 15:15:03 crc kubenswrapper[4722]: I0309 15:15:03.920558 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d3ff2c4-8779-4662-858b-4fe557fa2da9-kube-api-access-slk2w" (OuterVolumeSpecName: "kube-api-access-slk2w") pod "3d3ff2c4-8779-4662-858b-4fe557fa2da9" (UID: "3d3ff2c4-8779-4662-858b-4fe557fa2da9"). InnerVolumeSpecName "kube-api-access-slk2w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 15:15:04 crc kubenswrapper[4722]: I0309 15:15:04.019105 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-slk2w\" (UniqueName: \"kubernetes.io/projected/3d3ff2c4-8779-4662-858b-4fe557fa2da9-kube-api-access-slk2w\") on node \"crc\" DevicePath \"\"" Mar 09 15:15:04 crc kubenswrapper[4722]: I0309 15:15:04.019152 4722 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3d3ff2c4-8779-4662-858b-4fe557fa2da9-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 09 15:15:04 crc kubenswrapper[4722]: I0309 15:15:04.327081 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551155-rllq4" event={"ID":"3d3ff2c4-8779-4662-858b-4fe557fa2da9","Type":"ContainerDied","Data":"b12f721fd655177326215a6501a2baa198482f56f26ada212a91161978a966af"} Mar 09 15:15:04 crc kubenswrapper[4722]: I0309 15:15:04.327448 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b12f721fd655177326215a6501a2baa198482f56f26ada212a91161978a966af" Mar 09 15:15:04 crc kubenswrapper[4722]: I0309 15:15:04.327458 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551155-rllq4" Mar 09 15:15:04 crc kubenswrapper[4722]: I0309 15:15:04.368505 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551110-8j24g"] Mar 09 15:15:04 crc kubenswrapper[4722]: I0309 15:15:04.378285 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551110-8j24g"] Mar 09 15:15:06 crc kubenswrapper[4722]: I0309 15:15:06.172099 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="673ece17-a82b-4e82-b811-5c704ab9a1b4" path="/var/lib/kubelet/pods/673ece17-a82b-4e82-b811-5c704ab9a1b4/volumes" Mar 09 15:15:21 crc kubenswrapper[4722]: I0309 15:15:21.528156 4722 patch_prober.go:28] interesting pod/machine-config-daemon-hjrrb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 15:15:21 crc kubenswrapper[4722]: I0309 15:15:21.528798 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 15:15:30 crc kubenswrapper[4722]: I0309 15:15:30.315158 4722 scope.go:117] "RemoveContainer" containerID="d6113c52887efd5fe65d1135756dbb3f60c61fb6a4dcfe637d9efc547c003933" Mar 09 15:15:51 crc kubenswrapper[4722]: I0309 15:15:51.527779 4722 patch_prober.go:28] interesting pod/machine-config-daemon-hjrrb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 15:15:51 crc kubenswrapper[4722]: I0309 15:15:51.528372 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 15:15:51 crc kubenswrapper[4722]: I0309 15:15:51.528419 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" Mar 09 15:15:51 crc kubenswrapper[4722]: I0309 15:15:51.529256 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ed831924138fb872c73acc14a5ac7c6ed36b421017c8fe60dbbaede3d8f11789"} pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 15:15:51 crc kubenswrapper[4722]: I0309 15:15:51.529311 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerName="machine-config-daemon" containerID="cri-o://ed831924138fb872c73acc14a5ac7c6ed36b421017c8fe60dbbaede3d8f11789" gracePeriod=600 Mar 09 15:15:51 crc kubenswrapper[4722]: E0309 15:15:51.651106 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 15:15:52 crc kubenswrapper[4722]: I0309 15:15:52.239345 4722 generic.go:334] "Generic (PLEG): container finished" podID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerID="ed831924138fb872c73acc14a5ac7c6ed36b421017c8fe60dbbaede3d8f11789" exitCode=0 Mar 09 15:15:52 crc kubenswrapper[4722]: I0309 15:15:52.239382 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" event={"ID":"dac2aaf5-653b-4b2a-8efe-ed26bac8d648","Type":"ContainerDied","Data":"ed831924138fb872c73acc14a5ac7c6ed36b421017c8fe60dbbaede3d8f11789"} Mar 09 15:15:52 crc kubenswrapper[4722]: I0309 15:15:52.239859 4722 scope.go:117] "RemoveContainer" containerID="605e7cfbce2d49113bae14138b0367a77ac331e6056b4329529ab18e27f72b42" Mar 09 15:15:52 crc kubenswrapper[4722]: I0309 15:15:52.240593 4722 scope.go:117] "RemoveContainer" containerID="ed831924138fb872c73acc14a5ac7c6ed36b421017c8fe60dbbaede3d8f11789" Mar 09 15:15:52 crc kubenswrapper[4722]: E0309 15:15:52.240886 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 15:16:00 crc kubenswrapper[4722]: I0309 15:16:00.166840 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551156-w2nm6"] Mar 09 15:16:00 crc kubenswrapper[4722]: E0309 15:16:00.168003 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d3ff2c4-8779-4662-858b-4fe557fa2da9" containerName="collect-profiles" Mar 09 15:16:00 crc kubenswrapper[4722]: I0309 15:16:00.168020 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d3ff2c4-8779-4662-858b-4fe557fa2da9" containerName="collect-profiles" Mar 09 15:16:00 crc kubenswrapper[4722]: I0309 15:16:00.168394 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d3ff2c4-8779-4662-858b-4fe557fa2da9" containerName="collect-profiles" Mar 09 15:16:00 crc kubenswrapper[4722]: I0309 15:16:00.170161 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551156-w2nm6" Mar 09 15:16:00 crc kubenswrapper[4722]: I0309 15:16:00.174693 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-dr2x6" Mar 09 15:16:00 crc kubenswrapper[4722]: I0309 15:16:00.175338 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 15:16:00 crc kubenswrapper[4722]: I0309 15:16:00.175833 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 15:16:00 crc kubenswrapper[4722]: I0309 15:16:00.178453 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551156-w2nm6"] Mar 09 15:16:00 crc kubenswrapper[4722]: I0309 15:16:00.254914 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28rwm\" (UniqueName: \"kubernetes.io/projected/635b0387-641e-4b9a-a7eb-adbe84ed5d01-kube-api-access-28rwm\") pod \"auto-csr-approver-29551156-w2nm6\" (UID: \"635b0387-641e-4b9a-a7eb-adbe84ed5d01\") " pod="openshift-infra/auto-csr-approver-29551156-w2nm6" Mar 09 15:16:00 crc kubenswrapper[4722]: I0309 15:16:00.357392 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28rwm\" (UniqueName: \"kubernetes.io/projected/635b0387-641e-4b9a-a7eb-adbe84ed5d01-kube-api-access-28rwm\") pod \"auto-csr-approver-29551156-w2nm6\" (UID: \"635b0387-641e-4b9a-a7eb-adbe84ed5d01\") " pod="openshift-infra/auto-csr-approver-29551156-w2nm6" Mar 09 15:16:00 crc kubenswrapper[4722]: I0309 15:16:00.378029 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28rwm\" (UniqueName: \"kubernetes.io/projected/635b0387-641e-4b9a-a7eb-adbe84ed5d01-kube-api-access-28rwm\") pod \"auto-csr-approver-29551156-w2nm6\" (UID: \"635b0387-641e-4b9a-a7eb-adbe84ed5d01\") " pod="openshift-infra/auto-csr-approver-29551156-w2nm6" Mar 09 15:16:00 crc kubenswrapper[4722]: I0309 15:16:00.500336 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551156-w2nm6" Mar 09 15:16:01 crc kubenswrapper[4722]: I0309 15:16:01.054650 4722 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 15:16:01 crc kubenswrapper[4722]: I0309 15:16:01.056391 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551156-w2nm6"] Mar 09 15:16:01 crc kubenswrapper[4722]: I0309 15:16:01.366886 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551156-w2nm6" event={"ID":"635b0387-641e-4b9a-a7eb-adbe84ed5d01","Type":"ContainerStarted","Data":"975a101aa08b4fcd390bd39a434fd3d66ac561a1c775b55b1117f06769869542"} Mar 09 15:16:03 crc kubenswrapper[4722]: I0309 15:16:03.394323 4722 generic.go:334] "Generic (PLEG): container finished" podID="635b0387-641e-4b9a-a7eb-adbe84ed5d01" containerID="4f4f60e84c0f4934025a4b6014bc5592a2608f1e7b3d5ee5cf6960d91c19b24e" exitCode=0 Mar 09 15:16:03 crc kubenswrapper[4722]: I0309 15:16:03.394432 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551156-w2nm6" event={"ID":"635b0387-641e-4b9a-a7eb-adbe84ed5d01","Type":"ContainerDied","Data":"4f4f60e84c0f4934025a4b6014bc5592a2608f1e7b3d5ee5cf6960d91c19b24e"} Mar 09 15:16:04 crc kubenswrapper[4722]: I0309 15:16:04.862319 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551156-w2nm6" Mar 09 15:16:04 crc kubenswrapper[4722]: I0309 15:16:04.979867 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28rwm\" (UniqueName: \"kubernetes.io/projected/635b0387-641e-4b9a-a7eb-adbe84ed5d01-kube-api-access-28rwm\") pod \"635b0387-641e-4b9a-a7eb-adbe84ed5d01\" (UID: \"635b0387-641e-4b9a-a7eb-adbe84ed5d01\") " Mar 09 15:16:04 crc kubenswrapper[4722]: I0309 15:16:04.986850 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/635b0387-641e-4b9a-a7eb-adbe84ed5d01-kube-api-access-28rwm" (OuterVolumeSpecName: "kube-api-access-28rwm") pod "635b0387-641e-4b9a-a7eb-adbe84ed5d01" (UID: "635b0387-641e-4b9a-a7eb-adbe84ed5d01"). InnerVolumeSpecName "kube-api-access-28rwm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 15:16:05 crc kubenswrapper[4722]: I0309 15:16:05.082355 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-28rwm\" (UniqueName: \"kubernetes.io/projected/635b0387-641e-4b9a-a7eb-adbe84ed5d01-kube-api-access-28rwm\") on node \"crc\" DevicePath \"\"" Mar 09 15:16:05 crc kubenswrapper[4722]: I0309 15:16:05.420710 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551156-w2nm6" event={"ID":"635b0387-641e-4b9a-a7eb-adbe84ed5d01","Type":"ContainerDied","Data":"975a101aa08b4fcd390bd39a434fd3d66ac561a1c775b55b1117f06769869542"} Mar 09 15:16:05 crc kubenswrapper[4722]: I0309 15:16:05.421029 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="975a101aa08b4fcd390bd39a434fd3d66ac561a1c775b55b1117f06769869542" Mar 09 15:16:05 crc kubenswrapper[4722]: I0309 15:16:05.420837 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551156-w2nm6" Mar 09 15:16:05 crc kubenswrapper[4722]: I0309 15:16:05.945479 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551150-4lzhw"] Mar 09 15:16:05 crc kubenswrapper[4722]: I0309 15:16:05.956353 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551150-4lzhw"] Mar 09 15:16:06 crc kubenswrapper[4722]: I0309 15:16:06.164496 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e82423a0-60f3-4c62-82ba-2b34ae623247" path="/var/lib/kubelet/pods/e82423a0-60f3-4c62-82ba-2b34ae623247/volumes" Mar 09 15:16:08 crc kubenswrapper[4722]: I0309 15:16:08.149671 4722 scope.go:117] "RemoveContainer" containerID="ed831924138fb872c73acc14a5ac7c6ed36b421017c8fe60dbbaede3d8f11789" Mar 09 15:16:08 crc kubenswrapper[4722]: E0309 15:16:08.150745 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 15:16:23 crc kubenswrapper[4722]: I0309 15:16:23.148812 4722 scope.go:117] "RemoveContainer" containerID="ed831924138fb872c73acc14a5ac7c6ed36b421017c8fe60dbbaede3d8f11789" Mar 09 15:16:23 crc kubenswrapper[4722]: E0309 15:16:23.149528 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 15:16:30 crc kubenswrapper[4722]: I0309 15:16:30.433919 4722 scope.go:117] "RemoveContainer" containerID="7a6ee4b2904c6524c8599793fcfec2e6c3d24072549f2309fdda697531abfc6d" Mar 09 15:16:37 crc kubenswrapper[4722]: I0309 15:16:37.149664 4722 scope.go:117] "RemoveContainer" containerID="ed831924138fb872c73acc14a5ac7c6ed36b421017c8fe60dbbaede3d8f11789" Mar 09 15:16:37 crc kubenswrapper[4722]: E0309 15:16:37.150669 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 15:16:51 crc kubenswrapper[4722]: I0309 15:16:51.149559 4722 scope.go:117] "RemoveContainer" containerID="ed831924138fb872c73acc14a5ac7c6ed36b421017c8fe60dbbaede3d8f11789" Mar 09 15:16:51 crc kubenswrapper[4722]: E0309 15:16:51.150500 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 15:17:06 crc kubenswrapper[4722]: I0309 15:17:06.151396 4722 scope.go:117] "RemoveContainer" containerID="ed831924138fb872c73acc14a5ac7c6ed36b421017c8fe60dbbaede3d8f11789" Mar 09 15:17:06 crc kubenswrapper[4722]: E0309 15:17:06.152325 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 15:17:21 crc kubenswrapper[4722]: I0309 15:17:21.149039 4722 scope.go:117] "RemoveContainer" containerID="ed831924138fb872c73acc14a5ac7c6ed36b421017c8fe60dbbaede3d8f11789" Mar 09 15:17:21 crc kubenswrapper[4722]: E0309 15:17:21.150035 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 15:17:34 crc kubenswrapper[4722]: I0309 15:17:34.150138 4722 scope.go:117] "RemoveContainer" containerID="ed831924138fb872c73acc14a5ac7c6ed36b421017c8fe60dbbaede3d8f11789" Mar 09 15:17:34 crc kubenswrapper[4722]: E0309 15:17:34.152154 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 15:17:45 crc kubenswrapper[4722]: I0309 15:17:45.150251 4722 scope.go:117] "RemoveContainer" containerID="ed831924138fb872c73acc14a5ac7c6ed36b421017c8fe60dbbaede3d8f11789" Mar 09 15:17:45 crc kubenswrapper[4722]: E0309 15:17:45.151440 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 15:17:56 crc kubenswrapper[4722]: I0309 15:17:56.150225 4722 scope.go:117] "RemoveContainer" containerID="ed831924138fb872c73acc14a5ac7c6ed36b421017c8fe60dbbaede3d8f11789" Mar 09 15:17:56 crc kubenswrapper[4722]: E0309 15:17:56.151110 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 15:18:00 crc kubenswrapper[4722]: I0309 15:18:00.164065 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551158-pz25v"] Mar 09 15:18:00 crc kubenswrapper[4722]: E0309 15:18:00.165060 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="635b0387-641e-4b9a-a7eb-adbe84ed5d01" containerName="oc" Mar 09 15:18:00 crc kubenswrapper[4722]: I0309 15:18:00.165076 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="635b0387-641e-4b9a-a7eb-adbe84ed5d01" containerName="oc" Mar 09 15:18:00 crc kubenswrapper[4722]: I0309 15:18:00.165432 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="635b0387-641e-4b9a-a7eb-adbe84ed5d01" containerName="oc" Mar 09 15:18:00 crc kubenswrapper[4722]: I0309 15:18:00.166619 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551158-pz25v" Mar 09 15:18:00 crc kubenswrapper[4722]: I0309 15:18:00.169836 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 15:18:00 crc kubenswrapper[4722]: I0309 15:18:00.170111 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-dr2x6" Mar 09 15:18:00 crc kubenswrapper[4722]: I0309 15:18:00.170311 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 15:18:00 crc kubenswrapper[4722]: I0309 15:18:00.189068 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551158-pz25v"] Mar 09 15:18:00 crc kubenswrapper[4722]: I0309 15:18:00.238799 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9b2b2\" (UniqueName: \"kubernetes.io/projected/bf39b656-7744-4eed-9a1d-b278c8eeb3c0-kube-api-access-9b2b2\") pod \"auto-csr-approver-29551158-pz25v\" (UID: \"bf39b656-7744-4eed-9a1d-b278c8eeb3c0\") " pod="openshift-infra/auto-csr-approver-29551158-pz25v" Mar 09 15:18:00 crc kubenswrapper[4722]: I0309 15:18:00.341632 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9b2b2\" (UniqueName: \"kubernetes.io/projected/bf39b656-7744-4eed-9a1d-b278c8eeb3c0-kube-api-access-9b2b2\") pod \"auto-csr-approver-29551158-pz25v\" (UID: \"bf39b656-7744-4eed-9a1d-b278c8eeb3c0\") " pod="openshift-infra/auto-csr-approver-29551158-pz25v" Mar 09 15:18:00 crc kubenswrapper[4722]: I0309 15:18:00.361408 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9b2b2\" (UniqueName: \"kubernetes.io/projected/bf39b656-7744-4eed-9a1d-b278c8eeb3c0-kube-api-access-9b2b2\") pod \"auto-csr-approver-29551158-pz25v\" (UID: \"bf39b656-7744-4eed-9a1d-b278c8eeb3c0\") " pod="openshift-infra/auto-csr-approver-29551158-pz25v" Mar 09 15:18:00 crc kubenswrapper[4722]: I0309 15:18:00.498812 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551158-pz25v" Mar 09 15:18:01 crc kubenswrapper[4722]: I0309 15:18:01.079667 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551158-pz25v"] Mar 09 15:18:01 crc kubenswrapper[4722]: I0309 15:18:01.844636 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551158-pz25v" event={"ID":"bf39b656-7744-4eed-9a1d-b278c8eeb3c0","Type":"ContainerStarted","Data":"f029948796a42906a06b6b03b24c9590afdd7053b2665d1db226a2ab45a48980"} Mar 09 15:18:02 crc kubenswrapper[4722]: I0309 15:18:02.868472 4722 generic.go:334] "Generic (PLEG): container finished" podID="bf39b656-7744-4eed-9a1d-b278c8eeb3c0" containerID="7d831a8adee56ed8c32f1584589e4f2094a6800ee05874f152edb0d41a1042a0" exitCode=0 Mar 09 15:18:02 crc kubenswrapper[4722]: I0309 15:18:02.868533 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551158-pz25v" event={"ID":"bf39b656-7744-4eed-9a1d-b278c8eeb3c0","Type":"ContainerDied","Data":"7d831a8adee56ed8c32f1584589e4f2094a6800ee05874f152edb0d41a1042a0"} Mar 09 15:18:04 crc kubenswrapper[4722]: I0309 15:18:04.287993 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551158-pz25v" Mar 09 15:18:04 crc kubenswrapper[4722]: I0309 15:18:04.365692 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9b2b2\" (UniqueName: \"kubernetes.io/projected/bf39b656-7744-4eed-9a1d-b278c8eeb3c0-kube-api-access-9b2b2\") pod \"bf39b656-7744-4eed-9a1d-b278c8eeb3c0\" (UID: \"bf39b656-7744-4eed-9a1d-b278c8eeb3c0\") " Mar 09 15:18:04 crc kubenswrapper[4722]: I0309 15:18:04.373877 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf39b656-7744-4eed-9a1d-b278c8eeb3c0-kube-api-access-9b2b2" (OuterVolumeSpecName: "kube-api-access-9b2b2") pod "bf39b656-7744-4eed-9a1d-b278c8eeb3c0" (UID: "bf39b656-7744-4eed-9a1d-b278c8eeb3c0"). InnerVolumeSpecName "kube-api-access-9b2b2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 15:18:04 crc kubenswrapper[4722]: I0309 15:18:04.468786 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9b2b2\" (UniqueName: \"kubernetes.io/projected/bf39b656-7744-4eed-9a1d-b278c8eeb3c0-kube-api-access-9b2b2\") on node \"crc\" DevicePath \"\"" Mar 09 15:18:04 crc kubenswrapper[4722]: I0309 15:18:04.897517 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551158-pz25v" event={"ID":"bf39b656-7744-4eed-9a1d-b278c8eeb3c0","Type":"ContainerDied","Data":"f029948796a42906a06b6b03b24c9590afdd7053b2665d1db226a2ab45a48980"} Mar 09 15:18:04 crc kubenswrapper[4722]: I0309 15:18:04.897567 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f029948796a42906a06b6b03b24c9590afdd7053b2665d1db226a2ab45a48980" Mar 09 15:18:04 crc kubenswrapper[4722]: I0309 15:18:04.897646 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551158-pz25v" Mar 09 15:18:05 crc kubenswrapper[4722]: I0309 15:18:05.361740 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551152-95ng8"] Mar 09 15:18:05 crc kubenswrapper[4722]: I0309 15:18:05.374611 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551152-95ng8"] Mar 09 15:18:06 crc kubenswrapper[4722]: I0309 15:18:06.165577 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c64bc37-546f-48b6-9fae-e1c085dc63a4" path="/var/lib/kubelet/pods/9c64bc37-546f-48b6-9fae-e1c085dc63a4/volumes" Mar 09 15:18:07 crc kubenswrapper[4722]: I0309 15:18:07.150996 4722 scope.go:117] "RemoveContainer" containerID="ed831924138fb872c73acc14a5ac7c6ed36b421017c8fe60dbbaede3d8f11789" Mar 09 15:18:07 crc kubenswrapper[4722]: E0309 15:18:07.154341 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 15:18:21 crc kubenswrapper[4722]: I0309 15:18:21.150436 4722 scope.go:117] "RemoveContainer" containerID="ed831924138fb872c73acc14a5ac7c6ed36b421017c8fe60dbbaede3d8f11789" Mar 09 15:18:21 crc kubenswrapper[4722]: E0309 15:18:21.152323 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 15:18:30 crc kubenswrapper[4722]: I0309 15:18:30.572952 4722 scope.go:117] "RemoveContainer" containerID="57f703087ce04c4eff5680bef89eac7a19d5670e5094d244222882ab07740ecc" Mar 09 15:18:36 crc kubenswrapper[4722]: I0309 15:18:36.150109 4722 scope.go:117] "RemoveContainer" containerID="ed831924138fb872c73acc14a5ac7c6ed36b421017c8fe60dbbaede3d8f11789" Mar 09 15:18:36 crc kubenswrapper[4722]: E0309 15:18:36.150937 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 15:18:48 crc kubenswrapper[4722]: I0309 15:18:48.149927 4722 scope.go:117] "RemoveContainer" containerID="ed831924138fb872c73acc14a5ac7c6ed36b421017c8fe60dbbaede3d8f11789" Mar 09 15:18:48 crc kubenswrapper[4722]: E0309 15:18:48.151064 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 15:19:01 crc kubenswrapper[4722]: I0309 15:19:01.149735 4722 scope.go:117] "RemoveContainer" containerID="ed831924138fb872c73acc14a5ac7c6ed36b421017c8fe60dbbaede3d8f11789" Mar 09 15:19:01 crc kubenswrapper[4722]: E0309 15:19:01.150966 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 15:19:16 crc kubenswrapper[4722]: I0309 15:19:16.149410 4722 scope.go:117] "RemoveContainer" containerID="ed831924138fb872c73acc14a5ac7c6ed36b421017c8fe60dbbaede3d8f11789" Mar 09 15:19:16 crc kubenswrapper[4722]: E0309 15:19:16.150105 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 15:19:29 crc kubenswrapper[4722]: I0309 15:19:29.152039 4722 scope.go:117] "RemoveContainer" containerID="ed831924138fb872c73acc14a5ac7c6ed36b421017c8fe60dbbaede3d8f11789" Mar 09 15:19:29 crc kubenswrapper[4722]: E0309 15:19:29.153433 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 15:19:33 crc kubenswrapper[4722]: I0309 15:19:33.315506 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Mar 09 15:19:33 crc kubenswrapper[4722]: E0309 15:19:33.317054 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf39b656-7744-4eed-9a1d-b278c8eeb3c0" containerName="oc" Mar 09 15:19:33 crc kubenswrapper[4722]: I0309 15:19:33.317079 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf39b656-7744-4eed-9a1d-b278c8eeb3c0" containerName="oc" Mar 09 15:19:33 crc kubenswrapper[4722]: I0309 15:19:33.317516 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf39b656-7744-4eed-9a1d-b278c8eeb3c0" containerName="oc" Mar 09 15:19:33 crc kubenswrapper[4722]: I0309 15:19:33.318651 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 09 15:19:33 crc kubenswrapper[4722]: I0309 15:19:33.322878 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-l2bdw" Mar 09 15:19:33 crc kubenswrapper[4722]: I0309 15:19:33.322933 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Mar 09 15:19:33 crc kubenswrapper[4722]: I0309 15:19:33.322968 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Mar 09 15:19:33 crc kubenswrapper[4722]: I0309 15:19:33.323490 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 09 15:19:33 crc kubenswrapper[4722]: I0309 15:19:33.331616 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 09 15:19:33 crc kubenswrapper[4722]: I0309 15:19:33.386036 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/89fd3e80-4c6c-4619-ad44-ef440c0b1fb6-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"89fd3e80-4c6c-4619-ad44-ef440c0b1fb6\") " pod="openstack/tempest-tests-tempest" Mar 09 15:19:33 crc kubenswrapper[4722]: I0309 15:19:33.386086 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5cgw\" (UniqueName: \"kubernetes.io/projected/89fd3e80-4c6c-4619-ad44-ef440c0b1fb6-kube-api-access-w5cgw\") pod \"tempest-tests-tempest\" (UID: \"89fd3e80-4c6c-4619-ad44-ef440c0b1fb6\") " pod="openstack/tempest-tests-tempest" Mar 09 15:19:33 crc kubenswrapper[4722]: I0309 15:19:33.386126 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/89fd3e80-4c6c-4619-ad44-ef440c0b1fb6-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"89fd3e80-4c6c-4619-ad44-ef440c0b1fb6\") " pod="openstack/tempest-tests-tempest" Mar 09 15:19:33 crc kubenswrapper[4722]: I0309 15:19:33.386344 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/89fd3e80-4c6c-4619-ad44-ef440c0b1fb6-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"89fd3e80-4c6c-4619-ad44-ef440c0b1fb6\") " pod="openstack/tempest-tests-tempest" Mar 09 15:19:33 crc kubenswrapper[4722]: I0309 15:19:33.386848 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/89fd3e80-4c6c-4619-ad44-ef440c0b1fb6-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"89fd3e80-4c6c-4619-ad44-ef440c0b1fb6\") " pod="openstack/tempest-tests-tempest" Mar 09 15:19:33 crc kubenswrapper[4722]: I0309 15:19:33.386960 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/89fd3e80-4c6c-4619-ad44-ef440c0b1fb6-config-data\") pod \"tempest-tests-tempest\" (UID: \"89fd3e80-4c6c-4619-ad44-ef440c0b1fb6\") " pod="openstack/tempest-tests-tempest" Mar 09 15:19:33 crc kubenswrapper[4722]: I0309 15:19:33.387053 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"tempest-tests-tempest\" (UID: \"89fd3e80-4c6c-4619-ad44-ef440c0b1fb6\") " pod="openstack/tempest-tests-tempest" Mar 09 15:19:33 crc kubenswrapper[4722]: I0309 15:19:33.387238 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/89fd3e80-4c6c-4619-ad44-ef440c0b1fb6-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"89fd3e80-4c6c-4619-ad44-ef440c0b1fb6\") " pod="openstack/tempest-tests-tempest" Mar 09 15:19:33 crc kubenswrapper[4722]: I0309 15:19:33.387338 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/89fd3e80-4c6c-4619-ad44-ef440c0b1fb6-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"89fd3e80-4c6c-4619-ad44-ef440c0b1fb6\") " pod="openstack/tempest-tests-tempest" Mar 09 15:19:33 crc kubenswrapper[4722]: I0309 15:19:33.490408 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"tempest-tests-tempest\" (UID: \"89fd3e80-4c6c-4619-ad44-ef440c0b1fb6\") " pod="openstack/tempest-tests-tempest" Mar 09 15:19:33 crc kubenswrapper[4722]: I0309 15:19:33.490485 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/89fd3e80-4c6c-4619-ad44-ef440c0b1fb6-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"89fd3e80-4c6c-4619-ad44-ef440c0b1fb6\") " pod="openstack/tempest-tests-tempest" Mar 09 15:19:33 crc kubenswrapper[4722]: I0309 15:19:33.490545 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/89fd3e80-4c6c-4619-ad44-ef440c0b1fb6-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"89fd3e80-4c6c-4619-ad44-ef440c0b1fb6\") " pod="openstack/tempest-tests-tempest" Mar 09 15:19:33 crc kubenswrapper[4722]: I0309 15:19:33.490647 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/89fd3e80-4c6c-4619-ad44-ef440c0b1fb6-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"89fd3e80-4c6c-4619-ad44-ef440c0b1fb6\") " pod="openstack/tempest-tests-tempest" Mar 09 15:19:33 crc kubenswrapper[4722]: I0309 15:19:33.490679 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5cgw\" (UniqueName: \"kubernetes.io/projected/89fd3e80-4c6c-4619-ad44-ef440c0b1fb6-kube-api-access-w5cgw\") pod \"tempest-tests-tempest\" (UID: \"89fd3e80-4c6c-4619-ad44-ef440c0b1fb6\") " pod="openstack/tempest-tests-tempest" Mar 09 15:19:33 crc kubenswrapper[4722]: I0309 15:19:33.490720 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/89fd3e80-4c6c-4619-ad44-ef440c0b1fb6-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"89fd3e80-4c6c-4619-ad44-ef440c0b1fb6\") " pod="openstack/tempest-tests-tempest" Mar 09 15:19:33 crc kubenswrapper[4722]: I0309 15:19:33.490771 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/89fd3e80-4c6c-4619-ad44-ef440c0b1fb6-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"89fd3e80-4c6c-4619-ad44-ef440c0b1fb6\") " pod="openstack/tempest-tests-tempest" Mar 09 15:19:33 crc kubenswrapper[4722]: I0309 15:19:33.491266 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/89fd3e80-4c6c-4619-ad44-ef440c0b1fb6-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"89fd3e80-4c6c-4619-ad44-ef440c0b1fb6\") " pod="openstack/tempest-tests-tempest" Mar 09 15:19:33 crc kubenswrapper[4722]: I0309 15:19:33.491357 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/89fd3e80-4c6c-4619-ad44-ef440c0b1fb6-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"89fd3e80-4c6c-4619-ad44-ef440c0b1fb6\") " pod="openstack/tempest-tests-tempest" Mar 09 15:19:33 crc kubenswrapper[4722]: I0309 15:19:33.491491 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/89fd3e80-4c6c-4619-ad44-ef440c0b1fb6-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"89fd3e80-4c6c-4619-ad44-ef440c0b1fb6\") " pod="openstack/tempest-tests-tempest" Mar 09 15:19:33 crc kubenswrapper[4722]: I0309 15:19:33.491612 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/89fd3e80-4c6c-4619-ad44-ef440c0b1fb6-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"89fd3e80-4c6c-4619-ad44-ef440c0b1fb6\") " pod="openstack/tempest-tests-tempest" Mar 09 15:19:33 crc kubenswrapper[4722]: I0309 15:19:33.492089 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/89fd3e80-4c6c-4619-ad44-ef440c0b1fb6-config-data\") pod \"tempest-tests-tempest\" (UID: \"89fd3e80-4c6c-4619-ad44-ef440c0b1fb6\") " pod="openstack/tempest-tests-tempest" Mar 09 15:19:33 crc kubenswrapper[4722]: I0309 15:19:33.493102 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/89fd3e80-4c6c-4619-ad44-ef440c0b1fb6-config-data\") pod \"tempest-tests-tempest\" (UID: \"89fd3e80-4c6c-4619-ad44-ef440c0b1fb6\") " pod="openstack/tempest-tests-tempest" Mar 09 15:19:33 crc kubenswrapper[4722]: I0309 15:19:33.513937 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/89fd3e80-4c6c-4619-ad44-ef440c0b1fb6-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"89fd3e80-4c6c-4619-ad44-ef440c0b1fb6\") " pod="openstack/tempest-tests-tempest" Mar 09 15:19:33 crc kubenswrapper[4722]: I0309 15:19:33.514119 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5cgw\" (UniqueName: \"kubernetes.io/projected/89fd3e80-4c6c-4619-ad44-ef440c0b1fb6-kube-api-access-w5cgw\") pod \"tempest-tests-tempest\" (UID: \"89fd3e80-4c6c-4619-ad44-ef440c0b1fb6\") " pod="openstack/tempest-tests-tempest" Mar 09 15:19:33 crc kubenswrapper[4722]: I0309 15:19:33.514387 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/89fd3e80-4c6c-4619-ad44-ef440c0b1fb6-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"89fd3e80-4c6c-4619-ad44-ef440c0b1fb6\") " pod="openstack/tempest-tests-tempest" Mar 09 15:19:33 crc kubenswrapper[4722]: I0309 15:19:33.518274 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/89fd3e80-4c6c-4619-ad44-ef440c0b1fb6-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"89fd3e80-4c6c-4619-ad44-ef440c0b1fb6\") " pod="openstack/tempest-tests-tempest" Mar 09 15:19:33 crc kubenswrapper[4722]: I0309 15:19:33.529875 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"tempest-tests-tempest\" (UID: \"89fd3e80-4c6c-4619-ad44-ef440c0b1fb6\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/tempest-tests-tempest" Mar 09 15:19:33 crc kubenswrapper[4722]: I0309 15:19:33.569431 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"tempest-tests-tempest\" (UID: \"89fd3e80-4c6c-4619-ad44-ef440c0b1fb6\") " pod="openstack/tempest-tests-tempest" Mar 09 15:19:33 crc kubenswrapper[4722]: I0309 15:19:33.645171 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 09 15:19:34 crc kubenswrapper[4722]: I0309 15:19:34.003844 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 09 15:19:34 crc kubenswrapper[4722]: I0309 15:19:34.984685 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"89fd3e80-4c6c-4619-ad44-ef440c0b1fb6","Type":"ContainerStarted","Data":"3df5135e290af710c1f2c95f2c9f78d748d6cf1e02be5e74a3958bb146b287e0"} Mar 09 15:19:40 crc kubenswrapper[4722]: I0309 15:19:40.190096 4722 scope.go:117] "RemoveContainer" containerID="ed831924138fb872c73acc14a5ac7c6ed36b421017c8fe60dbbaede3d8f11789" Mar 09 15:19:40 crc kubenswrapper[4722]: E0309 15:19:40.191611 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 15:19:54 crc kubenswrapper[4722]: I0309 15:19:54.148987 4722 scope.go:117] "RemoveContainer" containerID="ed831924138fb872c73acc14a5ac7c6ed36b421017c8fe60dbbaede3d8f11789" Mar 09 15:19:54 crc kubenswrapper[4722]: E0309 15:19:54.149733 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 15:20:00 crc kubenswrapper[4722]: I0309 15:20:00.199613 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551160-gwht7"] Mar 09 15:20:00 crc kubenswrapper[4722]: I0309 15:20:00.201897 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551160-gwht7" Mar 09 15:20:00 crc kubenswrapper[4722]: I0309 15:20:00.203902 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 15:20:00 crc kubenswrapper[4722]: I0309 15:20:00.204135 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 15:20:00 crc kubenswrapper[4722]: I0309 15:20:00.204232 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-dr2x6" Mar 09 15:20:00 crc kubenswrapper[4722]: I0309 15:20:00.272435 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551160-gwht7"] Mar 09 15:20:00 crc kubenswrapper[4722]: I0309 15:20:00.355318 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pv2jj\" (UniqueName: \"kubernetes.io/projected/29face16-3773-4db5-8795-9d16496a7ecd-kube-api-access-pv2jj\") pod \"auto-csr-approver-29551160-gwht7\" (UID: \"29face16-3773-4db5-8795-9d16496a7ecd\") " pod="openshift-infra/auto-csr-approver-29551160-gwht7" Mar 09 15:20:00 crc kubenswrapper[4722]: I0309 15:20:00.457539 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pv2jj\" (UniqueName: \"kubernetes.io/projected/29face16-3773-4db5-8795-9d16496a7ecd-kube-api-access-pv2jj\") pod \"auto-csr-approver-29551160-gwht7\" (UID: \"29face16-3773-4db5-8795-9d16496a7ecd\") " pod="openshift-infra/auto-csr-approver-29551160-gwht7" Mar 09 15:20:00 crc kubenswrapper[4722]: I0309 15:20:00.480928 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pv2jj\" (UniqueName: \"kubernetes.io/projected/29face16-3773-4db5-8795-9d16496a7ecd-kube-api-access-pv2jj\") pod \"auto-csr-approver-29551160-gwht7\" (UID: \"29face16-3773-4db5-8795-9d16496a7ecd\") " pod="openshift-infra/auto-csr-approver-29551160-gwht7" Mar 09 15:20:00 crc kubenswrapper[4722]: I0309 15:20:00.528896 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551160-gwht7" Mar 09 15:20:07 crc kubenswrapper[4722]: I0309 15:20:07.149793 4722 scope.go:117] "RemoveContainer" containerID="ed831924138fb872c73acc14a5ac7c6ed36b421017c8fe60dbbaede3d8f11789" Mar 09 15:20:07 crc kubenswrapper[4722]: E0309 15:20:07.150329 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 15:20:14 crc kubenswrapper[4722]: E0309 15:20:14.832271 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Mar 09 15:20:14 crc kubenswrapper[4722]: E0309 15:20:14.836272 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-w5cgw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(89fd3e80-4c6c-4619-ad44-ef440c0b1fb6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 15:20:14 crc kubenswrapper[4722]: E0309 15:20:14.837414 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="89fd3e80-4c6c-4619-ad44-ef440c0b1fb6" Mar 09 15:20:15 crc kubenswrapper[4722]: I0309 15:20:15.327160 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551160-gwht7"] Mar 09 15:20:15 crc kubenswrapper[4722]: I0309 15:20:15.519473 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551160-gwht7" event={"ID":"29face16-3773-4db5-8795-9d16496a7ecd","Type":"ContainerStarted","Data":"4851470ce271804e19aa04030d227f65456ae152d17f6b2ec82b1c47fd87250b"} Mar 09 15:20:15 crc kubenswrapper[4722]: E0309 15:20:15.522135 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="89fd3e80-4c6c-4619-ad44-ef440c0b1fb6" Mar 09 15:20:17 crc kubenswrapper[4722]: I0309 15:20:17.540707 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551160-gwht7" event={"ID":"29face16-3773-4db5-8795-9d16496a7ecd","Type":"ContainerStarted","Data":"b96fc702ca0636d34a822bdccfc31ccc60982472d734b4e512abaaee940c9499"} Mar 09 15:20:17 crc kubenswrapper[4722]: I0309 15:20:17.559324 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551160-gwht7" podStartSLOduration=16.600037937 podStartE2EDuration="17.559305501s" podCreationTimestamp="2026-03-09 15:20:00 +0000 UTC" firstStartedPulling="2026-03-09 15:20:15.330500014 +0000 UTC m=+4655.886068590" lastFinishedPulling="2026-03-09 15:20:16.289767568 +0000 UTC m=+4656.845336154" observedRunningTime="2026-03-09 15:20:17.555092096 +0000 UTC m=+4658.110660662" watchObservedRunningTime="2026-03-09 15:20:17.559305501 +0000 UTC m=+4658.114874077" Mar 09 15:20:18 crc kubenswrapper[4722]: I0309 15:20:18.557605 4722 generic.go:334] "Generic (PLEG): container finished" podID="29face16-3773-4db5-8795-9d16496a7ecd" containerID="b96fc702ca0636d34a822bdccfc31ccc60982472d734b4e512abaaee940c9499" exitCode=0 Mar 09 15:20:18 crc kubenswrapper[4722]: I0309 15:20:18.558046 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551160-gwht7" event={"ID":"29face16-3773-4db5-8795-9d16496a7ecd","Type":"ContainerDied","Data":"b96fc702ca0636d34a822bdccfc31ccc60982472d734b4e512abaaee940c9499"} Mar 09 15:20:20 crc kubenswrapper[4722]: I0309 15:20:20.141283 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551160-gwht7" Mar 09 15:20:20 crc kubenswrapper[4722]: I0309 15:20:20.326598 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pv2jj\" (UniqueName: \"kubernetes.io/projected/29face16-3773-4db5-8795-9d16496a7ecd-kube-api-access-pv2jj\") pod \"29face16-3773-4db5-8795-9d16496a7ecd\" (UID: \"29face16-3773-4db5-8795-9d16496a7ecd\") " Mar 09 15:20:20 crc kubenswrapper[4722]: I0309 15:20:20.335830 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29face16-3773-4db5-8795-9d16496a7ecd-kube-api-access-pv2jj" (OuterVolumeSpecName: "kube-api-access-pv2jj") pod "29face16-3773-4db5-8795-9d16496a7ecd" (UID: "29face16-3773-4db5-8795-9d16496a7ecd"). InnerVolumeSpecName "kube-api-access-pv2jj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 15:20:20 crc kubenswrapper[4722]: I0309 15:20:20.430281 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pv2jj\" (UniqueName: \"kubernetes.io/projected/29face16-3773-4db5-8795-9d16496a7ecd-kube-api-access-pv2jj\") on node \"crc\" DevicePath \"\"" Mar 09 15:20:20 crc kubenswrapper[4722]: I0309 15:20:20.589323 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551160-gwht7" event={"ID":"29face16-3773-4db5-8795-9d16496a7ecd","Type":"ContainerDied","Data":"4851470ce271804e19aa04030d227f65456ae152d17f6b2ec82b1c47fd87250b"} Mar 09 15:20:20 crc kubenswrapper[4722]: I0309 15:20:20.589767 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4851470ce271804e19aa04030d227f65456ae152d17f6b2ec82b1c47fd87250b" Mar 09 15:20:20 crc kubenswrapper[4722]: I0309 15:20:20.589360 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551160-gwht7" Mar 09 15:20:20 crc kubenswrapper[4722]: I0309 15:20:20.674325 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551154-c478t"] Mar 09 15:20:20 crc kubenswrapper[4722]: I0309 15:20:20.690958 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551154-c478t"] Mar 09 15:20:21 crc kubenswrapper[4722]: I0309 15:20:21.149414 4722 scope.go:117] "RemoveContainer" containerID="ed831924138fb872c73acc14a5ac7c6ed36b421017c8fe60dbbaede3d8f11789" Mar 09 15:20:21 crc kubenswrapper[4722]: E0309 15:20:21.150587 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 15:20:22 crc kubenswrapper[4722]: I0309 15:20:22.176934 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a717f324-e899-4ed6-bdb8-23fac7294215" path="/var/lib/kubelet/pods/a717f324-e899-4ed6-bdb8-23fac7294215/volumes" Mar 09 15:20:27 crc kubenswrapper[4722]: I0309 15:20:27.800001 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 09 15:20:29 crc kubenswrapper[4722]: I0309 15:20:29.705473 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"89fd3e80-4c6c-4619-ad44-ef440c0b1fb6","Type":"ContainerStarted","Data":"59f9e2e53ec91c7b903e31572a6931dadcb8456c840d62bcc991024bba5ef79f"} Mar 09 15:20:29 crc kubenswrapper[4722]: I0309 15:20:29.740394 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.950116132 podStartE2EDuration="57.740375717s" podCreationTimestamp="2026-03-09 15:19:32 +0000 UTC" firstStartedPulling="2026-03-09 15:19:34.006747573 +0000 UTC m=+4614.562316149" lastFinishedPulling="2026-03-09 15:20:27.797007148 +0000 UTC m=+4668.352575734" observedRunningTime="2026-03-09 15:20:29.731102162 +0000 UTC m=+4670.286670768" watchObservedRunningTime="2026-03-09 15:20:29.740375717 +0000 UTC m=+4670.295944293" Mar 09 15:20:30 crc kubenswrapper[4722]: I0309 15:20:30.690781 4722 scope.go:117] "RemoveContainer" containerID="94a3861f8aa8959c666e8f15f78770bb8a16221e54f63dffdc7d9a2a9d2bfbd4" Mar 09 15:20:34 crc kubenswrapper[4722]: I0309 15:20:34.150593 4722 scope.go:117] "RemoveContainer" containerID="ed831924138fb872c73acc14a5ac7c6ed36b421017c8fe60dbbaede3d8f11789" Mar 09 15:20:34 crc kubenswrapper[4722]: E0309 15:20:34.152019 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 15:20:46 crc kubenswrapper[4722]: I0309 15:20:46.149310 4722 scope.go:117] "RemoveContainer" containerID="ed831924138fb872c73acc14a5ac7c6ed36b421017c8fe60dbbaede3d8f11789" Mar 09 15:20:46 crc kubenswrapper[4722]: E0309 15:20:46.150195 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 15:20:48 crc kubenswrapper[4722]: I0309 15:20:48.990571 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bb78l"] Mar 09 15:20:48 crc kubenswrapper[4722]: E0309 15:20:48.991432 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29face16-3773-4db5-8795-9d16496a7ecd" containerName="oc" Mar 09 15:20:48 crc kubenswrapper[4722]: I0309 15:20:48.991445 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="29face16-3773-4db5-8795-9d16496a7ecd" containerName="oc" Mar 09 15:20:48 crc kubenswrapper[4722]: I0309 15:20:48.991646 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="29face16-3773-4db5-8795-9d16496a7ecd" containerName="oc" Mar 09 15:20:48 crc kubenswrapper[4722]: I0309 15:20:48.993175 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bb78l" Mar 09 15:20:49 crc kubenswrapper[4722]: I0309 15:20:49.070182 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bb78l"] Mar 09 15:20:49 crc kubenswrapper[4722]: I0309 15:20:49.127482 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4004cffc-861e-4009-9fc4-f64482ad844e-utilities\") pod \"certified-operators-bb78l\" (UID: \"4004cffc-861e-4009-9fc4-f64482ad844e\") " pod="openshift-marketplace/certified-operators-bb78l" Mar 09 15:20:49 crc kubenswrapper[4722]: I0309 15:20:49.127543 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jt9s9\" (UniqueName: \"kubernetes.io/projected/4004cffc-861e-4009-9fc4-f64482ad844e-kube-api-access-jt9s9\") pod \"certified-operators-bb78l\" (UID: \"4004cffc-861e-4009-9fc4-f64482ad844e\") " pod="openshift-marketplace/certified-operators-bb78l" Mar 09 15:20:49 crc kubenswrapper[4722]: I0309 15:20:49.127640 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4004cffc-861e-4009-9fc4-f64482ad844e-catalog-content\") pod \"certified-operators-bb78l\" (UID: \"4004cffc-861e-4009-9fc4-f64482ad844e\") " pod="openshift-marketplace/certified-operators-bb78l" Mar 09 15:20:49 crc kubenswrapper[4722]: I0309 15:20:49.229825 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4004cffc-861e-4009-9fc4-f64482ad844e-catalog-content\") pod \"certified-operators-bb78l\" (UID: \"4004cffc-861e-4009-9fc4-f64482ad844e\") " pod="openshift-marketplace/certified-operators-bb78l" Mar 09 15:20:49 crc kubenswrapper[4722]: I0309 15:20:49.229981 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4004cffc-861e-4009-9fc4-f64482ad844e-utilities\") pod \"certified-operators-bb78l\" (UID: \"4004cffc-861e-4009-9fc4-f64482ad844e\") " pod="openshift-marketplace/certified-operators-bb78l" Mar 09 15:20:49 crc kubenswrapper[4722]: I0309 15:20:49.230020 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jt9s9\" (UniqueName: \"kubernetes.io/projected/4004cffc-861e-4009-9fc4-f64482ad844e-kube-api-access-jt9s9\") pod \"certified-operators-bb78l\" (UID: \"4004cffc-861e-4009-9fc4-f64482ad844e\") " pod="openshift-marketplace/certified-operators-bb78l" Mar 09 15:20:49 crc kubenswrapper[4722]: I0309 15:20:49.230858 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4004cffc-861e-4009-9fc4-f64482ad844e-catalog-content\") pod \"certified-operators-bb78l\" (UID: \"4004cffc-861e-4009-9fc4-f64482ad844e\") " pod="openshift-marketplace/certified-operators-bb78l" Mar 09 15:20:49 crc kubenswrapper[4722]: I0309 15:20:49.231069 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4004cffc-861e-4009-9fc4-f64482ad844e-utilities\") pod \"certified-operators-bb78l\" (UID: \"4004cffc-861e-4009-9fc4-f64482ad844e\") " pod="openshift-marketplace/certified-operators-bb78l" Mar 09 15:20:49 crc kubenswrapper[4722]: I0309 15:20:49.678096 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jt9s9\" (UniqueName: \"kubernetes.io/projected/4004cffc-861e-4009-9fc4-f64482ad844e-kube-api-access-jt9s9\") pod \"certified-operators-bb78l\" (UID: \"4004cffc-861e-4009-9fc4-f64482ad844e\") " pod="openshift-marketplace/certified-operators-bb78l" Mar 09 15:20:49 crc kubenswrapper[4722]: I0309 15:20:49.920661 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bb78l" Mar 09 15:20:50 crc kubenswrapper[4722]: I0309 15:20:50.461803 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bb78l"] Mar 09 15:20:50 crc kubenswrapper[4722]: I0309 15:20:50.976125 4722 generic.go:334] "Generic (PLEG): container finished" podID="4004cffc-861e-4009-9fc4-f64482ad844e" containerID="3e39a736e2d4ec59104f2b1ac6d11e3687a90419e7e436f62ce78a7e7f94ca5c" exitCode=0 Mar 09 15:20:50 crc kubenswrapper[4722]: I0309 15:20:50.976224 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bb78l" event={"ID":"4004cffc-861e-4009-9fc4-f64482ad844e","Type":"ContainerDied","Data":"3e39a736e2d4ec59104f2b1ac6d11e3687a90419e7e436f62ce78a7e7f94ca5c"} Mar 09 15:20:50 crc kubenswrapper[4722]: I0309 15:20:50.976610 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bb78l" event={"ID":"4004cffc-861e-4009-9fc4-f64482ad844e","Type":"ContainerStarted","Data":"57e527636da4ae1dcc2a43a7b72f4b23a03203ef71d03f056772a917114b0c04"} Mar 09 15:20:53 crc kubenswrapper[4722]: I0309 15:20:53.007963 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bb78l" event={"ID":"4004cffc-861e-4009-9fc4-f64482ad844e","Type":"ContainerStarted","Data":"4fdee61d3e0bcd2fe3f74ee3e5ad73868ea80461eebf92d40ccbcf1ee51ff28e"} Mar 09 15:20:55 crc kubenswrapper[4722]: I0309 15:20:55.028614 4722 generic.go:334] "Generic (PLEG): container finished" podID="4004cffc-861e-4009-9fc4-f64482ad844e" containerID="4fdee61d3e0bcd2fe3f74ee3e5ad73868ea80461eebf92d40ccbcf1ee51ff28e" exitCode=0 Mar 09 15:20:55 crc kubenswrapper[4722]: I0309 15:20:55.028671 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bb78l" event={"ID":"4004cffc-861e-4009-9fc4-f64482ad844e","Type":"ContainerDied","Data":"4fdee61d3e0bcd2fe3f74ee3e5ad73868ea80461eebf92d40ccbcf1ee51ff28e"} Mar 09 15:20:57 crc kubenswrapper[4722]: I0309 15:20:57.054506 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bb78l" event={"ID":"4004cffc-861e-4009-9fc4-f64482ad844e","Type":"ContainerStarted","Data":"e41835095b5c36ad492f387697c27295f6231e2511aa8a1f7e2808fb9ef0eb74"} Mar 09 15:20:57 crc kubenswrapper[4722]: I0309 15:20:57.079788 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bb78l" podStartSLOduration=4.160731916 podStartE2EDuration="9.079768229s" podCreationTimestamp="2026-03-09 15:20:48 +0000 UTC" firstStartedPulling="2026-03-09 15:20:50.980189911 +0000 UTC m=+4691.535758487" lastFinishedPulling="2026-03-09 15:20:55.899226224 +0000 UTC m=+4696.454794800" observedRunningTime="2026-03-09 15:20:57.071978435 +0000 UTC m=+4697.627547031" watchObservedRunningTime="2026-03-09 15:20:57.079768229 +0000 UTC m=+4697.635336815" Mar 09 15:20:59 crc kubenswrapper[4722]: I0309 15:20:59.921050 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bb78l" Mar 09 15:20:59 crc kubenswrapper[4722]: I0309 15:20:59.922632 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bb78l" Mar 09 15:20:59 crc kubenswrapper[4722]: I0309 15:20:59.995861 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bb78l" Mar 09 15:21:01 crc kubenswrapper[4722]: I0309 15:21:01.149842 4722 scope.go:117] "RemoveContainer" containerID="ed831924138fb872c73acc14a5ac7c6ed36b421017c8fe60dbbaede3d8f11789" Mar 09 15:21:01 crc kubenswrapper[4722]: I0309 15:21:01.179528 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bb78l" Mar 09 15:21:01 crc kubenswrapper[4722]: I0309 15:21:01.235276 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bb78l"] Mar 09 15:21:02 crc kubenswrapper[4722]: I0309 15:21:02.122591 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" event={"ID":"dac2aaf5-653b-4b2a-8efe-ed26bac8d648","Type":"ContainerStarted","Data":"f204c1f7d34e99a1176cb155859ff24baa5114f3011037f9d600a75f90dbf8fc"} Mar 09 15:21:03 crc kubenswrapper[4722]: I0309 15:21:03.132091 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-bb78l" podUID="4004cffc-861e-4009-9fc4-f64482ad844e" containerName="registry-server" containerID="cri-o://e41835095b5c36ad492f387697c27295f6231e2511aa8a1f7e2808fb9ef0eb74" gracePeriod=2 Mar 09 15:21:03 crc kubenswrapper[4722]: I0309 15:21:03.775599 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bb78l" Mar 09 15:21:03 crc kubenswrapper[4722]: I0309 15:21:03.892626 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4004cffc-861e-4009-9fc4-f64482ad844e-utilities\") pod \"4004cffc-861e-4009-9fc4-f64482ad844e\" (UID: \"4004cffc-861e-4009-9fc4-f64482ad844e\") " Mar 09 15:21:03 crc kubenswrapper[4722]: I0309 15:21:03.893184 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4004cffc-861e-4009-9fc4-f64482ad844e-catalog-content\") pod \"4004cffc-861e-4009-9fc4-f64482ad844e\" (UID: \"4004cffc-861e-4009-9fc4-f64482ad844e\") " Mar 09 15:21:03 crc kubenswrapper[4722]: I0309 15:21:03.893484 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jt9s9\" (UniqueName: \"kubernetes.io/projected/4004cffc-861e-4009-9fc4-f64482ad844e-kube-api-access-jt9s9\") pod \"4004cffc-861e-4009-9fc4-f64482ad844e\" (UID: \"4004cffc-861e-4009-9fc4-f64482ad844e\") " Mar 09 15:21:03 crc kubenswrapper[4722]: I0309 15:21:03.897864 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4004cffc-861e-4009-9fc4-f64482ad844e-utilities" (OuterVolumeSpecName: "utilities") pod "4004cffc-861e-4009-9fc4-f64482ad844e" (UID: "4004cffc-861e-4009-9fc4-f64482ad844e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 15:21:03 crc kubenswrapper[4722]: I0309 15:21:03.925514 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4004cffc-861e-4009-9fc4-f64482ad844e-kube-api-access-jt9s9" (OuterVolumeSpecName: "kube-api-access-jt9s9") pod "4004cffc-861e-4009-9fc4-f64482ad844e" (UID: "4004cffc-861e-4009-9fc4-f64482ad844e"). InnerVolumeSpecName "kube-api-access-jt9s9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 15:21:03 crc kubenswrapper[4722]: I0309 15:21:03.996937 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4004cffc-861e-4009-9fc4-f64482ad844e-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 15:21:03 crc kubenswrapper[4722]: I0309 15:21:03.996971 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jt9s9\" (UniqueName: \"kubernetes.io/projected/4004cffc-861e-4009-9fc4-f64482ad844e-kube-api-access-jt9s9\") on node \"crc\" DevicePath \"\"" Mar 09 15:21:04 crc kubenswrapper[4722]: I0309 15:21:04.032888 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4004cffc-861e-4009-9fc4-f64482ad844e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4004cffc-861e-4009-9fc4-f64482ad844e" (UID: "4004cffc-861e-4009-9fc4-f64482ad844e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 15:21:04 crc kubenswrapper[4722]: I0309 15:21:04.098695 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4004cffc-861e-4009-9fc4-f64482ad844e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 15:21:04 crc kubenswrapper[4722]: I0309 15:21:04.148044 4722 generic.go:334] "Generic (PLEG): container finished" podID="4004cffc-861e-4009-9fc4-f64482ad844e" containerID="e41835095b5c36ad492f387697c27295f6231e2511aa8a1f7e2808fb9ef0eb74" exitCode=0 Mar 09 15:21:04 crc kubenswrapper[4722]: I0309 15:21:04.148047 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bb78l" event={"ID":"4004cffc-861e-4009-9fc4-f64482ad844e","Type":"ContainerDied","Data":"e41835095b5c36ad492f387697c27295f6231e2511aa8a1f7e2808fb9ef0eb74"} Mar 09 15:21:04 crc kubenswrapper[4722]: I0309 15:21:04.148170 4722 scope.go:117] "RemoveContainer" containerID="e41835095b5c36ad492f387697c27295f6231e2511aa8a1f7e2808fb9ef0eb74" Mar 09 15:21:04 crc kubenswrapper[4722]: I0309 15:21:04.148313 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bb78l" Mar 09 15:21:04 crc kubenswrapper[4722]: I0309 15:21:04.177654 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bb78l" event={"ID":"4004cffc-861e-4009-9fc4-f64482ad844e","Type":"ContainerDied","Data":"57e527636da4ae1dcc2a43a7b72f4b23a03203ef71d03f056772a917114b0c04"} Mar 09 15:21:04 crc kubenswrapper[4722]: I0309 15:21:04.206169 4722 scope.go:117] "RemoveContainer" containerID="4fdee61d3e0bcd2fe3f74ee3e5ad73868ea80461eebf92d40ccbcf1ee51ff28e" Mar 09 15:21:04 crc kubenswrapper[4722]: I0309 15:21:04.206750 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bb78l"] Mar 09 15:21:04 crc kubenswrapper[4722]: I0309 15:21:04.221063 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-bb78l"] Mar 09 15:21:04 crc kubenswrapper[4722]: I0309 15:21:04.241053 4722 scope.go:117] "RemoveContainer" containerID="3e39a736e2d4ec59104f2b1ac6d11e3687a90419e7e436f62ce78a7e7f94ca5c" Mar 09 15:21:04 crc kubenswrapper[4722]: I0309 15:21:04.297960 4722 scope.go:117] "RemoveContainer" containerID="e41835095b5c36ad492f387697c27295f6231e2511aa8a1f7e2808fb9ef0eb74" Mar 09 15:21:04 crc kubenswrapper[4722]: E0309 15:21:04.298792 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e41835095b5c36ad492f387697c27295f6231e2511aa8a1f7e2808fb9ef0eb74\": container with ID starting with e41835095b5c36ad492f387697c27295f6231e2511aa8a1f7e2808fb9ef0eb74 not found: ID does not exist" containerID="e41835095b5c36ad492f387697c27295f6231e2511aa8a1f7e2808fb9ef0eb74" Mar 09 15:21:04 crc kubenswrapper[4722]: I0309 15:21:04.298838 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e41835095b5c36ad492f387697c27295f6231e2511aa8a1f7e2808fb9ef0eb74"} err="failed to get container status \"e41835095b5c36ad492f387697c27295f6231e2511aa8a1f7e2808fb9ef0eb74\": rpc error: code = NotFound desc = could not find container \"e41835095b5c36ad492f387697c27295f6231e2511aa8a1f7e2808fb9ef0eb74\": container with ID starting with e41835095b5c36ad492f387697c27295f6231e2511aa8a1f7e2808fb9ef0eb74 not found: ID does not exist" Mar 09 15:21:04 crc kubenswrapper[4722]: I0309 15:21:04.298865 4722 scope.go:117] "RemoveContainer" containerID="4fdee61d3e0bcd2fe3f74ee3e5ad73868ea80461eebf92d40ccbcf1ee51ff28e" Mar 09 15:21:04 crc kubenswrapper[4722]: E0309 15:21:04.299382 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4fdee61d3e0bcd2fe3f74ee3e5ad73868ea80461eebf92d40ccbcf1ee51ff28e\": container with ID starting with 4fdee61d3e0bcd2fe3f74ee3e5ad73868ea80461eebf92d40ccbcf1ee51ff28e not found: ID does not exist" containerID="4fdee61d3e0bcd2fe3f74ee3e5ad73868ea80461eebf92d40ccbcf1ee51ff28e" Mar 09 15:21:04 crc kubenswrapper[4722]: I0309 15:21:04.299470 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fdee61d3e0bcd2fe3f74ee3e5ad73868ea80461eebf92d40ccbcf1ee51ff28e"} err="failed to get container status \"4fdee61d3e0bcd2fe3f74ee3e5ad73868ea80461eebf92d40ccbcf1ee51ff28e\": rpc error: code = NotFound desc = could not find container \"4fdee61d3e0bcd2fe3f74ee3e5ad73868ea80461eebf92d40ccbcf1ee51ff28e\": container with ID starting with 4fdee61d3e0bcd2fe3f74ee3e5ad73868ea80461eebf92d40ccbcf1ee51ff28e not found: ID does not exist" Mar 09 15:21:04 crc kubenswrapper[4722]: I0309 15:21:04.299501 4722 scope.go:117] "RemoveContainer" containerID="3e39a736e2d4ec59104f2b1ac6d11e3687a90419e7e436f62ce78a7e7f94ca5c" Mar 09 15:21:04 crc kubenswrapper[4722]: E0309 15:21:04.299863 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e39a736e2d4ec59104f2b1ac6d11e3687a90419e7e436f62ce78a7e7f94ca5c\": container with ID starting with 3e39a736e2d4ec59104f2b1ac6d11e3687a90419e7e436f62ce78a7e7f94ca5c not found: ID does not exist" containerID="3e39a736e2d4ec59104f2b1ac6d11e3687a90419e7e436f62ce78a7e7f94ca5c" Mar 09 15:21:04 crc kubenswrapper[4722]: I0309 15:21:04.299904 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e39a736e2d4ec59104f2b1ac6d11e3687a90419e7e436f62ce78a7e7f94ca5c"} err="failed to get container status \"3e39a736e2d4ec59104f2b1ac6d11e3687a90419e7e436f62ce78a7e7f94ca5c\": rpc error: code = NotFound desc = could not find container \"3e39a736e2d4ec59104f2b1ac6d11e3687a90419e7e436f62ce78a7e7f94ca5c\": container with ID starting with 3e39a736e2d4ec59104f2b1ac6d11e3687a90419e7e436f62ce78a7e7f94ca5c not found: ID does not exist" Mar 09 15:21:06 crc kubenswrapper[4722]: I0309 15:21:06.170541 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4004cffc-861e-4009-9fc4-f64482ad844e" path="/var/lib/kubelet/pods/4004cffc-861e-4009-9fc4-f64482ad844e/volumes" Mar 09 15:22:00 crc kubenswrapper[4722]: I0309 15:22:00.549472 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551162-kb65f"] Mar 09 15:22:00 crc kubenswrapper[4722]: E0309 15:22:00.566545 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4004cffc-861e-4009-9fc4-f64482ad844e" containerName="registry-server" Mar 09 15:22:00 crc kubenswrapper[4722]: I0309 15:22:00.566582 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="4004cffc-861e-4009-9fc4-f64482ad844e" containerName="registry-server" Mar 09 15:22:00 crc kubenswrapper[4722]: E0309 15:22:00.567800 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4004cffc-861e-4009-9fc4-f64482ad844e" containerName="extract-utilities" Mar 09 15:22:00 crc kubenswrapper[4722]: I0309 15:22:00.567818 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="4004cffc-861e-4009-9fc4-f64482ad844e" containerName="extract-utilities" Mar 09 15:22:00 crc kubenswrapper[4722]: E0309 15:22:00.567832 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4004cffc-861e-4009-9fc4-f64482ad844e" containerName="extract-content" Mar 09 15:22:00 crc kubenswrapper[4722]: I0309 15:22:00.567839 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="4004cffc-861e-4009-9fc4-f64482ad844e" containerName="extract-content" Mar 09 15:22:00 crc kubenswrapper[4722]: I0309 15:22:00.569651 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="4004cffc-861e-4009-9fc4-f64482ad844e" containerName="registry-server" Mar 09 15:22:00 crc kubenswrapper[4722]: I0309 15:22:00.580819 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551162-kb65f" Mar 09 15:22:00 crc kubenswrapper[4722]: I0309 15:22:00.601739 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 15:22:00 crc kubenswrapper[4722]: I0309 15:22:00.601747 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 15:22:00 crc kubenswrapper[4722]: I0309 15:22:00.605096 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-dr2x6" Mar 09 15:22:00 crc kubenswrapper[4722]: I0309 15:22:00.729350 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551162-kb65f"] Mar 09 15:22:00 crc kubenswrapper[4722]: I0309 15:22:00.737767 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkvkz\" (UniqueName: \"kubernetes.io/projected/4c42a923-6da3-4c85-b007-7d5937445ac9-kube-api-access-nkvkz\") pod \"auto-csr-approver-29551162-kb65f\" (UID: \"4c42a923-6da3-4c85-b007-7d5937445ac9\") " pod="openshift-infra/auto-csr-approver-29551162-kb65f" Mar 09 15:22:00 crc kubenswrapper[4722]: I0309 15:22:00.839925 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkvkz\" (UniqueName: \"kubernetes.io/projected/4c42a923-6da3-4c85-b007-7d5937445ac9-kube-api-access-nkvkz\") pod \"auto-csr-approver-29551162-kb65f\" (UID: \"4c42a923-6da3-4c85-b007-7d5937445ac9\") " pod="openshift-infra/auto-csr-approver-29551162-kb65f" Mar 09 15:22:00 crc kubenswrapper[4722]: I0309 15:22:00.920243 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkvkz\" (UniqueName: \"kubernetes.io/projected/4c42a923-6da3-4c85-b007-7d5937445ac9-kube-api-access-nkvkz\") pod \"auto-csr-approver-29551162-kb65f\" (UID: \"4c42a923-6da3-4c85-b007-7d5937445ac9\") " pod="openshift-infra/auto-csr-approver-29551162-kb65f" Mar 09 15:22:00 crc kubenswrapper[4722]: I0309 15:22:00.959893 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551162-kb65f" Mar 09 15:22:02 crc kubenswrapper[4722]: I0309 15:22:02.485067 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551162-kb65f"] Mar 09 15:22:02 crc kubenswrapper[4722]: I0309 15:22:02.539234 4722 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 15:22:02 crc kubenswrapper[4722]: I0309 15:22:02.937365 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551162-kb65f" event={"ID":"4c42a923-6da3-4c85-b007-7d5937445ac9","Type":"ContainerStarted","Data":"3514a103a4055fda70e04334050cbe062236a0276354748f791e879f5325fdb4"} Mar 09 15:22:04 crc kubenswrapper[4722]: I0309 15:22:04.971583 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551162-kb65f" event={"ID":"4c42a923-6da3-4c85-b007-7d5937445ac9","Type":"ContainerStarted","Data":"7af62118ef4254fa690052f139b54e3cf888fc27b5dbc5fb3e6784c20f0e26ec"} Mar 09 15:22:05 crc kubenswrapper[4722]: I0309 15:22:05.000664 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551162-kb65f" podStartSLOduration=3.878891216 podStartE2EDuration="4.997665639s" podCreationTimestamp="2026-03-09 15:22:00 +0000 UTC" firstStartedPulling="2026-03-09 15:22:02.533524416 +0000 UTC m=+4763.089092992" lastFinishedPulling="2026-03-09 15:22:03.652298839 +0000 UTC m=+4764.207867415" observedRunningTime="2026-03-09 15:22:04.988028464 +0000 UTC m=+4765.543597040" watchObservedRunningTime="2026-03-09 15:22:04.997665639 +0000 UTC m=+4765.553234215" Mar 09 15:22:06 crc kubenswrapper[4722]: I0309 15:22:06.202610 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-6vn96" podUID="29ed2858-4fd0-4817-8ed3-b3515ac035d7" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:08 crc kubenswrapper[4722]: I0309 15:22:08.006324 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551162-kb65f" event={"ID":"4c42a923-6da3-4c85-b007-7d5937445ac9","Type":"ContainerDied","Data":"7af62118ef4254fa690052f139b54e3cf888fc27b5dbc5fb3e6784c20f0e26ec"} Mar 09 15:22:08 crc kubenswrapper[4722]: I0309 15:22:08.006879 4722 generic.go:334] "Generic (PLEG): container finished" podID="4c42a923-6da3-4c85-b007-7d5937445ac9" containerID="7af62118ef4254fa690052f139b54e3cf888fc27b5dbc5fb3e6784c20f0e26ec" exitCode=0 Mar 09 15:22:10 crc kubenswrapper[4722]: I0309 15:22:10.697605 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551162-kb65f" Mar 09 15:22:10 crc kubenswrapper[4722]: I0309 15:22:10.733948 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nkvkz\" (UniqueName: \"kubernetes.io/projected/4c42a923-6da3-4c85-b007-7d5937445ac9-kube-api-access-nkvkz\") pod \"4c42a923-6da3-4c85-b007-7d5937445ac9\" (UID: \"4c42a923-6da3-4c85-b007-7d5937445ac9\") " Mar 09 15:22:10 crc kubenswrapper[4722]: I0309 15:22:10.769904 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c42a923-6da3-4c85-b007-7d5937445ac9-kube-api-access-nkvkz" (OuterVolumeSpecName: "kube-api-access-nkvkz") pod "4c42a923-6da3-4c85-b007-7d5937445ac9" (UID: "4c42a923-6da3-4c85-b007-7d5937445ac9"). InnerVolumeSpecName "kube-api-access-nkvkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 15:22:10 crc kubenswrapper[4722]: I0309 15:22:10.840889 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nkvkz\" (UniqueName: \"kubernetes.io/projected/4c42a923-6da3-4c85-b007-7d5937445ac9-kube-api-access-nkvkz\") on node \"crc\" DevicePath \"\"" Mar 09 15:22:11 crc kubenswrapper[4722]: I0309 15:22:11.059886 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551162-kb65f" event={"ID":"4c42a923-6da3-4c85-b007-7d5937445ac9","Type":"ContainerDied","Data":"3514a103a4055fda70e04334050cbe062236a0276354748f791e879f5325fdb4"} Mar 09 15:22:11 crc kubenswrapper[4722]: I0309 15:22:11.061094 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3514a103a4055fda70e04334050cbe062236a0276354748f791e879f5325fdb4" Mar 09 15:22:11 crc kubenswrapper[4722]: I0309 15:22:11.062556 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551162-kb65f" Mar 09 15:22:12 crc kubenswrapper[4722]: I0309 15:22:12.104325 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551156-w2nm6"] Mar 09 15:22:12 crc kubenswrapper[4722]: I0309 15:22:12.117468 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551156-w2nm6"] Mar 09 15:22:12 crc kubenswrapper[4722]: I0309 15:22:12.166643 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="635b0387-641e-4b9a-a7eb-adbe84ed5d01" path="/var/lib/kubelet/pods/635b0387-641e-4b9a-a7eb-adbe84ed5d01/volumes" Mar 09 15:22:12 crc kubenswrapper[4722]: I0309 15:22:12.769367 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-svbs7"] Mar 09 15:22:12 crc kubenswrapper[4722]: E0309 15:22:12.776390 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c42a923-6da3-4c85-b007-7d5937445ac9" containerName="oc" Mar 09 15:22:12 crc kubenswrapper[4722]: I0309 15:22:12.779144 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c42a923-6da3-4c85-b007-7d5937445ac9" containerName="oc" Mar 09 15:22:12 crc kubenswrapper[4722]: I0309 15:22:12.785775 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c42a923-6da3-4c85-b007-7d5937445ac9" containerName="oc" Mar 09 15:22:12 crc kubenswrapper[4722]: I0309 15:22:12.801026 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-svbs7" Mar 09 15:22:12 crc kubenswrapper[4722]: I0309 15:22:12.890752 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-222qw\" (UniqueName: \"kubernetes.io/projected/44161991-883d-4494-80b3-b829ff355f47-kube-api-access-222qw\") pod \"redhat-operators-svbs7\" (UID: \"44161991-883d-4494-80b3-b829ff355f47\") " pod="openshift-marketplace/redhat-operators-svbs7" Mar 09 15:22:12 crc kubenswrapper[4722]: I0309 15:22:12.890910 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44161991-883d-4494-80b3-b829ff355f47-utilities\") pod \"redhat-operators-svbs7\" (UID: \"44161991-883d-4494-80b3-b829ff355f47\") " pod="openshift-marketplace/redhat-operators-svbs7" Mar 09 15:22:12 crc kubenswrapper[4722]: I0309 15:22:12.890969 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44161991-883d-4494-80b3-b829ff355f47-catalog-content\") pod \"redhat-operators-svbs7\" (UID: \"44161991-883d-4494-80b3-b829ff355f47\") " pod="openshift-marketplace/redhat-operators-svbs7" Mar 09 15:22:12 crc kubenswrapper[4722]: I0309 15:22:12.912684 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-svbs7"] Mar 09 15:22:12 crc kubenswrapper[4722]: I0309 15:22:12.992991 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44161991-883d-4494-80b3-b829ff355f47-utilities\") pod \"redhat-operators-svbs7\" (UID: \"44161991-883d-4494-80b3-b829ff355f47\") " pod="openshift-marketplace/redhat-operators-svbs7" Mar 09 15:22:12 crc kubenswrapper[4722]: I0309 15:22:12.993084 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44161991-883d-4494-80b3-b829ff355f47-catalog-content\") pod \"redhat-operators-svbs7\" (UID: \"44161991-883d-4494-80b3-b829ff355f47\") " pod="openshift-marketplace/redhat-operators-svbs7" Mar 09 15:22:12 crc kubenswrapper[4722]: I0309 15:22:12.993181 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-222qw\" (UniqueName: \"kubernetes.io/projected/44161991-883d-4494-80b3-b829ff355f47-kube-api-access-222qw\") pod \"redhat-operators-svbs7\" (UID: \"44161991-883d-4494-80b3-b829ff355f47\") " pod="openshift-marketplace/redhat-operators-svbs7" Mar 09 15:22:12 crc kubenswrapper[4722]: I0309 15:22:12.999548 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44161991-883d-4494-80b3-b829ff355f47-utilities\") pod \"redhat-operators-svbs7\" (UID: \"44161991-883d-4494-80b3-b829ff355f47\") " pod="openshift-marketplace/redhat-operators-svbs7" Mar 09 15:22:13 crc kubenswrapper[4722]: I0309 15:22:13.000846 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44161991-883d-4494-80b3-b829ff355f47-catalog-content\") pod \"redhat-operators-svbs7\" (UID: \"44161991-883d-4494-80b3-b829ff355f47\") " pod="openshift-marketplace/redhat-operators-svbs7" Mar 09 15:22:13 crc kubenswrapper[4722]: I0309 15:22:13.039102 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-222qw\" (UniqueName: \"kubernetes.io/projected/44161991-883d-4494-80b3-b829ff355f47-kube-api-access-222qw\") pod \"redhat-operators-svbs7\" (UID: \"44161991-883d-4494-80b3-b829ff355f47\") " pod="openshift-marketplace/redhat-operators-svbs7" Mar 09 15:22:13 crc kubenswrapper[4722]: I0309 15:22:13.148474 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-svbs7" Mar 09 15:22:14 crc kubenswrapper[4722]: I0309 15:22:14.258416 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-svbs7"] Mar 09 15:22:15 crc kubenswrapper[4722]: I0309 15:22:15.101904 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-svbs7" event={"ID":"44161991-883d-4494-80b3-b829ff355f47","Type":"ContainerDied","Data":"aedc98873afd42adba59fcbfbde977e28b5645f86910438a6f263830d1b51d20"} Mar 09 15:22:15 crc kubenswrapper[4722]: I0309 15:22:15.104615 4722 generic.go:334] "Generic (PLEG): container finished" podID="44161991-883d-4494-80b3-b829ff355f47" containerID="aedc98873afd42adba59fcbfbde977e28b5645f86910438a6f263830d1b51d20" exitCode=0 Mar 09 15:22:15 crc kubenswrapper[4722]: I0309 15:22:15.105012 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-svbs7" event={"ID":"44161991-883d-4494-80b3-b829ff355f47","Type":"ContainerStarted","Data":"428841f088edfd21acca847bdf06e66d5c0d29545acac520ba98928b8b361fa2"} Mar 09 15:22:16 crc kubenswrapper[4722]: I0309 15:22:16.118420 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-svbs7" event={"ID":"44161991-883d-4494-80b3-b829ff355f47","Type":"ContainerStarted","Data":"e17863f074bb023e7a57bc6dafe37da55973e0ac7840f2c4c87c71bc2b40aca2"} Mar 09 15:22:18 crc kubenswrapper[4722]: I0309 15:22:18.603042 4722 trace.go:236] Trace[740377259]: "Calculate volume metrics of lokistack-gateway for pod openshift-logging/logging-loki-gateway-6c5ff86c56-dvps5" (09-Mar-2026 15:22:17.135) (total time: 1391ms): Mar 09 15:22:18 crc kubenswrapper[4722]: Trace[740377259]: [1.391922515s] [1.391922515s] END Mar 09 15:22:18 crc kubenswrapper[4722]: I0309 15:22:18.962259 4722 patch_prober.go:28] interesting pod/oauth-openshift-5db757fd5b-t57qc container/oauth-openshift namespace/openshift-authentication: Liveness probe status=failure output="Get \"https://10.217.0.64:6443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:18 crc kubenswrapper[4722]: I0309 15:22:18.963033 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication/oauth-openshift-5db757fd5b-t57qc" podUID="ec3da47d-c782-4189-b195-d6b203bd7f7a" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.64:6443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:18 crc kubenswrapper[4722]: I0309 15:22:18.962245 4722 patch_prober.go:28] interesting pod/oauth-openshift-5db757fd5b-t57qc container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.64:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:18 crc kubenswrapper[4722]: I0309 15:22:18.963148 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-5db757fd5b-t57qc" podUID="ec3da47d-c782-4189-b195-d6b203bd7f7a" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.64:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:19 crc kubenswrapper[4722]: I0309 15:22:19.906458 4722 patch_prober.go:28] interesting pod/controller-manager-8445c785c8-hdmgl container/controller-manager namespace/openshift-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.65:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:19 crc kubenswrapper[4722]: I0309 15:22:19.906524 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-controller-manager/controller-manager-8445c785c8-hdmgl" podUID="7139e62b-5e90-4545-a264-aa8138821a55" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.65:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:19 crc kubenswrapper[4722]: I0309 15:22:19.906853 4722 patch_prober.go:28] interesting pod/controller-manager-8445c785c8-hdmgl container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.65:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:19 crc kubenswrapper[4722]: I0309 15:22:19.906877 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-8445c785c8-hdmgl" podUID="7139e62b-5e90-4545-a264-aa8138821a55" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.65:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:20 crc kubenswrapper[4722]: I0309 15:22:20.496388 4722 trace.go:236] Trace[696774191]: "Calculate volume metrics of prometheus-metric-storage-db for pod openstack/prometheus-metric-storage-0" (09-Mar-2026 15:22:18.531) (total time: 1964ms): Mar 09 15:22:20 crc kubenswrapper[4722]: Trace[696774191]: [1.964497307s] [1.964497307s] END Mar 09 15:22:20 crc kubenswrapper[4722]: I0309 15:22:20.496416 4722 trace.go:236] Trace[1289158185]: "Calculate volume metrics of glance for pod openstack/glance-default-external-api-0" (09-Mar-2026 15:22:16.834) (total time: 3661ms): Mar 09 15:22:20 crc kubenswrapper[4722]: Trace[1289158185]: [3.661849937s] [3.661849937s] END Mar 09 15:22:20 crc kubenswrapper[4722]: I0309 15:22:20.496441 4722 trace.go:236] Trace[1532469367]: "Calculate volume metrics of storage for pod minio-dev/minio" (09-Mar-2026 15:22:17.380) (total time: 3116ms): Mar 09 15:22:20 crc kubenswrapper[4722]: Trace[1532469367]: [3.116002036s] [3.116002036s] END Mar 09 15:22:20 crc kubenswrapper[4722]: I0309 15:22:20.775139 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="a79aaedb-92ba-42fc-8cc7-9ecb007d2ac7" containerName="galera" probeResult="failure" output="command timed out" Mar 09 15:22:20 crc kubenswrapper[4722]: I0309 15:22:20.775135 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-galera-0" podUID="a79aaedb-92ba-42fc-8cc7-9ecb007d2ac7" containerName="galera" probeResult="failure" output="command timed out" Mar 09 15:22:25 crc kubenswrapper[4722]: I0309 15:22:25.437009 4722 patch_prober.go:28] interesting pod/console-6dc4c5dd4b-rk4jt container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.142:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:25 crc kubenswrapper[4722]: I0309 15:22:25.437580 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-6dc4c5dd4b-rk4jt" podUID="6dc5b476-5a42-4c98-9a95-0e3b29f2f771" containerName="console" probeResult="failure" output="Get \"https://10.217.0.142:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:25 crc kubenswrapper[4722]: I0309 15:22:25.584900 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-2nlfl" podUID="8557439a-0367-4823-af83-28955a17cc08" containerName="frr-k8s-webhook-server" probeResult="failure" output="Get \"http://10.217.0.99:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:26 crc kubenswrapper[4722]: I0309 15:22:26.064622 4722 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-8xdjl container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.75:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:26 crc kubenswrapper[4722]: I0309 15:22:26.065001 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-8xdjl" podUID="8b35adf7-a305-4f94-a5c9-02fbc3fca46f" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.75:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:26 crc kubenswrapper[4722]: I0309 15:22:26.065171 4722 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-8xdjl container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Liveness probe status=failure output="Get \"https://10.217.0.75:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:26 crc kubenswrapper[4722]: I0309 15:22:26.065245 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-8xdjl" podUID="8b35adf7-a305-4f94-a5c9-02fbc3fca46f" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.75:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:26 crc kubenswrapper[4722]: I0309 15:22:26.203384 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-6vn96" podUID="29ed2858-4fd0-4817-8ed3-b3515ac035d7" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:26 crc kubenswrapper[4722]: I0309 15:22:26.685645 4722 patch_prober.go:28] interesting pod/loki-operator-controller-manager-7d6d6698bd-4r85k container/manager namespace/openshift-operators-redhat: Readiness probe status=failure output="Get \"http://10.217.0.50:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:26 crc kubenswrapper[4722]: I0309 15:22:26.685711 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators-redhat/loki-operator-controller-manager-7d6d6698bd-4r85k" podUID="497a07fc-9649-4620-9432-855aa3fdc327" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.50:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:27 crc kubenswrapper[4722]: I0309 15:22:27.857672 4722 trace.go:236] Trace[1401053170]: "Calculate volume metrics of mysql-db for pod openstack/openstack-galera-0" (09-Mar-2026 15:22:26.399) (total time: 1451ms): Mar 09 15:22:27 crc kubenswrapper[4722]: Trace[1401053170]: [1.451901127s] [1.451901127s] END Mar 09 15:22:29 crc kubenswrapper[4722]: I0309 15:22:29.044416 4722 patch_prober.go:28] interesting pod/oauth-openshift-5db757fd5b-t57qc container/oauth-openshift namespace/openshift-authentication: Liveness probe status=failure output="Get \"https://10.217.0.64:6443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:29 crc kubenswrapper[4722]: I0309 15:22:29.044429 4722 patch_prober.go:28] interesting pod/oauth-openshift-5db757fd5b-t57qc container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.64:6443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:29 crc kubenswrapper[4722]: I0309 15:22:29.044761 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication/oauth-openshift-5db757fd5b-t57qc" podUID="ec3da47d-c782-4189-b195-d6b203bd7f7a" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.64:6443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:29 crc kubenswrapper[4722]: I0309 15:22:29.044826 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-5db757fd5b-t57qc" podUID="ec3da47d-c782-4189-b195-d6b203bd7f7a" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.64:6443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:29 crc kubenswrapper[4722]: I0309 15:22:29.907384 4722 patch_prober.go:28] interesting pod/controller-manager-8445c785c8-hdmgl container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.65:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:29 crc kubenswrapper[4722]: I0309 15:22:29.907462 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-8445c785c8-hdmgl" podUID="7139e62b-5e90-4545-a264-aa8138821a55" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.65:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:29 crc kubenswrapper[4722]: I0309 15:22:29.907383 4722 patch_prober.go:28] interesting pod/controller-manager-8445c785c8-hdmgl container/controller-manager namespace/openshift-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.65:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:29 crc kubenswrapper[4722]: I0309 15:22:29.907525 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-controller-manager/controller-manager-8445c785c8-hdmgl" podUID="7139e62b-5e90-4545-a264-aa8138821a55" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.65:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:29 crc kubenswrapper[4722]: I0309 15:22:29.913832 4722 patch_prober.go:28] interesting pod/route-controller-manager-67579949ff-g69dw container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.66:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:29 crc kubenswrapper[4722]: I0309 15:22:29.913863 4722 patch_prober.go:28] interesting pod/route-controller-manager-67579949ff-g69dw container/route-controller-manager namespace/openshift-route-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.66:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:29 crc kubenswrapper[4722]: I0309 15:22:29.913891 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-67579949ff-g69dw" podUID="fd45c8c5-9cad-404b-b14a-9cbc710c8468" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.66:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:29 crc kubenswrapper[4722]: I0309 15:22:29.913912 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-route-controller-manager/route-controller-manager-67579949ff-g69dw" podUID="fd45c8c5-9cad-404b-b14a-9cbc710c8468" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.66:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:30 crc kubenswrapper[4722]: I0309 15:22:30.787648 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-galera-0" podUID="a79aaedb-92ba-42fc-8cc7-9ecb007d2ac7" containerName="galera" probeResult="failure" output="command timed out" Mar 09 15:22:30 crc kubenswrapper[4722]: I0309 15:22:30.791723 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="a79aaedb-92ba-42fc-8cc7-9ecb007d2ac7" containerName="galera" probeResult="failure" output="command timed out" Mar 09 15:22:30 crc kubenswrapper[4722]: I0309 15:22:30.983413 4722 patch_prober.go:28] interesting pod/metrics-server-bcd6d9dd6-5rp7g container/metrics-server namespace/openshift-monitoring: Liveness probe status=failure output="Get \"https://10.217.0.83:10250/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:30 crc kubenswrapper[4722]: I0309 15:22:30.983483 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/metrics-server-bcd6d9dd6-5rp7g" podUID="f8a65e9f-5e0a-47d0-b251-aa4e52e2f581" containerName="metrics-server" probeResult="failure" output="Get \"https://10.217.0.83:10250/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:30 crc kubenswrapper[4722]: I0309 15:22:30.983807 4722 patch_prober.go:28] interesting pod/metrics-server-bcd6d9dd6-5rp7g container/metrics-server namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.83:10250/livez\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:30 crc kubenswrapper[4722]: I0309 15:22:30.983861 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/metrics-server-bcd6d9dd6-5rp7g" podUID="f8a65e9f-5e0a-47d0-b251-aa4e52e2f581" containerName="metrics-server" probeResult="failure" output="Get \"https://10.217.0.83:10250/livez\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:31 crc kubenswrapper[4722]: I0309 15:22:31.001969 4722 scope.go:117] "RemoveContainer" containerID="4f4f60e84c0f4934025a4b6014bc5592a2608f1e7b3d5ee5cf6960d91c19b24e" Mar 09 15:22:31 crc kubenswrapper[4722]: I0309 15:22:31.253724 4722 patch_prober.go:28] interesting pod/monitoring-plugin-65599947bd-42bk4 container/monitoring-plugin namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.84:9443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:31 crc kubenswrapper[4722]: I0309 15:22:31.254083 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/monitoring-plugin-65599947bd-42bk4" podUID="85f5a76e-3679-44b3-8932-f5245c49b481" containerName="monitoring-plugin" probeResult="failure" output="Get \"https://10.217.0.84:9443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:31 crc kubenswrapper[4722]: I0309 15:22:31.698323 4722 patch_prober.go:28] interesting pod/logging-loki-distributor-5d5548c9f5-r6x4b container/loki-distributor namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.52:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:31 crc kubenswrapper[4722]: I0309 15:22:31.698404 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-distributor-5d5548c9f5-r6x4b" podUID="822bc43f-dfed-4440-be35-1bf58f50456b" containerName="loki-distributor" probeResult="failure" output="Get \"https://10.217.0.52:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:31 crc kubenswrapper[4722]: I0309 15:22:31.773645 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-nmstate/nmstate-handler-67pzj" podUID="ade25a79-1e43-41ed-be91-ce97aa1c4103" containerName="nmstate-handler" probeResult="failure" output="command timed out" Mar 09 15:22:31 crc kubenswrapper[4722]: I0309 15:22:31.773733 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-cell1-galera-0" podUID="4159e308-3ccf-45d9-a97b-8133542007a8" containerName="galera" probeResult="failure" output="command timed out" Mar 09 15:22:31 crc kubenswrapper[4722]: I0309 15:22:31.774135 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="4159e308-3ccf-45d9-a97b-8133542007a8" containerName="galera" probeResult="failure" output="command timed out" Mar 09 15:22:31 crc kubenswrapper[4722]: I0309 15:22:31.819488 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-lvfgg" podUID="7a62b98d-e9d4-4cbc-bea8-0da13fcc4467" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.112:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:31 crc kubenswrapper[4722]: I0309 15:22:31.820222 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-lvfgg" podUID="7a62b98d-e9d4-4cbc-bea8-0da13fcc4467" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.112:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:31 crc kubenswrapper[4722]: I0309 15:22:31.906160 4722 patch_prober.go:28] interesting pod/logging-loki-querier-76bf7b6d45-fv4dh container/loki-querier namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.53:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:31 crc kubenswrapper[4722]: I0309 15:22:31.906227 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-querier-76bf7b6d45-fv4dh" podUID="ba829d53-02a8-4003-a5ee-b9b36d8404e3" containerName="loki-querier" probeResult="failure" output="Get \"https://10.217.0.53:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:32 crc kubenswrapper[4722]: I0309 15:22:32.090537 4722 patch_prober.go:28] interesting pod/logging-loki-query-frontend-6d6859c548-5b72h container/loki-query-frontend namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.54:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:32 crc kubenswrapper[4722]: I0309 15:22:32.090601 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-query-frontend-6d6859c548-5b72h" podUID="5ccc948e-2185-44fd-90c4-3ae3228f6224" containerName="loki-query-frontend" probeResult="failure" output="Get \"https://10.217.0.54:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:32 crc kubenswrapper[4722]: I0309 15:22:32.209870 4722 patch_prober.go:28] interesting pod/logging-loki-gateway-6c5ff86c56-n5jdr container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.55:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:32 crc kubenswrapper[4722]: I0309 15:22:32.209928 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-6c5ff86c56-n5jdr" podUID="8becd072-3095-4717-a83d-e56cf0d0f816" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.55:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:32 crc kubenswrapper[4722]: I0309 15:22:32.222848 4722 patch_prober.go:28] interesting pod/logging-loki-gateway-6c5ff86c56-dvps5 container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:32 crc kubenswrapper[4722]: I0309 15:22:32.222904 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-6c5ff86c56-dvps5" podUID="e265fe14-7154-4fbb-a7c3-33557166f71d" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.56:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:32 crc kubenswrapper[4722]: I0309 15:22:32.287374 4722 patch_prober.go:28] interesting pod/logging-loki-gateway-6c5ff86c56-n5jdr container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.55:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:32 crc kubenswrapper[4722]: I0309 15:22:32.287423 4722 patch_prober.go:28] interesting pod/logging-loki-gateway-6c5ff86c56-dvps5 container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:32 crc kubenswrapper[4722]: I0309 15:22:32.287460 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-6c5ff86c56-dvps5" podUID="e265fe14-7154-4fbb-a7c3-33557166f71d" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:32 crc kubenswrapper[4722]: I0309 15:22:32.287378 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9ctn5qw" podUID="5e25c11b-f9c6-4542-9c0c-394ea6bc2c17" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.120:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:32 crc kubenswrapper[4722]: I0309 15:22:32.287490 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9ctn5qw" podUID="5e25c11b-f9c6-4542-9c0c-394ea6bc2c17" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.120:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:32 crc kubenswrapper[4722]: I0309 15:22:32.287429 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-6c5ff86c56-n5jdr" podUID="8becd072-3095-4717-a83d-e56cf0d0f816" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.55:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:32 crc kubenswrapper[4722]: I0309 15:22:32.555489 4722 patch_prober.go:28] interesting pod/observability-operator-59bdc8b94-lc5zn container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.18:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:32 crc kubenswrapper[4722]: I0309 15:22:32.555816 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-59bdc8b94-lc5zn" podUID="14655c3d-02fe-4215-b566-0c4008fd34a0" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.18:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:32 crc kubenswrapper[4722]: I0309 15:22:32.611285 4722 patch_prober.go:28] interesting pod/thanos-querier-5f6c868b98-s8fg2 container/kube-rbac-proxy-web namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.81:9091/-/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:32 crc kubenswrapper[4722]: I0309 15:22:32.611364 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/thanos-querier-5f6c868b98-s8fg2" podUID="1eb00f12-c24e-46dc-8346-c096826564f5" containerName="kube-rbac-proxy-web" probeResult="failure" output="Get \"https://10.217.0.81:9091/-/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:32 crc kubenswrapper[4722]: I0309 15:22:32.637396 4722 patch_prober.go:28] interesting pod/observability-operator-59bdc8b94-lc5zn container/operator namespace/openshift-operators: Liveness probe status=failure output="Get \"http://10.217.0.18:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:32 crc kubenswrapper[4722]: I0309 15:22:32.637429 4722 patch_prober.go:28] interesting pod/perses-operator-5bf474d74f-2rhld container/perses-operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.11:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:32 crc kubenswrapper[4722]: I0309 15:22:32.637459 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operators/observability-operator-59bdc8b94-lc5zn" podUID="14655c3d-02fe-4215-b566-0c4008fd34a0" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.18:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:32 crc kubenswrapper[4722]: I0309 15:22:32.637496 4722 patch_prober.go:28] interesting pod/perses-operator-5bf474d74f-2rhld container/perses-operator namespace/openshift-operators: Liveness probe status=failure output="Get \"http://10.217.0.11:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:32 crc kubenswrapper[4722]: I0309 15:22:32.637568 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operators/perses-operator-5bf474d74f-2rhld" podUID="c123a767-e0e0-4432-b34f-cbe0b581d938" containerName="perses-operator" probeResult="failure" output="Get \"http://10.217.0.11:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:32 crc kubenswrapper[4722]: I0309 15:22:32.637498 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/perses-operator-5bf474d74f-2rhld" podUID="c123a767-e0e0-4432-b34f-cbe0b581d938" containerName="perses-operator" probeResult="failure" output="Get \"http://10.217.0.11:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:32 crc kubenswrapper[4722]: I0309 15:22:32.640054 4722 patch_prober.go:28] interesting pod/console-operator-58897d9998-chrnr container/console-operator namespace/openshift-console-operator: Liveness probe status=failure output="Get \"https://10.217.0.14:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:32 crc kubenswrapper[4722]: I0309 15:22:32.640081 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console-operator/console-operator-58897d9998-chrnr" podUID="b919959a-1da1-4c74-9330-5bb8c33f5c26" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.14:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:32 crc kubenswrapper[4722]: I0309 15:22:32.640045 4722 patch_prober.go:28] interesting pod/console-operator-58897d9998-chrnr container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.14:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:32 crc kubenswrapper[4722]: I0309 15:22:32.640122 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-chrnr" podUID="b919959a-1da1-4c74-9330-5bb8c33f5c26" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.14:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:32 crc kubenswrapper[4722]: I0309 15:22:32.697965 4722 patch_prober.go:28] interesting pod/logging-loki-distributor-5d5548c9f5-r6x4b container/loki-distributor namespace/openshift-logging: Liveness probe status=failure output="Get \"https://10.217.0.52:3101/loki/api/v1/status/buildinfo\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:32 crc kubenswrapper[4722]: I0309 15:22:32.698107 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-logging/logging-loki-distributor-5d5548c9f5-r6x4b" podUID="822bc43f-dfed-4440-be35-1bf58f50456b" containerName="loki-distributor" probeResult="failure" output="Get \"https://10.217.0.52:3101/loki/api/v1/status/buildinfo\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:32 crc kubenswrapper[4722]: I0309 15:22:32.714706 4722 patch_prober.go:28] interesting pod/authentication-operator-69f744f599-8tkbn container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": context deadline exceeded" start-of-body= Mar 09 15:22:32 crc kubenswrapper[4722]: I0309 15:22:32.714774 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-69f744f599-8tkbn" podUID="427a4c04-99cd-4f53-ae98-20c1755d7658" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": context deadline exceeded" Mar 09 15:22:32 crc kubenswrapper[4722]: I0309 15:22:32.757781 4722 patch_prober.go:28] interesting pod/openshift-kube-scheduler-crc container/kube-scheduler namespace/openshift-kube-scheduler: Readiness probe status=failure output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:32 crc kubenswrapper[4722]: I0309 15:22:32.757860 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:32 crc kubenswrapper[4722]: I0309 15:22:32.777292 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="e4a22f8c-ed38-47cf-8238-baf804f573a1" containerName="ceilometer-central-agent" probeResult="failure" output="command timed out" Mar 09 15:22:32 crc kubenswrapper[4722]: I0309 15:22:32.868443 4722 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.57:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:32 crc kubenswrapper[4722]: I0309 15:22:32.868506 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="6a23db8b-8a30-47b8-bf39-6f193899fcee" containerName="loki-ingester" probeResult="failure" output="Get \"https://10.217.0.57:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:33 crc kubenswrapper[4722]: I0309 15:22:33.208192 4722 patch_prober.go:28] interesting pod/logging-loki-gateway-6c5ff86c56-n5jdr container/opa namespace/openshift-logging: Liveness probe status=failure output="Get \"https://10.217.0.55:8083/live\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:33 crc kubenswrapper[4722]: I0309 15:22:33.208287 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-logging/logging-loki-gateway-6c5ff86c56-n5jdr" podUID="8becd072-3095-4717-a83d-e56cf0d0f816" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.55:8083/live\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:33 crc kubenswrapper[4722]: I0309 15:22:33.222337 4722 patch_prober.go:28] interesting pod/logging-loki-gateway-6c5ff86c56-dvps5 container/opa namespace/openshift-logging: Liveness probe status=failure output="Get \"https://10.217.0.56:8083/live\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:33 crc kubenswrapper[4722]: I0309 15:22:33.222403 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-logging/logging-loki-gateway-6c5ff86c56-dvps5" podUID="e265fe14-7154-4fbb-a7c3-33557166f71d" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.56:8083/live\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:33 crc kubenswrapper[4722]: I0309 15:22:33.289055 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="hostpath-provisioner/csi-hostpathplugin-b5hps" podUID="240d1325-4400-475e-8bc7-9915294148d8" containerName="hostpath-provisioner" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 15:22:33 crc kubenswrapper[4722]: I0309 15:22:33.761403 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-controller-init-5b979cff56-vwbnz" podUID="bdac45ca-36d4-41c5-b5e5-332d70558171" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.104:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:34 crc kubenswrapper[4722]: I0309 15:22:34.168421 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="cert-manager/cert-manager-webhook-687f57d79b-wb6rj" podUID="ac6d6e52-6a89-4f96-8894-8ed2c71cdcbc" containerName="cert-manager-webhook" probeResult="failure" output="Get \"http://10.217.0.45:6080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:34 crc kubenswrapper[4722]: I0309 15:22:34.328345 4722 patch_prober.go:28] interesting pod/router-default-5444994796-dp8wn container/router namespace/openshift-ingress: Readiness probe status=failure output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:34 crc kubenswrapper[4722]: I0309 15:22:34.328400 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ingress/router-default-5444994796-dp8wn" podUID="317444ee-0620-47d2-869e-77578a367a87" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:34 crc kubenswrapper[4722]: I0309 15:22:34.328411 4722 patch_prober.go:28] interesting pod/router-default-5444994796-dp8wn container/router namespace/openshift-ingress: Liveness probe status=failure output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:34 crc kubenswrapper[4722]: I0309 15:22:34.328472 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-ingress/router-default-5444994796-dp8wn" podUID="317444ee-0620-47d2-869e-77578a367a87" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:34 crc kubenswrapper[4722]: I0309 15:22:34.328510 4722 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-dgxnd container/catalog-operator namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.42:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:34 crc kubenswrapper[4722]: I0309 15:22:34.328529 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dgxnd" podUID="c219beb3-4ba5-43bd-b2ec-3855d19c2b57" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.42:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:34 crc kubenswrapper[4722]: I0309 15:22:34.328519 4722 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-dgxnd container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.42:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:34 crc kubenswrapper[4722]: I0309 15:22:34.328632 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dgxnd" podUID="c219beb3-4ba5-43bd-b2ec-3855d19c2b57" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.42:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:34 crc kubenswrapper[4722]: I0309 15:22:34.464432 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/metallb-operator-controller-manager-5689854475-89q94" podUID="3ea04cb5-4d36-42a9-bb83-c6f943619d16" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.97:8080/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:34 crc kubenswrapper[4722]: I0309 15:22:34.501706 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="0e07fbab-4a47-4e59-aa72-f0a4521296af" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.173:9090/-/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:34 crc kubenswrapper[4722]: I0309 15:22:34.501735 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/prometheus-metric-storage-0" podUID="0e07fbab-4a47-4e59-aa72-f0a4521296af" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.173:9090/-/healthy\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:34 crc kubenswrapper[4722]: I0309 15:22:34.645381 4722 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-lnkwt container/olm-operator namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.19:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:34 crc kubenswrapper[4722]: I0309 15:22:34.645426 4722 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-lnkwt container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.19:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:34 crc kubenswrapper[4722]: I0309 15:22:34.645402 4722 patch_prober.go:28] interesting pod/package-server-manager-789f6589d5-mrtb2 container/package-server-manager namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"http://10.217.0.23:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:34 crc kubenswrapper[4722]: I0309 15:22:34.645447 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lnkwt" podUID="f8055a95-6b09-4e32-88b8-82ad36ca5029" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.19:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:34 crc kubenswrapper[4722]: I0309 15:22:34.645456 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lnkwt" podUID="f8055a95-6b09-4e32-88b8-82ad36ca5029" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.19:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:34 crc kubenswrapper[4722]: I0309 15:22:34.645560 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mrtb2" podUID="228d39d8-b0bc-4491-be90-e473c090f412" containerName="package-server-manager" probeResult="failure" output="Get \"http://10.217.0.23:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:34 crc kubenswrapper[4722]: I0309 15:22:34.687408 4722 patch_prober.go:28] interesting pod/package-server-manager-789f6589d5-mrtb2 container/package-server-manager namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"http://10.217.0.23:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:34 crc kubenswrapper[4722]: I0309 15:22:34.687474 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mrtb2" podUID="228d39d8-b0bc-4491-be90-e473c090f412" containerName="package-server-manager" probeResult="failure" output="Get \"http://10.217.0.23:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:34 crc kubenswrapper[4722]: I0309 15:22:34.728438 4722 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-fp2th container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.44:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:34 crc kubenswrapper[4722]: I0309 15:22:34.728503 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fp2th" podUID="996087ed-6480-4650-8632-c991e5d16c99" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.44:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:34 crc kubenswrapper[4722]: I0309 15:22:34.728630 4722 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-jmwpv container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.12:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:34 crc kubenswrapper[4722]: I0309 15:22:34.728648 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-jmwpv" podUID="9f2a2160-888c-4101-8b1c-63498753a2b7" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.12:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:34 crc kubenswrapper[4722]: I0309 15:22:34.728708 4722 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-fp2th container/packageserver namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.44:5443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:34 crc kubenswrapper[4722]: I0309 15:22:34.728731 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fp2th" podUID="996087ed-6480-4650-8632-c991e5d16c99" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.44:5443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:34 crc kubenswrapper[4722]: I0309 15:22:34.778828 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/prometheus-k8s-0" podUID="df07ee71-98db-42a1-9df4-6f8707504f08" containerName="prometheus" probeResult="failure" output="command timed out" Mar 09 15:22:34 crc kubenswrapper[4722]: I0309 15:22:34.778828 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-k8s-0" podUID="df07ee71-98db-42a1-9df4-6f8707504f08" containerName="prometheus" probeResult="failure" output="command timed out" Mar 09 15:22:34 crc kubenswrapper[4722]: I0309 15:22:34.899532 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/metallb-operator-webhook-server-798745ff96-864pz" podUID="f67efad4-1b85-4f64-9e98-55eb2da89fb6" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.98:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:34 crc kubenswrapper[4722]: I0309 15:22:34.899928 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/metallb-operator-webhook-server-798745ff96-864pz" podUID="f67efad4-1b85-4f64-9e98-55eb2da89fb6" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.98:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:35 crc kubenswrapper[4722]: I0309 15:22:35.478366 4722 patch_prober.go:28] interesting pod/console-6dc4c5dd4b-rk4jt container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.142:8443/health\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:35 crc kubenswrapper[4722]: I0309 15:22:35.478758 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-6dc4c5dd4b-rk4jt" podUID="6dc5b476-5a42-4c98-9a95-0e3b29f2f771" containerName="console" probeResult="failure" output="Get \"https://10.217.0.142:8443/health\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:35 crc kubenswrapper[4722]: I0309 15:22:35.633450 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-2nlfl" podUID="8557439a-0367-4823-af83-28955a17cc08" containerName="frr-k8s-webhook-server" probeResult="failure" output="Get \"http://10.217.0.99:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:35 crc kubenswrapper[4722]: I0309 15:22:35.634453 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-2nlfl" podUID="8557439a-0367-4823-af83-28955a17cc08" containerName="frr-k8s-webhook-server" probeResult="failure" output="Get \"http://10.217.0.99:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:35 crc kubenswrapper[4722]: I0309 15:22:35.685584 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/controller-86ddb6bd46-6w5ww" podUID="0b264ba4-1398-4d3e-bb2b-83cd34a0ccf6" containerName="controller" probeResult="failure" output="Get \"http://10.217.0.100:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:35 crc kubenswrapper[4722]: I0309 15:22:35.809718 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-vppnv" podUID="8f839106-1673-4589-9391-0cd7748e658c" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.106:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:35 crc kubenswrapper[4722]: I0309 15:22:35.892497 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-rsd9l" podUID="edd71e1d-6ff0-4918-9cd8-a342efba2df5" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.107:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:35 crc kubenswrapper[4722]: I0309 15:22:35.977742 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-bmpgd" podUID="a1a5e35a-83f6-4886-86db-55738f51f7e8" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.108:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:35 crc kubenswrapper[4722]: I0309 15:22:35.978387 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/controller-86ddb6bd46-6w5ww" podUID="0b264ba4-1398-4d3e-bb2b-83cd34a0ccf6" containerName="controller" probeResult="failure" output="Get \"http://10.217.0.100:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:35 crc kubenswrapper[4722]: I0309 15:22:35.978414 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-vppnv" podUID="8f839106-1673-4589-9391-0cd7748e658c" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.106:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:36 crc kubenswrapper[4722]: I0309 15:22:36.060603 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-rsd9l" podUID="edd71e1d-6ff0-4918-9cd8-a342efba2df5" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.107:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:36 crc kubenswrapper[4722]: I0309 15:22:36.060630 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-zjf7b" podUID="22043c71-5292-422c-99e5-c88ea1aef638" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.110:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:36 crc kubenswrapper[4722]: I0309 15:22:36.062723 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-bmpgd" podUID="a1a5e35a-83f6-4886-86db-55738f51f7e8" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.108:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:36 crc kubenswrapper[4722]: I0309 15:22:36.063621 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-zjf7b" podUID="22043c71-5292-422c-99e5-c88ea1aef638" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.110:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:36 crc kubenswrapper[4722]: I0309 15:22:36.063636 4722 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-8xdjl container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.75:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:36 crc kubenswrapper[4722]: I0309 15:22:36.063685 4722 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-8xdjl container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Liveness probe status=failure output="Get \"https://10.217.0.75:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:36 crc kubenswrapper[4722]: I0309 15:22:36.063701 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-8xdjl" podUID="8b35adf7-a305-4f94-a5c9-02fbc3fca46f" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.75:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:36 crc kubenswrapper[4722]: I0309 15:22:36.063723 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-8xdjl" podUID="8b35adf7-a305-4f94-a5c9-02fbc3fca46f" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.75:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:36 crc kubenswrapper[4722]: I0309 15:22:36.111578 4722 trace.go:236] Trace[137578641]: "Calculate volume metrics of storage for pod openshift-logging/logging-loki-ingester-0" (09-Mar-2026 15:22:33.058) (total time: 3048ms): Mar 09 15:22:36 crc kubenswrapper[4722]: Trace[137578641]: [3.048749935s] [3.048749935s] END Mar 09 15:22:36 crc kubenswrapper[4722]: I0309 15:22:36.155990 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/kube-state-metrics-0" podUID="652c419b-2a86-4c6f-ac7a-c2d7818ef55f" containerName="kube-state-metrics" probeResult="failure" output="Get \"https://10.217.1.16:8081/readyz\": context deadline exceeded" Mar 09 15:22:36 crc kubenswrapper[4722]: I0309 15:22:36.156397 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/kube-state-metrics-0" podUID="652c419b-2a86-4c6f-ac7a-c2d7818ef55f" containerName="kube-state-metrics" probeResult="failure" output="Get \"https://10.217.1.16:8080/livez\": context deadline exceeded" Mar 09 15:22:36 crc kubenswrapper[4722]: I0309 15:22:36.367399 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-6vn96" podUID="29ed2858-4fd0-4817-8ed3-b3515ac035d7" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:36 crc kubenswrapper[4722]: I0309 15:22:36.367424 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-6vn96" podUID="29ed2858-4fd0-4817-8ed3-b3515ac035d7" containerName="controller" probeResult="failure" output="Get \"http://127.0.0.1:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:36 crc kubenswrapper[4722]: I0309 15:22:36.367944 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-hnzfw" podUID="ec9f1f5e-26f5-4683-bf41-c85981da9d18" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.113:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:36 crc kubenswrapper[4722]: I0309 15:22:36.368079 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-hnzfw" podUID="ec9f1f5e-26f5-4683-bf41-c85981da9d18" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.113:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:36 crc kubenswrapper[4722]: I0309 15:22:36.449493 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-l8bds" podUID="74cb981b-ce89-479e-8573-fdda25190637" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.116:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:36 crc kubenswrapper[4722]: I0309 15:22:36.449504 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-6vn96" podUID="29ed2858-4fd0-4817-8ed3-b3515ac035d7" containerName="controller" probeResult="failure" output="Get \"http://127.0.0.1:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:36 crc kubenswrapper[4722]: I0309 15:22:36.532431 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-jwgfg" podUID="a9ff56ca-00a6-484f-a477-0dca4f3a0f5c" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.118:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:36 crc kubenswrapper[4722]: I0309 15:22:36.532440 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-l8bds" podUID="74cb981b-ce89-479e-8573-fdda25190637" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.116:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:36 crc kubenswrapper[4722]: I0309 15:22:36.615428 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-jwgfg" podUID="a9ff56ca-00a6-484f-a477-0dca4f3a0f5c" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.118:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:36 crc kubenswrapper[4722]: I0309 15:22:36.615432 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-n5zc7" podUID="717ffc3a-7a6d-4a7c-837f-d1ed92489b68" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.119:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:36 crc kubenswrapper[4722]: I0309 15:22:36.615538 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-n5zc7" podUID="717ffc3a-7a6d-4a7c-837f-d1ed92489b68" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.119:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:36 crc kubenswrapper[4722]: I0309 15:22:36.624113 4722 patch_prober.go:28] interesting pod/nmstate-webhook-786f45cff4-jdp69 container/nmstate-webhook namespace/openshift-nmstate: Readiness probe status=failure output="Get \"https://10.217.0.89:9443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:36 crc kubenswrapper[4722]: I0309 15:22:36.624188 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-nmstate/nmstate-webhook-786f45cff4-jdp69" podUID="c0af161b-a8d5-4a36-b1c2-0a4d43820c73" containerName="nmstate-webhook" probeResult="failure" output="Get \"https://10.217.0.89:9443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:36 crc kubenswrapper[4722]: I0309 15:22:36.686431 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-hgkzs" podUID="df8b52ff-f61e-4aca-a408-240590699ae6" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.121:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:36 crc kubenswrapper[4722]: I0309 15:22:36.775909 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/community-operators-rvtqn" podUID="3411289f-3e7c-4e43-b545-5e612822b18e" containerName="registry-server" probeResult="failure" output="command timed out" Mar 09 15:22:36 crc kubenswrapper[4722]: I0309 15:22:36.777770 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/community-operators-rvtqn" podUID="3411289f-3e7c-4e43-b545-5e612822b18e" containerName="registry-server" probeResult="failure" output="command timed out" Mar 09 15:22:36 crc kubenswrapper[4722]: I0309 15:22:36.798335 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/redhat-operators-v57f2" podUID="4e27c5a4-8cba-4119-8006-f9841d6121dc" containerName="registry-server" probeResult="failure" output=< Mar 09 15:22:36 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Mar 09 15:22:36 crc kubenswrapper[4722]: > Mar 09 15:22:36 crc kubenswrapper[4722]: I0309 15:22:36.798420 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/redhat-marketplace-c2mmw" podUID="690e5ab0-3719-40ac-aba6-9278480ecb44" containerName="registry-server" probeResult="failure" output=< Mar 09 15:22:36 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Mar 09 15:22:36 crc kubenswrapper[4722]: > Mar 09 15:22:36 crc kubenswrapper[4722]: I0309 15:22:36.810884 4722 patch_prober.go:28] interesting pod/loki-operator-controller-manager-7d6d6698bd-4r85k container/manager namespace/openshift-operators-redhat: Liveness probe status=failure output="Get \"http://10.217.0.50:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:36 crc kubenswrapper[4722]: I0309 15:22:36.810952 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operators-redhat/loki-operator-controller-manager-7d6d6698bd-4r85k" podUID="497a07fc-9649-4620-9432-855aa3fdc327" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.50:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:36 crc kubenswrapper[4722]: I0309 15:22:36.824969 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-c2mmw" podUID="690e5ab0-3719-40ac-aba6-9278480ecb44" containerName="registry-server" probeResult="failure" output=< Mar 09 15:22:36 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Mar 09 15:22:36 crc kubenswrapper[4722]: > Mar 09 15:22:37 crc kubenswrapper[4722]: I0309 15:22:37.018383 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-hgkzs" podUID="df8b52ff-f61e-4aca-a408-240590699ae6" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.121:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:37 crc kubenswrapper[4722]: I0309 15:22:37.018407 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/telemetry-operator-controller-manager-66c8b7dfbb-m7fv2" podUID="5bf14ad6-64cf-48f7-99e6-fabac12849e2" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.124:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:37 crc kubenswrapper[4722]: I0309 15:22:37.018511 4722 patch_prober.go:28] interesting pod/loki-operator-controller-manager-7d6d6698bd-4r85k container/manager namespace/openshift-operators-redhat: Readiness probe status=failure output="Get \"http://10.217.0.50:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:37 crc kubenswrapper[4722]: I0309 15:22:37.018534 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators-redhat/loki-operator-controller-manager-7d6d6698bd-4r85k" podUID="497a07fc-9649-4620-9432-855aa3fdc327" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.50:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:37 crc kubenswrapper[4722]: I0309 15:22:37.018696 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-wrzrq" podUID="e7b4c7c9-7c4f-4a13-8367-759f5f5ce368" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.123:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:37 crc kubenswrapper[4722]: I0309 15:22:37.018740 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-56qz9" podUID="a9df5689-5d83-4206-be2b-cf6877d70e23" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.122:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:37 crc kubenswrapper[4722]: I0309 15:22:37.018736 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-wrzrq" podUID="e7b4c7c9-7c4f-4a13-8367-759f5f5ce368" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.123:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:37 crc kubenswrapper[4722]: I0309 15:22:37.100418 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-pg8qn" podUID="ef36bc5a-2962-4c1e-a5fd-98f61d525d5d" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.125:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:37 crc kubenswrapper[4722]: I0309 15:22:37.100514 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-56qz9" podUID="a9df5689-5d83-4206-be2b-cf6877d70e23" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.122:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:37 crc kubenswrapper[4722]: I0309 15:22:37.182426 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/telemetry-operator-controller-manager-66c8b7dfbb-m7fv2" podUID="5bf14ad6-64cf-48f7-99e6-fabac12849e2" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.124:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:37 crc kubenswrapper[4722]: I0309 15:22:37.182458 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-pkbqb" podUID="0eac7341-5bab-4c97-a730-b7eeb0a75899" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.126:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:37 crc kubenswrapper[4722]: I0309 15:22:37.182832 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-pg8qn" podUID="ef36bc5a-2962-4c1e-a5fd-98f61d525d5d" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.125:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:37 crc kubenswrapper[4722]: I0309 15:22:37.207564 4722 patch_prober.go:28] interesting pod/logging-loki-gateway-6c5ff86c56-n5jdr container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.55:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:37 crc kubenswrapper[4722]: I0309 15:22:37.207634 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-6c5ff86c56-n5jdr" podUID="8becd072-3095-4717-a83d-e56cf0d0f816" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.55:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:37 crc kubenswrapper[4722]: I0309 15:22:37.222336 4722 patch_prober.go:28] interesting pod/logging-loki-gateway-6c5ff86c56-dvps5 container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:8081/ready\": context deadline exceeded" start-of-body= Mar 09 15:22:37 crc kubenswrapper[4722]: I0309 15:22:37.222446 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-6c5ff86c56-dvps5" podUID="e265fe14-7154-4fbb-a7c3-33557166f71d" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.56:8081/ready\": context deadline exceeded" Mar 09 15:22:37 crc kubenswrapper[4722]: I0309 15:22:37.264420 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-pkbqb" podUID="0eac7341-5bab-4c97-a730-b7eeb0a75899" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.126:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:37 crc kubenswrapper[4722]: I0309 15:22:37.264447 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/speaker-kwml8" podUID="93b8f0be-bf52-4559-8cf6-338026cb6610" containerName="speaker" probeResult="failure" output="Get \"http://localhost:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:37 crc kubenswrapper[4722]: I0309 15:22:37.264553 4722 patch_prober.go:28] interesting pod/logging-loki-gateway-6c5ff86c56-n5jdr container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.55:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:37 crc kubenswrapper[4722]: I0309 15:22:37.264579 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-6c5ff86c56-n5jdr" podUID="8becd072-3095-4717-a83d-e56cf0d0f816" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.55:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:37 crc kubenswrapper[4722]: I0309 15:22:37.264622 4722 patch_prober.go:28] interesting pod/logging-loki-gateway-6c5ff86c56-dvps5 container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:37 crc kubenswrapper[4722]: I0309 15:22:37.264671 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-6c5ff86c56-dvps5" podUID="e265fe14-7154-4fbb-a7c3-33557166f71d" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:37 crc kubenswrapper[4722]: I0309 15:22:37.264706 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/speaker-kwml8" podUID="93b8f0be-bf52-4559-8cf6-338026cb6610" containerName="speaker" probeResult="failure" output="Get \"http://localhost:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:37 crc kubenswrapper[4722]: I0309 15:22:37.490978 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/redhat-operators-v57f2" podUID="4e27c5a4-8cba-4119-8006-f9841d6121dc" containerName="registry-server" probeResult="failure" output=< Mar 09 15:22:37 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Mar 09 15:22:37 crc kubenswrapper[4722]: > Mar 09 15:22:37 crc kubenswrapper[4722]: I0309 15:22:37.547808 4722 patch_prober.go:28] interesting pod/openshift-kube-scheduler-crc container/kube-scheduler namespace/openshift-kube-scheduler: Liveness probe status=failure output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:37 crc kubenswrapper[4722]: I0309 15:22:37.547881 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:37 crc kubenswrapper[4722]: I0309 15:22:37.608863 4722 patch_prober.go:28] interesting pod/thanos-querier-5f6c868b98-s8fg2 container/kube-rbac-proxy-web namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.81:9091/-/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:37 crc kubenswrapper[4722]: I0309 15:22:37.608792 4722 patch_prober.go:28] interesting pod/thanos-querier-5f6c868b98-s8fg2 container/kube-rbac-proxy-web namespace/openshift-monitoring: Liveness probe status=failure output="Get \"https://10.217.0.81:9091/-/healthy\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:37 crc kubenswrapper[4722]: I0309 15:22:37.608931 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/thanos-querier-5f6c868b98-s8fg2" podUID="1eb00f12-c24e-46dc-8346-c096826564f5" containerName="kube-rbac-proxy-web" probeResult="failure" output="Get \"https://10.217.0.81:9091/-/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:37 crc kubenswrapper[4722]: I0309 15:22:37.608972 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/thanos-querier-5f6c868b98-s8fg2" podUID="1eb00f12-c24e-46dc-8346-c096826564f5" containerName="kube-rbac-proxy-web" probeResult="failure" output="Get \"https://10.217.0.81:9091/-/healthy\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:37 crc kubenswrapper[4722]: I0309 15:22:37.619217 4722 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-jmwpv container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.12:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:37 crc kubenswrapper[4722]: I0309 15:22:37.619301 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-jmwpv" podUID="9f2a2160-888c-4101-8b1c-63498753a2b7" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.12:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:37 crc kubenswrapper[4722]: I0309 15:22:37.773073 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-nmstate/nmstate-handler-67pzj" podUID="ade25a79-1e43-41ed-be91-ce97aa1c4103" containerName="nmstate-handler" probeResult="failure" output="command timed out" Mar 09 15:22:37 crc kubenswrapper[4722]: I0309 15:22:37.776764 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="e4a22f8c-ed38-47cf-8238-baf804f573a1" containerName="ceilometer-central-agent" probeResult="failure" output="command timed out" Mar 09 15:22:37 crc kubenswrapper[4722]: I0309 15:22:37.777351 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-index-7qbrd" podUID="8ac36d47-4501-4033-aee7-ce9ed8ed7002" containerName="registry-server" probeResult="failure" output="command timed out" Mar 09 15:22:37 crc kubenswrapper[4722]: I0309 15:22:37.791046 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-operator-index-7qbrd" podUID="8ac36d47-4501-4033-aee7-ce9ed8ed7002" containerName="registry-server" probeResult="failure" output="command timed out" Mar 09 15:22:37 crc kubenswrapper[4722]: I0309 15:22:37.843451 4722 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-svdsk container/marketplace-operator namespace/openshift-marketplace: Liveness probe status=failure output="Get \"http://10.217.0.68:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:37 crc kubenswrapper[4722]: I0309 15:22:37.843519 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/marketplace-operator-79b997595-svdsk" podUID="ea964ea5-3fad-4bd0-8ffe-d78f00229fbe" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.68:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:37 crc kubenswrapper[4722]: I0309 15:22:37.843570 4722 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-svdsk container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.68:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:37 crc kubenswrapper[4722]: I0309 15:22:37.843637 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-svdsk" podUID="ea964ea5-3fad-4bd0-8ffe-d78f00229fbe" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.68:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:38 crc kubenswrapper[4722]: I0309 15:22:38.111892 4722 trace.go:236] Trace[695777094]: "Calculate volume metrics of persistence for pod openstack/rabbitmq-server-1" (09-Mar-2026 15:22:36.313) (total time: 1797ms): Mar 09 15:22:38 crc kubenswrapper[4722]: Trace[695777094]: [1.797517514s] [1.797517514s] END Mar 09 15:22:38 crc kubenswrapper[4722]: I0309 15:22:38.116075 4722 trace.go:236] Trace[1117795473]: "Calculate volume metrics of wal for pod openshift-logging/logging-loki-ingester-0" (09-Mar-2026 15:22:36.194) (total time: 1921ms): Mar 09 15:22:38 crc kubenswrapper[4722]: Trace[1117795473]: [1.921985753s] [1.921985753s] END Mar 09 15:22:38 crc kubenswrapper[4722]: I0309 15:22:38.393795 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/certified-operators-wkdkx" podUID="69a0d1f2-276f-4062-91d9-8af8048a8d8f" containerName="registry-server" probeResult="failure" output=< Mar 09 15:22:38 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Mar 09 15:22:38 crc kubenswrapper[4722]: > Mar 09 15:22:38 crc kubenswrapper[4722]: I0309 15:22:38.790486 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/certified-operators-wkdkx" podUID="69a0d1f2-276f-4062-91d9-8af8048a8d8f" containerName="registry-server" probeResult="failure" output=< Mar 09 15:22:38 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Mar 09 15:22:38 crc kubenswrapper[4722]: > Mar 09 15:22:38 crc kubenswrapper[4722]: I0309 15:22:38.961293 4722 patch_prober.go:28] interesting pod/oauth-openshift-5db757fd5b-t57qc container/oauth-openshift namespace/openshift-authentication: Liveness probe status=failure output="Get \"https://10.217.0.64:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:38 crc kubenswrapper[4722]: I0309 15:22:38.961359 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication/oauth-openshift-5db757fd5b-t57qc" podUID="ec3da47d-c782-4189-b195-d6b203bd7f7a" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.64:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:38 crc kubenswrapper[4722]: I0309 15:22:38.961666 4722 patch_prober.go:28] interesting pod/oauth-openshift-5db757fd5b-t57qc container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.64:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:38 crc kubenswrapper[4722]: I0309 15:22:38.961727 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-5db757fd5b-t57qc" podUID="ec3da47d-c782-4189-b195-d6b203bd7f7a" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.64:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:38 crc kubenswrapper[4722]: I0309 15:22:38.962369 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-authentication/oauth-openshift-5db757fd5b-t57qc" Mar 09 15:22:38 crc kubenswrapper[4722]: I0309 15:22:38.963431 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-5db757fd5b-t57qc" Mar 09 15:22:38 crc kubenswrapper[4722]: I0309 15:22:38.967874 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="oauth-openshift" containerStatusID={"Type":"cri-o","ID":"1bef1f950532e1f9264d862747456e60811d3e386e3459a18b18cc7dad6b8a21"} pod="openshift-authentication/oauth-openshift-5db757fd5b-t57qc" containerMessage="Container oauth-openshift failed liveness probe, will be restarted" Mar 09 15:22:39 crc kubenswrapper[4722]: I0309 15:22:39.775901 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/prometheus-k8s-0" podUID="df07ee71-98db-42a1-9df4-6f8707504f08" containerName="prometheus" probeResult="failure" output="command timed out" Mar 09 15:22:39 crc kubenswrapper[4722]: I0309 15:22:39.776679 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-k8s-0" podUID="df07ee71-98db-42a1-9df4-6f8707504f08" containerName="prometheus" probeResult="failure" output="command timed out" Mar 09 15:22:39 crc kubenswrapper[4722]: I0309 15:22:39.858307 4722 patch_prober.go:28] interesting pod/image-registry-66df7c8f76-dch29 container/registry namespace/openshift-image-registry: Readiness probe status=failure output="Get \"https://10.217.0.74:5000/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:39 crc kubenswrapper[4722]: I0309 15:22:39.858350 4722 patch_prober.go:28] interesting pod/image-registry-66df7c8f76-dch29 container/registry namespace/openshift-image-registry: Liveness probe status=failure output="Get \"https://10.217.0.74:5000/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:39 crc kubenswrapper[4722]: I0309 15:22:39.858378 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-66df7c8f76-dch29" podUID="f352fba2-f1d8-46bb-b4e6-d68f6c9b4fe7" containerName="registry" probeResult="failure" output="Get \"https://10.217.0.74:5000/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:39 crc kubenswrapper[4722]: I0309 15:22:39.858406 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-66df7c8f76-dch29" podUID="f352fba2-f1d8-46bb-b4e6-d68f6c9b4fe7" containerName="registry" probeResult="failure" output="Get \"https://10.217.0.74:5000/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:39 crc kubenswrapper[4722]: I0309 15:22:39.905695 4722 patch_prober.go:28] interesting pod/controller-manager-8445c785c8-hdmgl container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.65:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:39 crc kubenswrapper[4722]: I0309 15:22:39.905944 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-8445c785c8-hdmgl" podUID="7139e62b-5e90-4545-a264-aa8138821a55" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.65:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:39 crc kubenswrapper[4722]: I0309 15:22:39.905757 4722 patch_prober.go:28] interesting pod/controller-manager-8445c785c8-hdmgl container/controller-manager namespace/openshift-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.65:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:39 crc kubenswrapper[4722]: I0309 15:22:39.905996 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-controller-manager/controller-manager-8445c785c8-hdmgl" podUID="7139e62b-5e90-4545-a264-aa8138821a55" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.65:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:39 crc kubenswrapper[4722]: I0309 15:22:39.906045 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-controller-manager/controller-manager-8445c785c8-hdmgl" Mar 09 15:22:39 crc kubenswrapper[4722]: I0309 15:22:39.907010 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="controller-manager" containerStatusID={"Type":"cri-o","ID":"4e30d1f5da18c63b781ab16a8bddd74f6fef64c6f69f6d7e9a89ed24675fbedc"} pod="openshift-controller-manager/controller-manager-8445c785c8-hdmgl" containerMessage="Container controller-manager failed liveness probe, will be restarted" Mar 09 15:22:39 crc kubenswrapper[4722]: I0309 15:22:39.907505 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-8445c785c8-hdmgl" podUID="7139e62b-5e90-4545-a264-aa8138821a55" containerName="controller-manager" containerID="cri-o://4e30d1f5da18c63b781ab16a8bddd74f6fef64c6f69f6d7e9a89ed24675fbedc" gracePeriod=30 Mar 09 15:22:39 crc kubenswrapper[4722]: I0309 15:22:39.913550 4722 patch_prober.go:28] interesting pod/route-controller-manager-67579949ff-g69dw container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.66:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:39 crc kubenswrapper[4722]: I0309 15:22:39.913683 4722 patch_prober.go:28] interesting pod/route-controller-manager-67579949ff-g69dw container/route-controller-manager namespace/openshift-route-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.66:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:39 crc kubenswrapper[4722]: I0309 15:22:39.913733 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-route-controller-manager/route-controller-manager-67579949ff-g69dw" podUID="fd45c8c5-9cad-404b-b14a-9cbc710c8468" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.66:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:39 crc kubenswrapper[4722]: I0309 15:22:39.913854 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-67579949ff-g69dw" podUID="fd45c8c5-9cad-404b-b14a-9cbc710c8468" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.66:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:39 crc kubenswrapper[4722]: I0309 15:22:39.963905 4722 patch_prober.go:28] interesting pod/oauth-openshift-5db757fd5b-t57qc container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.64:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:39 crc kubenswrapper[4722]: I0309 15:22:39.964003 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-5db757fd5b-t57qc" podUID="ec3da47d-c782-4189-b195-d6b203bd7f7a" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.64:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:40 crc kubenswrapper[4722]: I0309 15:22:40.096655 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="hostpath-provisioner/csi-hostpathplugin-b5hps" podUID="240d1325-4400-475e-8bc7-9915294148d8" containerName="hostpath-provisioner" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 15:22:40 crc kubenswrapper[4722]: I0309 15:22:40.661165 4722 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-jmwpv container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.12:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:40 crc kubenswrapper[4722]: I0309 15:22:40.661636 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-jmwpv" podUID="9f2a2160-888c-4101-8b1c-63498753a2b7" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.12:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:40 crc kubenswrapper[4722]: I0309 15:22:40.661724 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-config-operator/openshift-config-operator-7777fb866f-jmwpv" Mar 09 15:22:40 crc kubenswrapper[4722]: I0309 15:22:40.663788 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="openshift-config-operator" containerStatusID={"Type":"cri-o","ID":"85acb30f0b7edaa70a94e32978a937bc5732b62f2e5d01e71c5c4e5bc6878dfb"} pod="openshift-config-operator/openshift-config-operator-7777fb866f-jmwpv" containerMessage="Container openshift-config-operator failed liveness probe, will be restarted" Mar 09 15:22:40 crc kubenswrapper[4722]: I0309 15:22:40.663835 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-config-operator/openshift-config-operator-7777fb866f-jmwpv" podUID="9f2a2160-888c-4101-8b1c-63498753a2b7" containerName="openshift-config-operator" containerID="cri-o://85acb30f0b7edaa70a94e32978a937bc5732b62f2e5d01e71c5c4e5bc6878dfb" gracePeriod=30 Mar 09 15:22:40 crc kubenswrapper[4722]: I0309 15:22:40.784470 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="a79aaedb-92ba-42fc-8cc7-9ecb007d2ac7" containerName="galera" probeResult="failure" output="command timed out" Mar 09 15:22:40 crc kubenswrapper[4722]: I0309 15:22:40.791302 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-galera-0" podUID="a79aaedb-92ba-42fc-8cc7-9ecb007d2ac7" containerName="galera" probeResult="failure" output="command timed out" Mar 09 15:22:40 crc kubenswrapper[4722]: I0309 15:22:40.835371 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 09 15:22:40 crc kubenswrapper[4722]: I0309 15:22:40.835431 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/openstack-galera-0" Mar 09 15:22:40 crc kubenswrapper[4722]: I0309 15:22:40.840623 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="galera" containerStatusID={"Type":"cri-o","ID":"ed50884a2474119d9434961efa1de2e0e6821dcf7bc3597580905135c8f6ae50"} pod="openstack/openstack-galera-0" containerMessage="Container galera failed liveness probe, will be restarted" Mar 09 15:22:40 crc kubenswrapper[4722]: I0309 15:22:40.900934 4722 patch_prober.go:28] interesting pod/metrics-server-bcd6d9dd6-5rp7g container/metrics-server namespace/openshift-monitoring: Liveness probe status=failure output="Get \"https://10.217.0.83:10250/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:40 crc kubenswrapper[4722]: I0309 15:22:40.901009 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/metrics-server-bcd6d9dd6-5rp7g" podUID="f8a65e9f-5e0a-47d0-b251-aa4e52e2f581" containerName="metrics-server" probeResult="failure" output="Get \"https://10.217.0.83:10250/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:41 crc kubenswrapper[4722]: I0309 15:22:41.253094 4722 patch_prober.go:28] interesting pod/monitoring-plugin-65599947bd-42bk4 container/monitoring-plugin namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.84:9443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:41 crc kubenswrapper[4722]: I0309 15:22:41.253172 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/monitoring-plugin-65599947bd-42bk4" podUID="85f5a76e-3679-44b3-8932-f5245c49b481" containerName="monitoring-plugin" probeResult="failure" output="Get \"https://10.217.0.84:9443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:41 crc kubenswrapper[4722]: I0309 15:22:41.698708 4722 patch_prober.go:28] interesting pod/logging-loki-distributor-5d5548c9f5-r6x4b container/loki-distributor namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.52:3101/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:41 crc kubenswrapper[4722]: I0309 15:22:41.699099 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-distributor-5d5548c9f5-r6x4b" podUID="822bc43f-dfed-4440-be35-1bf58f50456b" containerName="loki-distributor" probeResult="failure" output="Get \"https://10.217.0.52:3101/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:41 crc kubenswrapper[4722]: I0309 15:22:41.773274 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="4159e308-3ccf-45d9-a97b-8133542007a8" containerName="galera" probeResult="failure" output="command timed out" Mar 09 15:22:41 crc kubenswrapper[4722]: I0309 15:22:41.773330 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-cell1-galera-0" podUID="4159e308-3ccf-45d9-a97b-8133542007a8" containerName="galera" probeResult="failure" output="command timed out" Mar 09 15:22:41 crc kubenswrapper[4722]: I0309 15:22:41.773479 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="a79aaedb-92ba-42fc-8cc7-9ecb007d2ac7" containerName="galera" probeResult="failure" output="command timed out" Mar 09 15:22:41 crc kubenswrapper[4722]: I0309 15:22:41.778448 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-lvfgg" podUID="7a62b98d-e9d4-4cbc-bea8-0da13fcc4467" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.112:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:41 crc kubenswrapper[4722]: I0309 15:22:41.906178 4722 patch_prober.go:28] interesting pod/logging-loki-querier-76bf7b6d45-fv4dh container/loki-querier namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.53:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:41 crc kubenswrapper[4722]: I0309 15:22:41.906285 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-querier-76bf7b6d45-fv4dh" podUID="ba829d53-02a8-4003-a5ee-b9b36d8404e3" containerName="loki-querier" probeResult="failure" output="Get \"https://10.217.0.53:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:42 crc kubenswrapper[4722]: I0309 15:22:42.087332 4722 patch_prober.go:28] interesting pod/logging-loki-query-frontend-6d6859c548-5b72h container/loki-query-frontend namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.54:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:42 crc kubenswrapper[4722]: I0309 15:22:42.087458 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-query-frontend-6d6859c548-5b72h" podUID="5ccc948e-2185-44fd-90c4-3ae3228f6224" containerName="loki-query-frontend" probeResult="failure" output="Get \"https://10.217.0.54:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:42 crc kubenswrapper[4722]: I0309 15:22:42.246379 4722 patch_prober.go:28] interesting pod/logging-loki-gateway-6c5ff86c56-n5jdr container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.55:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:42 crc kubenswrapper[4722]: I0309 15:22:42.246399 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9ctn5qw" podUID="5e25c11b-f9c6-4542-9c0c-394ea6bc2c17" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.120:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:42 crc kubenswrapper[4722]: I0309 15:22:42.246433 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-6c5ff86c56-n5jdr" podUID="8becd072-3095-4717-a83d-e56cf0d0f816" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.55:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:42 crc kubenswrapper[4722]: I0309 15:22:42.246461 4722 patch_prober.go:28] interesting pod/logging-loki-gateway-6c5ff86c56-dvps5 container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:42 crc kubenswrapper[4722]: I0309 15:22:42.246502 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-6c5ff86c56-dvps5" podUID="e265fe14-7154-4fbb-a7c3-33557166f71d" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:42 crc kubenswrapper[4722]: I0309 15:22:42.474430 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-controller-manager-55cd86c56-dm2dr" podUID="98c22319-d5f8-4a0b-8a30-89b9d832f354" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.127:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:42 crc kubenswrapper[4722]: I0309 15:22:42.558194 4722 patch_prober.go:28] interesting pod/observability-operator-59bdc8b94-lc5zn container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.18:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:42 crc kubenswrapper[4722]: I0309 15:22:42.558273 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-59bdc8b94-lc5zn" podUID="14655c3d-02fe-4215-b566-0c4008fd34a0" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.18:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:42 crc kubenswrapper[4722]: I0309 15:22:42.599438 4722 patch_prober.go:28] interesting pod/perses-operator-5bf474d74f-2rhld container/perses-operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.11:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:42 crc kubenswrapper[4722]: I0309 15:22:42.599501 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/perses-operator-5bf474d74f-2rhld" podUID="c123a767-e0e0-4432-b34f-cbe0b581d938" containerName="perses-operator" probeResult="failure" output="Get \"http://10.217.0.11:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:42 crc kubenswrapper[4722]: I0309 15:22:42.599444 4722 patch_prober.go:28] interesting pod/observability-operator-59bdc8b94-lc5zn container/operator namespace/openshift-operators: Liveness probe status=failure output="Get \"http://10.217.0.18:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:42 crc kubenswrapper[4722]: I0309 15:22:42.599553 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operators/observability-operator-59bdc8b94-lc5zn" podUID="14655c3d-02fe-4215-b566-0c4008fd34a0" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.18:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:42 crc kubenswrapper[4722]: I0309 15:22:42.608361 4722 patch_prober.go:28] interesting pod/thanos-querier-5f6c868b98-s8fg2 container/kube-rbac-proxy-web namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.81:9091/-/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:42 crc kubenswrapper[4722]: I0309 15:22:42.608694 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/thanos-querier-5f6c868b98-s8fg2" podUID="1eb00f12-c24e-46dc-8346-c096826564f5" containerName="kube-rbac-proxy-web" probeResult="failure" output="Get \"https://10.217.0.81:9091/-/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:42 crc kubenswrapper[4722]: I0309 15:22:42.708413 4722 patch_prober.go:28] interesting pod/console-operator-58897d9998-chrnr container/console-operator namespace/openshift-console-operator: Liveness probe status=failure output="Get \"https://10.217.0.14:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:42 crc kubenswrapper[4722]: I0309 15:22:42.708455 4722 patch_prober.go:28] interesting pod/console-operator-58897d9998-chrnr container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.14:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:42 crc kubenswrapper[4722]: I0309 15:22:42.708480 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-chrnr" podUID="b919959a-1da1-4c74-9330-5bb8c33f5c26" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.14:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:42 crc kubenswrapper[4722]: I0309 15:22:42.708480 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console-operator/console-operator-58897d9998-chrnr" podUID="b919959a-1da1-4c74-9330-5bb8c33f5c26" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.14:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:42 crc kubenswrapper[4722]: I0309 15:22:42.708413 4722 patch_prober.go:28] interesting pod/downloads-7954f5f757-knrzp container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.26:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:42 crc kubenswrapper[4722]: I0309 15:22:42.708542 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-knrzp" podUID="8f915ef9-5d9a-43ee-a333-def8766e083d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.26:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:42 crc kubenswrapper[4722]: I0309 15:22:42.708526 4722 patch_prober.go:28] interesting pod/downloads-7954f5f757-knrzp container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.26:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:42 crc kubenswrapper[4722]: I0309 15:22:42.708581 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-knrzp" podUID="8f915ef9-5d9a-43ee-a333-def8766e083d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.26:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:42 crc kubenswrapper[4722]: I0309 15:22:42.715136 4722 patch_prober.go:28] interesting pod/authentication-operator-69f744f599-8tkbn container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:42 crc kubenswrapper[4722]: I0309 15:22:42.715222 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-69f744f599-8tkbn" podUID="427a4c04-99cd-4f53-ae98-20c1755d7658" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:42 crc kubenswrapper[4722]: I0309 15:22:42.758228 4722 patch_prober.go:28] interesting pod/openshift-kube-scheduler-crc container/kube-scheduler namespace/openshift-kube-scheduler: Readiness probe status=failure output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:42 crc kubenswrapper[4722]: I0309 15:22:42.758544 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:42 crc kubenswrapper[4722]: I0309 15:22:42.774058 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-nmstate/nmstate-handler-67pzj" podUID="ade25a79-1e43-41ed-be91-ce97aa1c4103" containerName="nmstate-handler" probeResult="failure" output="command timed out" Mar 09 15:22:42 crc kubenswrapper[4722]: I0309 15:22:42.774156 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-67pzj" Mar 09 15:22:42 crc kubenswrapper[4722]: I0309 15:22:42.868690 4722 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.57:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:42 crc kubenswrapper[4722]: I0309 15:22:42.868774 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="6a23db8b-8a30-47b8-bf39-6f193899fcee" containerName="loki-ingester" probeResult="failure" output="Get \"https://10.217.0.57:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:43 crc kubenswrapper[4722]: I0309 15:22:43.146256 4722 patch_prober.go:28] interesting pod/logging-loki-index-gateway-0 container/loki-index-gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.60:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:43 crc kubenswrapper[4722]: I0309 15:22:43.146314 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-index-gateway-0" podUID="bce49c11-10b4-4c30-a1a4-16cf32cb42fd" containerName="loki-index-gateway" probeResult="failure" output="Get \"https://10.217.0.60:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:43 crc kubenswrapper[4722]: I0309 15:22:43.337303 4722 patch_prober.go:28] interesting pod/logging-loki-compactor-0 container/loki-compactor namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.58:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:43 crc kubenswrapper[4722]: I0309 15:22:43.337383 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-compactor-0" podUID="75ed49e3-dc17-45c0-96ec-1db69670395b" containerName="loki-compactor" probeResult="failure" output="Get \"https://10.217.0.58:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:43 crc kubenswrapper[4722]: I0309 15:22:43.653368 4722 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Readiness probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:43 crc kubenswrapper[4722]: I0309 15:22:43.653436 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:43 crc kubenswrapper[4722]: I0309 15:22:43.777888 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="e4a22f8c-ed38-47cf-8238-baf804f573a1" containerName="ceilometer-central-agent" probeResult="failure" output="command timed out" Mar 09 15:22:43 crc kubenswrapper[4722]: I0309 15:22:43.778002 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/ceilometer-0" Mar 09 15:22:43 crc kubenswrapper[4722]: I0309 15:22:43.792951 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="ceilometer-central-agent" containerStatusID={"Type":"cri-o","ID":"cd582a254af464cea392ee9b3f2a8bbe909916d04f0053279129bca6b32f8102"} pod="openstack/ceilometer-0" containerMessage="Container ceilometer-central-agent failed liveness probe, will be restarted" Mar 09 15:22:43 crc kubenswrapper[4722]: I0309 15:22:43.793080 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e4a22f8c-ed38-47cf-8238-baf804f573a1" containerName="ceilometer-central-agent" containerID="cri-o://cd582a254af464cea392ee9b3f2a8bbe909916d04f0053279129bca6b32f8102" gracePeriod=30 Mar 09 15:22:43 crc kubenswrapper[4722]: I0309 15:22:43.802377 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-operator-controller-init-5b979cff56-vwbnz" podUID="bdac45ca-36d4-41c5-b5e5-332d70558171" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.104:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:43 crc kubenswrapper[4722]: I0309 15:22:43.802590 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-controller-init-5b979cff56-vwbnz" podUID="bdac45ca-36d4-41c5-b5e5-332d70558171" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.104:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:44 crc kubenswrapper[4722]: I0309 15:22:44.167426 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="cert-manager/cert-manager-webhook-687f57d79b-wb6rj" podUID="ac6d6e52-6a89-4f96-8894-8ed2c71cdcbc" containerName="cert-manager-webhook" probeResult="failure" output="Get \"http://10.217.0.45:6080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:44 crc kubenswrapper[4722]: I0309 15:22:44.329426 4722 patch_prober.go:28] interesting pod/router-default-5444994796-dp8wn container/router namespace/openshift-ingress: Readiness probe status=failure output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:44 crc kubenswrapper[4722]: I0309 15:22:44.329507 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ingress/router-default-5444994796-dp8wn" podUID="317444ee-0620-47d2-869e-77578a367a87" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:44 crc kubenswrapper[4722]: I0309 15:22:44.329432 4722 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-dgxnd container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.42:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:44 crc kubenswrapper[4722]: I0309 15:22:44.329571 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dgxnd" podUID="c219beb3-4ba5-43bd-b2ec-3855d19c2b57" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.42:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:44 crc kubenswrapper[4722]: I0309 15:22:44.329430 4722 patch_prober.go:28] interesting pod/router-default-5444994796-dp8wn container/router namespace/openshift-ingress: Liveness probe status=failure output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:44 crc kubenswrapper[4722]: I0309 15:22:44.329619 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-ingress/router-default-5444994796-dp8wn" podUID="317444ee-0620-47d2-869e-77578a367a87" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:44 crc kubenswrapper[4722]: I0309 15:22:44.329455 4722 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-dgxnd container/catalog-operator namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.42:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:44 crc kubenswrapper[4722]: I0309 15:22:44.329665 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dgxnd" podUID="c219beb3-4ba5-43bd-b2ec-3855d19c2b57" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.42:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:44 crc kubenswrapper[4722]: I0309 15:22:44.464463 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/metallb-operator-controller-manager-5689854475-89q94" podUID="3ea04cb5-4d36-42a9-bb83-c6f943619d16" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.97:8080/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:44 crc kubenswrapper[4722]: I0309 15:22:44.496822 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="0e07fbab-4a47-4e59-aa72-f0a4521296af" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.173:9090/-/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:44 crc kubenswrapper[4722]: I0309 15:22:44.496891 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/prometheus-metric-storage-0" podUID="0e07fbab-4a47-4e59-aa72-f0a4521296af" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.173:9090/-/healthy\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:44 crc kubenswrapper[4722]: I0309 15:22:44.645498 4722 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-lnkwt container/olm-operator namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.19:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:44 crc kubenswrapper[4722]: I0309 15:22:44.645498 4722 patch_prober.go:28] interesting pod/package-server-manager-789f6589d5-mrtb2 container/package-server-manager namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"http://10.217.0.23:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:44 crc kubenswrapper[4722]: I0309 15:22:44.645604 4722 patch_prober.go:28] interesting pod/package-server-manager-789f6589d5-mrtb2 container/package-server-manager namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"http://10.217.0.23:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:44 crc kubenswrapper[4722]: I0309 15:22:44.645624 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mrtb2" podUID="228d39d8-b0bc-4491-be90-e473c090f412" containerName="package-server-manager" probeResult="failure" output="Get \"http://10.217.0.23:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:44 crc kubenswrapper[4722]: I0309 15:22:44.645643 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mrtb2" podUID="228d39d8-b0bc-4491-be90-e473c090f412" containerName="package-server-manager" probeResult="failure" output="Get \"http://10.217.0.23:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:44 crc kubenswrapper[4722]: I0309 15:22:44.645565 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lnkwt" podUID="f8055a95-6b09-4e32-88b8-82ad36ca5029" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.19:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:44 crc kubenswrapper[4722]: I0309 15:22:44.645573 4722 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-lnkwt container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.19:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:44 crc kubenswrapper[4722]: I0309 15:22:44.645740 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lnkwt" podUID="f8055a95-6b09-4e32-88b8-82ad36ca5029" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.19:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:44 crc kubenswrapper[4722]: I0309 15:22:44.658142 4722 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-fp2th container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.44:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:44 crc kubenswrapper[4722]: I0309 15:22:44.658227 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fp2th" podUID="996087ed-6480-4650-8632-c991e5d16c99" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.44:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:44 crc kubenswrapper[4722]: I0309 15:22:44.658485 4722 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-fp2th container/packageserver namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.44:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:44 crc kubenswrapper[4722]: I0309 15:22:44.658551 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fp2th" podUID="996087ed-6480-4650-8632-c991e5d16c99" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.44:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:44 crc kubenswrapper[4722]: I0309 15:22:44.774454 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-nmstate/nmstate-handler-67pzj" podUID="ade25a79-1e43-41ed-be91-ce97aa1c4103" containerName="nmstate-handler" probeResult="failure" output="command timed out" Mar 09 15:22:44 crc kubenswrapper[4722]: I0309 15:22:44.775442 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-k8s-0" podUID="df07ee71-98db-42a1-9df4-6f8707504f08" containerName="prometheus" probeResult="failure" output="command timed out" Mar 09 15:22:44 crc kubenswrapper[4722]: I0309 15:22:44.775455 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/prometheus-k8s-0" podUID="df07ee71-98db-42a1-9df4-6f8707504f08" containerName="prometheus" probeResult="failure" output="command timed out" Mar 09 15:22:44 crc kubenswrapper[4722]: I0309 15:22:44.775637 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-k8s-0" Mar 09 15:22:44 crc kubenswrapper[4722]: I0309 15:22:44.899016 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/metallb-operator-webhook-server-798745ff96-864pz" podUID="f67efad4-1b85-4f64-9e98-55eb2da89fb6" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.98:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:44 crc kubenswrapper[4722]: I0309 15:22:44.899078 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/metallb-operator-webhook-server-798745ff96-864pz" podUID="f67efad4-1b85-4f64-9e98-55eb2da89fb6" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.98:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:45 crc kubenswrapper[4722]: I0309 15:22:45.436583 4722 patch_prober.go:28] interesting pod/console-6dc4c5dd4b-rk4jt container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.142:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:45 crc kubenswrapper[4722]: I0309 15:22:45.436875 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-6dc4c5dd4b-rk4jt" podUID="6dc5b476-5a42-4c98-9a95-0e3b29f2f771" containerName="console" probeResult="failure" output="Get \"https://10.217.0.142:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:45 crc kubenswrapper[4722]: I0309 15:22:45.436946 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-6dc4c5dd4b-rk4jt" Mar 09 15:22:45 crc kubenswrapper[4722]: I0309 15:22:45.632347 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-2nlfl" podUID="8557439a-0367-4823-af83-28955a17cc08" containerName="frr-k8s-webhook-server" probeResult="failure" output="Get \"http://10.217.0.99:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:45 crc kubenswrapper[4722]: I0309 15:22:45.632442 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-2nlfl" Mar 09 15:22:45 crc kubenswrapper[4722]: I0309 15:22:45.632795 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-2nlfl" podUID="8557439a-0367-4823-af83-28955a17cc08" containerName="frr-k8s-webhook-server" probeResult="failure" output="Get \"http://10.217.0.99:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:45 crc kubenswrapper[4722]: I0309 15:22:45.766430 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-vppnv" podUID="8f839106-1673-4589-9391-0cd7748e658c" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.106:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:45 crc kubenswrapper[4722]: I0309 15:22:45.808454 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/controller-86ddb6bd46-6w5ww" podUID="0b264ba4-1398-4d3e-bb2b-83cd34a0ccf6" containerName="controller" probeResult="failure" output="Get \"http://10.217.0.100:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:45 crc kubenswrapper[4722]: I0309 15:22:45.808555 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-rsd9l" podUID="edd71e1d-6ff0-4918-9cd8-a342efba2df5" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.107:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:45 crc kubenswrapper[4722]: I0309 15:22:45.849645 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-bmpgd" podUID="a1a5e35a-83f6-4886-86db-55738f51f7e8" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.108:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:45 crc kubenswrapper[4722]: I0309 15:22:45.849749 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/controller-86ddb6bd46-6w5ww" podUID="0b264ba4-1398-4d3e-bb2b-83cd34a0ccf6" containerName="controller" probeResult="failure" output="Get \"http://10.217.0.100:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:45 crc kubenswrapper[4722]: I0309 15:22:45.931421 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-ct7x8" podUID="f21c35ef-c8ea-4331-a747-44a62c6f2e74" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.109:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:45 crc kubenswrapper[4722]: I0309 15:22:45.931410 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-zjf7b" podUID="22043c71-5292-422c-99e5-c88ea1aef638" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.110:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:46 crc kubenswrapper[4722]: I0309 15:22:46.051423 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-6npqv" podUID="663f1719-30f7-4588-a183-4a59787e8d8d" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.111:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:46 crc kubenswrapper[4722]: I0309 15:22:46.063533 4722 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-8xdjl container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Liveness probe status=failure output="Get \"https://10.217.0.75:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:46 crc kubenswrapper[4722]: I0309 15:22:46.063584 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-8xdjl" podUID="8b35adf7-a305-4f94-a5c9-02fbc3fca46f" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.75:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:46 crc kubenswrapper[4722]: I0309 15:22:46.063594 4722 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-8xdjl container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.75:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:46 crc kubenswrapper[4722]: I0309 15:22:46.063623 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-8xdjl" Mar 09 15:22:46 crc kubenswrapper[4722]: I0309 15:22:46.063655 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-8xdjl" podUID="8b35adf7-a305-4f94-a5c9-02fbc3fca46f" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.75:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:46 crc kubenswrapper[4722]: I0309 15:22:46.063745 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-8xdjl" Mar 09 15:22:46 crc kubenswrapper[4722]: I0309 15:22:46.065105 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="prometheus-operator-admission-webhook" containerStatusID={"Type":"cri-o","ID":"24504e3f082f14957d38d16a3a1fe7333e36577d396d752821c030b29c7c33f8"} pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-8xdjl" containerMessage="Container prometheus-operator-admission-webhook failed liveness probe, will be restarted" Mar 09 15:22:46 crc kubenswrapper[4722]: I0309 15:22:46.065149 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-8xdjl" podUID="8b35adf7-a305-4f94-a5c9-02fbc3fca46f" containerName="prometheus-operator-admission-webhook" containerID="cri-o://24504e3f082f14957d38d16a3a1fe7333e36577d396d752821c030b29c7c33f8" gracePeriod=30 Mar 09 15:22:46 crc kubenswrapper[4722]: I0309 15:22:46.156484 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/kube-state-metrics-0" podUID="652c419b-2a86-4c6f-ac7a-c2d7818ef55f" containerName="kube-state-metrics" probeResult="failure" output="Get \"https://10.217.1.16:8081/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:46 crc kubenswrapper[4722]: I0309 15:22:46.156755 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/kube-state-metrics-0" podUID="652c419b-2a86-4c6f-ac7a-c2d7818ef55f" containerName="kube-state-metrics" probeResult="failure" output="Get \"https://10.217.1.16:8080/livez\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:46 crc kubenswrapper[4722]: I0309 15:22:46.325426 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-hnzfw" podUID="ec9f1f5e-26f5-4683-bf41-c85981da9d18" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.113:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:46 crc kubenswrapper[4722]: I0309 15:22:46.325598 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-6vn96" podUID="29ed2858-4fd0-4817-8ed3-b3515ac035d7" containerName="controller" probeResult="failure" output="Get \"http://127.0.0.1:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:46 crc kubenswrapper[4722]: I0309 15:22:46.366526 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-2hxzr" podUID="4de6db14-6f3e-4c4e-a61d-39c6648209dd" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.114:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:46 crc kubenswrapper[4722]: I0309 15:22:46.366652 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-6vn96" podUID="29ed2858-4fd0-4817-8ed3-b3515ac035d7" containerName="controller" probeResult="failure" output="Get \"http://127.0.0.1:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:46 crc kubenswrapper[4722]: I0309 15:22:46.409462 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-6vn96" podUID="29ed2858-4fd0-4817-8ed3-b3515ac035d7" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:46 crc kubenswrapper[4722]: I0309 15:22:46.409521 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/manila-operator-controller-manager-67d996989d-28855" podUID="dae536b6-7a22-435e-b307-a8ab6b54779d" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.115:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:46 crc kubenswrapper[4722]: I0309 15:22:46.409552 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="metallb-system/frr-k8s-6vn96" Mar 09 15:22:46 crc kubenswrapper[4722]: I0309 15:22:46.411978 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="frr" containerStatusID={"Type":"cri-o","ID":"228312f95f7cd63c660e199315bd7b89094078a743e9a9f8c8185bc6054ecca4"} pod="metallb-system/frr-k8s-6vn96" containerMessage="Container frr failed liveness probe, will be restarted" Mar 09 15:22:46 crc kubenswrapper[4722]: I0309 15:22:46.412087 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="metallb-system/frr-k8s-6vn96" podUID="29ed2858-4fd0-4817-8ed3-b3515ac035d7" containerName="frr" containerID="cri-o://228312f95f7cd63c660e199315bd7b89094078a743e9a9f8c8185bc6054ecca4" gracePeriod=2 Mar 09 15:22:46 crc kubenswrapper[4722]: I0309 15:22:46.495466 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/neutron-operator-controller-manager-54688575f-5fcw8" podUID="febfeb1a-d5a3-46b8-bc4f-fe3266905e8c" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.117:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:46 crc kubenswrapper[4722]: I0309 15:22:46.578445 4722 patch_prober.go:28] interesting pod/console-6dc4c5dd4b-rk4jt container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.142:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:46 crc kubenswrapper[4722]: I0309 15:22:46.578868 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-6dc4c5dd4b-rk4jt" podUID="6dc5b476-5a42-4c98-9a95-0e3b29f2f771" containerName="console" probeResult="failure" output="Get \"https://10.217.0.142:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:46 crc kubenswrapper[4722]: I0309 15:22:46.578939 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-n5zc7" podUID="717ffc3a-7a6d-4a7c-837f-d1ed92489b68" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.119:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:46 crc kubenswrapper[4722]: I0309 15:22:46.578974 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-l8bds" podUID="74cb981b-ce89-479e-8573-fdda25190637" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.116:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:46 crc kubenswrapper[4722]: I0309 15:22:46.579727 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-jwgfg" podUID="a9ff56ca-00a6-484f-a477-0dca4f3a0f5c" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.118:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:46 crc kubenswrapper[4722]: I0309 15:22:46.623953 4722 patch_prober.go:28] interesting pod/nmstate-webhook-786f45cff4-jdp69 container/nmstate-webhook namespace/openshift-nmstate: Readiness probe status=failure output="Get \"https://10.217.0.89:9443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:46 crc kubenswrapper[4722]: I0309 15:22:46.624009 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-nmstate/nmstate-webhook-786f45cff4-jdp69" podUID="c0af161b-a8d5-4a36-b1c2-0a4d43820c73" containerName="nmstate-webhook" probeResult="failure" output="Get \"https://10.217.0.89:9443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:46 crc kubenswrapper[4722]: I0309 15:22:46.758412 4722 patch_prober.go:28] interesting pod/loki-operator-controller-manager-7d6d6698bd-4r85k container/manager namespace/openshift-operators-redhat: Readiness probe status=failure output="Get \"http://10.217.0.50:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:46 crc kubenswrapper[4722]: I0309 15:22:46.758474 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators-redhat/loki-operator-controller-manager-7d6d6698bd-4r85k" podUID="497a07fc-9649-4620-9432-855aa3fdc327" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.50:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:46 crc kubenswrapper[4722]: I0309 15:22:46.758570 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators-redhat/loki-operator-controller-manager-7d6d6698bd-4r85k" Mar 09 15:22:46 crc kubenswrapper[4722]: I0309 15:22:46.786774 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/community-operators-rvtqn" podUID="3411289f-3e7c-4e43-b545-5e612822b18e" containerName="registry-server" probeResult="failure" output="command timed out" Mar 09 15:22:46 crc kubenswrapper[4722]: I0309 15:22:46.793270 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/community-operators-rvtqn" podUID="3411289f-3e7c-4e43-b545-5e612822b18e" containerName="registry-server" probeResult="failure" output="command timed out" Mar 09 15:22:46 crc kubenswrapper[4722]: I0309 15:22:46.799367 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-2nlfl" podUID="8557439a-0367-4823-af83-28955a17cc08" containerName="frr-k8s-webhook-server" probeResult="failure" output="Get \"http://10.217.0.99:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:46 crc kubenswrapper[4722]: I0309 15:22:46.799453 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-wrzrq" podUID="e7b4c7c9-7c4f-4a13-8367-759f5f5ce368" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.123:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:46 crc kubenswrapper[4722]: I0309 15:22:46.840361 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-hgkzs" podUID="df8b52ff-f61e-4aca-a408-240590699ae6" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.121:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:46 crc kubenswrapper[4722]: I0309 15:22:46.840398 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-56qz9" podUID="a9df5689-5d83-4206-be2b-cf6877d70e23" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.122:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:46 crc kubenswrapper[4722]: I0309 15:22:46.881607 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/telemetry-operator-controller-manager-66c8b7dfbb-m7fv2" podUID="5bf14ad6-64cf-48f7-99e6-fabac12849e2" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.124:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:46 crc kubenswrapper[4722]: I0309 15:22:46.992418 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-pg8qn" podUID="ef36bc5a-2962-4c1e-a5fd-98f61d525d5d" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.125:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:47 crc kubenswrapper[4722]: I0309 15:22:47.033395 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-pkbqb" podUID="0eac7341-5bab-4c97-a730-b7eeb0a75899" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.126:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:47 crc kubenswrapper[4722]: I0309 15:22:47.064810 4722 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-8xdjl container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.75:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:47 crc kubenswrapper[4722]: I0309 15:22:47.065079 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-8xdjl" podUID="8b35adf7-a305-4f94-a5c9-02fbc3fca46f" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.75:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:47 crc kubenswrapper[4722]: I0309 15:22:47.208172 4722 patch_prober.go:28] interesting pod/logging-loki-gateway-6c5ff86c56-n5jdr container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.55:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:47 crc kubenswrapper[4722]: I0309 15:22:47.208263 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-6c5ff86c56-n5jdr" podUID="8becd072-3095-4717-a83d-e56cf0d0f816" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.55:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:47 crc kubenswrapper[4722]: I0309 15:22:47.212366 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/speaker-kwml8" podUID="93b8f0be-bf52-4559-8cf6-338026cb6610" containerName="speaker" probeResult="failure" output="Get \"http://localhost:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:47 crc kubenswrapper[4722]: I0309 15:22:47.212421 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/speaker-kwml8" podUID="93b8f0be-bf52-4559-8cf6-338026cb6610" containerName="speaker" probeResult="failure" output="Get \"http://localhost:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:47 crc kubenswrapper[4722]: I0309 15:22:47.212444 4722 patch_prober.go:28] interesting pod/logging-loki-gateway-6c5ff86c56-n5jdr container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.55:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:47 crc kubenswrapper[4722]: I0309 15:22:47.212629 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-6c5ff86c56-n5jdr" podUID="8becd072-3095-4717-a83d-e56cf0d0f816" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.55:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:47 crc kubenswrapper[4722]: I0309 15:22:47.222298 4722 patch_prober.go:28] interesting pod/logging-loki-gateway-6c5ff86c56-dvps5 container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:47 crc kubenswrapper[4722]: I0309 15:22:47.222361 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-6c5ff86c56-dvps5" podUID="e265fe14-7154-4fbb-a7c3-33557166f71d" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.56:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:47 crc kubenswrapper[4722]: I0309 15:22:47.222318 4722 patch_prober.go:28] interesting pod/logging-loki-gateway-6c5ff86c56-dvps5 container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:47 crc kubenswrapper[4722]: I0309 15:22:47.222429 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-6c5ff86c56-dvps5" podUID="e265fe14-7154-4fbb-a7c3-33557166f71d" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:47 crc kubenswrapper[4722]: I0309 15:22:47.548220 4722 patch_prober.go:28] interesting pod/openshift-kube-scheduler-crc container/kube-scheduler namespace/openshift-kube-scheduler: Liveness probe status=failure output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:47 crc kubenswrapper[4722]: I0309 15:22:47.548317 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:47 crc kubenswrapper[4722]: I0309 15:22:47.609525 4722 patch_prober.go:28] interesting pod/thanos-querier-5f6c868b98-s8fg2 container/kube-rbac-proxy-web namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.81:9091/-/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:47 crc kubenswrapper[4722]: I0309 15:22:47.609631 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/thanos-querier-5f6c868b98-s8fg2" podUID="1eb00f12-c24e-46dc-8346-c096826564f5" containerName="kube-rbac-proxy-web" probeResult="failure" output="Get \"https://10.217.0.81:9091/-/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:47 crc kubenswrapper[4722]: I0309 15:22:47.773958 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-nmstate/nmstate-handler-67pzj" podUID="ade25a79-1e43-41ed-be91-ce97aa1c4103" containerName="nmstate-handler" probeResult="failure" output="command timed out" Mar 09 15:22:47 crc kubenswrapper[4722]: I0309 15:22:47.777057 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-engine-775d545446-nrcd2" podUID="b78f1ea8-1fc2-4469-966c-4568370bfae9" containerName="heat-engine" probeResult="failure" output="command timed out" Mar 09 15:22:47 crc kubenswrapper[4722]: I0309 15:22:47.778173 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-index-7qbrd" podUID="8ac36d47-4501-4033-aee7-ce9ed8ed7002" containerName="registry-server" probeResult="failure" output="command timed out" Mar 09 15:22:47 crc kubenswrapper[4722]: I0309 15:22:47.778624 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/heat-engine-775d545446-nrcd2" podUID="b78f1ea8-1fc2-4469-966c-4568370bfae9" containerName="heat-engine" probeResult="failure" output="command timed out" Mar 09 15:22:47 crc kubenswrapper[4722]: I0309 15:22:47.781045 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-operator-index-7qbrd" podUID="8ac36d47-4501-4033-aee7-ce9ed8ed7002" containerName="registry-server" probeResult="failure" output="command timed out" Mar 09 15:22:47 crc kubenswrapper[4722]: I0309 15:22:47.799427 4722 patch_prober.go:28] interesting pod/loki-operator-controller-manager-7d6d6698bd-4r85k container/manager namespace/openshift-operators-redhat: Readiness probe status=failure output="Get \"http://10.217.0.50:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:47 crc kubenswrapper[4722]: I0309 15:22:47.799513 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators-redhat/loki-operator-controller-manager-7d6d6698bd-4r85k" podUID="497a07fc-9649-4620-9432-855aa3fdc327" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.50:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:47 crc kubenswrapper[4722]: I0309 15:22:47.887987 4722 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-2mf4l container/oauth-apiserver namespace/openshift-oauth-apiserver: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/readyz\": context deadline exceeded" start-of-body= Mar 09 15:22:47 crc kubenswrapper[4722]: I0309 15:22:47.888069 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2mf4l" podUID="062db6f1-77ab-4eca-be53-6480160aff81" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.10:8443/readyz\": context deadline exceeded" Mar 09 15:22:47 crc kubenswrapper[4722]: I0309 15:22:47.906251 4722 trace.go:236] Trace[8128063]: "Calculate volume metrics of persistence for pod openstack/rabbitmq-server-0" (09-Mar-2026 15:22:39.793) (total time: 8097ms): Mar 09 15:22:47 crc kubenswrapper[4722]: Trace[8128063]: [8.097389857s] [8.097389857s] END Mar 09 15:22:48 crc kubenswrapper[4722]: I0309 15:22:48.023123 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6vn96" event={"ID":"29ed2858-4fd0-4817-8ed3-b3515ac035d7","Type":"ContainerDied","Data":"228312f95f7cd63c660e199315bd7b89094078a743e9a9f8c8185bc6054ecca4"} Mar 09 15:22:48 crc kubenswrapper[4722]: I0309 15:22:48.026682 4722 generic.go:334] "Generic (PLEG): container finished" podID="29ed2858-4fd0-4817-8ed3-b3515ac035d7" containerID="228312f95f7cd63c660e199315bd7b89094078a743e9a9f8c8185bc6054ecca4" exitCode=143 Mar 09 15:22:48 crc kubenswrapper[4722]: I0309 15:22:48.176505 4722 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:6443/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:48 crc kubenswrapper[4722]: I0309 15:22:48.176572 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:48 crc kubenswrapper[4722]: I0309 15:22:48.177050 4722 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:6443/livez?exclude=etcd\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:48 crc kubenswrapper[4722]: I0309 15:22:48.177109 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/livez?exclude=etcd\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:48 crc kubenswrapper[4722]: I0309 15:22:48.774455 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/prometheus-k8s-0" podUID="df07ee71-98db-42a1-9df4-6f8707504f08" containerName="prometheus" probeResult="failure" output="command timed out" Mar 09 15:22:48 crc kubenswrapper[4722]: I0309 15:22:48.774461 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-fmhcx" podUID="731c31a2-ded2-452f-b330-0cf118ab1e84" containerName="ovnkube-controller" probeResult="failure" output="command timed out" Mar 09 15:22:48 crc kubenswrapper[4722]: I0309 15:22:48.775391 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-k8s-0" podUID="df07ee71-98db-42a1-9df4-6f8707504f08" containerName="prometheus" probeResult="failure" output="command timed out" Mar 09 15:22:48 crc kubenswrapper[4722]: I0309 15:22:48.960797 4722 patch_prober.go:28] interesting pod/oauth-openshift-5db757fd5b-t57qc container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.64:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:48 crc kubenswrapper[4722]: I0309 15:22:48.960859 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-5db757fd5b-t57qc" podUID="ec3da47d-c782-4189-b195-d6b203bd7f7a" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.64:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:49 crc kubenswrapper[4722]: I0309 15:22:49.497713 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="0e07fbab-4a47-4e59-aa72-f0a4521296af" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.173:9090/-/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:49 crc kubenswrapper[4722]: I0309 15:22:49.497711 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/prometheus-metric-storage-0" podUID="0e07fbab-4a47-4e59-aa72-f0a4521296af" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.173:9090/-/healthy\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:49 crc kubenswrapper[4722]: I0309 15:22:49.775529 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/redhat-operators-v57f2" podUID="4e27c5a4-8cba-4119-8006-f9841d6121dc" containerName="registry-server" probeResult="failure" output="command timed out" Mar 09 15:22:49 crc kubenswrapper[4722]: I0309 15:22:49.777513 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/redhat-operators-v57f2" podUID="4e27c5a4-8cba-4119-8006-f9841d6121dc" containerName="registry-server" probeResult="failure" output="command timed out" Mar 09 15:22:49 crc kubenswrapper[4722]: I0309 15:22:49.778333 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/redhat-marketplace-c2mmw" podUID="690e5ab0-3719-40ac-aba6-9278480ecb44" containerName="registry-server" probeResult="failure" output="command timed out" Mar 09 15:22:49 crc kubenswrapper[4722]: I0309 15:22:49.779419 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-c2mmw" podUID="690e5ab0-3719-40ac-aba6-9278480ecb44" containerName="registry-server" probeResult="failure" output="command timed out" Mar 09 15:22:49 crc kubenswrapper[4722]: I0309 15:22:49.859518 4722 patch_prober.go:28] interesting pod/image-registry-66df7c8f76-dch29 container/registry namespace/openshift-image-registry: Liveness probe status=failure output="Get \"https://10.217.0.74:5000/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:49 crc kubenswrapper[4722]: I0309 15:22:49.859600 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-66df7c8f76-dch29" podUID="f352fba2-f1d8-46bb-b4e6-d68f6c9b4fe7" containerName="registry" probeResult="failure" output="Get \"https://10.217.0.74:5000/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:49 crc kubenswrapper[4722]: I0309 15:22:49.859791 4722 patch_prober.go:28] interesting pod/image-registry-66df7c8f76-dch29 container/registry namespace/openshift-image-registry: Readiness probe status=failure output="Get \"https://10.217.0.74:5000/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:49 crc kubenswrapper[4722]: I0309 15:22:49.859818 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-66df7c8f76-dch29" podUID="f352fba2-f1d8-46bb-b4e6-d68f6c9b4fe7" containerName="registry" probeResult="failure" output="Get \"https://10.217.0.74:5000/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:49 crc kubenswrapper[4722]: I0309 15:22:49.905429 4722 patch_prober.go:28] interesting pod/controller-manager-8445c785c8-hdmgl container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.65:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:49 crc kubenswrapper[4722]: I0309 15:22:49.905500 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-8445c785c8-hdmgl" podUID="7139e62b-5e90-4545-a264-aa8138821a55" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.65:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:49 crc kubenswrapper[4722]: I0309 15:22:49.913945 4722 patch_prober.go:28] interesting pod/route-controller-manager-67579949ff-g69dw container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.66:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:49 crc kubenswrapper[4722]: I0309 15:22:49.913993 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-67579949ff-g69dw" podUID="fd45c8c5-9cad-404b-b14a-9cbc710c8468" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.66:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:49 crc kubenswrapper[4722]: I0309 15:22:49.914041 4722 patch_prober.go:28] interesting pod/route-controller-manager-67579949ff-g69dw container/route-controller-manager namespace/openshift-route-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.66:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:49 crc kubenswrapper[4722]: I0309 15:22:49.914055 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-route-controller-manager/route-controller-manager-67579949ff-g69dw" podUID="fd45c8c5-9cad-404b-b14a-9cbc710c8468" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.66:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:49 crc kubenswrapper[4722]: I0309 15:22:49.914550 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-route-controller-manager/route-controller-manager-67579949ff-g69dw" Mar 09 15:22:49 crc kubenswrapper[4722]: I0309 15:22:49.918967 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="route-controller-manager" containerStatusID={"Type":"cri-o","ID":"cd7063f2eab111d8560c940917127f1dca125e473c409c23b68096bf29d14642"} pod="openshift-route-controller-manager/route-controller-manager-67579949ff-g69dw" containerMessage="Container route-controller-manager failed liveness probe, will be restarted" Mar 09 15:22:49 crc kubenswrapper[4722]: I0309 15:22:49.920499 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-67579949ff-g69dw" podUID="fd45c8c5-9cad-404b-b14a-9cbc710c8468" containerName="route-controller-manager" containerID="cri-o://cd7063f2eab111d8560c940917127f1dca125e473c409c23b68096bf29d14642" gracePeriod=30 Mar 09 15:22:50 crc kubenswrapper[4722]: I0309 15:22:50.076299 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6vn96" event={"ID":"29ed2858-4fd0-4817-8ed3-b3515ac035d7","Type":"ContainerStarted","Data":"0cbe62ded263fafd545e35fd924f81cbf067663881d46e199cee7a97fcdec2b7"} Mar 09 15:22:50 crc kubenswrapper[4722]: I0309 15:22:50.168541 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-6vn96" Mar 09 15:22:50 crc kubenswrapper[4722]: I0309 15:22:50.221637 4722 patch_prober.go:28] interesting pod/apiserver-76f77b778f-gh244 container/openshift-apiserver namespace/openshift-apiserver: Readiness probe status=failure output="Get \"https://10.217.0.33:8443/readyz?exclude=etcd&exclude=etcd-readiness\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:50 crc kubenswrapper[4722]: I0309 15:22:50.221713 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-apiserver/apiserver-76f77b778f-gh244" podUID="4084fbb0-8fae-4b7e-a3f6-ec9d723bb367" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.33:8443/readyz?exclude=etcd&exclude=etcd-readiness\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:50 crc kubenswrapper[4722]: I0309 15:22:50.773812 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="a79aaedb-92ba-42fc-8cc7-9ecb007d2ac7" containerName="galera" probeResult="failure" output="command timed out" Mar 09 15:22:50 crc kubenswrapper[4722]: I0309 15:22:50.983457 4722 patch_prober.go:28] interesting pod/metrics-server-bcd6d9dd6-5rp7g container/metrics-server namespace/openshift-monitoring: Liveness probe status=failure output="Get \"https://10.217.0.83:10250/livez\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:50 crc kubenswrapper[4722]: I0309 15:22:50.983527 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/metrics-server-bcd6d9dd6-5rp7g" podUID="f8a65e9f-5e0a-47d0-b251-aa4e52e2f581" containerName="metrics-server" probeResult="failure" output="Get \"https://10.217.0.83:10250/livez\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:50 crc kubenswrapper[4722]: I0309 15:22:50.983570 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-monitoring/metrics-server-bcd6d9dd6-5rp7g" Mar 09 15:22:50 crc kubenswrapper[4722]: I0309 15:22:50.984016 4722 patch_prober.go:28] interesting pod/metrics-server-bcd6d9dd6-5rp7g container/metrics-server namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.83:10250/livez\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:50 crc kubenswrapper[4722]: I0309 15:22:50.984046 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/metrics-server-bcd6d9dd6-5rp7g" podUID="f8a65e9f-5e0a-47d0-b251-aa4e52e2f581" containerName="metrics-server" probeResult="failure" output="Get \"https://10.217.0.83:10250/livez\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:50 crc kubenswrapper[4722]: I0309 15:22:50.993161 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="metrics-server" containerStatusID={"Type":"cri-o","ID":"accf60ca5a63b2963a1bf6cb44c81ffff1c61e6d3e9b03e803b0af50dfcdb881"} pod="openshift-monitoring/metrics-server-bcd6d9dd6-5rp7g" containerMessage="Container metrics-server failed liveness probe, will be restarted" Mar 09 15:22:50 crc kubenswrapper[4722]: I0309 15:22:50.993244 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/metrics-server-bcd6d9dd6-5rp7g" podUID="f8a65e9f-5e0a-47d0-b251-aa4e52e2f581" containerName="metrics-server" containerID="cri-o://accf60ca5a63b2963a1bf6cb44c81ffff1c61e6d3e9b03e803b0af50dfcdb881" gracePeriod=170 Mar 09 15:22:51 crc kubenswrapper[4722]: I0309 15:22:51.202785 4722 prober.go:107] "Probe failed" probeType="Startup" pod="metallb-system/frr-k8s-6vn96" podUID="29ed2858-4fd0-4817-8ed3-b3515ac035d7" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:51 crc kubenswrapper[4722]: I0309 15:22:51.253787 4722 patch_prober.go:28] interesting pod/monitoring-plugin-65599947bd-42bk4 container/monitoring-plugin namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.84:9443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:51 crc kubenswrapper[4722]: I0309 15:22:51.253860 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/monitoring-plugin-65599947bd-42bk4" podUID="85f5a76e-3679-44b3-8932-f5245c49b481" containerName="monitoring-plugin" probeResult="failure" output="Get \"https://10.217.0.84:9443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:51 crc kubenswrapper[4722]: I0309 15:22:51.253971 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/monitoring-plugin-65599947bd-42bk4" Mar 09 15:22:51 crc kubenswrapper[4722]: I0309 15:22:51.633577 4722 patch_prober.go:28] interesting pod/apiserver-76f77b778f-gh244 container/openshift-apiserver namespace/openshift-apiserver: Liveness probe status=failure output="Get \"https://10.217.0.33:8443/livez?exclude=etcd\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:51 crc kubenswrapper[4722]: I0309 15:22:51.633669 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-apiserver/apiserver-76f77b778f-gh244" podUID="4084fbb0-8fae-4b7e-a3f6-ec9d723bb367" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.33:8443/livez?exclude=etcd\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:51 crc kubenswrapper[4722]: I0309 15:22:51.698227 4722 patch_prober.go:28] interesting pod/logging-loki-distributor-5d5548c9f5-r6x4b container/loki-distributor namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.52:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:51 crc kubenswrapper[4722]: I0309 15:22:51.698648 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-distributor-5d5548c9f5-r6x4b" podUID="822bc43f-dfed-4440-be35-1bf58f50456b" containerName="loki-distributor" probeResult="failure" output="Get \"https://10.217.0.52:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:51 crc kubenswrapper[4722]: I0309 15:22:51.698807 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-distributor-5d5548c9f5-r6x4b" Mar 09 15:22:51 crc kubenswrapper[4722]: I0309 15:22:51.702122 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-67pzj" Mar 09 15:22:51 crc kubenswrapper[4722]: I0309 15:22:51.741002 4722 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-2mf4l container/oauth-apiserver namespace/openshift-oauth-apiserver: Liveness probe status=failure output="Get \"https://10.217.0.10:8443/livez?exclude=etcd\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:51 crc kubenswrapper[4722]: I0309 15:22:51.741064 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2mf4l" podUID="062db6f1-77ab-4eca-be53-6480160aff81" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.10:8443/livez?exclude=etcd\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:51 crc kubenswrapper[4722]: I0309 15:22:51.780938 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/certified-operators-wkdkx" podUID="69a0d1f2-276f-4062-91d9-8af8048a8d8f" containerName="registry-server" probeResult="failure" output="command timed out" Mar 09 15:22:51 crc kubenswrapper[4722]: I0309 15:22:51.781041 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-cell1-galera-0" podUID="4159e308-3ccf-45d9-a97b-8133542007a8" containerName="galera" probeResult="failure" output="command timed out" Mar 09 15:22:51 crc kubenswrapper[4722]: I0309 15:22:51.781077 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 09 15:22:51 crc kubenswrapper[4722]: I0309 15:22:51.781696 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="4159e308-3ccf-45d9-a97b-8133542007a8" containerName="galera" probeResult="failure" output="command timed out" Mar 09 15:22:51 crc kubenswrapper[4722]: I0309 15:22:51.781709 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/certified-operators-wkdkx" podUID="69a0d1f2-276f-4062-91d9-8af8048a8d8f" containerName="registry-server" probeResult="failure" output="command timed out" Mar 09 15:22:51 crc kubenswrapper[4722]: I0309 15:22:51.781799 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 09 15:22:51 crc kubenswrapper[4722]: I0309 15:22:51.789827 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="galera" containerStatusID={"Type":"cri-o","ID":"80ff28c7cda34665de0bebe76a8a1102a025d02953717f80f765be65e05ede59"} pod="openstack/openstack-cell1-galera-0" containerMessage="Container galera failed liveness probe, will be restarted" Mar 09 15:22:51 crc kubenswrapper[4722]: I0309 15:22:51.818609 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-lvfgg" podUID="7a62b98d-e9d4-4cbc-bea8-0da13fcc4467" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.112:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:51 crc kubenswrapper[4722]: I0309 15:22:51.818883 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-lvfgg" podUID="7a62b98d-e9d4-4cbc-bea8-0da13fcc4467" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.112:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:51 crc kubenswrapper[4722]: I0309 15:22:51.819100 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-lvfgg" Mar 09 15:22:51 crc kubenswrapper[4722]: I0309 15:22:51.905551 4722 patch_prober.go:28] interesting pod/logging-loki-querier-76bf7b6d45-fv4dh container/loki-querier namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.53:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:51 crc kubenswrapper[4722]: I0309 15:22:51.905643 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-querier-76bf7b6d45-fv4dh" podUID="ba829d53-02a8-4003-a5ee-b9b36d8404e3" containerName="loki-querier" probeResult="failure" output="Get \"https://10.217.0.53:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:51 crc kubenswrapper[4722]: I0309 15:22:51.905801 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-querier-76bf7b6d45-fv4dh" Mar 09 15:22:52 crc kubenswrapper[4722]: I0309 15:22:52.087139 4722 patch_prober.go:28] interesting pod/logging-loki-query-frontend-6d6859c548-5b72h container/loki-query-frontend namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.54:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:52 crc kubenswrapper[4722]: I0309 15:22:52.087904 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-query-frontend-6d6859c548-5b72h" podUID="5ccc948e-2185-44fd-90c4-3ae3228f6224" containerName="loki-query-frontend" probeResult="failure" output="Get \"https://10.217.0.54:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:52 crc kubenswrapper[4722]: I0309 15:22:52.088004 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-query-frontend-6d6859c548-5b72h" Mar 09 15:22:52 crc kubenswrapper[4722]: I0309 15:22:52.208412 4722 patch_prober.go:28] interesting pod/logging-loki-gateway-6c5ff86c56-n5jdr container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.55:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:52 crc kubenswrapper[4722]: I0309 15:22:52.208494 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-6c5ff86c56-n5jdr" podUID="8becd072-3095-4717-a83d-e56cf0d0f816" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.55:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:52 crc kubenswrapper[4722]: I0309 15:22:52.221976 4722 patch_prober.go:28] interesting pod/logging-loki-gateway-6c5ff86c56-dvps5 container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:52 crc kubenswrapper[4722]: I0309 15:22:52.222060 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-6c5ff86c56-dvps5" podUID="e265fe14-7154-4fbb-a7c3-33557166f71d" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:52 crc kubenswrapper[4722]: I0309 15:22:52.255188 4722 patch_prober.go:28] interesting pod/monitoring-plugin-65599947bd-42bk4 container/monitoring-plugin namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.84:9443/health\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:52 crc kubenswrapper[4722]: I0309 15:22:52.255433 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/monitoring-plugin-65599947bd-42bk4" podUID="85f5a76e-3679-44b3-8932-f5245c49b481" containerName="monitoring-plugin" probeResult="failure" output="Get \"https://10.217.0.84:9443/health\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:52 crc kubenswrapper[4722]: I0309 15:22:52.287442 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9ctn5qw" podUID="5e25c11b-f9c6-4542-9c0c-394ea6bc2c17" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.120:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:52 crc kubenswrapper[4722]: I0309 15:22:52.287484 4722 patch_prober.go:28] interesting pod/logging-loki-gateway-6c5ff86c56-n5jdr container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.55:8081/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:52 crc kubenswrapper[4722]: I0309 15:22:52.287566 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9ctn5qw" podUID="5e25c11b-f9c6-4542-9c0c-394ea6bc2c17" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.120:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:52 crc kubenswrapper[4722]: I0309 15:22:52.287519 4722 patch_prober.go:28] interesting pod/logging-loki-gateway-6c5ff86c56-dvps5 container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:8081/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:52 crc kubenswrapper[4722]: I0309 15:22:52.287844 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-6c5ff86c56-n5jdr" podUID="8becd072-3095-4717-a83d-e56cf0d0f816" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.55:8081/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:52 crc kubenswrapper[4722]: I0309 15:22:52.287869 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-6c5ff86c56-dvps5" podUID="e265fe14-7154-4fbb-a7c3-33557166f71d" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.56:8081/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:52 crc kubenswrapper[4722]: I0309 15:22:52.287934 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9ctn5qw" Mar 09 15:22:52 crc kubenswrapper[4722]: E0309 15:22:52.452377 4722 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:52 crc kubenswrapper[4722]: I0309 15:22:52.520388 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-controller-manager-55cd86c56-dm2dr" podUID="98c22319-d5f8-4a0b-8a30-89b9d832f354" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.127:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:52 crc kubenswrapper[4722]: I0309 15:22:52.606588 4722 patch_prober.go:28] interesting pod/observability-operator-59bdc8b94-lc5zn container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.18:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:52 crc kubenswrapper[4722]: I0309 15:22:52.606595 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-operator-controller-manager-55cd86c56-dm2dr" podUID="98c22319-d5f8-4a0b-8a30-89b9d832f354" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.127:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:52 crc kubenswrapper[4722]: I0309 15:22:52.606674 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-59bdc8b94-lc5zn" podUID="14655c3d-02fe-4215-b566-0c4008fd34a0" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.18:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:52 crc kubenswrapper[4722]: I0309 15:22:52.607029 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-lc5zn" Mar 09 15:22:52 crc kubenswrapper[4722]: I0309 15:22:52.608697 4722 patch_prober.go:28] interesting pod/thanos-querier-5f6c868b98-s8fg2 container/kube-rbac-proxy-web namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.81:9091/-/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:52 crc kubenswrapper[4722]: I0309 15:22:52.608756 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/thanos-querier-5f6c868b98-s8fg2" podUID="1eb00f12-c24e-46dc-8346-c096826564f5" containerName="kube-rbac-proxy-web" probeResult="failure" output="Get \"https://10.217.0.81:9091/-/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:52 crc kubenswrapper[4722]: I0309 15:22:52.688474 4722 patch_prober.go:28] interesting pod/perses-operator-5bf474d74f-2rhld container/perses-operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.11:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:52 crc kubenswrapper[4722]: I0309 15:22:52.688543 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/perses-operator-5bf474d74f-2rhld" podUID="c123a767-e0e0-4432-b34f-cbe0b581d938" containerName="perses-operator" probeResult="failure" output="Get \"http://10.217.0.11:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:52 crc kubenswrapper[4722]: I0309 15:22:52.688618 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-2rhld" Mar 09 15:22:52 crc kubenswrapper[4722]: I0309 15:22:52.688925 4722 patch_prober.go:28] interesting pod/console-operator-58897d9998-chrnr container/console-operator namespace/openshift-console-operator: Liveness probe status=failure output="Get \"https://10.217.0.14:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:52 crc kubenswrapper[4722]: I0309 15:22:52.688971 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console-operator/console-operator-58897d9998-chrnr" podUID="b919959a-1da1-4c74-9330-5bb8c33f5c26" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.14:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:52 crc kubenswrapper[4722]: I0309 15:22:52.689020 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-58897d9998-chrnr" Mar 09 15:22:52 crc kubenswrapper[4722]: I0309 15:22:52.689414 4722 patch_prober.go:28] interesting pod/console-operator-58897d9998-chrnr container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.14:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:52 crc kubenswrapper[4722]: I0309 15:22:52.689486 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-chrnr" podUID="b919959a-1da1-4c74-9330-5bb8c33f5c26" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.14:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:52 crc kubenswrapper[4722]: I0309 15:22:52.689591 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-chrnr" Mar 09 15:22:52 crc kubenswrapper[4722]: I0309 15:22:52.689637 4722 patch_prober.go:28] interesting pod/perses-operator-5bf474d74f-2rhld container/perses-operator namespace/openshift-operators: Liveness probe status=failure output="Get \"http://10.217.0.11:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:52 crc kubenswrapper[4722]: I0309 15:22:52.689662 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operators/perses-operator-5bf474d74f-2rhld" podUID="c123a767-e0e0-4432-b34f-cbe0b581d938" containerName="perses-operator" probeResult="failure" output="Get \"http://10.217.0.11:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:52 crc kubenswrapper[4722]: I0309 15:22:52.696533 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="console-operator" containerStatusID={"Type":"cri-o","ID":"51927e5090e0b3c63faeac1e7ba440eff196cb147decf4416045e5adc2db287e"} pod="openshift-console-operator/console-operator-58897d9998-chrnr" containerMessage="Container console-operator failed liveness probe, will be restarted" Mar 09 15:22:52 crc kubenswrapper[4722]: I0309 15:22:52.696599 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console-operator/console-operator-58897d9998-chrnr" podUID="b919959a-1da1-4c74-9330-5bb8c33f5c26" containerName="console-operator" containerID="cri-o://51927e5090e0b3c63faeac1e7ba440eff196cb147decf4416045e5adc2db287e" gracePeriod=30 Mar 09 15:22:52 crc kubenswrapper[4722]: I0309 15:22:52.696835 4722 patch_prober.go:28] interesting pod/observability-operator-59bdc8b94-lc5zn container/operator namespace/openshift-operators: Liveness probe status=failure output="Get \"http://10.217.0.18:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:52 crc kubenswrapper[4722]: I0309 15:22:52.696927 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operators/observability-operator-59bdc8b94-lc5zn" podUID="14655c3d-02fe-4215-b566-0c4008fd34a0" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.18:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:52 crc kubenswrapper[4722]: I0309 15:22:52.696997 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-operators/observability-operator-59bdc8b94-lc5zn" Mar 09 15:22:52 crc kubenswrapper[4722]: I0309 15:22:52.699878 4722 patch_prober.go:28] interesting pod/logging-loki-distributor-5d5548c9f5-r6x4b container/loki-distributor namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.52:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:52 crc kubenswrapper[4722]: I0309 15:22:52.700010 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-distributor-5d5548c9f5-r6x4b" podUID="822bc43f-dfed-4440-be35-1bf58f50456b" containerName="loki-distributor" probeResult="failure" output="Get \"https://10.217.0.52:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:52 crc kubenswrapper[4722]: I0309 15:22:52.714132 4722 patch_prober.go:28] interesting pod/authentication-operator-69f744f599-8tkbn container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.217.0.9:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:52 crc kubenswrapper[4722]: I0309 15:22:52.714188 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-69f744f599-8tkbn" podUID="427a4c04-99cd-4f53-ae98-20c1755d7658" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.217.0.9:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:52 crc kubenswrapper[4722]: I0309 15:22:52.714243 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-authentication-operator/authentication-operator-69f744f599-8tkbn" Mar 09 15:22:52 crc kubenswrapper[4722]: I0309 15:22:52.714721 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="authentication-operator" containerStatusID={"Type":"cri-o","ID":"e614a94c2fbf38f966a561149647e547de6024a28362b5b063c83cb79f8c1d5e"} pod="openshift-authentication-operator/authentication-operator-69f744f599-8tkbn" containerMessage="Container authentication-operator failed liveness probe, will be restarted" Mar 09 15:22:52 crc kubenswrapper[4722]: I0309 15:22:52.714760 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication-operator/authentication-operator-69f744f599-8tkbn" podUID="427a4c04-99cd-4f53-ae98-20c1755d7658" containerName="authentication-operator" containerID="cri-o://e614a94c2fbf38f966a561149647e547de6024a28362b5b063c83cb79f8c1d5e" gracePeriod=30 Mar 09 15:22:52 crc kubenswrapper[4722]: I0309 15:22:52.759091 4722 patch_prober.go:28] interesting pod/openshift-kube-scheduler-crc container/kube-scheduler namespace/openshift-kube-scheduler: Readiness probe status=failure output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:52 crc kubenswrapper[4722]: I0309 15:22:52.759163 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:52 crc kubenswrapper[4722]: I0309 15:22:52.759281 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 09 15:22:52 crc kubenswrapper[4722]: I0309 15:22:52.775611 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-k8s-0" podUID="df07ee71-98db-42a1-9df4-6f8707504f08" containerName="prometheus" probeResult="failure" output="command timed out" Mar 09 15:22:52 crc kubenswrapper[4722]: I0309 15:22:52.800077 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/community-operators-rvtqn" podUID="3411289f-3e7c-4e43-b545-5e612822b18e" containerName="registry-server" probeResult="failure" output=< Mar 09 15:22:52 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Mar 09 15:22:52 crc kubenswrapper[4722]: > Mar 09 15:22:52 crc kubenswrapper[4722]: I0309 15:22:52.800658 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-marketplace/community-operators-rvtqn" Mar 09 15:22:52 crc kubenswrapper[4722]: I0309 15:22:52.800124 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/community-operators-rvtqn" podUID="3411289f-3e7c-4e43-b545-5e612822b18e" containerName="registry-server" probeResult="failure" output=< Mar 09 15:22:52 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Mar 09 15:22:52 crc kubenswrapper[4722]: > Mar 09 15:22:52 crc kubenswrapper[4722]: I0309 15:22:52.800805 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rvtqn" Mar 09 15:22:52 crc kubenswrapper[4722]: I0309 15:22:52.802383 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="registry-server" containerStatusID={"Type":"cri-o","ID":"6caf48dd254c82f3e713c8b7d146aa636738f0e702e7888634d4fbddac4cdbd7"} pod="openshift-marketplace/community-operators-rvtqn" containerMessage="Container registry-server failed liveness probe, will be restarted" Mar 09 15:22:52 crc kubenswrapper[4722]: I0309 15:22:52.802452 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rvtqn" podUID="3411289f-3e7c-4e43-b545-5e612822b18e" containerName="registry-server" containerID="cri-o://6caf48dd254c82f3e713c8b7d146aa636738f0e702e7888634d4fbddac4cdbd7" gracePeriod=30 Mar 09 15:22:52 crc kubenswrapper[4722]: I0309 15:22:52.861470 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-lvfgg" podUID="7a62b98d-e9d4-4cbc-bea8-0da13fcc4467" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.112:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:52 crc kubenswrapper[4722]: I0309 15:22:52.868292 4722 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.57:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:52 crc kubenswrapper[4722]: I0309 15:22:52.868343 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="6a23db8b-8a30-47b8-bf39-6f193899fcee" containerName="loki-ingester" probeResult="failure" output="Get \"https://10.217.0.57:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:52 crc kubenswrapper[4722]: I0309 15:22:52.868424 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-ingester-0" Mar 09 15:22:52 crc kubenswrapper[4722]: I0309 15:22:52.907074 4722 patch_prober.go:28] interesting pod/logging-loki-querier-76bf7b6d45-fv4dh container/loki-querier namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.53:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:52 crc kubenswrapper[4722]: I0309 15:22:52.907184 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-querier-76bf7b6d45-fv4dh" podUID="ba829d53-02a8-4003-a5ee-b9b36d8404e3" containerName="loki-querier" probeResult="failure" output="Get \"https://10.217.0.53:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:53 crc kubenswrapper[4722]: I0309 15:22:53.088970 4722 patch_prober.go:28] interesting pod/logging-loki-query-frontend-6d6859c548-5b72h container/loki-query-frontend namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.54:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:53 crc kubenswrapper[4722]: I0309 15:22:53.089026 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-query-frontend-6d6859c548-5b72h" podUID="5ccc948e-2185-44fd-90c4-3ae3228f6224" containerName="loki-query-frontend" probeResult="failure" output="Get \"https://10.217.0.54:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:53 crc kubenswrapper[4722]: I0309 15:22:53.133308 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="operator" containerStatusID={"Type":"cri-o","ID":"620d6d1f26b1e3ccb96ee8067fcb299cd682674fff256e53875db302966d4f26"} pod="openshift-operators/observability-operator-59bdc8b94-lc5zn" containerMessage="Container operator failed liveness probe, will be restarted" Mar 09 15:22:53 crc kubenswrapper[4722]: I0309 15:22:53.133609 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-operators/observability-operator-59bdc8b94-lc5zn" podUID="14655c3d-02fe-4215-b566-0c4008fd34a0" containerName="operator" containerID="cri-o://620d6d1f26b1e3ccb96ee8067fcb299cd682674fff256e53875db302966d4f26" gracePeriod=30 Mar 09 15:22:53 crc kubenswrapper[4722]: I0309 15:22:53.146848 4722 patch_prober.go:28] interesting pod/logging-loki-index-gateway-0 container/loki-index-gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.60:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:53 crc kubenswrapper[4722]: I0309 15:22:53.146909 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-index-gateway-0" podUID="bce49c11-10b4-4c30-a1a4-16cf32cb42fd" containerName="loki-index-gateway" probeResult="failure" output="Get \"https://10.217.0.60:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:53 crc kubenswrapper[4722]: I0309 15:22:53.330508 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9ctn5qw" podUID="5e25c11b-f9c6-4542-9c0c-394ea6bc2c17" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.120:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:53 crc kubenswrapper[4722]: I0309 15:22:53.337264 4722 patch_prober.go:28] interesting pod/logging-loki-compactor-0 container/loki-compactor namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.58:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:53 crc kubenswrapper[4722]: I0309 15:22:53.337347 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-compactor-0" podUID="75ed49e3-dc17-45c0-96ec-1db69670395b" containerName="loki-compactor" probeResult="failure" output="Get \"https://10.217.0.58:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:53 crc kubenswrapper[4722]: I0309 15:22:53.648345 4722 patch_prober.go:28] interesting pod/observability-operator-59bdc8b94-lc5zn container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.18:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:53 crc kubenswrapper[4722]: I0309 15:22:53.648653 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-59bdc8b94-lc5zn" podUID="14655c3d-02fe-4215-b566-0c4008fd34a0" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.18:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:53 crc kubenswrapper[4722]: I0309 15:22:53.696468 4722 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Readiness probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:53 crc kubenswrapper[4722]: I0309 15:22:53.696535 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:53 crc kubenswrapper[4722]: I0309 15:22:53.737437 4722 patch_prober.go:28] interesting pod/perses-operator-5bf474d74f-2rhld container/perses-operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.11:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:53 crc kubenswrapper[4722]: I0309 15:22:53.737790 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/perses-operator-5bf474d74f-2rhld" podUID="c123a767-e0e0-4432-b34f-cbe0b581d938" containerName="perses-operator" probeResult="failure" output="Get \"http://10.217.0.11:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:53 crc kubenswrapper[4722]: I0309 15:22:53.778501 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-controller-init-5b979cff56-vwbnz" podUID="bdac45ca-36d4-41c5-b5e5-332d70558171" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.104:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:53 crc kubenswrapper[4722]: I0309 15:22:53.778564 4722 patch_prober.go:28] interesting pod/openshift-kube-scheduler-crc container/kube-scheduler namespace/openshift-kube-scheduler: Readiness probe status=failure output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:53 crc kubenswrapper[4722]: I0309 15:22:53.778625 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:53 crc kubenswrapper[4722]: I0309 15:22:53.778662 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-5b979cff56-vwbnz" Mar 09 15:22:53 crc kubenswrapper[4722]: I0309 15:22:53.801780 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/prometheus-k8s-0" podUID="df07ee71-98db-42a1-9df4-6f8707504f08" containerName="prometheus" probeResult="failure" output="command timed out" Mar 09 15:22:53 crc kubenswrapper[4722]: I0309 15:22:53.869740 4722 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.57:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:53 crc kubenswrapper[4722]: I0309 15:22:53.869824 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="6a23db8b-8a30-47b8-bf39-6f193899fcee" containerName="loki-ingester" probeResult="failure" output="Get \"https://10.217.0.57:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:54 crc kubenswrapper[4722]: I0309 15:22:54.328384 4722 patch_prober.go:28] interesting pod/router-default-5444994796-dp8wn container/router namespace/openshift-ingress: Readiness probe status=failure output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:54 crc kubenswrapper[4722]: I0309 15:22:54.328429 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ingress/router-default-5444994796-dp8wn" podUID="317444ee-0620-47d2-869e-77578a367a87" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:54 crc kubenswrapper[4722]: I0309 15:22:54.328435 4722 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-dgxnd container/catalog-operator namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.42:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:54 crc kubenswrapper[4722]: I0309 15:22:54.328496 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-dp8wn" Mar 09 15:22:54 crc kubenswrapper[4722]: I0309 15:22:54.328487 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dgxnd" podUID="c219beb3-4ba5-43bd-b2ec-3855d19c2b57" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.42:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:54 crc kubenswrapper[4722]: I0309 15:22:54.328546 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dgxnd" Mar 09 15:22:54 crc kubenswrapper[4722]: I0309 15:22:54.328541 4722 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-dgxnd container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.42:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:54 crc kubenswrapper[4722]: I0309 15:22:54.328694 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dgxnd" podUID="c219beb3-4ba5-43bd-b2ec-3855d19c2b57" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.42:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:54 crc kubenswrapper[4722]: I0309 15:22:54.329069 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dgxnd" Mar 09 15:22:54 crc kubenswrapper[4722]: I0309 15:22:54.328394 4722 patch_prober.go:28] interesting pod/router-default-5444994796-dp8wn container/router namespace/openshift-ingress: Liveness probe status=failure output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:54 crc kubenswrapper[4722]: I0309 15:22:54.329342 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-ingress/router-default-5444994796-dp8wn" podUID="317444ee-0620-47d2-869e-77578a367a87" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:54 crc kubenswrapper[4722]: I0309 15:22:54.329398 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-ingress/router-default-5444994796-dp8wn" Mar 09 15:22:54 crc kubenswrapper[4722]: I0309 15:22:54.329718 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="catalog-operator" containerStatusID={"Type":"cri-o","ID":"b664453e53fd2ade7f0e7440ee555365a0657f568b5c0929fcced9082eb6ae61"} pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dgxnd" containerMessage="Container catalog-operator failed liveness probe, will be restarted" Mar 09 15:22:54 crc kubenswrapper[4722]: I0309 15:22:54.329771 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dgxnd" podUID="c219beb3-4ba5-43bd-b2ec-3855d19c2b57" containerName="catalog-operator" containerID="cri-o://b664453e53fd2ade7f0e7440ee555365a0657f568b5c0929fcced9082eb6ae61" gracePeriod=30 Mar 09 15:22:54 crc kubenswrapper[4722]: I0309 15:22:54.463452 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/metallb-operator-controller-manager-5689854475-89q94" podUID="3ea04cb5-4d36-42a9-bb83-c6f943619d16" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.97:8080/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:54 crc kubenswrapper[4722]: I0309 15:22:54.463668 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-5689854475-89q94" Mar 09 15:22:54 crc kubenswrapper[4722]: I0309 15:22:54.504555 4722 patch_prober.go:28] interesting pod/console-6dc4c5dd4b-rk4jt container/console namespace/openshift-console: Liveness probe status=failure output="Get \"https://10.217.0.142:8443/health\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:54 crc kubenswrapper[4722]: I0309 15:22:54.504654 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/console-6dc4c5dd4b-rk4jt" podUID="6dc5b476-5a42-4c98-9a95-0e3b29f2f771" containerName="console" probeResult="failure" output="Get \"https://10.217.0.142:8443/health\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:54 crc kubenswrapper[4722]: I0309 15:22:54.504786 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console/console-6dc4c5dd4b-rk4jt" Mar 09 15:22:54 crc kubenswrapper[4722]: I0309 15:22:54.504832 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="0e07fbab-4a47-4e59-aa72-f0a4521296af" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.173:9090/-/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:54 crc kubenswrapper[4722]: I0309 15:22:54.504901 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/prometheus-metric-storage-0" podUID="0e07fbab-4a47-4e59-aa72-f0a4521296af" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.173:9090/-/healthy\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:54 crc kubenswrapper[4722]: I0309 15:22:54.505040 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Mar 09 15:22:54 crc kubenswrapper[4722]: I0309 15:22:54.506272 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="console" containerStatusID={"Type":"cri-o","ID":"b8f5b1038e62f1fa81cc3b716e91ed6508a1746f147df7b94d2cb47945f17185"} pod="openshift-console/console-6dc4c5dd4b-rk4jt" containerMessage="Container console failed liveness probe, will be restarted" Mar 09 15:22:54 crc kubenswrapper[4722]: I0309 15:22:54.565296 4722 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-lnkwt container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.19:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:54 crc kubenswrapper[4722]: I0309 15:22:54.566041 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lnkwt" podUID="f8055a95-6b09-4e32-88b8-82ad36ca5029" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.19:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:54 crc kubenswrapper[4722]: I0309 15:22:54.566457 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lnkwt" Mar 09 15:22:54 crc kubenswrapper[4722]: I0309 15:22:54.580244 4722 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-lnkwt container/olm-operator namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.19:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:54 crc kubenswrapper[4722]: I0309 15:22:54.580690 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lnkwt" podUID="f8055a95-6b09-4e32-88b8-82ad36ca5029" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.19:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:54 crc kubenswrapper[4722]: I0309 15:22:54.580748 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lnkwt" Mar 09 15:22:54 crc kubenswrapper[4722]: I0309 15:22:54.664432 4722 patch_prober.go:28] interesting pod/package-server-manager-789f6589d5-mrtb2 container/package-server-manager namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"http://10.217.0.23:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:54 crc kubenswrapper[4722]: I0309 15:22:54.664475 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mrtb2" podUID="228d39d8-b0bc-4491-be90-e473c090f412" containerName="package-server-manager" probeResult="failure" output="Get \"http://10.217.0.23:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:54 crc kubenswrapper[4722]: I0309 15:22:54.664514 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mrtb2" Mar 09 15:22:54 crc kubenswrapper[4722]: I0309 15:22:54.664613 4722 patch_prober.go:28] interesting pod/package-server-manager-789f6589d5-mrtb2 container/package-server-manager namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"http://10.217.0.23:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:54 crc kubenswrapper[4722]: I0309 15:22:54.664672 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mrtb2" podUID="228d39d8-b0bc-4491-be90-e473c090f412" containerName="package-server-manager" probeResult="failure" output="Get \"http://10.217.0.23:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:54 crc kubenswrapper[4722]: I0309 15:22:54.664753 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mrtb2" Mar 09 15:22:54 crc kubenswrapper[4722]: I0309 15:22:54.666125 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="package-server-manager" containerStatusID={"Type":"cri-o","ID":"778991893ca7b914cdde2f5428e80ab6cfce3570917a24a4ccaaaff3928b14db"} pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mrtb2" containerMessage="Container package-server-manager failed liveness probe, will be restarted" Mar 09 15:22:54 crc kubenswrapper[4722]: I0309 15:22:54.666160 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mrtb2" podUID="228d39d8-b0bc-4491-be90-e473c090f412" containerName="package-server-manager" containerID="cri-o://778991893ca7b914cdde2f5428e80ab6cfce3570917a24a4ccaaaff3928b14db" gracePeriod=30 Mar 09 15:22:54 crc kubenswrapper[4722]: I0309 15:22:54.705394 4722 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-fp2th container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.44:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:54 crc kubenswrapper[4722]: I0309 15:22:54.705460 4722 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-fp2th container/packageserver namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.44:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:54 crc kubenswrapper[4722]: I0309 15:22:54.705459 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fp2th" podUID="996087ed-6480-4650-8632-c991e5d16c99" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.44:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:54 crc kubenswrapper[4722]: I0309 15:22:54.705487 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fp2th" podUID="996087ed-6480-4650-8632-c991e5d16c99" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.44:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:54 crc kubenswrapper[4722]: I0309 15:22:54.705528 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fp2th" Mar 09 15:22:54 crc kubenswrapper[4722]: I0309 15:22:54.705548 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fp2th" Mar 09 15:22:54 crc kubenswrapper[4722]: I0309 15:22:54.705866 4722 patch_prober.go:28] interesting pod/observability-operator-59bdc8b94-lc5zn container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.18:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:54 crc kubenswrapper[4722]: I0309 15:22:54.705941 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-59bdc8b94-lc5zn" podUID="14655c3d-02fe-4215-b566-0c4008fd34a0" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.18:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:54 crc kubenswrapper[4722]: I0309 15:22:54.706869 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="packageserver" containerStatusID={"Type":"cri-o","ID":"4b273112c5687a28adabe3b469c671154db487b3bd47cef0ffa59c7a855124c1"} pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fp2th" containerMessage="Container packageserver failed liveness probe, will be restarted" Mar 09 15:22:54 crc kubenswrapper[4722]: I0309 15:22:54.706911 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fp2th" podUID="996087ed-6480-4650-8632-c991e5d16c99" containerName="packageserver" containerID="cri-o://4b273112c5687a28adabe3b469c671154db487b3bd47cef0ffa59c7a855124c1" gracePeriod=30 Mar 09 15:22:54 crc kubenswrapper[4722]: I0309 15:22:54.821779 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-controller-init-5b979cff56-vwbnz" podUID="bdac45ca-36d4-41c5-b5e5-332d70558171" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.104:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:54 crc kubenswrapper[4722]: I0309 15:22:54.905140 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/metallb-operator-webhook-server-798745ff96-864pz" podUID="f67efad4-1b85-4f64-9e98-55eb2da89fb6" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.98:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:54 crc kubenswrapper[4722]: I0309 15:22:54.905280 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="metallb-system/metallb-operator-webhook-server-798745ff96-864pz" Mar 09 15:22:54 crc kubenswrapper[4722]: I0309 15:22:54.905485 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/metallb-operator-webhook-server-798745ff96-864pz" podUID="f67efad4-1b85-4f64-9e98-55eb2da89fb6" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.98:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:54 crc kubenswrapper[4722]: I0309 15:22:54.905596 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-798745ff96-864pz" Mar 09 15:22:54 crc kubenswrapper[4722]: I0309 15:22:54.909729 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="webhook-server" containerStatusID={"Type":"cri-o","ID":"40e8a42e49bdaa139e2f793ecc9c4f756fce5f98b5a4f6cd78ab4bf4c69dc0ee"} pod="metallb-system/metallb-operator-webhook-server-798745ff96-864pz" containerMessage="Container webhook-server failed liveness probe, will be restarted" Mar 09 15:22:54 crc kubenswrapper[4722]: I0309 15:22:54.909808 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="metallb-system/metallb-operator-webhook-server-798745ff96-864pz" podUID="f67efad4-1b85-4f64-9e98-55eb2da89fb6" containerName="webhook-server" containerID="cri-o://40e8a42e49bdaa139e2f793ecc9c4f756fce5f98b5a4f6cd78ab4bf4c69dc0ee" gracePeriod=2 Mar 09 15:22:55 crc kubenswrapper[4722]: I0309 15:22:55.064525 4722 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-8xdjl container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.75:8443/healthz\": dial tcp 10.217.0.75:8443: connect: connection refused" start-of-body= Mar 09 15:22:55 crc kubenswrapper[4722]: I0309 15:22:55.064605 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-8xdjl" podUID="8b35adf7-a305-4f94-a5c9-02fbc3fca46f" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.75:8443/healthz\": dial tcp 10.217.0.75:8443: connect: connection refused" Mar 09 15:22:55 crc kubenswrapper[4722]: I0309 15:22:55.162059 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="olm-operator" containerStatusID={"Type":"cri-o","ID":"d708c56ace6a0794be5e9b44e9d196661f8745ae35cf30b479af812d413dbd2a"} pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lnkwt" containerMessage="Container olm-operator failed liveness probe, will be restarted" Mar 09 15:22:55 crc kubenswrapper[4722]: I0309 15:22:55.162108 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lnkwt" podUID="f8055a95-6b09-4e32-88b8-82ad36ca5029" containerName="olm-operator" containerID="cri-o://d708c56ace6a0794be5e9b44e9d196661f8745ae35cf30b479af812d413dbd2a" gracePeriod=30 Mar 09 15:22:55 crc kubenswrapper[4722]: I0309 15:22:55.163252 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="router" containerStatusID={"Type":"cri-o","ID":"10bc0de264d7ed06b3f68942e6e3c249950032a18e0566405054ad9fe8eaaee6"} pod="openshift-ingress/router-default-5444994796-dp8wn" containerMessage="Container router failed liveness probe, will be restarted" Mar 09 15:22:55 crc kubenswrapper[4722]: I0309 15:22:55.163322 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ingress/router-default-5444994796-dp8wn" podUID="317444ee-0620-47d2-869e-77578a367a87" containerName="router" containerID="cri-o://10bc0de264d7ed06b3f68942e6e3c249950032a18e0566405054ad9fe8eaaee6" gracePeriod=10 Mar 09 15:22:55 crc kubenswrapper[4722]: I0309 15:22:55.370432 4722 patch_prober.go:28] interesting pod/router-default-5444994796-dp8wn container/router namespace/openshift-ingress: Readiness probe status=failure output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:55 crc kubenswrapper[4722]: I0309 15:22:55.370506 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ingress/router-default-5444994796-dp8wn" podUID="317444ee-0620-47d2-869e-77578a367a87" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:55 crc kubenswrapper[4722]: I0309 15:22:55.449555 4722 patch_prober.go:28] interesting pod/console-6dc4c5dd4b-rk4jt container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.142:8443/health\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:55 crc kubenswrapper[4722]: I0309 15:22:55.449629 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-6dc4c5dd4b-rk4jt" podUID="6dc5b476-5a42-4c98-9a95-0e3b29f2f771" containerName="console" probeResult="failure" output="Get \"https://10.217.0.142:8443/health\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:55 crc kubenswrapper[4722]: I0309 15:22:55.461225 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-c2mmw" podUID="690e5ab0-3719-40ac-aba6-9278480ecb44" containerName="registry-server" probeResult="failure" output=< Mar 09 15:22:55 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Mar 09 15:22:55 crc kubenswrapper[4722]: > Mar 09 15:22:55 crc kubenswrapper[4722]: I0309 15:22:55.461391 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-c2mmw" Mar 09 15:22:55 crc kubenswrapper[4722]: I0309 15:22:55.505716 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/metallb-operator-controller-manager-5689854475-89q94" podUID="3ea04cb5-4d36-42a9-bb83-c6f943619d16" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.97:8080/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:55 crc kubenswrapper[4722]: I0309 15:22:55.505770 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6dc4c5dd4b-rk4jt" Mar 09 15:22:55 crc kubenswrapper[4722]: I0309 15:22:55.632505 4722 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-lnkwt container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.19:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:55 crc kubenswrapper[4722]: I0309 15:22:55.633142 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lnkwt" podUID="f8055a95-6b09-4e32-88b8-82ad36ca5029" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.19:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:55 crc kubenswrapper[4722]: I0309 15:22:55.632566 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-2nlfl" podUID="8557439a-0367-4823-af83-28955a17cc08" containerName="frr-k8s-webhook-server" probeResult="failure" output="Get \"http://10.217.0.99:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:55 crc kubenswrapper[4722]: I0309 15:22:55.632533 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-2nlfl" podUID="8557439a-0367-4823-af83-28955a17cc08" containerName="frr-k8s-webhook-server" probeResult="failure" output="Get \"http://10.217.0.99:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:55 crc kubenswrapper[4722]: I0309 15:22:55.633424 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-2nlfl" Mar 09 15:22:55 crc kubenswrapper[4722]: I0309 15:22:55.635028 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="frr-k8s-webhook-server" containerStatusID={"Type":"cri-o","ID":"e319108c72ee500be331451610739dd7591d749c08a2d18d2a197ae9514e65d4"} pod="metallb-system/frr-k8s-webhook-server-7f989f654f-2nlfl" containerMessage="Container frr-k8s-webhook-server failed liveness probe, will be restarted" Mar 09 15:22:55 crc kubenswrapper[4722]: I0309 15:22:55.635083 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-2nlfl" podUID="8557439a-0367-4823-af83-28955a17cc08" containerName="frr-k8s-webhook-server" containerID="cri-o://e319108c72ee500be331451610739dd7591d749c08a2d18d2a197ae9514e65d4" gracePeriod=10 Mar 09 15:22:55 crc kubenswrapper[4722]: I0309 15:22:55.663143 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/redhat-operators-v57f2" podUID="4e27c5a4-8cba-4119-8006-f9841d6121dc" containerName="registry-server" probeResult="failure" output=< Mar 09 15:22:55 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Mar 09 15:22:55 crc kubenswrapper[4722]: > Mar 09 15:22:55 crc kubenswrapper[4722]: I0309 15:22:55.663240 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-marketplace/redhat-operators-v57f2" Mar 09 15:22:55 crc kubenswrapper[4722]: I0309 15:22:55.664130 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="registry-server" containerStatusID={"Type":"cri-o","ID":"39debea8587f6e6f170a7177c41011980494388f1c8da8fb895edab231f9cc8f"} pod="openshift-marketplace/redhat-operators-v57f2" containerMessage="Container registry-server failed liveness probe, will be restarted" Mar 09 15:22:55 crc kubenswrapper[4722]: I0309 15:22:55.664168 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-v57f2" podUID="4e27c5a4-8cba-4119-8006-f9841d6121dc" containerName="registry-server" containerID="cri-o://39debea8587f6e6f170a7177c41011980494388f1c8da8fb895edab231f9cc8f" gracePeriod=30 Mar 09 15:22:55 crc kubenswrapper[4722]: I0309 15:22:55.725380 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/controller-86ddb6bd46-6w5ww" podUID="0b264ba4-1398-4d3e-bb2b-83cd34a0ccf6" containerName="controller" probeResult="failure" output="Get \"http://10.217.0.100:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:55 crc kubenswrapper[4722]: I0309 15:22:55.725452 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="metallb-system/controller-86ddb6bd46-6w5ww" Mar 09 15:22:55 crc kubenswrapper[4722]: I0309 15:22:55.726644 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="controller" containerStatusID={"Type":"cri-o","ID":"a1a02869fd9ddd439c962393949fee7151323c975474b74ba0c3e6af42e7e3d9"} pod="metallb-system/controller-86ddb6bd46-6w5ww" containerMessage="Container controller failed liveness probe, will be restarted" Mar 09 15:22:55 crc kubenswrapper[4722]: I0309 15:22:55.726706 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="metallb-system/controller-86ddb6bd46-6w5ww" podUID="0b264ba4-1398-4d3e-bb2b-83cd34a0ccf6" containerName="controller" containerID="cri-o://a1a02869fd9ddd439c962393949fee7151323c975474b74ba0c3e6af42e7e3d9" gracePeriod=2 Mar 09 15:22:55 crc kubenswrapper[4722]: I0309 15:22:55.775449 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-k8s-0" podUID="df07ee71-98db-42a1-9df4-6f8707504f08" containerName="prometheus" probeResult="failure" output="command timed out" Mar 09 15:22:55 crc kubenswrapper[4722]: I0309 15:22:55.807518 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-vppnv" podUID="8f839106-1673-4589-9391-0cd7748e658c" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.106:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:55 crc kubenswrapper[4722]: I0309 15:22:55.807653 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-vppnv" Mar 09 15:22:55 crc kubenswrapper[4722]: I0309 15:22:55.808010 4722 patch_prober.go:28] interesting pod/loki-operator-controller-manager-7d6d6698bd-4r85k container/manager namespace/openshift-operators-redhat: Readiness probe status=failure output="Get \"http://10.217.0.50:8081/readyz\": dial tcp 10.217.0.50:8081: connect: connection refused" start-of-body= Mar 09 15:22:55 crc kubenswrapper[4722]: I0309 15:22:55.808059 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators-redhat/loki-operator-controller-manager-7d6d6698bd-4r85k" podUID="497a07fc-9649-4620-9432-855aa3fdc327" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.50:8081/readyz\": dial tcp 10.217.0.50:8081: connect: connection refused" Mar 09 15:22:55 crc kubenswrapper[4722]: I0309 15:22:55.808007 4722 patch_prober.go:28] interesting pod/loki-operator-controller-manager-7d6d6698bd-4r85k container/manager namespace/openshift-operators-redhat: Liveness probe status=failure output="Get \"http://10.217.0.50:8081/healthz\": dial tcp 10.217.0.50:8081: connect: connection refused" start-of-body= Mar 09 15:22:55 crc kubenswrapper[4722]: I0309 15:22:55.808123 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operators-redhat/loki-operator-controller-manager-7d6d6698bd-4r85k" podUID="497a07fc-9649-4620-9432-855aa3fdc327" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.50:8081/healthz\": dial tcp 10.217.0.50:8081: connect: connection refused" Mar 09 15:22:55 crc kubenswrapper[4722]: I0309 15:22:55.889458 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-rsd9l" podUID="edd71e1d-6ff0-4918-9cd8-a342efba2df5" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.107:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:55 crc kubenswrapper[4722]: I0309 15:22:55.889478 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/controller-86ddb6bd46-6w5ww" podUID="0b264ba4-1398-4d3e-bb2b-83cd34a0ccf6" containerName="controller" probeResult="failure" output="Get \"http://10.217.0.100:29150/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:55 crc kubenswrapper[4722]: I0309 15:22:55.889593 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-rsd9l" Mar 09 15:22:55 crc kubenswrapper[4722]: I0309 15:22:55.889644 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-86ddb6bd46-6w5ww" Mar 09 15:22:55 crc kubenswrapper[4722]: I0309 15:22:55.930405 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-bmpgd" podUID="a1a5e35a-83f6-4886-86db-55738f51f7e8" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.108:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:55 crc kubenswrapper[4722]: I0309 15:22:55.971626 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-bmpgd" podUID="a1a5e35a-83f6-4886-86db-55738f51f7e8" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.108:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:55 crc kubenswrapper[4722]: I0309 15:22:55.971650 4722 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-fp2th container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.44:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:55 crc kubenswrapper[4722]: I0309 15:22:55.971723 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-bmpgd" Mar 09 15:22:55 crc kubenswrapper[4722]: I0309 15:22:55.971922 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-vppnv" podUID="8f839106-1673-4589-9391-0cd7748e658c" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.106:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:55 crc kubenswrapper[4722]: I0309 15:22:55.971713 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fp2th" podUID="996087ed-6480-4650-8632-c991e5d16c99" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.44:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:56 crc kubenswrapper[4722]: I0309 15:22:56.054485 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-ct7x8" podUID="f21c35ef-c8ea-4331-a747-44a62c6f2e74" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.109:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:56 crc kubenswrapper[4722]: I0309 15:22:56.136402 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-rsd9l" podUID="edd71e1d-6ff0-4918-9cd8-a342efba2df5" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.107:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:56 crc kubenswrapper[4722]: I0309 15:22:56.136508 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-zjf7b" podUID="22043c71-5292-422c-99e5-c88ea1aef638" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.110:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:56 crc kubenswrapper[4722]: I0309 15:22:56.138062 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-ct7x8" podUID="f21c35ef-c8ea-4331-a747-44a62c6f2e74" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.109:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:56 crc kubenswrapper[4722]: I0309 15:22:56.138191 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-zjf7b" podUID="22043c71-5292-422c-99e5-c88ea1aef638" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.110:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:56 crc kubenswrapper[4722]: I0309 15:22:56.138285 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-zjf7b" Mar 09 15:22:56 crc kubenswrapper[4722]: I0309 15:22:56.157730 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/kube-state-metrics-0" podUID="652c419b-2a86-4c6f-ac7a-c2d7818ef55f" containerName="kube-state-metrics" probeResult="failure" output="Get \"https://10.217.1.16:8081/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:56 crc kubenswrapper[4722]: I0309 15:22:56.157871 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/kube-state-metrics-0" podUID="652c419b-2a86-4c6f-ac7a-c2d7818ef55f" containerName="kube-state-metrics" probeResult="failure" output="Get \"https://10.217.1.16:8080/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:56 crc kubenswrapper[4722]: I0309 15:22:56.161173 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/kube-state-metrics-0" Mar 09 15:22:56 crc kubenswrapper[4722]: I0309 15:22:56.161317 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 09 15:22:56 crc kubenswrapper[4722]: I0309 15:22:56.162874 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-state-metrics" containerStatusID={"Type":"cri-o","ID":"095d76880e80c386aa9f7fca9391735b55ff0971e29fbda2b8126c70011b1679"} pod="openstack/kube-state-metrics-0" containerMessage="Container kube-state-metrics failed liveness probe, will be restarted" Mar 09 15:22:56 crc kubenswrapper[4722]: I0309 15:22:56.162941 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="652c419b-2a86-4c6f-ac7a-c2d7818ef55f" containerName="kube-state-metrics" containerID="cri-o://095d76880e80c386aa9f7fca9391735b55ff0971e29fbda2b8126c70011b1679" gracePeriod=30 Mar 09 15:22:56 crc kubenswrapper[4722]: I0309 15:22:56.175194 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-8xdjl" event={"ID":"8b35adf7-a305-4f94-a5c9-02fbc3fca46f","Type":"ContainerDied","Data":"24504e3f082f14957d38d16a3a1fe7333e36577d396d752821c030b29c7c33f8"} Mar 09 15:22:56 crc kubenswrapper[4722]: I0309 15:22:56.192426 4722 generic.go:334] "Generic (PLEG): container finished" podID="8b35adf7-a305-4f94-a5c9-02fbc3fca46f" containerID="24504e3f082f14957d38d16a3a1fe7333e36577d396d752821c030b29c7c33f8" exitCode=0 Mar 09 15:22:56 crc kubenswrapper[4722]: I0309 15:22:56.294173 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/redhat-marketplace-c2mmw" podUID="690e5ab0-3719-40ac-aba6-9278480ecb44" containerName="registry-server" probeResult="failure" output=< Mar 09 15:22:56 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Mar 09 15:22:56 crc kubenswrapper[4722]: > Mar 09 15:22:56 crc kubenswrapper[4722]: I0309 15:22:56.294505 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-c2mmw" Mar 09 15:22:56 crc kubenswrapper[4722]: I0309 15:22:56.295256 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="registry-server" containerStatusID={"Type":"cri-o","ID":"90d71c8e051db0d4adfc07841fdf4a0886e86003f3b562c362dc6cd509599188"} pod="openshift-marketplace/redhat-marketplace-c2mmw" containerMessage="Container registry-server failed liveness probe, will be restarted" Mar 09 15:22:56 crc kubenswrapper[4722]: I0309 15:22:56.295290 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-c2mmw" podUID="690e5ab0-3719-40ac-aba6-9278480ecb44" containerName="registry-server" containerID="cri-o://90d71c8e051db0d4adfc07841fdf4a0886e86003f3b562c362dc6cd509599188" gracePeriod=30 Mar 09 15:22:56 crc kubenswrapper[4722]: I0309 15:22:56.306352 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/redhat-operators-v57f2" podUID="4e27c5a4-8cba-4119-8006-f9841d6121dc" containerName="registry-server" probeResult="failure" output=< Mar 09 15:22:56 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Mar 09 15:22:56 crc kubenswrapper[4722]: > Mar 09 15:22:56 crc kubenswrapper[4722]: I0309 15:22:56.306448 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-v57f2" Mar 09 15:22:56 crc kubenswrapper[4722]: I0309 15:22:56.307095 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/community-operators-rvtqn" podUID="3411289f-3e7c-4e43-b545-5e612822b18e" containerName="registry-server" probeResult="failure" output=< Mar 09 15:22:56 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Mar 09 15:22:56 crc kubenswrapper[4722]: > Mar 09 15:22:56 crc kubenswrapper[4722]: I0309 15:22:56.365475 4722 prober.go:107] "Probe failed" probeType="Startup" pod="metallb-system/frr-k8s-6vn96" podUID="29ed2858-4fd0-4817-8ed3-b3515ac035d7" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:56 crc kubenswrapper[4722]: I0309 15:22:56.365527 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-hnzfw" podUID="ec9f1f5e-26f5-4683-bf41-c85981da9d18" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.113:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:56 crc kubenswrapper[4722]: I0309 15:22:56.365926 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/kube-state-metrics-0" podUID="652c419b-2a86-4c6f-ac7a-c2d7818ef55f" containerName="kube-state-metrics" probeResult="failure" output="Get \"https://10.217.1.16:8081/readyz\": dial tcp 10.217.1.16:8081: connect: connection refused" Mar 09 15:22:56 crc kubenswrapper[4722]: I0309 15:22:56.365933 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-6vn96" podUID="29ed2858-4fd0-4817-8ed3-b3515ac035d7" containerName="controller" probeResult="failure" output="Get \"http://127.0.0.1:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:56 crc kubenswrapper[4722]: I0309 15:22:56.366271 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-6vn96" Mar 09 15:22:56 crc kubenswrapper[4722]: I0309 15:22:56.449439 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-hnzfw" podUID="ec9f1f5e-26f5-4683-bf41-c85981da9d18" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.113:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:56 crc kubenswrapper[4722]: I0309 15:22:56.449494 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/manila-operator-controller-manager-67d996989d-28855" podUID="dae536b6-7a22-435e-b307-a8ab6b54779d" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.115:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:56 crc kubenswrapper[4722]: I0309 15:22:56.449553 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-hnzfw" Mar 09 15:22:56 crc kubenswrapper[4722]: I0309 15:22:56.531419 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-l8bds" podUID="74cb981b-ce89-479e-8573-fdda25190637" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.116:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:56 crc kubenswrapper[4722]: I0309 15:22:56.531455 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-6vn96" podUID="29ed2858-4fd0-4817-8ed3-b3515ac035d7" containerName="controller" probeResult="failure" output="Get \"http://127.0.0.1:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:56 crc kubenswrapper[4722]: I0309 15:22:56.531580 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-l8bds" Mar 09 15:22:56 crc kubenswrapper[4722]: I0309 15:22:56.531611 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="metallb-system/frr-k8s-6vn96" Mar 09 15:22:56 crc kubenswrapper[4722]: I0309 15:22:56.572481 4722 patch_prober.go:28] interesting pod/router-default-5444994796-dp8wn container/router namespace/openshift-ingress: Readiness probe status=failure output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:56 crc kubenswrapper[4722]: I0309 15:22:56.572535 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ingress/router-default-5444994796-dp8wn" podUID="317444ee-0620-47d2-869e-77578a367a87" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:56 crc kubenswrapper[4722]: I0309 15:22:56.577719 4722 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Readiness probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:56 crc kubenswrapper[4722]: I0309 15:22:56.577794 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:56 crc kubenswrapper[4722]: I0309 15:22:56.616453 4722 trace.go:236] Trace[87951939]: "Calculate volume metrics of storage for pod openshift-logging/logging-loki-compactor-0" (09-Mar-2026 15:22:47.169) (total time: 9441ms): Mar 09 15:22:56 crc kubenswrapper[4722]: Trace[87951939]: [9.441178072s] [9.441178072s] END Mar 09 15:22:56 crc kubenswrapper[4722]: I0309 15:22:56.654339 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/manila-operator-controller-manager-67d996989d-28855" podUID="dae536b6-7a22-435e-b307-a8ab6b54779d" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.115:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:56 crc kubenswrapper[4722]: I0309 15:22:56.654368 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-jwgfg" podUID="a9ff56ca-00a6-484f-a477-0dca4f3a0f5c" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.118:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:56 crc kubenswrapper[4722]: I0309 15:22:56.737449 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-l8bds" podUID="74cb981b-ce89-479e-8573-fdda25190637" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.116:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:56 crc kubenswrapper[4722]: I0309 15:22:56.737931 4722 patch_prober.go:28] interesting pod/nmstate-webhook-786f45cff4-jdp69 container/nmstate-webhook namespace/openshift-nmstate: Readiness probe status=failure output="Get \"https://10.217.0.89:9443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:56 crc kubenswrapper[4722]: I0309 15:22:56.737938 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-jwgfg" podUID="a9ff56ca-00a6-484f-a477-0dca4f3a0f5c" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.118:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:56 crc kubenswrapper[4722]: I0309 15:22:56.737981 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-nmstate/nmstate-webhook-786f45cff4-jdp69" podUID="c0af161b-a8d5-4a36-b1c2-0a4d43820c73" containerName="nmstate-webhook" probeResult="failure" output="Get \"https://10.217.0.89:9443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:56 crc kubenswrapper[4722]: I0309 15:22:56.738051 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-786f45cff4-jdp69" Mar 09 15:22:56 crc kubenswrapper[4722]: I0309 15:22:56.738082 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lnkwt" Mar 09 15:22:56 crc kubenswrapper[4722]: I0309 15:22:56.738070 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-n5zc7" podUID="717ffc3a-7a6d-4a7c-837f-d1ed92489b68" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.119:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:56 crc kubenswrapper[4722]: I0309 15:22:56.738104 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-jwgfg" Mar 09 15:22:56 crc kubenswrapper[4722]: I0309 15:22:56.862420 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-hgkzs" podUID="df8b52ff-f61e-4aca-a408-240590699ae6" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.121:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:56 crc kubenswrapper[4722]: I0309 15:22:56.911078 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-c2mmw" podUID="690e5ab0-3719-40ac-aba6-9278480ecb44" containerName="registry-server" probeResult="failure" output=< Mar 09 15:22:56 crc kubenswrapper[4722]: timeout: health rpc did not complete within 1s Mar 09 15:22:56 crc kubenswrapper[4722]: > Mar 09 15:22:56 crc kubenswrapper[4722]: E0309 15:22:56.921569 4722 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="90d71c8e051db0d4adfc07841fdf4a0886e86003f3b562c362dc6cd509599188" cmd=["grpc_health_probe","-addr=:50051"] Mar 09 15:22:56 crc kubenswrapper[4722]: E0309 15:22:56.924004 4722 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="90d71c8e051db0d4adfc07841fdf4a0886e86003f3b562c362dc6cd509599188" cmd=["grpc_health_probe","-addr=:50051"] Mar 09 15:22:56 crc kubenswrapper[4722]: E0309 15:22:56.927044 4722 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="90d71c8e051db0d4adfc07841fdf4a0886e86003f3b562c362dc6cd509599188" cmd=["grpc_health_probe","-addr=:50051"] Mar 09 15:22:56 crc kubenswrapper[4722]: E0309 15:22:56.927109 4722 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-c2mmw" podUID="690e5ab0-3719-40ac-aba6-9278480ecb44" containerName="registry-server" Mar 09 15:22:56 crc kubenswrapper[4722]: I0309 15:22:56.945397 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-wrzrq" podUID="e7b4c7c9-7c4f-4a13-8367-759f5f5ce368" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.123:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:56 crc kubenswrapper[4722]: I0309 15:22:56.945543 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-wrzrq" Mar 09 15:22:56 crc kubenswrapper[4722]: I0309 15:22:56.945400 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-n5zc7" podUID="717ffc3a-7a6d-4a7c-837f-d1ed92489b68" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.119:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:56 crc kubenswrapper[4722]: I0309 15:22:56.946412 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-n5zc7" Mar 09 15:22:57 crc kubenswrapper[4722]: I0309 15:22:57.028405 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/telemetry-operator-controller-manager-66c8b7dfbb-m7fv2" podUID="5bf14ad6-64cf-48f7-99e6-fabac12849e2" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.124:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:57 crc kubenswrapper[4722]: I0309 15:22:57.028783 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-66c8b7dfbb-m7fv2" Mar 09 15:22:57 crc kubenswrapper[4722]: I0309 15:22:57.028982 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-2nlfl" podUID="8557439a-0367-4823-af83-28955a17cc08" containerName="frr-k8s-webhook-server" probeResult="failure" output="Get \"http://10.217.0.99:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:57 crc kubenswrapper[4722]: I0309 15:22:57.069365 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-hgkzs" podUID="df8b52ff-f61e-4aca-a408-240590699ae6" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.121:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:57 crc kubenswrapper[4722]: I0309 15:22:57.069473 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-hgkzs" Mar 09 15:22:57 crc kubenswrapper[4722]: I0309 15:22:57.069382 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-vppnv" podUID="8f839106-1673-4589-9391-0cd7748e658c" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.106:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:57 crc kubenswrapper[4722]: I0309 15:22:57.111409 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-wrzrq" podUID="e7b4c7c9-7c4f-4a13-8367-759f5f5ce368" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.123:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:57 crc kubenswrapper[4722]: I0309 15:22:57.111459 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-rsd9l" podUID="edd71e1d-6ff0-4918-9cd8-a342efba2df5" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.107:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:57 crc kubenswrapper[4722]: I0309 15:22:57.111500 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-56qz9" podUID="a9df5689-5d83-4206-be2b-cf6877d70e23" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.122:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:57 crc kubenswrapper[4722]: I0309 15:22:57.111491 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-56qz9" podUID="a9df5689-5d83-4206-be2b-cf6877d70e23" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.122:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:57 crc kubenswrapper[4722]: I0309 15:22:57.111558 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-56qz9" Mar 09 15:22:57 crc kubenswrapper[4722]: I0309 15:22:57.152441 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-pg8qn" podUID="ef36bc5a-2962-4c1e-a5fd-98f61d525d5d" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.125:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:57 crc kubenswrapper[4722]: I0309 15:22:57.152594 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-pg8qn" Mar 09 15:22:57 crc kubenswrapper[4722]: I0309 15:22:57.206882 4722 generic.go:334] "Generic (PLEG): container finished" podID="497a07fc-9649-4620-9432-855aa3fdc327" containerID="c59716ff78d5c2092732f1060e720e1a4bc901135d08a83828a4032b1d2c6101" exitCode=1 Mar 09 15:22:57 crc kubenswrapper[4722]: I0309 15:22:57.206931 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-7d6d6698bd-4r85k" event={"ID":"497a07fc-9649-4620-9432-855aa3fdc327","Type":"ContainerDied","Data":"c59716ff78d5c2092732f1060e720e1a4bc901135d08a83828a4032b1d2c6101"} Mar 09 15:22:57 crc kubenswrapper[4722]: I0309 15:22:57.207869 4722 patch_prober.go:28] interesting pod/logging-loki-gateway-6c5ff86c56-n5jdr container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.55:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:57 crc kubenswrapper[4722]: I0309 15:22:57.207908 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-6c5ff86c56-n5jdr" podUID="8becd072-3095-4717-a83d-e56cf0d0f816" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.55:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:57 crc kubenswrapper[4722]: I0309 15:22:57.211036 4722 scope.go:117] "RemoveContainer" containerID="c59716ff78d5c2092732f1060e720e1a4bc901135d08a83828a4032b1d2c6101" Mar 09 15:22:57 crc kubenswrapper[4722]: I0309 15:22:57.222415 4722 patch_prober.go:28] interesting pod/logging-loki-gateway-6c5ff86c56-dvps5 container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:57 crc kubenswrapper[4722]: I0309 15:22:57.222476 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-6c5ff86c56-dvps5" podUID="e265fe14-7154-4fbb-a7c3-33557166f71d" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.56:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:57 crc kubenswrapper[4722]: I0309 15:22:57.226229 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-58897d9998-chrnr_b919959a-1da1-4c74-9330-5bb8c33f5c26/console-operator/0.log" Mar 09 15:22:57 crc kubenswrapper[4722]: I0309 15:22:57.226305 4722 generic.go:334] "Generic (PLEG): container finished" podID="b919959a-1da1-4c74-9330-5bb8c33f5c26" containerID="51927e5090e0b3c63faeac1e7ba440eff196cb147decf4416045e5adc2db287e" exitCode=1 Mar 09 15:22:57 crc kubenswrapper[4722]: I0309 15:22:57.226385 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-chrnr" event={"ID":"b919959a-1da1-4c74-9330-5bb8c33f5c26","Type":"ContainerDied","Data":"51927e5090e0b3c63faeac1e7ba440eff196cb147decf4416045e5adc2db287e"} Mar 09 15:22:57 crc kubenswrapper[4722]: I0309 15:22:57.230433 4722 generic.go:334] "Generic (PLEG): container finished" podID="652c419b-2a86-4c6f-ac7a-c2d7818ef55f" containerID="095d76880e80c386aa9f7fca9391735b55ff0971e29fbda2b8126c70011b1679" exitCode=2 Mar 09 15:22:57 crc kubenswrapper[4722]: I0309 15:22:57.230654 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"652c419b-2a86-4c6f-ac7a-c2d7818ef55f","Type":"ContainerDied","Data":"095d76880e80c386aa9f7fca9391735b55ff0971e29fbda2b8126c70011b1679"} Mar 09 15:22:57 crc kubenswrapper[4722]: I0309 15:22:57.231165 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="controller" containerStatusID={"Type":"cri-o","ID":"5d32ee9d09c3f1a6f773c148a99e203bcff909b156f0b9409b87196b806bbf1a"} pod="metallb-system/frr-k8s-6vn96" containerMessage="Container controller failed liveness probe, will be restarted" Mar 09 15:22:57 crc kubenswrapper[4722]: I0309 15:22:57.231379 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="metallb-system/frr-k8s-6vn96" podUID="29ed2858-4fd0-4817-8ed3-b3515ac035d7" containerName="controller" containerID="cri-o://5d32ee9d09c3f1a6f773c148a99e203bcff909b156f0b9409b87196b806bbf1a" gracePeriod=2 Mar 09 15:22:57 crc kubenswrapper[4722]: E0309 15:22:57.262747 4722 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e25c11b_f9c6_4542_9c0c_394ea6bc2c17.slice/crio-2947e9366eb22bbce7325286e08c9f598b6274003b4528b8f993e61ce6bd571c.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc219beb3_4ba5_43bd_b2ec_3855d19c2b57.slice/crio-b664453e53fd2ade7f0e7440ee555365a0657f568b5c0929fcced9082eb6ae61.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf67efad4_1b85_4f64_9e98_55eb2da89fb6.slice/crio-40e8a42e49bdaa139e2f793ecc9c4f756fce5f98b5a4f6cd78ab4bf4c69dc0ee.scope\": RecentStats: unable to find data in memory cache]" Mar 09 15:22:57 crc kubenswrapper[4722]: I0309 15:22:57.317469 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-pkbqb" podUID="0eac7341-5bab-4c97-a730-b7eeb0a75899" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.126:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:57 crc kubenswrapper[4722]: I0309 15:22:57.317492 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/telemetry-operator-controller-manager-66c8b7dfbb-m7fv2" podUID="5bf14ad6-64cf-48f7-99e6-fabac12849e2" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.124:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:57 crc kubenswrapper[4722]: I0309 15:22:57.359416 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-zjf7b" podUID="22043c71-5292-422c-99e5-c88ea1aef638" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.110:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:57 crc kubenswrapper[4722]: I0309 15:22:57.359523 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-pg8qn" podUID="ef36bc5a-2962-4c1e-a5fd-98f61d525d5d" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.125:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:57 crc kubenswrapper[4722]: I0309 15:22:57.359559 4722 patch_prober.go:28] interesting pod/logging-loki-gateway-6c5ff86c56-n5jdr container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.55:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:57 crc kubenswrapper[4722]: I0309 15:22:57.359614 4722 patch_prober.go:28] interesting pod/logging-loki-gateway-6c5ff86c56-dvps5 container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:57 crc kubenswrapper[4722]: I0309 15:22:57.359652 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-bmpgd" podUID="a1a5e35a-83f6-4886-86db-55738f51f7e8" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.108:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:57 crc kubenswrapper[4722]: I0309 15:22:57.359653 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-6c5ff86c56-dvps5" podUID="e265fe14-7154-4fbb-a7c3-33557166f71d" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:57 crc kubenswrapper[4722]: I0309 15:22:57.359612 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-6c5ff86c56-n5jdr" podUID="8becd072-3095-4717-a83d-e56cf0d0f816" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.55:8083/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:57 crc kubenswrapper[4722]: I0309 15:22:57.359689 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-pkbqb" podUID="0eac7341-5bab-4c97-a730-b7eeb0a75899" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.126:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:57 crc kubenswrapper[4722]: I0309 15:22:57.359772 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-pkbqb" Mar 09 15:22:57 crc kubenswrapper[4722]: I0309 15:22:57.360583 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-56qz9" Mar 09 15:22:57 crc kubenswrapper[4722]: I0309 15:22:57.412524 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-6vn96" podUID="29ed2858-4fd0-4817-8ed3-b3515ac035d7" containerName="controller" probeResult="failure" output="Get \"http://127.0.0.1:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:57 crc kubenswrapper[4722]: I0309 15:22:57.490786 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-hnzfw" podUID="ec9f1f5e-26f5-4683-bf41-c85981da9d18" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.113:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:57 crc kubenswrapper[4722]: I0309 15:22:57.503416 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-n5zc7" Mar 09 15:22:57 crc kubenswrapper[4722]: I0309 15:22:57.517008 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="0e07fbab-4a47-4e59-aa72-f0a4521296af" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.173:9090/-/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:57 crc kubenswrapper[4722]: I0309 15:22:57.573488 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-l8bds" podUID="74cb981b-ce89-479e-8573-fdda25190637" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.116:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:57 crc kubenswrapper[4722]: I0309 15:22:57.573608 4722 patch_prober.go:28] interesting pod/openshift-kube-scheduler-crc container/kube-scheduler namespace/openshift-kube-scheduler: Liveness probe status=failure output="Get \"https://192.168.126.11:10259/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:57 crc kubenswrapper[4722]: I0309 15:22:57.573741 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.126.11:10259/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:57 crc kubenswrapper[4722]: I0309 15:22:57.573826 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 09 15:22:57 crc kubenswrapper[4722]: I0309 15:22:57.586626 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-pkbqb" Mar 09 15:22:57 crc kubenswrapper[4722]: I0309 15:22:57.594762 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-scheduler" containerStatusID={"Type":"cri-o","ID":"c5c374ffb7928a20af6f154bfca0c494d692f1f6f55eb8de5d4d5db79a355013"} pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" containerMessage="Container kube-scheduler failed liveness probe, will be restarted" Mar 09 15:22:57 crc kubenswrapper[4722]: I0309 15:22:57.594890 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" containerID="cri-o://c5c374ffb7928a20af6f154bfca0c494d692f1f6f55eb8de5d4d5db79a355013" gracePeriod=30 Mar 09 15:22:57 crc kubenswrapper[4722]: I0309 15:22:57.609846 4722 patch_prober.go:28] interesting pod/thanos-querier-5f6c868b98-s8fg2 container/kube-rbac-proxy-web namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.81:9091/-/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:57 crc kubenswrapper[4722]: I0309 15:22:57.610232 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/thanos-querier-5f6c868b98-s8fg2" podUID="1eb00f12-c24e-46dc-8346-c096826564f5" containerName="kube-rbac-proxy-web" probeResult="failure" output="Get \"https://10.217.0.81:9091/-/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:57 crc kubenswrapper[4722]: I0309 15:22:57.685736 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-jwgfg" Mar 09 15:22:57 crc kubenswrapper[4722]: I0309 15:22:57.778163 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-operator-index-7qbrd" podUID="8ac36d47-4501-4033-aee7-ce9ed8ed7002" containerName="registry-server" probeResult="failure" output="command timed out" Mar 09 15:22:57 crc kubenswrapper[4722]: I0309 15:22:57.778287 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack-operators/openstack-operator-index-7qbrd" Mar 09 15:22:57 crc kubenswrapper[4722]: I0309 15:22:57.778976 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="e4a22f8c-ed38-47cf-8238-baf804f573a1" containerName="ceilometer-notification-agent" probeResult="failure" output="command timed out" Mar 09 15:22:57 crc kubenswrapper[4722]: I0309 15:22:57.779067 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-index-7qbrd" podUID="8ac36d47-4501-4033-aee7-ce9ed8ed7002" containerName="registry-server" probeResult="failure" output="command timed out" Mar 09 15:22:57 crc kubenswrapper[4722]: I0309 15:22:57.779128 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-7qbrd" Mar 09 15:22:57 crc kubenswrapper[4722]: I0309 15:22:57.796947 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="registry-server" containerStatusID={"Type":"cri-o","ID":"6adf907913ba33a54b43059100ba51bbb9f8f927134943ccbc706b2478b42924"} pod="openstack-operators/openstack-operator-index-7qbrd" containerMessage="Container registry-server failed liveness probe, will be restarted" Mar 09 15:22:57 crc kubenswrapper[4722]: I0309 15:22:57.797057 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-7qbrd" podUID="8ac36d47-4501-4033-aee7-ce9ed8ed7002" containerName="registry-server" containerID="cri-o://6adf907913ba33a54b43059100ba51bbb9f8f927134943ccbc706b2478b42924" gracePeriod=30 Mar 09 15:22:57 crc kubenswrapper[4722]: I0309 15:22:57.862747 4722 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-svdsk container/marketplace-operator namespace/openshift-marketplace: Liveness probe status=failure output="Get \"http://10.217.0.68:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:57 crc kubenswrapper[4722]: I0309 15:22:57.862808 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/marketplace-operator-79b997595-svdsk" podUID="ea964ea5-3fad-4bd0-8ffe-d78f00229fbe" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.68:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:57 crc kubenswrapper[4722]: I0309 15:22:57.862874 4722 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-svdsk container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.68:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:57 crc kubenswrapper[4722]: I0309 15:22:57.862911 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-svdsk" podUID="ea964ea5-3fad-4bd0-8ffe-d78f00229fbe" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.68:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:57 crc kubenswrapper[4722]: I0309 15:22:57.862967 4722 patch_prober.go:28] interesting pod/nmstate-webhook-786f45cff4-jdp69 container/nmstate-webhook namespace/openshift-nmstate: Readiness probe status=failure output="Get \"https://10.217.0.89:9443/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:57 crc kubenswrapper[4722]: I0309 15:22:57.862989 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-nmstate/nmstate-webhook-786f45cff4-jdp69" podUID="c0af161b-a8d5-4a36-b1c2-0a4d43820c73" containerName="nmstate-webhook" probeResult="failure" output="Get \"https://10.217.0.89:9443/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:57 crc kubenswrapper[4722]: I0309 15:22:57.887901 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-66c8b7dfbb-m7fv2" Mar 09 15:22:57 crc kubenswrapper[4722]: I0309 15:22:57.891571 4722 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Liveness probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:57 crc kubenswrapper[4722]: I0309 15:22:57.891633 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:57 crc kubenswrapper[4722]: I0309 15:22:57.893671 4722 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-2mf4l container/oauth-apiserver namespace/openshift-oauth-apiserver: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:57 crc kubenswrapper[4722]: I0309 15:22:57.894216 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-2mf4l" podUID="062db6f1-77ab-4eca-be53-6480160aff81" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.10:8443/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:58 crc kubenswrapper[4722]: I0309 15:22:58.001926 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-wrzrq" podUID="e7b4c7c9-7c4f-4a13-8367-759f5f5ce368" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.123:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:58 crc kubenswrapper[4722]: I0309 15:22:58.135459 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-hgkzs" podUID="df8b52ff-f61e-4aca-a408-240590699ae6" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.121:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:58 crc kubenswrapper[4722]: I0309 15:22:58.243540 4722 generic.go:334] "Generic (PLEG): container finished" podID="f67efad4-1b85-4f64-9e98-55eb2da89fb6" containerID="40e8a42e49bdaa139e2f793ecc9c4f756fce5f98b5a4f6cd78ab4bf4c69dc0ee" exitCode=137 Mar 09 15:22:58 crc kubenswrapper[4722]: I0309 15:22:58.243644 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-798745ff96-864pz" event={"ID":"f67efad4-1b85-4f64-9e98-55eb2da89fb6","Type":"ContainerDied","Data":"40e8a42e49bdaa139e2f793ecc9c4f756fce5f98b5a4f6cd78ab4bf4c69dc0ee"} Mar 09 15:22:58 crc kubenswrapper[4722]: I0309 15:22:58.248379 4722 generic.go:334] "Generic (PLEG): container finished" podID="c219beb3-4ba5-43bd-b2ec-3855d19c2b57" containerID="b664453e53fd2ade7f0e7440ee555365a0657f568b5c0929fcced9082eb6ae61" exitCode=0 Mar 09 15:22:58 crc kubenswrapper[4722]: I0309 15:22:58.248439 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dgxnd" event={"ID":"c219beb3-4ba5-43bd-b2ec-3855d19c2b57","Type":"ContainerDied","Data":"b664453e53fd2ade7f0e7440ee555365a0657f568b5c0929fcced9082eb6ae61"} Mar 09 15:22:58 crc kubenswrapper[4722]: I0309 15:22:58.250566 4722 generic.go:334] "Generic (PLEG): container finished" podID="5e25c11b-f9c6-4542-9c0c-394ea6bc2c17" containerID="2947e9366eb22bbce7325286e08c9f598b6274003b4528b8f993e61ce6bd571c" exitCode=1 Mar 09 15:22:58 crc kubenswrapper[4722]: I0309 15:22:58.250665 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9ctn5qw" event={"ID":"5e25c11b-f9c6-4542-9c0c-394ea6bc2c17","Type":"ContainerDied","Data":"2947e9366eb22bbce7325286e08c9f598b6274003b4528b8f993e61ce6bd571c"} Mar 09 15:22:58 crc kubenswrapper[4722]: I0309 15:22:58.252828 4722 scope.go:117] "RemoveContainer" containerID="2947e9366eb22bbce7325286e08c9f598b6274003b4528b8f993e61ce6bd571c" Mar 09 15:22:58 crc kubenswrapper[4722]: I0309 15:22:58.277510 4722 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:6443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:58 crc kubenswrapper[4722]: I0309 15:22:58.277590 4722 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:58 crc kubenswrapper[4722]: I0309 15:22:58.277576 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:58 crc kubenswrapper[4722]: I0309 15:22:58.277648 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:58 crc kubenswrapper[4722]: I0309 15:22:58.278043 4722 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:6443/livez?exclude=etcd\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:58 crc kubenswrapper[4722]: I0309 15:22:58.278071 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/livez?exclude=etcd\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:58 crc kubenswrapper[4722]: I0309 15:22:58.278113 4722 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:58 crc kubenswrapper[4722]: I0309 15:22:58.278135 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:58 crc kubenswrapper[4722]: I0309 15:22:58.278071 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-pg8qn" podUID="ef36bc5a-2962-4c1e-a5fd-98f61d525d5d" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.125:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:58 crc kubenswrapper[4722]: I0309 15:22:58.390430 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-6vn96" Mar 09 15:22:58 crc kubenswrapper[4722]: I0309 15:22:58.581370 4722 patch_prober.go:28] interesting pod/openshift-kube-scheduler-crc container/kube-scheduler namespace/openshift-kube-scheduler: Readiness probe status=failure output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:58 crc kubenswrapper[4722]: I0309 15:22:58.581480 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:58 crc kubenswrapper[4722]: I0309 15:22:58.913492 4722 patch_prober.go:28] interesting pod/route-controller-manager-67579949ff-g69dw container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.66:8443/healthz\": dial tcp 10.217.0.66:8443: connect: connection refused" start-of-body= Mar 09 15:22:58 crc kubenswrapper[4722]: I0309 15:22:58.913840 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-67579949ff-g69dw" podUID="fd45c8c5-9cad-404b-b14a-9cbc710c8468" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.66:8443/healthz\": dial tcp 10.217.0.66:8443: connect: connection refused" Mar 09 15:22:59 crc kubenswrapper[4722]: I0309 15:22:59.002461 4722 patch_prober.go:28] interesting pod/oauth-openshift-5db757fd5b-t57qc container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.64:6443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:59 crc kubenswrapper[4722]: I0309 15:22:59.002530 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-5db757fd5b-t57qc" podUID="ec3da47d-c782-4189-b195-d6b203bd7f7a" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.64:6443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 15:22:59 crc kubenswrapper[4722]: I0309 15:22:59.265454 4722 generic.go:334] "Generic (PLEG): container finished" podID="14655c3d-02fe-4215-b566-0c4008fd34a0" containerID="620d6d1f26b1e3ccb96ee8067fcb299cd682674fff256e53875db302966d4f26" exitCode=0 Mar 09 15:22:59 crc kubenswrapper[4722]: I0309 15:22:59.265545 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-lc5zn" event={"ID":"14655c3d-02fe-4215-b566-0c4008fd34a0","Type":"ContainerDied","Data":"620d6d1f26b1e3ccb96ee8067fcb299cd682674fff256e53875db302966d4f26"} Mar 09 15:22:59 crc kubenswrapper[4722]: I0309 15:22:59.270249 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-8xdjl" event={"ID":"8b35adf7-a305-4f94-a5c9-02fbc3fca46f","Type":"ContainerStarted","Data":"c9dfac53693d62b45fa308b759a037ff2b4b84033e558d03cf7f90c2db64c779"} Mar 09 15:22:59 crc kubenswrapper[4722]: I0309 15:22:59.270670 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-8xdjl" Mar 09 15:22:59 crc kubenswrapper[4722]: I0309 15:22:59.270994 4722 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-8xdjl container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.75:8443/healthz\": dial tcp 10.217.0.75:8443: connect: connection refused" start-of-body= Mar 09 15:22:59 crc kubenswrapper[4722]: I0309 15:22:59.271036 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-8xdjl" podUID="8b35adf7-a305-4f94-a5c9-02fbc3fca46f" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.75:8443/healthz\": dial tcp 10.217.0.75:8443: connect: connection refused" Mar 09 15:22:59 crc kubenswrapper[4722]: I0309 15:22:59.274180 4722 generic.go:334] "Generic (PLEG): container finished" podID="f8055a95-6b09-4e32-88b8-82ad36ca5029" containerID="d708c56ace6a0794be5e9b44e9d196661f8745ae35cf30b479af812d413dbd2a" exitCode=0 Mar 09 15:22:59 crc kubenswrapper[4722]: I0309 15:22:59.274264 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lnkwt" event={"ID":"f8055a95-6b09-4e32-88b8-82ad36ca5029","Type":"ContainerDied","Data":"d708c56ace6a0794be5e9b44e9d196661f8745ae35cf30b479af812d413dbd2a"} Mar 09 15:22:59 crc kubenswrapper[4722]: I0309 15:22:59.280430 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-58897d9998-chrnr_b919959a-1da1-4c74-9330-5bb8c33f5c26/console-operator/0.log" Mar 09 15:22:59 crc kubenswrapper[4722]: I0309 15:22:59.280610 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-chrnr" event={"ID":"b919959a-1da1-4c74-9330-5bb8c33f5c26","Type":"ContainerStarted","Data":"9ab0c40031b5798f443fa13ee54ec663456e9d642dd4d74a1442ef8ef610ce37"} Mar 09 15:22:59 crc kubenswrapper[4722]: I0309 15:22:59.280851 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-chrnr" Mar 09 15:22:59 crc kubenswrapper[4722]: I0309 15:22:59.281442 4722 patch_prober.go:28] interesting pod/console-operator-58897d9998-chrnr container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.14:8443/readyz\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Mar 09 15:22:59 crc kubenswrapper[4722]: I0309 15:22:59.281506 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-chrnr" podUID="b919959a-1da1-4c74-9330-5bb8c33f5c26" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.14:8443/readyz\": dial tcp 10.217.0.14:8443: connect: connection refused" Mar 09 15:22:59 crc kubenswrapper[4722]: I0309 15:22:59.289294 4722 generic.go:334] "Generic (PLEG): container finished" podID="8557439a-0367-4823-af83-28955a17cc08" containerID="e319108c72ee500be331451610739dd7591d749c08a2d18d2a197ae9514e65d4" exitCode=0 Mar 09 15:22:59 crc kubenswrapper[4722]: I0309 15:22:59.289508 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-2nlfl" event={"ID":"8557439a-0367-4823-af83-28955a17cc08","Type":"ContainerDied","Data":"e319108c72ee500be331451610739dd7591d749c08a2d18d2a197ae9514e65d4"} Mar 09 15:22:59 crc kubenswrapper[4722]: I0309 15:22:59.302762 4722 generic.go:334] "Generic (PLEG): container finished" podID="29ed2858-4fd0-4817-8ed3-b3515ac035d7" containerID="5d32ee9d09c3f1a6f773c148a99e203bcff909b156f0b9409b87196b806bbf1a" exitCode=0 Mar 09 15:22:59 crc kubenswrapper[4722]: I0309 15:22:59.302847 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6vn96" event={"ID":"29ed2858-4fd0-4817-8ed3-b3515ac035d7","Type":"ContainerDied","Data":"5d32ee9d09c3f1a6f773c148a99e203bcff909b156f0b9409b87196b806bbf1a"} Mar 09 15:22:59 crc kubenswrapper[4722]: I0309 15:22:59.304945 4722 generic.go:334] "Generic (PLEG): container finished" podID="3ea04cb5-4d36-42a9-bb83-c6f943619d16" containerID="0d966fe7c2e8e3b2f7acee820d22775c7a77d23eaef3601e29397335bcda71ac" exitCode=1 Mar 09 15:22:59 crc kubenswrapper[4722]: I0309 15:22:59.305030 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5689854475-89q94" event={"ID":"3ea04cb5-4d36-42a9-bb83-c6f943619d16","Type":"ContainerDied","Data":"0d966fe7c2e8e3b2f7acee820d22775c7a77d23eaef3601e29397335bcda71ac"} Mar 09 15:22:59 crc kubenswrapper[4722]: I0309 15:22:59.306517 4722 scope.go:117] "RemoveContainer" containerID="0d966fe7c2e8e3b2f7acee820d22775c7a77d23eaef3601e29397335bcda71ac" Mar 09 15:22:59 crc kubenswrapper[4722]: I0309 15:22:59.309309 4722 generic.go:334] "Generic (PLEG): container finished" podID="0b264ba4-1398-4d3e-bb2b-83cd34a0ccf6" containerID="a1a02869fd9ddd439c962393949fee7151323c975474b74ba0c3e6af42e7e3d9" exitCode=137 Mar 09 15:22:59 crc kubenswrapper[4722]: I0309 15:22:59.309375 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-6w5ww" event={"ID":"0b264ba4-1398-4d3e-bb2b-83cd34a0ccf6","Type":"ContainerDied","Data":"a1a02869fd9ddd439c962393949fee7151323c975474b74ba0c3e6af42e7e3d9"} Mar 09 15:22:59 crc kubenswrapper[4722]: I0309 15:22:59.313175 4722 generic.go:334] "Generic (PLEG): container finished" podID="3411289f-3e7c-4e43-b545-5e612822b18e" containerID="6caf48dd254c82f3e713c8b7d146aa636738f0e702e7888634d4fbddac4cdbd7" exitCode=0 Mar 09 15:22:59 crc kubenswrapper[4722]: I0309 15:22:59.313370 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rvtqn" event={"ID":"3411289f-3e7c-4e43-b545-5e612822b18e","Type":"ContainerDied","Data":"6caf48dd254c82f3e713c8b7d146aa636738f0e702e7888634d4fbddac4cdbd7"} Mar 09 15:22:59 crc kubenswrapper[4722]: I0309 15:22:59.495026 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Mar 09 15:22:59 crc kubenswrapper[4722]: I0309 15:22:59.774848 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/prometheus-k8s-0" podUID="df07ee71-98db-42a1-9df4-6f8707504f08" containerName="prometheus" probeResult="failure" output="command timed out" Mar 09 15:22:59 crc kubenswrapper[4722]: I0309 15:22:59.774916 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Mar 09 15:22:59 crc kubenswrapper[4722]: I0309 15:22:59.776393 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-k8s-0" podUID="df07ee71-98db-42a1-9df4-6f8707504f08" containerName="prometheus" probeResult="failure" output="command timed out" Mar 09 15:22:59 crc kubenswrapper[4722]: I0309 15:22:59.783824 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="prometheus" containerStatusID={"Type":"cri-o","ID":"5c6af4f5b1390e0fa350198e4b30efc579a7139fde74125479e6797423853804"} pod="openshift-monitoring/prometheus-k8s-0" containerMessage="Container prometheus failed liveness probe, will be restarted" Mar 09 15:22:59 crc kubenswrapper[4722]: I0309 15:22:59.783964 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="df07ee71-98db-42a1-9df4-6f8707504f08" containerName="prometheus" containerID="cri-o://5c6af4f5b1390e0fa350198e4b30efc579a7139fde74125479e6797423853804" gracePeriod=600 Mar 09 15:22:59 crc kubenswrapper[4722]: I0309 15:22:59.907447 4722 patch_prober.go:28] interesting pod/controller-manager-8445c785c8-hdmgl container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.65:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:22:59 crc kubenswrapper[4722]: I0309 15:22:59.907732 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-8445c785c8-hdmgl" podUID="7139e62b-5e90-4545-a264-aa8138821a55" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.65:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:23:00 crc kubenswrapper[4722]: I0309 15:23:00.356965 4722 generic.go:334] "Generic (PLEG): container finished" podID="4e27c5a4-8cba-4119-8006-f9841d6121dc" containerID="39debea8587f6e6f170a7177c41011980494388f1c8da8fb895edab231f9cc8f" exitCode=0 Mar 09 15:23:00 crc kubenswrapper[4722]: I0309 15:23:00.357082 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v57f2" event={"ID":"4e27c5a4-8cba-4119-8006-f9841d6121dc","Type":"ContainerDied","Data":"39debea8587f6e6f170a7177c41011980494388f1c8da8fb895edab231f9cc8f"} Mar 09 15:23:00 crc kubenswrapper[4722]: I0309 15:23:00.363135 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lnkwt" event={"ID":"f8055a95-6b09-4e32-88b8-82ad36ca5029","Type":"ContainerStarted","Data":"8392b4c375c2f393b78e68b2d6d47e60cbe28f71ff6fcf50508c36854940dd01"} Mar 09 15:23:00 crc kubenswrapper[4722]: I0309 15:23:00.363469 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lnkwt" Mar 09 15:23:00 crc kubenswrapper[4722]: I0309 15:23:00.363945 4722 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-lnkwt container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" start-of-body= Mar 09 15:23:00 crc kubenswrapper[4722]: I0309 15:23:00.364011 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lnkwt" podUID="f8055a95-6b09-4e32-88b8-82ad36ca5029" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" Mar 09 15:23:00 crc kubenswrapper[4722]: I0309 15:23:00.368074 4722 generic.go:334] "Generic (PLEG): container finished" podID="e4a22f8c-ed38-47cf-8238-baf804f573a1" containerID="cd582a254af464cea392ee9b3f2a8bbe909916d04f0053279129bca6b32f8102" exitCode=0 Mar 09 15:23:00 crc kubenswrapper[4722]: I0309 15:23:00.368124 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4a22f8c-ed38-47cf-8238-baf804f573a1","Type":"ContainerDied","Data":"cd582a254af464cea392ee9b3f2a8bbe909916d04f0053279129bca6b32f8102"} Mar 09 15:23:00 crc kubenswrapper[4722]: I0309 15:23:00.397309 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-6vn96" event={"ID":"29ed2858-4fd0-4817-8ed3-b3515ac035d7","Type":"ContainerStarted","Data":"7fc410c7f2062f45b147b886c46c6bc2ad8b1c9bc4e837d458172e9bc44c0d19"} Mar 09 15:23:00 crc kubenswrapper[4722]: I0309 15:23:00.397604 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-6vn96" Mar 09 15:23:00 crc kubenswrapper[4722]: I0309 15:23:00.398008 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-6vn96" Mar 09 15:23:00 crc kubenswrapper[4722]: I0309 15:23:00.400806 4722 generic.go:334] "Generic (PLEG): container finished" podID="ec9f1f5e-26f5-4683-bf41-c85981da9d18" containerID="ed3f78f85b1a4a65b61d1a60a0bc4c588ac85a3fc264827eba225ad765864e3b" exitCode=1 Mar 09 15:23:00 crc kubenswrapper[4722]: I0309 15:23:00.400892 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-hnzfw" event={"ID":"ec9f1f5e-26f5-4683-bf41-c85981da9d18","Type":"ContainerDied","Data":"ed3f78f85b1a4a65b61d1a60a0bc4c588ac85a3fc264827eba225ad765864e3b"} Mar 09 15:23:00 crc kubenswrapper[4722]: I0309 15:23:00.402090 4722 scope.go:117] "RemoveContainer" containerID="ed3f78f85b1a4a65b61d1a60a0bc4c588ac85a3fc264827eba225ad765864e3b" Mar 09 15:23:00 crc kubenswrapper[4722]: I0309 15:23:00.406921 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9ctn5qw" event={"ID":"5e25c11b-f9c6-4542-9c0c-394ea6bc2c17","Type":"ContainerStarted","Data":"3f7c4efc45b814f9eef69f8dd05dfe9377576e125d3272ddd3a3695e58cf8f05"} Mar 09 15:23:00 crc kubenswrapper[4722]: I0309 15:23:00.407223 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9ctn5qw" Mar 09 15:23:00 crc kubenswrapper[4722]: I0309 15:23:00.410631 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-6w5ww" event={"ID":"0b264ba4-1398-4d3e-bb2b-83cd34a0ccf6","Type":"ContainerStarted","Data":"da6879719f7719ef2aa782c1c092d09bc3195e3f95287048c4da669f07f24e9c"} Mar 09 15:23:00 crc kubenswrapper[4722]: I0309 15:23:00.410776 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-86ddb6bd46-6w5ww" Mar 09 15:23:00 crc kubenswrapper[4722]: I0309 15:23:00.414323 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-798745ff96-864pz" event={"ID":"f67efad4-1b85-4f64-9e98-55eb2da89fb6","Type":"ContainerStarted","Data":"660fad02b63164b1ee452b62d6450dca271332a15b461dcdf62806e2e81ca4c0"} Mar 09 15:23:00 crc kubenswrapper[4722]: I0309 15:23:00.414372 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-798745ff96-864pz" Mar 09 15:23:00 crc kubenswrapper[4722]: I0309 15:23:00.420309 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Mar 09 15:23:00 crc kubenswrapper[4722]: I0309 15:23:00.421902 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 09 15:23:00 crc kubenswrapper[4722]: I0309 15:23:00.426577 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 09 15:23:00 crc kubenswrapper[4722]: I0309 15:23:00.426641 4722 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="bcddc7eafb6c9687b9000e59dccf3f95a805e933ddcd42fb6f704c8b18dd5257" exitCode=1 Mar 09 15:23:00 crc kubenswrapper[4722]: I0309 15:23:00.426721 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"bcddc7eafb6c9687b9000e59dccf3f95a805e933ddcd42fb6f704c8b18dd5257"} Mar 09 15:23:00 crc kubenswrapper[4722]: I0309 15:23:00.426773 4722 scope.go:117] "RemoveContainer" containerID="ff66d43b6e872eed24aacaf3ea882c7148b6ef62bbe5b7bf10ede5d2691a3680" Mar 09 15:23:00 crc kubenswrapper[4722]: I0309 15:23:00.427731 4722 scope.go:117] "RemoveContainer" containerID="bcddc7eafb6c9687b9000e59dccf3f95a805e933ddcd42fb6f704c8b18dd5257" Mar 09 15:23:00 crc kubenswrapper[4722]: I0309 15:23:00.430016 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5689854475-89q94" event={"ID":"3ea04cb5-4d36-42a9-bb83-c6f943619d16","Type":"ContainerStarted","Data":"e42d3f4e218dcdff490e2696c6ee2e2628b1e9cf4b4289441817b71729da7d17"} Mar 09 15:23:00 crc kubenswrapper[4722]: I0309 15:23:00.430297 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-5689854475-89q94" Mar 09 15:23:00 crc kubenswrapper[4722]: I0309 15:23:00.435938 4722 generic.go:334] "Generic (PLEG): container finished" podID="690e5ab0-3719-40ac-aba6-9278480ecb44" containerID="90d71c8e051db0d4adfc07841fdf4a0886e86003f3b562c362dc6cd509599188" exitCode=0 Mar 09 15:23:00 crc kubenswrapper[4722]: I0309 15:23:00.436012 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c2mmw" event={"ID":"690e5ab0-3719-40ac-aba6-9278480ecb44","Type":"ContainerDied","Data":"90d71c8e051db0d4adfc07841fdf4a0886e86003f3b562c362dc6cd509599188"} Mar 09 15:23:00 crc kubenswrapper[4722]: I0309 15:23:00.446235 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dgxnd" event={"ID":"c219beb3-4ba5-43bd-b2ec-3855d19c2b57","Type":"ContainerStarted","Data":"049784189ccaad8b24dee7613306087361e1f7da80d6c3e1a5429117e2560239"} Mar 09 15:23:00 crc kubenswrapper[4722]: I0309 15:23:00.446726 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dgxnd" Mar 09 15:23:00 crc kubenswrapper[4722]: I0309 15:23:00.447060 4722 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-dgxnd container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.42:8443/healthz\": dial tcp 10.217.0.42:8443: connect: connection refused" start-of-body= Mar 09 15:23:00 crc kubenswrapper[4722]: I0309 15:23:00.447127 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dgxnd" podUID="c219beb3-4ba5-43bd-b2ec-3855d19c2b57" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.42:8443/healthz\": dial tcp 10.217.0.42:8443: connect: connection refused" Mar 09 15:23:00 crc kubenswrapper[4722]: I0309 15:23:00.449328 4722 generic.go:334] "Generic (PLEG): container finished" podID="7a62b98d-e9d4-4cbc-bea8-0da13fcc4467" containerID="32a0e58da063dd7bade8b3cf5a28e5e43ca5f2b14ee02f6b41515830777fa4f0" exitCode=1 Mar 09 15:23:00 crc kubenswrapper[4722]: I0309 15:23:00.449389 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-lvfgg" event={"ID":"7a62b98d-e9d4-4cbc-bea8-0da13fcc4467","Type":"ContainerDied","Data":"32a0e58da063dd7bade8b3cf5a28e5e43ca5f2b14ee02f6b41515830777fa4f0"} Mar 09 15:23:00 crc kubenswrapper[4722]: I0309 15:23:00.450351 4722 scope.go:117] "RemoveContainer" containerID="32a0e58da063dd7bade8b3cf5a28e5e43ca5f2b14ee02f6b41515830777fa4f0" Mar 09 15:23:00 crc kubenswrapper[4722]: I0309 15:23:00.454573 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-7d6d6698bd-4r85k" event={"ID":"497a07fc-9649-4620-9432-855aa3fdc327","Type":"ContainerStarted","Data":"ee5a0bb28ba65742cd01112b1a41ffc02574b8a908ffd66aef12e33dddf7a212"} Mar 09 15:23:00 crc kubenswrapper[4722]: I0309 15:23:00.454803 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators-redhat/loki-operator-controller-manager-7d6d6698bd-4r85k" Mar 09 15:23:00 crc kubenswrapper[4722]: I0309 15:23:00.458698 4722 generic.go:334] "Generic (PLEG): container finished" podID="996087ed-6480-4650-8632-c991e5d16c99" containerID="4b273112c5687a28adabe3b469c671154db487b3bd47cef0ffa59c7a855124c1" exitCode=0 Mar 09 15:23:00 crc kubenswrapper[4722]: I0309 15:23:00.458760 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fp2th" event={"ID":"996087ed-6480-4650-8632-c991e5d16c99","Type":"ContainerDied","Data":"4b273112c5687a28adabe3b469c671154db487b3bd47cef0ffa59c7a855124c1"} Mar 09 15:23:00 crc kubenswrapper[4722]: I0309 15:23:00.461541 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-2nlfl" event={"ID":"8557439a-0367-4823-af83-28955a17cc08","Type":"ContainerStarted","Data":"03edb554584ee27041b659b7987c203895ec36f369cdfa87f0bfbbf7a96a3939"} Mar 09 15:23:00 crc kubenswrapper[4722]: I0309 15:23:00.462112 4722 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-8xdjl container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.75:8443/healthz\": dial tcp 10.217.0.75:8443: connect: connection refused" start-of-body= Mar 09 15:23:00 crc kubenswrapper[4722]: I0309 15:23:00.462133 4722 patch_prober.go:28] interesting pod/console-operator-58897d9998-chrnr container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.14:8443/readyz\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Mar 09 15:23:00 crc kubenswrapper[4722]: I0309 15:23:00.462154 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-8xdjl" podUID="8b35adf7-a305-4f94-a5c9-02fbc3fca46f" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.75:8443/healthz\": dial tcp 10.217.0.75:8443: connect: connection refused" Mar 09 15:23:00 crc kubenswrapper[4722]: I0309 15:23:00.462178 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-chrnr" podUID="b919959a-1da1-4c74-9330-5bb8c33f5c26" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.14:8443/readyz\": dial tcp 10.217.0.14:8443: connect: connection refused" Mar 09 15:23:00 crc kubenswrapper[4722]: I0309 15:23:00.462474 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-2nlfl" podUID="8557439a-0367-4823-af83-28955a17cc08" containerName="frr-k8s-webhook-server" probeResult="failure" output="Get \"http://10.217.0.99:7572/metrics\": dial tcp 10.217.0.99:7572: connect: connection refused" Mar 09 15:23:00 crc kubenswrapper[4722]: I0309 15:23:00.775879 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="a79aaedb-92ba-42fc-8cc7-9ecb007d2ac7" containerName="galera" probeResult="failure" output="command timed out" Mar 09 15:23:00 crc kubenswrapper[4722]: I0309 15:23:00.992832 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-distributor-5d5548c9f5-r6x4b" Mar 09 15:23:00 crc kubenswrapper[4722]: I0309 15:23:00.997882 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-querier-76bf7b6d45-fv4dh" Mar 09 15:23:01 crc kubenswrapper[4722]: I0309 15:23:01.156441 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/kube-state-metrics-0" podUID="652c419b-2a86-4c6f-ac7a-c2d7818ef55f" containerName="kube-state-metrics" probeResult="failure" output="Get \"https://10.217.1.16:8081/readyz\": dial tcp 10.217.1.16:8081: connect: connection refused" Mar 09 15:23:01 crc kubenswrapper[4722]: I0309 15:23:01.201220 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-query-frontend-6d6859c548-5b72h" Mar 09 15:23:01 crc kubenswrapper[4722]: I0309 15:23:01.254189 4722 patch_prober.go:28] interesting pod/monitoring-plugin-65599947bd-42bk4 container/monitoring-plugin namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.84:9443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:23:01 crc kubenswrapper[4722]: I0309 15:23:01.254276 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/monitoring-plugin-65599947bd-42bk4" podUID="85f5a76e-3679-44b3-8932-f5245c49b481" containerName="monitoring-plugin" probeResult="failure" output="Get \"https://10.217.0.84:9443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 09 15:23:01 crc kubenswrapper[4722]: I0309 15:23:01.292160 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-k8s-0" podUID="df07ee71-98db-42a1-9df4-6f8707504f08" containerName="prometheus" probeResult="failure" output=< Mar 09 15:23:01 crc kubenswrapper[4722]: % Total % Received % Xferd Average Speed Time Time Time Current Mar 09 15:23:01 crc kubenswrapper[4722]: Dload Upload Total Spent Left Speed Mar 09 15:23:01 crc kubenswrapper[4722]: [166B blob data] Mar 09 15:23:01 crc kubenswrapper[4722]: curl: (22) The requested URL returned error: 503 Mar 09 15:23:01 crc kubenswrapper[4722]: > Mar 09 15:23:01 crc kubenswrapper[4722]: E0309 15:23:01.304998 4722 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5c6af4f5b1390e0fa350198e4b30efc579a7139fde74125479e6797423853804" cmd=["sh","-c","if [ -x \"$(command -v curl)\" ]; then exec curl --fail http://localhost:9090/-/ready; elif [ -x \"$(command -v wget)\" ]; then exec wget -q -O /dev/null http://localhost:9090/-/ready; else exit 1; fi"] Mar 09 15:23:01 crc kubenswrapper[4722]: E0309 15:23:01.309107 4722 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5c6af4f5b1390e0fa350198e4b30efc579a7139fde74125479e6797423853804" cmd=["sh","-c","if [ -x \"$(command -v curl)\" ]; then exec curl --fail http://localhost:9090/-/ready; elif [ -x \"$(command -v wget)\" ]; then exec wget -q -O /dev/null http://localhost:9090/-/ready; else exit 1; fi"] Mar 09 15:23:01 crc kubenswrapper[4722]: E0309 15:23:01.313852 4722 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="5c6af4f5b1390e0fa350198e4b30efc579a7139fde74125479e6797423853804" cmd=["sh","-c","if [ -x \"$(command -v curl)\" ]; then exec curl --fail http://localhost:9090/-/ready; elif [ -x \"$(command -v wget)\" ]; then exec wget -q -O /dev/null http://localhost:9090/-/ready; else exit 1; fi"] Mar 09 15:23:01 crc kubenswrapper[4722]: E0309 15:23:01.313988 4722 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-monitoring/prometheus-k8s-0" podUID="df07ee71-98db-42a1-9df4-6f8707504f08" containerName="prometheus" Mar 09 15:23:01 crc kubenswrapper[4722]: I0309 15:23:01.434369 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-controller-manager-55cd86c56-dm2dr" podUID="98c22319-d5f8-4a0b-8a30-89b9d832f354" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.127:8081/readyz\": dial tcp 10.217.0.127:8081: connect: connection refused" Mar 09 15:23:01 crc kubenswrapper[4722]: I0309 15:23:01.434499 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-55cd86c56-dm2dr" Mar 09 15:23:01 crc kubenswrapper[4722]: I0309 15:23:01.435329 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-controller-manager-55cd86c56-dm2dr" podUID="98c22319-d5f8-4a0b-8a30-89b9d832f354" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.127:8081/readyz\": dial tcp 10.217.0.127:8081: connect: connection refused" Mar 09 15:23:01 crc kubenswrapper[4722]: I0309 15:23:01.471021 4722 patch_prober.go:28] interesting pod/observability-operator-59bdc8b94-lc5zn container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.18:8081/healthz\": dial tcp 10.217.0.18:8081: connect: connection refused" start-of-body= Mar 09 15:23:01 crc kubenswrapper[4722]: I0309 15:23:01.471073 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-59bdc8b94-lc5zn" podUID="14655c3d-02fe-4215-b566-0c4008fd34a0" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.18:8081/healthz\": dial tcp 10.217.0.18:8081: connect: connection refused" Mar 09 15:23:01 crc kubenswrapper[4722]: I0309 15:23:01.478040 4722 generic.go:334] "Generic (PLEG): container finished" podID="427a4c04-99cd-4f53-ae98-20c1755d7658" containerID="e614a94c2fbf38f966a561149647e547de6024a28362b5b063c83cb79f8c1d5e" exitCode=0 Mar 09 15:23:01 crc kubenswrapper[4722]: I0309 15:23:01.478122 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-8tkbn" event={"ID":"427a4c04-99cd-4f53-ae98-20c1755d7658","Type":"ContainerDied","Data":"e614a94c2fbf38f966a561149647e547de6024a28362b5b063c83cb79f8c1d5e"} Mar 09 15:23:01 crc kubenswrapper[4722]: I0309 15:23:01.501497 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"652c419b-2a86-4c6f-ac7a-c2d7818ef55f","Type":"ContainerStarted","Data":"44220b1447f2a77962d0bd83f40aba5f9958c06fde13b5c1ad10a7d1ffd47cf0"} Mar 09 15:23:01 crc kubenswrapper[4722]: I0309 15:23:01.501805 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 09 15:23:01 crc kubenswrapper[4722]: I0309 15:23:01.512015 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-2rhld" Mar 09 15:23:01 crc kubenswrapper[4722]: I0309 15:23:01.514821 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rvtqn" event={"ID":"3411289f-3e7c-4e43-b545-5e612822b18e","Type":"ContainerStarted","Data":"d0ccdf84ceefa3531aad4c015ee7332e908fd60a8adc8d0844978562c3e8dbfe"} Mar 09 15:23:01 crc kubenswrapper[4722]: I0309 15:23:01.518099 4722 generic.go:334] "Generic (PLEG): container finished" podID="fd45c8c5-9cad-404b-b14a-9cbc710c8468" containerID="cd7063f2eab111d8560c940917127f1dca125e473c409c23b68096bf29d14642" exitCode=0 Mar 09 15:23:01 crc kubenswrapper[4722]: I0309 15:23:01.518194 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-67579949ff-g69dw" event={"ID":"fd45c8c5-9cad-404b-b14a-9cbc710c8468","Type":"ContainerDied","Data":"cd7063f2eab111d8560c940917127f1dca125e473c409c23b68096bf29d14642"} Mar 09 15:23:01 crc kubenswrapper[4722]: I0309 15:23:01.520813 4722 generic.go:334] "Generic (PLEG): container finished" podID="717ffc3a-7a6d-4a7c-837f-d1ed92489b68" containerID="4215330fca94e183b407b09a814fe82779e48c0e1475bfc1884f69234bef8531" exitCode=1 Mar 09 15:23:01 crc kubenswrapper[4722]: I0309 15:23:01.520881 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-n5zc7" event={"ID":"717ffc3a-7a6d-4a7c-837f-d1ed92489b68","Type":"ContainerDied","Data":"4215330fca94e183b407b09a814fe82779e48c0e1475bfc1884f69234bef8531"} Mar 09 15:23:01 crc kubenswrapper[4722]: I0309 15:23:01.521412 4722 scope.go:117] "RemoveContainer" containerID="4215330fca94e183b407b09a814fe82779e48c0e1475bfc1884f69234bef8531" Mar 09 15:23:01 crc kubenswrapper[4722]: I0309 15:23:01.523225 4722 generic.go:334] "Generic (PLEG): container finished" podID="5bf14ad6-64cf-48f7-99e6-fabac12849e2" containerID="787222444ce2eed0099002be713c411e9d3b09b2c30c469655e4a2716fd32c8c" exitCode=1 Mar 09 15:23:01 crc kubenswrapper[4722]: I0309 15:23:01.523324 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-66c8b7dfbb-m7fv2" event={"ID":"5bf14ad6-64cf-48f7-99e6-fabac12849e2","Type":"ContainerDied","Data":"787222444ce2eed0099002be713c411e9d3b09b2c30c469655e4a2716fd32c8c"} Mar 09 15:23:01 crc kubenswrapper[4722]: I0309 15:23:01.524223 4722 scope.go:117] "RemoveContainer" containerID="787222444ce2eed0099002be713c411e9d3b09b2c30c469655e4a2716fd32c8c" Mar 09 15:23:01 crc kubenswrapper[4722]: I0309 15:23:01.527435 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-lvfgg" event={"ID":"7a62b98d-e9d4-4cbc-bea8-0da13fcc4467","Type":"ContainerStarted","Data":"39fb4cf3e3a8b123623f103bba3ec576826cf5841b4da1cfc26201e54d1bd698"} Mar 09 15:23:01 crc kubenswrapper[4722]: I0309 15:23:01.527825 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-lvfgg" Mar 09 15:23:01 crc kubenswrapper[4722]: I0309 15:23:01.534073 4722 generic.go:334] "Generic (PLEG): container finished" podID="98c22319-d5f8-4a0b-8a30-89b9d832f354" containerID="eef7b6b9c2478f3c9be2954de876f0c104d23e496f6e5c9281a6f9a6be437ed8" exitCode=1 Mar 09 15:23:01 crc kubenswrapper[4722]: I0309 15:23:01.534274 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-55cd86c56-dm2dr" event={"ID":"98c22319-d5f8-4a0b-8a30-89b9d832f354","Type":"ContainerDied","Data":"eef7b6b9c2478f3c9be2954de876f0c104d23e496f6e5c9281a6f9a6be437ed8"} Mar 09 15:23:01 crc kubenswrapper[4722]: I0309 15:23:01.535167 4722 scope.go:117] "RemoveContainer" containerID="eef7b6b9c2478f3c9be2954de876f0c104d23e496f6e5c9281a6f9a6be437ed8" Mar 09 15:23:01 crc kubenswrapper[4722]: I0309 15:23:01.542315 4722 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="c5c374ffb7928a20af6f154bfca0c494d692f1f6f55eb8de5d4d5db79a355013" exitCode=0 Mar 09 15:23:01 crc kubenswrapper[4722]: I0309 15:23:01.542359 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"c5c374ffb7928a20af6f154bfca0c494d692f1f6f55eb8de5d4d5db79a355013"} Mar 09 15:23:01 crc kubenswrapper[4722]: I0309 15:23:01.547735 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rvtqn" Mar 09 15:23:01 crc kubenswrapper[4722]: I0309 15:23:01.547828 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rvtqn" Mar 09 15:23:01 crc kubenswrapper[4722]: I0309 15:23:01.549520 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c2mmw" event={"ID":"690e5ab0-3719-40ac-aba6-9278480ecb44","Type":"ContainerStarted","Data":"de4a4f509d172a69426566eadc67773d9173af813ac49930ca151f7f2e77a15f"} Mar 09 15:23:01 crc kubenswrapper[4722]: I0309 15:23:01.555567 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fp2th" event={"ID":"996087ed-6480-4650-8632-c991e5d16c99","Type":"ContainerStarted","Data":"9e3c22dd25bd408c90b1a2936b224f36dcf05e44e14cf148ff1b91318386bef0"} Mar 09 15:23:01 crc kubenswrapper[4722]: I0309 15:23:01.555781 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fp2th" Mar 09 15:23:01 crc kubenswrapper[4722]: I0309 15:23:01.556404 4722 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-fp2th container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.44:5443/healthz\": dial tcp 10.217.0.44:5443: connect: connection refused" start-of-body= Mar 09 15:23:01 crc kubenswrapper[4722]: I0309 15:23:01.556740 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fp2th" podUID="996087ed-6480-4650-8632-c991e5d16c99" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.44:5443/healthz\": dial tcp 10.217.0.44:5443: connect: connection refused" Mar 09 15:23:01 crc kubenswrapper[4722]: I0309 15:23:01.562416 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-hnzfw" event={"ID":"ec9f1f5e-26f5-4683-bf41-c85981da9d18","Type":"ContainerStarted","Data":"6d2b2ddb34d6d924ce9af48340733b31b96792d5cda21875e329e25146b70191"} Mar 09 15:23:01 crc kubenswrapper[4722]: I0309 15:23:01.562894 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-hnzfw" Mar 09 15:23:01 crc kubenswrapper[4722]: I0309 15:23:01.567427 4722 generic.go:334] "Generic (PLEG): container finished" podID="ef36bc5a-2962-4c1e-a5fd-98f61d525d5d" containerID="6782d8882e0a59ca3e58a50a0e001600a53007a67f9e563627e6db6131091f0c" exitCode=1 Mar 09 15:23:01 crc kubenswrapper[4722]: I0309 15:23:01.567515 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-pg8qn" event={"ID":"ef36bc5a-2962-4c1e-a5fd-98f61d525d5d","Type":"ContainerDied","Data":"6782d8882e0a59ca3e58a50a0e001600a53007a67f9e563627e6db6131091f0c"} Mar 09 15:23:01 crc kubenswrapper[4722]: I0309 15:23:01.568143 4722 scope.go:117] "RemoveContainer" containerID="6782d8882e0a59ca3e58a50a0e001600a53007a67f9e563627e6db6131091f0c" Mar 09 15:23:01 crc kubenswrapper[4722]: I0309 15:23:01.570256 4722 generic.go:334] "Generic (PLEG): container finished" podID="74cb981b-ce89-479e-8573-fdda25190637" containerID="a154881d450e8d126219a8b7b71b123d2b85326acfe992710eb4d2e649f4e3c6" exitCode=1 Mar 09 15:23:01 crc kubenswrapper[4722]: I0309 15:23:01.570334 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-l8bds" event={"ID":"74cb981b-ce89-479e-8573-fdda25190637","Type":"ContainerDied","Data":"a154881d450e8d126219a8b7b71b123d2b85326acfe992710eb4d2e649f4e3c6"} Mar 09 15:23:01 crc kubenswrapper[4722]: I0309 15:23:01.571302 4722 scope.go:117] "RemoveContainer" containerID="a154881d450e8d126219a8b7b71b123d2b85326acfe992710eb4d2e649f4e3c6" Mar 09 15:23:01 crc kubenswrapper[4722]: I0309 15:23:01.583126 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Mar 09 15:23:01 crc kubenswrapper[4722]: I0309 15:23:01.584321 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 09 15:23:01 crc kubenswrapper[4722]: I0309 15:23:01.586951 4722 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-lnkwt container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" start-of-body= Mar 09 15:23:01 crc kubenswrapper[4722]: I0309 15:23:01.586994 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lnkwt" podUID="f8055a95-6b09-4e32-88b8-82ad36ca5029" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" Mar 09 15:23:01 crc kubenswrapper[4722]: I0309 15:23:01.587616 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-2nlfl" Mar 09 15:23:01 crc kubenswrapper[4722]: I0309 15:23:01.587857 4722 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-dgxnd container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.42:8443/healthz\": dial tcp 10.217.0.42:8443: connect: connection refused" start-of-body= Mar 09 15:23:01 crc kubenswrapper[4722]: I0309 15:23:01.587881 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dgxnd" podUID="c219beb3-4ba5-43bd-b2ec-3855d19c2b57" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.42:8443/healthz\": dial tcp 10.217.0.42:8443: connect: connection refused" Mar 09 15:23:01 crc kubenswrapper[4722]: I0309 15:23:01.639915 4722 patch_prober.go:28] interesting pod/console-operator-58897d9998-chrnr container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.14:8443/readyz\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Mar 09 15:23:01 crc kubenswrapper[4722]: I0309 15:23:01.639945 4722 patch_prober.go:28] interesting pod/console-operator-58897d9998-chrnr container/console-operator namespace/openshift-console-operator: Liveness probe status=failure output="Get \"https://10.217.0.14:8443/healthz\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Mar 09 15:23:01 crc kubenswrapper[4722]: I0309 15:23:01.639957 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-chrnr" podUID="b919959a-1da1-4c74-9330-5bb8c33f5c26" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.14:8443/readyz\": dial tcp 10.217.0.14:8443: connect: connection refused" Mar 09 15:23:01 crc kubenswrapper[4722]: I0309 15:23:01.639978 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console-operator/console-operator-58897d9998-chrnr" podUID="b919959a-1da1-4c74-9330-5bb8c33f5c26" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.14:8443/healthz\": dial tcp 10.217.0.14:8443: connect: connection refused" Mar 09 15:23:01 crc kubenswrapper[4722]: I0309 15:23:01.757743 4722 patch_prober.go:28] interesting pod/openshift-kube-scheduler-crc container/kube-scheduler namespace/openshift-kube-scheduler: Readiness probe status=failure output="Get \"https://192.168.126.11:10259/healthz\": dial tcp 192.168.126.11:10259: connect: connection refused" start-of-body= Mar 09 15:23:01 crc kubenswrapper[4722]: I0309 15:23:01.758165 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.126.11:10259/healthz\": dial tcp 192.168.126.11:10259: connect: connection refused" Mar 09 15:23:01 crc kubenswrapper[4722]: I0309 15:23:01.772937 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="4159e308-3ccf-45d9-a97b-8133542007a8" containerName="galera" probeResult="failure" output="command timed out" Mar 09 15:23:01 crc kubenswrapper[4722]: I0309 15:23:01.872444 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-ingester-0" Mar 09 15:23:02 crc kubenswrapper[4722]: I0309 15:23:02.327669 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="a79aaedb-92ba-42fc-8cc7-9ecb007d2ac7" containerName="galera" containerID="cri-o://ed50884a2474119d9434961efa1de2e0e6821dcf7bc3597580905135c8f6ae50" gracePeriod=9 Mar 09 15:23:02 crc kubenswrapper[4722]: I0309 15:23:02.336925 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-cell1-galera-0" podUID="4159e308-3ccf-45d9-a97b-8133542007a8" containerName="galera" containerID="cri-o://80ff28c7cda34665de0bebe76a8a1102a025d02953717f80f765be65e05ede59" gracePeriod=20 Mar 09 15:23:02 crc kubenswrapper[4722]: I0309 15:23:02.602986 4722 generic.go:334] "Generic (PLEG): container finished" podID="f21c35ef-c8ea-4331-a747-44a62c6f2e74" containerID="daf50fcb70e742726fdc3b24774f11f98dea479bc40bcb4838c7f9304343cbb9" exitCode=1 Mar 09 15:23:02 crc kubenswrapper[4722]: I0309 15:23:02.603061 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-ct7x8" event={"ID":"f21c35ef-c8ea-4331-a747-44a62c6f2e74","Type":"ContainerDied","Data":"daf50fcb70e742726fdc3b24774f11f98dea479bc40bcb4838c7f9304343cbb9"} Mar 09 15:23:02 crc kubenswrapper[4722]: I0309 15:23:02.603923 4722 scope.go:117] "RemoveContainer" containerID="daf50fcb70e742726fdc3b24774f11f98dea479bc40bcb4838c7f9304343cbb9" Mar 09 15:23:02 crc kubenswrapper[4722]: I0309 15:23:02.611765 4722 generic.go:334] "Generic (PLEG): container finished" podID="228d39d8-b0bc-4491-be90-e473c090f412" containerID="778991893ca7b914cdde2f5428e80ab6cfce3570917a24a4ccaaaff3928b14db" exitCode=0 Mar 09 15:23:02 crc kubenswrapper[4722]: I0309 15:23:02.611837 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mrtb2" event={"ID":"228d39d8-b0bc-4491-be90-e473c090f412","Type":"ContainerDied","Data":"778991893ca7b914cdde2f5428e80ab6cfce3570917a24a4ccaaaff3928b14db"} Mar 09 15:23:02 crc kubenswrapper[4722]: I0309 15:23:02.629132 4722 generic.go:334] "Generic (PLEG): container finished" podID="22043c71-5292-422c-99e5-c88ea1aef638" containerID="1233abd42f31e32e4b780ef1591655c23c9a6b3fe437a03d3f6f9931ed23c220" exitCode=1 Mar 09 15:23:02 crc kubenswrapper[4722]: I0309 15:23:02.629197 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-zjf7b" event={"ID":"22043c71-5292-422c-99e5-c88ea1aef638","Type":"ContainerDied","Data":"1233abd42f31e32e4b780ef1591655c23c9a6b3fe437a03d3f6f9931ed23c220"} Mar 09 15:23:02 crc kubenswrapper[4722]: I0309 15:23:02.630678 4722 scope.go:117] "RemoveContainer" containerID="1233abd42f31e32e4b780ef1591655c23c9a6b3fe437a03d3f6f9931ed23c220" Mar 09 15:23:02 crc kubenswrapper[4722]: I0309 15:23:02.637092 4722 generic.go:334] "Generic (PLEG): container finished" podID="4de6db14-6f3e-4c4e-a61d-39c6648209dd" containerID="708a193ff4194bb83d14fc922c7c15ba4f75bdbfb539cc02781e2e41893036b5" exitCode=1 Mar 09 15:23:02 crc kubenswrapper[4722]: I0309 15:23:02.637169 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-2hxzr" event={"ID":"4de6db14-6f3e-4c4e-a61d-39c6648209dd","Type":"ContainerDied","Data":"708a193ff4194bb83d14fc922c7c15ba4f75bdbfb539cc02781e2e41893036b5"} Mar 09 15:23:02 crc kubenswrapper[4722]: I0309 15:23:02.638220 4722 scope.go:117] "RemoveContainer" containerID="708a193ff4194bb83d14fc922c7c15ba4f75bdbfb539cc02781e2e41893036b5" Mar 09 15:23:02 crc kubenswrapper[4722]: I0309 15:23:02.642617 4722 generic.go:334] "Generic (PLEG): container finished" podID="df8b52ff-f61e-4aca-a408-240590699ae6" containerID="9ee98f0ba96b989ec81202064c9c5c6a9fc391943e93eafa7da4d27052db2cd9" exitCode=1 Mar 09 15:23:02 crc kubenswrapper[4722]: I0309 15:23:02.642679 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-hgkzs" event={"ID":"df8b52ff-f61e-4aca-a408-240590699ae6","Type":"ContainerDied","Data":"9ee98f0ba96b989ec81202064c9c5c6a9fc391943e93eafa7da4d27052db2cd9"} Mar 09 15:23:02 crc kubenswrapper[4722]: I0309 15:23:02.643518 4722 scope.go:117] "RemoveContainer" containerID="9ee98f0ba96b989ec81202064c9c5c6a9fc391943e93eafa7da4d27052db2cd9" Mar 09 15:23:02 crc kubenswrapper[4722]: I0309 15:23:02.655713 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-lc5zn" event={"ID":"14655c3d-02fe-4215-b566-0c4008fd34a0","Type":"ContainerStarted","Data":"3644deddf340c1402316d00e53983a5dd585051e3349ae8f804045f1ed54cc76"} Mar 09 15:23:02 crc kubenswrapper[4722]: I0309 15:23:02.656919 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-lc5zn" Mar 09 15:23:02 crc kubenswrapper[4722]: I0309 15:23:02.657012 4722 patch_prober.go:28] interesting pod/observability-operator-59bdc8b94-lc5zn container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.18:8081/healthz\": dial tcp 10.217.0.18:8081: connect: connection refused" start-of-body= Mar 09 15:23:02 crc kubenswrapper[4722]: I0309 15:23:02.657067 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-59bdc8b94-lc5zn" podUID="14655c3d-02fe-4215-b566-0c4008fd34a0" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.18:8081/healthz\": dial tcp 10.217.0.18:8081: connect: connection refused" Mar 09 15:23:02 crc kubenswrapper[4722]: I0309 15:23:02.658628 4722 generic.go:334] "Generic (PLEG): container finished" podID="f9ff9b26-9d5a-4194-bab5-1b9fb5dee947" containerID="ed3fbb43f36319a7eb8ab1a8588e8778eb342cc179a9ad24d7451bed03535581" exitCode=1 Mar 09 15:23:02 crc kubenswrapper[4722]: I0309 15:23:02.658701 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xswl4" event={"ID":"f9ff9b26-9d5a-4194-bab5-1b9fb5dee947","Type":"ContainerDied","Data":"ed3fbb43f36319a7eb8ab1a8588e8778eb342cc179a9ad24d7451bed03535581"} Mar 09 15:23:02 crc kubenswrapper[4722]: I0309 15:23:02.660857 4722 generic.go:334] "Generic (PLEG): container finished" podID="8ac36d47-4501-4033-aee7-ce9ed8ed7002" containerID="6adf907913ba33a54b43059100ba51bbb9f8f927134943ccbc706b2478b42924" exitCode=0 Mar 09 15:23:02 crc kubenswrapper[4722]: I0309 15:23:02.660925 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-7qbrd" event={"ID":"8ac36d47-4501-4033-aee7-ce9ed8ed7002","Type":"ContainerDied","Data":"6adf907913ba33a54b43059100ba51bbb9f8f927134943ccbc706b2478b42924"} Mar 09 15:23:02 crc kubenswrapper[4722]: I0309 15:23:02.662074 4722 scope.go:117] "RemoveContainer" containerID="ed3fbb43f36319a7eb8ab1a8588e8778eb342cc179a9ad24d7451bed03535581" Mar 09 15:23:02 crc kubenswrapper[4722]: I0309 15:23:02.667807 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4a22f8c-ed38-47cf-8238-baf804f573a1","Type":"ContainerStarted","Data":"4b9a57a7830d894ace19d3f55c8a7106923a4ca6e2dd258430d194d1c60e4052"} Mar 09 15:23:02 crc kubenswrapper[4722]: I0309 15:23:02.670004 4722 generic.go:334] "Generic (PLEG): container finished" podID="a9df5689-5d83-4206-be2b-cf6877d70e23" containerID="7b90ab1b4d8ef2c443f53b7a503c742ed75e29f430c3ef869ae535ea8c9a08c2" exitCode=1 Mar 09 15:23:02 crc kubenswrapper[4722]: I0309 15:23:02.670708 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-56qz9" event={"ID":"a9df5689-5d83-4206-be2b-cf6877d70e23","Type":"ContainerDied","Data":"7b90ab1b4d8ef2c443f53b7a503c742ed75e29f430c3ef869ae535ea8c9a08c2"} Mar 09 15:23:02 crc kubenswrapper[4722]: I0309 15:23:02.671056 4722 scope.go:117] "RemoveContainer" containerID="7b90ab1b4d8ef2c443f53b7a503c742ed75e29f430c3ef869ae535ea8c9a08c2" Mar 09 15:23:02 crc kubenswrapper[4722]: I0309 15:23:02.674525 4722 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-fp2th container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.44:5443/healthz\": dial tcp 10.217.0.44:5443: connect: connection refused" start-of-body= Mar 09 15:23:02 crc kubenswrapper[4722]: I0309 15:23:02.674579 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fp2th" podUID="996087ed-6480-4650-8632-c991e5d16c99" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.44:5443/healthz\": dial tcp 10.217.0.44:5443: connect: connection refused" Mar 09 15:23:02 crc kubenswrapper[4722]: I0309 15:23:02.724663 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-5b979cff56-vwbnz" Mar 09 15:23:02 crc kubenswrapper[4722]: E0309 15:23:02.822364 4722 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6adf907913ba33a54b43059100ba51bbb9f8f927134943ccbc706b2478b42924 is running failed: container process not found" containerID="6adf907913ba33a54b43059100ba51bbb9f8f927134943ccbc706b2478b42924" cmd=["grpc_health_probe","-addr=:50051"] Mar 09 15:23:02 crc kubenswrapper[4722]: E0309 15:23:02.822914 4722 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6adf907913ba33a54b43059100ba51bbb9f8f927134943ccbc706b2478b42924 is running failed: container process not found" containerID="6adf907913ba33a54b43059100ba51bbb9f8f927134943ccbc706b2478b42924" cmd=["grpc_health_probe","-addr=:50051"] Mar 09 15:23:02 crc kubenswrapper[4722]: E0309 15:23:02.823143 4722 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6adf907913ba33a54b43059100ba51bbb9f8f927134943ccbc706b2478b42924 is running failed: container process not found" containerID="6adf907913ba33a54b43059100ba51bbb9f8f927134943ccbc706b2478b42924" cmd=["grpc_health_probe","-addr=:50051"] Mar 09 15:23:02 crc kubenswrapper[4722]: E0309 15:23:02.823166 4722 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 6adf907913ba33a54b43059100ba51bbb9f8f927134943ccbc706b2478b42924 is running failed: container process not found" probeType="Readiness" pod="openstack-operators/openstack-operator-index-7qbrd" podUID="8ac36d47-4501-4033-aee7-ce9ed8ed7002" containerName="registry-server" Mar 09 15:23:02 crc kubenswrapper[4722]: I0309 15:23:02.890737 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-rvtqn" podUID="3411289f-3e7c-4e43-b545-5e612822b18e" containerName="registry-server" probeResult="failure" output=< Mar 09 15:23:02 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Mar 09 15:23:02 crc kubenswrapper[4722]: > Mar 09 15:23:03 crc kubenswrapper[4722]: I0309 15:23:03.182928 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-scheduler-0" podUID="4c227d2b-e035-426b-b1e1-5be3a4e06090" containerName="cinder-scheduler" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 15:23:03 crc kubenswrapper[4722]: I0309 15:23:03.251878 4722 patch_prober.go:28] interesting pod/router-default-5444994796-dp8wn container/router namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]backend-http ok Mar 09 15:23:03 crc kubenswrapper[4722]: [+]has-synced ok Mar 09 15:23:03 crc kubenswrapper[4722]: [-]process-running failed: reason withheld Mar 09 15:23:03 crc kubenswrapper[4722]: healthz check failed Mar 09 15:23:03 crc kubenswrapper[4722]: I0309 15:23:03.251935 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ingress/router-default-5444994796-dp8wn" podUID="317444ee-0620-47d2-869e-77578a367a87" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 15:23:03 crc kubenswrapper[4722]: I0309 15:23:03.319916 4722 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-dgxnd container/catalog-operator namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.42:8443/healthz\": dial tcp 10.217.0.42:8443: connect: connection refused" start-of-body= Mar 09 15:23:03 crc kubenswrapper[4722]: I0309 15:23:03.319979 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dgxnd" podUID="c219beb3-4ba5-43bd-b2ec-3855d19c2b57" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.42:8443/healthz\": dial tcp 10.217.0.42:8443: connect: connection refused" Mar 09 15:23:03 crc kubenswrapper[4722]: I0309 15:23:03.320796 4722 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-dgxnd container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.42:8443/healthz\": dial tcp 10.217.0.42:8443: connect: connection refused" start-of-body= Mar 09 15:23:03 crc kubenswrapper[4722]: I0309 15:23:03.320827 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dgxnd" podUID="c219beb3-4ba5-43bd-b2ec-3855d19c2b57" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.42:8443/healthz\": dial tcp 10.217.0.42:8443: connect: connection refused" Mar 09 15:23:03 crc kubenswrapper[4722]: I0309 15:23:03.562869 4722 patch_prober.go:28] interesting pod/package-server-manager-789f6589d5-mrtb2 container/package-server-manager namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"http://10.217.0.23:8080/healthz\": dial tcp 10.217.0.23:8080: connect: connection refused" start-of-body= Mar 09 15:23:03 crc kubenswrapper[4722]: I0309 15:23:03.562926 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mrtb2" podUID="228d39d8-b0bc-4491-be90-e473c090f412" containerName="package-server-manager" probeResult="failure" output="Get \"http://10.217.0.23:8080/healthz\": dial tcp 10.217.0.23:8080: connect: connection refused" Mar 09 15:23:03 crc kubenswrapper[4722]: I0309 15:23:03.565413 4722 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-lnkwt container/olm-operator namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" start-of-body= Mar 09 15:23:03 crc kubenswrapper[4722]: I0309 15:23:03.565496 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lnkwt" podUID="f8055a95-6b09-4e32-88b8-82ad36ca5029" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" Mar 09 15:23:03 crc kubenswrapper[4722]: I0309 15:23:03.565714 4722 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-lnkwt container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" start-of-body= Mar 09 15:23:03 crc kubenswrapper[4722]: I0309 15:23:03.565749 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lnkwt" podUID="f8055a95-6b09-4e32-88b8-82ad36ca5029" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.19:8443/healthz\": dial tcp 10.217.0.19:8443: connect: connection refused" Mar 09 15:23:03 crc kubenswrapper[4722]: I0309 15:23:03.659109 4722 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-fp2th container/packageserver namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.44:5443/healthz\": dial tcp 10.217.0.44:5443: connect: connection refused" start-of-body= Mar 09 15:23:03 crc kubenswrapper[4722]: I0309 15:23:03.659406 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fp2th" podUID="996087ed-6480-4650-8632-c991e5d16c99" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.44:5443/healthz\": dial tcp 10.217.0.44:5443: connect: connection refused" Mar 09 15:23:03 crc kubenswrapper[4722]: I0309 15:23:03.659306 4722 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-fp2th container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.44:5443/healthz\": dial tcp 10.217.0.44:5443: connect: connection refused" start-of-body= Mar 09 15:23:03 crc kubenswrapper[4722]: I0309 15:23:03.659451 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fp2th" podUID="996087ed-6480-4650-8632-c991e5d16c99" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.44:5443/healthz\": dial tcp 10.217.0.44:5443: connect: connection refused" Mar 09 15:23:03 crc kubenswrapper[4722]: I0309 15:23:03.686445 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 15:23:03 crc kubenswrapper[4722]: I0309 15:23:03.704764 4722 generic.go:334] "Generic (PLEG): container finished" podID="7139e62b-5e90-4545-a264-aa8138821a55" containerID="4e30d1f5da18c63b781ab16a8bddd74f6fef64c6f69f6d7e9a89ed24675fbedc" exitCode=0 Mar 09 15:23:03 crc kubenswrapper[4722]: I0309 15:23:03.704878 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8445c785c8-hdmgl" event={"ID":"7139e62b-5e90-4545-a264-aa8138821a55","Type":"ContainerDied","Data":"4e30d1f5da18c63b781ab16a8bddd74f6fef64c6f69f6d7e9a89ed24675fbedc"} Mar 09 15:23:03 crc kubenswrapper[4722]: I0309 15:23:03.731195 4722 generic.go:334] "Generic (PLEG): container finished" podID="df07ee71-98db-42a1-9df4-6f8707504f08" containerID="5c6af4f5b1390e0fa350198e4b30efc579a7139fde74125479e6797423853804" exitCode=0 Mar 09 15:23:03 crc kubenswrapper[4722]: I0309 15:23:03.731245 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"df07ee71-98db-42a1-9df4-6f8707504f08","Type":"ContainerDied","Data":"5c6af4f5b1390e0fa350198e4b30efc579a7139fde74125479e6797423853804"} Mar 09 15:23:03 crc kubenswrapper[4722]: I0309 15:23:03.736558 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xswl4" event={"ID":"f9ff9b26-9d5a-4194-bab5-1b9fb5dee947","Type":"ContainerStarted","Data":"308f5babf32260efa40e7268213ec97367623d6cd848df270c1daef10e0c95c3"} Mar 09 15:23:03 crc kubenswrapper[4722]: I0309 15:23:03.741400 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mrtb2" event={"ID":"228d39d8-b0bc-4491-be90-e473c090f412","Type":"ContainerStarted","Data":"a94717d3e420b3096b3442ca2b4ccc8eb1105539244931a2767eb0cf965f27de"} Mar 09 15:23:03 crc kubenswrapper[4722]: I0309 15:23:03.742462 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mrtb2" Mar 09 15:23:03 crc kubenswrapper[4722]: I0309 15:23:03.745639 4722 generic.go:334] "Generic (PLEG): container finished" podID="9f2a2160-888c-4101-8b1c-63498753a2b7" containerID="85acb30f0b7edaa70a94e32978a937bc5732b62f2e5d01e71c5c4e5bc6878dfb" exitCode=0 Mar 09 15:23:03 crc kubenswrapper[4722]: I0309 15:23:03.745715 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-jmwpv" event={"ID":"9f2a2160-888c-4101-8b1c-63498753a2b7","Type":"ContainerDied","Data":"85acb30f0b7edaa70a94e32978a937bc5732b62f2e5d01e71c5c4e5bc6878dfb"} Mar 09 15:23:03 crc kubenswrapper[4722]: I0309 15:23:03.745739 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-jmwpv" event={"ID":"9f2a2160-888c-4101-8b1c-63498753a2b7","Type":"ContainerStarted","Data":"5c111f913008d8881ee4938c3516ef6dcd35631ad7f0d95b37fe950e42a44223"} Mar 09 15:23:03 crc kubenswrapper[4722]: I0309 15:23:03.749193 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v57f2" event={"ID":"4e27c5a4-8cba-4119-8006-f9841d6121dc","Type":"ContainerStarted","Data":"64801968ad1fa38f05d01df1506ac098993ded0a01af660e30f5070685f78f41"} Mar 09 15:23:03 crc kubenswrapper[4722]: I0309 15:23:03.756733 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-pg8qn" event={"ID":"ef36bc5a-2962-4c1e-a5fd-98f61d525d5d","Type":"ContainerStarted","Data":"cff1ce349fdcc93351a701a4f631c6b8996282f25dfb06d3f3d1471bada73b8d"} Mar 09 15:23:03 crc kubenswrapper[4722]: I0309 15:23:03.757028 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-pg8qn" Mar 09 15:23:03 crc kubenswrapper[4722]: I0309 15:23:03.763737 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-67579949ff-g69dw" event={"ID":"fd45c8c5-9cad-404b-b14a-9cbc710c8468","Type":"ContainerStarted","Data":"deb6d9f4c7d3845369b76e9bff90275eb958ff3306f3457380be04bc9f49f05e"} Mar 09 15:23:03 crc kubenswrapper[4722]: I0309 15:23:03.764024 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-67579949ff-g69dw" Mar 09 15:23:03 crc kubenswrapper[4722]: I0309 15:23:03.764420 4722 patch_prober.go:28] interesting pod/route-controller-manager-67579949ff-g69dw container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.66:8443/healthz\": dial tcp 10.217.0.66:8443: connect: connection refused" start-of-body= Mar 09 15:23:03 crc kubenswrapper[4722]: I0309 15:23:03.764545 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-67579949ff-g69dw" podUID="fd45c8c5-9cad-404b-b14a-9cbc710c8468" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.66:8443/healthz\": dial tcp 10.217.0.66:8443: connect: connection refused" Mar 09 15:23:03 crc kubenswrapper[4722]: I0309 15:23:03.764586 4722 patch_prober.go:28] interesting pod/observability-operator-59bdc8b94-lc5zn container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.18:8081/healthz\": dial tcp 10.217.0.18:8081: connect: connection refused" start-of-body= Mar 09 15:23:03 crc kubenswrapper[4722]: I0309 15:23:03.764670 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-59bdc8b94-lc5zn" podUID="14655c3d-02fe-4215-b566-0c4008fd34a0" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.18:8081/healthz\": dial tcp 10.217.0.18:8081: connect: connection refused" Mar 09 15:23:04 crc kubenswrapper[4722]: I0309 15:23:04.393283 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-c2mmw" Mar 09 15:23:04 crc kubenswrapper[4722]: I0309 15:23:04.393896 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-c2mmw" Mar 09 15:23:04 crc kubenswrapper[4722]: I0309 15:23:04.584782 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-v57f2" Mar 09 15:23:04 crc kubenswrapper[4722]: I0309 15:23:04.584832 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-v57f2" Mar 09 15:23:04 crc kubenswrapper[4722]: I0309 15:23:04.646138 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-vppnv" Mar 09 15:23:04 crc kubenswrapper[4722]: I0309 15:23:04.672980 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-rsd9l" Mar 09 15:23:04 crc kubenswrapper[4722]: I0309 15:23:04.699171 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-bmpgd" Mar 09 15:23:04 crc kubenswrapper[4722]: I0309 15:23:04.776040 4722 generic.go:334] "Generic (PLEG): container finished" podID="4159e308-3ccf-45d9-a97b-8133542007a8" containerID="80ff28c7cda34665de0bebe76a8a1102a025d02953717f80f765be65e05ede59" exitCode=0 Mar 09 15:23:04 crc kubenswrapper[4722]: I0309 15:23:04.776110 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"4159e308-3ccf-45d9-a97b-8133542007a8","Type":"ContainerDied","Data":"80ff28c7cda34665de0bebe76a8a1102a025d02953717f80f765be65e05ede59"} Mar 09 15:23:04 crc kubenswrapper[4722]: I0309 15:23:04.778584 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-66c8b7dfbb-m7fv2" event={"ID":"5bf14ad6-64cf-48f7-99e6-fabac12849e2","Type":"ContainerStarted","Data":"3acd4030fce98a2fe15da2697b4d1bba26663ceae5b17184167459abc8d17b1f"} Mar 09 15:23:04 crc kubenswrapper[4722]: I0309 15:23:04.778782 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-66c8b7dfbb-m7fv2" Mar 09 15:23:04 crc kubenswrapper[4722]: I0309 15:23:04.781806 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-7qbrd" event={"ID":"8ac36d47-4501-4033-aee7-ce9ed8ed7002","Type":"ContainerStarted","Data":"59a0830201dfd07caf7b0a4b730546fd303383132629d28698d454b1d894c228"} Mar 09 15:23:04 crc kubenswrapper[4722]: I0309 15:23:04.784346 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-ct7x8" Mar 09 15:23:04 crc kubenswrapper[4722]: I0309 15:23:04.784458 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-55cd86c56-dm2dr" event={"ID":"98c22319-d5f8-4a0b-8a30-89b9d832f354","Type":"ContainerStarted","Data":"dbf4c56c8627ce6087197ab755d118d363eee77bdcf16820341af7fccc1a8467"} Mar 09 15:23:04 crc kubenswrapper[4722]: I0309 15:23:04.784724 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-55cd86c56-dm2dr" Mar 09 15:23:04 crc kubenswrapper[4722]: I0309 15:23:04.787315 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-56qz9" event={"ID":"a9df5689-5d83-4206-be2b-cf6877d70e23","Type":"ContainerStarted","Data":"35e2b8e7dbe921f26219eabb0e47753b6f467ae0142abb56dd93508148bf2aa8"} Mar 09 15:23:04 crc kubenswrapper[4722]: I0309 15:23:04.787529 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-56qz9" Mar 09 15:23:04 crc kubenswrapper[4722]: I0309 15:23:04.789862 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-n5zc7" event={"ID":"717ffc3a-7a6d-4a7c-837f-d1ed92489b68","Type":"ContainerStarted","Data":"04aa2e9f6ad56a2c58badbd80e68c5e6ff17dfe778e344446cc105691fbe62ce"} Mar 09 15:23:04 crc kubenswrapper[4722]: I0309 15:23:04.790091 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-n5zc7" Mar 09 15:23:04 crc kubenswrapper[4722]: I0309 15:23:04.793778 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8445c785c8-hdmgl" event={"ID":"7139e62b-5e90-4545-a264-aa8138821a55","Type":"ContainerStarted","Data":"46478759e5a6a9fdf355ed688a0201008053fb58afb03316123b98bda6f59137"} Mar 09 15:23:04 crc kubenswrapper[4722]: I0309 15:23:04.793890 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-8445c785c8-hdmgl" Mar 09 15:23:04 crc kubenswrapper[4722]: I0309 15:23:04.794305 4722 patch_prober.go:28] interesting pod/controller-manager-8445c785c8-hdmgl container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.65:8443/healthz\": dial tcp 10.217.0.65:8443: connect: connection refused" start-of-body= Mar 09 15:23:04 crc kubenswrapper[4722]: I0309 15:23:04.794378 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-8445c785c8-hdmgl" podUID="7139e62b-5e90-4545-a264-aa8138821a55" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.65:8443/healthz\": dial tcp 10.217.0.65:8443: connect: connection refused" Mar 09 15:23:04 crc kubenswrapper[4722]: I0309 15:23:04.796862 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-hgkzs" event={"ID":"df8b52ff-f61e-4aca-a408-240590699ae6","Type":"ContainerStarted","Data":"33b1a8eabbe2679e0060f9a72342e45dddc55a033c5247bdfb37e7764b61dfec"} Mar 09 15:23:04 crc kubenswrapper[4722]: I0309 15:23:04.797065 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-hgkzs" Mar 09 15:23:04 crc kubenswrapper[4722]: I0309 15:23:04.799149 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-l8bds" event={"ID":"74cb981b-ce89-479e-8573-fdda25190637","Type":"ContainerStarted","Data":"57ab355b478e3819f551c1755fd2861dccb2a4a3d7f4489333e062d094e82203"} Mar 09 15:23:04 crc kubenswrapper[4722]: I0309 15:23:04.799384 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-l8bds" Mar 09 15:23:04 crc kubenswrapper[4722]: I0309 15:23:04.802250 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Mar 09 15:23:04 crc kubenswrapper[4722]: I0309 15:23:04.803195 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 09 15:23:04 crc kubenswrapper[4722]: I0309 15:23:04.804183 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"e7a2b54a79964706e6d4565f33f76ddff032629e4252efcd72788c5648fa2238"} Mar 09 15:23:04 crc kubenswrapper[4722]: I0309 15:23:04.807509 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-zjf7b" event={"ID":"22043c71-5292-422c-99e5-c88ea1aef638","Type":"ContainerStarted","Data":"34632c657ef04d1fbd15d3656d56370965b077a76f3874cf41fbb0007ed5000c"} Mar 09 15:23:04 crc kubenswrapper[4722]: I0309 15:23:04.807737 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-zjf7b" Mar 09 15:23:04 crc kubenswrapper[4722]: I0309 15:23:04.810831 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"cea3822fd3bf457f7abaa63d2f4ebe4b7414cd38672eb4221653bd6b4ebde9af"} Mar 09 15:23:04 crc kubenswrapper[4722]: I0309 15:23:04.810987 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 09 15:23:04 crc kubenswrapper[4722]: I0309 15:23:04.813450 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-ct7x8" event={"ID":"f21c35ef-c8ea-4331-a747-44a62c6f2e74","Type":"ContainerStarted","Data":"8c20ca15f5a4153d29df1fbc68b81995a5d711c5371963f6e8f7643638b4ba84"} Mar 09 15:23:04 crc kubenswrapper[4722]: I0309 15:23:04.813861 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-ct7x8" Mar 09 15:23:04 crc kubenswrapper[4722]: I0309 15:23:04.851021 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"df07ee71-98db-42a1-9df4-6f8707504f08","Type":"ContainerStarted","Data":"07fef073afecf7ef127a0f348359960c67de4342970e4600c9e18b62db7c6a5f"} Mar 09 15:23:04 crc kubenswrapper[4722]: I0309 15:23:04.855094 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-8tkbn" event={"ID":"427a4c04-99cd-4f53-ae98-20c1755d7658","Type":"ContainerStarted","Data":"9c22f82503c721aebe141fbffa06f3fd844d3a645d31fc9449828e2f017eb1e3"} Mar 09 15:23:04 crc kubenswrapper[4722]: I0309 15:23:04.862263 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-2hxzr" event={"ID":"4de6db14-6f3e-4c4e-a61d-39c6648209dd","Type":"ContainerStarted","Data":"f2436986b9487ead49a51b8fb899b94b32377373d0c00a081ce27495c8bff681"} Mar 09 15:23:04 crc kubenswrapper[4722]: I0309 15:23:04.862838 4722 patch_prober.go:28] interesting pod/route-controller-manager-67579949ff-g69dw container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.66:8443/healthz\": dial tcp 10.217.0.66:8443: connect: connection refused" start-of-body= Mar 09 15:23:04 crc kubenswrapper[4722]: I0309 15:23:04.862867 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-2hxzr" Mar 09 15:23:04 crc kubenswrapper[4722]: I0309 15:23:04.862905 4722 patch_prober.go:28] interesting pod/observability-operator-59bdc8b94-lc5zn container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.18:8081/healthz\": dial tcp 10.217.0.18:8081: connect: connection refused" start-of-body= Mar 09 15:23:04 crc kubenswrapper[4722]: I0309 15:23:04.862904 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-67579949ff-g69dw" podUID="fd45c8c5-9cad-404b-b14a-9cbc710c8468" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.66:8443/healthz\": dial tcp 10.217.0.66:8443: connect: connection refused" Mar 09 15:23:04 crc kubenswrapper[4722]: I0309 15:23:04.862973 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-59bdc8b94-lc5zn" podUID="14655c3d-02fe-4215-b566-0c4008fd34a0" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.18:8081/healthz\": dial tcp 10.217.0.18:8081: connect: connection refused" Mar 09 15:23:05 crc kubenswrapper[4722]: I0309 15:23:05.066809 4722 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-8xdjl container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Liveness probe status=failure output="Get \"https://10.217.0.75:8443/healthz\": dial tcp 10.217.0.75:8443: connect: connection refused" start-of-body= Mar 09 15:23:05 crc kubenswrapper[4722]: I0309 15:23:05.066875 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-8xdjl" podUID="8b35adf7-a305-4f94-a5c9-02fbc3fca46f" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.75:8443/healthz\": dial tcp 10.217.0.75:8443: connect: connection refused" Mar 09 15:23:05 crc kubenswrapper[4722]: I0309 15:23:05.066958 4722 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-8xdjl container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.75:8443/healthz\": dial tcp 10.217.0.75:8443: connect: connection refused" start-of-body= Mar 09 15:23:05 crc kubenswrapper[4722]: I0309 15:23:05.066979 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-8xdjl" podUID="8b35adf7-a305-4f94-a5c9-02fbc3fca46f" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.75:8443/healthz\": dial tcp 10.217.0.75:8443: connect: connection refused" Mar 09 15:23:05 crc kubenswrapper[4722]: I0309 15:23:05.208864 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-2hxzr" Mar 09 15:23:05 crc kubenswrapper[4722]: I0309 15:23:05.299772 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-5db757fd5b-t57qc" podUID="ec3da47d-c782-4189-b195-d6b203bd7f7a" containerName="oauth-openshift" containerID="cri-o://1bef1f950532e1f9264d862747456e60811d3e386e3459a18b18cc7dad6b8a21" gracePeriod=14 Mar 09 15:23:05 crc kubenswrapper[4722]: I0309 15:23:05.362000 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-scheduler-0" podUID="4c227d2b-e035-426b-b1e1-5be3a4e06090" containerName="cinder-scheduler" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 15:23:05 crc kubenswrapper[4722]: I0309 15:23:05.449158 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-c2mmw" podUID="690e5ab0-3719-40ac-aba6-9278480ecb44" containerName="registry-server" probeResult="failure" output=< Mar 09 15:23:05 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Mar 09 15:23:05 crc kubenswrapper[4722]: > Mar 09 15:23:05 crc kubenswrapper[4722]: I0309 15:23:05.628809 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-786f45cff4-jdp69" Mar 09 15:23:05 crc kubenswrapper[4722]: I0309 15:23:05.649442 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-v57f2" podUID="4e27c5a4-8cba-4119-8006-f9841d6121dc" containerName="registry-server" probeResult="failure" output=< Mar 09 15:23:05 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Mar 09 15:23:05 crc kubenswrapper[4722]: > Mar 09 15:23:05 crc kubenswrapper[4722]: I0309 15:23:05.658357 4722 patch_prober.go:28] interesting pod/loki-operator-controller-manager-7d6d6698bd-4r85k container/manager namespace/openshift-operators-redhat: Readiness probe status=failure output="Get \"http://10.217.0.50:8081/readyz\": dial tcp 10.217.0.50:8081: connect: connection refused" start-of-body= Mar 09 15:23:05 crc kubenswrapper[4722]: I0309 15:23:05.658426 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators-redhat/loki-operator-controller-manager-7d6d6698bd-4r85k" podUID="497a07fc-9649-4620-9432-855aa3fdc327" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.50:8081/readyz\": dial tcp 10.217.0.50:8081: connect: connection refused" Mar 09 15:23:05 crc kubenswrapper[4722]: I0309 15:23:05.668241 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-wrzrq" Mar 09 15:23:05 crc kubenswrapper[4722]: I0309 15:23:05.876457 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5db757fd5b-t57qc" event={"ID":"ec3da47d-c782-4189-b195-d6b203bd7f7a","Type":"ContainerDied","Data":"1bef1f950532e1f9264d862747456e60811d3e386e3459a18b18cc7dad6b8a21"} Mar 09 15:23:05 crc kubenswrapper[4722]: I0309 15:23:05.876492 4722 generic.go:334] "Generic (PLEG): container finished" podID="ec3da47d-c782-4189-b195-d6b203bd7f7a" containerID="1bef1f950532e1f9264d862747456e60811d3e386e3459a18b18cc7dad6b8a21" exitCode=0 Mar 09 15:23:05 crc kubenswrapper[4722]: I0309 15:23:05.890772 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"4159e308-3ccf-45d9-a97b-8133542007a8","Type":"ContainerStarted","Data":"db8fc32ba5d70682965f33e4da28be37add2c49b65d47a3ce1e7f0446407271d"} Mar 09 15:23:05 crc kubenswrapper[4722]: I0309 15:23:05.901762 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-5444994796-dp8wn_317444ee-0620-47d2-869e-77578a367a87/router/0.log" Mar 09 15:23:05 crc kubenswrapper[4722]: I0309 15:23:05.901815 4722 generic.go:334] "Generic (PLEG): container finished" podID="317444ee-0620-47d2-869e-77578a367a87" containerID="10bc0de264d7ed06b3f68942e6e3c249950032a18e0566405054ad9fe8eaaee6" exitCode=137 Mar 09 15:23:05 crc kubenswrapper[4722]: I0309 15:23:05.902799 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-dp8wn" event={"ID":"317444ee-0620-47d2-869e-77578a367a87","Type":"ContainerDied","Data":"10bc0de264d7ed06b3f68942e6e3c249950032a18e0566405054ad9fe8eaaee6"} Mar 09 15:23:05 crc kubenswrapper[4722]: I0309 15:23:05.904150 4722 patch_prober.go:28] interesting pod/controller-manager-8445c785c8-hdmgl container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.65:8443/healthz\": dial tcp 10.217.0.65:8443: connect: connection refused" start-of-body= Mar 09 15:23:05 crc kubenswrapper[4722]: I0309 15:23:05.904185 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-8445c785c8-hdmgl" podUID="7139e62b-5e90-4545-a264-aa8138821a55" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.65:8443/healthz\": dial tcp 10.217.0.65:8443: connect: connection refused" Mar 09 15:23:05 crc kubenswrapper[4722]: I0309 15:23:05.965070 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-k8s-0" Mar 09 15:23:06 crc kubenswrapper[4722]: I0309 15:23:06.183059 4722 trace.go:236] Trace[956882338]: "Calculate volume metrics of mysql-db for pod openstack/openstack-cell1-galera-0" (09-Mar-2026 15:23:04.467) (total time: 1715ms): Mar 09 15:23:06 crc kubenswrapper[4722]: Trace[956882338]: [1.715672071s] [1.715672071s] END Mar 09 15:23:06 crc kubenswrapper[4722]: I0309 15:23:06.576497 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 15:23:06 crc kubenswrapper[4722]: I0309 15:23:06.620932 4722 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-jmwpv container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Mar 09 15:23:06 crc kubenswrapper[4722]: I0309 15:23:06.621005 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-jmwpv" podUID="9f2a2160-888c-4101-8b1c-63498753a2b7" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" Mar 09 15:23:06 crc kubenswrapper[4722]: I0309 15:23:06.924578 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-5444994796-dp8wn_317444ee-0620-47d2-869e-77578a367a87/router/0.log" Mar 09 15:23:06 crc kubenswrapper[4722]: I0309 15:23:06.924991 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-dp8wn" event={"ID":"317444ee-0620-47d2-869e-77578a367a87","Type":"ContainerStarted","Data":"be0cb5ef61590a08f356a2dde936066c29cc3a2b9e825a2b487d1cc0331fb811"} Mar 09 15:23:06 crc kubenswrapper[4722]: I0309 15:23:06.927588 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5db757fd5b-t57qc" event={"ID":"ec3da47d-c782-4189-b195-d6b203bd7f7a","Type":"ContainerStarted","Data":"fc62c65abfd4deeae246417095948c3887206d2693ed1f5e4ac0d12180668d85"} Mar 09 15:23:06 crc kubenswrapper[4722]: I0309 15:23:06.928896 4722 patch_prober.go:28] interesting pod/oauth-openshift-5db757fd5b-t57qc container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.64:6443/healthz\": dial tcp 10.217.0.64:6443: connect: connection refused" start-of-body= Mar 09 15:23:06 crc kubenswrapper[4722]: I0309 15:23:06.928934 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-5db757fd5b-t57qc" podUID="ec3da47d-c782-4189-b195-d6b203bd7f7a" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.64:6443/healthz\": dial tcp 10.217.0.64:6443: connect: connection refused" Mar 09 15:23:07 crc kubenswrapper[4722]: I0309 15:23:07.246603 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-dp8wn" Mar 09 15:23:07 crc kubenswrapper[4722]: I0309 15:23:07.248047 4722 patch_prober.go:28] interesting pod/router-default-5444994796-dp8wn container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Mar 09 15:23:07 crc kubenswrapper[4722]: I0309 15:23:07.248097 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dp8wn" podUID="317444ee-0620-47d2-869e-77578a367a87" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Mar 09 15:23:07 crc kubenswrapper[4722]: I0309 15:23:07.960015 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-5db757fd5b-t57qc" Mar 09 15:23:07 crc kubenswrapper[4722]: I0309 15:23:07.960240 4722 patch_prober.go:28] interesting pod/oauth-openshift-5db757fd5b-t57qc container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.64:6443/healthz\": dial tcp 10.217.0.64:6443: connect: connection refused" start-of-body= Mar 09 15:23:07 crc kubenswrapper[4722]: I0309 15:23:07.960515 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-5db757fd5b-t57qc" podUID="ec3da47d-c782-4189-b195-d6b203bd7f7a" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.64:6443/healthz\": dial tcp 10.217.0.64:6443: connect: connection refused" Mar 09 15:23:07 crc kubenswrapper[4722]: I0309 15:23:07.963797 4722 generic.go:334] "Generic (PLEG): container finished" podID="a79aaedb-92ba-42fc-8cc7-9ecb007d2ac7" containerID="ed50884a2474119d9434961efa1de2e0e6821dcf7bc3597580905135c8f6ae50" exitCode=0 Mar 09 15:23:07 crc kubenswrapper[4722]: I0309 15:23:07.964118 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"a79aaedb-92ba-42fc-8cc7-9ecb007d2ac7","Type":"ContainerDied","Data":"ed50884a2474119d9434961efa1de2e0e6821dcf7bc3597580905135c8f6ae50"} Mar 09 15:23:07 crc kubenswrapper[4722]: I0309 15:23:07.964935 4722 patch_prober.go:28] interesting pod/oauth-openshift-5db757fd5b-t57qc container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.64:6443/healthz\": dial tcp 10.217.0.64:6443: connect: connection refused" start-of-body= Mar 09 15:23:07 crc kubenswrapper[4722]: I0309 15:23:07.964972 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-5db757fd5b-t57qc" podUID="ec3da47d-c782-4189-b195-d6b203bd7f7a" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.64:6443/healthz\": dial tcp 10.217.0.64:6443: connect: connection refused" Mar 09 15:23:08 crc kubenswrapper[4722]: I0309 15:23:08.246692 4722 patch_prober.go:28] interesting pod/router-default-5444994796-dp8wn container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Mar 09 15:23:08 crc kubenswrapper[4722]: I0309 15:23:08.246740 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-dp8wn" podUID="317444ee-0620-47d2-869e-77578a367a87" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Mar 09 15:23:08 crc kubenswrapper[4722]: I0309 15:23:08.430976 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-scheduler-0" podUID="4c227d2b-e035-426b-b1e1-5be3a4e06090" containerName="cinder-scheduler" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 15:23:08 crc kubenswrapper[4722]: I0309 15:23:08.431307 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 09 15:23:08 crc kubenswrapper[4722]: I0309 15:23:08.432170 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cinder-scheduler" containerStatusID={"Type":"cri-o","ID":"231f54731c6756f7b9f1284a4fb32b33d4ba1b6c9bd8d9fce721d043fab1848e"} pod="openstack/cinder-scheduler-0" containerMessage="Container cinder-scheduler failed liveness probe, will be restarted" Mar 09 15:23:08 crc kubenswrapper[4722]: I0309 15:23:08.432233 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="4c227d2b-e035-426b-b1e1-5be3a4e06090" containerName="cinder-scheduler" containerID="cri-o://231f54731c6756f7b9f1284a4fb32b33d4ba1b6c9bd8d9fce721d043fab1848e" gracePeriod=30 Mar 09 15:23:08 crc kubenswrapper[4722]: I0309 15:23:08.906195 4722 patch_prober.go:28] interesting pod/controller-manager-8445c785c8-hdmgl container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.65:8443/healthz\": dial tcp 10.217.0.65:8443: connect: connection refused" start-of-body= Mar 09 15:23:08 crc kubenswrapper[4722]: I0309 15:23:08.906461 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-8445c785c8-hdmgl" podUID="7139e62b-5e90-4545-a264-aa8138821a55" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.65:8443/healthz\": dial tcp 10.217.0.65:8443: connect: connection refused" Mar 09 15:23:08 crc kubenswrapper[4722]: I0309 15:23:08.939901 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-67579949ff-g69dw" Mar 09 15:23:08 crc kubenswrapper[4722]: I0309 15:23:08.994686 4722 generic.go:334] "Generic (PLEG): container finished" podID="44161991-883d-4494-80b3-b829ff355f47" containerID="e17863f074bb023e7a57bc6dafe37da55973e0ac7840f2c4c87c71bc2b40aca2" exitCode=0 Mar 09 15:23:08 crc kubenswrapper[4722]: I0309 15:23:08.995263 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-svbs7" event={"ID":"44161991-883d-4494-80b3-b829ff355f47","Type":"ContainerDied","Data":"e17863f074bb023e7a57bc6dafe37da55973e0ac7840f2c4c87c71bc2b40aca2"} Mar 09 15:23:08 crc kubenswrapper[4722]: I0309 15:23:08.996849 4722 patch_prober.go:28] interesting pod/oauth-openshift-5db757fd5b-t57qc container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.64:6443/healthz\": dial tcp 10.217.0.64:6443: connect: connection refused" start-of-body= Mar 09 15:23:08 crc kubenswrapper[4722]: I0309 15:23:08.996894 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-5db757fd5b-t57qc" podUID="ec3da47d-c782-4189-b195-d6b203bd7f7a" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.64:6443/healthz\": dial tcp 10.217.0.64:6443: connect: connection refused" Mar 09 15:23:09 crc kubenswrapper[4722]: E0309 15:23:09.169384 4722 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ed50884a2474119d9434961efa1de2e0e6821dcf7bc3597580905135c8f6ae50 is running failed: container process not found" containerID="ed50884a2474119d9434961efa1de2e0e6821dcf7bc3597580905135c8f6ae50" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Mar 09 15:23:09 crc kubenswrapper[4722]: E0309 15:23:09.174909 4722 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ed50884a2474119d9434961efa1de2e0e6821dcf7bc3597580905135c8f6ae50 is running failed: container process not found" containerID="ed50884a2474119d9434961efa1de2e0e6821dcf7bc3597580905135c8f6ae50" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Mar 09 15:23:09 crc kubenswrapper[4722]: E0309 15:23:09.176492 4722 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ed50884a2474119d9434961efa1de2e0e6821dcf7bc3597580905135c8f6ae50 is running failed: container process not found" containerID="ed50884a2474119d9434961efa1de2e0e6821dcf7bc3597580905135c8f6ae50" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Mar 09 15:23:09 crc kubenswrapper[4722]: E0309 15:23:09.176548 4722 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of ed50884a2474119d9434961efa1de2e0e6821dcf7bc3597580905135c8f6ae50 is running failed: container process not found" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="a79aaedb-92ba-42fc-8cc7-9ecb007d2ac7" containerName="galera" Mar 09 15:23:09 crc kubenswrapper[4722]: I0309 15:23:09.259486 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-dp8wn" Mar 09 15:23:10 crc kubenswrapper[4722]: I0309 15:23:10.014484 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"a79aaedb-92ba-42fc-8cc7-9ecb007d2ac7","Type":"ContainerStarted","Data":"4c455a892a8a1c25518980c598ac51d1b803d038f30fcb6041dd74044e1c9980"} Mar 09 15:23:10 crc kubenswrapper[4722]: I0309 15:23:10.015616 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-dp8wn" Mar 09 15:23:10 crc kubenswrapper[4722]: I0309 15:23:10.031158 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-dp8wn" Mar 09 15:23:10 crc kubenswrapper[4722]: I0309 15:23:10.261428 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-65599947bd-42bk4" Mar 09 15:23:10 crc kubenswrapper[4722]: I0309 15:23:10.506365 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 09 15:23:10 crc kubenswrapper[4722]: I0309 15:23:10.507542 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 09 15:23:10 crc kubenswrapper[4722]: I0309 15:23:10.742096 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-f7fcc58b9-lvfgg" Mar 09 15:23:10 crc kubenswrapper[4722]: I0309 15:23:10.964506 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Mar 09 15:23:11 crc kubenswrapper[4722]: I0309 15:23:11.030979 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-svbs7" event={"ID":"44161991-883d-4494-80b3-b829ff355f47","Type":"ContainerStarted","Data":"0c3a05cd3ea849665c307ddf7b879759c27b1f5aedfe3354dd3ac827df5ecb41"} Mar 09 15:23:11 crc kubenswrapper[4722]: I0309 15:23:11.043776 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-monitoring/prometheus-k8s-0" podUID="df07ee71-98db-42a1-9df4-6f8707504f08" containerName="prometheus" probeResult="failure" output=< Mar 09 15:23:11 crc kubenswrapper[4722]: % Total % Received % Xferd Average Speed Time Time Time Current Mar 09 15:23:11 crc kubenswrapper[4722]: Dload Upload Total Spent Left Speed Mar 09 15:23:11 crc kubenswrapper[4722]: [166B blob data] Mar 09 15:23:11 crc kubenswrapper[4722]: curl: (22) The requested URL returned error: 503 Mar 09 15:23:11 crc kubenswrapper[4722]: > Mar 09 15:23:11 crc kubenswrapper[4722]: I0309 15:23:11.051614 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-svbs7" podStartSLOduration=4.661815747 podStartE2EDuration="59.051580293s" podCreationTimestamp="2026-03-09 15:22:12 +0000 UTC" firstStartedPulling="2026-03-09 15:22:15.106316652 +0000 UTC m=+4775.661885228" lastFinishedPulling="2026-03-09 15:23:09.496081198 +0000 UTC m=+4830.051649774" observedRunningTime="2026-03-09 15:23:11.04781576 +0000 UTC m=+4831.603384336" watchObservedRunningTime="2026-03-09 15:23:11.051580293 +0000 UTC m=+4831.607148869" Mar 09 15:23:11 crc kubenswrapper[4722]: I0309 15:23:11.173564 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 09 15:23:11 crc kubenswrapper[4722]: I0309 15:23:11.246553 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9ctn5qw" Mar 09 15:23:11 crc kubenswrapper[4722]: I0309 15:23:11.451482 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-55cd86c56-dm2dr" Mar 09 15:23:11 crc kubenswrapper[4722]: I0309 15:23:11.472759 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-lc5zn" Mar 09 15:23:11 crc kubenswrapper[4722]: I0309 15:23:11.649755 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-chrnr" Mar 09 15:23:12 crc kubenswrapper[4722]: I0309 15:23:12.423463 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-cnjgj"] Mar 09 15:23:12 crc kubenswrapper[4722]: I0309 15:23:12.426065 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cnjgj" Mar 09 15:23:12 crc kubenswrapper[4722]: I0309 15:23:12.455379 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cnjgj"] Mar 09 15:23:12 crc kubenswrapper[4722]: I0309 15:23:12.498539 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgnr8\" (UniqueName: \"kubernetes.io/projected/259f49a9-0c80-4b3c-96a7-24d3e0139a26-kube-api-access-bgnr8\") pod \"community-operators-cnjgj\" (UID: \"259f49a9-0c80-4b3c-96a7-24d3e0139a26\") " pod="openshift-marketplace/community-operators-cnjgj" Mar 09 15:23:12 crc kubenswrapper[4722]: I0309 15:23:12.499093 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/259f49a9-0c80-4b3c-96a7-24d3e0139a26-utilities\") pod \"community-operators-cnjgj\" (UID: \"259f49a9-0c80-4b3c-96a7-24d3e0139a26\") " pod="openshift-marketplace/community-operators-cnjgj" Mar 09 15:23:12 crc kubenswrapper[4722]: I0309 15:23:12.499120 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/259f49a9-0c80-4b3c-96a7-24d3e0139a26-catalog-content\") pod \"community-operators-cnjgj\" (UID: \"259f49a9-0c80-4b3c-96a7-24d3e0139a26\") " pod="openshift-marketplace/community-operators-cnjgj" Mar 09 15:23:12 crc kubenswrapper[4722]: I0309 15:23:12.601530 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/259f49a9-0c80-4b3c-96a7-24d3e0139a26-utilities\") pod \"community-operators-cnjgj\" (UID: \"259f49a9-0c80-4b3c-96a7-24d3e0139a26\") " pod="openshift-marketplace/community-operators-cnjgj" Mar 09 15:23:12 crc kubenswrapper[4722]: I0309 15:23:12.601567 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/259f49a9-0c80-4b3c-96a7-24d3e0139a26-catalog-content\") pod \"community-operators-cnjgj\" (UID: \"259f49a9-0c80-4b3c-96a7-24d3e0139a26\") " pod="openshift-marketplace/community-operators-cnjgj" Mar 09 15:23:12 crc kubenswrapper[4722]: I0309 15:23:12.601703 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgnr8\" (UniqueName: \"kubernetes.io/projected/259f49a9-0c80-4b3c-96a7-24d3e0139a26-kube-api-access-bgnr8\") pod \"community-operators-cnjgj\" (UID: \"259f49a9-0c80-4b3c-96a7-24d3e0139a26\") " pod="openshift-marketplace/community-operators-cnjgj" Mar 09 15:23:12 crc kubenswrapper[4722]: I0309 15:23:12.602168 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/259f49a9-0c80-4b3c-96a7-24d3e0139a26-utilities\") pod \"community-operators-cnjgj\" (UID: \"259f49a9-0c80-4b3c-96a7-24d3e0139a26\") " pod="openshift-marketplace/community-operators-cnjgj" Mar 09 15:23:12 crc kubenswrapper[4722]: I0309 15:23:12.602259 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/259f49a9-0c80-4b3c-96a7-24d3e0139a26-catalog-content\") pod \"community-operators-cnjgj\" (UID: \"259f49a9-0c80-4b3c-96a7-24d3e0139a26\") " pod="openshift-marketplace/community-operators-cnjgj" Mar 09 15:23:12 crc kubenswrapper[4722]: I0309 15:23:12.634169 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgnr8\" (UniqueName: \"kubernetes.io/projected/259f49a9-0c80-4b3c-96a7-24d3e0139a26-kube-api-access-bgnr8\") pod \"community-operators-cnjgj\" (UID: \"259f49a9-0c80-4b3c-96a7-24d3e0139a26\") " pod="openshift-marketplace/community-operators-cnjgj" Mar 09 15:23:12 crc kubenswrapper[4722]: I0309 15:23:12.665079 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-rvtqn" podUID="3411289f-3e7c-4e43-b545-5e612822b18e" containerName="registry-server" probeResult="failure" output=< Mar 09 15:23:12 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Mar 09 15:23:12 crc kubenswrapper[4722]: > Mar 09 15:23:12 crc kubenswrapper[4722]: I0309 15:23:12.752144 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cnjgj" Mar 09 15:23:12 crc kubenswrapper[4722]: I0309 15:23:12.826292 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-7qbrd" Mar 09 15:23:12 crc kubenswrapper[4722]: I0309 15:23:12.827565 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-7qbrd" Mar 09 15:23:12 crc kubenswrapper[4722]: I0309 15:23:12.957290 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-7qbrd" Mar 09 15:23:13 crc kubenswrapper[4722]: I0309 15:23:13.114371 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-7qbrd" Mar 09 15:23:13 crc kubenswrapper[4722]: I0309 15:23:13.149051 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-svbs7" Mar 09 15:23:13 crc kubenswrapper[4722]: I0309 15:23:13.149103 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-svbs7" Mar 09 15:23:13 crc kubenswrapper[4722]: I0309 15:23:13.333112 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dgxnd" Mar 09 15:23:13 crc kubenswrapper[4722]: I0309 15:23:13.573366 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lnkwt" Mar 09 15:23:13 crc kubenswrapper[4722]: I0309 15:23:13.668634 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-fp2th" Mar 09 15:23:13 crc kubenswrapper[4722]: I0309 15:23:13.690635 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 15:23:13 crc kubenswrapper[4722]: I0309 15:23:13.701544 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 15:23:13 crc kubenswrapper[4722]: I0309 15:23:13.853247 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-798745ff96-864pz" Mar 09 15:23:14 crc kubenswrapper[4722]: I0309 15:23:14.070342 4722 generic.go:334] "Generic (PLEG): container finished" podID="4c227d2b-e035-426b-b1e1-5be3a4e06090" containerID="231f54731c6756f7b9f1284a4fb32b33d4ba1b6c9bd8d9fce721d043fab1848e" exitCode=0 Mar 09 15:23:14 crc kubenswrapper[4722]: I0309 15:23:14.071995 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4c227d2b-e035-426b-b1e1-5be3a4e06090","Type":"ContainerDied","Data":"231f54731c6756f7b9f1284a4fb32b33d4ba1b6c9bd8d9fce721d043fab1848e"} Mar 09 15:23:14 crc kubenswrapper[4722]: I0309 15:23:14.117386 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 09 15:23:14 crc kubenswrapper[4722]: I0309 15:23:14.276833 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-svbs7" podUID="44161991-883d-4494-80b3-b829ff355f47" containerName="registry-server" probeResult="failure" output=< Mar 09 15:23:14 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Mar 09 15:23:14 crc kubenswrapper[4722]: > Mar 09 15:23:14 crc kubenswrapper[4722]: I0309 15:23:14.624345 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-2nlfl" Mar 09 15:23:14 crc kubenswrapper[4722]: I0309 15:23:14.666281 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-86ddb6bd46-6w5ww" Mar 09 15:23:14 crc kubenswrapper[4722]: W0309 15:23:14.720449 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod259f49a9_0c80_4b3c_96a7_24d3e0139a26.slice/crio-a8dd708893314923306fb1fb6bc2c6b86787a31c6d8ccd5d5a524e00e5f8614f WatchSource:0}: Error finding container a8dd708893314923306fb1fb6bc2c6b86787a31c6d8ccd5d5a524e00e5f8614f: Status 404 returned error can't find the container with id a8dd708893314923306fb1fb6bc2c6b86787a31c6d8ccd5d5a524e00e5f8614f Mar 09 15:23:14 crc kubenswrapper[4722]: I0309 15:23:14.770518 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cnjgj"] Mar 09 15:23:14 crc kubenswrapper[4722]: I0309 15:23:14.788496 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-ct7x8" Mar 09 15:23:14 crc kubenswrapper[4722]: I0309 15:23:14.844743 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-zjf7b" Mar 09 15:23:15 crc kubenswrapper[4722]: I0309 15:23:15.068388 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-8xdjl" Mar 09 15:23:15 crc kubenswrapper[4722]: I0309 15:23:15.119288 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cnjgj" event={"ID":"259f49a9-0c80-4b3c-96a7-24d3e0139a26","Type":"ContainerStarted","Data":"a8dd708893314923306fb1fb6bc2c6b86787a31c6d8ccd5d5a524e00e5f8614f"} Mar 09 15:23:15 crc kubenswrapper[4722]: I0309 15:23:15.165079 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-6vn96" Mar 09 15:23:15 crc kubenswrapper[4722]: I0309 15:23:15.169432 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-hnzfw" Mar 09 15:23:15 crc kubenswrapper[4722]: I0309 15:23:15.209718 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-2hxzr" Mar 09 15:23:15 crc kubenswrapper[4722]: I0309 15:23:15.290032 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-l8bds" Mar 09 15:23:15 crc kubenswrapper[4722]: I0309 15:23:15.429099 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-n5zc7" Mar 09 15:23:15 crc kubenswrapper[4722]: I0309 15:23:15.463501 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-c2mmw" podUID="690e5ab0-3719-40ac-aba6-9278480ecb44" containerName="registry-server" probeResult="failure" output=< Mar 09 15:23:15 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Mar 09 15:23:15 crc kubenswrapper[4722]: > Mar 09 15:23:15 crc kubenswrapper[4722]: I0309 15:23:15.646831 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-hgkzs" Mar 09 15:23:15 crc kubenswrapper[4722]: I0309 15:23:15.676872 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators-redhat/loki-operator-controller-manager-7d6d6698bd-4r85k" Mar 09 15:23:15 crc kubenswrapper[4722]: I0309 15:23:15.690731 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-v57f2" podUID="4e27c5a4-8cba-4119-8006-f9841d6121dc" containerName="registry-server" probeResult="failure" output=< Mar 09 15:23:15 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Mar 09 15:23:15 crc kubenswrapper[4722]: > Mar 09 15:23:15 crc kubenswrapper[4722]: I0309 15:23:15.711720 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-56qz9" Mar 09 15:23:15 crc kubenswrapper[4722]: I0309 15:23:15.736426 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-66c8b7dfbb-m7fv2" Mar 09 15:23:15 crc kubenswrapper[4722]: I0309 15:23:15.954196 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-pg8qn" Mar 09 15:23:16 crc kubenswrapper[4722]: I0309 15:23:16.131051 4722 generic.go:334] "Generic (PLEG): container finished" podID="259f49a9-0c80-4b3c-96a7-24d3e0139a26" containerID="2aada6aeae14620a04f1c221ca630c9e6dc733854bcb233a9dbd039677fe36fe" exitCode=0 Mar 09 15:23:16 crc kubenswrapper[4722]: I0309 15:23:16.131400 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cnjgj" event={"ID":"259f49a9-0c80-4b3c-96a7-24d3e0139a26","Type":"ContainerDied","Data":"2aada6aeae14620a04f1c221ca630c9e6dc733854bcb233a9dbd039677fe36fe"} Mar 09 15:23:16 crc kubenswrapper[4722]: I0309 15:23:16.146921 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4c227d2b-e035-426b-b1e1-5be3a4e06090","Type":"ContainerStarted","Data":"6d7a07cdc7286168b959730e6235b24d9421093485910ba6aff0b20cb7df66ea"} Mar 09 15:23:17 crc kubenswrapper[4722]: I0309 15:23:17.159855 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cnjgj" event={"ID":"259f49a9-0c80-4b3c-96a7-24d3e0139a26","Type":"ContainerStarted","Data":"6c9115a8d05760586be02a1b2d44e99ba4fdfd5544ed91fd67bf7d955acdf2a9"} Mar 09 15:23:17 crc kubenswrapper[4722]: I0309 15:23:17.162078 4722 generic.go:334] "Generic (PLEG): container finished" podID="89fd3e80-4c6c-4619-ad44-ef440c0b1fb6" containerID="59f9e2e53ec91c7b903e31572a6931dadcb8456c840d62bcc991024bba5ef79f" exitCode=1 Mar 09 15:23:17 crc kubenswrapper[4722]: I0309 15:23:17.162139 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"89fd3e80-4c6c-4619-ad44-ef440c0b1fb6","Type":"ContainerDied","Data":"59f9e2e53ec91c7b903e31572a6931dadcb8456c840d62bcc991024bba5ef79f"} Mar 09 15:23:17 crc kubenswrapper[4722]: I0309 15:23:17.964263 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-5db757fd5b-t57qc" Mar 09 15:23:18 crc kubenswrapper[4722]: I0309 15:23:18.343163 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 09 15:23:18 crc kubenswrapper[4722]: I0309 15:23:18.911685 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-8445c785c8-hdmgl" Mar 09 15:23:18 crc kubenswrapper[4722]: I0309 15:23:18.975776 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 09 15:23:19 crc kubenswrapper[4722]: I0309 15:23:19.033277 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/89fd3e80-4c6c-4619-ad44-ef440c0b1fb6-test-operator-ephemeral-temporary\") pod \"89fd3e80-4c6c-4619-ad44-ef440c0b1fb6\" (UID: \"89fd3e80-4c6c-4619-ad44-ef440c0b1fb6\") " Mar 09 15:23:19 crc kubenswrapper[4722]: I0309 15:23:19.033366 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/89fd3e80-4c6c-4619-ad44-ef440c0b1fb6-config-data\") pod \"89fd3e80-4c6c-4619-ad44-ef440c0b1fb6\" (UID: \"89fd3e80-4c6c-4619-ad44-ef440c0b1fb6\") " Mar 09 15:23:19 crc kubenswrapper[4722]: I0309 15:23:19.033824 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/89fd3e80-4c6c-4619-ad44-ef440c0b1fb6-openstack-config\") pod \"89fd3e80-4c6c-4619-ad44-ef440c0b1fb6\" (UID: \"89fd3e80-4c6c-4619-ad44-ef440c0b1fb6\") " Mar 09 15:23:19 crc kubenswrapper[4722]: I0309 15:23:19.034070 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/89fd3e80-4c6c-4619-ad44-ef440c0b1fb6-openstack-config-secret\") pod \"89fd3e80-4c6c-4619-ad44-ef440c0b1fb6\" (UID: \"89fd3e80-4c6c-4619-ad44-ef440c0b1fb6\") " Mar 09 15:23:19 crc kubenswrapper[4722]: I0309 15:23:19.034114 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/89fd3e80-4c6c-4619-ad44-ef440c0b1fb6-ca-certs\") pod \"89fd3e80-4c6c-4619-ad44-ef440c0b1fb6\" (UID: \"89fd3e80-4c6c-4619-ad44-ef440c0b1fb6\") " Mar 09 15:23:19 crc kubenswrapper[4722]: I0309 15:23:19.034180 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"89fd3e80-4c6c-4619-ad44-ef440c0b1fb6\" (UID: \"89fd3e80-4c6c-4619-ad44-ef440c0b1fb6\") " Mar 09 15:23:19 crc kubenswrapper[4722]: I0309 15:23:19.034222 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/89fd3e80-4c6c-4619-ad44-ef440c0b1fb6-ssh-key\") pod \"89fd3e80-4c6c-4619-ad44-ef440c0b1fb6\" (UID: \"89fd3e80-4c6c-4619-ad44-ef440c0b1fb6\") " Mar 09 15:23:19 crc kubenswrapper[4722]: I0309 15:23:19.034269 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/89fd3e80-4c6c-4619-ad44-ef440c0b1fb6-test-operator-ephemeral-workdir\") pod \"89fd3e80-4c6c-4619-ad44-ef440c0b1fb6\" (UID: \"89fd3e80-4c6c-4619-ad44-ef440c0b1fb6\") " Mar 09 15:23:19 crc kubenswrapper[4722]: I0309 15:23:19.034347 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5cgw\" (UniqueName: \"kubernetes.io/projected/89fd3e80-4c6c-4619-ad44-ef440c0b1fb6-kube-api-access-w5cgw\") pod \"89fd3e80-4c6c-4619-ad44-ef440c0b1fb6\" (UID: \"89fd3e80-4c6c-4619-ad44-ef440c0b1fb6\") " Mar 09 15:23:19 crc kubenswrapper[4722]: I0309 15:23:19.034955 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89fd3e80-4c6c-4619-ad44-ef440c0b1fb6-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "89fd3e80-4c6c-4619-ad44-ef440c0b1fb6" (UID: "89fd3e80-4c6c-4619-ad44-ef440c0b1fb6"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 15:23:19 crc kubenswrapper[4722]: I0309 15:23:19.039317 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89fd3e80-4c6c-4619-ad44-ef440c0b1fb6-config-data" (OuterVolumeSpecName: "config-data") pod "89fd3e80-4c6c-4619-ad44-ef440c0b1fb6" (UID: "89fd3e80-4c6c-4619-ad44-ef440c0b1fb6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 15:23:19 crc kubenswrapper[4722]: I0309 15:23:19.046301 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89fd3e80-4c6c-4619-ad44-ef440c0b1fb6-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "89fd3e80-4c6c-4619-ad44-ef440c0b1fb6" (UID: "89fd3e80-4c6c-4619-ad44-ef440c0b1fb6"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 15:23:19 crc kubenswrapper[4722]: I0309 15:23:19.046613 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89fd3e80-4c6c-4619-ad44-ef440c0b1fb6-kube-api-access-w5cgw" (OuterVolumeSpecName: "kube-api-access-w5cgw") pod "89fd3e80-4c6c-4619-ad44-ef440c0b1fb6" (UID: "89fd3e80-4c6c-4619-ad44-ef440c0b1fb6"). InnerVolumeSpecName "kube-api-access-w5cgw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 15:23:19 crc kubenswrapper[4722]: I0309 15:23:19.083394 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "test-operator-logs") pod "89fd3e80-4c6c-4619-ad44-ef440c0b1fb6" (UID: "89fd3e80-4c6c-4619-ad44-ef440c0b1fb6"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 09 15:23:19 crc kubenswrapper[4722]: I0309 15:23:19.092385 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89fd3e80-4c6c-4619-ad44-ef440c0b1fb6-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "89fd3e80-4c6c-4619-ad44-ef440c0b1fb6" (UID: "89fd3e80-4c6c-4619-ad44-ef440c0b1fb6"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 15:23:19 crc kubenswrapper[4722]: I0309 15:23:19.101480 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89fd3e80-4c6c-4619-ad44-ef440c0b1fb6-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "89fd3e80-4c6c-4619-ad44-ef440c0b1fb6" (UID: "89fd3e80-4c6c-4619-ad44-ef440c0b1fb6"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 15:23:19 crc kubenswrapper[4722]: I0309 15:23:19.132672 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89fd3e80-4c6c-4619-ad44-ef440c0b1fb6-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "89fd3e80-4c6c-4619-ad44-ef440c0b1fb6" (UID: "89fd3e80-4c6c-4619-ad44-ef440c0b1fb6"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 15:23:19 crc kubenswrapper[4722]: I0309 15:23:19.137451 4722 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Mar 09 15:23:19 crc kubenswrapper[4722]: I0309 15:23:19.137486 4722 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/89fd3e80-4c6c-4619-ad44-ef440c0b1fb6-ssh-key\") on node \"crc\" DevicePath \"\"" Mar 09 15:23:19 crc kubenswrapper[4722]: I0309 15:23:19.137496 4722 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/89fd3e80-4c6c-4619-ad44-ef440c0b1fb6-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Mar 09 15:23:19 crc kubenswrapper[4722]: I0309 15:23:19.137507 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5cgw\" (UniqueName: \"kubernetes.io/projected/89fd3e80-4c6c-4619-ad44-ef440c0b1fb6-kube-api-access-w5cgw\") on node \"crc\" DevicePath \"\"" Mar 09 15:23:19 crc kubenswrapper[4722]: I0309 15:23:19.137517 4722 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/89fd3e80-4c6c-4619-ad44-ef440c0b1fb6-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Mar 09 15:23:19 crc kubenswrapper[4722]: I0309 15:23:19.137527 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/89fd3e80-4c6c-4619-ad44-ef440c0b1fb6-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 15:23:19 crc kubenswrapper[4722]: I0309 15:23:19.137537 4722 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/89fd3e80-4c6c-4619-ad44-ef440c0b1fb6-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 09 15:23:19 crc kubenswrapper[4722]: I0309 15:23:19.137545 4722 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/89fd3e80-4c6c-4619-ad44-ef440c0b1fb6-ca-certs\") on node \"crc\" DevicePath \"\"" Mar 09 15:23:19 crc kubenswrapper[4722]: I0309 15:23:19.149013 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 09 15:23:19 crc kubenswrapper[4722]: I0309 15:23:19.149061 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 09 15:23:19 crc kubenswrapper[4722]: I0309 15:23:19.156708 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89fd3e80-4c6c-4619-ad44-ef440c0b1fb6-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "89fd3e80-4c6c-4619-ad44-ef440c0b1fb6" (UID: "89fd3e80-4c6c-4619-ad44-ef440c0b1fb6"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 15:23:19 crc kubenswrapper[4722]: I0309 15:23:19.170334 4722 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Mar 09 15:23:19 crc kubenswrapper[4722]: I0309 15:23:19.192470 4722 generic.go:334] "Generic (PLEG): container finished" podID="259f49a9-0c80-4b3c-96a7-24d3e0139a26" containerID="6c9115a8d05760586be02a1b2d44e99ba4fdfd5544ed91fd67bf7d955acdf2a9" exitCode=0 Mar 09 15:23:19 crc kubenswrapper[4722]: I0309 15:23:19.192534 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cnjgj" event={"ID":"259f49a9-0c80-4b3c-96a7-24d3e0139a26","Type":"ContainerDied","Data":"6c9115a8d05760586be02a1b2d44e99ba4fdfd5544ed91fd67bf7d955acdf2a9"} Mar 09 15:23:19 crc kubenswrapper[4722]: I0309 15:23:19.200585 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"89fd3e80-4c6c-4619-ad44-ef440c0b1fb6","Type":"ContainerDied","Data":"3df5135e290af710c1f2c95f2c9f78d748d6cf1e02be5e74a3958bb146b287e0"} Mar 09 15:23:19 crc kubenswrapper[4722]: I0309 15:23:19.200823 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3df5135e290af710c1f2c95f2c9f78d748d6cf1e02be5e74a3958bb146b287e0" Mar 09 15:23:19 crc kubenswrapper[4722]: I0309 15:23:19.200934 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 09 15:23:19 crc kubenswrapper[4722]: I0309 15:23:19.241393 4722 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/89fd3e80-4c6c-4619-ad44-ef440c0b1fb6-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 09 15:23:19 crc kubenswrapper[4722]: I0309 15:23:19.241450 4722 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Mar 09 15:23:19 crc kubenswrapper[4722]: I0309 15:23:19.904641 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-6dc4c5dd4b-rk4jt" podUID="6dc5b476-5a42-4c98-9a95-0e3b29f2f771" containerName="console" containerID="cri-o://b8f5b1038e62f1fa81cc3b716e91ed6508a1746f147df7b94d2cb47945f17185" gracePeriod=15 Mar 09 15:23:20 crc kubenswrapper[4722]: I0309 15:23:20.214968 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cnjgj" event={"ID":"259f49a9-0c80-4b3c-96a7-24d3e0139a26","Type":"ContainerStarted","Data":"7a0e2b436fd8a4c44adc95f64130576f7b352d194143f429ca7aa730791bf082"} Mar 09 15:23:20 crc kubenswrapper[4722]: I0309 15:23:20.218608 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6dc4c5dd4b-rk4jt_6dc5b476-5a42-4c98-9a95-0e3b29f2f771/console/0.log" Mar 09 15:23:20 crc kubenswrapper[4722]: I0309 15:23:20.218667 4722 generic.go:334] "Generic (PLEG): container finished" podID="6dc5b476-5a42-4c98-9a95-0e3b29f2f771" containerID="b8f5b1038e62f1fa81cc3b716e91ed6508a1746f147df7b94d2cb47945f17185" exitCode=2 Mar 09 15:23:20 crc kubenswrapper[4722]: I0309 15:23:20.218697 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6dc4c5dd4b-rk4jt" event={"ID":"6dc5b476-5a42-4c98-9a95-0e3b29f2f771","Type":"ContainerDied","Data":"b8f5b1038e62f1fa81cc3b716e91ed6508a1746f147df7b94d2cb47945f17185"} Mar 09 15:23:20 crc kubenswrapper[4722]: I0309 15:23:20.243430 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-cnjgj" podStartSLOduration=4.789624878 podStartE2EDuration="8.243412196s" podCreationTimestamp="2026-03-09 15:23:12 +0000 UTC" firstStartedPulling="2026-03-09 15:23:16.139172922 +0000 UTC m=+4836.694741498" lastFinishedPulling="2026-03-09 15:23:19.59296024 +0000 UTC m=+4840.148528816" observedRunningTime="2026-03-09 15:23:20.242329036 +0000 UTC m=+4840.797897612" watchObservedRunningTime="2026-03-09 15:23:20.243412196 +0000 UTC m=+4840.798980772" Mar 09 15:23:21 crc kubenswrapper[4722]: I0309 15:23:21.233956 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6dc4c5dd4b-rk4jt_6dc5b476-5a42-4c98-9a95-0e3b29f2f771/console/0.log" Mar 09 15:23:21 crc kubenswrapper[4722]: I0309 15:23:21.234258 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6dc4c5dd4b-rk4jt" event={"ID":"6dc5b476-5a42-4c98-9a95-0e3b29f2f771","Type":"ContainerStarted","Data":"e6b68ecc17cd3129db93ce4556a922d23b22b96d5bad11f04ed33e92aabec48a"} Mar 09 15:23:21 crc kubenswrapper[4722]: I0309 15:23:21.527991 4722 patch_prober.go:28] interesting pod/machine-config-daemon-hjrrb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 15:23:21 crc kubenswrapper[4722]: I0309 15:23:21.528049 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 15:23:22 crc kubenswrapper[4722]: I0309 15:23:22.603885 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-rvtqn" podUID="3411289f-3e7c-4e43-b545-5e612822b18e" containerName="registry-server" probeResult="failure" output=< Mar 09 15:23:22 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Mar 09 15:23:22 crc kubenswrapper[4722]: > Mar 09 15:23:22 crc kubenswrapper[4722]: I0309 15:23:22.752881 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-cnjgj" Mar 09 15:23:22 crc kubenswrapper[4722]: I0309 15:23:22.752920 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-cnjgj" Mar 09 15:23:22 crc kubenswrapper[4722]: I0309 15:23:22.819106 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-cnjgj" Mar 09 15:23:23 crc kubenswrapper[4722]: I0309 15:23:23.375567 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/cinder-scheduler-0" podUID="4c227d2b-e035-426b-b1e1-5be3a4e06090" containerName="cinder-scheduler" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 15:23:24 crc kubenswrapper[4722]: I0309 15:23:24.120870 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 09 15:23:24 crc kubenswrapper[4722]: E0309 15:23:24.121495 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89fd3e80-4c6c-4619-ad44-ef440c0b1fb6" containerName="tempest-tests-tempest-tests-runner" Mar 09 15:23:24 crc kubenswrapper[4722]: I0309 15:23:24.121525 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="89fd3e80-4c6c-4619-ad44-ef440c0b1fb6" containerName="tempest-tests-tempest-tests-runner" Mar 09 15:23:24 crc kubenswrapper[4722]: I0309 15:23:24.122318 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="89fd3e80-4c6c-4619-ad44-ef440c0b1fb6" containerName="tempest-tests-tempest-tests-runner" Mar 09 15:23:24 crc kubenswrapper[4722]: I0309 15:23:24.123228 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 09 15:23:24 crc kubenswrapper[4722]: I0309 15:23:24.125067 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-l2bdw" Mar 09 15:23:24 crc kubenswrapper[4722]: I0309 15:23:24.134408 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 09 15:23:24 crc kubenswrapper[4722]: I0309 15:23:24.201833 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-svbs7" podUID="44161991-883d-4494-80b3-b829ff355f47" containerName="registry-server" probeResult="failure" output=< Mar 09 15:23:24 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Mar 09 15:23:24 crc kubenswrapper[4722]: > Mar 09 15:23:24 crc kubenswrapper[4722]: I0309 15:23:24.267225 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"1ed3bba7-c106-49a1-96d6-672710c534bf\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 09 15:23:24 crc kubenswrapper[4722]: I0309 15:23:24.267730 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvmnx\" (UniqueName: \"kubernetes.io/projected/1ed3bba7-c106-49a1-96d6-672710c534bf-kube-api-access-pvmnx\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"1ed3bba7-c106-49a1-96d6-672710c534bf\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 09 15:23:24 crc kubenswrapper[4722]: I0309 15:23:24.328428 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-cnjgj" Mar 09 15:23:24 crc kubenswrapper[4722]: I0309 15:23:24.370253 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvmnx\" (UniqueName: \"kubernetes.io/projected/1ed3bba7-c106-49a1-96d6-672710c534bf-kube-api-access-pvmnx\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"1ed3bba7-c106-49a1-96d6-672710c534bf\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 09 15:23:24 crc kubenswrapper[4722]: I0309 15:23:24.370766 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"1ed3bba7-c106-49a1-96d6-672710c534bf\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 09 15:23:24 crc kubenswrapper[4722]: I0309 15:23:24.371473 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"1ed3bba7-c106-49a1-96d6-672710c534bf\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 09 15:23:24 crc kubenswrapper[4722]: I0309 15:23:24.405733 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvmnx\" (UniqueName: \"kubernetes.io/projected/1ed3bba7-c106-49a1-96d6-672710c534bf-kube-api-access-pvmnx\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"1ed3bba7-c106-49a1-96d6-672710c534bf\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 09 15:23:24 crc kubenswrapper[4722]: I0309 15:23:24.406929 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"1ed3bba7-c106-49a1-96d6-672710c534bf\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 09 15:23:24 crc kubenswrapper[4722]: I0309 15:23:24.436611 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6dc4c5dd4b-rk4jt" Mar 09 15:23:24 crc kubenswrapper[4722]: I0309 15:23:24.436657 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-6dc4c5dd4b-rk4jt" Mar 09 15:23:24 crc kubenswrapper[4722]: I0309 15:23:24.442748 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 09 15:23:24 crc kubenswrapper[4722]: I0309 15:23:24.444265 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6dc4c5dd4b-rk4jt" Mar 09 15:23:25 crc kubenswrapper[4722]: I0309 15:23:25.109018 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 09 15:23:25 crc kubenswrapper[4722]: I0309 15:23:25.280600 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"1ed3bba7-c106-49a1-96d6-672710c534bf","Type":"ContainerStarted","Data":"be28956430215ae1731749d2f579edcd52bd1f0fb7102e951efeac1406657f5f"} Mar 09 15:23:25 crc kubenswrapper[4722]: I0309 15:23:25.286450 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6dc4c5dd4b-rk4jt" Mar 09 15:23:25 crc kubenswrapper[4722]: I0309 15:23:25.464255 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-c2mmw" podUID="690e5ab0-3719-40ac-aba6-9278480ecb44" containerName="registry-server" probeResult="failure" output=< Mar 09 15:23:25 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Mar 09 15:23:25 crc kubenswrapper[4722]: > Mar 09 15:23:25 crc kubenswrapper[4722]: I0309 15:23:25.647966 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-v57f2" podUID="4e27c5a4-8cba-4119-8006-f9841d6121dc" containerName="registry-server" probeResult="failure" output=< Mar 09 15:23:25 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Mar 09 15:23:25 crc kubenswrapper[4722]: > Mar 09 15:23:27 crc kubenswrapper[4722]: I0309 15:23:27.242560 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cnjgj"] Mar 09 15:23:27 crc kubenswrapper[4722]: I0309 15:23:27.243466 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-cnjgj" podUID="259f49a9-0c80-4b3c-96a7-24d3e0139a26" containerName="registry-server" containerID="cri-o://7a0e2b436fd8a4c44adc95f64130576f7b352d194143f429ca7aa730791bf082" gracePeriod=2 Mar 09 15:23:27 crc kubenswrapper[4722]: I0309 15:23:27.318023 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"1ed3bba7-c106-49a1-96d6-672710c534bf","Type":"ContainerStarted","Data":"8e412a94d4604e7463e51f0577a8b141ae83b19973d56c7d6c2a3150e6f09706"} Mar 09 15:23:27 crc kubenswrapper[4722]: I0309 15:23:27.334082 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.225842765 podStartE2EDuration="3.334061248s" podCreationTimestamp="2026-03-09 15:23:24 +0000 UTC" firstStartedPulling="2026-03-09 15:23:25.115625616 +0000 UTC m=+4845.671194192" lastFinishedPulling="2026-03-09 15:23:26.223844099 +0000 UTC m=+4846.779412675" observedRunningTime="2026-03-09 15:23:27.33339316 +0000 UTC m=+4847.888961726" watchObservedRunningTime="2026-03-09 15:23:27.334061248 +0000 UTC m=+4847.889629824" Mar 09 15:23:27 crc kubenswrapper[4722]: I0309 15:23:27.983486 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cnjgj" Mar 09 15:23:28 crc kubenswrapper[4722]: I0309 15:23:28.168307 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/259f49a9-0c80-4b3c-96a7-24d3e0139a26-utilities\") pod \"259f49a9-0c80-4b3c-96a7-24d3e0139a26\" (UID: \"259f49a9-0c80-4b3c-96a7-24d3e0139a26\") " Mar 09 15:23:28 crc kubenswrapper[4722]: I0309 15:23:28.168722 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bgnr8\" (UniqueName: \"kubernetes.io/projected/259f49a9-0c80-4b3c-96a7-24d3e0139a26-kube-api-access-bgnr8\") pod \"259f49a9-0c80-4b3c-96a7-24d3e0139a26\" (UID: \"259f49a9-0c80-4b3c-96a7-24d3e0139a26\") " Mar 09 15:23:28 crc kubenswrapper[4722]: I0309 15:23:28.168848 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/259f49a9-0c80-4b3c-96a7-24d3e0139a26-catalog-content\") pod \"259f49a9-0c80-4b3c-96a7-24d3e0139a26\" (UID: \"259f49a9-0c80-4b3c-96a7-24d3e0139a26\") " Mar 09 15:23:28 crc kubenswrapper[4722]: I0309 15:23:28.169040 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/259f49a9-0c80-4b3c-96a7-24d3e0139a26-utilities" (OuterVolumeSpecName: "utilities") pod "259f49a9-0c80-4b3c-96a7-24d3e0139a26" (UID: "259f49a9-0c80-4b3c-96a7-24d3e0139a26"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 15:23:28 crc kubenswrapper[4722]: I0309 15:23:28.169432 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/259f49a9-0c80-4b3c-96a7-24d3e0139a26-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 15:23:28 crc kubenswrapper[4722]: I0309 15:23:28.186720 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/259f49a9-0c80-4b3c-96a7-24d3e0139a26-kube-api-access-bgnr8" (OuterVolumeSpecName: "kube-api-access-bgnr8") pod "259f49a9-0c80-4b3c-96a7-24d3e0139a26" (UID: "259f49a9-0c80-4b3c-96a7-24d3e0139a26"). InnerVolumeSpecName "kube-api-access-bgnr8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 15:23:28 crc kubenswrapper[4722]: I0309 15:23:28.274838 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bgnr8\" (UniqueName: \"kubernetes.io/projected/259f49a9-0c80-4b3c-96a7-24d3e0139a26-kube-api-access-bgnr8\") on node \"crc\" DevicePath \"\"" Mar 09 15:23:28 crc kubenswrapper[4722]: I0309 15:23:28.280037 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/259f49a9-0c80-4b3c-96a7-24d3e0139a26-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "259f49a9-0c80-4b3c-96a7-24d3e0139a26" (UID: "259f49a9-0c80-4b3c-96a7-24d3e0139a26"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 15:23:28 crc kubenswrapper[4722]: I0309 15:23:28.332880 4722 generic.go:334] "Generic (PLEG): container finished" podID="259f49a9-0c80-4b3c-96a7-24d3e0139a26" containerID="7a0e2b436fd8a4c44adc95f64130576f7b352d194143f429ca7aa730791bf082" exitCode=0 Mar 09 15:23:28 crc kubenswrapper[4722]: I0309 15:23:28.332948 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cnjgj" Mar 09 15:23:28 crc kubenswrapper[4722]: I0309 15:23:28.332989 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cnjgj" event={"ID":"259f49a9-0c80-4b3c-96a7-24d3e0139a26","Type":"ContainerDied","Data":"7a0e2b436fd8a4c44adc95f64130576f7b352d194143f429ca7aa730791bf082"} Mar 09 15:23:28 crc kubenswrapper[4722]: I0309 15:23:28.333043 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cnjgj" event={"ID":"259f49a9-0c80-4b3c-96a7-24d3e0139a26","Type":"ContainerDied","Data":"a8dd708893314923306fb1fb6bc2c6b86787a31c6d8ccd5d5a524e00e5f8614f"} Mar 09 15:23:28 crc kubenswrapper[4722]: I0309 15:23:28.333060 4722 scope.go:117] "RemoveContainer" containerID="7a0e2b436fd8a4c44adc95f64130576f7b352d194143f429ca7aa730791bf082" Mar 09 15:23:28 crc kubenswrapper[4722]: I0309 15:23:28.377149 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/259f49a9-0c80-4b3c-96a7-24d3e0139a26-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 15:23:28 crc kubenswrapper[4722]: I0309 15:23:28.379644 4722 scope.go:117] "RemoveContainer" containerID="6c9115a8d05760586be02a1b2d44e99ba4fdfd5544ed91fd67bf7d955acdf2a9" Mar 09 15:23:28 crc kubenswrapper[4722]: I0309 15:23:28.382252 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cnjgj"] Mar 09 15:23:28 crc kubenswrapper[4722]: I0309 15:23:28.397620 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-cnjgj"] Mar 09 15:23:28 crc kubenswrapper[4722]: I0309 15:23:28.400646 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/cinder-scheduler-0" podUID="4c227d2b-e035-426b-b1e1-5be3a4e06090" containerName="cinder-scheduler" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 15:23:28 crc kubenswrapper[4722]: I0309 15:23:28.986911 4722 scope.go:117] "RemoveContainer" containerID="2aada6aeae14620a04f1c221ca630c9e6dc733854bcb233a9dbd039677fe36fe" Mar 09 15:23:29 crc kubenswrapper[4722]: I0309 15:23:29.055633 4722 scope.go:117] "RemoveContainer" containerID="7a0e2b436fd8a4c44adc95f64130576f7b352d194143f429ca7aa730791bf082" Mar 09 15:23:29 crc kubenswrapper[4722]: E0309 15:23:29.056178 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a0e2b436fd8a4c44adc95f64130576f7b352d194143f429ca7aa730791bf082\": container with ID starting with 7a0e2b436fd8a4c44adc95f64130576f7b352d194143f429ca7aa730791bf082 not found: ID does not exist" containerID="7a0e2b436fd8a4c44adc95f64130576f7b352d194143f429ca7aa730791bf082" Mar 09 15:23:29 crc kubenswrapper[4722]: I0309 15:23:29.056244 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a0e2b436fd8a4c44adc95f64130576f7b352d194143f429ca7aa730791bf082"} err="failed to get container status \"7a0e2b436fd8a4c44adc95f64130576f7b352d194143f429ca7aa730791bf082\": rpc error: code = NotFound desc = could not find container \"7a0e2b436fd8a4c44adc95f64130576f7b352d194143f429ca7aa730791bf082\": container with ID starting with 7a0e2b436fd8a4c44adc95f64130576f7b352d194143f429ca7aa730791bf082 not found: ID does not exist" Mar 09 15:23:29 crc kubenswrapper[4722]: I0309 15:23:29.056295 4722 scope.go:117] "RemoveContainer" containerID="6c9115a8d05760586be02a1b2d44e99ba4fdfd5544ed91fd67bf7d955acdf2a9" Mar 09 15:23:29 crc kubenswrapper[4722]: E0309 15:23:29.056792 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c9115a8d05760586be02a1b2d44e99ba4fdfd5544ed91fd67bf7d955acdf2a9\": container with ID starting with 6c9115a8d05760586be02a1b2d44e99ba4fdfd5544ed91fd67bf7d955acdf2a9 not found: ID does not exist" containerID="6c9115a8d05760586be02a1b2d44e99ba4fdfd5544ed91fd67bf7d955acdf2a9" Mar 09 15:23:29 crc kubenswrapper[4722]: I0309 15:23:29.056896 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c9115a8d05760586be02a1b2d44e99ba4fdfd5544ed91fd67bf7d955acdf2a9"} err="failed to get container status \"6c9115a8d05760586be02a1b2d44e99ba4fdfd5544ed91fd67bf7d955acdf2a9\": rpc error: code = NotFound desc = could not find container \"6c9115a8d05760586be02a1b2d44e99ba4fdfd5544ed91fd67bf7d955acdf2a9\": container with ID starting with 6c9115a8d05760586be02a1b2d44e99ba4fdfd5544ed91fd67bf7d955acdf2a9 not found: ID does not exist" Mar 09 15:23:29 crc kubenswrapper[4722]: I0309 15:23:29.056939 4722 scope.go:117] "RemoveContainer" containerID="2aada6aeae14620a04f1c221ca630c9e6dc733854bcb233a9dbd039677fe36fe" Mar 09 15:23:29 crc kubenswrapper[4722]: E0309 15:23:29.057313 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2aada6aeae14620a04f1c221ca630c9e6dc733854bcb233a9dbd039677fe36fe\": container with ID starting with 2aada6aeae14620a04f1c221ca630c9e6dc733854bcb233a9dbd039677fe36fe not found: ID does not exist" containerID="2aada6aeae14620a04f1c221ca630c9e6dc733854bcb233a9dbd039677fe36fe" Mar 09 15:23:29 crc kubenswrapper[4722]: I0309 15:23:29.057410 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2aada6aeae14620a04f1c221ca630c9e6dc733854bcb233a9dbd039677fe36fe"} err="failed to get container status \"2aada6aeae14620a04f1c221ca630c9e6dc733854bcb233a9dbd039677fe36fe\": rpc error: code = NotFound desc = could not find container \"2aada6aeae14620a04f1c221ca630c9e6dc733854bcb233a9dbd039677fe36fe\": container with ID starting with 2aada6aeae14620a04f1c221ca630c9e6dc733854bcb233a9dbd039677fe36fe not found: ID does not exist" Mar 09 15:23:30 crc kubenswrapper[4722]: I0309 15:23:30.174306 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="259f49a9-0c80-4b3c-96a7-24d3e0139a26" path="/var/lib/kubelet/pods/259f49a9-0c80-4b3c-96a7-24d3e0139a26/volumes" Mar 09 15:23:31 crc kubenswrapper[4722]: I0309 15:23:31.654684 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rvtqn" Mar 09 15:23:31 crc kubenswrapper[4722]: I0309 15:23:31.732512 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rvtqn" Mar 09 15:23:33 crc kubenswrapper[4722]: I0309 15:23:33.385641 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/cinder-scheduler-0" podUID="4c227d2b-e035-426b-b1e1-5be3a4e06090" containerName="cinder-scheduler" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 15:23:33 crc kubenswrapper[4722]: I0309 15:23:33.439105 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-5689854475-89q94" Mar 09 15:23:33 crc kubenswrapper[4722]: I0309 15:23:33.566579 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mrtb2" Mar 09 15:23:34 crc kubenswrapper[4722]: I0309 15:23:34.216777 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-svbs7" podUID="44161991-883d-4494-80b3-b829ff355f47" containerName="registry-server" probeResult="failure" output=< Mar 09 15:23:34 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Mar 09 15:23:34 crc kubenswrapper[4722]: > Mar 09 15:23:34 crc kubenswrapper[4722]: I0309 15:23:34.466253 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-c2mmw" Mar 09 15:23:34 crc kubenswrapper[4722]: I0309 15:23:34.561944 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-c2mmw" Mar 09 15:23:35 crc kubenswrapper[4722]: I0309 15:23:35.786172 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-v57f2" podUID="4e27c5a4-8cba-4119-8006-f9841d6121dc" containerName="registry-server" probeResult="failure" output=< Mar 09 15:23:35 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Mar 09 15:23:35 crc kubenswrapper[4722]: > Mar 09 15:23:38 crc kubenswrapper[4722]: I0309 15:23:38.362143 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/cinder-scheduler-0" podUID="4c227d2b-e035-426b-b1e1-5be3a4e06090" containerName="cinder-scheduler" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 15:23:43 crc kubenswrapper[4722]: I0309 15:23:43.336861 4722 patch_prober.go:28] interesting pod/logging-loki-compactor-0 container/loki-compactor namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.58:3101/ready\": context deadline exceeded" start-of-body= Mar 09 15:23:43 crc kubenswrapper[4722]: I0309 15:23:43.337350 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-compactor-0" podUID="75ed49e3-dc17-45c0-96ec-1db69670395b" containerName="loki-compactor" probeResult="failure" output="Get \"https://10.217.0.58:3101/ready\": context deadline exceeded" Mar 09 15:23:43 crc kubenswrapper[4722]: I0309 15:23:43.477894 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/cinder-scheduler-0" podUID="4c227d2b-e035-426b-b1e1-5be3a4e06090" containerName="cinder-scheduler" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 15:23:43 crc kubenswrapper[4722]: I0309 15:23:43.760448 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-operator-controller-init-5b979cff56-vwbnz" podUID="bdac45ca-36d4-41c5-b5e5-332d70558171" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.104:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:23:44 crc kubenswrapper[4722]: I0309 15:23:44.328316 4722 patch_prober.go:28] interesting pod/router-default-5444994796-dp8wn container/router namespace/openshift-ingress: Readiness probe status=failure output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:23:44 crc kubenswrapper[4722]: I0309 15:23:44.328375 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ingress/router-default-5444994796-dp8wn" podUID="317444ee-0620-47d2-869e-77578a367a87" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:23:44 crc kubenswrapper[4722]: I0309 15:23:44.328372 4722 patch_prober.go:28] interesting pod/router-default-5444994796-dp8wn container/router namespace/openshift-ingress: Liveness probe status=failure output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:23:44 crc kubenswrapper[4722]: I0309 15:23:44.328462 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-ingress/router-default-5444994796-dp8wn" podUID="317444ee-0620-47d2-869e-77578a367a87" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:23:44 crc kubenswrapper[4722]: I0309 15:23:44.685369 4722 patch_prober.go:28] interesting pod/package-server-manager-789f6589d5-mrtb2 container/package-server-manager namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"http://10.217.0.23:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:23:44 crc kubenswrapper[4722]: I0309 15:23:44.685387 4722 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-lnkwt container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.19:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:23:44 crc kubenswrapper[4722]: I0309 15:23:44.685436 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mrtb2" podUID="228d39d8-b0bc-4491-be90-e473c090f412" containerName="package-server-manager" probeResult="failure" output="Get \"http://10.217.0.23:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:23:44 crc kubenswrapper[4722]: I0309 15:23:44.685450 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lnkwt" podUID="f8055a95-6b09-4e32-88b8-82ad36ca5029" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.19:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 09 15:23:44 crc kubenswrapper[4722]: I0309 15:23:44.685508 4722 patch_prober.go:28] interesting pod/package-server-manager-789f6589d5-mrtb2 container/package-server-manager namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"http://10.217.0.23:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 09 15:23:44 crc kubenswrapper[4722]: I0309 15:23:44.685530 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-mrtb2" podUID="228d39d8-b0bc-4491-be90-e473c090f412" containerName="package-server-manager" probeResult="failure" output="Get \"http://10.217.0.23:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 09 15:23:44 crc kubenswrapper[4722]: I0309 15:23:44.925662 4722 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 2.130859744s: [/var/lib/containers/storage/overlay/9d010f4f414981b87bfba875d29132870cfddeae11f8ad5f8a56c5b79ad43c11/diff /var/log/pods/cert-manager_cert-manager-858654f9db-rrczb_344178ce-f6d3-47f4-ab3c-69c394e2f677/cert-manager-controller/0.log]; will not log again for this container unless duration exceeds 2s Mar 09 15:23:44 crc kubenswrapper[4722]: I0309 15:23:44.926159 4722 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 2.06755727s: [/var/lib/containers/storage/overlay/2a77e1e86f508f661411248b30fc1298c49fec929b50d253984f0296a9c3333d/diff /var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-kxd7l_5548bcb9-3490-4e2b-982f-adc9ff86db62/cert-manager-cainjector/0.log]; will not log again for this container unless duration exceeds 2s Mar 09 15:23:44 crc kubenswrapper[4722]: I0309 15:23:44.929164 4722 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 2.070581843s: [/var/lib/containers/storage/overlay/4a0e1011b86657b5415560ce6a42be1fda5d1a3605e35c4c029ee0959e47aa02/diff /var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-wb6rj_ac6d6e52-6a89-4f96-8894-8ed2c71cdcbc/cert-manager-webhook/0.log]; will not log again for this container unless duration exceeds 2s Mar 09 15:23:44 crc kubenswrapper[4722]: I0309 15:23:44.934787 4722 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 2.691250072s: [/var/lib/containers/storage/overlay/7a9388b90a0bad64b9d12faa50ca0366aa31a123c4c8cbc7aa51f481007440a0/diff /var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager-recovery-controller/0.log]; will not log again for this container unless duration exceeds 2s Mar 09 15:23:44 crc kubenswrapper[4722]: I0309 15:23:44.934920 4722 fsHandler.go:133] fs: disk usage and inodes count on following dirs took 1.970156202s: [/var/lib/containers/storage/overlay/0fbacbdff75476401408e31e3dd3c3986e75739ef64f8fd07c3d839a8c55acba/diff /var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-crc_3dcd261975c3d6b9a6ad6367fd4facd3/kube-scheduler-recovery-controller/0.log]; will not log again for this container unless duration exceeds 2s Mar 09 15:23:46 crc kubenswrapper[4722]: I0309 15:23:46.261550 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-v57f2" podUID="4e27c5a4-8cba-4119-8006-f9841d6121dc" containerName="registry-server" probeResult="failure" output=< Mar 09 15:23:46 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Mar 09 15:23:46 crc kubenswrapper[4722]: > Mar 09 15:23:46 crc kubenswrapper[4722]: I0309 15:23:46.263188 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-svbs7" podUID="44161991-883d-4494-80b3-b829ff355f47" containerName="registry-server" probeResult="failure" output=< Mar 09 15:23:46 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Mar 09 15:23:46 crc kubenswrapper[4722]: > Mar 09 15:23:47 crc kubenswrapper[4722]: I0309 15:23:47.687520 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 15:23:47 crc kubenswrapper[4722]: I0309 15:23:47.695162 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e4a22f8c-ed38-47cf-8238-baf804f573a1" containerName="ceilometer-notification-agent" containerID="cri-o://85414c6aa3d412ee02c488bb1d4b37cbe6610e47d25cb2cf02f72ee5191cff5d" gracePeriod=30 Mar 09 15:23:47 crc kubenswrapper[4722]: I0309 15:23:47.695259 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e4a22f8c-ed38-47cf-8238-baf804f573a1" containerName="ceilometer-central-agent" containerID="cri-o://4b9a57a7830d894ace19d3f55c8a7106923a4ca6e2dd258430d194d1c60e4052" gracePeriod=30 Mar 09 15:23:47 crc kubenswrapper[4722]: I0309 15:23:47.695269 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e4a22f8c-ed38-47cf-8238-baf804f573a1" containerName="sg-core" containerID="cri-o://6287639f5110871948ab340f5fc565b05341b6ee17f9cfceba7413ea012ddad7" gracePeriod=30 Mar 09 15:23:47 crc kubenswrapper[4722]: I0309 15:23:47.695233 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e4a22f8c-ed38-47cf-8238-baf804f573a1" containerName="proxy-httpd" containerID="cri-o://2227fa96630e194da4addb6cfc3a43b10b936a7d8ca77af3b21c6ddd6d69615d" gracePeriod=30 Mar 09 15:23:48 crc kubenswrapper[4722]: I0309 15:23:48.385716 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/cinder-scheduler-0" podUID="4c227d2b-e035-426b-b1e1-5be3a4e06090" containerName="cinder-scheduler" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 15:23:48 crc kubenswrapper[4722]: I0309 15:23:48.575454 4722 generic.go:334] "Generic (PLEG): container finished" podID="e4a22f8c-ed38-47cf-8238-baf804f573a1" containerID="6287639f5110871948ab340f5fc565b05341b6ee17f9cfceba7413ea012ddad7" exitCode=2 Mar 09 15:23:48 crc kubenswrapper[4722]: I0309 15:23:48.575501 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4a22f8c-ed38-47cf-8238-baf804f573a1","Type":"ContainerDied","Data":"6287639f5110871948ab340f5fc565b05341b6ee17f9cfceba7413ea012ddad7"} Mar 09 15:23:50 crc kubenswrapper[4722]: I0309 15:23:50.601040 4722 generic.go:334] "Generic (PLEG): container finished" podID="e4a22f8c-ed38-47cf-8238-baf804f573a1" containerID="4b9a57a7830d894ace19d3f55c8a7106923a4ca6e2dd258430d194d1c60e4052" exitCode=0 Mar 09 15:23:50 crc kubenswrapper[4722]: I0309 15:23:50.601699 4722 generic.go:334] "Generic (PLEG): container finished" podID="e4a22f8c-ed38-47cf-8238-baf804f573a1" containerID="2227fa96630e194da4addb6cfc3a43b10b936a7d8ca77af3b21c6ddd6d69615d" exitCode=0 Mar 09 15:23:50 crc kubenswrapper[4722]: I0309 15:23:50.601103 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4a22f8c-ed38-47cf-8238-baf804f573a1","Type":"ContainerDied","Data":"4b9a57a7830d894ace19d3f55c8a7106923a4ca6e2dd258430d194d1c60e4052"} Mar 09 15:23:50 crc kubenswrapper[4722]: I0309 15:23:50.601749 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4a22f8c-ed38-47cf-8238-baf804f573a1","Type":"ContainerDied","Data":"2227fa96630e194da4addb6cfc3a43b10b936a7d8ca77af3b21c6ddd6d69615d"} Mar 09 15:23:50 crc kubenswrapper[4722]: I0309 15:23:50.635238 4722 scope.go:117] "RemoveContainer" containerID="cd582a254af464cea392ee9b3f2a8bbe909916d04f0053279129bca6b32f8102" Mar 09 15:23:51 crc kubenswrapper[4722]: I0309 15:23:51.527597 4722 patch_prober.go:28] interesting pod/machine-config-daemon-hjrrb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 15:23:51 crc kubenswrapper[4722]: I0309 15:23:51.528162 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 15:23:51 crc kubenswrapper[4722]: I0309 15:23:51.763488 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 09 15:23:51 crc kubenswrapper[4722]: I0309 15:23:51.872354 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 09 15:23:52 crc kubenswrapper[4722]: I0309 15:23:52.147195 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="e4a22f8c-ed38-47cf-8238-baf804f573a1" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.1.21:3000/\": dial tcp 10.217.1.21:3000: connect: connection refused" Mar 09 15:23:52 crc kubenswrapper[4722]: I0309 15:23:52.448006 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 09 15:23:52 crc kubenswrapper[4722]: I0309 15:23:52.597684 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="a79aaedb-92ba-42fc-8cc7-9ecb007d2ac7" containerName="galera" probeResult="failure" output=< Mar 09 15:23:52 crc kubenswrapper[4722]: wsrep_local_state_comment (Joined) differs from Synced Mar 09 15:23:52 crc kubenswrapper[4722]: > Mar 09 15:23:52 crc kubenswrapper[4722]: I0309 15:23:52.598104 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="4159e308-3ccf-45d9-a97b-8133542007a8" containerName="galera" probeResult="failure" output=< Mar 09 15:23:52 crc kubenswrapper[4722]: wsrep_local_state_comment (Joined) differs from Synced Mar 09 15:23:52 crc kubenswrapper[4722]: > Mar 09 15:23:52 crc kubenswrapper[4722]: I0309 15:23:52.652498 4722 generic.go:334] "Generic (PLEG): container finished" podID="e4a22f8c-ed38-47cf-8238-baf804f573a1" containerID="85414c6aa3d412ee02c488bb1d4b37cbe6610e47d25cb2cf02f72ee5191cff5d" exitCode=0 Mar 09 15:23:52 crc kubenswrapper[4722]: I0309 15:23:52.652729 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4a22f8c-ed38-47cf-8238-baf804f573a1","Type":"ContainerDied","Data":"85414c6aa3d412ee02c488bb1d4b37cbe6610e47d25cb2cf02f72ee5191cff5d"} Mar 09 15:23:53 crc kubenswrapper[4722]: I0309 15:23:53.369667 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/cinder-scheduler-0" podUID="4c227d2b-e035-426b-b1e1-5be3a4e06090" containerName="cinder-scheduler" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 15:23:53 crc kubenswrapper[4722]: I0309 15:23:53.471596 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 15:23:53 crc kubenswrapper[4722]: I0309 15:23:53.670866 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e4a22f8c-ed38-47cf-8238-baf804f573a1-sg-core-conf-yaml\") pod \"e4a22f8c-ed38-47cf-8238-baf804f573a1\" (UID: \"e4a22f8c-ed38-47cf-8238-baf804f573a1\") " Mar 09 15:23:53 crc kubenswrapper[4722]: I0309 15:23:53.670986 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4a22f8c-ed38-47cf-8238-baf804f573a1-log-httpd\") pod \"e4a22f8c-ed38-47cf-8238-baf804f573a1\" (UID: \"e4a22f8c-ed38-47cf-8238-baf804f573a1\") " Mar 09 15:23:53 crc kubenswrapper[4722]: I0309 15:23:53.671097 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4a22f8c-ed38-47cf-8238-baf804f573a1-ceilometer-tls-certs\") pod \"e4a22f8c-ed38-47cf-8238-baf804f573a1\" (UID: \"e4a22f8c-ed38-47cf-8238-baf804f573a1\") " Mar 09 15:23:53 crc kubenswrapper[4722]: I0309 15:23:53.671228 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4a22f8c-ed38-47cf-8238-baf804f573a1-run-httpd\") pod \"e4a22f8c-ed38-47cf-8238-baf804f573a1\" (UID: \"e4a22f8c-ed38-47cf-8238-baf804f573a1\") " Mar 09 15:23:53 crc kubenswrapper[4722]: I0309 15:23:53.671252 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4a22f8c-ed38-47cf-8238-baf804f573a1-combined-ca-bundle\") pod \"e4a22f8c-ed38-47cf-8238-baf804f573a1\" (UID: \"e4a22f8c-ed38-47cf-8238-baf804f573a1\") " Mar 09 15:23:53 crc kubenswrapper[4722]: I0309 15:23:53.671286 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4a22f8c-ed38-47cf-8238-baf804f573a1-scripts\") pod \"e4a22f8c-ed38-47cf-8238-baf804f573a1\" (UID: \"e4a22f8c-ed38-47cf-8238-baf804f573a1\") " Mar 09 15:23:53 crc kubenswrapper[4722]: I0309 15:23:53.671374 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ff2j9\" (UniqueName: \"kubernetes.io/projected/e4a22f8c-ed38-47cf-8238-baf804f573a1-kube-api-access-ff2j9\") pod \"e4a22f8c-ed38-47cf-8238-baf804f573a1\" (UID: \"e4a22f8c-ed38-47cf-8238-baf804f573a1\") " Mar 09 15:23:53 crc kubenswrapper[4722]: I0309 15:23:53.671443 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4a22f8c-ed38-47cf-8238-baf804f573a1-config-data\") pod \"e4a22f8c-ed38-47cf-8238-baf804f573a1\" (UID: \"e4a22f8c-ed38-47cf-8238-baf804f573a1\") " Mar 09 15:23:53 crc kubenswrapper[4722]: I0309 15:23:53.683141 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4a22f8c-ed38-47cf-8238-baf804f573a1-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e4a22f8c-ed38-47cf-8238-baf804f573a1" (UID: "e4a22f8c-ed38-47cf-8238-baf804f573a1"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 15:23:53 crc kubenswrapper[4722]: I0309 15:23:53.684281 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4a22f8c-ed38-47cf-8238-baf804f573a1-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e4a22f8c-ed38-47cf-8238-baf804f573a1" (UID: "e4a22f8c-ed38-47cf-8238-baf804f573a1"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 15:23:53 crc kubenswrapper[4722]: I0309 15:23:53.685021 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e4a22f8c-ed38-47cf-8238-baf804f573a1","Type":"ContainerDied","Data":"63ae7fe34969d0cd121ab26d69678707d71b9e4e87f10d6776f6cdcad4b23266"} Mar 09 15:23:53 crc kubenswrapper[4722]: I0309 15:23:53.685072 4722 scope.go:117] "RemoveContainer" containerID="4b9a57a7830d894ace19d3f55c8a7106923a4ca6e2dd258430d194d1c60e4052" Mar 09 15:23:53 crc kubenswrapper[4722]: I0309 15:23:53.685234 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 15:23:53 crc kubenswrapper[4722]: I0309 15:23:53.708956 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4a22f8c-ed38-47cf-8238-baf804f573a1-kube-api-access-ff2j9" (OuterVolumeSpecName: "kube-api-access-ff2j9") pod "e4a22f8c-ed38-47cf-8238-baf804f573a1" (UID: "e4a22f8c-ed38-47cf-8238-baf804f573a1"). InnerVolumeSpecName "kube-api-access-ff2j9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 15:23:53 crc kubenswrapper[4722]: I0309 15:23:53.718027 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4a22f8c-ed38-47cf-8238-baf804f573a1-scripts" (OuterVolumeSpecName: "scripts") pod "e4a22f8c-ed38-47cf-8238-baf804f573a1" (UID: "e4a22f8c-ed38-47cf-8238-baf804f573a1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 15:23:53 crc kubenswrapper[4722]: I0309 15:23:53.743611 4722 scope.go:117] "RemoveContainer" containerID="2227fa96630e194da4addb6cfc3a43b10b936a7d8ca77af3b21c6ddd6d69615d" Mar 09 15:23:53 crc kubenswrapper[4722]: I0309 15:23:53.757125 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4a22f8c-ed38-47cf-8238-baf804f573a1-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e4a22f8c-ed38-47cf-8238-baf804f573a1" (UID: "e4a22f8c-ed38-47cf-8238-baf804f573a1"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 15:23:53 crc kubenswrapper[4722]: I0309 15:23:53.780837 4722 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4a22f8c-ed38-47cf-8238-baf804f573a1-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 15:23:53 crc kubenswrapper[4722]: I0309 15:23:53.780874 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e4a22f8c-ed38-47cf-8238-baf804f573a1-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 15:23:53 crc kubenswrapper[4722]: I0309 15:23:53.780885 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ff2j9\" (UniqueName: \"kubernetes.io/projected/e4a22f8c-ed38-47cf-8238-baf804f573a1-kube-api-access-ff2j9\") on node \"crc\" DevicePath \"\"" Mar 09 15:23:53 crc kubenswrapper[4722]: I0309 15:23:53.780897 4722 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e4a22f8c-ed38-47cf-8238-baf804f573a1-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 09 15:23:53 crc kubenswrapper[4722]: I0309 15:23:53.780906 4722 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4a22f8c-ed38-47cf-8238-baf804f573a1-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 15:23:54 crc kubenswrapper[4722]: I0309 15:23:54.212358 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-svbs7" podUID="44161991-883d-4494-80b3-b829ff355f47" containerName="registry-server" probeResult="failure" output=< Mar 09 15:23:54 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Mar 09 15:23:54 crc kubenswrapper[4722]: > Mar 09 15:23:54 crc kubenswrapper[4722]: I0309 15:23:54.500290 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4a22f8c-ed38-47cf-8238-baf804f573a1-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "e4a22f8c-ed38-47cf-8238-baf804f573a1" (UID: "e4a22f8c-ed38-47cf-8238-baf804f573a1"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 15:23:54 crc kubenswrapper[4722]: I0309 15:23:54.553803 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4a22f8c-ed38-47cf-8238-baf804f573a1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e4a22f8c-ed38-47cf-8238-baf804f573a1" (UID: "e4a22f8c-ed38-47cf-8238-baf804f573a1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 15:23:54 crc kubenswrapper[4722]: I0309 15:23:54.607000 4722 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4a22f8c-ed38-47cf-8238-baf804f573a1-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 15:23:54 crc kubenswrapper[4722]: I0309 15:23:54.607027 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4a22f8c-ed38-47cf-8238-baf804f573a1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 15:23:54 crc kubenswrapper[4722]: I0309 15:23:54.707714 4722 scope.go:117] "RemoveContainer" containerID="6287639f5110871948ab340f5fc565b05341b6ee17f9cfceba7413ea012ddad7" Mar 09 15:23:54 crc kubenswrapper[4722]: I0309 15:23:54.733325 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4a22f8c-ed38-47cf-8238-baf804f573a1-config-data" (OuterVolumeSpecName: "config-data") pod "e4a22f8c-ed38-47cf-8238-baf804f573a1" (UID: "e4a22f8c-ed38-47cf-8238-baf804f573a1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 15:23:54 crc kubenswrapper[4722]: I0309 15:23:54.735580 4722 scope.go:117] "RemoveContainer" containerID="85414c6aa3d412ee02c488bb1d4b37cbe6610e47d25cb2cf02f72ee5191cff5d" Mar 09 15:23:54 crc kubenswrapper[4722]: I0309 15:23:54.810280 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4a22f8c-ed38-47cf-8238-baf804f573a1-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 15:23:54 crc kubenswrapper[4722]: I0309 15:23:54.925739 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 15:23:54 crc kubenswrapper[4722]: I0309 15:23:54.947451 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 09 15:23:55 crc kubenswrapper[4722]: I0309 15:23:55.004167 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 09 15:23:55 crc kubenswrapper[4722]: E0309 15:23:55.007023 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4a22f8c-ed38-47cf-8238-baf804f573a1" containerName="ceilometer-central-agent" Mar 09 15:23:55 crc kubenswrapper[4722]: I0309 15:23:55.007046 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4a22f8c-ed38-47cf-8238-baf804f573a1" containerName="ceilometer-central-agent" Mar 09 15:23:55 crc kubenswrapper[4722]: E0309 15:23:55.007061 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4a22f8c-ed38-47cf-8238-baf804f573a1" containerName="ceilometer-notification-agent" Mar 09 15:23:55 crc kubenswrapper[4722]: I0309 15:23:55.007070 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4a22f8c-ed38-47cf-8238-baf804f573a1" containerName="ceilometer-notification-agent" Mar 09 15:23:55 crc kubenswrapper[4722]: E0309 15:23:55.007085 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4a22f8c-ed38-47cf-8238-baf804f573a1" containerName="ceilometer-central-agent" Mar 09 15:23:55 crc kubenswrapper[4722]: I0309 15:23:55.007092 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4a22f8c-ed38-47cf-8238-baf804f573a1" containerName="ceilometer-central-agent" Mar 09 15:23:55 crc kubenswrapper[4722]: E0309 15:23:55.007115 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4a22f8c-ed38-47cf-8238-baf804f573a1" containerName="sg-core" Mar 09 15:23:55 crc kubenswrapper[4722]: I0309 15:23:55.007123 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4a22f8c-ed38-47cf-8238-baf804f573a1" containerName="sg-core" Mar 09 15:23:55 crc kubenswrapper[4722]: E0309 15:23:55.007145 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="259f49a9-0c80-4b3c-96a7-24d3e0139a26" containerName="registry-server" Mar 09 15:23:55 crc kubenswrapper[4722]: I0309 15:23:55.007150 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="259f49a9-0c80-4b3c-96a7-24d3e0139a26" containerName="registry-server" Mar 09 15:23:55 crc kubenswrapper[4722]: E0309 15:23:55.007166 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4a22f8c-ed38-47cf-8238-baf804f573a1" containerName="proxy-httpd" Mar 09 15:23:55 crc kubenswrapper[4722]: I0309 15:23:55.007171 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4a22f8c-ed38-47cf-8238-baf804f573a1" containerName="proxy-httpd" Mar 09 15:23:55 crc kubenswrapper[4722]: E0309 15:23:55.007190 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="259f49a9-0c80-4b3c-96a7-24d3e0139a26" containerName="extract-utilities" Mar 09 15:23:55 crc kubenswrapper[4722]: I0309 15:23:55.007196 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="259f49a9-0c80-4b3c-96a7-24d3e0139a26" containerName="extract-utilities" Mar 09 15:23:55 crc kubenswrapper[4722]: E0309 15:23:55.007227 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="259f49a9-0c80-4b3c-96a7-24d3e0139a26" containerName="extract-content" Mar 09 15:23:55 crc kubenswrapper[4722]: I0309 15:23:55.007232 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="259f49a9-0c80-4b3c-96a7-24d3e0139a26" containerName="extract-content" Mar 09 15:23:55 crc kubenswrapper[4722]: I0309 15:23:55.007508 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4a22f8c-ed38-47cf-8238-baf804f573a1" containerName="ceilometer-central-agent" Mar 09 15:23:55 crc kubenswrapper[4722]: I0309 15:23:55.007533 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4a22f8c-ed38-47cf-8238-baf804f573a1" containerName="ceilometer-notification-agent" Mar 09 15:23:55 crc kubenswrapper[4722]: I0309 15:23:55.007544 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4a22f8c-ed38-47cf-8238-baf804f573a1" containerName="proxy-httpd" Mar 09 15:23:55 crc kubenswrapper[4722]: I0309 15:23:55.007559 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4a22f8c-ed38-47cf-8238-baf804f573a1" containerName="sg-core" Mar 09 15:23:55 crc kubenswrapper[4722]: I0309 15:23:55.007571 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="259f49a9-0c80-4b3c-96a7-24d3e0139a26" containerName="registry-server" Mar 09 15:23:55 crc kubenswrapper[4722]: I0309 15:23:55.008046 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4a22f8c-ed38-47cf-8238-baf804f573a1" containerName="ceilometer-central-agent" Mar 09 15:23:55 crc kubenswrapper[4722]: I0309 15:23:55.009768 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 15:23:55 crc kubenswrapper[4722]: I0309 15:23:55.017862 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 09 15:23:55 crc kubenswrapper[4722]: I0309 15:23:55.017874 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 09 15:23:55 crc kubenswrapper[4722]: I0309 15:23:55.017867 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 09 15:23:55 crc kubenswrapper[4722]: I0309 15:23:55.018265 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0d962040-8d2b-4827-8a2f-1c5771970a8d-log-httpd\") pod \"ceilometer-0\" (UID: \"0d962040-8d2b-4827-8a2f-1c5771970a8d\") " pod="openstack/ceilometer-0" Mar 09 15:23:55 crc kubenswrapper[4722]: I0309 15:23:55.018557 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0d962040-8d2b-4827-8a2f-1c5771970a8d-run-httpd\") pod \"ceilometer-0\" (UID: \"0d962040-8d2b-4827-8a2f-1c5771970a8d\") " pod="openstack/ceilometer-0" Mar 09 15:23:55 crc kubenswrapper[4722]: I0309 15:23:55.018830 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d962040-8d2b-4827-8a2f-1c5771970a8d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0d962040-8d2b-4827-8a2f-1c5771970a8d\") " pod="openstack/ceilometer-0" Mar 09 15:23:55 crc kubenswrapper[4722]: I0309 15:23:55.019009 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0d962040-8d2b-4827-8a2f-1c5771970a8d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0d962040-8d2b-4827-8a2f-1c5771970a8d\") " pod="openstack/ceilometer-0" Mar 09 15:23:55 crc kubenswrapper[4722]: I0309 15:23:55.019067 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d962040-8d2b-4827-8a2f-1c5771970a8d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0d962040-8d2b-4827-8a2f-1c5771970a8d\") " pod="openstack/ceilometer-0" Mar 09 15:23:55 crc kubenswrapper[4722]: I0309 15:23:55.019111 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d962040-8d2b-4827-8a2f-1c5771970a8d-scripts\") pod \"ceilometer-0\" (UID: \"0d962040-8d2b-4827-8a2f-1c5771970a8d\") " pod="openstack/ceilometer-0" Mar 09 15:23:55 crc kubenswrapper[4722]: I0309 15:23:55.019225 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vb2kk\" (UniqueName: \"kubernetes.io/projected/0d962040-8d2b-4827-8a2f-1c5771970a8d-kube-api-access-vb2kk\") pod \"ceilometer-0\" (UID: \"0d962040-8d2b-4827-8a2f-1c5771970a8d\") " pod="openstack/ceilometer-0" Mar 09 15:23:55 crc kubenswrapper[4722]: I0309 15:23:55.019337 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d962040-8d2b-4827-8a2f-1c5771970a8d-config-data\") pod \"ceilometer-0\" (UID: \"0d962040-8d2b-4827-8a2f-1c5771970a8d\") " pod="openstack/ceilometer-0" Mar 09 15:23:55 crc kubenswrapper[4722]: I0309 15:23:55.040193 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 15:23:55 crc kubenswrapper[4722]: I0309 15:23:55.124503 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d962040-8d2b-4827-8a2f-1c5771970a8d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0d962040-8d2b-4827-8a2f-1c5771970a8d\") " pod="openstack/ceilometer-0" Mar 09 15:23:55 crc kubenswrapper[4722]: I0309 15:23:55.124556 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d962040-8d2b-4827-8a2f-1c5771970a8d-scripts\") pod \"ceilometer-0\" (UID: \"0d962040-8d2b-4827-8a2f-1c5771970a8d\") " pod="openstack/ceilometer-0" Mar 09 15:23:55 crc kubenswrapper[4722]: I0309 15:23:55.124608 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vb2kk\" (UniqueName: \"kubernetes.io/projected/0d962040-8d2b-4827-8a2f-1c5771970a8d-kube-api-access-vb2kk\") pod \"ceilometer-0\" (UID: \"0d962040-8d2b-4827-8a2f-1c5771970a8d\") " pod="openstack/ceilometer-0" Mar 09 15:23:55 crc kubenswrapper[4722]: I0309 15:23:55.124649 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d962040-8d2b-4827-8a2f-1c5771970a8d-config-data\") pod \"ceilometer-0\" (UID: \"0d962040-8d2b-4827-8a2f-1c5771970a8d\") " pod="openstack/ceilometer-0" Mar 09 15:23:55 crc kubenswrapper[4722]: I0309 15:23:55.124688 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0d962040-8d2b-4827-8a2f-1c5771970a8d-log-httpd\") pod \"ceilometer-0\" (UID: \"0d962040-8d2b-4827-8a2f-1c5771970a8d\") " pod="openstack/ceilometer-0" Mar 09 15:23:55 crc kubenswrapper[4722]: I0309 15:23:55.124785 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0d962040-8d2b-4827-8a2f-1c5771970a8d-run-httpd\") pod \"ceilometer-0\" (UID: \"0d962040-8d2b-4827-8a2f-1c5771970a8d\") " pod="openstack/ceilometer-0" Mar 09 15:23:55 crc kubenswrapper[4722]: I0309 15:23:55.124872 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d962040-8d2b-4827-8a2f-1c5771970a8d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0d962040-8d2b-4827-8a2f-1c5771970a8d\") " pod="openstack/ceilometer-0" Mar 09 15:23:55 crc kubenswrapper[4722]: I0309 15:23:55.124908 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0d962040-8d2b-4827-8a2f-1c5771970a8d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0d962040-8d2b-4827-8a2f-1c5771970a8d\") " pod="openstack/ceilometer-0" Mar 09 15:23:55 crc kubenswrapper[4722]: I0309 15:23:55.126558 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0d962040-8d2b-4827-8a2f-1c5771970a8d-run-httpd\") pod \"ceilometer-0\" (UID: \"0d962040-8d2b-4827-8a2f-1c5771970a8d\") " pod="openstack/ceilometer-0" Mar 09 15:23:55 crc kubenswrapper[4722]: I0309 15:23:55.130493 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0d962040-8d2b-4827-8a2f-1c5771970a8d-log-httpd\") pod \"ceilometer-0\" (UID: \"0d962040-8d2b-4827-8a2f-1c5771970a8d\") " pod="openstack/ceilometer-0" Mar 09 15:23:55 crc kubenswrapper[4722]: I0309 15:23:55.143864 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0d962040-8d2b-4827-8a2f-1c5771970a8d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0d962040-8d2b-4827-8a2f-1c5771970a8d\") " pod="openstack/ceilometer-0" Mar 09 15:23:55 crc kubenswrapper[4722]: I0309 15:23:55.145091 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d962040-8d2b-4827-8a2f-1c5771970a8d-scripts\") pod \"ceilometer-0\" (UID: \"0d962040-8d2b-4827-8a2f-1c5771970a8d\") " pod="openstack/ceilometer-0" Mar 09 15:23:55 crc kubenswrapper[4722]: I0309 15:23:55.145100 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d962040-8d2b-4827-8a2f-1c5771970a8d-config-data\") pod \"ceilometer-0\" (UID: \"0d962040-8d2b-4827-8a2f-1c5771970a8d\") " pod="openstack/ceilometer-0" Mar 09 15:23:55 crc kubenswrapper[4722]: I0309 15:23:55.146721 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d962040-8d2b-4827-8a2f-1c5771970a8d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0d962040-8d2b-4827-8a2f-1c5771970a8d\") " pod="openstack/ceilometer-0" Mar 09 15:23:55 crc kubenswrapper[4722]: I0309 15:23:55.151996 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d962040-8d2b-4827-8a2f-1c5771970a8d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0d962040-8d2b-4827-8a2f-1c5771970a8d\") " pod="openstack/ceilometer-0" Mar 09 15:23:55 crc kubenswrapper[4722]: I0309 15:23:55.152347 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vb2kk\" (UniqueName: \"kubernetes.io/projected/0d962040-8d2b-4827-8a2f-1c5771970a8d-kube-api-access-vb2kk\") pod \"ceilometer-0\" (UID: \"0d962040-8d2b-4827-8a2f-1c5771970a8d\") " pod="openstack/ceilometer-0" Mar 09 15:23:55 crc kubenswrapper[4722]: I0309 15:23:55.336318 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 15:23:55 crc kubenswrapper[4722]: I0309 15:23:55.633587 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-v57f2" podUID="4e27c5a4-8cba-4119-8006-f9841d6121dc" containerName="registry-server" probeResult="failure" output=< Mar 09 15:23:55 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Mar 09 15:23:55 crc kubenswrapper[4722]: > Mar 09 15:23:55 crc kubenswrapper[4722]: I0309 15:23:55.960161 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 15:23:56 crc kubenswrapper[4722]: I0309 15:23:56.161754 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4a22f8c-ed38-47cf-8238-baf804f573a1" path="/var/lib/kubelet/pods/e4a22f8c-ed38-47cf-8238-baf804f573a1/volumes" Mar 09 15:23:56 crc kubenswrapper[4722]: I0309 15:23:56.231148 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-bgdq7/must-gather-sd2h4"] Mar 09 15:23:56 crc kubenswrapper[4722]: I0309 15:23:56.233563 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bgdq7/must-gather-sd2h4" Mar 09 15:23:56 crc kubenswrapper[4722]: I0309 15:23:56.236361 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-bgdq7"/"default-dockercfg-cwxpx" Mar 09 15:23:56 crc kubenswrapper[4722]: I0309 15:23:56.238318 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-bgdq7"/"openshift-service-ca.crt" Mar 09 15:23:56 crc kubenswrapper[4722]: I0309 15:23:56.238535 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-bgdq7"/"kube-root-ca.crt" Mar 09 15:23:56 crc kubenswrapper[4722]: I0309 15:23:56.249707 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-bgdq7/must-gather-sd2h4"] Mar 09 15:23:56 crc kubenswrapper[4722]: I0309 15:23:56.301228 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzdw9\" (UniqueName: \"kubernetes.io/projected/13a7cb7f-e80c-4ff8-9a4e-1c1048399cd5-kube-api-access-hzdw9\") pod \"must-gather-sd2h4\" (UID: \"13a7cb7f-e80c-4ff8-9a4e-1c1048399cd5\") " pod="openshift-must-gather-bgdq7/must-gather-sd2h4" Mar 09 15:23:56 crc kubenswrapper[4722]: I0309 15:23:56.301355 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/13a7cb7f-e80c-4ff8-9a4e-1c1048399cd5-must-gather-output\") pod \"must-gather-sd2h4\" (UID: \"13a7cb7f-e80c-4ff8-9a4e-1c1048399cd5\") " pod="openshift-must-gather-bgdq7/must-gather-sd2h4" Mar 09 15:23:56 crc kubenswrapper[4722]: I0309 15:23:56.403724 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzdw9\" (UniqueName: \"kubernetes.io/projected/13a7cb7f-e80c-4ff8-9a4e-1c1048399cd5-kube-api-access-hzdw9\") pod \"must-gather-sd2h4\" (UID: \"13a7cb7f-e80c-4ff8-9a4e-1c1048399cd5\") " pod="openshift-must-gather-bgdq7/must-gather-sd2h4" Mar 09 15:23:56 crc kubenswrapper[4722]: I0309 15:23:56.404135 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/13a7cb7f-e80c-4ff8-9a4e-1c1048399cd5-must-gather-output\") pod \"must-gather-sd2h4\" (UID: \"13a7cb7f-e80c-4ff8-9a4e-1c1048399cd5\") " pod="openshift-must-gather-bgdq7/must-gather-sd2h4" Mar 09 15:23:56 crc kubenswrapper[4722]: I0309 15:23:56.405760 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/13a7cb7f-e80c-4ff8-9a4e-1c1048399cd5-must-gather-output\") pod \"must-gather-sd2h4\" (UID: \"13a7cb7f-e80c-4ff8-9a4e-1c1048399cd5\") " pod="openshift-must-gather-bgdq7/must-gather-sd2h4" Mar 09 15:23:56 crc kubenswrapper[4722]: I0309 15:23:56.425078 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzdw9\" (UniqueName: \"kubernetes.io/projected/13a7cb7f-e80c-4ff8-9a4e-1c1048399cd5-kube-api-access-hzdw9\") pod \"must-gather-sd2h4\" (UID: \"13a7cb7f-e80c-4ff8-9a4e-1c1048399cd5\") " pod="openshift-must-gather-bgdq7/must-gather-sd2h4" Mar 09 15:23:56 crc kubenswrapper[4722]: I0309 15:23:56.555607 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bgdq7/must-gather-sd2h4" Mar 09 15:23:56 crc kubenswrapper[4722]: I0309 15:23:56.766451 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0d962040-8d2b-4827-8a2f-1c5771970a8d","Type":"ContainerStarted","Data":"965ea04502da9365aa29eedded9b80e15224a8c81534bba570ab5c1917f95991"} Mar 09 15:23:56 crc kubenswrapper[4722]: I0309 15:23:56.766820 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0d962040-8d2b-4827-8a2f-1c5771970a8d","Type":"ContainerStarted","Data":"f849cac505357dd19cab995379d0f5def9397acd79c5c06a296d4c27b3fa4e99"} Mar 09 15:23:57 crc kubenswrapper[4722]: I0309 15:23:57.102384 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-bgdq7/must-gather-sd2h4"] Mar 09 15:23:57 crc kubenswrapper[4722]: I0309 15:23:57.782311 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bgdq7/must-gather-sd2h4" event={"ID":"13a7cb7f-e80c-4ff8-9a4e-1c1048399cd5","Type":"ContainerStarted","Data":"bd945971cff8d5fa0a8116e1606ce9b6141a00521c22d3198930f7f02f3b50fe"} Mar 09 15:23:57 crc kubenswrapper[4722]: I0309 15:23:57.785626 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0d962040-8d2b-4827-8a2f-1c5771970a8d","Type":"ContainerStarted","Data":"cf28409412fda95e3e91755b05fe62d4a98690c171f347747053923f6d0d927c"} Mar 09 15:23:58 crc kubenswrapper[4722]: I0309 15:23:58.379559 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/cinder-scheduler-0" podUID="4c227d2b-e035-426b-b1e1-5be3a4e06090" containerName="cinder-scheduler" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 09 15:23:58 crc kubenswrapper[4722]: I0309 15:23:58.852145 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0d962040-8d2b-4827-8a2f-1c5771970a8d","Type":"ContainerStarted","Data":"41e2b3c952e9ca88c4bde1a07b5d6986a406f4d45580077ef3d5e0e8ccbaef67"} Mar 09 15:23:59 crc kubenswrapper[4722]: I0309 15:23:59.312128 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 09 15:24:00 crc kubenswrapper[4722]: I0309 15:24:00.201564 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551164-b2lll"] Mar 09 15:24:00 crc kubenswrapper[4722]: I0309 15:24:00.204027 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551164-b2lll" Mar 09 15:24:00 crc kubenswrapper[4722]: I0309 15:24:00.208068 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 15:24:00 crc kubenswrapper[4722]: I0309 15:24:00.208269 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 15:24:00 crc kubenswrapper[4722]: I0309 15:24:00.208405 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-dr2x6" Mar 09 15:24:00 crc kubenswrapper[4722]: I0309 15:24:00.234429 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551164-b2lll"] Mar 09 15:24:00 crc kubenswrapper[4722]: I0309 15:24:00.328464 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdh4n\" (UniqueName: \"kubernetes.io/projected/aae15a8c-7e4e-4db4-b209-d1243d668860-kube-api-access-cdh4n\") pod \"auto-csr-approver-29551164-b2lll\" (UID: \"aae15a8c-7e4e-4db4-b209-d1243d668860\") " pod="openshift-infra/auto-csr-approver-29551164-b2lll" Mar 09 15:24:00 crc kubenswrapper[4722]: I0309 15:24:00.431359 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdh4n\" (UniqueName: \"kubernetes.io/projected/aae15a8c-7e4e-4db4-b209-d1243d668860-kube-api-access-cdh4n\") pod \"auto-csr-approver-29551164-b2lll\" (UID: \"aae15a8c-7e4e-4db4-b209-d1243d668860\") " pod="openshift-infra/auto-csr-approver-29551164-b2lll" Mar 09 15:24:00 crc kubenswrapper[4722]: I0309 15:24:00.467093 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdh4n\" (UniqueName: \"kubernetes.io/projected/aae15a8c-7e4e-4db4-b209-d1243d668860-kube-api-access-cdh4n\") pod \"auto-csr-approver-29551164-b2lll\" (UID: \"aae15a8c-7e4e-4db4-b209-d1243d668860\") " pod="openshift-infra/auto-csr-approver-29551164-b2lll" Mar 09 15:24:00 crc kubenswrapper[4722]: I0309 15:24:00.527484 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551164-b2lll" Mar 09 15:24:00 crc kubenswrapper[4722]: I0309 15:24:00.613824 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 09 15:24:01 crc kubenswrapper[4722]: I0309 15:24:01.417218 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551164-b2lll"] Mar 09 15:24:02 crc kubenswrapper[4722]: I0309 15:24:02.009865 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0d962040-8d2b-4827-8a2f-1c5771970a8d","Type":"ContainerStarted","Data":"467c0d05f435e710d42f3c400e9dd56788c214f93663c3cf5997a70ad7dfd284"} Mar 09 15:24:02 crc kubenswrapper[4722]: I0309 15:24:02.010657 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 09 15:24:02 crc kubenswrapper[4722]: I0309 15:24:02.037487 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551164-b2lll" event={"ID":"aae15a8c-7e4e-4db4-b209-d1243d668860","Type":"ContainerStarted","Data":"fd19bad59f11412cc640fecb0e53652e203cba5ce390fd3f1c222b8b7419f22a"} Mar 09 15:24:02 crc kubenswrapper[4722]: I0309 15:24:02.074826 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.421839038 podStartE2EDuration="8.074804572s" podCreationTimestamp="2026-03-09 15:23:54 +0000 UTC" firstStartedPulling="2026-03-09 15:23:55.968017918 +0000 UTC m=+4876.523586484" lastFinishedPulling="2026-03-09 15:24:00.620983442 +0000 UTC m=+4881.176552018" observedRunningTime="2026-03-09 15:24:02.063314587 +0000 UTC m=+4882.618883163" watchObservedRunningTime="2026-03-09 15:24:02.074804572 +0000 UTC m=+4882.630373138" Mar 09 15:24:03 crc kubenswrapper[4722]: I0309 15:24:03.427836 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 09 15:24:04 crc kubenswrapper[4722]: I0309 15:24:04.235888 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-svbs7" podUID="44161991-883d-4494-80b3-b829ff355f47" containerName="registry-server" probeResult="failure" output=< Mar 09 15:24:04 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Mar 09 15:24:04 crc kubenswrapper[4722]: > Mar 09 15:24:05 crc kubenswrapper[4722]: I0309 15:24:05.343511 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 15:24:05 crc kubenswrapper[4722]: I0309 15:24:05.344417 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0d962040-8d2b-4827-8a2f-1c5771970a8d" containerName="ceilometer-central-agent" containerID="cri-o://965ea04502da9365aa29eedded9b80e15224a8c81534bba570ab5c1917f95991" gracePeriod=30 Mar 09 15:24:05 crc kubenswrapper[4722]: I0309 15:24:05.344591 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0d962040-8d2b-4827-8a2f-1c5771970a8d" containerName="proxy-httpd" containerID="cri-o://467c0d05f435e710d42f3c400e9dd56788c214f93663c3cf5997a70ad7dfd284" gracePeriod=30 Mar 09 15:24:05 crc kubenswrapper[4722]: I0309 15:24:05.344656 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0d962040-8d2b-4827-8a2f-1c5771970a8d" containerName="sg-core" containerID="cri-o://41e2b3c952e9ca88c4bde1a07b5d6986a406f4d45580077ef3d5e0e8ccbaef67" gracePeriod=30 Mar 09 15:24:05 crc kubenswrapper[4722]: I0309 15:24:05.344702 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0d962040-8d2b-4827-8a2f-1c5771970a8d" containerName="ceilometer-notification-agent" containerID="cri-o://cf28409412fda95e3e91755b05fe62d4a98690c171f347747053923f6d0d927c" gracePeriod=30 Mar 09 15:24:05 crc kubenswrapper[4722]: I0309 15:24:05.674250 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-v57f2" podUID="4e27c5a4-8cba-4119-8006-f9841d6121dc" containerName="registry-server" probeResult="failure" output=< Mar 09 15:24:05 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Mar 09 15:24:05 crc kubenswrapper[4722]: > Mar 09 15:24:06 crc kubenswrapper[4722]: I0309 15:24:06.158503 4722 generic.go:334] "Generic (PLEG): container finished" podID="0d962040-8d2b-4827-8a2f-1c5771970a8d" containerID="467c0d05f435e710d42f3c400e9dd56788c214f93663c3cf5997a70ad7dfd284" exitCode=0 Mar 09 15:24:06 crc kubenswrapper[4722]: I0309 15:24:06.158580 4722 generic.go:334] "Generic (PLEG): container finished" podID="0d962040-8d2b-4827-8a2f-1c5771970a8d" containerID="41e2b3c952e9ca88c4bde1a07b5d6986a406f4d45580077ef3d5e0e8ccbaef67" exitCode=2 Mar 09 15:24:06 crc kubenswrapper[4722]: I0309 15:24:06.158597 4722 generic.go:334] "Generic (PLEG): container finished" podID="0d962040-8d2b-4827-8a2f-1c5771970a8d" containerID="cf28409412fda95e3e91755b05fe62d4a98690c171f347747053923f6d0d927c" exitCode=0 Mar 09 15:24:06 crc kubenswrapper[4722]: I0309 15:24:06.167721 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0d962040-8d2b-4827-8a2f-1c5771970a8d","Type":"ContainerDied","Data":"467c0d05f435e710d42f3c400e9dd56788c214f93663c3cf5997a70ad7dfd284"} Mar 09 15:24:06 crc kubenswrapper[4722]: I0309 15:24:06.167763 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0d962040-8d2b-4827-8a2f-1c5771970a8d","Type":"ContainerDied","Data":"41e2b3c952e9ca88c4bde1a07b5d6986a406f4d45580077ef3d5e0e8ccbaef67"} Mar 09 15:24:06 crc kubenswrapper[4722]: I0309 15:24:06.167773 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0d962040-8d2b-4827-8a2f-1c5771970a8d","Type":"ContainerDied","Data":"cf28409412fda95e3e91755b05fe62d4a98690c171f347747053923f6d0d927c"} Mar 09 15:24:11 crc kubenswrapper[4722]: I0309 15:24:11.032356 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Mar 09 15:24:11 crc kubenswrapper[4722]: I0309 15:24:11.081184 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Mar 09 15:24:12 crc kubenswrapper[4722]: I0309 15:24:12.280092 4722 generic.go:334] "Generic (PLEG): container finished" podID="0d962040-8d2b-4827-8a2f-1c5771970a8d" containerID="965ea04502da9365aa29eedded9b80e15224a8c81534bba570ab5c1917f95991" exitCode=0 Mar 09 15:24:12 crc kubenswrapper[4722]: I0309 15:24:12.280175 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0d962040-8d2b-4827-8a2f-1c5771970a8d","Type":"ContainerDied","Data":"965ea04502da9365aa29eedded9b80e15224a8c81534bba570ab5c1917f95991"} Mar 09 15:24:13 crc kubenswrapper[4722]: I0309 15:24:13.098352 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 15:24:13 crc kubenswrapper[4722]: I0309 15:24:13.134889 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d962040-8d2b-4827-8a2f-1c5771970a8d-config-data\") pod \"0d962040-8d2b-4827-8a2f-1c5771970a8d\" (UID: \"0d962040-8d2b-4827-8a2f-1c5771970a8d\") " Mar 09 15:24:13 crc kubenswrapper[4722]: I0309 15:24:13.134987 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0d962040-8d2b-4827-8a2f-1c5771970a8d-log-httpd\") pod \"0d962040-8d2b-4827-8a2f-1c5771970a8d\" (UID: \"0d962040-8d2b-4827-8a2f-1c5771970a8d\") " Mar 09 15:24:13 crc kubenswrapper[4722]: I0309 15:24:13.135121 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vb2kk\" (UniqueName: \"kubernetes.io/projected/0d962040-8d2b-4827-8a2f-1c5771970a8d-kube-api-access-vb2kk\") pod \"0d962040-8d2b-4827-8a2f-1c5771970a8d\" (UID: \"0d962040-8d2b-4827-8a2f-1c5771970a8d\") " Mar 09 15:24:13 crc kubenswrapper[4722]: I0309 15:24:13.135147 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0d962040-8d2b-4827-8a2f-1c5771970a8d-sg-core-conf-yaml\") pod \"0d962040-8d2b-4827-8a2f-1c5771970a8d\" (UID: \"0d962040-8d2b-4827-8a2f-1c5771970a8d\") " Mar 09 15:24:13 crc kubenswrapper[4722]: I0309 15:24:13.135166 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d962040-8d2b-4827-8a2f-1c5771970a8d-ceilometer-tls-certs\") pod \"0d962040-8d2b-4827-8a2f-1c5771970a8d\" (UID: \"0d962040-8d2b-4827-8a2f-1c5771970a8d\") " Mar 09 15:24:13 crc kubenswrapper[4722]: I0309 15:24:13.135277 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0d962040-8d2b-4827-8a2f-1c5771970a8d-run-httpd\") pod \"0d962040-8d2b-4827-8a2f-1c5771970a8d\" (UID: \"0d962040-8d2b-4827-8a2f-1c5771970a8d\") " Mar 09 15:24:13 crc kubenswrapper[4722]: I0309 15:24:13.135381 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d962040-8d2b-4827-8a2f-1c5771970a8d-combined-ca-bundle\") pod \"0d962040-8d2b-4827-8a2f-1c5771970a8d\" (UID: \"0d962040-8d2b-4827-8a2f-1c5771970a8d\") " Mar 09 15:24:13 crc kubenswrapper[4722]: I0309 15:24:13.135417 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d962040-8d2b-4827-8a2f-1c5771970a8d-scripts\") pod \"0d962040-8d2b-4827-8a2f-1c5771970a8d\" (UID: \"0d962040-8d2b-4827-8a2f-1c5771970a8d\") " Mar 09 15:24:13 crc kubenswrapper[4722]: I0309 15:24:13.135597 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d962040-8d2b-4827-8a2f-1c5771970a8d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "0d962040-8d2b-4827-8a2f-1c5771970a8d" (UID: "0d962040-8d2b-4827-8a2f-1c5771970a8d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 15:24:13 crc kubenswrapper[4722]: I0309 15:24:13.135806 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d962040-8d2b-4827-8a2f-1c5771970a8d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "0d962040-8d2b-4827-8a2f-1c5771970a8d" (UID: "0d962040-8d2b-4827-8a2f-1c5771970a8d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 15:24:13 crc kubenswrapper[4722]: I0309 15:24:13.136414 4722 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0d962040-8d2b-4827-8a2f-1c5771970a8d-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 15:24:13 crc kubenswrapper[4722]: I0309 15:24:13.136438 4722 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0d962040-8d2b-4827-8a2f-1c5771970a8d-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 09 15:24:13 crc kubenswrapper[4722]: I0309 15:24:13.147629 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d962040-8d2b-4827-8a2f-1c5771970a8d-kube-api-access-vb2kk" (OuterVolumeSpecName: "kube-api-access-vb2kk") pod "0d962040-8d2b-4827-8a2f-1c5771970a8d" (UID: "0d962040-8d2b-4827-8a2f-1c5771970a8d"). InnerVolumeSpecName "kube-api-access-vb2kk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 15:24:13 crc kubenswrapper[4722]: I0309 15:24:13.150602 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d962040-8d2b-4827-8a2f-1c5771970a8d-scripts" (OuterVolumeSpecName: "scripts") pod "0d962040-8d2b-4827-8a2f-1c5771970a8d" (UID: "0d962040-8d2b-4827-8a2f-1c5771970a8d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 15:24:13 crc kubenswrapper[4722]: I0309 15:24:13.181337 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d962040-8d2b-4827-8a2f-1c5771970a8d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "0d962040-8d2b-4827-8a2f-1c5771970a8d" (UID: "0d962040-8d2b-4827-8a2f-1c5771970a8d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 15:24:13 crc kubenswrapper[4722]: I0309 15:24:13.241396 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d962040-8d2b-4827-8a2f-1c5771970a8d-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "0d962040-8d2b-4827-8a2f-1c5771970a8d" (UID: "0d962040-8d2b-4827-8a2f-1c5771970a8d"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 15:24:13 crc kubenswrapper[4722]: I0309 15:24:13.242139 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vb2kk\" (UniqueName: \"kubernetes.io/projected/0d962040-8d2b-4827-8a2f-1c5771970a8d-kube-api-access-vb2kk\") on node \"crc\" DevicePath \"\"" Mar 09 15:24:13 crc kubenswrapper[4722]: I0309 15:24:13.242227 4722 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0d962040-8d2b-4827-8a2f-1c5771970a8d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 09 15:24:13 crc kubenswrapper[4722]: I0309 15:24:13.242252 4722 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d962040-8d2b-4827-8a2f-1c5771970a8d-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 09 15:24:13 crc kubenswrapper[4722]: I0309 15:24:13.242262 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d962040-8d2b-4827-8a2f-1c5771970a8d-scripts\") on node \"crc\" DevicePath \"\"" Mar 09 15:24:13 crc kubenswrapper[4722]: I0309 15:24:13.280062 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d962040-8d2b-4827-8a2f-1c5771970a8d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0d962040-8d2b-4827-8a2f-1c5771970a8d" (UID: "0d962040-8d2b-4827-8a2f-1c5771970a8d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 15:24:13 crc kubenswrapper[4722]: I0309 15:24:13.303249 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bgdq7/must-gather-sd2h4" event={"ID":"13a7cb7f-e80c-4ff8-9a4e-1c1048399cd5","Type":"ContainerStarted","Data":"10da958cc3f84b9190c63354e4053d2f9fad8b6733521bf86ee17326bf2d8503"} Mar 09 15:24:13 crc kubenswrapper[4722]: I0309 15:24:13.315855 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0d962040-8d2b-4827-8a2f-1c5771970a8d","Type":"ContainerDied","Data":"f849cac505357dd19cab995379d0f5def9397acd79c5c06a296d4c27b3fa4e99"} Mar 09 15:24:13 crc kubenswrapper[4722]: I0309 15:24:13.316279 4722 scope.go:117] "RemoveContainer" containerID="467c0d05f435e710d42f3c400e9dd56788c214f93663c3cf5997a70ad7dfd284" Mar 09 15:24:13 crc kubenswrapper[4722]: I0309 15:24:13.315935 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 15:24:13 crc kubenswrapper[4722]: I0309 15:24:13.346015 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d962040-8d2b-4827-8a2f-1c5771970a8d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 09 15:24:13 crc kubenswrapper[4722]: I0309 15:24:13.350262 4722 scope.go:117] "RemoveContainer" containerID="41e2b3c952e9ca88c4bde1a07b5d6986a406f4d45580077ef3d5e0e8ccbaef67" Mar 09 15:24:13 crc kubenswrapper[4722]: I0309 15:24:13.350377 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d962040-8d2b-4827-8a2f-1c5771970a8d-config-data" (OuterVolumeSpecName: "config-data") pod "0d962040-8d2b-4827-8a2f-1c5771970a8d" (UID: "0d962040-8d2b-4827-8a2f-1c5771970a8d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 15:24:13 crc kubenswrapper[4722]: I0309 15:24:13.386162 4722 scope.go:117] "RemoveContainer" containerID="cf28409412fda95e3e91755b05fe62d4a98690c171f347747053923f6d0d927c" Mar 09 15:24:13 crc kubenswrapper[4722]: I0309 15:24:13.436497 4722 scope.go:117] "RemoveContainer" containerID="965ea04502da9365aa29eedded9b80e15224a8c81534bba570ab5c1917f95991" Mar 09 15:24:13 crc kubenswrapper[4722]: I0309 15:24:13.448601 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d962040-8d2b-4827-8a2f-1c5771970a8d-config-data\") on node \"crc\" DevicePath \"\"" Mar 09 15:24:13 crc kubenswrapper[4722]: I0309 15:24:13.658750 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 09 15:24:13 crc kubenswrapper[4722]: I0309 15:24:13.679085 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 09 15:24:13 crc kubenswrapper[4722]: I0309 15:24:13.695223 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 09 15:24:13 crc kubenswrapper[4722]: E0309 15:24:13.695965 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d962040-8d2b-4827-8a2f-1c5771970a8d" containerName="sg-core" Mar 09 15:24:13 crc kubenswrapper[4722]: I0309 15:24:13.695989 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d962040-8d2b-4827-8a2f-1c5771970a8d" containerName="sg-core" Mar 09 15:24:13 crc kubenswrapper[4722]: E0309 15:24:13.696008 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d962040-8d2b-4827-8a2f-1c5771970a8d" containerName="ceilometer-notification-agent" Mar 09 15:24:13 crc kubenswrapper[4722]: I0309 15:24:13.696015 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d962040-8d2b-4827-8a2f-1c5771970a8d" containerName="ceilometer-notification-agent" Mar 09 15:24:13 crc kubenswrapper[4722]: E0309 15:24:13.696047 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d962040-8d2b-4827-8a2f-1c5771970a8d" containerName="ceilometer-central-agent" Mar 09 15:24:13 crc kubenswrapper[4722]: I0309 15:24:13.696055 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d962040-8d2b-4827-8a2f-1c5771970a8d" containerName="ceilometer-central-agent" Mar 09 15:24:13 crc kubenswrapper[4722]: E0309 15:24:13.696070 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d962040-8d2b-4827-8a2f-1c5771970a8d" containerName="proxy-httpd" Mar 09 15:24:13 crc kubenswrapper[4722]: I0309 15:24:13.696076 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d962040-8d2b-4827-8a2f-1c5771970a8d" containerName="proxy-httpd" Mar 09 15:24:13 crc kubenswrapper[4722]: I0309 15:24:13.696359 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d962040-8d2b-4827-8a2f-1c5771970a8d" containerName="ceilometer-notification-agent" Mar 09 15:24:13 crc kubenswrapper[4722]: I0309 15:24:13.697288 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d962040-8d2b-4827-8a2f-1c5771970a8d" containerName="proxy-httpd" Mar 09 15:24:13 crc kubenswrapper[4722]: I0309 15:24:13.697474 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d962040-8d2b-4827-8a2f-1c5771970a8d" containerName="ceilometer-central-agent" Mar 09 15:24:13 crc kubenswrapper[4722]: I0309 15:24:13.697522 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d962040-8d2b-4827-8a2f-1c5771970a8d" containerName="sg-core" Mar 09 15:24:13 crc kubenswrapper[4722]: I0309 15:24:13.720874 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 15:24:13 crc kubenswrapper[4722]: I0309 15:24:13.724459 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 09 15:24:13 crc kubenswrapper[4722]: I0309 15:24:13.724630 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 09 15:24:13 crc kubenswrapper[4722]: I0309 15:24:13.730539 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 09 15:24:13 crc kubenswrapper[4722]: I0309 15:24:13.740654 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 15:24:13 crc kubenswrapper[4722]: I0309 15:24:13.757050 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cd177746-866b-44d0-b46b-5b0f9b683a7b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cd177746-866b-44d0-b46b-5b0f9b683a7b\") " pod="openstack/ceilometer-0" Mar 09 15:24:13 crc kubenswrapper[4722]: I0309 15:24:13.757115 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd177746-866b-44d0-b46b-5b0f9b683a7b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cd177746-866b-44d0-b46b-5b0f9b683a7b\") " pod="openstack/ceilometer-0" Mar 09 15:24:13 crc kubenswrapper[4722]: I0309 15:24:13.757139 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd177746-866b-44d0-b46b-5b0f9b683a7b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"cd177746-866b-44d0-b46b-5b0f9b683a7b\") " pod="openstack/ceilometer-0" Mar 09 15:24:13 crc kubenswrapper[4722]: I0309 15:24:13.757408 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd177746-866b-44d0-b46b-5b0f9b683a7b-config-data\") pod \"ceilometer-0\" (UID: \"cd177746-866b-44d0-b46b-5b0f9b683a7b\") " pod="openstack/ceilometer-0" Mar 09 15:24:13 crc kubenswrapper[4722]: I0309 15:24:13.757494 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cd177746-866b-44d0-b46b-5b0f9b683a7b-run-httpd\") pod \"ceilometer-0\" (UID: \"cd177746-866b-44d0-b46b-5b0f9b683a7b\") " pod="openstack/ceilometer-0" Mar 09 15:24:13 crc kubenswrapper[4722]: I0309 15:24:13.757539 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfdbl\" (UniqueName: \"kubernetes.io/projected/cd177746-866b-44d0-b46b-5b0f9b683a7b-kube-api-access-pfdbl\") pod \"ceilometer-0\" (UID: \"cd177746-866b-44d0-b46b-5b0f9b683a7b\") " pod="openstack/ceilometer-0" Mar 09 15:24:13 crc kubenswrapper[4722]: I0309 15:24:13.757687 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd177746-866b-44d0-b46b-5b0f9b683a7b-scripts\") pod \"ceilometer-0\" (UID: \"cd177746-866b-44d0-b46b-5b0f9b683a7b\") " pod="openstack/ceilometer-0" Mar 09 15:24:13 crc kubenswrapper[4722]: I0309 15:24:13.757812 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cd177746-866b-44d0-b46b-5b0f9b683a7b-log-httpd\") pod \"ceilometer-0\" (UID: \"cd177746-866b-44d0-b46b-5b0f9b683a7b\") " pod="openstack/ceilometer-0" Mar 09 15:24:13 crc kubenswrapper[4722]: I0309 15:24:13.859507 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cd177746-866b-44d0-b46b-5b0f9b683a7b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cd177746-866b-44d0-b46b-5b0f9b683a7b\") " pod="openstack/ceilometer-0" Mar 09 15:24:13 crc kubenswrapper[4722]: I0309 15:24:13.859561 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd177746-866b-44d0-b46b-5b0f9b683a7b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cd177746-866b-44d0-b46b-5b0f9b683a7b\") " pod="openstack/ceilometer-0" Mar 09 15:24:13 crc kubenswrapper[4722]: I0309 15:24:13.859593 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd177746-866b-44d0-b46b-5b0f9b683a7b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"cd177746-866b-44d0-b46b-5b0f9b683a7b\") " pod="openstack/ceilometer-0" Mar 09 15:24:13 crc kubenswrapper[4722]: I0309 15:24:13.859659 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd177746-866b-44d0-b46b-5b0f9b683a7b-config-data\") pod \"ceilometer-0\" (UID: \"cd177746-866b-44d0-b46b-5b0f9b683a7b\") " pod="openstack/ceilometer-0" Mar 09 15:24:13 crc kubenswrapper[4722]: I0309 15:24:13.859693 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cd177746-866b-44d0-b46b-5b0f9b683a7b-run-httpd\") pod \"ceilometer-0\" (UID: \"cd177746-866b-44d0-b46b-5b0f9b683a7b\") " pod="openstack/ceilometer-0" Mar 09 15:24:13 crc kubenswrapper[4722]: I0309 15:24:13.859713 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfdbl\" (UniqueName: \"kubernetes.io/projected/cd177746-866b-44d0-b46b-5b0f9b683a7b-kube-api-access-pfdbl\") pod \"ceilometer-0\" (UID: \"cd177746-866b-44d0-b46b-5b0f9b683a7b\") " pod="openstack/ceilometer-0" Mar 09 15:24:13 crc kubenswrapper[4722]: I0309 15:24:13.859787 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd177746-866b-44d0-b46b-5b0f9b683a7b-scripts\") pod \"ceilometer-0\" (UID: \"cd177746-866b-44d0-b46b-5b0f9b683a7b\") " pod="openstack/ceilometer-0" Mar 09 15:24:13 crc kubenswrapper[4722]: I0309 15:24:13.859847 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cd177746-866b-44d0-b46b-5b0f9b683a7b-log-httpd\") pod \"ceilometer-0\" (UID: \"cd177746-866b-44d0-b46b-5b0f9b683a7b\") " pod="openstack/ceilometer-0" Mar 09 15:24:13 crc kubenswrapper[4722]: I0309 15:24:13.860364 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cd177746-866b-44d0-b46b-5b0f9b683a7b-log-httpd\") pod \"ceilometer-0\" (UID: \"cd177746-866b-44d0-b46b-5b0f9b683a7b\") " pod="openstack/ceilometer-0" Mar 09 15:24:13 crc kubenswrapper[4722]: I0309 15:24:13.860721 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cd177746-866b-44d0-b46b-5b0f9b683a7b-run-httpd\") pod \"ceilometer-0\" (UID: \"cd177746-866b-44d0-b46b-5b0f9b683a7b\") " pod="openstack/ceilometer-0" Mar 09 15:24:13 crc kubenswrapper[4722]: I0309 15:24:13.864405 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cd177746-866b-44d0-b46b-5b0f9b683a7b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cd177746-866b-44d0-b46b-5b0f9b683a7b\") " pod="openstack/ceilometer-0" Mar 09 15:24:13 crc kubenswrapper[4722]: I0309 15:24:13.865900 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd177746-866b-44d0-b46b-5b0f9b683a7b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cd177746-866b-44d0-b46b-5b0f9b683a7b\") " pod="openstack/ceilometer-0" Mar 09 15:24:13 crc kubenswrapper[4722]: I0309 15:24:13.869539 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd177746-866b-44d0-b46b-5b0f9b683a7b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"cd177746-866b-44d0-b46b-5b0f9b683a7b\") " pod="openstack/ceilometer-0" Mar 09 15:24:13 crc kubenswrapper[4722]: I0309 15:24:13.873087 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd177746-866b-44d0-b46b-5b0f9b683a7b-scripts\") pod \"ceilometer-0\" (UID: \"cd177746-866b-44d0-b46b-5b0f9b683a7b\") " pod="openstack/ceilometer-0" Mar 09 15:24:13 crc kubenswrapper[4722]: I0309 15:24:13.874248 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd177746-866b-44d0-b46b-5b0f9b683a7b-config-data\") pod \"ceilometer-0\" (UID: \"cd177746-866b-44d0-b46b-5b0f9b683a7b\") " pod="openstack/ceilometer-0" Mar 09 15:24:13 crc kubenswrapper[4722]: I0309 15:24:13.923944 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfdbl\" (UniqueName: \"kubernetes.io/projected/cd177746-866b-44d0-b46b-5b0f9b683a7b-kube-api-access-pfdbl\") pod \"ceilometer-0\" (UID: \"cd177746-866b-44d0-b46b-5b0f9b683a7b\") " pod="openstack/ceilometer-0" Mar 09 15:24:14 crc kubenswrapper[4722]: I0309 15:24:14.039594 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 09 15:24:14 crc kubenswrapper[4722]: I0309 15:24:14.190236 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d962040-8d2b-4827-8a2f-1c5771970a8d" path="/var/lib/kubelet/pods/0d962040-8d2b-4827-8a2f-1c5771970a8d/volumes" Mar 09 15:24:14 crc kubenswrapper[4722]: I0309 15:24:14.234378 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-svbs7" podUID="44161991-883d-4494-80b3-b829ff355f47" containerName="registry-server" probeResult="failure" output=< Mar 09 15:24:14 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Mar 09 15:24:14 crc kubenswrapper[4722]: > Mar 09 15:24:14 crc kubenswrapper[4722]: I0309 15:24:14.352409 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bgdq7/must-gather-sd2h4" event={"ID":"13a7cb7f-e80c-4ff8-9a4e-1c1048399cd5","Type":"ContainerStarted","Data":"1d7ec723dc1c1c0a85c1e18d64e41de3d9f4364b2e000c15dc8996ae9106febb"} Mar 09 15:24:14 crc kubenswrapper[4722]: I0309 15:24:14.370355 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-bgdq7/must-gather-sd2h4" podStartSLOduration=2.84764584 podStartE2EDuration="18.370339194s" podCreationTimestamp="2026-03-09 15:23:56 +0000 UTC" firstStartedPulling="2026-03-09 15:23:57.113981656 +0000 UTC m=+4877.669550232" lastFinishedPulling="2026-03-09 15:24:12.63667501 +0000 UTC m=+4893.192243586" observedRunningTime="2026-03-09 15:24:14.365014548 +0000 UTC m=+4894.920583134" watchObservedRunningTime="2026-03-09 15:24:14.370339194 +0000 UTC m=+4894.925907770" Mar 09 15:24:15 crc kubenswrapper[4722]: I0309 15:24:15.049228 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 09 15:24:15 crc kubenswrapper[4722]: I0309 15:24:15.366683 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551164-b2lll" event={"ID":"aae15a8c-7e4e-4db4-b209-d1243d668860","Type":"ContainerStarted","Data":"3444c3b817df432becaa95c2f52ef4df00df7db9375eb54518de33ccd7ebf670"} Mar 09 15:24:15 crc kubenswrapper[4722]: I0309 15:24:15.368254 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cd177746-866b-44d0-b46b-5b0f9b683a7b","Type":"ContainerStarted","Data":"85c11659d54dae79dd091b332ce4e6cafb4d5ce0dfe94a460dd224cb83f8b481"} Mar 09 15:24:15 crc kubenswrapper[4722]: I0309 15:24:15.646514 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-v57f2" podUID="4e27c5a4-8cba-4119-8006-f9841d6121dc" containerName="registry-server" probeResult="failure" output=< Mar 09 15:24:15 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Mar 09 15:24:15 crc kubenswrapper[4722]: > Mar 09 15:24:16 crc kubenswrapper[4722]: I0309 15:24:16.389814 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cd177746-866b-44d0-b46b-5b0f9b683a7b","Type":"ContainerStarted","Data":"c531a5dd208004b5498cd8d63ff5d35bb7266974e9714792f4c6d7c9d82f4d45"} Mar 09 15:24:17 crc kubenswrapper[4722]: I0309 15:24:17.399760 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cd177746-866b-44d0-b46b-5b0f9b683a7b","Type":"ContainerStarted","Data":"b38747e6253cb34ff5bdad4ecdbffa457e8b7a6a04b23dd2877503e0a5528179"} Mar 09 15:24:18 crc kubenswrapper[4722]: I0309 15:24:18.412057 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cd177746-866b-44d0-b46b-5b0f9b683a7b","Type":"ContainerStarted","Data":"1f9d5e83ffcee1bb50849a4c892f8e4d46bda73b596cac479040e34aeb718163"} Mar 09 15:24:19 crc kubenswrapper[4722]: I0309 15:24:19.425610 4722 generic.go:334] "Generic (PLEG): container finished" podID="aae15a8c-7e4e-4db4-b209-d1243d668860" containerID="3444c3b817df432becaa95c2f52ef4df00df7db9375eb54518de33ccd7ebf670" exitCode=0 Mar 09 15:24:19 crc kubenswrapper[4722]: I0309 15:24:19.427406 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551164-b2lll" event={"ID":"aae15a8c-7e4e-4db4-b209-d1243d668860","Type":"ContainerDied","Data":"3444c3b817df432becaa95c2f52ef4df00df7db9375eb54518de33ccd7ebf670"} Mar 09 15:24:21 crc kubenswrapper[4722]: I0309 15:24:21.097803 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551164-b2lll" Mar 09 15:24:21 crc kubenswrapper[4722]: I0309 15:24:21.173772 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cdh4n\" (UniqueName: \"kubernetes.io/projected/aae15a8c-7e4e-4db4-b209-d1243d668860-kube-api-access-cdh4n\") pod \"aae15a8c-7e4e-4db4-b209-d1243d668860\" (UID: \"aae15a8c-7e4e-4db4-b209-d1243d668860\") " Mar 09 15:24:21 crc kubenswrapper[4722]: I0309 15:24:21.181999 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aae15a8c-7e4e-4db4-b209-d1243d668860-kube-api-access-cdh4n" (OuterVolumeSpecName: "kube-api-access-cdh4n") pod "aae15a8c-7e4e-4db4-b209-d1243d668860" (UID: "aae15a8c-7e4e-4db4-b209-d1243d668860"). InnerVolumeSpecName "kube-api-access-cdh4n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 15:24:21 crc kubenswrapper[4722]: I0309 15:24:21.277254 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cdh4n\" (UniqueName: \"kubernetes.io/projected/aae15a8c-7e4e-4db4-b209-d1243d668860-kube-api-access-cdh4n\") on node \"crc\" DevicePath \"\"" Mar 09 15:24:21 crc kubenswrapper[4722]: I0309 15:24:21.484763 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cd177746-866b-44d0-b46b-5b0f9b683a7b","Type":"ContainerStarted","Data":"be87af72fabe31f73651cebbc2937387c0d7fc747703f6ca7e86ba54de838f76"} Mar 09 15:24:21 crc kubenswrapper[4722]: I0309 15:24:21.484850 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 09 15:24:21 crc kubenswrapper[4722]: I0309 15:24:21.493146 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551164-b2lll" event={"ID":"aae15a8c-7e4e-4db4-b209-d1243d668860","Type":"ContainerDied","Data":"fd19bad59f11412cc640fecb0e53652e203cba5ce390fd3f1c222b8b7419f22a"} Mar 09 15:24:21 crc kubenswrapper[4722]: I0309 15:24:21.493355 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551164-b2lll" Mar 09 15:24:21 crc kubenswrapper[4722]: I0309 15:24:21.499258 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd19bad59f11412cc640fecb0e53652e203cba5ce390fd3f1c222b8b7419f22a" Mar 09 15:24:21 crc kubenswrapper[4722]: I0309 15:24:21.520708 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.173627036 podStartE2EDuration="8.520683581s" podCreationTimestamp="2026-03-09 15:24:13 +0000 UTC" firstStartedPulling="2026-03-09 15:24:15.060121258 +0000 UTC m=+4895.615689834" lastFinishedPulling="2026-03-09 15:24:20.407177803 +0000 UTC m=+4900.962746379" observedRunningTime="2026-03-09 15:24:21.517037001 +0000 UTC m=+4902.072605577" watchObservedRunningTime="2026-03-09 15:24:21.520683581 +0000 UTC m=+4902.076252157" Mar 09 15:24:21 crc kubenswrapper[4722]: I0309 15:24:21.527920 4722 patch_prober.go:28] interesting pod/machine-config-daemon-hjrrb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 15:24:21 crc kubenswrapper[4722]: I0309 15:24:21.527965 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 15:24:21 crc kubenswrapper[4722]: I0309 15:24:21.527999 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" Mar 09 15:24:21 crc kubenswrapper[4722]: I0309 15:24:21.529698 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f204c1f7d34e99a1176cb155859ff24baa5114f3011037f9d600a75f90dbf8fc"} pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 15:24:21 crc kubenswrapper[4722]: I0309 15:24:21.529767 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerName="machine-config-daemon" containerID="cri-o://f204c1f7d34e99a1176cb155859ff24baa5114f3011037f9d600a75f90dbf8fc" gracePeriod=600 Mar 09 15:24:21 crc kubenswrapper[4722]: I0309 15:24:21.537692 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551158-pz25v"] Mar 09 15:24:21 crc kubenswrapper[4722]: I0309 15:24:21.549532 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551158-pz25v"] Mar 09 15:24:22 crc kubenswrapper[4722]: I0309 15:24:22.162683 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf39b656-7744-4eed-9a1d-b278c8eeb3c0" path="/var/lib/kubelet/pods/bf39b656-7744-4eed-9a1d-b278c8eeb3c0/volumes" Mar 09 15:24:22 crc kubenswrapper[4722]: I0309 15:24:22.507117 4722 generic.go:334] "Generic (PLEG): container finished" podID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerID="f204c1f7d34e99a1176cb155859ff24baa5114f3011037f9d600a75f90dbf8fc" exitCode=0 Mar 09 15:24:22 crc kubenswrapper[4722]: I0309 15:24:22.508436 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" event={"ID":"dac2aaf5-653b-4b2a-8efe-ed26bac8d648","Type":"ContainerDied","Data":"f204c1f7d34e99a1176cb155859ff24baa5114f3011037f9d600a75f90dbf8fc"} Mar 09 15:24:22 crc kubenswrapper[4722]: I0309 15:24:22.508474 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" event={"ID":"dac2aaf5-653b-4b2a-8efe-ed26bac8d648","Type":"ContainerStarted","Data":"05d0cab124a66072241bc2a93dace83012b26d6bbd0de5507999fb5df76700cb"} Mar 09 15:24:22 crc kubenswrapper[4722]: I0309 15:24:22.508496 4722 scope.go:117] "RemoveContainer" containerID="ed831924138fb872c73acc14a5ac7c6ed36b421017c8fe60dbbaede3d8f11789" Mar 09 15:24:24 crc kubenswrapper[4722]: I0309 15:24:24.202904 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-svbs7" podUID="44161991-883d-4494-80b3-b829ff355f47" containerName="registry-server" probeResult="failure" output=< Mar 09 15:24:24 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Mar 09 15:24:24 crc kubenswrapper[4722]: > Mar 09 15:24:25 crc kubenswrapper[4722]: I0309 15:24:25.671357 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-v57f2" podUID="4e27c5a4-8cba-4119-8006-f9841d6121dc" containerName="registry-server" probeResult="failure" output=< Mar 09 15:24:25 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Mar 09 15:24:25 crc kubenswrapper[4722]: > Mar 09 15:24:26 crc kubenswrapper[4722]: I0309 15:24:26.775316 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-bgdq7/crc-debug-trwmw"] Mar 09 15:24:26 crc kubenswrapper[4722]: E0309 15:24:26.776794 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aae15a8c-7e4e-4db4-b209-d1243d668860" containerName="oc" Mar 09 15:24:26 crc kubenswrapper[4722]: I0309 15:24:26.776814 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="aae15a8c-7e4e-4db4-b209-d1243d668860" containerName="oc" Mar 09 15:24:26 crc kubenswrapper[4722]: I0309 15:24:26.777061 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="aae15a8c-7e4e-4db4-b209-d1243d668860" containerName="oc" Mar 09 15:24:26 crc kubenswrapper[4722]: I0309 15:24:26.779306 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bgdq7/crc-debug-trwmw" Mar 09 15:24:26 crc kubenswrapper[4722]: I0309 15:24:26.812847 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cd703b81-b135-46a9-b2e0-5ef4743376cc-host\") pod \"crc-debug-trwmw\" (UID: \"cd703b81-b135-46a9-b2e0-5ef4743376cc\") " pod="openshift-must-gather-bgdq7/crc-debug-trwmw" Mar 09 15:24:26 crc kubenswrapper[4722]: I0309 15:24:26.813267 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84h8f\" (UniqueName: \"kubernetes.io/projected/cd703b81-b135-46a9-b2e0-5ef4743376cc-kube-api-access-84h8f\") pod \"crc-debug-trwmw\" (UID: \"cd703b81-b135-46a9-b2e0-5ef4743376cc\") " pod="openshift-must-gather-bgdq7/crc-debug-trwmw" Mar 09 15:24:26 crc kubenswrapper[4722]: I0309 15:24:26.915400 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cd703b81-b135-46a9-b2e0-5ef4743376cc-host\") pod \"crc-debug-trwmw\" (UID: \"cd703b81-b135-46a9-b2e0-5ef4743376cc\") " pod="openshift-must-gather-bgdq7/crc-debug-trwmw" Mar 09 15:24:26 crc kubenswrapper[4722]: I0309 15:24:26.915449 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84h8f\" (UniqueName: \"kubernetes.io/projected/cd703b81-b135-46a9-b2e0-5ef4743376cc-kube-api-access-84h8f\") pod \"crc-debug-trwmw\" (UID: \"cd703b81-b135-46a9-b2e0-5ef4743376cc\") " pod="openshift-must-gather-bgdq7/crc-debug-trwmw" Mar 09 15:24:26 crc kubenswrapper[4722]: I0309 15:24:26.916053 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cd703b81-b135-46a9-b2e0-5ef4743376cc-host\") pod \"crc-debug-trwmw\" (UID: \"cd703b81-b135-46a9-b2e0-5ef4743376cc\") " pod="openshift-must-gather-bgdq7/crc-debug-trwmw" Mar 09 15:24:26 crc kubenswrapper[4722]: I0309 15:24:26.943828 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84h8f\" (UniqueName: \"kubernetes.io/projected/cd703b81-b135-46a9-b2e0-5ef4743376cc-kube-api-access-84h8f\") pod \"crc-debug-trwmw\" (UID: \"cd703b81-b135-46a9-b2e0-5ef4743376cc\") " pod="openshift-must-gather-bgdq7/crc-debug-trwmw" Mar 09 15:24:27 crc kubenswrapper[4722]: I0309 15:24:27.118557 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bgdq7/crc-debug-trwmw" Mar 09 15:24:27 crc kubenswrapper[4722]: I0309 15:24:27.564743 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bgdq7/crc-debug-trwmw" event={"ID":"cd703b81-b135-46a9-b2e0-5ef4743376cc","Type":"ContainerStarted","Data":"9ded4f4bef38c9d1f28294a7e57a66b705e54ce47adbc0a18f584cd8f3518d88"} Mar 09 15:24:31 crc kubenswrapper[4722]: I0309 15:24:31.985635 4722 scope.go:117] "RemoveContainer" containerID="7d831a8adee56ed8c32f1584589e4f2094a6800ee05874f152edb0d41a1042a0" Mar 09 15:24:33 crc kubenswrapper[4722]: I0309 15:24:33.148684 4722 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 09 15:24:34 crc kubenswrapper[4722]: I0309 15:24:34.211715 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-svbs7" podUID="44161991-883d-4494-80b3-b829ff355f47" containerName="registry-server" probeResult="failure" output=< Mar 09 15:24:34 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Mar 09 15:24:34 crc kubenswrapper[4722]: > Mar 09 15:24:35 crc kubenswrapper[4722]: I0309 15:24:35.648094 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-v57f2" podUID="4e27c5a4-8cba-4119-8006-f9841d6121dc" containerName="registry-server" probeResult="failure" output=< Mar 09 15:24:35 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Mar 09 15:24:35 crc kubenswrapper[4722]: > Mar 09 15:24:35 crc kubenswrapper[4722]: I0309 15:24:35.648493 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-v57f2" Mar 09 15:24:35 crc kubenswrapper[4722]: I0309 15:24:35.649542 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="registry-server" containerStatusID={"Type":"cri-o","ID":"64801968ad1fa38f05d01df1506ac098993ded0a01af660e30f5070685f78f41"} pod="openshift-marketplace/redhat-operators-v57f2" containerMessage="Container registry-server failed startup probe, will be restarted" Mar 09 15:24:35 crc kubenswrapper[4722]: I0309 15:24:35.649583 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-v57f2" podUID="4e27c5a4-8cba-4119-8006-f9841d6121dc" containerName="registry-server" containerID="cri-o://64801968ad1fa38f05d01df1506ac098993ded0a01af660e30f5070685f78f41" gracePeriod=30 Mar 09 15:24:42 crc kubenswrapper[4722]: E0309 15:24:42.968859 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6ab858aed98e4fe57e6b144da8e90ad5d6698bb4cc5521206f5c05809f0f9296" Mar 09 15:24:42 crc kubenswrapper[4722]: E0309 15:24:42.973585 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:container-00,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6ab858aed98e4fe57e6b144da8e90ad5d6698bb4cc5521206f5c05809f0f9296,Command:[chroot /host bash -c echo 'TOOLBOX_NAME=toolbox-osp' > /root/.toolboxrc ; rm -rf \"/var/tmp/sos-osp\" && mkdir -p \"/var/tmp/sos-osp\" && sudo podman rm --force toolbox-osp; sudo --preserve-env podman pull --authfile /var/lib/kubelet/config.json registry.redhat.io/rhel9/support-tools && toolbox sos report --batch --all-logs --only-plugins block,cifs,crio,devicemapper,devices,firewall_tables,firewalld,iscsi,lvm2,memory,multipath,nfs,nis,nvme,podman,process,processor,selinux,scsi,udev,logs,crypto --tmp-dir=\"/var/tmp/sos-osp\" && if [[ \"$(ls /var/log/pods/*/{*.log.*,*/*.log.*} 2>/dev/null)\" != '' ]]; then tar --ignore-failed-read --warning=no-file-changed -cJf \"/var/tmp/sos-osp/podlogs.tar.xz\" --transform 's,^,podlogs/,' /var/log/pods/*/{*.log.*,*/*.log.*} || true; fi],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:TMOUT,Value:900,ValueFrom:nil,},EnvVar{Name:HOST,Value:/host,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host,ReadOnly:false,MountPath:/host,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-84h8f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod crc-debug-trwmw_openshift-must-gather-bgdq7(cd703b81-b135-46a9-b2e0-5ef4743376cc): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 09 15:24:42 crc kubenswrapper[4722]: E0309 15:24:42.975191 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"container-00\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openshift-must-gather-bgdq7/crc-debug-trwmw" podUID="cd703b81-b135-46a9-b2e0-5ef4743376cc" Mar 09 15:24:43 crc kubenswrapper[4722]: E0309 15:24:43.762242 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"container-00\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6ab858aed98e4fe57e6b144da8e90ad5d6698bb4cc5521206f5c05809f0f9296\\\"\"" pod="openshift-must-gather-bgdq7/crc-debug-trwmw" podUID="cd703b81-b135-46a9-b2e0-5ef4743376cc" Mar 09 15:24:44 crc kubenswrapper[4722]: I0309 15:24:44.058407 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 09 15:24:44 crc kubenswrapper[4722]: I0309 15:24:44.197215 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-svbs7" podUID="44161991-883d-4494-80b3-b829ff355f47" containerName="registry-server" probeResult="failure" output=< Mar 09 15:24:44 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Mar 09 15:24:44 crc kubenswrapper[4722]: > Mar 09 15:24:44 crc kubenswrapper[4722]: I0309 15:24:44.197726 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-svbs7" Mar 09 15:24:44 crc kubenswrapper[4722]: I0309 15:24:44.198812 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="registry-server" containerStatusID={"Type":"cri-o","ID":"0c3a05cd3ea849665c307ddf7b879759c27b1f5aedfe3354dd3ac827df5ecb41"} pod="openshift-marketplace/redhat-operators-svbs7" containerMessage="Container registry-server failed startup probe, will be restarted" Mar 09 15:24:44 crc kubenswrapper[4722]: I0309 15:24:44.198891 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-svbs7" podUID="44161991-883d-4494-80b3-b829ff355f47" containerName="registry-server" containerID="cri-o://0c3a05cd3ea849665c307ddf7b879759c27b1f5aedfe3354dd3ac827df5ecb41" gracePeriod=30 Mar 09 15:24:57 crc kubenswrapper[4722]: I0309 15:24:57.936061 4722 generic.go:334] "Generic (PLEG): container finished" podID="4e27c5a4-8cba-4119-8006-f9841d6121dc" containerID="64801968ad1fa38f05d01df1506ac098993ded0a01af660e30f5070685f78f41" exitCode=0 Mar 09 15:24:57 crc kubenswrapper[4722]: I0309 15:24:57.936107 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v57f2" event={"ID":"4e27c5a4-8cba-4119-8006-f9841d6121dc","Type":"ContainerDied","Data":"64801968ad1fa38f05d01df1506ac098993ded0a01af660e30f5070685f78f41"} Mar 09 15:24:57 crc kubenswrapper[4722]: I0309 15:24:57.936686 4722 scope.go:117] "RemoveContainer" containerID="39debea8587f6e6f170a7177c41011980494388f1c8da8fb895edab231f9cc8f" Mar 09 15:24:58 crc kubenswrapper[4722]: I0309 15:24:58.949978 4722 generic.go:334] "Generic (PLEG): container finished" podID="44161991-883d-4494-80b3-b829ff355f47" containerID="0c3a05cd3ea849665c307ddf7b879759c27b1f5aedfe3354dd3ac827df5ecb41" exitCode=0 Mar 09 15:24:58 crc kubenswrapper[4722]: I0309 15:24:58.950060 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-svbs7" event={"ID":"44161991-883d-4494-80b3-b829ff355f47","Type":"ContainerDied","Data":"0c3a05cd3ea849665c307ddf7b879759c27b1f5aedfe3354dd3ac827df5ecb41"} Mar 09 15:24:58 crc kubenswrapper[4722]: I0309 15:24:58.953159 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v57f2" event={"ID":"4e27c5a4-8cba-4119-8006-f9841d6121dc","Type":"ContainerStarted","Data":"7460cd647230927187709ade14224bc0049f2f449b47e44fb0cf9eccbfb14ba9"} Mar 09 15:24:58 crc kubenswrapper[4722]: I0309 15:24:58.955053 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bgdq7/crc-debug-trwmw" event={"ID":"cd703b81-b135-46a9-b2e0-5ef4743376cc","Type":"ContainerStarted","Data":"c271b8211cefb56ca605873d18ec367536653bbe35acdf81a77dd4fcc192810c"} Mar 09 15:24:59 crc kubenswrapper[4722]: I0309 15:24:59.022035 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-bgdq7/crc-debug-trwmw" podStartSLOduration=1.616051992 podStartE2EDuration="33.022015887s" podCreationTimestamp="2026-03-09 15:24:26 +0000 UTC" firstStartedPulling="2026-03-09 15:24:27.205840456 +0000 UTC m=+4907.761409032" lastFinishedPulling="2026-03-09 15:24:58.611804351 +0000 UTC m=+4939.167372927" observedRunningTime="2026-03-09 15:24:59.002582264 +0000 UTC m=+4939.558150840" watchObservedRunningTime="2026-03-09 15:24:59.022015887 +0000 UTC m=+4939.577584463" Mar 09 15:24:59 crc kubenswrapper[4722]: I0309 15:24:59.966928 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-svbs7" event={"ID":"44161991-883d-4494-80b3-b829ff355f47","Type":"ContainerStarted","Data":"8a26a5a101608ceed68af2af2a434e39fb9013308aa9272021748678ec2a6f1e"} Mar 09 15:25:03 crc kubenswrapper[4722]: I0309 15:25:03.151018 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-svbs7" Mar 09 15:25:03 crc kubenswrapper[4722]: I0309 15:25:03.151569 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-svbs7" Mar 09 15:25:04 crc kubenswrapper[4722]: I0309 15:25:04.205430 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-svbs7" podUID="44161991-883d-4494-80b3-b829ff355f47" containerName="registry-server" probeResult="failure" output=< Mar 09 15:25:04 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Mar 09 15:25:04 crc kubenswrapper[4722]: > Mar 09 15:25:04 crc kubenswrapper[4722]: I0309 15:25:04.584636 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-v57f2" Mar 09 15:25:04 crc kubenswrapper[4722]: I0309 15:25:04.584722 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-v57f2" Mar 09 15:25:05 crc kubenswrapper[4722]: I0309 15:25:05.632680 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-v57f2" podUID="4e27c5a4-8cba-4119-8006-f9841d6121dc" containerName="registry-server" probeResult="failure" output=< Mar 09 15:25:05 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Mar 09 15:25:05 crc kubenswrapper[4722]: > Mar 09 15:25:14 crc kubenswrapper[4722]: I0309 15:25:14.208543 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-svbs7" podUID="44161991-883d-4494-80b3-b829ff355f47" containerName="registry-server" probeResult="failure" output=< Mar 09 15:25:14 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Mar 09 15:25:14 crc kubenswrapper[4722]: > Mar 09 15:25:15 crc kubenswrapper[4722]: I0309 15:25:15.641404 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-v57f2" podUID="4e27c5a4-8cba-4119-8006-f9841d6121dc" containerName="registry-server" probeResult="failure" output=< Mar 09 15:25:15 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Mar 09 15:25:15 crc kubenswrapper[4722]: > Mar 09 15:25:22 crc kubenswrapper[4722]: I0309 15:25:22.227986 4722 generic.go:334] "Generic (PLEG): container finished" podID="f8a65e9f-5e0a-47d0-b251-aa4e52e2f581" containerID="accf60ca5a63b2963a1bf6cb44c81ffff1c61e6d3e9b03e803b0af50dfcdb881" exitCode=0 Mar 09 15:25:22 crc kubenswrapper[4722]: I0309 15:25:22.228224 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-bcd6d9dd6-5rp7g" event={"ID":"f8a65e9f-5e0a-47d0-b251-aa4e52e2f581","Type":"ContainerDied","Data":"accf60ca5a63b2963a1bf6cb44c81ffff1c61e6d3e9b03e803b0af50dfcdb881"} Mar 09 15:25:22 crc kubenswrapper[4722]: I0309 15:25:22.228641 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-bcd6d9dd6-5rp7g" event={"ID":"f8a65e9f-5e0a-47d0-b251-aa4e52e2f581","Type":"ContainerStarted","Data":"e2b93eefed6e2773e0461f5ccc0540525febf79ce1ee23f31d335615535e3a14"} Mar 09 15:25:24 crc kubenswrapper[4722]: I0309 15:25:24.221832 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-svbs7" podUID="44161991-883d-4494-80b3-b829ff355f47" containerName="registry-server" probeResult="failure" output=< Mar 09 15:25:24 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Mar 09 15:25:24 crc kubenswrapper[4722]: > Mar 09 15:25:24 crc kubenswrapper[4722]: I0309 15:25:24.642798 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-v57f2" Mar 09 15:25:24 crc kubenswrapper[4722]: I0309 15:25:24.714596 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-v57f2" Mar 09 15:25:29 crc kubenswrapper[4722]: I0309 15:25:29.900402 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/metrics-server-bcd6d9dd6-5rp7g" Mar 09 15:25:29 crc kubenswrapper[4722]: I0309 15:25:29.900944 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-bcd6d9dd6-5rp7g" Mar 09 15:25:33 crc kubenswrapper[4722]: I0309 15:25:33.199256 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-svbs7" Mar 09 15:25:33 crc kubenswrapper[4722]: I0309 15:25:33.252650 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-svbs7" Mar 09 15:25:33 crc kubenswrapper[4722]: I0309 15:25:33.450999 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-svbs7"] Mar 09 15:25:34 crc kubenswrapper[4722]: I0309 15:25:34.368729 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-svbs7" podUID="44161991-883d-4494-80b3-b829ff355f47" containerName="registry-server" containerID="cri-o://8a26a5a101608ceed68af2af2a434e39fb9013308aa9272021748678ec2a6f1e" gracePeriod=2 Mar 09 15:25:35 crc kubenswrapper[4722]: I0309 15:25:35.382087 4722 generic.go:334] "Generic (PLEG): container finished" podID="44161991-883d-4494-80b3-b829ff355f47" containerID="8a26a5a101608ceed68af2af2a434e39fb9013308aa9272021748678ec2a6f1e" exitCode=0 Mar 09 15:25:35 crc kubenswrapper[4722]: I0309 15:25:35.382234 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-svbs7" event={"ID":"44161991-883d-4494-80b3-b829ff355f47","Type":"ContainerDied","Data":"8a26a5a101608ceed68af2af2a434e39fb9013308aa9272021748678ec2a6f1e"} Mar 09 15:25:35 crc kubenswrapper[4722]: I0309 15:25:35.382488 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-svbs7" event={"ID":"44161991-883d-4494-80b3-b829ff355f47","Type":"ContainerDied","Data":"428841f088edfd21acca847bdf06e66d5c0d29545acac520ba98928b8b361fa2"} Mar 09 15:25:35 crc kubenswrapper[4722]: I0309 15:25:35.382500 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="428841f088edfd21acca847bdf06e66d5c0d29545acac520ba98928b8b361fa2" Mar 09 15:25:35 crc kubenswrapper[4722]: I0309 15:25:35.382516 4722 scope.go:117] "RemoveContainer" containerID="0c3a05cd3ea849665c307ddf7b879759c27b1f5aedfe3354dd3ac827df5ecb41" Mar 09 15:25:35 crc kubenswrapper[4722]: I0309 15:25:35.444664 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-svbs7" Mar 09 15:25:35 crc kubenswrapper[4722]: I0309 15:25:35.573312 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-222qw\" (UniqueName: \"kubernetes.io/projected/44161991-883d-4494-80b3-b829ff355f47-kube-api-access-222qw\") pod \"44161991-883d-4494-80b3-b829ff355f47\" (UID: \"44161991-883d-4494-80b3-b829ff355f47\") " Mar 09 15:25:35 crc kubenswrapper[4722]: I0309 15:25:35.573880 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44161991-883d-4494-80b3-b829ff355f47-catalog-content\") pod \"44161991-883d-4494-80b3-b829ff355f47\" (UID: \"44161991-883d-4494-80b3-b829ff355f47\") " Mar 09 15:25:35 crc kubenswrapper[4722]: I0309 15:25:35.574008 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44161991-883d-4494-80b3-b829ff355f47-utilities\") pod \"44161991-883d-4494-80b3-b829ff355f47\" (UID: \"44161991-883d-4494-80b3-b829ff355f47\") " Mar 09 15:25:35 crc kubenswrapper[4722]: I0309 15:25:35.574752 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44161991-883d-4494-80b3-b829ff355f47-utilities" (OuterVolumeSpecName: "utilities") pod "44161991-883d-4494-80b3-b829ff355f47" (UID: "44161991-883d-4494-80b3-b829ff355f47"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 15:25:35 crc kubenswrapper[4722]: I0309 15:25:35.575058 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/44161991-883d-4494-80b3-b829ff355f47-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 15:25:35 crc kubenswrapper[4722]: I0309 15:25:35.591017 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44161991-883d-4494-80b3-b829ff355f47-kube-api-access-222qw" (OuterVolumeSpecName: "kube-api-access-222qw") pod "44161991-883d-4494-80b3-b829ff355f47" (UID: "44161991-883d-4494-80b3-b829ff355f47"). InnerVolumeSpecName "kube-api-access-222qw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 15:25:35 crc kubenswrapper[4722]: I0309 15:25:35.676836 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-222qw\" (UniqueName: \"kubernetes.io/projected/44161991-883d-4494-80b3-b829ff355f47-kube-api-access-222qw\") on node \"crc\" DevicePath \"\"" Mar 09 15:25:35 crc kubenswrapper[4722]: I0309 15:25:35.693214 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/44161991-883d-4494-80b3-b829ff355f47-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "44161991-883d-4494-80b3-b829ff355f47" (UID: "44161991-883d-4494-80b3-b829ff355f47"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 15:25:35 crc kubenswrapper[4722]: I0309 15:25:35.778629 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/44161991-883d-4494-80b3-b829ff355f47-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 15:25:36 crc kubenswrapper[4722]: I0309 15:25:36.395337 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-svbs7" Mar 09 15:25:36 crc kubenswrapper[4722]: I0309 15:25:36.459637 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-svbs7"] Mar 09 15:25:36 crc kubenswrapper[4722]: I0309 15:25:36.474190 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-svbs7"] Mar 09 15:25:38 crc kubenswrapper[4722]: I0309 15:25:38.164689 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44161991-883d-4494-80b3-b829ff355f47" path="/var/lib/kubelet/pods/44161991-883d-4494-80b3-b829ff355f47/volumes" Mar 09 15:25:43 crc kubenswrapper[4722]: I0309 15:25:43.509013 4722 generic.go:334] "Generic (PLEG): container finished" podID="cd703b81-b135-46a9-b2e0-5ef4743376cc" containerID="c271b8211cefb56ca605873d18ec367536653bbe35acdf81a77dd4fcc192810c" exitCode=0 Mar 09 15:25:43 crc kubenswrapper[4722]: I0309 15:25:43.509661 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bgdq7/crc-debug-trwmw" event={"ID":"cd703b81-b135-46a9-b2e0-5ef4743376cc","Type":"ContainerDied","Data":"c271b8211cefb56ca605873d18ec367536653bbe35acdf81a77dd4fcc192810c"} Mar 09 15:25:44 crc kubenswrapper[4722]: I0309 15:25:44.678549 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bgdq7/crc-debug-trwmw" Mar 09 15:25:44 crc kubenswrapper[4722]: I0309 15:25:44.723981 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-bgdq7/crc-debug-trwmw"] Mar 09 15:25:44 crc kubenswrapper[4722]: I0309 15:25:44.743340 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-bgdq7/crc-debug-trwmw"] Mar 09 15:25:44 crc kubenswrapper[4722]: I0309 15:25:44.806772 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cd703b81-b135-46a9-b2e0-5ef4743376cc-host\") pod \"cd703b81-b135-46a9-b2e0-5ef4743376cc\" (UID: \"cd703b81-b135-46a9-b2e0-5ef4743376cc\") " Mar 09 15:25:44 crc kubenswrapper[4722]: I0309 15:25:44.806864 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84h8f\" (UniqueName: \"kubernetes.io/projected/cd703b81-b135-46a9-b2e0-5ef4743376cc-kube-api-access-84h8f\") pod \"cd703b81-b135-46a9-b2e0-5ef4743376cc\" (UID: \"cd703b81-b135-46a9-b2e0-5ef4743376cc\") " Mar 09 15:25:44 crc kubenswrapper[4722]: I0309 15:25:44.806896 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cd703b81-b135-46a9-b2e0-5ef4743376cc-host" (OuterVolumeSpecName: "host") pod "cd703b81-b135-46a9-b2e0-5ef4743376cc" (UID: "cd703b81-b135-46a9-b2e0-5ef4743376cc"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 15:25:44 crc kubenswrapper[4722]: I0309 15:25:44.810286 4722 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cd703b81-b135-46a9-b2e0-5ef4743376cc-host\") on node \"crc\" DevicePath \"\"" Mar 09 15:25:44 crc kubenswrapper[4722]: I0309 15:25:44.819688 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd703b81-b135-46a9-b2e0-5ef4743376cc-kube-api-access-84h8f" (OuterVolumeSpecName: "kube-api-access-84h8f") pod "cd703b81-b135-46a9-b2e0-5ef4743376cc" (UID: "cd703b81-b135-46a9-b2e0-5ef4743376cc"). InnerVolumeSpecName "kube-api-access-84h8f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 15:25:44 crc kubenswrapper[4722]: I0309 15:25:44.912658 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-84h8f\" (UniqueName: \"kubernetes.io/projected/cd703b81-b135-46a9-b2e0-5ef4743376cc-kube-api-access-84h8f\") on node \"crc\" DevicePath \"\"" Mar 09 15:25:45 crc kubenswrapper[4722]: I0309 15:25:45.532115 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ded4f4bef38c9d1f28294a7e57a66b705e54ce47adbc0a18f584cd8f3518d88" Mar 09 15:25:45 crc kubenswrapper[4722]: I0309 15:25:45.532722 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bgdq7/crc-debug-trwmw" Mar 09 15:25:45 crc kubenswrapper[4722]: I0309 15:25:45.927860 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-bgdq7/crc-debug-45hp2"] Mar 09 15:25:45 crc kubenswrapper[4722]: E0309 15:25:45.929134 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44161991-883d-4494-80b3-b829ff355f47" containerName="extract-content" Mar 09 15:25:45 crc kubenswrapper[4722]: I0309 15:25:45.929169 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="44161991-883d-4494-80b3-b829ff355f47" containerName="extract-content" Mar 09 15:25:45 crc kubenswrapper[4722]: E0309 15:25:45.929186 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44161991-883d-4494-80b3-b829ff355f47" containerName="registry-server" Mar 09 15:25:45 crc kubenswrapper[4722]: I0309 15:25:45.929195 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="44161991-883d-4494-80b3-b829ff355f47" containerName="registry-server" Mar 09 15:25:45 crc kubenswrapper[4722]: E0309 15:25:45.929226 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd703b81-b135-46a9-b2e0-5ef4743376cc" containerName="container-00" Mar 09 15:25:45 crc kubenswrapper[4722]: I0309 15:25:45.929237 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd703b81-b135-46a9-b2e0-5ef4743376cc" containerName="container-00" Mar 09 15:25:45 crc kubenswrapper[4722]: E0309 15:25:45.929272 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44161991-883d-4494-80b3-b829ff355f47" containerName="registry-server" Mar 09 15:25:45 crc kubenswrapper[4722]: I0309 15:25:45.929282 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="44161991-883d-4494-80b3-b829ff355f47" containerName="registry-server" Mar 09 15:25:45 crc kubenswrapper[4722]: E0309 15:25:45.929304 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44161991-883d-4494-80b3-b829ff355f47" containerName="extract-utilities" Mar 09 15:25:45 crc kubenswrapper[4722]: I0309 15:25:45.929314 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="44161991-883d-4494-80b3-b829ff355f47" containerName="extract-utilities" Mar 09 15:25:45 crc kubenswrapper[4722]: I0309 15:25:45.929687 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="44161991-883d-4494-80b3-b829ff355f47" containerName="registry-server" Mar 09 15:25:45 crc kubenswrapper[4722]: I0309 15:25:45.929724 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd703b81-b135-46a9-b2e0-5ef4743376cc" containerName="container-00" Mar 09 15:25:45 crc kubenswrapper[4722]: I0309 15:25:45.929738 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="44161991-883d-4494-80b3-b829ff355f47" containerName="registry-server" Mar 09 15:25:45 crc kubenswrapper[4722]: I0309 15:25:45.931154 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bgdq7/crc-debug-45hp2" Mar 09 15:25:46 crc kubenswrapper[4722]: I0309 15:25:46.038864 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8ffb1106-bf05-4405-850b-6d5fa063f8a8-host\") pod \"crc-debug-45hp2\" (UID: \"8ffb1106-bf05-4405-850b-6d5fa063f8a8\") " pod="openshift-must-gather-bgdq7/crc-debug-45hp2" Mar 09 15:25:46 crc kubenswrapper[4722]: I0309 15:25:46.039411 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msbjs\" (UniqueName: \"kubernetes.io/projected/8ffb1106-bf05-4405-850b-6d5fa063f8a8-kube-api-access-msbjs\") pod \"crc-debug-45hp2\" (UID: \"8ffb1106-bf05-4405-850b-6d5fa063f8a8\") " pod="openshift-must-gather-bgdq7/crc-debug-45hp2" Mar 09 15:25:46 crc kubenswrapper[4722]: I0309 15:25:46.141182 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msbjs\" (UniqueName: \"kubernetes.io/projected/8ffb1106-bf05-4405-850b-6d5fa063f8a8-kube-api-access-msbjs\") pod \"crc-debug-45hp2\" (UID: \"8ffb1106-bf05-4405-850b-6d5fa063f8a8\") " pod="openshift-must-gather-bgdq7/crc-debug-45hp2" Mar 09 15:25:46 crc kubenswrapper[4722]: I0309 15:25:46.141648 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8ffb1106-bf05-4405-850b-6d5fa063f8a8-host\") pod \"crc-debug-45hp2\" (UID: \"8ffb1106-bf05-4405-850b-6d5fa063f8a8\") " pod="openshift-must-gather-bgdq7/crc-debug-45hp2" Mar 09 15:25:46 crc kubenswrapper[4722]: I0309 15:25:46.141955 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8ffb1106-bf05-4405-850b-6d5fa063f8a8-host\") pod \"crc-debug-45hp2\" (UID: \"8ffb1106-bf05-4405-850b-6d5fa063f8a8\") " pod="openshift-must-gather-bgdq7/crc-debug-45hp2" Mar 09 15:25:46 crc kubenswrapper[4722]: I0309 15:25:46.164076 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd703b81-b135-46a9-b2e0-5ef4743376cc" path="/var/lib/kubelet/pods/cd703b81-b135-46a9-b2e0-5ef4743376cc/volumes" Mar 09 15:25:46 crc kubenswrapper[4722]: I0309 15:25:46.169746 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msbjs\" (UniqueName: \"kubernetes.io/projected/8ffb1106-bf05-4405-850b-6d5fa063f8a8-kube-api-access-msbjs\") pod \"crc-debug-45hp2\" (UID: \"8ffb1106-bf05-4405-850b-6d5fa063f8a8\") " pod="openshift-must-gather-bgdq7/crc-debug-45hp2" Mar 09 15:25:46 crc kubenswrapper[4722]: I0309 15:25:46.251120 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bgdq7/crc-debug-45hp2" Mar 09 15:25:46 crc kubenswrapper[4722]: W0309 15:25:46.358788 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ffb1106_bf05_4405_850b_6d5fa063f8a8.slice/crio-c9af42fee94264e32e226b1b8b8b31c3c29375a2f15be1ffd4f46c0ac358928d WatchSource:0}: Error finding container c9af42fee94264e32e226b1b8b8b31c3c29375a2f15be1ffd4f46c0ac358928d: Status 404 returned error can't find the container with id c9af42fee94264e32e226b1b8b8b31c3c29375a2f15be1ffd4f46c0ac358928d Mar 09 15:25:46 crc kubenswrapper[4722]: I0309 15:25:46.547319 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bgdq7/crc-debug-45hp2" event={"ID":"8ffb1106-bf05-4405-850b-6d5fa063f8a8","Type":"ContainerStarted","Data":"c9af42fee94264e32e226b1b8b8b31c3c29375a2f15be1ffd4f46c0ac358928d"} Mar 09 15:25:47 crc kubenswrapper[4722]: I0309 15:25:47.566102 4722 generic.go:334] "Generic (PLEG): container finished" podID="8ffb1106-bf05-4405-850b-6d5fa063f8a8" containerID="60a3f7a2763968e65f4b292ed56000b48c001ad5126d8f59220768804bc386e5" exitCode=0 Mar 09 15:25:47 crc kubenswrapper[4722]: I0309 15:25:47.566195 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bgdq7/crc-debug-45hp2" event={"ID":"8ffb1106-bf05-4405-850b-6d5fa063f8a8","Type":"ContainerDied","Data":"60a3f7a2763968e65f4b292ed56000b48c001ad5126d8f59220768804bc386e5"} Mar 09 15:25:48 crc kubenswrapper[4722]: I0309 15:25:48.694799 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bgdq7/crc-debug-45hp2" Mar 09 15:25:48 crc kubenswrapper[4722]: I0309 15:25:48.788858 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-bgdq7/crc-debug-45hp2"] Mar 09 15:25:48 crc kubenswrapper[4722]: I0309 15:25:48.801522 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-bgdq7/crc-debug-45hp2"] Mar 09 15:25:48 crc kubenswrapper[4722]: I0309 15:25:48.812432 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-msbjs\" (UniqueName: \"kubernetes.io/projected/8ffb1106-bf05-4405-850b-6d5fa063f8a8-kube-api-access-msbjs\") pod \"8ffb1106-bf05-4405-850b-6d5fa063f8a8\" (UID: \"8ffb1106-bf05-4405-850b-6d5fa063f8a8\") " Mar 09 15:25:48 crc kubenswrapper[4722]: I0309 15:25:48.812464 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8ffb1106-bf05-4405-850b-6d5fa063f8a8-host\") pod \"8ffb1106-bf05-4405-850b-6d5fa063f8a8\" (UID: \"8ffb1106-bf05-4405-850b-6d5fa063f8a8\") " Mar 09 15:25:48 crc kubenswrapper[4722]: I0309 15:25:48.812688 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8ffb1106-bf05-4405-850b-6d5fa063f8a8-host" (OuterVolumeSpecName: "host") pod "8ffb1106-bf05-4405-850b-6d5fa063f8a8" (UID: "8ffb1106-bf05-4405-850b-6d5fa063f8a8"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 15:25:48 crc kubenswrapper[4722]: I0309 15:25:48.812982 4722 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8ffb1106-bf05-4405-850b-6d5fa063f8a8-host\") on node \"crc\" DevicePath \"\"" Mar 09 15:25:48 crc kubenswrapper[4722]: I0309 15:25:48.818102 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ffb1106-bf05-4405-850b-6d5fa063f8a8-kube-api-access-msbjs" (OuterVolumeSpecName: "kube-api-access-msbjs") pod "8ffb1106-bf05-4405-850b-6d5fa063f8a8" (UID: "8ffb1106-bf05-4405-850b-6d5fa063f8a8"). InnerVolumeSpecName "kube-api-access-msbjs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 15:25:48 crc kubenswrapper[4722]: I0309 15:25:48.914518 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-msbjs\" (UniqueName: \"kubernetes.io/projected/8ffb1106-bf05-4405-850b-6d5fa063f8a8-kube-api-access-msbjs\") on node \"crc\" DevicePath \"\"" Mar 09 15:25:49 crc kubenswrapper[4722]: I0309 15:25:49.590766 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c9af42fee94264e32e226b1b8b8b31c3c29375a2f15be1ffd4f46c0ac358928d" Mar 09 15:25:49 crc kubenswrapper[4722]: I0309 15:25:49.590808 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bgdq7/crc-debug-45hp2" Mar 09 15:25:49 crc kubenswrapper[4722]: I0309 15:25:49.906047 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-bcd6d9dd6-5rp7g" Mar 09 15:25:49 crc kubenswrapper[4722]: I0309 15:25:49.909834 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-bcd6d9dd6-5rp7g" Mar 09 15:25:50 crc kubenswrapper[4722]: I0309 15:25:50.015368 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-bgdq7/crc-debug-rt9wz"] Mar 09 15:25:50 crc kubenswrapper[4722]: E0309 15:25:50.015820 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ffb1106-bf05-4405-850b-6d5fa063f8a8" containerName="container-00" Mar 09 15:25:50 crc kubenswrapper[4722]: I0309 15:25:50.015835 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ffb1106-bf05-4405-850b-6d5fa063f8a8" containerName="container-00" Mar 09 15:25:50 crc kubenswrapper[4722]: I0309 15:25:50.016114 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ffb1106-bf05-4405-850b-6d5fa063f8a8" containerName="container-00" Mar 09 15:25:50 crc kubenswrapper[4722]: I0309 15:25:50.016891 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bgdq7/crc-debug-rt9wz" Mar 09 15:25:50 crc kubenswrapper[4722]: I0309 15:25:50.145408 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnnj4\" (UniqueName: \"kubernetes.io/projected/39d38ec1-fadc-4877-9992-62a6db0d1a27-kube-api-access-pnnj4\") pod \"crc-debug-rt9wz\" (UID: \"39d38ec1-fadc-4877-9992-62a6db0d1a27\") " pod="openshift-must-gather-bgdq7/crc-debug-rt9wz" Mar 09 15:25:50 crc kubenswrapper[4722]: I0309 15:25:50.145836 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/39d38ec1-fadc-4877-9992-62a6db0d1a27-host\") pod \"crc-debug-rt9wz\" (UID: \"39d38ec1-fadc-4877-9992-62a6db0d1a27\") " pod="openshift-must-gather-bgdq7/crc-debug-rt9wz" Mar 09 15:25:50 crc kubenswrapper[4722]: I0309 15:25:50.165510 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ffb1106-bf05-4405-850b-6d5fa063f8a8" path="/var/lib/kubelet/pods/8ffb1106-bf05-4405-850b-6d5fa063f8a8/volumes" Mar 09 15:25:50 crc kubenswrapper[4722]: I0309 15:25:50.248395 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/39d38ec1-fadc-4877-9992-62a6db0d1a27-host\") pod \"crc-debug-rt9wz\" (UID: \"39d38ec1-fadc-4877-9992-62a6db0d1a27\") " pod="openshift-must-gather-bgdq7/crc-debug-rt9wz" Mar 09 15:25:50 crc kubenswrapper[4722]: I0309 15:25:50.248570 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/39d38ec1-fadc-4877-9992-62a6db0d1a27-host\") pod \"crc-debug-rt9wz\" (UID: \"39d38ec1-fadc-4877-9992-62a6db0d1a27\") " pod="openshift-must-gather-bgdq7/crc-debug-rt9wz" Mar 09 15:25:50 crc kubenswrapper[4722]: I0309 15:25:50.248619 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnnj4\" (UniqueName: \"kubernetes.io/projected/39d38ec1-fadc-4877-9992-62a6db0d1a27-kube-api-access-pnnj4\") pod \"crc-debug-rt9wz\" (UID: \"39d38ec1-fadc-4877-9992-62a6db0d1a27\") " pod="openshift-must-gather-bgdq7/crc-debug-rt9wz" Mar 09 15:25:50 crc kubenswrapper[4722]: I0309 15:25:50.975106 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnnj4\" (UniqueName: \"kubernetes.io/projected/39d38ec1-fadc-4877-9992-62a6db0d1a27-kube-api-access-pnnj4\") pod \"crc-debug-rt9wz\" (UID: \"39d38ec1-fadc-4877-9992-62a6db0d1a27\") " pod="openshift-must-gather-bgdq7/crc-debug-rt9wz" Mar 09 15:25:51 crc kubenswrapper[4722]: I0309 15:25:51.237404 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bgdq7/crc-debug-rt9wz" Mar 09 15:25:51 crc kubenswrapper[4722]: I0309 15:25:51.623488 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bgdq7/crc-debug-rt9wz" event={"ID":"39d38ec1-fadc-4877-9992-62a6db0d1a27","Type":"ContainerStarted","Data":"4b4f4d7c319232b3b1372d6c1675bea7598166f3c3f9bdb9f22ab8b2054513fe"} Mar 09 15:25:51 crc kubenswrapper[4722]: I0309 15:25:51.623763 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bgdq7/crc-debug-rt9wz" event={"ID":"39d38ec1-fadc-4877-9992-62a6db0d1a27","Type":"ContainerStarted","Data":"0b0200fa67c955a8ed0ce641da19a89352ca71fffb34f1fddf8a59f5f215a30d"} Mar 09 15:25:51 crc kubenswrapper[4722]: I0309 15:25:51.700555 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-bgdq7/crc-debug-rt9wz"] Mar 09 15:25:51 crc kubenswrapper[4722]: I0309 15:25:51.712678 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-bgdq7/crc-debug-rt9wz"] Mar 09 15:25:52 crc kubenswrapper[4722]: I0309 15:25:52.640259 4722 generic.go:334] "Generic (PLEG): container finished" podID="39d38ec1-fadc-4877-9992-62a6db0d1a27" containerID="4b4f4d7c319232b3b1372d6c1675bea7598166f3c3f9bdb9f22ab8b2054513fe" exitCode=0 Mar 09 15:25:52 crc kubenswrapper[4722]: I0309 15:25:52.803550 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bgdq7/crc-debug-rt9wz" Mar 09 15:25:52 crc kubenswrapper[4722]: I0309 15:25:52.936256 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/39d38ec1-fadc-4877-9992-62a6db0d1a27-host\") pod \"39d38ec1-fadc-4877-9992-62a6db0d1a27\" (UID: \"39d38ec1-fadc-4877-9992-62a6db0d1a27\") " Mar 09 15:25:52 crc kubenswrapper[4722]: I0309 15:25:52.936381 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/39d38ec1-fadc-4877-9992-62a6db0d1a27-host" (OuterVolumeSpecName: "host") pod "39d38ec1-fadc-4877-9992-62a6db0d1a27" (UID: "39d38ec1-fadc-4877-9992-62a6db0d1a27"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 09 15:25:52 crc kubenswrapper[4722]: I0309 15:25:52.936482 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pnnj4\" (UniqueName: \"kubernetes.io/projected/39d38ec1-fadc-4877-9992-62a6db0d1a27-kube-api-access-pnnj4\") pod \"39d38ec1-fadc-4877-9992-62a6db0d1a27\" (UID: \"39d38ec1-fadc-4877-9992-62a6db0d1a27\") " Mar 09 15:25:52 crc kubenswrapper[4722]: I0309 15:25:52.937176 4722 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/39d38ec1-fadc-4877-9992-62a6db0d1a27-host\") on node \"crc\" DevicePath \"\"" Mar 09 15:25:53 crc kubenswrapper[4722]: I0309 15:25:53.375885 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39d38ec1-fadc-4877-9992-62a6db0d1a27-kube-api-access-pnnj4" (OuterVolumeSpecName: "kube-api-access-pnnj4") pod "39d38ec1-fadc-4877-9992-62a6db0d1a27" (UID: "39d38ec1-fadc-4877-9992-62a6db0d1a27"). InnerVolumeSpecName "kube-api-access-pnnj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 15:25:53 crc kubenswrapper[4722]: I0309 15:25:53.450827 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pnnj4\" (UniqueName: \"kubernetes.io/projected/39d38ec1-fadc-4877-9992-62a6db0d1a27-kube-api-access-pnnj4\") on node \"crc\" DevicePath \"\"" Mar 09 15:25:53 crc kubenswrapper[4722]: I0309 15:25:53.652420 4722 scope.go:117] "RemoveContainer" containerID="4b4f4d7c319232b3b1372d6c1675bea7598166f3c3f9bdb9f22ab8b2054513fe" Mar 09 15:25:53 crc kubenswrapper[4722]: I0309 15:25:53.652447 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bgdq7/crc-debug-rt9wz" Mar 09 15:25:54 crc kubenswrapper[4722]: I0309 15:25:54.148394 4722 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 09 15:25:54 crc kubenswrapper[4722]: I0309 15:25:54.161981 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39d38ec1-fadc-4877-9992-62a6db0d1a27" path="/var/lib/kubelet/pods/39d38ec1-fadc-4877-9992-62a6db0d1a27/volumes" Mar 09 15:26:00 crc kubenswrapper[4722]: I0309 15:26:00.145740 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551166-bgm8q"] Mar 09 15:26:00 crc kubenswrapper[4722]: E0309 15:26:00.146875 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39d38ec1-fadc-4877-9992-62a6db0d1a27" containerName="container-00" Mar 09 15:26:00 crc kubenswrapper[4722]: I0309 15:26:00.146907 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="39d38ec1-fadc-4877-9992-62a6db0d1a27" containerName="container-00" Mar 09 15:26:00 crc kubenswrapper[4722]: I0309 15:26:00.147228 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="39d38ec1-fadc-4877-9992-62a6db0d1a27" containerName="container-00" Mar 09 15:26:00 crc kubenswrapper[4722]: I0309 15:26:00.148161 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551166-bgm8q" Mar 09 15:26:00 crc kubenswrapper[4722]: I0309 15:26:00.160982 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551166-bgm8q"] Mar 09 15:26:00 crc kubenswrapper[4722]: I0309 15:26:00.161020 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 15:26:00 crc kubenswrapper[4722]: I0309 15:26:00.162078 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 15:26:00 crc kubenswrapper[4722]: I0309 15:26:00.165900 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-dr2x6" Mar 09 15:26:00 crc kubenswrapper[4722]: I0309 15:26:00.207235 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-424wl\" (UniqueName: \"kubernetes.io/projected/ca84248b-e4d2-4c20-a7e6-4c41d64cbb7e-kube-api-access-424wl\") pod \"auto-csr-approver-29551166-bgm8q\" (UID: \"ca84248b-e4d2-4c20-a7e6-4c41d64cbb7e\") " pod="openshift-infra/auto-csr-approver-29551166-bgm8q" Mar 09 15:26:00 crc kubenswrapper[4722]: I0309 15:26:00.311306 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-424wl\" (UniqueName: \"kubernetes.io/projected/ca84248b-e4d2-4c20-a7e6-4c41d64cbb7e-kube-api-access-424wl\") pod \"auto-csr-approver-29551166-bgm8q\" (UID: \"ca84248b-e4d2-4c20-a7e6-4c41d64cbb7e\") " pod="openshift-infra/auto-csr-approver-29551166-bgm8q" Mar 09 15:26:00 crc kubenswrapper[4722]: I0309 15:26:00.346715 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-424wl\" (UniqueName: \"kubernetes.io/projected/ca84248b-e4d2-4c20-a7e6-4c41d64cbb7e-kube-api-access-424wl\") pod \"auto-csr-approver-29551166-bgm8q\" (UID: \"ca84248b-e4d2-4c20-a7e6-4c41d64cbb7e\") " pod="openshift-infra/auto-csr-approver-29551166-bgm8q" Mar 09 15:26:00 crc kubenswrapper[4722]: I0309 15:26:00.487998 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551166-bgm8q" Mar 09 15:26:01 crc kubenswrapper[4722]: I0309 15:26:01.043090 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551166-bgm8q"] Mar 09 15:26:01 crc kubenswrapper[4722]: W0309 15:26:01.047375 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca84248b_e4d2_4c20_a7e6_4c41d64cbb7e.slice/crio-0977f435eac0da08986ea9165f59870fb1236a21f1618184fdb5818e31e986c4 WatchSource:0}: Error finding container 0977f435eac0da08986ea9165f59870fb1236a21f1618184fdb5818e31e986c4: Status 404 returned error can't find the container with id 0977f435eac0da08986ea9165f59870fb1236a21f1618184fdb5818e31e986c4 Mar 09 15:26:01 crc kubenswrapper[4722]: I0309 15:26:01.764632 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551166-bgm8q" event={"ID":"ca84248b-e4d2-4c20-a7e6-4c41d64cbb7e","Type":"ContainerStarted","Data":"0977f435eac0da08986ea9165f59870fb1236a21f1618184fdb5818e31e986c4"} Mar 09 15:26:02 crc kubenswrapper[4722]: I0309 15:26:02.777568 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551166-bgm8q" event={"ID":"ca84248b-e4d2-4c20-a7e6-4c41d64cbb7e","Type":"ContainerStarted","Data":"824121ab0ac52d5e8b6e0e8c57813649c3d2f7180b2f52686cc3e32d5064cd75"} Mar 09 15:26:02 crc kubenswrapper[4722]: I0309 15:26:02.800132 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551166-bgm8q" podStartSLOduration=1.5067289449999999 podStartE2EDuration="2.800113469s" podCreationTimestamp="2026-03-09 15:26:00 +0000 UTC" firstStartedPulling="2026-03-09 15:26:01.050899919 +0000 UTC m=+5001.606468495" lastFinishedPulling="2026-03-09 15:26:02.344284423 +0000 UTC m=+5002.899853019" observedRunningTime="2026-03-09 15:26:02.795138232 +0000 UTC m=+5003.350706838" watchObservedRunningTime="2026-03-09 15:26:02.800113469 +0000 UTC m=+5003.355682045" Mar 09 15:26:04 crc kubenswrapper[4722]: I0309 15:26:04.804615 4722 generic.go:334] "Generic (PLEG): container finished" podID="ca84248b-e4d2-4c20-a7e6-4c41d64cbb7e" containerID="824121ab0ac52d5e8b6e0e8c57813649c3d2f7180b2f52686cc3e32d5064cd75" exitCode=0 Mar 09 15:26:04 crc kubenswrapper[4722]: I0309 15:26:04.804752 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551166-bgm8q" event={"ID":"ca84248b-e4d2-4c20-a7e6-4c41d64cbb7e","Type":"ContainerDied","Data":"824121ab0ac52d5e8b6e0e8c57813649c3d2f7180b2f52686cc3e32d5064cd75"} Mar 09 15:26:06 crc kubenswrapper[4722]: I0309 15:26:06.355706 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551166-bgm8q" Mar 09 15:26:06 crc kubenswrapper[4722]: I0309 15:26:06.480136 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-424wl\" (UniqueName: \"kubernetes.io/projected/ca84248b-e4d2-4c20-a7e6-4c41d64cbb7e-kube-api-access-424wl\") pod \"ca84248b-e4d2-4c20-a7e6-4c41d64cbb7e\" (UID: \"ca84248b-e4d2-4c20-a7e6-4c41d64cbb7e\") " Mar 09 15:26:06 crc kubenswrapper[4722]: I0309 15:26:06.485852 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca84248b-e4d2-4c20-a7e6-4c41d64cbb7e-kube-api-access-424wl" (OuterVolumeSpecName: "kube-api-access-424wl") pod "ca84248b-e4d2-4c20-a7e6-4c41d64cbb7e" (UID: "ca84248b-e4d2-4c20-a7e6-4c41d64cbb7e"). InnerVolumeSpecName "kube-api-access-424wl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 15:26:06 crc kubenswrapper[4722]: I0309 15:26:06.583833 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-424wl\" (UniqueName: \"kubernetes.io/projected/ca84248b-e4d2-4c20-a7e6-4c41d64cbb7e-kube-api-access-424wl\") on node \"crc\" DevicePath \"\"" Mar 09 15:26:06 crc kubenswrapper[4722]: I0309 15:26:06.850535 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551166-bgm8q" event={"ID":"ca84248b-e4d2-4c20-a7e6-4c41d64cbb7e","Type":"ContainerDied","Data":"0977f435eac0da08986ea9165f59870fb1236a21f1618184fdb5818e31e986c4"} Mar 09 15:26:06 crc kubenswrapper[4722]: I0309 15:26:06.850815 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0977f435eac0da08986ea9165f59870fb1236a21f1618184fdb5818e31e986c4" Mar 09 15:26:06 crc kubenswrapper[4722]: I0309 15:26:06.850660 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551166-bgm8q" Mar 09 15:26:06 crc kubenswrapper[4722]: I0309 15:26:06.911094 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551160-gwht7"] Mar 09 15:26:06 crc kubenswrapper[4722]: I0309 15:26:06.934132 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551160-gwht7"] Mar 09 15:26:08 crc kubenswrapper[4722]: I0309 15:26:08.183174 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29face16-3773-4db5-8795-9d16496a7ecd" path="/var/lib/kubelet/pods/29face16-3773-4db5-8795-9d16496a7ecd/volumes" Mar 09 15:26:18 crc kubenswrapper[4722]: I0309 15:26:18.649020 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_0083dc56-7904-4b06-8c12-2429f8a7fe9a/aodh-api/0.log" Mar 09 15:26:18 crc kubenswrapper[4722]: I0309 15:26:18.969446 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_0083dc56-7904-4b06-8c12-2429f8a7fe9a/aodh-evaluator/0.log" Mar 09 15:26:18 crc kubenswrapper[4722]: I0309 15:26:18.996794 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_0083dc56-7904-4b06-8c12-2429f8a7fe9a/aodh-listener/0.log" Mar 09 15:26:19 crc kubenswrapper[4722]: I0309 15:26:19.173269 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_0083dc56-7904-4b06-8c12-2429f8a7fe9a/aodh-notifier/0.log" Mar 09 15:26:19 crc kubenswrapper[4722]: I0309 15:26:19.312342 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-847f5f7dcd-6xz74_339f60b7-5615-4bbf-a907-ec8daeb69158/barbican-api-log/0.log" Mar 09 15:26:19 crc kubenswrapper[4722]: I0309 15:26:19.337980 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-847f5f7dcd-6xz74_339f60b7-5615-4bbf-a907-ec8daeb69158/barbican-api/0.log" Mar 09 15:26:19 crc kubenswrapper[4722]: I0309 15:26:19.384571 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5c8c9c48fd-dnbhs_612e965f-d243-486c-90a5-e4c867ef6fd5/barbican-keystone-listener/0.log" Mar 09 15:26:19 crc kubenswrapper[4722]: I0309 15:26:19.607218 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5c8c9c48fd-dnbhs_612e965f-d243-486c-90a5-e4c867ef6fd5/barbican-keystone-listener-log/0.log" Mar 09 15:26:19 crc kubenswrapper[4722]: I0309 15:26:19.641758 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5576b4c89f-ddl4q_7ff2eef6-823d-496e-b64d-abb692d53b42/barbican-worker/0.log" Mar 09 15:26:19 crc kubenswrapper[4722]: I0309 15:26:19.666618 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5576b4c89f-ddl4q_7ff2eef6-823d-496e-b64d-abb692d53b42/barbican-worker-log/0.log" Mar 09 15:26:19 crc kubenswrapper[4722]: I0309 15:26:19.908330 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-fsqlh_4a60c1ca-738f-48a9-b972-3eef08a28fc6/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 15:26:19 crc kubenswrapper[4722]: I0309 15:26:19.933894 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_cd177746-866b-44d0-b46b-5b0f9b683a7b/ceilometer-central-agent/0.log" Mar 09 15:26:20 crc kubenswrapper[4722]: I0309 15:26:20.202554 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_cd177746-866b-44d0-b46b-5b0f9b683a7b/ceilometer-notification-agent/0.log" Mar 09 15:26:20 crc kubenswrapper[4722]: I0309 15:26:20.759639 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_cd177746-866b-44d0-b46b-5b0f9b683a7b/sg-core/0.log" Mar 09 15:26:20 crc kubenswrapper[4722]: I0309 15:26:20.787677 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_cd177746-866b-44d0-b46b-5b0f9b683a7b/proxy-httpd/0.log" Mar 09 15:26:20 crc kubenswrapper[4722]: I0309 15:26:20.908563 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_dba3cf0e-d449-48f5-86c6-080681acf177/cinder-api/0.log" Mar 09 15:26:20 crc kubenswrapper[4722]: I0309 15:26:20.999495 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_dba3cf0e-d449-48f5-86c6-080681acf177/cinder-api-log/0.log" Mar 09 15:26:21 crc kubenswrapper[4722]: I0309 15:26:21.125843 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_4c227d2b-e035-426b-b1e1-5be3a4e06090/cinder-scheduler/1.log" Mar 09 15:26:21 crc kubenswrapper[4722]: I0309 15:26:21.158946 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_4c227d2b-e035-426b-b1e1-5be3a4e06090/cinder-scheduler/0.log" Mar 09 15:26:21 crc kubenswrapper[4722]: I0309 15:26:21.235434 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_4c227d2b-e035-426b-b1e1-5be3a4e06090/probe/0.log" Mar 09 15:26:21 crc kubenswrapper[4722]: I0309 15:26:21.389682 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-rtjd6_ecc0ba5d-ebce-4f74-b55c-284c3c6edc05/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 15:26:21 crc kubenswrapper[4722]: I0309 15:26:21.509754 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-rpdzb_1ece0886-09c8-4a9e-a309-5d538fefed94/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 15:26:21 crc kubenswrapper[4722]: I0309 15:26:21.528570 4722 patch_prober.go:28] interesting pod/machine-config-daemon-hjrrb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 15:26:21 crc kubenswrapper[4722]: I0309 15:26:21.528681 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 15:26:21 crc kubenswrapper[4722]: I0309 15:26:21.595313 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6f6df4f56c-8q8pp_868763d5-a256-477e-b82e-dd85f1e05dea/init/0.log" Mar 09 15:26:21 crc kubenswrapper[4722]: I0309 15:26:21.816571 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6f6df4f56c-8q8pp_868763d5-a256-477e-b82e-dd85f1e05dea/init/0.log" Mar 09 15:26:21 crc kubenswrapper[4722]: I0309 15:26:21.859302 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-gk84b_ed3d0a76-d1d6-4c0b-afb3-e9a74176d95f/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 15:26:21 crc kubenswrapper[4722]: I0309 15:26:21.990520 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6f6df4f56c-8q8pp_868763d5-a256-477e-b82e-dd85f1e05dea/dnsmasq-dns/0.log" Mar 09 15:26:22 crc kubenswrapper[4722]: I0309 15:26:22.372018 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_ff21e8aa-a39e-4355-a251-a86106a089c7/glance-log/0.log" Mar 09 15:26:22 crc kubenswrapper[4722]: I0309 15:26:22.391715 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_3cebe5cd-3f7d-4c97-beb5-5702d5ad9aae/glance-httpd/0.log" Mar 09 15:26:22 crc kubenswrapper[4722]: I0309 15:26:22.417519 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_3cebe5cd-3f7d-4c97-beb5-5702d5ad9aae/glance-log/0.log" Mar 09 15:26:22 crc kubenswrapper[4722]: I0309 15:26:22.447826 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_ff21e8aa-a39e-4355-a251-a86106a089c7/glance-httpd/0.log" Mar 09 15:26:23 crc kubenswrapper[4722]: I0309 15:26:23.022672 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-hmwmh_7e37df29-c0ce-40d1-bd50-50936c544bb0/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 15:26:23 crc kubenswrapper[4722]: I0309 15:26:23.117602 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-775d545446-nrcd2_b78f1ea8-1fc2-4469-966c-4568370bfae9/heat-engine/0.log" Mar 09 15:26:23 crc kubenswrapper[4722]: I0309 15:26:23.205487 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-867b85dcfc-rx6pp_b87d6ac1-cd7f-4e42-b8b3-7c70fdffda0d/heat-api/0.log" Mar 09 15:26:23 crc kubenswrapper[4722]: I0309 15:26:23.251412 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-6df8cdc54c-dd8gs_b4de9ab3-ed0f-4ad7-8fe6-e4e67aa9eabe/heat-cfnapi/0.log" Mar 09 15:26:23 crc kubenswrapper[4722]: I0309 15:26:23.298531 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-745rg_fb7d79c1-83d1-4e4a-aab1-7f9e069e6442/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 15:26:23 crc kubenswrapper[4722]: I0309 15:26:23.464593 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29551141-nnsmc_a88cb554-d13a-427d-959b-70271e698efe/keystone-cron/0.log" Mar 09 15:26:23 crc kubenswrapper[4722]: I0309 15:26:23.579802 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_652c419b-2a86-4c6f-ac7a-c2d7818ef55f/kube-state-metrics/1.log" Mar 09 15:26:23 crc kubenswrapper[4722]: I0309 15:26:23.901140 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_652c419b-2a86-4c6f-ac7a-c2d7818ef55f/kube-state-metrics/0.log" Mar 09 15:26:24 crc kubenswrapper[4722]: I0309 15:26:24.066457 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-wqbn2_b0bc3b99-5368-4287-8a9d-7b19b8b33e40/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 15:26:24 crc kubenswrapper[4722]: I0309 15:26:24.105961 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_logging-edpm-deployment-openstack-edpm-ipam-vv7zp_be833819-c229-4d0f-b489-a733e1b26a68/logging-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 15:26:24 crc kubenswrapper[4722]: I0309 15:26:24.418357 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mysqld-exporter-0_5f04dc47-34bc-4124-b129-f0c643f73284/mysqld-exporter/0.log" Mar 09 15:26:24 crc kubenswrapper[4722]: I0309 15:26:24.759090 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6f9b466589-25vjm_36f6d192-80a4-427c-8869-643481617222/neutron-httpd/0.log" Mar 09 15:26:24 crc kubenswrapper[4722]: I0309 15:26:24.795645 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6f9b466589-25vjm_36f6d192-80a4-427c-8869-643481617222/neutron-api/0.log" Mar 09 15:26:24 crc kubenswrapper[4722]: I0309 15:26:24.977466 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-26wzs_38ca037e-8b49-4fa3-a8e2-edbfacecdaf5/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 15:26:25 crc kubenswrapper[4722]: I0309 15:26:25.634920 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_b915b224-7fbf-4ec6-be9a-7205dd818ed4/nova-api-log/0.log" Mar 09 15:26:25 crc kubenswrapper[4722]: I0309 15:26:25.777029 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_2b3ff5cc-27b8-4242-b213-41632f062f72/nova-cell0-conductor-conductor/0.log" Mar 09 15:26:26 crc kubenswrapper[4722]: I0309 15:26:26.143619 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_b915b224-7fbf-4ec6-be9a-7205dd818ed4/nova-api-api/0.log" Mar 09 15:26:26 crc kubenswrapper[4722]: I0309 15:26:26.178478 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_6b61a6cf-d8fb-40e9-ae4f-19441d83beed/nova-cell1-conductor-conductor/0.log" Mar 09 15:26:26 crc kubenswrapper[4722]: I0309 15:26:26.465947 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_3a4409e4-5fc9-4f5b-ad06-af09ea87f10b/nova-cell1-novncproxy-novncproxy/0.log" Mar 09 15:26:26 crc kubenswrapper[4722]: I0309 15:26:26.551706 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-kbmxh_d10fe7e3-dd24-40b8-a94c-63dce2cb64ca/nova-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 15:26:26 crc kubenswrapper[4722]: I0309 15:26:26.833378 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_f5a79b9f-1947-4307-bc83-cba88e1e00cf/nova-metadata-log/0.log" Mar 09 15:26:27 crc kubenswrapper[4722]: I0309 15:26:27.357684 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_bbc84d58-b38f-49b4-b80f-f3377b43d7a4/nova-scheduler-scheduler/0.log" Mar 09 15:26:27 crc kubenswrapper[4722]: I0309 15:26:27.534403 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_4159e308-3ccf-45d9-a97b-8133542007a8/mysql-bootstrap/0.log" Mar 09 15:26:27 crc kubenswrapper[4722]: I0309 15:26:27.791112 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_4159e308-3ccf-45d9-a97b-8133542007a8/mysql-bootstrap/0.log" Mar 09 15:26:27 crc kubenswrapper[4722]: I0309 15:26:27.839532 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_4159e308-3ccf-45d9-a97b-8133542007a8/galera/1.log" Mar 09 15:26:28 crc kubenswrapper[4722]: I0309 15:26:28.035805 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_4159e308-3ccf-45d9-a97b-8133542007a8/galera/0.log" Mar 09 15:26:28 crc kubenswrapper[4722]: I0309 15:26:28.311424 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_a79aaedb-92ba-42fc-8cc7-9ecb007d2ac7/mysql-bootstrap/0.log" Mar 09 15:26:28 crc kubenswrapper[4722]: I0309 15:26:28.448106 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_a79aaedb-92ba-42fc-8cc7-9ecb007d2ac7/mysql-bootstrap/0.log" Mar 09 15:26:28 crc kubenswrapper[4722]: I0309 15:26:28.613241 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_a79aaedb-92ba-42fc-8cc7-9ecb007d2ac7/galera/1.log" Mar 09 15:26:28 crc kubenswrapper[4722]: I0309 15:26:28.710399 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_a79aaedb-92ba-42fc-8cc7-9ecb007d2ac7/galera/0.log" Mar 09 15:26:28 crc kubenswrapper[4722]: I0309 15:26:28.965174 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_363b3657-6ba6-40e5-a353-7e1440ce3d01/openstackclient/0.log" Mar 09 15:26:29 crc kubenswrapper[4722]: I0309 15:26:29.123451 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_f5a79b9f-1947-4307-bc83-cba88e1e00cf/nova-metadata-metadata/0.log" Mar 09 15:26:29 crc kubenswrapper[4722]: I0309 15:26:29.214597 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-b8gzx_32bc4279-b6a2-4846-801c-ddf3a01db8b2/ovn-controller/0.log" Mar 09 15:26:29 crc kubenswrapper[4722]: I0309 15:26:29.444895 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-mw6tx_f28b2af2-0f97-4160-9641-6771f3deb9d1/openstack-network-exporter/0.log" Mar 09 15:26:29 crc kubenswrapper[4722]: I0309 15:26:29.700133 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-k6ng6_d6228268-4d1f-464b-b733-a2f308211670/ovsdb-server-init/0.log" Mar 09 15:26:29 crc kubenswrapper[4722]: I0309 15:26:29.865288 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-k6ng6_d6228268-4d1f-464b-b733-a2f308211670/ovsdb-server-init/0.log" Mar 09 15:26:29 crc kubenswrapper[4722]: I0309 15:26:29.917552 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-k6ng6_d6228268-4d1f-464b-b733-a2f308211670/ovsdb-server/0.log" Mar 09 15:26:29 crc kubenswrapper[4722]: I0309 15:26:29.943532 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-k6ng6_d6228268-4d1f-464b-b733-a2f308211670/ovs-vswitchd/0.log" Mar 09 15:26:30 crc kubenswrapper[4722]: I0309 15:26:30.255011 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-c6hrn_6d67f5f0-866e-4de7-ba1a-9a1b4a4086e5/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 15:26:30 crc kubenswrapper[4722]: I0309 15:26:30.519320 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_6585120e-2007-43b3-a72d-4e80fb7ab2fb/ovn-northd/0.log" Mar 09 15:26:30 crc kubenswrapper[4722]: I0309 15:26:30.535315 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_6585120e-2007-43b3-a72d-4e80fb7ab2fb/openstack-network-exporter/0.log" Mar 09 15:26:30 crc kubenswrapper[4722]: I0309 15:26:30.752250 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_3f2d1a87-0e77-4753-b87a-39b2b5f333a4/ovsdbserver-nb/0.log" Mar 09 15:26:30 crc kubenswrapper[4722]: I0309 15:26:30.808735 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_3f2d1a87-0e77-4753-b87a-39b2b5f333a4/openstack-network-exporter/0.log" Mar 09 15:26:30 crc kubenswrapper[4722]: I0309 15:26:30.958823 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-8495c95fc9-42qqz_1343ca6f-93ab-45e7-8887-261b10bb1e88/keystone-api/0.log" Mar 09 15:26:31 crc kubenswrapper[4722]: I0309 15:26:31.052787 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_9aed43b2-88da-4388-b0ce-77699c8f978c/ovsdbserver-sb/0.log" Mar 09 15:26:31 crc kubenswrapper[4722]: I0309 15:26:31.058426 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_9aed43b2-88da-4388-b0ce-77699c8f978c/openstack-network-exporter/0.log" Mar 09 15:26:31 crc kubenswrapper[4722]: I0309 15:26:31.317410 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7ccbf8d8bb-gjl7f_b46c2ebf-e484-41c4-9f12-392f46798dfd/placement-api/0.log" Mar 09 15:26:31 crc kubenswrapper[4722]: I0309 15:26:31.536606 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7ccbf8d8bb-gjl7f_b46c2ebf-e484-41c4-9f12-392f46798dfd/placement-log/0.log" Mar 09 15:26:31 crc kubenswrapper[4722]: I0309 15:26:31.616140 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_0e07fbab-4a47-4e59-aa72-f0a4521296af/init-config-reloader/0.log" Mar 09 15:26:31 crc kubenswrapper[4722]: I0309 15:26:31.791945 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_0e07fbab-4a47-4e59-aa72-f0a4521296af/config-reloader/0.log" Mar 09 15:26:31 crc kubenswrapper[4722]: I0309 15:26:31.899147 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_0e07fbab-4a47-4e59-aa72-f0a4521296af/init-config-reloader/0.log" Mar 09 15:26:31 crc kubenswrapper[4722]: I0309 15:26:31.931610 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_0e07fbab-4a47-4e59-aa72-f0a4521296af/thanos-sidecar/0.log" Mar 09 15:26:32 crc kubenswrapper[4722]: I0309 15:26:32.006682 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_0e07fbab-4a47-4e59-aa72-f0a4521296af/prometheus/0.log" Mar 09 15:26:32 crc kubenswrapper[4722]: I0309 15:26:32.101415 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_8b6ee542-26e6-4126-8566-a34f7621d104/setup-container/0.log" Mar 09 15:26:32 crc kubenswrapper[4722]: I0309 15:26:32.290082 4722 scope.go:117] "RemoveContainer" containerID="b96fc702ca0636d34a822bdccfc31ccc60982472d734b4e512abaaee940c9499" Mar 09 15:26:32 crc kubenswrapper[4722]: I0309 15:26:32.434002 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_8b6ee542-26e6-4126-8566-a34f7621d104/rabbitmq/0.log" Mar 09 15:26:32 crc kubenswrapper[4722]: I0309 15:26:32.514710 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_49886118-4852-41ba-bbed-a946764f4649/setup-container/0.log" Mar 09 15:26:32 crc kubenswrapper[4722]: I0309 15:26:32.569892 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_8b6ee542-26e6-4126-8566-a34f7621d104/setup-container/0.log" Mar 09 15:26:33 crc kubenswrapper[4722]: I0309 15:26:33.008218 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-1_694c0bbd-9c21-4de1-b82b-e79aa32feb6b/setup-container/0.log" Mar 09 15:26:33 crc kubenswrapper[4722]: I0309 15:26:33.041297 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_49886118-4852-41ba-bbed-a946764f4649/setup-container/0.log" Mar 09 15:26:33 crc kubenswrapper[4722]: I0309 15:26:33.148680 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_49886118-4852-41ba-bbed-a946764f4649/rabbitmq/0.log" Mar 09 15:26:33 crc kubenswrapper[4722]: I0309 15:26:33.659069 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-1_694c0bbd-9c21-4de1-b82b-e79aa32feb6b/setup-container/0.log" Mar 09 15:26:33 crc kubenswrapper[4722]: I0309 15:26:33.723079 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-1_694c0bbd-9c21-4de1-b82b-e79aa32feb6b/rabbitmq/0.log" Mar 09 15:26:34 crc kubenswrapper[4722]: I0309 15:26:34.029261 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-2_fabf84f5-0f35-4400-b612-235235a21f3c/setup-container/0.log" Mar 09 15:26:34 crc kubenswrapper[4722]: I0309 15:26:34.276719 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-2_fabf84f5-0f35-4400-b612-235235a21f3c/setup-container/0.log" Mar 09 15:26:34 crc kubenswrapper[4722]: I0309 15:26:34.325171 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-2hn5t_cfb8fcd2-1bf9-44f5-a82c-6049951ea321/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 15:26:34 crc kubenswrapper[4722]: I0309 15:26:34.468514 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-2_fabf84f5-0f35-4400-b612-235235a21f3c/rabbitmq/0.log" Mar 09 15:26:34 crc kubenswrapper[4722]: I0309 15:26:34.541428 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-7k5vg_fd8102b7-59d6-4f93-8f71-43701a8b99ad/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 15:26:34 crc kubenswrapper[4722]: I0309 15:26:34.784246 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-rct52_5f8120f5-690a-4bb4-ba23-dead16d6946f/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 15:26:34 crc kubenswrapper[4722]: I0309 15:26:34.825614 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-7fht9_36900a74-39f3-4977-9151-b3b4bdc64554/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 15:26:35 crc kubenswrapper[4722]: I0309 15:26:35.078355 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-hj7ts_0104a7c4-89e8-4e4e-a184-d514fb780bb0/ssh-known-hosts-edpm-deployment/0.log" Mar 09 15:26:35 crc kubenswrapper[4722]: I0309 15:26:35.303825 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-77cd88d8c5-gml97_685a5733-d06e-4523-a35a-051db91eb0be/proxy-server/0.log" Mar 09 15:26:35 crc kubenswrapper[4722]: I0309 15:26:35.421964 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-vtpb7_cb0983d9-3f03-406f-a485-f89ba50341fc/swift-ring-rebalance/0.log" Mar 09 15:26:35 crc kubenswrapper[4722]: I0309 15:26:35.430485 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-77cd88d8c5-gml97_685a5733-d06e-4523-a35a-051db91eb0be/proxy-httpd/0.log" Mar 09 15:26:35 crc kubenswrapper[4722]: I0309 15:26:35.616651 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_7463e84f-f457-4409-9621-507d331e06b5/account-auditor/0.log" Mar 09 15:26:35 crc kubenswrapper[4722]: I0309 15:26:35.742863 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_7463e84f-f457-4409-9621-507d331e06b5/account-reaper/0.log" Mar 09 15:26:35 crc kubenswrapper[4722]: I0309 15:26:35.750428 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_7463e84f-f457-4409-9621-507d331e06b5/account-replicator/0.log" Mar 09 15:26:35 crc kubenswrapper[4722]: I0309 15:26:35.772867 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_7463e84f-f457-4409-9621-507d331e06b5/account-server/0.log" Mar 09 15:26:36 crc kubenswrapper[4722]: I0309 15:26:36.004338 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_7463e84f-f457-4409-9621-507d331e06b5/container-auditor/0.log" Mar 09 15:26:36 crc kubenswrapper[4722]: I0309 15:26:36.028430 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_7463e84f-f457-4409-9621-507d331e06b5/container-replicator/0.log" Mar 09 15:26:36 crc kubenswrapper[4722]: I0309 15:26:36.034227 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_7463e84f-f457-4409-9621-507d331e06b5/container-updater/0.log" Mar 09 15:26:36 crc kubenswrapper[4722]: I0309 15:26:36.064159 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_7463e84f-f457-4409-9621-507d331e06b5/container-server/0.log" Mar 09 15:26:36 crc kubenswrapper[4722]: I0309 15:26:36.214244 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_7463e84f-f457-4409-9621-507d331e06b5/object-expirer/0.log" Mar 09 15:26:36 crc kubenswrapper[4722]: I0309 15:26:36.287679 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_7463e84f-f457-4409-9621-507d331e06b5/object-auditor/0.log" Mar 09 15:26:36 crc kubenswrapper[4722]: I0309 15:26:36.304645 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_7463e84f-f457-4409-9621-507d331e06b5/object-replicator/0.log" Mar 09 15:26:36 crc kubenswrapper[4722]: I0309 15:26:36.408954 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_7463e84f-f457-4409-9621-507d331e06b5/object-server/0.log" Mar 09 15:26:36 crc kubenswrapper[4722]: I0309 15:26:36.531276 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_7463e84f-f457-4409-9621-507d331e06b5/rsync/0.log" Mar 09 15:26:36 crc kubenswrapper[4722]: I0309 15:26:36.551187 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_7463e84f-f457-4409-9621-507d331e06b5/object-updater/0.log" Mar 09 15:26:36 crc kubenswrapper[4722]: I0309 15:26:36.611789 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_7463e84f-f457-4409-9621-507d331e06b5/swift-recon-cron/0.log" Mar 09 15:26:37 crc kubenswrapper[4722]: I0309 15:26:37.180410 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-7lzxs_713bd472-187b-47a0-9094-5ac6496d7830/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 15:26:37 crc kubenswrapper[4722]: I0309 15:26:37.242868 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-power-monitoring-edpm-deployment-openstack-edpm-9sbwv_6c87731f-8737-43d6-ba7a-e1427fc96fd4/telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 15:26:37 crc kubenswrapper[4722]: I0309 15:26:37.494830 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_1ed3bba7-c106-49a1-96d6-672710c534bf/test-operator-logs-container/0.log" Mar 09 15:26:37 crc kubenswrapper[4722]: I0309 15:26:37.708841 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-tf98x_0b760182-c6d3-4f80-8f18-89b16c3c480d/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 09 15:26:38 crc kubenswrapper[4722]: I0309 15:26:38.088443 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_89fd3e80-4c6c-4619-ad44-ef440c0b1fb6/tempest-tests-tempest-tests-runner/0.log" Mar 09 15:26:46 crc kubenswrapper[4722]: I0309 15:26:46.623942 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_0a6188a6-3f71-48b5-9013-66d297c205a7/memcached/0.log" Mar 09 15:26:51 crc kubenswrapper[4722]: I0309 15:26:51.528242 4722 patch_prober.go:28] interesting pod/machine-config-daemon-hjrrb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 15:26:51 crc kubenswrapper[4722]: I0309 15:26:51.528872 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 15:27:08 crc kubenswrapper[4722]: I0309 15:27:08.148158 4722 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 09 15:27:11 crc kubenswrapper[4722]: I0309 15:27:11.123498 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b46038f17be43086a4100f87067c3f77525f8955a044026c95f8e3646bc72km_fcc40a21-72a2-4ffc-9148-848cc22b9ada/util/0.log" Mar 09 15:27:11 crc kubenswrapper[4722]: I0309 15:27:11.352063 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b46038f17be43086a4100f87067c3f77525f8955a044026c95f8e3646bc72km_fcc40a21-72a2-4ffc-9148-848cc22b9ada/util/0.log" Mar 09 15:27:11 crc kubenswrapper[4722]: I0309 15:27:11.354358 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b46038f17be43086a4100f87067c3f77525f8955a044026c95f8e3646bc72km_fcc40a21-72a2-4ffc-9148-848cc22b9ada/pull/0.log" Mar 09 15:27:11 crc kubenswrapper[4722]: I0309 15:27:11.354624 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b46038f17be43086a4100f87067c3f77525f8955a044026c95f8e3646bc72km_fcc40a21-72a2-4ffc-9148-848cc22b9ada/pull/0.log" Mar 09 15:27:11 crc kubenswrapper[4722]: I0309 15:27:11.507184 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b46038f17be43086a4100f87067c3f77525f8955a044026c95f8e3646bc72km_fcc40a21-72a2-4ffc-9148-848cc22b9ada/util/0.log" Mar 09 15:27:11 crc kubenswrapper[4722]: I0309 15:27:11.526048 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b46038f17be43086a4100f87067c3f77525f8955a044026c95f8e3646bc72km_fcc40a21-72a2-4ffc-9148-848cc22b9ada/pull/0.log" Mar 09 15:27:11 crc kubenswrapper[4722]: I0309 15:27:11.619101 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b46038f17be43086a4100f87067c3f77525f8955a044026c95f8e3646bc72km_fcc40a21-72a2-4ffc-9148-848cc22b9ada/extract/0.log" Mar 09 15:27:12 crc kubenswrapper[4722]: I0309 15:27:12.549015 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-5d87c9d997-bmpgd_a1a5e35a-83f6-4886-86db-55738f51f7e8/manager/0.log" Mar 09 15:27:12 crc kubenswrapper[4722]: I0309 15:27:12.767314 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-64db6967f8-zjf7b_22043c71-5292-422c-99e5-c88ea1aef638/manager/1.log" Mar 09 15:27:12 crc kubenswrapper[4722]: I0309 15:27:12.902823 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-64db6967f8-zjf7b_22043c71-5292-422c-99e5-c88ea1aef638/manager/0.log" Mar 09 15:27:13 crc kubenswrapper[4722]: I0309 15:27:13.187423 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-cf99c678f-ct7x8_f21c35ef-c8ea-4331-a747-44a62c6f2e74/manager/1.log" Mar 09 15:27:13 crc kubenswrapper[4722]: I0309 15:27:13.432036 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-cf99c678f-ct7x8_f21c35ef-c8ea-4331-a747-44a62c6f2e74/manager/0.log" Mar 09 15:27:13 crc kubenswrapper[4722]: I0309 15:27:13.676236 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-78bc7f9bd9-6npqv_663f1719-30f7-4588-a183-4a59787e8d8d/manager/0.log" Mar 09 15:27:13 crc kubenswrapper[4722]: I0309 15:27:13.968161 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-f7fcc58b9-lvfgg_7a62b98d-e9d4-4cbc-bea8-0da13fcc4467/manager/1.log" Mar 09 15:27:14 crc kubenswrapper[4722]: I0309 15:27:14.371333 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-55d77d7b5c-rsd9l_edd71e1d-6ff0-4918-9cd8-a342efba2df5/manager/0.log" Mar 09 15:27:14 crc kubenswrapper[4722]: I0309 15:27:14.609493 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-545456dc4-hnzfw_ec9f1f5e-26f5-4683-bf41-c85981da9d18/manager/1.log" Mar 09 15:27:14 crc kubenswrapper[4722]: I0309 15:27:14.872305 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-545456dc4-hnzfw_ec9f1f5e-26f5-4683-bf41-c85981da9d18/manager/0.log" Mar 09 15:27:15 crc kubenswrapper[4722]: I0309 15:27:15.137179 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-f7fcc58b9-lvfgg_7a62b98d-e9d4-4cbc-bea8-0da13fcc4467/manager/0.log" Mar 09 15:27:15 crc kubenswrapper[4722]: I0309 15:27:15.195280 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7c789f89c6-2hxzr_4de6db14-6f3e-4c4e-a61d-39c6648209dd/manager/1.log" Mar 09 15:27:15 crc kubenswrapper[4722]: I0309 15:27:15.245116 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7c789f89c6-2hxzr_4de6db14-6f3e-4c4e-a61d-39c6648209dd/manager/0.log" Mar 09 15:27:15 crc kubenswrapper[4722]: I0309 15:27:15.470797 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-7b6bfb6475-l8bds_74cb981b-ce89-479e-8573-fdda25190637/manager/1.log" Mar 09 15:27:15 crc kubenswrapper[4722]: I0309 15:27:15.517045 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-67d996989d-28855_dae536b6-7a22-435e-b307-a8ab6b54779d/manager/0.log" Mar 09 15:27:15 crc kubenswrapper[4722]: I0309 15:27:15.767343 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-7b6bfb6475-l8bds_74cb981b-ce89-479e-8573-fdda25190637/manager/0.log" Mar 09 15:27:15 crc kubenswrapper[4722]: I0309 15:27:15.786654 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-54688575f-5fcw8_febfeb1a-d5a3-46b8-bc4f-fe3266905e8c/manager/0.log" Mar 09 15:27:16 crc kubenswrapper[4722]: I0309 15:27:16.008440 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5d86c7ddb7-n5zc7_717ffc3a-7a6d-4a7c-837f-d1ed92489b68/manager/0.log" Mar 09 15:27:16 crc kubenswrapper[4722]: I0309 15:27:16.008614 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5d86c7ddb7-n5zc7_717ffc3a-7a6d-4a7c-837f-d1ed92489b68/manager/1.log" Mar 09 15:27:16 crc kubenswrapper[4722]: I0309 15:27:16.100017 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-74b6b5dc96-jwgfg_a9ff56ca-00a6-484f-a477-0dca4f3a0f5c/manager/0.log" Mar 09 15:27:16 crc kubenswrapper[4722]: I0309 15:27:16.238503 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-7c6767dc9ctn5qw_5e25c11b-f9c6-4542-9c0c-394ea6bc2c17/manager/0.log" Mar 09 15:27:16 crc kubenswrapper[4722]: I0309 15:27:16.248160 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-7c6767dc9ctn5qw_5e25c11b-f9c6-4542-9c0c-394ea6bc2c17/manager/1.log" Mar 09 15:27:16 crc kubenswrapper[4722]: I0309 15:27:16.792090 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-55cd86c56-dm2dr_98c22319-d5f8-4a0b-8a30-89b9d832f354/manager/1.log" Mar 09 15:27:16 crc kubenswrapper[4722]: I0309 15:27:16.831256 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-5b979cff56-vwbnz_bdac45ca-36d4-41c5-b5e5-332d70558171/operator/0.log" Mar 09 15:27:17 crc kubenswrapper[4722]: I0309 15:27:17.041559 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-7qbrd_8ac36d47-4501-4033-aee7-ce9ed8ed7002/registry-server/1.log" Mar 09 15:27:17 crc kubenswrapper[4722]: I0309 15:27:17.088363 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-7qbrd_8ac36d47-4501-4033-aee7-ce9ed8ed7002/registry-server/0.log" Mar 09 15:27:17 crc kubenswrapper[4722]: I0309 15:27:17.263715 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-75684d597f-hgkzs_df8b52ff-f61e-4aca-a408-240590699ae6/manager/1.log" Mar 09 15:27:17 crc kubenswrapper[4722]: I0309 15:27:17.427078 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-75684d597f-hgkzs_df8b52ff-f61e-4aca-a408-240590699ae6/manager/0.log" Mar 09 15:27:17 crc kubenswrapper[4722]: I0309 15:27:17.560470 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-648564c9fc-wrzrq_e7b4c7c9-7c4f-4a13-8367-759f5f5ce368/manager/0.log" Mar 09 15:27:18 crc kubenswrapper[4722]: I0309 15:27:18.008360 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-xswl4_f9ff9b26-9d5a-4194-bab5-1b9fb5dee947/operator/1.log" Mar 09 15:27:18 crc kubenswrapper[4722]: I0309 15:27:18.050661 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-xswl4_f9ff9b26-9d5a-4194-bab5-1b9fb5dee947/operator/0.log" Mar 09 15:27:18 crc kubenswrapper[4722]: I0309 15:27:18.310657 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-9b9ff9f4d-56qz9_a9df5689-5d83-4206-be2b-cf6877d70e23/manager/1.log" Mar 09 15:27:18 crc kubenswrapper[4722]: I0309 15:27:18.351429 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-9b9ff9f4d-56qz9_a9df5689-5d83-4206-be2b-cf6877d70e23/manager/0.log" Mar 09 15:27:18 crc kubenswrapper[4722]: I0309 15:27:18.688143 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-66c8b7dfbb-m7fv2_5bf14ad6-64cf-48f7-99e6-fabac12849e2/manager/1.log" Mar 09 15:27:19 crc kubenswrapper[4722]: I0309 15:27:19.000351 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-55b5ff4dbb-pg8qn_ef36bc5a-2962-4c1e-a5fd-98f61d525d5d/manager/1.log" Mar 09 15:27:19 crc kubenswrapper[4722]: I0309 15:27:19.148009 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-55b5ff4dbb-pg8qn_ef36bc5a-2962-4c1e-a5fd-98f61d525d5d/manager/0.log" Mar 09 15:27:19 crc kubenswrapper[4722]: I0309 15:27:19.356438 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-66c8b7dfbb-m7fv2_5bf14ad6-64cf-48f7-99e6-fabac12849e2/manager/0.log" Mar 09 15:27:19 crc kubenswrapper[4722]: I0309 15:27:19.628792 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-55cd86c56-dm2dr_98c22319-d5f8-4a0b-8a30-89b9d832f354/manager/0.log" Mar 09 15:27:19 crc kubenswrapper[4722]: I0309 15:27:19.649725 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-bccc79885-pkbqb_0eac7341-5bab-4c97-a730-b7eeb0a75899/manager/0.log" Mar 09 15:27:21 crc kubenswrapper[4722]: I0309 15:27:21.527768 4722 patch_prober.go:28] interesting pod/machine-config-daemon-hjrrb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 15:27:21 crc kubenswrapper[4722]: I0309 15:27:21.528072 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 15:27:21 crc kubenswrapper[4722]: I0309 15:27:21.528116 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" Mar 09 15:27:21 crc kubenswrapper[4722]: I0309 15:27:21.529123 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"05d0cab124a66072241bc2a93dace83012b26d6bbd0de5507999fb5df76700cb"} pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 15:27:21 crc kubenswrapper[4722]: I0309 15:27:21.529176 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerName="machine-config-daemon" containerID="cri-o://05d0cab124a66072241bc2a93dace83012b26d6bbd0de5507999fb5df76700cb" gracePeriod=600 Mar 09 15:27:21 crc kubenswrapper[4722]: E0309 15:27:21.662334 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 15:27:21 crc kubenswrapper[4722]: I0309 15:27:21.804994 4722 generic.go:334] "Generic (PLEG): container finished" podID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerID="05d0cab124a66072241bc2a93dace83012b26d6bbd0de5507999fb5df76700cb" exitCode=0 Mar 09 15:27:21 crc kubenswrapper[4722]: I0309 15:27:21.805052 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" event={"ID":"dac2aaf5-653b-4b2a-8efe-ed26bac8d648","Type":"ContainerDied","Data":"05d0cab124a66072241bc2a93dace83012b26d6bbd0de5507999fb5df76700cb"} Mar 09 15:27:21 crc kubenswrapper[4722]: I0309 15:27:21.805099 4722 scope.go:117] "RemoveContainer" containerID="f204c1f7d34e99a1176cb155859ff24baa5114f3011037f9d600a75f90dbf8fc" Mar 09 15:27:21 crc kubenswrapper[4722]: I0309 15:27:21.806130 4722 scope.go:117] "RemoveContainer" containerID="05d0cab124a66072241bc2a93dace83012b26d6bbd0de5507999fb5df76700cb" Mar 09 15:27:21 crc kubenswrapper[4722]: E0309 15:27:21.806602 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 15:27:25 crc kubenswrapper[4722]: I0309 15:27:25.228911 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6db6876945-vppnv_8f839106-1673-4589-9391-0cd7748e658c/manager/0.log" Mar 09 15:27:35 crc kubenswrapper[4722]: I0309 15:27:35.149345 4722 scope.go:117] "RemoveContainer" containerID="05d0cab124a66072241bc2a93dace83012b26d6bbd0de5507999fb5df76700cb" Mar 09 15:27:35 crc kubenswrapper[4722]: E0309 15:27:35.150126 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 15:27:46 crc kubenswrapper[4722]: I0309 15:27:46.074152 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-2xkrn_9fb5d9ef-bb16-45d0-a7c0-a9fb43edeb34/control-plane-machine-set-operator/0.log" Mar 09 15:27:46 crc kubenswrapper[4722]: I0309 15:27:46.165365 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-xxz9x_8a0be9cc-2cda-4b22-838b-0036cfa4405c/kube-rbac-proxy/0.log" Mar 09 15:27:46 crc kubenswrapper[4722]: I0309 15:27:46.297917 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-xxz9x_8a0be9cc-2cda-4b22-838b-0036cfa4405c/machine-api-operator/0.log" Mar 09 15:27:48 crc kubenswrapper[4722]: I0309 15:27:48.153359 4722 scope.go:117] "RemoveContainer" containerID="05d0cab124a66072241bc2a93dace83012b26d6bbd0de5507999fb5df76700cb" Mar 09 15:27:48 crc kubenswrapper[4722]: E0309 15:27:48.163505 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 15:28:00 crc kubenswrapper[4722]: I0309 15:28:00.162728 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551168-cxpl9"] Mar 09 15:28:00 crc kubenswrapper[4722]: E0309 15:28:00.164013 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca84248b-e4d2-4c20-a7e6-4c41d64cbb7e" containerName="oc" Mar 09 15:28:00 crc kubenswrapper[4722]: I0309 15:28:00.164032 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca84248b-e4d2-4c20-a7e6-4c41d64cbb7e" containerName="oc" Mar 09 15:28:00 crc kubenswrapper[4722]: I0309 15:28:00.164436 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca84248b-e4d2-4c20-a7e6-4c41d64cbb7e" containerName="oc" Mar 09 15:28:00 crc kubenswrapper[4722]: I0309 15:28:00.165560 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551168-cxpl9" Mar 09 15:28:00 crc kubenswrapper[4722]: I0309 15:28:00.167919 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 15:28:00 crc kubenswrapper[4722]: I0309 15:28:00.168104 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 15:28:00 crc kubenswrapper[4722]: I0309 15:28:00.168691 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-dr2x6" Mar 09 15:28:00 crc kubenswrapper[4722]: I0309 15:28:00.197887 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551168-cxpl9"] Mar 09 15:28:00 crc kubenswrapper[4722]: I0309 15:28:00.297320 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7df6h\" (UniqueName: \"kubernetes.io/projected/f65d5c88-8e7d-4c6e-a14d-b838623e152a-kube-api-access-7df6h\") pod \"auto-csr-approver-29551168-cxpl9\" (UID: \"f65d5c88-8e7d-4c6e-a14d-b838623e152a\") " pod="openshift-infra/auto-csr-approver-29551168-cxpl9" Mar 09 15:28:00 crc kubenswrapper[4722]: I0309 15:28:00.399995 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7df6h\" (UniqueName: \"kubernetes.io/projected/f65d5c88-8e7d-4c6e-a14d-b838623e152a-kube-api-access-7df6h\") pod \"auto-csr-approver-29551168-cxpl9\" (UID: \"f65d5c88-8e7d-4c6e-a14d-b838623e152a\") " pod="openshift-infra/auto-csr-approver-29551168-cxpl9" Mar 09 15:28:00 crc kubenswrapper[4722]: I0309 15:28:00.444730 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7df6h\" (UniqueName: \"kubernetes.io/projected/f65d5c88-8e7d-4c6e-a14d-b838623e152a-kube-api-access-7df6h\") pod \"auto-csr-approver-29551168-cxpl9\" (UID: \"f65d5c88-8e7d-4c6e-a14d-b838623e152a\") " pod="openshift-infra/auto-csr-approver-29551168-cxpl9" Mar 09 15:28:00 crc kubenswrapper[4722]: I0309 15:28:00.492038 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551168-cxpl9" Mar 09 15:28:01 crc kubenswrapper[4722]: I0309 15:28:01.097941 4722 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 15:28:01 crc kubenswrapper[4722]: I0309 15:28:01.099463 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551168-cxpl9"] Mar 09 15:28:01 crc kubenswrapper[4722]: I0309 15:28:01.308357 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551168-cxpl9" event={"ID":"f65d5c88-8e7d-4c6e-a14d-b838623e152a","Type":"ContainerStarted","Data":"061342d230629b01a1ab71df9946da1bdfe6f6898662bef00a710f7c271c902f"} Mar 09 15:28:01 crc kubenswrapper[4722]: I0309 15:28:01.624132 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-rrczb_344178ce-f6d3-47f4-ab3c-69c394e2f677/cert-manager-controller/0.log" Mar 09 15:28:01 crc kubenswrapper[4722]: I0309 15:28:01.871102 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-kxd7l_5548bcb9-3490-4e2b-982f-adc9ff86db62/cert-manager-cainjector/0.log" Mar 09 15:28:01 crc kubenswrapper[4722]: I0309 15:28:01.999396 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-wb6rj_ac6d6e52-6a89-4f96-8894-8ed2c71cdcbc/cert-manager-webhook/0.log" Mar 09 15:28:03 crc kubenswrapper[4722]: I0309 15:28:03.149617 4722 scope.go:117] "RemoveContainer" containerID="05d0cab124a66072241bc2a93dace83012b26d6bbd0de5507999fb5df76700cb" Mar 09 15:28:03 crc kubenswrapper[4722]: E0309 15:28:03.150333 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 15:28:03 crc kubenswrapper[4722]: I0309 15:28:03.337373 4722 generic.go:334] "Generic (PLEG): container finished" podID="f65d5c88-8e7d-4c6e-a14d-b838623e152a" containerID="14fee07583b34b908693441432e9847512fa63e284d6b01fe43f82e351ad444b" exitCode=0 Mar 09 15:28:03 crc kubenswrapper[4722]: I0309 15:28:03.337472 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551168-cxpl9" event={"ID":"f65d5c88-8e7d-4c6e-a14d-b838623e152a","Type":"ContainerDied","Data":"14fee07583b34b908693441432e9847512fa63e284d6b01fe43f82e351ad444b"} Mar 09 15:28:04 crc kubenswrapper[4722]: I0309 15:28:04.815762 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551168-cxpl9" Mar 09 15:28:04 crc kubenswrapper[4722]: I0309 15:28:04.917612 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7df6h\" (UniqueName: \"kubernetes.io/projected/f65d5c88-8e7d-4c6e-a14d-b838623e152a-kube-api-access-7df6h\") pod \"f65d5c88-8e7d-4c6e-a14d-b838623e152a\" (UID: \"f65d5c88-8e7d-4c6e-a14d-b838623e152a\") " Mar 09 15:28:04 crc kubenswrapper[4722]: I0309 15:28:04.925277 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f65d5c88-8e7d-4c6e-a14d-b838623e152a-kube-api-access-7df6h" (OuterVolumeSpecName: "kube-api-access-7df6h") pod "f65d5c88-8e7d-4c6e-a14d-b838623e152a" (UID: "f65d5c88-8e7d-4c6e-a14d-b838623e152a"). InnerVolumeSpecName "kube-api-access-7df6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 15:28:05 crc kubenswrapper[4722]: I0309 15:28:05.020570 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7df6h\" (UniqueName: \"kubernetes.io/projected/f65d5c88-8e7d-4c6e-a14d-b838623e152a-kube-api-access-7df6h\") on node \"crc\" DevicePath \"\"" Mar 09 15:28:05 crc kubenswrapper[4722]: I0309 15:28:05.363708 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551168-cxpl9" event={"ID":"f65d5c88-8e7d-4c6e-a14d-b838623e152a","Type":"ContainerDied","Data":"061342d230629b01a1ab71df9946da1bdfe6f6898662bef00a710f7c271c902f"} Mar 09 15:28:05 crc kubenswrapper[4722]: I0309 15:28:05.363964 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="061342d230629b01a1ab71df9946da1bdfe6f6898662bef00a710f7c271c902f" Mar 09 15:28:05 crc kubenswrapper[4722]: I0309 15:28:05.364074 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551168-cxpl9" Mar 09 15:28:05 crc kubenswrapper[4722]: I0309 15:28:05.899417 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551162-kb65f"] Mar 09 15:28:05 crc kubenswrapper[4722]: I0309 15:28:05.911022 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551162-kb65f"] Mar 09 15:28:06 crc kubenswrapper[4722]: I0309 15:28:06.161916 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c42a923-6da3-4c85-b007-7d5937445ac9" path="/var/lib/kubelet/pods/4c42a923-6da3-4c85-b007-7d5937445ac9/volumes" Mar 09 15:28:14 crc kubenswrapper[4722]: I0309 15:28:14.154671 4722 scope.go:117] "RemoveContainer" containerID="05d0cab124a66072241bc2a93dace83012b26d6bbd0de5507999fb5df76700cb" Mar 09 15:28:14 crc kubenswrapper[4722]: E0309 15:28:14.155697 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 15:28:15 crc kubenswrapper[4722]: I0309 15:28:15.148619 4722 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 09 15:28:19 crc kubenswrapper[4722]: I0309 15:28:19.011121 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5dcbbd79cf-vnpzr_6a030cdd-139c-4d44-bf79-43f14e97d9f9/nmstate-console-plugin/0.log" Mar 09 15:28:19 crc kubenswrapper[4722]: I0309 15:28:19.375906 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-67pzj_ade25a79-1e43-41ed-be91-ce97aa1c4103/nmstate-handler/0.log" Mar 09 15:28:19 crc kubenswrapper[4722]: I0309 15:28:19.417516 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-zvlr9_6f9a1c28-1ff4-4ead-91d8-5dc47c7b4d55/kube-rbac-proxy/0.log" Mar 09 15:28:19 crc kubenswrapper[4722]: I0309 15:28:19.493875 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-zvlr9_6f9a1c28-1ff4-4ead-91d8-5dc47c7b4d55/nmstate-metrics/0.log" Mar 09 15:28:19 crc kubenswrapper[4722]: I0309 15:28:19.580496 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-75c5dccd6c-svlc5_0556dc80-2dcf-4f82-8b4c-96198af98e00/nmstate-operator/0.log" Mar 09 15:28:19 crc kubenswrapper[4722]: I0309 15:28:19.705166 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-786f45cff4-jdp69_c0af161b-a8d5-4a36-b1c2-0a4d43820c73/nmstate-webhook/0.log" Mar 09 15:28:28 crc kubenswrapper[4722]: I0309 15:28:28.151071 4722 scope.go:117] "RemoveContainer" containerID="05d0cab124a66072241bc2a93dace83012b26d6bbd0de5507999fb5df76700cb" Mar 09 15:28:28 crc kubenswrapper[4722]: E0309 15:28:28.152447 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 15:28:32 crc kubenswrapper[4722]: I0309 15:28:32.467709 4722 scope.go:117] "RemoveContainer" containerID="aedc98873afd42adba59fcbfbde977e28b5645f86910438a6f263830d1b51d20" Mar 09 15:28:32 crc kubenswrapper[4722]: I0309 15:28:32.707067 4722 scope.go:117] "RemoveContainer" containerID="7af62118ef4254fa690052f139b54e3cf888fc27b5dbc5fb3e6784c20f0e26ec" Mar 09 15:28:32 crc kubenswrapper[4722]: I0309 15:28:32.786536 4722 scope.go:117] "RemoveContainer" containerID="e17863f074bb023e7a57bc6dafe37da55973e0ac7840f2c4c87c71bc2b40aca2" Mar 09 15:28:36 crc kubenswrapper[4722]: I0309 15:28:36.809071 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-7d6d6698bd-4r85k_497a07fc-9649-4620-9432-855aa3fdc327/manager/1.log" Mar 09 15:28:36 crc kubenswrapper[4722]: I0309 15:28:36.884688 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-7d6d6698bd-4r85k_497a07fc-9649-4620-9432-855aa3fdc327/kube-rbac-proxy/0.log" Mar 09 15:28:37 crc kubenswrapper[4722]: I0309 15:28:37.076510 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-7d6d6698bd-4r85k_497a07fc-9649-4620-9432-855aa3fdc327/manager/0.log" Mar 09 15:28:41 crc kubenswrapper[4722]: I0309 15:28:41.150053 4722 scope.go:117] "RemoveContainer" containerID="05d0cab124a66072241bc2a93dace83012b26d6bbd0de5507999fb5df76700cb" Mar 09 15:28:41 crc kubenswrapper[4722]: E0309 15:28:41.152200 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 15:28:52 crc kubenswrapper[4722]: I0309 15:28:52.602878 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-pw2x9_f612329a-8162-4440-aae8-a5467e713976/prometheus-operator/0.log" Mar 09 15:28:52 crc kubenswrapper[4722]: I0309 15:28:52.853930 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-5d974998df-p2w5j_a58a5d27-2898-4346-b0c0-08507cf2eb44/prometheus-operator-admission-webhook/0.log" Mar 09 15:28:52 crc kubenswrapper[4722]: I0309 15:28:52.893765 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-5d974998df-jzrsl_85953f91-dda3-42f4-b308-7b553054dad6/prometheus-operator-admission-webhook/0.log" Mar 09 15:28:53 crc kubenswrapper[4722]: I0309 15:28:53.095514 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-lc5zn_14655c3d-02fe-4215-b566-0c4008fd34a0/operator/1.log" Mar 09 15:28:53 crc kubenswrapper[4722]: I0309 15:28:53.112782 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-lc5zn_14655c3d-02fe-4215-b566-0c4008fd34a0/operator/0.log" Mar 09 15:28:53 crc kubenswrapper[4722]: I0309 15:28:53.198663 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-ui-dashboards-66cbf594b5-bzqhp_0d15a083-af10-4638-b6bc-9de1f89f123f/observability-ui-dashboards/0.log" Mar 09 15:28:53 crc kubenswrapper[4722]: I0309 15:28:53.322238 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-2rhld_c123a767-e0e0-4432-b34f-cbe0b581d938/perses-operator/0.log" Mar 09 15:28:54 crc kubenswrapper[4722]: I0309 15:28:54.149614 4722 scope.go:117] "RemoveContainer" containerID="05d0cab124a66072241bc2a93dace83012b26d6bbd0de5507999fb5df76700cb" Mar 09 15:28:54 crc kubenswrapper[4722]: E0309 15:28:54.150250 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 15:29:09 crc kubenswrapper[4722]: I0309 15:29:09.149333 4722 scope.go:117] "RemoveContainer" containerID="05d0cab124a66072241bc2a93dace83012b26d6bbd0de5507999fb5df76700cb" Mar 09 15:29:09 crc kubenswrapper[4722]: E0309 15:29:09.150247 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 15:29:12 crc kubenswrapper[4722]: I0309 15:29:12.624717 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_cluster-logging-operator-c769fd969-gjzjg_b08e9124-838c-47b3-9452-62c3388a66e0/cluster-logging-operator/0.log" Mar 09 15:29:12 crc kubenswrapper[4722]: I0309 15:29:12.843653 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_collector-rj4pp_b7932da8-d764-41c7-b8ac-038cc75e50fd/collector/0.log" Mar 09 15:29:12 crc kubenswrapper[4722]: I0309 15:29:12.849182 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-compactor-0_75ed49e3-dc17-45c0-96ec-1db69670395b/loki-compactor/0.log" Mar 09 15:29:13 crc kubenswrapper[4722]: I0309 15:29:13.526534 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-distributor-5d5548c9f5-r6x4b_822bc43f-dfed-4440-be35-1bf58f50456b/loki-distributor/0.log" Mar 09 15:29:13 crc kubenswrapper[4722]: I0309 15:29:13.588177 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-6c5ff86c56-dvps5_e265fe14-7154-4fbb-a7c3-33557166f71d/gateway/0.log" Mar 09 15:29:13 crc kubenswrapper[4722]: I0309 15:29:13.773634 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-6c5ff86c56-dvps5_e265fe14-7154-4fbb-a7c3-33557166f71d/opa/0.log" Mar 09 15:29:13 crc kubenswrapper[4722]: I0309 15:29:13.806575 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-6c5ff86c56-n5jdr_8becd072-3095-4717-a83d-e56cf0d0f816/opa/0.log" Mar 09 15:29:13 crc kubenswrapper[4722]: I0309 15:29:13.847708 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-6c5ff86c56-n5jdr_8becd072-3095-4717-a83d-e56cf0d0f816/gateway/0.log" Mar 09 15:29:13 crc kubenswrapper[4722]: I0309 15:29:13.997502 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-index-gateway-0_bce49c11-10b4-4c30-a1a4-16cf32cb42fd/loki-index-gateway/0.log" Mar 09 15:29:14 crc kubenswrapper[4722]: I0309 15:29:14.107118 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-ingester-0_6a23db8b-8a30-47b8-bf39-6f193899fcee/loki-ingester/0.log" Mar 09 15:29:14 crc kubenswrapper[4722]: I0309 15:29:14.304269 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-querier-76bf7b6d45-fv4dh_ba829d53-02a8-4003-a5ee-b9b36d8404e3/loki-querier/0.log" Mar 09 15:29:14 crc kubenswrapper[4722]: I0309 15:29:14.380360 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-query-frontend-6d6859c548-5b72h_5ccc948e-2185-44fd-90c4-3ae3228f6224/loki-query-frontend/0.log" Mar 09 15:29:22 crc kubenswrapper[4722]: I0309 15:29:22.149321 4722 scope.go:117] "RemoveContainer" containerID="05d0cab124a66072241bc2a93dace83012b26d6bbd0de5507999fb5df76700cb" Mar 09 15:29:22 crc kubenswrapper[4722]: E0309 15:29:22.150374 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 15:29:24 crc kubenswrapper[4722]: I0309 15:29:24.148381 4722 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 09 15:29:33 crc kubenswrapper[4722]: I0309 15:29:33.191868 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-6w5ww_0b264ba4-1398-4d3e-bb2b-83cd34a0ccf6/controller/1.log" Mar 09 15:29:33 crc kubenswrapper[4722]: I0309 15:29:33.392184 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-6w5ww_0b264ba4-1398-4d3e-bb2b-83cd34a0ccf6/controller/0.log" Mar 09 15:29:34 crc kubenswrapper[4722]: I0309 15:29:34.141440 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-6w5ww_0b264ba4-1398-4d3e-bb2b-83cd34a0ccf6/kube-rbac-proxy/0.log" Mar 09 15:29:34 crc kubenswrapper[4722]: I0309 15:29:34.179831 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6vn96_29ed2858-4fd0-4817-8ed3-b3515ac035d7/cp-frr-files/0.log" Mar 09 15:29:34 crc kubenswrapper[4722]: I0309 15:29:34.377696 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6vn96_29ed2858-4fd0-4817-8ed3-b3515ac035d7/cp-frr-files/0.log" Mar 09 15:29:34 crc kubenswrapper[4722]: I0309 15:29:34.438040 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6vn96_29ed2858-4fd0-4817-8ed3-b3515ac035d7/cp-metrics/0.log" Mar 09 15:29:34 crc kubenswrapper[4722]: I0309 15:29:34.438412 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6vn96_29ed2858-4fd0-4817-8ed3-b3515ac035d7/cp-reloader/0.log" Mar 09 15:29:34 crc kubenswrapper[4722]: I0309 15:29:34.447583 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6vn96_29ed2858-4fd0-4817-8ed3-b3515ac035d7/cp-reloader/0.log" Mar 09 15:29:34 crc kubenswrapper[4722]: I0309 15:29:34.640035 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6vn96_29ed2858-4fd0-4817-8ed3-b3515ac035d7/cp-reloader/0.log" Mar 09 15:29:34 crc kubenswrapper[4722]: I0309 15:29:34.648569 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6vn96_29ed2858-4fd0-4817-8ed3-b3515ac035d7/cp-frr-files/0.log" Mar 09 15:29:34 crc kubenswrapper[4722]: I0309 15:29:34.667549 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6vn96_29ed2858-4fd0-4817-8ed3-b3515ac035d7/cp-metrics/0.log" Mar 09 15:29:34 crc kubenswrapper[4722]: I0309 15:29:34.695900 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6vn96_29ed2858-4fd0-4817-8ed3-b3515ac035d7/cp-metrics/0.log" Mar 09 15:29:34 crc kubenswrapper[4722]: I0309 15:29:34.932780 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6vn96_29ed2858-4fd0-4817-8ed3-b3515ac035d7/cp-reloader/0.log" Mar 09 15:29:34 crc kubenswrapper[4722]: I0309 15:29:34.933849 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6vn96_29ed2858-4fd0-4817-8ed3-b3515ac035d7/cp-metrics/0.log" Mar 09 15:29:34 crc kubenswrapper[4722]: I0309 15:29:34.936043 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6vn96_29ed2858-4fd0-4817-8ed3-b3515ac035d7/cp-frr-files/0.log" Mar 09 15:29:34 crc kubenswrapper[4722]: I0309 15:29:34.949354 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6vn96_29ed2858-4fd0-4817-8ed3-b3515ac035d7/controller/1.log" Mar 09 15:29:35 crc kubenswrapper[4722]: I0309 15:29:35.128856 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6vn96_29ed2858-4fd0-4817-8ed3-b3515ac035d7/controller/0.log" Mar 09 15:29:35 crc kubenswrapper[4722]: I0309 15:29:35.149060 4722 scope.go:117] "RemoveContainer" containerID="05d0cab124a66072241bc2a93dace83012b26d6bbd0de5507999fb5df76700cb" Mar 09 15:29:35 crc kubenswrapper[4722]: E0309 15:29:35.149419 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 15:29:35 crc kubenswrapper[4722]: I0309 15:29:35.176143 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6vn96_29ed2858-4fd0-4817-8ed3-b3515ac035d7/frr-metrics/0.log" Mar 09 15:29:35 crc kubenswrapper[4722]: I0309 15:29:35.215938 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6vn96_29ed2858-4fd0-4817-8ed3-b3515ac035d7/frr/1.log" Mar 09 15:29:35 crc kubenswrapper[4722]: I0309 15:29:35.328710 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6vn96_29ed2858-4fd0-4817-8ed3-b3515ac035d7/kube-rbac-proxy/0.log" Mar 09 15:29:36 crc kubenswrapper[4722]: I0309 15:29:36.069161 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6vn96_29ed2858-4fd0-4817-8ed3-b3515ac035d7/kube-rbac-proxy-frr/0.log" Mar 09 15:29:36 crc kubenswrapper[4722]: I0309 15:29:36.128657 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6vn96_29ed2858-4fd0-4817-8ed3-b3515ac035d7/reloader/0.log" Mar 09 15:29:36 crc kubenswrapper[4722]: I0309 15:29:36.354959 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7f989f654f-2nlfl_8557439a-0367-4823-af83-28955a17cc08/frr-k8s-webhook-server/0.log" Mar 09 15:29:36 crc kubenswrapper[4722]: I0309 15:29:36.363095 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7f989f654f-2nlfl_8557439a-0367-4823-af83-28955a17cc08/frr-k8s-webhook-server/1.log" Mar 09 15:29:36 crc kubenswrapper[4722]: I0309 15:29:36.590685 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-5689854475-89q94_3ea04cb5-4d36-42a9-bb83-c6f943619d16/manager/1.log" Mar 09 15:29:36 crc kubenswrapper[4722]: I0309 15:29:36.640572 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-5689854475-89q94_3ea04cb5-4d36-42a9-bb83-c6f943619d16/manager/0.log" Mar 09 15:29:36 crc kubenswrapper[4722]: I0309 15:29:36.888752 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-798745ff96-864pz_f67efad4-1b85-4f64-9e98-55eb2da89fb6/webhook-server/1.log" Mar 09 15:29:36 crc kubenswrapper[4722]: I0309 15:29:36.924009 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-6vn96_29ed2858-4fd0-4817-8ed3-b3515ac035d7/frr/0.log" Mar 09 15:29:36 crc kubenswrapper[4722]: I0309 15:29:36.945866 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-798745ff96-864pz_f67efad4-1b85-4f64-9e98-55eb2da89fb6/webhook-server/0.log" Mar 09 15:29:37 crc kubenswrapper[4722]: I0309 15:29:37.092611 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-kwml8_93b8f0be-bf52-4559-8cf6-338026cb6610/kube-rbac-proxy/0.log" Mar 09 15:29:37 crc kubenswrapper[4722]: I0309 15:29:37.644898 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-kwml8_93b8f0be-bf52-4559-8cf6-338026cb6610/speaker/0.log" Mar 09 15:29:49 crc kubenswrapper[4722]: I0309 15:29:49.149388 4722 scope.go:117] "RemoveContainer" containerID="05d0cab124a66072241bc2a93dace83012b26d6bbd0de5507999fb5df76700cb" Mar 09 15:29:49 crc kubenswrapper[4722]: E0309 15:29:49.150147 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 15:29:51 crc kubenswrapper[4722]: I0309 15:29:51.683669 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82l5sn8_55a96bf1-f247-49ec-9ecf-feb3ae63a814/util/0.log" Mar 09 15:29:51 crc kubenswrapper[4722]: I0309 15:29:51.926307 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82l5sn8_55a96bf1-f247-49ec-9ecf-feb3ae63a814/util/0.log" Mar 09 15:29:51 crc kubenswrapper[4722]: I0309 15:29:51.941320 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82l5sn8_55a96bf1-f247-49ec-9ecf-feb3ae63a814/pull/0.log" Mar 09 15:29:51 crc kubenswrapper[4722]: I0309 15:29:51.946300 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82l5sn8_55a96bf1-f247-49ec-9ecf-feb3ae63a814/pull/0.log" Mar 09 15:29:52 crc kubenswrapper[4722]: I0309 15:29:52.095218 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82l5sn8_55a96bf1-f247-49ec-9ecf-feb3ae63a814/util/0.log" Mar 09 15:29:52 crc kubenswrapper[4722]: I0309 15:29:52.123308 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82l5sn8_55a96bf1-f247-49ec-9ecf-feb3ae63a814/pull/0.log" Mar 09 15:29:52 crc kubenswrapper[4722]: I0309 15:29:52.134769 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82l5sn8_55a96bf1-f247-49ec-9ecf-feb3ae63a814/extract/0.log" Mar 09 15:29:52 crc kubenswrapper[4722]: I0309 15:29:52.293689 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e19k78qz_af1c9e69-9bc2-4c93-8e62-132b4470617d/util/0.log" Mar 09 15:29:52 crc kubenswrapper[4722]: I0309 15:29:52.477666 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e19k78qz_af1c9e69-9bc2-4c93-8e62-132b4470617d/pull/0.log" Mar 09 15:29:52 crc kubenswrapper[4722]: I0309 15:29:52.492957 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e19k78qz_af1c9e69-9bc2-4c93-8e62-132b4470617d/util/0.log" Mar 09 15:29:52 crc kubenswrapper[4722]: I0309 15:29:52.561768 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e19k78qz_af1c9e69-9bc2-4c93-8e62-132b4470617d/pull/0.log" Mar 09 15:29:52 crc kubenswrapper[4722]: I0309 15:29:52.705913 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e19k78qz_af1c9e69-9bc2-4c93-8e62-132b4470617d/util/0.log" Mar 09 15:29:52 crc kubenswrapper[4722]: I0309 15:29:52.707077 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e19k78qz_af1c9e69-9bc2-4c93-8e62-132b4470617d/extract/0.log" Mar 09 15:29:52 crc kubenswrapper[4722]: I0309 15:29:52.789192 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_371ee4810f5f68c5176d7257cefd8758df33c232524c25acbf90f69e19k78qz_af1c9e69-9bc2-4c93-8e62-132b4470617d/pull/0.log" Mar 09 15:29:52 crc kubenswrapper[4722]: I0309 15:29:52.923582 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08pcbpj_1c250d34-2965-46ae-81ec-c73a372d0380/util/0.log" Mar 09 15:29:53 crc kubenswrapper[4722]: I0309 15:29:53.164681 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08pcbpj_1c250d34-2965-46ae-81ec-c73a372d0380/util/0.log" Mar 09 15:29:53 crc kubenswrapper[4722]: I0309 15:29:53.215509 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08pcbpj_1c250d34-2965-46ae-81ec-c73a372d0380/pull/0.log" Mar 09 15:29:53 crc kubenswrapper[4722]: I0309 15:29:53.219357 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08pcbpj_1c250d34-2965-46ae-81ec-c73a372d0380/pull/0.log" Mar 09 15:29:53 crc kubenswrapper[4722]: I0309 15:29:53.447832 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08pcbpj_1c250d34-2965-46ae-81ec-c73a372d0380/extract/0.log" Mar 09 15:29:53 crc kubenswrapper[4722]: I0309 15:29:53.452905 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08pcbpj_1c250d34-2965-46ae-81ec-c73a372d0380/pull/0.log" Mar 09 15:29:53 crc kubenswrapper[4722]: I0309 15:29:53.482831 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08pcbpj_1c250d34-2965-46ae-81ec-c73a372d0380/util/0.log" Mar 09 15:29:53 crc kubenswrapper[4722]: I0309 15:29:53.653722 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wkdkx_69a0d1f2-276f-4062-91d9-8af8048a8d8f/extract-utilities/0.log" Mar 09 15:29:53 crc kubenswrapper[4722]: I0309 15:29:53.867928 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wkdkx_69a0d1f2-276f-4062-91d9-8af8048a8d8f/extract-utilities/0.log" Mar 09 15:29:53 crc kubenswrapper[4722]: I0309 15:29:53.884633 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wkdkx_69a0d1f2-276f-4062-91d9-8af8048a8d8f/extract-content/0.log" Mar 09 15:29:53 crc kubenswrapper[4722]: I0309 15:29:53.889176 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wkdkx_69a0d1f2-276f-4062-91d9-8af8048a8d8f/extract-content/0.log" Mar 09 15:29:54 crc kubenswrapper[4722]: I0309 15:29:54.081059 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wkdkx_69a0d1f2-276f-4062-91d9-8af8048a8d8f/extract-content/0.log" Mar 09 15:29:54 crc kubenswrapper[4722]: I0309 15:29:54.082275 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wkdkx_69a0d1f2-276f-4062-91d9-8af8048a8d8f/extract-utilities/0.log" Mar 09 15:29:54 crc kubenswrapper[4722]: I0309 15:29:54.343735 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rvtqn_3411289f-3e7c-4e43-b545-5e612822b18e/extract-utilities/0.log" Mar 09 15:29:54 crc kubenswrapper[4722]: I0309 15:29:54.573710 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rvtqn_3411289f-3e7c-4e43-b545-5e612822b18e/extract-utilities/0.log" Mar 09 15:29:54 crc kubenswrapper[4722]: I0309 15:29:54.681743 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rvtqn_3411289f-3e7c-4e43-b545-5e612822b18e/extract-content/0.log" Mar 09 15:29:54 crc kubenswrapper[4722]: I0309 15:29:54.737414 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rvtqn_3411289f-3e7c-4e43-b545-5e612822b18e/extract-content/0.log" Mar 09 15:29:54 crc kubenswrapper[4722]: I0309 15:29:54.926352 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-wkdkx_69a0d1f2-276f-4062-91d9-8af8048a8d8f/registry-server/0.log" Mar 09 15:29:54 crc kubenswrapper[4722]: I0309 15:29:54.979837 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rvtqn_3411289f-3e7c-4e43-b545-5e612822b18e/extract-utilities/0.log" Mar 09 15:29:55 crc kubenswrapper[4722]: I0309 15:29:55.001565 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rvtqn_3411289f-3e7c-4e43-b545-5e612822b18e/extract-content/0.log" Mar 09 15:29:55 crc kubenswrapper[4722]: I0309 15:29:55.140076 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rvtqn_3411289f-3e7c-4e43-b545-5e612822b18e/registry-server/1.log" Mar 09 15:29:55 crc kubenswrapper[4722]: I0309 15:29:55.308620 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4tw5s7_ef685835-f0f9-45e4-b0e1-9213895704e4/util/0.log" Mar 09 15:29:55 crc kubenswrapper[4722]: I0309 15:29:55.495792 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4tw5s7_ef685835-f0f9-45e4-b0e1-9213895704e4/util/0.log" Mar 09 15:29:55 crc kubenswrapper[4722]: I0309 15:29:55.531491 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rvtqn_3411289f-3e7c-4e43-b545-5e612822b18e/registry-server/0.log" Mar 09 15:29:55 crc kubenswrapper[4722]: I0309 15:29:55.539055 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4tw5s7_ef685835-f0f9-45e4-b0e1-9213895704e4/pull/0.log" Mar 09 15:29:55 crc kubenswrapper[4722]: I0309 15:29:55.582006 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4tw5s7_ef685835-f0f9-45e4-b0e1-9213895704e4/pull/0.log" Mar 09 15:29:55 crc kubenswrapper[4722]: I0309 15:29:55.774028 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4tw5s7_ef685835-f0f9-45e4-b0e1-9213895704e4/util/0.log" Mar 09 15:29:55 crc kubenswrapper[4722]: I0309 15:29:55.800525 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4tw5s7_ef685835-f0f9-45e4-b0e1-9213895704e4/extract/0.log" Mar 09 15:29:55 crc kubenswrapper[4722]: I0309 15:29:55.812577 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4tw5s7_ef685835-f0f9-45e4-b0e1-9213895704e4/pull/0.log" Mar 09 15:29:55 crc kubenswrapper[4722]: I0309 15:29:55.973821 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e08989dhs6v_adf5a27c-0e62-496e-a0a2-a0d2d98f898f/util/0.log" Mar 09 15:29:56 crc kubenswrapper[4722]: I0309 15:29:56.165338 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e08989dhs6v_adf5a27c-0e62-496e-a0a2-a0d2d98f898f/util/0.log" Mar 09 15:29:56 crc kubenswrapper[4722]: I0309 15:29:56.206765 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e08989dhs6v_adf5a27c-0e62-496e-a0a2-a0d2d98f898f/pull/0.log" Mar 09 15:29:56 crc kubenswrapper[4722]: I0309 15:29:56.222160 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e08989dhs6v_adf5a27c-0e62-496e-a0a2-a0d2d98f898f/pull/0.log" Mar 09 15:29:56 crc kubenswrapper[4722]: I0309 15:29:56.397546 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e08989dhs6v_adf5a27c-0e62-496e-a0a2-a0d2d98f898f/util/0.log" Mar 09 15:29:56 crc kubenswrapper[4722]: I0309 15:29:56.398886 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e08989dhs6v_adf5a27c-0e62-496e-a0a2-a0d2d98f898f/extract/0.log" Mar 09 15:29:56 crc kubenswrapper[4722]: I0309 15:29:56.444567 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_e2b87168fae98cca1c2d05d26ceb83b1b30b4b54c6968a79bb91e08989dhs6v_adf5a27c-0e62-496e-a0a2-a0d2d98f898f/pull/0.log" Mar 09 15:29:56 crc kubenswrapper[4722]: I0309 15:29:56.461762 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-svdsk_ea964ea5-3fad-4bd0-8ffe-d78f00229fbe/marketplace-operator/0.log" Mar 09 15:29:56 crc kubenswrapper[4722]: I0309 15:29:56.630636 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-c2mmw_690e5ab0-3719-40ac-aba6-9278480ecb44/extract-utilities/0.log" Mar 09 15:29:56 crc kubenswrapper[4722]: I0309 15:29:56.807085 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-c2mmw_690e5ab0-3719-40ac-aba6-9278480ecb44/extract-utilities/0.log" Mar 09 15:29:56 crc kubenswrapper[4722]: I0309 15:29:56.813922 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-c2mmw_690e5ab0-3719-40ac-aba6-9278480ecb44/extract-content/0.log" Mar 09 15:29:56 crc kubenswrapper[4722]: I0309 15:29:56.890804 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-c2mmw_690e5ab0-3719-40ac-aba6-9278480ecb44/extract-content/0.log" Mar 09 15:29:57 crc kubenswrapper[4722]: I0309 15:29:57.109374 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-c2mmw_690e5ab0-3719-40ac-aba6-9278480ecb44/registry-server/1.log" Mar 09 15:29:57 crc kubenswrapper[4722]: I0309 15:29:57.150572 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-c2mmw_690e5ab0-3719-40ac-aba6-9278480ecb44/extract-content/0.log" Mar 09 15:29:57 crc kubenswrapper[4722]: I0309 15:29:57.166760 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-c2mmw_690e5ab0-3719-40ac-aba6-9278480ecb44/extract-utilities/0.log" Mar 09 15:29:57 crc kubenswrapper[4722]: I0309 15:29:57.308576 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-c2mmw_690e5ab0-3719-40ac-aba6-9278480ecb44/registry-server/0.log" Mar 09 15:29:57 crc kubenswrapper[4722]: I0309 15:29:57.363014 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-v57f2_4e27c5a4-8cba-4119-8006-f9841d6121dc/extract-utilities/0.log" Mar 09 15:29:57 crc kubenswrapper[4722]: I0309 15:29:57.551601 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-v57f2_4e27c5a4-8cba-4119-8006-f9841d6121dc/extract-utilities/0.log" Mar 09 15:29:57 crc kubenswrapper[4722]: I0309 15:29:57.608952 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-v57f2_4e27c5a4-8cba-4119-8006-f9841d6121dc/extract-content/0.log" Mar 09 15:29:57 crc kubenswrapper[4722]: I0309 15:29:57.614097 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-v57f2_4e27c5a4-8cba-4119-8006-f9841d6121dc/extract-content/0.log" Mar 09 15:29:58 crc kubenswrapper[4722]: I0309 15:29:58.567406 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-v57f2_4e27c5a4-8cba-4119-8006-f9841d6121dc/extract-utilities/0.log" Mar 09 15:29:58 crc kubenswrapper[4722]: I0309 15:29:58.647820 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-v57f2_4e27c5a4-8cba-4119-8006-f9841d6121dc/extract-content/0.log" Mar 09 15:29:58 crc kubenswrapper[4722]: I0309 15:29:58.667842 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-v57f2_4e27c5a4-8cba-4119-8006-f9841d6121dc/registry-server/1.log" Mar 09 15:29:58 crc kubenswrapper[4722]: I0309 15:29:58.669741 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-v57f2_4e27c5a4-8cba-4119-8006-f9841d6121dc/registry-server/2.log" Mar 09 15:30:00 crc kubenswrapper[4722]: I0309 15:30:00.167640 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551170-lxxqz"] Mar 09 15:30:00 crc kubenswrapper[4722]: E0309 15:30:00.168162 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f65d5c88-8e7d-4c6e-a14d-b838623e152a" containerName="oc" Mar 09 15:30:00 crc kubenswrapper[4722]: I0309 15:30:00.168184 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f65d5c88-8e7d-4c6e-a14d-b838623e152a" containerName="oc" Mar 09 15:30:00 crc kubenswrapper[4722]: I0309 15:30:00.168644 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="f65d5c88-8e7d-4c6e-a14d-b838623e152a" containerName="oc" Mar 09 15:30:00 crc kubenswrapper[4722]: I0309 15:30:00.170329 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551170-lxxqz" Mar 09 15:30:00 crc kubenswrapper[4722]: I0309 15:30:00.177596 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 15:30:00 crc kubenswrapper[4722]: I0309 15:30:00.177895 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-dr2x6" Mar 09 15:30:00 crc kubenswrapper[4722]: I0309 15:30:00.177967 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 15:30:00 crc kubenswrapper[4722]: I0309 15:30:00.182986 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551170-clzmq"] Mar 09 15:30:00 crc kubenswrapper[4722]: I0309 15:30:00.186774 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551170-clzmq" Mar 09 15:30:00 crc kubenswrapper[4722]: I0309 15:30:00.190018 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 09 15:30:00 crc kubenswrapper[4722]: I0309 15:30:00.194983 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 09 15:30:00 crc kubenswrapper[4722]: I0309 15:30:00.204631 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551170-lxxqz"] Mar 09 15:30:00 crc kubenswrapper[4722]: I0309 15:30:00.224689 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551170-clzmq"] Mar 09 15:30:00 crc kubenswrapper[4722]: I0309 15:30:00.280835 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twpdk\" (UniqueName: \"kubernetes.io/projected/d2b03ceb-a279-4acb-b31a-61b62dd37ae0-kube-api-access-twpdk\") pod \"auto-csr-approver-29551170-lxxqz\" (UID: \"d2b03ceb-a279-4acb-b31a-61b62dd37ae0\") " pod="openshift-infra/auto-csr-approver-29551170-lxxqz" Mar 09 15:30:00 crc kubenswrapper[4722]: I0309 15:30:00.281880 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pscb9\" (UniqueName: \"kubernetes.io/projected/316fc048-f856-4e84-8b39-c50cb43848a9-kube-api-access-pscb9\") pod \"collect-profiles-29551170-clzmq\" (UID: \"316fc048-f856-4e84-8b39-c50cb43848a9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551170-clzmq" Mar 09 15:30:00 crc kubenswrapper[4722]: I0309 15:30:00.281981 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/316fc048-f856-4e84-8b39-c50cb43848a9-secret-volume\") pod \"collect-profiles-29551170-clzmq\" (UID: \"316fc048-f856-4e84-8b39-c50cb43848a9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551170-clzmq" Mar 09 15:30:00 crc kubenswrapper[4722]: I0309 15:30:00.282018 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/316fc048-f856-4e84-8b39-c50cb43848a9-config-volume\") pod \"collect-profiles-29551170-clzmq\" (UID: \"316fc048-f856-4e84-8b39-c50cb43848a9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551170-clzmq" Mar 09 15:30:00 crc kubenswrapper[4722]: I0309 15:30:00.384050 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twpdk\" (UniqueName: \"kubernetes.io/projected/d2b03ceb-a279-4acb-b31a-61b62dd37ae0-kube-api-access-twpdk\") pod \"auto-csr-approver-29551170-lxxqz\" (UID: \"d2b03ceb-a279-4acb-b31a-61b62dd37ae0\") " pod="openshift-infra/auto-csr-approver-29551170-lxxqz" Mar 09 15:30:00 crc kubenswrapper[4722]: I0309 15:30:00.384113 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pscb9\" (UniqueName: \"kubernetes.io/projected/316fc048-f856-4e84-8b39-c50cb43848a9-kube-api-access-pscb9\") pod \"collect-profiles-29551170-clzmq\" (UID: \"316fc048-f856-4e84-8b39-c50cb43848a9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551170-clzmq" Mar 09 15:30:00 crc kubenswrapper[4722]: I0309 15:30:00.384180 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/316fc048-f856-4e84-8b39-c50cb43848a9-secret-volume\") pod \"collect-profiles-29551170-clzmq\" (UID: \"316fc048-f856-4e84-8b39-c50cb43848a9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551170-clzmq" Mar 09 15:30:00 crc kubenswrapper[4722]: I0309 15:30:00.384247 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/316fc048-f856-4e84-8b39-c50cb43848a9-config-volume\") pod \"collect-profiles-29551170-clzmq\" (UID: \"316fc048-f856-4e84-8b39-c50cb43848a9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551170-clzmq" Mar 09 15:30:00 crc kubenswrapper[4722]: I0309 15:30:00.387087 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/316fc048-f856-4e84-8b39-c50cb43848a9-config-volume\") pod \"collect-profiles-29551170-clzmq\" (UID: \"316fc048-f856-4e84-8b39-c50cb43848a9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551170-clzmq" Mar 09 15:30:00 crc kubenswrapper[4722]: I0309 15:30:00.401848 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/316fc048-f856-4e84-8b39-c50cb43848a9-secret-volume\") pod \"collect-profiles-29551170-clzmq\" (UID: \"316fc048-f856-4e84-8b39-c50cb43848a9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551170-clzmq" Mar 09 15:30:00 crc kubenswrapper[4722]: I0309 15:30:00.409851 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pscb9\" (UniqueName: \"kubernetes.io/projected/316fc048-f856-4e84-8b39-c50cb43848a9-kube-api-access-pscb9\") pod \"collect-profiles-29551170-clzmq\" (UID: \"316fc048-f856-4e84-8b39-c50cb43848a9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29551170-clzmq" Mar 09 15:30:00 crc kubenswrapper[4722]: I0309 15:30:00.414490 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twpdk\" (UniqueName: \"kubernetes.io/projected/d2b03ceb-a279-4acb-b31a-61b62dd37ae0-kube-api-access-twpdk\") pod \"auto-csr-approver-29551170-lxxqz\" (UID: \"d2b03ceb-a279-4acb-b31a-61b62dd37ae0\") " pod="openshift-infra/auto-csr-approver-29551170-lxxqz" Mar 09 15:30:00 crc kubenswrapper[4722]: I0309 15:30:00.504504 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551170-lxxqz" Mar 09 15:30:00 crc kubenswrapper[4722]: I0309 15:30:00.520026 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551170-clzmq" Mar 09 15:30:01 crc kubenswrapper[4722]: I0309 15:30:01.109842 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551170-lxxqz"] Mar 09 15:30:01 crc kubenswrapper[4722]: W0309 15:30:01.112421 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod316fc048_f856_4e84_8b39_c50cb43848a9.slice/crio-8bb1f6da5b1ae8ebb44d4c573ad21edcf42b3c36a3029a481800542166ddb0aa WatchSource:0}: Error finding container 8bb1f6da5b1ae8ebb44d4c573ad21edcf42b3c36a3029a481800542166ddb0aa: Status 404 returned error can't find the container with id 8bb1f6da5b1ae8ebb44d4c573ad21edcf42b3c36a3029a481800542166ddb0aa Mar 09 15:30:01 crc kubenswrapper[4722]: I0309 15:30:01.124916 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551170-clzmq"] Mar 09 15:30:01 crc kubenswrapper[4722]: I0309 15:30:01.801795 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551170-lxxqz" event={"ID":"d2b03ceb-a279-4acb-b31a-61b62dd37ae0","Type":"ContainerStarted","Data":"5afae24136b200dea2133b4dff57442d343dbb995848d2afc0101573faa9163e"} Mar 09 15:30:01 crc kubenswrapper[4722]: I0309 15:30:01.804813 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551170-clzmq" event={"ID":"316fc048-f856-4e84-8b39-c50cb43848a9","Type":"ContainerStarted","Data":"fbc4d7ecfd94ddfade34804dc4417dabf2136538b445d121681bfd6bc633a689"} Mar 09 15:30:01 crc kubenswrapper[4722]: I0309 15:30:01.804837 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551170-clzmq" event={"ID":"316fc048-f856-4e84-8b39-c50cb43848a9","Type":"ContainerStarted","Data":"8bb1f6da5b1ae8ebb44d4c573ad21edcf42b3c36a3029a481800542166ddb0aa"} Mar 09 15:30:01 crc kubenswrapper[4722]: I0309 15:30:01.832964 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29551170-clzmq" podStartSLOduration=1.8329461409999999 podStartE2EDuration="1.832946141s" podCreationTimestamp="2026-03-09 15:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-09 15:30:01.823138932 +0000 UTC m=+5242.378707518" watchObservedRunningTime="2026-03-09 15:30:01.832946141 +0000 UTC m=+5242.388514717" Mar 09 15:30:02 crc kubenswrapper[4722]: I0309 15:30:02.821432 4722 generic.go:334] "Generic (PLEG): container finished" podID="316fc048-f856-4e84-8b39-c50cb43848a9" containerID="fbc4d7ecfd94ddfade34804dc4417dabf2136538b445d121681bfd6bc633a689" exitCode=0 Mar 09 15:30:02 crc kubenswrapper[4722]: I0309 15:30:02.821576 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551170-clzmq" event={"ID":"316fc048-f856-4e84-8b39-c50cb43848a9","Type":"ContainerDied","Data":"fbc4d7ecfd94ddfade34804dc4417dabf2136538b445d121681bfd6bc633a689"} Mar 09 15:30:03 crc kubenswrapper[4722]: I0309 15:30:03.850777 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551170-lxxqz" event={"ID":"d2b03ceb-a279-4acb-b31a-61b62dd37ae0","Type":"ContainerStarted","Data":"f253d61918f887f00e856fd1ce870c503a753b41202f6302d3674397bd593ac2"} Mar 09 15:30:03 crc kubenswrapper[4722]: I0309 15:30:03.865388 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551170-lxxqz" podStartSLOduration=1.551675897 podStartE2EDuration="3.865366274s" podCreationTimestamp="2026-03-09 15:30:00 +0000 UTC" firstStartedPulling="2026-03-09 15:30:01.113347924 +0000 UTC m=+5241.668916500" lastFinishedPulling="2026-03-09 15:30:03.427038301 +0000 UTC m=+5243.982606877" observedRunningTime="2026-03-09 15:30:03.864477929 +0000 UTC m=+5244.420046505" watchObservedRunningTime="2026-03-09 15:30:03.865366274 +0000 UTC m=+5244.420934850" Mar 09 15:30:04 crc kubenswrapper[4722]: I0309 15:30:04.149874 4722 scope.go:117] "RemoveContainer" containerID="05d0cab124a66072241bc2a93dace83012b26d6bbd0de5507999fb5df76700cb" Mar 09 15:30:04 crc kubenswrapper[4722]: E0309 15:30:04.155034 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 15:30:04 crc kubenswrapper[4722]: I0309 15:30:04.307105 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551170-clzmq" Mar 09 15:30:04 crc kubenswrapper[4722]: I0309 15:30:04.402875 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/316fc048-f856-4e84-8b39-c50cb43848a9-secret-volume\") pod \"316fc048-f856-4e84-8b39-c50cb43848a9\" (UID: \"316fc048-f856-4e84-8b39-c50cb43848a9\") " Mar 09 15:30:04 crc kubenswrapper[4722]: I0309 15:30:04.403419 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pscb9\" (UniqueName: \"kubernetes.io/projected/316fc048-f856-4e84-8b39-c50cb43848a9-kube-api-access-pscb9\") pod \"316fc048-f856-4e84-8b39-c50cb43848a9\" (UID: \"316fc048-f856-4e84-8b39-c50cb43848a9\") " Mar 09 15:30:04 crc kubenswrapper[4722]: I0309 15:30:04.403718 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/316fc048-f856-4e84-8b39-c50cb43848a9-config-volume\") pod \"316fc048-f856-4e84-8b39-c50cb43848a9\" (UID: \"316fc048-f856-4e84-8b39-c50cb43848a9\") " Mar 09 15:30:04 crc kubenswrapper[4722]: I0309 15:30:04.405606 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/316fc048-f856-4e84-8b39-c50cb43848a9-config-volume" (OuterVolumeSpecName: "config-volume") pod "316fc048-f856-4e84-8b39-c50cb43848a9" (UID: "316fc048-f856-4e84-8b39-c50cb43848a9"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 09 15:30:04 crc kubenswrapper[4722]: I0309 15:30:04.411774 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/316fc048-f856-4e84-8b39-c50cb43848a9-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "316fc048-f856-4e84-8b39-c50cb43848a9" (UID: "316fc048-f856-4e84-8b39-c50cb43848a9"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 09 15:30:04 crc kubenswrapper[4722]: I0309 15:30:04.411774 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/316fc048-f856-4e84-8b39-c50cb43848a9-kube-api-access-pscb9" (OuterVolumeSpecName: "kube-api-access-pscb9") pod "316fc048-f856-4e84-8b39-c50cb43848a9" (UID: "316fc048-f856-4e84-8b39-c50cb43848a9"). InnerVolumeSpecName "kube-api-access-pscb9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 15:30:04 crc kubenswrapper[4722]: I0309 15:30:04.507450 4722 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/316fc048-f856-4e84-8b39-c50cb43848a9-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 09 15:30:04 crc kubenswrapper[4722]: I0309 15:30:04.507488 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pscb9\" (UniqueName: \"kubernetes.io/projected/316fc048-f856-4e84-8b39-c50cb43848a9-kube-api-access-pscb9\") on node \"crc\" DevicePath \"\"" Mar 09 15:30:04 crc kubenswrapper[4722]: I0309 15:30:04.507497 4722 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/316fc048-f856-4e84-8b39-c50cb43848a9-config-volume\") on node \"crc\" DevicePath \"\"" Mar 09 15:30:04 crc kubenswrapper[4722]: I0309 15:30:04.905977 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29551170-clzmq" Mar 09 15:30:04 crc kubenswrapper[4722]: I0309 15:30:04.906108 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29551170-clzmq" event={"ID":"316fc048-f856-4e84-8b39-c50cb43848a9","Type":"ContainerDied","Data":"8bb1f6da5b1ae8ebb44d4c573ad21edcf42b3c36a3029a481800542166ddb0aa"} Mar 09 15:30:04 crc kubenswrapper[4722]: I0309 15:30:04.907735 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8bb1f6da5b1ae8ebb44d4c573ad21edcf42b3c36a3029a481800542166ddb0aa" Mar 09 15:30:04 crc kubenswrapper[4722]: I0309 15:30:04.924999 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551125-s9v26"] Mar 09 15:30:04 crc kubenswrapper[4722]: I0309 15:30:04.936830 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29551125-s9v26"] Mar 09 15:30:05 crc kubenswrapper[4722]: I0309 15:30:05.921980 4722 generic.go:334] "Generic (PLEG): container finished" podID="d2b03ceb-a279-4acb-b31a-61b62dd37ae0" containerID="f253d61918f887f00e856fd1ce870c503a753b41202f6302d3674397bd593ac2" exitCode=0 Mar 09 15:30:05 crc kubenswrapper[4722]: I0309 15:30:05.922075 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551170-lxxqz" event={"ID":"d2b03ceb-a279-4acb-b31a-61b62dd37ae0","Type":"ContainerDied","Data":"f253d61918f887f00e856fd1ce870c503a753b41202f6302d3674397bd593ac2"} Mar 09 15:30:06 crc kubenswrapper[4722]: I0309 15:30:06.169553 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9db8477f-7668-4dee-8ffd-8ceec067e99f" path="/var/lib/kubelet/pods/9db8477f-7668-4dee-8ffd-8ceec067e99f/volumes" Mar 09 15:30:07 crc kubenswrapper[4722]: I0309 15:30:07.363790 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551170-lxxqz" Mar 09 15:30:07 crc kubenswrapper[4722]: I0309 15:30:07.474772 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twpdk\" (UniqueName: \"kubernetes.io/projected/d2b03ceb-a279-4acb-b31a-61b62dd37ae0-kube-api-access-twpdk\") pod \"d2b03ceb-a279-4acb-b31a-61b62dd37ae0\" (UID: \"d2b03ceb-a279-4acb-b31a-61b62dd37ae0\") " Mar 09 15:30:07 crc kubenswrapper[4722]: I0309 15:30:07.492014 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2b03ceb-a279-4acb-b31a-61b62dd37ae0-kube-api-access-twpdk" (OuterVolumeSpecName: "kube-api-access-twpdk") pod "d2b03ceb-a279-4acb-b31a-61b62dd37ae0" (UID: "d2b03ceb-a279-4acb-b31a-61b62dd37ae0"). InnerVolumeSpecName "kube-api-access-twpdk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 15:30:07 crc kubenswrapper[4722]: I0309 15:30:07.578236 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-twpdk\" (UniqueName: \"kubernetes.io/projected/d2b03ceb-a279-4acb-b31a-61b62dd37ae0-kube-api-access-twpdk\") on node \"crc\" DevicePath \"\"" Mar 09 15:30:07 crc kubenswrapper[4722]: I0309 15:30:07.953685 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551170-lxxqz" event={"ID":"d2b03ceb-a279-4acb-b31a-61b62dd37ae0","Type":"ContainerDied","Data":"5afae24136b200dea2133b4dff57442d343dbb995848d2afc0101573faa9163e"} Mar 09 15:30:07 crc kubenswrapper[4722]: I0309 15:30:07.953893 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5afae24136b200dea2133b4dff57442d343dbb995848d2afc0101573faa9163e" Mar 09 15:30:07 crc kubenswrapper[4722]: I0309 15:30:07.953781 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551170-lxxqz" Mar 09 15:30:07 crc kubenswrapper[4722]: I0309 15:30:07.985444 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551164-b2lll"] Mar 09 15:30:07 crc kubenswrapper[4722]: I0309 15:30:07.996295 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551164-b2lll"] Mar 09 15:30:08 crc kubenswrapper[4722]: I0309 15:30:08.161783 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aae15a8c-7e4e-4db4-b209-d1243d668860" path="/var/lib/kubelet/pods/aae15a8c-7e4e-4db4-b209-d1243d668860/volumes" Mar 09 15:30:13 crc kubenswrapper[4722]: I0309 15:30:13.353463 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-5d974998df-jzrsl_85953f91-dda3-42f4-b308-7b553054dad6/prometheus-operator-admission-webhook/0.log" Mar 09 15:30:13 crc kubenswrapper[4722]: I0309 15:30:13.371757 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-pw2x9_f612329a-8162-4440-aae8-a5467e713976/prometheus-operator/0.log" Mar 09 15:30:13 crc kubenswrapper[4722]: I0309 15:30:13.432610 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-5d974998df-p2w5j_a58a5d27-2898-4346-b0c0-08507cf2eb44/prometheus-operator-admission-webhook/0.log" Mar 09 15:30:13 crc kubenswrapper[4722]: I0309 15:30:13.558252 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-lc5zn_14655c3d-02fe-4215-b566-0c4008fd34a0/operator/1.log" Mar 09 15:30:13 crc kubenswrapper[4722]: I0309 15:30:13.637179 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-lc5zn_14655c3d-02fe-4215-b566-0c4008fd34a0/operator/0.log" Mar 09 15:30:13 crc kubenswrapper[4722]: I0309 15:30:13.650411 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-ui-dashboards-66cbf594b5-bzqhp_0d15a083-af10-4638-b6bc-9de1f89f123f/observability-ui-dashboards/0.log" Mar 09 15:30:13 crc kubenswrapper[4722]: I0309 15:30:13.740485 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-2rhld_c123a767-e0e0-4432-b34f-cbe0b581d938/perses-operator/0.log" Mar 09 15:30:16 crc kubenswrapper[4722]: I0309 15:30:16.149148 4722 scope.go:117] "RemoveContainer" containerID="05d0cab124a66072241bc2a93dace83012b26d6bbd0de5507999fb5df76700cb" Mar 09 15:30:16 crc kubenswrapper[4722]: E0309 15:30:16.149732 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 15:30:27 crc kubenswrapper[4722]: I0309 15:30:27.149652 4722 scope.go:117] "RemoveContainer" containerID="05d0cab124a66072241bc2a93dace83012b26d6bbd0de5507999fb5df76700cb" Mar 09 15:30:27 crc kubenswrapper[4722]: E0309 15:30:27.150618 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 15:30:29 crc kubenswrapper[4722]: I0309 15:30:29.011498 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-7d6d6698bd-4r85k_497a07fc-9649-4620-9432-855aa3fdc327/manager/1.log" Mar 09 15:30:29 crc kubenswrapper[4722]: I0309 15:30:29.018630 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-7d6d6698bd-4r85k_497a07fc-9649-4620-9432-855aa3fdc327/kube-rbac-proxy/0.log" Mar 09 15:30:29 crc kubenswrapper[4722]: I0309 15:30:29.065099 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-7d6d6698bd-4r85k_497a07fc-9649-4620-9432-855aa3fdc327/manager/0.log" Mar 09 15:30:29 crc kubenswrapper[4722]: I0309 15:30:29.149028 4722 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 09 15:30:32 crc kubenswrapper[4722]: I0309 15:30:32.968918 4722 scope.go:117] "RemoveContainer" containerID="3444c3b817df432becaa95c2f52ef4df00df7db9375eb54518de33ccd7ebf670" Mar 09 15:30:33 crc kubenswrapper[4722]: I0309 15:30:33.038508 4722 scope.go:117] "RemoveContainer" containerID="686f78f6fd8badc955a3137b91963993db8394a75fbc1f22cbbf6617559d2bfe" Mar 09 15:30:40 crc kubenswrapper[4722]: I0309 15:30:40.219277 4722 scope.go:117] "RemoveContainer" containerID="05d0cab124a66072241bc2a93dace83012b26d6bbd0de5507999fb5df76700cb" Mar 09 15:30:40 crc kubenswrapper[4722]: E0309 15:30:40.220367 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 15:30:54 crc kubenswrapper[4722]: I0309 15:30:54.148971 4722 scope.go:117] "RemoveContainer" containerID="05d0cab124a66072241bc2a93dace83012b26d6bbd0de5507999fb5df76700cb" Mar 09 15:30:54 crc kubenswrapper[4722]: E0309 15:30:54.149759 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 15:31:07 crc kubenswrapper[4722]: I0309 15:31:07.150188 4722 scope.go:117] "RemoveContainer" containerID="05d0cab124a66072241bc2a93dace83012b26d6bbd0de5507999fb5df76700cb" Mar 09 15:31:07 crc kubenswrapper[4722]: E0309 15:31:07.151766 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 15:31:22 crc kubenswrapper[4722]: I0309 15:31:22.150449 4722 scope.go:117] "RemoveContainer" containerID="05d0cab124a66072241bc2a93dace83012b26d6bbd0de5507999fb5df76700cb" Mar 09 15:31:22 crc kubenswrapper[4722]: E0309 15:31:22.151307 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 15:31:33 crc kubenswrapper[4722]: I0309 15:31:33.508455 4722 scope.go:117] "RemoveContainer" containerID="c271b8211cefb56ca605873d18ec367536653bbe35acdf81a77dd4fcc192810c" Mar 09 15:31:33 crc kubenswrapper[4722]: I0309 15:31:33.556401 4722 scope.go:117] "RemoveContainer" containerID="8a26a5a101608ceed68af2af2a434e39fb9013308aa9272021748678ec2a6f1e" Mar 09 15:31:35 crc kubenswrapper[4722]: I0309 15:31:35.149712 4722 scope.go:117] "RemoveContainer" containerID="05d0cab124a66072241bc2a93dace83012b26d6bbd0de5507999fb5df76700cb" Mar 09 15:31:35 crc kubenswrapper[4722]: E0309 15:31:35.150364 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 15:31:44 crc kubenswrapper[4722]: I0309 15:31:44.149426 4722 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 09 15:31:50 crc kubenswrapper[4722]: I0309 15:31:50.163016 4722 scope.go:117] "RemoveContainer" containerID="05d0cab124a66072241bc2a93dace83012b26d6bbd0de5507999fb5df76700cb" Mar 09 15:31:50 crc kubenswrapper[4722]: E0309 15:31:50.163904 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 15:32:00 crc kubenswrapper[4722]: I0309 15:32:00.147321 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551172-c6qz8"] Mar 09 15:32:00 crc kubenswrapper[4722]: E0309 15:32:00.149961 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2b03ceb-a279-4acb-b31a-61b62dd37ae0" containerName="oc" Mar 09 15:32:00 crc kubenswrapper[4722]: I0309 15:32:00.150000 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2b03ceb-a279-4acb-b31a-61b62dd37ae0" containerName="oc" Mar 09 15:32:00 crc kubenswrapper[4722]: E0309 15:32:00.150013 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="316fc048-f856-4e84-8b39-c50cb43848a9" containerName="collect-profiles" Mar 09 15:32:00 crc kubenswrapper[4722]: I0309 15:32:00.150019 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="316fc048-f856-4e84-8b39-c50cb43848a9" containerName="collect-profiles" Mar 09 15:32:00 crc kubenswrapper[4722]: I0309 15:32:00.150301 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="316fc048-f856-4e84-8b39-c50cb43848a9" containerName="collect-profiles" Mar 09 15:32:00 crc kubenswrapper[4722]: I0309 15:32:00.150341 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2b03ceb-a279-4acb-b31a-61b62dd37ae0" containerName="oc" Mar 09 15:32:00 crc kubenswrapper[4722]: I0309 15:32:00.151402 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551172-c6qz8" Mar 09 15:32:00 crc kubenswrapper[4722]: I0309 15:32:00.175006 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-dr2x6" Mar 09 15:32:00 crc kubenswrapper[4722]: I0309 15:32:00.179795 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 15:32:00 crc kubenswrapper[4722]: I0309 15:32:00.179997 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 15:32:00 crc kubenswrapper[4722]: I0309 15:32:00.201252 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551172-c6qz8"] Mar 09 15:32:00 crc kubenswrapper[4722]: I0309 15:32:00.267190 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvxkz\" (UniqueName: \"kubernetes.io/projected/2378821c-5194-4bff-a140-c2a703479347-kube-api-access-dvxkz\") pod \"auto-csr-approver-29551172-c6qz8\" (UID: \"2378821c-5194-4bff-a140-c2a703479347\") " pod="openshift-infra/auto-csr-approver-29551172-c6qz8" Mar 09 15:32:00 crc kubenswrapper[4722]: I0309 15:32:00.369666 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvxkz\" (UniqueName: \"kubernetes.io/projected/2378821c-5194-4bff-a140-c2a703479347-kube-api-access-dvxkz\") pod \"auto-csr-approver-29551172-c6qz8\" (UID: \"2378821c-5194-4bff-a140-c2a703479347\") " pod="openshift-infra/auto-csr-approver-29551172-c6qz8" Mar 09 15:32:00 crc kubenswrapper[4722]: I0309 15:32:00.409303 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvxkz\" (UniqueName: \"kubernetes.io/projected/2378821c-5194-4bff-a140-c2a703479347-kube-api-access-dvxkz\") pod \"auto-csr-approver-29551172-c6qz8\" (UID: \"2378821c-5194-4bff-a140-c2a703479347\") " pod="openshift-infra/auto-csr-approver-29551172-c6qz8" Mar 09 15:32:00 crc kubenswrapper[4722]: I0309 15:32:00.513770 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551172-c6qz8" Mar 09 15:32:01 crc kubenswrapper[4722]: I0309 15:32:01.682412 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551172-c6qz8"] Mar 09 15:32:02 crc kubenswrapper[4722]: I0309 15:32:02.406338 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551172-c6qz8" event={"ID":"2378821c-5194-4bff-a140-c2a703479347","Type":"ContainerStarted","Data":"59d742f22c60b4b57c8d218178153ae722dd54b7ca78c6252084cd72eb8a18fd"} Mar 09 15:32:03 crc kubenswrapper[4722]: I0309 15:32:03.423563 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551172-c6qz8" event={"ID":"2378821c-5194-4bff-a140-c2a703479347","Type":"ContainerStarted","Data":"6d7459202f7933f771a1f696937f359f9ac4ce1e9144c811ae0db1459862d78e"} Mar 09 15:32:04 crc kubenswrapper[4722]: I0309 15:32:04.150074 4722 scope.go:117] "RemoveContainer" containerID="05d0cab124a66072241bc2a93dace83012b26d6bbd0de5507999fb5df76700cb" Mar 09 15:32:04 crc kubenswrapper[4722]: E0309 15:32:04.150811 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 15:32:04 crc kubenswrapper[4722]: I0309 15:32:04.437404 4722 generic.go:334] "Generic (PLEG): container finished" podID="2378821c-5194-4bff-a140-c2a703479347" containerID="6d7459202f7933f771a1f696937f359f9ac4ce1e9144c811ae0db1459862d78e" exitCode=0 Mar 09 15:32:04 crc kubenswrapper[4722]: I0309 15:32:04.437455 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551172-c6qz8" event={"ID":"2378821c-5194-4bff-a140-c2a703479347","Type":"ContainerDied","Data":"6d7459202f7933f771a1f696937f359f9ac4ce1e9144c811ae0db1459862d78e"} Mar 09 15:32:05 crc kubenswrapper[4722]: I0309 15:32:05.938953 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551172-c6qz8" Mar 09 15:32:06 crc kubenswrapper[4722]: I0309 15:32:06.123005 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvxkz\" (UniqueName: \"kubernetes.io/projected/2378821c-5194-4bff-a140-c2a703479347-kube-api-access-dvxkz\") pod \"2378821c-5194-4bff-a140-c2a703479347\" (UID: \"2378821c-5194-4bff-a140-c2a703479347\") " Mar 09 15:32:06 crc kubenswrapper[4722]: I0309 15:32:06.132557 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2378821c-5194-4bff-a140-c2a703479347-kube-api-access-dvxkz" (OuterVolumeSpecName: "kube-api-access-dvxkz") pod "2378821c-5194-4bff-a140-c2a703479347" (UID: "2378821c-5194-4bff-a140-c2a703479347"). InnerVolumeSpecName "kube-api-access-dvxkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 15:32:06 crc kubenswrapper[4722]: I0309 15:32:06.226646 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dvxkz\" (UniqueName: \"kubernetes.io/projected/2378821c-5194-4bff-a140-c2a703479347-kube-api-access-dvxkz\") on node \"crc\" DevicePath \"\"" Mar 09 15:32:06 crc kubenswrapper[4722]: I0309 15:32:06.486940 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551172-c6qz8" event={"ID":"2378821c-5194-4bff-a140-c2a703479347","Type":"ContainerDied","Data":"59d742f22c60b4b57c8d218178153ae722dd54b7ca78c6252084cd72eb8a18fd"} Mar 09 15:32:06 crc kubenswrapper[4722]: I0309 15:32:06.486987 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="59d742f22c60b4b57c8d218178153ae722dd54b7ca78c6252084cd72eb8a18fd" Mar 09 15:32:06 crc kubenswrapper[4722]: I0309 15:32:06.486988 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551172-c6qz8" Mar 09 15:32:06 crc kubenswrapper[4722]: I0309 15:32:06.530086 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551166-bgm8q"] Mar 09 15:32:06 crc kubenswrapper[4722]: I0309 15:32:06.545861 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551166-bgm8q"] Mar 09 15:32:08 crc kubenswrapper[4722]: I0309 15:32:08.168904 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca84248b-e4d2-4c20-a7e6-4c41d64cbb7e" path="/var/lib/kubelet/pods/ca84248b-e4d2-4c20-a7e6-4c41d64cbb7e/volumes" Mar 09 15:32:17 crc kubenswrapper[4722]: I0309 15:32:17.149661 4722 scope.go:117] "RemoveContainer" containerID="05d0cab124a66072241bc2a93dace83012b26d6bbd0de5507999fb5df76700cb" Mar 09 15:32:17 crc kubenswrapper[4722]: E0309 15:32:17.150590 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 15:32:27 crc kubenswrapper[4722]: I0309 15:32:27.317985 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-72b9t"] Mar 09 15:32:27 crc kubenswrapper[4722]: E0309 15:32:27.319602 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2378821c-5194-4bff-a140-c2a703479347" containerName="oc" Mar 09 15:32:27 crc kubenswrapper[4722]: I0309 15:32:27.319620 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="2378821c-5194-4bff-a140-c2a703479347" containerName="oc" Mar 09 15:32:27 crc kubenswrapper[4722]: I0309 15:32:27.319918 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="2378821c-5194-4bff-a140-c2a703479347" containerName="oc" Mar 09 15:32:27 crc kubenswrapper[4722]: I0309 15:32:27.328593 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-72b9t" Mar 09 15:32:27 crc kubenswrapper[4722]: I0309 15:32:27.341961 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-72b9t"] Mar 09 15:32:27 crc kubenswrapper[4722]: I0309 15:32:27.419845 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjv85\" (UniqueName: \"kubernetes.io/projected/407240ef-1832-40c9-8460-17164f5e1666-kube-api-access-bjv85\") pod \"certified-operators-72b9t\" (UID: \"407240ef-1832-40c9-8460-17164f5e1666\") " pod="openshift-marketplace/certified-operators-72b9t" Mar 09 15:32:27 crc kubenswrapper[4722]: I0309 15:32:27.420277 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/407240ef-1832-40c9-8460-17164f5e1666-utilities\") pod \"certified-operators-72b9t\" (UID: \"407240ef-1832-40c9-8460-17164f5e1666\") " pod="openshift-marketplace/certified-operators-72b9t" Mar 09 15:32:27 crc kubenswrapper[4722]: I0309 15:32:27.420573 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/407240ef-1832-40c9-8460-17164f5e1666-catalog-content\") pod \"certified-operators-72b9t\" (UID: \"407240ef-1832-40c9-8460-17164f5e1666\") " pod="openshift-marketplace/certified-operators-72b9t" Mar 09 15:32:27 crc kubenswrapper[4722]: I0309 15:32:27.522544 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjv85\" (UniqueName: \"kubernetes.io/projected/407240ef-1832-40c9-8460-17164f5e1666-kube-api-access-bjv85\") pod \"certified-operators-72b9t\" (UID: \"407240ef-1832-40c9-8460-17164f5e1666\") " pod="openshift-marketplace/certified-operators-72b9t" Mar 09 15:32:27 crc kubenswrapper[4722]: I0309 15:32:27.522653 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/407240ef-1832-40c9-8460-17164f5e1666-utilities\") pod \"certified-operators-72b9t\" (UID: \"407240ef-1832-40c9-8460-17164f5e1666\") " pod="openshift-marketplace/certified-operators-72b9t" Mar 09 15:32:27 crc kubenswrapper[4722]: I0309 15:32:27.522704 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/407240ef-1832-40c9-8460-17164f5e1666-catalog-content\") pod \"certified-operators-72b9t\" (UID: \"407240ef-1832-40c9-8460-17164f5e1666\") " pod="openshift-marketplace/certified-operators-72b9t" Mar 09 15:32:27 crc kubenswrapper[4722]: I0309 15:32:27.523314 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/407240ef-1832-40c9-8460-17164f5e1666-catalog-content\") pod \"certified-operators-72b9t\" (UID: \"407240ef-1832-40c9-8460-17164f5e1666\") " pod="openshift-marketplace/certified-operators-72b9t" Mar 09 15:32:27 crc kubenswrapper[4722]: I0309 15:32:27.523330 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/407240ef-1832-40c9-8460-17164f5e1666-utilities\") pod \"certified-operators-72b9t\" (UID: \"407240ef-1832-40c9-8460-17164f5e1666\") " pod="openshift-marketplace/certified-operators-72b9t" Mar 09 15:32:27 crc kubenswrapper[4722]: I0309 15:32:27.552453 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjv85\" (UniqueName: \"kubernetes.io/projected/407240ef-1832-40c9-8460-17164f5e1666-kube-api-access-bjv85\") pod \"certified-operators-72b9t\" (UID: \"407240ef-1832-40c9-8460-17164f5e1666\") " pod="openshift-marketplace/certified-operators-72b9t" Mar 09 15:32:27 crc kubenswrapper[4722]: I0309 15:32:27.651968 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-72b9t" Mar 09 15:32:28 crc kubenswrapper[4722]: I0309 15:32:28.201723 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-72b9t"] Mar 09 15:32:29 crc kubenswrapper[4722]: I0309 15:32:29.057856 4722 generic.go:334] "Generic (PLEG): container finished" podID="407240ef-1832-40c9-8460-17164f5e1666" containerID="a3a300d6ec98fb7c1db8a21c5525bfb75cf2b8f57743eb0e8d4e4643b4936513" exitCode=0 Mar 09 15:32:29 crc kubenswrapper[4722]: I0309 15:32:29.057943 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-72b9t" event={"ID":"407240ef-1832-40c9-8460-17164f5e1666","Type":"ContainerDied","Data":"a3a300d6ec98fb7c1db8a21c5525bfb75cf2b8f57743eb0e8d4e4643b4936513"} Mar 09 15:32:29 crc kubenswrapper[4722]: I0309 15:32:29.058232 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-72b9t" event={"ID":"407240ef-1832-40c9-8460-17164f5e1666","Type":"ContainerStarted","Data":"760feac865af11b8c805582af1c65fd194de15336305b68f69aac5e83a7797e3"} Mar 09 15:32:30 crc kubenswrapper[4722]: I0309 15:32:30.161290 4722 scope.go:117] "RemoveContainer" containerID="05d0cab124a66072241bc2a93dace83012b26d6bbd0de5507999fb5df76700cb" Mar 09 15:32:31 crc kubenswrapper[4722]: I0309 15:32:31.088087 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" event={"ID":"dac2aaf5-653b-4b2a-8efe-ed26bac8d648","Type":"ContainerStarted","Data":"4a53fd9ee2bab0740d7bdf686e82ec34043eb1813cf987fb88ca038e6c64eb91"} Mar 09 15:32:32 crc kubenswrapper[4722]: I0309 15:32:32.110772 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-72b9t" event={"ID":"407240ef-1832-40c9-8460-17164f5e1666","Type":"ContainerStarted","Data":"441274cdf7bf65e710c44dfc8a038a5c65de7bce33c8d38e219f27d84201f408"} Mar 09 15:32:33 crc kubenswrapper[4722]: I0309 15:32:33.715631 4722 scope.go:117] "RemoveContainer" containerID="824121ab0ac52d5e8b6e0e8c57813649c3d2f7180b2f52686cc3e32d5064cd75" Mar 09 15:32:33 crc kubenswrapper[4722]: I0309 15:32:33.790101 4722 scope.go:117] "RemoveContainer" containerID="60a3f7a2763968e65f4b292ed56000b48c001ad5126d8f59220768804bc386e5" Mar 09 15:32:34 crc kubenswrapper[4722]: I0309 15:32:34.133978 4722 generic.go:334] "Generic (PLEG): container finished" podID="407240ef-1832-40c9-8460-17164f5e1666" containerID="441274cdf7bf65e710c44dfc8a038a5c65de7bce33c8d38e219f27d84201f408" exitCode=0 Mar 09 15:32:34 crc kubenswrapper[4722]: I0309 15:32:34.134045 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-72b9t" event={"ID":"407240ef-1832-40c9-8460-17164f5e1666","Type":"ContainerDied","Data":"441274cdf7bf65e710c44dfc8a038a5c65de7bce33c8d38e219f27d84201f408"} Mar 09 15:32:35 crc kubenswrapper[4722]: I0309 15:32:35.164721 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-72b9t" event={"ID":"407240ef-1832-40c9-8460-17164f5e1666","Type":"ContainerStarted","Data":"27ac1935d0f59c67f773f9e632d31b644cc423a747d070c0ef52819054b193a2"} Mar 09 15:32:35 crc kubenswrapper[4722]: I0309 15:32:35.216294 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-72b9t" podStartSLOduration=2.550008943 podStartE2EDuration="8.216269651s" podCreationTimestamp="2026-03-09 15:32:27 +0000 UTC" firstStartedPulling="2026-03-09 15:32:29.060871281 +0000 UTC m=+5389.616439857" lastFinishedPulling="2026-03-09 15:32:34.727131989 +0000 UTC m=+5395.282700565" observedRunningTime="2026-03-09 15:32:35.20561781 +0000 UTC m=+5395.761186436" watchObservedRunningTime="2026-03-09 15:32:35.216269651 +0000 UTC m=+5395.771838227" Mar 09 15:32:37 crc kubenswrapper[4722]: I0309 15:32:37.652339 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-72b9t" Mar 09 15:32:37 crc kubenswrapper[4722]: I0309 15:32:37.652922 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-72b9t" Mar 09 15:32:38 crc kubenswrapper[4722]: I0309 15:32:38.706488 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-72b9t" podUID="407240ef-1832-40c9-8460-17164f5e1666" containerName="registry-server" probeResult="failure" output=< Mar 09 15:32:38 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Mar 09 15:32:38 crc kubenswrapper[4722]: > Mar 09 15:32:40 crc kubenswrapper[4722]: I0309 15:32:40.251265 4722 generic.go:334] "Generic (PLEG): container finished" podID="13a7cb7f-e80c-4ff8-9a4e-1c1048399cd5" containerID="10da958cc3f84b9190c63354e4053d2f9fad8b6733521bf86ee17326bf2d8503" exitCode=0 Mar 09 15:32:40 crc kubenswrapper[4722]: I0309 15:32:40.251606 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bgdq7/must-gather-sd2h4" event={"ID":"13a7cb7f-e80c-4ff8-9a4e-1c1048399cd5","Type":"ContainerDied","Data":"10da958cc3f84b9190c63354e4053d2f9fad8b6733521bf86ee17326bf2d8503"} Mar 09 15:32:40 crc kubenswrapper[4722]: I0309 15:32:40.252396 4722 scope.go:117] "RemoveContainer" containerID="10da958cc3f84b9190c63354e4053d2f9fad8b6733521bf86ee17326bf2d8503" Mar 09 15:32:40 crc kubenswrapper[4722]: I0309 15:32:40.922800 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-bgdq7_must-gather-sd2h4_13a7cb7f-e80c-4ff8-9a4e-1c1048399cd5/gather/0.log" Mar 09 15:32:43 crc kubenswrapper[4722]: I0309 15:32:43.863383 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-h64qb"] Mar 09 15:32:43 crc kubenswrapper[4722]: I0309 15:32:43.868131 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h64qb" Mar 09 15:32:43 crc kubenswrapper[4722]: I0309 15:32:43.887235 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h64qb"] Mar 09 15:32:43 crc kubenswrapper[4722]: I0309 15:32:43.969000 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76lrf\" (UniqueName: \"kubernetes.io/projected/1910c215-3bd9-4767-8839-12ed902c7a84-kube-api-access-76lrf\") pod \"redhat-marketplace-h64qb\" (UID: \"1910c215-3bd9-4767-8839-12ed902c7a84\") " pod="openshift-marketplace/redhat-marketplace-h64qb" Mar 09 15:32:43 crc kubenswrapper[4722]: I0309 15:32:43.969060 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1910c215-3bd9-4767-8839-12ed902c7a84-utilities\") pod \"redhat-marketplace-h64qb\" (UID: \"1910c215-3bd9-4767-8839-12ed902c7a84\") " pod="openshift-marketplace/redhat-marketplace-h64qb" Mar 09 15:32:43 crc kubenswrapper[4722]: I0309 15:32:43.970091 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1910c215-3bd9-4767-8839-12ed902c7a84-catalog-content\") pod \"redhat-marketplace-h64qb\" (UID: \"1910c215-3bd9-4767-8839-12ed902c7a84\") " pod="openshift-marketplace/redhat-marketplace-h64qb" Mar 09 15:32:44 crc kubenswrapper[4722]: I0309 15:32:44.073464 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76lrf\" (UniqueName: \"kubernetes.io/projected/1910c215-3bd9-4767-8839-12ed902c7a84-kube-api-access-76lrf\") pod \"redhat-marketplace-h64qb\" (UID: \"1910c215-3bd9-4767-8839-12ed902c7a84\") " pod="openshift-marketplace/redhat-marketplace-h64qb" Mar 09 15:32:44 crc kubenswrapper[4722]: I0309 15:32:44.073531 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1910c215-3bd9-4767-8839-12ed902c7a84-utilities\") pod \"redhat-marketplace-h64qb\" (UID: \"1910c215-3bd9-4767-8839-12ed902c7a84\") " pod="openshift-marketplace/redhat-marketplace-h64qb" Mar 09 15:32:44 crc kubenswrapper[4722]: I0309 15:32:44.073881 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1910c215-3bd9-4767-8839-12ed902c7a84-catalog-content\") pod \"redhat-marketplace-h64qb\" (UID: \"1910c215-3bd9-4767-8839-12ed902c7a84\") " pod="openshift-marketplace/redhat-marketplace-h64qb" Mar 09 15:32:44 crc kubenswrapper[4722]: I0309 15:32:44.074131 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1910c215-3bd9-4767-8839-12ed902c7a84-utilities\") pod \"redhat-marketplace-h64qb\" (UID: \"1910c215-3bd9-4767-8839-12ed902c7a84\") " pod="openshift-marketplace/redhat-marketplace-h64qb" Mar 09 15:32:44 crc kubenswrapper[4722]: I0309 15:32:44.074373 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1910c215-3bd9-4767-8839-12ed902c7a84-catalog-content\") pod \"redhat-marketplace-h64qb\" (UID: \"1910c215-3bd9-4767-8839-12ed902c7a84\") " pod="openshift-marketplace/redhat-marketplace-h64qb" Mar 09 15:32:44 crc kubenswrapper[4722]: I0309 15:32:44.100135 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76lrf\" (UniqueName: \"kubernetes.io/projected/1910c215-3bd9-4767-8839-12ed902c7a84-kube-api-access-76lrf\") pod \"redhat-marketplace-h64qb\" (UID: \"1910c215-3bd9-4767-8839-12ed902c7a84\") " pod="openshift-marketplace/redhat-marketplace-h64qb" Mar 09 15:32:44 crc kubenswrapper[4722]: I0309 15:32:44.198337 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h64qb" Mar 09 15:32:44 crc kubenswrapper[4722]: I0309 15:32:44.716194 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h64qb"] Mar 09 15:32:44 crc kubenswrapper[4722]: W0309 15:32:44.983957 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1910c215_3bd9_4767_8839_12ed902c7a84.slice/crio-9ba92a5c97472e666d993a9f03efd3f0a940f97191ea1db6e864f73040ad67b4 WatchSource:0}: Error finding container 9ba92a5c97472e666d993a9f03efd3f0a940f97191ea1db6e864f73040ad67b4: Status 404 returned error can't find the container with id 9ba92a5c97472e666d993a9f03efd3f0a940f97191ea1db6e864f73040ad67b4 Mar 09 15:32:45 crc kubenswrapper[4722]: I0309 15:32:45.325781 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h64qb" event={"ID":"1910c215-3bd9-4767-8839-12ed902c7a84","Type":"ContainerStarted","Data":"9ba92a5c97472e666d993a9f03efd3f0a940f97191ea1db6e864f73040ad67b4"} Mar 09 15:32:46 crc kubenswrapper[4722]: I0309 15:32:46.340105 4722 generic.go:334] "Generic (PLEG): container finished" podID="1910c215-3bd9-4767-8839-12ed902c7a84" containerID="e23562ac3309bc10c6dcf6b196a690279859571bc7352770360d99e3d5187a11" exitCode=0 Mar 09 15:32:46 crc kubenswrapper[4722]: I0309 15:32:46.340426 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h64qb" event={"ID":"1910c215-3bd9-4767-8839-12ed902c7a84","Type":"ContainerDied","Data":"e23562ac3309bc10c6dcf6b196a690279859571bc7352770360d99e3d5187a11"} Mar 09 15:32:47 crc kubenswrapper[4722]: I0309 15:32:47.148725 4722 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 09 15:32:47 crc kubenswrapper[4722]: I0309 15:32:47.353602 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h64qb" event={"ID":"1910c215-3bd9-4767-8839-12ed902c7a84","Type":"ContainerStarted","Data":"4318593f98957e1f3387172dc5f613b46a21ea8327c4788ff112e84aab38c7e2"} Mar 09 15:32:48 crc kubenswrapper[4722]: I0309 15:32:48.704817 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-72b9t" podUID="407240ef-1832-40c9-8460-17164f5e1666" containerName="registry-server" probeResult="failure" output=< Mar 09 15:32:48 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Mar 09 15:32:48 crc kubenswrapper[4722]: > Mar 09 15:32:49 crc kubenswrapper[4722]: I0309 15:32:49.378758 4722 generic.go:334] "Generic (PLEG): container finished" podID="1910c215-3bd9-4767-8839-12ed902c7a84" containerID="4318593f98957e1f3387172dc5f613b46a21ea8327c4788ff112e84aab38c7e2" exitCode=0 Mar 09 15:32:49 crc kubenswrapper[4722]: I0309 15:32:49.378966 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h64qb" event={"ID":"1910c215-3bd9-4767-8839-12ed902c7a84","Type":"ContainerDied","Data":"4318593f98957e1f3387172dc5f613b46a21ea8327c4788ff112e84aab38c7e2"} Mar 09 15:32:50 crc kubenswrapper[4722]: I0309 15:32:50.392915 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h64qb" event={"ID":"1910c215-3bd9-4767-8839-12ed902c7a84","Type":"ContainerStarted","Data":"ef0c010bfe456ca2ddb3b5371f0eee92efff01b7399a6f4b0827c9059f87a838"} Mar 09 15:32:50 crc kubenswrapper[4722]: I0309 15:32:50.417218 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-h64qb" podStartSLOduration=3.924820945 podStartE2EDuration="7.41717611s" podCreationTimestamp="2026-03-09 15:32:43 +0000 UTC" firstStartedPulling="2026-03-09 15:32:46.351150431 +0000 UTC m=+5406.906719007" lastFinishedPulling="2026-03-09 15:32:49.843505596 +0000 UTC m=+5410.399074172" observedRunningTime="2026-03-09 15:32:50.412235965 +0000 UTC m=+5410.967804541" watchObservedRunningTime="2026-03-09 15:32:50.41717611 +0000 UTC m=+5410.972744686" Mar 09 15:32:54 crc kubenswrapper[4722]: I0309 15:32:54.028151 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-bgdq7/must-gather-sd2h4"] Mar 09 15:32:54 crc kubenswrapper[4722]: I0309 15:32:54.029317 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-bgdq7/must-gather-sd2h4" podUID="13a7cb7f-e80c-4ff8-9a4e-1c1048399cd5" containerName="copy" containerID="cri-o://1d7ec723dc1c1c0a85c1e18d64e41de3d9f4364b2e000c15dc8996ae9106febb" gracePeriod=2 Mar 09 15:32:54 crc kubenswrapper[4722]: I0309 15:32:54.040172 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-bgdq7/must-gather-sd2h4"] Mar 09 15:32:54 crc kubenswrapper[4722]: I0309 15:32:54.200849 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-h64qb" Mar 09 15:32:54 crc kubenswrapper[4722]: I0309 15:32:54.201548 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-h64qb" Mar 09 15:32:54 crc kubenswrapper[4722]: I0309 15:32:54.445145 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-bgdq7_must-gather-sd2h4_13a7cb7f-e80c-4ff8-9a4e-1c1048399cd5/copy/0.log" Mar 09 15:32:54 crc kubenswrapper[4722]: I0309 15:32:54.445703 4722 generic.go:334] "Generic (PLEG): container finished" podID="13a7cb7f-e80c-4ff8-9a4e-1c1048399cd5" containerID="1d7ec723dc1c1c0a85c1e18d64e41de3d9f4364b2e000c15dc8996ae9106febb" exitCode=143 Mar 09 15:32:54 crc kubenswrapper[4722]: I0309 15:32:54.744075 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-bgdq7_must-gather-sd2h4_13a7cb7f-e80c-4ff8-9a4e-1c1048399cd5/copy/0.log" Mar 09 15:32:54 crc kubenswrapper[4722]: I0309 15:32:54.744787 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bgdq7/must-gather-sd2h4" Mar 09 15:32:54 crc kubenswrapper[4722]: I0309 15:32:54.842102 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hzdw9\" (UniqueName: \"kubernetes.io/projected/13a7cb7f-e80c-4ff8-9a4e-1c1048399cd5-kube-api-access-hzdw9\") pod \"13a7cb7f-e80c-4ff8-9a4e-1c1048399cd5\" (UID: \"13a7cb7f-e80c-4ff8-9a4e-1c1048399cd5\") " Mar 09 15:32:54 crc kubenswrapper[4722]: I0309 15:32:54.842173 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/13a7cb7f-e80c-4ff8-9a4e-1c1048399cd5-must-gather-output\") pod \"13a7cb7f-e80c-4ff8-9a4e-1c1048399cd5\" (UID: \"13a7cb7f-e80c-4ff8-9a4e-1c1048399cd5\") " Mar 09 15:32:54 crc kubenswrapper[4722]: I0309 15:32:54.848806 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13a7cb7f-e80c-4ff8-9a4e-1c1048399cd5-kube-api-access-hzdw9" (OuterVolumeSpecName: "kube-api-access-hzdw9") pod "13a7cb7f-e80c-4ff8-9a4e-1c1048399cd5" (UID: "13a7cb7f-e80c-4ff8-9a4e-1c1048399cd5"). InnerVolumeSpecName "kube-api-access-hzdw9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 15:32:54 crc kubenswrapper[4722]: I0309 15:32:54.944913 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hzdw9\" (UniqueName: \"kubernetes.io/projected/13a7cb7f-e80c-4ff8-9a4e-1c1048399cd5-kube-api-access-hzdw9\") on node \"crc\" DevicePath \"\"" Mar 09 15:32:55 crc kubenswrapper[4722]: I0309 15:32:55.017763 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13a7cb7f-e80c-4ff8-9a4e-1c1048399cd5-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "13a7cb7f-e80c-4ff8-9a4e-1c1048399cd5" (UID: "13a7cb7f-e80c-4ff8-9a4e-1c1048399cd5"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 15:32:55 crc kubenswrapper[4722]: I0309 15:32:55.046538 4722 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/13a7cb7f-e80c-4ff8-9a4e-1c1048399cd5-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 09 15:32:55 crc kubenswrapper[4722]: I0309 15:32:55.281253 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-h64qb" podUID="1910c215-3bd9-4767-8839-12ed902c7a84" containerName="registry-server" probeResult="failure" output=< Mar 09 15:32:55 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Mar 09 15:32:55 crc kubenswrapper[4722]: > Mar 09 15:32:55 crc kubenswrapper[4722]: I0309 15:32:55.456904 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-bgdq7_must-gather-sd2h4_13a7cb7f-e80c-4ff8-9a4e-1c1048399cd5/copy/0.log" Mar 09 15:32:55 crc kubenswrapper[4722]: I0309 15:32:55.457318 4722 scope.go:117] "RemoveContainer" containerID="1d7ec723dc1c1c0a85c1e18d64e41de3d9f4364b2e000c15dc8996ae9106febb" Mar 09 15:32:55 crc kubenswrapper[4722]: I0309 15:32:55.457394 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bgdq7/must-gather-sd2h4" Mar 09 15:32:55 crc kubenswrapper[4722]: I0309 15:32:55.479047 4722 scope.go:117] "RemoveContainer" containerID="10da958cc3f84b9190c63354e4053d2f9fad8b6733521bf86ee17326bf2d8503" Mar 09 15:32:56 crc kubenswrapper[4722]: I0309 15:32:56.163233 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13a7cb7f-e80c-4ff8-9a4e-1c1048399cd5" path="/var/lib/kubelet/pods/13a7cb7f-e80c-4ff8-9a4e-1c1048399cd5/volumes" Mar 09 15:32:58 crc kubenswrapper[4722]: I0309 15:32:58.709018 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-72b9t" podUID="407240ef-1832-40c9-8460-17164f5e1666" containerName="registry-server" probeResult="failure" output=< Mar 09 15:32:58 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Mar 09 15:32:58 crc kubenswrapper[4722]: > Mar 09 15:33:04 crc kubenswrapper[4722]: I0309 15:33:04.255907 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-h64qb" Mar 09 15:33:04 crc kubenswrapper[4722]: I0309 15:33:04.311065 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-h64qb" Mar 09 15:33:04 crc kubenswrapper[4722]: I0309 15:33:04.513318 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-h64qb"] Mar 09 15:33:05 crc kubenswrapper[4722]: I0309 15:33:05.574821 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-h64qb" podUID="1910c215-3bd9-4767-8839-12ed902c7a84" containerName="registry-server" containerID="cri-o://ef0c010bfe456ca2ddb3b5371f0eee92efff01b7399a6f4b0827c9059f87a838" gracePeriod=2 Mar 09 15:33:06 crc kubenswrapper[4722]: I0309 15:33:06.487068 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h64qb" Mar 09 15:33:06 crc kubenswrapper[4722]: I0309 15:33:06.589292 4722 generic.go:334] "Generic (PLEG): container finished" podID="1910c215-3bd9-4767-8839-12ed902c7a84" containerID="ef0c010bfe456ca2ddb3b5371f0eee92efff01b7399a6f4b0827c9059f87a838" exitCode=0 Mar 09 15:33:06 crc kubenswrapper[4722]: I0309 15:33:06.589336 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h64qb" event={"ID":"1910c215-3bd9-4767-8839-12ed902c7a84","Type":"ContainerDied","Data":"ef0c010bfe456ca2ddb3b5371f0eee92efff01b7399a6f4b0827c9059f87a838"} Mar 09 15:33:06 crc kubenswrapper[4722]: I0309 15:33:06.589362 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h64qb" event={"ID":"1910c215-3bd9-4767-8839-12ed902c7a84","Type":"ContainerDied","Data":"9ba92a5c97472e666d993a9f03efd3f0a940f97191ea1db6e864f73040ad67b4"} Mar 09 15:33:06 crc kubenswrapper[4722]: I0309 15:33:06.589378 4722 scope.go:117] "RemoveContainer" containerID="ef0c010bfe456ca2ddb3b5371f0eee92efff01b7399a6f4b0827c9059f87a838" Mar 09 15:33:06 crc kubenswrapper[4722]: I0309 15:33:06.589500 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h64qb" Mar 09 15:33:06 crc kubenswrapper[4722]: I0309 15:33:06.596282 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1910c215-3bd9-4767-8839-12ed902c7a84-utilities\") pod \"1910c215-3bd9-4767-8839-12ed902c7a84\" (UID: \"1910c215-3bd9-4767-8839-12ed902c7a84\") " Mar 09 15:33:06 crc kubenswrapper[4722]: I0309 15:33:06.596446 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76lrf\" (UniqueName: \"kubernetes.io/projected/1910c215-3bd9-4767-8839-12ed902c7a84-kube-api-access-76lrf\") pod \"1910c215-3bd9-4767-8839-12ed902c7a84\" (UID: \"1910c215-3bd9-4767-8839-12ed902c7a84\") " Mar 09 15:33:06 crc kubenswrapper[4722]: I0309 15:33:06.596709 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1910c215-3bd9-4767-8839-12ed902c7a84-catalog-content\") pod \"1910c215-3bd9-4767-8839-12ed902c7a84\" (UID: \"1910c215-3bd9-4767-8839-12ed902c7a84\") " Mar 09 15:33:06 crc kubenswrapper[4722]: I0309 15:33:06.597344 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1910c215-3bd9-4767-8839-12ed902c7a84-utilities" (OuterVolumeSpecName: "utilities") pod "1910c215-3bd9-4767-8839-12ed902c7a84" (UID: "1910c215-3bd9-4767-8839-12ed902c7a84"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 15:33:06 crc kubenswrapper[4722]: I0309 15:33:06.608502 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1910c215-3bd9-4767-8839-12ed902c7a84-kube-api-access-76lrf" (OuterVolumeSpecName: "kube-api-access-76lrf") pod "1910c215-3bd9-4767-8839-12ed902c7a84" (UID: "1910c215-3bd9-4767-8839-12ed902c7a84"). InnerVolumeSpecName "kube-api-access-76lrf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 15:33:06 crc kubenswrapper[4722]: I0309 15:33:06.628408 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1910c215-3bd9-4767-8839-12ed902c7a84-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1910c215-3bd9-4767-8839-12ed902c7a84" (UID: "1910c215-3bd9-4767-8839-12ed902c7a84"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 15:33:06 crc kubenswrapper[4722]: I0309 15:33:06.629003 4722 scope.go:117] "RemoveContainer" containerID="4318593f98957e1f3387172dc5f613b46a21ea8327c4788ff112e84aab38c7e2" Mar 09 15:33:06 crc kubenswrapper[4722]: I0309 15:33:06.690940 4722 scope.go:117] "RemoveContainer" containerID="e23562ac3309bc10c6dcf6b196a690279859571bc7352770360d99e3d5187a11" Mar 09 15:33:06 crc kubenswrapper[4722]: I0309 15:33:06.703229 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1910c215-3bd9-4767-8839-12ed902c7a84-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 15:33:06 crc kubenswrapper[4722]: I0309 15:33:06.703262 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1910c215-3bd9-4767-8839-12ed902c7a84-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 15:33:06 crc kubenswrapper[4722]: I0309 15:33:06.703276 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-76lrf\" (UniqueName: \"kubernetes.io/projected/1910c215-3bd9-4767-8839-12ed902c7a84-kube-api-access-76lrf\") on node \"crc\" DevicePath \"\"" Mar 09 15:33:06 crc kubenswrapper[4722]: I0309 15:33:06.736648 4722 scope.go:117] "RemoveContainer" containerID="ef0c010bfe456ca2ddb3b5371f0eee92efff01b7399a6f4b0827c9059f87a838" Mar 09 15:33:06 crc kubenswrapper[4722]: E0309 15:33:06.737860 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef0c010bfe456ca2ddb3b5371f0eee92efff01b7399a6f4b0827c9059f87a838\": container with ID starting with ef0c010bfe456ca2ddb3b5371f0eee92efff01b7399a6f4b0827c9059f87a838 not found: ID does not exist" containerID="ef0c010bfe456ca2ddb3b5371f0eee92efff01b7399a6f4b0827c9059f87a838" Mar 09 15:33:06 crc kubenswrapper[4722]: I0309 15:33:06.737912 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef0c010bfe456ca2ddb3b5371f0eee92efff01b7399a6f4b0827c9059f87a838"} err="failed to get container status \"ef0c010bfe456ca2ddb3b5371f0eee92efff01b7399a6f4b0827c9059f87a838\": rpc error: code = NotFound desc = could not find container \"ef0c010bfe456ca2ddb3b5371f0eee92efff01b7399a6f4b0827c9059f87a838\": container with ID starting with ef0c010bfe456ca2ddb3b5371f0eee92efff01b7399a6f4b0827c9059f87a838 not found: ID does not exist" Mar 09 15:33:06 crc kubenswrapper[4722]: I0309 15:33:06.737939 4722 scope.go:117] "RemoveContainer" containerID="4318593f98957e1f3387172dc5f613b46a21ea8327c4788ff112e84aab38c7e2" Mar 09 15:33:06 crc kubenswrapper[4722]: E0309 15:33:06.738161 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4318593f98957e1f3387172dc5f613b46a21ea8327c4788ff112e84aab38c7e2\": container with ID starting with 4318593f98957e1f3387172dc5f613b46a21ea8327c4788ff112e84aab38c7e2 not found: ID does not exist" containerID="4318593f98957e1f3387172dc5f613b46a21ea8327c4788ff112e84aab38c7e2" Mar 09 15:33:06 crc kubenswrapper[4722]: I0309 15:33:06.738184 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4318593f98957e1f3387172dc5f613b46a21ea8327c4788ff112e84aab38c7e2"} err="failed to get container status \"4318593f98957e1f3387172dc5f613b46a21ea8327c4788ff112e84aab38c7e2\": rpc error: code = NotFound desc = could not find container \"4318593f98957e1f3387172dc5f613b46a21ea8327c4788ff112e84aab38c7e2\": container with ID starting with 4318593f98957e1f3387172dc5f613b46a21ea8327c4788ff112e84aab38c7e2 not found: ID does not exist" Mar 09 15:33:06 crc kubenswrapper[4722]: I0309 15:33:06.738214 4722 scope.go:117] "RemoveContainer" containerID="e23562ac3309bc10c6dcf6b196a690279859571bc7352770360d99e3d5187a11" Mar 09 15:33:06 crc kubenswrapper[4722]: E0309 15:33:06.738471 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e23562ac3309bc10c6dcf6b196a690279859571bc7352770360d99e3d5187a11\": container with ID starting with e23562ac3309bc10c6dcf6b196a690279859571bc7352770360d99e3d5187a11 not found: ID does not exist" containerID="e23562ac3309bc10c6dcf6b196a690279859571bc7352770360d99e3d5187a11" Mar 09 15:33:06 crc kubenswrapper[4722]: I0309 15:33:06.738502 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e23562ac3309bc10c6dcf6b196a690279859571bc7352770360d99e3d5187a11"} err="failed to get container status \"e23562ac3309bc10c6dcf6b196a690279859571bc7352770360d99e3d5187a11\": rpc error: code = NotFound desc = could not find container \"e23562ac3309bc10c6dcf6b196a690279859571bc7352770360d99e3d5187a11\": container with ID starting with e23562ac3309bc10c6dcf6b196a690279859571bc7352770360d99e3d5187a11 not found: ID does not exist" Mar 09 15:33:06 crc kubenswrapper[4722]: I0309 15:33:06.930781 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-h64qb"] Mar 09 15:33:06 crc kubenswrapper[4722]: I0309 15:33:06.944618 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-h64qb"] Mar 09 15:33:08 crc kubenswrapper[4722]: I0309 15:33:08.120081 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-72b9t" Mar 09 15:33:08 crc kubenswrapper[4722]: I0309 15:33:08.166610 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1910c215-3bd9-4767-8839-12ed902c7a84" path="/var/lib/kubelet/pods/1910c215-3bd9-4767-8839-12ed902c7a84/volumes" Mar 09 15:33:08 crc kubenswrapper[4722]: I0309 15:33:08.175024 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-72b9t" Mar 09 15:33:09 crc kubenswrapper[4722]: I0309 15:33:09.901894 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-72b9t"] Mar 09 15:33:09 crc kubenswrapper[4722]: I0309 15:33:09.902758 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-72b9t" podUID="407240ef-1832-40c9-8460-17164f5e1666" containerName="registry-server" containerID="cri-o://27ac1935d0f59c67f773f9e632d31b644cc423a747d070c0ef52819054b193a2" gracePeriod=2 Mar 09 15:33:10 crc kubenswrapper[4722]: I0309 15:33:10.532601 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-72b9t" Mar 09 15:33:10 crc kubenswrapper[4722]: I0309 15:33:10.654641 4722 generic.go:334] "Generic (PLEG): container finished" podID="407240ef-1832-40c9-8460-17164f5e1666" containerID="27ac1935d0f59c67f773f9e632d31b644cc423a747d070c0ef52819054b193a2" exitCode=0 Mar 09 15:33:10 crc kubenswrapper[4722]: I0309 15:33:10.654684 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-72b9t" event={"ID":"407240ef-1832-40c9-8460-17164f5e1666","Type":"ContainerDied","Data":"27ac1935d0f59c67f773f9e632d31b644cc423a747d070c0ef52819054b193a2"} Mar 09 15:33:10 crc kubenswrapper[4722]: I0309 15:33:10.654715 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-72b9t" event={"ID":"407240ef-1832-40c9-8460-17164f5e1666","Type":"ContainerDied","Data":"760feac865af11b8c805582af1c65fd194de15336305b68f69aac5e83a7797e3"} Mar 09 15:33:10 crc kubenswrapper[4722]: I0309 15:33:10.654732 4722 scope.go:117] "RemoveContainer" containerID="27ac1935d0f59c67f773f9e632d31b644cc423a747d070c0ef52819054b193a2" Mar 09 15:33:10 crc kubenswrapper[4722]: I0309 15:33:10.654998 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-72b9t" Mar 09 15:33:10 crc kubenswrapper[4722]: I0309 15:33:10.674136 4722 scope.go:117] "RemoveContainer" containerID="441274cdf7bf65e710c44dfc8a038a5c65de7bce33c8d38e219f27d84201f408" Mar 09 15:33:10 crc kubenswrapper[4722]: I0309 15:33:10.697022 4722 scope.go:117] "RemoveContainer" containerID="a3a300d6ec98fb7c1db8a21c5525bfb75cf2b8f57743eb0e8d4e4643b4936513" Mar 09 15:33:10 crc kubenswrapper[4722]: I0309 15:33:10.701461 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/407240ef-1832-40c9-8460-17164f5e1666-catalog-content\") pod \"407240ef-1832-40c9-8460-17164f5e1666\" (UID: \"407240ef-1832-40c9-8460-17164f5e1666\") " Mar 09 15:33:10 crc kubenswrapper[4722]: I0309 15:33:10.701568 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/407240ef-1832-40c9-8460-17164f5e1666-utilities\") pod \"407240ef-1832-40c9-8460-17164f5e1666\" (UID: \"407240ef-1832-40c9-8460-17164f5e1666\") " Mar 09 15:33:10 crc kubenswrapper[4722]: I0309 15:33:10.701783 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjv85\" (UniqueName: \"kubernetes.io/projected/407240ef-1832-40c9-8460-17164f5e1666-kube-api-access-bjv85\") pod \"407240ef-1832-40c9-8460-17164f5e1666\" (UID: \"407240ef-1832-40c9-8460-17164f5e1666\") " Mar 09 15:33:10 crc kubenswrapper[4722]: I0309 15:33:10.704039 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/407240ef-1832-40c9-8460-17164f5e1666-utilities" (OuterVolumeSpecName: "utilities") pod "407240ef-1832-40c9-8460-17164f5e1666" (UID: "407240ef-1832-40c9-8460-17164f5e1666"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 15:33:10 crc kubenswrapper[4722]: I0309 15:33:10.709700 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/407240ef-1832-40c9-8460-17164f5e1666-kube-api-access-bjv85" (OuterVolumeSpecName: "kube-api-access-bjv85") pod "407240ef-1832-40c9-8460-17164f5e1666" (UID: "407240ef-1832-40c9-8460-17164f5e1666"). InnerVolumeSpecName "kube-api-access-bjv85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 15:33:10 crc kubenswrapper[4722]: I0309 15:33:10.760442 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/407240ef-1832-40c9-8460-17164f5e1666-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "407240ef-1832-40c9-8460-17164f5e1666" (UID: "407240ef-1832-40c9-8460-17164f5e1666"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 15:33:10 crc kubenswrapper[4722]: I0309 15:33:10.805071 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjv85\" (UniqueName: \"kubernetes.io/projected/407240ef-1832-40c9-8460-17164f5e1666-kube-api-access-bjv85\") on node \"crc\" DevicePath \"\"" Mar 09 15:33:10 crc kubenswrapper[4722]: I0309 15:33:10.805110 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/407240ef-1832-40c9-8460-17164f5e1666-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 15:33:10 crc kubenswrapper[4722]: I0309 15:33:10.805122 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/407240ef-1832-40c9-8460-17164f5e1666-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 15:33:10 crc kubenswrapper[4722]: I0309 15:33:10.824941 4722 scope.go:117] "RemoveContainer" containerID="27ac1935d0f59c67f773f9e632d31b644cc423a747d070c0ef52819054b193a2" Mar 09 15:33:10 crc kubenswrapper[4722]: E0309 15:33:10.825810 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27ac1935d0f59c67f773f9e632d31b644cc423a747d070c0ef52819054b193a2\": container with ID starting with 27ac1935d0f59c67f773f9e632d31b644cc423a747d070c0ef52819054b193a2 not found: ID does not exist" containerID="27ac1935d0f59c67f773f9e632d31b644cc423a747d070c0ef52819054b193a2" Mar 09 15:33:10 crc kubenswrapper[4722]: I0309 15:33:10.825881 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27ac1935d0f59c67f773f9e632d31b644cc423a747d070c0ef52819054b193a2"} err="failed to get container status \"27ac1935d0f59c67f773f9e632d31b644cc423a747d070c0ef52819054b193a2\": rpc error: code = NotFound desc = could not find container \"27ac1935d0f59c67f773f9e632d31b644cc423a747d070c0ef52819054b193a2\": container with ID starting with 27ac1935d0f59c67f773f9e632d31b644cc423a747d070c0ef52819054b193a2 not found: ID does not exist" Mar 09 15:33:10 crc kubenswrapper[4722]: I0309 15:33:10.825920 4722 scope.go:117] "RemoveContainer" containerID="441274cdf7bf65e710c44dfc8a038a5c65de7bce33c8d38e219f27d84201f408" Mar 09 15:33:10 crc kubenswrapper[4722]: E0309 15:33:10.826551 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"441274cdf7bf65e710c44dfc8a038a5c65de7bce33c8d38e219f27d84201f408\": container with ID starting with 441274cdf7bf65e710c44dfc8a038a5c65de7bce33c8d38e219f27d84201f408 not found: ID does not exist" containerID="441274cdf7bf65e710c44dfc8a038a5c65de7bce33c8d38e219f27d84201f408" Mar 09 15:33:10 crc kubenswrapper[4722]: I0309 15:33:10.826611 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"441274cdf7bf65e710c44dfc8a038a5c65de7bce33c8d38e219f27d84201f408"} err="failed to get container status \"441274cdf7bf65e710c44dfc8a038a5c65de7bce33c8d38e219f27d84201f408\": rpc error: code = NotFound desc = could not find container \"441274cdf7bf65e710c44dfc8a038a5c65de7bce33c8d38e219f27d84201f408\": container with ID starting with 441274cdf7bf65e710c44dfc8a038a5c65de7bce33c8d38e219f27d84201f408 not found: ID does not exist" Mar 09 15:33:10 crc kubenswrapper[4722]: I0309 15:33:10.826638 4722 scope.go:117] "RemoveContainer" containerID="a3a300d6ec98fb7c1db8a21c5525bfb75cf2b8f57743eb0e8d4e4643b4936513" Mar 09 15:33:10 crc kubenswrapper[4722]: E0309 15:33:10.826968 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3a300d6ec98fb7c1db8a21c5525bfb75cf2b8f57743eb0e8d4e4643b4936513\": container with ID starting with a3a300d6ec98fb7c1db8a21c5525bfb75cf2b8f57743eb0e8d4e4643b4936513 not found: ID does not exist" containerID="a3a300d6ec98fb7c1db8a21c5525bfb75cf2b8f57743eb0e8d4e4643b4936513" Mar 09 15:33:10 crc kubenswrapper[4722]: I0309 15:33:10.827053 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3a300d6ec98fb7c1db8a21c5525bfb75cf2b8f57743eb0e8d4e4643b4936513"} err="failed to get container status \"a3a300d6ec98fb7c1db8a21c5525bfb75cf2b8f57743eb0e8d4e4643b4936513\": rpc error: code = NotFound desc = could not find container \"a3a300d6ec98fb7c1db8a21c5525bfb75cf2b8f57743eb0e8d4e4643b4936513\": container with ID starting with a3a300d6ec98fb7c1db8a21c5525bfb75cf2b8f57743eb0e8d4e4643b4936513 not found: ID does not exist" Mar 09 15:33:10 crc kubenswrapper[4722]: I0309 15:33:10.998241 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-72b9t"] Mar 09 15:33:11 crc kubenswrapper[4722]: I0309 15:33:11.029605 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-72b9t"] Mar 09 15:33:12 crc kubenswrapper[4722]: I0309 15:33:12.176040 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="407240ef-1832-40c9-8460-17164f5e1666" path="/var/lib/kubelet/pods/407240ef-1832-40c9-8460-17164f5e1666/volumes" Mar 09 15:34:00 crc kubenswrapper[4722]: I0309 15:34:00.148841 4722 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 09 15:34:00 crc kubenswrapper[4722]: I0309 15:34:00.172174 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551174-bzkws"] Mar 09 15:34:00 crc kubenswrapper[4722]: E0309 15:34:00.181007 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1910c215-3bd9-4767-8839-12ed902c7a84" containerName="registry-server" Mar 09 15:34:00 crc kubenswrapper[4722]: I0309 15:34:00.181049 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="1910c215-3bd9-4767-8839-12ed902c7a84" containerName="registry-server" Mar 09 15:34:00 crc kubenswrapper[4722]: E0309 15:34:00.181103 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="407240ef-1832-40c9-8460-17164f5e1666" containerName="extract-content" Mar 09 15:34:00 crc kubenswrapper[4722]: I0309 15:34:00.181112 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="407240ef-1832-40c9-8460-17164f5e1666" containerName="extract-content" Mar 09 15:34:00 crc kubenswrapper[4722]: E0309 15:34:00.181162 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1910c215-3bd9-4767-8839-12ed902c7a84" containerName="extract-content" Mar 09 15:34:00 crc kubenswrapper[4722]: I0309 15:34:00.181171 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="1910c215-3bd9-4767-8839-12ed902c7a84" containerName="extract-content" Mar 09 15:34:00 crc kubenswrapper[4722]: E0309 15:34:00.181242 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13a7cb7f-e80c-4ff8-9a4e-1c1048399cd5" containerName="copy" Mar 09 15:34:00 crc kubenswrapper[4722]: I0309 15:34:00.181251 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="13a7cb7f-e80c-4ff8-9a4e-1c1048399cd5" containerName="copy" Mar 09 15:34:00 crc kubenswrapper[4722]: E0309 15:34:00.181269 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13a7cb7f-e80c-4ff8-9a4e-1c1048399cd5" containerName="gather" Mar 09 15:34:00 crc kubenswrapper[4722]: I0309 15:34:00.181277 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="13a7cb7f-e80c-4ff8-9a4e-1c1048399cd5" containerName="gather" Mar 09 15:34:00 crc kubenswrapper[4722]: E0309 15:34:00.181293 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1910c215-3bd9-4767-8839-12ed902c7a84" containerName="extract-utilities" Mar 09 15:34:00 crc kubenswrapper[4722]: I0309 15:34:00.181301 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="1910c215-3bd9-4767-8839-12ed902c7a84" containerName="extract-utilities" Mar 09 15:34:00 crc kubenswrapper[4722]: E0309 15:34:00.181318 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="407240ef-1832-40c9-8460-17164f5e1666" containerName="registry-server" Mar 09 15:34:00 crc kubenswrapper[4722]: I0309 15:34:00.181326 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="407240ef-1832-40c9-8460-17164f5e1666" containerName="registry-server" Mar 09 15:34:00 crc kubenswrapper[4722]: E0309 15:34:00.181343 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="407240ef-1832-40c9-8460-17164f5e1666" containerName="extract-utilities" Mar 09 15:34:00 crc kubenswrapper[4722]: I0309 15:34:00.181350 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="407240ef-1832-40c9-8460-17164f5e1666" containerName="extract-utilities" Mar 09 15:34:00 crc kubenswrapper[4722]: I0309 15:34:00.181800 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="1910c215-3bd9-4767-8839-12ed902c7a84" containerName="registry-server" Mar 09 15:34:00 crc kubenswrapper[4722]: I0309 15:34:00.181822 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="13a7cb7f-e80c-4ff8-9a4e-1c1048399cd5" containerName="copy" Mar 09 15:34:00 crc kubenswrapper[4722]: I0309 15:34:00.181854 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="13a7cb7f-e80c-4ff8-9a4e-1c1048399cd5" containerName="gather" Mar 09 15:34:00 crc kubenswrapper[4722]: I0309 15:34:00.181871 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="407240ef-1832-40c9-8460-17164f5e1666" containerName="registry-server" Mar 09 15:34:00 crc kubenswrapper[4722]: I0309 15:34:00.182897 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551174-bzkws"] Mar 09 15:34:00 crc kubenswrapper[4722]: I0309 15:34:00.183008 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551174-bzkws" Mar 09 15:34:00 crc kubenswrapper[4722]: I0309 15:34:00.186141 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 15:34:00 crc kubenswrapper[4722]: I0309 15:34:00.187374 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-dr2x6" Mar 09 15:34:00 crc kubenswrapper[4722]: I0309 15:34:00.187436 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 15:34:00 crc kubenswrapper[4722]: I0309 15:34:00.329414 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbxdp\" (UniqueName: \"kubernetes.io/projected/27d1b542-78a5-4f99-bf22-c30efb0a190b-kube-api-access-qbxdp\") pod \"auto-csr-approver-29551174-bzkws\" (UID: \"27d1b542-78a5-4f99-bf22-c30efb0a190b\") " pod="openshift-infra/auto-csr-approver-29551174-bzkws" Mar 09 15:34:00 crc kubenswrapper[4722]: I0309 15:34:00.432112 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbxdp\" (UniqueName: \"kubernetes.io/projected/27d1b542-78a5-4f99-bf22-c30efb0a190b-kube-api-access-qbxdp\") pod \"auto-csr-approver-29551174-bzkws\" (UID: \"27d1b542-78a5-4f99-bf22-c30efb0a190b\") " pod="openshift-infra/auto-csr-approver-29551174-bzkws" Mar 09 15:34:00 crc kubenswrapper[4722]: I0309 15:34:00.456885 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbxdp\" (UniqueName: \"kubernetes.io/projected/27d1b542-78a5-4f99-bf22-c30efb0a190b-kube-api-access-qbxdp\") pod \"auto-csr-approver-29551174-bzkws\" (UID: \"27d1b542-78a5-4f99-bf22-c30efb0a190b\") " pod="openshift-infra/auto-csr-approver-29551174-bzkws" Mar 09 15:34:00 crc kubenswrapper[4722]: I0309 15:34:00.505388 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551174-bzkws" Mar 09 15:34:00 crc kubenswrapper[4722]: I0309 15:34:00.995813 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551174-bzkws"] Mar 09 15:34:01 crc kubenswrapper[4722]: I0309 15:34:01.008790 4722 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 09 15:34:01 crc kubenswrapper[4722]: I0309 15:34:01.314016 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551174-bzkws" event={"ID":"27d1b542-78a5-4f99-bf22-c30efb0a190b","Type":"ContainerStarted","Data":"d779a64ead8d7a5053bd4ceb4a97d25e6ecbe53b61621f162231fdee0e508210"} Mar 09 15:34:03 crc kubenswrapper[4722]: I0309 15:34:03.334329 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551174-bzkws" event={"ID":"27d1b542-78a5-4f99-bf22-c30efb0a190b","Type":"ContainerStarted","Data":"fab2fd7dd30201799fbf375a1892ea949b1452cf414c215dda0ff35213cf6e19"} Mar 09 15:34:03 crc kubenswrapper[4722]: I0309 15:34:03.362792 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551174-bzkws" podStartSLOduration=2.032622434 podStartE2EDuration="3.362769134s" podCreationTimestamp="2026-03-09 15:34:00 +0000 UTC" firstStartedPulling="2026-03-09 15:34:01.008534267 +0000 UTC m=+5481.564102843" lastFinishedPulling="2026-03-09 15:34:02.338680957 +0000 UTC m=+5482.894249543" observedRunningTime="2026-03-09 15:34:03.348812932 +0000 UTC m=+5483.904381528" watchObservedRunningTime="2026-03-09 15:34:03.362769134 +0000 UTC m=+5483.918337720" Mar 09 15:34:04 crc kubenswrapper[4722]: I0309 15:34:04.348079 4722 generic.go:334] "Generic (PLEG): container finished" podID="27d1b542-78a5-4f99-bf22-c30efb0a190b" containerID="fab2fd7dd30201799fbf375a1892ea949b1452cf414c215dda0ff35213cf6e19" exitCode=0 Mar 09 15:34:04 crc kubenswrapper[4722]: I0309 15:34:04.348220 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551174-bzkws" event={"ID":"27d1b542-78a5-4f99-bf22-c30efb0a190b","Type":"ContainerDied","Data":"fab2fd7dd30201799fbf375a1892ea949b1452cf414c215dda0ff35213cf6e19"} Mar 09 15:34:05 crc kubenswrapper[4722]: I0309 15:34:05.805936 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551174-bzkws" Mar 09 15:34:05 crc kubenswrapper[4722]: I0309 15:34:05.868139 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qbxdp\" (UniqueName: \"kubernetes.io/projected/27d1b542-78a5-4f99-bf22-c30efb0a190b-kube-api-access-qbxdp\") pod \"27d1b542-78a5-4f99-bf22-c30efb0a190b\" (UID: \"27d1b542-78a5-4f99-bf22-c30efb0a190b\") " Mar 09 15:34:05 crc kubenswrapper[4722]: I0309 15:34:05.874780 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27d1b542-78a5-4f99-bf22-c30efb0a190b-kube-api-access-qbxdp" (OuterVolumeSpecName: "kube-api-access-qbxdp") pod "27d1b542-78a5-4f99-bf22-c30efb0a190b" (UID: "27d1b542-78a5-4f99-bf22-c30efb0a190b"). InnerVolumeSpecName "kube-api-access-qbxdp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 15:34:05 crc kubenswrapper[4722]: I0309 15:34:05.971585 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qbxdp\" (UniqueName: \"kubernetes.io/projected/27d1b542-78a5-4f99-bf22-c30efb0a190b-kube-api-access-qbxdp\") on node \"crc\" DevicePath \"\"" Mar 09 15:34:06 crc kubenswrapper[4722]: I0309 15:34:06.376566 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551174-bzkws" event={"ID":"27d1b542-78a5-4f99-bf22-c30efb0a190b","Type":"ContainerDied","Data":"d779a64ead8d7a5053bd4ceb4a97d25e6ecbe53b61621f162231fdee0e508210"} Mar 09 15:34:06 crc kubenswrapper[4722]: I0309 15:34:06.376614 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d779a64ead8d7a5053bd4ceb4a97d25e6ecbe53b61621f162231fdee0e508210" Mar 09 15:34:06 crc kubenswrapper[4722]: I0309 15:34:06.376683 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551174-bzkws" Mar 09 15:34:06 crc kubenswrapper[4722]: I0309 15:34:06.439707 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551168-cxpl9"] Mar 09 15:34:06 crc kubenswrapper[4722]: I0309 15:34:06.458029 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551168-cxpl9"] Mar 09 15:34:08 crc kubenswrapper[4722]: I0309 15:34:08.173751 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f65d5c88-8e7d-4c6e-a14d-b838623e152a" path="/var/lib/kubelet/pods/f65d5c88-8e7d-4c6e-a14d-b838623e152a/volumes" Mar 09 15:34:29 crc kubenswrapper[4722]: I0309 15:34:29.248544 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9jxrv"] Mar 09 15:34:29 crc kubenswrapper[4722]: E0309 15:34:29.250120 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27d1b542-78a5-4f99-bf22-c30efb0a190b" containerName="oc" Mar 09 15:34:29 crc kubenswrapper[4722]: I0309 15:34:29.250137 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="27d1b542-78a5-4f99-bf22-c30efb0a190b" containerName="oc" Mar 09 15:34:29 crc kubenswrapper[4722]: I0309 15:34:29.250517 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="27d1b542-78a5-4f99-bf22-c30efb0a190b" containerName="oc" Mar 09 15:34:29 crc kubenswrapper[4722]: I0309 15:34:29.253789 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9jxrv" Mar 09 15:34:29 crc kubenswrapper[4722]: I0309 15:34:29.278628 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9jxrv"] Mar 09 15:34:29 crc kubenswrapper[4722]: I0309 15:34:29.380837 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2aca18bf-036f-47e0-b011-5fb536c536dc-utilities\") pod \"community-operators-9jxrv\" (UID: \"2aca18bf-036f-47e0-b011-5fb536c536dc\") " pod="openshift-marketplace/community-operators-9jxrv" Mar 09 15:34:29 crc kubenswrapper[4722]: I0309 15:34:29.380909 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mn46q\" (UniqueName: \"kubernetes.io/projected/2aca18bf-036f-47e0-b011-5fb536c536dc-kube-api-access-mn46q\") pod \"community-operators-9jxrv\" (UID: \"2aca18bf-036f-47e0-b011-5fb536c536dc\") " pod="openshift-marketplace/community-operators-9jxrv" Mar 09 15:34:29 crc kubenswrapper[4722]: I0309 15:34:29.380972 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2aca18bf-036f-47e0-b011-5fb536c536dc-catalog-content\") pod \"community-operators-9jxrv\" (UID: \"2aca18bf-036f-47e0-b011-5fb536c536dc\") " pod="openshift-marketplace/community-operators-9jxrv" Mar 09 15:34:29 crc kubenswrapper[4722]: I0309 15:34:29.483270 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2aca18bf-036f-47e0-b011-5fb536c536dc-utilities\") pod \"community-operators-9jxrv\" (UID: \"2aca18bf-036f-47e0-b011-5fb536c536dc\") " pod="openshift-marketplace/community-operators-9jxrv" Mar 09 15:34:29 crc kubenswrapper[4722]: I0309 15:34:29.483387 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mn46q\" (UniqueName: \"kubernetes.io/projected/2aca18bf-036f-47e0-b011-5fb536c536dc-kube-api-access-mn46q\") pod \"community-operators-9jxrv\" (UID: \"2aca18bf-036f-47e0-b011-5fb536c536dc\") " pod="openshift-marketplace/community-operators-9jxrv" Mar 09 15:34:29 crc kubenswrapper[4722]: I0309 15:34:29.483427 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2aca18bf-036f-47e0-b011-5fb536c536dc-catalog-content\") pod \"community-operators-9jxrv\" (UID: \"2aca18bf-036f-47e0-b011-5fb536c536dc\") " pod="openshift-marketplace/community-operators-9jxrv" Mar 09 15:34:29 crc kubenswrapper[4722]: I0309 15:34:29.484076 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2aca18bf-036f-47e0-b011-5fb536c536dc-catalog-content\") pod \"community-operators-9jxrv\" (UID: \"2aca18bf-036f-47e0-b011-5fb536c536dc\") " pod="openshift-marketplace/community-operators-9jxrv" Mar 09 15:34:29 crc kubenswrapper[4722]: I0309 15:34:29.484076 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2aca18bf-036f-47e0-b011-5fb536c536dc-utilities\") pod \"community-operators-9jxrv\" (UID: \"2aca18bf-036f-47e0-b011-5fb536c536dc\") " pod="openshift-marketplace/community-operators-9jxrv" Mar 09 15:34:29 crc kubenswrapper[4722]: I0309 15:34:29.501715 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mn46q\" (UniqueName: \"kubernetes.io/projected/2aca18bf-036f-47e0-b011-5fb536c536dc-kube-api-access-mn46q\") pod \"community-operators-9jxrv\" (UID: \"2aca18bf-036f-47e0-b011-5fb536c536dc\") " pod="openshift-marketplace/community-operators-9jxrv" Mar 09 15:34:29 crc kubenswrapper[4722]: I0309 15:34:29.581953 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9jxrv" Mar 09 15:34:30 crc kubenswrapper[4722]: I0309 15:34:30.123005 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9jxrv"] Mar 09 15:34:30 crc kubenswrapper[4722]: W0309 15:34:30.782583 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2aca18bf_036f_47e0_b011_5fb536c536dc.slice/crio-0e39078b5a6dcb5820fe10c98628701ae58a070c0a9a9b4c6659b5c9dc5d6ce0 WatchSource:0}: Error finding container 0e39078b5a6dcb5820fe10c98628701ae58a070c0a9a9b4c6659b5c9dc5d6ce0: Status 404 returned error can't find the container with id 0e39078b5a6dcb5820fe10c98628701ae58a070c0a9a9b4c6659b5c9dc5d6ce0 Mar 09 15:34:31 crc kubenswrapper[4722]: I0309 15:34:31.695474 4722 generic.go:334] "Generic (PLEG): container finished" podID="2aca18bf-036f-47e0-b011-5fb536c536dc" containerID="fca61f92c3131f46f4dc0a628e5727ebecc08232860a5dcdccd8d69dd20b3621" exitCode=0 Mar 09 15:34:31 crc kubenswrapper[4722]: I0309 15:34:31.695546 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9jxrv" event={"ID":"2aca18bf-036f-47e0-b011-5fb536c536dc","Type":"ContainerDied","Data":"fca61f92c3131f46f4dc0a628e5727ebecc08232860a5dcdccd8d69dd20b3621"} Mar 09 15:34:31 crc kubenswrapper[4722]: I0309 15:34:31.696189 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9jxrv" event={"ID":"2aca18bf-036f-47e0-b011-5fb536c536dc","Type":"ContainerStarted","Data":"0e39078b5a6dcb5820fe10c98628701ae58a070c0a9a9b4c6659b5c9dc5d6ce0"} Mar 09 15:34:32 crc kubenswrapper[4722]: I0309 15:34:32.707281 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9jxrv" event={"ID":"2aca18bf-036f-47e0-b011-5fb536c536dc","Type":"ContainerStarted","Data":"4d335a0eb1d293e13a3b81c8160d062f8f5b9d928d48d0db2c73c98cdc87e65e"} Mar 09 15:34:34 crc kubenswrapper[4722]: I0309 15:34:34.013383 4722 scope.go:117] "RemoveContainer" containerID="14fee07583b34b908693441432e9847512fa63e284d6b01fe43f82e351ad444b" Mar 09 15:34:34 crc kubenswrapper[4722]: I0309 15:34:34.732365 4722 generic.go:334] "Generic (PLEG): container finished" podID="2aca18bf-036f-47e0-b011-5fb536c536dc" containerID="4d335a0eb1d293e13a3b81c8160d062f8f5b9d928d48d0db2c73c98cdc87e65e" exitCode=0 Mar 09 15:34:34 crc kubenswrapper[4722]: I0309 15:34:34.732449 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9jxrv" event={"ID":"2aca18bf-036f-47e0-b011-5fb536c536dc","Type":"ContainerDied","Data":"4d335a0eb1d293e13a3b81c8160d062f8f5b9d928d48d0db2c73c98cdc87e65e"} Mar 09 15:34:35 crc kubenswrapper[4722]: I0309 15:34:35.745984 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9jxrv" event={"ID":"2aca18bf-036f-47e0-b011-5fb536c536dc","Type":"ContainerStarted","Data":"fe1396d48e4a8308e05d791247be11778fa1b694198bb2cfd0f202fa1fad5db1"} Mar 09 15:34:35 crc kubenswrapper[4722]: I0309 15:34:35.767728 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9jxrv" podStartSLOduration=3.321152662 podStartE2EDuration="6.767710182s" podCreationTimestamp="2026-03-09 15:34:29 +0000 UTC" firstStartedPulling="2026-03-09 15:34:31.697916391 +0000 UTC m=+5512.253484967" lastFinishedPulling="2026-03-09 15:34:35.144473911 +0000 UTC m=+5515.700042487" observedRunningTime="2026-03-09 15:34:35.765587274 +0000 UTC m=+5516.321155860" watchObservedRunningTime="2026-03-09 15:34:35.767710182 +0000 UTC m=+5516.323278758" Mar 09 15:34:39 crc kubenswrapper[4722]: I0309 15:34:39.583141 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9jxrv" Mar 09 15:34:39 crc kubenswrapper[4722]: I0309 15:34:39.584449 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9jxrv" Mar 09 15:34:40 crc kubenswrapper[4722]: I0309 15:34:40.663777 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-9jxrv" podUID="2aca18bf-036f-47e0-b011-5fb536c536dc" containerName="registry-server" probeResult="failure" output=< Mar 09 15:34:40 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Mar 09 15:34:40 crc kubenswrapper[4722]: > Mar 09 15:34:49 crc kubenswrapper[4722]: I0309 15:34:49.682815 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9jxrv" Mar 09 15:34:49 crc kubenswrapper[4722]: I0309 15:34:49.753956 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9jxrv" Mar 09 15:34:49 crc kubenswrapper[4722]: I0309 15:34:49.948269 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9jxrv"] Mar 09 15:34:50 crc kubenswrapper[4722]: I0309 15:34:50.938093 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9jxrv" podUID="2aca18bf-036f-47e0-b011-5fb536c536dc" containerName="registry-server" containerID="cri-o://fe1396d48e4a8308e05d791247be11778fa1b694198bb2cfd0f202fa1fad5db1" gracePeriod=2 Mar 09 15:34:51 crc kubenswrapper[4722]: I0309 15:34:51.528267 4722 patch_prober.go:28] interesting pod/machine-config-daemon-hjrrb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 15:34:51 crc kubenswrapper[4722]: I0309 15:34:51.528532 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 15:34:51 crc kubenswrapper[4722]: I0309 15:34:51.544551 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9jxrv" Mar 09 15:34:51 crc kubenswrapper[4722]: I0309 15:34:51.620621 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2aca18bf-036f-47e0-b011-5fb536c536dc-catalog-content\") pod \"2aca18bf-036f-47e0-b011-5fb536c536dc\" (UID: \"2aca18bf-036f-47e0-b011-5fb536c536dc\") " Mar 09 15:34:51 crc kubenswrapper[4722]: I0309 15:34:51.620712 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2aca18bf-036f-47e0-b011-5fb536c536dc-utilities\") pod \"2aca18bf-036f-47e0-b011-5fb536c536dc\" (UID: \"2aca18bf-036f-47e0-b011-5fb536c536dc\") " Mar 09 15:34:51 crc kubenswrapper[4722]: I0309 15:34:51.620855 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mn46q\" (UniqueName: \"kubernetes.io/projected/2aca18bf-036f-47e0-b011-5fb536c536dc-kube-api-access-mn46q\") pod \"2aca18bf-036f-47e0-b011-5fb536c536dc\" (UID: \"2aca18bf-036f-47e0-b011-5fb536c536dc\") " Mar 09 15:34:51 crc kubenswrapper[4722]: I0309 15:34:51.623328 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2aca18bf-036f-47e0-b011-5fb536c536dc-utilities" (OuterVolumeSpecName: "utilities") pod "2aca18bf-036f-47e0-b011-5fb536c536dc" (UID: "2aca18bf-036f-47e0-b011-5fb536c536dc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 15:34:51 crc kubenswrapper[4722]: I0309 15:34:51.634086 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2aca18bf-036f-47e0-b011-5fb536c536dc-kube-api-access-mn46q" (OuterVolumeSpecName: "kube-api-access-mn46q") pod "2aca18bf-036f-47e0-b011-5fb536c536dc" (UID: "2aca18bf-036f-47e0-b011-5fb536c536dc"). InnerVolumeSpecName "kube-api-access-mn46q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 15:34:51 crc kubenswrapper[4722]: I0309 15:34:51.691438 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2aca18bf-036f-47e0-b011-5fb536c536dc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2aca18bf-036f-47e0-b011-5fb536c536dc" (UID: "2aca18bf-036f-47e0-b011-5fb536c536dc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 15:34:51 crc kubenswrapper[4722]: I0309 15:34:51.723297 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mn46q\" (UniqueName: \"kubernetes.io/projected/2aca18bf-036f-47e0-b011-5fb536c536dc-kube-api-access-mn46q\") on node \"crc\" DevicePath \"\"" Mar 09 15:34:51 crc kubenswrapper[4722]: I0309 15:34:51.723333 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2aca18bf-036f-47e0-b011-5fb536c536dc-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 15:34:51 crc kubenswrapper[4722]: I0309 15:34:51.723342 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2aca18bf-036f-47e0-b011-5fb536c536dc-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 15:34:51 crc kubenswrapper[4722]: I0309 15:34:51.949185 4722 generic.go:334] "Generic (PLEG): container finished" podID="2aca18bf-036f-47e0-b011-5fb536c536dc" containerID="fe1396d48e4a8308e05d791247be11778fa1b694198bb2cfd0f202fa1fad5db1" exitCode=0 Mar 09 15:34:51 crc kubenswrapper[4722]: I0309 15:34:51.949244 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9jxrv" event={"ID":"2aca18bf-036f-47e0-b011-5fb536c536dc","Type":"ContainerDied","Data":"fe1396d48e4a8308e05d791247be11778fa1b694198bb2cfd0f202fa1fad5db1"} Mar 09 15:34:51 crc kubenswrapper[4722]: I0309 15:34:51.949272 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9jxrv" event={"ID":"2aca18bf-036f-47e0-b011-5fb536c536dc","Type":"ContainerDied","Data":"0e39078b5a6dcb5820fe10c98628701ae58a070c0a9a9b4c6659b5c9dc5d6ce0"} Mar 09 15:34:51 crc kubenswrapper[4722]: I0309 15:34:51.949289 4722 scope.go:117] "RemoveContainer" containerID="fe1396d48e4a8308e05d791247be11778fa1b694198bb2cfd0f202fa1fad5db1" Mar 09 15:34:51 crc kubenswrapper[4722]: I0309 15:34:51.949287 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9jxrv" Mar 09 15:34:51 crc kubenswrapper[4722]: I0309 15:34:51.997269 4722 scope.go:117] "RemoveContainer" containerID="4d335a0eb1d293e13a3b81c8160d062f8f5b9d928d48d0db2c73c98cdc87e65e" Mar 09 15:34:52 crc kubenswrapper[4722]: I0309 15:34:52.001883 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9jxrv"] Mar 09 15:34:52 crc kubenswrapper[4722]: I0309 15:34:52.012445 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9jxrv"] Mar 09 15:34:52 crc kubenswrapper[4722]: I0309 15:34:52.036731 4722 scope.go:117] "RemoveContainer" containerID="fca61f92c3131f46f4dc0a628e5727ebecc08232860a5dcdccd8d69dd20b3621" Mar 09 15:34:52 crc kubenswrapper[4722]: I0309 15:34:52.087055 4722 scope.go:117] "RemoveContainer" containerID="fe1396d48e4a8308e05d791247be11778fa1b694198bb2cfd0f202fa1fad5db1" Mar 09 15:34:52 crc kubenswrapper[4722]: E0309 15:34:52.087634 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe1396d48e4a8308e05d791247be11778fa1b694198bb2cfd0f202fa1fad5db1\": container with ID starting with fe1396d48e4a8308e05d791247be11778fa1b694198bb2cfd0f202fa1fad5db1 not found: ID does not exist" containerID="fe1396d48e4a8308e05d791247be11778fa1b694198bb2cfd0f202fa1fad5db1" Mar 09 15:34:52 crc kubenswrapper[4722]: I0309 15:34:52.087712 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe1396d48e4a8308e05d791247be11778fa1b694198bb2cfd0f202fa1fad5db1"} err="failed to get container status \"fe1396d48e4a8308e05d791247be11778fa1b694198bb2cfd0f202fa1fad5db1\": rpc error: code = NotFound desc = could not find container \"fe1396d48e4a8308e05d791247be11778fa1b694198bb2cfd0f202fa1fad5db1\": container with ID starting with fe1396d48e4a8308e05d791247be11778fa1b694198bb2cfd0f202fa1fad5db1 not found: ID does not exist" Mar 09 15:34:52 crc kubenswrapper[4722]: I0309 15:34:52.087760 4722 scope.go:117] "RemoveContainer" containerID="4d335a0eb1d293e13a3b81c8160d062f8f5b9d928d48d0db2c73c98cdc87e65e" Mar 09 15:34:52 crc kubenswrapper[4722]: E0309 15:34:52.088123 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d335a0eb1d293e13a3b81c8160d062f8f5b9d928d48d0db2c73c98cdc87e65e\": container with ID starting with 4d335a0eb1d293e13a3b81c8160d062f8f5b9d928d48d0db2c73c98cdc87e65e not found: ID does not exist" containerID="4d335a0eb1d293e13a3b81c8160d062f8f5b9d928d48d0db2c73c98cdc87e65e" Mar 09 15:34:52 crc kubenswrapper[4722]: I0309 15:34:52.088170 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d335a0eb1d293e13a3b81c8160d062f8f5b9d928d48d0db2c73c98cdc87e65e"} err="failed to get container status \"4d335a0eb1d293e13a3b81c8160d062f8f5b9d928d48d0db2c73c98cdc87e65e\": rpc error: code = NotFound desc = could not find container \"4d335a0eb1d293e13a3b81c8160d062f8f5b9d928d48d0db2c73c98cdc87e65e\": container with ID starting with 4d335a0eb1d293e13a3b81c8160d062f8f5b9d928d48d0db2c73c98cdc87e65e not found: ID does not exist" Mar 09 15:34:52 crc kubenswrapper[4722]: I0309 15:34:52.088193 4722 scope.go:117] "RemoveContainer" containerID="fca61f92c3131f46f4dc0a628e5727ebecc08232860a5dcdccd8d69dd20b3621" Mar 09 15:34:52 crc kubenswrapper[4722]: E0309 15:34:52.088560 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fca61f92c3131f46f4dc0a628e5727ebecc08232860a5dcdccd8d69dd20b3621\": container with ID starting with fca61f92c3131f46f4dc0a628e5727ebecc08232860a5dcdccd8d69dd20b3621 not found: ID does not exist" containerID="fca61f92c3131f46f4dc0a628e5727ebecc08232860a5dcdccd8d69dd20b3621" Mar 09 15:34:52 crc kubenswrapper[4722]: I0309 15:34:52.088581 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fca61f92c3131f46f4dc0a628e5727ebecc08232860a5dcdccd8d69dd20b3621"} err="failed to get container status \"fca61f92c3131f46f4dc0a628e5727ebecc08232860a5dcdccd8d69dd20b3621\": rpc error: code = NotFound desc = could not find container \"fca61f92c3131f46f4dc0a628e5727ebecc08232860a5dcdccd8d69dd20b3621\": container with ID starting with fca61f92c3131f46f4dc0a628e5727ebecc08232860a5dcdccd8d69dd20b3621 not found: ID does not exist" Mar 09 15:34:52 crc kubenswrapper[4722]: I0309 15:34:52.162156 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2aca18bf-036f-47e0-b011-5fb536c536dc" path="/var/lib/kubelet/pods/2aca18bf-036f-47e0-b011-5fb536c536dc/volumes" Mar 09 15:35:21 crc kubenswrapper[4722]: I0309 15:35:21.528081 4722 patch_prober.go:28] interesting pod/machine-config-daemon-hjrrb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 15:35:21 crc kubenswrapper[4722]: I0309 15:35:21.528605 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 15:35:26 crc kubenswrapper[4722]: I0309 15:35:26.148829 4722 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 09 15:35:30 crc kubenswrapper[4722]: I0309 15:35:30.224489 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bth2g"] Mar 09 15:35:30 crc kubenswrapper[4722]: E0309 15:35:30.226793 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2aca18bf-036f-47e0-b011-5fb536c536dc" containerName="extract-content" Mar 09 15:35:30 crc kubenswrapper[4722]: I0309 15:35:30.226933 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="2aca18bf-036f-47e0-b011-5fb536c536dc" containerName="extract-content" Mar 09 15:35:30 crc kubenswrapper[4722]: E0309 15:35:30.227134 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2aca18bf-036f-47e0-b011-5fb536c536dc" containerName="extract-utilities" Mar 09 15:35:30 crc kubenswrapper[4722]: I0309 15:35:30.227265 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="2aca18bf-036f-47e0-b011-5fb536c536dc" containerName="extract-utilities" Mar 09 15:35:30 crc kubenswrapper[4722]: E0309 15:35:30.227365 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2aca18bf-036f-47e0-b011-5fb536c536dc" containerName="registry-server" Mar 09 15:35:30 crc kubenswrapper[4722]: I0309 15:35:30.227379 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="2aca18bf-036f-47e0-b011-5fb536c536dc" containerName="registry-server" Mar 09 15:35:30 crc kubenswrapper[4722]: I0309 15:35:30.227877 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="2aca18bf-036f-47e0-b011-5fb536c536dc" containerName="registry-server" Mar 09 15:35:30 crc kubenswrapper[4722]: I0309 15:35:30.230333 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bth2g" Mar 09 15:35:30 crc kubenswrapper[4722]: I0309 15:35:30.244640 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bth2g"] Mar 09 15:35:30 crc kubenswrapper[4722]: I0309 15:35:30.365149 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34452a38-1c2d-44a0-87b4-8d22c9774b81-catalog-content\") pod \"redhat-operators-bth2g\" (UID: \"34452a38-1c2d-44a0-87b4-8d22c9774b81\") " pod="openshift-marketplace/redhat-operators-bth2g" Mar 09 15:35:30 crc kubenswrapper[4722]: I0309 15:35:30.365223 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34452a38-1c2d-44a0-87b4-8d22c9774b81-utilities\") pod \"redhat-operators-bth2g\" (UID: \"34452a38-1c2d-44a0-87b4-8d22c9774b81\") " pod="openshift-marketplace/redhat-operators-bth2g" Mar 09 15:35:30 crc kubenswrapper[4722]: I0309 15:35:30.365358 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kc9c8\" (UniqueName: \"kubernetes.io/projected/34452a38-1c2d-44a0-87b4-8d22c9774b81-kube-api-access-kc9c8\") pod \"redhat-operators-bth2g\" (UID: \"34452a38-1c2d-44a0-87b4-8d22c9774b81\") " pod="openshift-marketplace/redhat-operators-bth2g" Mar 09 15:35:30 crc kubenswrapper[4722]: I0309 15:35:30.467465 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34452a38-1c2d-44a0-87b4-8d22c9774b81-catalog-content\") pod \"redhat-operators-bth2g\" (UID: \"34452a38-1c2d-44a0-87b4-8d22c9774b81\") " pod="openshift-marketplace/redhat-operators-bth2g" Mar 09 15:35:30 crc kubenswrapper[4722]: I0309 15:35:30.467522 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34452a38-1c2d-44a0-87b4-8d22c9774b81-utilities\") pod \"redhat-operators-bth2g\" (UID: \"34452a38-1c2d-44a0-87b4-8d22c9774b81\") " pod="openshift-marketplace/redhat-operators-bth2g" Mar 09 15:35:30 crc kubenswrapper[4722]: I0309 15:35:30.467625 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kc9c8\" (UniqueName: \"kubernetes.io/projected/34452a38-1c2d-44a0-87b4-8d22c9774b81-kube-api-access-kc9c8\") pod \"redhat-operators-bth2g\" (UID: \"34452a38-1c2d-44a0-87b4-8d22c9774b81\") " pod="openshift-marketplace/redhat-operators-bth2g" Mar 09 15:35:30 crc kubenswrapper[4722]: I0309 15:35:30.468763 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34452a38-1c2d-44a0-87b4-8d22c9774b81-catalog-content\") pod \"redhat-operators-bth2g\" (UID: \"34452a38-1c2d-44a0-87b4-8d22c9774b81\") " pod="openshift-marketplace/redhat-operators-bth2g" Mar 09 15:35:30 crc kubenswrapper[4722]: I0309 15:35:30.469478 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34452a38-1c2d-44a0-87b4-8d22c9774b81-utilities\") pod \"redhat-operators-bth2g\" (UID: \"34452a38-1c2d-44a0-87b4-8d22c9774b81\") " pod="openshift-marketplace/redhat-operators-bth2g" Mar 09 15:35:30 crc kubenswrapper[4722]: I0309 15:35:30.495976 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kc9c8\" (UniqueName: \"kubernetes.io/projected/34452a38-1c2d-44a0-87b4-8d22c9774b81-kube-api-access-kc9c8\") pod \"redhat-operators-bth2g\" (UID: \"34452a38-1c2d-44a0-87b4-8d22c9774b81\") " pod="openshift-marketplace/redhat-operators-bth2g" Mar 09 15:35:30 crc kubenswrapper[4722]: I0309 15:35:30.573407 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bth2g" Mar 09 15:35:31 crc kubenswrapper[4722]: I0309 15:35:31.098135 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bth2g"] Mar 09 15:35:31 crc kubenswrapper[4722]: I0309 15:35:31.535704 4722 generic.go:334] "Generic (PLEG): container finished" podID="34452a38-1c2d-44a0-87b4-8d22c9774b81" containerID="77d9080b788e1ad459f467fa139ed79df6e3207ce4051b32f1ff09b52cbb5f6d" exitCode=0 Mar 09 15:35:31 crc kubenswrapper[4722]: I0309 15:35:31.535801 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bth2g" event={"ID":"34452a38-1c2d-44a0-87b4-8d22c9774b81","Type":"ContainerDied","Data":"77d9080b788e1ad459f467fa139ed79df6e3207ce4051b32f1ff09b52cbb5f6d"} Mar 09 15:35:31 crc kubenswrapper[4722]: I0309 15:35:31.536013 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bth2g" event={"ID":"34452a38-1c2d-44a0-87b4-8d22c9774b81","Type":"ContainerStarted","Data":"17f1efa6772f58c33c7b7e5a99ee6229cc0dd057396ffbc77609b4862f7f3793"} Mar 09 15:35:33 crc kubenswrapper[4722]: I0309 15:35:33.563837 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bth2g" event={"ID":"34452a38-1c2d-44a0-87b4-8d22c9774b81","Type":"ContainerStarted","Data":"d153acbff25a077b0ebf4a4b56c5df90907d0b7e3f63602d40f9ccb8e2e7304b"} Mar 09 15:35:38 crc kubenswrapper[4722]: I0309 15:35:38.623944 4722 generic.go:334] "Generic (PLEG): container finished" podID="34452a38-1c2d-44a0-87b4-8d22c9774b81" containerID="d153acbff25a077b0ebf4a4b56c5df90907d0b7e3f63602d40f9ccb8e2e7304b" exitCode=0 Mar 09 15:35:38 crc kubenswrapper[4722]: I0309 15:35:38.624007 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bth2g" event={"ID":"34452a38-1c2d-44a0-87b4-8d22c9774b81","Type":"ContainerDied","Data":"d153acbff25a077b0ebf4a4b56c5df90907d0b7e3f63602d40f9ccb8e2e7304b"} Mar 09 15:35:39 crc kubenswrapper[4722]: I0309 15:35:39.639750 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bth2g" event={"ID":"34452a38-1c2d-44a0-87b4-8d22c9774b81","Type":"ContainerStarted","Data":"8506a33963f9ebbee59fefd785189a8c6dadea5e8a4a960d8fec2b3ca2d32641"} Mar 09 15:35:39 crc kubenswrapper[4722]: I0309 15:35:39.676370 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bth2g" podStartSLOduration=2.083981257 podStartE2EDuration="9.67634542s" podCreationTimestamp="2026-03-09 15:35:30 +0000 UTC" firstStartedPulling="2026-03-09 15:35:31.537979559 +0000 UTC m=+5572.093548145" lastFinishedPulling="2026-03-09 15:35:39.130343732 +0000 UTC m=+5579.685912308" observedRunningTime="2026-03-09 15:35:39.662728297 +0000 UTC m=+5580.218296883" watchObservedRunningTime="2026-03-09 15:35:39.67634542 +0000 UTC m=+5580.231914016" Mar 09 15:35:40 crc kubenswrapper[4722]: I0309 15:35:40.573481 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bth2g" Mar 09 15:35:40 crc kubenswrapper[4722]: I0309 15:35:40.573858 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bth2g" Mar 09 15:35:41 crc kubenswrapper[4722]: I0309 15:35:41.625545 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bth2g" podUID="34452a38-1c2d-44a0-87b4-8d22c9774b81" containerName="registry-server" probeResult="failure" output=< Mar 09 15:35:41 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Mar 09 15:35:41 crc kubenswrapper[4722]: > Mar 09 15:35:51 crc kubenswrapper[4722]: I0309 15:35:51.528431 4722 patch_prober.go:28] interesting pod/machine-config-daemon-hjrrb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 15:35:51 crc kubenswrapper[4722]: I0309 15:35:51.528886 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 15:35:51 crc kubenswrapper[4722]: I0309 15:35:51.528929 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" Mar 09 15:35:51 crc kubenswrapper[4722]: I0309 15:35:51.530258 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4a53fd9ee2bab0740d7bdf686e82ec34043eb1813cf987fb88ca038e6c64eb91"} pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 15:35:51 crc kubenswrapper[4722]: I0309 15:35:51.530316 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerName="machine-config-daemon" containerID="cri-o://4a53fd9ee2bab0740d7bdf686e82ec34043eb1813cf987fb88ca038e6c64eb91" gracePeriod=600 Mar 09 15:35:51 crc kubenswrapper[4722]: I0309 15:35:51.642888 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bth2g" podUID="34452a38-1c2d-44a0-87b4-8d22c9774b81" containerName="registry-server" probeResult="failure" output=< Mar 09 15:35:51 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Mar 09 15:35:51 crc kubenswrapper[4722]: > Mar 09 15:35:51 crc kubenswrapper[4722]: I0309 15:35:51.812408 4722 generic.go:334] "Generic (PLEG): container finished" podID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerID="4a53fd9ee2bab0740d7bdf686e82ec34043eb1813cf987fb88ca038e6c64eb91" exitCode=0 Mar 09 15:35:51 crc kubenswrapper[4722]: I0309 15:35:51.812582 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" event={"ID":"dac2aaf5-653b-4b2a-8efe-ed26bac8d648","Type":"ContainerDied","Data":"4a53fd9ee2bab0740d7bdf686e82ec34043eb1813cf987fb88ca038e6c64eb91"} Mar 09 15:35:51 crc kubenswrapper[4722]: I0309 15:35:51.814327 4722 scope.go:117] "RemoveContainer" containerID="05d0cab124a66072241bc2a93dace83012b26d6bbd0de5507999fb5df76700cb" Mar 09 15:35:52 crc kubenswrapper[4722]: I0309 15:35:52.826701 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" event={"ID":"dac2aaf5-653b-4b2a-8efe-ed26bac8d648","Type":"ContainerStarted","Data":"212e82f1263ddfd9c596ba4b91b126e5985c2c4e08e0901d34421afe340b4cd9"} Mar 09 15:36:00 crc kubenswrapper[4722]: I0309 15:36:00.165340 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551176-6ts4l"] Mar 09 15:36:00 crc kubenswrapper[4722]: I0309 15:36:00.169517 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551176-6ts4l" Mar 09 15:36:00 crc kubenswrapper[4722]: I0309 15:36:00.173391 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-dr2x6" Mar 09 15:36:00 crc kubenswrapper[4722]: I0309 15:36:00.174113 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 15:36:00 crc kubenswrapper[4722]: I0309 15:36:00.174271 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 15:36:00 crc kubenswrapper[4722]: I0309 15:36:00.183966 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551176-6ts4l"] Mar 09 15:36:00 crc kubenswrapper[4722]: I0309 15:36:00.320103 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scnk4\" (UniqueName: \"kubernetes.io/projected/b6ba7f09-db1f-44d9-ae72-b668460209a2-kube-api-access-scnk4\") pod \"auto-csr-approver-29551176-6ts4l\" (UID: \"b6ba7f09-db1f-44d9-ae72-b668460209a2\") " pod="openshift-infra/auto-csr-approver-29551176-6ts4l" Mar 09 15:36:00 crc kubenswrapper[4722]: I0309 15:36:00.422724 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scnk4\" (UniqueName: \"kubernetes.io/projected/b6ba7f09-db1f-44d9-ae72-b668460209a2-kube-api-access-scnk4\") pod \"auto-csr-approver-29551176-6ts4l\" (UID: \"b6ba7f09-db1f-44d9-ae72-b668460209a2\") " pod="openshift-infra/auto-csr-approver-29551176-6ts4l" Mar 09 15:36:00 crc kubenswrapper[4722]: I0309 15:36:00.448369 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scnk4\" (UniqueName: \"kubernetes.io/projected/b6ba7f09-db1f-44d9-ae72-b668460209a2-kube-api-access-scnk4\") pod \"auto-csr-approver-29551176-6ts4l\" (UID: \"b6ba7f09-db1f-44d9-ae72-b668460209a2\") " pod="openshift-infra/auto-csr-approver-29551176-6ts4l" Mar 09 15:36:00 crc kubenswrapper[4722]: I0309 15:36:00.498931 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551176-6ts4l" Mar 09 15:36:01 crc kubenswrapper[4722]: W0309 15:36:01.168489 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb6ba7f09_db1f_44d9_ae72_b668460209a2.slice/crio-bb4b8eb6d5569828312fd42a27c1f0a185d51515c90dcaabad59e1163e3767db WatchSource:0}: Error finding container bb4b8eb6d5569828312fd42a27c1f0a185d51515c90dcaabad59e1163e3767db: Status 404 returned error can't find the container with id bb4b8eb6d5569828312fd42a27c1f0a185d51515c90dcaabad59e1163e3767db Mar 09 15:36:01 crc kubenswrapper[4722]: I0309 15:36:01.168878 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551176-6ts4l"] Mar 09 15:36:01 crc kubenswrapper[4722]: I0309 15:36:01.644110 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bth2g" podUID="34452a38-1c2d-44a0-87b4-8d22c9774b81" containerName="registry-server" probeResult="failure" output=< Mar 09 15:36:01 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Mar 09 15:36:01 crc kubenswrapper[4722]: > Mar 09 15:36:01 crc kubenswrapper[4722]: I0309 15:36:01.923103 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551176-6ts4l" event={"ID":"b6ba7f09-db1f-44d9-ae72-b668460209a2","Type":"ContainerStarted","Data":"bb4b8eb6d5569828312fd42a27c1f0a185d51515c90dcaabad59e1163e3767db"} Mar 09 15:36:02 crc kubenswrapper[4722]: I0309 15:36:02.954023 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551176-6ts4l" event={"ID":"b6ba7f09-db1f-44d9-ae72-b668460209a2","Type":"ContainerStarted","Data":"612651f2bee115e76c2a85ea219638791772b87c4bf710d86853bab9073fb28c"} Mar 09 15:36:02 crc kubenswrapper[4722]: I0309 15:36:02.984803 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551176-6ts4l" podStartSLOduration=2.003392037 podStartE2EDuration="2.984777865s" podCreationTimestamp="2026-03-09 15:36:00 +0000 UTC" firstStartedPulling="2026-03-09 15:36:01.172576476 +0000 UTC m=+5601.728145052" lastFinishedPulling="2026-03-09 15:36:02.153962304 +0000 UTC m=+5602.709530880" observedRunningTime="2026-03-09 15:36:02.967531793 +0000 UTC m=+5603.523100379" watchObservedRunningTime="2026-03-09 15:36:02.984777865 +0000 UTC m=+5603.540346441" Mar 09 15:36:03 crc kubenswrapper[4722]: I0309 15:36:03.970999 4722 generic.go:334] "Generic (PLEG): container finished" podID="b6ba7f09-db1f-44d9-ae72-b668460209a2" containerID="612651f2bee115e76c2a85ea219638791772b87c4bf710d86853bab9073fb28c" exitCode=0 Mar 09 15:36:03 crc kubenswrapper[4722]: I0309 15:36:03.971387 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551176-6ts4l" event={"ID":"b6ba7f09-db1f-44d9-ae72-b668460209a2","Type":"ContainerDied","Data":"612651f2bee115e76c2a85ea219638791772b87c4bf710d86853bab9073fb28c"} Mar 09 15:36:05 crc kubenswrapper[4722]: I0309 15:36:05.405687 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551176-6ts4l" Mar 09 15:36:05 crc kubenswrapper[4722]: I0309 15:36:05.449454 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-scnk4\" (UniqueName: \"kubernetes.io/projected/b6ba7f09-db1f-44d9-ae72-b668460209a2-kube-api-access-scnk4\") pod \"b6ba7f09-db1f-44d9-ae72-b668460209a2\" (UID: \"b6ba7f09-db1f-44d9-ae72-b668460209a2\") " Mar 09 15:36:05 crc kubenswrapper[4722]: I0309 15:36:05.462030 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6ba7f09-db1f-44d9-ae72-b668460209a2-kube-api-access-scnk4" (OuterVolumeSpecName: "kube-api-access-scnk4") pod "b6ba7f09-db1f-44d9-ae72-b668460209a2" (UID: "b6ba7f09-db1f-44d9-ae72-b668460209a2"). InnerVolumeSpecName "kube-api-access-scnk4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 15:36:05 crc kubenswrapper[4722]: I0309 15:36:05.552855 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-scnk4\" (UniqueName: \"kubernetes.io/projected/b6ba7f09-db1f-44d9-ae72-b668460209a2-kube-api-access-scnk4\") on node \"crc\" DevicePath \"\"" Mar 09 15:36:06 crc kubenswrapper[4722]: I0309 15:36:06.008816 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551176-6ts4l" event={"ID":"b6ba7f09-db1f-44d9-ae72-b668460209a2","Type":"ContainerDied","Data":"bb4b8eb6d5569828312fd42a27c1f0a185d51515c90dcaabad59e1163e3767db"} Mar 09 15:36:06 crc kubenswrapper[4722]: I0309 15:36:06.008883 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb4b8eb6d5569828312fd42a27c1f0a185d51515c90dcaabad59e1163e3767db" Mar 09 15:36:06 crc kubenswrapper[4722]: I0309 15:36:06.008956 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551176-6ts4l" Mar 09 15:36:06 crc kubenswrapper[4722]: I0309 15:36:06.067322 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551170-lxxqz"] Mar 09 15:36:06 crc kubenswrapper[4722]: I0309 15:36:06.083740 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551170-lxxqz"] Mar 09 15:36:06 crc kubenswrapper[4722]: I0309 15:36:06.163270 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2b03ceb-a279-4acb-b31a-61b62dd37ae0" path="/var/lib/kubelet/pods/d2b03ceb-a279-4acb-b31a-61b62dd37ae0/volumes" Mar 09 15:36:10 crc kubenswrapper[4722]: I0309 15:36:10.663699 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bth2g" Mar 09 15:36:10 crc kubenswrapper[4722]: I0309 15:36:10.729392 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bth2g" Mar 09 15:36:10 crc kubenswrapper[4722]: I0309 15:36:10.905165 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bth2g"] Mar 09 15:36:12 crc kubenswrapper[4722]: I0309 15:36:12.092274 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bth2g" podUID="34452a38-1c2d-44a0-87b4-8d22c9774b81" containerName="registry-server" containerID="cri-o://8506a33963f9ebbee59fefd785189a8c6dadea5e8a4a960d8fec2b3ca2d32641" gracePeriod=2 Mar 09 15:36:12 crc kubenswrapper[4722]: I0309 15:36:12.684999 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bth2g" Mar 09 15:36:12 crc kubenswrapper[4722]: I0309 15:36:12.752428 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34452a38-1c2d-44a0-87b4-8d22c9774b81-catalog-content\") pod \"34452a38-1c2d-44a0-87b4-8d22c9774b81\" (UID: \"34452a38-1c2d-44a0-87b4-8d22c9774b81\") " Mar 09 15:36:12 crc kubenswrapper[4722]: I0309 15:36:12.752522 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34452a38-1c2d-44a0-87b4-8d22c9774b81-utilities\") pod \"34452a38-1c2d-44a0-87b4-8d22c9774b81\" (UID: \"34452a38-1c2d-44a0-87b4-8d22c9774b81\") " Mar 09 15:36:12 crc kubenswrapper[4722]: I0309 15:36:12.752741 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kc9c8\" (UniqueName: \"kubernetes.io/projected/34452a38-1c2d-44a0-87b4-8d22c9774b81-kube-api-access-kc9c8\") pod \"34452a38-1c2d-44a0-87b4-8d22c9774b81\" (UID: \"34452a38-1c2d-44a0-87b4-8d22c9774b81\") " Mar 09 15:36:12 crc kubenswrapper[4722]: I0309 15:36:12.753826 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34452a38-1c2d-44a0-87b4-8d22c9774b81-utilities" (OuterVolumeSpecName: "utilities") pod "34452a38-1c2d-44a0-87b4-8d22c9774b81" (UID: "34452a38-1c2d-44a0-87b4-8d22c9774b81"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 15:36:12 crc kubenswrapper[4722]: I0309 15:36:12.780496 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34452a38-1c2d-44a0-87b4-8d22c9774b81-kube-api-access-kc9c8" (OuterVolumeSpecName: "kube-api-access-kc9c8") pod "34452a38-1c2d-44a0-87b4-8d22c9774b81" (UID: "34452a38-1c2d-44a0-87b4-8d22c9774b81"). InnerVolumeSpecName "kube-api-access-kc9c8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 15:36:12 crc kubenswrapper[4722]: I0309 15:36:12.855977 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kc9c8\" (UniqueName: \"kubernetes.io/projected/34452a38-1c2d-44a0-87b4-8d22c9774b81-kube-api-access-kc9c8\") on node \"crc\" DevicePath \"\"" Mar 09 15:36:12 crc kubenswrapper[4722]: I0309 15:36:12.856369 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34452a38-1c2d-44a0-87b4-8d22c9774b81-utilities\") on node \"crc\" DevicePath \"\"" Mar 09 15:36:12 crc kubenswrapper[4722]: I0309 15:36:12.907540 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34452a38-1c2d-44a0-87b4-8d22c9774b81-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "34452a38-1c2d-44a0-87b4-8d22c9774b81" (UID: "34452a38-1c2d-44a0-87b4-8d22c9774b81"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 09 15:36:12 crc kubenswrapper[4722]: I0309 15:36:12.958752 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34452a38-1c2d-44a0-87b4-8d22c9774b81-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 09 15:36:13 crc kubenswrapper[4722]: I0309 15:36:13.110639 4722 generic.go:334] "Generic (PLEG): container finished" podID="34452a38-1c2d-44a0-87b4-8d22c9774b81" containerID="8506a33963f9ebbee59fefd785189a8c6dadea5e8a4a960d8fec2b3ca2d32641" exitCode=0 Mar 09 15:36:13 crc kubenswrapper[4722]: I0309 15:36:13.110675 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bth2g" event={"ID":"34452a38-1c2d-44a0-87b4-8d22c9774b81","Type":"ContainerDied","Data":"8506a33963f9ebbee59fefd785189a8c6dadea5e8a4a960d8fec2b3ca2d32641"} Mar 09 15:36:13 crc kubenswrapper[4722]: I0309 15:36:13.110721 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bth2g" event={"ID":"34452a38-1c2d-44a0-87b4-8d22c9774b81","Type":"ContainerDied","Data":"17f1efa6772f58c33c7b7e5a99ee6229cc0dd057396ffbc77609b4862f7f3793"} Mar 09 15:36:13 crc kubenswrapper[4722]: I0309 15:36:13.110743 4722 scope.go:117] "RemoveContainer" containerID="8506a33963f9ebbee59fefd785189a8c6dadea5e8a4a960d8fec2b3ca2d32641" Mar 09 15:36:13 crc kubenswrapper[4722]: I0309 15:36:13.110756 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bth2g" Mar 09 15:36:13 crc kubenswrapper[4722]: I0309 15:36:13.140415 4722 scope.go:117] "RemoveContainer" containerID="d153acbff25a077b0ebf4a4b56c5df90907d0b7e3f63602d40f9ccb8e2e7304b" Mar 09 15:36:13 crc kubenswrapper[4722]: I0309 15:36:13.168239 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bth2g"] Mar 09 15:36:13 crc kubenswrapper[4722]: I0309 15:36:13.173516 4722 scope.go:117] "RemoveContainer" containerID="77d9080b788e1ad459f467fa139ed79df6e3207ce4051b32f1ff09b52cbb5f6d" Mar 09 15:36:13 crc kubenswrapper[4722]: I0309 15:36:13.182435 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bth2g"] Mar 09 15:36:13 crc kubenswrapper[4722]: I0309 15:36:13.222146 4722 scope.go:117] "RemoveContainer" containerID="8506a33963f9ebbee59fefd785189a8c6dadea5e8a4a960d8fec2b3ca2d32641" Mar 09 15:36:13 crc kubenswrapper[4722]: E0309 15:36:13.222663 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8506a33963f9ebbee59fefd785189a8c6dadea5e8a4a960d8fec2b3ca2d32641\": container with ID starting with 8506a33963f9ebbee59fefd785189a8c6dadea5e8a4a960d8fec2b3ca2d32641 not found: ID does not exist" containerID="8506a33963f9ebbee59fefd785189a8c6dadea5e8a4a960d8fec2b3ca2d32641" Mar 09 15:36:13 crc kubenswrapper[4722]: I0309 15:36:13.222710 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8506a33963f9ebbee59fefd785189a8c6dadea5e8a4a960d8fec2b3ca2d32641"} err="failed to get container status \"8506a33963f9ebbee59fefd785189a8c6dadea5e8a4a960d8fec2b3ca2d32641\": rpc error: code = NotFound desc = could not find container \"8506a33963f9ebbee59fefd785189a8c6dadea5e8a4a960d8fec2b3ca2d32641\": container with ID starting with 8506a33963f9ebbee59fefd785189a8c6dadea5e8a4a960d8fec2b3ca2d32641 not found: ID does not exist" Mar 09 15:36:13 crc kubenswrapper[4722]: I0309 15:36:13.222737 4722 scope.go:117] "RemoveContainer" containerID="d153acbff25a077b0ebf4a4b56c5df90907d0b7e3f63602d40f9ccb8e2e7304b" Mar 09 15:36:13 crc kubenswrapper[4722]: E0309 15:36:13.223075 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d153acbff25a077b0ebf4a4b56c5df90907d0b7e3f63602d40f9ccb8e2e7304b\": container with ID starting with d153acbff25a077b0ebf4a4b56c5df90907d0b7e3f63602d40f9ccb8e2e7304b not found: ID does not exist" containerID="d153acbff25a077b0ebf4a4b56c5df90907d0b7e3f63602d40f9ccb8e2e7304b" Mar 09 15:36:13 crc kubenswrapper[4722]: I0309 15:36:13.223114 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d153acbff25a077b0ebf4a4b56c5df90907d0b7e3f63602d40f9ccb8e2e7304b"} err="failed to get container status \"d153acbff25a077b0ebf4a4b56c5df90907d0b7e3f63602d40f9ccb8e2e7304b\": rpc error: code = NotFound desc = could not find container \"d153acbff25a077b0ebf4a4b56c5df90907d0b7e3f63602d40f9ccb8e2e7304b\": container with ID starting with d153acbff25a077b0ebf4a4b56c5df90907d0b7e3f63602d40f9ccb8e2e7304b not found: ID does not exist" Mar 09 15:36:13 crc kubenswrapper[4722]: I0309 15:36:13.223142 4722 scope.go:117] "RemoveContainer" containerID="77d9080b788e1ad459f467fa139ed79df6e3207ce4051b32f1ff09b52cbb5f6d" Mar 09 15:36:13 crc kubenswrapper[4722]: E0309 15:36:13.223508 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77d9080b788e1ad459f467fa139ed79df6e3207ce4051b32f1ff09b52cbb5f6d\": container with ID starting with 77d9080b788e1ad459f467fa139ed79df6e3207ce4051b32f1ff09b52cbb5f6d not found: ID does not exist" containerID="77d9080b788e1ad459f467fa139ed79df6e3207ce4051b32f1ff09b52cbb5f6d" Mar 09 15:36:13 crc kubenswrapper[4722]: I0309 15:36:13.223538 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77d9080b788e1ad459f467fa139ed79df6e3207ce4051b32f1ff09b52cbb5f6d"} err="failed to get container status \"77d9080b788e1ad459f467fa139ed79df6e3207ce4051b32f1ff09b52cbb5f6d\": rpc error: code = NotFound desc = could not find container \"77d9080b788e1ad459f467fa139ed79df6e3207ce4051b32f1ff09b52cbb5f6d\": container with ID starting with 77d9080b788e1ad459f467fa139ed79df6e3207ce4051b32f1ff09b52cbb5f6d not found: ID does not exist" Mar 09 15:36:14 crc kubenswrapper[4722]: I0309 15:36:14.165690 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34452a38-1c2d-44a0-87b4-8d22c9774b81" path="/var/lib/kubelet/pods/34452a38-1c2d-44a0-87b4-8d22c9774b81/volumes" Mar 09 15:36:23 crc kubenswrapper[4722]: I0309 15:36:23.840282 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-77cd88d8c5-gml97" podUID="685a5733-d06e-4523-a35a-051db91eb0be" containerName="proxy-server" probeResult="failure" output="HTTP probe failed with statuscode: 502" Mar 09 15:36:34 crc kubenswrapper[4722]: I0309 15:36:34.197437 4722 scope.go:117] "RemoveContainer" containerID="f253d61918f887f00e856fd1ce870c503a753b41202f6302d3674397bd593ac2" Mar 09 15:36:43 crc kubenswrapper[4722]: I0309 15:36:43.148729 4722 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 09 15:37:51 crc kubenswrapper[4722]: I0309 15:37:51.528191 4722 patch_prober.go:28] interesting pod/machine-config-daemon-hjrrb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 15:37:51 crc kubenswrapper[4722]: I0309 15:37:51.528772 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 15:38:00 crc kubenswrapper[4722]: I0309 15:38:00.162098 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29551178-s95kv"] Mar 09 15:38:00 crc kubenswrapper[4722]: E0309 15:38:00.162963 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34452a38-1c2d-44a0-87b4-8d22c9774b81" containerName="extract-utilities" Mar 09 15:38:00 crc kubenswrapper[4722]: I0309 15:38:00.162975 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="34452a38-1c2d-44a0-87b4-8d22c9774b81" containerName="extract-utilities" Mar 09 15:38:00 crc kubenswrapper[4722]: E0309 15:38:00.163011 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6ba7f09-db1f-44d9-ae72-b668460209a2" containerName="oc" Mar 09 15:38:00 crc kubenswrapper[4722]: I0309 15:38:00.163016 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6ba7f09-db1f-44d9-ae72-b668460209a2" containerName="oc" Mar 09 15:38:00 crc kubenswrapper[4722]: E0309 15:38:00.163026 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34452a38-1c2d-44a0-87b4-8d22c9774b81" containerName="registry-server" Mar 09 15:38:00 crc kubenswrapper[4722]: I0309 15:38:00.163032 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="34452a38-1c2d-44a0-87b4-8d22c9774b81" containerName="registry-server" Mar 09 15:38:00 crc kubenswrapper[4722]: E0309 15:38:00.163044 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34452a38-1c2d-44a0-87b4-8d22c9774b81" containerName="extract-content" Mar 09 15:38:00 crc kubenswrapper[4722]: I0309 15:38:00.163049 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="34452a38-1c2d-44a0-87b4-8d22c9774b81" containerName="extract-content" Mar 09 15:38:00 crc kubenswrapper[4722]: I0309 15:38:00.163333 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="34452a38-1c2d-44a0-87b4-8d22c9774b81" containerName="registry-server" Mar 09 15:38:00 crc kubenswrapper[4722]: I0309 15:38:00.163363 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6ba7f09-db1f-44d9-ae72-b668460209a2" containerName="oc" Mar 09 15:38:00 crc kubenswrapper[4722]: I0309 15:38:00.164290 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551178-s95kv" Mar 09 15:38:00 crc kubenswrapper[4722]: I0309 15:38:00.168335 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-dr2x6" Mar 09 15:38:00 crc kubenswrapper[4722]: I0309 15:38:00.168337 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 09 15:38:00 crc kubenswrapper[4722]: I0309 15:38:00.168392 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 09 15:38:00 crc kubenswrapper[4722]: I0309 15:38:00.182445 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551178-s95kv"] Mar 09 15:38:00 crc kubenswrapper[4722]: I0309 15:38:00.256169 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fhfm\" (UniqueName: \"kubernetes.io/projected/51d94f54-b9da-4765-af7d-44f51913d45f-kube-api-access-8fhfm\") pod \"auto-csr-approver-29551178-s95kv\" (UID: \"51d94f54-b9da-4765-af7d-44f51913d45f\") " pod="openshift-infra/auto-csr-approver-29551178-s95kv" Mar 09 15:38:00 crc kubenswrapper[4722]: I0309 15:38:00.357576 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fhfm\" (UniqueName: \"kubernetes.io/projected/51d94f54-b9da-4765-af7d-44f51913d45f-kube-api-access-8fhfm\") pod \"auto-csr-approver-29551178-s95kv\" (UID: \"51d94f54-b9da-4765-af7d-44f51913d45f\") " pod="openshift-infra/auto-csr-approver-29551178-s95kv" Mar 09 15:38:00 crc kubenswrapper[4722]: I0309 15:38:00.380101 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fhfm\" (UniqueName: \"kubernetes.io/projected/51d94f54-b9da-4765-af7d-44f51913d45f-kube-api-access-8fhfm\") pod \"auto-csr-approver-29551178-s95kv\" (UID: \"51d94f54-b9da-4765-af7d-44f51913d45f\") " pod="openshift-infra/auto-csr-approver-29551178-s95kv" Mar 09 15:38:00 crc kubenswrapper[4722]: I0309 15:38:00.489780 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551178-s95kv" Mar 09 15:38:01 crc kubenswrapper[4722]: I0309 15:38:01.026086 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29551178-s95kv"] Mar 09 15:38:01 crc kubenswrapper[4722]: I0309 15:38:01.148975 4722 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 09 15:38:01 crc kubenswrapper[4722]: I0309 15:38:01.560670 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551178-s95kv" event={"ID":"51d94f54-b9da-4765-af7d-44f51913d45f","Type":"ContainerStarted","Data":"3993303cb591452e09f5ca2e6ab1a2375147f010601efbd0a7285f892c280bf8"} Mar 09 15:38:02 crc kubenswrapper[4722]: I0309 15:38:02.575501 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551178-s95kv" event={"ID":"51d94f54-b9da-4765-af7d-44f51913d45f","Type":"ContainerStarted","Data":"82efaabe2c219cc78898a3bfa42d1cff597f3966087e42813f7952a6129b3499"} Mar 09 15:38:02 crc kubenswrapper[4722]: I0309 15:38:02.622567 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29551178-s95kv" podStartSLOduration=1.603947175 podStartE2EDuration="2.62252323s" podCreationTimestamp="2026-03-09 15:38:00 +0000 UTC" firstStartedPulling="2026-03-09 15:38:01.038136794 +0000 UTC m=+5721.593705370" lastFinishedPulling="2026-03-09 15:38:02.056712849 +0000 UTC m=+5722.612281425" observedRunningTime="2026-03-09 15:38:02.600328482 +0000 UTC m=+5723.155897078" watchObservedRunningTime="2026-03-09 15:38:02.62252323 +0000 UTC m=+5723.178091806" Mar 09 15:38:03 crc kubenswrapper[4722]: I0309 15:38:03.590639 4722 generic.go:334] "Generic (PLEG): container finished" podID="51d94f54-b9da-4765-af7d-44f51913d45f" containerID="82efaabe2c219cc78898a3bfa42d1cff597f3966087e42813f7952a6129b3499" exitCode=0 Mar 09 15:38:03 crc kubenswrapper[4722]: I0309 15:38:03.590686 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551178-s95kv" event={"ID":"51d94f54-b9da-4765-af7d-44f51913d45f","Type":"ContainerDied","Data":"82efaabe2c219cc78898a3bfa42d1cff597f3966087e42813f7952a6129b3499"} Mar 09 15:38:05 crc kubenswrapper[4722]: I0309 15:38:05.036534 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551178-s95kv" Mar 09 15:38:05 crc kubenswrapper[4722]: I0309 15:38:05.077757 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8fhfm\" (UniqueName: \"kubernetes.io/projected/51d94f54-b9da-4765-af7d-44f51913d45f-kube-api-access-8fhfm\") pod \"51d94f54-b9da-4765-af7d-44f51913d45f\" (UID: \"51d94f54-b9da-4765-af7d-44f51913d45f\") " Mar 09 15:38:05 crc kubenswrapper[4722]: I0309 15:38:05.084336 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51d94f54-b9da-4765-af7d-44f51913d45f-kube-api-access-8fhfm" (OuterVolumeSpecName: "kube-api-access-8fhfm") pod "51d94f54-b9da-4765-af7d-44f51913d45f" (UID: "51d94f54-b9da-4765-af7d-44f51913d45f"). InnerVolumeSpecName "kube-api-access-8fhfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 09 15:38:05 crc kubenswrapper[4722]: I0309 15:38:05.181224 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8fhfm\" (UniqueName: \"kubernetes.io/projected/51d94f54-b9da-4765-af7d-44f51913d45f-kube-api-access-8fhfm\") on node \"crc\" DevicePath \"\"" Mar 09 15:38:05 crc kubenswrapper[4722]: I0309 15:38:05.638331 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29551178-s95kv" event={"ID":"51d94f54-b9da-4765-af7d-44f51913d45f","Type":"ContainerDied","Data":"3993303cb591452e09f5ca2e6ab1a2375147f010601efbd0a7285f892c280bf8"} Mar 09 15:38:05 crc kubenswrapper[4722]: I0309 15:38:05.638794 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3993303cb591452e09f5ca2e6ab1a2375147f010601efbd0a7285f892c280bf8" Mar 09 15:38:05 crc kubenswrapper[4722]: I0309 15:38:05.638451 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29551178-s95kv" Mar 09 15:38:05 crc kubenswrapper[4722]: I0309 15:38:05.714974 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29551172-c6qz8"] Mar 09 15:38:05 crc kubenswrapper[4722]: I0309 15:38:05.732644 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29551172-c6qz8"] Mar 09 15:38:06 crc kubenswrapper[4722]: I0309 15:38:06.165742 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2378821c-5194-4bff-a140-c2a703479347" path="/var/lib/kubelet/pods/2378821c-5194-4bff-a140-c2a703479347/volumes" Mar 09 15:38:21 crc kubenswrapper[4722]: I0309 15:38:21.528537 4722 patch_prober.go:28] interesting pod/machine-config-daemon-hjrrb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 15:38:21 crc kubenswrapper[4722]: I0309 15:38:21.529177 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 15:38:34 crc kubenswrapper[4722]: I0309 15:38:34.358302 4722 scope.go:117] "RemoveContainer" containerID="6d7459202f7933f771a1f696937f359f9ac4ce1e9144c811ae0db1459862d78e" Mar 09 15:38:51 crc kubenswrapper[4722]: I0309 15:38:51.528313 4722 patch_prober.go:28] interesting pod/machine-config-daemon-hjrrb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 09 15:38:51 crc kubenswrapper[4722]: I0309 15:38:51.529037 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 09 15:38:51 crc kubenswrapper[4722]: I0309 15:38:51.529101 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" Mar 09 15:38:51 crc kubenswrapper[4722]: I0309 15:38:51.530186 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"212e82f1263ddfd9c596ba4b91b126e5985c2c4e08e0901d34421afe340b4cd9"} pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 09 15:38:51 crc kubenswrapper[4722]: I0309 15:38:51.530353 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerName="machine-config-daemon" containerID="cri-o://212e82f1263ddfd9c596ba4b91b126e5985c2c4e08e0901d34421afe340b4cd9" gracePeriod=600 Mar 09 15:38:51 crc kubenswrapper[4722]: E0309 15:38:51.657538 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 15:38:51 crc kubenswrapper[4722]: I0309 15:38:51.747682 4722 generic.go:334] "Generic (PLEG): container finished" podID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" containerID="212e82f1263ddfd9c596ba4b91b126e5985c2c4e08e0901d34421afe340b4cd9" exitCode=0 Mar 09 15:38:51 crc kubenswrapper[4722]: I0309 15:38:51.747740 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" event={"ID":"dac2aaf5-653b-4b2a-8efe-ed26bac8d648","Type":"ContainerDied","Data":"212e82f1263ddfd9c596ba4b91b126e5985c2c4e08e0901d34421afe340b4cd9"} Mar 09 15:38:51 crc kubenswrapper[4722]: I0309 15:38:51.747783 4722 scope.go:117] "RemoveContainer" containerID="4a53fd9ee2bab0740d7bdf686e82ec34043eb1813cf987fb88ca038e6c64eb91" Mar 09 15:38:51 crc kubenswrapper[4722]: I0309 15:38:51.749345 4722 scope.go:117] "RemoveContainer" containerID="212e82f1263ddfd9c596ba4b91b126e5985c2c4e08e0901d34421afe340b4cd9" Mar 09 15:38:51 crc kubenswrapper[4722]: E0309 15:38:51.750471 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 15:39:03 crc kubenswrapper[4722]: I0309 15:39:03.149505 4722 scope.go:117] "RemoveContainer" containerID="212e82f1263ddfd9c596ba4b91b126e5985c2c4e08e0901d34421afe340b4cd9" Mar 09 15:39:03 crc kubenswrapper[4722]: E0309 15:39:03.150483 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 15:39:17 crc kubenswrapper[4722]: I0309 15:39:17.149032 4722 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 09 15:39:18 crc kubenswrapper[4722]: I0309 15:39:18.150858 4722 scope.go:117] "RemoveContainer" containerID="212e82f1263ddfd9c596ba4b91b126e5985c2c4e08e0901d34421afe340b4cd9" Mar 09 15:39:18 crc kubenswrapper[4722]: E0309 15:39:18.151991 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" Mar 09 15:39:33 crc kubenswrapper[4722]: I0309 15:39:33.150123 4722 scope.go:117] "RemoveContainer" containerID="212e82f1263ddfd9c596ba4b91b126e5985c2c4e08e0901d34421afe340b4cd9" Mar 09 15:39:33 crc kubenswrapper[4722]: E0309 15:39:33.151543 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-hjrrb_openshift-machine-config-operator(dac2aaf5-653b-4b2a-8efe-ed26bac8d648)\"" pod="openshift-machine-config-operator/machine-config-daemon-hjrrb" podUID="dac2aaf5-653b-4b2a-8efe-ed26bac8d648" var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515153564715024461 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015153564716017377 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015153551046016512 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015153551046015462 5ustar corecore